A statistical package for computing time and frequency domain analysis
NASA Technical Reports Server (NTRS)
Brownlow, J.
1978-01-01
The spectrum analysis (SPA) program is a general purpose digital computer program designed to aid in data analysis. The program does time and frequency domain statistical analyses as well as some preanalysis data preparation. The capabilities of the SPA program include linear trend removal and/or digital filtering of data, plotting and/or listing of both filtered and unfiltered data, time domain statistical characterization of data, and frequency domain statistical characterization of data.
Notes on numerical reliability of several statistical analysis programs
Landwehr, J.M.; Tasker, Gary D.
1999-01-01
This report presents a benchmark analysis of several statistical analysis programs currently in use in the USGS. The benchmark consists of a comparison between the values provided by a statistical analysis program for variables in the reference data set ANASTY and their known or calculated theoretical values. The ANASTY data set is an amendment of the Wilkinson NASTY data set that has been used in the statistical literature to assess the reliability (computational correctness) of calculated analytical results.
Statistical energy analysis computer program, user's guide
NASA Technical Reports Server (NTRS)
Trudell, R. W.; Yano, L. I.
1981-01-01
A high frequency random vibration analysis, (statistical energy analysis (SEA) method) is examined. The SEA method accomplishes high frequency prediction of arbitrary structural configurations. A general SEA computer program is described. A summary of SEA theory, example problems of SEA program application, and complete program listing are presented.
Tanavalee, Chotetawan; Luksanapruksa, Panya; Singhatanadgige, Weerasak
2016-06-01
Microsoft Excel (MS Excel) is a commonly used program for data collection and statistical analysis in biomedical research. However, this program has many limitations, including fewer functions that can be used for analysis and a limited number of total cells compared with dedicated statistical programs. MS Excel cannot complete analyses with blank cells, and cells must be selected manually for analysis. In addition, it requires multiple steps of data transformation and formulas to plot survival analysis graphs, among others. The Megastat add-on program, which will be supported by MS Excel 2016 soon, would eliminate some limitations of using statistic formulas within MS Excel.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-05
...] Guidance for Industry on Documenting Statistical Analysis Programs and Data Files; Availability AGENCY... Programs and Data Files.'' This guidance is provided to inform study statisticians of recommendations for documenting statistical analyses and data files submitted to the Center for Veterinary Medicine (CVM) for the...
Application of Statistics in Engineering Technology Programs
ERIC Educational Resources Information Center
Zhan, Wei; Fink, Rainer; Fang, Alex
2010-01-01
Statistics is a critical tool for robustness analysis, measurement system error analysis, test data analysis, probabilistic risk assessment, and many other fields in the engineering world. Traditionally, however, statistics is not extensively used in undergraduate engineering technology (ET) programs, resulting in a major disconnect from industry…
NASA Astrophysics Data System (ADS)
Hendikawati, P.; Arifudin, R.; Zahid, M. Z.
2018-03-01
This study aims to design an android Statistics Data Analysis application that can be accessed through mobile devices to making it easier for users to access. The Statistics Data Analysis application includes various topics of basic statistical along with a parametric statistics data analysis application. The output of this application system is parametric statistics data analysis that can be used for students, lecturers, and users who need the results of statistical calculations quickly and easily understood. Android application development is created using Java programming language. The server programming language uses PHP with the Code Igniter framework, and the database used MySQL. The system development methodology used is the Waterfall methodology with the stages of analysis, design, coding, testing, and implementation and system maintenance. This statistical data analysis application is expected to support statistical lecturing activities and make students easier to understand the statistical analysis of mobile devices.
Statistical principle and methodology in the NISAN system.
Asano, C
1979-01-01
The NISAN system is a new interactive statistical analysis program package constructed by an organization of Japanese statisticans. The package is widely available for both statistical situations, confirmatory analysis and exploratory analysis, and is planned to obtain statistical wisdom and to choose optimal process of statistical analysis for senior statisticians. PMID:540594
2010-03-01
ANALYSIS OF THE EFFECT OF THE NAVY’S TUITION ASSISTANCE PROGRAM : DO DISTANCE LEARNING CLASSES MAKE A DIFFERENCE? by Jeremy P. McLaughlin March...TYPE AND DATES COVERED Master’s Thesis 4. TITLE AND SUBTITLE A Statistical Analysis of the Effect of the Navy’s Tuition Assistance Program : Do...200 words) This thesis analyzes the impact of participation in the Navy’s Tuition Assistance (TA) program on the retention of first-term Navy
ERIC Educational Resources Information Center
Mascaró, Maite; Sacristán, Ana Isabel; Rufino, Marta M.
2016-01-01
For the past 4 years, we have been involved in a project that aims to enhance the teaching and learning of experimental analysis and statistics, of environmental and biological sciences students, through computational programming activities (using R code). In this project, through an iterative design, we have developed sequences of R-code-based…
Statistical Software and Artificial Intelligence: A Watershed in Applications Programming.
ERIC Educational Resources Information Center
Pickett, John C.
1984-01-01
AUTOBJ and AUTOBOX are revolutionary software programs which contain the first application of artificial intelligence to statistical procedures used in analysis of time series data. The artificial intelligence included in the programs and program features are discussed. (JN)
Research Education in Undergraduate Occupational Therapy Programs.
ERIC Educational Resources Information Center
Petersen, Paul; And Others
1992-01-01
Of 63 undergraduate occupational therapy programs surveyed, the 38 responses revealed some common areas covered: elementary descriptive statistics, validity, reliability, and measurement. Areas underrepresented include statistical analysis with or without computers, research design, and advanced statistics. (SK)
IVHS Countermeasures for Rear-End Collisions, Task 1; Vol. II: Statistical Analysis
DOT National Transportation Integrated Search
1994-02-25
This report is from the NHTSA sponsored program, "IVHS Countermeasures for Rear-End Collisions". This Volume, Volume II, Statistical Analysis, presents the statistical analysis of rear-end collision accident data that characterizes the accidents with...
Statistical Symbolic Execution with Informed Sampling
NASA Technical Reports Server (NTRS)
Filieri, Antonio; Pasareanu, Corina S.; Visser, Willem; Geldenhuys, Jaco
2014-01-01
Symbolic execution techniques have been proposed recently for the probabilistic analysis of programs. These techniques seek to quantify the likelihood of reaching program events of interest, e.g., assert violations. They have many promising applications but have scalability issues due to high computational demand. To address this challenge, we propose a statistical symbolic execution technique that performs Monte Carlo sampling of the symbolic program paths and uses the obtained information for Bayesian estimation and hypothesis testing with respect to the probability of reaching the target events. To speed up the convergence of the statistical analysis, we propose Informed Sampling, an iterative symbolic execution that first explores the paths that have high statistical significance, prunes them from the state space and guides the execution towards less likely paths. The technique combines Bayesian estimation with a partial exact analysis for the pruned paths leading to provably improved convergence of the statistical analysis. We have implemented statistical symbolic execution with in- formed sampling in the Symbolic PathFinder tool. We show experimentally that the informed sampling obtains more precise results and converges faster than a purely statistical analysis and may also be more efficient than an exact symbolic analysis. When the latter does not terminate symbolic execution with informed sampling can give meaningful results under the same time and memory limits.
Tools for Basic Statistical Analysis
NASA Technical Reports Server (NTRS)
Luz, Paul L.
2005-01-01
Statistical Analysis Toolset is a collection of eight Microsoft Excel spreadsheet programs, each of which performs calculations pertaining to an aspect of statistical analysis. These programs present input and output data in user-friendly, menu-driven formats, with automatic execution. The following types of calculations are performed: Descriptive statistics are computed for a set of data x(i) (i = 1, 2, 3 . . . ) entered by the user. Normal Distribution Estimates will calculate the statistical value that corresponds to cumulative probability values, given a sample mean and standard deviation of the normal distribution. Normal Distribution from two Data Points will extend and generate a cumulative normal distribution for the user, given two data points and their associated probability values. Two programs perform two-way analysis of variance (ANOVA) with no replication or generalized ANOVA for two factors with four levels and three repetitions. Linear Regression-ANOVA will curvefit data to the linear equation y=f(x) and will do an ANOVA to check its significance.
Robbins, L G
2000-01-01
Graduate school programs in genetics have become so full that courses in statistics have often been eliminated. In addition, typical introductory statistics courses for the "statistics user" rather than the nascent statistician are laden with methods for analysis of measured variables while genetic data are most often discrete numbers. These courses are often seen by students and genetics professors alike as largely irrelevant cookbook courses. The powerful methods of likelihood analysis, although commonly employed in human genetics, are much less often used in other areas of genetics, even though current computational tools make this approach readily accessible. This article introduces the MLIKELY.PAS computer program and the logic of do-it-yourself maximum-likelihood statistics. The program itself, course materials, and expanded discussions of some examples that are only summarized here are available at http://www.unisi. it/ricerca/dip/bio_evol/sitomlikely/mlikely.h tml. PMID:10628965
NASA Technical Reports Server (NTRS)
Purves, L.; Strang, R. F.; Dube, M. P.; Alea, P.; Ferragut, N.; Hershfeld, D.
1983-01-01
The software and procedures of a system of programs used to generate a report of the statistical correlation between NASTRAN modal analysis results and physical tests results from modal surveys are described. Topics discussed include: a mathematical description of statistical correlation, a user's guide for generating a statistical correlation report, a programmer's guide describing the organization and functions of individual programs leading to a statistical correlation report, and a set of examples including complete listings of programs, and input and output data.
SPA- STATISTICAL PACKAGE FOR TIME AND FREQUENCY DOMAIN ANALYSIS
NASA Technical Reports Server (NTRS)
Brownlow, J. D.
1994-01-01
The need for statistical analysis often arises when data is in the form of a time series. This type of data is usually a collection of numerical observations made at specified time intervals. Two kinds of analysis may be performed on the data. First, the time series may be treated as a set of independent observations using a time domain analysis to derive the usual statistical properties including the mean, variance, and distribution form. Secondly, the order and time intervals of the observations may be used in a frequency domain analysis to examine the time series for periodicities. In almost all practical applications, the collected data is actually a mixture of the desired signal and a noise signal which is collected over a finite time period with a finite precision. Therefore, any statistical calculations and analyses are actually estimates. The Spectrum Analysis (SPA) program was developed to perform a wide range of statistical estimation functions. SPA can provide the data analyst with a rigorous tool for performing time and frequency domain studies. In a time domain statistical analysis the SPA program will compute the mean variance, standard deviation, mean square, and root mean square. It also lists the data maximum, data minimum, and the number of observations included in the sample. In addition, a histogram of the time domain data is generated, a normal curve is fit to the histogram, and a goodness-of-fit test is performed. These time domain calculations may be performed on both raw and filtered data. For a frequency domain statistical analysis the SPA program computes the power spectrum, cross spectrum, coherence, phase angle, amplitude ratio, and transfer function. The estimates of the frequency domain parameters may be smoothed with the use of Hann-Tukey, Hamming, Barlett, or moving average windows. Various digital filters are available to isolate data frequency components. Frequency components with periods longer than the data collection interval are removed by least-squares detrending. As many as ten channels of data may be analyzed at one time. Both tabular and plotted output may be generated by the SPA program. This program is written in FORTRAN IV and has been implemented on a CDC 6000 series computer with a central memory requirement of approximately 142K (octal) of 60 bit words. This core requirement can be reduced by segmentation of the program. The SPA program was developed in 1978.
NASA Technical Reports Server (NTRS)
1990-01-01
Structural Reliability Consultants' computer program creates graphic plots showing the statistical parameters of glue laminated timbers, or 'glulam.' The company president, Dr. Joseph Murphy, read in NASA Tech Briefs about work related to analysis of Space Shuttle surface tile strength performed for Johnson Space Center by Rockwell International Corporation. Analysis led to a theory of 'consistent tolerance bounds' for statistical distributions, applicable in industrial testing where statistical analysis can influence product development and use. Dr. Murphy then obtained the Tech Support Package that covers the subject in greater detail. The TSP became the basis for Dr. Murphy's computer program PC-DATA, which he is marketing commercially.
An Analysis of the Navy’s Voluntary Education Program
2007-03-01
NAVAL ANALYSIS VOLED STUDY .........11 1. Data .........................................11 2. Statistical Models ...........................12 3...B. EMPLOYER FINANCED GENERAL TRAINING ................31 1. Data .........................................32 2. Statistical Model...37 1. Data .........................................38 2. Statistical Model ............................38 3. Findings
Analysis of reference transactions using packaged computer programs.
Calabretta, N; Ross, R
1984-01-01
Motivated by a continuing education class attended by the authors on the measurement of reference desk activities, the reference department at Scott Memorial Library initiated a project to gather data on reference desk transactions and to analyze the data by using packaged computer programs. The programs utilized for the project were SPSS (Statistical Package for the Social Sciences) and SAS (Statistical Analysis System). The planning, implementation and development of the project are described.
NASA Technical Reports Server (NTRS)
Gyekenyesi, John P.; Nemeth, Noel N.
1987-01-01
The SCARE (Structural Ceramics Analysis and Reliability Evaluation) computer program on statistical fast fracture reliability analysis with quadratic elements for volume distributed imperfections is enhanced to include the use of linear finite elements and the capability of designing against concurrent surface flaw induced ceramic component failure. The SCARE code is presently coupled as a postprocessor to the MSC/NASTRAN general purpose, finite element analysis program. The improved version now includes the Weibull and Batdorf statistical failure theories for both surface and volume flaw based reliability analysis. The program uses the two-parameter Weibull fracture strength cumulative failure probability distribution model with the principle of independent action for poly-axial stress states, and Batdorf's shear-sensitive as well as shear-insensitive statistical theories. The shear-sensitive surface crack configurations include the Griffith crack and Griffith notch geometries, using the total critical coplanar strain energy release rate criterion to predict mixed-mode fracture. Weibull material parameters based on both surface and volume flaw induced fracture can also be calculated from modulus of rupture bar tests, using the least squares method with known specimen geometry and grouped fracture data. The statistical fast fracture theories for surface flaw induced failure, along with selected input and output formats and options, are summarized. An example problem to demonstrate various features of the program is included.
Taeyoung Kim; Christian Langpap
2015-01-01
This report provides a statistical analysis of the data collected from two survey regions of the United States, the Pacific Northwest and the Southeast. The survey asked about individual agricultural landownersâ characteristics, characteristics of their land, and the landownersâ willingness to enroll in a tree planting program under incentive payments for carbon...
Linkage analysis of systolic blood pressure: a score statistic and computer implementation
Wang, Kai; Peng, Yingwei
2003-01-01
A genome-wide linkage analysis was conducted on systolic blood pressure using a score statistic. The randomly selected Replicate 34 of the simulated data was used. The score statistic was applied to the sibships derived from the general pedigrees. An add-on R program to GENEHUNTER was developed for this analysis and is freely available. PMID:14975145
The Shock and Vibration Digest. Volume 16, Number 1
1984-01-01
investigation of the measure- ment of frequency band average loss factors of structural components for use in the statistical energy analysis method of...stiffness. Matrix methods Key Words: Finite element technique. Statistical energy analysis . Experimental techniques. Framed structures, Com- puter...programs In order to further understand the practical application of the statistical energy analysis , a two section plate-like frame structure is
Multiple linear regression analysis
NASA Technical Reports Server (NTRS)
Edwards, T. R.
1980-01-01
Program rapidly selects best-suited set of coefficients. User supplies only vectors of independent and dependent data and specifies confidence level required. Program uses stepwise statistical procedure for relating minimal set of variables to set of observations; final regression contains only most statistically significant coefficients. Program is written in FORTRAN IV for batch execution and has been implemented on NOVA 1200.
Problem area descriptions : motor vehicle crashes - data analysis and IVI program analysis
DOT National Transportation Integrated Search
In general, the IVI program focuses on the more significant safety problem categories as : indicated by statistical analyses of crash data. However, other factors were considered in setting : program priorities and schedules. For some problem areas, ...
The Empirical Review of Meta-Analysis Published in Korea
ERIC Educational Resources Information Center
Park, Sunyoung; Hong, Sehee
2016-01-01
Meta-analysis is a statistical method that is increasingly utilized to combine and compare the results of previous primary studies. However, because of the lack of comprehensive guidelines for how to use meta-analysis, many meta-analysis studies have failed to consider important aspects, such as statistical programs, power analysis, publication…
Development of new on-line statistical program for the Korean Society for Radiation Oncology
Song, Si Yeol; Ahn, Seung Do; Chung, Weon Kuu; Choi, Eun Kyung; Cho, Kwan Ho
2015-01-01
Purpose To develop new on-line statistical program for the Korean Society for Radiation Oncology (KOSRO) to collect and extract medical data in radiation oncology more efficiently. Materials and Methods The statistical program is a web-based program. The directory was placed in a sub-folder of the homepage of KOSRO and its web address is http://www.kosro.or.kr/asda. The operating systems server is Linux and the webserver is the Apache HTTP server. For database (DB) server, MySQL is adopted and dedicated scripting language is the PHP. Each ID and password are controlled independently and all screen pages for data input or analysis are made to be friendly to users. Scroll-down menu is actively used for the convenience of user and the consistence of data analysis. Results Year of data is one of top categories and main topics include human resource, equipment, clinical statistics, specialized treatment and research achievement. Each topic or category has several subcategorized topics. Real-time on-line report of analysis is produced immediately after entering each data and the administrator is able to monitor status of data input of each hospital. Backup of data as spread sheets can be accessed by the administrator and be used for academic works by any members of the KOSRO. Conclusion The new on-line statistical program was developed to collect data from nationwide departments of radiation oncology. Intuitive screen and consistent input structure are expected to promote entering data of member hospitals and annual statistics should be a cornerstone of advance in radiation oncology. PMID:26157684
Development of new on-line statistical program for the Korean Society for Radiation Oncology.
Song, Si Yeol; Ahn, Seung Do; Chung, Weon Kuu; Shin, Kyung Hwan; Choi, Eun Kyung; Cho, Kwan Ho
2015-06-01
To develop new on-line statistical program for the Korean Society for Radiation Oncology (KOSRO) to collect and extract medical data in radiation oncology more efficiently. The statistical program is a web-based program. The directory was placed in a sub-folder of the homepage of KOSRO and its web address is http://www.kosro.or.kr/asda. The operating systems server is Linux and the webserver is the Apache HTTP server. For database (DB) server, MySQL is adopted and dedicated scripting language is the PHP. Each ID and password are controlled independently and all screen pages for data input or analysis are made to be friendly to users. Scroll-down menu is actively used for the convenience of user and the consistence of data analysis. Year of data is one of top categories and main topics include human resource, equipment, clinical statistics, specialized treatment and research achievement. Each topic or category has several subcategorized topics. Real-time on-line report of analysis is produced immediately after entering each data and the administrator is able to monitor status of data input of each hospital. Backup of data as spread sheets can be accessed by the administrator and be used for academic works by any members of the KOSRO. The new on-line statistical program was developed to collect data from nationwide departments of radiation oncology. Intuitive screen and consistent input structure are expected to promote entering data of member hospitals and annual statistics should be a cornerstone of advance in radiation oncology.
ERIC Educational Resources Information Center
Madhere, Serge
An analytic procedure, efficiency analysis, is proposed for improving the utility of quantitative program evaluation for decision making. The three features of the procedure are explained: (1) for statistical control, it adopts and extends the regression-discontinuity design; (2) for statistical inferences, it de-emphasizes hypothesis testing in…
Some issues in the statistical analysis of vehicle emissions
DOT National Transportation Integrated Search
2000-09-01
Some of the issues complicating the statistical analysis of vehicle emissions and the effectiveness of emissions control programs are presented in this article. Issues discussed include: the variability of inter- and intra-vehicle emissions; the skew...
1988-06-01
and PCBs. The pilot program involved screening, testing , and repairing of EMs/PCBs for both COMNAVSEASYSCOM and Commander, Naval Electronic Systems...were chosen from the Support and Test Equipment Engineering Program (STEEP) tests rformed by"IMA San Diego duringl987. A statistical analysis and a Level...were chosen from the Support and Test Equipment Engineering Program (STEEP) tests performed by SIMA San Diego during 1987. A statistical analysis and a
1987-08-01
HVAC duct hanger system over an extensive frequency range. The finite element, component mode synthesis, and statistical energy analysis methods are...800-5,000 Hz) analysis was conducted with Statistical Energy Analysis (SEA) coupled with a closed-form harmonic beam analysis program. These...resonances may be obtained by using a finer frequency increment. Statistical Energy Analysis The basic assumption used in SEA analysis is that within each band
RAWS II: A MULTIPLE REGRESSION ANALYSIS PROGRAM,
This memorandum gives instructions for the use and operation of a revised version of RAWS, a multiple regression analysis program. The program...of preprocessed data, the directed retention of variable, listing of the matrix of the normal equations and its inverse, and the bypassing of the regression analysis to provide the input variable statistics only. (Author)
Gorman, Dennis M; Huber, J Charles
2009-08-01
This study explores the possibility that any drug prevention program might be considered ;;evidence-based'' given the use of data analysis procedures that optimize the chance of producing statistically significant results by reanalyzing data from a Drug Abuse Resistance Education (DARE) program evaluation. The analysis produced a number of statistically significant differences between the DARE and control conditions on alcohol and marijuana use measures. Many of these differences occurred at cutoff points on the assessment scales for which post hoc meaningful labels were created. Our results are compared to those from evaluations of programs that appear on evidence-based drug prevention lists.
Davis, J.C.
2000-01-01
Geologists may feel that geological data are not amenable to statistical analysis, or at best require specialized approaches such as nonparametric statistics and geostatistics. However, there are many circumstances, particularly in systematic studies conducted for environmental or regulatory purposes, where traditional parametric statistical procedures can be beneficial. An example is the application of analysis of variance to data collected in an annual program of measuring groundwater levels in Kansas. Influences such as well conditions, operator effects, and use of the water can be assessed and wells that yield less reliable measurements can be identified. Such statistical studies have resulted in yearly improvements in the quality and reliability of the collected hydrologic data. Similar benefits may be achieved in other geological studies by the appropriate use of classical statistical tools.
Trend Analysis Using Microcomputers.
ERIC Educational Resources Information Center
Berger, Carl F.
A trend analysis statistical package and additional programs for the Apple microcomputer are presented. They illustrate strategies of data analysis suitable to the graphics and processing capabilities of the microcomputer. The programs analyze data sets using examples of: (1) analysis of variance with multiple linear regression; (2) exponential…
Computer program uses Monte Carlo techniques for statistical system performance analysis
NASA Technical Reports Server (NTRS)
Wohl, D. P.
1967-01-01
Computer program with Monte Carlo sampling techniques determines the effect of a component part of a unit upon the overall system performance. It utilizes the full statistics of the disturbances and misalignments of each component to provide unbiased results through simulated random sampling.
SWToolbox: A surface-water tool-box for statistical analysis of streamflow time series
Kiang, Julie E.; Flynn, Kate; Zhai, Tong; Hummel, Paul; Granato, Gregory
2018-03-07
This report is a user guide for the low-flow analysis methods provided with version 1.0 of the Surface Water Toolbox (SWToolbox) computer program. The software combines functionality from two software programs—U.S. Geological Survey (USGS) SWSTAT and U.S. Environmental Protection Agency (EPA) DFLOW. Both of these programs have been used primarily for computation of critical low-flow statistics. The main analysis methods are the computation of hydrologic frequency statistics such as the 7-day minimum flow that occurs on average only once every 10 years (7Q10), computation of design flows including biologically based flows, and computation of flow-duration curves and duration hydrographs. Other annual, monthly, and seasonal statistics can also be computed. The interface facilitates retrieval of streamflow discharge data from the USGS National Water Information System and outputs text reports for a record of the analysis. Tools for graphing data and screening tests are available to assist the analyst in conducting the analysis.
VanTrump, G.; Miesch, A.T.
1977-01-01
RASS is an acronym for Rock Analysis Storage System and STATPAC, for Statistical Package. The RASS and STATPAC computer programs are integrated into the RASS-STATPAC system for the management and statistical reduction of geochemical data. The system, in its present form, has been in use for more than 9 yr by scores of U.S. Geological Survey geologists, geochemists, and other scientists engaged in a broad range of geologic and geochemical investigations. The principal advantage of the system is the flexibility afforded the user both in data searches and retrievals and in the manner of statistical treatment of data. The statistical programs provide for most types of statistical reduction normally used in geochemistry and petrology, but also contain bridges to other program systems for statistical processing and automatic plotting. ?? 1977.
Akterations/corrections to the BRASS Program
NASA Technical Reports Server (NTRS)
Brand, S. N.
1985-01-01
Corrections applied to statistical programs contained in two subroutines of the Bed Rest Analysis Software System (BRASS) are summarized. Two subroutines independently calculate significant values within the BRASS program.
1990-01-01
7 . ,: 1& *U _’ ś TECHNICAL REPORT AD NATICK/TR-90/014 (V) N* IMAGE ANALYSIS PROGRAM FOR MEASURING PARTICLES < WITH THE ZEISS CSM 950 SCANNING... image analysis program for measuring particles using the Zeiss CSM 950/Kontron system is as follows: A>CSM calls the image analysis program. Press D to...27 vili LIST OF TABLES TABLE PAGE 1. Image Analysis Program for Measuring 29 Spherical Particles 14 2. Printout of Statistical Data Frcm Table 1 16 3
[A SAS marco program for batch processing of univariate Cox regression analysis for great database].
Yang, Rendong; Xiong, Jie; Peng, Yangqin; Peng, Xiaoning; Zeng, Xiaomin
2015-02-01
To realize batch processing of univariate Cox regression analysis for great database by SAS marco program. We wrote a SAS macro program, which can filter, integrate, and export P values to Excel by SAS9.2. The program was used for screening survival correlated RNA molecules of ovarian cancer. A SAS marco program could finish the batch processing of univariate Cox regression analysis, the selection and export of the results. The SAS macro program has potential applications in reducing the workload of statistical analysis and providing a basis for batch processing of univariate Cox regression analysis.
An Overview of R in Health Decision Sciences.
Jalal, Hawre; Pechlivanoglou, Petros; Krijkamp, Eline; Alarid-Escudero, Fernando; Enns, Eva; Hunink, M G Myriam
2017-10-01
As the complexity of health decision science applications increases, high-level programming languages are increasingly adopted for statistical analyses and numerical computations. These programming languages facilitate sophisticated modeling, model documentation, and analysis reproducibility. Among the high-level programming languages, the statistical programming framework R is gaining increased recognition. R is freely available, cross-platform compatible, and open source. A large community of users who have generated an extensive collection of well-documented packages and functions supports it. These functions facilitate applications of health decision science methodology as well as the visualization and communication of results. Although R's popularity is increasing among health decision scientists, methodological extensions of R in the field of decision analysis remain isolated. The purpose of this article is to provide an overview of existing R functionality that is applicable to the various stages of decision analysis, including model design, input parameter estimation, and analysis of model outputs.
The statistical analysis of global climate change studies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hardin, J.W.
1992-01-01
The focus of this work is to contribute to the enhancement of the relationship between climatologists and statisticians. The analysis of global change data has been underway for many years by atmospheric scientists. Much of this analysis includes a heavy reliance on statistics and statistical inference. Some specific climatological analyses are presented and the dependence on statistics is documented before the analysis is undertaken. The first problem presented involves the fluctuation-dissipation theorem and its application to global climate models. This problem has a sound theoretical niche in the literature of both climate modeling and physics, but a statistical analysis inmore » which the data is obtained from the model to show graphically the relationship has not been undertaken. It is under this motivation that the author presents this problem. A second problem concerning the standard errors in estimating global temperatures is purely statistical in nature although very little materials exists for sampling on such a frame. This problem not only has climatological and statistical ramifications, but political ones as well. It is planned to use these results in a further analysis of global warming using actual data collected on the earth. In order to simplify the analysis of these problems, the development of a computer program, MISHA, is presented. This interactive program contains many of the routines, functions, graphics, and map projections needed by the climatologist in order to effectively enter the arena of data visualization.« less
ERIC Educational Resources Information Center
Santos-Delgado, M. J.; Larrea-Tarruella, L.
2004-01-01
The back-titration methods are compared statistically to establish glycine in a nonaqueous medium of acetic acid. Important variations in the mean values of glycine are observed due to the interaction effects between the analysis of variance (ANOVA) technique and a statistical study through a computer software.
NASA Technical Reports Server (NTRS)
Grosveld, Ferdinand W.; Schiller, Noah H.; Cabell, Randolph H.
2011-01-01
Comet Enflow is a commercially available, high frequency vibroacoustic analysis software founded on Energy Finite Element Analysis (EFEA) and Energy Boundary Element Analysis (EBEA). Energy Finite Element Analysis (EFEA) was validated on a floor-equipped composite cylinder by comparing EFEA vibroacoustic response predictions with Statistical Energy Analysis (SEA) and experimental results. Statistical Energy Analysis (SEA) predictions were made using the commercial software program VA One 2009 from ESI Group. The frequency region of interest for this study covers the one-third octave bands with center frequencies from 100 Hz to 4000 Hz.
ERIC Educational Resources Information Center
Gorman, Dennis M.; Huber, J. Charles, Jr.
2009-01-01
This study explores the possibility that any drug prevention program might be considered "evidence-based" given the use of data analysis procedures that optimize the chance of producing statistically significant results by reanalyzing data from a Drug Abuse Resistance Education (DARE) program evaluation. The analysis produced a number of…
Conducting Simulation Studies in the R Programming Environment.
Hallgren, Kevin A
2013-10-12
Simulation studies allow researchers to answer specific questions about data analysis, statistical power, and best-practices for obtaining accurate results in empirical research. Despite the benefits that simulation research can provide, many researchers are unfamiliar with available tools for conducting their own simulation studies. The use of simulation studies need not be restricted to researchers with advanced skills in statistics and computer programming, and such methods can be implemented by researchers with a variety of abilities and interests. The present paper provides an introduction to methods used for running simulation studies using the R statistical programming environment and is written for individuals with minimal experience running simulation studies or using R. The paper describes the rationale and benefits of using simulations and introduces R functions relevant for many simulation studies. Three examples illustrate different applications for simulation studies, including (a) the use of simulations to answer a novel question about statistical analysis, (b) the use of simulations to estimate statistical power, and (c) the use of simulations to obtain confidence intervals of parameter estimates through bootstrapping. Results and fully annotated syntax from these examples are provided.
ERIC Educational Resources Information Center
Zaback, Tosha; Becker, Thomas M.; Dignan, Mark B.; Lambert, William E.
2010-01-01
In this article, the authors describe a unique summer program to train American Indian/Alaska Native (AI/AN) health professionals in a variety of health research-related skills, including epidemiology, data management, statistical analysis, program evaluation, cost-benefit analysis, community-based participatory research, grant writing, and…
A new SAS program for behavioral analysis of Electrical Penetration Graph (EPG) data
USDA-ARS?s Scientific Manuscript database
A new program is introduced that uses SAS software to duplicate output of descriptive statistics from the Sarria Excel workbook for EPG waveform analysis. Not only are publishable means and standard errors or deviations output, the user also is guided through four relatively simple sub-programs for ...
SAP- FORTRAN STATIC SOURCE CODE ANALYZER PROGRAM (IBM VERSION)
NASA Technical Reports Server (NTRS)
Manteufel, R.
1994-01-01
The FORTRAN Static Source Code Analyzer program, SAP, was developed to automatically gather statistics on the occurrences of statements and structures within a FORTRAN program and to provide for the reporting of those statistics. Provisions have been made for weighting each statistic and to provide an overall figure of complexity. Statistics, as well as figures of complexity, are gathered on a module by module basis. Overall summed statistics are also accumulated for the complete input source file. SAP accepts as input syntactically correct FORTRAN source code written in the FORTRAN 77 standard language. In addition, code written using features in the following languages is also accepted: VAX-11 FORTRAN, IBM S/360 FORTRAN IV Level H Extended; and Structured FORTRAN. The SAP program utilizes two external files in its analysis procedure. A keyword file allows flexibility in classifying statements and in marking a statement as either executable or non-executable. A statistical weight file allows the user to assign weights to all output statistics, thus allowing the user flexibility in defining the figure of complexity. The SAP program is written in FORTRAN IV for batch execution and has been implemented on a DEC VAX series computer under VMS and on an IBM 370 series computer under MVS. The SAP program was developed in 1978 and last updated in 1985.
SAP- FORTRAN STATIC SOURCE CODE ANALYZER PROGRAM (DEC VAX VERSION)
NASA Technical Reports Server (NTRS)
Merwarth, P. D.
1994-01-01
The FORTRAN Static Source Code Analyzer program, SAP, was developed to automatically gather statistics on the occurrences of statements and structures within a FORTRAN program and to provide for the reporting of those statistics. Provisions have been made for weighting each statistic and to provide an overall figure of complexity. Statistics, as well as figures of complexity, are gathered on a module by module basis. Overall summed statistics are also accumulated for the complete input source file. SAP accepts as input syntactically correct FORTRAN source code written in the FORTRAN 77 standard language. In addition, code written using features in the following languages is also accepted: VAX-11 FORTRAN, IBM S/360 FORTRAN IV Level H Extended; and Structured FORTRAN. The SAP program utilizes two external files in its analysis procedure. A keyword file allows flexibility in classifying statements and in marking a statement as either executable or non-executable. A statistical weight file allows the user to assign weights to all output statistics, thus allowing the user flexibility in defining the figure of complexity. The SAP program is written in FORTRAN IV for batch execution and has been implemented on a DEC VAX series computer under VMS and on an IBM 370 series computer under MVS. The SAP program was developed in 1978 and last updated in 1985.
Meta-Analysis in Stata Using Gllamm
ERIC Educational Resources Information Center
Bagos, Pantelis G.
2015-01-01
There are several user-written programs for performing meta-analysis in Stata (Stata Statistical Software: College Station, TX: Stata Corp LP). These include metan, metareg, mvmeta, and glst. However, there are several cases for which these programs do not suffice. For instance, there is no software for performing univariate meta-analysis with…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grant, C W; Lenderman, J S; Gansemer, J D
This document is an update to the 'ADIS Algorithm Evaluation Project Plan' specified in the Statement of Work for the US-VISIT Identity Matching Algorithm Evaluation Program, as deliverable II.D.1. The original plan was delivered in August 2010. This document modifies the plan to reflect modified deliverables reflecting delays in obtaining a database refresh. This document describes the revised schedule of the program deliverables. The detailed description of the processes used, the statistical analysis processes and the results of the statistical analysis will be described fully in the program deliverables. The US-VISIT Identity Matching Algorithm Evaluation Program is work performed bymore » Lawrence Livermore National Laboratory (LLNL) under IAA HSHQVT-07-X-00002 P00004 from the Department of Homeland Security (DHS).« less
WINPEPI updated: computer programs for epidemiologists, and their teaching potential
2011-01-01
Background The WINPEPI computer programs for epidemiologists are designed for use in practice and research in the health field and as learning or teaching aids. The programs are free, and can be downloaded from the Internet. Numerous additions have been made in recent years. Implementation There are now seven WINPEPI programs: DESCRIBE, for use in descriptive epidemiology; COMPARE2, for use in comparisons of two independent groups or samples; PAIRSetc, for use in comparisons of paired and other matched observations; LOGISTIC, for logistic regression analysis; POISSON, for Poisson regression analysis; WHATIS, a "ready reckoner" utility program; and ETCETERA, for miscellaneous other procedures. The programs now contain 122 modules, each of which provides a number, sometimes a large number, of statistical procedures. The programs are accompanied by a Finder that indicates which modules are appropriate for different purposes. The manuals explain the uses, limitations and applicability of the procedures, and furnish formulae and references. Conclusions WINPEPI is a handy resource for a wide variety of statistical routines used by epidemiologists. Because of its ready availability, portability, ease of use, and versatility, WINPEPI has a considerable potential as a learning and teaching aid, both with respect to practical procedures in the planning and analysis of epidemiological studies, and with respect to important epidemiological concepts. It can also be used as an aid in the teaching of general basic statistics. PMID:21288353
J. D. Shaw
2006-01-01
Benefits of a strategic national forest inventory to science and society: the USDA Forest Service Forest Inventory and Analysis program. Forest Inventory and Analysis, previously known as Forest Survey, is one of the oldest research and development programs in the USDA Forest Service. Statistically-based inventory efforts that started in Scandinavian countries in the...
AutoBayes Program Synthesis System Users Manual
NASA Technical Reports Server (NTRS)
Schumann, Johann; Jafari, Hamed; Pressburger, Tom; Denney, Ewen; Buntine, Wray; Fischer, Bernd
2008-01-01
Program synthesis is the systematic, automatic construction of efficient executable code from high-level declarative specifications. AutoBayes is a fully automatic program synthesis system for the statistical data analysis domain; in particular, it solves parameter estimation problems. It has seen many successful applications at NASA and is currently being used, for example, to analyze simulation results for Orion. The input to AutoBayes is a concise description of a data analysis problem composed of a parameterized statistical model and a goal that is a probability term involving parameters and input data. The output is optimized and fully documented C/C++ code computing the values for those parameters that maximize the probability term. AutoBayes can solve many subproblems symbolically rather than having to rely on numeric approximation algorithms, thus yielding effective, efficient, and compact code. Statistical analysis is faster and more reliable, because effort can be focused on model development and validation rather than manual development of solution algorithms and code.
ERIC Educational Resources Information Center
Goshu, Ayele Taye
2016-01-01
This paper describes the experiences gained from the established statistical collaboration canter at Hawassa University in May 2015 as part of LISA 2020 [Laboratory for Interdisciplinary Statistical Analysis] network. The center has got similar setup as LISA of Virginia Tech. Statisticians are trained on how to become more effective scientific…
ERIC Educational Resources Information Center
Meletiou-Mavrotheris, Maria
2004-01-01
While technology has become an integral part of introductory statistics courses, the programs typically employed are professional packages designed primarily for data analysis rather than for learning. Findings from several studies suggest that use of such software in the introductory statistics classroom may not be very effective in helping…
Shock and Vibration Symposium (59th) Held in Albuquerque, New Mexico on 18-20 October 1988. Volume 4
1988-12-01
program to support TOPEX spacecraft design, Statistical energy analysis modeling of nonstructural mass on lightweight equipment panels using VAPEPS...and Stress estimation and statistical energy analysis of the Magellan spacecraft solar array using VAPEPS; Dynamic measurement -- An automated
Ronald E. McRoberts; William A. Bechtold; Paul L. Patterson; Charles T. Scott; Gregory A. Reams
2005-01-01
The Forest Inventory and Analysis (FIA) program of the USDA Forest Service has initiated a transition from regional, periodic inventories to an enhanced national FIA program featuring annual measurement of a proportion of plots in each state, greater national consistency, and integration with the ground sampling component of the Forest Health Monitoring (FHM) program...
NASA Technical Reports Server (NTRS)
Keegan, W. B.
1974-01-01
In order to produce cost effective environmental test programs, the test specifications must be realistic and to be useful, they must be available early in the life of a program. This paper describes a method for achieving such specifications for subsystems by utilizing the results of a statistical analysis of data acquired at subsystem mounting locations during system level environmental tests. The paper describes the details of this statistical analysis. The resultant recommended levels are a function of the subsystems' mounting location in the spacecraft. Methods of determining this mounting 'zone' are described. Recommendations are then made as to which of the various problem areas encountered should be pursued further.
FUNSTAT and statistical image representations
NASA Technical Reports Server (NTRS)
Parzen, E.
1983-01-01
General ideas of functional statistical inference analysis of one sample and two samples, univariate and bivariate are outlined. ONESAM program is applied to analyze the univariate probability distributions of multi-spectral image data.
AIDS Education for Tanzanian Youth: A Mediation Analysis
ERIC Educational Resources Information Center
Stigler, Melissa H.; Kugler, K. C.; Komro, K. A.; Leshabari, M. T.; Klepp, K. I.
2006-01-01
Mediation analysis is a statistical technique that can be used to identify mechanisms by which intervention programs achieve their effects. This paper presents the results of a mediation analysis of Ngao, an acquired immunodeficiency syndrome (AIDS) education program that was implemented with school children in Grades 6 and 7 in Tanzania in the…
The Role of Data Analysis Software in Graduate Programs in Education and Post-Graduate Research
ERIC Educational Resources Information Center
Harwell, Michael
2018-01-01
The importance of data analysis software in graduate programs in education and post-graduate educational research is self-evident. However the role of this software in facilitating supererogated statistical practice versus "cookbookery" is unclear. The need to rigorously document the role of data analysis software in students' graduate…
Modular reweighting software for statistical mechanical analysis of biased equilibrium data
NASA Astrophysics Data System (ADS)
Sindhikara, Daniel J.
2012-07-01
Here a simple, useful, modular approach and software suite designed for statistical reweighting and analysis of equilibrium ensembles is presented. Statistical reweighting is useful and sometimes necessary for analysis of equilibrium enhanced sampling methods, such as umbrella sampling or replica exchange, and also in experimental cases where biasing factors are explicitly known. Essentially, statistical reweighting allows extrapolation of data from one or more equilibrium ensembles to another. Here, the fundamental separable steps of statistical reweighting are broken up into modules - allowing for application to the general case and avoiding the black-box nature of some “all-inclusive” reweighting programs. Additionally, the programs included are, by-design, written with little dependencies. The compilers required are either pre-installed on most systems, or freely available for download with minimal trouble. Examples of the use of this suite applied to umbrella sampling and replica exchange molecular dynamics simulations will be shown along with advice on how to apply it in the general case. New version program summaryProgram title: Modular reweighting version 2 Catalogue identifier: AEJH_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEJH_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License, version 3 No. of lines in distributed program, including test data, etc.: 179 118 No. of bytes in distributed program, including test data, etc.: 8 518 178 Distribution format: tar.gz Programming language: C++, Python 2.6+, Perl 5+ Computer: Any Operating system: Any RAM: 50-500 MB Supplementary material: An updated version of the original manuscript (Comput. Phys. Commun. 182 (2011) 2227) is available Classification: 4.13 Catalogue identifier of previous version: AEJH_v1_0 Journal reference of previous version: Comput. Phys. Commun. 182 (2011) 2227 Does the new version supersede the previous version?: Yes Nature of problem: While equilibrium reweighting is ubiquitous, there are no public programs available to perform the reweighting in the general case. Further, specific programs often suffer from many library dependencies and numerical instability. Solution method: This package is written in a modular format that allows for easy applicability of reweighting in the general case. Modules are small, numerically stable, and require minimal libraries. Reasons for new version: Some minor bugs, some upgrades needed, error analysis added. analyzeweight.py/analyzeweight.py2 has been replaced by “multihist.py”. This new program performs all the functions of its predecessor while being versatile enough to handle other types of histograms and probability analysis. “bootstrap.py” was added. This script performs basic bootstrap resampling allowing for error analysis of data. “avg_dev_distribution.py” was added. This program computes the averages and standard deviations of multiple distributions, making error analysis (e.g. from bootstrap resampling) easier to visualize. WRE.cpp was slightly modified purely for cosmetic reasons. The manual was updated for clarity and to reflect version updates. Examples were removed from the manual in favor of online tutorials (packaged examples remain). Examples were updated to reflect the new format. An additional example is included to demonstrate error analysis. Running time: Preprocessing scripts 1-5 minutes, WHAM engine <1 minute, postprocess script ∼1-5 minutes.
ERIC Educational Resources Information Center
Long, Thomas E.
In this institute, the participants were trained to use peripheral computer related equipment. They were taught Fortran programming skills so they might write and redimension statistical formulary programs, and they were trained to assemble data so they might access computers via both card and punched-tape input. The objectives of the Institute…
New software for statistical analysis of Cambridge Structural Database data
Sykes, Richard A.; McCabe, Patrick; Allen, Frank H.; Battle, Gary M.; Bruno, Ian J.; Wood, Peter A.
2011-01-01
A collection of new software tools is presented for the analysis of geometrical, chemical and crystallographic data from the Cambridge Structural Database (CSD). This software supersedes the program Vista. The new functionality is integrated into the program Mercury in order to provide statistical, charting and plotting options alongside three-dimensional structural visualization and analysis. The integration also permits immediate access to other information about specific CSD entries through the Mercury framework, a common requirement in CSD data analyses. In addition, the new software includes a range of more advanced features focused towards structural analysis such as principal components analysis, cone-angle correction in hydrogen-bond analyses and the ability to deal with topological symmetry that may be exhibited in molecular search fragments. PMID:22477784
Telecommunication market research processing
NASA Astrophysics Data System (ADS)
Dupont, J. F.
1983-06-01
The data processing in two telecommunication market investigations is described. One of the studies concerns the office applications of communication and the other the experiences with a videotex terminal. Statistical factorial analysis was performed on a large mass of data. A comparison between utilization intentions and effective utilization is made. Extensive rewriting of statistical analysis computer programs was required.
Scout trajectory error propagation computer program
NASA Technical Reports Server (NTRS)
Myler, T. R.
1982-01-01
Since 1969, flight experience has been used as the basis for predicting Scout orbital accuracy. The data used for calculating the accuracy consists of errors in the trajectory parameters (altitude, velocity, etc.) at stage burnout as observed on Scout flights. Approximately 50 sets of errors are used in Monte Carlo analysis to generate error statistics in the trajectory parameters. A covariance matrix is formed which may be propagated in time. The mechanization of this process resulted in computer program Scout Trajectory Error Propagation (STEP) and is described herein. Computer program STEP may be used in conjunction with the Statistical Orbital Analysis Routine to generate accuracy in the orbit parameters (apogee, perigee, inclination, etc.) based upon flight experience.
SYNOPTIC RAINFALL DATA ANALYSIS PROGRAM (SYNOP). RELEASE NO. 1
An integral part of the assessment of storm loads on water quality is the statistical evaluation of rainfall records. Hourly rainfall records of many years duration are cumbersome and difficult to analyze. The purpose of this rainfall data analysis program is to provide the user ...
Vermont's use-value appraisal property tax program: a forest inventory and analysis
Paul E. Sendak; Donald F. Dennis; Donald F. Dennis
1989-01-01
A statistical report and analysis of the timberland enrolled in the Vermont Use Value Appraisal (UVA) property tax program. The study was conducted using data collected in the fourth forest survey of Vermont (1983). Estimates are presented on land area, timber volumes, tree quality, numbers of live trees, and biomass for timberland enrolled in the UVA program and for...
Forest Fire History... A Computer Method of Data Analysis
Romain M. Meese
1973-01-01
A series of computer programs is available to extract information from the individual Fire Reports (U.S. Forest Service Form 5100-29). The programs use a statistical technique to fit a continuous distribution to a set of sampled data. The goodness-of-fit program is applicable to data other than the fire history. Data summaries illustrate analysis of fire occurrence,...
2015-06-18
Engineering Effectiveness Survey. CMU/SEI-2012-SR-009. Carnegie Mellon University. November 2012. Field, Andy. Discovering Statistics Using SPSS , 3rd...enough into the survey to begin answering questions on risk practices. All of the data statistical analysis will be performed using SPSS . Prior to...probabilistically using distributions for likelihood and impact. Statistical methods like Monte Carlo can more comprehensively evaluate the cost and
34 CFR Appendix A to Subpart N of... - Sample Default Prevention Plan
Code of Federal Regulations, 2010 CFR
2010-07-01
... relevant default prevention statistics, including a statistical analysis of the borrowers who default on...'s delinquency status by obtaining reports from data managers and FFEL Program lenders. 5. Enhance... academic study. III. Statistics for Measuring Progress 1. The number of students enrolled at your...
Dale D. Gormanson; Scott A. Pugh; Charles J. Barnett; Patrick D. Miles; Randall S. Morin; Paul A. Sowers; James A. Westfall
2018-01-01
The U.S. Forest Service Forest Inventory and Analysis (FIA) program collects sample plot data on all forest ownerships across the United States. FIAâs primary objective is to determine the extent, condition, volume, growth, and use of trees on the Nationâs forest land through a comprehensive inventory and analysis of the Nationâs forest resources. The FIA program...
Two-Bin Kanban: Ordering Impact at Navy Medical Center San Diego
2016-06-17
pretest (2013 data set) and posttest (2015 data set) analysis to avoid having the findings influenced by price changes. DMLSS does not track shipping...statistics based on those observations (Kabacoff, 2011, p. 112). Replacing the groups of observations with summary statistics allows the analyst...listed on the Acquisition Research Program website (www.acquisitionresearch.net). Acquisition Research Program Graduate School of Business & Public
Analysis of Nursing Curriculum and Course Competencies.
ERIC Educational Resources Information Center
Trani, G. M.
The objectives of this study were to relate the competencies of the Nursing Program at Delaware County Community College to national morbidity statistics and to recommend curriculum changes based on this analysis. Existing terminal objectives of the program and each nursing module were compared with college-wide terminal objectives, overlap was…
A Categorization of Dynamic Analyzers
NASA Technical Reports Server (NTRS)
Lujan, Michelle R.
1997-01-01
Program analysis techniques and tools are essential to the development process because of the support they provide in detecting errors and deficiencies at different phases of development. The types of information rendered through analysis includes the following: statistical measurements of code, type checks, dataflow analysis, consistency checks, test data,verification of code, and debugging information. Analyzers can be broken into two major categories: dynamic and static. Static analyzers examine programs with respect to syntax errors and structural properties., This includes gathering statistical information on program content, such as the number of lines of executable code, source lines. and cyclomatic complexity. In addition, static analyzers provide the ability to check for the consistency of programs with respect to variables. Dynamic analyzers in contrast are dependent on input and the execution of a program providing the ability to find errors that cannot be detected through the use of static analysis alone. Dynamic analysis provides information on the behavior of a program rather than on the syntax. Both types of analysis detect errors in a program, but dynamic analyzers accomplish this through run-time behavior. This paper focuses on the following broad classification of dynamic analyzers: 1) Metrics; 2) Models; and 3) Monitors. Metrics are those analyzers that provide measurement. The next category, models, captures those analyzers that present the state of the program to the user at specified points in time. The last category, monitors, checks specified code based on some criteria. The paper discusses each classification and the techniques that are included under them. In addition, the role of each technique in the software life cycle is discussed. Familiarization with the tools that measure, model and monitor programs provides a framework for understanding the program's dynamic behavior from different, perspectives through analysis of the input/output data.
A Monte Carlo study of Weibull reliability analysis for space shuttle main engine components
NASA Technical Reports Server (NTRS)
Abernethy, K.
1986-01-01
The incorporation of a number of additional capabilities into an existing Weibull analysis computer program and the results of Monte Carlo computer simulation study to evaluate the usefulness of the Weibull methods using samples with a very small number of failures and extensive censoring are discussed. Since the censoring mechanism inherent in the Space Shuttle Main Engine (SSME) data is hard to analyze, it was decided to use a random censoring model, generating censoring times from a uniform probability distribution. Some of the statistical techniques and computer programs that are used in the SSME Weibull analysis are described. The methods documented in were supplemented by adding computer calculations of approximate (using iteractive methods) confidence intervals for several parameters of interest. These calculations are based on a likelihood ratio statistic which is asymptotically a chisquared statistic with one degree of freedom. The assumptions built into the computer simulations are described. The simulation program and the techniques used in it are described there also. Simulation results are tabulated for various combinations of Weibull shape parameters and the numbers of failures in the samples.
ERIC Educational Resources Information Center
Warne, Russell T.
2016-01-01
Recently Kim (2016) published a meta-analysis on the effects of enrichment programs for gifted students. She found that these programs produced substantial effects for academic achievement (g = 0.96) and socioemotional outcomes (g = 0.55). However, given current theory and empirical research these estimates of the benefits of enrichment programs…
ERIC Educational Resources Information Center
Murphy, Philip J.
The paper reports the final evaluation of a program for approximately 143 learning disabled (LD) students (grades 6-to-12) from six school districts. A number of test instruments were used to evaluate student progress during the program, including the Wide Range Achievement Test (WRAT), the Durrell Analysis of Reading Difficulty, and the…
The microcomputer scientific software series 3: general linear model--analysis of variance.
Harold M. Rauscher
1985-01-01
A BASIC language set of programs, designed for use on microcomputers, is presented. This set of programs will perform the analysis of variance for any statistical model describing either balanced or unbalanced designs. The program computes and displays the degrees of freedom, Type I sum of squares, and the mean square for the overall model, the error, and each factor...
Environmental Technician Training in the United Kingdom.
ERIC Educational Resources Information Center
Potter, John F.
1985-01-01
Stresses the need for qualified environmental science technicians and for training courses in this area. Provides program information and statistical summarization of a national diploma program for environmental technicians titled "Business and Technician Education Council." Reviews the program areas of environmental analysis and…
40 CFR 610.10 - Program purpose.
Code of Federal Regulations, 2013 CFR
2013-07-01
... DEVICES Test Procedures and Evaluation Criteria General Provisions § 610.10 Program purpose. (a) The... standardized procedures, the performance of various retrofit devices applicable to automobiles for which fuel... statistical analysis of data from vehicle tests, the evaluation program will determine the effects on fuel...
40 CFR 610.10 - Program purpose.
Code of Federal Regulations, 2014 CFR
2014-07-01
... DEVICES Test Procedures and Evaluation Criteria General Provisions § 610.10 Program purpose. (a) The... standardized procedures, the performance of various retrofit devices applicable to automobiles for which fuel... statistical analysis of data from vehicle tests, the evaluation program will determine the effects on fuel...
40 CFR 610.10 - Program purpose.
Code of Federal Regulations, 2011 CFR
2011-07-01
... DEVICES Test Procedures and Evaluation Criteria General Provisions § 610.10 Program purpose. (a) The... standardized procedures, the performance of various retrofit devices applicable to automobiles for which fuel... statistical analysis of data from vehicle tests, the evaluation program will determine the effects on fuel...
40 CFR 610.10 - Program purpose.
Code of Federal Regulations, 2012 CFR
2012-07-01
... DEVICES Test Procedures and Evaluation Criteria General Provisions § 610.10 Program purpose. (a) The... standardized procedures, the performance of various retrofit devices applicable to automobiles for which fuel... statistical analysis of data from vehicle tests, the evaluation program will determine the effects on fuel...
Dale D. Gormanson; Scott A. Pugh; Charles J. Barnett; Patrick D. Miles; Randall S. Morin; Paul A. Sowers; Jim Westfall
2017-01-01
The U.S. Forest Service Forest Inventory and Analysis (FIA) program collects sample plot data on all forest ownerships across the United States. FIA's primary objective is to determine the extent, condition, volume, growth, and use of trees on the Nation's forest land through a comprehensive inventory and analysis of the Nation's forest resources. The...
A statistical data analysis and plotting program for cloud microphysics experiments
NASA Technical Reports Server (NTRS)
Jordan, A. J.
1981-01-01
The analysis software developed for atmospheric cloud microphysics experiments conducted in the laboratory as well as aboard a KC-135 aircraft is described. A group of four programs was developed and implemented on a Hewlett Packard 1000 series F minicomputer running under HP's RTE-IVB operating system. The programs control and read data from a MEMODYNE Model 3765-8BV cassette recorder, format the data on the Hewlett Packard disk subsystem, and generate statistical data (mean, variance, standard deviation) and voltage and engineering unit plots on a user selected plotting device. The programs are written in HP FORTRAN IV and HP ASSEMBLY Language with the graphics software using the HP 1000 Graphics. The supported plotting devices are the HP 2647A graphics terminal, the HP 9872B four color pen plotter, and the HP 2608A matrix line printer.
Poster - Thur Eve - 54: A software solution for ongoing DVH quality assurance in radiation therapy.
Annis, S-L; Zeng, G; Wu, X; Macpherson, M
2012-07-01
A program has been developed in MATLAB for use in quality assurance of treatment planning of radiation therapy. It analyzes patient DVH files and compiles dose volume data for review, trending, comparison and analysis. Patient DVH files are exported from the Eclipse treatment planning system and saved according to treatment sites and date. Currently analysis is available for 4 treatment sites; Prostate, Prostate Bed, Lung, and Upper GI, with two functions for data report and analysis: patient-specific and organ-specific. The patient-specific function loads one patient DVH file and reports the user-specified dose volume data of organs and targets. These data can be compiled to an external file for a third party analysis. The organ-specific function extracts a requested dose volume of an organ from the DVH files of a patient group and reports the statistics over this population. A graphical user interface is utilized to select clinical sites, function and structures, and input user's requests. We have implemented this program in planning quality assurance at our center. The program has tracked the dosimetric improvement in GU sites after VMAT was implemented clinically. It has generated dose volume statistics for different groups of patients associated with technique or time range. This program allows reporting and statistical analysis of DVH files. It is an efficient tool for the planning quality control in radiation therapy. © 2012 American Association of Physicists in Medicine.
Development of a Predictive Corrosion Model Using Locality-Specific Corrosion Indices
2017-09-12
6 3.2.1 Statistical data analysis methods ...6 3.2.2 Algorithm development method ...components, and method ) were compiled into an executable program that uses mathematical models of materials degradation, and statistical calcula- tions
Approaching Career Criminals With An Intelligence Cycle
2015-12-01
including arrest statistics and “arrest statistics have been used as the main barometer of juvenile delinquent activity, (but) many juvenile... Statistical Briefing Book,” 187. 26 guided by theories about the causes of delinquent behavior, but there was no determination if those efforts achieved the...children.”110 However, the most evidence-based comparison of juvenile delinquency reduction programs is the statistical meta-analysis (a systematic
Model-Based Linkage Analysis of a Quantitative Trait.
Song, Yeunjoo E; Song, Sunah; Schnell, Audrey H
2017-01-01
Linkage Analysis is a family-based method of analysis to examine whether any typed genetic markers cosegregate with a given trait, in this case a quantitative trait. If linkage exists, this is taken as evidence in support of a genetic basis for the trait. Historically, linkage analysis was performed using a binary disease trait, but has been extended to include quantitative disease measures. Quantitative traits are desirable as they provide more information than binary traits. Linkage analysis can be performed using single-marker methods (one marker at a time) or multipoint (using multiple markers simultaneously). In model-based linkage analysis the genetic model for the trait of interest is specified. There are many software options for performing linkage analysis. Here, we use the program package Statistical Analysis for Genetic Epidemiology (S.A.G.E.). S.A.G.E. was chosen because it also includes programs to perform data cleaning procedures and to generate and test genetic models for a quantitative trait, in addition to performing linkage analysis. We demonstrate in detail the process of running the program LODLINK to perform single-marker analysis, and MLOD to perform multipoint analysis using output from SEGREG, where SEGREG was used to determine the best fitting statistical model for the trait.
Graduate Programs in Education: Impact on Teachers' Careers
ERIC Educational Resources Information Center
Tucker, Janice; Fushell, Marian
2013-01-01
This paper examined teachers' decisions to pursue graduate programs and their career choices following completion of their studies. Based on document analysis and statistical examination of teacher questionnaire responses, this study determined that teachers choose graduate studies for different reasons, their program choice influences future…
Learning from Commercials: The Influence of TV Advertising on the Voter Political "Agenda."
ERIC Educational Resources Information Center
Shaw, Donald L.; Bowers, Thomas A.
The effects of the television advertisements for Richard Nixon and George McGovern during the 1972 presidential election were tested by a content analysis of television programing and statistical analysis of viewer attitudinal response. Programing content for Nixon developed more general issues and did not especially feature the personality of…
An Educational Program of Mathematical Creativity
ERIC Educational Resources Information Center
Petrovici, Constantin; Havârneanu, Geanina
2015-01-01
In this article we intend to analyze the effectiveness of an educational program of mathematical creativity, designed for learners aged 10 to 12 years, which has been implemented in an urban school of Iasi, Romania. This analysis has both a psycho-educational dimension and a statistical analysis one. The psycho-educational dimension refers to the…
Larry J. Gangi
2006-01-01
The FIREMON Analysis Tools program is designed to let the user perform grouped or ungrouped summary calculations of single measurement plot data, or statistical comparisons of grouped or ungrouped plot data taken at different sampling periods. The program allows the user to create reports and graphs, save and print them, or cut and paste them into a word processor....
Mixed-Integer Nonconvex Quadratic Optimization Relaxations and Performance Analysis
2016-10-11
Analysis of Interior Point Algorithms for Non-Lipschitz and Nonconvex Minimization,” (W. Bian, X. Chen, and Ye), Math Programming, 149 (2015) 301-327...Chen, Ge, Wang, Ye), Math Programming, 143 (1-2) (2014) 371-383. This paper resolved an important open question in cardinality constrained...Statistical Performance, and Algorithmic Theory for Local Solutions,” (H. Liu, T. Yao, R. Li, Y. Ye) manuscript, 2nd revision in Math Programming
Probability and Statistics in Sensor Performance Modeling
2010-12-01
language software program is called Environmental Awareness for Sensor and Emitter Employment. Some important numerical issues in the implementation...3 Statistical analysis for measuring sensor performance...complementary cumulative distribution function cdf cumulative distribution function DST decision-support tool EASEE Environmental Awareness of
Statistical Analysis Tools for Learning in Engineering Laboratories.
ERIC Educational Resources Information Center
Maher, Carolyn A.
1990-01-01
Described are engineering programs that have used automated data acquisition systems to implement data collection and analyze experiments. Applications include a biochemical engineering laboratory, heat transfer performance, engineering materials testing, mechanical system reliability, statistical control laboratory, thermo-fluid laboratory, and a…
Tang, Qi-Yi; Zhang, Chuan-Xi
2013-04-01
A comprehensive but simple-to-use software package called DPS (Data Processing System) has been developed to execute a range of standard numerical analyses and operations used in experimental design, statistics and data mining. This program runs on standard Windows computers. Many of the functions are specific to entomological and other biological research and are not found in standard statistical software. This paper presents applications of DPS to experimental design, statistical analysis and data mining in entomology. © 2012 The Authors Insect Science © 2012 Institute of Zoology, Chinese Academy of Sciences.
Yu, Xiaojin; Liu, Pei; Min, Jie; Chen, Qiguang
2009-01-01
To explore the application of regression on order statistics (ROS) in estimating nondetects for food exposure assessment. Regression on order statistics was adopted in analysis of cadmium residual data set from global food contaminant monitoring, the mean residual was estimated basing SAS programming and compared with the results from substitution methods. The results show that ROS method performs better obviously than substitution methods for being robust and convenient for posterior analysis. Regression on order statistics is worth to adopt,but more efforts should be make for details of application of this method.
Ahmed, K S
1979-01-01
In Bangladesh the Population Control and Family Planning Division of the Ministry of Health and Population Control has decided to delegate increased financial and administrative powers to the officers of the family planning program at the district level and below. Currently, about 20,000 family planning workers and officials are at work in rural areas. The government believes that the success of the entire family planning program depends on the performance of workers in rural areas, because that is where about 90% of the population lives. Awareness of the need to improve statistical data in Bangladesh has been increasing, particularly in regard to the development of rural areas. An accurate statistical profile of rural Bangladesh is crucial to the formation, implementation and evaluation of rural development programs. A Seminar on Statistics for Rural Development will be held from June 18-20, 1980. The primary objectives of the Seminar are to make an exhaustive analysis of the current availability of statistics required for rural development programs and to consider methodological and operational improvements toward building up an adequate data base.
Senior Computational Scientist | Center for Cancer Research
The Basic Science Program (BSP) pursues independent, multidisciplinary research in basic and applied molecular biology, immunology, retrovirology, cancer biology, and human genetics. Research efforts and support are an integral part of the Center for Cancer Research (CCR) at the Frederick National Laboratory for Cancer Research (FNLCR). The Cancer & Inflammation Program (CIP), Basic Science Program, HLA Immunogenetics Section, under the leadership of Dr. Mary Carrington, studies the influence of human leukocyte antigens (HLA) and specific KIR/HLA genotypes on risk of and outcomes to infection, cancer, autoimmune disease, and maternal-fetal disease. Recent studies have focused on the impact of HLA gene expression in disease, the molecular mechanism regulating expression levels, and the functional basis for the effect of differential expression on disease outcome. The lab’s further focus is on the genetic basis for resistance/susceptibility to disease conferred by immunogenetic variation. KEY ROLES/RESPONSIBILITIES The Senior Computational Scientist will provide research support to the CIP-BSP-HLA Immunogenetics Section performing bio-statistical design, analysis and reporting of research projects conducted in the lab. This individual will be involved in the implementation of statistical models and data preparation. Successful candidate should have 5 or more years of competent, innovative biostatistics/bioinformatics research experience, beyond doctoral training Considerable experience with statistical software, such as SAS, R and S-Plus Sound knowledge, and demonstrated experience of theoretical and applied statistics Write program code to analyze data using statistical analysis software Contribute to the interpretation and publication of research results
Granato, Gregory E.
2009-01-01
Streamflow information is important for many planning and design activities including water-supply analysis, habitat protection, bridge and culvert design, calibration of surface and ground-water models, and water-quality assessments. Streamflow information is especially critical for water-quality assessments (Warn and Brew, 1980; Di Toro, 1984; Driscoll and others, 1989; Driscoll and others, 1990, a,b). Calculation of streamflow statistics for receiving waters is necessary to estimate the potential effects of point sources such as wastewater-treatment plants and nonpoint sources such as highway and urban-runoff discharges on receiving water. Streamflow statistics indicate the amount of flow that may be available for dilution and transport of contaminants (U.S. Environmental Protection Agency, 1986; Driscoll and others, 1990, a,b). Streamflow statistics also may be used to indicate receiving-water quality because concentrations of water-quality constituents commonly vary naturally with streamflow. For example, concentrations of suspended sediment and sediment-associated constituents (such as nutrients, trace elements, and many organic compounds) commonly increase with increasing flows, and concentrations of many dissolved constituents commonly decrease with increasing flows in streams and rivers (O'Connor, 1976; Glysson, 1987; Vogel and others, 2003, 2005). Reliable, efficient and repeatable methods are needed to access and process streamflow information and data. For example, the Nation's highway infrastructure includes an innumerable number of stream crossings and stormwater-outfall points for which estimates of stream-discharge statistics may be needed. The U.S. Geological Survey (USGS) streamflow data-collection program is designed to provide streamflow data at gaged sites and to provide information that can be used to estimate streamflows at almost any point along any stream in the United States (Benson and Carter, 1973; Wahl and others, 1995; National Research Council, 2004). The USGS maintains the National Water Information System (NWIS), a distributed network of computers and file servers used to store and retrieve hydrologic data (Mathey, 1998; U.S. Geological Survey, 2008). NWISWeb is an online version of this database that includes water data from more than 24,000 streamflow-gaging stations throughout the United States (U.S. Geological Survey, 2002, 2008). Information from NWISWeb is commonly used to characterize streamflows at gaged sites and to help predict streamflows at ungaged sites. Five computer programs were developed for obtaining and analyzing streamflow from the National Water Information System (NWISWeb). The programs were developed as part of a study by the U.S. Geological Survey, in cooperation with the Federal Highway Administration, to develop a stochastic empirical loading and dilution model. The programs were developed because reliable, efficient, and repeatable methods are needed to access and process streamflow information and data. The first program is designed to facilitate the downloading and reformatting of NWISWeb streamflow data. The second program is designed to facilitate graphical analysis of streamflow data. The third program is designed to facilitate streamflow-record extension and augmentation to help develop long-term statistical estimates for sites with limited data. The fourth program is designed to facilitate statistical analysis of streamflow data. The fifth program is a preprocessor to create batch input files for the U.S. Environmental Protection Agency DFLOW3 program for calculating low-flow statistics. These computer programs were developed to facilitate the analysis of daily mean streamflow data for planning-level water-quality analyses but also are useful for many other applications pertaining to streamflow data and statistics. These programs and the associated documentation are included on the CD-ROM accompanying this report. This report and the appendixes on the
Systems Analysis of NASA Aviation Safety Program: Final Report
NASA Technical Reports Server (NTRS)
Jones, Sharon M.; Reveley, Mary S.; Withrow, Colleen A.; Evans, Joni K.; Barr, Lawrence; Leone, Karen
2013-01-01
A three-month study (February to April 2010) of the NASA Aviation Safety (AvSafe) program was conducted. This study comprised three components: (1) a statistical analysis of currently available civilian subsonic aircraft data from the National Transportation Safety Board (NTSB), the Federal Aviation Administration (FAA), and the Aviation Safety Information Analysis and Sharing (ASIAS) system to identify any significant or overlooked aviation safety issues; (2) a high-level qualitative identification of future safety risks, with an assessment of the potential impact of the NASA AvSafe research on the National Airspace System (NAS) based on these risks; and (3) a detailed, top-down analysis of the NASA AvSafe program using an established and peer-reviewed systems analysis methodology. The statistical analysis identified the top aviation "tall poles" based on NTSB accident and FAA incident data from 1997 to 2006. A separate examination of medical helicopter accidents in the United States was also conducted. Multiple external sources were used to develop a compilation of ten "tall poles" in future safety issues/risks. The top-down analysis of the AvSafe was conducted by using a modification of the Gibson methodology. Of the 17 challenging safety issues that were identified, 11 were directly addressed by the AvSafe program research portfolio.
How Do Microfinance Programs Contribute to Poverty Reduction
2016-09-01
areas have experienced statistically higher incidents of crime tied to class conflict.90 Land tax systems under the British were also responsible for...countries.173 This low delinquency rate is credited to the lack of alternative opportunities that are available to the poor.174 According to Muhammad...TOTAL: 909 54.6 60.2 55 Figure 2. Program Duration and Objective Poverty.197 The statistical analysis conducted by Chowdhury, Gosh and Wright finds
1992-01-09
Crystal Polymers Tracy Reed Geophysics Laboratory (GEO) 9 Analysis of Model Output Statistics Thunderstorm Prediction Model Frank Lasley 10...four hours to twenty-four hours. It was predicted that the dogbones would turn brown once they reached the approximate annealing temperature. This was...LYS Hanscom AFB Frank A. Lasley Abstracft. Model Output Statistics (MOS) Thunderstorm prediction information and Service A weather observations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goltz, G.; Kaiser, L.M.; Weiner, H.
A major mission of the U.S. Coast Guard is the task of providing and maintaining Maritime Aids to Navigation. These aids are located on and near the coastline and inland waters of the United States and its possessions. A computer program, Design Synthesis and Performance Analysis (DSPA), has been developed by the Jet Propulsion Laboratory to demonstrate the feasibility of low-cost solar array/battery power systems for use on flashing lamp buoys. To provide detailed, realistic temperature, wind, and solar insolation data for analysis of the flashing lamp buoy power systems, the two DSPA support computer program sets: MERGE and STATmore » were developed. A general description of these two packages is presented in this program summary report. The MERGE program set will enable the Coast Guard to combine temperature and wind velocity data (NOAA TDF-14 tapes) with solar insolation data (NOAA DECK-280 tapes) onto a single sequential MERGE file containing up to 12 years of hourly observations. This MERGE file can then be used as direct input to the DSPA program. The STAT program set will enable a statistical analysis to be performed of the MERGE data and produce high or low or mean profiles of the data and/or do a worst case analysis. The STAT output file consists of a one-year set of hourly statistical weather data which can be used as input to the DSPA program.« less
An exploratory investigation of weight estimation techniques for hypersonic flight vehicles
NASA Technical Reports Server (NTRS)
Cook, E. L.
1981-01-01
The three basic methods of weight prediction (fixed-fraction, statistical correlation, and point stress analysis) and some of the computer programs that have been developed to implement them are discussed. A modified version of the WAATS (Weights Analysis of Advanced Transportation Systems) program is presented, along with input data forms and an example problem.
Remote Sensing/gis Integration for Site Planning and Resource Management
NASA Technical Reports Server (NTRS)
Fellows, J. D.
1982-01-01
The development of an interactive/batch gridded information system (array of cells georeferenced to USGS quad sheets) and interfacing application programs (e.g., hydrologic models) is discussed. This system allows non-programer users to request any data set(s) stored in the data base by inputing any random polygon's (watershed, political zone) boundary points. The data base information contained within this polygon can be used to produce maps, statistics, and define model parameters for the area. Present/proposed conditions for the area may be compared by inputing future usage (land cover, soils, slope, etc.). This system, known as the Hydrologic Analysis Program (HAP), is especially effective in the real time analysis of proposed land cover changes on runoff hydrographs and graphics/statistics resource inventories of random study area/watersheds.
Lu, Yan; He, Tian
2014-09-15
Much attention has been recently paid to ex-post assessments of socioeconomic and environmental benefits of payment for ecosystem services (PES) programs on poverty reduction, water quality, and forest protection. To evaluate the effects of a regional PES program on water quality, we selected chemical oxygen demand (COD) and ammonia-nitrogen (NH3-N) as indicators of water quality. Statistical methods and an intervention analysis model were employed to assess whether the PES program produced substantial changes in water quality at 10 water-quality sampling stations in the Shaying River watershed, China during 2006-2011. Statistical results from paired-sample t-tests and box plots of COD and NH3-N concentrations at the 10 stations showed that the PES program has played a positive role in improving water quality and reducing trans-boundary water pollution in the Shaying River watershed. Using the intervention analysis model, we quantitatively evaluated the effects of the intervention policy, i.e., the watershed PES program, on water quality at the 10 stations. The results suggest that this method could be used to assess the environmental benefits of watershed or water-related PES programs, such as improvements in water quality, seasonal flow regulation, erosion and sedimentation, and aquatic habitat. Copyright © 2014 Elsevier B.V. All rights reserved.
Anima: Modular Workflow System for Comprehensive Image Data Analysis
Rantanen, Ville; Valori, Miko; Hautaniemi, Sampsa
2014-01-01
Modern microscopes produce vast amounts of image data, and computational methods are needed to analyze and interpret these data. Furthermore, a single image analysis project may require tens or hundreds of analysis steps starting from data import and pre-processing to segmentation and statistical analysis; and ending with visualization and reporting. To manage such large-scale image data analysis projects, we present here a modular workflow system called Anima. Anima is designed for comprehensive and efficient image data analysis development, and it contains several features that are crucial in high-throughput image data analysis: programing language independence, batch processing, easily customized data processing, interoperability with other software via application programing interfaces, and advanced multivariate statistical analysis. The utility of Anima is shown with two case studies focusing on testing different algorithms developed in different imaging platforms and an automated prediction of alive/dead C. elegans worms by integrating several analysis environments. Anima is a fully open source and available with documentation at www.anduril.org/anima. PMID:25126541
A Comparative Analysis of the Minuteman Education Programs as Currently Offered at Six SAC Bases.
1980-06-01
Principles of Marketing 3 Business Statistics 3 Business Law 3 Management Total... Principles of Marketing 3 Mathematics Methods I Total prerequisite hours 26 Required Graduate Courses Policy Formulation and Administration 3 Management...Business and Economic Statistics 3 Intermediate Business and Economic Statistics 3 Principles of Management 3 Corporation Finance 3 Principles of Marketing
An Environmental Decision Support System for Spatial Assessment and Selective Remediation
Spatial Analysis and Decision Assistance (SADA) is a Windows freeware program that incorporates environmental assessment tools for effective problem-solving. The software integrates modules for GIS, visualization, geospatial analysis, statistical analysis, human health and ecolog...
ERIC Educational Resources Information Center
Bellafiore, Margaret
2012-01-01
Soldiers are returning from war to college. The number of veterans enrolled nationally is hard to find. Data from the National Center for Veterans Analysis and Statistics identify nearly 924,000 veterans as "total education program beneficiaries" for 2011. These statistics combine many categories, including dependents and survivors. The…
System analysis for the Huntsville Operational Support Center distributed computer system
NASA Technical Reports Server (NTRS)
Ingels, E. M.
1983-01-01
A simulation model was developed and programmed in three languages BASIC, PASCAL, and SLAM. Two of the programs are included in this report, the BASIC and the PASCAL language programs. SLAM is not supported by NASA/MSFC facilities and hence was not included. The statistical comparison of simulations of the same HOSC system configurations are in good agreement and are in agreement with the operational statistics of HOSC that were obtained. Three variations of the most recent HOSC configuration was run and some conclusions drawn as to the system performance under these variations.
NASA Astrophysics Data System (ADS)
McCray, Wilmon Wil L., Jr.
The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization model and dashboard that demonstrates the use of statistical methods, statistical process control, sensitivity analysis, quantitative and optimization techniques to establish a baseline and predict future customer satisfaction index scores (outcomes). The American Customer Satisfaction Index (ACSI) model and industry benchmarks were used as a framework for the simulation model.
NASA Technical Reports Server (NTRS)
Shipman, D. L.
1972-01-01
The development of a model to simulate the information system of a program management type of organization is reported. The model statistically determines the following parameters: type of messages, destinations, delivery durations, type processing, processing durations, communication channels, outgoing messages, and priorites. The total management information system of the program management organization is considered, including formal and informal information flows and both facilities and equipment. The model is written in General Purpose System Simulation 2 computer programming language for use on the Univac 1108, Executive 8 computer. The model is simulated on a daily basis and collects queue and resource utilization statistics for each decision point. The statistics are then used by management to evaluate proposed resource allocations, to evaluate proposed changes to the system, and to identify potential problem areas. The model employs both empirical and theoretical distributions which are adjusted to simulate the information flow being studied.
A Mokken scale analysis of the peer physical examination questionnaire.
Vaughan, Brett; Grace, Sandra
2018-01-01
Peer physical examination (PPE) is a teaching and learning strategy utilised in most health profession education programs. Perceptions of participating in PPE have been described in the literature, focusing on areas of the body students are willing, or unwilling, to examine. A small number of questionnaires exist to evaluate these perceptions, however none have described the measurement properties that may allow them to be used longitudinally. The present study undertook a Mokken scale analysis of the Peer Physical Examination Questionnaire (PPEQ) to evaluate its dimensionality and structure when used with Australian osteopathy students. Students enrolled in Year 1 of the osteopathy programs at Victoria University (Melbourne, Australia) and Southern Cross University (Lismore, Australia) were invited to complete the PPEQ prior to their first practical skills examination class. R, an open-source statistics program, was used to generate the descriptive statistics and perform a Mokken scale analysis. Mokken scale analysis is a non-parametric item response theory approach that is used to cluster items measuring a latent construct. Initial analysis suggested the PPEQ did not form a single scale. Further analysis identified three subscales: 'comfort', 'concern', and 'professionalism and education'. The properties of each subscale suggested they were unidimensional with variable internal structures. The 'comfort' subscale was the strongest of the three identified. All subscales demonstrated acceptable reliability estimation statistics (McDonald's omega > 0.75) supporting the calculation of a sum score for each subscale. The subscales identified are consistent with the literature. The 'comfort' subscale may be useful to longitudinally evaluate student perceptions of PPE. Further research is required to evaluate changes with PPE and the utility of the questionnaire with other health profession education programs.
Powerlaw: a Python package for analysis of heavy-tailed distributions.
Alstott, Jeff; Bullmore, Ed; Plenz, Dietmar
2014-01-01
Power laws are theoretically interesting probability distributions that are also frequently used to describe empirical data. In recent years, effective statistical methods for fitting power laws have been developed, but appropriate use of these techniques requires significant programming and statistical insight. In order to greatly decrease the barriers to using good statistical methods for fitting power law distributions, we developed the powerlaw Python package. This software package provides easy commands for basic fitting and statistical analysis of distributions. Notably, it also seeks to support a variety of user needs by being exhaustive in the options available to the user. The source code is publicly available and easily extensible.
Human Systems Engineering and Program Success - A Retrospective Content Analysis
2016-01-01
collected from the 546 documents and entered into SPSS Statistics Version 22.0 for Windows. HSI words within the sampled doc- uments ranged from zero to...engineers. The approach used a retrospective content analysis of documents from weapon systems acquisi- tion programs, namely Major Defense Acquisition...January 2016, Vol. 23 No. 1 : 78–101 January 2016 The interaction between humans and the systems they use affects program success, as well as life-cycle
ERIC Educational Resources Information Center
Bitler, Marianne; Domina, Thurston; Penner, Emily; Hoynes, Hilary
2015-01-01
We use quantile treatment effects estimation to examine the consequences of the random-assignment New York City School Choice Scholarship Program across the distribution of student achievement. Our analyses suggest that the program had negligible and statistically insignificant effects across the skill distribution. In addition to contributing to…
ERIC Educational Resources Information Center
Blackwell, Cindy DeRuiter; Bilics, Andrea
2018-01-01
Directors of entry-level occupational therapy (OT) programs were surveyed regarding how their programs prepare students to become mental health practitioners in schools. Analysis of quantitative data included descriptive statistics to examine participants' ratings of their program's ability to prepare students for mental health practice. We found…
ERIC Educational Resources Information Center
Oderman, Dale
2003-01-01
Part Two B of a three-part study examined how 40 universities with baccalaureate programs in aviation management include ethics education in the curricula. Analysis of responses suggests that there is strong support for ethics instruction and that active department head involvement leads to higher levels of planned ethics inclusion. (JOW)
Longitudinal Analysis of Student Performance in a Dental Hygiene Distance Education Program.
ERIC Educational Resources Information Center
Olmsted, Jodi L.
2002-01-01
Examined over the course of five years whether learners who receive face-to-face instruction in a dental hygiene program performed statistically better on established benchmark assessments than learners at a distance. Found no significant differences. (EV)
Johnson Space Center's Risk and Reliability Analysis Group 2008 Annual Report
NASA Technical Reports Server (NTRS)
Valentine, Mark; Boyer, Roger; Cross, Bob; Hamlin, Teri; Roelant, Henk; Stewart, Mike; Bigler, Mark; Winter, Scott; Reistle, Bruce; Heydorn,Dick
2009-01-01
The Johnson Space Center (JSC) Safety & Mission Assurance (S&MA) Directorate s Risk and Reliability Analysis Group provides both mathematical and engineering analysis expertise in the areas of Probabilistic Risk Assessment (PRA), Reliability and Maintainability (R&M) analysis, and data collection and analysis. The fundamental goal of this group is to provide National Aeronautics and Space Administration (NASA) decisionmakers with the necessary information to make informed decisions when evaluating personnel, flight hardware, and public safety concerns associated with current operating systems as well as with any future systems. The Analysis Group includes a staff of statistical and reliability experts with valuable backgrounds in the statistical, reliability, and engineering fields. This group includes JSC S&MA Analysis Branch personnel as well as S&MA support services contractors, such as Science Applications International Corporation (SAIC) and SoHaR. The Analysis Group s experience base includes nuclear power (both commercial and navy), manufacturing, Department of Defense, chemical, and shipping industries, as well as significant aerospace experience specifically in the Shuttle, International Space Station (ISS), and Constellation Programs. The Analysis Group partners with project and program offices, other NASA centers, NASA contractors, and universities to provide additional resources or information to the group when performing various analysis tasks. The JSC S&MA Analysis Group is recognized as a leader in risk and reliability analysis within the NASA community. Therefore, the Analysis Group is in high demand to help the Space Shuttle Program (SSP) continue to fly safely, assist in designing the next generation spacecraft for the Constellation Program (CxP), and promote advanced analytical techniques. The Analysis Section s tasks include teaching classes and instituting personnel qualification processes to enhance the professional abilities of our analysts as well as performing major probabilistic assessments used to support flight rationale and help establish program requirements. During 2008, the Analysis Group performed more than 70 assessments. Although all these assessments were important, some were instrumental in the decisionmaking processes for the Shuttle and Constellation Programs. Two of the more significant tasks were the Space Transportation System (STS)-122 Low Level Cutoff PRA for the SSP and the Orion Pad Abort One (PA-1) PRA for the CxP. These two activities, along with the numerous other tasks the Analysis Group performed in 2008, are summarized in this report. This report also highlights several ongoing and upcoming efforts to provide crucial statistical and probabilistic assessments, such as the Extravehicular Activity (EVA) PRA for the Hubble Space Telescope service mission and the first fully integrated PRAs for the CxP's Lunar Sortie and ISS missions.
2011-07-01
joined the project team in the statistical and research coordination role. Dr. Collin is an employee at the University of Pittsburgh. A successful...3. Submit to Ft. Detrick Completed Milestone: Statistical analysis planning 1. Review planned data metrics and data gathering tools...approach to performance assessment for continuous quality improvement. Analyzing data with modern statistical techniques to determine the
Economic effectiveness of disease management programs: a meta-analysis.
Krause, David S
2005-04-01
The economic effectiveness of disease management programs, which are designed to improve the clinical and economic outcomes for chronically ill individuals, has been evaluated extensively. A literature search was performed with MEDLINE and other published sources for the period covering January 1995 to September 2003. The search was limited to empirical articles that measured the direct economic outcomes for asthma, diabetes, and heart disease management programs. Of the 360 articles and presentations evaluated, only 67 met the selection criteria for meta-analysis, which included 32,041 subjects. Although some studies contained multiple measurements of direct economic outcomes, only one average effect size per study was included in the meta-analysis. Based on the studies included in the research, a meta-analysis provided a statistically significant answer to the question of whether disease management programs are economically effective. The magnitude of the observed average effect size for equally weighted studies was 0.311 (95% CI = 0.272-0.350). Statistically significant differences of effect sizes by study design, disease type and intensity of disease management program interventions were not found after a moderating variable, disease severity, was taken into consideration. The results suggest that disease management programs are more effective economically with severely ill enrollees and that chronic disease program interventions are most effective when coordinated with the overall level of disease severity. The findings can be generalized, which may assist health care policy makers and practitioners in addressing the issue of providing economically effective care for the growing number of individuals with chronic illness.
Gregory A. Reams; Ronald E. McRoberts; Paul C. van Deusen; [Editors
2001-01-01
Documents progress in developing techniques in remote sensing, statistics, information management, and analysis required for full implementation of the national Forest Inventory and Analysis programâs annual forest inventory system.
Comparative Research of Navy Voluntary Education at Operational Commands
2017-03-01
return on investment, ROI, logistic regression, multivariate analysis, descriptive statistics, Markov, time-series, linear programming 15. NUMBER...21 B. DESCRIPTIVE STATISTICS TABLES ...............................................25 C. PRIVACY CONSIDERATIONS...THIS PAGE INTENTIONALLY LEFT BLANK xi LIST OF TABLES Table 1. Variables and Descriptions . Adapted from NETC (2016). .......................21
Mapping the global health employment market: an analysis of global health jobs.
Keralis, Jessica M; Riggin-Pathak, Brianne L; Majeski, Theresa; Pathak, Bogdan A; Foggia, Janine; Cullinen, Kathleen M; Rajagopal, Abbhirami; West, Heidi S
2018-02-27
The number of university global health training programs has grown in recent years. However, there is little research on the needs of the global health profession. We therefore set out to characterize the global health employment market by analyzing global health job vacancies. We collected data from advertised, paid positions posted to web-based job boards, email listservs, and global health organization websites from November 2015 to May 2016. Data on requirements for education, language proficiency, technical expertise, physical location, and experience level were analyzed for all vacancies. Descriptive statistics were calculated for the aforementioned job characteristics. Associations between technical specialty area and requirements for non-English language proficiency and overseas experience were calculated using Chi-square statistics. A qualitative thematic analysis was performed on a subset of vacancies. We analyzed the data from 1007 global health job vacancies from 127 employers. Among private and non-profit sector vacancies, 40% (n = 354) were for technical or subject matter experts, 20% (n = 177) for program directors, and 16% (n = 139) for managers, compared to 9.8% (n = 87) for entry-level and 13.6% (n = 120) for mid-level positions. The most common technical focus area was program or project management, followed by HIV/AIDS and quantitative analysis. Thematic analysis demonstrated a common emphasis on program operations, relations, design and planning, communication, and management. Our analysis shows a demand for candidates with several years of experience with global health programs, particularly program managers/directors and technical experts, with very few entry-level positions accessible to recent graduates of global health training programs. It is unlikely that global health training programs equip graduates to be competitive for the majority of positions that are currently available in this field.
NASA Astrophysics Data System (ADS)
Voegel, Phillip D.; Quashnock, Kathryn A.; Heil, Katrina M.
2004-05-01
The Student-to-Student Chemistry Initiative is an outreach program started in the fall of 2001 at Midwestern State University (MSU). The oncampus program trains high school science students to perform a series of chemistry demonstrations and subsequently provides kits containing necessary supplies and reagents for the high school students to perform demonstration programs at elementary schools. The program focuses on improving student perception of science. The program's impact on high school student perception is evaluated through statistical analysis of paired preparticipation and postparticipation surveys. The surveys focus on four areas of student perception: general attitude toward science, interest in careers in science, science awareness, and interest in attending MSU for postsecondary education. Increased scores were observed in all evaluation areas including a statistically significant increase in science awareness following participation.
Developing nurse leaders: a program enhancing staff nurse leadership skills and professionalism.
Abraham, Pauline J
2011-01-01
This study aims to determine whether participation in the Nursing Leadership Perspectives Program (NLPP) at Mayo Clinic in Rochester, Minnesota, produced a change in leadership skills, increased professional activities, leadership promotion, and retention rates of participants. The NLPP is an educational program designed to enhance leadership skills and promote professionalism of registered nurses. The 6-month program provides participants with theoretical knowledge, core competencies, and opportunities to practice application of leadership skills. Outcome metrics were collected from registered nurses who completed the program (n = 15). Data analysis included descriptive and nonparametric methods. Participants reported statistically significant changes in their leadership skills after participation in the program (P = .007) on the Leadership Practices Inventory. Changes in professional behavior were also statistically significant as rated by the Nursing Activity Scale (P = .001). Participants demonstrated a change in leadership skills and professional behavior following the program.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Koch, C.D.; Pirkle, F.L.; Schmidt, J.S.
1981-01-01
A Principal Components Analysis (PCA) has been written to aid in the interpretation of multivariate aerial radiometric data collected by the US Department of Energy (DOE) under the National Uranium Resource Evaluation (NURE) program. The variations exhibited by these data have been reduced and classified into a number of linear combinations by using the PCA program. The PCA program then generates histograms and outlier maps of the individual variates. Black and white plots can be made on a Calcomp plotter by the application of follow-up programs. All programs referred to in this guide were written for a DEC-10. From thismore » analysis a geologist may begin to interpret the data structure. Insight into geological processes underlying the data may be obtained.« less
NASA Astrophysics Data System (ADS)
Adedokun, Omolola A.; Liu, Jia; Parker, Loran Carleton; Burgess, Wilella
2015-02-01
Although virtual field trips are becoming popular, there are few empirical studies of their impacts on student outcomes. This study reports on a meta-analytic evaluation of the impact of a virtual field trip on student perceptions of scientists. Specifically, the study examined the summary effect of zipTrips broadcasts on evaluation participants' perceptions of scientists, as well as the moderating effect of program type on program impact. The results showed statistically significant effect of each broadcast, as well as statistically significant summary (combined) effect of zipTrips on evaluation participants' perceptions of scientists. Results of the moderation analysis showed that the effect was greater for the students that participated in the evaluation of the 8th grade broadcasts, providing additional insight into the role of program variation in predicting differential program impact. This study illustrates how meta-analysis, a methodology that should be of interest to STEM education researchers and evaluation practitioners, can be used to summarize the effects of multiple offerings of the same program. Other implications for STEM educators are discussed.
Impact of management development on nurse retention.
Wilson, Alexis A
2005-01-01
Nurse retention is essential to maintain quality healthcare organizations. In an effort to mitigate the loss of nurse managers, a management education program was created for new and transitioning nurse managers that included scholarships for nurses from long-term and rural acute care settings. Program evaluation was based upon the outcomes of anticipated turnover and employee satisfaction. Using a preprogram and postprogram evaluation, the Index of Work Satisfaction (IWS) and the Anticipated Turnover Scale (ATS) were used to survey participants. Descriptive statistics as well as Wilcoxon statistics for group comparisons were used for analysis. ATS scores were significantly reduced (P < .05) for all program participants. Further analysis of scholarship recipients indicated that the management program significantly increased their intent to stay (P < .08) in their current positions. However, because of a large rate of attrition, findings can only be considered preliminary. While the high level of attrition among the scholarship recipients is disappointing, potential attendance barriers are discussed, particularly from long-term care settings. Management development programs may improve the satisfaction and retention of critically needed managers and enhance development of future nursing leaders.
A PROPOSED CHEMICAL INFORMATION AND DATA SYSTEM. VOLUME I.
CHEMICAL COMPOUNDS, *DATA PROCESSING, *INFORMATION RETRIEVAL, * CHEMICAL ANALYSIS, INPUT OUTPUT DEVICES, COMPUTER PROGRAMMING, CLASSIFICATION...CONFIGURATIONS, DATA STORAGE SYSTEMS, ATOMS, MOLECULES, PERFORMANCE( ENGINEERING ), MAINTENANCE, SUBJECT INDEXING, MAGNETIC TAPE, AUTOMATIC, MILITARY REQUIREMENTS, TYPEWRITERS, OPTICS, TOPOLOGY, STATISTICAL ANALYSIS, FLOW CHARTING.
Statistical Evaluation of Time Series Analysis Techniques
NASA Technical Reports Server (NTRS)
Benignus, V. A.
1973-01-01
The performance of a modified version of NASA's multivariate spectrum analysis program is discussed. A multiple regression model was used to make the revisions. Performance improvements were documented and compared to the standard fast Fourier transform by Monte Carlo techniques.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
This document comprises Pacific Northwest National Laboratory`s report for Fiscal Year 1996 on research and development programs. The document contains 161 project summaries in 16 areas of research and development. The 16 areas of research and development reported on are: atmospheric sciences, biotechnology, chemical instrumentation and analysis, computer and information science, ecological science, electronics and sensors, health protection and dosimetry, hydrological and geologic sciences, marine sciences, materials science and engineering, molecular science, process science and engineering, risk and safety analysis, socio-technical systems analysis, statistics and applied mathematics, and thermal and energy systems. In addition, this report provides an overview ofmore » the research and development program, program management, program funding, and Fiscal Year 1997 projects.« less
Statistical properties of alternative national forest inventory area estimators
Francis Roesch; John Coulston; Andrew D. Hill
2012-01-01
The statistical properties of potential estimators of forest area for the USDA Forest Service's Forest Inventory and Analysis (FIA) program are presented and discussed. The current FIA area estimator is compared and contrasted with a weighted mean estimator and an estimator based on the Polya posterior, in the presence of nonresponse. Estimator optimality is...
Some Experience with Interactive Computing in Teaching Introductory Statistics.
ERIC Educational Resources Information Center
Diegert, Carl
Students in two biostatistics courses at the Cornell Medical College and in a course in applications of computer science given in Cornell's School of Industrial Engineering were given access to an interactive package of computer programs enabling them to perform statistical analysis without the burden of hand computation. After a general…
Sun Series program for the REEDA System. [predicting orbital lifetime using sunspot values
NASA Technical Reports Server (NTRS)
Shankle, R. W.
1980-01-01
Modifications made to data bases and to four programs in a series of computer programs (Sun Series) which run on the REEDA HP minicomputer system to aid NASA's solar activity predictions used in orbital life time predictions are described. These programs utilize various mathematical smoothing technique and perform statistical and graphical analysis of various solar activity data bases residing on the REEDA System.
[Statistical analysis of German radiologic periodicals: developmental trends in the last 10 years].
Golder, W
1999-09-01
To identify which statistical tests are applied in German radiological publications, to what extent their use has changed during the last decade, and which factors might be responsible for this development. The major articles published in "ROFO" and "DER RADIOLOGE" during 1988, 1993 and 1998 were reviewed for statistical content. The contributions were classified by principal focus and radiological subspecialty. The methods used were assigned to descriptive, basal and advanced statistics. Sample size, significance level and power were established. The use of experts' assistance was monitored. Finally, we calculated the so-called cumulative accessibility of the publications. 525 contributions were found to be eligible. In 1988, 87% used descriptive statistics only, 12.5% basal, and 0.5% advanced statistics. The corresponding figures in 1993 and 1998 are 62 and 49%, 32 and 41%, and 6 and 10%, respectively. Statistical techniques were most likely to be used in research on musculoskeletal imaging and articles dedicated to MRI. Six basic categories of statistical methods account for the complete statistical analysis appearing in 90% of the articles. ROC analysis is the single most common advanced technique. Authors make increasingly use of statistical experts' opinion and programs. During the last decade, the use of statistical methods in German radiological journals has fundamentally improved, both quantitatively and qualitatively. Presently, advanced techniques account for 20% of the pertinent statistical tests. This development seems to be promoted by the increasing availability of statistical analysis software.
MetaGenyo: a web tool for meta-analysis of genetic association studies.
Martorell-Marugan, Jordi; Toro-Dominguez, Daniel; Alarcon-Riquelme, Marta E; Carmona-Saez, Pedro
2017-12-16
Genetic association studies (GAS) aims to evaluate the association between genetic variants and phenotypes. In the last few years, the number of this type of study has increased exponentially, but the results are not always reproducible due to experimental designs, low sample sizes and other methodological errors. In this field, meta-analysis techniques are becoming very popular tools to combine results across studies to increase statistical power and to resolve discrepancies in genetic association studies. A meta-analysis summarizes research findings, increases statistical power and enables the identification of genuine associations between genotypes and phenotypes. Meta-analysis techniques are increasingly used in GAS, but it is also increasing the amount of published meta-analysis containing different errors. Although there are several software packages that implement meta-analysis, none of them are specifically designed for genetic association studies and in most cases their use requires advanced programming or scripting expertise. We have developed MetaGenyo, a web tool for meta-analysis in GAS. MetaGenyo implements a complete and comprehensive workflow that can be executed in an easy-to-use environment without programming knowledge. MetaGenyo has been developed to guide users through the main steps of a GAS meta-analysis, covering Hardy-Weinberg test, statistical association for different genetic models, analysis of heterogeneity, testing for publication bias, subgroup analysis and robustness testing of the results. MetaGenyo is a useful tool to conduct comprehensive genetic association meta-analysis. The application is freely available at http://bioinfo.genyo.es/metagenyo/ .
Title I ESEA, High School; English as a Second Language: 1979-1980. OEE Evaluation Report.
ERIC Educational Resources Information Center
New York City Board of Education, Brooklyn, NY. Office of Educational Evaluation.
The report is an evaluation of the 1979-80 High School Title I English as a Second Language Program. Two types of information are presented: (1) a narrative description of the program which provides qualitative data regarding the program, and (2) a statistical analysis of test results which consists of quantitative, city-wide data. By integrating…
Additional Support for the Information Systems Analyst Exam as a Valid Program Assessment Tool
ERIC Educational Resources Information Center
Carpenter, Donald A.; Snyder, Johnny; Slauson, Gayla Jo; Bridge, Morgan K.
2011-01-01
This paper presents a statistical analysis to support the notion that the Information Systems Analyst (ISA) exam can be used as a program assessment tool in addition to measuring student performance. It compares ISA exam scores earned by students in one particular Computer Information Systems program with scores earned by the same students on the…
Evaluation of a clinical medical librarianship program at a university Health Sciences Library.
Schnall, J G; Wilson, J W
1976-01-01
An evaluation of the clinical medical librarianship program at the University of Washington Health Sciences Library was undertaken to determine the benefits of the program to patient care and to the education of the recipients of the service. Results of a questionnaire reflected overwhelming acceptance of the clinical medical librarianship program. Guidelines for the establishment of a limited clinical medical librarianship program are described. A statistical cost analysis of the program is included. PMID:938773
Brohet, C R; Richman, H G
1979-06-01
Automated processing of electrocardiograms by the Veterans Administration program was evaluated for both agreement with physician interpretation and interpretative accuracy as assessed with nonelectrocardiographic criteria. One thousand unselected electrocardiograms were analyzed by two reviewer groups, one familiar and the other unfamiliar with the computer program. A significant number of measurement errors involving repolarization changes and left axis deviation occurred; however, interpretative disagreements related to statistical decision were largely language-related. Use of a printout with a more traditional format resulted in agreement with physician interpretation by both reviewer groups in more than 80 percent of cases. Overall sensitivity based on agreement with nonelectrocardiographic criteria was significantly greater with use of the computer program than with use of the conventional criteria utilized by the reviewers. This difference was particularly evident in the subgroup analysis of myocardial infarction and left ventricular hypertrophy. The degree of overdiagnosis of left ventricular hypertrophy and posteroinferior infarction was initially unacceptable, but this difficulty was corrected by adjustment of probabilities. Clinical acceptability of the Veterans Administration program appears to require greater physician education than that needed for other computer programs of electrocardiographic analysis; the flexibility of interpretation by statistical decision offers the potential for better diagnostic accuracy.
FabricS: A user-friendly, complete and robust software for particle shape-fabric analysis
NASA Astrophysics Data System (ADS)
Moreno Chávez, G.; Castillo Rivera, F.; Sarocchi, D.; Borselli, L.; Rodríguez-Sedano, L. A.
2018-06-01
Shape-fabric is a textural parameter related to the spatial arrangement of elongated particles in geological samples. Its usefulness spans a range from sedimentary petrology to igneous and metamorphic petrology. Independently of the process being studied, when a material flows, the elongated particles are oriented with the major axis in the direction of flow. In sedimentary petrology this information has been used for studies of paleo-flow direction of turbidites, the origin of quartz sediments, and locating ignimbrite vents, among others. In addition to flow direction and its polarity, the method enables flow rheology to be inferred. The use of shape-fabric has been limited due to the difficulties of automatically measuring particles and analyzing them with reliable circular statistics programs. This has dampened interest in the method for a long time. Shape-fabric measurement has increased in popularity since the 1980s thanks to the development of new image analysis techniques and circular statistics software. However, the programs currently available are unreliable, old and are incompatible with newer operating systems, or require programming skills. The goal of our work is to develop a user-friendly program, in the MATLAB environment, with a graphical user interface, that can process images and includes editing functions, and thresholds (elongation and size) for selecting a particle population and analyzing it with reliable circular statistics algorithms. Moreover, the method also has to produce rose diagrams, orientation vectors, and a complete series of statistical parameters. All these requirements are met by our new software. In this paper, we briefly explain the methodology from collection of oriented samples in the field to the minimum number of particles needed to obtain reliable fabric data. We obtained the data using specific statistical tests and taking into account the degree of iso-orientation of the samples and the required degree of reliability. The program has been verified by means of several simulations performed using appropriately designed features and by analyzing real samples.
Image analysis library software development
NASA Technical Reports Server (NTRS)
Guseman, L. F., Jr.; Bryant, J.
1977-01-01
The Image Analysis Library consists of a collection of general purpose mathematical/statistical routines and special purpose data analysis/pattern recognition routines basic to the development of image analysis techniques for support of current and future Earth Resources Programs. Work was done to provide a collection of computer routines and associated documentation which form a part of the Image Analysis Library.
Online Statistical Modeling (Regression Analysis) for Independent Responses
NASA Astrophysics Data System (ADS)
Made Tirta, I.; Anggraeni, Dian; Pandutama, Martinus
2017-06-01
Regression analysis (statistical analmodelling) are among statistical methods which are frequently needed in analyzing quantitative data, especially to model relationship between response and explanatory variables. Nowadays, statistical models have been developed into various directions to model various type and complex relationship of data. Rich varieties of advanced and recent statistical modelling are mostly available on open source software (one of them is R). However, these advanced statistical modelling, are not very friendly to novice R users, since they are based on programming script or command line interface. Our research aims to developed web interface (based on R and shiny), so that most recent and advanced statistical modelling are readily available, accessible and applicable on web. We have previously made interface in the form of e-tutorial for several modern and advanced statistical modelling on R especially for independent responses (including linear models/LM, generalized linier models/GLM, generalized additive model/GAM and generalized additive model for location scale and shape/GAMLSS). In this research we unified them in the form of data analysis, including model using Computer Intensive Statistics (Bootstrap and Markov Chain Monte Carlo/ MCMC). All are readily accessible on our online Virtual Statistics Laboratory. The web (interface) make the statistical modeling becomes easier to apply and easier to compare them in order to find the most appropriate model for the data.
STATISTICAL PERSPECTIVE ON THE DESIGN AND ANALYSIS ON NATURAL RESOURCE MONITORING PROGRAMS
Natural resource monitoring includes a wide variation in the type of natural resource monitored as well as in the objectives for the monitoring. Rather than address the entire breadth, the focus will be restricted to programs whose focus is to produce state, regional, or nationa...
The open-source movement: an introduction for forestry professionals
Patrick Proctor; Paul C. Van Deusen; Linda S. Heath; Jeffrey H. Gove
2005-01-01
In recent years, the open-source movement has yielded a generous and powerful suite of software and utilities that rivals those developed by many commercial software companies. Open-source programs are available for many scientific needs: operating systems, databases, statistical analysis, Geographic Information System applications, and object-oriented programming....
NASA Astrophysics Data System (ADS)
Pardo-Igúzquiza, Eulogio; Rodríguez-Tovar, Francisco J.
2012-12-01
Many spectral analysis techniques have been designed assuming sequences taken with a constant sampling interval. However, there are empirical time series in the geosciences (sediment cores, fossil abundance data, isotope analysis, …) that do not follow regular sampling because of missing data, gapped data, random sampling or incomplete sequences, among other reasons. In general, interpolating an uneven series in order to obtain a succession with a constant sampling interval alters the spectral content of the series. In such cases it is preferable to follow an approach that works with the uneven data directly, avoiding the need for an explicit interpolation step. The Lomb-Scargle periodogram is a popular choice in such circumstances, as there are programs available in the public domain for its computation. One new computer program for spectral analysis improves the standard Lomb-Scargle periodogram approach in two ways: (1) It explicitly adjusts the statistical significance to any bias introduced by variance reduction smoothing, and (2) it uses a permutation test to evaluate confidence levels, which is better suited than parametric methods when neighbouring frequencies are highly correlated. Another novel program for cross-spectral analysis offers the advantage of estimating the Lomb-Scargle cross-periodogram of two uneven time series defined on the same interval, and it evaluates the confidence levels of the estimated cross-spectra by a non-parametric computer intensive permutation test. Thus, the cross-spectrum, the squared coherence spectrum, the phase spectrum, and the Monte Carlo statistical significance of the cross-spectrum and the squared-coherence spectrum can be obtained. Both of the programs are written in ANSI Fortran 77, in view of its simplicity and compatibility. The program code is of public domain, provided on the website of the journal (http://www.iamg.org/index.php/publisher/articleview/frmArticleID/112/). Different examples (with simulated and real data) are described in this paper to corroborate the methodology and the implementation of these two new programs.
NASA Technical Reports Server (NTRS)
Colvin, E. L.; Emptage, M. R.
1992-01-01
The breaking load test provides quantitative stress corrosion cracking data by determining the residual strength of tension specimens that have been exposed to corrosive environments. Eight laboratories have participated in a cooperative test program under the auspices of ASTM Committee G-1 to evaluate the new test method. All eight laboratories were able to distinguish between three tempers of aluminum alloy 7075. The statistical analysis procedures that were used in the test program do not work well in all situations. An alternative procedure using Box-Cox transformations shows a great deal of promise. An ASTM standard method has been drafted which incorporates the Box-Cox procedure.
medplot: a web application for dynamic summary and analysis of longitudinal medical data based on R.
Ahlin, Črt; Stupica, Daša; Strle, Franc; Lusa, Lara
2015-01-01
In biomedical studies the patients are often evaluated numerous times and a large number of variables are recorded at each time-point. Data entry and manipulation of longitudinal data can be performed using spreadsheet programs, which usually include some data plotting and analysis capabilities and are straightforward to use, but are not designed for the analyses of complex longitudinal data. Specialized statistical software offers more flexibility and capabilities, but first time users with biomedical background often find its use difficult. We developed medplot, an interactive web application that simplifies the exploration and analysis of longitudinal data. The application can be used to summarize, visualize and analyze data by researchers that are not familiar with statistical programs and whose knowledge of statistics is limited. The summary tools produce publication-ready tables and graphs. The analysis tools include features that are seldom available in spreadsheet software, such as correction for multiple testing, repeated measurement analyses and flexible non-linear modeling of the association of the numerical variables with the outcome. medplot is freely available and open source, it has an intuitive graphical user interface (GUI), it is accessible via the Internet and can be used within a web browser, without the need for installing and maintaining programs locally on the user's computer. This paper describes the application and gives detailed examples describing how to use the application on real data from a clinical study including patients with early Lyme borreliosis.
DnaSAM: Software to perform neutrality testing for large datasets with complex null models.
Eckert, Andrew J; Liechty, John D; Tearse, Brandon R; Pande, Barnaly; Neale, David B
2010-05-01
Patterns of DNA sequence polymorphisms can be used to understand the processes of demography and adaptation within natural populations. High-throughput generation of DNA sequence data has historically been the bottleneck with respect to data processing and experimental inference. Advances in marker technologies have largely solved this problem. Currently, the limiting step is computational, with most molecular population genetic software allowing a gene-by-gene analysis through a graphical user interface. An easy-to-use analysis program that allows both high-throughput processing of multiple sequence alignments along with the flexibility to simulate data under complex demographic scenarios is currently lacking. We introduce a new program, named DnaSAM, which allows high-throughput estimation of DNA sequence diversity and neutrality statistics from experimental data along with the ability to test those statistics via Monte Carlo coalescent simulations. These simulations are conducted using the ms program, which is able to incorporate several genetic parameters (e.g. recombination) and demographic scenarios (e.g. population bottlenecks). The output is a set of diversity and neutrality statistics with associated probability values under a user-specified null model that are stored in easy to manipulate text file. © 2009 Blackwell Publishing Ltd.
Nonlinear Statistical Estimation with Numerical Maximum Likelihood
1974-10-01
probably most directly attributable to the speed, precision and compactness of the linear programming algorithm exercised ; the mutual primal-dual...discriminant analysis is to classify the individual as a member of T# or IT, 1 2 according to the relative...Introduction to the Dissertation 1 Introduction to Statistical Estimation Theory 3 Choice of Estimator.. .Density Functions 12 Choice of Estimator
Analysis of USAREUR Family Housing.
1985-04-01
Standard Installation/Division Personnel System SJA ................ Staff Judge Advocate SPSS ............... Statistical Package for the...for Projecting Family Housing Requirements. a. Attempts to define USAREUR’s programmable family housing deficit Sbased on the FHS have caused anguish ...responses using the Statistical Package for the Social Sciences ( SPSS ) computer program. E-2 ANNEX E RESPONSE TO ESC HOUSING QUESTIONNAIRE Section Page I
ERIC Educational Resources Information Center
Dancer, Diane; Morrison, Kellie; Tarr, Garth
2015-01-01
Peer-assisted study session (PASS) programs have been shown to positively affect students' grades in a majority of studies. This study extends that analysis in two ways: controlling for ability and other factors, with focus on international students, and by presenting results for PASS in business statistics. Ordinary least squares, random effects…
AGR-1 Thermocouple Data Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jeff Einerson
2012-05-01
This report documents an effort to analyze measured and simulated data obtained in the Advanced Gas Reactor (AGR) fuel irradiation test program conducted in the INL's Advanced Test Reactor (ATR) to support the Next Generation Nuclear Plant (NGNP) R&D program. The work follows up on a previous study (Pham and Einerson, 2010), in which statistical analysis methods were applied for AGR-1 thermocouple data qualification. The present work exercises the idea that, while recognizing uncertainties inherent in physics and thermal simulations of the AGR-1 test, results of the numerical simulations can be used in combination with the statistical analysis methods tomore » further improve qualification of measured data. Additionally, the combined analysis of measured and simulation data can generate insights about simulation model uncertainty that can be useful for model improvement. This report also describes an experimental control procedure to maintain fuel target temperature in the future AGR tests using regression relationships that include simulation results. The report is organized into four chapters. Chapter 1 introduces the AGR Fuel Development and Qualification program, AGR-1 test configuration and test procedure, overview of AGR-1 measured data, and overview of physics and thermal simulation, including modeling assumptions and uncertainties. A brief summary of statistical analysis methods developed in (Pham and Einerson 2010) for AGR-1 measured data qualification within NGNP Data Management and Analysis System (NDMAS) is also included for completeness. Chapters 2-3 describe and discuss cases, in which the combined use of experimental and simulation data is realized. A set of issues associated with measurement and modeling uncertainties resulted from the combined analysis are identified. This includes demonstration that such a combined analysis led to important insights for reducing uncertainty in presentation of AGR-1 measured data (Chapter 2) and interpretation of simulation results (Chapter 3). The statistics-based simulation-aided experimental control procedure described for the future AGR tests is developed and demonstrated in Chapter 4. The procedure for controlling the target fuel temperature (capsule peak or average) is based on regression functions of thermocouple readings and other relevant parameters and accounting for possible changes in both physical and thermal conditions and in instrument performance.« less
Ari, Arzu
2009-09-01
Respiratory care education programs are being held accountable for student retention. Increasing student retention is necessary for the respiratory therapy profession, which suffers from a shortage of qualified therapists needed to meet the increased demand. The present study investigated the relationship between student retention rate and program resources, in order to understand which and to what extent the different components of program resources predict student retention rate. The target population of this study was baccalaureate of science degree respiratory care education programs. After utilizing a survey research method, Pearson correlations and multiple regression analysis were used for data analysis. With a 63% response rate (n = 36), this study found a statistically significant relationship between program resources and student retention rate. Financial and personnel resources had a statistically significant positive relationship with student retention. The mean financial resources per student was responsible for 33% of the variance in student retention, while the mean personnel resources per student accounted for 12% of the variance in student retention. Program financial resources available to students was the single best predictor of program performance on student retention. Respiratory care education programs spending more money per student and utilizing more personnel in the program have higher mean performance in student retention. Therefore, respiratory care education programs must devote sufficient resources to retaining students so that they can produce more respiratory therapists and thereby make the respiratory therapy profession stronger.
Statistical Analysis in Dental Research Papers.
1983-08-08
AD A136, 019 STATISTICAL ANALYSS IN DENTAL RESEARCH PAPERS(Ul ARMY I INS OF DENTAL NESEARCH WASHINGTON DC L LORTON 0R AUG983 UNCL ASS FED F/S 6/5 IEE...BEFORE COSTL’,..G FORM 2. GOVT ACCESSION NO 3. RECIPIENTS CATALOG NUbER d Ste S. TYPE OF REPORT A PERIOD COVERED ,cistical Analysis in Dental Research ...Papers Submission of papaer Jan- Aue 1983 X!t AUTHOR(&) ". COTACO.RATN Lewis Lorton 9. PERFORMING ORGANIZATION NAME AND ADDRESS 10. PROGRAM ELEMENT
Orchestrating high-throughput genomic analysis with Bioconductor
Huber, Wolfgang; Carey, Vincent J.; Gentleman, Robert; Anders, Simon; Carlson, Marc; Carvalho, Benilton S.; Bravo, Hector Corrada; Davis, Sean; Gatto, Laurent; Girke, Thomas; Gottardo, Raphael; Hahne, Florian; Hansen, Kasper D.; Irizarry, Rafael A.; Lawrence, Michael; Love, Michael I.; MacDonald, James; Obenchain, Valerie; Oleś, Andrzej K.; Pagès, Hervé; Reyes, Alejandro; Shannon, Paul; Smyth, Gordon K.; Tenenbaum, Dan; Waldron, Levi; Morgan, Martin
2015-01-01
Bioconductor is an open-source, open-development software project for the analysis and comprehension of high-throughput data in genomics and molecular biology. The project aims to enable interdisciplinary research, collaboration and rapid development of scientific software. Based on the statistical programming language R, Bioconductor comprises 934 interoperable packages contributed by a large, diverse community of scientists. Packages cover a range of bioinformatic and statistical applications. They undergo formal initial review and continuous automated testing. We present an overview for prospective users and contributors. PMID:25633503
Pyrotechnic Shock Analysis Using Statistical Energy Analysis
2015-10-23
SEA subsystems. A couple of validation examples are provided to demonstrate the new approach. KEY WORDS : Peak Ratio, phase perturbation...Ballistic Shock Prediction Models and Techniques for Use in the Crusader Combat Vehicle Program,” 11th Annual US Army Ground Vehicle Survivability
Methods of analysis and resources available for genetic trait mapping.
Ott, J
1999-01-01
Methods of genetic linkage analysis are reviewed and put in context with other mapping techniques. Sources of information are outlined (books, web sites, computer programs). Special consideration is given to statistical problems in canine genetic mapping (heterozygosity, inbreeding, marker maps).
ERIC Educational Resources Information Center
Ling, Guangming; Rijmen, Frank
2011-01-01
The factorial structure of the Time Management (TM) scale of the Student 360: Insight Program (S360) was evaluated based on a national sample. A general procedure with a variety of methods was introduced and implemented, including the computation of descriptive statistics, exploratory factor analysis (EFA), and confirmatory factor analysis (CFA).…
RAD-ADAPT: Software for modelling clonogenic assay data in radiation biology.
Zhang, Yaping; Hu, Kaiqiang; Beumer, Jan H; Bakkenist, Christopher J; D'Argenio, David Z
2017-04-01
We present a comprehensive software program, RAD-ADAPT, for the quantitative analysis of clonogenic assays in radiation biology. Two commonly used models for clonogenic assay analysis, the linear-quadratic model and single-hit multi-target model, are included in the software. RAD-ADAPT uses maximum likelihood estimation method to obtain parameter estimates with the assumption that cell colony count data follow a Poisson distribution. The program has an intuitive interface, generates model prediction plots, tabulates model parameter estimates, and allows automatic statistical comparison of parameters between different groups. The RAD-ADAPT interface is written using the statistical software R and the underlying computations are accomplished by the ADAPT software system for pharmacokinetic/pharmacodynamic systems analysis. The use of RAD-ADAPT is demonstrated using an example that examines the impact of pharmacologic ATM and ATR kinase inhibition on human lung cancer cell line A549 after ionizing radiation. Copyright © 2017 Elsevier B.V. All rights reserved.
The Statistical Package for the Social Sciences (SPSS) as an adjunct to pharmacokinetic analysis.
Mather, L E; Austin, K L
1983-01-01
Computer techniques for numerical analysis are well known to pharmacokineticists. Powerful techniques for data file management have been developed by social scientists but have, in general, been ignored by pharmacokineticists because of their apparent lack of ability to interface with pharmacokinetic programs. Extensive use has been made of the Statistical Package for the Social Sciences (SPSS) for its data handling capabilities, but at the same time, techniques have been developed within SPSS to interface with pharmacokinetic programs of the users' choice and to carry out a variety of user-defined pharmacokinetic tasks within SPSS commands, apart from the expected variety of statistical tasks. Because it is based on a ubiquitous package, this methodology has all of the benefits of excellent documentation, interchangeability between different types and sizes of machines and true portability of techniques and data files. An example is given of the total management of a pharmacokinetic study previously reported in the literature by the authors.
Factorial analysis of trihalomethanes formation in drinking water.
Chowdhury, Shakhawat; Champagne, Pascale; McLellan, P James
2010-06-01
Disinfection of drinking water reduces pathogenic infection, but may pose risks to human health through the formation of disinfection byproducts. The effects of different factors on the formation of trihalomethanes were investigated using a statistically designed experimental program, and a predictive model for trihalomethanes formation was developed. Synthetic water samples with different factor levels were produced, and trihalomethanes concentrations were measured. A replicated fractional factorial design with center points was performed, and significant factors were identified through statistical analysis. A second-order trihalomethanes formation model was developed from 92 experiments, and the statistical adequacy was assessed through appropriate diagnostics. This model was validated using additional data from the Drinking Water Surveillance Program database and was applied to the Smiths Falls water supply system in Ontario, Canada. The model predictions were correlated strongly to the measured trihalomethanes, with correlations of 0.95 and 0.91, respectively. The resulting model can assist in analyzing risk-cost tradeoffs in the design and operation of water supply systems.
Comparison of requirements and capabilities of major multipurpose software packages.
Igo, Robert P; Schnell, Audrey H
2012-01-01
The aim of this chapter is to introduce the reader to commonly used software packages and illustrate their input requirements, analysis options, strengths, and limitations. We focus on packages that perform more than one function and include a program for quality control, linkage, and association analyses. Additional inclusion criteria were (1) programs that are free to academic users and (2) currently supported, maintained, and developed. Using those criteria, we chose to review three programs: Statistical Analysis for Genetic Epidemiology (S.A.G.E.), PLINK, and Merlin. We will describe the required input format and analysis options. We will not go into detail about every possible program in the packages, but we will give an overview of the packages requirements and capabilities.
Mossotti, Victor G.; Eldeeb, A. Raouf; Oscarson, Robert
1998-01-01
MORPH-I is a set of C-language computer programs for the IBM PC and compatible minicomputers. The programs in MORPH-I are used for the fractal analysis of scanning electron microscope and electron microprobe images of pore profiles exposed in cross-section. The program isolates and traces the cross-sectional profiles of exposed pores and computes the Richardson fractal dimension for each pore. Other programs in the set provide for image calibration, display, and statistical analysis of the computed dimensions for highly complex porous materials. Requirements: IBM PC or compatible; minimum 640 K RAM; mathcoprocessor; SVGA graphics board providing mode 103 display.
On-Line Analysis of Southern FIA Data
Michael P. Spinney; Paul C. Van Deusen; Francis A. Roesch
2006-01-01
The Southern On-Line Estimator (SOLE) is a web-based FIA database analysis tool designed with an emphasis on modularity. The Java-based user interface is simple and intuitive to use and the R-based analysis engine is fast and stable. Each component of the program (data retrieval, statistical analysis and output) can be individually modified to accommodate major...
ROOT — A C++ framework for petabyte data storage, statistical analysis and visualization
NASA Astrophysics Data System (ADS)
Antcheva, I.; Ballintijn, M.; Bellenot, B.; Biskup, M.; Brun, R.; Buncic, N.; Canal, Ph.; Casadei, D.; Couet, O.; Fine, V.; Franco, L.; Ganis, G.; Gheata, A.; Maline, D. Gonzalez; Goto, M.; Iwaszkiewicz, J.; Kreshuk, A.; Segura, D. Marcos; Maunder, R.; Moneta, L.; Naumann, A.; Offermann, E.; Onuchin, V.; Panacek, S.; Rademakers, F.; Russo, P.; Tadel, M.
2009-12-01
ROOT is an object-oriented C++ framework conceived in the high-energy physics (HEP) community, designed for storing and analyzing petabytes of data in an efficient way. Any instance of a C++ class can be stored into a ROOT file in a machine-independent compressed binary format. In ROOT the TTree object container is optimized for statistical data analysis over very large data sets by using vertical data storage techniques. These containers can span a large number of files on local disks, the web, or a number of different shared file systems. In order to analyze this data, the user can chose out of a wide set of mathematical and statistical functions, including linear algebra classes, numerical algorithms such as integration and minimization, and various methods for performing regression analysis (fitting). In particular, the RooFit package allows the user to perform complex data modeling and fitting while the RooStats library provides abstractions and implementations for advanced statistical tools. Multivariate classification methods based on machine learning techniques are available via the TMVA package. A central piece in these analysis tools are the histogram classes which provide binning of one- and multi-dimensional data. Results can be saved in high-quality graphical formats like Postscript and PDF or in bitmap formats like JPG or GIF. The result can also be stored into ROOT macros that allow a full recreation and rework of the graphics. Users typically create their analysis macros step by step, making use of the interactive C++ interpreter CINT, while running over small data samples. Once the development is finished, they can run these macros at full compiled speed over large data sets, using on-the-fly compilation, or by creating a stand-alone batch program. Finally, if processing farms are available, the user can reduce the execution time of intrinsically parallel tasks — e.g. data mining in HEP — by using PROOF, which will take care of optimally distributing the work over the available resources in a transparent way. Program summaryProgram title: ROOT Catalogue identifier: AEFA_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEFA_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: LGPL No. of lines in distributed program, including test data, etc.: 3 044 581 No. of bytes in distributed program, including test data, etc.: 36 325 133 Distribution format: tar.gz Programming language: C++ Computer: Intel i386, Intel x86-64, Motorola PPC, Sun Sparc, HP PA-RISC Operating system: GNU/Linux, Windows XP/Vista, Mac OS X, FreeBSD, OpenBSD, Solaris, HP-UX, AIX Has the code been vectorized or parallelized?: Yes RAM:>55 Mbytes Classification: 4, 9, 11.9, 14 Nature of problem: Storage, analysis and visualization of scientific data Solution method: Object store, wide range of analysis algorithms and visualization methods Additional comments: For an up-to-date author list see: http://root.cern.ch/drupal/content/root-development-team and http://root.cern.ch/drupal/content/former-root-developers Running time: Depending on the data size and complexity of analysis algorithms References:http://root.cern.ch.
Spouge, J L
1992-01-01
Reports on retroviral primate trials rarely publish any statistical analysis. Present statistical methodology lacks appropriate tests for these trials and effectively discourages quantitative assessment. This paper describes the theory behind VACMAN, a user-friendly computer program that calculates statistics for in vitro and in vivo infectivity data. VACMAN's analysis applies to many retroviral trials using i.v. challenges and is valid whenever the viral dose-response curve has a particular shape. Statistics from actual i.v. retroviral trials illustrate some unappreciated principles of effective animal use: dilutions other than 1:10 can improve titration accuracy; infecting titration animals at the lowest doses possible can lower challenge doses; and finally, challenging test animals in small trials with more virus than controls safeguards against false successes, "reuses" animals, and strengthens experimental conclusions. The theory presented also explains the important concept of viral saturation, a phenomenon that may cause in vitro and in vivo titrations to agree for some retroviral strains and disagree for others. PMID:1323844
Mishima, Hiroyuki; Lidral, Andrew C; Ni, Jun
2008-05-28
Genetic association studies have been used to map disease-causing genes. A newly introduced statistical method, called exhaustive haplotype association study, analyzes genetic information consisting of different numbers and combinations of DNA sequence variations along a chromosome. Such studies involve a large number of statistical calculations and subsequently high computing power. It is possible to develop parallel algorithms and codes to perform the calculations on a high performance computing (HPC) system. However, most existing commonly-used statistic packages for genetic studies are non-parallel versions. Alternatively, one may use the cutting-edge technology of grid computing and its packages to conduct non-parallel genetic statistical packages on a centralized HPC system or distributed computing systems. In this paper, we report the utilization of a queuing scheduler built on the Grid Engine and run on a Rocks Linux cluster for our genetic statistical studies. Analysis of both consecutive and combinational window haplotypes was conducted by the FBAT (Laird et al., 2000) and Unphased (Dudbridge, 2003) programs. The dataset consisted of 26 loci from 277 extended families (1484 persons). Using the Rocks Linux cluster with 22 compute-nodes, FBAT jobs performed about 14.4-15.9 times faster, while Unphased jobs performed 1.1-18.6 times faster compared to the accumulated computation duration. Execution of exhaustive haplotype analysis using non-parallel software packages on a Linux-based system is an effective and efficient approach in terms of cost and performance.
Mishima, Hiroyuki; Lidral, Andrew C; Ni, Jun
2008-01-01
Background Genetic association studies have been used to map disease-causing genes. A newly introduced statistical method, called exhaustive haplotype association study, analyzes genetic information consisting of different numbers and combinations of DNA sequence variations along a chromosome. Such studies involve a large number of statistical calculations and subsequently high computing power. It is possible to develop parallel algorithms and codes to perform the calculations on a high performance computing (HPC) system. However, most existing commonly-used statistic packages for genetic studies are non-parallel versions. Alternatively, one may use the cutting-edge technology of grid computing and its packages to conduct non-parallel genetic statistical packages on a centralized HPC system or distributed computing systems. In this paper, we report the utilization of a queuing scheduler built on the Grid Engine and run on a Rocks Linux cluster for our genetic statistical studies. Results Analysis of both consecutive and combinational window haplotypes was conducted by the FBAT (Laird et al., 2000) and Unphased (Dudbridge, 2003) programs. The dataset consisted of 26 loci from 277 extended families (1484 persons). Using the Rocks Linux cluster with 22 compute-nodes, FBAT jobs performed about 14.4–15.9 times faster, while Unphased jobs performed 1.1–18.6 times faster compared to the accumulated computation duration. Conclusion Execution of exhaustive haplotype analysis using non-parallel software packages on a Linux-based system is an effective and efficient approach in terms of cost and performance. PMID:18541045
Choi, Mi-Ri; Jeon, Sang-Wan; Yi, Eun-Surk
2018-04-01
The purpose of this study is to analyze the differences among the hospitalized cancer patients on their perception of exercise and physical activity constraints based on their medical history. The study used questionnaire survey as measurement tool for 194 cancer patients (male or female, aged 20 or older) living in Seoul metropolitan area (Seoul, Gyeonggi, Incheon). The collected data were analyzed using frequency analysis, exploratory factor analysis, reliability analysis t -test, and one-way distribution using statistical program SPSS 18.0. The following results were obtained. First, there was no statistically significant difference between cancer stage and exercise recognition/physical activity constraint. Second, there was a significant difference between cancer stage and sociocultural constraint/facility constraint/program constraint. Third, there was a significant difference between cancer operation history and physical/socio-cultural/facility/program constraint. Fourth, there was a significant difference between cancer operation history and negative perception/facility/program constraint. Fifth, there was a significant difference between ancillary cancer treatment method and negative perception/facility/program constraint. Sixth, there was a significant difference between hospitalization period and positive perception/negative perception/physical constraint/cognitive constraint. In conclusion, this study will provide information necessary to create patient-centered healthcare service system by analyzing exercise recognition of hospitalized cancer patients based on their medical history and to investigate the constraint factors that prevents patients from actually making efforts to exercise.
Brown, Geoffrey W.; Sandstrom, Mary M.; Preston, Daniel N.; ...
2014-11-17
In this study, the Integrated Data Collection Analysis (IDCA) program has conducted a proficiency test for small-scale safety and thermal (SSST) testing of homemade explosives (HMEs). Described here are statistical analyses of the results from this test for impact, friction, electrostatic discharge, and differential scanning calorimetry analysis of the RDX Class 5 Type II standard. The material was tested as a well-characterized standard several times during the proficiency test to assess differences among participants and the range of results that may arise for well-behaved explosive materials.
Program for Weibull Analysis of Fatigue Data
NASA Technical Reports Server (NTRS)
Krantz, Timothy L.
2005-01-01
A Fortran computer program has been written for performing statistical analyses of fatigue-test data that are assumed to be adequately represented by a two-parameter Weibull distribution. This program calculates the following: (1) Maximum-likelihood estimates of the Weibull distribution; (2) Data for contour plots of relative likelihood for two parameters; (3) Data for contour plots of joint confidence regions; (4) Data for the profile likelihood of the Weibull-distribution parameters; (5) Data for the profile likelihood of any percentile of the distribution; and (6) Likelihood-based confidence intervals for parameters and/or percentiles of the distribution. The program can account for tests that are suspended without failure (the statistical term for such suspension of tests is "censoring"). The analytical approach followed in this program for the software is valid for type-I censoring, which is the removal of unfailed units at pre-specified times. Confidence regions and intervals are calculated by use of the likelihood-ratio method.
The Role of Institutional Research in a High Profile Study of Undergraduate Research
ERIC Educational Resources Information Center
Webber, Karen L.
2012-01-01
Armed with a strong toolkit of knowledge and skills, institutional research (IR) professionals often serve as collaborators with campus colleagues who may need assistance with survey design, statistical analysis, program review, and assessment of individual programs or the institution. This paper discusses the role that an IR professional played…
Adolescent Pregnancy in an Urban Environment: Issues, Programs, and Evaluation.
ERIC Educational Resources Information Center
Hardy, Janet B.; Zabin, Laurie Schwab
An in-depth discussion of national and local statistics regarding teenage and adolescent pregnancy and the developmental issues involved opens this analysis. Problems and adverse consequences of adolescent pregnancy in an urban setting are explored using a city-wide random sample of adolescent births. A model pregnancy and parenting program and…
Federal Programs Supporting Educational Change, Vol. 2: Factors Affecting Change Agent Projects.
ERIC Educational Resources Information Center
Berman, Paul; Pauly, Edward W.
This second volume in the change-agent series reports the interim results of an exploratory statistical analysis of a survey of a nationwide sample of 293 change-agent projects funded by four federal demonstration programs--Elementary Secondary Education Act (ESEA) Title III, Innovative Projects; ESEA Title VII, Bilingual Projects; Vocational…
ERIC Educational Resources Information Center
Montoya, Isaac D.
2008-01-01
Three classification techniques (Chi-square Automatic Interaction Detection [CHAID], Classification and Regression Tree [CART], and discriminant analysis) were tested to determine their accuracy in predicting Temporary Assistance for Needy Families program recipients' future employment. Technique evaluation was based on proportion of correctly…
An Application of Indian Health Service Standards for Alcoholism Programs.
ERIC Educational Resources Information Center
Burns, Thomas R.
1984-01-01
Discusses Phoenix-area applications of 1981 Indian Health Service standards for alcoholism programs. Results of standard statistical techniques note areas of deficiency through application of a one-tailed z test at .05 level of significance. Factor analysis sheds further light on design of standards. Implications for revisions are suggested.…
Scripted or Non-Scripted: A Comparative Analysis of Two Reading Programs
ERIC Educational Resources Information Center
Bosen, Pamela K.
2014-01-01
The focus of this quantitative comparative study was to analyze school achievement on third-grade reading assessments in 60 similar schools over a three-year period on Washington state standardized criterion-referenced assessments. This study provides statistical data showing the non-scripted programs were consistent for all three years while…
[A Review on the Use of Effect Size in Nursing Research].
Kang, Hyuncheol; Yeon, Kyupil; Han, Sang Tae
2015-10-01
The purpose of this study was to introduce the main concepts of statistical testing and effect size and to provide researchers in nursing science with guidance on how to calculate the effect size for the statistical analysis methods mainly used in nursing. For t-test, analysis of variance, correlation analysis, regression analysis which are used frequently in nursing research, the generally accepted definitions of the effect size were explained. Some formulae for calculating the effect size are described with several examples in nursing research. Furthermore, the authors present the required minimum sample size for each example utilizing G*Power 3 software that is the most widely used program for calculating sample size. It is noted that statistical significance testing and effect size measurement serve different purposes, and the reliance on only one side may be misleading. Some practical guidelines are recommended for combining statistical significance testing and effect size measure in order to make more balanced decisions in quantitative analyses.
Outcomes assessment of a residency program in laboratory medicine.
Morse, E E; Pisciotto, P T; Hopfer, S M; Makowski, G; Ryan, R W; Aslanzadeh, J
1997-01-01
During a down-sizing of residency programs at a State University Medical School, hospital based residents' positions were eliminated. It was determined to find out the characteristics of the residents who graduated from the Laboratory Medicine Program, to compare women graduates with men graduates, and to compare IMGs with United States Graduates. An assessment of a 25 year program in laboratory medicine which had graduated 100 residents showed that there was no statistically significant difference by chi 2 analysis in positions (laboratory directors or staff), in certification (American Board of Pathology [and subspecialties], American Board of Medical Microbiology, American Board of Clinical Chemistry) nor in academic appointments (assistant professor to full professor) when the male graduates were compared with the female graduates or when graduates of American medical schools were compared with graduates of foreign medical schools. There were statistically significant associations by chi 2 analysis between directorship positions and board certification and between academic appointments and board certification. Of 100 graduates, there were 57 directors, 52 certified, and 41 with academic appointments. Twenty-two graduates (11 women and 11 men) attained all three.
NASA Technical Reports Server (NTRS)
Gyekenyesi, J. P.
1985-01-01
A computer program was developed for calculating the statistical fast fracture reliability and failure probability of ceramic components. The program includes the two-parameter Weibull material fracture strength distribution model, using the principle of independent action for polyaxial stress states and Batdorf's shear-sensitive as well as shear-insensitive crack theories, all for volume distributed flaws in macroscopically isotropic solids. Both penny-shaped cracks and Griffith cracks are included in the Batdorf shear-sensitive crack response calculations, using Griffith's maximum tensile stress or critical coplanar strain energy release rate criteria to predict mixed mode fracture. Weibull material parameters can also be calculated from modulus of rupture bar tests, using the least squares method with known specimen geometry and fracture data. The reliability prediction analysis uses MSC/NASTRAN stress, temperature and volume output, obtained from the use of three-dimensional, quadratic, isoparametric, or axisymmetric finite elements. The statistical fast fracture theories employed, along with selected input and output formats and options, are summarized. An example problem to demonstrate various features of the program is included.
ERIC Educational Resources Information Center
Borman, Stuart A.
1985-01-01
Discusses various aspects of scientific software, including evaluation and selection of commercial software products; program exchanges, catalogs, and other information sources; major data analysis packages; statistics and chemometrics software; and artificial intelligence. (JN)
Billot, Laurent; Lindley, Richard I; Harvey, Lisa A; Maulik, Pallab K; Hackett, Maree L; Murthy, Gudlavalleti Vs; Anderson, Craig S; Shamanna, Bindiganavale R; Jan, Stephen; Walker, Marion; Forster, Anne; Langhorne, Peter; Verma, Shweta J; Felix, Cynthia; Alim, Mohammed; Gandhi, Dorcas Bc; Pandian, Jeyaraj Durai
2017-02-01
Background In low- and middle-income countries, few patients receive organized rehabilitation after stroke, yet the burden of chronic diseases such as stroke is increasing in these countries. Affordable models of effective rehabilitation could have a major impact. The ATTEND trial is evaluating a family-led caregiver delivered rehabilitation program after stroke. Objective To publish the detailed statistical analysis plan for the ATTEND trial prior to trial unblinding. Methods Based upon the published registration and protocol, the blinded steering committee and management team, led by the trial statistician, have developed a statistical analysis plan. The plan has been informed by the chosen outcome measures, the data collection forms and knowledge of key baseline data. Results The resulting statistical analysis plan is consistent with best practice and will allow open and transparent reporting. Conclusions Publication of the trial statistical analysis plan reduces potential bias in trial reporting, and clearly outlines pre-specified analyses. Clinical Trial Registrations India CTRI/2013/04/003557; Australian New Zealand Clinical Trials Registry ACTRN1261000078752; Universal Trial Number U1111-1138-6707.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hertzler, C.L.; Poloski, J.P.; Bates, R.A.
1988-01-01
The Compliance Program Data Management System (DMS) developed at the Idaho National Engineering Laboratory (INEL) validates and maintains the integrity of data collected to support the Consent Order and Compliance Agreement (COCA) between the INEL and the Environmental Protection Agency (EPA). The system uses dBase III Plus programs and dBase III Plus in an interactive mode to enter, store, validate, manage, and retrieve analytical information provided on EPA Contract Laboratory Program (CLP) forms and CLP forms modified to accommodate 40 CFR 264 Appendix IX constituent analyses. Data analysis and presentation is performed utilizing SAS, a statistical analysis software program. Archivingmore » of data and results is performed at appropriate stages of data management. The DMS is useful for sampling and analysis programs where adherence to EPA CLP protocol, along with maintenance and retrieval of waste site investigation sampling results is desired or requested. 3 refs.« less
From micro to mainframe. A practical approach to perinatal data processing.
Yeh, S Y; Lincoln, T
1985-04-01
A new, practical approach to perinatal data processing for a large obstetric population is described. This was done with a microcomputer for data entry and a mainframe computer for data reduction. The Screen Oriented Data Access (SODA) program was used to generate the data entry form and to input data into the Apple II Plus computer. Data were stored on diskettes and transmitted through a modern and telephone line to the IBM 370/168 computer. The Statistical Analysis System (SAS) program was used for statistical analyses and report generations. This approach was found to be most practical, flexible, and economical.
Analytic programming with FMRI data: a quick-start guide for statisticians using R.
Eloyan, Ani; Li, Shanshan; Muschelli, John; Pekar, Jim J; Mostofsky, Stewart H; Caffo, Brian S
2014-01-01
Functional magnetic resonance imaging (fMRI) is a thriving field that plays an important role in medical imaging analysis, biological and neuroscience research and practice. This manuscript gives a didactic introduction to the statistical analysis of fMRI data using the R project, along with the relevant R code. The goal is to give statisticians who would like to pursue research in this area a quick tutorial for programming with fMRI data. References of relevant packages and papers are provided for those interested in more advanced analysis.
An improved multiple linear regression and data analysis computer program package
NASA Technical Reports Server (NTRS)
Sidik, S. M.
1972-01-01
NEWRAP, an improved version of a previous multiple linear regression program called RAPIER, CREDUC, and CRSPLT, allows for a complete regression analysis including cross plots of the independent and dependent variables, correlation coefficients, regression coefficients, analysis of variance tables, t-statistics and their probability levels, rejection of independent variables, plots of residuals against the independent and dependent variables, and a canonical reduction of quadratic response functions useful in optimum seeking experimentation. A major improvement over RAPIER is that all regression calculations are done in double precision arithmetic.
Prototyping with Data Dictionaries for Requirements Analysis.
1985-03-01
statistical packages and software for screen layout. These items work at a higher level than another category of prototyping tool, program generators... Program generators are software packages which, when given specifications, produce source listings, usually in a high order language such as COBCL...with users and this will not happen if he must stop to develcp a detailed program . [Ref. 241] Hardware as well as software should be considered in
NASA Astrophysics Data System (ADS)
Zavaletta, Vanessa A.; Bartholmai, Brian J.; Robb, Richard A.
2007-03-01
Diffuse lung diseases, such as idiopathic pulmonary fibrosis (IPF), can be characterized and quantified by analysis of volumetric high resolution CT scans of the lungs. These data sets typically have dimensions of 512 x 512 x 400. It is too subjective and labor intensive for a radiologist to analyze each slice and quantify regional abnormalities manually. Thus, computer aided techniques are necessary, particularly texture analysis techniques which classify various lung tissue types. Second and higher order statistics which relate the spatial variation of the intensity values are good discriminatory features for various textures. The intensity values in lung CT scans range between [-1024, 1024]. Calculation of second order statistics on this range is too computationally intensive so the data is typically binned between 16 or 32 gray levels. There are more effective ways of binning the gray level range to improve classification. An optimal and very efficient way to nonlinearly bin the histogram is to use a dynamic programming algorithm. The objective of this paper is to show that nonlinear binning using dynamic programming is computationally efficient and improves the discriminatory power of the second and higher order statistics for more accurate quantification of diffuse lung disease.
7 CFR 4279.43 - Certified Lender Program.
Code of Federal Regulations, 2010 CFR
2010-01-01
... guaranteed by any Federal agency, with information on delinquencies and losses and, if applicable, the... lender will provide a written certification to this effect along with a statistical analysis of its...
NASA Technical Reports Server (NTRS)
Ford, F. E.; Harkness, J. M.
1977-01-01
A brief discussion on the accelerated testing of batteries is given. The statistical analysis and the various aspects of the modeling that was done and the results attained from the model are also briefly discussed.
A review of small canned computer programs for survey research and demographic analysis.
Sinquefield, J C
1976-12-01
A variety of small canned computer programs for survey research and demographic analysis appropriate for use in developing countries are reviewed in this article. The programs discussed are SPSS (Statistical Package for the Social Sciences); CENTS, CO-CENTS, CENTS-AID, CENTS-AIE II; MINI-TAB EDIT, FREQUENCIES, TABLES, REGRESSION, CLIENT RECORD, DATES, MULT, LIFE, and PREGNANCY HISTORY; FIVFIV and SINSIN; DCL (Demographic Computer Library); MINI-TAB Population Projection, Functional Population Projection, and Family Planning Target Projection. A description and evaluation for each program of uses, instruction manuals, computer requirements, and procedures for obtaining manuals and programs are provided. Such information is intended to facilitate and encourage the use of the computer by data processors in developing countries.
Preliminary Survey of Icing Conditions Measured During Routine Transcontinental Airline Operation
NASA Technical Reports Server (NTRS)
Perkins, Porter J.
1952-01-01
Icing data collected on routine operations by four DC-4-type aircraft equipped with NACA pressure-type icing-rate meters are presented as preliminary information obtained from a statistical icing data program sponsored by the NACA with the cooperation of many airline companies and the United States Air Force. The program is continuing on a much greater scale to provide large quantities of data from many air routes in the United States and overseas. Areas not covered by established air routes are also being included in the survey. The four aircraft which collected the data presented in this report were operated by United Air Lines over a transcontinental route from January through May, 1951. An analysis of the pressure-type icing-rate meter was satisfactory for collecting statistical data during routine operations. Data obtained on routine flight icing encounters from.these four instrumented aircraft, although insufficient for a conclusive statistical analysis, provide a greater quantity and considerably more realistic information than that obtained from random research flights. A summary of statistical data will be published when the information obtained daring the 1951-52 icing season and that to be obtained during the 1952-53 season can be analyzed and assembled. The 1951-52 data already analyzed indicate that the quantity, quality, and range of icing information being provided by this expanded program should afford a sound basis for ice-protection-system design by defining the important meteorological parameters of the icing cloud.
An analysis of student performance benchmarks in dental hygiene via distance education.
Olmsted, Jodi L
2010-01-01
Three graduate programs, 35 undergraduate programs and 12 dental hygiene degree completion programs in the United States use varying forms of Distance Learning (DL). Relying heavily on DL leaves an unanswered question: Is learner performance on standard benchmark assessments impacted when using technology as a delivery system? A 10 year, longitudinal examination looked for student performance differences in a Distance Education (DE) dental hygiene program. The purpose of this research was to determine if there was a difference in performance between learners taught in a traditional classroom as compared to their counterparts taking classes through an alternative delivery system. A longitudinal, ex post facto design was used. Two hundred and sixty-six subject records were examined. Seventy-seven individuals (29%) were lost through attrition over 10 years. One hundred and eighty-nine records were used as the study sample, 117 individuals were located face-to-face and 72 were at a distance. Independent variables included time and location, while the dependent variables included course grades, grade point average (GPA) and the National Board of Dental Hygiene Examination (NBDHE). Three research questions were asked: Were there statistically significant differences in learner performance on the National Board of Dental Hygiene Examination (NBDHE)? Were there statistically significant differences in learner performance when considering GPAs? Did statistically significant differences in performance exist relating to individual course grades? T-tests were used for data analysis in answering the research questions. From a cumulative perspective, no statistically significant differences were apparent for the NBDHE and GPAs or for individual courses. Interactive Television (ITV), the synchronous DL system examined, was considered effective for delivering education to learners if similar performance outcomes were the evaluation criteria.
MATHEMATICS PANEL PROGRESS REPORT FOR PERIOD MARCH 1, 1957 TO AUGUST 31, 1958
DOE Office of Scientific and Technical Information (OSTI.GOV)
Householder, A.S.
1959-03-24
ORACLE operation and programming are summarized, and progress is indicated on various current problems. Work is reviewed on numerical analysis, programming, basic mathematics, biometrics and statistics, ORACLE operations and special codes, and training. Publications and lectures for the report period are listed. (For preceding period see ORNL-2283.) (W.D.M.)
ERIC Educational Resources Information Center
Hung, Y.-C.
2012-01-01
This paper investigates the impact of combining self explaining (SE) with computer architecture diagrams to help novice students learn assembly language programming. Pre- and post-test scores for the experimental and control groups were compared and subjected to covariance (ANCOVA) statistical analysis. Results indicate that the SE-plus-diagram…
ERIC Educational Resources Information Center
Kleiner, Brian; Thomas, Nina; Lewis, Laurie
2007-01-01
This report presents findings from a 2006 national survey of all Title IV degree-granting 4- year postsecondary institutions on how teacher candidates within teacher education programs for initial licensure are being prepared to use educational technology once they enter the field. The "Educational Technology in Teacher Education Programs…
Institute for Training Minority Group Research and Evaluation Specialists. Final Report.
ERIC Educational Resources Information Center
Brown, Roscoe C., Jr.
The Institute for Training Minority Group Research and Evaluation Specialists comprised 4 programs in 1: (1) a 6-week graduate course at New York University (NYU) during the 1970 summer session for 20 minority group persons that provided training in research design, statistics, data collection and analysis, and report writing; (2) a program of…
Do Vouchers and Tax Credits Increase Private School Regulation? A Statistical Analysis
ERIC Educational Resources Information Center
Coulson, Andrew J.
2011-01-01
School voucher and education tax credit programs have proliferated in the United States over the past 2 decades. Advocates have argued that they will enable families to become active consumers in a free and competitive education marketplace, but some fear that these programs may bring a heavy regulatory burden that could stifle market forces.…
William H. McWilliams; Stanford L. Arner; Charles J. Barnett
1997-01-01
The USDA Forest Service's Forest Inventory and Analysis (FIA) program and the Forest Health Monitoring (FHM) program maintain networks of sample locations providing coarse-scale information that characterize general indicators of forest health. Tree mortality is the primary FIA variable for analyzing forest health. Recent FIA inventories of New York, Pennsylvania...
Space station interior noise analysis program
NASA Technical Reports Server (NTRS)
Stusnick, E.; Burn, M.
1987-01-01
Documentation is provided for a microcomputer program which was developed to evaluate the effect of the vibroacoustic environment on speech communication inside a space station. The program, entitled Space Station Interior Noise Analysis Program (SSINAP), combines a Statistical Energy Analysis (SEA) prediction of sound and vibration levels within the space station with a speech intelligibility model based on the Modulation Transfer Function and the Speech Transmission Index (MTF/STI). The SEA model provides an effective analysis tool for predicting the acoustic environment based on proposed space station design. The MTF/STI model provides a method for evaluating speech communication in the relatively reverberant and potentially noisy environments that are likely to occur in space stations. The combinations of these two models provides a powerful analysis tool for optimizing the acoustic design of space stations from the point of view of speech communications. The mathematical algorithms used in SSINAP are presented to implement the SEA and MTF/STI models. An appendix provides an explanation of the operation of the program along with details of the program structure and code.
Probabilistic/Fracture-Mechanics Model For Service Life
NASA Technical Reports Server (NTRS)
Watkins, T., Jr.; Annis, C. G., Jr.
1991-01-01
Computer program makes probabilistic estimates of lifetime of engine and components thereof. Developed to fill need for more accurate life-assessment technique that avoids errors in estimated lives and provides for statistical assessment of levels of risk created by engineering decisions in designing system. Implements mathematical model combining techniques of statistics, fatigue, fracture mechanics, nondestructive analysis, life-cycle cost analysis, and management of engine parts. Used to investigate effects of such engine-component life-controlling parameters as return-to-service intervals, stresses, capabilities for nondestructive evaluation, and qualities of materials.
7 CFR 4279.43 - Certified Lender Program.
Code of Federal Regulations, 2011 CFR
2011-01-01
... complete financial statements; and completion by the Agency of the environmental analysis. The Agency may... lender will provide a written certification to this effect along with a statistical analysis of its... 80 percent. (4) If the lender is a bank or savings and loan, it must have a financial strength rating...
7 CFR 4279.43 - Certified Lender Program.
Code of Federal Regulations, 2013 CFR
2013-01-01
... complete financial statements; and completion by the Agency of the environmental analysis. The Agency may... lender will provide a written certification to this effect along with a statistical analysis of its... 80 percent. (4) If the lender is a bank or savings and loan, it must have a financial strength rating...
7 CFR 4279.43 - Certified Lender Program.
Code of Federal Regulations, 2014 CFR
2014-01-01
... complete financial statements; and completion by the Agency of the environmental analysis. The Agency may... lender will provide a written certification to this effect along with a statistical analysis of its... 80 percent. (4) If the lender is a bank or savings and loan, it must have a financial strength rating...
7 CFR 4279.43 - Certified Lender Program.
Code of Federal Regulations, 2012 CFR
2012-01-01
... complete financial statements; and completion by the Agency of the environmental analysis. The Agency may... lender will provide a written certification to this effect along with a statistical analysis of its... 80 percent. (4) If the lender is a bank or savings and loan, it must have a financial strength rating...
The Equivalence of Three Statistical Packages for Performing Hierarchical Cluster Analysis
ERIC Educational Resources Information Center
Blashfield, Roger
1977-01-01
Three different software programs which contain hierarchical agglomerative cluster analysis procedures were shown to generate different solutions on the same data set using apparently the same options. The basis for the differences in the solutions was the formulae used to calculate Euclidean distance. (Author/JKS)
Opportunities for Applied Behavior Analysis in the Total Quality Movement.
ERIC Educational Resources Information Center
Redmon, William K.
1992-01-01
This paper identifies critical components of recent organizational quality improvement programs and specifies how applied behavior analysis can contribute to quality technology. Statistical Process Control and Total Quality Management approaches are compared, and behavior analysts are urged to build their research base and market behavior change…
NASA Astrophysics Data System (ADS)
Jebur, M. N.; Pradhan, B.; Shafri, H. Z. M.; Yusof, Z.; Tehrany, M. S.
2014-10-01
Modeling and classification difficulties are fundamental issues in natural hazard assessment. A geographic information system (GIS) is a domain that requires users to use various tools to perform different types of spatial modeling. Bivariate statistical analysis (BSA) assists in hazard modeling. To perform this analysis, several calculations are required and the user has to transfer data from one format to another. Most researchers perform these calculations manually by using Microsoft Excel or other programs. This process is time consuming and carries a degree of uncertainty. The lack of proper tools to implement BSA in a GIS environment prompted this study. In this paper, a user-friendly tool, BSM (bivariate statistical modeler), for BSA technique is proposed. Three popular BSA techniques such as frequency ratio, weights-of-evidence, and evidential belief function models are applied in the newly proposed ArcMAP tool. This tool is programmed in Python and is created by a simple graphical user interface, which facilitates the improvement of model performance. The proposed tool implements BSA automatically, thus allowing numerous variables to be examined. To validate the capability and accuracy of this program, a pilot test area in Malaysia is selected and all three models are tested by using the proposed program. Area under curve is used to measure the success rate and prediction rate. Results demonstrate that the proposed program executes BSA with reasonable accuracy. The proposed BSA tool can be used in numerous applications, such as natural hazard, mineral potential, hydrological, and other engineering and environmental applications.
NASA Astrophysics Data System (ADS)
Jebur, M. N.; Pradhan, B.; Shafri, H. Z. M.; Yusoff, Z. M.; Tehrany, M. S.
2015-03-01
Modelling and classification difficulties are fundamental issues in natural hazard assessment. A geographic information system (GIS) is a domain that requires users to use various tools to perform different types of spatial modelling. Bivariate statistical analysis (BSA) assists in hazard modelling. To perform this analysis, several calculations are required and the user has to transfer data from one format to another. Most researchers perform these calculations manually by using Microsoft Excel or other programs. This process is time-consuming and carries a degree of uncertainty. The lack of proper tools to implement BSA in a GIS environment prompted this study. In this paper, a user-friendly tool, bivariate statistical modeler (BSM), for BSA technique is proposed. Three popular BSA techniques, such as frequency ratio, weight-of-evidence (WoE), and evidential belief function (EBF) models, are applied in the newly proposed ArcMAP tool. This tool is programmed in Python and created by a simple graphical user interface (GUI), which facilitates the improvement of model performance. The proposed tool implements BSA automatically, thus allowing numerous variables to be examined. To validate the capability and accuracy of this program, a pilot test area in Malaysia is selected and all three models are tested by using the proposed program. Area under curve (AUC) is used to measure the success rate and prediction rate. Results demonstrate that the proposed program executes BSA with reasonable accuracy. The proposed BSA tool can be used in numerous applications, such as natural hazard, mineral potential, hydrological, and other engineering and environmental applications.
Substance-dependence rehab treatment in Thailand: a meta analysis.
Verachai, Viroj; Kittipichai, Wirin; Konghom, Suwapat; Lukanapichonchut, Lumsum; Sinlapasacran, Narong; Kimsongneun, Nipa; Rergarun, Prachern; Doungnimit, Amawasee
2009-12-01
To synthesize the substance-dependence researches focusing on rehab treatment phase. Several criteria were used to select studies for meta analysis. Firstly, the research must have focused on the rehab period on the substance-dependence treatment, secondly, only quantitative researches that used statistics to calculate effect sizes were selected, and thirdly, all researches were from Thai libraries and were done during 1997-2006. The instrument used for data collection was comprised of two sets. The first used to collect the general information of studies including the crucial statistics and test statistics. The second was used to assess the quality of studies. Results from synthesizing 32 separate studies found that 323 effect sizes were computed in terms of the correlation coefficient "r". The psychology approach rehab program was higher in effect size than the network approach (p < 0.05). Additionally, Quasi-experimental studies were higher in effect size than correlation studies (p < 0.05). Among the quasi-experimental studies it was found that TCs revealed the highest effect size (r = 0.76). Among the correlation studies, it was found that the motivation program revealed the highest effect size (r = 0.84). The substance-use rehab treatment programs in Thailand which revealed the high effect size should be adjusted to the current program. However, the narcotic studies which focus on the rehab phase should be synthesized every 5-10 years in order to integrate new concept into the development of future the substance-dependence rehab treatment program, especially those at the research unit of the Drug Dependence Treatment Institute/Centers in Thailand.
User's manual for the Simulated Life Analysis of Vehicle Elements (SLAVE) model
NASA Technical Reports Server (NTRS)
Paul, D. D., Jr.
1972-01-01
The simulated life analysis of vehicle elements model was designed to perform statistical simulation studies for any constant loss rate. The outputs of the model consist of the total number of stages required, stages successfully completing their lifetime, and average stage flight life. This report contains a complete description of the model. Users' instructions and interpretation of input and output data are presented such that a user with little or no prior programming knowledge can successfully implement the program.
Brady, Teresa J; Murphy, Louise B; O'Colmain, Benita J; Hobson, Reeti Desai
2017-09-01
To evaluate whether implementation factors or fidelity moderate chronic disease self-management education program outcomes. Meta-analysis of 34 Arthritis Self-Management Program and Chronic Disease Self-Management Program studies. Community. N = 10 792. Twelve implementation factors: program delivery fidelity and setting and leader and participant characteristics. Eighteen program outcomes: self-reported health behaviors, physical health status, psychological health status, and health-care utilization. Meta-analysis using pooled effect sizes. Modest to moderate statistically significant differences for 4 of 6 implementation factors; these findings were counterintuitive with better outcomes when leaders and participants were unpaid, leaders had less than minimum training, and implementation did not meet fidelity requirements. Exploratory study findings suggest that these interventions tolerate some variability in implementation factors. Further work is needed to identify key elements where fidelity is essential for intervention effectiveness.
NASA Astrophysics Data System (ADS)
Wright, Robyn; Thornberg, Steven M.
SEDIDAT is a series of compiled IBM-BASIC (version 2.0) programs that direct the collection, statistical calculation, and graphic presentation of particle settling velocity and equivalent spherical diameter for samples analyzed using the settling tube technique. The programs follow a menu-driven format that is understood easily by students and scientists with little previous computer experience. Settling velocity is measured directly (cm,sec) and also converted into Chi units. Equivalent spherical diameter (reported in Phi units) is calculated using a modified Gibbs equation for different particle densities. Input parameters, such as water temperature, settling distance, particle density, run time, and Phi;Chi interval are changed easily at operator discretion. Optional output to a dot-matrix printer includes a summary of moment and graphic statistical parameters, a tabulation of individual and cumulative weight percents, a listing of major distribution modes, and cumulative and histogram plots of a raw time, settling velocity. Chi and Phi data.
Samuel A. Cushman; Kevin S. McKelvey
2006-01-01
The primary weakness in our current ability to evaluate future landscapes in terms of wildlife lies in the lack of quantitative models linking wildlife to forest stand conditions, including fuels treatments. This project focuses on 1) developing statistical wildlife habitat relationships models (WHR) utilizing Forest Inventory and Analysis (FIA) and National Vegetation...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-30
... Based on Customary Charges In Sec. 447.271(a), DHHS is adding an introductory phrase to read ``Except as... hospital that is located outside of a Core-Based Statistical Area (for Medicaid) and outside a Metropolitan Statistical Area for Medicare) and has fewer than 100 beds. DHHS is not preparing an analysis for section 1102...
Hou, Deyi; O'Connor, David; Nathanail, Paul; Tian, Li; Ma, Yan
2017-12-01
Heavy metal soil contamination is associated with potential toxicity to humans or ecotoxicity. Scholars have increasingly used a combination of geographical information science (GIS) with geostatistical and multivariate statistical analysis techniques to examine the spatial distribution of heavy metals in soils at a regional scale. A review of such studies showed that most soil sampling programs were based on grid patterns and composite sampling methodologies. Many programs intended to characterize various soil types and land use types. The most often used sampling depth intervals were 0-0.10 m, or 0-0.20 m, below surface; and the sampling densities used ranged from 0.0004 to 6.1 samples per km 2 , with a median of 0.4 samples per km 2 . The most widely used spatial interpolators were inverse distance weighted interpolation and ordinary kriging; and the most often used multivariate statistical analysis techniques were principal component analysis and cluster analysis. The review also identified several determining and correlating factors in heavy metal distribution in soils, including soil type, soil pH, soil organic matter, land use type, Fe, Al, and heavy metal concentrations. The major natural and anthropogenic sources of heavy metals were found to derive from lithogenic origin, roadway and transportation, atmospheric deposition, wastewater and runoff from industrial and mining facilities, fertilizer application, livestock manure, and sewage sludge. This review argues that the full potential of integrated GIS and multivariate statistical analysis for assessing heavy metal distribution in soils on a regional scale has not yet been fully realized. It is proposed that future research be conducted to map multivariate results in GIS to pinpoint specific anthropogenic sources, to analyze temporal trends in addition to spatial patterns, to optimize modeling parameters, and to expand the use of different multivariate analysis tools beyond principal component analysis (PCA) and cluster analysis (CA). Copyright © 2017 Elsevier Ltd. All rights reserved.
1997-09-01
program include the ACEIT software training and the combination of Department of Defense (DOD) application, regression, and statistics. The weaknesses...and Integrated Tools ( ACEIT ) software and training could not be praised enough. AFIT vs. Civilian Institutions. The GCA program provides a Department...very useful to the graduates and beneficial to their careers. The main strengths of the program include the ACEIT software training and the combination
Modelling the Effects of Land-Use Changes on Climate: a Case Study on Yamula DAM
NASA Astrophysics Data System (ADS)
Köylü, Ü.; Geymen, A.
2016-10-01
Dams block flow of rivers and cause artificial water reservoirs which affect the climate and the land use characteristics of the river basin. In this research, the effect of the huge water body obtained by Yamula Dam in Kızılırmak Basin is analysed over surrounding spatial's land use and climate change. Mann Kendal non-parametrical statistical test, Theil&Sen Slope method, Inverse Distance Weighting (IDW), Soil Conservation Service-Curve Number (SCS-CN) methods are integrated for spatial and temporal analysis of the research area. For this research humidity, temperature, wind speed, precipitation observations which are collected in 16 weather stations nearby Kızılırmak Basin are analyzed. After that these statistical information is combined by GIS data over years. An application is developed for GIS analysis in Python Programming Language and integrated with ArcGIS software. Statistical analysis calculated in the R Project for Statistical Computing and integrated with developed application. According to the statistical analysis of extracted time series of meteorological parameters, statistical significant spatiotemporal trends are observed for climate change and land use characteristics. In this study, we indicated the effect of big dams in local climate on semi-arid Yamula Dam.
Statistical correlation analysis for comparing vibration data from test and analysis
NASA Technical Reports Server (NTRS)
Butler, T. G.; Strang, R. F.; Purves, L. R.; Hershfeld, D. J.
1986-01-01
A theory was developed to compare vibration modes obtained by NASTRAN analysis with those obtained experimentally. Because many more analytical modes can be obtained than experimental modes, the analytical set was treated as expansion functions for putting both sources in comparative form. The dimensional symmetry was developed for three general cases: nonsymmetric whole model compared with a nonsymmetric whole structural test, symmetric analytical portion compared with a symmetric experimental portion, and analytical symmetric portion with a whole experimental test. The theory was coded and a statistical correlation program was installed as a utility. The theory is established with small classical structures.
A Guideline to Univariate Statistical Analysis for LC/MS-Based Untargeted Metabolomics-Derived Data
Vinaixa, Maria; Samino, Sara; Saez, Isabel; Duran, Jordi; Guinovart, Joan J.; Yanes, Oscar
2012-01-01
Several metabolomic software programs provide methods for peak picking, retention time alignment and quantification of metabolite features in LC/MS-based metabolomics. Statistical analysis, however, is needed in order to discover those features significantly altered between samples. By comparing the retention time and MS/MS data of a model compound to that from the altered feature of interest in the research sample, metabolites can be then unequivocally identified. This paper reports on a comprehensive overview of a workflow for statistical analysis to rank relevant metabolite features that will be selected for further MS/MS experiments. We focus on univariate data analysis applied in parallel on all detected features. Characteristics and challenges of this analysis are discussed and illustrated using four different real LC/MS untargeted metabolomic datasets. We demonstrate the influence of considering or violating mathematical assumptions on which univariate statistical test rely, using high-dimensional LC/MS datasets. Issues in data analysis such as determination of sample size, analytical variation, assumption of normality and homocedasticity, or correction for multiple testing are discussed and illustrated in the context of our four untargeted LC/MS working examples. PMID:24957762
A Guideline to Univariate Statistical Analysis for LC/MS-Based Untargeted Metabolomics-Derived Data.
Vinaixa, Maria; Samino, Sara; Saez, Isabel; Duran, Jordi; Guinovart, Joan J; Yanes, Oscar
2012-10-18
Several metabolomic software programs provide methods for peak picking, retention time alignment and quantification of metabolite features in LC/MS-based metabolomics. Statistical analysis, however, is needed in order to discover those features significantly altered between samples. By comparing the retention time and MS/MS data of a model compound to that from the altered feature of interest in the research sample, metabolites can be then unequivocally identified. This paper reports on a comprehensive overview of a workflow for statistical analysis to rank relevant metabolite features that will be selected for further MS/MS experiments. We focus on univariate data analysis applied in parallel on all detected features. Characteristics and challenges of this analysis are discussed and illustrated using four different real LC/MS untargeted metabolomic datasets. We demonstrate the influence of considering or violating mathematical assumptions on which univariate statistical test rely, using high-dimensional LC/MS datasets. Issues in data analysis such as determination of sample size, analytical variation, assumption of normality and homocedasticity, or correction for multiple testing are discussed and illustrated in the context of our four untargeted LC/MS working examples.
NASA Astrophysics Data System (ADS)
Zaborowicz, M.; Przybył, J.; Koszela, K.; Boniecki, P.; Mueller, W.; Raba, B.; Lewicki, A.; Przybył, K.
2014-04-01
The aim of the project was to make the software which on the basis on image of greenhouse tomato allows for the extraction of its characteristics. Data gathered during the image analysis and processing were used to build learning sets of artificial neural networks. Program enables to process pictures in jpeg format, acquisition of statistical information of the picture and export them to an external file. Produced software is intended to batch analyze collected research material and obtained information saved as a csv file. Program allows for analysis of 33 independent parameters implicitly to describe tested image. The application is dedicated to processing and image analysis of greenhouse tomatoes. The program can be used for analysis of other fruits and vegetables of a spherical shape.
Tho, Poh Chi; Ang, Emily
2016-02-01
Advancements in technology and medical treatment have made cancer care treatment more complex. With the current trend of sub-specialization in health care, cancer patients commonly receive care from multiple specialists and have wider treatment options. In view of this, there is a need to coordinate care and integrate information to enhance care and quality of outcomes for patients. Since the successful implementation of programs for increasing the survival rate of breast cancer patients at Harlem Hospital Center, New York, USA, patient navigation programs have been widely introduced in healthcare settings. Some literature has identified nurses as a primary candidate in assuming the role of a navigator. However, there is a need to further explore the effectiveness of patient navigation programs for their effectiveness in improving quality of life, and patient satisfaction and outcomes during the commencement of cancer treatment. The objective of this review was to synthesize the best available evidence on the effectiveness of patient navigation programs in adult cancer patients undergoing treatments such as radiotherapy and/or chemotherapy. This review considered studies that included adults aged 18 years and over, diagnosed with any type of cancer and undergoing treatment in an acute care hospital setting, including inpatient and outpatient/ambulatory care.This review considered studies that evaluated nurse-led patient navigation programs versus no patient navigation program or non-structured care coordination.A patient navigation program includes patient education, psychosocial support, and care coordination.This review considered randomized controlled trials and quasi-experimental studies.The review focused on the effects of patient navigator program clinical/patient outcomes. The review included studies on patient wellbeing and clinical outcomes, but excluded studies that had examined the impact of these programs on efficiency-related outcomes, such as length of hospital stay and resource use. A three-step search strategy was utilized to find both published and unpublished studies in the databases: CINAHL, MEDLINE, Academic Search Complete, EMBASE, Cochrane Central Register of Controlled Trials (CENTRAL), Science Direct, Google Scholar (SCIRUS), MEDNAR (first 200 hits) and ProQuest Dissertations and Theses published between 1990 to 2013. Only studies published in English were included in this review. Two reviewers independently evaluated the methodological quality of studies that met the inclusion criteria for the review, using a standardized critical appraisal instrument from the Joanna Briggs Institute. Data was extracted from the included papers using the standardized data extraction tool from the Joanna Briggs Institute Meta-Analysis of Statistics Assessment and Review Instrument. Quantitative data was pooled in a statistical meta-analysis using Review Manager 5.3. Effect sizes expressed as weighted mean differences (for continuous data) and their 95% confidence intervals were calculated for analysis. Heterogeneity was assessed statistically using the standard Chi-square test. Where statistical pooling was not possible, the findings are presented in narrative form. After the process of study selection, four studies (two randomized controlled trials and two quasi-experimental studies) with a total of 667 participants were included in the review. The results demonstrated no statistically significant difference in the quality of life of patients with cancer who had undergone patient navigation programs (pooled weighted difference = 0.41 [95% CI = -2.89 to 3.71], P=0.81). However, the two included studies that assessed patient satisfaction as an outcome measure both showed statistically significant improvements (p-values = 0.03 and 0.001, respectively). In the study that assessed patient distress level, there was no statistically significant difference found between the: nurse-led navigation and non-navigation groups (P = 0.675). Nurse-led patient navigation programs were not effective in addressing outcomes such as quality of life and distress levels, the systematic review did not find any significant difference between the two groups. However, there was a statistically significance difference in increasing patient satisfaction.There is limited evidence that patient navigation programs improve the outcomes of quality of life and reduce distress (for adult patients with cancer undergoing treatment). However, there is good evidence that patient navigation programs improve patients' satisfaction. Therefore it is recommended that patient navigation programs are used for adult cancer patients in the acute care setting to improve patients' satisfaction.There may be a need to explore a more rigorous evaluation of nurse-led navigation programs to determine their effectiveness. Researchers should consider multi-site studies and larger sample sizes for better generalization.
Suzuki, Tomoyuki; Kamiya, Nobuyuki; Yahata, Yuichiro; Ozeki, Yukie; Kishimoto, Tsuyoshi; Nadaoka, Yoko; Nakanishi, Yoshiko; Yoshimura, Takesumi; Shimada, Tomoe; Tada, Yuki; Shirabe, Komei; Kozawa, Kunihisa
2013-03-01
The objective of this study was to assess the need for and usefulness of training programs for Local Infectious Disease Surveillance Center (LIDSC) staff. A structured questionnaire survey was conducted to assess the needs and usefulness of training programs. The subjects of the survey were participants of a workshop held after an annual conference for the LIDSC staff. Data on demographic information, the necessity of training programs for LIDSC staff, the themes and contents of the training program, self-assessment of knowledge on epidemiology and statistics were covered by the questionnaire. A total of 55 local government officials responded to the questionnaire (response rate: 100%). Among these, 95% of participants believed that the training program for the LIDSC staff was necessary. Basic statistical analysis (85%), descriptive epidemiology (65%), outline of epidemiology (60%), interpretation of surveillance data (65%), background and objectives of national infectious disease surveillance in Japan (60%), methods of field epidemiology (60%), and methods of analysis data (51%) were selected by over half of the respondents as suitable themes for training programs. A total of 34 LIDSC staff answered the self-assessment question on knowledge of epidemiology. A majority of respondents selected "a little" or "none" for all questions about knowledge. Only a few respondents had received education in epidemiology. The results of this study indicate that LIDSC staff have basic demands for fundamental and specialized education to improve their work. Considering the current situation regarding the capacity of LIDSC staff, these training programs should be started immediately.
Pease, J M; Morselli, M F
1987-01-01
This paper deals with a computer program adapted to a statistical method for analyzing an unlimited quantity of binary recorded data of an independent circular variable (e.g. wind direction), and a linear variable (e.g. maple sap flow volume). Circular variables cannot be statistically analyzed with linear methods, unless they have been transformed. The program calculates a critical quantity, the acrophase angle (PHI, phi o). The technique is adapted from original mathematics [1] and is written in Fortran 77 for easier conversion between computer networks. Correlation analysis can be performed following the program or regression which, because of the circular nature of the independent variable, becomes periodic regression. The technique was tested on a file of approximately 4050 data pairs.
Metikaridis, T Damianos; Hadjipavlou, Alexander; Artemiadis, Artemios; Chrousos, George; Darviri, Christina
2016-05-20
Studies have shown that stress is implicated in the cause of neck pain (NP). The purpose of this study is to examine the effect of a simple, zero cost stress management program on patients suffering from NP. This study is a parallel-type randomized clinical study. People suffering from chronic non-specific NP were chosen randomly to participate in an eight week duration program of stress management (N= 28) (including diaphragmatic breathing, progressive muscle relaxation) or in a no intervention control condition (N= 25). Self-report measures were used for the evaluation of various variables at the beginning and at the end of the eight-week monitoring period. Descriptive and inferential statistic methods were used for the statistical analysis. At the end of the monitoring period, the intervention group showed a statistically significant reduction of stress and anxiety (p= 0.03, p= 0.01), report of stress related symptoms (p= 0.003), percentage of disability due to NP (p= 0.000) and NP intensity (p= 0.002). At the same time, daily routine satisfaction levels were elevated (p= 0.019). No statistically significant difference was observed in cortisol measurements. Stress management has positive effects on NP patients.
NASA Astrophysics Data System (ADS)
Vigan, A.; Chauvin, G.; Bonavita, M.; Desidera, S.; Bonnefoy, M.; Mesa, D.; Beuzit, J.-L.; Augereau, J.-C.; Biller, B.; Boccaletti, A.; Brugaletta, E.; Buenzli, E.; Carson, J.; Covino, E.; Delorme, P.; Eggenberger, A.; Feldt, M.; Hagelberg, J.; Henning, T.; Lagrange, A.-M.; Lanzafame, A.; Ménard, F.; Messina, S.; Meyer, M.; Montagnier, G.; Mordasini, C.; Mouillet, D.; Moutou, C.; Mugnier, L.; Quanz, S. P.; Reggiani, M.; Ségransan, D.; Thalmann, C.; Waters, R.; Zurlo, A.
2014-01-01
Over the past decade, a growing number of deep imaging surveys have started to provide meaningful constraints on the population of extrasolar giant planets at large orbital separation. Primary targets for these surveys have been carefully selected based on their age, distance and spectral type, and often on their membership to young nearby associations where all stars share common kinematics, photometric and spectroscopic properties. The next step is a wider statistical analysis of the frequency and properties of low mass companions as a function of stellar mass and orbital separation. In late 2009, we initiated a coordinated European Large Program using angular differential imaging in the H band (1.66 μm) with NaCo at the VLT. Our aim is to provide a comprehensive and statistically significant study of the occurrence of extrasolar giant planets and brown dwarfs at large (5-500 AU) orbital separation around ~150 young, nearby stars, a large fraction of which have never been observed at very deep contrast. The survey has now been completed and we present the data analysis and detection limits for the observed sample, for which we reach the planetary-mass domain at separations of >~50 AU on average. We also present the results of the statistical analysis that has been performed over the 75 targets newly observed at high-contrast. We discuss the details of the statistical analysis and the physical constraints that our survey provides for the frequency and formation scenario of planetary mass companions at large separation.
ERIC Educational Resources Information Center
Knight, Jennifer L.
This paper considers some decisions that must be made by the researcher conducting an exploratory factor analysis. The primary purpose is to aid the researcher in making informed decisions during the factor analysis instead of relying on defaults in statistical programs or traditions of previous researchers. Three decision areas are addressed.…
ERIC Educational Resources Information Center
Aragón, Sonia; Lapresa, Daniel; Arana, Javier; Anguera, M. Teresa; Garzón, Belén
2017-01-01
Polar coordinate analysis is a powerful data reduction technique based on the Zsum statistic, which is calculated from adjusted residuals obtained by lag sequential analysis. Its use has been greatly simplified since the addition of a module in the free software program HOISAN for performing the necessary computations and producing…
Alignment-free genetic sequence comparisons: a review of recent approaches by word analysis
Steele, Joe; Bastola, Dhundy
2014-01-01
Modern sequencing and genome assembly technologies have provided a wealth of data, which will soon require an analysis by comparison for discovery. Sequence alignment, a fundamental task in bioinformatics research, may be used but with some caveats. Seminal techniques and methods from dynamic programming are proving ineffective for this work owing to their inherent computational expense when processing large amounts of sequence data. These methods are prone to giving misleading information because of genetic recombination, genetic shuffling and other inherent biological events. New approaches from information theory, frequency analysis and data compression are available and provide powerful alternatives to dynamic programming. These new methods are often preferred, as their algorithms are simpler and are not affected by synteny-related problems. In this review, we provide a detailed discussion of computational tools, which stem from alignment-free methods based on statistical analysis from word frequencies. We provide several clear examples to demonstrate applications and the interpretations over several different areas of alignment-free analysis such as base–base correlations, feature frequency profiles, compositional vectors, an improved string composition and the D2 statistic metric. Additionally, we provide detailed discussion and an example of analysis by Lempel–Ziv techniques from data compression. PMID:23904502
A Field-Effect Transistor (FET) model for ASAP
NASA Technical Reports Server (NTRS)
Ming, L.
1965-01-01
The derivation of the circuitry of a field effect transistor (FET) model, the procedure for adapting the model to automated statistical analysis program (ASAP), and the results of applying ASAP on this model are described.
Kamiru, H N; Ross, M W; Bartholomew, L K; McCurdy, S A; Kline, M W
2009-11-01
Implementation of HIV care and treatment programs in sub-Saharan Africa is a complex undertaking that requires training of health care providers (HCPs). Many sub-Saharan African countries have introduced training programs to build human resources for health. Evaluation of the ongoing trainings is warranted so that programs can be improved. The purpose of this study was to evaluate Baylor International Pediatric AIDS Initiative's (BIPAI) HCP training program in Swaziland. The specific aims were: (1) to assess coverage and delivery of the training program; and (2) to determine the impact of the training program on HCPs' knowledge about HIV and pediatric practices, attitudes toward HIV/AIDS patients, and self-efficacy to provide antiretroviral therapy (ART). The evaluation was a multimethod design with two types of data collection and analysis: (1) one-group pretest-posttest survey with 101 HCPs; and (2) semi-structured in-depth interviews with seven trainers from Baylor College of Medicine and 16 local HCPs in Swaziland. Quantitative data were analyzed using Stata Statistical Software version 8.2 for descriptive and multivariate analysis while factor analysis was done using Statistical Program for Social Sciences version 14. The transcribed interviews were analyzed using a didactic approach. Process evaluation showed that the training had good coverage, was delivered as intended, and improved as the work progressed. The training program led to a significant increase (p=0.0000) in HCPs' knowledge about HIV/AIDS, ART, and relevant clinical pediatrics practices between pretest (mean 68.7% SD 13.7) and post training (mean 84.0% SD 12.0). The training program also increased trainees' self-efficacy to provide ART and their attitudes toward AIDS patients (p=0.0000 and 0.02, respectively). In conclusion, BIPAI training program in Swaziland had good coverage of all health care facilities and HCPs in Swaziland. The training was effective in imparting knowledge and skills to HCPs and in their attitudes toward HIV/AIDS patients.
RepExplore: addressing technical replicate variance in proteomics and metabolomics data analysis.
Glaab, Enrico; Schneider, Reinhard
2015-07-01
High-throughput omics datasets often contain technical replicates included to account for technical sources of noise in the measurement process. Although summarizing these replicate measurements by using robust averages may help to reduce the influence of noise on downstream data analysis, the information on the variance across the replicate measurements is lost in the averaging process and therefore typically disregarded in subsequent statistical analyses.We introduce RepExplore, a web-service dedicated to exploit the information captured in the technical replicate variance to provide more reliable and informative differential expression and abundance statistics for omics datasets. The software builds on previously published statistical methods, which have been applied successfully to biomedical omics data but are difficult to use without prior experience in programming or scripting. RepExplore facilitates the analysis by providing a fully automated data processing and interactive ranking tables, whisker plot, heat map and principal component analysis visualizations to interpret omics data and derived statistics. Freely available at http://www.repexplore.tk enrico.glaab@uni.lu Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press.
Shadish, William R; Hedges, Larry V; Pustejovsky, James E
2014-04-01
This article presents a d-statistic for single-case designs that is in the same metric as the d-statistic used in between-subjects designs such as randomized experiments and offers some reasons why such a statistic would be useful in SCD research. The d has a formal statistical development, is accompanied by appropriate power analyses, and can be estimated using user-friendly SPSS macros. We discuss both advantages and disadvantages of d compared to other approaches such as previous d-statistics, overlap statistics, and multilevel modeling. It requires at least three cases for computation and assumes normally distributed outcomes and stationarity, assumptions that are discussed in some detail. We also show how to test these assumptions. The core of the article then demonstrates in depth how to compute d for one study, including estimation of the autocorrelation and the ratio of between case variance to total variance (between case plus within case variance), how to compute power using a macro, and how to use the d to conduct a meta-analysis of studies using single-case designs in the free program R, including syntax in an appendix. This syntax includes how to read data, compute fixed and random effect average effect sizes, prepare a forest plot and a cumulative meta-analysis, estimate various influence statistics to identify studies contributing to heterogeneity and effect size, and do various kinds of publication bias analyses. This d may prove useful for both the analysis and meta-analysis of data from SCDs. Copyright © 2013 Society for the Study of School Psychology. Published by Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Coulson, Andrew J.
2010-01-01
School voucher and education tax credit programs have proliferated in the United States over the past two decades. Advocates have argued that they will enable families to become active consumers in a free and competitive education marketplace, but some fear that these programs may in fact bring with them a heavy regulatory burden that could stifle…
Exact and Monte carlo resampling procedures for the Wilcoxon-Mann-Whitney and Kruskal-Wallis tests.
Berry, K J; Mielke, P W
2000-12-01
Exact and Monte Carlo resampling FORTRAN programs are described for the Wilcoxon-Mann-Whitney rank sum test and the Kruskal-Wallis one-way analysis of variance for ranks test. The program algorithms compensate for tied values and do not depend on asymptotic approximations for probability values, unlike most algorithms contained in PC-based statistical software packages.
Sandoval-Castellanos, Edson; Palkopoulou, Eleftheria; Dalén, Love
2014-01-01
Inference of population demographic history has vastly improved in recent years due to a number of technological and theoretical advances including the use of ancient DNA. Approximate Bayesian computation (ABC) stands among the most promising methods due to its simple theoretical fundament and exceptional flexibility. However, limited availability of user-friendly programs that perform ABC analysis renders it difficult to implement, and hence programming skills are frequently required. In addition, there is limited availability of programs able to deal with heterochronous data. Here we present the software BaySICS: Bayesian Statistical Inference of Coalescent Simulations. BaySICS provides an integrated and user-friendly platform that performs ABC analyses by means of coalescent simulations from DNA sequence data. It estimates historical demographic population parameters and performs hypothesis testing by means of Bayes factors obtained from model comparisons. Although providing specific features that improve inference from datasets with heterochronous data, BaySICS also has several capabilities making it a suitable tool for analysing contemporary genetic datasets. Those capabilities include joint analysis of independent tables, a graphical interface and the implementation of Markov-chain Monte Carlo without likelihoods.
CTTITEM: SAS macro and SPSS syntax for classical item analysis.
Lei, Pui-Wa; Wu, Qiong
2007-08-01
This article describes the functions of a SAS macro and an SPSS syntax that produce common statistics for conventional item analysis including Cronbach's alpha, item difficulty index (p-value or item mean), and item discrimination indices (D-index, point biserial and biserial correlations for dichotomous items and item-total correlation for polytomous items). These programs represent an improvement over the existing SAS and SPSS item analysis routines in terms of completeness and user-friendliness. To promote routine evaluations of item qualities in instrument development of any scale, the programs are available at no charge for interested users. The program codes along with a brief user's manual that contains instructions and examples are downloadable from suen.ed.psu.edu/-pwlei/plei.htm.
O'Connor, Brian P
2004-02-01
Levels-of-analysis issues arise whenever individual-level data are collected from more than one person from the same dyad, family, classroom, work group, or other interaction unit. Interdependence in data from individuals in the same interaction units also violates the independence-of-observations assumption that underlies commonly used statistical tests. This article describes the data analysis challenges that are presented by these issues and presents SPSS and SAS programs for conducting appropriate analyses. The programs conduct the within-and-between-analyses described by Dansereau, Alutto, and Yammarino (1984) and the dyad-level analyses described by Gonzalez and Griffin (1999) and Griffin and Gonzalez (1995). Contrasts with general multilevel modeling procedures are then discussed.
ITA, a portable program for the interactive analysis of data from tracer experiments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wootton, R.; Ashley, K.
ITA is a portable program for analyzing data from tracer experiments, most of the mathematical and graphical work being carried out by subroutines from the NAG and DASL libraries. The program can be used in batch or interactive mode, commands being typed in an English-like language, in free format. Data can be entered from a terminal keyboard or read from a file, and can be validated by printing or plotting them. Erroneous values can be corrected by appropriate editing. Analysis can involve elementary statistics, multiple-isotope crossover corrections, convolution or deconvolution, polyexponential curve-fitting, spline interpolation and/or compartmental analysis. On those installationsmore » with the appropriate hardware, high-resolution graphs can be drawn.« less
Guidelines for Design and Analysis of Large, Brittle Spacecraft Components
NASA Technical Reports Server (NTRS)
Robinson, E. Y.
1993-01-01
There were two related parts to this work. The first, conducted at The Aerospace Corporation was to develop and define methods for integrating the statistical theory of brittle strength with conventional finite element stress analysis, and to carry out a limited laboratory test program to illustrate the methods. The second part, separately funded at Aerojet Electronic Systems Division, was to create the finite element postprocessing program for integrating the statistical strength analysis with the structural analysis. The second part was monitored by Capt. Jeff McCann of USAF/SMC, as Special Study No.11, which authorized Aerojet to support Aerospace on this work requested by NASA. This second part is documented in Appendix A. The activity at Aerojet was guided by the Aerospace methods developed in the first part of this work. This joint work of Aerospace and Aerojet stemmed from prior related work for the Defense Support Program (DSP) Program Office, to qualify the DSP sensor main mirror and corrector lens for flight as part of a shuttle payload. These large brittle components of the DSP sensor are provided by Aerojet. This document defines rational methods for addressing the structural integrity and safety of large, brittle, payload components, which have low and variable tensile strength and can suddenly break or shatter. The methods are applicable to the evaluation and validation of such components, which, because of size and configuration restrictions, cannot be validated by direct proof test.
Physics Teachers and Students: A Statistical and Historical Analysis of Women
NASA Astrophysics Data System (ADS)
Gregory, Amanda
2009-10-01
Historically, women have been denied an education comparable to that available to men. Since women have been allowed into institutions of higher learning, they have been studying and earning physics degrees. The aim of this poster is to discuss the statistical relationship between the number of women enrolled in university physics programs and the number of female physics faculty members. Special care has been given to examining the statistical data in the context of the social climate at the time that these women were teaching or pursuing their education.
Linear combination reading program for capture gamma rays
Tanner, Allan B.
1971-01-01
This program computes a weighting function, Qj, which gives a scalar output value of unity when applied to the spectrum of a desired element and a minimum value (considering statistics) when applied to spectra of materials not containing the desired element. Intermediate values are obtained for materials containing the desired element, in proportion to the amount of the element they contain. The program is written in the BASIC language in a format specific to the Hewlett-Packard 2000A Time-Sharing System, and is an adaptation of an earlier program for linear combination reading for X-ray fluorescence analysis (Tanner and Brinkerhoff, 1971). Following the program is a sample run from a study of the application of the linear combination technique to capture-gamma-ray analysis for calcium (report in preparation).
Experimental design, power and sample size for animal reproduction experiments.
Chapman, Phillip L; Seidel, George E
2008-01-01
The present paper concerns statistical issues in the design of animal reproduction experiments, with emphasis on the problems of sample size determination and power calculations. We include examples and non-technical discussions aimed at helping researchers avoid serious errors that may invalidate or seriously impair the validity of conclusions from experiments. Screen shots from interactive power calculation programs and basic SAS power calculation programs are presented to aid in understanding statistical power and computing power in some common experimental situations. Practical issues that are common to most statistical design problems are briefly discussed. These include one-sided hypothesis tests, power level criteria, equality of within-group variances, transformations of response variables to achieve variance equality, optimal specification of treatment group sizes, 'post hoc' power analysis and arguments for the increased use of confidence intervals in place of hypothesis tests.
A Study of Persistence in the Northeast State Community College Health-Related Programs of Study
NASA Astrophysics Data System (ADS)
Hamilton, Allana R.
2011-12-01
The purpose of the study was to identify factors that were positively associated with persistence to graduation by students who were admitted to Health-Related Programs leading to the degree associate of applied science at Northeast State Community College. The criterion variable in this study was persistence, which was categorized into two groups the persister group (program completers) and the nonpersister (program noncompleters) group. The predictor variables included gender, ethnic origin, first- (or nonfirst-) generation-student status, age, specific major program of study, number of remedial and/or developmental courses taken, grades in selected courses (human anatomy and physiology I and II, microbiology, probability and statistics, composition I, clinical I, clinical II), and number of mathematics and science credit hours earned prior to program admission. The data for this ex post facto nonexperimental design were located in Northeast State's student records database, Banner Information System. The subjects of the study were students who had been admitted into Health-Related Programs of study at a 2-year public community college between the years of 1999 and 2008. The population size was 761. Health-Related Programs of study included Dental Assisting, Cardiovascular Technology, Emergency Medical Technology -- Paramedic, Medical Laboratory Technology, Nursing, and Surgical Technology. A combination of descriptive and inferential statistics was used in the analysis of the data. Descriptive statistics included measures of central tendency, standard deviations, and percentages, as appropriate. Independent samples t-tests were used to determine if the mean of a variable on one group of subjects was different from the mean of the same variable with a different group of subjects. It was found that gender, ethnic origin, first-generation status, and age were not significantly associated with persistence to graduation. However, findings did reveal a statistically significant difference in persistence rates among the specific Health-Related Programs of study. Academic data including grades in human anatomy and physiology I, probability and statistics, and composition I, suggested a relationship between the course grade and persistence to graduation. Findings also revealed a relationship between the number of math and science courses completed and students' persistence to graduation.
Ponirou, Paraskevi; Diomidous, Marianna; Mantas, John; Kalokairinou, Athena; Kalouri, Ourania; Kapadochos, Theodoros; Tzavara, Chara
2014-01-01
The education in First Aid through health education programs can help in promoting the health of the population. Meanwhile, the development of alternative forms of education with emphasis on distance learning implemented with e-learning creates an innovative system of knowledge and skills in different population groups. The main purpose of this research proposal is to investigate the effectiveness of the educational program to candidates educators about knowledge and emergency preparedness at school. The study used the Solomon four group design (2 intervention groups and 2 control groups). Statistical analysis showed significant difference within the four groups. Intervention groups had improved significantly their knowledge showing that the program was effective and that they would eventually deal with a threatening situation with right handlings. There were no statistical significant findings regarding other independent variables (p>0,05).The health education program with the implementation of synchronous distance learning succeeded to enhance the knowledge of candidates educators.
A PERT/CPM of the Computer Assisted Completion of The Ministry September Report. Research Report.
ERIC Educational Resources Information Center
Feeney, J. D.
Using two statistical analysis techniques (the Program Evaluation and Review Technique and the Critical Path Method), this study analyzed procedures for compiling the required yearly report of the Metropolitan Separate School Board (Catholic) of Toronto, Canada. The computer-assisted analysis organized the process of completing the report more…
The US EPA’s ToxCastTM program seeks to combine advances in high-throughput screening technology with methodologies from statistics and computer science to develop high-throughput decision support tools for assessing chemical hazard and risk. To develop new methods of analysis of...
Modular Open-Source Software for Item Factor Analysis
ERIC Educational Resources Information Center
Pritikin, Joshua N.; Hunter, Micheal D.; Boker, Steven M.
2015-01-01
This article introduces an item factor analysis (IFA) module for "OpenMx," a free, open-source, and modular statistical modeling package that runs within the R programming environment on GNU/Linux, Mac OS X, and Microsoft Windows. The IFA module offers a novel model specification language that is well suited to programmatic generation…
Summary of the COS Cycle 22 Calibration Program
NASA Astrophysics Data System (ADS)
Sonnentrucker, Paule; Becker, George; Bostroem, Azalee; Debes, John H.; Ely, Justin; Fox, Andrew; Lockwood, Sean; Oliveira, Cristina; Penton, Steven; Proffitt, Charles; Roman-Duval, Julia; Sahnow, David; Sana, Hugues; Taylor, Jo; Welty, Alan D.; Wheeler, Thomas
2016-09-01
We summarize the calibration activities for the Cosmic Origins Spectrograph (COS) on the Hubble Space Telescope during Cycle 22 which ran from November 2014 through October 2015. We give an overview of the COS calibration plan, COS usage statistics and we briefly describe major changes with respect to the previous cycle. High-level executive summaries for each calibration program comprising Cycle 22 are also given here. Results of the analysis attached to each program are published in separate ISRs.
ProteoSign: an end-user online differential proteomics statistical analysis platform.
Efstathiou, Georgios; Antonakis, Andreas N; Pavlopoulos, Georgios A; Theodosiou, Theodosios; Divanach, Peter; Trudgian, David C; Thomas, Benjamin; Papanikolaou, Nikolas; Aivaliotis, Michalis; Acuto, Oreste; Iliopoulos, Ioannis
2017-07-03
Profiling of proteome dynamics is crucial for understanding cellular behavior in response to intrinsic and extrinsic stimuli and maintenance of homeostasis. Over the last 20 years, mass spectrometry (MS) has emerged as the most powerful tool for large-scale identification and characterization of proteins. Bottom-up proteomics, the most common MS-based proteomics approach, has always been challenging in terms of data management, processing, analysis and visualization, with modern instruments capable of producing several gigabytes of data out of a single experiment. Here, we present ProteoSign, a freely available web application, dedicated in allowing users to perform proteomics differential expression/abundance analysis in a user-friendly and self-explanatory way. Although several non-commercial standalone tools have been developed for post-quantification statistical analysis of proteomics data, most of them are not end-user appealing as they often require very stringent installation of programming environments, third-party software packages and sometimes further scripting or computer programming. To avoid this bottleneck, we have developed a user-friendly software platform accessible via a web interface in order to enable proteomics laboratories and core facilities to statistically analyse quantitative proteomics data sets in a resource-efficient manner. ProteoSign is available at http://bioinformatics.med.uoc.gr/ProteoSign and the source code at https://github.com/yorgodillo/ProteoSign. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.
Evaluating disease management program effectiveness: an introduction to survival analysis.
Linden, Ariel; Adams, John L; Roberts, Nancy
2004-01-01
Currently, the most widely used method in the disease management industry for evaluating program effectiveness is the "total population approach." This model is a pretest-posttest design, with the most basic limitation being that without a control group, there may be sources of bias and/or competing extraneous confounding factors that offer plausible rationale explaining the change from baseline. Survival analysis allows for the inclusion of data from censored cases, those subjects who either "survived" the program without experiencing the event (e.g., achievement of target clinical levels, hospitalization) or left the program prematurely, due to disenrollement from the health plan or program, or were lost to follow-up. Additionally, independent variables may be included in the model to help explain the variability in the outcome measure. In order to maximize the potential of this statistical method, validity of the model and research design must be assured. This paper reviews survival analysis as an alternative, and more appropriate, approach to evaluating DM program effectiveness than the current total population approach.
The relative effectiveness of self-management programs for type 2 diabetes.
McGowan, Patrick
2015-10-01
The objectives of the study were to investigate the effectiveness of 2 types of peer-led self-management programs in bringing about improvements in subjects with type 2 diabetes mellitus and to determine whether there were differences in effectiveness between the 2 programs. The study used a 3-arm randomized controlled trial design with clinical measures taken at baseline and at 6 and 12 months post-program. In total, 405 persons completed baseline questionnaires and were randomly allocated to a diabetes self-management program (n=130), to a general self-management program (n=109) or to a control group (n=122). A 2-way factorial analyses of variance was employed as the primary statistical analysis. The findings showed that the self-management programs had affected 5 of the 30 measures: fatigue, cognitive symptom management, self-efficacy with regard to the disease in general, communication with physician, and the score on the Diabetes Empowerment Scale. In addition, 3 variables-social role limitations, total hospital nights and glycated hemoglobin levels-showed marginally significant interaction effects. The second analysis found similar results, with 4 of the 5 measures remaining statistically significant: fatigue, cognitive symptom management, communication with physician and diabetes empowerment, with neither program being more effective than the other. The major findings are that although both programs were effective in bringing about positive changes in the outcome measures, there was little difference in effectiveness between the Diabetes Self-Management Program and the Chronic Disease Self-Management Program. This finding is consistent with the principle that behaviour-change strategies using self-efficacy are key components in health education programs. Copyright © 2015 Canadian Diabetes Association. Published by Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Null, Cynthia H.
2009-01-01
In June 2004, the June Space Flight Leadership Council (SFLC) assigned an action to the NASA Engineering and Safety Center (NESC) and External Tank (ET) project jointly to characterize the available dataset [of defect sizes from dissections of foam], identify resultant limitations to statistical treatment of ET as-built foam as part of the overall thermal protection system (TPS) certification, and report to the Program Requirements Change Board (PRCB) and SFLC in September 2004. The NESC statistics team was formed to assist the ET statistics group in August 2004. The NESC's conclusions are presented in this report.
Cichonska, Anna; Rousu, Juho; Marttinen, Pekka; Kangas, Antti J; Soininen, Pasi; Lehtimäki, Terho; Raitakari, Olli T; Järvelin, Marjo-Riitta; Salomaa, Veikko; Ala-Korpela, Mika; Ripatti, Samuli; Pirinen, Matti
2016-07-01
A dominant approach to genetic association studies is to perform univariate tests between genotype-phenotype pairs. However, analyzing related traits together increases statistical power, and certain complex associations become detectable only when several variants are tested jointly. Currently, modest sample sizes of individual cohorts, and restricted availability of individual-level genotype-phenotype data across the cohorts limit conducting multivariate tests. We introduce metaCCA, a computational framework for summary statistics-based analysis of a single or multiple studies that allows multivariate representation of both genotype and phenotype. It extends the statistical technique of canonical correlation analysis to the setting where original individual-level records are not available, and employs a covariance shrinkage algorithm to achieve robustness.Multivariate meta-analysis of two Finnish studies of nuclear magnetic resonance metabolomics by metaCCA, using standard univariate output from the program SNPTEST, shows an excellent agreement with the pooled individual-level analysis of original data. Motivated by strong multivariate signals in the lipid genes tested, we envision that multivariate association testing using metaCCA has a great potential to provide novel insights from already published summary statistics from high-throughput phenotyping technologies. Code is available at https://github.com/aalto-ics-kepaco anna.cichonska@helsinki.fi or matti.pirinen@helsinki.fi Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.
Cichonska, Anna; Rousu, Juho; Marttinen, Pekka; Kangas, Antti J.; Soininen, Pasi; Lehtimäki, Terho; Raitakari, Olli T.; Järvelin, Marjo-Riitta; Salomaa, Veikko; Ala-Korpela, Mika; Ripatti, Samuli; Pirinen, Matti
2016-01-01
Motivation: A dominant approach to genetic association studies is to perform univariate tests between genotype-phenotype pairs. However, analyzing related traits together increases statistical power, and certain complex associations become detectable only when several variants are tested jointly. Currently, modest sample sizes of individual cohorts, and restricted availability of individual-level genotype-phenotype data across the cohorts limit conducting multivariate tests. Results: We introduce metaCCA, a computational framework for summary statistics-based analysis of a single or multiple studies that allows multivariate representation of both genotype and phenotype. It extends the statistical technique of canonical correlation analysis to the setting where original individual-level records are not available, and employs a covariance shrinkage algorithm to achieve robustness. Multivariate meta-analysis of two Finnish studies of nuclear magnetic resonance metabolomics by metaCCA, using standard univariate output from the program SNPTEST, shows an excellent agreement with the pooled individual-level analysis of original data. Motivated by strong multivariate signals in the lipid genes tested, we envision that multivariate association testing using metaCCA has a great potential to provide novel insights from already published summary statistics from high-throughput phenotyping technologies. Availability and implementation: Code is available at https://github.com/aalto-ics-kepaco Contacts: anna.cichonska@helsinki.fi or matti.pirinen@helsinki.fi Supplementary information: Supplementary data are available at Bioinformatics online. PMID:27153689
Voice Response System Statistics Program : Operational Handbook.
DOT National Transportation Integrated Search
1980-06-01
This report documents the Voice Response System (VRS) Statistics Program developed for the preflight weather briefing VRS. It describes the VRS statistical report format and contents, the software program structure, and the program operation.
NASA Astrophysics Data System (ADS)
Hayes, Catherine
2005-07-01
This study sought to identify a variable or variables predictive of attrition among baccalaureate nursing students. The study was quantitative in design and multivariate correlational statistics and discriminant statistical analysis were used to identify a model for prediction of attrition. The analysis then weighted variables according to their predictive value to determine the most parsimonious model with the greatest predictive value. Three public university nursing education programs in Mississippi offering a Bachelors Degree in Nursing were selected for the study. The population consisted of students accepted and enrolled in these three programs for the years 2001 and 2002 and graduating in the years 2003 and 2004 (N = 195). The categorical dependent variable was attrition (includes academic failure or withdrawal) from the program of nursing education. The ten independent variables selected for the study and considered to have possible predictive value were: Grade Point Average for Pre-requisite Course Work; ACT Composite Score, ACT Reading Subscore, and ACT Mathematics Subscore; Letter Grades in the Courses: Anatomy & Physiology and Lab I, Algebra I, English I (101), Chemistry & Lab I, and Microbiology & Lab I; and Number of Institutions Attended (Universities, Colleges, Junior Colleges or Community Colleges). Descriptive analysis was performed and the means of each of the ten independent variables was compared for students who attrited and those who were retained in the population. The discriminant statistical analysis performed created a matrix using the ten variable model that was able to correctly predicted attrition in the study's population in 77.6% of the cases. Variables were then combined and recombined to produce the most efficient and parsimonious model for prediction. A six variable model resulted which weighted each variable according to predictive value: GPA for Prerequisite Coursework, ACT Composite, English I, Chemistry & Lab I, Microbiology & Lab I, and Number of Institutions Attended. Results of the study indicate that it is possible to predict attrition among students enrolled in baccalaureate nursing education programs and that additional investigation on the subject is warranted.
Acoustic environmental accuracy requirements for response determination
NASA Technical Reports Server (NTRS)
Pettitt, M. R.
1983-01-01
A general purpose computer program was developed for the prediction of vehicle interior noise. This program, named VIN, has both modal and statistical energy analysis capabilities for structural/acoustic interaction analysis. The analytic models and their computer implementation were verified through simple test cases with well-defined experimental results. The model was also applied in a space shuttle payload bay launch acoustics prediction study. The computer program processes large and small problems with equal efficiency because all arrays are dynamically sized by program input variables at run time. A data base is built and easily accessed for design studies. The data base significantly reduces the computational costs of such studies by allowing the reuse of the still-valid calculated parameters of previous iterations.
Automated training site selection for large-area remote-sensing image analysis
NASA Astrophysics Data System (ADS)
McCaffrey, Thomas M.; Franklin, Steven E.
1993-11-01
A computer program is presented to select training sites automatically from remotely sensed digital imagery. The basic ideas are to guide the image analyst through the process of selecting typical and representative areas for large-area image classifications by minimizing bias, and to provide an initial list of potential classes for which training sites are required to develop a classification scheme or to verify classification accuracy. Reducing subjectivity in training site selection is achieved by using a purely statistical selection of homogeneous sites which then can be compared to field knowledge, aerial photography, or other remote-sensing imagery and ancillary data to arrive at a final selection of sites to be used to train the classification decision rules. The selection of the homogeneous sites uses simple tests based on the coefficient of variance, the F-statistic, and the Student's i-statistic. Comparisons of site means are conducted with a linear growing list of previously located homogeneous pixels. The program supports a common pixel-interleaved digital image format and has been tested on aerial and satellite optical imagery. The program is coded efficiently in the C programming language and was developed under AIX-Unix on an IBM RISC 6000 24-bit color workstation.
Classification software technique assessment
NASA Technical Reports Server (NTRS)
Jayroe, R. R., Jr.; Atkinson, R.; Dasarathy, B. V.; Lybanon, M.; Ramapryian, H. K.
1976-01-01
A catalog of software options is presented for the use of local user communities to obtain software for analyzing remotely sensed multispectral imagery. The resources required to utilize a particular software program are described. Descriptions of how a particular program analyzes data and the performance of that program for an application and data set provided by the user are shown. An effort is made to establish a statistical performance base for various software programs with regard to different data sets and analysis applications, to determine the status of the state-of-the-art.
Statistical analysis and model validation of automobile emissions
DOT National Transportation Integrated Search
2000-09-01
The article discusses the development of a comprehensive modal emissions model that is currently being integrated with a variety of transportation models as part of National Cooperative Highway Research Program project 25-11. Described is the second-...
2009 Oregon traffic crash summary
DOT National Transportation Integrated Search
2010-09-01
The Crash Analysis and Reporting Unit compiles data and publishes statistics for reported motor vehicle : traffic crashes per ORS 802.050(2) and 802.220(6). The data supports various local, county and state : traffic safety programs, engineering and ...
2008 Oregon traffic crash summary
DOT National Transportation Integrated Search
2009-09-01
The Crash Analysis and Reporting Unit compiles data and publishes statistics for reported motor vehicle : traffic crashes per ORS 802.050(2) and 802.220(6). The data supports various local, county and state : traffic safety programs, engineering and ...
2010 Oregon traffic crash summary
DOT National Transportation Integrated Search
2011-08-01
The Crash Analysis and Reporting Unit compiles data and publishes statistics for reported motor vehicle : traffic crashes per ORS 802.050(2) and 802.220(6). The data supports various local, county and state : traffic safety programs, engineering and ...
P-MartCancer-Interactive Online Software to Enable Analysis of Shotgun Cancer Proteomic Datasets.
Webb-Robertson, Bobbie-Jo M; Bramer, Lisa M; Jensen, Jeffrey L; Kobold, Markus A; Stratton, Kelly G; White, Amanda M; Rodland, Karin D
2017-11-01
P-MartCancer is an interactive web-based software environment that enables statistical analyses of peptide or protein data, quantitated from mass spectrometry-based global proteomics experiments, without requiring in-depth knowledge of statistical programming. P-MartCancer offers a series of statistical modules associated with quality assessment, peptide and protein statistics, protein quantification, and exploratory data analyses driven by the user via customized workflows and interactive visualization. Currently, P-MartCancer offers access and the capability to analyze multiple cancer proteomic datasets generated through the Clinical Proteomics Tumor Analysis Consortium at the peptide, gene, and protein levels. P-MartCancer is deployed as a web service (https://pmart.labworks.org/cptac.html), alternatively available via Docker Hub (https://hub.docker.com/r/pnnl/pmart-web/). Cancer Res; 77(21); e47-50. ©2017 AACR . ©2017 American Association for Cancer Research.
Parallel line analysis: multifunctional software for the biomedical sciences
NASA Technical Reports Server (NTRS)
Swank, P. R.; Lewis, M. L.; Damron, K. L.; Morrison, D. R.
1990-01-01
An easy to use, interactive FORTRAN program for analyzing the results of parallel line assays is described. The program is menu driven and consists of five major components: data entry, data editing, manual analysis, manual plotting, and automatic analysis and plotting. Data can be entered from the terminal or from previously created data files. The data editing portion of the program is used to inspect and modify data and to statistically identify outliers. The manual analysis component is used to test the assumptions necessary for parallel line assays using analysis of covariance techniques and to determine potency ratios with confidence limits. The manual plotting component provides a graphic display of the data on the terminal screen or on a standard line printer. The automatic portion runs through multiple analyses without operator input. Data may be saved in a special file to expedite input at a future time.
Computing Reliabilities Of Ceramic Components Subject To Fracture
NASA Technical Reports Server (NTRS)
Nemeth, N. N.; Gyekenyesi, J. P.; Manderscheid, J. M.
1992-01-01
CARES calculates fast-fracture reliability or failure probability of macroscopically isotropic ceramic components. Program uses results from commercial structural-analysis program (MSC/NASTRAN or ANSYS) to evaluate reliability of component in presence of inherent surface- and/or volume-type flaws. Computes measure of reliability by use of finite-element mathematical model applicable to multiple materials in sense model made function of statistical characterizations of many ceramic materials. Reliability analysis uses element stress, temperature, area, and volume outputs, obtained from two-dimensional shell and three-dimensional solid isoparametric or axisymmetric finite elements. Written in FORTRAN 77.
NASA Technical Reports Server (NTRS)
Ryan, Robert S.; Townsend, John S.
1993-01-01
The prospective improvement of probabilistic methods for space program analysis/design entails the further development of theories, codes, and tools which match specific areas of application, the drawing of lessons from previous uses of probability and statistics data bases, the enlargement of data bases (especially in the field of structural failures), and the education of engineers and managers on the advantages of these methods. An evaluation is presently made of the current limitations of probabilistic engineering methods. Recommendations are made for specific applications.
Computational tools for multi-linked flexible structures
NASA Technical Reports Server (NTRS)
Lee, Gordon K. F.; Brubaker, Thomas A.; Shults, James R.
1990-01-01
A software module which designs and tests controllers and filters in Kalman Estimator form, based on a polynomial state-space model is discussed. The user-friendly program employs an interactive graphics approach to simplify the design process. A variety of input methods are provided to test the effectiveness of the estimator. Utilities are provided which address important issues in filter design such as graphical analysis, statistical analysis, and calculation time. The program also provides the user with the ability to save filter parameters, inputs, and outputs for future use.
Parker, Elizabeth O; Chang, Jennifer; Thomas, Volker
2016-01-01
We examined the trends of quantitative research over the past 10 years in the Journal of Marital and Family Therapy (JMFT). Specifically, within the JMFT, we investigated the types and trends of research design and statistical analysis within the quantitative research that was published in JMFT from 2005 to 2014. We found that while the amount of peer-reviewed articles have increased over time, the percentage of quantitative research has remained constant. We discussed the types and trends of statistical analysis and the implications for clinical work and training programs in the field of marriage and family therapy. © 2016 American Association for Marriage and Family Therapy.
Vibration Response Models of a Stiffened Aluminum Plate Excited by a Shaker
NASA Technical Reports Server (NTRS)
Cabell, Randolph H.
2008-01-01
Numerical models of structural-acoustic interactions are of interest to aircraft designers and the space program. This paper describes a comparison between two energy finite element codes, a statistical energy analysis code, a structural finite element code, and the experimentally measured response of a stiffened aluminum plate excited by a shaker. Different methods for modeling the stiffeners and the power input from the shaker are discussed. The results show that the energy codes (energy finite element and statistical energy analysis) accurately predicted the measured mean square velocity of the plate. In addition, predictions from an energy finite element code had the best spatial correlation with measured velocities. However, predictions from a considerably simpler, single subsystem, statistical energy analysis model also correlated well with the spatial velocity distribution. The results highlight a need for further work to understand the relationship between modeling assumptions and the prediction results.
Compositional Solution Space Quantification for Probabilistic Software Analysis
NASA Technical Reports Server (NTRS)
Borges, Mateus; Pasareanu, Corina S.; Filieri, Antonio; d'Amorim, Marcelo; Visser, Willem
2014-01-01
Probabilistic software analysis aims at quantifying how likely a target event is to occur during program execution. Current approaches rely on symbolic execution to identify the conditions to reach the target event and try to quantify the fraction of the input domain satisfying these conditions. Precise quantification is usually limited to linear constraints, while only approximate solutions can be provided in general through statistical approaches. However, statistical approaches may fail to converge to an acceptable accuracy within a reasonable time. We present a compositional statistical approach for the efficient quantification of solution spaces for arbitrarily complex constraints over bounded floating-point domains. The approach leverages interval constraint propagation to improve the accuracy of the estimation by focusing the sampling on the regions of the input domain containing the sought solutions. Preliminary experiments show significant improvement on previous approaches both in results accuracy and analysis time.
NASA Astrophysics Data System (ADS)
Tyralis, Hristos; Mamassis, Nikos; Photis, Yorgos N.
2016-04-01
We investigate various uses of electricity demand in Greece (agricultural, commercial, domestic, industrial use as well as use for public and municipal authorities and street lightning) and we examine their relation with variables such as population, total area, population density and the Gross Domestic Product. The analysis is performed on data which span from 2008 to 2012 and have annual temporal resolution and spatial resolution down to the level of prefecture. We both visualize the results of the analysis and we perform cluster and outlier analysis using the Anselin local Moran's I statistic as well as hot spot analysis using the Getis-Ord Gi* statistic. The definition of the spatial patterns and relationships of the aforementioned variables in a GIS environment provides meaningful insight and better understanding of the regional development model in Greece and justifies the basis for an energy demand forecasting methodology. Acknowledgement: This research has been partly financed by the European Union (European Social Fund - ESF) and Greek national funds through the Operational Program "Education and Lifelong Learning" of the National Strategic Reference Framework (NSRF) - Research Funding Program: ARISTEIA II: Reinforcement of the interdisciplinary and/ or inter-institutional research and innovation (CRESSENDO project; grant number 5145).
Code of Federal Regulations, 2011 CFR
2011-10-01
... 45 Public Welfare 2 2011-10-01 2011-10-01 false What statistical and narrative reporting... (IV-D) PROGRAM Statistical and Narrative Reporting Requirements § 309.170 What statistical and... organizations must submit the following information and statistics for Tribal IV-D program activity and caseload...
Code of Federal Regulations, 2010 CFR
2010-10-01
... 45 Public Welfare 2 2010-10-01 2010-10-01 false What statistical and narrative reporting... (IV-D) PROGRAM Statistical and Narrative Reporting Requirements § 309.170 What statistical and... organizations must submit the following information and statistics for Tribal IV-D program activity and caseload...
Ames Research Center SR&T program and earth observations
NASA Technical Reports Server (NTRS)
Poppoff, I. G.
1972-01-01
An overview is presented of the research activities in earth observations at Ames Research Center. Most of the tasks involve the use of research aircraft platforms. The program is also directed toward the use of the Illiac 4 computer for statistical analysis. Most tasks are weighted toward Pacific coast and Pacific basin problems with emphasis on water applications, air applications, animal migration studies, and geophysics.
W. Keith Moser; Renate Bush; John D. Shaw; Mark H. Hansen; Mark D. Nelson
2010-01-01
A major challenge for todayâs resource managers is the linking of standand landscape-scale dynamics. The U.S. Forest Service has made major investments in programs at both the stand- (national forest project) and landscape/regional (Forest Inventory and Analysis [FIA] program) levels. FIA produces the only comprehensive and consistent statistical information on the...
An Analysis of Competencies for Managing Science and Technology Programs
2008-03-19
competency modeling through a two-year task force commissioned by the Society for Industrial and Organizational Psychology (Shippmann and others, 2000:704...positions—specifically within Research and Development (R&D) programs. If so, the final investigative question tests whether those differences are...statistics are used to analyze the comparisons through hypothesis testing and t- tests relevant to the research investigative questions. These
DOE Office of Scientific and Technical Information (OSTI.GOV)
Plemons, R.E.; Hopwood, W.H. Jr.; Hamilton, J.H.
For a number of years the Oak Ridge Y-12 Plant Laboratory has been analyzing coal predominately for the utilities department of the Y-12 Plant. All laboratory procedures, except a Leco sulfur method which used the Leco Instruction Manual as a reference, were written based on the ASTM coal analyses. Sulfur is analyzed at the present time by two methods, gravimetric and Leco. The laboratory has two major endeavors for monitoring the quality of its coal analyses. (1) A control program by the Plant Statistical Quality Control Department. Quality Control submits one sample for every nine samples submitted by the utilitiesmore » departments and the laboratory analyzes a control sample along with the utilities samples. (2) An exchange program with the DOE Coal Analysis Laboratory in Bruceton, Pennsylvania. The Y-12 Laboratory submits to the DOE Coal Laboratory, on even numbered months, a sample that Y-12 has analyzed. The DOE Coal Laboratory submits, on odd numbered months, one of their analyzed samples to the Y-12 Plant Laboratory to be analyzed. The results of these control and exchange programs are monitored not only by laboratory personnel, but also by Statistical Quality Control personnel who provide statistical evaluations. After analysis and reporting of results, all utilities samples are retained by the laboratory until the coal contracts have been settled. The utilities departments have responsibility for the initiation and preparation of the coal samples. The samples normally received by the laboratory have been ground to 4-mesh, reduced to 0.5-gallon quantities, and sealed in air-tight containers. Sample identification numbers and a Request for Analysis are generated by the utilities departments.« less
The Application of a Statistical Analysis Software Package to Explosive Testing
1993-12-01
deviation not corrected for test interval. M refer to equation 2. s refer to equation 3. G refer to section 2.1, C 36 Appendix I : Program Structured ...APPENDIX I: Program Structured Diagrams 37 APPENDIX II: Bruceton Reference Graphs 39 APPENDIX III: Input and Output Data File Format 44 APPENDIX IV...directly from Graph II, which has been digitised and incorporated into the program . IfM falls below 0.3, the curve that is closest to diff( eq . 3a) is
NASA Technical Reports Server (NTRS)
Labudde, R. A.
1972-01-01
An attempt has been made to keep the programs as subroutine oriented as possible. Usually only the main programs are directly concerned with the problem of total cross sections. In particular the subroutines POLFIT, BILINR, GASS59/MAXLIK, SYMOR, MATIN, STUDNT, DNTERP, DIFTAB, FORDIF, EPSALG, REGFAL and ADSIMP are completely general, and are concerned only with the problems of numerical analysis and statistics. Each subroutine is independently documented.
Segal, Leonie; Sara Opie, Rachelle; Dalziel, Kim
2012-01-01
Context Home-visiting programs have been offered for more than sixty years to at-risk families of newborns and infants. But despite decades of experience with program delivery, more than sixty published controlled trials, and more than thirty published literature reviews, there is still uncertainty surrounding the performance of these programs. Our particular interest was the performance of home visiting in reducing child maltreatment. Methods We developed a program logic framework to assist in understanding the neonate/infant home-visiting literature, identified through a systematic literature review. We tested whether success could be explained by the logic model using descriptive synthesis and statistical analysis. Findings Having a stated objective of reducing child maltreatment—a theory or mechanism of change underpinning the home-visiting program consistent with the target population and their needs and program components that can deliver against the nominated theory of change—considerably increased the chance of success. We found that only seven of fifty-three programs demonstrated such consistency, all of which had a statistically significant positive outcome, whereas of the fifteen that had no match, none was successful. Programs with a partial match had an intermediate success rate. The relationship between program success and full, partial or no match was statistically significant. Conclusions Employing a theory-driven approach provides a new way of understanding the disparate performance of neonate/infant home-visiting programs. Employing a similar theory-driven approach could also prove useful in the review of other programs that embody a diverse set of characteristics and may apply to diverse populations and settings. A program logic framework provides a rigorous approach to deriving policy-relevant meaning from effectiveness evidence of complex programs. For neonate/infant home-visiting programs, it means that in developing these programs, attention to consistency of objectives, theory of change, target population, and program components is critical. PMID:22428693
Segal, Leonie; Sara Opie, Rachelle; Dalziel, Kim
2012-03-01
Home-visiting programs have been offered for more than sixty years to at-risk families of newborns and infants. But despite decades of experience with program delivery, more than sixty published controlled trials, and more than thirty published literature reviews, there is still uncertainty surrounding the performance of these programs. Our particular interest was the performance of home visiting in reducing child maltreatment. We developed a program logic framework to assist in understanding the neonate/infant home-visiting literature, identified through a systematic literature review. We tested whether success could be explained by the logic model using descriptive synthesis and statistical analysis. Having a stated objective of reducing child maltreatment-a theory or mechanism of change underpinning the home-visiting program consistent with the target population and their needs and program components that can deliver against the nominated theory of change-considerably increased the chance of success. We found that only seven of fifty-three programs demonstrated such consistency, all of which had a statistically significant positive outcome, whereas of the fifteen that had no match, none was successful. Programs with a partial match had an intermediate success rate. The relationship between program success and full, partial or no match was statistically significant. Employing a theory-driven approach provides a new way of understanding the disparate performance of neonate/infant home-visiting programs. Employing a similar theory-driven approach could also prove useful in the review of other programs that embody a diverse set of characteristics and may apply to diverse populations and settings. A program logic framework provides a rigorous approach to deriving policy-relevant meaning from effectiveness evidence of complex programs. For neonate/infant home-visiting programs, it means that in developing these programs, attention to consistency of objectives, theory of change, target population, and program components is critical. © 2012 Milbank Memorial Fund.
7 CFR 2.17 - Under Secretary for Rural Development.
Code of Federal Regulations, 2012 CFR
2012-01-01
... economic, social, and environmental research and analysis, statistical programs, and associated service...; rural population and manpower; local government finance; income development strategies; housing; social... activities. (12) Assist other Federal agencies in formulating manpower development and training policies. (13...
7 CFR 2.17 - Under Secretary for Rural Development.
Code of Federal Regulations, 2011 CFR
2011-01-01
... economic, social, and environmental research and analysis, statistical programs, and associated service...; rural population and manpower; local government finance; income development strategies; housing; social... activities. (12) Assist other Federal agencies in formulating manpower development and training policies. (13...
The need for conducting forensic analysis of decommissioned bridges.
DOT National Transportation Integrated Search
2014-01-01
A limiting factor in current bridge management programs is a lack of detailed knowledge of bridge deterioration : mechanisms and processes. The current state of the art is to predict future condition using statistical forecasting : models based upon ...
SIMREL: Software for Coefficient Alpha and Its Confidence Intervals with Monte Carlo Studies
ERIC Educational Resources Information Center
Yurdugul, Halil
2009-01-01
This article describes SIMREL, a software program designed for the simulation of alpha coefficients and the estimation of its confidence intervals. SIMREL runs on two alternatives. In the first one, if SIMREL is run for a single data file, it performs descriptive statistics, principal components analysis, and variance analysis of the item scores…
Implementation of Head Start Planned Variation: 1970-1971. Part II.
ERIC Educational Resources Information Center
Lukas, Carol Van Deusen; Wohlleb, Cynthia
This volume of appendices is Part II of a study of program implementation in 12 models of Head Start Planned Variation. It presents details of the data analysis, copies of data collection instruments, and additional analyses and statistics. The appendices are: (A) Analysis of Variance Designs, (B) Copies of Instruments, (C) Additional Analyses,…
Sole: Online Analysis of Southern FIA Data
Michael P. Spinney; Paul C. Van Deusen; Francis A. Roesch
2006-01-01
The Southern On Line Estimator (SOLE) is a flexible modular software program for analyzing U.S. Department of Agriculture Forest Service Forest Inventory and Analysis data. SOLE produces statistical tables, figures, maps, and portable document format reports based on user selected area and variables. SOLE?s Java-based graphical user interface is easy to use, and its R-...
Automatic Generation of Algorithms for the Statistical Analysis of Planetary Nebulae Images
NASA Technical Reports Server (NTRS)
Fischer, Bernd
2004-01-01
Analyzing data sets collected in experiments or by observations is a Core scientific activity. Typically, experimentd and observational data are &aught with uncertainty, and the analysis is based on a statistical model of the conjectured underlying processes, The large data volumes collected by modern instruments make computer support indispensible for this. Consequently, scientists spend significant amounts of their time with the development and refinement of the data analysis programs. AutoBayes [GF+02, FS03] is a fully automatic synthesis system for generating statistical data analysis programs. Externally, it looks like a compiler: it takes an abstract problem specification and translates it into executable code. Its input is a concise description of a data analysis problem in the form of a statistical model as shown in Figure 1; its output is optimized and fully documented C/C++ code which can be linked dynamically into the Matlab and Octave environments. Internally, however, it is quite different: AutoBayes derives a customized algorithm implementing the given model using a schema-based process, and then further refines and optimizes the algorithm into code. A schema is a parameterized code template with associated semantic constraints which define and restrict the template s applicability. The schema parameters are instantiated in a problem-specific way during synthesis as AutoBayes checks the constraints against the original model or, recursively, against emerging sub-problems. AutoBayes schema library contains problem decomposition operators (which are justified by theorems in a formal logic in the domain of Bayesian networks) as well as machine learning algorithms (e.g., EM, k-Means) and nu- meric optimization methods (e.g., Nelder-Mead simplex, conjugate gradient). AutoBayes augments this schema-based approach by symbolic computation to derive closed-form solutions whenever possible. This is a major advantage over other statistical data analysis systems which use numerical approximations even in cases where closed-form solutions exist. AutoBayes is implemented in Prolog and comprises approximately 75.000 lines of code. In this paper, we take one typical scientific data analysis problem-analyzing planetary nebulae images taken by the Hubble Space Telescope-and show how AutoBayes can be used to automate the implementation of the necessary anal- ysis programs. We initially follow the analysis described by Knuth and Hajian [KHO2] and use AutoBayes to derive code for the published models. We show the details of the code derivation process, including the symbolic computations and automatic integration of library procedures, and compare the results of the automatically generated and manually implemented code. We then go beyond the original analysis and use AutoBayes to derive code for a simple image segmentation procedure based on a mixture model which can be used to automate a manual preproceesing step. Finally, we combine the original approach with the simple segmentation which yields a more detailed analysis. This also demonstrates that AutoBayes makes it easy to combine different aspects of data analysis.
Boudreault, David J; Li, Chin-Shang; Wong, Michael S
2016-01-01
To evaluate the effect of web-based education on (1) patient satisfaction, (2) consultation times, and (3) conversion to surgery. A retrospective review of 767 new patient consultations seen by 4 university-based plastic surgeons was conducted between May 2012 and August 2013 to determine the effect a web-based education program has on patient satisfaction and consultation time. A standard 5-point Likert scale survey completed at the end of the consultation was used to assess satisfaction with their experience. Consult times were obtained from the electronic medical record. All analyses were done with Statistical Analysis Software version 9.2 (SAS Inc., Cary, NC). A P value less than 0.05 was considered statistically significant. Those who viewed the program before their consultation were more satisfied with their consultation compared to those who did not (satisfaction scores, mean ± SD: 1.13 ± 0.44 vs 1.36 ± 0.74; P = 0.02) and more likely to rate their experience as excellent (92% vs 75%; P = 0.02). Contrary to the claims of Emmi Solutions, patients who viewed the educational program before consultation trended toward longer visits compared to those who did not (mean time ± SD: 54 ± 26 vs 50 ± 35 minutes; P = 0.10). More patients who completed the program went on to undergo a procedure (44% vs 37%; P = 0.16), but this difference was not statistically significant. Viewing web-based educational programs significantly improved plastic surgery patients' satisfaction with their consultation, but patients who viewed the program also trended toward longer consultation times. Although there was an increase in converting to surgical procedures, this did not reach statistical significance.
Soils element activities for the period October 1973--September 1974
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fowler, E.B.; Essington, E.H.; White, M.G.
Soils Element activities were conducted on behalf of the U. S. Atomic Energy Commission's Nevada Applied Ecology Group (NAEG) program to provide source term information for the other program elements and maintain continuous cognizance of program requirements for sampling, sample preparation, and analysis. Activities included presentation of papers; participation in workshops; analysis of soil, vegetation, and animal tissue samples for $sup 238$Pu, $sup 239-240$Pu, $sup 241$Am, $sup 137$Cs, $sup 60$Co, and gamma scan for routine and laboratory quality control purposes; preparation and analysis of animal tissue samples for NAEG laboratory certification; studies on a number of analytical, sample preparation, andmore » sample collection procedures; and contributions to the evaluation of procedures for calculation of specialized counting statistics. (auth)« less
Competent statistical programmer: Need of business process outsourcing industry
Khan, Imran
2014-01-01
Over the last two decades Business Process Outsourcing (BPO) has evolved as much mature practice. India is looked as preferred destination for pharmaceutical outsourcing over a cost arbitrage. Among the biometrics outsourcing, statistical programming and analysis required very niche skill for service delivery. The demand and supply ratios are imbalance due to high churn out rate and less supply of competent programmer. Industry is moving from task delivery to ownership and accountability. The paradigm shift from an outsourcing to consulting is triggering the need for competent statistical programmer. Programmers should be trained in technical, analytical, problem solving, decision making and soft skill as the expectations from the customer are changing from task delivery to accountability of the project. This paper will highlight the common issue SAS programming service industry is facing and skills the programmers need to develop to cope up with these changes. PMID:24987578
Competent statistical programmer: Need of business process outsourcing industry.
Khan, Imran
2014-07-01
Over the last two decades Business Process Outsourcing (BPO) has evolved as much mature practice. India is looked as preferred destination for pharmaceutical outsourcing over a cost arbitrage. Among the biometrics outsourcing, statistical programming and analysis required very niche skill for service delivery. The demand and supply ratios are imbalance due to high churn out rate and less supply of competent programmer. Industry is moving from task delivery to ownership and accountability. The paradigm shift from an outsourcing to consulting is triggering the need for competent statistical programmer. Programmers should be trained in technical, analytical, problem solving, decision making and soft skill as the expectations from the customer are changing from task delivery to accountability of the project. This paper will highlight the common issue SAS programming service industry is facing and skills the programmers need to develop to cope up with these changes.
NASA Astrophysics Data System (ADS)
Takabe, Satoshi; Hukushima, Koji
2016-05-01
Typical behavior of the linear programming (LP) problem is studied as a relaxation of the minimum vertex cover (min-VC), a type of integer programming (IP) problem. A lattice-gas model on the Erdös-Rényi random graphs of α -uniform hyperedges is proposed to express both the LP and IP problems of the min-VC in the common statistical mechanical model with a one-parameter family. Statistical mechanical analyses reveal for α =2 that the LP optimal solution is typically equal to that given by the IP below the critical average degree c =e in the thermodynamic limit. The critical threshold for good accuracy of the relaxation extends the mathematical result c =1 and coincides with the replica symmetry-breaking threshold of the IP. The LP relaxation for the minimum hitting sets with α ≥3 , minimum vertex covers on α -uniform random graphs, is also studied. Analytic and numerical results strongly suggest that the LP relaxation fails to estimate optimal values above the critical average degree c =e /(α -1 ) where the replica symmetry is broken.
Takabe, Satoshi; Hukushima, Koji
2016-05-01
Typical behavior of the linear programming (LP) problem is studied as a relaxation of the minimum vertex cover (min-VC), a type of integer programming (IP) problem. A lattice-gas model on the Erdös-Rényi random graphs of α-uniform hyperedges is proposed to express both the LP and IP problems of the min-VC in the common statistical mechanical model with a one-parameter family. Statistical mechanical analyses reveal for α=2 that the LP optimal solution is typically equal to that given by the IP below the critical average degree c=e in the thermodynamic limit. The critical threshold for good accuracy of the relaxation extends the mathematical result c=1 and coincides with the replica symmetry-breaking threshold of the IP. The LP relaxation for the minimum hitting sets with α≥3, minimum vertex covers on α-uniform random graphs, is also studied. Analytic and numerical results strongly suggest that the LP relaxation fails to estimate optimal values above the critical average degree c=e/(α-1) where the replica symmetry is broken.
Regression modeling of ground-water flow
Cooley, R.L.; Naff, R.L.
1985-01-01
Nonlinear multiple regression methods are developed to model and analyze groundwater flow systems. Complete descriptions of regression methodology as applied to groundwater flow models allow scientists and engineers engaged in flow modeling to apply the methods to a wide range of problems. Organization of the text proceeds from an introduction that discusses the general topic of groundwater flow modeling, to a review of basic statistics necessary to properly apply regression techniques, and then to the main topic: exposition and use of linear and nonlinear regression to model groundwater flow. Statistical procedures are given to analyze and use the regression models. A number of exercises and answers are included to exercise the student on nearly all the methods that are presented for modeling and statistical analysis. Three computer programs implement the more complex methods. These three are a general two-dimensional, steady-state regression model for flow in an anisotropic, heterogeneous porous medium, a program to calculate a measure of model nonlinearity with respect to the regression parameters, and a program to analyze model errors in computed dependent variables such as hydraulic head. (USGS)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Belzer, D.; Mosey, G.; Plympton, P.
2007-07-01
Home Performance with ENERGY STAR (HPwES) is a jointly managed program of the U.S. Department of Energy (DOE) and the U.S. Environmental Protection Agency (EPA). This program focuses on improving energy efficiency in existing homes via a whole-house approach to assessing and improving a home's energy performance, and helping to protect the environment. As one of HPwES's local sponsors, Austin Energy's HPwES program offers a complete home energy analysis and a list of recommendations for efficiency improvements, along with cost estimates. To determine the benefits of this program, the National Renewable Energy Laboratory (NREL) collaborated with the Pacific Northwest Nationalmore » Laboratory (PNNL) to conduct a statistical analysis using energy consumption data of HPwES homes provided by Austin Energy. This report provides preliminary estimates of average savings per home from the HPwES Loan Program for the period 1998 through 2006. The results from this preliminary analysis suggest that the HPwES program sponsored by Austin Energy had a very significant impact on reducing average cooling electricity for participating households. Overall, average savings were in the range of 25%-35%, and appear to be robust under various criteria for the number of households included in the analysis.« less
Dental hygiene students' perceptions of distance learning: do they change over time?
Sledge, Rhonda; Vuk, Jasna; Long, Susan
2014-02-01
The University of Arkansas for Medical Sciences dental hygiene program established a distant site where the didactic curriculum was broadcast via interactive video from the main campus to the distant site, supplemented with on-line learning via Blackboard. This study compared the perceptions of students towards distance learning as they progressed through the 21 month curriculum. Specifically, the study sought to answer the following questions: Is there a difference in the initial perceptions of students on the main campus and at the distant site toward distance learning? Do students' perceptions change over time with exposure to synchronous distance learning over the course of the curriculum? All 39 subjects were women between the ages of 20 and 35 years. Of the 39 subjects, 37 were Caucasian and 2 were African-American. A 15-question Likert scale survey was administered at 4 different periods during the 21 month program to compare changes in perceptions toward distance learning as students progressed through the program. An independent sample t-test and ANOVA were utilized for statistical analysis. At the beginning of the program, independent samples t-test revealed that students at the main campus (n=34) perceived statistically significantly higher effectiveness of distance learning than students at the distant site (n=5). Repeated measures of ANOVA revealed that perceptions of students at the main campus on effectiveness and advantages of distance learning statistically significantly decreased whereas perceptions of students at distant site statistically significantly increased over time. Distance learning in the dental hygiene program was discussed, and replication of the study with larger samples of students was recommended.
Cutting efficiency of Reciproc and waveOne reciprocating instruments.
Plotino, Gianluca; Giansiracusa Rubini, Alessio; Grande, Nicola M; Testarelli, Luca; Gambarini, Gianluca
2014-08-01
The aim of the present study was to evaluate the cutting efficiency of 2 new reciprocating instruments, Reciproc and WaveOne. Twenty-four new Reciproc R25 and 24 new WaveOne Primary files were activated by using a torque-controlled motor (Silver Reciproc) and divided into 4 groups (n = 12): group 1, Reciproc activated by Reciproc ALL program; group 2, Reciproc activated by WaveOne ALL program; group 3, WaveOne activated by Reciproc ALL program; and group 4, WaveOne activated by WaveOne ALL program. The device used for the cutting test consisted of a main frame to which a mobile plastic support for the handpiece is connected and a stainless steel block containing a Plexiglas block (inPlexiglass, Rome, Italy) against which the cutting efficiency of the instruments was tested. The length of the block cut in 1 minute was measured in a computerized program with a precision of 0.1 mm. Means and standard deviations of each group were calculated, and data were statistically analyzed with 1-way analysis of variance and Bonferroni test (P < .05). Reciproc R25 displayed greater cutting efficiency than WaveOne Primary for both the movements used (P < .05); in particular, Reciproc instruments used with their proper reciprocating motion presented a statistically significant higher cutting efficiency than WaveOne instruments used with their proper reciprocating motion (P < .05). There was no statistically significant difference between the 2 movements for both instruments (P > .05). Reciproc instruments demonstrated statistically higher cutting efficiency than WaveOne instruments. Copyright © 2014 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.
Mapping Quantitative Traits in Unselected Families: Algorithms and Examples
Dupuis, Josée; Shi, Jianxin; Manning, Alisa K.; Benjamin, Emelia J.; Meigs, James B.; Cupples, L. Adrienne; Siegmund, David
2009-01-01
Linkage analysis has been widely used to identify from family data genetic variants influencing quantitative traits. Common approaches have both strengths and limitations. Likelihood ratio tests typically computed in variance component analysis can accommodate large families but are highly sensitive to departure from normality assumptions. Regression-based approaches are more robust but their use has primarily been restricted to nuclear families. In this paper, we develop methods for mapping quantitative traits in moderately large pedigrees. Our methods are based on the score statistic which in contrast to the likelihood ratio statistic, can use nonparametric estimators of variability to achieve robustness of the false positive rate against departures from the hypothesized phenotypic model. Because the score statistic is easier to calculate than the likelihood ratio statistic, our basic mapping methods utilize relatively simple computer code that performs statistical analysis on output from any program that computes estimates of identity-by-descent. This simplicity also permits development and evaluation of methods to deal with multivariate and ordinal phenotypes, and with gene-gene and gene-environment interaction. We demonstrate our methods on simulated data and on fasting insulin, a quantitative trait measured in the Framingham Heart Study. PMID:19278016
Testing an automated method to estimate ground-water recharge from streamflow records
Rutledge, A.T.; Daniel, C.C.
1994-01-01
The computer program, RORA, allows automated analysis of streamflow hydrographs to estimate ground-water recharge. Output from the program, which is based on the recession-curve-displacement method (often referred to as the Rorabaugh method, for whom the program is named), was compared to estimates of recharge obtained from a manual analysis of 156 years of streamflow record from 15 streamflow-gaging stations in the eastern United States. Statistical tests showed that there was no significant difference between paired estimates of annual recharge by the two methods. Tests of results produced by the four workers who performed the manual method showed that results can differ significantly between workers. Twenty-two percent of the variation between manual and automated estimates could be attributed to having different workers perform the manual method. The program RORA will produce estimates of recharge equivalent to estimates produced manually, greatly increase the speed od analysis, and reduce the subjectivity inherent in manual analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Due to the increase in the use of Coordinate Measuring Machines (CMMs) to measure fine details and complex geometries in manufacturing, many programs have been made to compile and analyze the data. These programs typically require extensive setup to determine the expected results in order to not only track the pass/fail of a dimension, but also to use statistical process control (SPC). These extra steps and setup times have been addressed through the CMM Data Analysis Tool, which only requires the output of the CMM to provide both pass/fail analysis on all parts run to the same inspection program asmore » well as provide graphs which help visualize where the part measures within the allowed tolerances. This provides feedback not only to the customer for approval of a part during development, but also to machining process engineers to identify when any dimension is drifting towards an out of tolerance condition during production. This program can handle hundreds of parts with complex dimensions and will provide an analysis within minutes.« less
A meta-analysis of experimental studies of diversion programs for juvenile offenders.
Schwalbe, Craig S; Gearing, Robin E; MacKenzie, Michael J; Brewer, Kathryne B; Ibrahim, Rawan
2012-02-01
Research to establish an evidence-base for the treatment of conduct problems and delinquency in adolescence is well established; however, an evidence-base for interventions with offenders who are diverted from the juvenile justice system has yet to be synthesized. The purpose of this study was to conduct a meta-analysis of experimental studies testing juvenile diversion programs and to examine the moderating effect of program type and implementation quality. A literature search using PsycINFO, Web of Science, and the National Criminal Justice Reference Service data-bases and research institute websites yielded 28 eligible studies involving 57 experimental comparisons and 19,301 youths. Recidivism was the most common outcome reported across all studies. Overall, the effect of diversion programs on recidivism was non-significant (k=45, OR=0.83, 95%CI=0.43-1.58). Of the five program types identified, including case management (k=18, OR=0.78), individual treatment (k=11, OR=0.83), family treatment (k=4, OR=0.57), youth court (k=6, OR=0.93), and restorative justice (k=6, OR=0.87), only family treatment led to a statistically significant reduction in recidivism. Restorative justice studies that were implemented with active involvement of researchers led to statistically significant reductions in recidivism (k=3, OR=0.69). Other outcomes, including frequency of offending, truancy, and psycho-social problems were reported infrequently and were not subjected to meta-analysis. High levels of heterogeneity characterize diversion research. Results of this study recommend against implementation of programs limited to case management and highlight the promise of family interventions and restorative justice. Copyright © 2011 Elsevier Ltd. All rights reserved.
Smeltzer, Suzanne C; Sharts-Hopko, Nancy C; Cantrell, Mary Ann; Heverly, Mary Ann; Wise, Nancy; Jenkinson, Amanda
Support for research strongly predicts doctoral program faculty members' research productivity. Although academic administrators affect such support, their views of faculty members' use of support are unknown. We examined academic administrators' perceptions of institutional support and their perceptions of the effects of teaching doctoral students on faculty members' scholarship productivity and work-life balance. An online survey was completed by a random sample of 180 deans/directors of schools of nursing and doctoral programs directors. Data were analyzed with descriptive statistics, chi-square analysis, and analysis of variance. Deans and doctoral program directors viewed the level of productivity of program faculty as high to moderately high and unchanged since faculty started teaching doctoral students. Deans perceived better administrative research supports, productivity, and work-life balance of doctoral program faculty than did program directors. Findings indicate the need for greater administrative support for scholarship and mentoring given the changes in the composition of doctoral program faculty. Copyright © 2017 Elsevier Inc. All rights reserved.
Automating approximate Bayesian computation by local linear regression.
Thornton, Kevin R
2009-07-07
In several biological contexts, parameter inference often relies on computationally-intensive techniques. "Approximate Bayesian Computation", or ABC, methods based on summary statistics have become increasingly popular. A particular flavor of ABC based on using a linear regression to approximate the posterior distribution of the parameters, conditional on the summary statistics, is computationally appealing, yet no standalone tool exists to automate the procedure. Here, I describe a program to implement the method. The software package ABCreg implements the local linear-regression approach to ABC. The advantages are: 1. The code is standalone, and fully-documented. 2. The program will automatically process multiple data sets, and create unique output files for each (which may be processed immediately in R), facilitating the testing of inference procedures on simulated data, or the analysis of multiple data sets. 3. The program implements two different transformation methods for the regression step. 4. Analysis options are controlled on the command line by the user, and the program is designed to output warnings for cases where the regression fails. 5. The program does not depend on any particular simulation machinery (coalescent, forward-time, etc.), and therefore is a general tool for processing the results from any simulation. 6. The code is open-source, and modular.Examples of applying the software to empirical data from Drosophila melanogaster, and testing the procedure on simulated data, are shown. In practice, the ABCreg simplifies implementing ABC based on local-linear regression.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-05
... methodological explanations may be addressed to Marie L. Lihn or Peter B. Kahn, Economic and Market Analysis... not statistically significant. Rent reasonableness studies are not subject to the same constraints on...
NASA Technical Reports Server (NTRS)
Rana, D. S.
1980-01-01
The data reduction capabilities of the current data reduction programs were assessed and a search for a more comprehensive system with higher data analytic capabilities was made. Results of the investigation are presented.
Boosting Stochastic Problem Solvers Through Online Self-Analysis of Performance
2003-07-21
Boosting Stochastic Problem Solvers Through Online Self-Analysis of Performance Vincent A. Cicirello CMU-RI-TR-03-27 Submitted in partial fulfillment...AND SUBTITLE Boosting Stochastic Problem Solvers Through Online Self-Analysis of Performance 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM...lead to the development of a search control framework, called QD-BEACON that uses online -generated statistical models of search performance to
Alignment-free genetic sequence comparisons: a review of recent approaches by word analysis.
Bonham-Carter, Oliver; Steele, Joe; Bastola, Dhundy
2014-11-01
Modern sequencing and genome assembly technologies have provided a wealth of data, which will soon require an analysis by comparison for discovery. Sequence alignment, a fundamental task in bioinformatics research, may be used but with some caveats. Seminal techniques and methods from dynamic programming are proving ineffective for this work owing to their inherent computational expense when processing large amounts of sequence data. These methods are prone to giving misleading information because of genetic recombination, genetic shuffling and other inherent biological events. New approaches from information theory, frequency analysis and data compression are available and provide powerful alternatives to dynamic programming. These new methods are often preferred, as their algorithms are simpler and are not affected by synteny-related problems. In this review, we provide a detailed discussion of computational tools, which stem from alignment-free methods based on statistical analysis from word frequencies. We provide several clear examples to demonstrate applications and the interpretations over several different areas of alignment-free analysis such as base-base correlations, feature frequency profiles, compositional vectors, an improved string composition and the D2 statistic metric. Additionally, we provide detailed discussion and an example of analysis by Lempel-Ziv techniques from data compression. © The Author 2013. Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.
Narayanan, Roshni; Nugent, Rebecca; Nugent, Kenneth
2015-10-01
Accreditation Council for Graduate Medical Education guidelines require internal medicine residents to develop skills in the interpretation of medical literature and to understand the principles of research. A necessary component is the ability to understand the statistical methods used and their results, material that is not an in-depth focus of most medical school curricula and residency programs. Given the breadth and depth of the current medical literature and an increasing emphasis on complex, sophisticated statistical analyses, the statistical foundation and education necessary for residents are uncertain. We reviewed the statistical methods and terms used in 49 articles discussed at the journal club in the Department of Internal Medicine residency program at Texas Tech University between January 1, 2013 and June 30, 2013. We collected information on the study type and on the statistical methods used for summarizing and comparing samples, determining the relations between independent variables and dependent variables, and estimating models. We then identified the typical statistics education level at which each term or method is learned. A total of 14 articles came from the Journal of the American Medical Association Internal Medicine, 11 from the New England Journal of Medicine, 6 from the Annals of Internal Medicine, 5 from the Journal of the American Medical Association, and 13 from other journals. Twenty reported randomized controlled trials. Summary statistics included mean values (39 articles), category counts (38), and medians (28). Group comparisons were based on t tests (14 articles), χ2 tests (21), and nonparametric ranking tests (10). The relations between dependent and independent variables were analyzed with simple regression (6 articles), multivariate regression (11), and logistic regression (8). Nine studies reported odds ratios with 95% confidence intervals, and seven analyzed test performance using sensitivity and specificity calculations. These papers used 128 statistical terms and context-defined concepts, including some from data analysis (56), epidemiology-biostatistics (31), modeling (24), data collection (12), and meta-analysis (5). Ten different software programs were used in these articles. Based on usual undergraduate and graduate statistics curricula, 64.3% of the concepts and methods used in these papers required at least a master's degree-level statistics education. The interpretation of the current medical literature can require an extensive background in statistical methods at an education level exceeding the material and resources provided to most medical students and residents. Given the complexity and time pressure of medical education, these deficiencies will be hard to correct, but this project can serve as a basis for developing a curriculum in study design and statistical methods needed by physicians-in-training.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-30
... for OMB Review; Comment Request; Mass Layoff Statistics Program ACTION: Notice. SUMMARY: The... request (ICR) titled, ``Mass Layoff Statistics Program,'' to the Office of Management and Budget (OMB) for... Statistics (BLS). Title of Collection: Mass Layoff Statistics Program. OMB Control Number: 1220-0090...
CAPSAS: Computer Assisted Program for the Selection of Appropriate Statistics.
ERIC Educational Resources Information Center
Shermis, Mark D.; Albert, Susan L.
A computer-assisted program has been developed for the selection of statistics or statistical techniques by both students and researchers. Based on Andrews, Klem, Davidson, O'Malley and Rodgers "A Guide for Selecting Statistical Techniques for Analyzing Social Science Data," this FORTRAN-compiled interactive computer program was…
Visual Data Analysis for Satellites
NASA Technical Reports Server (NTRS)
Lau, Yee; Bhate, Sachin; Fitzpatrick, Patrick
2008-01-01
The Visual Data Analysis Package is a collection of programs and scripts that facilitate visual analysis of data available from NASA and NOAA satellites, as well as dropsonde, buoy, and conventional in-situ observations. The package features utilities for data extraction, data quality control, statistical analysis, and data visualization. The Hierarchical Data Format (HDF) satellite data extraction routines from NASA's Jet Propulsion Laboratory were customized for specific spatial coverage and file input/output. Statistical analysis includes the calculation of the relative error, the absolute error, and the root mean square error. Other capabilities include curve fitting through the data points to fill in missing data points between satellite passes or where clouds obscure satellite data. For data visualization, the software provides customizable Generic Mapping Tool (GMT) scripts to generate difference maps, scatter plots, line plots, vector plots, histograms, timeseries, and color fill images.
1993-12-01
graduate education required for Ocean Facilities Program (OFP) officers in the Civil Engineer Corps (CEC) of the United States Navy. For the purpose...determined by distributing questionnaires to all officers in the OFP. Statistical analyses of numerical data and judgmental3 analysis of professional...45 B. Ocean Facility Program Officer Graduate Education Questionnaire ....... 47 C. Summary of Questionnaire Responses
Jack Lewis; Jim Baldwin
1997-01-01
The State of California has embarked upon a Long-Term Monitoring Program whose primary goal is to assess the effectiveness of the Forest Practice Rules and Review Process in protecting the beneficial uses of waters from the impacts of timber operations on private timberlands. The Board of Forestry's Monitoring Study Group concluded that hillslope monitoring should...
Reducing child mortality in Nigeria: a case study of immunization and systemic factors.
Nwogu, Rufus; Ngowu, Rufus; Larson, James S; Kim, Min Su
2008-07-01
The purpose of the study is to assess the outcome of the Expanded Program on Immunization (EPI) in Nigeria, as well as to examine systemic factors influencing its high under-five mortality rate (UFMR). The principal objective of the EPI program when it was implemented in 1978 was to reduce mortality, morbidity and disability associated with six vaccine preventable diseases namely tuberculosis, tetanus, diphtheria, measles, pertussis and poliomyelitis. The methodological approach to this study is quantitative, using secondary time series data from 1970 to 2003. The study tested three hypotheses using time series multiple regression analysis with autocorrelation adjustment as a statistical model. The results showed that the EPI program had little effect on UFMR in Nigeria. Only the literacy rate and domestic spending on healthcare had statistically significant effects on the UFMR. The military government was not a significant factor in reducing or increasing the UFMR. It appears that Nigeria needs a unified approach to healthcare delivery, rather than fragmented programs, to overcome cultural and political divisions in society.
Physics First: Impact on SAT Math Scores
NASA Astrophysics Data System (ADS)
Bouma, Craig E.
Improving science, technology, engineering, and mathematics (STEM) education has become a national priority and the call to modernize secondary science has been heard. A Physics First (PF) program with the curriculum sequence of physics, chemistry, and biology (PCB) driven by inquiry- and project-based learning offers a viable alternative to the traditional curricular sequence (BCP) and methods of teaching, but requires more empirical evidence. This study determined impact of a PF program (PF-PCB) on math achievement (SAT math scores) after the first two cohorts of students completed the PF-PCB program at Matteo Ricci High School (MRHS) and provided more quantitative data to inform the PF debate and advance secondary science education. Statistical analysis (ANCOVA) determined the influence of covariates and revealed that PF-PCB program had a significant (p < .05) impact on SAT math scores in the second cohort at MRHS. Statistically adjusted, the SAT math means for PF students were 21.4 points higher than their non-PF counterparts when controlling for prior math achievement (HSTP math), socioeconomic status (SES), and ethnicity/race.
Identification of curriculum content for a renewable energy graduate degree program
NASA Astrophysics Data System (ADS)
Haughery, John R.
There currently exists a disconnect between renewable energy industry workforce needs and academic program proficiencies. This is evidenced by an absence of clear curriculum content on renewable energy graduate program websites. The purpose of this study was to identify a set of curriculum content for graduate degrees in renewable energy. At the conclusion, a clear list of 42 content items was identified and statistically ranked. The content items identified were based on a review of literature from government initiatives, professional society's body of knowledge, and related research studies. Leaders and experts in the field of renewable energy and sustainability were surveyed, using a five-point Likert-Scale model. This allowed each item's importance level to be analyzed and prioritized based on non-parametric statistical analysis methods. The study found seven competency items to be very important , 30 to be important, and five to be somewhat important. The results were also appropriate for use as a framework in developing or improving renewable energy graduate programs.
Hagen, Brad; Awosoga, Oluwagbohunmi A; Kellett, Peter; Damgaard, Marie
2013-04-23
This article describes the results of a qualitative research study evaluating nursing students' experiences of a mandatory course in applied statistics, and the perceived effectiveness of teaching methods implemented during the course. Fifteen nursing students in the third year of a four-year baccalaureate program in nursing participated in focus groups before and after taking the mandatory course in statistics. The interviews were transcribed and analyzed using content analysis to reveal four major themes: (i) "one of those courses you throw out?," (ii) "numbers and terrifying equations," (iii) "first aid for statistics casualties," and (iv) "re-thinking curriculum." Overall, the data revealed that although nursing students initially enter statistics courses with considerable skepticism, fear, and anxiety, there are a number of concrete actions statistics instructors can take to reduce student fear and increase the perceived relevance of courses in statistics.
Terrain-analysis procedures for modeling radar backscatter
Schaber, Gerald G.; Pike, Richard J.; Berlin, Graydon Lennis
1978-01-01
The collection and analysis of detailed information on the surface of natural terrain are important aspects of radar-backscattering modeling. Radar is especially sensitive to surface-relief changes in the millimeter- to-decimeter scale four conventional K-band (~1-cm wavelength) to L-band (~25-cm wavelength) radar systems. Surface roughness statistics that characterize these changes in detail have been generated by a comprehensive set of seven programmed calculations for radar-backscatter modeling from sets of field measurements. The seven programs are 1) formatting of data in readable form for subsequent topographic analysis program; 2) relief analysis; 3) power spectral analysis; 4) power spectrum plots; 5) slope angle between slope reversals; 6) slope angle against slope interval plots; and 7) base length slope angle and curvature. This complete Fortran IV software package, 'Terrain Analysis', is here presented for the first time. It was originally developed a decade ago for investigations of lunar morphology and surface trafficability for the Apollo Lunar Roving Vehicle.
Noise characteristics of the Skylab S-193 altimeter altitude measurements
NASA Technical Reports Server (NTRS)
Hatch, W. E.
1975-01-01
The statistical characteristics of the SKYLAB S-193 altimeter altitude noise are considered. These results are reported in a concise format for use and analysis by the scientific community. In most instances the results have been grouped according to satellite pointing so that the effects of pointing on the statistical characteristics can be readily seen. The altimeter measurements and the processing techniques are described. The mathematical descriptions of the computer programs used for these results are included.
NASA Astrophysics Data System (ADS)
Svirina, Anna; Shindor, Olga; Tatmyshevsky, Konstantin
2014-12-01
The paper deals with the main problems of Russian energy system development that proves necessary to provide educational programs in the field of renewable and alternative energy. In the paper the process of curricula development and defining teaching techniques on the basis of expert opinion evaluation is defined, and the competence model for renewable and alternative energy processing master students is suggested. On the basis of a distributed questionnaire and in-depth interviews, the data for statistical analysis was obtained. On the basis of this data, an optimization of curricula structure was performed, and three models of a structure for optimizing teaching techniques were developed. The suggested educational program structure which was adopted by employers is presented in the paper. The findings include quantitatively estimated importance of systemic thinking and professional skills and knowledge as basic competences of a masters' program graduate; statistically estimated necessity of practice-based learning approach; and optimization models for structuring curricula in renewable and alternative energy processing. These findings allow the establishment of a platform for the development of educational programs.
The effect of the Family Case Management Program on 1996 birth outcomes in Illinois.
Keeton, Kristie; Saunders, Stephen E; Koltun, David
2004-03-01
The purpose of this study was to determine if birth outcomes for Medicaid recipients were improved with participation in the Illinois Family Case Management Program. Health program data files were linked with the 1996 Illinois Vital Records linked birth-death certificate file. Logistic regression was used to characterize the variation in birth outcomes as a function of Family Case Management participation while statistically controlling for measurable factors found to be confounders. Results of the logistic regression analysis show that women who participated in the Family Care Management Program were significantly less likely to give birth to very low birth weight infants (odds ratio [OR] = 0.86, 95% confidence interval [CI] = 0.75, 0.99) and low birth weight infants (OR = 0.83, CI = 0.79, 0.89). For infant mortality, however, the adjusted OR (OR = 0.98, CI = 0.82, 1.17), although under 1, was not statistically significant. These results suggest that the Family Case Management Program may be effective in reducing very low birth weight and low birth weight rates among infants born to low-income women.
History and Development of the Schmidt-Hunter Meta-Analysis Methods
ERIC Educational Resources Information Center
Schmidt, Frank L.
2015-01-01
In this article, I provide answers to the questions posed by Will Shadish about the history and development of the Schmidt-Hunter methods of meta-analysis. In the 1970s, I headed a research program on personnel selection at the US Office of Personnel Management (OPM). After our research showed that validity studies have low statistical power, OPM…
KaDonna C. Randolph
2006-01-01
The U.S. Department of Agriculture Forest Service, Forest Inventory and Analysis Program (FIA) utilizes visual assessments of tree crown condition to monitor changes and trends in forest health. This report describes and discusses distributions of three FIA crown condition indicators (crown density, crown dieback, and foliage transparency) for trees in the Southern...
ERIC Educational Resources Information Center
Leow, Christine; Wen, Xiaoli; Korfmacher, Jon
2015-01-01
This article compares regression modeling and propensity score analysis as different types of statistical techniques used in addressing selection bias when estimating the impact of two-year versus one-year Head Start on children's school readiness. The analyses were based on the national Head Start secondary dataset. After controlling for…
NASA Astrophysics Data System (ADS)
Aoyama, Hideaki; Fujiwara, Yoshi; Ikeda, Yuichi; Iyetomi, Hiroshi; Souma, Wataru; Yoshikawa, Hiroshi
2017-07-01
Preface; Foreword, Acknowledgements, List of tables; List of figures, prologue, 1. Introduction: reconstructing macroeconomics; 2. Basic concepts in statistical physics and stochastic models; 3. Income and firm-size distributions; 4. Productivity distribution and related topics; 5. Multivariate time-series analysis; 6. Business cycles; 7. Price dynamics and inflation/deflation; 8. Complex network, community analysis, visualization; 9. Systemic risks; Appendix A: computer program for beginners; Epilogue; Bibliography; Index.
On Designing Multicore-Aware Simulators for Systems Biology Endowed with OnLine Statistics
Calcagno, Cristina; Coppo, Mario
2014-01-01
The paper arguments are on enabling methodologies for the design of a fully parallel, online, interactive tool aiming to support the bioinformatics scientists .In particular, the features of these methodologies, supported by the FastFlow parallel programming framework, are shown on a simulation tool to perform the modeling, the tuning, and the sensitivity analysis of stochastic biological models. A stochastic simulation needs thousands of independent simulation trajectories turning into big data that should be analysed by statistic and data mining tools. In the considered approach the two stages are pipelined in such a way that the simulation stage streams out the partial results of all simulation trajectories to the analysis stage that immediately produces a partial result. The simulation-analysis workflow is validated for performance and effectiveness of the online analysis in capturing biological systems behavior on a multicore platform and representative proof-of-concept biological systems. The exploited methodologies include pattern-based parallel programming and data streaming that provide key features to the software designers such as performance portability and efficient in-memory (big) data management and movement. Two paradigmatic classes of biological systems exhibiting multistable and oscillatory behavior are used as a testbed. PMID:25050327
On designing multicore-aware simulators for systems biology endowed with OnLine statistics.
Aldinucci, Marco; Calcagno, Cristina; Coppo, Mario; Damiani, Ferruccio; Drocco, Maurizio; Sciacca, Eva; Spinella, Salvatore; Torquati, Massimo; Troina, Angelo
2014-01-01
The paper arguments are on enabling methodologies for the design of a fully parallel, online, interactive tool aiming to support the bioinformatics scientists .In particular, the features of these methodologies, supported by the FastFlow parallel programming framework, are shown on a simulation tool to perform the modeling, the tuning, and the sensitivity analysis of stochastic biological models. A stochastic simulation needs thousands of independent simulation trajectories turning into big data that should be analysed by statistic and data mining tools. In the considered approach the two stages are pipelined in such a way that the simulation stage streams out the partial results of all simulation trajectories to the analysis stage that immediately produces a partial result. The simulation-analysis workflow is validated for performance and effectiveness of the online analysis in capturing biological systems behavior on a multicore platform and representative proof-of-concept biological systems. The exploited methodologies include pattern-based parallel programming and data streaming that provide key features to the software designers such as performance portability and efficient in-memory (big) data management and movement. Two paradigmatic classes of biological systems exhibiting multistable and oscillatory behavior are used as a testbed.
Student perception of initial transition into a nursing program: A mixed methods research study.
McDonald, Meghan; Brown, Janine; Knihnitski, Crystal
2018-05-01
Transition into undergraduate education programs is stressful and impacts students' well-being and academic achievement. Previous research indicates nursing students experience stress, depression, anxiety, and poor lifestyle habits which interfere with learning. However, nursing students' experience of transition into nursing programs has not been well studied. Incongruence exists between this lack of research and the desire to foster student success. This study analyzed students' experiences of initial transition into a nursing program. An embedded mixed method design. A single site of a direct-entry, four year baccalaureate Canadian nursing program. All first year nursing students enrolled in the fall term of 2016. This study combined the Student Adaptation to College Questionnaire (SACQ) with a subset of participants participating in qualitative focus groups. Quantitative data was analyzed using descriptive statistics to identify statistically significant differences in full-scale and subscale scores. Qualitative data was analyzed utilizing thematic analysis. Significant differences were seen between those who moved to attend university and those who did not, with those who moved scoring lower on the Academic Adjustment subscale. Focus group thematic analysis highlighted how students experienced initial transition into a baccalaureate nursing program. Identified themes included reframing supports, splitting focus/finding focus, negotiating own expectations, negotiating others' expectations, and forming identity. These findings form the Undergraduate Nursing Initial Transition (UNIT) Framework. Significance of this research includes applications in faculty development and program supports to increase student success in the first year of nursing and to provide foundational success for ongoing nursing practice. Copyright © 2018 Elsevier Ltd. All rights reserved.
7 CFR 2.17 - Under Secretary for Rural Development.
Code of Federal Regulations, 2014 CFR
2014-01-01
... research and analysis, statistical programs, and associated service work related to rural people and the communities in which they live including rural industrialization; rural population and manpower; local... formulating manpower development and training policies. (13) Related to committee management. Establish and...
7 CFR 2.17 - Under Secretary for Rural Development.
Code of Federal Regulations, 2013 CFR
2013-01-01
... research and analysis, statistical programs, and associated service work related to rural people and the communities in which they live including rural industrialization; rural population and manpower; local... formulating manpower development and training policies. (13) Related to committee management. Establish and...
7 CFR 2.17 - Under Secretary for Rural Development.
Code of Federal Regulations, 2010 CFR
2010-01-01
... research and analysis, statistical programs, and associated service work related to rural people and the communities in which they live including rural industrialization; rural population and manpower; local... formulating manpower development and training policies. (13) Related to committee management. Establish and...
Hamar, Brent; Bradley, Chastity; Gandy, William M.; Harrison, Patricia L.; Sidney, James A.; Coberley, Carter R.; Rula, Elizabeth Y.; Pope, James E.
2013-01-01
Abstract Evaluation of chronic care management (CCM) programs is necessary to determine the behavioral, clinical, and financial value of the programs. Financial outcomes of members who are exposed to interventions (treatment group) typically are compared to those not exposed (comparison group) in a quasi-experimental study design. However, because member assignment is not randomized, outcomes reported from these designs may be biased or inefficient if study groups are not comparable or balanced prior to analysis. Two matching techniques used to achieve balanced groups are Propensity Score Matching (PSM) and Coarsened Exact Matching (CEM). Unlike PSM, CEM has been shown to yield estimates of causal (program) effects that are lowest in variance and bias for any given sample size. The objective of this case study was to provide a comprehensive comparison of these 2 matching methods within an evaluation of a CCM program administered to a large health plan during a 2-year time period. Descriptive and statistical methods were used to assess the level of balance between comparison and treatment members pre matching. Compared with PSM, CEM retained more members, achieved better balance between matched members, and resulted in a statistically insignificant Wald test statistic for group aggregation. In terms of program performance, the results showed an overall higher medical cost savings among treatment members matched using CEM compared with those matched using PSM (-$25.57 versus -$19.78, respectively). Collectively, the results suggest CEM is a viable alternative, if not the most appropriate matching method, to apply when evaluating CCM program performance. (Population Health Management 2013;16:35–45) PMID:22788834
Wells, Aaron R; Hamar, Brent; Bradley, Chastity; Gandy, William M; Harrison, Patricia L; Sidney, James A; Coberley, Carter R; Rula, Elizabeth Y; Pope, James E
2013-02-01
Evaluation of chronic care management (CCM) programs is necessary to determine the behavioral, clinical, and financial value of the programs. Financial outcomes of members who are exposed to interventions (treatment group) typically are compared to those not exposed (comparison group) in a quasi-experimental study design. However, because member assignment is not randomized, outcomes reported from these designs may be biased or inefficient if study groups are not comparable or balanced prior to analysis. Two matching techniques used to achieve balanced groups are Propensity Score Matching (PSM) and Coarsened Exact Matching (CEM). Unlike PSM, CEM has been shown to yield estimates of causal (program) effects that are lowest in variance and bias for any given sample size. The objective of this case study was to provide a comprehensive comparison of these 2 matching methods within an evaluation of a CCM program administered to a large health plan during a 2-year time period. Descriptive and statistical methods were used to assess the level of balance between comparison and treatment members pre matching. Compared with PSM, CEM retained more members, achieved better balance between matched members, and resulted in a statistically insignificant Wald test statistic for group aggregation. In terms of program performance, the results showed an overall higher medical cost savings among treatment members matched using CEM compared with those matched using PSM (-$25.57 versus -$19.78, respectively). Collectively, the results suggest CEM is a viable alternative, if not the most appropriate matching method, to apply when evaluating CCM program performance.
Natural environment application for NASP-X-30 design and mission planning
NASA Technical Reports Server (NTRS)
Johnson, D. L.; Hill, C. K.; Brown, S. C.; Batts, G. W.
1993-01-01
The NASA/MSFC Mission Analysis Program has recently been utilized in various National Aero-Space Plane (NASP) mission and operational planning scenarios. This paper focuses on presenting various atmospheric constraint statistics based on assumed NASP mission phases using established natural environment design, parametric, threshold values. Probabilities of no-go are calculated using atmospheric parameters such as temperature, humidity, density altitude, peak/steady-state winds, cloud cover/ceiling, thunderstorms, and precipitation. The program although developed to evaluate test or operational missions after flight constraints have been established, can provide valuable information in the design phase of the NASP X-30 program. Inputting the design values as flight constraints the Mission Analysis Program returns the probability of no-go, or launch delay, by hour by month. This output tells the X-30 program manager whether the design values are stringent enough to meet his required test flight schedules.
A simple program to measure and analyse tree rings using Excel, R and SigmaScan
Hietz, Peter
2011-01-01
I present a new software that links a program for image analysis (SigmaScan), one for spreadsheets (Excel) and one for statistical analysis (R) for applications of tree-ring analysis. The first macro measures ring width marked by the user on scanned images, stores raw and detrended data in Excel and calculates the distance to the pith and inter-series correlations. A second macro measures darkness along a defined path to identify latewood–earlywood transition in conifers, and a third shows the potential for automatic detection of boundaries. Written in Visual Basic for Applications, the code makes use of the advantages of existing programs and is consequently very economic and relatively simple to adjust to the requirements of specific projects or to expand making use of already available code. PMID:26109835
A simple program to measure and analyse tree rings using Excel, R and SigmaScan.
Hietz, Peter
I present a new software that links a program for image analysis (SigmaScan), one for spreadsheets (Excel) and one for statistical analysis (R) for applications of tree-ring analysis. The first macro measures ring width marked by the user on scanned images, stores raw and detrended data in Excel and calculates the distance to the pith and inter-series correlations. A second macro measures darkness along a defined path to identify latewood-earlywood transition in conifers, and a third shows the potential for automatic detection of boundaries. Written in Visual Basic for Applications, the code makes use of the advantages of existing programs and is consequently very economic and relatively simple to adjust to the requirements of specific projects or to expand making use of already available code.
Assessment of NDE reliability data
NASA Technical Reports Server (NTRS)
Yee, B. G. W.; Couchman, J. C.; Chang, F. H.; Packman, D. F.
1975-01-01
Twenty sets of relevant nondestructive test (NDT) reliability data were identified, collected, compiled, and categorized. A criterion for the selection of data for statistical analysis considerations was formulated, and a model to grade the quality and validity of the data sets was developed. Data input formats, which record the pertinent parameters of the defect/specimen and inspection procedures, were formulated for each NDE method. A comprehensive computer program was written and debugged to calculate the probability of flaw detection at several confidence limits by the binomial distribution. This program also selects the desired data sets for pooling and tests the statistical pooling criteria before calculating the composite detection reliability. An example of the calculated reliability of crack detection in bolt holes by an automatic eddy current method is presented.
Mathematics and Statistics Research Department progress report, period ending June 30, 1982
DOE Office of Scientific and Technical Information (OSTI.GOV)
Denson, M.V.; Funderlic, R.E.; Gosslee, D.G.
1982-08-01
This report is the twenty-fifth in the series of progress reports of the Mathematics and Statistics Research Department of the Computer Sciences Division, Union Carbide Corporation Nuclear Division (UCC-ND). Part A records research progress in analysis of large data sets, biometrics research, computational statistics, materials science applications, moving boundary problems, numerical linear algebra, and risk analysis. Collaboration and consulting with others throughout the UCC-ND complex are recorded in Part B. Included are sections on biology, chemistry, energy, engineering, environmental sciences, health and safety, materials science, safeguards, surveys, and the waste storage program. Part C summarizes the various educational activities inmore » which the staff was engaged. Part D lists the presentations of research results, and Part E records the staff's other professional activities during the report period.« less
Mediation analysis in nursing research: a methodological review.
Liu, Jianghong; Ulrich, Connie
2016-12-01
Mediation statistical models help clarify the relationship between independent predictor variables and dependent outcomes of interest by assessing the impact of third variables. This type of statistical analysis is applicable for many clinical nursing research questions, yet its use within nursing remains low. Indeed, mediational analyses may help nurse researchers develop more effective and accurate prevention and treatment programs as well as help bridge the gap between scientific knowledge and clinical practice. In addition, this statistical approach allows nurse researchers to ask - and answer - more meaningful and nuanced questions that extend beyond merely determining whether an outcome occurs. Therefore, the goal of this paper is to provide a brief tutorial on the use of mediational analyses in clinical nursing research by briefly introducing the technique and, through selected empirical examples from the nursing literature, demonstrating its applicability in advancing nursing science.
NASA Astrophysics Data System (ADS)
Kwon, O.; Kim, W.; Kim, J.
2017-12-01
Recently construction of subsea tunnel has been increased globally. For safe construction of subsea tunnel, identifying the geological structure including fault at design and construction stage is more than important. Then unlike the tunnel in land, it's very difficult to obtain the data on geological structure because of the limit in geological survey. This study is intended to challenge such difficulties in a way of developing the technology to identify the geological structure of seabed automatically by using echo sounding data. When investigation a potential site for a deep subsea tunnel, there is the technical and economical limit with borehole of geophysical investigation. On the contrary, echo sounding data is easily obtainable while information reliability is higher comparing to above approaches. This study is aimed at developing the algorithm that identifies the large scale of geological structure of seabed using geostatic approach. This study is based on theory of structural geology that topographic features indicate geological structure. Basic concept of algorithm is outlined as follows; (1) convert the seabed topography to the grid data using echo sounding data, (2) apply the moving window in optimal size to the grid data, (3) estimate the spatial statistics of the grid data in the window area, (4) set the percentile standard of spatial statistics, (5) display the values satisfying the standard on the map, (6) visualize the geological structure on the map. The important elements in this study include optimal size of moving window, kinds of optimal spatial statistics and determination of optimal percentile standard. To determine such optimal elements, a numerous simulations were implemented. Eventually, user program based on R was developed using optimal analysis algorithm. The user program was designed to identify the variations of various spatial statistics. It leads to easy analysis of geological structure depending on variation of spatial statistics by arranging to easily designate the type of spatial statistics and percentile standard. This research was supported by the Korea Agency for Infrastructure Technology Advancement under the Ministry of Land, Infrastructure and Transport of the Korean government. (Project Number: 13 Construction Research T01)
Yiu, Rex; Fung, Vicky; Szeto, Karen; Hung, Veronica; Siu, Ricky; Lam, Johnny; Lai, Daniel; Maw, Christina; Cheung, Adah; Shea, Raman; Choy, Anna
2013-01-01
In Hong Kong, elderly patients discharged from hospital are at high risk of unplanned readmission. The Integrated Care Model (ICM) program is introduced to provide continuous and coordinated care for high risk elders from hospital to community to prevent unplanned readmission. A multidisciplinary working group was set up to address the requirements on developing the electronic forms for ICM program. Six (6) forms were developed. These forms can support ICM service delivery for the high risk elders, clinical documentation, statistical analysis and information sharing.
1985-06-01
ADDRESS 10. PROGRAM ELEMENT, PROJECT. TASK AREA & WORK UNIT NUMBERS Naval Postgraduate School Monterey, California 93943 11. CONTROLLING OFFICE NAME AND...determine the sccioeccnomic representativeness of the Army’s enlistees in that iarticular year. In addition, the socioeconomic overviev of Republic cf...accomplished with the use of the Statistical Analysis System (SAS), an integrated computer system for data analysis. 32 TABLE 2 The States in Each District
Multisite cost analysis of a school-based voluntary alcohol and drug prevention program.
Kilmer, Beau; Burgdorf, James R; D'Amico, Elizabeth J; Miles, Jeremy; Tucker, Joan
2011-09-01
This article estimates the societal costs of Project CHOICE, a voluntary after-school alcohol and other drug prevention program for adolescents. To our knowledge, this is the first cost analysis of an after-school program specifically focused on reducing alcohol and other drug use. The article uses microcosting methods based on the societal perspective and includes a number of sensitivity analyses to assess how the results change with alternative assumptions. Cost data were obtained from surveys of participants, facilitators, and school administrators; insights from program staff members; program expenditures; school budgets; the Bureau of Labor Statistics; and the National Center for Education Statistics. From the societal perspective, the cost of implementing Project CHOICE in eight California schools ranged from $121 to $305 per participant (Mdn = $238). The major cost drivers included labor costs associated with facilitating Project CHOICE, opportunity costs of displaced class time (because of in-class promotions for Project CHOICE and consent obtainment), and other efforts to increase participation. Substituting nationally representative cost information for wages and space reduced the range to $100-$206 (Mdn = $182), which is lower than the Substance Abuse and Mental Health Services Administration's estimate of $262 per pupil for the "average effective school-based program in 2002." Denominating national Project CHOICE costs by enrolled students instead of participants generates a median per-pupil cost of $21 (range: $14-$28). Estimating the societal costs of school-based prevention programs is crucial for efficiently allocating resources to reduce alcohol and other drug use. The large variation in Project CHOICE costs across schools highlights the importance of collecting program cost information from multiple sites.
Multisite Cost Analysis of a School-Based Voluntary Alcohol and Drug Prevention Program*
Kilmer, Beau; Burgdorf, James R.; D'amico, Elizabeth J.; Miles, Jeremy; Tucker, Joan
2011-01-01
Objective: This article estimates the societal costs of Project CHOICE, a voluntary after-school alcohol and other drug prevention program for adolescents. To our knowledge, this is the first cost analysis of an after-school program specifically focused on reducing alcohol and other drug use. Method: The article uses microcosting methods based on the societal perspective and includes a number of sensitivity analyses to assess how the results change with alternative assumptions. Cost data were obtained from surveys of participants, facilitators, and school administrators; insights from program staff members; program expenditures; school budgets; the Bureau of Labor Statistics; and the National Center for Education Statistics. Results: From the societal perspective, the cost of implementing Project CHOICE in eight California schools ranged from $121 to $305 per participant (Mdn = $238). The major cost drivers included labor costs associated with facilitating Project CHOICE, opportunity costs of displaced class time (because of in-class promotions for Project CHOICE and consent obtainment), and other efforts to increase participation. Substituting nationally representative cost information for wages and space reduced the range to $100–$206 (Mdn = $182), which is lower than the Substance Abuse and Mental Health Services Administration's estimate of $262 per pupil for the "average effective school-based program in 2002." Denominating national Project CHOICE costs by enrolled students instead of participants generates a median per-pupil cost of $21 (range: $14—$28). Conclusions: Estimating the societal costs of school-based prevention programs is crucial for efficiently allocating resources to reduce alcohol and other drug use. The large variation in Project CHOICE costs across schools highlights the importance of collecting program cost information from multiple sites. PMID:21906509
NIPARS: An Analysis of Procurement Performance and Cost for Nonstandard Items
1993-09-01
Assistance Program provides essential military and economic aid through the administration of six component programs (DISAM, 1993:37). The only component...47). Materiel Quality. "The NIPARS contract requires that items are manaufactured under the essential elements of MIL-l-45208" (Air Force, 1992: 12...Officer, AVSCOM, St Louis MO. Personal lnter~iew. 15 August 1993. McCLave, James T. and P. George Benson. Statistics for Business and Ecnomics (Fifth
Ceramic component reliability with the restructured NASA/CARES computer program
NASA Technical Reports Server (NTRS)
Powers, Lynn M.; Starlinger, Alois; Gyekenyesi, John P.
1992-01-01
The Ceramics Analysis and Reliability Evaluation of Structures (CARES) integrated design program on statistical fast fracture reliability and monolithic ceramic components is enhanced to include the use of a neutral data base, two-dimensional modeling, and variable problem size. The data base allows for the efficient transfer of element stresses, temperatures, and volumes/areas from the finite element output to the reliability analysis program. Elements are divided to insure a direct correspondence between the subelements and the Gaussian integration points. Two-dimensional modeling is accomplished by assessing the volume flaw reliability with shell elements. To demonstrate the improvements in the algorithm, example problems are selected from a round-robin conducted by WELFEP (WEakest Link failure probability prediction by Finite Element Postprocessors).
Probabilistic structural analysis methods for improving Space Shuttle engine reliability
NASA Technical Reports Server (NTRS)
Boyce, L.
1989-01-01
Probabilistic structural analysis methods are particularly useful in the design and analysis of critical structural components and systems that operate in very severe and uncertain environments. These methods have recently found application in space propulsion systems to improve the structural reliability of Space Shuttle Main Engine (SSME) components. A computer program, NESSUS, based on a deterministic finite-element program and a method of probabilistic analysis (fast probability integration) provides probabilistic structural analysis for selected SSME components. While computationally efficient, it considers both correlated and nonnormal random variables as well as an implicit functional relationship between independent and dependent variables. The program is used to determine the response of a nickel-based superalloy SSME turbopump blade. Results include blade tip displacement statistics due to the variability in blade thickness, modulus of elasticity, Poisson's ratio or density. Modulus of elasticity significantly contributed to blade tip variability while Poisson's ratio did not. Thus, a rational method for choosing parameters to be modeled as random is provided.
Software for Analyzing Sequences of Flow-Related Images
NASA Technical Reports Server (NTRS)
Klimek, Robert; Wright, Ted
2004-01-01
Spotlight is a computer program for analysis of sequences of images generated in combustion and fluid physics experiments. Spotlight can perform analysis of a single image in an interactive mode or a sequence of images in an automated fashion. The primary type of analysis is tracking of positions of objects over sequences of frames. Features and objects that are typically tracked include flame fronts, particles, droplets, and fluid interfaces. Spotlight automates the analysis of object parameters, such as centroid position, velocity, acceleration, size, shape, intensity, and color. Images can be processed to enhance them before statistical and measurement operations are performed. An unlimited number of objects can be analyzed simultaneously. Spotlight saves results of analyses in a text file that can be exported to other programs for graphing or further analysis. Spotlight is a graphical-user-interface-based program that at present can be executed on Microsoft Windows and Linux operating systems. A version that runs on Macintosh computers is being considered.
Rivera-Rodriguez, Claudia L; Resch, Stephen; Haneuse, Sebastien
2018-01-01
In many low- and middle-income countries, the costs of delivering public health programs such as for HIV/AIDS, nutrition, and immunization are not routinely tracked. A number of recent studies have sought to estimate program costs on the basis of detailed information collected on a subsample of facilities. While unbiased estimates can be obtained via accurate measurement and appropriate analyses, they are subject to statistical uncertainty. Quantification of this uncertainty, for example, via standard errors and/or 95% confidence intervals, provides important contextual information for decision-makers and for the design of future costing studies. While other forms of uncertainty, such as that due to model misspecification, are considered and can be investigated through sensitivity analyses, statistical uncertainty is often not reported in studies estimating the total program costs. This may be due to a lack of awareness/understanding of (1) the technical details regarding uncertainty estimation and (2) the availability of software with which to calculate uncertainty for estimators resulting from complex surveys. We provide an overview of statistical uncertainty in the context of complex costing surveys, emphasizing the various potential specific sources that contribute to overall uncertainty. We describe how analysts can compute measures of uncertainty, either via appropriately derived formulae or through resampling techniques such as the bootstrap. We also provide an overview of calibration as a means of using additional auxiliary information that is readily available for the entire program, such as the total number of doses administered, to decrease uncertainty and thereby improve decision-making and the planning of future studies. A recent study of the national program for routine immunization in Honduras shows that uncertainty can be reduced by using information available prior to the study. This method can not only be used when estimating the total cost of delivering established health programs but also to decrease uncertainty when the interest lies in assessing the incremental effect of an intervention. Measures of statistical uncertainty associated with survey-based estimates of program costs, such as standard errors and 95% confidence intervals, provide important contextual information for health policy decision-making and key inputs for the design of future costing studies. Such measures are often not reported, possibly because of technical challenges associated with their calculation and a lack of awareness of appropriate software. Modern statistical analysis methods for survey data, such as calibration, provide a means to exploit additional information that is readily available but was not used in the design of the study to significantly improve the estimation of total cost through the reduction of statistical uncertainty.
Resch, Stephen
2018-01-01
Objectives: In many low- and middle-income countries, the costs of delivering public health programs such as for HIV/AIDS, nutrition, and immunization are not routinely tracked. A number of recent studies have sought to estimate program costs on the basis of detailed information collected on a subsample of facilities. While unbiased estimates can be obtained via accurate measurement and appropriate analyses, they are subject to statistical uncertainty. Quantification of this uncertainty, for example, via standard errors and/or 95% confidence intervals, provides important contextual information for decision-makers and for the design of future costing studies. While other forms of uncertainty, such as that due to model misspecification, are considered and can be investigated through sensitivity analyses, statistical uncertainty is often not reported in studies estimating the total program costs. This may be due to a lack of awareness/understanding of (1) the technical details regarding uncertainty estimation and (2) the availability of software with which to calculate uncertainty for estimators resulting from complex surveys. We provide an overview of statistical uncertainty in the context of complex costing surveys, emphasizing the various potential specific sources that contribute to overall uncertainty. Methods: We describe how analysts can compute measures of uncertainty, either via appropriately derived formulae or through resampling techniques such as the bootstrap. We also provide an overview of calibration as a means of using additional auxiliary information that is readily available for the entire program, such as the total number of doses administered, to decrease uncertainty and thereby improve decision-making and the planning of future studies. Results: A recent study of the national program for routine immunization in Honduras shows that uncertainty can be reduced by using information available prior to the study. This method can not only be used when estimating the total cost of delivering established health programs but also to decrease uncertainty when the interest lies in assessing the incremental effect of an intervention. Conclusion: Measures of statistical uncertainty associated with survey-based estimates of program costs, such as standard errors and 95% confidence intervals, provide important contextual information for health policy decision-making and key inputs for the design of future costing studies. Such measures are often not reported, possibly because of technical challenges associated with their calculation and a lack of awareness of appropriate software. Modern statistical analysis methods for survey data, such as calibration, provide a means to exploit additional information that is readily available but was not used in the design of the study to significantly improve the estimation of total cost through the reduction of statistical uncertainty. PMID:29636964
NASA Astrophysics Data System (ADS)
Choquet, Élodie; Pueyo, Laurent; Soummer, Rémi; Perrin, Marshall D.; Hagan, J. Brendan; Gofas-Salas, Elena; Rajan, Abhijith; Aguilar, Jonathan
2015-09-01
The ALICE program, for Archival Legacy Investigation of Circumstellar Environment, is currently conducting a virtual survey of about 400 stars, by re-analyzing the HST-NICMOS coronagraphic archive with advanced post-processing techniques. We present here the strategy that we adopted to identify detections and potential candidates for follow-up observations, and we give a preliminary overview of our detections. We present a statistical analysis conducted to evaluate the confidence level on these detection and the completeness of our candidate search.
NASA Astrophysics Data System (ADS)
Koparan, Timur; Güven, Bülent
2015-07-01
The point of this study is to define the effect of project-based learning approach on 8th Grade secondary-school students' statistical literacy levels for data representation. To achieve this goal, a test which consists of 12 open-ended questions in accordance with the views of experts was developed. Seventy 8th grade secondary-school students, 35 in the experimental group and 35 in the control group, took this test twice, one before the application and one after the application. All the raw scores were turned into linear points by using the Winsteps 3.72 modelling program that makes the Rasch analysis and t-tests, and an ANCOVA analysis was carried out with the linear points. Depending on the findings, it was concluded that the project-based learning approach increases students' level of statistical literacy for data representation. Students' levels of statistical literacy before and after the application were shown through the obtained person-item maps.
Special Report: Schizophrenia.
ERIC Educational Resources Information Center
Mosher, Loren R.; Feinsilver, David
The review and analysis of the current status of knowledge about schizophrenia and its treatment begins with a brief review of some statistics on mental health, the National Institute of Mental Health's grants program in schizophrenia, an NIMH-sponsored international conference on Schizophrenia - the Implications of Research Findings for Treatment…
Computer program documentation for the pasture/range condition assessment processor
NASA Technical Reports Server (NTRS)
Mcintyre, K. S.; Miller, T. G. (Principal Investigator)
1982-01-01
The processor which drives for the RANGE software allows the user to analyze LANDSAT data containing pasture and rangeland. Analysis includes mapping, generating statistics, calculating vegetative indexes, and plotting vegetative indexes. Routines for using the processor are given. A flow diagram is included.
NASA Astrophysics Data System (ADS)
Hirst, Jonathan D.; King, Ross D.; Sternberg, Michael J. E.
1994-08-01
Neural networks and inductive logic programming (ILP) have been compared to linear regression for modelling the QSAR of the inhibition of E. coli dihydrofolate reductase (DHFR) by 2,4-diamino-5-(substitured benzyl)pyrimidines, and, in the subsequent paper [Hirst, J.D., King, R.D. and Sternberg, M.J.E., J. Comput.-Aided Mol. Design, 8 (1994) 421], the inhibition of rodent DHFR by 2,4-diamino-6,6-dimethyl-5-phenyl-dihydrotriazines. Cross-validation trials provide a statistically rigorous assessment of the predictive capabilities of the methods, with training and testing data selected randomly and all the methods developed using identical training data. For the ILP analysis, molecules are represented by attributes other than Hansch parameters. Neural networks and ILP perform better than linear regression using the attribute representation, but the difference is not statistically significant. The major benefit from the ILP analysis is the formulation of understandable rules relating the activity of the inhibitors to their chemical structure.
Analysis of medical students' needs for development of a career guidance program.
An, Hyejin; Kim, Eunjeong; Hwang, Jinyoung; Lee, Seunghee
2014-09-01
The purpose of this study is to provide basic data for the development of a career guidance program through a demand survey. For this purpose, three study topics were examined: Is there a difference between the satisfaction and importance of a career program? Is there a difference between the satisfaction and importance of a career program by gender, grade level? and What type of mentor and the mentoring way of medical students demanded? The subjects were 380 students at Seoul National University College of Medicine. The data were analyzed by frequency analysis, paired t-test, and Borich's formula. By t-test with matched samples for satisfaction-importance, We noted statistically significant differences in all domains. In particular, the difference was greater in the second year. According to the needs analysis, the most urgent program is meeting with seniors in various career areas. Also, medical students hope for mentor from clinical professors of the university and successful medical practitioners, and personal counseling. These results show that medical students need a career guidance program. The findings of the study can be used to guide the development of career education programs and curriculum for medicine students.
Analysis of carpooling in Missouri and an evaluation of Missouri's carpool services
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barnett, D.R.
1984-12-10
The evaluation is both a statistical profile of carpooling in Missouri as well as an experimental use of utilizing secondary data analysis in combination with clientele surveys to measure the impact of the Division of Energy's carpooling programs. Kansas City, mid-Missouri and St. Louis are examined. Secondary data analysis seems to indicate that during the period from 1980 to 1983 carpooling increased but vehicle occupancy counts decreased simultaneously with increasing gasoline prices. The evaluation theorizes that the Civilian Labor Force masked carpool statistics - growing at a faster rate than the carpooling growth rate. In conjunction with clientele surveys, themore » secondary data analysis measures the Division of Energy's impact on carpooling at 2.6% of all carpoolers in Kansas City and 1.0% of all carpoolers in St. Louis during 1983.« less
Quantitative histogram analysis of images
NASA Astrophysics Data System (ADS)
Holub, Oliver; Ferreira, Sérgio T.
2006-11-01
A routine for histogram analysis of images has been written in the object-oriented, graphical development environment LabVIEW. The program converts an RGB bitmap image into an intensity-linear greyscale image according to selectable conversion coefficients. This greyscale image is subsequently analysed by plots of the intensity histogram and probability distribution of brightness, and by calculation of various parameters, including average brightness, standard deviation, variance, minimal and maximal brightness, mode, skewness and kurtosis of the histogram and the median of the probability distribution. The program allows interactive selection of specific regions of interest (ROI) in the image and definition of lower and upper threshold levels (e.g., to permit the removal of a constant background signal). The results of the analysis of multiple images can be conveniently saved and exported for plotting in other programs, which allows fast analysis of relatively large sets of image data. The program file accompanies this manuscript together with a detailed description of two application examples: The analysis of fluorescence microscopy images, specifically of tau-immunofluorescence in primary cultures of rat cortical and hippocampal neurons, and the quantification of protein bands by Western-blot. The possibilities and limitations of this kind of analysis are discussed. Program summaryTitle of program: HAWGC Catalogue identifier: ADXG_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADXG_v1_0 Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Computers: Mobile Intel Pentium III, AMD Duron Installations: No installation necessary—Executable file together with necessary files for LabVIEW Run-time engine Operating systems or monitors under which the program has been tested: WindowsME/2000/XP Programming language used: LabVIEW 7.0 Memory required to execute with typical data:˜16MB for starting and ˜160MB used for loading of an image No. of bits in a word: 32 No. of processors used: 1 Has the code been vectorized or parallelized?: No No of lines in distributed program, including test data, etc.:138 946 No. of bytes in distributed program, including test data, etc.:15 166 675 Distribution format: tar.gz Nature of physical problem: Quantification of image data (e.g., for discrimination of molecular species in gels or fluorescent molecular probes in cell cultures) requires proprietary or complex software packages, which might not include the relevant statistical parameters or make the analysis of multiple images a tedious procedure for the general user. Method of solution: Tool for conversion of RGB bitmap image into luminance-linear image and extraction of luminance histogram, probability distribution, and statistical parameters (average brightness, standard deviation, variance, minimal and maximal brightness, mode, skewness and kurtosis of histogram and median of probability distribution) with possible selection of region of interest (ROI) and lower and upper threshold levels. Restrictions on the complexity of the problem: Does not incorporate application-specific functions (e.g., morphometric analysis) Typical running time: Seconds (depending on image size and processor speed) Unusual features of the program: None
Desai, Nidhi; Veras, Laura V; Gosain, Ankush
2018-06-01
The Accreditation Council for Graduate Medical Education (ACGME) Common Program Requirements state that faculty must establish and maintain an environment of inquiry and scholarship. Bibliometrics, the statistical analysis of written publications, assesses scientific productivity and impact. The goal of this study was to understand the state of scholarship at Pediatric Surgery training programs. Following IRB approval, Scopus was used to generate bibliometric profiles for US Pediatric Surgery training programs and faculty. Statistical analyses were performed. Information was obtained for 430 surgeons (105 female) from 48 US training programs. The mean lifetime h-index/surgeon for programs was 14.4 +/- 4.7 (6 programs above 1 SD, 9 programs below 1 SD). The mean 5-yearh-index/surgeon for programs was 3.92 +/- 1.5 (7 programs above 1 SD, 8 programs below 1 SD). Programs accredited after 2000 had a lower lifetime h-index than those accredited before 2000 (p=0.0378). Female surgeons had a lower lifetime h-index (p<0.0001), 5-yearh-index (p=0.0049), and m-quotient (p<0.0001) compared to males. Mean lifetime h-index increased with academic rank (p<0.0001), with no gender differences beyond the assistant professor rank (p=NS). Variability was identified based on institution, gender, and rank. This information can be used for benchmarking the academic productivity of faculty and programs and as an adjunct in promotion/tenure decisions. Original Research. n/a. Copyright © 2018 Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Harwell, Michael
2014-01-01
Commercial data analysis software has been a fixture of quantitative analyses in education for more than three decades. Despite its apparent widespread use there is no formal evidence cataloging what software is used in educational research and educational statistics classes, by whom and for what purpose, and whether some programs should be…
Mathematical Sciences Division 1992 Programs
1992-10-01
statistical theory that underlies modern signal analysis . There is a strong emphasis on stochastic processes and time series , particularly those which...include optimal resource planning and real- time scheduling of stochastic shop-floor processes. Scheduling systems will be developed that can adapt to...make forecasts for the length-of-service time series . Protocol analysis of these sessions will be used to idenify relevant contextual features and to
Validation of a proposal for evaluating hospital infection control programs.
Silva, Cristiane Pavanello Rodrigues; Lacerda, Rúbia Aparecida
2011-02-01
To validate the construct and discriminant properties of a hospital infection prevention and control program. The program consisted of four indicators: technical-operational structure; operational prevention and control guidelines; epidemiological surveillance system; and prevention and control activities. These indicators, with previously validated content, were applied to 50 healthcare institutions in the city of São Paulo, Southeastern Brazil, in 2009. Descriptive statistics were used to characterize the hospitals and indicator scores, and Cronbach's α coefficient was used to evaluate the internal consistency. The discriminant validity was analyzed by comparing indicator scores between groups of hospitals: with versus without quality certification. The construct validity analysis was based on exploratory factor analysis with a tetrachoric correlation matrix. The indicators for the technical-operational structure and epidemiological surveillance presented almost 100% conformity in the whole sample. The indicators for the operational prevention and control guidelines and the prevention and control activities presented internal consistency ranging from 0.67 to 0.80. The discriminant validity of these indicators indicated higher and statistically significant mean conformity scores among the group of institutions with healthcare certification or accreditation processes. In the construct validation, two dimensions were identified for the operational prevention and control guidelines: recommendations for preventing hospital infection and recommendations for standardizing prophylaxis procedures, with good correlation between the analysis units that formed the guidelines. The same was found for the prevention and control activities: interfaces with treatment units and support units were identified. Validation of the measurement properties of the hospital infection prevention and control program indicators made it possible to develop a tool for evaluating these programs in an ethical and scientific manner in order to obtain a quality diagnosis in this field.
Kuretzki, Carlos Henrique; Campos, Antônio Carlos Ligocki; Malafaia, Osvaldo; Soares, Sandramara Scandelari Kusano de Paula; Tenório, Sérgio Bernardo; Timi, Jorge Rufino Ribas
2016-03-01
The use of information technology is often applied in healthcare. With regard to scientific research, the SINPE(c) - Integrated Electronic Protocols was created as a tool to support researchers, offering clinical data standardization. By the time, SINPE(c) lacked statistical tests obtained by automatic analysis. Add to SINPE(c) features for automatic realization of the main statistical methods used in medicine . The study was divided into four topics: check the interest of users towards the implementation of the tests; search the frequency of their use in health care; carry out the implementation; and validate the results with researchers and their protocols. It was applied in a group of users of this software in their thesis in the strict sensu master and doctorate degrees in one postgraduate program in surgery. To assess the reliability of the statistics was compared the data obtained both automatically by SINPE(c) as manually held by a professional in statistics with experience with this type of study. There was concern for the use of automatic statistical tests, with good acceptance. The chi-square, Mann-Whitney, Fisher and t-Student were considered as tests frequently used by participants in medical studies. These methods have been implemented and thereafter approved as expected. The incorporation of the automatic SINPE (c) Statistical Analysis was shown to be reliable and equal to the manually done, validating its use as a research tool for medical research.
Stephens, D.W.; Wangsgard, J.B.
1988-01-01
A computer program, Numerical Taxonomy System of Multivariate Statistical Programs (NTSYS), was used with interfacing software to perform cluster analyses of phytoplankton data stored in the biological files of the U.S. Geological Survey. The NTSYS software performs various types of statistical analyses and is capable of handling a large matrix of data. Cluster analyses were done on phytoplankton data collected from 1974 to 1981 at four national Stream Quality Accounting Network stations in the Tennessee River basin. Analysis of the changes in clusters of phytoplankton genera indicated possible changes in the water quality of the French Broad River near Knoxville, Tennessee. At this station, the most common diatom groups indicated a shift in dominant forms with some of the less common diatoms being replaced by green and blue-green algae. There was a reduction in genera variability between 1974-77 and 1979-81 sampling periods. Statistical analysis of chloride and dissolved solids confirmed that concentrations of these substances were smaller in 1974-77 than in 1979-81. At Pickwick Landing Dam, the furthest downstream station used in the study, there was an increase in the number of genera of ' rare ' organisms with time. The appearance of two groups of green and blue-green algae indicated that an increase in temperature or nutrient concentrations occurred from 1974 to 1981, but this could not be confirmed using available water quality data. Associations of genera forming the phytoplankton communities at three stations on the Tennessee River were found to be seasonal. Nodal analysis of combined data from all four stations used in the study did not identify any seasonal or temporal patterns during 1974-81. Cluster analysis using the NYSYS programs was effective in reducing the large phytoplankton data set to a manageable size and provided considerable insight into the structure of phytoplankton communities in the Tennessee River basin. Problems encountered using cluster analysis were the subjectivity introduced in the definition of meaningful clusters, and the lack of taxonomic identification to the species level. (Author 's abstract)
Computer programs for computing particle-size statistics of fluvial sediments
Stevens, H.H.; Hubbell, D.W.
1986-01-01
Two versions of computer programs for inputing data and computing particle-size statistics of fluvial sediments are presented. The FORTRAN 77 language versions are for use on the Prime computer, and the BASIC language versions are for use on microcomputers. The size-statistics program compute Inman, Trask , and Folk statistical parameters from phi values and sizes determined for 10 specified percent-finer values from inputed size and percent-finer data. The program also determines the percentage gravel, sand, silt, and clay, and the Meyer-Peter effective diameter. Documentation and listings for both versions of the programs are included. (Author 's abstract)
STATISTICAL PROGRAMS OF THE UNITED STATES GOVERNMENT: FISCAL YEAR 2018
DOT National Transportation Integrated Search
2018-01-01
Statistical Programs of the United States Government: Fiscal Year 2018 outlines the funding proposed for Federal statistical activities in the President's Budget. This report, along with the chapter "Strengthening Federal Statistics" in the Analytica...
The impact of length of stay on recovery measures in faith-based addiction treatment.
Lashley, Mary
2018-03-30
To determine the impact of length of stay among homeless men in faith-based residential addictions recovery on physical activity, depression, self-esteem, and nicotine dependence. A time series design was utilized to measure changes in the four quality measures at program entry and at 3, 6, and 9 months following admission. The sample consisted of 175 homeless residents enrolled in a faith-based residential recovery program. Paired t tests were used to determine the change in average instrument response from admission to each follow-up period. Analysis of variance (ANOVA) and Tukey posthoc tests were used to assess for differences in length of stay between demographic variables. Statistically significant improvements were noted in self-esteem and depressive symptomatology at 3 and 6 months following admission and in physical activity levels at 3 months following admission. Nicotine dependence scores declined at 3 and 6 months but were not statistically significant. Time spent in this faith-based spiritual recovery program had a significant impact on depression, self-esteem, and physical activity. Recommendations for future study include conducting research to analyze the relationship between distinct program elements and quality indicators and comparing faith-based programs to other similar programs and to publicly funded secular recovery programs. © 2018 Wiley Periodicals, Inc.
SPSS and SAS programming for the testing of mediation models.
Dudley, William N; Benuzillo, Jose G; Carrico, Mineh S
2004-01-01
Mediation modeling can explain the nature of the relation among three or more variables. In addition, it can be used to show how a variable mediates the relation between levels of intervention and outcome. The Sobel test, developed in 1990, provides a statistical method for determining the influence of a mediator on an intervention or outcome. Although interactive Web-based and stand-alone methods exist for computing the Sobel test, SPSS and SAS programs that automatically run the required regression analyses and computations increase the accessibility of mediation modeling to nursing researchers. To illustrate the utility of the Sobel test and to make this programming available to the Nursing Research audience in both SAS and SPSS. The history, logic, and technical aspects of mediation testing are introduced. The syntax files sobel.sps and sobel.sas, created to automate the computation of the regression analysis and test statistic, are available from the corresponding author. The reported programming allows the user to complete mediation testing with the user's own data in a single-step fashion. A technical manual included with the programming provides instruction on program use and interpretation of the output. Mediation modeling is a useful tool for describing the relation between three or more variables. Programming and manuals for using this model are made available.
ASTEP user's guide and software documentation
NASA Technical Reports Server (NTRS)
Gliniewicz, A. S.; Lachowski, H. M.; Pace, W. H., Jr.; Salvato, P., Jr.
1974-01-01
The Algorithm Simulation Test and Evaluation Program (ASTEP) is a modular computer program developed for the purpose of testing and evaluating methods of processing remotely sensed multispectral scanner earth resources data. ASTEP is written in FORTRAND V on the UNIVAC 1110 under the EXEC 8 operating system and may be operated in either a batch or interactive mode. The program currently contains over one hundred subroutines consisting of data classification and display algorithms, statistical analysis algorithms, utility support routines, and feature selection capability. The current program can accept data in LARSC1, LARSC2, ERTS, and Universal formats, and can output processed image or data tapes in Universal format.
7 CFR 295.5 - Program statistical reports.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 7 Agriculture 4 2011-01-01 2011-01-01 false Program statistical reports. 295.5 Section 295.5 Agriculture Regulations of the Department of Agriculture (Continued) FOOD AND NUTRITION SERVICE, DEPARTMENT OF... statistical reports. Current and historical information on FNS food assistance program size, monetary outlays...
7 CFR 295.5 - Program statistical reports.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 7 Agriculture 4 2010-01-01 2010-01-01 false Program statistical reports. 295.5 Section 295.5 Agriculture Regulations of the Department of Agriculture (Continued) FOOD AND NUTRITION SERVICE, DEPARTMENT OF... statistical reports. Current and historical information on FNS food assistance program size, monetary outlays...
Developing Competency of Teachers in Basic Education Schools
ERIC Educational Resources Information Center
Yuayai, Rerngrit; Chansirisira, Pacharawit; Numnaphol, Kochaporn
2015-01-01
This study aims to develop competency of teachers in basic education schools. The research instruments included the semi-structured in-depth interview form, questionnaire, program developing competency, and evaluation competency form. The statistics used for data analysis were percentage, mean, and standard deviation. The research found that…
Do climate extreme events foster violent civil conflicts? A coincidence analysis
NASA Astrophysics Data System (ADS)
Schleussner, Carl-Friedrich; Donges, Jonathan F.; Donner, Reik V.
2014-05-01
Civil conflicts promoted by adverse environmental conditions represent one of the most important potential feedbacks in the global socio-environmental nexus. While the role of climate extremes as a triggering factor is often discussed, no consensus is yet reached about the cause-and-effect relation in the observed data record. Here we present results of a rigorous statistical coincidence analysis based on the Munich Re Inc. extreme events database and the Uppsala conflict data program. We report evidence for statistically significant synchronicity between climate extremes with high economic impact and violent conflicts for various regions, although no coherent global signal emerges from our analysis. Our results indicate the importance of regional vulnerability and might aid to identify hot-spot regions for potential climate-triggered violent social conflicts.
Uzunovic, Slavoljub; Kostic, Radmila; Zivkovic, Dobrica
2010-09-01
This study aimed to determine the effects of two different programs of modern sports dancing on coordination, strength, and speed in 60 beginner-level female dancers, aged 13 and 14 yrs. The subjects were divided into two experimental groups (E1 and E2), each numbering 30 subjects, drawn from local dance clubs. In order to determine motor coordination, strength, and speed, we used 15 measurements. The groups were tested before and after the experimental programs. Both experimental programs lasted for 18 wks, with training sessions twice a week for 60 minutes. The subjects from the E1 group trained according to a new experimental program of disco dance (DD) modern sports dance, and the E2 group trained according to the classic DD program of the same kind for beginner selections. The obtained results were assessed by statistical analysis: a paired-samples t-test and MANCOVA/ANCOVA. The results indicated that following the experimental programs, both groups showed a statistically significant improvement in the evaluated skills, but the changes among the E1 group subjects were more pronounced. The basic assumption of this research was confirmed, that the new experimental DD program has a significant influence on coordination, strength, and speed. In relation to these changes, the application of the new DD program was recommended for beginner dancers.
Tc-99m Labeled and VIP-Receptor Targeted Liposomes for Effective Imaging of Breast Cancer
2006-09-01
computer. The images (100,000 counts/image) were acquired and stored in a 512X512 matrix. Image Analysis : The Odyssey software program was used to...as well as between normal and tumor- were calculated. Statistical analysis was performed bearing rats for each of the formulations using using...large signal-to-noise ratio, thereby rendering data analysis impractical. Moreover, -helicity of VIP associated with SSM is potentiated in the presence
GenomeGraphs: integrated genomic data visualization with R.
Durinck, Steffen; Bullard, James; Spellman, Paul T; Dudoit, Sandrine
2009-01-06
Biological studies involve a growing number of distinct high-throughput experiments to characterize samples of interest. There is a lack of methods to visualize these different genomic datasets in a versatile manner. In addition, genomic data analysis requires integrated visualization of experimental data along with constantly changing genomic annotation and statistical analyses. We developed GenomeGraphs, as an add-on software package for the statistical programming environment R, to facilitate integrated visualization of genomic datasets. GenomeGraphs uses the biomaRt package to perform on-line annotation queries to Ensembl and translates these to gene/transcript structures in viewports of the grid graphics package. This allows genomic annotation to be plotted together with experimental data. GenomeGraphs can also be used to plot custom annotation tracks in combination with different experimental data types together in one plot using the same genomic coordinate system. GenomeGraphs is a flexible and extensible software package which can be used to visualize a multitude of genomic datasets within the statistical programming environment R.
Ho, Andrew D; Yu, Carol C
2015-06-01
Many statistical analyses benefit from the assumption that unconditional or conditional distributions are continuous and normal. More than 50 years ago in this journal, Lord and Cook chronicled departures from normality in educational tests, and Micerri similarly showed that the normality assumption is met rarely in educational and psychological practice. In this article, the authors extend these previous analyses to state-level educational test score distributions that are an increasingly common target of high-stakes analysis and interpretation. Among 504 scale-score and raw-score distributions from state testing programs from recent years, nonnormal distributions are common and are often associated with particular state programs. The authors explain how scaling procedures from item response theory lead to nonnormal distributions as well as unusual patterns of discreteness. The authors recommend that distributional descriptive statistics be calculated routinely to inform model selection for large-scale test score data, and they illustrate consequences of nonnormality using sensitivity studies that compare baseline results to those from normalized score scales.
Lack of grading agreement among international hemostasis external quality assessment programs
Olson, John D.; Jennings, Ian; Meijer, Piet; Bon, Chantal; Bonar, Roslyn; Favaloro, Emmanuel J.; Higgins, Russell A.; Keeney, Michael; Mammen, Joy; Marlar, Richard A.; Meley, Roland; Nair, Sukesh C.; Nichols, William L.; Raby, Anne; Reverter, Joan C.; Srivastava, Alok; Walker, Isobel
2018-01-01
Laboratory quality programs rely on internal quality control and external quality assessment (EQA). EQA programs provide unknown specimens for the laboratory to test. The laboratory's result is compared with other (peer) laboratories performing the same test. EQA programs assign target values using a variety of methods statistical tools and performance assessment of ‘pass’ or ‘fail’ is made. EQA provider members of the international organization, external quality assurance in thrombosis and hemostasis, took part in a study to compare outcome of performance analysis using the same data set of laboratory results. Eleven EQA organizations using eight different analytical approaches participated. Data for a normal and prolonged activated partial thromboplastin time (aPTT) and a normal and reduced factor VIII (FVIII) from 218 laboratories were sent to the EQA providers who analyzed the data set using their method of evaluation for aPTT and FVIII, determining the performance for each laboratory record in the data set. Providers also summarized their statistical approach to assignment of target values and laboratory performance. Each laboratory record in the data set was graded pass/fail by all EQA providers for each of the four analytes. There was a lack of agreement of pass/fail grading among EQA programs. Discordance in the grading was 17.9 and 11% of normal and prolonged aPTT results, respectively, and 20.2 and 17.4% of normal and reduced FVIII results, respectively. All EQA programs in this study employed statistical methods compliant with the International Standardization Organization (ISO), ISO 13528, yet the evaluation of laboratory results for all four analytes showed remarkable grading discordance. PMID:29232255
Wahner-Roedler, Dietlind L.; Thompson, Jeffrey M.; Luedtke, Connie A.; King, Susan M.; Cha, Stephen S.; Elkin, Peter L.; Bruce, Barbara K.; Townsend, Cynthia O.; Bergeson, Jody R.; Eickhoff, Andrea L.; Loehrer, Laura L.; Sood, Amit; Bauer, Brent A.
2011-01-01
Most patients with fibromyalgia use complementary and alternative medicine (CAM). Properly designed controlled trials are necessary to assess the effectiveness of these practices. This study was a randomized, double-blind, placebo-controlled, early phase trial. Fifty patients seen at a fibromyalgia outpatient treatment program were randomly assigned to a daily soy or placebo (casein) shake. Outcome measures were scores of the Fibromyalgia Impact Questionnaire (FIQ) and the Center for Epidemiologic Studies Depression Scale (CES-D) at baseline and after 6 weeks of intervention. Analysis was with standard statistics based on the null hypothesis, and separation test for early phase CAM comparative trials. Twenty-eight patients completed the study. Use of standard statistics with intent-to-treat analysis showed that total FIQ scores decreased by 14% in the soy group (P = .02) and by 18% in the placebo group (P < .001). The difference in change in scores between the groups was not significant (P = .16). With the same analysis, CES-D scores decreased in the soy group by 16% (P = .004) and in the placebo group by 15% (P = .05). The change in scores was similar in the groups (P = .83). Results of statistical analysis using the separation test and intent-to-treat analysis revealed no benefit of soy compared with placebo. Shakes that contain soy and shakes that contain casein, when combined with a multidisciplinary fibromyalgia treatment program, provide a decrease in fibromyalgia symptoms. Separation between the effects of soy and casein (control) shakes did not favor the intervention. Therefore, large-sample studies using soy for patients with fibromyalgia are probably not indicated. PMID:18990724
United States Air Force Graduate Student Research Program. Program Technical rept. Volume 3
1988-12-01
placing scalp electrodes in a bipolar arrangement. An additional problem we faced in the F4 study was time- loc ::ing the ERP to the eliciting...and Leo Jehl of OEHL for their help with halide and metal analysis that was invalubae to this project. John Barnaby, Will Robinson, Chris Ritter, and...a patient teacher of statistical methods. Dr Eric D. Grassman and Dr Leo Spaccavento explained many technical cardiological points and were generally
ERIC Educational Resources Information Center
Dierker, Lisa; Ward, Nadia; Alexander, Jalen; Donate, Emmanuel
2017-01-01
Background: Upward trends in data-oriented careers threaten to further increase the underrepresentation of both females and individuals from racial minority groups in programs focused on data analysis and applied statistics. To begin to develop the necessary skills for a data-oriented career, project-based learning seems the most promising given…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-29
... for OMB Review; Comment Request; Local Area Unemployment Statistics Program ACTION: Notice. SUMMARY... collection request (ICR) titled, ``Local Area Unemployment Statistics Program,'' to the Office of Management... of Collection: Local Area Unemployment Statistics Program. OMB Control Number: 1220-0017. Affected...
Huang, Shi; MacKinnon, David P.; Perrino, Tatiana; Gallo, Carlos; Cruden, Gracelyn; Brown, C Hendricks
2016-01-01
Mediation analysis often requires larger sample sizes than main effect analysis to achieve the same statistical power. Combining results across similar trials may be the only practical option for increasing statistical power for mediation analysis in some situations. In this paper, we propose a method to estimate: 1) marginal means for mediation path a, the relation of the independent variable to the mediator; 2) marginal means for path b, the relation of the mediator to the outcome, across multiple trials; and 3) the between-trial level variance-covariance matrix based on a bivariate normal distribution. We present the statistical theory and an R computer program to combine regression coefficients from multiple trials to estimate a combined mediated effect and confidence interval under a random effects model. Values of coefficients a and b, along with their standard errors from each trial are the input for the method. This marginal likelihood based approach with Monte Carlo confidence intervals provides more accurate inference than the standard meta-analytic approach. We discuss computational issues, apply the method to two real-data examples and make recommendations for the use of the method in different settings. PMID:28239330
Mediation analysis in nursing research: a methodological review
Liu, Jianghong; Ulrich, Connie
2017-01-01
Mediation statistical models help clarify the relationship between independent predictor variables and dependent outcomes of interest by assessing the impact of third variables. This type of statistical analysis is applicable for many clinical nursing research questions, yet its use within nursing remains low. Indeed, mediational analyses may help nurse researchers develop more effective and accurate prevention and treatment programs as well as help bridge the gap between scientific knowledge and clinical practice. In addition, this statistical approach allows nurse researchers to ask – and answer – more meaningful and nuanced questions that extend beyond merely determining whether an outcome occurs. Therefore, the goal of this paper is to provide a brief tutorial on the use of mediational analyses in clinical nursing research by briefly introducing the technique and, through selected empirical examples from the nursing literature, demonstrating its applicability in advancing nursing science. PMID:26176804
Frojo, Gianfranco; Tadisina, Kashyap Komarraju; Pressman, Zachary; Chibnall, John T; Lin, Alexander Y; Kraemer, Bruce A
2016-12-01
The integrated plastic surgery match is a competitive process not only for applicants but also for programs vying for highly qualified candidates. Interactions between applicants and program constituents are limited to a single interview visit. The authors aimed to identify components of the interview visit that influence applicant decision making when determining a final program rank list. Thirty-six applicants who were interviewed (100% response) completed the survey. Applicants rated the importance of 20 elements of the interview visit regarding future ranking of the program on a 1 to 5 Likert scale. Data were analyzed using descriptive statistics, hierarchical cluster analysis, analysis of variance, and Pearson correlations. A literature review was performed regarding the plastic surgery integrated residency interview process. Survey questions were categorized into four groups based on mean survey responses:1. Interactions with faculty and residents (mean response > 4),2. Information about the program (3.5-4),3. Ancillaries (food, amenities, stipends) (3-3.5),4. Hospital tour, hotel (<3).Hierarchical item cluster analysis and analysis of variance testing validated these groupings. Average summary scores were calculated for the items representing Interactions, Information, and Ancillaries. Correlation analysis between clusters yielded no significant correlations. A review of the literature yielded a paucity of data on analysis of the interview visit. The interview visit consists of a discrete hierarchy of perceived importance by applicants. The strongest independent factor in determining future program ranking is the quality of interactions between applicants and program constituents on the interview visit. This calls for further investigation and optimization of the interview visit experience.
The Statistical Analysis Techniques to Support the NGNP Fuel Performance Experiments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bihn T. Pham; Jeffrey J. Einerson
2010-06-01
This paper describes the development and application of statistical analysis techniques to support the AGR experimental program on NGNP fuel performance. The experiments conducted in the Idaho National Laboratory’s Advanced Test Reactor employ fuel compacts placed in a graphite cylinder shrouded by a steel capsule. The tests are instrumented with thermocouples embedded in graphite blocks and the target quantity (fuel/graphite temperature) is regulated by the He-Ne gas mixture that fills the gap volume. Three techniques for statistical analysis, namely control charting, correlation analysis, and regression analysis, are implemented in the SAS-based NGNP Data Management and Analysis System (NDMAS) for automatedmore » processing and qualification of the AGR measured data. The NDMAS also stores daily neutronic (power) and thermal (heat transfer) code simulation results along with the measurement data, allowing for their combined use and comparative scrutiny. The ultimate objective of this work includes (a) a multi-faceted system for data monitoring and data accuracy testing, (b) identification of possible modes of diagnostics deterioration and changes in experimental conditions, (c) qualification of data for use in code validation, and (d) identification and use of data trends to support effective control of test conditions with respect to the test target. Analysis results and examples given in the paper show the three statistical analysis techniques providing a complementary capability to warn of thermocouple failures. It also suggests that the regression analysis models relating calculated fuel temperatures and thermocouple readings can enable online regulation of experimental parameters (i.e. gas mixture content), to effectively maintain the target quantity (fuel temperature) within a given range.« less
Quantitative trait nucleotide analysis using Bayesian model selection.
Blangero, John; Goring, Harald H H; Kent, Jack W; Williams, Jeff T; Peterson, Charles P; Almasy, Laura; Dyer, Thomas D
2005-10-01
Although much attention has been given to statistical genetic methods for the initial localization and fine mapping of quantitative trait loci (QTLs), little methodological work has been done to date on the problem of statistically identifying the most likely functional polymorphisms using sequence data. In this paper we provide a general statistical genetic framework, called Bayesian quantitative trait nucleotide (BQTN) analysis, for assessing the likely functional status of genetic variants. The approach requires the initial enumeration of all genetic variants in a set of resequenced individuals. These polymorphisms are then typed in a large number of individuals (potentially in families), and marker variation is related to quantitative phenotypic variation using Bayesian model selection and averaging. For each sequence variant a posterior probability of effect is obtained and can be used to prioritize additional molecular functional experiments. An example of this quantitative nucleotide analysis is provided using the GAW12 simulated data. The results show that the BQTN method may be useful for choosing the most likely functional variants within a gene (or set of genes). We also include instructions on how to use our computer program, SOLAR, for association analysis and BQTN analysis.
O'Connor, B P
2000-08-01
Popular statistical software packages do not have the proper procedures for determining the number of components in factor and principal components analyses. Parallel analysis and Velicer's minimum average partial (MAP) test are validated procedures, recommended widely by statisticians. However, many researchers continue to use alternative, simpler, but flawed procedures, such as the eigenvalues-greater-than-one rule. Use of the proper procedures might be increased if these procedures could be conducted within familiar software environments. This paper describes brief and efficient programs for using SPSS and SAS to conduct parallel analyses and the MAP test.
2009 GED Testing Program Statistical Report
ERIC Educational Resources Information Center
GED Testing Service, 2010
2010-01-01
The "2009 GED[R] Testing Program Statistical Report" is the 52nd annual report in the program's 68-year history of providing a second opportunity for adults without a high school credential to earn their jurisdiction's GED credential. The report provides candidate demographic and GED Test performance statistics as well as historical…
An Assessment Blueprint for EncStat: A Statistics Anxiety Intervention Program.
ERIC Educational Resources Information Center
Watson, Freda S.; Lang, Thomas R.; Kromrey, Jeffrey D.; Ferron, John M.; Hess, Melinda R.; Hogarty, Kristine Y.
EncStat (Encouraged about Statistics) is a multimedia program being developed to identify and assist students with statistics anxiety or negative attitudes about statistics. This study explored the validity of the assessment instruments included in EncStat with respect to their diagnostic value for statistics anxiety and negative attitudes about…
Welfare Reform in California: Early Results from the Impact Analysis.
ERIC Educational Resources Information Center
Klerman, Jacob Alex; Hotz, V. Joseph; Reardon, Elaine; Cox, Amy G.; Farley, Donna O.; Haider, Steven J.; Imbens, Guido; Schoeni, Robert
The impact of California Work Opportunity and Responsibility to Kids (CalWORKS), which was passed to increase California welfare recipients' participation in welfare-to-work (WTW) activities, was examined. The impact study consisted of a nonexperimental program evaluation that used statistical models to estimate causal effects and a simulation…
DIFAS: Differential Item Functioning Analysis System. Computer Program Exchange
ERIC Educational Resources Information Center
Penfield, Randall D.
2005-01-01
Differential item functioning (DIF) is an important consideration in assessing the validity of test scores (Camilli & Shepard, 1994). A variety of statistical procedures have been developed to assess DIF in tests of dichotomous (Hills, 1989; Millsap & Everson, 1993) and polytomous (Penfield & Lam, 2000; Potenza & Dorans, 1995) items. Some of these…
Successful Vocational Rehabilitation of Clients with Retinitis Pigmentosa.
ERIC Educational Resources Information Center
Taheri-Araghi, M.; Hendren, G.
1994-01-01
Statistical analysis of 10 personal (client) variables and four program variables related to 76 people who became blind from retinitis pigmentosa revealed that 6 variables predicted clients' rehabilitation outcomes: age, gender, race, work status, amount of case-service money spent on the client's behalf, and number of changes in career objectives…
American Samoa's forest resources, 2001.
Joseph A. Donnegan; Sheri S. Mann; Sarah L. Butler; Bruce A. Hiserote
2004-01-01
The Forest Inventory and Analysis Program of the Pacific Northwest Research Station collected, analyzed, and summarized data from field plots, and mapped land cover on four islands in American Samoa. This statistical sample provides estimates of forest area, stem volume, biomass, numbers of trees, damages to trees, and tree size distribution. The summary provides...
Leveraging Code Comments to Improve Software Reliability
ERIC Educational Resources Information Center
Tan, Lin
2009-01-01
Commenting source code has long been a common practice in software development. This thesis, consisting of three pieces of work, made novel use of the code comments written in natural language to improve software reliability. Our solution combines Natural Language Processing (NLP), Machine Learning, Statistics, and Program Analysis techniques to…
Monte Carlo Approach for Reliability Estimations in Generalizability Studies.
ERIC Educational Resources Information Center
Dimitrov, Dimiter M.
A Monte Carlo approach is proposed, using the Statistical Analysis System (SAS) programming language, for estimating reliability coefficients in generalizability theory studies. Test scores are generated by a probabilistic model that considers the probability for a person with a given ability score to answer an item with a given difficulty…
ESL Student Bias in Instructional Evaluation.
ERIC Educational Resources Information Center
Wennerstrom, Ann K.; Heiser, Patty
1992-01-01
Reports on a statistical analysis of English-as-a-Second-Language (ESL) student evaluations of teachers in two large programs. Results indicated a systematic bias occurs in ESL student evaluations, raising issues of fairness in the use of student evaluations of ESL teachers for purposes of personnel decisions. (21 references) (GLR)
Artificial Neural Networks in Policy Research: A Current Assessment.
ERIC Educational Resources Information Center
Woelfel, Joseph
1993-01-01
Suggests that artificial neural networks (ANNs) exhibit properties that promise usefulness for policy researchers. Notes that ANNs have found extensive use in areas once reserved for multivariate statistical programs such as regression and multiple classification analysis and are developing an extensive community of advocates for processing text…
Statistics of Poverty: A Bibliography.
ERIC Educational Resources Information Center
Cameron, Colin; And Others
This bibliography is a collection of citations and illustrative tables which is meant to complement and supplement a 1976 work by the United States Department of Health, Education, and Welfare. That work included an analysis of several social service programs such as food stamps, public welfare, housing, and the education of the poor. The…
Teaching Graduate Business Students to Write Clearly about Technical Topics
ERIC Educational Resources Information Center
Dyrud, Marilyn A.; Worley, Rebecca B.; Jameson, Daphne
2006-01-01
Graduate programs in business emphasize technical analysis in finance, accounting, marketing, and other core courses. Important business decisions--what market to target, which products to offer, how to finance an acquisition, whether to lease or buy equipment--require mathematical and statistical problem solving. Management communication courses…
Forest land area estimates from vegetation continuous fields
Mark D. Nelson; Ronald E. McRoberts; Matthew C. Hansen
2004-01-01
The USDA Forest Service's Forest Inventory and Analysis (FIA) program provides data, information, and knowledge about our Nation's forest resources. FIA regional units collect data from field plots and remotely sensed imagery to produce statistical estimates of forest extent (area); volume, growth, and removals; and health and condition. There is increasing...
Faculty Perceptions of Transition Personnel Preparation in Saudi Arabia
ERIC Educational Resources Information Center
Alhossan, Bandar A.; Trainor, Audrey A.
2017-01-01
This study investigated to what extent faculty members include and value transition curricula in special education preparation programs in Saudi Arabia. A web-based survey was conducted and sent to special education professors across 20 universities. Descriptive statistics and a t-test analysis generated three main findings: (a) Institutions…
NASA Technical Reports Server (NTRS)
Bull, William B. (Compiler); Pinoli, Pat C. (Compiler); Upton, Cindy G. (Compiler); Day, Tony (Compiler); Hill, Keith (Compiler); Stone, Frank (Compiler); Hall, William B.
1994-01-01
This report is a compendium of the presentations of the 12th biannual meeting of the Industry Advisory Committee under the Solid Propulsion Integrity Program. A complete transcript of the welcoming talks is provided. Presentation outlines and overheads are included for the other sessions: SPIP Overview, Past, Current and Future Activity; Test Methods Manual and Video Tape Library; Air Force Developed Computer Aided Cure Program and SPC/TQM Experience; Magneto-Optical mapper (MOM), Joint Army/NASA program to assess composite integrity; Permeability Testing; Moisture Effusion Testing by Karl Fischer Analysis; Statistical Analysis of Acceptance Test Data; NMR Phenolic Resin Advancement; Constituent Testing Highlights on the LDC Optimization Program; Carbon Sulfur Study, Performance Related Testing; Current Rayon Specifications and Future Availability; RSRM/SPC Implementation; SRM Test Methods, Delta/Titan/FBM/RSRM; and Open Forum on Performance Based Acceptance Testing -- Industry Experience.
Park, Sungmin
2014-01-01
This study analyzes the efficiency of small and medium-sized enterprises (SMEs) of a national technology innovation research and development (R&D) program. In particular, an empirical analysis is presented that aims to answer the following question: "Is there a difference in the efficiency between R&D collaboration types and between government R&D subsidy sizes?" Methodologically, the efficiency of a government-sponsored R&D project (i.e., GSP) is measured by Data Envelopment Analysis (DEA), and a nonparametric analysis of variance method, the Kruskal-Wallis (KW) test is adopted to see if the efficiency differences between R&D collaboration types and between government R&D subsidy sizes are statistically significant. This study's major findings are as follows. First, contrary to our hypothesis, when we controlled the influence of government R&D subsidy size, there was no statistically significant difference in the efficiency between R&D collaboration types. However, the R&D collaboration type, "SME-University-Laboratory" Joint-Venture was superior to the others, achieving the largest median and the smallest interquartile range of DEA efficiency scores. Second, the differences in the efficiency were statistically significant between government R&D subsidy sizes, and the phenomenon of diseconomies of scale was identified on the whole. As the government R&D subsidy size increases, the central measures of DEA efficiency scores were reduced, but the dispersion measures rather tended to get larger.
NASA Technical Reports Server (NTRS)
Michaud, N. H.
1979-01-01
A system of independent computer programs for the processing of digitized pulse code modulated (PCM) and frequency modulated (FM) data is described. Information is stored in a set of random files and accessed to produce both statistical and graphical output. The software system is designed primarily to present these reports within a twenty-four hour period for quick analysis of the helicopter's performance.
ERIC Educational Resources Information Center
Asahina, Roberta R.
A two-fold statistical analysis examined the creative development of the 15-second television commercial, providing a follow-up to a similar study conducted in 1986. Study 1 of the present analysis examined 335 actual 15-second spots extracted from 30 hours of network daytime and primetime programming in the fourth and first quarters of 1988-1989.…
Influence of the helicopter environment on patient care capabilities: Flight crew perceptions
NASA Technical Reports Server (NTRS)
Meyers, K. Jeffrey; Rodenberg, Howard; Woodard, Daniel
1994-01-01
Flight crew perceptions of the effect of the rotary wing environment on patient care capabilities have not been subject to statistical analysis. We hypothesized that flight crew perceived significant difficulties in performing patient care tasks during air medical transport. A survey instrument was distributed to a convenience sample of flight crew members from twenty flight programs. Respondents were asked to compare the difficulty of performing patient care tasks in rotary wing and standard (emergency department or intensive care unit) settings. Demographic data collected on respondents included years of flight experience, flights per month, crew duty position, and primary aircraft in which the respondent worked. Statistical analysis was performed as appropriate using Student's t-test, type 111 sum of squares, and analysis of variance. Alpha was defined as p is less than or equal to .05. Fifty-five percent of programs (90 individuals) responded. All tasks were rated significantly more difficult in the rotary wing environment. Ratings were not significantly correlated with flight experience, duty position, flights per month, or aircraft used. We conclude that the performance of patient care tasks are perceived by air medical flight crew to be significantly more difficult during rotary wing air medical transport than in hospital settings.
Influence of the helicopter environment on patient care capabilities: flight crew perceptions
NASA Technical Reports Server (NTRS)
Myers, K. J.; Rodenberg, H.; Woodard, D.
1995-01-01
INTRODUCTION: Flight crew perceptions of the effect of the rotary-wing environment on patient-care capabilities have not been subject to statistical analysis. We hypothesized that flight crew members perceived significant difficulties in performing patient-care tasks during air medical transport. METHODS: A survey was distributed to a convenience sample of flight crew members from 20 flight programs. Respondents were asked to compare the difficulty of performing patient-care tasks in rotary-wing and standard (emergency department or intensive care unit) settings. Demographic data collected on respondents included years of flight experience, flights per month, crew duty position and primary aircraft in which the respondent worked. Statistical analysis was performed as appropriate using Student's t-test, type III sum of squares, and analysis of variance. Alpha was defined as p < 0.05. RESULTS: Fifty-five percent of programs (90 individuals) responded. All tasks were significantly rated more difficult in the rotary-wing environment. Ratings were not significantly correlated with flight experience, duty position, flights per month or aircraft used. CONCLUSIONS: We conclude that the performance of patient-care tasks are perceived by air medical flight crew to be significantly more difficult during rotary-wing air medical transport than in hospital settings.
Van Ginckel, Ans; Thijs, Youri; Hesar, Narmin Ghani Zadeh; Mahieu, Nele; De Clercq, Dirk; Roosen, Philip; Witvrouw, Erik
2009-04-01
The purpose of this prospective cohort study was to identify dynamic gait-related risk factors for Achilles tendinopathy (AT) in a population of novice runners. Prior to a 10-week running program, force distribution patterns underneath the feet of 129 subjects were registered using a footscan pressure plate while the subjects jogged barefoot at a comfortable self-selected pace. Throughout the program 10 subjects sustained Achilles tendinopathy of which three reported bilateral complaints. Sixty-six subjects were excluded from the statistical analysis. Therefore the statistical analysis was performed on the remaining sample of 63 subjects. Logistic regression analysis revealed a significant decrease in the total posterior-anterior displacement of the Centre Of Force (COF) (P=0.015) and a laterally directed force distribution underneath the forefoot at 'forefoot flat' (P=0.016) as intrinsic gait-related risk factors for Achilles tendinopathy in novice runners. These results suggest that, in contrast to the frequently described functional hyperpronation following a more inverted touchdown, a lateral foot roll-over following heel strike and diminished forward force transfer underneath the foot should be considered in the prevention of Achilles tendinopathy.
Software For Computing Reliability Of Other Software
NASA Technical Reports Server (NTRS)
Nikora, Allen; Antczak, Thomas M.; Lyu, Michael
1995-01-01
Computer Aided Software Reliability Estimation (CASRE) computer program developed for use in measuring reliability of other software. Easier for non-specialists in reliability to use than many other currently available programs developed for same purpose. CASRE incorporates mathematical modeling capabilities of public-domain Statistical Modeling and Estimation of Reliability Functions for Software (SMERFS) computer program and runs in Windows software environment. Provides menu-driven command interface; enabling and disabling of menu options guides user through (1) selection of set of failure data, (2) execution of mathematical model, and (3) analysis of results from model. Written in C language.
Navigation analysis for Viking 1979, option B
NASA Technical Reports Server (NTRS)
Mitchell, P. H.
1971-01-01
A parametric study performed for 48 trans-Mars reference missions in support of the Viking program is reported. The launch dates cover several months in the year 1979, and each launch date has multiple arrival dates in 1980. A plot of launch versus arrival dates with case numbers designated for reference purposes is included. The analysis consists of the computation of statistical covariance matrices based on certain assumptions about the ground-based tracking systems. The error model statistics are listed in tables. Tracking systems were assumed at three sites: Goldstone, California; Canberra, Australia; and Madrid, Spain. The tracking data consisted of range and Doppler measurements taken during the tracking intervals starting at E-30(d) and ending at E-10(d) for the control data and ending at E-18(h) for the knowledge data. The control and knowledge covariance matrices were delivered to the Planetary Mission Analysis Branch for inputs into a delta V dispersion analysis.
Language-Agnostic Reproducible Data Analysis Using Literate Programming.
Vassilev, Boris; Louhimo, Riku; Ikonen, Elina; Hautaniemi, Sampsa
2016-01-01
A modern biomedical research project can easily contain hundreds of analysis steps and lack of reproducibility of the analyses has been recognized as a severe issue. While thorough documentation enables reproducibility, the number of analysis programs used can be so large that in reality reproducibility cannot be easily achieved. Literate programming is an approach to present computer programs to human readers. The code is rearranged to follow the logic of the program, and to explain that logic in a natural language. The code executed by the computer is extracted from the literate source code. As such, literate programming is an ideal formalism for systematizing analysis steps in biomedical research. We have developed the reproducible computing tool Lir (literate, reproducible computing) that allows a tool-agnostic approach to biomedical data analysis. We demonstrate the utility of Lir by applying it to a case study. Our aim was to investigate the role of endosomal trafficking regulators to the progression of breast cancer. In this analysis, a variety of tools were combined to interpret the available data: a relational database, standard command-line tools, and a statistical computing environment. The analysis revealed that the lipid transport related genes LAPTM4B and NDRG1 are coamplified in breast cancer patients, and identified genes potentially cooperating with LAPTM4B in breast cancer progression. Our case study demonstrates that with Lir, an array of tools can be combined in the same data analysis to improve efficiency, reproducibility, and ease of understanding. Lir is an open-source software available at github.com/borisvassilev/lir.
Language-Agnostic Reproducible Data Analysis Using Literate Programming
Vassilev, Boris; Louhimo, Riku; Ikonen, Elina; Hautaniemi, Sampsa
2016-01-01
A modern biomedical research project can easily contain hundreds of analysis steps and lack of reproducibility of the analyses has been recognized as a severe issue. While thorough documentation enables reproducibility, the number of analysis programs used can be so large that in reality reproducibility cannot be easily achieved. Literate programming is an approach to present computer programs to human readers. The code is rearranged to follow the logic of the program, and to explain that logic in a natural language. The code executed by the computer is extracted from the literate source code. As such, literate programming is an ideal formalism for systematizing analysis steps in biomedical research. We have developed the reproducible computing tool Lir (literate, reproducible computing) that allows a tool-agnostic approach to biomedical data analysis. We demonstrate the utility of Lir by applying it to a case study. Our aim was to investigate the role of endosomal trafficking regulators to the progression of breast cancer. In this analysis, a variety of tools were combined to interpret the available data: a relational database, standard command-line tools, and a statistical computing environment. The analysis revealed that the lipid transport related genes LAPTM4B and NDRG1 are coamplified in breast cancer patients, and identified genes potentially cooperating with LAPTM4B in breast cancer progression. Our case study demonstrates that with Lir, an array of tools can be combined in the same data analysis to improve efficiency, reproducibility, and ease of understanding. Lir is an open-source software available at github.com/borisvassilev/lir. PMID:27711123
Special Flood Hazard Evaluation Report, Maumee River, Defiance and Paulding Counties, Ohio
1988-01-01
into the Flood Flow Frequency Analysis (FFFA) computer program (Reference 3) to determine the discharge-frequency relationship for the Maumee River...although the flood may occur in any year. It is based on statistical analysis of streamflow records available for the watershed and analysis of rainfall...C) K) K4 10 ERFODBUDR .S ryEgne itit ufI N - FODA ONAYSEIA LO AADEAUTO 6 ? -F -C )I= ~ - %E )tvXJ. AE LO LVTO MAMERVE CROS SECIONLOCAION DEFINCEAND
ERIC Educational Resources Information Center
Ord, Anna S.; Ripley, Jennifer S.; Hook, Joshua; Erspamer, Tiffany
2016-01-01
Although statistical methods and research design are crucial areas of competency for psychologists, few studies explore how statistics are taught across doctoral programs in psychology in the United States. The present study examined 153 American Psychological Association-accredited doctoral programs in clinical and counseling psychology and aimed…
Survey of rural, private wells. Statistical design
Mehnert, Edward; Schock, Susan C.; ,
1991-01-01
Half of Illinois' 38 million acres were planted in corn and soybeans in 1988. On the 19 million acres planted in corn and soybeans, approximately 1 million tons of nitrogen fertilizer and 50 million pounds of pesticides were applied. Because groundwater is the water supply for over 90 percent of rural Illinois, the occurrence of agricultural chemicals in groundwater in Illinois is of interest to the agricultural community, the public, and regulatory agencies. The occurrence of agricultural chemicals in groundwater is well documented. However, the extent of this contamination still needs to be defined. This can be done by randomly sampling wells across a geographic area. Key elements of a random, water-well sampling program for regional groundwater quality include the overall statistical design of the program, definition of the sample population, selection of wells to be sampled, and analysis of survey results. These elements must be consistent with the purpose for conducting the program; otherwise, the program will not provide the desired information. The need to carefully design and conduct a sampling program becomes readily apparent when one considers the high cost of collecting and analyzing a sample. For a random sampling program conducted in Illinois, the key elements, as well as the limitations imposed by available information, are described.
Hyun, Kyung Sun; Kang, Hyun Sook; Kim, Won Ock; Park, Sunhee; Lee, Jia; Sok, Sohyune
2009-04-01
The purpose of this study was to develop a multimedia learning program for patients with diabetes mellitus (DM) diet education using standardized patients and to examine the effects of the program on educational skills, communication skills, DM diet knowledge and learning satisfaction. The study employed a randomized control posttest non-synchronized design. The participants were 108 third year nursing students (52 experimental group, 56 control group) at K university in Seoul, Korea. The experimental group had regular lectures and the multimedia learning program for DM diet education using standardized patients while the control group had regular lectures only. The DM educational skills were measured by trained research assistants. The students who received the multimedia learning program scored higher for DM diet educational skills, communication skills and DM diet knowledge compared to the control group. Learning satisfaction of the experimental group was higher than the control group, but statistically insignificant. Clinical competency was improved for students receiving the multimedia learning program for DM diet education using standardized patients, but there was no statistically significant effect on learning satisfaction. In the nursing education system there is a need to develop and apply more multimedia materials for education and to use standardized patients effectively.
Statistical analysis of multivariate atmospheric variables. [cloud cover
NASA Technical Reports Server (NTRS)
Tubbs, J. D.
1979-01-01
Topics covered include: (1) estimation in discrete multivariate distributions; (2) a procedure to predict cloud cover frequencies in the bivariate case; (3) a program to compute conditional bivariate normal parameters; (4) the transformation of nonnormal multivariate to near-normal; (5) test of fit for the extreme value distribution based upon the generalized minimum chi-square; (6) test of fit for continuous distributions based upon the generalized minimum chi-square; (7) effect of correlated observations on confidence sets based upon chi-square statistics; and (8) generation of random variates from specified distributions.
ERIC Educational Resources Information Center
Earl, Lorna L.
This series of manuals describing and illustrating the Statistical Package for the Social Sciences (SPSS) was planned as a self-teaching instrument, beginning with the basics and progressing to an advanced level. Information on what the searcher must know to define the data and write a program for preliminary analysis is contained in manual 1,…
The Marshall Islands Data Management Program
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stoker, A.C.; Conrado, C.L.
1995-09-01
This report is a resource document of the methods and procedures used currently in the Data Management Program of the Marshall Islands Dose Assessment and Radioecology Project. Since 1973, over 60,000 environmental samples have been collected. Our program includes relational database design, programming and maintenance; sample and information management; sample tracking; quality control; and data entry, evaluation and reduction. The usefulness of scientific databases involves careful planning in order to fulfill the requirements of any large research program. Compilation of scientific results requires consolidation of information from several databases, and incorporation of new information as it is generated. The successmore » in combining and organizing all radionuclide analysis, sample information and statistical results into a readily accessible form, is critical to our project.« less
Whist, A C; Liland, K H; Jonsson, M E; Sæbø, S; Sviland, S; Østerås, O; Norström, M; Hopp, P
2014-11-01
Surveillance programs for animal diseases are critical to early disease detection and risk estimation and to documenting a population's disease status at a given time. The aim of this study was to describe a risk-based surveillance program for detecting Mycobacterium avium ssp. paratuberculosis (MAP) infection in Norwegian dairy cattle. The included risk factors for detecting MAP were purchase of cattle, combined cattle and goat farming, and location of the cattle farm in counties containing goats with MAP. The risk indicators included production data [culling of animals >3 yr of age, carcass conformation of animals >3 yr of age, milk production decrease in older lactating cows (lactations 3, 4, and 5)], and clinical data (diarrhea, enteritis, or both, in animals >3 yr of age). Except for combined cattle and goat farming and cattle farm location, all data were collected at the cow level and summarized at the herd level. Predefined risk factors and risk indicators were extracted from different national databases and combined in a multivariate statistical process control to obtain a risk assessment for each herd. The ordinary Hotelling's T(2) statistic was applied as a multivariate, standardized measure of difference between the current observed state and the average state of the risk factors for a given herd. To make the analysis more robust and adapt it to the slowly developing nature of MAP, monthly risk calculations were based on data accumulated during a 24-mo period. Monitoring of these variables was performed to identify outliers that may indicate deviance in one or more of the underlying processes. The highest-ranked herds were scattered all over Norway and clustered in high-density dairy cattle farm areas. The resulting rankings of herds are being used in the national surveillance program for MAP in 2014 to increase the sensitivity of the ongoing surveillance program in which 5 fecal samples for bacteriological examination are collected from 25 dairy herds. The use of multivariate statistical process control for selection of herds will be beneficial when a diagnostic test suitable for mass screening is available and validated on the Norwegian cattle population, thus making it possible to increase the number of sampled herds. Copyright © 2014 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
The strength of graduated drivers license programs and fatalities among teen drivers and passengers.
Morrisey, Michael A; Grabowski, David C; Dee, Thomas S; Campbell, Christine
2006-01-01
The purpose of this study is to investigate the effects of differentially stringent graduated drivers license programs on teen driver fatalities, day-time and night-time teen driver fatalities, fatalities of teen drivers with passengers present, and fatalities among teen passengers. The study uses 1992-2002 data on motor vehicle fatalities among 15-17-year-old drivers from the Fatality Analysis Reporting System to identify the effects of "good", "fair", and "marginal" GDL programs based upon designations by the Insurance Institute for Highway Safety. Analysis is conducted using conditional negative binomial regressions with fixed effects. "Good" programs reduce total fatalities among young drivers by 19.4% (c.i. -33.0%, -5.9%). "Fair" programs reduce night-time young driver fatalities by 12.6% (c.i. -23.9%, -1.2%), but have no effect on day-time fatalities. "Marginal" programs had no statistically meaningful effect on driver fatalities. All three types of programs reduced teen passenger fatalities, but the effects of limitations on the number of passengers appear to have had only minimal effects in reducing fatalities among young drivers themselves. Stronger GDL programs are more effective than weaker programs in reducing teenage motor vehicle fatalities.
Calypso: a user-friendly web-server for mining and visualizing microbiome-environment interactions.
Zakrzewski, Martha; Proietti, Carla; Ellis, Jonathan J; Hasan, Shihab; Brion, Marie-Jo; Berger, Bernard; Krause, Lutz
2017-03-01
Calypso is an easy-to-use online software suite that allows non-expert users to mine, interpret and compare taxonomic information from metagenomic or 16S rDNA datasets. Calypso has a focus on multivariate statistical approaches that can identify complex environment-microbiome associations. The software enables quantitative visualizations, statistical testing, multivariate analysis, supervised learning, factor analysis, multivariable regression, network analysis and diversity estimates. Comprehensive help pages, tutorials and videos are provided via a wiki page. The web-interface is accessible via http://cgenome.net/calypso/ . The software is programmed in Java, PERL and R and the source code is available from Zenodo ( https://zenodo.org/record/50931 ). The software is freely available for non-commercial users. l.krause@uq.edu.au. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.
Randomization in cancer clinical trials: permutation test and development of a computer program.
Ohashi, Y
1990-01-01
When analyzing cancer clinical trial data where the treatment allocation is done using dynamic balancing methods such as the minimization method for balancing the distribution of important prognostic factors in each arm, conservativeness occurs if such a randomization scheme is ignored and a simple unstratified analysis is carried out. In this paper, the above conservativeness is demonstrated by computer simulation, and the development of a computer program that carries out permutation tests of the log-rank statistics for clinical trial data where the allocation is done by the minimization method or a stratified permuted block design is introduced. We are planning to use this program in practice to supplement a usual stratified analysis and model-based methods such as the Cox regression. The most serious problem in cancer clinical trials in Japan is how to carry out the quality control or data management in trials that are initiated and conducted by researchers without support from pharmaceutical companies. In the final section of this paper, one international collaborative work for developing international guidelines on data management in clinical trials of bladder cancer is briefly introduced, and the differences between the system adopted in US/European statistical centers and the Japanese system is described. PMID:2269216
Center for Prostate Disease Research
... 2017 Cancer Statistics programs Clinical Research Program Synopsis Leadership Multi-Disciplinary Clinic Staff Listing 2017 Cancer Statistics Basic Science Research Program Synopsis Leadership Gene Expression Data Research Achievements Staff Listing Lab ...
Household participation in recycling programs: a case study from Turkey.
Budak, Fuat; Oguz, Burcu
2008-11-01
This study investigates the underlining factors that motivate households to participate in a pilot source separation and recycling program in Turkey. The data of this research were collected from randomly selected households in the program area via face to face interviews based on an inclusive questionnaire. The results of logistic regression analysis show that having sufficient knowledge regarding recycling and the recycling program is the most statistically significant factor in determining whether a household will participate in recycling. The results also imply that some of the socio-economic and demographic characteristics of household hypothesized to affect the household decision to participate in recycling, in the research framework, are not significant.
Whellan, David J; Cohen, Elizabeth J; Matchar, David B; Califf, Robert M
2002-07-01
Despite the widening use of disease management (DM) programs throughout the country, little is understood about the "state of DM" in healthcare systems and managed care organizations. To better characterize the range of users of DM in healthcare and to identify critical issues, both present and future, for DM. Qualitative survey. Forty-seven healthcare systems (n = 22) and managed care organizations (n = 25) were randomly selected. Decision makers were identified and interviewed between January 1, 2000, and March 31, 2000. We limited quantitative analysis to tabulations of suitable responses, without statistical testing. Responses were organized around 3 themes: models for DM, implementation strategies, and measurements of success. Of 47 decision makers surveyed, 42 (89%) reported that their organizations currently have (75%) or are working to develop (14%) DM programs. Although the goals of DM programs were similar, organizations took a variety of approaches to achieving these ends. There were typically 3 steps in implementing a DM program: analysis of patient data, external analysis, and organizational analysis. Decision makers believed that DM programs had only achieved partial success in reaching the 2 main goals of improved quality of care and cost savings. Given the variety of DM programs, there is a need to develop a classification scheme to allow for better comparison between programs. Further quantitative studies of decision makers' opinions would be helpful in developing programs and in designing necessary studies of patient management strategies.
HydroClimATe: hydrologic and climatic analysis toolkit
Dickinson, Jesse; Hanson, Randall T.; Predmore, Steven K.
2014-01-01
The potential consequences of climate variability and climate change have been identified as major issues for the sustainability and availability of the worldwide water resources. Unlike global climate change, climate variability represents deviations from the long-term state of the climate over periods of a few years to several decades. Currently, rich hydrologic time-series data are available, but the combination of data preparation and statistical methods developed by the U.S. Geological Survey as part of the Groundwater Resources Program is relatively unavailable to hydrologists and engineers who could benefit from estimates of climate variability and its effects on periodic recharge and water-resource availability. This report documents HydroClimATe, a computer program for assessing the relations between variable climatic and hydrologic time-series data. HydroClimATe was developed for a Windows operating system. The software includes statistical tools for (1) time-series preprocessing, (2) spectral analysis, (3) spatial and temporal analysis, (4) correlation analysis, and (5) projections. The time-series preprocessing tools include spline fitting, standardization using a normal or gamma distribution, and transformation by a cumulative departure. The spectral analysis tools include discrete Fourier transform, maximum entropy method, and singular spectrum analysis. The spatial and temporal analysis tool is empirical orthogonal function analysis. The correlation analysis tools are linear regression and lag correlation. The projection tools include autoregressive time-series modeling and generation of many realizations. These tools are demonstrated in four examples that use stream-flow discharge data, groundwater-level records, gridded time series of precipitation data, and the Multivariate ENSO Index.
ERIC Educational Resources Information Center
King, James M.; And Others
The materials described here represent the conversion of a highly popular student workbook "Sets, Probability and Statistics: The Mathematics of Life Insurance" into a computer program. The program is designed to familiarize students with the concepts of sets, probability, and statistics, and to provide practice using real life examples. It also…
ERIC Educational Resources Information Center
Dunn, Karee
2014-01-01
Online graduate education programs are expanding rapidly. Many of these programs require a statistics course, resulting in an increasing need for online statistics courses. The study reported here grew from experiences teaching online, graduate statistics courses. In seeking answers on how to improve this class, I discovered that research has yet…
The evaluation of reproductive health PhD program in Iran: The input indicators analysis.
AbdiShahshahani, Mahshid; Ehsanpour, Soheila; Yamani, Nikoo; Kohan, Shahnaz
2014-11-01
Appropriate quality achievement of a PhD program requires frequent assessment and discovering the shortcomings in the program. Inputs, which are important elements of the curriculum, are frequently missed in evaluations. The purpose of this study was to evaluate the input indicators of reproductive health PhD program in Iran based on the Context, Input, Process, and Product (CIPP) evaluation model. This is a descriptive and evaluative study based on the CIPP evaluation model. It was conducted in 2013 in four Iranian schools of nursing and midwifery of medical sciences universities. Statistical population consisted of four groups: heads of departments (n = 5), faculty members (n = 18), graduates (n = 12), and PhD students of reproductive health (n = 54). Data collection tools were five separate questionnaires including 37 indicators that were developed by the researcher. Content and face validity were evaluated based on the experts' indications. The Cronbach's alpha coefficient was calculated in order to obtain the reliability of the questionnaires. Collected data were analyzed by SPSS software. Data were analyzed by descriptive statistics (mean, frequency, percentage, and standard deviation), and one-way analysis of variance (ANOVA) and least significant difference (LSD) post hoc tests to compare means between groups. The results of the study indicated that the highest percentage of the heads of departments (80%), graduates (66.7%), and students (68.5%) evaluated the status of input indicators of reproductive health PhD program as relatively appropriate, while most of the faculties (66.7%) evaluated that as appropriate. It is suggested to explore the reasons for relatively appropriate evaluation of input indicators by further academic researches and improve the reproductive health PhD program accordingly.
Technology, Data Bases and System Analysis for Space-to-Ground Optical Communications
NASA Technical Reports Server (NTRS)
Lesh, James
1995-01-01
Optical communications is becoming an ever-increasingly important option for designers of space-to- ground communications links, whether it be for government or commercial applications. In this paper the technology being developed by NASA for use in space-to-ground optical communications is presented. Next, a program which is collecting a long term data base of atmospheric visibility statistics for optical propagation through the atmosphere will be described. Finally, a methodology for utilizing the statistics of the atmospheric data base in the analysis of space-to-ground links will be presented. This methodology takes into account the effects of station availability, is useful when comparing optical communications with microwave systems, and provides a rationale establishing the recommended link margin.
Weber, Joseph J; Mascarenhas, Debra C; Bellin, Lisa S; Raab, Rachel E; Wong, Jan H
2012-10-01
Patient navigation programs are initiated to help guide patients through barriers in a complex cancer care system. We sought to analyze the impact of our patient navigator program on the adherence to specific Breast Cancer Care Quality Indicators (BCCQI). A retrospective cohort of patients with stage I-III breast cancer seen the calendar year prior to the initiation of the patient navigation program were compared with patients treated in the ensuing two calendar years. Quality indicators deemed appropriate for analysis were those associated with overcoming barriers to treatment and those associated with providing health education and improving patient decision-making. A total of 134 consecutive patients between January 1, 2006 and December 31, 2006 and 234 consecutive patients between January 1, 2008 and December 31, 2009 were evaluated for compliance with the BCCQI. There was no significant difference in the mean age or race/ethnic distribution of the study population. In all ten BCCQI evaluated, there was improvement in the percentage of patients in compliance from pre and post implementation of a patient navigator program (range 2.5-27.0 %). Overall, compliance with BCCQI improved from 74.1 to 95.5 % (p < 0.0001). Indicators associated with informed decision-making and patient preference achieved statistical significance, while only completion axillary node dissection in sentinel node-positive biopsies in the process of treatment achieved statistical significance. The implementation of a patient navigator program improved breast cancer care as measured by BCCQI. The impact on disease-free and overall survival remains to be determined.
NASA Astrophysics Data System (ADS)
Sarrazine, Angela Renee
The purpose of this study was to incorporate multiple intelligences techniques in both a classroom and planetarium setting to create a significant increase in student learning about the moon and lunar phases. Utilizing a free-response questionnaire and a 25 item multiple choice pre-test/post-test design, this study identified middle school students' misconceptions and measured increases in student learning about the moon and lunar phases. The study spanned two semesters and contained six treatment groups which consisted of both single and multiple interventions. One group only attended the planetarium program. Two groups attended one of two classes a week prior to the planetarium program, and two groups attended one of two classes a week after the planetarium program. The most rigorous treatment group attended a class both a week before and after the planetarium program. Utilizing Rasch analysis techniques and parametric statistical tests, all six groups exhibited statistically significant gains in knowledge at the 0.05 level. There were no significant differences between students who attended only a planetarium program versus a single classroom program. Also, subjects who attended either a pre-planetarium class or a post- planetarium class did not show a statistically significant gain over the planetarium only situation. Equivalent effects on student learning were exhibited by the pre-planetarium class groups and post-planetarium class groups. Therefore, it was determined that the placement of the second intervention does not have a significant impact on student learning. However, a decrease in learning was observed with the addition of a third intervention. Further instruction and testing appeared to hinder student learning. This is perhaps an effect of subject fatigue.
Physics Education: A Significant Backbone of Sustainable Development in Developing Countries
NASA Astrophysics Data System (ADS)
Akintola, R. A.
2006-08-01
In the quest for technological self-reliance, many policies, programs and projects have been proposed and implemented in order to procure solutions to the problems of technological inadequacies of developing countries. It has been observed that all these failed. This research identifies the problems and proposes lasting solutions to emancipate physics education in developing nations and highlight possible future gains. The statistical analysis employed was based on questionnaires, interviews and data analysis.
1991-12-01
9 2.6.1 Multi-Shape Detection. .. .. .. .. .. .. ...... 9 Page 2.6.2 Line Segment Extraction and Re-Combination.. 9 2.6.3 Planimetric Feature... Extraction ............... 10 2.6.4 Line Segment Extraction From Statistical Texture Analysis .............................. 11 2.6.5 Edge Following as Graph...image after image, could benefit clue to the fact that major spatial characteristics of subregions could be extracted , and minor spatial changes could be
Analysis of trends in water-quality data for water conservation area 3A, the Everglades, Florida
Mattraw, H.C.; Scheidt, D.J.; Federico, A.C.
1987-01-01
Rainfall and water quality data bases from the South Florida Water Management District were used to evaluate water quality trends at 10 locations near or in Water Conservation Area 3A in The Everglades. The Seasonal Kendall test was applied to specific conductance, orthophosphate-phosphorus, nitrate-nitrogen, total Kjeldahl nitrogen, and total nitrogen regression residuals for the period 1978-82. Residuals of orthophosphate and nitrate quadratic models, based on antecedent 7-day rainfall at inflow gate S-11B, were the only two constituent-structure pairs that showed apparent significant (p < 0.05) increases in constituent concentrations. Elimination of regression models with distinct residual patterns and data outlines resulted in 17 statistically significant station water quality combinations for trend analysis. No water quality trends were observed. The 1979 Memorandum of Agreement outlining the water quality monitoring program between the Everglades National Park and the U.S. Army Corps of Engineers stressed collection four times a year at three stations, and extensive coverage of water quality properties. Trend analysis and other rigorous statistical evaluation programs are better suited to data monitoring programs that include more frequent sampling and that are organized in a water quality data management system. Pronounced areal differences in water quality suggest that a water quality monitoring system for Shark River Slough in Everglades National Park include collection locations near the source of inflow to Water Conservation Area 3A. (Author 's abstract)
Assessment of NDE Reliability Data
NASA Technical Reports Server (NTRS)
Yee, B. G. W.; Chang, F. H.; Couchman, J. C.; Lemon, G. H.; Packman, P. F.
1976-01-01
Twenty sets of relevant Nondestructive Evaluation (NDE) reliability data have been identified, collected, compiled, and categorized. A criterion for the selection of data for statistical analysis considerations has been formulated. A model to grade the quality and validity of the data sets has been developed. Data input formats, which record the pertinent parameters of the defect/specimen and inspection procedures, have been formulated for each NDE method. A comprehensive computer program has been written to calculate the probability of flaw detection at several confidence levels by the binomial distribution. This program also selects the desired data sets for pooling and tests the statistical pooling criteria before calculating the composite detection reliability. Probability of detection curves at 95 and 50 percent confidence levels have been plotted for individual sets of relevant data as well as for several sets of merged data with common sets of NDE parameters.
Guam's forest resources, 2002.
Joseph A. Donnegan; Sarah L. Butler; Walter Grabowiecki; Bruce A. Hiserote; David. Limtiaco
2004-01-01
The Forest Inventory and Analysis Program collected, analyzed, and summarized field data on 46 forested plots on the island of Guam. Estimates of forest area, tree stem volume and biomass, the numbers of trees, tree damages, and the distribution of tree sizes were summarized for this statistical sample. Detailed tables and graphical highlights provide a summary of Guam...
Automatic Rock Detection and Mapping from HiRISE Imagery
NASA Technical Reports Server (NTRS)
Huertas, Andres; Adams, Douglas S.; Cheng, Yang
2008-01-01
This system includes a C-code software program and a set of MATLAB software tools for statistical analysis and rock distribution mapping. The major functions include rock detection and rock detection validation. The rock detection code has been evolved into a production tool that can be used by engineers and geologists with minor training.
Railroad safety program, task 2
NASA Technical Reports Server (NTRS)
1983-01-01
Aspects of railroad safety and the preparation of a National Inspection Plan (NIP) for rail safety improvement are examined. Methodology for the allocation of inspection resources, preparation of a NIP instruction manual, and recommendations for future NIP, are described. A statistical analysis of regional rail accidents is presented with causes and suggested preventive measures included.
ERIC Educational Resources Information Center
Palazotto, Anthony N.; And Others
This report is the result of a pilot program to seek out ways for developing an educational institution's transportation flow. Techniques and resulting statistics are discussed. Suggestions for additional uses of the information obtained are indicated. (Author)
Making Knowledge Delivery Failsafe: Adding Step Zero in Hypothesis Testing
ERIC Educational Resources Information Center
Pan, Xia; Zhou, Qiang
2010-01-01
Knowledge of statistical analysis is increasingly important for professionals in modern business. For example, hypothesis testing is one of the critical topics for quality managers and team workers in Six Sigma training programs. Delivering the knowledge of hypothesis testing effectively can be an important step for the incapable learners or…
Parent Expectations and Planning for College. Statistical Analysis Report. NCES 2008-079
ERIC Educational Resources Information Center
Lippman, Laura; Guzman, Lina; Keith, Julie Dombrowski; Kinukawa, Akemi; Shwalb, Rebecca; Tice, Peter
2008-01-01
This report uses data from the 2003 National Household Education Surveys Program (NHES) Parent and Family Involvement Survey (PFI) to examine the characteristics associated with the educational expectations parents had for their children and the postsecondary education planning practices families and schools engaged in. The results presented in…
A data storage, retrieval and analysis system for endocrine research. [for Skylab
NASA Technical Reports Server (NTRS)
Newton, L. E.; Johnston, D. A.
1975-01-01
This retrieval system builds, updates, retrieves, and performs basic statistical analyses on blood, urine, and diet parameters for the M071 and M073 Skylab and Apollo experiments. This system permits data entry from cards to build an indexed sequential file. Programs are easily modified for specialized analyses.
Descriptive statistics of tree crown condition in the Northeastern United States
KaDonna C. Randolph; Randall S. Morin; Jim Steinman
2010-01-01
The U.S. Forest Service Forest Inventory and Analysis (FIA) Program uses visual assessments of tree crown condition to monitor changes and trends in forest health. This report describes four crown condition indicators (crown dieback, crown density, foliage transparency, and sapling crown vigor) measured in Connecticut, Delaware, Maine, Maryland, Massachusetts, New...
T.M. Barrett
2004-01-01
During the 1990s, forest inventories for California, Oregon, and Washington were conducted by different agencies using different methods. The Pacific Northwest Research Station Forest Inventory and Analysis program recently integrated these inventories into a single database. This document briefly describes potential statistical methods for estimating population totals...
The Effectiveness of the Counterinsurgency Operations During the Macedonian Conflict in 2001
2010-12-10
comes from the statistical analysis of demographic data that predicts that in the near future the high Albanian birth rate (Albanians have highest......among the first members of the NATO Partnership for Peace program, which was a first step to joining NATO. Until the emergence of the insurgency in
Statistical Analysis for Multisite Trials Using Instrumental Variables with Random Coefficients
ERIC Educational Resources Information Center
Raudenbush, Stephen W.; Reardon, Sean F.; Nomi, Takako
2012-01-01
Multisite trials can clarify the average impact of a new program and the heterogeneity of impacts across sites. Unfortunately, in many applications, compliance with treatment assignment is imperfect. For these applications, we propose an instrumental variable (IV) model with person-specific and site-specific random coefficients. Site-specific IV…
An Analysis of Factors Affecting Student Perceptions in a Blended Learning Environment
ERIC Educational Resources Information Center
Peruso, Florence Mary
2012-01-01
The current quantitative study measured the perceptions of students towards online-only learning and towards blended-hybrid learning. Descriptive statistics were implemented to analyze the data from a Likert-type survey, administered to students in degree-seeking programs at an institution of higher learning. A "t"-test and…
Palau's forest resources, 2003.
Joseph A. Donnegan; Sarah L. Butler; Olaf Kuegler; Brent J. Stroud; Bruce A. Hiserote; Kashgar. Rengulbai
2007-01-01
The Forest Inventory and Analysis Program collected, analyzed, and summarized field data on 54 forested plots on the islands in the Republic of Palau. Estimates of forest area, tree stem volume and biomass, the numbers of trees, tree damages, and the distribution of tree sizes were summarized for this statistical sample. Detailed tables and graphical highlights provide...
The National Center for Health Statistics collaborated with the National Human Monitoring Program of the U.S. Environmental Protection Agency (EPA) in a four-year study to assess the exposure of the general population to selected pesticides through analysis of blood serum and uri...
Overview Of Recent Enhancements To The Bumper-II Meteoroid and Orbital Debris Risk Assessment Tool
NASA Technical Reports Server (NTRS)
Hyde, James L.; Christiansen, Eric L.; Lear, Dana M.; Prior, Thomas G.
2006-01-01
Discussion includes recent enhancements to the BUMPER-II program and input files in support of Shuttle Return to Flight. Improvements to the mesh definitions of the finite element input model will be presented. A BUMPER-II analysis process that was used to estimate statistical uncertainty is introduced.
Farmers as Consumers of Agricultural Education Services: Willingness to Pay and Spend Time
ERIC Educational Resources Information Center
Charatsari, Chrysanthi; Papadaki-Klavdianou, Afroditi; Michailidis, Anastasios
2011-01-01
This study assessed farmers' willingness to pay for and spend time attending an Agricultural Educational Program (AEP). Primary data on the demographic and socio-economic variables of farmers were collected from 355 farmers selected randomly from Northern Greece. Descriptive statistics and multivariate analysis methods were used in order to meet…
Manufacturing Dissent: Labor Revitalization, Union Summer and Student Protest
ERIC Educational Resources Information Center
Van Dyke, Nella; Dixon, Marc; Carlon, Helen
2007-01-01
During the late 1990s, college students across the United States mobilized around labor issues. Our research explores whether this explosion of student protest activity was generated, in part, by concerted efforts of the AFL-CIO through its Union Summer college student internship program. A statistical analysis of factors influencing the location…
ERIC Educational Resources Information Center
Wang, Shudong; Wang, Ning; Hoadley, David
2007-01-01
This study used confirmatory factor analysis (CFA) to examine the comparability of the National Nurse Aide Assessment Program (NNAAP[TM]) test scores across language and administration condition groups for calibration and validation samples that were randomly drawn from the same population. Fit statistics supported both the calibration and…
Emery, R J
1997-03-01
Institutional radiation safety programs routinely use wipe test sampling and liquid scintillation counting analysis to indicate the presence of removable radioactive contamination. Significant volumes of liquid waste can be generated by such surveillance activities, and the subsequent disposal of these materials can sometimes be difficult and costly. In settings where large numbers of negative results are regularly obtained, the limited grouping of samples for analysis based on expected value statistical techniques is possible. To demonstrate the plausibility of the approach, single wipe samples exposed to varying amounts of contamination were analyzed concurrently with nine non-contaminated samples. Although the sample grouping inevitably leads to increased quenching with liquid scintillation counting systems, the effect did not impact the ability to detect removable contamination in amounts well below recommended action levels. Opportunities to further improve this cost effective semi-quantitative screening procedure are described, including improvements in sample collection procedures, enhancing sample-counting media contact through mixing and extending elution periods, increasing sample counting times, and adjusting institutional action levels.
Meeting the family: promoting humanism in gross anatomy.
Crow, Sheila M; O'Donoghue, Dan; Vannatta, Jerry B; Thompson, Britta M
2012-01-01
Human dissection commonly occurs early in the undergraduate medical school curriculum, thus presenting an immediate opportunity for educators to teach and encourage humanistic qualities of respect, empathy, and compassion. The purpose of this study was to measure the impact of the Donor Luncheon, a unique program in which medical students meet the families of the anatomical donor prior to dissection in the anatomy course at the University of Oklahoma College of Medicine. Students were randomized into groups of 8 to attend the luncheon and either met with family of the donor or attended the luncheon with no donor family present. A questionnaire measured students' attitudes at 2 weeks, 6 weeks, and at the conclusion of the anatomy course. Factor analysis revealed 5 scales. Analysis revealed statistically significant differences across time for Donor as Person, Dissection Process, and Donor as Patient and statistically significant differences between groups for Donor as Person and Donor as Patient. These results suggest that this program can provide students with the opportunity to maintain more humanistic attitudes at the beginning of their medical education career.
Argonne National Laboratory Li-alloy/FeS cell testing and R and D programs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gay, E.C.
1982-01-01
Groups of 12 or more identical Li-alloy/FeS cells fabricated by Eagle-Picher Industries, Inc. and Gould Inc. were operated at Argonne National Laboratory (ANL) in the status cell test program to obtain data for statistical analysis of cell cycle life and failure modes. The cells were full-size electric vehicle battery cells (150 to 350 Ah capacity) and they were cycled at the 4-h discharge rate and 8-h charge rate. The end of life was defined as a 20% loss of capacity or a decrease in the coulombic efficiency to less than 95%. Seventy-four cells (six groups of identical cells) were cycle-lifemore » tested and the results were analyzed statistically. The ultimate goal of this analysis was to predict cell and battery reliability. Testing of groups of identical cells also provided a means of identifying common failure modes which were eliminated by cell design changes. Mean time to failure (MTTF) for the cells based on the Weibull distribution is presented.« less
Materials of acoustic analysis: sustained vowel versus sentence.
Moon, Kyung Ray; Chung, Sung Min; Park, Hae Sang; Kim, Han Su
2012-09-01
Sustained vowel is a widely used material of acoustic analysis. However, vowel phonation does not sufficiently demonstrate sentence-based real-life phonation, and biases may occur depending on the test subjects intent during pronunciation. The purpose of this study was to investigate the differences between the results of acoustic analysis using each material. An individual prospective study. Two hundred two individuals (87 men and 115 women) with normal findings in videostroboscopy were enrolled. Acoustic analysis was done using the speech pattern element acquisition and display program. Fundamental frequency (Fx), amplitude (Ax), contact quotient (Qx), jitter, and shimmer were measured with sustained vowel-based acoustic analysis. Average fundamental frequency (FxM), average amplitude (AxM), average contact quotient (QxM), Fx perturbation (CFx), and amplitude perturbation (CAx) were measured with sentence-based acoustic analysis. Corresponding data of the two methods were compared with each other. SPSS (Statistical Package for the Social Sciences, Version 12.0; SPSS, Inc., Chicago, IL) software was used for statistical analysis. FxM was higher than Fx in men (Fx, 124.45 Hz; FxM, 133.09 Hz; P=0.000). In women, FxM seemed to be lower than Fx, but the results were not statistically significant (Fx, 210.58 Hz; FxM, 208.34 Hz; P=0.065). There was no statistical significance between Ax and AxM in both the groups. QxM was higher than Qx in men and women. Jitter was lower in men, but CFx was lower in women. Both Shimmer and CAx were higher in men. Sustained vowel phonation could not be a complete substitute for real-time phonation in acoustic analysis. Characteristics of acoustic materials should be considered when choosing the material for acoustic analysis and interpreting the results. Copyright © 2012 The Voice Foundation. Published by Mosby, Inc. All rights reserved.
GWAR: robust analysis and meta-analysis of genome-wide association studies.
Dimou, Niki L; Tsirigos, Konstantinos D; Elofsson, Arne; Bagos, Pantelis G
2017-05-15
In the context of genome-wide association studies (GWAS), there is a variety of statistical techniques in order to conduct the analysis, but, in most cases, the underlying genetic model is usually unknown. Under these circumstances, the classical Cochran-Armitage trend test (CATT) is suboptimal. Robust procedures that maximize the power and preserve the nominal type I error rate are preferable. Moreover, performing a meta-analysis using robust procedures is of great interest and has never been addressed in the past. The primary goal of this work is to implement several robust methods for analysis and meta-analysis in the statistical package Stata and subsequently to make the software available to the scientific community. The CATT under a recessive, additive and dominant model of inheritance as well as robust methods based on the Maximum Efficiency Robust Test statistic, the MAX statistic and the MIN2 were implemented in Stata. Concerning MAX and MIN2, we calculated their asymptotic null distributions relying on numerical integration resulting in a great gain in computational time without losing accuracy. All the aforementioned approaches were employed in a fixed or a random effects meta-analysis setting using summary data with weights equal to the reciprocal of the combined cases and controls. Overall, this is the first complete effort to implement procedures for analysis and meta-analysis in GWAS using Stata. A Stata program and a web-server are freely available for academic users at http://www.compgen.org/tools/GWAR. pbagos@compgen.org. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com
Sources of Safety Data and Statistical Strategies for Design and Analysis: Clinical Trials.
Zink, Richard C; Marchenko, Olga; Sanchez-Kam, Matilde; Ma, Haijun; Jiang, Qi
2018-03-01
There has been an increased emphasis on the proactive and comprehensive evaluation of safety endpoints to ensure patient well-being throughout the medical product life cycle. In fact, depending on the severity of the underlying disease, it is important to plan for a comprehensive safety evaluation at the start of any development program. Statisticians should be intimately involved in this process and contribute their expertise to study design, safety data collection, analysis, reporting (including data visualization), and interpretation. In this manuscript, we review the challenges associated with the analysis of safety endpoints and describe the safety data that are available to influence the design and analysis of premarket clinical trials. We share our recommendations for the statistical and graphical methodologies necessary to appropriately analyze, report, and interpret safety outcomes, and we discuss the advantages and disadvantages of safety data obtained from clinical trials compared to other sources. Clinical trials are an important source of safety data that contribute to the totality of safety information available to generate evidence for regulators, sponsors, payers, physicians, and patients. This work is a result of the efforts of the American Statistical Association Biopharmaceutical Section Safety Working Group.
Monolithic ceramic analysis using the SCARE program
NASA Technical Reports Server (NTRS)
Manderscheid, Jane M.
1988-01-01
The Structural Ceramics Analysis and Reliability Evaluation (SCARE) computer program calculates the fast fracture reliability of monolithic ceramic components. The code is a post-processor to the MSC/NASTRAN general purpose finite element program. The SCARE program automatically accepts the MSC/NASTRAN output necessary to compute reliability. This includes element stresses, temperatures, volumes, and areas. The SCARE program computes two-parameter Weibull strength distributions from input fracture data for both volume and surface flaws. The distributions can then be used to calculate the reliability of geometrically complex components subjected to multiaxial stress states. Several fracture criteria and flaw types are available for selection by the user, including out-of-plane crack extension theories. The theoretical basis for the reliability calculations was proposed by Batdorf. These models combine linear elastic fracture mechanics (LEFM) with Weibull statistics to provide a mechanistic failure criterion. Other fracture theories included in SCARE are the normal stress averaging technique and the principle of independent action. The objective of this presentation is to summarize these theories, including their limitations and advantages, and to provide a general description of the SCARE program, along with example problems.
Marcelli, E A; Heer, D M
1998-01-01
"Using a unique 1994 Los Angeles County Household Survey of foreign-born Mexicans and the March 1994 and 1995 Current Population Surveys, we estimate the number of unauthorized Mexican immigrants (UMIs) residing in Los Angeles County, and compare their use of seven welfare programs with that of other non-U.S. citizens and U.S. citizens. Non-U.S. citizens were found to be no more likely than U.S. citizens to have used welfare, and UMIs were 11% (14%) less likely than other non-citizens (U.S.-born citizens).... We demonstrate how results differ depending on the unit of analysis employed, and on which programs constitute ¿welfare'." excerpt
The composite sequential clustering technique for analysis of multispectral scanner data
NASA Technical Reports Server (NTRS)
Su, M. Y.
1972-01-01
The clustering technique consists of two parts: (1) a sequential statistical clustering which is essentially a sequential variance analysis, and (2) a generalized K-means clustering. In this composite clustering technique, the output of (1) is a set of initial clusters which are input to (2) for further improvement by an iterative scheme. This unsupervised composite technique was employed for automatic classification of two sets of remote multispectral earth resource observations. The classification accuracy by the unsupervised technique is found to be comparable to that by traditional supervised maximum likelihood classification techniques. The mathematical algorithms for the composite sequential clustering program and a detailed computer program description with job setup are given.
2012 statistical summaries : FTA grant assistance programs.
DOT National Transportation Integrated Search
2013-12-01
The 2012 Statistical Summaries provides information about the Federal Transit Administrations (FTA) major financial aid programs for : Federal Fiscal Year (FY) 2012. The report covers the following programs: Urbanized Area Formula, Non-urbanized A...
2011 statistical summaries : FTA grant assistance programs.
DOT National Transportation Integrated Search
2013-05-01
The 2011 Statistical Summaries provides information about the Federal Transit Administrations (FTA) major financial aid programs for Federal Fiscal Year (FY) 2011. The report covers the following programs: Urbanized Area Formula, Non-urbanized Are...
2010 statistical summaries : FTA grant assistance programs.
DOT National Transportation Integrated Search
2013-07-01
The 2010 Statistical Summaries provides information about the Federal Transit Administrations (FTA) major financial aid programs for Federal Fiscal Year (FY) 2010. The report covers the following programs: Urbanized Area Formula, Non-urbanized Are...
McKay, E
2000-01-01
An innovative research program was devised to investigate the interactive effect of instructional strategies enhanced with text-plus-textual metaphors or text-plus-graphical metaphors, and cognitive style on the acquisition of programming concepts. The Cognitive Styles Analysis (CSA) program (Riding,1991) was used to establish the participants' cognitive style. The QUEST Interactive Test Analysis System (Adams and Khoo,1996) provided the cognitive performance measuring tool, which ensured an absence of error measurement in the programming knowledge testing instruments. Therefore, reliability of the instrumentation was assured through the calibration techniques utilized by the QUEST estimate; providing predictability of the research design. A means analysis of the QUEST data, using the Cohen (1977) approach to size effect and statistical power further quantified the significance of the findings. The experimental methodology adopted for this research links the disciplines of instructional science, cognitive psychology, and objective measurement to provide reliable mechanisms for beneficial use in the evaluation of cognitive performance by the education, training and development sectors. Furthermore, the research outcomes will be of interest to educators, cognitive psychologists, communications engineers, and computer scientists specializing in computer-human interactions.
ERIC Educational Resources Information Center
Martin, Justin D.
2017-01-01
This essay presents data from a census of statistics requirements and offerings at all 4-year journalism programs in the United States (N = 369) and proposes a model of a potential course in statistics for journalism majors. The author proposes that three philosophies underlie a statistics course for journalism students. Such a course should (a)…
Environmental flow allocation and statistics calculator
Konrad, Christopher P.
2011-01-01
The Environmental Flow Allocation and Statistics Calculator (EFASC) is a computer program that calculates hydrologic statistics based on a time series of daily streamflow values. EFASC will calculate statistics for daily streamflow in an input file or will generate synthetic daily flow series from an input file based on rules for allocating and protecting streamflow and then calculate statistics for the synthetic time series. The program reads dates and daily streamflow values from input files. The program writes statistics out to a series of worksheets and text files. Multiple sites can be processed in series as one run. EFASC is written in MicrosoftRegistered Visual BasicCopyright for Applications and implemented as a macro in MicrosoftOffice Excel 2007Registered. EFASC is intended as a research tool for users familiar with computer programming. The code for EFASC is provided so that it can be modified for specific applications. All users should review how output statistics are calculated and recognize that the algorithms may not comply with conventions used to calculate streamflow statistics published by the U.S. Geological Survey.
Güngen, Belma Doğan; Tunç, Abdulkadir; Aras, Yeşim Güzey; Gündoğdu, Aslı Aksoy; Güngen, Adil Can; Bal, Serdar
2017-07-11
The aim of this study was to investigate the predictors of intensive care unit (ICU) admission and mortality among stroke patients and the effects of a pulmonary rehabilitation program on stroke patients. This prospective study enrolled 181 acute ischemic stroke patients aged between 40 and 90 years. Demographical characteristics, laboratory tests, diffusion-weighed magnetic resonance imaging (DWI-MRI) time, nutritional status, vascular risk factors, National Institute of Health Stroke Scale (NIHSS) scores and modified Rankin scale (MRS) scores were recorded for all patients. One-hundred patients participated in the pulmonary rehabilitation program, 81 of whom served as a control group. Statistically, one- and three-month mortality was associated with NIHSS and MRS scores at admission and three months (p<0.001; r=0.440, r=0.432, r=0.339 and r=0.410, respectively). One and three months mortality- ICU admission had a statistically significant relationship with parenteral nutrition (p<0.001; r=0.346, r=0.300, respectively; r=0.294 and r=0.294, respectively). Similarly, there was also a statistically significant relationship between pneumonia onset and one- and three-month mortality- ICU admission (p<0.05; r=0.217, r=0.127, r=0.185 and r=0.185, respectively). A regression analysis showed that parenteral nutrition (odds ratio [OR] =13.434, 95% confidence interval [CI] =1.148-157.265, p=0.038) was a significant predictor of ICU admission. The relationship between pulmonary physiotherapy (PPT) and ICU admission- pneumonia onset at the end of three months was statistically significant (p=0.04 and p=0.043, respectively). This study showed that PPT improved the prognosis of ischemic stroke patients. We believe that a pulmonary rehabilitation program, in addition to general stroke rehabilitation programs, can play a critical role in improving survival and functional outcomes. NCT03195907 . Trial registration date: 21.06.2017 'Retrospectively registered'.
Zhu, Yuerong; Zhu, Yuelin; Xu, Wei
2008-01-01
Background Though microarray experiments are very popular in life science research, managing and analyzing microarray data are still challenging tasks for many biologists. Most microarray programs require users to have sophisticated knowledge of mathematics, statistics and computer skills for usage. With accumulating microarray data deposited in public databases, easy-to-use programs to re-analyze previously published microarray data are in high demand. Results EzArray is a web-based Affymetrix expression array data management and analysis system for researchers who need to organize microarray data efficiently and get data analyzed instantly. EzArray organizes microarray data into projects that can be analyzed online with predefined or custom procedures. EzArray performs data preprocessing and detection of differentially expressed genes with statistical methods. All analysis procedures are optimized and highly automated so that even novice users with limited pre-knowledge of microarray data analysis can complete initial analysis quickly. Since all input files, analysis parameters, and executed scripts can be downloaded, EzArray provides maximum reproducibility for each analysis. In addition, EzArray integrates with Gene Expression Omnibus (GEO) and allows instantaneous re-analysis of published array data. Conclusion EzArray is a novel Affymetrix expression array data analysis and sharing system. EzArray provides easy-to-use tools for re-analyzing published microarray data and will help both novice and experienced users perform initial analysis of their microarray data from the location of data storage. We believe EzArray will be a useful system for facilities with microarray services and laboratories with multiple members involved in microarray data analysis. EzArray is freely available from . PMID:18218103
Barbero, Marina M. D.; Oliveira, Henrique N.; de Camargo, Gregório M. F.; Fernandes Júnior, Gerardo A.; Aspilcueta-Borquis, Rusbel R.; Souza, Fabio R. P.; Boligon, Arione A.; Melo, Thaise P.; Regatieri, Inaê C.; Feitosa, Fabieli L. B.; Fonseca, Larissa F. S.; Magalhães, Ana F. B.; Costa, Raphael B.; Albuquerque, Lucia G.
2018-01-01
Reproductive traits are of the utmost importance for any livestock farming, but are difficult to measure and to interpret since they are influenced by various factors. The objective of this study was to detect associations between known polymorphisms in candidate genes related to sexual precocity in Nellore heifers, which could be used in breeding programs. Records of 1,689 precocious and non-precocious heifers from farms participating in the Conexão Delta G breeding program were analyzed. A subset of single nucleotide polymorphisms (SNP) located in the region of the candidate genes at a distance of up to 5 kb from the boundaries of each gene, were selected from the panel of 777,000 SNPs of the High-Density Bovine SNP BeadChip. Linear mixed models were used for statistical analysis of early heifer pregnancy, relating the trait with isolated SNPs or with haplotype groups. The model included the contemporary group (year and month of birth) as fixed effect and parent of the animal (sire effect) as random effect. The fastPHASE® and GenomeStudio® were used for reconstruction of the haplotypes and for analysis of linkage disequilibrium based on r2 statistics. A total of 125 candidate genes and 2,024 SNPs forming haplotypes were analyzed. Statistical analysis after Bonferroni correction showed that nine haplotypes exerted a significant effect (p<0.05) on sexual precocity. Four of these haplotypes were located in the Pregnancy-associated plasma protein-A2 gene (PAPP-A2), two in the Estrogen-related receptor gamma gene (ESRRG), and one each in the Pregnancy-associated plasma protein-A gene (PAPP-A), Kell blood group complex subunit-related family (XKR4) and mannose-binding lectin genes (MBL-1) genes. Although the present results indicate that the PAPP-A2, PAPP-A, XKR4, MBL-1 and ESRRG genes influence sexual precocity in Nellore heifers, further studies are needed to evaluate their possible use in breeding programs. PMID:29293544
Takada, Luciana; Barbero, Marina M D; Oliveira, Henrique N; de Camargo, Gregório M F; Fernandes Júnior, Gerardo A; Aspilcueta-Borquis, Rusbel R; Souza, Fabio R P; Boligon, Arione A; Melo, Thaise P; Regatieri, Inaê C; Feitosa, Fabieli L B; Fonseca, Larissa F S; Magalhães, Ana F B; Costa, Raphael B; Albuquerque, Lucia G
2018-01-01
Reproductive traits are of the utmost importance for any livestock farming, but are difficult to measure and to interpret since they are influenced by various factors. The objective of this study was to detect associations between known polymorphisms in candidate genes related to sexual precocity in Nellore heifers, which could be used in breeding programs. Records of 1,689 precocious and non-precocious heifers from farms participating in the Conexão Delta G breeding program were analyzed. A subset of single nucleotide polymorphisms (SNP) located in the region of the candidate genes at a distance of up to 5 kb from the boundaries of each gene, were selected from the panel of 777,000 SNPs of the High-Density Bovine SNP BeadChip. Linear mixed models were used for statistical analysis of early heifer pregnancy, relating the trait with isolated SNPs or with haplotype groups. The model included the contemporary group (year and month of birth) as fixed effect and parent of the animal (sire effect) as random effect. The fastPHASE® and GenomeStudio® were used for reconstruction of the haplotypes and for analysis of linkage disequilibrium based on r2 statistics. A total of 125 candidate genes and 2,024 SNPs forming haplotypes were analyzed. Statistical analysis after Bonferroni correction showed that nine haplotypes exerted a significant effect (p<0.05) on sexual precocity. Four of these haplotypes were located in the Pregnancy-associated plasma protein-A2 gene (PAPP-A2), two in the Estrogen-related receptor gamma gene (ESRRG), and one each in the Pregnancy-associated plasma protein-A gene (PAPP-A), Kell blood group complex subunit-related family (XKR4) and mannose-binding lectin genes (MBL-1) genes. Although the present results indicate that the PAPP-A2, PAPP-A, XKR4, MBL-1 and ESRRG genes influence sexual precocity in Nellore heifers, further studies are needed to evaluate their possible use in breeding programs.
NASA Astrophysics Data System (ADS)
Ghannadpour, Seyyed Saeed; Hezarkhani, Ardeshir
2016-03-01
The U-statistic method is one of the most important structural methods to separate the anomaly from the background. It considers the location of samples and carries out the statistical analysis of the data without judging from a geochemical point of view and tries to separate subpopulations and determine anomalous areas. In the present study, to use U-statistic method in three-dimensional (3D) condition, U-statistic is applied on the grade of two ideal test examples, by considering sample Z values (elevation). So far, this is the first time that this method has been applied on a 3D condition. To evaluate the performance of 3D U-statistic method and in order to compare U-statistic with one non-structural method, the method of threshold assessment based on median and standard deviation (MSD method) is applied on the two example tests. Results show that the samples indicated by U-statistic method as anomalous are more regular and involve less dispersion than those indicated by the MSD method. So that, according to the location of anomalous samples, denser areas of them can be determined as promising zones. Moreover, results show that at a threshold of U = 0, the total error of misclassification for U-statistic method is much smaller than the total error of criteria of bar {x}+n× s. Finally, 3D model of two test examples for separating anomaly from background using 3D U-statistic method is provided. The source code for a software program, which was developed in the MATLAB programming language in order to perform the calculations of the 3D U-spatial statistic method, is additionally provided. This software is compatible with all the geochemical varieties and can be used in similar exploration projects.
1998 statistical summaries : Federal Transit Administration : grant assistance programs
DOT National Transportation Integrated Search
1999-03-01
The 1998 Statistical Summaries provides information about the Federal Transit Administration's (FTA) major financial aid programs for Federal Fiscal Year (FY) 1998. The report covers the following programs: Urbanized Area Formula, Non-urbanized Area ...
NASA Astrophysics Data System (ADS)
Delyana, H.; Rismen, S.; Handayani, S.
2018-04-01
This research is a development research using 4-D design model (define, design, develop, and disseminate). The results of the define stage are analyzed for the needs of the following; Syllabus analysis, textbook analysis, student characteristics analysis and literature analysis. The results of textbook analysis obtained the description that of the two textbooks that must be owned by students also still difficulty in understanding it, the form of presentation also has not facilitated students to be independent in learning to find the concept, textbooks are also not equipped with data processing referrals by using software R. The developed module is considered valid by the experts. Further field trials are conducted to determine the practicality and effectiveness. The trial was conducted to the students of Mathematics Education Study Program of STKIP PGRI which was taken randomly which has not taken Basic Statistics Course that is as many as 4 people. Practical aspects of attention are easy, time efficient, easy to interpret, and equivalence. The practical value in each aspect is 3.7; 3.79, 3.7 and 3.78. Based on the results of the test students considered that the module has been very practical use in learning. This means that the module developed can be used by students in Elementary Statistics learning.
Management system of occupational diseases in Korea: statistics, report and monitoring system.
Rhee, Kyung Yong; Choe, Seong Weon
2010-12-01
The management system of occupational diseases in Korea can be assessed from the perspective of a surveillance system. Workers' compensation insurance reports are used to produce official statistics on occupational diseases in Korea. National working conditions surveys are used to monitor the magnitude of work-related symptoms and signs in the labor force. A health examination program was introduced to detect occupational diseases through both selective and mass screening programs. The Working Environment Measurement Institution assesses workers' exposure to hazards in the workplace. Government regulates that the employer should do health examinations and working conditions measurement through contracted private agencies and following the Occupational Safety and Health Act. It is hoped that these institutions may be able to effectively detect and monitor occupational diseases and hazards in the workplace. In view of this, the occupational management system in Korea is well designed, except for the national survey system. In the future, national surveys for detection of hazards and ill-health outcomes in workers should be developed. The existing surveillance system for occupational disease can be improved by providing more refined information through statistical analysis of surveillance data.
SHAREv2: fluctuations and a comprehensive treatment of decay feed-down
NASA Astrophysics Data System (ADS)
Torrieri, G.; Jeon, S.; Letessier, J.; Rafelski, J.
2006-11-01
This the user's manual for SHARE version 2. SHARE [G. Torrieri, S. Steinke, W. Broniowski, W. Florkowski, J. Letessier, J. Rafelski, Comput. Phys. Comm. 167 (2005) 229] (Statistical Hadronization with Resonances) is a collection of programs designed for the statistical analysis of particle production in relativistic heavy-ion collisions. While the structure of the program remains similar to v1.x, v2 provides several new features such as evaluation of statistical fluctuations of particle yields, and a greater versatility, in particular regarding decay feed-down and input/output structure. This article describes all the new features, with emphasis on statistical fluctuations. Program summaryTitle of program:SHAREv2 Catalogue identifier:ADVD_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADVD_v2_0 Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Computer:PC, Pentium III, 512 MB RAM not hardware dependent Operating system:Linux: RedHat 6.1, 7.2, FEDORA, etc. not system dependent Programming language:FORTRAN77 Size of the package:167 KB directory, without libraries (see http://wwwasdoc.web.cern.ch/wwwasdoc/minuit/minmain.html, http://wwwasd.web.cern.ch/wwwasd/cernlib.html for details on library requirements) Number of lines in distributed program, including test data, etc.:26 101 Number of bytes in distributed program, including test data, etc.:170 346 Distribution format:tar.gzip file Computer:Any computer with an f77 compiler Nature of the physical problem:Event-by-event fluctuations have been recognized to be the physical observable capable to constrain particle production models. Therefore, consideration of event-by-event fluctuations is required for a decisive falsification or constraining of (variants of) particle production models based on (grand-, micro-) canonical statistical mechanics phase space, the so called statistical hadronization models (SHM). As in the case of particle yields, to properly compare model calculations to data it is necessary to consistently take into account resonance decays. However, event-by-event fluctuations are more sensitive than particle yields to experimental acceptance issues, and a range of techniques needs to be implemented to extract 'physical' fluctuations from an experimental event-by-event measurement. Method of solving the problem:The techniques used within the SHARE suite of programs [G. Torrieri, S. Steinke, W. Broniowski, W. Florkowski, J. Letessier, J. Rafelski, Comput. Phys. Comm. 167 (2005) 229; SHAREv1] are updated and extended to fluctuations. A full particle data-table, decay tree, and set of experimental feed-down coefficients are provided. Unlike SHAREv1.x, experimental acceptance feed-down coefficients can be entered for any resonance decay. SHAREv2 can calculate yields, fluctuations, and bulk properties of the fireball from provided thermal parameters; alternatively, parameters can be obtained from fits to experimental data, via the MINUIT fitting algorithm [F. James, M. Roos, Comput. Phys. Comm. 10 (1975) 343]. Fits can also be analyzed for significance, parameter and data point sensitivity. Averages and fluctuations at freeze-out of both the stable particles and the hadronic resonances are set according to a statistical prescription, calculated via a series of Bessel functions, using CERN library programs. We also have the option of including finite particle widths of the resonances. A χ minimization algorithm, also from the CERN library programs, is used to perform and analyze the fit. Please see SHAREv1 for more details on these. Purpose:The vast amount of high quality soft hadron production data, from experiments running at the SPS, RHIC, in past at the AGS, and in the near future at the LHC, offers the opportunity for statistical particle production model falsification. This task has turned out to be difficult when considering solely particle yields addressed in the context of SHAREv1.x. For this reason physical conditions at freeze-out remain contested. Inclusion in the analysis of event-by-event fluctuations appears to resolve this issue. Similarly, a thorough analysis including both fluctuations and average multiplicities gives a way to explore the presence and strength of interactions following hadronization (when hadrons form), ending with thermal freeze-out (when all interactions cease). SHAREv2 with fluctuations will also help determine which statistical ensemble (if any), e.g., canonical or grand-canonical, is more physically appropriate for analyzing a given system. Together with resonances, fluctuations can also be used for a direct estimate of the extent the system re-interacts between chemical and thermal freeze-out. We hope and expect that SHAREv2 will contribute to decide if any of the statistical hadronization model variants has a genuine physical connection to hadron particle production. Computation time survey:We encounter, in the FORTRAN version computation, times up to seconds for evaluation of particle yields. These rise by up to a factor of 300 in the process of minimization and a further factor of a few when χ/N profiles and contours with chemical non-equilibrium are requested. Summary of new features (w.r.t. SHAREv1.x)Fluctuations:In addition to particle yields, ratios and bulk quantities SHAREv2 can calculate, fit and analyze statistical fluctuations of particles and particle ratios Decays:SHAREv2 has the flexibility to account for any experimental method of allowing for decay feed-downs to the particle yields Charm flavor:Charmed particles have been added to the decay tree, allowing as an option study of statistical hadronization of J/ψ, χ, D, etc. Quark chemistry:Chemical non-equilibrium yields for both u and d flavors, as opposed to generically light quarks q, are considered; η- η mixing, etc., are properly dealt with, and chemical non-equilibrium can be studied for each flavor separately Misc:Many new commands and features have been introduced and added to the basic user interface. For example, it is possible to study combinations of particles and their ratios. It is also possible to combine all the input files into one file. SHARE compatibility and manual:This write-up is an update and extension of SHAREv1. The user should consult SHAREv1 regarding the principles of user interface and for all particle yield related physics and program instructions, other than the parameter additions and minor changes described here. SHAREv2 is downward compatible for the changes of the user interface, offering the user of SHAREv1 a computer generated revised input files compatible with SHAREv2.
"Hyperstat": an educational and working tool in epidemiology.
Nicolosi, A
1995-01-01
The work of a researcher in epidemiology is based on studying literature, planning studies, gathering data, analyzing data and writing results. Therefore he has need for performing, more or less, simple calculations, the need for consulting or quoting literature, the need for consulting textbooks about certain issues or procedures, and the need for looking at a specific formula. There are no programs conceived as a workstation to assist the different aspects of researcher work in an integrated fashion. A hypertextual system was developed which supports different stages of the epidemiologist's work. It combines database management, statistical analysis or planning, and literature searches. The software was developed on Apple Macintosh by using Hypercard 2.1 as a database and HyperTalk as a programming language. The program is structured in 7 "stacks" or files: Procedures; Statistical Tables; Graphs; References; Text; Formulas; Help. Each stack has its own management system with an automated Table of Contents. Stacks contain "cards" which make up the databases and carry executable programs. The programs are of four kinds: association; statistical procedure; formatting (input/output); database management. The system performs general statistical procedures, procedures applicable to epidemiological studies only (follow-up and case-control), and procedures for clinical trials. All commands are given by clicking the mouse on self-explanatory "buttons". In order to perform calculations, the user only needs to enter the data into the appropriate cells and then click on the selected procedure's button. The system has a hypertextual structure. The user can go from a procedure to other cards following the preferred order of succession and according to built-in associations. The user can access different levels of knowledge or information from any stack he is consulting or operating. From every card, the user can go to a selected procedure to perform statistical calculations, to the reference database management system, to the textbook in which all procedures and issues are discussed in detail, to the database of statistical formulas with automated table of contents, to statistical tables with automated table of contents, or to the help module. he program has a very user-friendly interface and leaves the user free to use the same format he would use on paper. The interface does not require special skills. It reflects the Macintosh philosophy of using windows, buttons and mouse. This allows the user to perform complicated calculations without losing the "feel" of data, weight alternatives, and simulations. This program shares many features in common with hypertexts. It has an underlying network database where the nodes consist of text, graphics, executable procedures, and combinations of these; the nodes in the database correspond to windows on the screen; the links between the nodes in the database are visible as "active" text or icons in the windows; the text is read by following links and opening new windows. The program is especially useful as an educational tool, directed to medical and epidemiology students. The combination of computing capabilities with a textbook and databases of formulas and literature references, makes the program versatile and attractive as a learning tool. The program is also helpful in the work done at the desk, where the researcher examines results, consults literature, explores different analytic approaches, plans new studies, or writes grant proposals or scientific articles.
EHME: a new word database for research in Basque language.
Acha, Joana; Laka, Itziar; Landa, Josu; Salaburu, Pello
2014-11-14
This article presents EHME, the frequency dictionary of Basque structure, an online program that enables researchers in psycholinguistics to extract word and nonword stimuli, based on a broad range of statistics concerning the properties of Basque words. The database consists of 22.7 million tokens, and properties available include morphological structure frequency and word-similarity measures, apart from classical indexes: word frequency, orthographic structure, orthographic similarity, bigram and biphone frequency, and syllable-based measures. Measures are indexed at the lemma, morpheme and word level. We include reliability and validation analysis. The application is freely available, and enables the user to extract words based on concrete statistical criteria 1 , as well as to obtain statistical characteristics from a list of words
McNeill, Marjorie H
2009-01-01
The purpose of this research study was to determine whether the administration of a comprehensive examination before graduation increases the percentage of students passing the Registered Health Information Administrator certification examination. A t-test for independent means yielded a statistically significant difference between the Registered Health Information Administrator certification examination pass rates of health information administration programs that administer a comprehensive examination and programs that do not administer a comprehensive examination. Programs with a high certification examination pass rate do not require a comprehensive examination when compared with those programs with a lower pass rate. It is concluded that health information administration faculty at the local level should perform program self-analysis to improve student progress toward achievement of learning outcomes and entry-level competencies.