Pleiotropy Analysis of Quantitative Traits at Gene Level by Multivariate Functional Linear Models
Wang, Yifan; Liu, Aiyi; Mills, James L.; Boehnke, Michael; Wilson, Alexander F.; Bailey-Wilson, Joan E.; Xiong, Momiao; Wu, Colin O.; Fan, Ruzong
2015-01-01
In genetics, pleiotropy describes the genetic effect of a single gene on multiple phenotypic traits. A common approach is to analyze the phenotypic traits separately using univariate analyses and combine the test results through multiple comparisons. This approach may lead to low power. Multivariate functional linear models are developed to connect genetic variant data to multiple quantitative traits adjusting for covariates for a unified analysis. Three types of approximate F-distribution tests based on Pillai–Bartlett trace, Hotelling–Lawley trace, and Wilks’s Lambda are introduced to test for association between multiple quantitative traits and multiple genetic variants in one genetic region. The approximate F-distribution tests provide much more significant results than those of F-tests of univariate analysis and optimal sequence kernel association test (SKAT-O). Extensive simulations were performed to evaluate the false positive rates and power performance of the proposed models and tests. We show that the approximate F-distribution tests control the type I error rates very well. Overall, simultaneous analysis of multiple traits can increase power performance compared to an individual test of each trait. The proposed methods were applied to analyze (1) four lipid traits in eight European cohorts, and (2) three biochemical traits in the Trinity Students Study. The approximate F-distribution tests provide much more significant results than those of F-tests of univariate analysis and SKAT-O for the three biochemical traits. The approximate F-distribution tests of the proposed functional linear models are more sensitive than those of the traditional multivariate linear models that in turn are more sensitive than SKAT-O in the univariate case. The analysis of the four lipid traits and the three biochemical traits detects more association than SKAT-O in the univariate case. PMID:25809955
Pleiotropy analysis of quantitative traits at gene level by multivariate functional linear models.
Wang, Yifan; Liu, Aiyi; Mills, James L; Boehnke, Michael; Wilson, Alexander F; Bailey-Wilson, Joan E; Xiong, Momiao; Wu, Colin O; Fan, Ruzong
2015-05-01
In genetics, pleiotropy describes the genetic effect of a single gene on multiple phenotypic traits. A common approach is to analyze the phenotypic traits separately using univariate analyses and combine the test results through multiple comparisons. This approach may lead to low power. Multivariate functional linear models are developed to connect genetic variant data to multiple quantitative traits adjusting for covariates for a unified analysis. Three types of approximate F-distribution tests based on Pillai-Bartlett trace, Hotelling-Lawley trace, and Wilks's Lambda are introduced to test for association between multiple quantitative traits and multiple genetic variants in one genetic region. The approximate F-distribution tests provide much more significant results than those of F-tests of univariate analysis and optimal sequence kernel association test (SKAT-O). Extensive simulations were performed to evaluate the false positive rates and power performance of the proposed models and tests. We show that the approximate F-distribution tests control the type I error rates very well. Overall, simultaneous analysis of multiple traits can increase power performance compared to an individual test of each trait. The proposed methods were applied to analyze (1) four lipid traits in eight European cohorts, and (2) three biochemical traits in the Trinity Students Study. The approximate F-distribution tests provide much more significant results than those of F-tests of univariate analysis and SKAT-O for the three biochemical traits. The approximate F-distribution tests of the proposed functional linear models are more sensitive than those of the traditional multivariate linear models that in turn are more sensitive than SKAT-O in the univariate case. The analysis of the four lipid traits and the three biochemical traits detects more association than SKAT-O in the univariate case. © 2015 WILEY PERIODICALS, INC.
Improved Test Planning and Analysis Through the Use of Advanced Statistical Methods
NASA Technical Reports Server (NTRS)
Green, Lawrence L.; Maxwell, Katherine A.; Glass, David E.; Vaughn, Wallace L.; Barger, Weston; Cook, Mylan
2016-01-01
The goal of this work is, through computational simulations, to provide statistically-based evidence to convince the testing community that a distributed testing approach is superior to a clustered testing approach for most situations. For clustered testing, numerous, repeated test points are acquired at a limited number of test conditions. For distributed testing, only one or a few test points are requested at many different conditions. The statistical techniques of Analysis of Variance (ANOVA), Design of Experiments (DOE) and Response Surface Methods (RSM) are applied to enable distributed test planning, data analysis and test augmentation. The D-Optimal class of DOE is used to plan an optimally efficient single- and multi-factor test. The resulting simulated test data are analyzed via ANOVA and a parametric model is constructed using RSM. Finally, ANOVA can be used to plan a second round of testing to augment the existing data set with new data points. The use of these techniques is demonstrated through several illustrative examples. To date, many thousands of comparisons have been performed and the results strongly support the conclusion that the distributed testing approach outperforms the clustered testing approach.
Modified Distribution-Free Goodness-of-Fit Test Statistic.
Chun, So Yeon; Browne, Michael W; Shapiro, Alexander
2018-03-01
Covariance structure analysis and its structural equation modeling extensions have become one of the most widely used methodologies in social sciences such as psychology, education, and economics. An important issue in such analysis is to assess the goodness of fit of a model under analysis. One of the most popular test statistics used in covariance structure analysis is the asymptotically distribution-free (ADF) test statistic introduced by Browne (Br J Math Stat Psychol 37:62-83, 1984). The ADF statistic can be used to test models without any specific distribution assumption (e.g., multivariate normal distribution) of the observed data. Despite its advantage, it has been shown in various empirical studies that unless sample sizes are extremely large, this ADF statistic could perform very poorly in practice. In this paper, we provide a theoretical explanation for this phenomenon and further propose a modified test statistic that improves the performance in samples of realistic size. The proposed statistic deals with the possible ill-conditioning of the involved large-scale covariance matrices.
An accurate test for homogeneity of odds ratios based on Cochran's Q-statistic.
Kulinskaya, Elena; Dollinger, Michael B
2015-06-10
A frequently used statistic for testing homogeneity in a meta-analysis of K independent studies is Cochran's Q. For a standard test of homogeneity the Q statistic is referred to a chi-square distribution with K-1 degrees of freedom. For the situation in which the effects of the studies are logarithms of odds ratios, the chi-square distribution is much too conservative for moderate size studies, although it may be asymptotically correct as the individual studies become large. Using a mixture of theoretical results and simulations, we provide formulas to estimate the shape and scale parameters of a gamma distribution to fit the distribution of Q. Simulation studies show that the gamma distribution is a good approximation to the distribution for Q. Use of the gamma distribution instead of the chi-square distribution for Q should eliminate inaccurate inferences in assessing homogeneity in a meta-analysis. (A computer program for implementing this test is provided.) This hypothesis test is competitive with the Breslow-Day test both in accuracy of level and in power.
Analytic Considerations and Design Basis for the IEEE Distribution Test Feeders
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schneider, K. P.; Mather, B. A.; Pal, B. C.
For nearly 20 years the Test Feeder Working Group of the Distribution System Analysis Subcommittee has been developing openly available distribution test feeders for use by researchers. The purpose of these test feeders is to provide models of distribution systems that reflect the wide diversity in design and their various analytic challenges. Because of their utility and accessibility, the test feeders have been used for a wide range of research, some of which has been outside the original scope of intended uses. This paper provides an overview of the existing distribution feeder models and clarifies the specific analytic challenges thatmore » they were originally designed to examine. Additionally, the paper will provide guidance on which feeders are best suited for various types of analysis. The purpose of this paper is to provide the original intent of the Working Group and to provide the information necessary so that researchers may make an informed decision on which of the test feeders are most appropriate for their work.« less
Analytic Considerations and Design Basis for the IEEE Distribution Test Feeders
Schneider, K. P.; Mather, B. A.; Pal, B. C.; ...
2017-10-10
For nearly 20 years the Test Feeder Working Group of the Distribution System Analysis Subcommittee has been developing openly available distribution test feeders for use by researchers. The purpose of these test feeders is to provide models of distribution systems that reflect the wide diversity in design and their various analytic challenges. Because of their utility and accessibility, the test feeders have been used for a wide range of research, some of which has been outside the original scope of intended uses. This paper provides an overview of the existing distribution feeder models and clarifies the specific analytic challenges thatmore » they were originally designed to examine. Additionally, the paper will provide guidance on which feeders are best suited for various types of analysis. The purpose of this paper is to provide the original intent of the Working Group and to provide the information necessary so that researchers may make an informed decision on which of the test feeders are most appropriate for their work.« less
Detecting Non-Gaussian and Lognormal Characteristics of Temperature and Water Vapor Mixing Ratio
NASA Astrophysics Data System (ADS)
Kliewer, A.; Fletcher, S. J.; Jones, A. S.; Forsythe, J. M.
2017-12-01
Many operational data assimilation and retrieval systems assume that the errors and variables come from a Gaussian distribution. This study builds upon previous results that shows that positive definite variables, specifically water vapor mixing ratio and temperature, can follow a non-Gaussian distribution and moreover a lognormal distribution. Previously, statistical testing procedures which included the Jarque-Bera test, the Shapiro-Wilk test, the Chi-squared goodness-of-fit test, and a composite test which incorporated the results of the former tests were employed to determine locations and time spans where atmospheric variables assume a non-Gaussian distribution. These tests are now investigated in a "sliding window" fashion in order to extend the testing procedure to near real-time. The analyzed 1-degree resolution data comes from the National Oceanic and Atmospheric Administration (NOAA) Global Forecast System (GFS) six hour forecast from the 0Z analysis. These results indicate the necessity of a Data Assimilation (DA) system to be able to properly use the lognormally-distributed variables in an appropriate Bayesian analysis that does not assume the variables are Gaussian.
Comparative analysis through probability distributions of a data set
NASA Astrophysics Data System (ADS)
Cristea, Gabriel; Constantinescu, Dan Mihai
2018-02-01
In practice, probability distributions are applied in such diverse fields as risk analysis, reliability engineering, chemical engineering, hydrology, image processing, physics, market research, business and economic research, customer support, medicine, sociology, demography etc. This article highlights important aspects of fitting probability distributions to data and applying the analysis results to make informed decisions. There are a number of statistical methods available which can help us to select the best fitting model. Some of the graphs display both input data and fitted distributions at the same time, as probability density and cumulative distribution. The goodness of fit tests can be used to determine whether a certain distribution is a good fit. The main used idea is to measure the "distance" between the data and the tested distribution, and compare that distance to some threshold values. Calculating the goodness of fit statistics also enables us to order the fitted distributions accordingly to how good they fit to data. This particular feature is very helpful for comparing the fitted models. The paper presents a comparison of most commonly used goodness of fit tests as: Kolmogorov-Smirnov, Anderson-Darling, and Chi-Squared. A large set of data is analyzed and conclusions are drawn by visualizing the data, comparing multiple fitted distributions and selecting the best model. These graphs should be viewed as an addition to the goodness of fit tests.
HammerCloud: A Stress Testing System for Distributed Analysis
NASA Astrophysics Data System (ADS)
van der Ster, Daniel C.; Elmsheuser, Johannes; Úbeda García, Mario; Paladin, Massimo
2011-12-01
Distributed analysis of LHC data is an I/O-intensive activity which places large demands on the internal network, storage, and local disks at remote computing facilities. Commissioning and maintaining a site to provide an efficient distributed analysis service is therefore a challenge which can be aided by tools to help evaluate a variety of infrastructure designs and configurations. HammerCloud is one such tool; it is a stress testing service which is used by central operations teams, regional coordinators, and local site admins to (a) submit arbitrary number of analysis jobs to a number of sites, (b) maintain at a steady-state a predefined number of jobs running at the sites under test, (c) produce web-based reports summarizing the efficiency and performance of the sites under test, and (d) present a web-interface for historical test results to both evaluate progress and compare sites. HammerCloud was built around the distributed analysis framework Ganga, exploiting its API for grid job management. HammerCloud has been employed by the ATLAS experiment for continuous testing of many sites worldwide, and also during large scale computing challenges such as STEP'09 and UAT'09, where the scale of the tests exceeded 10,000 concurrently running and 1,000,000 total jobs over multi-day periods. In addition, HammerCloud is being adopted by the CMS experiment; the plugin structure of HammerCloud allows the execution of CMS jobs using their official tool (CRAB).
An Analysis of Variance Approach for the Estimation of Response Time Distributions in Tests
ERIC Educational Resources Information Center
Attali, Yigal
2010-01-01
Generalizability theory and analysis of variance methods are employed, together with the concept of objective time pressure, to estimate response time distributions and the degree of time pressure in timed tests. By estimating response time variance components due to person, item, and their interaction, and fixed effects due to item types and…
Univariate and Bivariate Loglinear Models for Discrete Test Score Distributions.
ERIC Educational Resources Information Center
Holland, Paul W.; Thayer, Dorothy T.
2000-01-01
Applied the theory of exponential families of distributions to the problem of fitting the univariate histograms and discrete bivariate frequency distributions that often arise in the analysis of test scores. Considers efficient computation of the maximum likelihood estimates of the parameters using Newton's Method and computationally efficient…
Performance of concrete members subjected to large hydrocarbon pool fires
Zwiers, Renata I.; Morgan, Bruce J.
1989-01-01
The authors discuss an investigation to determine analytically if the performance of concrete beams and columns in a hydrocarbon pool test fire would differ significantly from their performance in a standard test fire. The investigation consisted of a finite element analysis to obtain temperature distributions in typical cross sections, a comparison of the resulting temperature distribution in the cross section, and a strength analysis of a beam based on temperature distribution data. Results of the investigation are reported.
A method for developing design diagrams for ceramic and glass materials using fatigue data
NASA Technical Reports Server (NTRS)
Heslin, T. M.; Magida, M. B.; Forrest, K. A.
1986-01-01
The service lifetime of glass and ceramic materials can be expressed as a plot of time-to-failure versus applied stress whose plot is parametric in percent probability of failure. This type of plot is called a design diagram. Confidence interval estimates for such plots depend on the type of test that is used to generate the data, on assumptions made concerning the statistical distribution of the test results, and on the type of analysis used. This report outlines the development of design diagrams for glass and ceramic materials in engineering terms using static or dynamic fatigue tests, assuming either no particular statistical distribution of test results or a Weibull distribution and using either median value or homologous ratio analysis of the test results.
Analysis of shear test method for composite laminates
NASA Technical Reports Server (NTRS)
Bergner, H. W., Jr.; Davis, J. G., Jr.; Herakovich, C. T.
1977-01-01
An elastic plane stress finite element analysis of the stress distributions in four flat test specimens for in-plane shear response of composite materials subjected to mechanical or thermal loads is presented. The shear test specimens investigated include: slotted coupon, cross beam, losipescu, and rail shear. Results are presented in the form of normalized shear contour plots for all three in-plane stess components. It is shown that the cross beam, losipescu, and rail shear specimens have stress distributions which are more than adequate for determining linear shear behavior of composite materials. Laminate properties, core effects, and fixture configurations are among the factors which were found to influence the stress distributions.
Lee, Bang Yeon; Kang, Su-Tae; Yun, Hae-Bum; Kim, Yun Yong
2016-01-12
The distribution of fiber orientation is an important factor in determining the mechanical properties of fiber-reinforced concrete. This study proposes a new image analysis technique for improving the evaluation accuracy of fiber orientation distribution in the sectional image of fiber-reinforced concrete. A series of tests on the accuracy of fiber detection and the estimation performance of fiber orientation was performed on artificial fiber images to assess the validity of the proposed technique. The validation test results showed that the proposed technique estimates the distribution of fiber orientation more accurately than the direct measurement of fiber orientation by image analysis.
Lee, Bang Yeon; Kang, Su-Tae; Yun, Hae-Bum; Kim, Yun Yong
2016-01-01
The distribution of fiber orientation is an important factor in determining the mechanical properties of fiber-reinforced concrete. This study proposes a new image analysis technique for improving the evaluation accuracy of fiber orientation distribution in the sectional image of fiber-reinforced concrete. A series of tests on the accuracy of fiber detection and the estimation performance of fiber orientation was performed on artificial fiber images to assess the validity of the proposed technique. The validation test results showed that the proposed technique estimates the distribution of fiber orientation more accurately than the direct measurement of fiber orientation by image analysis. PMID:28787839
Distributions of Characteristic Roots in Multivariate Analysis
1976-07-01
stiidied by various authors, have been briefly discussed. Such distributional ies of four test criteria and a few less important ones which are...functions h. -nots have further been discussed in view of the power comparisons made in co. ion wich tests of three multivariate hypotheses. In addition...one- sample case has also been considered in terms of distributional aspects of the ch. roots and criteria for tests of two hypotheses on the
ERIC Educational Resources Information Center
Texeira, Antonio; Rosa, Alvaro; Calapez, Teresa
2009-01-01
This article presents statistical power analysis (SPA) based on the normal distribution using Excel, adopting textbook and SPA approaches. The objective is to present the latter in a comparative way within a framework that is familiar to textbook level readers, as a first step to understand SPA with other distributions. The analysis focuses on the…
NEAT: an efficient network enrichment analysis test.
Signorelli, Mirko; Vinciotti, Veronica; Wit, Ernst C
2016-09-05
Network enrichment analysis is a powerful method, which allows to integrate gene enrichment analysis with the information on relationships between genes that is provided by gene networks. Existing tests for network enrichment analysis deal only with undirected networks, they can be computationally slow and are based on normality assumptions. We propose NEAT, a test for network enrichment analysis. The test is based on the hypergeometric distribution, which naturally arises as the null distribution in this context. NEAT can be applied not only to undirected, but to directed and partially directed networks as well. Our simulations indicate that NEAT is considerably faster than alternative resampling-based methods, and that its capacity to detect enrichments is at least as good as the one of alternative tests. We discuss applications of NEAT to network analyses in yeast by testing for enrichment of the Environmental Stress Response target gene set with GO Slim and KEGG functional gene sets, and also by inspecting associations between functional sets themselves. NEAT is a flexible and efficient test for network enrichment analysis that aims to overcome some limitations of existing resampling-based tests. The method is implemented in the R package neat, which can be freely downloaded from CRAN ( https://cran.r-project.org/package=neat ).
NASA Astrophysics Data System (ADS)
Khanmohammadi, Neda; Rezaie, Hossein; Montaseri, Majid; Behmanesh, Javad
2017-10-01
The reference evapotranspiration (ET0) plays an important role in water management plans in arid or semi-arid countries such as Iran. For this reason, the regional analysis of this parameter is important. But, ET0 process is affected by several meteorological parameters such as wind speed, solar radiation, temperature and relative humidity. Therefore, the effect of distribution type of effective meteorological variables on ET0 distribution was analyzed. For this purpose, the regional probability distribution of the annual ET0 and its effective parameters were selected. Used data in this research was recorded data at 30 synoptic stations of Iran during 1960-2014. Using the probability plot correlation coefficient (PPCC) test and the L-moment method, five common distributions were compared and the best distribution was selected. The results of PPCC test and L-moment diagram indicated that the Pearson type III distribution was the best probability distribution for fitting annual ET0 and its four effective parameters. The results of RMSE showed that the ability of the PPCC test and L-moment method for regional analysis of reference evapotranspiration and its effective parameters was similar. The results also showed that the distribution type of the parameters which affected ET0 values can affect the distribution of reference evapotranspiration.
Ho, Andrew D; Yu, Carol C
2015-06-01
Many statistical analyses benefit from the assumption that unconditional or conditional distributions are continuous and normal. More than 50 years ago in this journal, Lord and Cook chronicled departures from normality in educational tests, and Micerri similarly showed that the normality assumption is met rarely in educational and psychological practice. In this article, the authors extend these previous analyses to state-level educational test score distributions that are an increasingly common target of high-stakes analysis and interpretation. Among 504 scale-score and raw-score distributions from state testing programs from recent years, nonnormal distributions are common and are often associated with particular state programs. The authors explain how scaling procedures from item response theory lead to nonnormal distributions as well as unusual patterns of discreteness. The authors recommend that distributional descriptive statistics be calculated routinely to inform model selection for large-scale test score data, and they illustrate consequences of nonnormality using sensitivity studies that compare baseline results to those from normalized score scales.
Mathur, Sunil; Sadana, Ajit
2015-12-01
We present a rank-based test statistic for the identification of differentially expressed genes using a distance measure. The proposed test statistic is highly robust against extreme values and does not assume the distribution of parent population. Simulation studies show that the proposed test is more powerful than some of the commonly used methods, such as paired t-test, Wilcoxon signed rank test, and significance analysis of microarray (SAM) under certain non-normal distributions. The asymptotic distribution of the test statistic, and the p-value function are discussed. The application of proposed method is shown using a real-life data set. © The Author(s) 2011.
Xu, Maoqi; Chen, Liang
2018-01-01
The individual sample heterogeneity is one of the biggest obstacles in biomarker identification for complex diseases such as cancers. Current statistical models to identify differentially expressed genes between disease and control groups often overlook the substantial human sample heterogeneity. Meanwhile, traditional nonparametric tests lose detailed data information and sacrifice the analysis power, although they are distribution free and robust to heterogeneity. Here, we propose an empirical likelihood ratio test with a mean-variance relationship constraint (ELTSeq) for the differential expression analysis of RNA sequencing (RNA-seq). As a distribution-free nonparametric model, ELTSeq handles individual heterogeneity by estimating an empirical probability for each observation without making any assumption about read-count distribution. It also incorporates a constraint for the read-count overdispersion, which is widely observed in RNA-seq data. ELTSeq demonstrates a significant improvement over existing methods such as edgeR, DESeq, t-tests, Wilcoxon tests and the classic empirical likelihood-ratio test when handling heterogeneous groups. It will significantly advance the transcriptomics studies of cancers and other complex disease. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
López-Valcárcel, Beatriz G; González-Martel, Christian; Peiro, Salvador
2018-01-01
Objective Newcomb-Benford’s Law (NBL) proposes a regular distribution for first digits, second digits and digit combinations applicable to many different naturally occurring sources of data. Testing deviations from NBL is used in many datasets as a screening tool for identifying data trustworthiness problems. This study aims to compare public available waiting lists (WL) data from Finland and Spain for testing NBL as an instrument to flag up potential manipulation in WLs. Design Analysis of the frequency of Finnish and Spanish WLs first digits to determine if their distribution is similar to the pattern documented by NBL. Deviations from the expected first digit frequency were analysed using Pearson’s χ2, mean absolute deviation and Kuiper tests. Setting/participants Publicly available WL data from Finland and Spain, two countries with universal health insurance and National Health Systems but characterised by different levels of transparency and good governance standards. Main outcome measures Adjustment of the observed distribution of the numbers reported in Finnish and Spanish WL data to the expected distribution according to NBL. Results WL data reported by the Finnish health system fits first digit NBL according to all statistical tests used (p=0.6519 in χ2 test). For Spanish data, this hypothesis was rejected in all tests (p<0.0001 in χ2 test). Conclusions Testing deviations from NBL distribution can be a useful tool to identify problems with WL data trustworthiness and signalling the need for further testing. PMID:29743333
Model-Driven Test Generation of Distributed Systems
NASA Technical Reports Server (NTRS)
Easwaran, Arvind; Hall, Brendan; Schweiker, Kevin
2012-01-01
This report describes a novel test generation technique for distributed systems. Utilizing formal models and formal verification tools, spe cifically the Symbolic Analysis Laboratory (SAL) tool-suite from SRI, we present techniques to generate concurrent test vectors for distrib uted systems. These are initially explored within an informal test validation context and later extended to achieve full MC/DC coverage of the TTEthernet protocol operating within a system-centric context.
Analysis of quantitative data obtained from toxicity studies showing non-normal distribution.
Kobayashi, Katsumi
2005-05-01
The data obtained from toxicity studies are examined for homogeneity of variance, but, usually, they are not examined for normal distribution. In this study I examined the measured items of a carcinogenicity/chronic toxicity study with rats for both homogeneity of variance and normal distribution. It was observed that a lot of hematology and biochemistry items showed non-normal distribution. For testing normal distribution of the data obtained from toxicity studies, the data of the concurrent control group may be examined, and for the data that show a non-normal distribution, non-parametric tests with robustness may be applied.
NASA Technical Reports Server (NTRS)
Lameris, J.
1984-01-01
The development of a thermal and structural model for a hypersonic wing test structure using the NASTRAN finite-element method as its primary analytical tool is described. A detailed analysis was defined to obtain the temperature and thermal stress distribution in the whole wing as well as the five upper and lower root panels. During the development of the models, it was found that the thermal application of NASTRAN and the VIEW program, used for the generation of the radiation exchange coefficients, were definicent. Although for most of these deficiencies solutions could be found, the existence of one particular deficiency in the current thermal model prevented the final computation of the temperature distributions. A SPAR analysis of a single bay of the wing, using data converted from the original NASTRAN model, indicates that local temperature-time distributions can be obtained with good agreement with the test data. The conversion of the NASTRAN thermal model into a SPAR model is recommended to meet the immediate goal of obtaining an accurate thermal stress distribution.
Distributed analysis functional testing using GangaRobot in the ATLAS experiment
NASA Astrophysics Data System (ADS)
Legger, Federica; ATLAS Collaboration
2011-12-01
Automated distributed analysis tests are necessary to ensure smooth operations of the ATLAS grid resources. The HammerCloud framework allows for easy definition, submission and monitoring of grid test applications. Both functional and stress test applications can be defined in HammerCloud. Stress tests are large-scale tests meant to verify the behaviour of sites under heavy load. Functional tests are light user applications running at each site with high frequency, to ensure that the site functionalities are available at all times. Success or failure rates of these tests jobs are individually monitored. Test definitions and results are stored in a database and made available to users and site administrators through a web interface. In this work we present the recent developments of the GangaRobot framework. GangaRobot monitors the outcome of functional tests, creates a blacklist of sites failing the tests, and exports the results to the ATLAS Site Status Board (SSB) and to the Service Availability Monitor (SAM), providing on the one hand a fast way to identify systematic or temporary site failures, and on the other hand allowing for an effective distribution of the work load on the available resources.
Optimum structural design based on reliability and proof-load testing
NASA Technical Reports Server (NTRS)
Shinozuka, M.; Yang, J. N.
1969-01-01
Proof-load test eliminates structures with strength less than the proof load and improves the reliability value in analysis. It truncates the distribution function of strength at the proof load, thereby alleviating verification of a fitted distribution function at the lower tail portion where data are usually nonexistent.
Probabilistic thermal-shock strength testing using infrared imaging
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wereszczak, A.A.; Scheidt, R.A.; Ferber, M.K.
1999-12-01
A thermal-shock strength-testing technique has been developed that uses a high-resolution, high-temperature infrared camera to capture a specimen's surface temperature distribution at fracture. Aluminum nitride (AlN) substrates are thermally shocked to fracture to demonstrate the technique. The surface temperature distribution for each test and AlN's thermal expansion are used as input in a finite-element model to determine the thermal-shock strength for each specimen. An uncensored thermal-shock strength Weibull distribution is then determined. The test and analysis algorithm show promise as a means to characterize thermal shock strength of ceramic materials.
Chou, C P; Bentler, P M; Satorra, A
1991-11-01
Research studying robustness of maximum likelihood (ML) statistics in covariance structure analysis has concluded that test statistics and standard errors are biased under severe non-normality. An estimation procedure known as asymptotic distribution free (ADF), making no distributional assumption, has been suggested to avoid these biases. Corrections to the normal theory statistics to yield more adequate performance have also been proposed. This study compares the performance of a scaled test statistic and robust standard errors for two models under several non-normal conditions and also compares these with the results from ML and ADF methods. Both ML and ADF test statistics performed rather well in one model and considerably worse in the other. In general, the scaled test statistic seemed to behave better than the ML test statistic and the ADF statistic performed the worst. The robust and ADF standard errors yielded more appropriate estimates of sampling variability than the ML standard errors, which were usually downward biased, in both models under most of the non-normal conditions. ML test statistics and standard errors were found to be quite robust to the violation of the normality assumption when data had either symmetric and platykurtic distributions, or non-symmetric and zero kurtotic distributions.
Gene Level Meta-Analysis of Quantitative Traits by Functional Linear Models.
Fan, Ruzong; Wang, Yifan; Boehnke, Michael; Chen, Wei; Li, Yun; Ren, Haobo; Lobach, Iryna; Xiong, Momiao
2015-08-01
Meta-analysis of genetic data must account for differences among studies including study designs, markers genotyped, and covariates. The effects of genetic variants may differ from population to population, i.e., heterogeneity. Thus, meta-analysis of combining data of multiple studies is difficult. Novel statistical methods for meta-analysis are needed. In this article, functional linear models are developed for meta-analyses that connect genetic data to quantitative traits, adjusting for covariates. The models can be used to analyze rare variants, common variants, or a combination of the two. Both likelihood-ratio test (LRT) and F-distributed statistics are introduced to test association between quantitative traits and multiple variants in one genetic region. Extensive simulations are performed to evaluate empirical type I error rates and power performance of the proposed tests. The proposed LRT and F-distributed statistics control the type I error very well and have higher power than the existing methods of the meta-analysis sequence kernel association test (MetaSKAT). We analyze four blood lipid levels in data from a meta-analysis of eight European studies. The proposed methods detect more significant associations than MetaSKAT and the P-values of the proposed LRT and F-distributed statistics are usually much smaller than those of MetaSKAT. The functional linear models and related test statistics can be useful in whole-genome and whole-exome association studies. Copyright © 2015 by the Genetics Society of America.
NASA Technical Reports Server (NTRS)
Podwysocki, M. H.
1976-01-01
A study was made of the field size distributions for LACIE test sites 5029, 5033, and 5039, People's Republic of China. Field lengths and widths were measured from LANDSAT imagery, and field area was statistically modeled. Field size parameters have log-normal or Poisson frequency distributions. These were normalized to the Gaussian distribution and theoretical population curves were made. When compared to fields in other areas of the same country measured in the previous study, field lengths and widths in the three LACIE test sites were 2 to 3 times smaller and areas were smaller by an order of magnitude.
Pinilla, Jaime; López-Valcárcel, Beatriz G; González-Martel, Christian; Peiro, Salvador
2018-05-09
Newcomb-Benford's Law (NBL) proposes a regular distribution for first digits, second digits and digit combinations applicable to many different naturally occurring sources of data. Testing deviations from NBL is used in many datasets as a screening tool for identifying data trustworthiness problems. This study aims to compare public available waiting lists (WL) data from Finland and Spain for testing NBL as an instrument to flag up potential manipulation in WLs. Analysis of the frequency of Finnish and Spanish WLs first digits to determine if their distribution is similar to the pattern documented by NBL. Deviations from the expected first digit frequency were analysed using Pearson's χ 2 , mean absolute deviation and Kuiper tests. Publicly available WL data from Finland and Spain, two countries with universal health insurance and National Health Systems but characterised by different levels of transparency and good governance standards. Adjustment of the observed distribution of the numbers reported in Finnish and Spanish WL data to the expected distribution according to NBL. WL data reported by the Finnish health system fits first digit NBL according to all statistical tests used (p=0.6519 in χ 2 test). For Spanish data, this hypothesis was rejected in all tests (p<0.0001 in χ 2 test). Testing deviations from NBL distribution can be a useful tool to identify problems with WL data trustworthiness and signalling the need for further testing. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Formulation of the Multi-Hit Model With a Non-Poisson Distribution of Hits
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vassiliev, Oleg N., E-mail: Oleg.Vassiliev@albertahealthservices.ca
2012-07-15
Purpose: We proposed a formulation of the multi-hit single-target model in which the Poisson distribution of hits was replaced by a combination of two distributions: one for the number of particles entering the target and one for the number of hits a particle entering the target produces. Such an approach reflects the fact that radiation damage is a result of two different random processes: particle emission by a radiation source and interaction of particles with matter inside the target. Methods and Materials: Poisson distribution is well justified for the first of the two processes. The second distribution depends on howmore » a hit is defined. To test our approach, we assumed that the second distribution was also a Poisson distribution. The two distributions combined resulted in a non-Poisson distribution. We tested the proposed model by comparing it with previously reported data for DNA single- and double-strand breaks induced by protons and electrons, for survival of a range of cell lines, and variation of the initial slopes of survival curves with radiation quality for heavy-ion beams. Results: Analysis of cell survival equations for this new model showed that they had realistic properties overall, such as the initial and high-dose slopes of survival curves, the shoulder, and relative biological effectiveness (RBE) In most cases tested, a better fit of survival curves was achieved with the new model than with the linear-quadratic model. The results also suggested that the proposed approach may extend the multi-hit model beyond its traditional role in analysis of survival curves to predicting effects of radiation quality and analysis of DNA strand breaks. Conclusions: Our model, although conceptually simple, performed well in all tests. The model was able to consistently fit data for both cell survival and DNA single- and double-strand breaks. It correctly predicted the dependence of radiation effects on parameters of radiation quality.« less
Privacy-preserving Kruskal-Wallis test.
Guo, Suxin; Zhong, Sheng; Zhang, Aidong
2013-10-01
Statistical tests are powerful tools for data analysis. Kruskal-Wallis test is a non-parametric statistical test that evaluates whether two or more samples are drawn from the same distribution. It is commonly used in various areas. But sometimes, the use of the method is impeded by privacy issues raised in fields such as biomedical research and clinical data analysis because of the confidential information contained in the data. In this work, we give a privacy-preserving solution for the Kruskal-Wallis test which enables two or more parties to coordinately perform the test on the union of their data without compromising their data privacy. To the best of our knowledge, this is the first work that solves the privacy issues in the use of the Kruskal-Wallis test on distributed data. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Stochastic Analysis of Wind Energy for Wind Pump Irrigation in Coastal Andhra Pradesh, India
NASA Astrophysics Data System (ADS)
Raju, M. M.; Kumar, A.; Bisht, D.; Rao, D. B.
2014-09-01
The rapid escalation in the prices of oil and gas as well as increasing demand for energy has attracted the attention of scientists and researchers to explore the possibility of generating and utilizing the alternative and renewable sources of wind energy in the long coastal belt of India with considerable wind energy resources. A detailed analysis of wind potential is a prerequisite to harvest the wind energy resources efficiently. Keeping this in view, the present study was undertaken to analyze the wind energy potential to assess feasibility of the wind-pump operated irrigation system in the coastal region of Andhra Pradesh, India, where high ground water table conditions are available. The stochastic analysis of wind speed data were tested to fit a probability distribution, which describes the wind energy potential in the region. The normal and Weibull probability distributions were tested; and on the basis of Chi square test, the Weibull distribution gave better results. Hence, it was concluded that the Weibull probability distribution may be used to stochastically describe the annual wind speed data of coastal Andhra Pradesh with better accuracy. The size as well as the complete irrigation system with mass curve analysis was determined to satisfy various daily irrigation demands at different risk levels.
Test Protocol for Room-to-Room Distribution of Outside Air by Residential Ventilation Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barley, C. D.; Anderson, R.; Hendron, B.
2007-12-01
This test and analysis protocol has been developed as a practical approach for measuring outside air distribution in homes. It has been used successfully in field tests and has led to significant insights on ventilation design issues. Performance advantages of more sophisticated ventilation systems over simpler, less-costly designs have been verified, and specific problems, such as airflow short-circuiting, have been identified.
Statistical analysis of multivariate atmospheric variables. [cloud cover
NASA Technical Reports Server (NTRS)
Tubbs, J. D.
1979-01-01
Topics covered include: (1) estimation in discrete multivariate distributions; (2) a procedure to predict cloud cover frequencies in the bivariate case; (3) a program to compute conditional bivariate normal parameters; (4) the transformation of nonnormal multivariate to near-normal; (5) test of fit for the extreme value distribution based upon the generalized minimum chi-square; (6) test of fit for continuous distributions based upon the generalized minimum chi-square; (7) effect of correlated observations on confidence sets based upon chi-square statistics; and (8) generation of random variates from specified distributions.
NASA Astrophysics Data System (ADS)
Coelho, Carlos A.; Marques, Filipe J.
2013-09-01
In this paper the authors combine the equicorrelation and equivariance test introduced by Wilks [13] with the likelihood ratio test (l.r.t.) for independence of groups of variables to obtain the l.r.t. of block equicorrelation and equivariance. This test or its single block version may find applications in many areas as in psychology, education, medicine, genetics and they are important "in many tests of multivariate analysis, e.g. in MANOVA, Profile Analysis, Growth Curve analysis, etc" [12, 9]. By decomposing the overall hypothesis into the hypotheses of independence of groups of variables and the hypothesis of equicorrelation and equivariance we are able to obtain the expressions for the overall l.r.t. statistic and its moments. From these we obtain a suitable factorization of the characteristic function (c.f.) of the logarithm of the l.r.t. statistic, which enables us to develop highly manageable and precise near-exact distributions for the test statistic.
Strain Modal Analysis of Small and Light Pipes Using Distributed Fibre Bragg Grating Sensors
Huang, Jun; Zhou, Zude; Zhang, Lin; Chen, Juntao; Ji, Chunqian; Pham, Duc Truong
2016-01-01
Vibration fatigue failure is a critical problem of hydraulic pipes under severe working conditions. Strain modal testing of small and light pipes is a good option for dynamic characteristic evaluation, structural health monitoring and damage identification. Unique features such as small size, light weight, and high multiplexing capability enable Fibre Bragg Grating (FBG) sensors to measure structural dynamic responses where sensor size and placement are critical. In this paper, experimental strain modal analysis of pipes using distributed FBG sensors ispresented. Strain modal analysis and parameter identification methods are introduced. Experimental strain modal testing and finite element analysis for a cantilever pipe have been carried out. The analysis results indicate that the natural frequencies and strain mode shapes of the tested pipe acquired by FBG sensors are in good agreement with the results obtained by a reference accelerometer and simulation outputs. The strain modal parameters of a hydraulic pipe were obtained by the proposed strain modal testing method. FBG sensors have been shown to be useful in the experimental strain modal analysis of small and light pipes in mechanical, aeronautic and aerospace applications. PMID:27681728
Adaptive linear rank tests for eQTL studies
Szymczak, Silke; Scheinhardt, Markus O.; Zeller, Tanja; Wild, Philipp S.; Blankenberg, Stefan; Ziegler, Andreas
2013-01-01
Expression quantitative trait loci (eQTL) studies are performed to identify single-nucleotide polymorphisms that modify average expression values of genes, proteins, or metabolites, depending on the genotype. As expression values are often not normally distributed, statistical methods for eQTL studies should be valid and powerful in these situations. Adaptive tests are promising alternatives to standard approaches, such as the analysis of variance or the Kruskal–Wallis test. In a two-stage procedure, skewness and tail length of the distributions are estimated and used to select one of several linear rank tests. In this study, we compare two adaptive tests that were proposed in the literature using extensive Monte Carlo simulations of a wide range of different symmetric and skewed distributions. We derive a new adaptive test that combines the advantages of both literature-based approaches. The new test does not require the user to specify a distribution. It is slightly less powerful than the locally most powerful rank test for the correct distribution and at least as powerful as the maximin efficiency robust rank test. We illustrate the application of all tests using two examples from different eQTL studies. PMID:22933317
Adaptive linear rank tests for eQTL studies.
Szymczak, Silke; Scheinhardt, Markus O; Zeller, Tanja; Wild, Philipp S; Blankenberg, Stefan; Ziegler, Andreas
2013-02-10
Expression quantitative trait loci (eQTL) studies are performed to identify single-nucleotide polymorphisms that modify average expression values of genes, proteins, or metabolites, depending on the genotype. As expression values are often not normally distributed, statistical methods for eQTL studies should be valid and powerful in these situations. Adaptive tests are promising alternatives to standard approaches, such as the analysis of variance or the Kruskal-Wallis test. In a two-stage procedure, skewness and tail length of the distributions are estimated and used to select one of several linear rank tests. In this study, we compare two adaptive tests that were proposed in the literature using extensive Monte Carlo simulations of a wide range of different symmetric and skewed distributions. We derive a new adaptive test that combines the advantages of both literature-based approaches. The new test does not require the user to specify a distribution. It is slightly less powerful than the locally most powerful rank test for the correct distribution and at least as powerful as the maximin efficiency robust rank test. We illustrate the application of all tests using two examples from different eQTL studies. Copyright © 2012 John Wiley & Sons, Ltd.
Research on the novel FBG detection system for temperature and strain field distribution
NASA Astrophysics Data System (ADS)
Liu, Zhi-chao; Yang, Jin-hua
2017-10-01
In order to collect the information of temperature and strain field distribution information, the novel FBG detection system was designed. The system applied linear chirped FBG structure for large bandwidth. The structure of novel FBG cover was designed as a linear change in thickness, in order to have a different response at different locations. It can obtain the temperature and strain field distribution information by reflection spectrum simultaneously. The structure of novel FBG cover was designed, and its theoretical function is calculated. Its solution is derived for strain field distribution. By simulation analysis the change trend of temperature and strain field distribution were analyzed in the conditions of different strain strength and action position, the strain field distribution can be resolved. The FOB100 series equipment was used to test the temperature in experiment, and The JSM-A10 series equipment was used to test the strain field distribution in experiment. The average error of experimental results was better than 1.1% for temperature, and the average error of experimental results was better than 1.3% for strain. There were individual errors when the strain was small in test data. It is feasibility by theoretical analysis, simulation calculation and experiment, and it is very suitable for application practice.
Yoganandan, Narayan; Arun, Mike W J; Pintar, Frank A; Szabo, Aniko
2014-01-01
Derive optimum injury probability curves to describe human tolerance of the lower leg using parametric survival analysis. The study reexamined lower leg postmortem human subjects (PMHS) data from a large group of specimens. Briefly, axial loading experiments were conducted by impacting the plantar surface of the foot. Both injury and noninjury tests were included in the testing process. They were identified by pre- and posttest radiographic images and detailed dissection following the impact test. Fractures included injuries to the calcaneus and distal tibia-fibula complex (including pylon), representing severities at the Abbreviated Injury Score (AIS) level 2+. For the statistical analysis, peak force was chosen as the main explanatory variable and the age was chosen as the covariable. Censoring statuses depended on experimental outcomes. Parameters from the parametric survival analysis were estimated using the maximum likelihood approach and the dfbetas statistic was used to identify overly influential samples. The best fit from the Weibull, log-normal, and log-logistic distributions was based on the Akaike information criterion. Plus and minus 95% confidence intervals were obtained for the optimum injury probability distribution. The relative sizes of the interval were determined at predetermined risk levels. Quality indices were described at each of the selected probability levels. The mean age, stature, and weight were 58.2±15.1 years, 1.74±0.08 m, and 74.9±13.8 kg, respectively. Excluding all overly influential tests resulted in the tightest confidence intervals. The Weibull distribution was the most optimum function compared to the other 2 distributions. A majority of quality indices were in the good category for this optimum distribution when results were extracted for 25-, 45- and 65-year-olds at 5, 25, and 50% risk levels age groups for lower leg fracture. For 25, 45, and 65 years, peak forces were 8.1, 6.5, and 5.1 kN at 5% risk; 9.6, 7.7, and 6.1 kN at 25% risk; and 10.4, 8.3, and 6.6 kN at 50% risk, respectively. This study derived axial loading-induced injury risk curves based on survival analysis using peak force and specimen age; adopting different censoring schemes; considering overly influential samples in the analysis; and assessing the quality of the distribution at discrete probability levels. Because procedures used in the present survival analysis are accepted by international automotive communities, current optimum human injury probability distributions can be used at all risk levels with more confidence in future crashworthiness applications for automotive and other disciplines.
The source of electrostatic fluctuations in the solar-wind
NASA Technical Reports Server (NTRS)
Lemons, D. S.; Asbridge, J. R.; Bame, S. J.; Feldman, W. C.; Gary, S. P.; Gosling, J. T.
1979-01-01
Solar wind electron and ion distribution functions measured simultaneously with or close to times of intense electrostatic fluctuations are subjected to a linear Vlasov stability analysis. Although all distributions tested were found to be stable, the analysis suggests that the ion beam instability is the most likely source of the fluctuations.
Interfacial stress state present in a 'thin-slice' fibre push-out test
NASA Technical Reports Server (NTRS)
Kallas, M. N.; Koss, D. A.; Hahn, H. T.; Hellmann, J. R.
1992-01-01
An analysis of the stress distributions along the fiber-matrix interface in a 'thin-slice' fiber push-out test is presented for selected test geometries. For the small specimen thicknesses often required to displace large-diameter fibers with high interfacial shear strengths, finite element analysis indicates that large bending stresses may be present. The magnitude of these stresses and their spatial distribution can be very sensitive to the test configuration. For certain test geometries, the specimen configuration itself may alter the interfacial failure process from one which initiates due to a maximum in shear stress near the top surface adjacent to the indentor, to one which involves mixed mode crack growth up from the bottom surface and/or yielding within the matrix near the interface.
ERIC Educational Resources Information Center
Umesh, U. N.; Mishra, Sanjay
1990-01-01
Major issues related to index-of-fit conjoint analysis were addressed in this simulation study. Goals were to develop goodness-of-fit criteria for conjoint analysis; develop tests to determine the significance of conjoint analysis results; and calculate the power of the test of the null hypothesis of random data distribution. (SLD)
Tips and Tricks for Successful Application of Statistical Methods to Biological Data.
Schlenker, Evelyn
2016-01-01
This chapter discusses experimental design and use of statistics to describe characteristics of data (descriptive statistics) and inferential statistics that test the hypothesis posed by the investigator. Inferential statistics, based on probability distributions, depend upon the type and distribution of the data. For data that are continuous, randomly and independently selected, as well as normally distributed more powerful parametric tests such as Student's t test and analysis of variance (ANOVA) can be used. For non-normally distributed or skewed data, transformation of the data (using logarithms) may normalize the data allowing use of parametric tests. Alternatively, with skewed data nonparametric tests can be utilized, some of which rely on data that are ranked prior to statistical analysis. Experimental designs and analyses need to balance between committing type 1 errors (false positives) and type 2 errors (false negatives). For a variety of clinical studies that determine risk or benefit, relative risk ratios (random clinical trials and cohort studies) or odds ratios (case-control studies) are utilized. Although both use 2 × 2 tables, their premise and calculations differ. Finally, special statistical methods are applied to microarray and proteomics data, since the large number of genes or proteins evaluated increase the likelihood of false discoveries. Additional studies in separate samples are used to verify microarray and proteomic data. Examples in this chapter and references are available to help continued investigation of experimental designs and appropriate data analysis.
NASA Technical Reports Server (NTRS)
Scudder, Nathan F
1934-01-01
This report presents the results of an investigation of the spinning characteristics of NY-1 naval training biplane. The results of flight tests and an analysis based on wind-tunnel test data are given and compared. The primary purpose of the investigation was the determination in flight of the effect of changes in mass distribution along the longitudinal axis, without change of mass quantity or centroid. Other effects were also investigated, such as those due to wing loading, center-of-gravity position, dihedral of wings, control setting, and the removal of a large portion of the fabric from the fin and rudder. The wind tunnel test results used in the numerical analysis were obtained in the 7 by 10 foot wind tunnel through an angle-of-attack.
Testing the anisotropy in the angular distribution of Fermi/GBM gamma-ray bursts
NASA Astrophysics Data System (ADS)
Tarnopolski, M.
2017-12-01
Gamma-ray bursts (GRBs) were confirmed to be of extragalactic origin due to their isotropic angular distribution, combined with the fact that they exhibited an intensity distribution that deviated strongly from the -3/2 power law. This finding was later confirmed with the first redshift, equal to at least z = 0.835, measured for GRB970508. Despite this result, the data from CGRO/BATSE and Swift/BAT indicate that long GRBs are indeed distributed isotropically, but the distribution of short GRBs is anisotropic. Fermi/GBM has detected 1669 GRBs up to date, and their sky distribution is examined in this paper. A number of statistical tests are applied: nearest neighbour analysis, fractal dimension, dipole and quadrupole moments of the distribution function decomposed into spherical harmonics, binomial test and the two-point angular correlation function. Monte Carlo benchmark testing of each test is performed in order to evaluate its reliability. It is found that short GRBs are distributed anisotropically in the sky, and long ones have an isotropic distribution. The probability that these results are not a chance occurrence is equal to at least 99.98 per cent and 30.68 per cent for short and long GRBs, respectively. The cosmological context of this finding and its relation to large-scale structures is discussed.
Efficient Blockwise Permutation Tests Preserving Exchangeability
Zhou, Chunxiao; Zwilling, Chris E.; Calhoun, Vince D.; Wang, Michelle Y.
2014-01-01
In this paper, we present a new blockwise permutation test approach based on the moments of the test statistic. The method is of importance to neuroimaging studies. In order to preserve the exchangeability condition required in permutation tests, we divide the entire set of data into certain exchangeability blocks. In addition, computationally efficient moments-based permutation tests are performed by approximating the permutation distribution of the test statistic with the Pearson distribution series. This involves the calculation of the first four moments of the permutation distribution within each block and then over the entire set of data. The accuracy and efficiency of the proposed method are demonstrated through simulated experiment on the magnetic resonance imaging (MRI) brain data, specifically the multi-site voxel-based morphometry analysis from structural MRI (sMRI). PMID:25289113
Effect of carbide distribution on rolling-element fatigue life of AMS 5749
NASA Technical Reports Server (NTRS)
Parker, R. J.; Bamberger, E. N.
1983-01-01
Endurance tests with ball bearings made of corrosion resistant bearing steel which resulted in fatigue lives much lower than were predicted are discussed. Metallurgical analysis revealed an undesirable carbide distribution in the races. It was shown in accelerated fatigue tests in the RC rig that large, banded carbides can reduce rolling element fatigue life by a factor of approximately four. The early spalling failures on the bearing raceways are attributed to the large carbide size and banded distribution.
NASA Technical Reports Server (NTRS)
1980-01-01
MATHPAC image-analysis library is collection of general-purpose mathematical and statistical routines and special-purpose data-analysis and pattern-recognition routines for image analysis. MATHPAC library consists of Linear Algebra, Optimization, Statistical-Summary, Densities and Distribution, Regression, and Statistical-Test packages.
Frison, Severine; Checchi, Francesco; Kerac, Marko; Nicholas, Jennifer
2016-01-01
Wasting is a major public health issue throughout the developing world. Out of the 6.9 million estimated deaths among children under five annually, over 800,000 deaths (11.6 %) are attributed to wasting. Wasting is quantified as low Weight-For-Height (WFH) and/or low Mid-Upper Arm Circumference (MUAC) (since 2005). Many statistical procedures are based on the assumption that the data used are normally distributed. Analyses have been conducted on the distribution of WFH but there are no equivalent studies on the distribution of MUAC. This secondary data analysis assesses the normality of the MUAC distributions of 852 nutrition cross-sectional survey datasets of children from 6 to 59 months old and examines different approaches to normalise "non-normal" distributions. The distribution of MUAC showed no departure from a normal distribution in 319 (37.7 %) distributions using the Shapiro-Wilk test. Out of the 533 surveys showing departure from a normal distribution, 183 (34.3 %) were skewed (D'Agostino test) and 196 (36.8 %) had a kurtosis different to the one observed in the normal distribution (Anscombe-Glynn test). Testing for normality can be sensitive to data quality, design effect and sample size. Out of the 533 surveys showing departure from a normal distribution, 294 (55.2 %) showed high digit preference, 164 (30.8 %) had a large design effect, and 204 (38.3 %) a large sample size. Spline and LOESS smoothing techniques were explored and both techniques work well. After Spline smoothing, 56.7 % of the MUAC distributions showing departure from normality were "normalised" and 59.7 % after LOESS. Box-Cox power transformation had similar results on distributions showing departure from normality with 57 % of distributions approximating "normal" after transformation. Applying Box-Cox transformation after Spline or Loess smoothing techniques increased that proportion to 82.4 and 82.7 % respectively. This suggests that statistical approaches relying on the normal distribution assumption can be successfully applied to MUAC. In light of this promising finding, further research is ongoing to evaluate the performance of a normal distribution based approach to estimating the prevalence of wasting using MUAC.
Developing a Methodology for Risk-Informed Trade-Space Analysis in Acquisition
2015-01-01
73 6.10. Research, Development, Test, and Evaluation Cost Distribution, Technology 1 Mitigation of...6.11. Research, Development, Test, and Evaluation Cost Distribution, Technology 3 Mitigation of the Upgrade Alternative...courses of action, or risk- mitigation behaviors, which take place in the event that the technology is not developed by the mile- stone date (e.g
Entropy-based derivation of generalized distributions for hydrometeorological frequency analysis
NASA Astrophysics Data System (ADS)
Chen, Lu; Singh, Vijay P.
2018-02-01
Frequency analysis of hydrometeorological and hydrological extremes is needed for the design of hydraulic and civil infrastructure facilities as well as water resources management. A multitude of distributions have been employed for frequency analysis of these extremes. However, no single distribution has been accepted as a global standard. Employing the entropy theory, this study derived five generalized distributions for frequency analysis that used different kinds of information encoded as constraints. These distributions were the generalized gamma (GG), the generalized beta distribution of the second kind (GB2), and the Halphen type A distribution (Hal-A), Halphen type B distribution (Hal-B) and Halphen type inverse B distribution (Hal-IB), among which the GG and GB2 distribution were previously derived by Papalexiou and Koutsoyiannis (2012) and the Halphen family was first derived using entropy theory in this paper. The entropy theory allowed to estimate parameters of the distributions in terms of the constraints used for their derivation. The distributions were tested using extreme daily and hourly rainfall data. Results show that the root mean square error (RMSE) values were very small, which indicated that the five generalized distributions fitted the extreme rainfall data well. Among them, according to the Akaike information criterion (AIC) values, generally the GB2 and Halphen family gave a better fit. Therefore, those general distributions are one of the best choices for frequency analysis. The entropy-based derivation led to a new way for frequency analysis of hydrometeorological extremes.
Load flow and state estimation algorithms for three-phase unbalanced power distribution systems
NASA Astrophysics Data System (ADS)
Madvesh, Chiranjeevi
Distribution load flow and state estimation are two important functions in distribution energy management systems (DEMS) and advanced distribution automation (ADA) systems. Distribution load flow analysis is a tool which helps to analyze the status of a power distribution system under steady-state operating conditions. In this research, an effective and comprehensive load flow algorithm is developed to extensively incorporate the distribution system components. Distribution system state estimation is a mathematical procedure which aims to estimate the operating states of a power distribution system by utilizing the information collected from available measurement devices in real-time. An efficient and computationally effective state estimation algorithm adapting the weighted-least-squares (WLS) method has been developed in this research. Both the developed algorithms are tested on different IEEE test-feeders and the results obtained are justified.
Modeling the Extremely Lightweight Zerodur Mirror (ELZM) Thermal Soak Test
NASA Technical Reports Server (NTRS)
Brooks, Thomas E.; Eng, Ron; Hull, Tony; Stahl, H. Philip
2017-01-01
Exoplanet science requires extreme wavefront stability (10 pm change/10 minutes), so every source of wavefront error (WFE) must be characterized in detail. This work illustrates the testing and characterization process that will be used to determine how much surface figure error (SFE) is produced by mirror substrate materials' CTE distributions. Schott's extremely lightweight Zerodur mirror (ELZM) was polished to a sphere, mounted, and tested at Marshall Space Flight Center (MSFC) in the X-Ray and Cryogenic Test Facility (XRCF). The test transitioned the mirror's temperature from an isothermal state at 292K to isothermal states at 275K, 250K and 230K to isolate the effects of the mirror's CTE distribution. The SFE was measured interferometrically at each temperature state and finite element analysis (FEA) has been completed to assess the predictability of the change in the mirror's surface due to a change in the mirror's temperature. The coefficient of thermal expansion (CTE) distribution in the ELZM is unknown, so the analysis has been correlated to the test data. The correlation process requires finding the sensitivity of SFE to a given CTE distribution in the mirror. A novel hand calculation is proposed to use these sensitivities to estimate thermally induced SFE. The correlation process was successful and is documented in this paper. The CTE map that produces the measured SFE is in line with the measured data of typical boules of Schott's Zerodur glass.
Modeling the Extremely Lightweight Zerodur Mirror (ELZM) thermal soak test
NASA Astrophysics Data System (ADS)
Brooks, Thomas E.; Eng, Ron; Hull, Tony; Stahl, H. Philip
2017-09-01
Exoplanet science requires extreme wavefront stability (10 pm change/10 minutes), so every source of wavefront error (WFE) must be characterized in detail. This work illustrates the testing and characterization process that will be used to determine how much surface figure error (SFE) is produced by mirror substrate materials' CTE distributions. Schott's extremely lightweight Zerodur mirror (ELZM) was polished to a sphere, mounted, and tested at Marshall Space Flight Center (MSFC) in the X-Ray and Cryogenic Test Facility (XRCF). The test transitioned the mirror's temperature from an isothermal state at 292K to isothermal states at 275K, 250K and 230K to isolate the effects of the mirror's CTE distribution. The SFE was measured interferometrically at each temperature state and finite element analysis (FEA) has been completed to assess the predictability of the change in the mirror's surface due to a change in the mirror's temperature. The coefficient of thermal expansion (CTE) distribution in the ELZM is unknown, so the analysis has been correlated to the test data. The correlation process requires finding the sensitivity of SFE to a given CTE distribution in the mirror. A novel hand calculation is proposed to use these sensitivities to estimate thermally induced SFE. The correlation process was successful and is documented in this paper. The CTE map that produces the measured SFE is in line with the measured data of typical boules of Schott's Zerodur glass.
Can power-law scaling and neuronal avalanches arise from stochastic dynamics?
Touboul, Jonathan; Destexhe, Alain
2010-02-11
The presence of self-organized criticality in biology is often evidenced by a power-law scaling of event size distributions, which can be measured by linear regression on logarithmic axes. We show here that such a procedure does not necessarily mean that the system exhibits self-organized criticality. We first provide an analysis of multisite local field potential (LFP) recordings of brain activity and show that event size distributions defined as negative LFP peaks can be close to power-law distributions. However, this result is not robust to change in detection threshold, or when tested using more rigorous statistical analyses such as the Kolmogorov-Smirnov test. Similar power-law scaling is observed for surrogate signals, suggesting that power-law scaling may be a generic property of thresholded stochastic processes. We next investigate this problem analytically, and show that, indeed, stochastic processes can produce spurious power-law scaling without the presence of underlying self-organized criticality. However, this power-law is only apparent in logarithmic representations, and does not survive more rigorous analysis such as the Kolmogorov-Smirnov test. The same analysis was also performed on an artificial network known to display self-organized criticality. In this case, both the graphical representations and the rigorous statistical analysis reveal with no ambiguity that the avalanche size is distributed as a power-law. We conclude that logarithmic representations can lead to spurious power-law scaling induced by the stochastic nature of the phenomenon. This apparent power-law scaling does not constitute a proof of self-organized criticality, which should be demonstrated by more stringent statistical tests.
Joint analysis of binary and quantitative traits with data sharing and outcome-dependent sampling.
Zheng, Gang; Wu, Colin O; Kwak, Minjung; Jiang, Wenhua; Joo, Jungnam; Lima, Joao A C
2012-04-01
We study the analysis of a joint association between a genetic marker with both binary (case-control) and quantitative (continuous) traits, where the quantitative trait values are only available for the cases due to data sharing and outcome-dependent sampling. Data sharing becomes common in genetic association studies, and the outcome-dependent sampling is the consequence of data sharing, under which a phenotype of interest is not measured for some subgroup. The trend test (or Pearson's test) and F-test are often, respectively, used to analyze the binary and quantitative traits. Because of the outcome-dependent sampling, the usual F-test can be applied using the subgroup with the observed quantitative traits. We propose a modified F-test by also incorporating the genotype frequencies of the subgroup whose traits are not observed. Further, a combination of this modified F-test and Pearson's test is proposed by Fisher's combination of their P-values as a joint analysis. Because of the correlation of the two analyses, we propose to use a Gamma (scaled chi-squared) distribution to fit the asymptotic null distribution for the joint analysis. The proposed modified F-test and the joint analysis can also be applied to test single trait association (either binary or quantitative trait). Through simulations, we identify the situations under which the proposed tests are more powerful than the existing ones. Application to a real dataset of rheumatoid arthritis is presented. © 2012 Wiley Periodicals, Inc.
A note on generalized Genome Scan Meta-Analysis statistics
Koziol, James A; Feng, Anne C
2005-01-01
Background Wise et al. introduced a rank-based statistical technique for meta-analysis of genome scans, the Genome Scan Meta-Analysis (GSMA) method. Levinson et al. recently described two generalizations of the GSMA statistic: (i) a weighted version of the GSMA statistic, so that different studies could be ascribed different weights for analysis; and (ii) an order statistic approach, reflecting the fact that a GSMA statistic can be computed for each chromosomal region or bin width across the various genome scan studies. Results We provide an Edgeworth approximation to the null distribution of the weighted GSMA statistic, and, we examine the limiting distribution of the GSMA statistics under the order statistic formulation, and quantify the relevance of the pairwise correlations of the GSMA statistics across different bins on this limiting distribution. We also remark on aggregate criteria and multiple testing for determining significance of GSMA results. Conclusion Theoretical considerations detailed herein can lead to clarification and simplification of testing criteria for generalizations of the GSMA statistic. PMID:15717930
Analysis of off-axis tension test of wood specimens
Jen Y. Liu
2002-01-01
This paper presents a stress analysis of the off-axis tension test of clear wood specimens based on orthotropic elasticity theory. The effects of Poisson's ratio and shear coupling coefficient on stress distribution are analyzed in detail. The analysis also provides a theoretical foundation for the selection of a 10° grain angle in wood specimens for the...
Zonation in the deep benthic megafauna : Application of a general test.
Gardiner, Frederick P; Haedrich, Richard L
1978-01-01
A test based on Maxwell-Boltzman statistics, instead of the formerly suggested but inappropriate Bose-Einstein statistics (Pielou and Routledge, 1976), examines the distribution of the boundaries of species' ranges distributed along a gradient, and indicates whether they are random or clustered (zoned). The test is most useful as a preliminary to the application of more instructive but less statistically rigorous methods such as cluster analysis. The test indicates zonation is marked in the deep benthic megafauna living between 200 and 3000 m, but below 3000 m little zonation may be found.
Best Statistical Distribution of flood variables for Johor River in Malaysia
NASA Astrophysics Data System (ADS)
Salarpour Goodarzi, M.; Yusop, Z.; Yusof, F.
2012-12-01
A complex flood event is always characterized by a few characteristics such as flood peak, flood volume, and flood duration, which might be mutually correlated. This study explored the statistical distribution of peakflow, flood duration and flood volume at Rantau Panjang gauging station on the Johor River in Malaysia. Hourly data were recorded for 45 years. The data were analysed based on water year (July - June). Five distributions namely, Log Normal, Generalize Pareto, Log Pearson, Normal and Generalize Extreme Value (GEV) were used to model the distribution of all the three variables. Anderson-Darling and Kolmogorov-Smirnov goodness-of-fit tests were used to evaluate the best fit. Goodness-of-fit tests at 5% level of significance indicate that all the models can be used to model the distribution of peakflow, flood duration and flood volume. However, Generalize Pareto distribution is found to be the most suitable model when tested with the Anderson-Darling test and the, Kolmogorov-Smirnov suggested that GEV is the best for peakflow. The result of this research can be used to improve flood frequency analysis. Comparison between Generalized Extreme Value, Generalized Pareto and Log Pearson distributions in the Cumulative Distribution Function of peakflow
A Permutation-Randomization Approach to Test the Spatial Distribution of Plant Diseases.
Lione, G; Gonthier, P
2016-01-01
The analysis of the spatial distribution of plant diseases requires the availability of trustworthy geostatistical methods. The mean distance tests (MDT) are here proposed as a series of permutation and randomization tests to assess the spatial distribution of plant diseases when the variable of phytopathological interest is categorical. A user-friendly software to perform the tests is provided. Estimates of power and type I error, obtained with Monte Carlo simulations, showed the reliability of the MDT (power > 0.80; type I error < 0.05). A biological validation on the spatial distribution of spores of two fungal pathogens causing root rot on conifers was successfully performed by verifying the consistency between the MDT responses and previously published data. An application of the MDT was carried out to analyze the relation between the plantation density and the distribution of the infection of Gnomoniopsis castanea, an emerging fungal pathogen causing nut rot on sweet chestnut. Trees carrying nuts infected by the pathogen were randomly distributed in areas with different plantation densities, suggesting that the distribution of G. castanea was not related to the plantation density. The MDT could be used to analyze the spatial distribution of plant diseases both in agricultural and natural ecosystems.
NASA Technical Reports Server (NTRS)
Sakuraba, K.; Tsuruda, Y.; Hanada, T.; Liou, J.-C.; Akahoshi, Y.
2007-01-01
This paper summarizes two new satellite impact tests conducted in order to investigate on the outcome of low- and hyper-velocity impacts on two identical target satellites. The first experiment was performed at a low velocity of 1.5 km/s using a 40-gram aluminum alloy sphere, whereas the second experiment was performed at a hyper-velocity of 4.4 km/s using a 4-gram aluminum alloy sphere by two-stage light gas gun in Kyushu Institute of Technology. To date, approximately 1,500 fragments from each impact test have been collected for detailed analysis. Each piece was analyzed based on the method used in the NASA Standard Breakup Model 2000 revision. The detailed analysis will conclude: 1) the similarity in mass distribution of fragments between low and hyper-velocity impacts encourages the development of a general-purpose distribution model applicable for a wide impact velocity range, and 2) the difference in area-to-mass ratio distribution between the impact experiments and the NASA standard breakup model suggests to describe the area-to-mass ratio by a bi-normal distribution.
A Study to Investigate the Sleeping Comfort of Mattress using Finite Element Method
NASA Astrophysics Data System (ADS)
Yoshida, Hiroaki; Kamijo, Masayoshi; Shimizu, Yoshio
Sleep is an essential physiological activity for human beings and many studies have so far investigated sleeping comfort of mattresses. The appropriate measurement of stress distribution within the human body would provide valuable information to us. For the appropriate measurement to estimate stress distribution within the human body, numerical analysis is considered one of the most desirable techniques, and Finite Element Method (FEM), which is widely accepted as a useful numerical technique, was utilized in this study. Since human body dimensions have individual differences, however, it is presumed that the way of the internal stress distribution also changes due to the differences and that the mattress preference varies among different body forms. Thus, we developed three human FEM models reproducing the body forms of three types of male subjects, and investigated the sleeping comfort of mattress based on the relationship between FEM analysis findings and sensory testing results. In comparison with the results of both FEM analysis and sensory testing in the neck region, we found, the sensory testing results corresponded to the FEM analysis findings, and it was possible to estimate subjects' preferences of mattress and comfort in the neck region using the FEM analysis. In this study, we believe, the FEM analysis managed to quantify the subjects' preferences for mattress and to prove itself that it is the valuable tools to examine the sleeping comfort of mattress.
Mixture distributions of wind speed in the UAE
NASA Astrophysics Data System (ADS)
Shin, J.; Ouarda, T.; Lee, T. S.
2013-12-01
Wind speed probability distribution is commonly used to estimate potential wind energy. The 2-parameter Weibull distribution has been most widely used to characterize the distribution of wind speed. However, it is unable to properly model wind speed regimes when wind speed distribution presents bimodal and kurtotic shapes. Several studies have concluded that the Weibull distribution should not be used for frequency analysis of wind speed without investigation of wind speed distribution. Due to these mixture distributional characteristics of wind speed data, the application of mixture distributions should be further investigated in the frequency analysis of wind speed. A number of studies have investigated the potential wind energy in different parts of the Arabian Peninsula. Mixture distributional characteristics of wind speed were detected from some of these studies. Nevertheless, mixture distributions have not been employed for wind speed modeling in the Arabian Peninsula. In order to improve our understanding of wind energy potential in Arabian Peninsula, mixture distributions should be tested for the frequency analysis of wind speed. The aim of the current study is to assess the suitability of mixture distributions for the frequency analysis of wind speed in the UAE. Hourly mean wind speed data at 10-m height from 7 stations were used in the current study. The Weibull and Kappa distributions were employed as representatives of the conventional non-mixture distributions. 10 mixture distributions are used and constructed by mixing four probability distributions such as Normal, Gamma, Weibull and Extreme value type-one (EV-1) distributions. Three parameter estimation methods such as Expectation Maximization algorithm, Least Squares method and Meta-Heuristic Maximum Likelihood (MHML) method were employed to estimate the parameters of the mixture distributions. In order to compare the goodness-of-fit of tested distributions and parameter estimation methods for sample wind data, the adjusted coefficient of determination, Bayesian Information Criterion (BIC) and Chi-squared statistics were computed. Results indicate that MHML presents the best performance of parameter estimation for the used mixture distributions. In most of the employed 7 stations, mixture distributions give the best fit. When the wind speed regime shows mixture distributional characteristics, most of these regimes present the kurtotic statistical characteristic. Particularly, applications of mixture distributions for these stations show a significant improvement in explaining the whole wind speed regime. In addition, the Weibull-Weibull mixture distribution presents the best fit for the wind speed data in the UAE.
Yoganandan, Narayan; Arun, Mike W.J.; Pintar, Frank A.; Szabo, Aniko
2015-01-01
Objective Derive optimum injury probability curves to describe human tolerance of the lower leg using parametric survival analysis. Methods The study re-examined lower leg PMHS data from a large group of specimens. Briefly, axial loading experiments were conducted by impacting the plantar surface of the foot. Both injury and non-injury tests were included in the testing process. They were identified by pre- and posttest radiographic images and detailed dissection following the impact test. Fractures included injuries to the calcaneus and distal tibia-fibula complex (including pylon), representing severities at the Abbreviated Injury Score (AIS) level 2+. For the statistical analysis, peak force was chosen as the main explanatory variable and the age was chosen as the co-variable. Censoring statuses depended on experimental outcomes. Parameters from the parametric survival analysis were estimated using the maximum likelihood approach and the dfbetas statistic was used to identify overly influential samples. The best fit from the Weibull, log-normal and log-logistic distributions was based on the Akaike Information Criterion. Plus and minus 95% confidence intervals were obtained for the optimum injury probability distribution. The relative sizes of the interval were determined at predetermined risk levels. Quality indices were described at each of the selected probability levels. Results The mean age, stature and weight: 58.2 ± 15.1 years, 1.74 ± 0.08 m and 74.9 ± 13.8 kg. Excluding all overly influential tests resulted in the tightest confidence intervals. The Weibull distribution was the most optimum function compared to the other two distributions. A majority of quality indices were in the good category for this optimum distribution when results were extracted for 25-, 45- and 65-year-old at five, 25 and 50% risk levels age groups for lower leg fracture. For 25, 45 and 65 years, peak forces were 8.1, 6.5, and 5.1 kN at 5% risk; 9.6, 7.7, and 6.1 kN at 25% risk; and 10.4, 8.3, and 6.6 kN at 50% risk, respectively. Conclusions This study derived axial loading-induced injury risk curves based on survival analysis using peak force and specimen age; adopting different censoring schemes; considering overly influential samples in the analysis; and assessing the quality of the distribution at discrete probability levels. Because procedures used in the present survival analysis are accepted by international automotive communities, current optimum human injury probability distributions can be used at all risk levels with more confidence in future crashworthiness applications for automotive and other disciplines. PMID:25307381
Bivariate Rainfall and Runoff Analysis Using Shannon Entropy Theory
NASA Astrophysics Data System (ADS)
Rahimi, A.; Zhang, L.
2012-12-01
Rainfall-Runoff analysis is the key component for many hydrological and hydraulic designs in which the dependence of rainfall and runoff needs to be studied. It is known that the convenient bivariate distribution are often unable to model the rainfall-runoff variables due to that they either have constraints on the range of the dependence or fixed form for the marginal distributions. Thus, this paper presents an approach to derive the entropy-based joint rainfall-runoff distribution using Shannon entropy theory. The distribution derived can model the full range of dependence and allow different specified marginals. The modeling and estimation can be proceeded as: (i) univariate analysis of marginal distributions which includes two steps, (a) using the nonparametric statistics approach to detect modes and underlying probability density, and (b) fitting the appropriate parametric probability density functions; (ii) define the constraints based on the univariate analysis and the dependence structure; (iii) derive and validate the entropy-based joint distribution. As to validate the method, the rainfall-runoff data are collected from the small agricultural experimental watersheds located in semi-arid region near Riesel (Waco), Texas, maintained by the USDA. The results of unviariate analysis show that the rainfall variables follow the gamma distribution, whereas the runoff variables have mixed structure and follow the mixed-gamma distribution. With this information, the entropy-based joint distribution is derived using the first moments, the first moments of logarithm transformed rainfall and runoff, and the covariance between rainfall and runoff. The results of entropy-based joint distribution indicate: (1) the joint distribution derived successfully preserves the dependence between rainfall and runoff, and (2) the K-S goodness of fit statistical tests confirm the marginal distributions re-derived reveal the underlying univariate probability densities which further assure that the entropy-based joint rainfall-runoff distribution are satisfactorily derived. Overall, the study shows the Shannon entropy theory can be satisfactorily applied to model the dependence between rainfall and runoff. The study also shows that the entropy-based joint distribution is an appropriate approach to capture the dependence structure that cannot be captured by the convenient bivariate joint distributions. Joint Rainfall-Runoff Entropy Based PDF, and Corresponding Marginal PDF and Histogram for W12 Watershed The K-S Test Result and RMSE on Univariate Distributions Derived from the Maximum Entropy Based Joint Probability Distribution;
Theoretical Studies of Ionic Liquids and Nanoclusters as Hybrid Fuels
2016-08-17
Acknowledgements Distribution A: Approved for Public Release; Distribution Unlimited. PA# 16409 Aerospace Systems Directorate RQ-West (EAFB, CA) Rocket ...Engines & Motors Satellite Propulsion Combustion Devices Fuels and Propellants System Analysis R&D Rocket Testing RQ-East (WPAFB, OH) Air...Distribution A: Approved for Public Release; Distribution Unlimited. PA# 16409 5 Identify and develop advanced chemical propellants for rocket
Three-beam interferogram analysis method for surface flatness testing of glass plates and wedges
NASA Astrophysics Data System (ADS)
Sunderland, Zofia; Patorski, Krzysztof
2015-09-01
When testing transparent plates with high quality flat surfaces and a small angle between them the three-beam interference phenomenon is observed. Since the reference beam and the object beams reflected from both the front and back surface of a sample are detected, the recorded intensity distribution may be regarded as a sum of three fringe patterns. Images of that type cannot be succesfully analyzed with standard interferogram analysis methods. They contain, however, useful information on the tested plate surface flatness and its optical thickness variations. Several methods were elaborated to decode the plate parameters. Our technique represents a competitive solution which allows for retrieval of phase components of the three-beam interferogram. It requires recording two images: a three-beam interferogram and the two-beam one with the reference beam blocked. Mutually subtracting these images leads to the intensity distribution which, under some assumptions, provides access to the two component fringe sets which encode surfaces flatness. At various stages of processing we take advantage of nonlinear operations as well as single-frame interferogram analysis methods. Two-dimensional continuous wavelet transform (2D CWT) is used to separate a particular fringe family from the overall interferogram intensity distribution as well as to estimate the phase distribution from a pattern. We distinguish two processing paths depending on the relative density of fringe sets which is connected with geometry of a sample and optical setup. The proposed method is tested on simulated data.
Shi, Xiao-Jun; Zhang, Ming-Li
2015-03-01
Zygophyllum xanthoxylon, a desert species, displaying a broad east-west continuous distribution pattern in arid Northwestern China, can be considered as a model species to investigate the biogeographical history of this region. We sequenced two chloroplast DNA spacers (psbK-psbI and rpl32-trnL) in 226 individuals from 31 populations to explore the phylogeographical structure. Median-joining network was constructed and analysis of AMOVA, SMOVA, neutrality tests and distribution analysis were used to examine genetic structure and potential range expansion. Using species distribution modeling, the geographical distribution of Z. xanthoxylon was modeled during the present and at the Last Glacial Maximum (LGM). Among 26 haplotypes, one was widely distributed, but most was restricted to either the eastern or western region. The populations with the highest levels of haplotype diversity were found in the Tianshan Mountains and its surroundings in the west, and the Helan Mountains and Alxa Plateau in the east. AMOVA and SAMOVA showed that over all populations, the species lacks phylogeographical structure, which is speculated to be the result of its specific biology. Neutrality tests and mismatch distribution analysis support past range expansions of the species. Comparing the current distribution to those cold and dry conditions in LGM, Z. xanthoxylon had a shrunken and more fragmented range during LGM. Based on the evidences from phylogeographical patterns, distribution of genetic variability, and paleodistribution modeling, Z. xanthoxylon is speculated most likely to have originated from the east and migrated westward via the Hexi Corridor.
EVALUATION OF A NEW MEAN SCALED AND MOMENT ADJUSTED TEST STATISTIC FOR SEM.
Tong, Xiaoxiao; Bentler, Peter M
2013-01-01
Recently a new mean scaled and skewness adjusted test statistic was developed for evaluating structural equation models in small samples and with potentially nonnormal data, but this statistic has received only limited evaluation. The performance of this statistic is compared to normal theory maximum likelihood and two well-known robust test statistics. A modification to the Satorra-Bentler scaled statistic is developed for the condition that sample size is smaller than degrees of freedom. The behavior of the four test statistics is evaluated with a Monte Carlo confirmatory factor analysis study that varies seven sample sizes and three distributional conditions obtained using Headrick's fifth-order transformation to nonnormality. The new statistic performs badly in most conditions except under the normal distribution. The goodness-of-fit χ(2) test based on maximum-likelihood estimation performed well under normal distributions as well as under a condition of asymptotic robustness. The Satorra-Bentler scaled test statistic performed best overall, while the mean scaled and variance adjusted test statistic outperformed the others at small and moderate sample sizes under certain distributional conditions.
On sample size of the kruskal-wallis test with application to a mouse peritoneal cavity study.
Fan, Chunpeng; Zhang, Donghui; Zhang, Cun-Hui
2011-03-01
As the nonparametric generalization of the one-way analysis of variance model, the Kruskal-Wallis test applies when the goal is to test the difference between multiple samples and the underlying population distributions are nonnormal or unknown. Although the Kruskal-Wallis test has been widely used for data analysis, power and sample size methods for this test have been investigated to a much lesser extent. This article proposes new power and sample size calculation methods for the Kruskal-Wallis test based on the pilot study in either a completely nonparametric model or a semiparametric location model. No assumption is made on the shape of the underlying population distributions. Simulation results show that, in terms of sample size calculation for the Kruskal-Wallis test, the proposed methods are more reliable and preferable to some more traditional methods. A mouse peritoneal cavity study is used to demonstrate the application of the methods. © 2010, The International Biometric Society.
NASA Technical Reports Server (NTRS)
Applin, Zachary T.; Gentry, Garl L., Jr.
1988-01-01
An unswept, semispan wing model equipped with full-span leading- and trailing-edge flaps was tested in the Langley 14- by 22-Foot Subsonic Tunnel to determine the effect of high-lift components on the aerodynamics of an advanced laminar-flow-control (LFC) airfoil section. Chordwise pressure distributions near the midsemispan were measured for four configurations: cruise, trailing-edge flap only, and trailing-edge flap with a leading-edge Krueger flap of either 0.10 or 0.12 chord. Part 1 of this report (under separate cover) presents a representative sample of the plotted pressure distribution data for each configuration tested. Part 2 presents the entire set of plotted and tabulated pressure distribution data. The data are presented without analysis.
Program for Weibull Analysis of Fatigue Data
NASA Technical Reports Server (NTRS)
Krantz, Timothy L.
2005-01-01
A Fortran computer program has been written for performing statistical analyses of fatigue-test data that are assumed to be adequately represented by a two-parameter Weibull distribution. This program calculates the following: (1) Maximum-likelihood estimates of the Weibull distribution; (2) Data for contour plots of relative likelihood for two parameters; (3) Data for contour plots of joint confidence regions; (4) Data for the profile likelihood of the Weibull-distribution parameters; (5) Data for the profile likelihood of any percentile of the distribution; and (6) Likelihood-based confidence intervals for parameters and/or percentiles of the distribution. The program can account for tests that are suspended without failure (the statistical term for such suspension of tests is "censoring"). The analytical approach followed in this program for the software is valid for type-I censoring, which is the removal of unfailed units at pre-specified times. Confidence regions and intervals are calculated by use of the likelihood-ratio method.
Zanatta, Rayssa Ferreira; Barreto, Bruno de Castro Ferreira; Xavier, Tathy Aparecida; Versluis, Antheunis; Soares, Carlos José
2015-02-01
This study evaluated the influence of punch and base orifice diameters on push-out test results by means of finite element analysis (FEA). FEA was performed using 3D models of the push-out test with 3 base orifice diameters (2.5, 3.0, and 3.5 mm) and 3 punch diameters (0.5, 1.0, and 1.5 mm) using MARC/MENTAT (MSC.Software). The image of a cervical slice from a root restored with a fiberglass post was used to construct the models. The mechanical properties of dentin, post, and resin cement were obtained from the literature. Bases and punches were constructed as rigid bodies. A 10-N force was applied by the punch in the center of the post in a nonlinear contact analysis. Modified von Mises stress, maximum principal stress, as well as shear and normal stress components were calculated. Both punch and base orifice sizes influenced the stress distribution of the push-out test. Bases with larger diameters and punches with smaller diameters caused higher stress in dentin and at the dentin/cement interface. FEA showed that the diameter of the orifice base had a more significant influence on the stress distribution than did the punch diameter. For this reason, both factors should be taken into account during push-out experimental tests.
Numerical analyses of a rocket engine turbine and comparison with air test data
NASA Technical Reports Server (NTRS)
Tran, Ken; Chan, Daniel C.; Hudson, Susan T.; Gaddis, Stephen W.
1992-01-01
The study presents cold air test data on the Space Shuttle Main Engine High Pressure Fuel Turbopump turbine recently collected at the NASA Marshall Space Flight Center. Overall performance data, static pressures on the first- and second-stage nozzles, and static pressures along with the gas path at the hub and tip are gathered and compared with various (1D, quasi-3D, and 3D viscous) analysis procedures. The results of each level of analysis are compared to test data to demonstrate the range of applicability for each step in the design process of a turbine. One-dimensional performance prediction, quasi-3D loading prediction, 3D wall pressure distribution prediction, and 3D viscous wall pressure distribution prediction are illustrated.
Integral criteria for large-scale multiple fingerprint solutions
NASA Astrophysics Data System (ADS)
Ushmaev, Oleg S.; Novikov, Sergey O.
2004-08-01
We propose the definition and analysis of the optimal integral similarity score criterion for large scale multmodal civil ID systems. Firstly, the general properties of score distributions for genuine and impostor matches for different systems and input devices are investigated. The empirical statistics was taken from the real biometric tests. Then we carry out the analysis of simultaneous score distributions for a number of combined biometric tests and primary for ultiple fingerprint solutions. The explicit and approximate relations for optimal integral score, which provides the least value of the FRR while the FAR is predefined, have been obtained. The results of real multiple fingerprint test show good correspondence with the theoretical results in the wide range of the False Acceptance and the False Rejection Rates.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1998-09-01
This report describes the test site, equipment, and procedures and presents the data obtained during field testing at G.P.U. Genco Homer City Station, August 19--24, 1997. This was the third of three field tests that the US Environmental Protection Agency (EPA) conducted in 1997 as part of a major study to evaluate potential improvements to Method 3, EPA`s test method for measuring flue gas volumetric flow in stacks. The report also includes a Data Distribution Package, the official, complete repository of the results obtained at the test site.
Huang, Shuguang; Yeo, Adeline A; Li, Shuyu Dan
2007-10-01
The Kolmogorov-Smirnov (K-S) test is a statistical method often used for comparing two distributions. In high-throughput screening (HTS) studies, such distributions usually arise from the phenotype of independent cell populations. However, the K-S test has been criticized for being overly sensitive in applications, and it often detects a statistically significant difference that is not biologically meaningful. One major reason is that there is a common phenomenon in HTS studies that systematic drifting exists among the distributions due to reasons such as instrument variation, plate edge effect, accidental difference in sample handling, etc. In particular, in high-content cellular imaging experiments, the location shift could be dramatic since some compounds themselves are fluorescent. This oversensitivity of the K-S test is particularly overpowered in cellular assays where the sample sizes are very big (usually several thousands). In this paper, a modified K-S test is proposed to deal with the nonspecific location-shift problem in HTS studies. Specifically, we propose that the distributions are "normalized" by density curve alignment before the K-S test is conducted. In applications to simulation data and real experimental data, the results show that the proposed method has improved specificity.
NASA Technical Reports Server (NTRS)
Ladson, Charles L.; Hill, Acquilla S.; Johnson, William G., Jr.
1987-01-01
Tests were conducted in the 2-D test section of the Langley 0.3-meter Transonic Cryogenic Tunnel on a NACA 0012 airfoil to obtain aerodynamic data as a part of the Advanced Technology Airfoil Test (ATAT) program. The test program covered a Mach number range of 0.30 to 0.82 and a Reynolds number range of 3.0 to 45.0 x 10 to the 6th power. The stagnation pressure was varied between 1.2 and 6.0 atmospheres and the stagnation temperature was varied between 300 K and 90 K to obtain these test conditions. Tabulated pressure distributions and integrated force and moment coefficients are presented as well as plots of the surface pressure distributions. The data are presented uncorrected for wall interference effects and without analysis.
Slow crack growth test method for polyethylene gas pipes. Volume 1. Topical report, December 1992
DOE Office of Scientific and Technical Information (OSTI.GOV)
Leis, B.; Ahmad, J.; Forte, T.
1992-12-01
In spite of the excellent performance record of polyethylene (PE) pipes used for gas distribution, a small number of leaks occur in distribution systems each year because of slow growth of cracks through pipe walls. The Slow Crack Growth Test (SCG) has been developed as a key element in a methodology for the assessment of the performance of polyethylene gas distribution systems to resist such leaks. This tropical report describes work conducted in the first part of the research directed at the initial development of the SCG test, including a critical evaluation of the applicability of the SCG test asmore » an element in PE gas pipe system performance methodology. Results of extensive experiments and analysis are reported. The results show that the SCG test should be very useful in performance assessment.« less
Distributed Ferrite Isolation in Traveling-Wave Tubes.
coupling to broadband edge modes of ferrite slabs. Evidence of coupling to the lower branch of edge mode, i.e., magnetostatic, has been obtained with L...band helix . Cold tests and analysis suggest coupling to ferrite edge modes from helix is easier at higher microwave frequencies. Plans for a hot...test at the 1-2 kW power level is an L-band TWT incorporating such distributed ferrites are described.
Consistent Tolerance Bounds for Statistical Distributions
NASA Technical Reports Server (NTRS)
Mezzacappa, M. A.
1983-01-01
Assumption that sample comes from population with particular distribution is made with confidence C if data lie between certain bounds. These "confidence bounds" depend on C and assumption about distribution of sampling errors around regression line. Graphical test criteria using tolerance bounds are applied in industry where statistical analysis influences product development and use. Applied to evaluate equipment life.
Improved silicon nitride for advanced heat engines
NASA Technical Reports Server (NTRS)
Yeh, H. C.; Wimmer, J. M.; Huang, H. H.; Rorabaugh, M. E.; Schienle, J.; Styhr, K. H.
1985-01-01
The AiResearch Casting Company baseline silicon nitride (92 percent GTE SN-502 Si sub 3 N sub 4 plus 6 percent Y sub 2 O sub 3 plus 2 percent Al sub 2 O sub 3) was characterized with methods that included chemical analysis, oxygen content determination, electrophoresis, particle size distribution analysis, surface area determination, and analysis of the degree of agglomeration and maximum particle size of elutriated powder. Test bars were injection molded and processed through sintering at 0.68 MPa (100 psi) of nitrogen. The as-sintered test bars were evaluated by X-ray phase analysis, room and elevated temperature modulus of rupture strength, Weibull modulus, stress rupture, strength after oxidation, fracture origins, microstructure, and density from quantities of samples sufficiently large to generate statistically valid results. A series of small test matrices were conducted to study the effects and interactions of processing parameters which included raw materials, binder systems, binder removal cycles, injection molding temperatures, particle size distribution, sintering additives, and sintering cycle parameters.
Some analysis on the diurnal variation of rainfall over the Atlantic Ocean
NASA Technical Reports Server (NTRS)
Gill, T.; Perng, S.; Hughes, A.
1981-01-01
Data collected from the GARP Atlantic Tropical Experiment (GATE) was examined. The data were collected from 10,000 grid points arranged as a 100 x 100 array; each grid covered a 4 square km area. The amount of rainfall was measured every 15 minutes during the experiment periods using c-band radars. Two types of analyses were performed on the data: analysis of diurnal variation was done on each of grid points based on the rainfall averages at noon and at midnight, and time series analysis on selected grid points based on the hourly averages of rainfall. Since there are no known distribution model which best describes the rainfall amount, nonparametric methods were used to examine the diurnal variation. Kolmogorov-Smirnov test was used to test if the rainfalls at noon and at midnight have the same statistical distribution. Wilcoxon signed-rank test was used to test if the noon rainfall is heavier than, equal to, or lighter than the midnight rainfall. These tests were done on each of the 10,000 grid points at which the data are available.
Analysis of particulates on tape lift samples
NASA Astrophysics Data System (ADS)
Moision, Robert M.; Chaney, John A.; Panetta, Chris J.; Liu, De-Ling
2014-09-01
Particle counts on tape lift samples taken from a hardware surface exceeded threshold requirements in six successive tests despite repeated cleaning of the surface. Subsequent analysis of the particle size distributions of the failed tests revealed that the handling and processing of the tape lift samples may have played a role in the test failures. In order to explore plausible causes for the observed size distribution anomalies, scanning electron microscopy (SEM), energy dispersive X-ray spectroscopy (EDX), and time-of-flight secondary ion mass spectrometry (ToF-SIMS) were employed to perform chemical analysis on collected particulates. SEM/EDX identified Na and S containing particles on the hardware samples in a size range identified as being responsible for the test failures. ToF-SIMS was employed to further examine the Na and S containing particulates and identified the molecular signature of sodium alkylbenzene sulfonates, a common surfactant used in industrial detergent. The root cause investigation suggests that the tape lift test failures originated from detergent residue left behind on the glass slides used to mount and transport the tape following sampling and not from the hardware surface.
Potential flow analysis of glaze ice accretions on an airfoil
NASA Technical Reports Server (NTRS)
Zaguli, R. J.
1984-01-01
The results of an analytical/experimental study of the flow fields about an airfoil with leading edge glaze ice accretion shapes are presented. Tests were conducted in the Icing Research Tunnel to measure surface pressure distributions and boundary layer separation reattachment characteristics on a general aviation wing section to which was affixed wooden ice shapes which approximated typical glaze ice accretions. Comparisons were made with predicted pressure distributions using current airfoil analysis codes as well as the Bristow mixed analysis/design airfoil panel code. The Bristow code was also used to predict the separation reattachment dividing streamline by inputting the appropriate experimental surface pressure distribution.
Schörgendorfer, Angela; Branscum, Adam J; Hanson, Timothy E
2013-06-01
Logistic regression is a popular tool for risk analysis in medical and population health science. With continuous response data, it is common to create a dichotomous outcome for logistic regression analysis by specifying a threshold for positivity. Fitting a linear regression to the nondichotomized response variable assuming a logistic sampling model for the data has been empirically shown to yield more efficient estimates of odds ratios than ordinary logistic regression of the dichotomized endpoint. We illustrate that risk inference is not robust to departures from the parametric logistic distribution. Moreover, the model assumption of proportional odds is generally not satisfied when the condition of a logistic distribution for the data is violated, leading to biased inference from a parametric logistic analysis. We develop novel Bayesian semiparametric methodology for testing goodness of fit of parametric logistic regression with continuous measurement data. The testing procedures hold for any cutoff threshold and our approach simultaneously provides the ability to perform semiparametric risk estimation. Bayes factors are calculated using the Savage-Dickey ratio for testing the null hypothesis of logistic regression versus a semiparametric generalization. We propose a fully Bayesian and a computationally efficient empirical Bayesian approach to testing, and we present methods for semiparametric estimation of risks, relative risks, and odds ratios when parametric logistic regression fails. Theoretical results establish the consistency of the empirical Bayes test. Results from simulated data show that the proposed approach provides accurate inference irrespective of whether parametric assumptions hold or not. Evaluation of risk factors for obesity shows that different inferences are derived from an analysis of a real data set when deviations from a logistic distribution are permissible in a flexible semiparametric framework. © 2013, The International Biometric Society.
Integral-moment analysis of the BATSE gamma-ray burst intensity distribution
NASA Technical Reports Server (NTRS)
Horack, John M.; Emslie, A. Gordon
1994-01-01
We have applied the technique of integral-moment analysis to the intensity distribution of the first 260 gamma-ray bursts observed by the Burst and Transient Source Experiment (BATSE) on the Compton Gamma Ray Observatory. This technique provides direct measurement of properties such as the mean, variance, and skewness of the convolved luminosity-number density distribution, as well as associated uncertainties. Using this method, one obtains insight into the nature of the source distributions unavailable through computation of traditional single parameters such as V/V(sub max)). If the luminosity function of the gamma-ray bursts is strongly peaked, giving bursts only a narrow range of luminosities, these results are then direct probes of the radial distribution of sources, regardless of whether the bursts are a local phenomenon, are distributed in a galactic halo, or are at cosmological distances. Accordingly, an integral-moment analysis of the intensity distribution of the gamma-ray bursts provides for the most complete analytic description of the source distribution available from the data, and offers the most comprehensive test of the compatibility of a given hypothesized distribution with observation.
Estimating the proportion of true null hypotheses when the statistics are discrete.
Dialsingh, Isaac; Austin, Stefanie R; Altman, Naomi S
2015-07-15
In high-dimensional testing problems π0, the proportion of null hypotheses that are true is an important parameter. For discrete test statistics, the P values come from a discrete distribution with finite support and the null distribution may depend on an ancillary statistic such as a table margin that varies among the test statistics. Methods for estimating π0 developed for continuous test statistics, which depend on a uniform or identical null distribution of P values, may not perform well when applied to discrete testing problems. This article introduces a number of π0 estimators, the regression and 'T' methods that perform well with discrete test statistics and also assesses how well methods developed for or adapted from continuous tests perform with discrete tests. We demonstrate the usefulness of these estimators in the analysis of high-throughput biological RNA-seq and single-nucleotide polymorphism data. implemented in R. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
A Third Moment Adjusted Test Statistic for Small Sample Factor Analysis.
Lin, Johnny; Bentler, Peter M
2012-01-01
Goodness of fit testing in factor analysis is based on the assumption that the test statistic is asymptotically chi-square; but this property may not hold in small samples even when the factors and errors are normally distributed in the population. Robust methods such as Browne's asymptotically distribution-free method and Satorra Bentler's mean scaling statistic were developed under the presumption of non-normality in the factors and errors. This paper finds new application to the case where factors and errors are normally distributed in the population but the skewness of the obtained test statistic is still high due to sampling error in the observed indicators. An extension of Satorra Bentler's statistic is proposed that not only scales the mean but also adjusts the degrees of freedom based on the skewness of the obtained test statistic in order to improve its robustness under small samples. A simple simulation study shows that this third moment adjusted statistic asymptotically performs on par with previously proposed methods, and at a very small sample size offers superior Type I error rates under a properly specified model. Data from Mardia, Kent and Bibby's study of students tested for their ability in five content areas that were either open or closed book were used to illustrate the real-world performance of this statistic.
Statistical Tutorial | Center for Cancer Research
Recent advances in cancer biology have resulted in the need for increased statistical analysis of research data. ST is designed as a follow up to Statistical Analysis of Research Data (SARD) held in April 2018. The tutorial will apply the general principles of statistical analysis of research data including descriptive statistics, z- and t-tests of means and mean differences, simple and multiple linear regression, ANOVA tests, and Chi-Squared distribution.
On the Likelihood Ratio Test for the Number of Factors in Exploratory Factor Analysis
ERIC Educational Resources Information Center
Hayashi, Kentaro; Bentler, Peter M.; Yuan, Ke-Hai
2007-01-01
In the exploratory factor analysis, when the number of factors exceeds the true number of factors, the likelihood ratio test statistic no longer follows the chi-square distribution due to a problem of rank deficiency and nonidentifiability of model parameters. As a result, decisions regarding the number of factors may be incorrect. Several…
Marko, Nicholas F.; Weil, Robert J.
2012-01-01
Introduction Gene expression data is often assumed to be normally-distributed, but this assumption has not been tested rigorously. We investigate the distribution of expression data in human cancer genomes and study the implications of deviations from the normal distribution for translational molecular oncology research. Methods We conducted a central moments analysis of five cancer genomes and performed empiric distribution fitting to examine the true distribution of expression data both on the complete-experiment and on the individual-gene levels. We used a variety of parametric and nonparametric methods to test the effects of deviations from normality on gene calling, functional annotation, and prospective molecular classification using a sixth cancer genome. Results Central moments analyses reveal statistically-significant deviations from normality in all of the analyzed cancer genomes. We observe as much as 37% variability in gene calling, 39% variability in functional annotation, and 30% variability in prospective, molecular tumor subclassification associated with this effect. Conclusions Cancer gene expression profiles are not normally-distributed, either on the complete-experiment or on the individual-gene level. Instead, they exhibit complex, heavy-tailed distributions characterized by statistically-significant skewness and kurtosis. The non-Gaussian distribution of this data affects identification of differentially-expressed genes, functional annotation, and prospective molecular classification. These effects may be reduced in some circumstances, although not completely eliminated, by using nonparametric analytics. This analysis highlights two unreliable assumptions of translational cancer gene expression analysis: that “small” departures from normality in the expression data distributions are analytically-insignificant and that “robust” gene-calling algorithms can fully compensate for these effects. PMID:23118863
Neti, Prasad V.S.V.; Howell, Roger W.
2010-01-01
Recently, the distribution of radioactivity among a population of cells labeled with 210Po was shown to be well described by a log-normal (LN) distribution function (J Nucl Med. 2006;47:1049–1058) with the aid of autoradiography. To ascertain the influence of Poisson statistics on the interpretation of the autoradiographic data, the present work reports on a detailed statistical analysis of these earlier data. Methods The measured distributions of α-particle tracks per cell were subjected to statistical tests with Poisson, LN, and Poisson-lognormal (P-LN) models. Results The LN distribution function best describes the distribution of radioactivity among cell populations exposed to 0.52 and 3.8 kBq/mL of 210Po-citrate. When cells were exposed to 67 kBq/mL, the P-LN distribution function gave a better fit; however, the underlying activity distribution remained log-normal. Conclusion The present analysis generally provides further support for the use of LN distributions to describe the cellular uptake of radioactivity. Care should be exercised when analyzing autoradiographic data on activity distributions to ensure that Poisson processes do not distort the underlying LN distribution. PMID:18483086
Distributed data analysis in ATLAS
NASA Astrophysics Data System (ADS)
Nilsson, Paul; Atlas Collaboration
2012-12-01
Data analysis using grid resources is one of the fundamental challenges to be addressed before the start of LHC data taking. The ATLAS detector will produce petabytes of data per year, and roughly one thousand users will need to run physics analyses on this data. Appropriate user interfaces and helper applications have been made available to ensure that the grid resources can be used without requiring expertise in grid technology. These tools enlarge the number of grid users from a few production administrators to potentially all participating physicists. ATLAS makes use of three grid infrastructures for the distributed analysis: the EGEE sites, the Open Science Grid, and Nordu Grid. These grids are managed by the gLite workload management system, the PanDA workload management system, and ARC middleware; many sites can be accessed via both the gLite WMS and PanDA. Users can choose between two front-end tools to access the distributed resources. Ganga is a tool co-developed with LHCb to provide a common interface to the multitude of execution backends (local, batch, and grid). The PanDA workload management system provides a set of utilities called PanDA Client; with these tools users can easily submit Athena analysis jobs to the PanDA-managed resources. Distributed data is managed by Don Quixote 2, a system developed by ATLAS; DQ2 is used to replicate datasets according to the data distribution policies and maintains a central catalog of file locations. The operation of the grid resources is continually monitored by the Ganga Robot functional testing system, and infrequent site stress tests are performed using the Hammer Cloud system. In addition, the DAST shift team is a group of power users who take shifts to provide distributed analysis user support; this team has effectively relieved the burden of support from the developers.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brady, D.N.; Church, B.W.; White, M.G.
Soil sampling activities during 1974 were concentrated in Area 5 of the Nevada Test Site (NTS). Area 5 has been assigned the highest priority because of the number of atmospheric test events held and a wide distribution of contaminants. Improved sampling techniques are described. Preliminary data analysis aided in designing a program to infer $sup 239-240$Pu results by Ge(Li) scanning techniques. (auth)
Real time testing of intelligent relays for synchronous distributed generation islanding detection
NASA Astrophysics Data System (ADS)
Zhuang, Davy
As electric power systems continue to grow to meet ever-increasing energy demand, their security, reliability, and sustainability requirements also become more stringent. The deployment of distributed energy resources (DER), including generation and storage, in conventional passive distribution feeders, gives rise to integration problems involving protection and unintentional islanding. Distributed generators need to be islanded for safety reasons when disconnected or isolated from the main feeder as distributed generator islanding may create hazards to utility and third-party personnel, and possibly damage the distribution system infrastructure, including the distributed generators. This thesis compares several key performance indicators of a newly developed intelligent islanding detection relay, against islanding detection devices currently used by the industry. The intelligent relay employs multivariable analysis and data mining methods to arrive at decision trees that contain both the protection handles and the settings. A test methodology is developed to assess the performance of these intelligent relays on a real time simulation environment using a generic model based on a real-life distribution feeder. The methodology demonstrates the applicability and potential advantages of the intelligent relay, by running a large number of tests, reflecting a multitude of system operating conditions. The testing indicates that the intelligent relay often outperforms frequency, voltage and rate of change of frequency relays currently used for islanding detection, while respecting the islanding detection time constraints imposed by standing distributed generator interconnection guidelines.
The retest distribution of the visual field summary index mean deviation is close to normal.
Anderson, Andrew J; Cheng, Allan C Y; Lau, Samantha; Le-Pham, Anne; Liu, Victor; Rahman, Farahnaz
2016-09-01
When modelling optimum strategies for how best to determine visual field progression in glaucoma, it is commonly assumed that the summary index mean deviation (MD) is normally distributed on repeated testing. Here we tested whether this assumption is correct. We obtained 42 reliable 24-2 Humphrey Field Analyzer SITA standard visual fields from one eye of each of five healthy young observers, with the first two fields excluded from analysis. Previous work has shown that although MD variability is higher in glaucoma, the shape of the MD distribution is similar to that found in normal visual fields. A Shapiro-Wilks test determined any deviation from normality. Kurtosis values for the distributions were also calculated. Data from each observer passed the Shapiro-Wilks normality test. Bootstrapped 95% confidence intervals for kurtosis encompassed the value for a normal distribution in four of five observers. When examined with quantile-quantile plots, distributions were close to normal and showed no consistent deviations across observers. The retest distribution of MD is not significantly different from normal in healthy observers, and so is likely also normally distributed - or nearly so - in those with glaucoma. Our results increase our confidence in the results of influential modelling studies where a normal distribution for MD was assumed. © 2016 The Authors Ophthalmic & Physiological Optics © 2016 The College of Optometrists.
Data on the no-load performance analysis of a tomato postharvest storage system.
Ayomide, Orhewere B; Ajayi, Oluseyi O; Banjo, Solomon O; Ajayi, Adesola A
2017-08-01
In this present investigation, an original and detailed empirical data on the transfer of heat in a tomato postharvest storage system was presented. No-load tests were performed for a period of 96 h. The heat distribution at different locations, namely the top, middle and bottom of the system was acquired, at a time interval of 30 min for the test period. The humidity inside the system was taken into consideration. Thus, No-load tests with or without introduction of humidity were carried out and data showing the effect of a rise in humidity level, on temperature distribution were acquired. The temperatures at the external mechanical cooling components were acquired and could be used for showing the performance analysis of the storage system.
Experimental Testing and Modeling Analysis of Solute Mixing at Water Distribution Pipe Junctions
Flow dynamics at a pipe junction controls particle trajectories, solute mixing and concentrations in downstream pipes. Here we have categorized pipe junctions into five hydraulic types, for which flow distribution factors and analytical equations for describing the solute mixing ...
Statistical analysis of the count and profitability of air conditioners.
Rady, El Houssainy A; Mohamed, Salah M; Abd Elmegaly, Alaa A
2018-08-01
This article presents the statistical analysis of the number and profitability of air conditioners in an Egyptian company. Checking the same distribution for each categorical variable has been made using Kruskal-Wallis test.
Journal of Naval Science. Volume 2, Number 1
1976-01-01
has defined a probability distribution function which fits this type of data and forms the basis for statistical analysis of test results (see...Conditions to Assess the Performance of Fire-Resistant Fluids’. Wear, 28 (1974) 29. J.N.S., Vol. 2, No. 1 APPENDIX A Analysis of Fatigue Test Data...used to produce the impulse response and the equipment required for the analysis is relatively simple. The methods that must be used to produce
The effect of caster wheel diameter and mass distribution on drag forces in manual wheelchairs.
Zepeda, Rene; Chan, Franco; Sawatzky, Bonita
2016-01-01
This study proposes a way to reduce energy losses in the form of rolling resistance friction during manual wheelchair propulsion by increasing the size of the front caster wheels and adjusting the weight distribution. Drag tests were conducted using a treadmill and a force transducer. Three different casters diameter (4 in., 5 in., and 6 in.) and six different mass distribution combinations (based on percentage of total weight on the caster wheels) were studied. A two-way analysis of variance test was performed to compare caster size and weight distribution contribution with drag force of an ultralight wheelchair. The 4 in. caster contributed significantly more drag, but only when weight was 40% or greater over the casters. Weight distribution contributed more to drag regardless of the casters used.
Regional analysis of annual maximum rainfall using TL-moments method
NASA Astrophysics Data System (ADS)
Shabri, Ani Bin; Daud, Zalina Mohd; Ariff, Noratiqah Mohd
2011-06-01
Information related to distributions of rainfall amounts are of great importance for designs of water-related structures. One of the concerns of hydrologists and engineers is the probability distribution for modeling of regional data. In this study, a novel approach to regional frequency analysis using L-moments is revisited. Subsequently, an alternative regional frequency analysis using the TL-moments method is employed. The results from both methods were then compared. The analysis was based on daily annual maximum rainfall data from 40 stations in Selangor Malaysia. TL-moments for the generalized extreme value (GEV) and generalized logistic (GLO) distributions were derived and used to develop the regional frequency analysis procedure. TL-moment ratio diagram and Z-test were employed in determining the best-fit distribution. Comparison between the two approaches showed that the L-moments and TL-moments produced equivalent results. GLO and GEV distributions were identified as the most suitable distributions for representing the statistical properties of extreme rainfall in Selangor. Monte Carlo simulation was used for performance evaluation, and it showed that the method of TL-moments was more efficient for lower quantile estimation compared with the L-moments.
Statistical analysis of secondary particle distributions in relativistic nucleus-nucleus collisions
NASA Technical Reports Server (NTRS)
Mcguire, Stephen C.
1987-01-01
The use is described of several statistical techniques to characterize structure in the angular distributions of secondary particles from nucleus-nucleus collisions in the energy range 24 to 61 GeV/nucleon. The objective of this work was to determine whether there are correlations between emitted particle intensity and angle that may be used to support the existence of the quark gluon plasma. The techniques include chi-square null hypothesis tests, the method of discrete Fourier transform analysis, and fluctuation analysis. We have also used the method of composite unit vectors to test for azimuthal asymmetry in a data set of 63 JACEE-3 events. Each method is presented in a manner that provides the reader with some practical detail regarding its application. Of those events with relatively high statistics, Fe approaches 0 at 55 GeV/nucleon was found to possess an azimuthal distribution with a highly non-random structure. No evidence of non-statistical fluctuations was found in the pseudo-rapidity distributions of the events studied. It is seen that the most effective application of these methods relies upon the availability of many events or single events that possess very high multiplicities.
Application of ideal pressure distribution in development process of automobile seats.
Kilincsoy, U; Wagner, A; Vink, P; Bubb, H
2016-07-19
In designing a car seat the ideal pressure distribution is important as it is the largest contact surface between the human and the car. Because of obstacles hindering a more general application of the ideal pressure distribution in seating design, multidimensional measuring techniques are necessary with extensive user tests. The objective of this study is to apply and integrate the knowledge about the ideal pressure distribution in the seat design process for a car manufacturer in an efficient way. Ideal pressure distribution was combined with pressure measurement, in this case pressure mats. In order to integrate this theoretical knowledge of seating comfort in the seat development process for a car manufacturer a special user interface was defined and developed. The mapping of the measured pressure distribution in real-time and accurately scaled to actual seats during test setups directly lead to design implications for seat design even during the test situation. Detailed analysis of the subject's feedback was correlated with objective measurements of the subject's pressure distribution in real time. Therefore existing seating characteristics were taken into account as well. A user interface can incorporate theoretical and validated 'state of the art' models of comfort. Consequently, this information can reduce extensive testing and lead to more detailed results in a shorter time period.
Prediction of Mean and Design Fatigue Lives of Self Compacting Concrete Beams in Flexure
NASA Astrophysics Data System (ADS)
Goel, S.; Singh, S. P.; Singh, P.; Kaushik, S. K.
2012-02-01
In this paper, result of an investigation conducted to study the flexural fatigue characteristics of self compacting concrete (SCC) beams in flexure are presented. An experimental programme was planned in which approximately 60 SCC beam specimens of size 100 × 100 × 500 mm were tested under flexural fatigue loading. Approximately 45 static flexural tests were also conducted to facilitate fatigue testing. The flexural fatigue and static flexural strength tests were conducted on a 100 kN servo-controlled actuator. The fatigue life data thus obtained have been used to establish the probability distributions of fatigue life of SCC using two-parameter Weibull distribution. The parameters of the Weibull distribution have been obtained by different methods of analysis. Using the distribution parameters, the mean and design fatigue lives of SCC have been estimated and compared with Normally vibrated concrete (NVC), the data for which have been taken from literature. It has been observed that SCC exhibits higher mean and design fatigue lives compared to NVC.
Test results management and distributed cognition in electronic health record-enabled primary care.
Smith, Michael W; Hughes, Ashley M; Brown, Charnetta; Russo And, Elise; Giardina, Traber D; Mehta, Praveen; Singh, Hardeep
2018-06-01
Managing abnormal test results in primary care involves coordination across various settings. This study identifies how primary care teams manage test results in a large, computerized healthcare system in order to inform health information technology requirements for test results management and other distributed healthcare services. At five US Veterans Health Administration facilities, we interviewed 37 primary care team members, including 16 primary care providers, 12 registered nurses, and 9 licensed practical nurses. We performed content analysis using a distributed cognition approach, identifying patterns of information transmission across people and artifacts (e.g. electronic health records). Results illustrate challenges (e.g. information overload) as well as strategies used to overcome challenges. Various communication paths were used. Some team members served as intermediaries, processing information before relaying it. Artifacts were used as memory aids. Health information technology should address the risks of distributed work by supporting awareness of team and task status for reliable management of results.
The Measurement of Pressure Through Tubes in Pressure Distribution Tests
NASA Technical Reports Server (NTRS)
Hemke, Paul E
1928-01-01
The tests described in this report were made to determine the error caused by using small tubes to connect orifices on the surface of aircraft to central pressure capsules in making pressure distribution tests. Aluminum tubes of 3/16-inch inside diameter were used to determine this error. Lengths from 20 feet to 226 feet and pressures whose maxima varied from 2 inches to 140 inches of water were used. Single-pressure impulses for which the time of rise of pressure from zero to a maximum varied from 0.25 second to 3 seconds were investigated. The results show that the pressure recorded at the capsule on the far end of the tube lags behind the pressure at the orifice end and experiences also a change in magnitude. For the values used in these tests the time lag and pressure change vary principally with the time of rise of pressure from zero to a maximum and the tube length. Curves are constructed showing the time lag and pressure change. Empirical formulas are also given for computing the time lag. Analysis of pressure distribution tests made on airplanes in flight shows that the recorded pressures are slightly higher than the pressures at the orifice and that the time lag is negligible. The apparent increase in pressure is usually within the experimental error, but in the case of the modern pursuit type of airplane the pressure increase may be 5 per cent. For pressure-distribution tests on airships the analysis shows that the time lag and pressure change may be neglected.
2010-09-30
and climate forecasting and use of satellite data assimilation for model evaluation. He is a task leader on another NSF_EPSCoR project for the...1 DISTRIBUTION STATEMENT A: Approved for public release; distribution is unlimited. Data Analysis, Modeling, and Ensemble Forecasting to...observations including remotely sensed data . OBJECTIVES The main objectives of the study are: 1) to further develop, test, and continue twice daily
Information fusion in regularized inversion of tomographic pumping tests
Bohling, Geoffrey C.; ,
2008-01-01
In this chapter we investigate a simple approach to incorporating geophysical information into the analysis of tomographic pumping tests for characterization of the hydraulic conductivity (K) field in an aquifer. A number of authors have suggested a tomographic approach to the analysis of hydraulic tests in aquifers - essentially simultaneous analysis of multiple tests or stresses on the flow system - in order to improve the resolution of the estimated parameter fields. However, even with a large amount of hydraulic data in hand, the inverse problem is still plagued by non-uniqueness and ill-conditioning and the parameter space for the inversion needs to be constrained in some sensible fashion in order to obtain plausible estimates of aquifer properties. For seismic and radar tomography problems, the parameter space is often constrained through the application of regularization terms that impose penalties on deviations of the estimated parameters from a prior or background model, with the tradeoff between data fit and model norm explored through systematic analysis of results for different levels of weighting on the regularization terms. In this study we apply systematic regularized inversion to analysis of tomographic pumping tests in an alluvial aquifer, taking advantage of the steady-shape flow regime exhibited in these tests to expedite the inversion process. In addition, we explore the possibility of incorporating geophysical information into the inversion through a regularization term relating the estimated K distribution to ground penetrating radar velocity and attenuation distributions through a smoothing spline model. ?? 2008 Springer-Verlag Berlin Heidelberg.
Voltage stress effects on microcircuit accelerated life test failure rates
NASA Technical Reports Server (NTRS)
Johnson, G. M.
1976-01-01
The applicability of Arrhenius and Eyring reaction rate models for describing microcircuit aging characteristics as a function of junction temperature and applied voltage was evaluated. The results of a matrix of accelerated life tests with a single metal oxide semiconductor microcircuit operated at six different combinations of temperature and voltage were used to evaluate the models. A total of 450 devices from two different lots were tested at ambient temperatures between 200 C and 250 C and applied voltages between 5 Vdc and 15 Vdc. A statistical analysis of the surface related failure data resulted in bimodal failure distributions comprising two lognormal distributions; a 'freak' distribution observed early in time, and a 'main' distribution observed later in time. The Arrhenius model was shown to provide a good description of device aging as a function of temperature at a fixed voltage. The Eyring model also appeared to provide a reasonable description of main distribution device aging as a function of temperature and voltage. Circuit diagrams are shown.
Tian, Guo-Liang; Li, Hui-Qiong
2017-08-01
Some existing confidence interval methods and hypothesis testing methods in the analysis of a contingency table with incomplete observations in both margins entirely depend on an underlying assumption that the sampling distribution of the observed counts is a product of independent multinomial/binomial distributions for complete and incomplete counts. However, it can be shown that this independency assumption is incorrect and can result in unreliable conclusions because of the under-estimation of the uncertainty. Therefore, the first objective of this paper is to derive the valid joint sampling distribution of the observed counts in a contingency table with incomplete observations in both margins. The second objective is to provide a new framework for analyzing incomplete contingency tables based on the derived joint sampling distribution of the observed counts by developing a Fisher scoring algorithm to calculate maximum likelihood estimates of parameters of interest, the bootstrap confidence interval methods, and the bootstrap testing hypothesis methods. We compare the differences between the valid sampling distribution and the sampling distribution under the independency assumption. Simulation studies showed that average/expected confidence-interval widths of parameters based on the sampling distribution under the independency assumption are shorter than those based on the new sampling distribution, yielding unrealistic results. A real data set is analyzed to illustrate the application of the new sampling distribution for incomplete contingency tables and the analysis results again confirm the conclusions obtained from the simulation studies.
Frequency distribution histograms for the rapid analysis of data
NASA Technical Reports Server (NTRS)
Burke, P. V.; Bullen, B. L.; Poff, K. L.
1988-01-01
The mean and standard error are good representations for the response of a population to an experimental parameter and are frequently used for this purpose. Frequency distribution histograms show, in addition, responses of individuals in the population. Both the statistics and a visual display of the distribution of the responses can be obtained easily using a microcomputer and available programs. The type of distribution shown by the histogram may suggest different mechanisms to be tested.
Goodness-of-Fit Tests for Generalized Normal Distribution for Use in Hydrological Frequency Analysis
NASA Astrophysics Data System (ADS)
Das, Samiran
2018-04-01
The use of three-parameter generalized normal (GNO) as a hydrological frequency distribution is well recognized, but its application is limited due to unavailability of popular goodness-of-fit (GOF) test statistics. This study develops popular empirical distribution function (EDF)-based test statistics to investigate the goodness-of-fit of the GNO distribution. The focus is on the case most relevant to the hydrologist, namely, that in which the parameter values are unidentified and estimated from a sample using the method of L-moments. The widely used EDF tests such as Kolmogorov-Smirnov, Cramer von Mises, and Anderson-Darling (AD) are considered in this study. A modified version of AD, namely, the Modified Anderson-Darling (MAD) test, is also considered and its performance is assessed against other EDF tests using a power study that incorporates six specific Wakeby distributions (WA-1, WA-2, WA-3, WA-4, WA-5, and WA-6) as the alternative distributions. The critical values of the proposed test statistics are approximated using Monte Carlo techniques and are summarized in chart and regression equation form to show the dependence of shape parameter and sample size. The performance results obtained from the power study suggest that the AD and a variant of the MAD (MAD-L) are the most powerful tests. Finally, the study performs case studies involving annual maximum flow data of selected gauged sites from Irish and US catchments to show the application of the derived critical values and recommends further assessments to be carried out on flow data sets of rivers with various hydrological regimes.
Fatigue analysis and testing of wind turbine blades
NASA Astrophysics Data System (ADS)
Greaves, Peter Robert
This thesis focuses on fatigue analysis and testing of large, multi MW wind turbine blades. The blades are one of the most expensive components of a wind turbine, and their mass has cost implications for the hub, nacelle, tower and foundations of the turbine so it is important that they are not unnecessarily strong. Fatigue is often an important design driver, but fatigue of composites is poorly understood and so large safety factors are often applied to the loads. This has implications for the weight of the blade. Full scale fatigue testing of blades is required by the design standards, and provides manufacturers with confidence that the blade will be able to survive its service life. This testing is usually performed by resonating the blade in the flapwise and edgewise directions separately, but in service these two loads occur at the same time.. A fatigue testing method developed at Narec (the National Renewable Energy Centre) in the UK in which the flapwise and edgewise directions are excited simultaneously has been evaluated by comparing the Palmgren-Miner damage sum around the blade cross section after testing with the damage distribution caused by the service life. A method to obtain the resonant test configuration that will result in the optimum mode shapes for the flapwise and edgewise directions was then developed, and simulation software was designed to allow the blade test to be simulated so that realistic comparisons between the damage distributions after different test types could be obtained. During the course of this work the shortcomings with conventional fatigue analysis methods became apparent, and a novel method of fatigue analysis based on multi-continuum theory and the kinetic theory of fracture was developed. This method was benchmarked using physical test data from the OPTIDAT database and was applied to the analysis of a complete blade. A full scale fatigue test method based on this new analysis approach is also discussed..
Spacecraft thermal balance testing using infrared sources
NASA Technical Reports Server (NTRS)
Tan, G. B. T.; Walker, J. B.
1982-01-01
A thermal balance test (controlled flux intensity) on a simple black dummy spacecraft using IR lamps was performed and evaluated, the latter being aimed specifically at thermal mathematical model (TMM) verification. For reference purposes the model was also subjected to a solar simulation test (SST). The results show that the temperature distributions measured during IR testing for two different model attitudes under steady state conditions are reproducible with a TMM. The TMM test data correlation is not as accurate for IRT as for SST. Using the standard deviation of the temperature difference distribution (analysis minus test) the SST data correlation is better by a factor of 1.8 to 2.5. The lower figure applies to the measured and the higher to the computer-generated IR flux intensity distribution. Techniques of lamp power control are presented. A continuing work program is described which is aimed at quantifying the differences between solar simulation and infrared techniques for a model representing the thermal radiating surfaces of a large communications spacecraft.
Lopes, Rubia Garcia; de Godoy, Camila Haddad Leal; Deana, Alessandro Melo; de Santi, Maria Eugenia Simões Onofre; Prates, Renato Araujo; França, Cristiane Miranda; Fernandes, Kristianne Porta Santos; Mesquita-Ferrari, Raquel Agnelli; Bussadori, Sandra Kalil
2014-11-14
Halitosis is a common problem that affects a large portion of the population worldwide. The origin of this condition is oral in 90% and systemic in 10% of cases. The unpleasant odor is mainly the result of volatile sulfur compounds produced by Gram-negative bacteria. However, it has recently been found that anaerobic Gram-positive bacteria also produce hydrogen sulfide (H2S) in the presence of amino acids, such as cysteine. Light, both with and without the use of chemical agents, has been used to induce therapeutic and antimicrobial effects. In photodynamic therapy, the antimicrobial effect is confined to areas covered by photosensitizing dye. The aim of the present study is to evaluate the antimicrobial effect of photodynamic therapy on halitosis in adolescents through the analysis of volatile sulfur compounds measured using gas chromatography and microbiological analysis of coated tongue. A quantitative clinical trial will be carried out involving 60 adolescents randomly divided into the following groups: group 1 will receive treatment with a tongue scraper, group 2 will receive photodynamic therapy applied to the posterior two-thirds of the dorsum of the tongue, and group 3 will receive combined treatment (tongue scraper and photodynamic therapy). Gas chromatography (OralChromaTM) and microbiological analysis will be used for the diagnosis of halitosis at the beginning of the study. Post-treatment evaluations will be conducted at one hour and 24 hours after treatment. The statistical analysis will include the Shapiro-Wilk test for the determination of the distribution of the data. If normal distribution is demonstrated, analysis of variance followed by Tukey's test will be used to compare groups. The Kruskal-Wallis test followed by the Student-Newman-Keuls test will be used for data with non-normal distribution. Either the paired t-test or the Wilcoxon test will be used to compare data before and after treatment, depending on the distribution of the data. The results of this trial will determine the efficacy of using photodynamic therapy alone or in combination with a tongue scraper to treat bad breath in adolescents. The protocol for this study was registered with Clinical Trials (registration number NCT02007993) on 10 December 2013.
NASA Technical Reports Server (NTRS)
Mcfarland, E.; Tabakoff, W.; Hamed, A.
1977-01-01
An investigation of the effects of coolant injection on the aerodynamic performance of cooled turbine blades is presented. The coolant injection is modeled in the inviscid irrotational adiabatic flow analysis through the cascade using the distributed singularities approach. The resulting integral equations are solved using a minimized surface singularity density criteria. The aerodynamic performance was evaluated using this solution in conjunction with an existing mixing theory analysis. The results of the present analysis are compared with experimental measurements in cold flow tests.
Three Strategies for the Critical Use of Statistical Methods in Psychological Research
ERIC Educational Resources Information Center
Campitelli, Guillermo; Macbeth, Guillermo; Ospina, Raydonal; Marmolejo-Ramos, Fernando
2017-01-01
We present three strategies to replace the null hypothesis statistical significance testing approach in psychological research: (1) visual representation of cognitive processes and predictions, (2) visual representation of data distributions and choice of the appropriate distribution for analysis, and (3) model comparison. The three strategies…
The Use of Propensity Scores in Mediation Analysis
ERIC Educational Resources Information Center
Jo, Booil; Stuart, Elizabeth A.; MacKinnon, David P.; Vinokur, Amiram D.
2011-01-01
Mediation analysis uses measures of hypothesized mediating variables to test theory for how a treatment achieves effects on outcomes and to improve subsequent treatments by identifying the most efficient treatment components. Most current mediation analysis methods rely on untested distributional and functional form assumptions for valid…
NASA Astrophysics Data System (ADS)
Park, Hyeonwoo; Teramoto, Akinobu; Kuroda, Rihito; Suwa, Tomoyuki; Sugawa, Shigetoshi
2018-04-01
Localized stress-induced leakage current (SILC) has become a major problem in the reliability of flash memories. To reduce it, clarifying the SILC mechanism is important, and statistical measurement and analysis have to be carried out. In this study, we applied an array test circuit that can measure the SILC distribution of more than 80,000 nMOSFETs with various gate areas at a high speed (within 80 s) and a high accuracy (on the 10-17 A current order). The results clarified that the distributions of localized SILC in different gate areas follow a universal distribution assuming the same SILC defect density distribution per unit area, and the current of localized SILC defects does not scale down with the gate area. Moreover, the distribution of SILC defect density and its dependence on the oxide field for measurement (E OX-Measure) were experimentally determined for fabricated devices.
NASA Astrophysics Data System (ADS)
Duari, Debiprosad; Gupta, Patrick D.; Narlikar, Jayant V.
1992-01-01
An overview of statistical tests of peaks and periodicities in the redshift distribution of quasi-stellar objects is presented. The tests include the power-spectrum analysis carried out by Burbidge and O'Dell (1972), the generalized Rayleigh test, the Kolmogorov-Smirnov test, and the 'comb-tooth' test. The tests reveal moderate to strong evidence for periodicities of 0.0565 and 0.0127-0.0129. The confidence level of the periodicity of 0.0565 in fact marginally increases when redshifts are transformed to the Galactocentric frame. The same periodicity, first noticed in 1968, persists to date with a QSO population that has since grown about 30 times its original size. The prima facie evidence for periodicities in 1n(1 + z) is found to be of no great significance.
A Third Moment Adjusted Test Statistic for Small Sample Factor Analysis
Lin, Johnny; Bentler, Peter M.
2012-01-01
Goodness of fit testing in factor analysis is based on the assumption that the test statistic is asymptotically chi-square; but this property may not hold in small samples even when the factors and errors are normally distributed in the population. Robust methods such as Browne’s asymptotically distribution-free method and Satorra Bentler’s mean scaling statistic were developed under the presumption of non-normality in the factors and errors. This paper finds new application to the case where factors and errors are normally distributed in the population but the skewness of the obtained test statistic is still high due to sampling error in the observed indicators. An extension of Satorra Bentler’s statistic is proposed that not only scales the mean but also adjusts the degrees of freedom based on the skewness of the obtained test statistic in order to improve its robustness under small samples. A simple simulation study shows that this third moment adjusted statistic asymptotically performs on par with previously proposed methods, and at a very small sample size offers superior Type I error rates under a properly specified model. Data from Mardia, Kent and Bibby’s study of students tested for their ability in five content areas that were either open or closed book were used to illustrate the real-world performance of this statistic. PMID:23144511
Evidence of Periodicity in Ancient Egyptian Calendars of Lucky and Unlucky Days
NASA Astrophysics Data System (ADS)
Porceddu, P.; Jetsu, L.; Markkanen, T.; Toivari-Viitala, J.
2008-10-01
This article presents an experiment in time series analysis, specifically the Rayleigh Test, applied to the ancient Egyptian calendars of lucky and unlucky days recorded in papyri P. Cairo 86637, P. BM 10474 and P. Sallier IV. The Rayleigh Test is used to determine whether the lucky and unlucky days are distributed randomly within the year, or whether they exhibit periodicity. The results of the analysis show beyond doubt that some of the lucky days were distributed according to a lunar calendar. The cycles of the moon thus played an important role in the religious thinking of the Egyptians. Other periods found using the Rayleigh Test are connected to the civil calendar, the mythological symbolism of the twelfth hour of the day and possibly the period of variation of the star Algol.
Test plane uniformity analysis for the MSFC solar simulator lamp array
NASA Technical Reports Server (NTRS)
Griner, D. B.
1976-01-01
A preliminary analysis was made on the solar simulator lamp array. It is an array of 405 tungsten halogen lamps with Fresnel lenses to achieve the required spectral distribution and collimation. A computer program was developed to analyze lamp array performance at the test plane. Measurements were made on individual lamp lens combinations to obtain data for the computer analysis. The analysis indicated that the performance of the lamp array was about as expected, except for a need to position the test plane within 2.7 m of the lamp array to achieve the desired 7 percent uniformity of illumination tolerance.
NASA Technical Reports Server (NTRS)
Srivastava, Rakesh
2004-01-01
A ceramic guide vane has been designed and tested for operation under high temperature. Previous efforts have suggested that some cooling flow may be required to alleviate the high temperatures observed near the trailing edge region. The present report describes briefly a three-dimensional viscous analysis carried out to calculate the temperature and pressure distribution on the blade surface and in the flow path with a jet of cooling air exiting from the suction surface near the trailing edge region. The data for analysis was obtained from Dr. Craig Robinson. The surface temperature and pressure distribution along with a flowfield distribution is shown in the results. The surface distribution is also given in a tabular form at the end of the document.
Li, Dongmei; Le Pape, Marc A; Parikh, Nisha I; Chen, Will X; Dye, Timothy D
2013-01-01
Microarrays are widely used for examining differential gene expression, identifying single nucleotide polymorphisms, and detecting methylation loci. Multiple testing methods in microarray data analysis aim at controlling both Type I and Type II error rates; however, real microarray data do not always fit their distribution assumptions. Smyth's ubiquitous parametric method, for example, inadequately accommodates violations of normality assumptions, resulting in inflated Type I error rates. The Significance Analysis of Microarrays, another widely used microarray data analysis method, is based on a permutation test and is robust to non-normally distributed data; however, the Significance Analysis of Microarrays method fold change criteria are problematic, and can critically alter the conclusion of a study, as a result of compositional changes of the control data set in the analysis. We propose a novel approach, combining resampling with empirical Bayes methods: the Resampling-based empirical Bayes Methods. This approach not only reduces false discovery rates for non-normally distributed microarray data, but it is also impervious to fold change threshold since no control data set selection is needed. Through simulation studies, sensitivities, specificities, total rejections, and false discovery rates are compared across the Smyth's parametric method, the Significance Analysis of Microarrays, and the Resampling-based empirical Bayes Methods. Differences in false discovery rates controls between each approach are illustrated through a preterm delivery methylation study. The results show that the Resampling-based empirical Bayes Methods offer significantly higher specificity and lower false discovery rates compared to Smyth's parametric method when data are not normally distributed. The Resampling-based empirical Bayes Methods also offers higher statistical power than the Significance Analysis of Microarrays method when the proportion of significantly differentially expressed genes is large for both normally and non-normally distributed data. Finally, the Resampling-based empirical Bayes Methods are generalizable to next generation sequencing RNA-seq data analysis.
Verification of forecast ensembles in complex terrain including observation uncertainty
NASA Astrophysics Data System (ADS)
Dorninger, Manfred; Kloiber, Simon
2017-04-01
Traditionally, verification means to verify a forecast (ensemble) with the truth represented by observations. The observation errors are quite often neglected arguing that they are small when compared to the forecast error. In this study as part of the MesoVICT (Mesoscale Verification Inter-comparison over Complex Terrain) project it will be shown, that observation errors have to be taken into account for verification purposes. The observation uncertainty is estimated from the VERA (Vienna Enhanced Resolution Analysis) and represented via two analysis ensembles which are compared to the forecast ensemble. For the whole study results from COSMO-LEPS provided by Arpae-SIMC Emilia-Romagna are used as forecast ensemble. The time period covers the MesoVICT core case from 20-22 June 2007. In a first step, all ensembles are investigated concerning their distribution. Several tests have been executed (Kolmogorov-Smirnov-Test, Finkelstein-Schafer Test, Chi-Square Test etc.) showing no exact mathematical distribution. So the main focus is on non-parametric statistics (e.g. Kernel density estimation, Boxplots etc.) and also the deviation between "forced" normal distributed data and the kernel density estimations. In a next step the observational deviations due to the analysis ensembles are analysed. In a first approach scores are multiple times calculated with every single ensemble member from the analysis ensemble regarded as "true" observation. The results are presented as boxplots for the different scores and parameters. Additionally, the bootstrapping method is also applied to the ensembles. These possible approaches to incorporating observational uncertainty into the computation of statistics will be discussed in the talk.
Analysis of Stress in Steel and Concrete in Cfst Push-Out Test Samples
NASA Astrophysics Data System (ADS)
Grzeszykowski, Bartosz; Szadkowska, Magdalena; Szmigiera, Elżbieta
2017-09-01
The paper presents the analysis of stress in steel and concrete in CFST composite elements subjected to push-out tests. Two analytical models of stress distribution are presented. The bond at the interface between steel and concrete in the initial phase of the push-out test is provided by the adhesion. Until the force reach a certain value, the slip between both materials does not occur or it is negligibly small, what ensures full composite action of the specimen. In the first analytical model the full bond between both materials was assumed. This model allows to estimate value of the force for which the local loss of adhesion in given cross section begins. In the second model it was assumed that the bond stress distribution is constant along the shear transfer length of the specimen. Based on that the formulas for triangle distribution of stress in steel and concrete for the maximum push-out force were derived and compared with the experimental results. Both models can be used to better understand the mechanisms of interaction between steel and concrete in composite steel-concrete columns.
Research on intelligent power distribution system for spacecraft
NASA Astrophysics Data System (ADS)
Xia, Xiaodong; Wu, Jianju
2017-10-01
The power distribution system (PDS) mainly realizes the power distribution and management of the electrical load of the whole spacecraft, which is directly related to the success or failure of the mission, and hence is an important part of the spacecraft. In order to improve the reliability and intelligent degree of the PDS, and considering the function and composition of spacecraft power distribution system, this paper systematically expounds the design principle and method of the intelligent power distribution system based on SSPC, and provides the analysis and verification of the test data additionally.
NASA Astrophysics Data System (ADS)
Jiang, Quan; Zhong, Shan; Cui, Jie; Feng, Xia-Ting; Song, Leibo
2016-12-01
We investigated the statistical characteristics and probability distribution of the mechanical parameters of natural rock using triaxial compression tests. Twenty cores of Jinping marble were tested under each different levels of confining stress (i.e., 5, 10, 20, 30, and 40 MPa). From these full stress-strain data, we summarized the numerical characteristics and determined the probability distribution form of several important mechanical parameters, including deformational parameters, characteristic strength, characteristic strains, and failure angle. The statistical proofs relating to the mechanical parameters of rock presented new information about the marble's probabilistic distribution characteristics. The normal and log-normal distributions were appropriate for describing random strengths of rock; the coefficients of variation of the peak strengths had no relationship to the confining stress; the only acceptable random distribution for both Young's elastic modulus and Poisson's ratio was the log-normal function; and the cohesive strength had a different probability distribution pattern than the frictional angle. The triaxial tests and statistical analysis also provided experimental evidence for deciding the minimum reliable number of experimental sample and for picking appropriate parameter distributions to use in reliability calculations for rock engineering.
Effect of Bimodal Grain Size Distribution on Scatter in Toughness
NASA Astrophysics Data System (ADS)
Chakrabarti, Debalay; Strangwood, Martin; Davis, Claire
2009-04-01
Blunt-notch tests were performed at -160 °C to investigate the effect of a bimodal ferrite grain size distribution in steel on cleavage fracture toughness, by comparing local fracture stress values for heat-treated microstructures with uniformly fine, uniformly coarse, and bimodal grain structures. An analysis of fracture stress values indicates that bimodality can have a significant effect on toughness by generating high scatter in the fracture test results. Local cleavage fracture values were related to grain size distributions and it was shown that the largest grains in the microstructure, with an area percent greater than approximately 4 pct, gave rise to cleavage initiation. In the case of the bimodal grain size distribution, the large grains from both the “fine grain” and “coarse grain” population initiate cleavage; this spread in grain size values resulted in higher scatter in the fracture stress than in the unimodal distributions. The notch-bend test results have been used to explain the difference in scatter in the Charpy energies for the unimodal and bimodal ferrite grain size distributions of thermomechanically controlled rolled (TMCR) steel, in which the bimodal distribution showed higher scatter in the Charpy impact transition (IT) region.
Estimation of Occupational Test Norms from Job Analysis Data.
ERIC Educational Resources Information Center
Mecham, Robert C.
Occupational norms exist for some tests, and differences in the distributions of test scores by occupation are evident. Sampling error (SE), situationally specific factors (SSFs), and differences in job content (DIJCs) were explored as possible reasons for the observed differences. SE was explored by analyzing 742 validity studies performed by the…
The Spelling Project. Technical Report 1992-2.
ERIC Educational Resources Information Center
Green, Kathy E.; Schroeder, David H.
Results of an analysis of a newly developed spelling test and several related measures are reported. Information about the reliability of a newly developed spelling test; its distribution of scores; its relationship with the standard battery of aptitude tests of the Johnson O'Connor Research Foundation; and its relationships with sex, age,…
Improving ATLAS grid site reliability with functional tests using HammerCloud
NASA Astrophysics Data System (ADS)
Elmsheuser, Johannes; Legger, Federica; Medrano Llamas, Ramon; Sciacca, Gianfranco; van der Ster, Dan
2012-12-01
With the exponential growth of LHC (Large Hadron Collider) data in 2011, and more coming in 2012, distributed computing has become the established way to analyse collider data. The ATLAS grid infrastructure includes almost 100 sites worldwide, ranging from large national computing centers to smaller university clusters. These facilities are used for data reconstruction and simulation, which are centrally managed by the ATLAS production system, and for distributed user analysis. To ensure the smooth operation of such a complex system, regular tests of all sites are necessary to validate the site capability of successfully executing user and production jobs. We report on the development, optimization and results of an automated functional testing suite using the HammerCloud framework. Functional tests are short lightweight applications covering typical user analysis and production schemes, which are periodically submitted to all ATLAS grid sites. Results from those tests are collected and used to evaluate site performances. Sites that fail or are unable to run the tests are automatically excluded from the PanDA brokerage system, therefore avoiding user or production jobs to be sent to problematic sites.
Combined Loads Test Fixture for Thermal-Structural Testing Aerospace Vehicle Panel Concepts
NASA Technical Reports Server (NTRS)
Fields, Roger A.; Richards, W. Lance; DeAngelis, Michael V.
2004-01-01
A structural test requirement of the National Aero-Space Plane (NASP) program has resulted in the design, fabrication, and implementation of a combined loads test fixture. Principal requirements for the fixture are testing a 4- by 4-ft hat-stiffened panel with combined axial (either tension or compression) and shear load at temperatures ranging from room temperature to 915 F, keeping the test panel stresses caused by the mechanical loads uniform, and thermal stresses caused by non-uniform panel temperatures minimized. The panel represents the side fuselage skin of an experimental aerospace vehicle, and was produced for the NASP program. A comprehensive mechanical loads test program using the new test fixture has been conducted on this panel from room temperature to 500 F. Measured data have been compared with finite-element analyses predictions, verifying that uniform load distributions were achieved by the fixture. The overall correlation of test data with analysis is excellent. The panel stress distributions and temperature distributions are very uniform and fulfill program requirements. This report provides details of an analytical and experimental validation of the combined loads test fixture. Because of its simple design, this unique test fixture can accommodate panels from a variety of aerospace vehicle designs.
The Effects of Variability and Risk in Selection Utility Analysis: An Empirical Comparison.
ERIC Educational Resources Information Center
Rich, Joseph R.; Boudreau, John W.
1987-01-01
Investigated utility estimate variability for the selection utility of using the Programmer Aptitude Test to select computer programmers. Comparison of Monte Carlo results to other risk assessment approaches (sensitivity analysis, break-even analysis, algebraic derivation of the distribtion) suggests that distribution information provided by Monte…
2015-08-01
primarily concerned with the results of a three-dimensional elasto– plastic finite element contact analysis of a typical aluminium fatigue test coupon...determine the nonlinear three-dimensional elasto–plastic contact stress distributions around a circular hole in an aluminium plate that is fitted...Australian Air Force (RAAF) airframes. An aluminium -alloy fatigue test coupon (see Figure 1) has been designed and applied in support of the validation of
NASA Technical Reports Server (NTRS)
Morgan, H. L., Jr.
1982-01-01
A 2.29 m (7.5 ft.) span high-lift research model equipped with full-span leading-edge slat and part-span double-slotted trailing-edge flap was tested in the Langley 4- by 7-Meter Tunnel to determine the low speed performance characteristics of a representative high aspect ratio suprcritical wing. These tests were performed in support of the Energy Efficient Transport (EET) program which is one element of the Aircraft Energy Efficiency (ACEE) project. Static longitudinal forces and moments and chordwise pressure distributions at three spanwise stations were measured for cruise, climb, two take-off flap, and two landing flap wing configurations. The tabulated and plotted pressure distribution data is presented without analysis or discussion.
NASA Technical Reports Server (NTRS)
Kjelgaard, S. O.; Morgan, H. L., Jr.
1983-01-01
A high-lift transport aircraft model equipped with full-span leading-edge slat and part-span double-slotted trailing-edge flap was tested in the Ames 12-ft pressure tunnel to determine the low-speed performance characteristics of a representative high-aspect-ratio supercritical wing. These tests were performed in support of the Energy Efficient Transport (EET) program which is one element of the Aircraft Energy Efficiency (ACEE) project. Static longitudinal forces and moments and chordwise pressure distributions at three spanwise stations were measured for cruise, climb, two take-off flap, and two landing flap wing configurations. The tabulated and plotted pressure distribution data is presented without analysis or discussion.
ERIC Educational Resources Information Center
Bakir, Saad T.
2010-01-01
We propose a nonparametric (or distribution-free) procedure for testing the equality of several population variances (or scale parameters). The proposed test is a modification of Bakir's (1989, Commun. Statist., Simul-Comp., 18, 757-775) analysis of means by ranks (ANOMR) procedure for testing the equality of several population means. A proof is…
Statistical analysis of content of Cs-137 in soils in Bansko-Razlog region
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kobilarov, R. G., E-mail: rkobi@tu-sofia.bg
Statistical analysis of the data set consisting of the activity concentrations of {sup 137}Cs in soils in Bansko–Razlog region is carried out in order to establish the dependence of the deposition and the migration of {sup 137}Cs on the soil type. The descriptive statistics and the test of normality show that the data set have not normal distribution. Positively skewed distribution and possible outlying values of the activity of {sup 137}Cs in soils were observed. After reduction of the effects of outliers, the data set is divided into two parts, depending on the soil type. Test of normality of themore » two new data sets shows that they have a normal distribution. Ordinary kriging technique is used to characterize the spatial distribution of the activity of {sup 137}Cs over an area covering 40 km{sup 2} (whole Razlog valley). The result (a map of the spatial distribution of the activity concentration of {sup 137}Cs) can be used as a reference point for future studies on the assessment of radiological risk to the population and the erosion of soils in the study area.« less
Sensitivity of goodness-of-fit statistics to rainfall data rounding off
NASA Astrophysics Data System (ADS)
Deidda, Roberto; Puliga, Michelangelo
An analysis based on the L-moments theory suggests of adopting the generalized Pareto distribution to interpret daily rainfall depths recorded by the rain-gauge network of the Hydrological Survey of the Sardinia Region. Nevertheless, a big problem, not yet completely resolved, arises in the estimation of a left-censoring threshold able to assure a good fitting of rainfall data with the generalized Pareto distribution. In order to detect an optimal threshold, keeping the largest possible number of data, we chose to apply a “failure-to-reject” method based on goodness-of-fit tests, as it was proposed by Choulakian and Stephens [Choulakian, V., Stephens, M.A., 2001. Goodness-of-fit tests for the generalized Pareto distribution. Technometrics 43, 478-484]. Unfortunately, the application of the test, using percentage points provided by Choulakian and Stephens (2001), did not succeed in detecting a useful threshold value in most analyzed time series. A deeper analysis revealed that these failures are mainly due to the presence of large quantities of rounding off values among sample data, affecting the distribution of goodness-of-fit statistics and leading to significant departures from percentage points expected for continuous random variables. A procedure based on Monte Carlo simulations is thus proposed to overcome these problems.
A systematic review and meta-analysis on the incubation period of Campylobacteriosis.
Awofisayo-Okuyelu, A; Hall, I; Adak, G; Hawker, J I; Abbott, S; McCARTHY, N
2017-08-01
Accurate knowledge of pathogen incubation period is essential to inform public health policies and implement interventions that contribute to the reduction of burden of disease. The incubation period distribution of campylobacteriosis is currently unknown with several sources reporting different times. Variation in the distribution could be expected due to host, transmission vehicle, and organism characteristics, however, the extent of this variation and influencing factors are unclear. The authors have undertaken a systematic review of published literature of outbreak studies with well-defined point source exposures and human experimental studies to estimate the distribution of incubation period and also identify and explain the variation in the distribution between studies. We tested for heterogeneity using I 2 and Kolmogorov-Smirnov tests, regressed incubation period against possible explanatory factors, and used hierarchical clustering analysis to define subgroups of studies without evidence of heterogeneity. The mean incubation period of subgroups ranged from 2·5 to 4·3 days. We observed variation in the distribution of incubation period between studies that was not due to chance. A significant association between the mean incubation period and age distribution was observed with outbreaks involving only children reporting an incubation of 1·29 days longer when compared with outbreaks involving other age groups.
Toward a quantitative account of pitch distribution in spontaneous narrative: Method and validation
Matteson, Samuel E.; Streit Olness, Gloria; Caplow, Nancy J.
2013-01-01
Pitch is well-known both to animate human discourse and to convey meaning in communication. The study of the statistical population distributions of pitch in discourse will undoubtedly benefit from methodological improvements. The current investigation examines a method that parameterizes pitch in discourse as musical pitch interval H measured in units of cents and that disaggregates the sequence of peak word-pitches using tools employed in time-series analysis and digital signal processing. The investigators test the proposed methodology by its application to distributions in pitch interval of the peak word-pitch (collectively called the discourse gamut) that occur in simulated and actual spontaneous emotive narratives obtained from 17 middle-aged African-American adults. The analysis, in rigorous tests, not only faithfully reproduced simulated distributions imbedded in realistic time series that drift and include pitch breaks, but the protocol also reveals that the empirical distributions exhibit a common hidden structure when normalized to a slowly varying mode (called the gamut root) of their respective probability density functions. Quantitative differences between narratives reveal the speakers' relative propensity for the use of pitch levels corresponding to elevated degrees of a discourse gamut (the “e-la”) superimposed upon a continuum that conforms systematically to an asymmetric Laplace distribution. PMID:23654400
NASA Astrophysics Data System (ADS)
Ruthven, R. C.; Ketcham, R. A.; Kelly, E. D.
2015-12-01
Three-dimensional textural analysis of garnet porphyroblasts and electron microprobe analyses can, in concert, be used to pose novel tests that challenge and ultimately increase our understanding of metamorphic crystallization mechanisms. Statistical analysis of high-resolution X-ray computed tomography (CT) data of garnet porphyroblasts tells us the degree of ordering or randomness of garnets, which can be used to distinguish the rate-limiting factors behind their nucleation and growth. Electron microprobe data for cores, rims, and core-to-rim traverses are used as proxies to ascertain porphyroblast nucleation and growth rates, and the evolution of sample composition during crystallization. MnO concentrations in garnet cores serve as a proxy for the relative timing of nucleation, and rim concentrations test the hypothesis that MnO is in equilibrium sample-wide during the final stages of crystallization, and that concentrations have not been greatly altered by intracrystalline diffusion. Crystal size distributions combined with compositional data can be used to quantify the evolution of nucleation rates and sample composition during crystallization. This study focuses on quartzite schists from the Picuris Mountains with heterogeneous garnet distributions consisting of dense and sparse layers. 3D data shows that the sparse layers have smaller, less euhedral garnets, and petrographic observations show that sparse layers have more quartz and less mica than dense layers. Previous studies on rocks with homogeneously distributed garnet have shown that crystallization rates are diffusion-controlled, meaning that they are limited by diffusion of nutrients to growth and nucleation sites. This research extends this analysis to heterogeneous rocks to determine nucleation and growth rates, and test the assumption of rock-wide equilibrium for some major elements, among a set of compositionally distinct domains evolving in mm- to cm-scale proximity under identical P-T conditions.
Aryal, Madhava P; Nagaraja, Tavarekere N; Brown, Stephen L; Lu, Mei; Bagher-Ebadian, Hassan; Ding, Guangliang; Panda, Swayamprava; Keenan, Kelly; Cabral, Glauber; Mikkelsen, Tom; Ewing, James R
2014-10-01
The distribution of dynamic contrast-enhanced MRI (DCE-MRI) parametric estimates in a rat U251 glioma model was analyzed. Using Magnevist as contrast agent (CA), 17 nude rats implanted with U251 cerebral glioma were studied by DCE-MRI twice in a 24 h interval. A data-driven analysis selected one of three models to estimate either (1) plasma volume (vp), (2) vp and forward volume transfer constant (K(trans)) or (3) vp, K(trans) and interstitial volume fraction (ve), constituting Models 1, 2 and 3, respectively. CA distribution volume (VD) was estimated in Model 3 regions by Logan plots. Regions of interest (ROIs) were selected by model. In the Model 3 ROI, descriptors of parameter distributions--mean, median, variance and skewness--were calculated and compared between the two time points for repeatability. All distributions of parametric estimates in Model 3 ROIs were positively skewed. Test-retest differences between population summaries for any parameter were not significant (p ≥ 0.10; Wilcoxon signed-rank and paired t tests). These and similar measures of parametric distribution and test-retest variance from other tumor models can be used to inform the choice of biomarkers that best summarize tumor status and treatment effects. Copyright © 2014 John Wiley & Sons, Ltd.
Development of probabilistic multimedia multipathway computer codes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yu, C.; LePoire, D.; Gnanapragasam, E.
2002-01-01
The deterministic multimedia dose/risk assessment codes RESRAD and RESRAD-BUILD have been widely used for many years for evaluation of sites contaminated with residual radioactive materials. The RESRAD code applies to the cleanup of sites (soils) and the RESRAD-BUILD code applies to the cleanup of buildings and structures. This work describes the procedure used to enhance the deterministic RESRAD and RESRAD-BUILD codes for probabilistic dose analysis. A six-step procedure was used in developing default parameter distributions and the probabilistic analysis modules. These six steps include (1) listing and categorizing parameters; (2) ranking parameters; (3) developing parameter distributions; (4) testing parameter distributionsmore » for probabilistic analysis; (5) developing probabilistic software modules; and (6) testing probabilistic modules and integrated codes. The procedures used can be applied to the development of other multimedia probabilistic codes. The probabilistic versions of RESRAD and RESRAD-BUILD codes provide tools for studying the uncertainty in dose assessment caused by uncertain input parameters. The parameter distribution data collected in this work can also be applied to other multimedia assessment tasks and multimedia computer codes.« less
Zhang, Jinju; Li, Zuozhou; Fritsch, Peter W.; Tian, Hua; Yang, Aihong; Yao, Xiaohong
2015-01-01
Background and Aims The phylogeography of plant species in sub-tropical China remains largely unclear. This study used Tapiscia sinensis, an endemic and endangered tree species widely but disjunctly distributed in sub-tropical China, as a model to reveal the patterns of genetic diversity and phylogeographical history of Tertiary relict plant species in this region. The implications of the results are discussed in relation to its conservation management. Methods Samples were taken from 24 populations covering the natural geographical distribution of T. sinensis. Genetic structure was investigated by analysis of molecular variance (AMOVA) and spatial analysis of molecular variance (SAMOVA). Phylogenetic relationships among haplotypes were constructed with maximum parsimony and haplotype network methods. Historical population expansion events were tested with pairwise mismatch distribution analysis and neutrality tests. Species potential range was deduced by ecological niche modelling (ENM). Key Results A low level of genetic diversity was detected at the population level. A high level of genetic differentiation and a significant phylogeographical structure were revealed. The mean divergence time of the haplotypes was approx. 1·33 million years ago. Recent range expansion in this species is suggested by a star-like haplotype network and by the results from the mismatch distribution analysis and neutrality tests. Conclusions Climatic oscillations during the Pleistocene have had pronounced effects on the extant distribution of Tapiscia relative to the Last Glacial Maximum (LGM). Spatial patterns of molecular variation and ENM suggest that T. sinensis may have retreated in south-western and central China and colonized eastern China prior to the LGM. Multiple montane refugia for T. sinense existing during the LGM are inferred in central and western China. The populations adjacent to or within these refugia of T. sinense should be given high priority in the development of conservation policies and management strategies for this endangered species. PMID:26187222
Zhang, Jinju; Li, Zuozhou; Fritsch, Peter W; Tian, Hua; Yang, Aihong; Yao, Xiaohong
2015-10-01
The phylogeography of plant species in sub-tropical China remains largely unclear. This study used Tapiscia sinensis, an endemic and endangered tree species widely but disjunctly distributed in sub-tropical China, as a model to reveal the patterns of genetic diversity and phylogeographical history of Tertiary relict plant species in this region. The implications of the results are discussed in relation to its conservation management. Samples were taken from 24 populations covering the natural geographical distribution of T. sinensis. Genetic structure was investigated by analysis of molecular variance (AMOVA) and spatial analysis of molecular variance (SAMOVA). Phylogenetic relationships among haplotypes were constructed with maximum parsimony and haplotype network methods. Historical population expansion events were tested with pairwise mismatch distribution analysis and neutrality tests. Species potential range was deduced by ecological niche modelling (ENM). A low level of genetic diversity was detected at the population level. A high level of genetic differentiation and a significant phylogeographical structure were revealed. The mean divergence time of the haplotypes was approx. 1·33 million years ago. Recent range expansion in this species is suggested by a star-like haplotype network and by the results from the mismatch distribution analysis and neutrality tests. Climatic oscillations during the Pleistocene have had pronounced effects on the extant distribution of Tapiscia relative to the Last Glacial Maximum (LGM). Spatial patterns of molecular variation and ENM suggest that T. sinensis may have retreated in south-western and central China and colonized eastern China prior to the LGM. Multiple montane refugia for T. sinense existing during the LGM are inferred in central and western China. The populations adjacent to or within these refugia of T. sinense should be given high priority in the development of conservation policies and management strategies for this endangered species. © The Author 2015. Published by Oxford University Press on behalf of the Annals of Botany Company. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Armentrout, J.M.; Smith-Rouch, L.S.; Bowman, S.A.
1996-08-01
Numeric simulations based on integrated data sets enhance our understanding of depositional geometry and facilitate quantification of depositional processes. Numeric values tested against well-constrained geologic data sets can then be used in iterations testing each variable, and in predicting lithofacies distributions under various depositional scenarios using the principles of sequence stratigraphic analysis. The stratigraphic modeling software provides a broad spectrum of techniques for modeling and testing elements of the petroleum system. Using well-constrained geologic examples, variations in depositional geometry and lithofacies distributions between different tectonic settings (passive vs. active margin) and climate regimes (hothouse vs. icehouse) can provide insight tomore » potential source rock and reservoir rock distribution, maturation timing, migration pathways, and trap formation. Two data sets are used to illustrate such variations: both include a seismic reflection profile calibrated by multiple wells. The first is a Pennsylvanian mixed carbonate-siliciclastic system in the Paradox basin, and the second a Pliocene-Pleistocene siliciclastic system in the Gulf of Mexico. Numeric simulations result in geometry and facies distributions consistent with those interpreted using the integrated stratigraphic analysis of the calibrated seismic profiles. An exception occurs in the Gulf of Mexico study where the simulated sediment thickness from 3.8 to 1.6 Ma within an upper slope minibasin was less than that mapped using a regional seismic grid. Regional depositional patterns demonstrate that this extra thickness was probably sourced from out of the plane of the modeled transect, illustrating the necessity for three-dimensional constraints on two-dimensional modeling.« less
Tests for informative cluster size using a novel balanced bootstrap scheme.
Nevalainen, Jaakko; Oja, Hannu; Datta, Somnath
2017-07-20
Clustered data are often encountered in biomedical studies, and to date, a number of approaches have been proposed to analyze such data. However, the phenomenon of informative cluster size (ICS) is a challenging problem, and its presence has an impact on the choice of a correct analysis methodology. For example, Dutta and Datta (2015, Biometrics) presented a number of marginal distributions that could be tested. Depending on the nature and degree of informativeness of the cluster size, these marginal distributions may differ, as do the choices of the appropriate test. In particular, they applied their new test to a periodontal data set where the plausibility of the informativeness was mentioned, but no formal test for the same was conducted. We propose bootstrap tests for testing the presence of ICS. A balanced bootstrap method is developed to successfully estimate the null distribution by merging the re-sampled observations with closely matching counterparts. Relying on the assumption of exchangeability within clusters, the proposed procedure performs well in simulations even with a small number of clusters, at different distributions and against different alternative hypotheses, thus making it an omnibus test. We also explain how to extend the ICS test to a regression setting and thereby enhancing its practical utility. The methodologies are illustrated using the periodontal data set mentioned earlier. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Cao, M.-H.; Jiang, H.-K.; Chin, J.-S.
1982-04-01
An improved flat-fan spray model is used for the semi-empirical analysis of liquid fuel distribution downstream of a plain orifice injector under cross-stream air flow. The model assumes that, due to the aerodynamic force of the high-velocity cross air flow, the injected fuel immediately forms a flat-fan liquid sheet perpendicular to the cross flow. Once the droplets have been formed, the trajectories of individual droplets determine fuel distribution downstream. Comparison with test data shows that the proposed model accurately predicts liquid fuel distribution at any point downstream of a plain orifice injector under high-velocity, low-temperature uniform cross-stream air flow over a wide range of conditions.
Wu, Hao
2018-05-01
In structural equation modelling (SEM), a robust adjustment to the test statistic or to its reference distribution is needed when its null distribution deviates from a χ 2 distribution, which usually arises when data do not follow a multivariate normal distribution. Unfortunately, existing studies on this issue typically focus on only a few methods and neglect the majority of alternative methods in statistics. Existing simulation studies typically consider only non-normal distributions of data that either satisfy asymptotic robustness or lead to an asymptotic scaled χ 2 distribution. In this work we conduct a comprehensive study that involves both typical methods in SEM and less well-known methods from the statistics literature. We also propose the use of several novel non-normal data distributions that are qualitatively different from the non-normal distributions widely used in existing studies. We found that several under-studied methods give the best performance under specific conditions, but the Satorra-Bentler method remains the most viable method for most situations. © 2017 The British Psychological Society.
NASA Technical Reports Server (NTRS)
2001-01-01
This document presents the full-scale analyses of the CFD RSRM. The RSRM model was developed with a 20 second burn time. The following are presented as part of the full-scale analyses: (1) RSRM embedded inclusion analysis; (2) RSRM igniter nozzle design analysis; (3) Nozzle Joint 4 erosion anomaly; (4) RSRM full motor port slag accumulation analysis; (5) RSRM motor analysis of two-phase flow in the aft segment/submerged nozzle region; (6) Completion of 3-D Analysis of the hot air nozzle manifold; (7) Bates Motor distributed combustion test case; and (8) Three Dimensional Polysulfide Bump Analysis.
NASA Astrophysics Data System (ADS)
Kallolimath, Sharan Chandrashekar
For the past several years, many researchers are constantly developing and improving board level drop test procedures and specifications to quantify the solder joint reliability performance of consumer electronics products. Predictive finite element analysis (FEA) by utilizing simulation software has become widely acceptable verification method which can reduce time and cost of the real-time test process. However, due to testing and metrological limitations it is difficult not only to simulate exact drop condition and capture critical measurement data but also tedious to calibrate the system to improve test methods. Moreover, some of the important ever changing factors such as board flexural rigidity, damping, drop height, and drop orientation results in non-uniform stress/strain distribution throughout the test board. In addition, one of the most challenging tasks is to quantify uniform stress and strain distribution throughout the test board and identify critical failure factors. The major contributions of this work are in the four aspects of the drop test in electronics as following. First of all, an analytical FEA model was developed to study the board natural frequencies and responses of the system with the consideration of dynamic stiffness, damping behavior of the material and effect of impact loading condition. An approach to find the key parameters that affect stress and strain distributions under predominate mode responses was proposed and verified with theoretical solutions. Input-G method was adopted to study board response behavior and cut boundary interpolation methods was used to analyze local model solder joint stresses with the development of global/local FEA model in ANSYS software. Second, no ring phenomenon during the drop test was identified theoretically when the test board was modeled as both discrete system and continuous system. Numerical analysis was then conducted by FEA method for detailed geometry of attached chips with solder-joints. No ring test conditions was proposed and verified for the current widely used JEDEC standard. The significance of impact loading parameters such as pulse magnitude, pulse duration, pulse shapes and board dynamic parameter such as linear hysteretic damping and dynamic stiffness were discussed. Third, Kirchhoff's plate theory by principle of minimum potential energy was adopted to develop the FEA formulation to consider the effect of material hysteretic damping for the currently used JEDEC board test and proposed no-ring response test condition. Fourth, a hexagonal symmetrical board model was proposed to address the uniform stress and strain distribution throughout the test board and identify the critical failure factors. Dynamic stress and strain of the hexagonal board model were then compared with standard JEDEC board for both standard and proposed no-ring test conditions. In general, this line of research demonstrates that advanced techniques of FEA analysis can provide useful insights concerning the optimal design of drop test in microelectronics.
Testing the mean for dependent business data.
Liang, Jiajuan; Martin, Linda
2008-01-01
In business data analysis, it is well known that the comparison of several means is usually carried out by the F-test in analysis of variance under the assumption of independently collected data from all populations. This assumption, however, is likely to be violated in survey data collected from various questionnaires or time-series data. As a result, it is not justifiable or problematic to apply the traditional F-test to comparison of dependent means directly. In this article, we develop a generalized F-test for comparing population means with dependent data. Simulation studies show that the proposed test has a simple approximate null distribution and feasible finite-sample properties. Applications of the proposed test in analysis of survey data and time-series data are illustrated by two real datasets.
NASA Astrophysics Data System (ADS)
Pieczara, Łukasz
2015-09-01
The paper presents the results of analysis of surface roughness parameters in the Krosno Sandstones of Mucharz, southern Poland. It was aimed at determining whether these parameters are influenced by structural features (mainly the laminar distribution of mineral components and directional distribution of non-isometric grains) and fracture processes. The tests applied in the analysis enabled us to determine and describe the primary statistical parameters used in the quantitative description of surface roughness, as well as specify the usefulness of contact profilometry as a method of visualizing spatial differentiation of fracture processes in rocks. These aims were achieved by selecting a model material (Krosno Sandstones from the Górka-Mucharz Quarry) and an appropriate research methodology. The schedule of laboratory analyses included: identification analyses connected with non-destructive ultrasonic tests, aimed at the preliminary determination of rock anisotropy, strength point load tests (cleaved surfaces were obtained due to destruction of rock samples), microscopic analysis (observation of thin sections in order to determine the mechanism of inducing fracture processes) and a test method of measuring surface roughness (two- and three-dimensional diagrams, topographic and contour maps, and statistical parameters of surface roughness). The highest values of roughness indicators were achieved for surfaces formed under the influence of intragranular fracture processes (cracks propagating directly through grains). This is related to the structural features of the Krosno Sandstones (distribution of lamination and bedding).
Statistics of indicated pressure in combustion engine.
NASA Astrophysics Data System (ADS)
Sitnik, L. J.; Andrych-Zalewska, M.
2016-09-01
The paper presents the classic form of pressure waveforms in burn chamber of diesel engine but based on strict analytical basis for amending the displacement volume. The pressure measurement results are obtained in the engine running on an engine dynamometer stand. The study was conducted by a 13-phase ESC test (European Stationary Cycle). In each test phase are archived 90 waveforms of pressure. As a result of extensive statistical analysis was found that while the engine is idling distribution of 90 value of pressure at any value of the angle of rotation of the crankshaft can be described uniform distribution. In the each point of characteristic of the engine corresponding to the individual phases of the ESC test, 90 of the pressure for any value of the angle of rotation of the crankshaft can be described as normal distribution. These relationships are verified using tests: Shapiro-Wilk, Jarque-Bera, Lilliefors, Anderson-Darling. In the following part, with each value of the crank angle, are obtain values of descriptive statistics for the pressure data. In its essence, are obtained a new way to approach the issue of pressure waveform analysis in the burn chamber of engine. The new method can be used to further analysis, especially the combustion process in the engine. It was found, e.g. a very large variances of pressure near the transition from compression to expansion stroke. This lack of stationarity of the process can be important both because of the emissions of exhaust gases and fuel consumption of the engine.
1992-10-01
Manual CI APPENDIX D: Drawing Navigator Field Test D1 DISTRIBUTION Accesion For NTIS CRA&I OTIC TAB Unannouncea JustiteCdtOn By Distribution I "".i•I...methods replace manual methods, the automation will handle the data for the designer, thus reducing error and increasing throughput. However, the two...actively move data from one automation tool (CADD) to the other (the analysis program). This intervention involves a manual rekeying of data already in
Zalvidea; Colautti; Sicre
2000-05-01
An analysis of the Strehl ratio and the optical transfer function as imaging quality parameters of optical elements with enhanced focal length is carried out by employing the Wigner distribution function. To this end, we use four different pupil functions: a full circular aperture, a hyper-Gaussian aperture, a quartic phase plate, and a logarithmic phase mask. A comparison is performed between the quality parameters and test images formed by these pupil functions at different defocus distances.
Photon counting statistics analysis of biophotons from hands.
Jung, Hyun-Hee; Woo, Won-Myung; Yang, Joon-Mo; Choi, Chunho; Lee, Jonghan; Yoon, Gilwon; Yang, Jong S; Soh, Kwang-Sup
2003-05-01
The photon counting statistics of biophotons emitted from hands is studied with a view to test its agreement with the Poisson distribution. The moments of observed probability up to seventh order have been evaluated. The moments of biophoton emission from hands are in good agreement while those of dark counts of photomultiplier tube show large deviations from the theoretical values of Poisson distribution. The present results are consistent with the conventional delta-value analysis of the second moment of probability.
Stress distribution in composite flatwise tension test specimens
NASA Technical Reports Server (NTRS)
Scott, Curtis A.; Pereira, J. Michael
1993-01-01
A finite element analysis was conducted to determine the stress distribution in typical graphite/epoxy composite flat wise tension (FWT) specimens under normal loading conditions. The purpose of the analysis was to determine the relationship between the applied load and the stress in the sample to evaluate the validity of the test as a means of measuring the out-of-plane strength of a composite laminate. Three different test geometries and three different material lay ups were modeled. In all cases, the out-of-plane component of stress in the test section was found to be uniform, with no stress concentrations, and very close to the nominal applied stress. The stress in the sample was found to be three-dimensional, and the magnitude of in-plane normal and shear stresses varied with the anisotropy of the test specimen. However, in the cases considered here, these components of stress were much smaller than the out-of-plane normal stress. The geometry of the test specimen had little influence on the results. It was concluded that the flat wise tension test provides a good measure of the out-of-plane strength for the representative materials that were studied.
Effect of distributive mass of spring on power flow in engineering test
NASA Astrophysics Data System (ADS)
Sheng, Meiping; Wang, Ting; Wang, Minqing; Wang, Xiao; Zhao, Xuan
2018-06-01
Mass of spring is always neglected in theoretical and simulative analysis, while it may be a significance in practical engineering. This paper is concerned with the distributive mass of a steel spring which is used as an isolator to simulate isolation performance of a water pipe in a heating system. Theoretical derivation of distributive mass effect of steel spring on vibration is presented, and multiple eigenfrequencies are obtained, which manifest that distributive mass results in extra modes and complex impedance properties. Furthermore, numerical simulation visually shows several anti-resonances of the steel spring corresponding to impedance and power flow curves. When anti-resonances emerge, the spring collects large energy which may cause damage and unexpected consequences in practical engineering and needs to be avoided. Finally, experimental tests are conducted and results show consistency with that of the simulation of the spring with distributive mass.
Regional frequency analysis of extreme rainfalls using partial L moments method
NASA Astrophysics Data System (ADS)
Zakaria, Zahrahtul Amani; Shabri, Ani
2013-07-01
An approach based on regional frequency analysis using L moments and LH moments are revisited in this study. Subsequently, an alternative regional frequency analysis using the partial L moments (PL moments) method is employed, and a new relationship for homogeneity analysis is developed. The results were then compared with those obtained using the method of L moments and LH moments of order two. The Selangor catchment, consisting of 37 sites and located on the west coast of Peninsular Malaysia, is chosen as a case study. PL moments for the generalized extreme value (GEV), generalized logistic (GLO), and generalized Pareto distributions were derived and used to develop the regional frequency analysis procedure. PL moment ratio diagram and Z test were employed in determining the best-fit distribution. Comparison between the three approaches showed that GLO and GEV distributions were identified as the suitable distributions for representing the statistical properties of extreme rainfall in Selangor. Monte Carlo simulation used for performance evaluation shows that the method of PL moments would outperform L and LH moments methods for estimation of large return period events.
A Test Generation Framework for Distributed Fault-Tolerant Algorithms
NASA Technical Reports Server (NTRS)
Goodloe, Alwyn; Bushnell, David; Miner, Paul; Pasareanu, Corina S.
2009-01-01
Heavyweight formal methods such as theorem proving have been successfully applied to the analysis of safety critical fault-tolerant systems. Typically, the models and proofs performed during such analysis do not inform the testing process of actual implementations. We propose a framework for generating test vectors from specifications written in the Prototype Verification System (PVS). The methodology uses a translator to produce a Java prototype from a PVS specification. Symbolic (Java) PathFinder is then employed to generate a collection of test cases. A small example is employed to illustrate how the framework can be used in practice.
NASA Technical Reports Server (NTRS)
Smith, D. R.; Leslie, F. W.
1984-01-01
The Purdue Regional Objective Analysis of the Mesoscale (PROAM) is a successive correction type scheme for the analysis of surface meteorological data. The scheme is subjected to a series of experiments to evaluate its performance under a variety of analysis conditions. The tests include use of a known analytic temperature distribution to quantify error bounds for the scheme. Similar experiments were conducted using actual atmospheric data. Results indicate that the multiple pass technique increases the accuracy of the analysis. Furthermore, the tests suggest appropriate values for the analysis parameters in resolving disturbances for the data set used in this investigation.
A practical and systematic review of Weibull statistics for reporting strengths of dental materials
Quinn, George D.; Quinn, Janet B.
2011-01-01
Objectives To review the history, theory and current applications of Weibull analyses sufficient to make informed decisions regarding practical use of the analysis in dental material strength testing. Data References are made to examples in the engineering and dental literature, but this paper also includes illustrative analyses of Weibull plots, fractographic interpretations, and Weibull distribution parameters obtained for a dense alumina, two feldspathic porcelains, and a zirconia. Sources Informational sources include Weibull's original articles, later articles specific to applications and theoretical foundations of Weibull analysis, texts on statistics and fracture mechanics and the international standards literature. Study Selection The chosen Weibull analyses are used to illustrate technique, the importance of flaw size distributions, physical meaning of Weibull parameters and concepts of “equivalent volumes” to compare measured strengths obtained from different test configurations. Conclusions Weibull analysis has a strong theoretical basis and can be of particular value in dental applications, primarily because of test specimen size limitations and the use of different test configurations. Also endemic to dental materials, however, is increased difficulty in satisfying application requirements, such as confirming fracture origin type and diligence in obtaining quality strength data. PMID:19945745
Impact of electric vehicles on the IEEE 34 node distribution infrastructure
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jiang, Zeming; Shalalfel, Laith; Beshir, Mohammed J.
With the growing penetration of the electric vehicles to our daily life owing to their economic and environmental benefits, there will be both opportunities and challenges to the utilities when adopting plug-in electric vehicles (PEV) to the distribution network. In this study, a thorough analysis based on real-world project is conducted to evaluate the impacts of electric vehicles infrastructure on the grid relating to system load flow, load factor, and voltage stability. IEEE 34 node test feeder was selected and tested along with different case scenarios utilizing the electrical distribution design (EDD) software to find out the potential impacts tomore » the grid.« less
Impact of electric vehicles on the IEEE 34 node distribution infrastructure
Jiang, Zeming; Shalalfel, Laith; Beshir, Mohammed J.
2014-10-01
With the growing penetration of the electric vehicles to our daily life owing to their economic and environmental benefits, there will be both opportunities and challenges to the utilities when adopting plug-in electric vehicles (PEV) to the distribution network. In this study, a thorough analysis based on real-world project is conducted to evaluate the impacts of electric vehicles infrastructure on the grid relating to system load flow, load factor, and voltage stability. IEEE 34 node test feeder was selected and tested along with different case scenarios utilizing the electrical distribution design (EDD) software to find out the potential impacts tomore » the grid.« less
Normality of raw data in general linear models: The most widespread myth in statistics
Kery, Marc; Hatfield, Jeff S.
2003-01-01
In years of statistical consulting for ecologists and wildlife biologists, by far the most common misconception we have come across has been the one about normality in general linear models. These comprise a very large part of the statistical models used in ecology and include t tests, simple and multiple linear regression, polynomial regression, and analysis of variance (ANOVA) and covariance (ANCOVA). There is a widely held belief that the normality assumption pertains to the raw data rather than to the model residuals. We suspect that this error may also occur in countless published studies, whenever the normality assumption is tested prior to analysis. This may lead to the use of nonparametric alternatives (if there are any), when parametric tests would indeed be appropriate, or to use of transformations of raw data, which may introduce hidden assumptions such as multiplicative effects on the natural scale in the case of log-transformed data. Our aim here is to dispel this myth. We very briefly describe relevant theory for two cases of general linear models to show that the residuals need to be normally distributed if tests requiring normality are to be used, such as t and F tests. We then give two examples demonstrating that the distribution of the response variable may be nonnormal, and yet the residuals are well behaved. We do not go into the issue of how to test normality; instead we display the distributions of response variables and residuals graphically.
Foglia, L.; Hill, Mary C.; Mehl, Steffen W.; Burlando, P.
2009-01-01
We evaluate the utility of three interrelated means of using data to calibrate the fully distributed rainfall‐runoff model TOPKAPI as applied to the Maggia Valley drainage area in Switzerland. The use of error‐based weighting of observation and prior information data, local sensitivity analysis, and single‐objective function nonlinear regression provides quantitative evaluation of sensitivity of the 35 model parameters to the data, identification of data types most important to the calibration, and identification of correlations among parameters that contribute to nonuniqueness. Sensitivity analysis required only 71 model runs, and regression required about 50 model runs. The approach presented appears to be ideal for evaluation of models with long run times or as a preliminary step to more computationally demanding methods. The statistics used include composite scaled sensitivities, parameter correlation coefficients, leverage, Cook's D, and DFBETAS. Tests suggest predictive ability of the calibrated model typical of hydrologic models.
Open-source framework for power system transmission and distribution dynamics co-simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huang, Renke; Fan, Rui; Daily, Jeff
The promise of the smart grid entails more interactions between the transmission and distribution networks, and there is an immediate need for tools to provide the comprehensive modelling and simulation required to integrate operations at both transmission and distribution levels. Existing electromagnetic transient simulators can perform simulations with integration of transmission and distribution systems, but the computational burden is high for large-scale system analysis. For transient stability analysis, currently there are only separate tools for simulating transient dynamics of the transmission and distribution systems. In this paper, we introduce an open source co-simulation framework “Framework for Network Co-Simulation” (FNCS), togethermore » with the decoupled simulation approach that links existing transmission and distribution dynamic simulators through FNCS. FNCS is a middleware interface and framework that manages the interaction and synchronization of the transmission and distribution simulators. Preliminary testing results show the validity and capability of the proposed open-source co-simulation framework and the decoupled co-simulation methodology.« less
Distributed Actuation and Sensing on an Uninhabited Aerial Vehicle
NASA Technical Reports Server (NTRS)
Barnwell, William Garrard
2003-01-01
An array of effectors and sensors has been designed, tested and implemented on a Blended Wing Body Uninhabited Aerial Vehicle (UAV). The UAV is modified to serve as a flying, controls research, testbed. This effector/sensor array provides for the dynamic vehicle testing of controller designs and the study of decentralized control techniques. Each wing of the UAV is equipped with 12 distributed effectors that comprise a segmented array of independently actuated, contoured control surfaces. A single pressure sensor is installed near the base of each effector to provide a measure of deflections of the effectors. The UAV wings were tested in the North Carolina State University Subsonic Wind Tunnel and the pressure distribution that result from the deflections of the effectors are characterized. The results of the experiments are used to develop a simple, but accurate, prediction method, such that for any arrangement of the effector array the corresponding pressure distribution can be determined. Numerical analysis using the panel code CMARC verifies this prediction method.
UAV Flight Control Using Distributed Actuation and Sensing
NASA Technical Reports Server (NTRS)
Barnwell, William G.; Heinzen, Stearns N.; Hall, Charles E., Jr.; Chokani, Ndaona; Raney, David L. (Technical Monitor)
2003-01-01
An array of effectors and sensors has been designed, tested and implemented on a Blended Wing Body Uninhabited Aerial Vehicle (UAV). This UAV is modified to serve as a flying, controls research, testbed. This effectorhensor array provides for the dynamic vehicle testing of controller designs and the study of decentralized control techniques. Each wing of the UAV is equipped with 12 distributed effectors that comprise a segmented array of independently actuated, contoured control surfaces. A single pressure sensor is installed near the base of each effector to provide a measure of deflections of the effectors. The UAV wings were tested in the North Carolina State University Subsonic Wind Tunnel and the pressure distribution that result from the deflections of the effectors are characterized. The results of the experiments are used to develop a simple, but accurate, prediction method, such that for any arrangement of the effector array the corresponding pressure distribution can be determined. Numerical analysis using the panel code CMARC verifies this prediction method.
Robustness of S1 statistic with Hodges-Lehmann for skewed distributions
NASA Astrophysics Data System (ADS)
Ahad, Nor Aishah; Yahaya, Sharipah Soaad Syed; Yin, Lee Ping
2016-10-01
Analysis of variance (ANOVA) is a common use parametric method to test the differences in means for more than two groups when the populations are normally distributed. ANOVA is highly inefficient under the influence of non- normal and heteroscedastic settings. When the assumptions are violated, researchers are looking for alternative such as Kruskal-Wallis under nonparametric or robust method. This study focused on flexible method, S1 statistic for comparing groups using median as the location estimator. S1 statistic was modified by substituting the median with Hodges-Lehmann and the default scale estimator with the variance of Hodges-Lehmann and MADn to produce two different test statistics for comparing groups. Bootstrap method was used for testing the hypotheses since the sampling distributions of these modified S1 statistics are unknown. The performance of the proposed statistic in terms of Type I error was measured and compared against the original S1 statistic, ANOVA and Kruskal-Wallis. The propose procedures show improvement compared to the original statistic especially under extremely skewed distribution.
[The research protocol VI: How to choose the appropriate statistical test. Inferential statistics].
Flores-Ruiz, Eric; Miranda-Novales, María Guadalupe; Villasís-Keever, Miguel Ángel
2017-01-01
The statistical analysis can be divided in two main components: descriptive analysis and inferential analysis. An inference is to elaborate conclusions from the tests performed with the data obtained from a sample of a population. Statistical tests are used in order to establish the probability that a conclusion obtained from a sample is applicable to the population from which it was obtained. However, choosing the appropriate statistical test in general poses a challenge for novice researchers. To choose the statistical test it is necessary to take into account three aspects: the research design, the number of measurements and the scale of measurement of the variables. Statistical tests are divided into two sets, parametric and nonparametric. Parametric tests can only be used if the data show a normal distribution. Choosing the right statistical test will make it easier for readers to understand and apply the results.
Test and Analysis of a Hyper-X Carbon-Carbon Leading Edge Chine
NASA Technical Reports Server (NTRS)
Smith, Russell W.; Sikora, Joseph G.; Lindell, Michael C.
2005-01-01
During parts production for the X43A Mach 10 hypersonic vehicle nondestructive evaluation (NDE) of a leading edge chine detected on imbedded delamination near the lower surface of the part. An ultimate proof test was conducted to verify the ultimate strength of this leading edge chine part. The ultimate proof test setup used a pressure bladder design to impose a uniform distributed pressure field over the bi-planar surface of the chine test article. A detailed description of the chine test article and experimental test setup is presented. Analysis results from a linear status model of the test article are also presented and discussed. Post-test inspection of the specimen revealed no visible failures or areas of delamination.
Robust LOD scores for variance component-based linkage analysis.
Blangero, J; Williams, J T; Almasy, L
2000-01-01
The variance component method is now widely used for linkage analysis of quantitative traits. Although this approach offers many advantages, the importance of the underlying assumption of multivariate normality of the trait distribution within pedigrees has not been studied extensively. Simulation studies have shown that traits with leptokurtic distributions yield linkage test statistics that exhibit excessive Type I error when analyzed naively. We derive analytical formulae relating the deviation from the expected asymptotic distribution of the lod score to the kurtosis and total heritability of the quantitative trait. A simple correction constant yields a robust lod score for any deviation from normality and for any pedigree structure, and effectively eliminates the problem of inflated Type I error due to misspecification of the underlying probability model in variance component-based linkage analysis.
Plasma Electrolyte Distributions in Humans-Normal or Skewed?
Feldman, Mark; Dickson, Beverly
2017-11-01
It is widely believed that plasma electrolyte levels are normally distributed. Statistical tests and calculations using plasma electrolyte data are often reported based on this assumption of normality. Examples include t tests, analysis of variance, correlations and confidence intervals. The purpose of our study was to determine whether plasma sodium (Na + ), potassium (K + ), chloride (Cl - ) and bicarbonate [Formula: see text] distributions are indeed normally distributed. We analyzed plasma electrolyte data from 237 consecutive adults (137 women and 100 men) who had normal results on a standard basic metabolic panel which included plasma electrolyte measurements. The skewness of each distribution (as a measure of its asymmetry) was compared to the zero skewness of a normal (Gaussian) distribution. The plasma Na + distribution was skewed slightly to the right, but the skew was not significantly different from zero skew. The plasma Cl - distribution was skewed slightly to the left, but again the skew was not significantly different from zero skew. On the contrary, both the plasma K + and [Formula: see text] distributions were significantly skewed to the right (P < 0.01 zero skew). There was also a suggestion from examining frequency distribution curves that K + and [Formula: see text] distributions were bimodal. In adults with a normal basic metabolic panel, plasma potassium and bicarbonate levels are not normally distributed and may be bimodal. Thus, statistical methods to evaluate these 2 plasma electrolytes should be nonparametric tests and not parametric ones that require a normal distribution. Copyright © 2017 Southern Society for Clinical Investigation. Published by Elsevier Inc. All rights reserved.
Life prediction and mechanical reliability of NT551 silicon nitride
NASA Astrophysics Data System (ADS)
Andrews, Mark Jay
The inert strength and fatigue performance of a diesel engine exhaust valve made from silicon nitride (Si3N4) ceramic were assessed. The Si3N4 characterized in this study was manufactured by Saint Gobain/Norton Industrial Ceramics and was designated as NT551. The evaluation was made utilizing a probabilistic life prediction algorithm that combined censored test specimen strength data with a Weibull distribution function and the stress field of the ceramic valve obtained from finite element analysis. The major assumptions of the life prediction algorithm are that the bulk ceramic material is isotropic and homogeneous and that the strength-limiting flaws are uniformly distributed. The results from mechanical testing indicated that NT551 was not a homogeneous ceramic and that its strength were functions of temperature, loading rate, and machining orientation. Fractographic analysis identified four different failure modes; 2 were identified as inhomogeneities that were located throughout the bulk of NT551 and were due to processing operations. The fractographic analysis concluded that the strength degradation of NT551 observed from the temperature and loading rate test parameters was due to a change of state that occurred in its secondary phase. Pristine and engine-tested valves made from NT551 were loaded to failure and the inert strengths were obtained. Fractographic analysis of the valves identified the same four failure mechanisms as found with the test specimens. The fatigue performance and the inert strength of the Si3N 4 valves were assessed from censored and uncensored test specimen strength data, respectively. The inert strength failure probability predictions were compared to the inert strength of the Si3N4 valves. The inert strength failure probability predictions were more conservative than the strength of the valves. The lack of correlation between predicted and actual valve strength was due to the nonuniform distribution of inhomogeneities present in NT551. For the same reasons, the predicted and actual fatigue performance did not correlate well. The results of this study should not be considered a limitation of the life prediction algorithm but emphasize the requirement that ceramics be homogeneous and strength-limiting flaws uniformly distributed as a perquisite for accurate life prediction and reliability analyses.
The fundamental parameter method applied to X-ray fluorescence analysis with synchrotron radiation
NASA Astrophysics Data System (ADS)
Pantenburg, F. J.; Beier, T.; Hennrich, F.; Mommsen, H.
1992-05-01
Quantitative X-ray fluorescence analysis applying the fundamental parameter method is usually restricted to monochromatic excitation sources. It is shown here, that such analyses can be performed as well with a white synchrotron radiation spectrum. To determine absolute elemental concentration values it is necessary to know the spectral distribution of this spectrum. A newly designed and tested experimental setup, which uses the synchrotron radiation emitted from electrons in a bending magnet of ELSA (electron stretcher accelerator of the university of Bonn) is presented. The determination of the exciting spectrum, described by the given electron beam parameters, is limited due to uncertainties in the vertical electron beam size and divergence. We describe a method which allows us to determine the relative and absolute spectral distributions needed for accurate analysis. First test measurements of different alloys and standards of known composition demonstrate that it is possible to determine exact concentration values in bulk and trace element analysis.
Yoon, Jong H.; Tamir, Diana; Minzenberg, Michael J.; Ragland, J. Daniel; Ursu, Stefan; Carter, Cameron S.
2009-01-01
Background Multivariate pattern analysis is an alternative method of analyzing fMRI data, which is capable of decoding distributed neural representations. We applied this method to test the hypothesis of the impairment in distributed representations in schizophrenia. We also compared the results of this method with traditional GLM-based univariate analysis. Methods 19 schizophrenia and 15 control subjects viewed two runs of stimuli--exemplars of faces, scenes, objects, and scrambled images. To verify engagement with stimuli, subjects completed a 1-back matching task. A multi-voxel pattern classifier was trained to identify category-specific activity patterns on one run of fMRI data. Classification testing was conducted on the remaining run. Correlation of voxel-wise activity across runs evaluated variance over time in activity patterns. Results Patients performed the task less accurately. This group difference was reflected in the pattern analysis results with diminished classification accuracy in patients compared to controls, 59% and 72% respectively. In contrast, there was no group difference in GLM-based univariate measures. In both groups, classification accuracy was significantly correlated with behavioral measures. Both groups showed highly significant correlation between inter-run correlations and classification accuracy. Conclusions Distributed representations of visual objects are impaired in schizophrenia. This impairment is correlated with diminished task performance, suggesting that decreased integrity of cortical activity patterns is reflected in impaired behavior. Comparisons with univariate results suggest greater sensitivity of pattern analysis in detecting group differences in neural activity and reduced likelihood of non-specific factors driving these results. PMID:18822407
Kwon, Yong Hyun; Kwon, Jung Won; Lee, Myoung Hee
2015-01-01
[Purpose] The purpose of the current study was to compare the effectiveness of motor sequential learning according to two different types of practice schedules, distributed practice schedule (two 12-hour inter-trial intervals) and massed practice schedule (two 10-minute inter-trial intervals) using a serial reaction time (SRT) task. [Subjects and Methods] Thirty healthy subjects were recruited and then randomly and evenly assigned to either the distributed practice group or the massed practice group. All subjects performed three consecutive sessions of the SRT task following one of the two different types of practice schedules. Distributed practice was scheduled for two 12-hour inter-session intervals including sleeping time, whereas massed practice was administered for two 10-minute inter-session intervals. Response time (RT) and response accuracy (RA) were measured in at pre-test, mid-test, and post-test. [Results] For RT, univariate analysis demonstrated significant main effects in the within-group comparison of the three tests as well as the interaction effect of two groups × three tests, whereas the between-group comparison showed no significant effect. The results for RA showed no significant differences in neither the between-group comparison nor the interaction effect of two groups × three tests, whereas the within-group comparison of the three tests showed a significant main effect. [Conclusion] Distributed practice led to enhancement of motor skill acquisition at the first inter-session interval as well as at the second inter-interval the following day, compared to massed practice. Consequentially, the results of this study suggest that a distributed practice schedule can enhance the effectiveness of motor sequential learning in 1-day learning as well as for two days learning formats compared to massed practice. PMID:25931727
Domina, Thurston; Penner, Emily; Hoynes, Hilary
2014-01-01
We use quantile treatment effects estimation to examine the consequences of the random-assignment New York City School Choice Scholarship Program (NYCSCSP) across the distribution of student achievement. Our analyses suggest that the program had negligible and statistically insignificant effects across the skill distribution. In addition to contributing to the literature on school choice, the paper illustrates several ways in which distributional effects estimation can enrich educational research: First, we demonstrate that moving beyond a focus on mean effects estimation makes it possible to generate and test new hypotheses about the heterogeneity of educational treatment effects that speak to the justification for many interventions. Second, we demonstrate that distributional effects can uncover issues even with well-studied datasets by forcing analysts to view their data in new ways. Finally, such estimates highlight where in the overall national achievement distribution test scores of children exposed to particular interventions lie; this is important for exploring the external validity of the intervention’s effects. PMID:26207158
Design, development and testing twin pulse tube cryocooler
NASA Astrophysics Data System (ADS)
Gour, Abhay Singh; Sagar, Pankaj; Karunanithi, R.
2017-09-01
The design and development of Twin Pulse Tube Cryocooler (TPTC) is presented. Both the coolers are driven by a single Linear Moving Magnet Synchronous Motor (LMMSM) with piston heads at both ends of the mover shaft. Magnetostatic analysis for flux line distribution was carried-out during design and development of LMMSM based pressure wave generator. Based on the performance of PWG, design of TPTC was carried out using Sage and Computational Fluid Dynamics (CFD) analysis. Detailed design, fabrication and testing of LMMSM, TPTC and their integration tests are presented in this paper.
Harrison, Luke B; Larsson, Hans C E
2015-03-01
Likelihood-based methods are commonplace in phylogenetic systematics. Although much effort has been directed toward likelihood-based models for molecular data, comparatively less work has addressed models for discrete morphological character (DMC) data. Among-character rate variation (ACRV) may confound phylogenetic analysis, but there have been few analyses of the magnitude and distribution of rate heterogeneity among DMCs. Using 76 data sets covering a range of plants, invertebrate, and vertebrate animals, we used a modified version of MrBayes to test equal, gamma-distributed and lognormally distributed models of ACRV, integrating across phylogenetic uncertainty using Bayesian model selection. We found that in approximately 80% of data sets, unequal-rates models outperformed equal-rates models, especially among larger data sets. Moreover, although most data sets were equivocal, more data sets favored the lognormal rate distribution relative to the gamma rate distribution, lending some support for more complex character correlations than in molecular data. Parsimony estimation of the underlying rate distributions in several data sets suggests that the lognormal distribution is preferred when there are many slowly evolving characters and fewer quickly evolving characters. The commonly adopted four rate category discrete approximation used for molecular data was found to be sufficient to approximate a gamma rate distribution with discrete characters. However, among the two data sets tested that favored a lognormal rate distribution, the continuous distribution was better approximated with at least eight discrete rate categories. Although the effect of rate model on the estimation of topology was difficult to assess across all data sets, it appeared relatively minor between the unequal-rates models for the one data set examined carefully. As in molecular analyses, we argue that researchers should test and adopt the most appropriate model of rate variation for the data set in question. As discrete characters are increasingly used in more sophisticated likelihood-based phylogenetic analyses, it is important that these studies be built on the most appropriate and carefully selected underlying models of evolution. © The Author(s) 2014. Published by Oxford University Press, on behalf of the Society of Systematic Biologists. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Kim, Young Hoon; Song, Kwang Yong
2017-06-26
A Brillouin optical time domain analysis (BOTDA) system utilizing tailored compensation for the propagation loss of the pump pulse is demonstrated for long-range and high-resolution distributed sensing. A continuous pump wave for distributed Brillouin amplification (DBA pump) of the pump pulse co-propagates with the probe wave, where gradual variation of the spectral width is additionally introduced to the DBA pump to obtain a uniform Brillouin gain along the position. In the experimental confirmation, a distributed strain measurement along a 51.2 km fiber under test is presented with a spatial resolution of 20 cm, in which the measurement error (σ) of less than 1.45 MHz and the near-constant Brillouin gain of the probe wave are maintained throughout the fiber.
The Significance of Breakdown Voltages for Quality Assurance of Low-Voltage BME Ceramic Capacitors
NASA Technical Reports Server (NTRS)
Teverovsky, Alexander A.
2014-01-01
Application of thin dielectric, base metal electrode (BME) ceramic capacitors for high-reliability applications requires development of testing procedures that can assure high quality and reliability of the parts. In this work, distributions of breakdown voltages (VBR) in variety of low-voltage BME multilayer ceramic capacitors (MLCCs) have been measured and analyzed. It has been shown that analysis of the distributions can indicate the proportion of defective parts in the lot and significance of the defects. Variations of the distributions after solder dip testing allow for an assessment of the robustness of capacitors to soldering-related stresses. The drawbacks of the existing screening and qualification methods to reveal defects in high-value, low-voltage MLCCs and the importance of VBR measurements are discussed. Analysis has shown that due to a larger concentration of oxygen vacancies, defect-related degradation of the insulation resistance (IR) and failures are more likely in BME compared to the precious metal electrode (PME) capacitors.
Research on Fault Characteristics and Line Protections Within a Large-scale Photovoltaic Power Plant
NASA Astrophysics Data System (ADS)
Zhang, Chi; Zeng, Jie; Zhao, Wei; Zhong, Guobin; Xu, Qi; Luo, Pandian; Gu, Chenjie; Liu, Bohan
2017-05-01
Centralized photovoltaic (PV) systems have different fault characteristics from distributed PV systems due to the different system structures and controls. This makes the fault analysis and protection methods used in distribution networks with distributed PV not suitable for a centralized PV power plant. Therefore, a consolidated expression for the fault current within a PV power plant under different controls was calculated considering the fault response of the PV array. Then, supported by the fault current analysis and the on-site testing data, the overcurrent relay (OCR) performance was evaluated in the collection system of an 850 MW PV power plant. It reveals that the OCRs at downstream side on overhead lines may malfunction. In this case, a new relay scheme was proposed using directional distance elements. In the PSCAD/EMTDC, a detailed PV system model was built and verified using the on-site testing data. Simulation results indicate that the proposed relay scheme could effectively solve the problems under variant fault scenarios and PV plant output levels.
Distribution of the two-sample t-test statistic following blinded sample size re-estimation.
Lu, Kaifeng
2016-05-01
We consider the blinded sample size re-estimation based on the simple one-sample variance estimator at an interim analysis. We characterize the exact distribution of the standard two-sample t-test statistic at the final analysis. We describe a simulation algorithm for the evaluation of the probability of rejecting the null hypothesis at given treatment effect. We compare the blinded sample size re-estimation method with two unblinded methods with respect to the empirical type I error, the empirical power, and the empirical distribution of the standard deviation estimator and final sample size. We characterize the type I error inflation across the range of standardized non-inferiority margin for non-inferiority trials, and derive the adjusted significance level to ensure type I error control for given sample size of the internal pilot study. We show that the adjusted significance level increases as the sample size of the internal pilot study increases. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Meta-analysis of diagnostic test data: a bivariate Bayesian modeling approach.
Verde, Pablo E
2010-12-30
In the last decades, the amount of published results on clinical diagnostic tests has expanded very rapidly. The counterpart to this development has been the formal evaluation and synthesis of diagnostic results. However, published results present substantial heterogeneity and they can be regarded as so far removed from the classical domain of meta-analysis, that they can provide a rather severe test of classical statistical methods. Recently, bivariate random effects meta-analytic methods, which model the pairs of sensitivities and specificities, have been presented from the classical point of view. In this work a bivariate Bayesian modeling approach is presented. This approach substantially extends the scope of classical bivariate methods by allowing the structural distribution of the random effects to depend on multiple sources of variability. Meta-analysis is summarized by the predictive posterior distributions for sensitivity and specificity. This new approach allows, also, to perform substantial model checking, model diagnostic and model selection. Statistical computations are implemented in the public domain statistical software (WinBUGS and R) and illustrated with real data examples. Copyright © 2010 John Wiley & Sons, Ltd.
Schwantes-An, Tae-Hwi; Sung, Heejong; Sabourin, Jeremy A; Justice, Cristina M; Sorant, Alexa J M; Wilson, Alexander F
2016-01-01
In this study, the effects of (a) the minor allele frequency of the single nucleotide variant (SNV), (b) the degree of departure from normality of the trait, and (c) the position of the SNVs on type I error rates were investigated in the Genetic Analysis Workshop (GAW) 19 whole exome sequence data. To test the distribution of the type I error rate, 5 simulated traits were considered: standard normal and gamma distributed traits; 2 transformed versions of the gamma trait (log 10 and rank-based inverse normal transformations); and trait Q1 provided by GAW 19. Each trait was tested with 313,340 SNVs. Tests of association were performed with simple linear regression and average type I error rates were determined for minor allele frequency classes. Rare SNVs (minor allele frequency < 0.05) showed inflated type I error rates for non-normally distributed traits that increased as the minor allele frequency decreased. The inflation of average type I error rates increased as the significance threshold decreased. Normally distributed traits did not show inflated type I error rates with respect to the minor allele frequency for rare SNVs. There was no consistent effect of transformation on the uniformity of the distribution of the location of SNVs with a type I error.
Laurence, Ted A; Bude, Jeff D; Ly, Sonny; Shen, Nan; Feit, Michael D
2012-05-07
Surface laser damage limits the lifetime of optics for systems guiding high fluence pulses, particularly damage in silica optics used for inertial confinement fusion-class lasers (nanosecond-scale high energy pulses at 355 nm/3.5 eV). The density of damage precursors at low fluence has been measured using large beams (1-3 cm); higher fluences cannot be measured easily since the high density of resulting damage initiation sites results in clustering. We developed automated experiments and analysis that allow us to damage test thousands of sites with small beams (10-30 µm), and automatically image the test sites to determine if laser damage occurred. We developed an analysis method that provides a rigorous connection between these small beam damage test results of damage probability versus laser pulse energy and the large beam damage results of damage precursor densities versus fluence. We find that for uncoated and coated fused silica samples, the distribution of precursors nearly flattens at very high fluences, up to 150 J/cm2, providing important constraints on the physical distribution and nature of these precursors.
Multiscale power analysis for heart rate variability
NASA Astrophysics Data System (ADS)
Zeng, Peng; Liu, Hongxing; Ni, Huangjing; Zhou, Jing; Xia, Lan; Ning, Xinbao
2015-06-01
We first introduce multiscale power (MSP) method to assess the power distribution of physiological signals on multiple time scales. Simulation on synthetic data and experiments on heart rate variability (HRV) are tested to support the approach. Results show that both physical and psychological changes influence power distribution significantly. A quantitative parameter, termed power difference (PD), is introduced to evaluate the degree of power distribution alteration. We find that dynamical correlation of HRV will be destroyed completely when PD>0.7.
Modeled ground water age distributions
Woolfenden, Linda R.; Ginn, Timothy R.
2009-01-01
The age of ground water in any given sample is a distributed quantity representing distributed provenance (in space and time) of the water. Conventional analysis of tracers such as unstable isotopes or anthropogenic chemical species gives discrete or binary measures of the presence of water of a given age. Modeled ground water age distributions provide a continuous measure of contributions from different recharge sources to aquifers. A numerical solution of the ground water age equation of Ginn (1999) was tested both on a hypothetical simplified one-dimensional flow system and under real world conditions. Results from these simulations yield the first continuous distributions of ground water age using this model. Complete age distributions as a function of one and two space dimensions were obtained from both numerical experiments. Simulations in the test problem produced mean ages that were consistent with the expected value at the end of the model domain for all dispersivity values tested, although the mean ages for the two highest dispersivity values deviated slightly from the expected value. Mean ages in the dispersionless case also were consistent with the expected mean ages throughout the physical model domain. Simulations under real world conditions for three dispersivity values resulted in decreasing mean age with increasing dispersivity. This likely is a consequence of an edge effect. However, simulations for all three dispersivity values tested were mass balanced and stable demonstrating that the solution of the ground water age equation can provide estimates of water mass density distributions over age under real world conditions.
iMARS--mutation analysis reporting software: an analysis of spontaneous cII mutation spectra.
Morgan, Claire; Lewis, Paul D
2006-01-31
The sensitivity of any mutational assay is determined by the level at which spontaneous mutations occur in the corresponding untreated controls. Establishing the type and frequency at which mutations occur naturally within a test system is essential if one is to draw scientifically sound conclusions regarding chemically induced mutations. Currently, mutation-spectra analysis is laborious and time-consuming. Thus, we have developed iMARS, a comprehensive mutation-spectrum analysis package that utilises routinely used methodologies and visualisation tools. To demonstrate the use and capabilities of iMARS, we have analysed the distribution, types and sequence context of spontaneous base substitutions derived from the cII gene mutation assay in transgenic animals. Analysis of spontaneous mutation spectra revealed variation both within and between the transgenic rodent test systems Big Blue Mouse, MutaMouse and Big Blue Rat. The most common spontaneous base substitutions were G:C-->A:T transitions and G:C-->T:A transversions. All Big Blue Mouse spectra were significantly different from each other by distribution and nearly all by mutation type, whereas the converse was true for the other test systems. Twenty-eight mutation hotspots were observed across all spectra generally occurring in CG, GA/TC, GG and GC dinucleotides. A mutation hotspot at nucleotide 212 occurred at a higher frequency in MutaMouse and Big Blue Rat. In addition, CG dinucleotides were the most mutable in all spectra except two Big Blue Mouse spectra. Thus, spontaneous base-substitution spectra showed more variation in distribution, type and sequence context in Big Blue Mouse relative to spectra derived from MutaMouse and Big Blue Rat. The results of our analysis provide a baseline reference for mutation studies utilising the cII gene in transgenic rodent models. The potential differences in spontaneous base-substitution spectra should be considered when making comparisons between these test systems. The ease at which iMARS has allowed us to carry out an exhaustive investigation to assess mutation distribution, mutation type, strand bias, target sequences and motifs, as well as predict mutation hotspots provides us with a valuable tool in helping to distinguish true chemically induced hotspots from background mutations and gives a true reflection of mutation frequency.
Trébucq, A; Guérin, N; Ali Ismael, H; Bernatas, J J; Sèvre, J P; Rieder, H L
2005-10-01
Djibouti, 1994 and 2001. To estimate the prevalence of tuberculosis (TB) and average annual risk of TB infection (ARTI) and trends, and to test a new method for calculations. Tuberculin surveys among schoolchildren and sputum smear-positive TB patients. Prevalence of infection was calculated using cut-off points, the mirror image technique, mixture analysis, and a new method based on the operating characteristics of the tuberculin test. Test sensitivity was derived from tuberculin reactions among TB patients and test specificity from a comparison of reaction size distributions among children with and without a BCG scar. The ARTI was estimated to lie between 2.6% and 3.1%, with no significant changes between 1994 and 2001. The close match of the distributions between children tested in 1994 and patients justifies the utilisation of the latter to determine test sensitivity. This new method gave very consistent estimates of prevalence of infection for any induration for values between 15 and 20 mm. Specificity was successfully determined for 1994, but not for 2001. Mixture analysis confirmed the estimates obtained with the new method. Djibouti has a high ARTI, and no apparent change over the observation time was found. Using operating test characteristics to estimate prevalence of infection looks promising.
Neti, Prasad V.S.V.; Howell, Roger W.
2008-01-01
Recently, the distribution of radioactivity among a population of cells labeled with 210Po was shown to be well described by a log normal distribution function (J Nucl Med 47, 6 (2006) 1049-1058) with the aid of an autoradiographic approach. To ascertain the influence of Poisson statistics on the interpretation of the autoradiographic data, the present work reports on a detailed statistical analyses of these data. Methods The measured distributions of alpha particle tracks per cell were subjected to statistical tests with Poisson (P), log normal (LN), and Poisson – log normal (P – LN) models. Results The LN distribution function best describes the distribution of radioactivity among cell populations exposed to 0.52 and 3.8 kBq/mL 210Po-citrate. When cells were exposed to 67 kBq/mL, the P – LN distribution function gave a better fit, however, the underlying activity distribution remained log normal. Conclusions The present analysis generally provides further support for the use of LN distributions to describe the cellular uptake of radioactivity. Care should be exercised when analyzing autoradiographic data on activity distributions to ensure that Poisson processes do not distort the underlying LN distribution. PMID:16741316
Ultrasonic test of resistance spot welds based on wavelet package analysis.
Liu, Jing; Xu, Guocheng; Gu, Xiaopeng; Zhou, Guanghao
2015-02-01
In this paper, ultrasonic test of spot welds for stainless steel sheets has been studied. It is indicated that traditional ultrasonic signal analysis in either time domain or frequency domain remains inadequate to evaluate the nugget diameter of spot welds. However, the method based on wavelet package analysis in time-frequency domain can easily distinguish the nugget from the corona bond by extracting high-frequency signals in different positions of spot welds, thereby quantitatively evaluating the nugget diameter. The results of ultrasonic test fit the actual measured value well. Mean value of normal distribution of error statistics is 0.00187, and the standard deviation is 0.1392. Furthermore, the quality of spot welds was evaluated, and it is showed ultrasonic nondestructive test based on wavelet packet analysis can be used to evaluate the quality of spot welds, and it is more reliable than single tensile destructive test. Copyright © 2014 Elsevier B.V. All rights reserved.
Progress of soil radionuclide distribution studies for the Nevada Applied Ecology Group: 1981
DOE Office of Scientific and Technical Information (OSTI.GOV)
Essington, E.H.
Two nuclear sites have been under intensive study by the Nevada Applied Ecology Group (NAEG) during 1980 and 1981, NS201 in area 18 and NS219,221 in area 20. In support of the various studies Los Alamos National Laboratory (Group LS-6) has provided consultation and evaluations relative to radionuclide distributions in soils inundated with radioactive debris from those tests. In addition, a referee effort was also conducted in both analysis of replicate samples and in evaluating various data sets for consistency of results. This report summarizes results of several of the data sets collected to test certain hypotheses relative to radionuclidemore » distributions and factors affecting calculations of hypotheses relative to radionuclide distributions and factors affecting calculations of radionuclide inventories and covers the period February 1980 to May 1981.« less
ERIC Educational Resources Information Center
Fouladi, Rachel T.
2000-01-01
Provides an overview of standard and modified normal theory and asymptotically distribution-free covariance and correlation structure analysis techniques and details Monte Carlo simulation results on Type I and Type II error control. Demonstrates through the simulation that robustness and nonrobustness of structure analysis techniques vary as a…
PresenceAbsence: An R package for presence absence analysis
Elizabeth A. Freeman; Gretchen Moisen
2008-01-01
The PresenceAbsence package for R provides a set of functions useful when evaluating the results of presence-absence analysis, for example, models of species distribution or the analysis of diagnostic tests. The package provides a toolkit for selecting the optimal threshold for translating a probability surface into presence-absence maps specifically tailored to their...
Separate-channel analysis of two-channel microarrays: recovering inter-spot information.
Smyth, Gordon K; Altman, Naomi S
2013-05-26
Two-channel (or two-color) microarrays are cost-effective platforms for comparative analysis of gene expression. They are traditionally analysed in terms of the log-ratios (M-values) of the two channel intensities at each spot, but this analysis does not use all the information available in the separate channel observations. Mixed models have been proposed to analyse intensities from the two channels as separate observations, but such models can be complex to use and the gain in efficiency over the log-ratio analysis is difficult to quantify. Mixed models yield test statistics for the null distributions can be specified only approximately, and some approaches do not borrow strength between genes. This article reformulates the mixed model to clarify the relationship with the traditional log-ratio analysis, to facilitate information borrowing between genes, and to obtain an exact distributional theory for the resulting test statistics. The mixed model is transformed to operate on the M-values and A-values (average log-expression for each spot) instead of on the log-expression values. The log-ratio analysis is shown to ignore information contained in the A-values. The relative efficiency of the log-ratio analysis is shown to depend on the size of the intraspot correlation. A new separate channel analysis method is proposed that assumes a constant intra-spot correlation coefficient across all genes. This approach permits the mixed model to be transformed into an ordinary linear model, allowing the data analysis to use a well-understood empirical Bayes analysis pipeline for linear modeling of microarray data. This yields statistically powerful test statistics that have an exact distributional theory. The log-ratio, mixed model and common correlation methods are compared using three case studies. The results show that separate channel analyses that borrow strength between genes are more powerful than log-ratio analyses. The common correlation analysis is the most powerful of all. The common correlation method proposed in this article for separate-channel analysis of two-channel microarray data is no more difficult to apply in practice than the traditional log-ratio analysis. It provides an intuitive and powerful means to conduct analyses and make comparisons that might otherwise not be possible.
Sensitivity analysis of a sound absorption model with correlated inputs
NASA Astrophysics Data System (ADS)
Chai, W.; Christen, J.-L.; Zine, A.-M.; Ichchou, M.
2017-04-01
Sound absorption in porous media is a complex phenomenon, which is usually addressed with homogenized models, depending on macroscopic parameters. Since these parameters emerge from the structure at microscopic scale, they may be correlated. This paper deals with sensitivity analysis methods of a sound absorption model with correlated inputs. Specifically, the Johnson-Champoux-Allard model (JCA) is chosen as the objective model with correlation effects generated by a secondary micro-macro semi-empirical model. To deal with this case, a relatively new sensitivity analysis method Fourier Amplitude Sensitivity Test with Correlation design (FASTC), based on Iman's transform, is taken into application. This method requires a priori information such as variables' marginal distribution functions and their correlation matrix. The results are compared to the Correlation Ratio Method (CRM) for reference and validation. The distribution of the macroscopic variables arising from the microstructure, as well as their correlation matrix are studied. Finally the results of tests shows that the correlation has a very important impact on the results of sensitivity analysis. Assessment of correlation strength among input variables on the sensitivity analysis is also achieved.
Acoustic emission characteristics of copper alloys under low-cycle fatigue conditions
NASA Technical Reports Server (NTRS)
Krampfner, Y.; Kawamoto, A.; Ono, K.; Green, A.
1975-01-01
The acoustic emission (AE) characteristics of pure copper, zirconium-copper, and several copper alloys were determined to develop nondestructive evaluation schemes of thrust chambers through AE techniques. The AE counts rms voltages, frequency spectrum, and amplitude distribution analysis evaluated AE behavior under fatigue loading conditions. The results were interpreted with the evaluation of wave forms, crack propagation characteristics, as well as scanning electron fractographs of fatigue-tested samples. AE signals at the beginning of a fatigue test were produced by a sample of annealed alloys. A sample of zirconium-containing alloys annealed repeatedly after each fatigue loading cycle showed numerous surface cracks during the subsequent fatigue cycle, emitting strong-burst AE signals. Amplitude distribution analysis exhibits responses that are characteristic of certain types of AE signals.
Etzel, C J; Shete, S; Beasley, T M; Fernandez, J R; Allison, D B; Amos, C I
2003-01-01
Non-normality of the phenotypic distribution can affect power to detect quantitative trait loci in sib pair studies. Previously, we observed that Winsorizing the sib pair phenotypes increased the power of quantitative trait locus (QTL) detection for both Haseman-Elston (HE) least-squares tests [Hum Hered 2002;53:59-67] and maximum likelihood-based variance components (MLVC) analysis [Behav Genet (in press)]. Winsorizing the phenotypes led to a slight increase in type 1 error in H-E tests and a slight decrease in type I error for MLVC analysis. Herein, we considered transforming the sib pair phenotypes using the Box-Cox family of transformations. Data were simulated for normal and non-normal (skewed and kurtic) distributions. Phenotypic values were replaced by Box-Cox transformed values. Twenty thousand replications were performed for three H-E tests of linkage and the likelihood ratio test (LRT), the Wald test and other robust versions based on the MLVC method. We calculated the relative nominal inflation rate as the ratio of observed empirical type 1 error divided by the set alpha level (5, 1 and 0.1% alpha levels). MLVC tests applied to non-normal data had inflated type I errors (rate ratio greater than 1.0), which were controlled best by Box-Cox transformation and to a lesser degree by Winsorizing. For example, for non-transformed, skewed phenotypes (derived from a chi2 distribution with 2 degrees of freedom), the rates of empirical type 1 error with respect to set alpha level=0.01 were 0.80, 4.35 and 7.33 for the original H-E test, LRT and Wald test, respectively. For the same alpha level=0.01, these rates were 1.12, 3.095 and 4.088 after Winsorizing and 0.723, 1.195 and 1.905 after Box-Cox transformation. Winsorizing reduced inflated error rates for the leptokurtic distribution (derived from a Laplace distribution with mean 0 and variance 8). Further, power (adjusted for empirical type 1 error) at the 0.01 alpha level ranged from 4.7 to 17.3% across all tests using the non-transformed, skewed phenotypes, from 7.5 to 20.1% after Winsorizing and from 12.6 to 33.2% after Box-Cox transformation. Likewise, power (adjusted for empirical type 1 error) using leptokurtic phenotypes at the 0.01 alpha level ranged from 4.4 to 12.5% across all tests with no transformation, from 7 to 19.2% after Winsorizing and from 4.5 to 13.8% after Box-Cox transformation. Thus the Box-Cox transformation apparently provided the best type 1 error control and maximal power among the procedures we considered for analyzing a non-normal, skewed distribution (chi2) while Winzorizing worked best for the non-normal, kurtic distribution (Laplace). We repeated the same simulations using a larger sample size (200 sib pairs) and found similar results. Copyright 2003 S. Karger AG, Basel
Bringing the CMS distributed computing system into scalable operations
NASA Astrophysics Data System (ADS)
Belforte, S.; Fanfani, A.; Fisk, I.; Flix, J.; Hernández, J. M.; Kress, T.; Letts, J.; Magini, N.; Miccio, V.; Sciabà, A.
2010-04-01
Establishing efficient and scalable operations of the CMS distributed computing system critically relies on the proper integration, commissioning and scale testing of the data and workload management tools, the various computing workflows and the underlying computing infrastructure, located at more than 50 computing centres worldwide and interconnected by the Worldwide LHC Computing Grid. Computing challenges periodically undertaken by CMS in the past years with increasing scale and complexity have revealed the need for a sustained effort on computing integration and commissioning activities. The Processing and Data Access (PADA) Task Force was established at the beginning of 2008 within the CMS Computing Program with the mandate of validating the infrastructure for organized processing and user analysis including the sites and the workload and data management tools, validating the distributed production system by performing functionality, reliability and scale tests, helping sites to commission, configure and optimize the networking and storage through scale testing data transfers and data processing, and improving the efficiency of accessing data across the CMS computing system from global transfers to local access. This contribution reports on the tools and procedures developed by CMS for computing commissioning and scale testing as well as the improvements accomplished towards efficient, reliable and scalable computing operations. The activities include the development and operation of load generators for job submission and data transfers with the aim of stressing the experiment and Grid data management and workload management systems, site commissioning procedures and tools to monitor and improve site availability and reliability, as well as activities targeted to the commissioning of the distributed production, user analysis and monitoring systems.
Thermal-Structural Analysis of PICA Tiles for Solar Tower Test
NASA Technical Reports Server (NTRS)
Agrawal, Parul; Empey, Daniel M.; Squire, Thomas H.
2009-01-01
Thermal protection materials used in spacecraft heatshields are subjected to severe thermal and mechanical loading environments during re-entry into earth atmosphere. In order to investigate the reliability of PICA tiles in the presence of high thermal gradients as well as mechanical loads, the authors designed and conducted solar-tower tests. This paper presents the design and analysis work for this tests series. Coupled non-linear thermal-mechanical finite element analyses was conducted to estimate in-depth temperature distribution and stress contours for various cases. The first set of analyses performed on isolated PICA tile showed that stresses generated during the tests were below the PICA allowable limit and should not lead to any catastrophic failure during the test. The tests results were consistent with analytical predictions. The temperature distribution and magnitude of the measured strains were also consistent with predicted values. The second test series is designed to test the arrayed PICA tiles with various gap-filler materials. A nonlinear contact method is used to model the complex geometry with various tiles. The analyses for these coupons predict the stress contours in PICA and inside gap fillers. Suitable mechanical loads for this architecture will be predicted, which can be applied during the test to exceed the allowable limits and demonstrate failure modes. Thermocouple and strain-gauge data obtained from the solar tower tests will be used for subsequent analyses and validation of FEM models.
Cuypers, Eva; Flinders, Bryn; Boone, Carolien M; Bosman, Ingrid J; Lusthof, Klaas J; Van Asten, Arian C; Tytgat, Jan; Heeren, Ron M A
2016-03-15
Today, hair testing is considered to be the standard method for the detection of chronic drug abuse. Nevertheless, the differentiation between systemic exposure and external contamination remains a major challenge in the forensic interpretation of hair analysis. Nowadays, it is still impossible to directly show the difference between external contamination and use-related incorporation. Although the effects of washing procedures on the distribution of (incorporated) drugs in hair remain unknown, these decontamination procedures prior to hair analysis are considered to be indispensable in order to exclude external contamination. However, insights into the effect of decontamination protocols on levels and distribution of drugs incorporated in hair are essential to draw the correct forensic conclusions from hair analysis; we studied the consequences of these procedures on the spatial distribution of cocaine in hair using imaging mass spectrometry. Additionally, using metal-assisted secondary ion mass spectrometry, we are the first to directly show the difference between cocaine-contaminated and user hair without any prior washing procedure.
ERIC Educational Resources Information Center
Bluestone, Barry A.
The study investigates the determinants of the earnings distribution in the U.S. paying particular attention to the less-skilled segment of the workforce. A general earnings theory is proposed which has elements of human capital theory, institutional hypotheses, and radical stratification analysis. Much attention is paid to testing the "crowding"…
EMTP based stability analysis of space station electric power system in a test bed environment
NASA Technical Reports Server (NTRS)
Dravid, Narayan V.; Kacpura, Thomas J.; Oconnor, Andrew M.
1992-01-01
The Space Station Freedom Electric Power System (EPS) will convert solar energy into electric energy and distribute the same using an 'all dc', Power Management and Distribution (PMAD) System. Power conditioning devices (dc to dc converters) are needed to interconnect parts of this system operating at different nominal voltage levels. Operation of such devices could generate under damped oscillations (instability) under certain conditions. Criteria for instability are examined and verified for a single device. Suggested extension of the criteria to a system operation is examined by using the EMTP model of the PMAD DC test bed. Wherever possible, data from the test bed is compared with the modeling results.
EMTP based stability analysis of Space Station Electric Power System in a test bed environment
NASA Technical Reports Server (NTRS)
Dravid, Narayan V.; Kacpura, Thomas J.; O'Connor, Andrew M.
1992-01-01
The Space Station Freedom Electric Power System (EPS) will convert solar energy into electric energy and distribute the same using an 'all dc', Power Management and Distribution (PMAD) System. Power conditioning devices (dc to dc converters) are needed to interconnect parts of this system operating at different nominal voltage levels. Operation of such devices could generate under damped oscillations (instability) under certain conditions. Criteria for instability are examined and verified for a single device. Suggested extension of the criteria to a system operation is examined by using the EMTP model of the PMAD dc test bed. Wherever possible, data from the test bed is compared with the modeling results.
Astrelin, A V; Sokolov, M V; Behnisch, T; Reymann, K G; Voronin, L L
1997-04-25
A statistical approach to analysis of amplitude fluctuations of postsynaptic responses is described. This includes (1) using a L1-metric in the space of distribution functions for minimisation with application of linear programming methods to decompose amplitude distributions into a convolution of Gaussian and discrete distributions; (2) deconvolution of the resulting discrete distribution with determination of the release probabilities and the quantal amplitude for cases with a small number (< 5) of discrete components. The methods were tested against simulated data over a range of sample sizes and signal-to-noise ratios which mimicked those observed in physiological experiments. In computer simulation experiments, comparisons were made with other methods of 'unconstrained' (generalized) and constrained reconstruction of discrete components from convolutions. The simulation results provided additional criteria for improving the solutions to overcome 'over-fitting phenomena' and to constrain the number of components with small probabilities. Application of the programme to recordings from hippocampal neurones demonstrated its usefulness for the analysis of amplitude distributions of postsynaptic responses.
Strain-controlled fatigue of acrylic bone cement.
Carter, D R; Gates, E I; Harris, W H
1982-09-01
Monotonic tensile tests and tension-compression fatigue tests were conducted of wet acrylic bone cement specimens at 37 degrees C. All testing was conducted in strain control at a strain rate of 0.02/s. Weibull analysis of the tensile tests indicated that monotonic fracture was governed more strongly by strain than stress. The number of cycles to fatigue failure was also more strongly controlled by strain amplitude than stress amplitude. Specimen porosity distribution played a major role in determining the tensile and fatigue strengths. The degree of data scatter suggests that Weibull analysis of fatigue data may be useful in developing design criteria for the surgical use of bone cement.
Bayesian multivariate hierarchical transformation models for ROC analysis.
O'Malley, A James; Zou, Kelly H
2006-02-15
A Bayesian multivariate hierarchical transformation model (BMHTM) is developed for receiver operating characteristic (ROC) curve analysis based on clustered continuous diagnostic outcome data with covariates. Two special features of this model are that it incorporates non-linear monotone transformations of the outcomes and that multiple correlated outcomes may be analysed. The mean, variance, and transformation components are all modelled parametrically, enabling a wide range of inferences. The general framework is illustrated by focusing on two problems: (1) analysis of the diagnostic accuracy of a covariate-dependent univariate test outcome requiring a Box-Cox transformation within each cluster to map the test outcomes to a common family of distributions; (2) development of an optimal composite diagnostic test using multivariate clustered outcome data. In the second problem, the composite test is estimated using discriminant function analysis and compared to the test derived from logistic regression analysis where the gold standard is a binary outcome. The proposed methodology is illustrated on prostate cancer biopsy data from a multi-centre clinical trial.
Bayesian multivariate hierarchical transformation models for ROC analysis
O'Malley, A. James; Zou, Kelly H.
2006-01-01
SUMMARY A Bayesian multivariate hierarchical transformation model (BMHTM) is developed for receiver operating characteristic (ROC) curve analysis based on clustered continuous diagnostic outcome data with covariates. Two special features of this model are that it incorporates non-linear monotone transformations of the outcomes and that multiple correlated outcomes may be analysed. The mean, variance, and transformation components are all modelled parametrically, enabling a wide range of inferences. The general framework is illustrated by focusing on two problems: (1) analysis of the diagnostic accuracy of a covariate-dependent univariate test outcome requiring a Box–Cox transformation within each cluster to map the test outcomes to a common family of distributions; (2) development of an optimal composite diagnostic test using multivariate clustered outcome data. In the second problem, the composite test is estimated using discriminant function analysis and compared to the test derived from logistic regression analysis where the gold standard is a binary outcome. The proposed methodology is illustrated on prostate cancer biopsy data from a multi-centre clinical trial. PMID:16217836
DeVore, Matthew S; Gull, Stephen F; Johnson, Carey K
2012-04-05
We describe a method for analysis of single-molecule Förster resonance energy transfer (FRET) burst measurements using classic maximum entropy. Classic maximum entropy determines the Bayesian inference for the joint probability describing the total fluorescence photons and the apparent FRET efficiency. The method was tested with simulated data and then with DNA labeled with fluorescent dyes. The most probable joint distribution can be marginalized to obtain both the overall distribution of fluorescence photons and the apparent FRET efficiency distribution. This method proves to be ideal for determining the distance distribution of FRET-labeled biomolecules, and it successfully predicts the shape of the recovered distributions.
NASA Technical Reports Server (NTRS)
Krantz, Timothy L.
2002-01-01
The Weibull distribution has been widely adopted for the statistical description and inference of fatigue data. This document provides user instructions, examples, and verification for software to analyze gear fatigue test data. The software was developed presuming the data are adequately modeled using a two-parameter Weibull distribution. The calculations are based on likelihood methods, and the approach taken is valid for data that include type 1 censoring. The software was verified by reproducing results published by others.
NASA Technical Reports Server (NTRS)
Kranz, Timothy L.
2002-01-01
The Weibull distribution has been widely adopted for the statistical description and inference of fatigue data. This document provides user instructions, examples, and verification for software to analyze gear fatigue test data. The software was developed presuming the data are adequately modeled using a two-parameter Weibull distribution. The calculations are based on likelihood methods, and the approach taken is valid for data that include type I censoring. The software was verified by reproducing results published by others.
TEMPEST code simulations of hydrogen distribution in reactor containment structures. Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Trent, D.S.; Eyler, L.L.
The mass transport version of the TEMPEST computer code was used to simulate hydrogen distribution in geometric configurations relevant to reactor containment structures. Predicted results of Battelle-Frankfurt hydrogen distribution tests 1 to 6, and 12 are presented. Agreement between predictions and experimental data is good. Best agreement is obtained using the k-epsilon turbulence model in TEMPEST in flow cases where turbulent diffusion and stable stratification are dominant mechanisms affecting transport. The code's general analysis capabilities are summarized.
Qiao, Yuchun; Shang, Jizhen; Li, Shuying; Feng, Luping; Jiang, Yao; Duan, Zhiqiang; Lv, Xiaoxia; Zhang, Chunxian; Yao, Tiantian; Dong, Zhichao; Zhang, Yu; Wang, Hua
2016-11-04
A fluorimetric Hg 2+ test strip has been developed using a lotus-inspired fabrication method for suppressing the "coffee stains" toward the uniform distribution of probe materials through creating a hydrophobic drying pattern for fast solvent evaporation. The test strips were first loaded with the model probes of fluorescent gold-silver nanoclusters and then dried in vacuum on the hydrophobic pattern. On the one hand, here, the hydrophobic constraining forces from the lotus surface-like pattern could control the exterior transport of dispersed nanoclusters on strips leading to the minimized "coffee stains". On the other hand, the vacuum-aided fast solvent evaporation could boost the interior Marangoni flow of probe materials on strips to expect the further improved probe distribution on strips. High aqueous stability and enhanced fluorescence of probes on test strips were realized by the hydrophilic treatment with amine-derivatized silicane. A test strips-based fluorimetry has thereby been developed for probing Hg 2+ ions in wastewater, showing the detection performances comparable to the classic instrumental analysis ones. Such a facile and efficient fabrication route for the bio-inspired suppression of "coffee stains" on test strips may expand the scope of applications of test strips-based "point-of-care" analysis methods or detection devices in the biomedical and environmental fields.
Numerical sedimentation particle-size analysis using the Discrete Element Method
NASA Astrophysics Data System (ADS)
Bravo, R.; Pérez-Aparicio, J. L.; Gómez-Hernández, J. J.
2015-12-01
Sedimentation tests are widely used to determine the particle size distribution of a granular sample. In this work, the Discrete Element Method interacts with the simulation of flow using the well known one-way-coupling method, a computationally affordable approach for the time-consuming numerical simulation of the hydrometer, buoyancy and pipette sedimentation tests. These tests are used in the laboratory to determine the particle-size distribution of fine-grained aggregates. Five samples with different particle-size distributions are modeled by about six million rigid spheres projected on two-dimensions, with diameters ranging from 2.5 ×10-6 m to 70 ×10-6 m, forming a water suspension in a sedimentation cylinder. DEM simulates the particle's movement considering laminar flow interactions of buoyant, drag and lubrication forces. The simulation provides the temporal/spatial distributions of densities and concentrations of the suspension. The numerical simulations cannot replace the laboratory tests since they need the final granulometry as initial data, but, as the results show, these simulations can identify the strong and weak points of each method and eventually recommend useful variations and draw conclusions on their validity, aspects very difficult to achieve in the laboratory.
Three New Methods for Analysis of Answer Changes
ERIC Educational Resources Information Center
Sinharay, Sandip; Johnson, Matthew S.
2017-01-01
In a pioneering research article, Wollack and colleagues suggested the "erasure detection index" (EDI) to detect test tampering. The EDI can be used with or without a continuity correction and is assumed to follow the standard normal distribution under the null hypothesis of no test tampering. When used without a continuity correction,…
A Review of DIMPACK Version 1.0: Conditional Covariance-Based Test Dimensionality Analysis Package
ERIC Educational Resources Information Center
Deng, Nina; Han, Kyung T.; Hambleton, Ronald K.
2013-01-01
DIMPACK Version 1.0 for assessing test dimensionality based on a nonparametric conditional covariance approach is reviewed. This software was originally distributed by Assessment Systems Corporation and now can be freely accessed online. The software consists of Windows-based interfaces of three components: DIMTEST, DETECT, and CCPROX/HAC, which…
Test Section Turbulence in the AEDC/VKF Supersonic/Hypersonic Wind Tunnels
1981-07-01
8 4.3 Ins t rumen ta t ion ....................................................... 18...Pressure Fluctuation Spectral Content in AEDC Tunnels A and B (Based on FY79 Pitot Probe), Af = 200 Hz...intensity, spatial distribution, and spectral content , has become increasingly important in the analysis of test data. The sector- supported model in the
Assessment of Adolescent Perceptions on Parental Attitudes on Different Variables
ERIC Educational Resources Information Center
Ersoy, Evren
2015-01-01
The purpose of this study is to examine secondary school student perceptions of parental attitudes with regards to specific variables. Independent samples t test for parametric distributions and one-way variance analysis (ANOVA) was used for analyzing the data, when the ANOVA analyses were significant Scheffe test was conducted on homogeneous…
Poisson Approximation-Based Score Test for Detecting Association of Rare Variants.
Fang, Hongyan; Zhang, Hong; Yang, Yaning
2016-07-01
Genome-wide association study (GWAS) has achieved great success in identifying genetic variants, but the nature of GWAS has determined its inherent limitations. Under the common disease rare variants (CDRV) hypothesis, the traditional association analysis methods commonly used in GWAS for common variants do not have enough power for detecting rare variants with a limited sample size. As a solution to this problem, pooling rare variants by their functions provides an efficient way for identifying susceptible genes. Rare variant typically have low frequencies of minor alleles, and the distribution of the total number of minor alleles of the rare variants can be approximated by a Poisson distribution. Based on this fact, we propose a new test method, the Poisson Approximation-based Score Test (PAST), for association analysis of rare variants. Two testing methods, namely, ePAST and mPAST, are proposed based on different strategies of pooling rare variants. Simulation results and application to the CRESCENDO cohort data show that our methods are more powerful than the existing methods. © 2016 John Wiley & Sons Ltd/University College London.
NASA Astrophysics Data System (ADS)
Griffiths, K. R.; Hicks, B. J.; Keogh, P. S.; Shires, D.
2016-08-01
In general, vehicle vibration is non-stationary and has a non-Gaussian probability distribution; yet existing testing methods for packaging design employ Gaussian distributions to represent vibration induced by road profiles. This frequently results in over-testing and/or over-design of the packaging to meet a specification and correspondingly leads to wasteful packaging and product waste, which represent 15bn per year in the USA and €3bn per year in the EU. The purpose of the paper is to enable a measured non-stationary acceleration signal to be replaced by a constructed signal that includes as far as possible any non-stationary characteristics from the original signal. The constructed signal consists of a concatenation of decomposed shorter duration signals, each having its own kurtosis level. Wavelet analysis is used for the decomposition process into inner and outlier signal components. The constructed signal has a similar PSD to the original signal, without incurring excessive acceleration levels. This allows an improved and more representative simulated input signal to be generated that can be used on the current generation of shaker tables. The wavelet decomposition method is also demonstrated experimentally through two correlation studies. It is shown that significant improvements over current international standards for packaging testing are achievable; hence the potential for more efficient packaging system design is possible.
Hawaiian Electric Advanced Inverter Grid Support Function Laboratory Validation and Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nelson, Austin; Nagarajan, Adarsh; Prabakar, Kumar
The objective for this test plan was to better understand how to utilize the performance capabilities of advanced inverter functions to allow the interconnection of distributed energy resource (DER) systems to support the new Customer Self-Supply, Customer Grid-Supply, and other future DER programs. The purpose of this project was: 1) to characterize how the tested grid supportive inverters performed the functions of interest, 2) to evaluate the grid supportive inverters in an environment that emulates the dynamics of O'ahu's electrical distribution system, and 3) to gain insight into the benefits of the grid support functions on selected O'ahu island distributionmore » feeders. These goals were achieved through laboratory testing of photovoltaic inverters, including power hardware-in-the-loop testing.« less
Charest, Mathieu; Bélair, Marc-André
2017-06-01
Helicobacter pylori infection is the leading cause of peptic ulcer disease. The purpose of this study was, first, to assess the difference in the distribution of negative versus positive results between the older 14 C-urea breath test and the newer 13 C-urea breath test and, second, to determine whether use of an indeterminate-results category is still meaningful and what type of results should trigger repeated testing. Methods: A retrospective survey was performed of all consecutive patients referred to our service for urea breath testing. We analyzed 562 patients who had undergone testing with 14 C-urea and 454 patients who had undergone testing with 13 C-urea. Results: In comparison with the wide distribution of negative 14 C results, negative 13 C results were distributed farther from the cutoff and were grouped more tightly around the mean negative value. Distribution analysis of the negative results for 13 C testing, compared with those for 14 C testing, revealed a statistically significant difference between the two. Within the 13 C group, only 1 patient could have been classified as having indeterminate results using the same indeterminate zone as was used for the 14 C group. This is significantly less frequent than what was found for the 14 C group. Discussion: Borderline-negative results do occur with 13 C-urea breath testing, although less frequently than with 14 C-urea breath testing, and we will be carefully monitoring differences falling between 3.0 and 3.5 %Δ. 13 C-urea breath testing is safe and simple for the patient and, in most cases, provides clearer positive or negative results for the clinician. © 2017 by the Society of Nuclear Medicine and Molecular Imaging.
Neghab, Masoud; Jalilian, Hamed; Taheri, Shekoufeh; Tatar, Mohsen; Haji Zadeh, Zeynab
2018-06-01
This study was undertaken to ascertain whether light occupational exposure to pesticides by retailers might be associated with any liver, kidney, nervous system dysfunction or hematological abnormalities. In this cross-sectional study, 70 male pesticide retailers (cases) and 64 male subjects, randomly selected from the constructions workers of city council contractors, as the referent group, were investigated. Urine and blood samples were taken from all subjects for urine analysis, hematological and biochemical parameters. Data analysis was conducted through SPSS v.19 using t-test and chi-square test. The results of urine analysis showed that the frequency of abnormal urine tests was significantly higher in cases than in referent individuals. Similarly, the results of CBC showed that the mean values of monocyte, hemoglobin, hematocrit, mean corpuscular volume, mean corpuscular hemoglobin and platelet distribution width were significantly lower, and mean corpuscular hemoglobin concentration and red blood cell distribution width were significantly higher in retailers. No significant differences were found for other parameters. These findings indicate that an association exists between exposure to pesticides by retailers and early subtle and sub-clinical changes in the urine tests and hematological parameters. Engineering measures are recommended to eliminate exposure to pesticides and to prevent its associated outcomes. Copyright © 2018 Elsevier Inc. All rights reserved.
Investigation on Composite Throat Insert For Cryogenic Engines
NASA Astrophysics Data System (ADS)
Ayyappan, G.; Tiwari, S. B.; Praveen, RS; Mohankumar, L.; Jathaveda, M.; Ganesh, P.
2017-02-01
Injector element testing is an important step in the development and qualification of the cryogenic rocket engines. For the purpose of characterising the injectors, sub scale chambers are used. In order to assess the performance of the injectors, different configurations of the injectors are tested using a combustion chamber and a convergent-divergent nozzle. Pressure distribution along the wall of the chamber and throat insert is obtained from the CFD analysis and temperature distribution is obtained from thermal analysis. Thermo-structural analysis is carried out for the sub-scale model of throat inert using temperature dependent material properties. For the experiments a sub-scale model of the thrust chamber is realised. Injector element tests are carried out for the studies. The objective of the present study is to investigate the behaviour of different throat inserts, mainly graphite, 2-D Carbon-Carbon(2D C-C), 4-D Carbon-Carbon (4D C-C) and Silica Phenolic (SP), under pressure and thermal load for repeated operation of the engine. Analytical results are compared with the test results. The paper gives the results of theoretical studies and experiments conducted with all the four type of throat material. It is concluded that 2D C-C is superior in terms of throat erosion being the least under specified combustion environment.
The physics of heavy quark distributions in hadrons: Collider tests
NASA Astrophysics Data System (ADS)
Brodsky, S. J.; Bednyakov, V. A.; Lykasov, G. I.; Smiesko, J.; Tokar, S.
2017-03-01
We present a review of the current understanding of the heavy quark distributions in the nucleon and their impact on collider physics. The origin of strange, charm and bottom quark pairs at high light-front (LF) momentum fractions in hadron wavefunction-the "intrinsic" quarks, is reviewed. The determination of heavy-quark parton distribution functions (PDFs) is particularly significant for the analysis of hard processes at LHC energies. We show that a careful study of the inclusive production of open charm and the production of γ / Z / W particles, accompanied by the heavy jets at large transverse momenta can give essential information on the intrinsic heavy quark (IQ) distributions. We also focus on the theoretical predictions concerning other observables which are very sensitive to the intrinsic charm contribution to PDFs including Higgs production at high xF and novel fixed target measurements which can be tested at the LHC.
The physics of heavy quark distributions in hadrons: Collider tests
Brodsky, S. J.; Bednyakov, V. A.; Lykasov, G. I.; ...
2016-12-18
Here, we present a review of the current understanding of the heavy quark distributions in the nucleon and their impact on collider physics. The origin of strange, charm and bottom quark pairs at high light-front (LF) momentum fractions in hadron wavefunction—the “intrinsic” quarks, is reviewed. The determination of heavy-quark parton distribution functions (PDFs) is particularly significant for the analysis of hard processes at LHC energies. We show that a careful study of the inclusive production of open charm and the production of γ/Z/W particles, accompanied by the heavy jets at large transverse momenta can give essential information on the intrinsicmore » heavy quark (IQ) distributions. We also focus on the theoretical predictions concerning other observables which are very sensitive to the intrinsic charm contribution to PDFs including Higgs production at high x F and novel fixed target measurements which can be tested at the LHC.« less
Probabilistic model of bridge vehicle loads in port area based on in-situ load testing
NASA Astrophysics Data System (ADS)
Deng, Ming; Wang, Lei; Zhang, Jianren; Wang, Rei; Yan, Yanhong
2017-11-01
Vehicle load is an important factor affecting the safety and usability of bridges. An statistical analysis is carried out in this paper to investigate the vehicle load data of Tianjin Haibin highway in Tianjin port of China, which are collected by the Weigh-in- Motion (WIM) system. Following this, the effect of the vehicle load on test bridge is calculated, and then compared with the calculation result according to HL-93(AASHTO LRFD). Results show that the overall vehicle load follows a distribution with a weighted sum of four normal distributions. The maximum vehicle load during the design reference period follows a type I extremum distribution. The vehicle load effect also follows a weighted sum of four normal distributions, and the standard value of the vehicle load is recommended as 1.8 times that of the calculated value according to HL-93.
ETICS: the international software engineering service for the grid
NASA Astrophysics Data System (ADS)
Meglio, A. D.; Bégin, M.-E.; Couvares, P.; Ronchieri, E.; Takacs, E.
2008-07-01
The ETICS system is a distributed software configuration, build and test system designed to fulfil the needs of improving the quality, reliability and interoperability of distributed software in general and grid software in particular. The ETICS project is a consortium of five partners (CERN, INFN, Engineering Ingegneria Informatica, 4D Soft and the University of Wisconsin-Madison). The ETICS service consists of a build and test job execution system based on the Metronome software and an integrated set of web services and software engineering tools to design, maintain and control build and test scenarios. The ETICS system allows taking into account complex dependencies among applications and middleware components and provides a rich environment to perform static and dynamic analysis of the software and execute deployment, system and interoperability tests. This paper gives an overview of the system architecture and functionality set and then describes how the EC-funded EGEE, DILIGENT and OMII-Europe projects are using the software engineering services to build, validate and distribute their software. Finally a number of significant use and test cases will be described to show how ETICS can be used in particular to perform interoperability tests of grid middleware using the grid itself.
Research in Varying Burner Tilt Angle to Reduce Rear Pass Temperature in Coal Fired Boiler
NASA Astrophysics Data System (ADS)
Thrangaraju, Savithry K.; Munisamy, Kannan M.; Baskaran, Saravanan
2017-04-01
This research shows the investigation conducted on one of techniques that is used in Manjung 700 MW tangentially fired coal power plant. The investigation conducted in this research is finding out the right tilt angle for the burners in the boiler that causes an efficient temperature distribution and combustion gas flow pattern in the boiler especially at the rear pass section. The main outcome of the project is to determine the right tilt angle for the burner to create an efficient temperature distribution and combustion gas flow pattern that able to increase the efficiency of the boiler. The investigation is carried out by using Computational Fluid Dynamics method to obtain the results by varying the burner tilt angle. The boiler model is drawn by using designing software which is called Solid Works and Fluent from Computational Fluid Dynamics is used to conduct the analysis on the boiler model. The analysis is to imitate the real combustion process in the real Manjung 700 MW boiler. The expected results are to determine the right burner tilt angle with a computational fluid analysis by obtaining the temperature distribution and combustion gas flow pattern for each of the three angles set for the burner tilt angle in FLUENT software. Three burner tilt angles are selected which are burner tilt angle at (0°) as test case 1, burner tilt angle at (+10°) as test case 2 and burner tilt angle at (-10°) as test case 3. These entire three cases were run in CFD software and the results of temperature distribution and velocity vector were obtained to find out the changes on the three cases at the furnace and rear pass section of the boiler. The results are being compared in analysis part by plotting graphs to determine the right tilting angle that reduces the rear pass temperature.
Total hydrocarbon content (THC) testing in liquid oxygen (LOX) systems
NASA Astrophysics Data System (ADS)
Meneghelli, B. J.; Obregon, R. E.; Ross, H. R.; Hebert, B. J.; Sass, J. P.; Dirschka, G. E.
2015-12-01
The measured Total Hydrocarbon Content (THC) levels in liquid oxygen (LOX) systems at Stennis Space Center (SSC) have shown wide variations. Examples of these variations include the following: 1) differences between vendor-supplied THC values and those obtained using standard SSC analysis procedures; and 2) increasing THC values over time at an active SSC test stand in both storage and run vessels. A detailed analysis of LOX sampling techniques, analytical instrumentation, and sampling procedures will be presented. Additional data obtained on LOX system operations and LOX delivery trailer THC values during the past 12-24 months will also be discussed. Field test results showing THC levels and the distribution of the THC's in the test stand run tank, modified for THC analysis via dip tubes, will be presented.
Total Hydrocarbon Content (THC) Testing in Liquid Oxygen (LOX)
NASA Technical Reports Server (NTRS)
Meneghelli, B. J.; Obregon, R. E.; Ross, H. R.; Hebert, B. J.; Sass, J. P.; Dirschka, G. E.
2016-01-01
The measured Total Hydrocarbon Content (THC) levels in liquid oxygen (LOX) systems at Stennis Space Center (SSC) have shown wide variations. Examples of these variations include the following: 1) differences between vendor-supplied THC values and those obtained using standard SSC analysis procedures; and 2) increasing THC values over time at an active SSC test stand in both storage and run vessels. A detailed analysis of LOX sampling techniques, analytical instrumentation, and sampling procedures will be presented. Additional data obtained on LOX system operations and LOX delivery trailer THC values during the past 12-24 months will also be discussed. Field test results showing THC levels and the distribution of the THC's in the test stand run tank, modified for THC analysis via dip tubes, will be presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shi, J.; Xue, X.
A comprehensive 3D CFD model is developed for a bi-electrode supported cell (BSC) SOFC. The model includes complicated transport phenomena of mass/heat transfer, charge (electron and ion) migration, and electrochemical reaction. The uniqueness of the modeling study is that functionally graded porous electrode property is taken into account, including not only linear but nonlinear porosity distributions. Extensive numerical analysis is performed to elucidate the effects of both porous microstructure distributions and operating condition on cell performance. Results indicate that cell performance is strongly dependent on both operating conditions and porous microstructure distributions of electrodes. Using the proposed fuel/gas feeding design,more » the uniform hydrogen distribution within porous anode is achieved; the oxygen distribution within the cathode is dependent on porous microstructure distributions as well as pressure loss conditions. Simulation results show that fairly uniform temperature distribution can be obtained with the proposed fuel/gas feeding design. The modeling results can be employed to guide experimental design of BSC test and provide pre-experimental analysis, as a result, to circumvent high cost associated with try-and-error experimental design and setup.« less
Statistical homogeneity tests applied to large data sets from high energy physics experiments
NASA Astrophysics Data System (ADS)
Trusina, J.; Franc, J.; Kůs, V.
2017-12-01
Homogeneity tests are used in high energy physics for the verification of simulated Monte Carlo samples, it means if they have the same distribution as a measured data from particle detector. Kolmogorov-Smirnov, χ 2, and Anderson-Darling tests are the most used techniques to assess the samples’ homogeneity. Since MC generators produce plenty of entries from different models, each entry has to be re-weighted to obtain the same sample size as the measured data has. One way of the homogeneity testing is through the binning. If we do not want to lose any information, we can apply generalized tests based on weighted empirical distribution functions. In this paper, we propose such generalized weighted homogeneity tests and introduce some of their asymptotic properties. We present the results based on numerical analysis which focuses on estimations of the type-I error and power of the test. Finally, we present application of our homogeneity tests to data from the experiment DØ in Fermilab.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chakraborty, Sudipta
Various interconnection challenges exist when connecting distributed PV into the electrical distribution grid in terms of safety, reliability, and stability of the electric power systems. Some of the urgent areas for research, as identified by inverter manufacturers, installers and utilities, are potential for transient overvoltage from PV inverters, multi-inverter anti-islanding, impact of smart inverters on volt-VAR support, impact of bidirectional power flow, and potential for distributed generation curtailment solutions to mitigate grid stability challenges. Under this project, NREL worked with SolarCity to address these challenges through research, testing and analysis at the Energy System Integration Facility (ESIF). Inverters from differentmore » manufacturers were tested at ESIF and NREL's unique power hardware-in-the-loop (PHIL) capability was utilized to evaluate various system-level impacts. Through the modeling, simulation, and testing, this project eliminated critical barriers on high PV penetration and directly supported the Department of Energy's SunShot goal of increasing the solar PV on the electrical grid.« less
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes, These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.
Y. Chen; S. J. Seybold
2013-01-01
Instar determination of field-collected insect larvae has generally been based on the analysis of head capsule width frequency distributions or bivariate plotting, but few studies have tested the validity of such methods. We used head capsules from exuviae of known instars of the beet armyworm, Spodoptera exigua (Hübner) (Lepidoptera: Noctuidae),...
Characterization and Dynamic Analysis of Long-Cavity Multi-Section Gain- Levered Quantum-Dot Lasers
2013-03-01
test setup .................................................................... 8 Figure 5: Comparison of a Fabry – Perot and distributed feedback...for example Fabry – Perot and distributed-feedback designs), with each possessing advantages and disadvantages that will be discussed in detail in...contrast to Fabry – Perot cavities (two discrete mirrors) that result in lasing over multiple longitudinal modes supported by the cavity. Figure 5 shows
Distributed measurement of high electric current by means of polarimetric optical fiber sensor.
Palmieri, Luca; Sarchi, Davide; Galtarossa, Andrea
2015-05-04
A novel distributed optical fiber sensor for spatially resolved monitoring of high direct electric current is proposed and analyzed. The sensor exploits Faraday rotation and is based on the polarization analysis of the Rayleigh backscattered light. Preliminary laboratory tests, performed on a section of electric cable for currents up to 2.5 kA, have confirmed the viability of the method.
The distribution of mercury in a forest floor transect across the central United States
Charles H. (Hobie) Perry; Michael C. Amacher; William Cannon; Randall K. Kolka; Laurel Woodruff
2009-01-01
Mercury (Hg) stored in soil organic matter may be released when the forest floor is consumed by fire. Our objective is to document the spatial distribution of forest floor Hg for a transect crossing the central United States. Samples collected by the Forest Service, U.S. Department of Agriculture's Forest Inventory and Analysis Soil Quality Indicator were tested...
APL-UW Deep Water Propagation 2015-2017: Philippine Sea Data Analysis
2015-09-30
DISTRIBUTION STATEMENT A: Approved for public release: distribution is unlimited APL-UW Deep Water Propagation 2015-2017: Philippine Sea Data...the fundamental statistics of broadband low-frequency acoustical signals evolve during propagation through a dynamically-varying deep ocean. OBJECTIVES...Current models of signal randomization over long ranges in the deep ocean were developed for and tested in the North Pacific Ocean gyre. The
78 FR 33442 - Manufacturer of Controlled Substances; Notice of Registration; Cerilliant Corporation
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-04
... the listed controlled substances for distribution to their research and forensic customers conducting drug testing and analysis. No comments or objections have been received. DEA has considered the factors...
Reichenau, Tim G; Korres, Wolfgang; Montzka, Carsten; Fiener, Peter; Wilken, Florian; Stadler, Anja; Waldhoff, Guido; Schneider, Karl
2016-01-01
The ratio of leaf area to ground area (leaf area index, LAI) is an important state variable in ecosystem studies since it influences fluxes of matter and energy between the land surface and the atmosphere. As a basis for generating temporally continuous and spatially distributed datasets of LAI, the current study contributes an analysis of its spatial variability and spatial structure. Soil-vegetation-atmosphere fluxes of water, carbon and energy are nonlinearly related to LAI. Therefore, its spatial heterogeneity, i.e., the combination of spatial variability and structure, has an effect on simulations of these fluxes. To assess LAI spatial heterogeneity, we apply a Comprehensive Data Analysis Approach that combines data from remote sensing (5 m resolution) and simulation (150 m resolution) with field measurements and a detailed land use map. Test area is the arable land in the fertile loess plain of the Rur catchment on the Germany-Belgium-Netherlands border. LAI from remote sensing and simulation compares well with field measurements. Based on the simulation results, we describe characteristic crop-specific temporal patterns of LAI spatial variability. By means of these patterns, we explain the complex multimodal frequency distributions of LAI in the remote sensing data. In the test area, variability between agricultural fields is higher than within fields. Therefore, spatial resolutions less than the 5 m of the remote sensing scenes are sufficient to infer LAI spatial variability. Frequency distributions from the simulation agree better with the multimodal distributions from remote sensing than normal distributions do. The spatial structure of LAI in the test area is dominated by a short distance referring to field sizes. Longer distances that refer to soil and weather can only be derived from remote sensing data. Therefore, simulations alone are not sufficient to characterize LAI spatial structure. It can be concluded that a comprehensive picture of LAI spatial heterogeneity and its temporal course can contribute to the development of an approach to create spatially distributed and temporally continuous datasets of LAI.
Korres, Wolfgang; Montzka, Carsten; Fiener, Peter; Wilken, Florian; Stadler, Anja; Waldhoff, Guido; Schneider, Karl
2016-01-01
The ratio of leaf area to ground area (leaf area index, LAI) is an important state variable in ecosystem studies since it influences fluxes of matter and energy between the land surface and the atmosphere. As a basis for generating temporally continuous and spatially distributed datasets of LAI, the current study contributes an analysis of its spatial variability and spatial structure. Soil-vegetation-atmosphere fluxes of water, carbon and energy are nonlinearly related to LAI. Therefore, its spatial heterogeneity, i.e., the combination of spatial variability and structure, has an effect on simulations of these fluxes. To assess LAI spatial heterogeneity, we apply a Comprehensive Data Analysis Approach that combines data from remote sensing (5 m resolution) and simulation (150 m resolution) with field measurements and a detailed land use map. Test area is the arable land in the fertile loess plain of the Rur catchment on the Germany-Belgium-Netherlands border. LAI from remote sensing and simulation compares well with field measurements. Based on the simulation results, we describe characteristic crop-specific temporal patterns of LAI spatial variability. By means of these patterns, we explain the complex multimodal frequency distributions of LAI in the remote sensing data. In the test area, variability between agricultural fields is higher than within fields. Therefore, spatial resolutions less than the 5 m of the remote sensing scenes are sufficient to infer LAI spatial variability. Frequency distributions from the simulation agree better with the multimodal distributions from remote sensing than normal distributions do. The spatial structure of LAI in the test area is dominated by a short distance referring to field sizes. Longer distances that refer to soil and weather can only be derived from remote sensing data. Therefore, simulations alone are not sufficient to characterize LAI spatial structure. It can be concluded that a comprehensive picture of LAI spatial heterogeneity and its temporal course can contribute to the development of an approach to create spatially distributed and temporally continuous datasets of LAI. PMID:27391858
NASA Astrophysics Data System (ADS)
Liu, Shuang; Hu, Xiangyun; Liu, Tianyou; Xi, Yufei; Cai, Jianchao; Zhang, Henglei
2015-01-01
The ant colony optimisation algorithm has successfully been used to invert for surface magnetic data. However, the resolution of the distributions of the recovered physical property for deeply buried magnetic sources is not generally very high because of geophysical ambiguities. We use three approaches to deal with this problem. First, the observed surface magnetic data are taken together with the three-component borehole magnetic anomalies to recover the distributions of the physical properties. This cooperative inversion strategy improves the resolution of the inversion results in the vertical direction. Additionally, as the ant colony tours the discrete nodes, we force it to visit the nodes with physical properties that agree with the drilled lithologies. These lithological constraints reduce the non-uniqueness of the inversion problem. Finally, we also implement a K-means cluster analysis for the distributions of the magnetic cells after each iteration, in order to separate the distributions of magnetisation intensity instead of concentrating the distribution in a single area. We tested our method using synthetic data and found that all tests returned favourable results. In the case study of the Mengku iron-ore deposit in northwest China, the recovered distributions of magnetisation are in good agreement with the locations and shapes of the magnetite orebodies as inferred by drillholes. Uncertainty analysis shows that the ant colony algorithm is robust in the presence of noise and that the proposed approaches significantly improve the quality of the inversion results.
COI Structural Analysis Presentation
NASA Technical Reports Server (NTRS)
Cline, Todd; Stahl, H. Philip (Technical Monitor)
2001-01-01
This report discusses the structural analysis of the Next Generation Space Telescope Mirror System Demonstrator (NMSD) developed by Composite Optics Incorporated (COI) in support of the Next Generation Space Telescope (NGST) project. The mirror was submitted to Marshall Space Flight Center (MSFC) for cryogenic testing and evaluation. Once at MSFC, the mirror was lowered to approximately 40 K and the optical surface distortions were measured. Alongside this experiment, an analytical model was developed and used to compare to the test results. A NASTRAN finite element model was provided by COI and a thermal model was developed from it. Using the thermal model, steady state nodal temperatures were calculated based on the predicted environment of the large cryogenic test chamber at MSFC. This temperature distribution was applied in the structural analysis to solve for the deflections of the optical surface. Finally, these deflections were submitted for optical analysis and comparison to the interferometer test data.
[The Freiburg monosyllable word test in postoperative cochlear implant diagnostics].
Hey, M; Brademann, G; Ambrosch, P
2016-08-01
The Freiburg monosyllable word test represents a central tool of postoperative cochlear implant (CI) diagnostics. The objective of this study is to test the equivalence of different word lists by analysing word comprehension. For patients whose CI has been implanted for more than 5 years, the distribution of suprathreshold speech intelligibility outcomes will also be analysed. In a retrospective data analysis, speech understanding for 626 CI users word correct scores were evaluated using a total of 5211 lists with 20 words each. The analysis of word comprehension within each list shows differences in mean and in the kind of distribution function. There are lists which show a significant difference of their mean word recognition to the overall mean. The Freiburg monosyllable word test is easy to administer at suprathreshold speech level for CI recipients, and typically has a saturation level above 80 %. The Freiburg monosyllable word test can be performed successfully by the majority of CI patients. The limited balance of the test lists elicits the conclusion that an adaptive test procedure with the Freiburg monosyllable test does not make sense. The Freiburg monosyllable test can be restructured by resorting all words across lists, or by omitting individual words of a test list to increase the reliability of the test. The results show that speech intelligibility in quiet should also be investigated in CI recipients al levels below 70 dB.
Phase walk analysis of leptokurtic time series.
Schreiber, Korbinian; Modest, Heike I; Räth, Christoph
2018-06-01
The Fourier phase information play a key role for the quantified description of nonlinear data. We present a novel tool for time series analysis that identifies nonlinearities by sensitively detecting correlations among the Fourier phases. The method, being called phase walk analysis, is based on well established measures from random walk analysis, which are now applied to the unwrapped Fourier phases of time series. We provide an analytical description of its functionality and demonstrate its capabilities on systematically controlled leptokurtic noise. Hereby, we investigate the properties of leptokurtic time series and their influence on the Fourier phases of time series. The phase walk analysis is applied to measured and simulated intermittent time series, whose probability density distribution is approximated by power laws. We use the day-to-day returns of the Dow-Jones industrial average, a synthetic time series with tailored nonlinearities mimicing the power law behavior of the Dow-Jones and the acceleration of the wind at an Atlantic offshore site. Testing for nonlinearities by means of surrogates shows that the new method yields strong significances for nonlinear behavior. Due to the drastically decreased computing time as compared to embedding space methods, the number of surrogate realizations can be increased by orders of magnitude. Thereby, the probability distribution of the test statistics can very accurately be derived and parameterized, which allows for much more precise tests on nonlinearities.
2014-12-01
113 Figure 64 - Elemental Analysis, Typical TMS Post - Test , Post Carbon Burn-off, Hexane rinsed ............ 114 Figure 65 – SEM (20X...Agency’s Wright-Patterson Aerospace Fuels Laboratory AFRL Air Force Research Laboratory AFTSTU Aviation Fuel Thermal Stability Test Unit ARSFSS Advanced...Approved for public release; distribution unlimited. For all ARSFSS testing , SV hysteresis is measure pre- and post - test and is defined by relating
Analysis of spatial thermal field in a magnetic bearing
NASA Astrophysics Data System (ADS)
Wajnert, Dawid; Tomczuk, Bronisław
2018-03-01
This paper presents two mathematical models for temperature field analysis in a new hybrid magnetic bearing. Temperature distributions have been calculated using a three dimensional simulation and a two dimensional one. A physical model for temperature testing in the magnetic bearing has been developed. Some results obtained from computer simulations were compared with measurements.
NASA Astrophysics Data System (ADS)
Sabarish, R. Mani; Narasimhan, R.; Chandhru, A. R.; Suribabu, C. R.; Sudharsan, J.; Nithiyanantham, S.
2017-05-01
In the design of irrigation and other hydraulic structures, evaluating the magnitude of extreme rainfall for a specific probability of occurrence is of much importance. The capacity of such structures is usually designed to cater to the probability of occurrence of extreme rainfall during its lifetime. In this study, an extreme value analysis of rainfall for Tiruchirapalli City in Tamil Nadu was carried out using 100 years of rainfall data. Statistical methods were used in the analysis. The best-fit probability distribution was evaluated for 1, 2, 3, 4 and 5 days of continuous maximum rainfall. The goodness of fit was evaluated using Chi-square test. The results of the goodness-of-fit tests indicate that log-Pearson type III method is the overall best-fit probability distribution for 1-day maximum rainfall and consecutive 2-, 3-, 4-, 5- and 6-day maximum rainfall series of Tiruchirapalli. To be reliable, the forecasted maximum rainfalls for the selected return periods are evaluated in comparison with the results of the plotting position.
[Simulation and data analysis of stereological modeling based on virtual slices].
Wang, Hao; Shen, Hong; Bai, Xiao-yan
2008-05-01
To establish a computer-assisted stereological model for simulating the process of slice section and evaluate the relationship between section surface and estimated three-dimensional structure. The model was designed by mathematic method as a win32 software based on the MFC using Microsoft visual studio as IDE for simulating the infinite process of sections and analysis of the data derived from the model. The linearity of the fitting of the model was evaluated by comparison with the traditional formula. The win32 software based on this algorithm allowed random sectioning of the particles distributed randomly in an ideal virtual cube. The stereological parameters showed very high throughput (>94.5% and 92%) in homogeneity and independence tests. The data of density, shape and size of the section were tested to conform to normal distribution. The output of the model and that from the image analysis system showed statistical correlation and consistency. The algorithm we described can be used for evaluating the stereologic parameters of the structure of tissue slices.
Quantitative histogram analysis of images
NASA Astrophysics Data System (ADS)
Holub, Oliver; Ferreira, Sérgio T.
2006-11-01
A routine for histogram analysis of images has been written in the object-oriented, graphical development environment LabVIEW. The program converts an RGB bitmap image into an intensity-linear greyscale image according to selectable conversion coefficients. This greyscale image is subsequently analysed by plots of the intensity histogram and probability distribution of brightness, and by calculation of various parameters, including average brightness, standard deviation, variance, minimal and maximal brightness, mode, skewness and kurtosis of the histogram and the median of the probability distribution. The program allows interactive selection of specific regions of interest (ROI) in the image and definition of lower and upper threshold levels (e.g., to permit the removal of a constant background signal). The results of the analysis of multiple images can be conveniently saved and exported for plotting in other programs, which allows fast analysis of relatively large sets of image data. The program file accompanies this manuscript together with a detailed description of two application examples: The analysis of fluorescence microscopy images, specifically of tau-immunofluorescence in primary cultures of rat cortical and hippocampal neurons, and the quantification of protein bands by Western-blot. The possibilities and limitations of this kind of analysis are discussed. Program summaryTitle of program: HAWGC Catalogue identifier: ADXG_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADXG_v1_0 Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Computers: Mobile Intel Pentium III, AMD Duron Installations: No installation necessary—Executable file together with necessary files for LabVIEW Run-time engine Operating systems or monitors under which the program has been tested: WindowsME/2000/XP Programming language used: LabVIEW 7.0 Memory required to execute with typical data:˜16MB for starting and ˜160MB used for loading of an image No. of bits in a word: 32 No. of processors used: 1 Has the code been vectorized or parallelized?: No No of lines in distributed program, including test data, etc.:138 946 No. of bytes in distributed program, including test data, etc.:15 166 675 Distribution format: tar.gz Nature of physical problem: Quantification of image data (e.g., for discrimination of molecular species in gels or fluorescent molecular probes in cell cultures) requires proprietary or complex software packages, which might not include the relevant statistical parameters or make the analysis of multiple images a tedious procedure for the general user. Method of solution: Tool for conversion of RGB bitmap image into luminance-linear image and extraction of luminance histogram, probability distribution, and statistical parameters (average brightness, standard deviation, variance, minimal and maximal brightness, mode, skewness and kurtosis of histogram and median of probability distribution) with possible selection of region of interest (ROI) and lower and upper threshold levels. Restrictions on the complexity of the problem: Does not incorporate application-specific functions (e.g., morphometric analysis) Typical running time: Seconds (depending on image size and processor speed) Unusual features of the program: None
Zhang, Jian; Hou, Dibo; Wang, Ke; Huang, Pingjie; Zhang, Guangxin; Loáiciga, Hugo
2017-05-01
The detection of organic contaminants in water distribution systems is essential to protect public health from potential harmful compounds resulting from accidental spills or intentional releases. Existing methods for detecting organic contaminants are based on quantitative analyses such as chemical testing and gas/liquid chromatography, which are time- and reagent-consuming and involve costly maintenance. This study proposes a novel procedure based on discrete wavelet transform and principal component analysis for detecting organic contamination events from ultraviolet spectral data. Firstly, the spectrum of each observation is transformed using discrete wavelet with a coiflet mother wavelet to capture the abrupt change along the wavelength. Principal component analysis is then employed to approximate the spectra based on capture and fusion features. The significant value of Hotelling's T 2 statistics is calculated and used to detect outliers. An alarm of contamination event is triggered by sequential Bayesian analysis when the outliers appear continuously in several observations. The effectiveness of the proposed procedure is tested on-line using a pilot-scale setup and experimental data.
Extending GIS Technology to Study Karst Features of Southeastern Minnesota
NASA Astrophysics Data System (ADS)
Gao, Y.; Tipping, R. G.; Alexander, E. C.; Alexander, S. C.
2001-12-01
This paper summarizes ongoing research on karst feature distribution of southeastern Minnesota. The main goals of this interdisciplinary research are: 1) to look for large-scale patterns in the rate and distribution of sinkhole development; 2) to conduct statistical tests of hypotheses about the formation of sinkholes; 3) to create management tools for land-use managers and planners; and 4) to deliver geomorphic and hydrogeologic criteria for making scientifically valid land-use policies and ethical decisions in karst areas of southeastern Minnesota. Existing county and sub-county karst feature datasets of southeastern Minnesota have been assembled into a large GIS-based database capable of analyzing the entire data set. The central database management system (DBMS) is a relational GIS-based system interacting with three modules: GIS, statistical and hydrogeologic modules. ArcInfo and ArcView were used to generate a series of 2D and 3D maps depicting karst feature distributions in southeastern Minnesota. IRIS ExplorerTM was used to produce satisfying 3D maps and animations using data exported from GIS-based database. Nearest-neighbor analysis has been used to test sinkhole distributions in different topographic and geologic settings. All current nearest-neighbor analyses testify that sinkholes in southeastern Minnesota are not evenly distributed in this area (i.e., they tend to be clustered). More detailed statistical methods such as cluster analysis, histograms, probability estimation, correlation and regression have been used to study the spatial distributions of some mapped karst features of southeastern Minnesota. A sinkhole probability map for Goodhue County has been constructed based on sinkhole distribution, bedrock geology, depth to bedrock, GIS buffer analysis and nearest-neighbor analysis. A series of karst features for Winona County including sinkholes, springs, seeps, stream sinks and outcrop has been mapped and entered into the Karst Feature Database of Southeastern Minnesota. The Karst Feature Database of Winona County is being expanded to include all the mapped karst features of southeastern Minnesota. Air photos from 1930s to 1990s of Spring Valley Cavern Area in Fillmore County were scanned and geo-referenced into our GIS system. This technology has been proved to be very useful to identify sinkholes and study the rate of sinkhole development.
NASA Astrophysics Data System (ADS)
Mori, Kaya; Chonko, James C.; Hailey, Charles J.
2005-10-01
We have reanalyzed the 260 ks XMM-Newton observation of 1E 1207.4-5209. There are several significant improvements over previous work. First, a much broader range of physically plausible spectral models was used. Second, we have used a more rigorous statistical analysis. The standard F-distribution was not employed, but rather the exact finite statistics F-distribution was determined by Monte Carlo simulations. This approach was motivated by the recent work of Protassov and coworkers and Freeman and coworkers. They demonstrated that the standard F-distribution is not even asymptotically correct when applied to assess the significance of additional absorption features in a spectrum. With our improved analysis we do not find a third and fourth spectral feature in 1E 1207.4-5209 but only the two broad absorption features previously reported. Two additional statistical tests, one line model dependent and the other line model independent, confirmed our modified F-test analysis. For all physically plausible continuum models in which the weak residuals are strong enough to fit, the residuals occur at the instrument Au M edge. As a sanity check we confirmed that the residuals are consistent in strength and position with the instrument Au M residuals observed in 3C 273.
NASA Astrophysics Data System (ADS)
Holá, Markéta; Kalvoda, Jiří; Nováková, Hana; Škoda, Radek; Kanický, Viktor
2011-01-01
LA-ICP-MS and solution based ICP-MS in combination with electron microprobe are presented as a method for the determination of the elemental spatial distribution in fish scales which represent an example of a heterogeneous layered bone structure. Two different LA-ICP-MS techniques were tested on recent common carp ( Cyprinus carpio) scales: A line scan through the whole fish scale perpendicular to the growth rings. The ablation crater of 55 μm width and 50 μm depth allowed analysis of the elemental distribution in the external layer. Suitable ablation conditions providing a deeper ablation crater gave average values from the external HAP layer and the collagen basal plate. Depth profiling using spot analysis was tested in fish scales for the first time. Spot analysis allows information to be obtained about the depth profile of the elements at the selected position on the sample. The combination of all mentioned laser ablation techniques provides complete information about the elemental distribution in the fish scale samples. The results were compared with the solution based ICP-MS and EMP analyses. The fact that the results of depth profiling are in a good agreement both with EMP and PIXE results and, with the assumed ways of incorporation of the studied elements in the HAP structure, suggests a very good potential for this method.
Distributions-per-level: a means of testing level detectors and models of patch-clamp data.
Schröder, I; Huth, T; Suitchmezian, V; Jarosik, J; Schnell, S; Hansen, U P
2004-01-01
Level or jump detectors generate the reconstructed time series from a noisy record of patch-clamp current. The reconstructed time series is used to create dwell-time histograms for the kinetic analysis of the Markov model of the investigated ion channel. It is shown here that some additional lines in the software of such a detector can provide a powerful new means of patch-clamp analysis. For each current level that can be recognized by the detector, an array is declared. The new software assigns every data point of the original time series to the array that belongs to the actual state of the detector. From the data sets in these arrays distributions-per-level are generated. Simulated and experimental time series analyzed by Hinkley detectors are used to demonstrate the benefits of these distributions-per-level. First, they can serve as a test of the reliability of jump and level detectors. Second, they can reveal beta distributions as resulting from fast gating that would usually be hidden in the overall amplitude histogram. Probably the most valuable feature is that the malfunctions of the Hinkley detectors turn out to depend on the Markov model of the ion channel. Thus, the errors revealed by the distributions-per-level can be used to distinguish between different putative Markov models of the measured time series.
Formability analysis of sheet metals by cruciform testing
NASA Astrophysics Data System (ADS)
Güler, B.; Alkan, K.; Efe, M.
2017-09-01
Cruciform biaxial tests are increasingly becoming popular for testing the formability of sheet metals as they achieve frictionless, in-plane, multi-axial stress states with a single sample geometry. However, premature fracture of the samples during testing prevents large strain deformation necessary for the formability analysis. In this work, we introduce a miniature cruciform sample design (few mm test region) and a test setup to achieve centre fracture and large uniform strains. With its excellent surface finish and optimized geometry, the sample deforms with diagonal strain bands intersecting at the test region. These bands prevent local necking and concentrate the strains at the sample centre. Imaging and strain analysis during testing confirm the uniform strain distributions and the centre fracture are possible for various strain paths ranging from plane-strain to equibiaxial tension. Moreover, the sample deforms without deviating from the predetermined strain ratio at all test conditions, allowing formability analysis under large strains. We demonstrate these features of the cruciform test for three sample materials: Aluminium 6061-T6 alloy, DC-04 steel and Magnesium AZ31 alloy, and investigate their formability at both the millimetre scale and the microstructure scale.
Li, Leah
2012-01-01
Summary Studies of cognitive development in children are often based on tests designed for specific ages. Examination of the changes of these scores over time may not be meaningful. This paper investigates the influence of early life factors on cognitive development using maths and reading test scores at ages 7, 11, and 16 years in a British birth cohort born in 1958. The distributions of these test scores differ between ages, for example, 20% participants scored the top mark in the reading test at 7 and the distribution of reading score at 16 is heavily skewed. In this paper, we group participants into 5 ordered categories, approximately 20% in each category according to their test scores at each age. Multilevel models for a repeated ordinal outcome are applied to relate the ordinal scale of maths and reading ability to early life factors. PMID:22661923
Validation Tests of Fiber Optic Strain-Based Operational Shape and Load Measurements
NASA Technical Reports Server (NTRS)
Bakalyar, John A.; Jutte, Christine
2012-01-01
Aircraft design has been progressing toward reduced structural weight to improve fuel efficiency, increase performance, and reduce cost. Lightweight aircraft structures are more flexible than conventional designs and require new design considerations. Intelligent sensing allows for enhanced control and monitoring of aircraft, which enables increased structurally efficiency. The NASA Dryden Flight Research Center (DFRC) has developed an instrumentation system and analysis techniques that combine to make distributed structural measurements practical for lightweight vehicles. Dryden's Fiber Optic Strain Sensing (FOSS) technology enables a multitude of lightweight, distributed surface strain measurements. The analysis techniques, referred to as the Displacement Transfer Functions (DTF) and Load Transfer Functions (LTF), use surface strain values to calculate structural deflections and operational loads. The combined system is useful for real-time monitoring of aeroelastic structures, along with many other applications. This paper describes how the capabilities of the measurement system were demonstrated using subscale test articles that represent simple aircraft structures. Empirical FOSS strain data were used within the DTF to calculate the displacement of the article and within the LTF to calculate bending moments due to loads acting on the article. The results of the tests, accuracy of the measurements, and a sensitivity analysis are presented.
Piepho, H P
1995-03-01
The additive main effects multiplicative interaction model is frequently used in the analysis of multilocation trials. In the analysis of such data it is of interest to decide how many of the multiplicative interaction terms are significant. Several tests for this task are available, all of which assume that errors are normally distributed with a common variance. This paper investigates the robustness of several tests (Gollob, F GH1, FGH2, FR)to departures from these assumptions. It is concluded that, because of its better robustness, the F Rtest is preferable. If the other tests are to be used, preliminary tests for the validity of assumptions should be performed.
Lackey, Daniel P; Carruth, Eric D; Lasher, Richard A; Boenisch, Jan; Sachse, Frank B; Hitchcock, Robert W
2011-11-01
Gap junctions play a fundamental role in intercellular communication in cardiac tissue. Various types of heart disease including hypertrophy and ischemia are associated with alterations of the spatial arrangement of gap junctions. Previous studies applied two-dimensional optical and electron-microscopy to visualize gap junction arrangements. In normal cardiomyocytes, gap junctions were primarily found at cell ends, but can be found also in more central regions. In this study, we extended these approaches toward three-dimensional reconstruction of gap junction distributions based on high-resolution scanning confocal microscopy and image processing. We developed methods for quantitative characterization of gap junction distributions based on analysis of intensity profiles along the principal axes of myocytes. The analyses characterized gap junction polarization at cell ends and higher-order statistical image moments of intensity profiles. The methodology was tested in rat ventricular myocardium. Our analysis yielded novel quantitative data on gap junction distributions. In particular, the analysis demonstrated that the distributions exhibit significant variability with respect to polarization, skewness, and kurtosis. We suggest that this methodology provides a quantitative alternative to current approaches based on visual inspection, with applications in particular in characterization of engineered and diseased myocardium. Furthermore, we propose that these data provide improved input for computational modeling of cardiac conduction.
Distribution System Reliability Analysis for Smart Grid Applications
NASA Astrophysics Data System (ADS)
Aljohani, Tawfiq Masad
Reliability of power systems is a key aspect in modern power system planning, design, and operation. The ascendance of the smart grid concept has provided high hopes of developing an intelligent network that is capable of being a self-healing grid, offering the ability to overcome the interruption problems that face the utility and cost it tens of millions in repair and loss. To address its reliability concerns, the power utilities and interested parties have spent extensive amount of time and effort to analyze and study the reliability of the generation and transmission sectors of the power grid. Only recently has attention shifted to be focused on improving the reliability of the distribution network, the connection joint between the power providers and the consumers where most of the electricity problems occur. In this work, we will examine the effect of the smart grid applications in improving the reliability of the power distribution networks. The test system used in conducting this thesis is the IEEE 34 node test feeder, released in 2003 by the Distribution System Analysis Subcommittee of the IEEE Power Engineering Society. The objective is to analyze the feeder for the optimal placement of the automatic switching devices and quantify their proper installation based on the performance of the distribution system. The measures will be the changes in the reliability system indices including SAIDI, SAIFI, and EUE. The goal is to design and simulate the effect of the installation of the Distributed Generators (DGs) on the utility's distribution system and measure the potential improvement of its reliability. The software used in this work is DISREL, which is intelligent power distribution software that is developed by General Reliability Co.
DeVore, Matthew S.; Gull, Stephen F.; Johnson, Carey K.
2012-01-01
We describe a method for analysis of single-molecule Förster resonance energy transfer (FRET) burst measurements using classic maximum entropy. Classic maximum entropy determines the Bayesian inference for the joint probability describing the total fluorescence photons and the apparent FRET efficiency. The method was tested with simulated data and then with DNA labeled with fluorescent dyes. The most probable joint distribution can be marginalized to obtain both the overall distribution of fluorescence photons and the apparent FRET efficiency distribution. This method proves to be ideal for determining the distance distribution of FRET-labeled biomolecules, and it successfully predicts the shape of the recovered distributions. PMID:22338694
Preliminary evaluation of the feasibility of artificial recharge in northern Qater
Vecchioli, John
1976-01-01
Fresh ground water in northern Qatar occurs as a lens in limestone and dolomite of Eocene age. Natural recharge from precipitation averages 17x106 cubic metres per year whereas current discharge averages 26.6x106 cubic metres per year. Depletion of storage is accompanied by a deterioration in quality due to encroachment of salty water from the Gulf and from underlying formations. Artificial recharge with desalted sea water to permit additional agricultural development appears technically feasible but its practicability needs to be examined further. A hydrogeological appraisal including test drilling, geophysical logging, pumping tests, and a recharge test, coupled with engineering analysis of direct surface storage/distribution of desalted sea water versus aquifer storage/distribution, is recommended.
An Analysis of Test Equating Models for the Alabama High School Graduation Examination.
ERIC Educational Resources Information Center
Glowacki, Margaret L.
The purpose of this study was to determine which equating models are appropriate for the Alabama High School Graduation Examination (AHSGE) by equating two previously administered fall forms for each subject area of the AHSGE and determining whether differences exist in the test score distributions or passing scores resulting from the equating…
ERIC Educational Resources Information Center
Haile, Getinet Astatike; Nguyen, Anh Ngoc
2008-01-01
We investigate the determinants of high school students' academic attainment in mathematics, reading and science in the United States; focusing particularly on possible differential impacts of ethnicity and family background across the distribution of test scores. Using data from the NELS2000 and employing quantile regression, we find two…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Denslow, Kayte M.; Bontha, Jagannadha R.; Adkins, Harold E.
This document presents the visual and ultrasonic PulseEcho critical velocity test results obtained from the System Performance test campaign that was completed in September 2012 with the Remote Sampler Demonstration (RSD)/Waste Feed Flow Loop cold-test platform located at the Monarch test facility in Pasco, Washington. This report is intended to complement and accompany the report that will be developed by WRPS on the design of the System Performance simulant matrix, the analysis of the slurry test sample concentration and particle size distribution (PSD) data, and the design and construction of the RSD/Waste Feed Flow Loop cold-test platform.
Harmonic Analysis of Electric Vehicle Loadings on Distribution System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xu, Yijun A; Xu, Yunshan; Chen, Zimin
2014-12-01
With the increasing number of Electric Vehicles (EV) in this age, the power system is facing huge challenges of the high penetration rates of EVs charging stations. Therefore, a technical study of the impact of EVs charging on the distribution system is required. This paper is applied with PSCAD software and aimed to analyzing the Total Harmonic Distortion (THD) brought by Electric Vehicles charging stations in power systems. The paper starts with choosing IEEE34 node test feeder as the distribution system, building electric vehicle level two charging battery model and other four different testing scenarios: overhead transmission line and undergroundmore » cable, industrial area, transformer and photovoltaic (PV) system. Then the statistic method is used to analyze different characteristics of THD in the plug-in transient, plug-out transient and steady-state charging conditions associated with these four scenarios are taken into the analysis. Finally, the factors influencing the THD in different scenarios are found. The analyzing results lead the conclusion of this paper to have constructive suggestions for both Electric Vehicle charging station construction and customers' charging habits.« less
Considering Horn's Parallel Analysis from a Random Matrix Theory Point of View.
Saccenti, Edoardo; Timmerman, Marieke E
2017-03-01
Horn's parallel analysis is a widely used method for assessing the number of principal components and common factors. We discuss the theoretical foundations of parallel analysis for principal components based on a covariance matrix by making use of arguments from random matrix theory. In particular, we show that (i) for the first component, parallel analysis is an inferential method equivalent to the Tracy-Widom test, (ii) its use to test high-order eigenvalues is equivalent to the use of the joint distribution of the eigenvalues, and thus should be discouraged, and (iii) a formal test for higher-order components can be obtained based on a Tracy-Widom approximation. We illustrate the performance of the two testing procedures using simulated data generated under both a principal component model and a common factors model. For the principal component model, the Tracy-Widom test performs consistently in all conditions, while parallel analysis shows unpredictable behavior for higher-order components. For the common factor model, including major and minor factors, both procedures are heuristic approaches, with variable performance. We conclude that the Tracy-Widom procedure is preferred over parallel analysis for statistically testing the number of principal components based on a covariance matrix.
Cost-Benefit Analysis of the 2006 Air Force Materiel Command Test and Evaluation Proposal
2008-01-01
research organization providing objective analysis and effective solutions that address the challenges facing the public and private sectors...distribution unlimited The RAND Corporation is a nonprofit research organization providing objective analysis and effective solutions that address the...paper) 1. United States. Air Force Materiel Command—Reorganization—Cost effectiveness . I. Thirtle, Michael R., 1967– UG633.2.C67 2008
NASA Technical Reports Server (NTRS)
Shumka, A.; Sollock, S. G.
1981-01-01
This paper represents the first comprehensive survey of the Mount Laguna Photovoltaic Installation. The novel techniques used for performing the field tests have been effective in locating and characterizing defective modules. A comparative analysis on the two types of modules used in the array indicates that they have significantly different failure rates, different distributions in degradational space and very different failure modes. A life cycle model is presented to explain a multimodal distribution observed for one module type. A statistical model is constructed and it is shown to be in good agreement with the field data.
NASA Astrophysics Data System (ADS)
Sembiring, N.; Ginting, E.; Darnello, T.
2017-12-01
Problems that appear in a company that produces refined sugar, the production floor has not reached the level of critical machine availability because it often suffered damage (breakdown). This results in a sudden loss of production time and production opportunities. This problem can be solved by Reliability Engineering method where the statistical approach to historical damage data is performed to see the pattern of the distribution. The method can provide a value of reliability, rate of damage, and availability level, of an machine during the maintenance time interval schedule. The result of distribution test to time inter-damage data (MTTF) flexible hose component is lognormal distribution while component of teflon cone lifthing is weibull distribution. While from distribution test to mean time of improvement (MTTR) flexible hose component is exponential distribution while component of teflon cone lifthing is weibull distribution. The actual results of the flexible hose component on the replacement schedule per 720 hours obtained reliability of 0.2451 and availability 0.9960. While on the critical components of teflon cone lifthing actual on the replacement schedule per 1944 hours obtained reliability of 0.4083 and availability 0.9927.
78 FR 69130 - Importer of Controlled Substances; Notice of Application; Cerilliant Corporation
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-18
... substances for manufacture and distribution to their research and forensic customers conducting drug testing and analysis. Any bulk manufacturer who is presently, or is applying to be, registered with DEA to...
78 FR 23959 - Manufacturer of Controlled Substances; Notice of Registration; Cayman Chemical Company
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-23
... plans to manufacture the listed controlled substances for distribution to their research and forensic customers conducting drug testing and analysis. No comments or objections have been received. DEA has...
Quality control analysis : part I : asphaltic concrete.
DOT National Transportation Integrated Search
1964-11-01
This report deals with the statistical evaluation of results from several hot mix plants to determine the pattern of variability with respect to bituminous hot mix characteristics. : Individual tests results when subjected to frequency distribution i...
NASA Technical Reports Server (NTRS)
Gauthier, J. J.; Roman, M. C.; Kilgore, B. A.; Huff, T. L.; Obenhuber, D. C.; Terrell, D. W.; Wilson, M. E.; Jackson, N. E.
1991-01-01
NASA/MSFC is developing a physical/chemical treatment system to reclaim wastewater for reuse on Space Station Freedom (SSF). Integrated testing of hygiene and potable water subsystems assessed the capability to reclaim water to SSF specifications. The test was conducted from May through July 1990 with a total of 47 days of system test operation. Water samples were analyzed using standard cultural methods employing membrane filtration and spread plate techniques and epifluorescence microscopy. Fatty acid methyl ester and biochemical profiles were used for microbial identification. Analysis of waste and product water produced by the subsystems demonstrated the effective reduction of viable microbial populations greater than 8.0E + 06 colony forming units (CFU) per 100 mL to an average of 5 CFU/100 mL prior to distribution into storage tanks.
Li, Jianwei; Zhang, Weimin; Zeng, Weiqin; Chen, Guolong; Qiu, Zhongchao; Cao, Xinyuan; Gao, Xuanyi
2017-01-01
Estimation of the stress distribution in ferromagnetic components is very important for evaluating the working status of mechanical equipment and implementing preventive maintenance. Eddy current testing technology is a promising method in this field because of its advantages of safety, no need of coupling agent, etc. In order to reduce the cost of eddy current stress measurement system, and obtain the stress distribution in ferromagnetic materials without scanning, a low cost eddy current stress measurement system based on Archimedes spiral planar coil was established, and a method based on BP neural network to obtain the stress distribution using the stress of several discrete test points was proposed. To verify the performance of the developed test system and the validity of the proposed method, experiment was implemented using structural steel (Q235) specimens. Standard curves of sensors at each test point were achieved, the calibrated data were used to establish the BP neural network model for approximating the stress variation on the specimen surface, and the stress distribution curve of the specimen was obtained by interpolating with the established model. The results show that there is a good linear relationship between the change of signal modulus and the stress in most elastic range of the specimen, and the established system can detect the change in stress with a theoretical average sensitivity of -0.4228 mV/MPa. The obtained stress distribution curve is well consonant with the theoretical analysis result. At last, possible causes and improving methods of problems appeared in the results were discussed. This research has important significance for reducing the cost of eddy current stress measurement system, and advancing the engineering application of eddy current stress testing.
Li, Jianwei; Zeng, Weiqin; Chen, Guolong; Qiu, Zhongchao; Cao, Xinyuan; Gao, Xuanyi
2017-01-01
Estimation of the stress distribution in ferromagnetic components is very important for evaluating the working status of mechanical equipment and implementing preventive maintenance. Eddy current testing technology is a promising method in this field because of its advantages of safety, no need of coupling agent, etc. In order to reduce the cost of eddy current stress measurement system, and obtain the stress distribution in ferromagnetic materials without scanning, a low cost eddy current stress measurement system based on Archimedes spiral planar coil was established, and a method based on BP neural network to obtain the stress distribution using the stress of several discrete test points was proposed. To verify the performance of the developed test system and the validity of the proposed method, experiment was implemented using structural steel (Q235) specimens. Standard curves of sensors at each test point were achieved, the calibrated data were used to establish the BP neural network model for approximating the stress variation on the specimen surface, and the stress distribution curve of the specimen was obtained by interpolating with the established model. The results show that there is a good linear relationship between the change of signal modulus and the stress in most elastic range of the specimen, and the established system can detect the change in stress with a theoretical average sensitivity of -0.4228 mV/MPa. The obtained stress distribution curve is well consonant with the theoretical analysis result. At last, possible causes and improving methods of problems appeared in the results were discussed. This research has important significance for reducing the cost of eddy current stress measurement system, and advancing the engineering application of eddy current stress testing. PMID:29145500
Code C# for chaos analysis of relativistic many-body systems
NASA Astrophysics Data System (ADS)
Grossu, I. V.; Besliu, C.; Jipa, Al.; Bordeianu, C. C.; Felea, D.; Stan, E.; Esanu, T.
2010-08-01
This work presents a new Microsoft Visual C# .NET code library, conceived as a general object oriented solution for chaos analysis of three-dimensional, relativistic many-body systems. In this context, we implemented the Lyapunov exponent and the “fragmentation level” (defined using the graph theory and the Shannon entropy). Inspired by existing studies on billiard nuclear models and clusters of galaxies, we tried to apply the virial theorem for a simplified many-body system composed by nucleons. A possible application of the “virial coefficient” to the stability analysis of chaotic systems is also discussed. Catalogue identifier: AEGH_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEGH_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 30 053 No. of bytes in distributed program, including test data, etc.: 801 258 Distribution format: tar.gz Programming language: Visual C# .NET 2005 Computer: PC Operating system: .Net Framework 2.0 running on MS Windows Has the code been vectorized or parallelized?: Each many-body system is simulated on a separate execution thread RAM: 128 Megabytes Classification: 6.2, 6.5 External routines: .Net Framework 2.0 Library Nature of problem: Chaos analysis of three-dimensional, relativistic many-body systems. Solution method: Second order Runge-Kutta algorithm for simulating relativistic many-body systems. Object oriented solution, easy to reuse, extend and customize, in any development environment which accepts .Net assemblies or COM components. Implementation of: Lyapunov exponent, “fragmentation level”, “average system radius”, “virial coefficient”, and energy conservation precision test. Additional comments: Easy copy/paste based deployment method. Running time: Quadratic complexity.
Distributed Evaluation of Local Sensitivity Analysis (DELSA), with application to hydrologic models
Rakovec, O.; Hill, Mary C.; Clark, M.P.; Weerts, A. H.; Teuling, A. J.; Uijlenhoet, R.
2014-01-01
This paper presents a hybrid local-global sensitivity analysis method termed the Distributed Evaluation of Local Sensitivity Analysis (DELSA), which is used here to identify important and unimportant parameters and evaluate how model parameter importance changes as parameter values change. DELSA uses derivative-based “local” methods to obtain the distribution of parameter sensitivity across the parameter space, which promotes consideration of sensitivity analysis results in the context of simulated dynamics. This work presents DELSA, discusses how it relates to existing methods, and uses two hydrologic test cases to compare its performance with the popular global, variance-based Sobol' method. The first test case is a simple nonlinear reservoir model with two parameters. The second test case involves five alternative “bucket-style” hydrologic models with up to 14 parameters applied to a medium-sized catchment (200 km2) in the Belgian Ardennes. Results show that in both examples, Sobol' and DELSA identify similar important and unimportant parameters, with DELSA enabling more detailed insight at much lower computational cost. For example, in the real-world problem the time delay in runoff is the most important parameter in all models, but DELSA shows that for about 20% of parameter sets it is not important at all and alternative mechanisms and parameters dominate. Moreover, the time delay was identified as important in regions producing poor model fits, whereas other parameters were identified as more important in regions of the parameter space producing better model fits. The ability to understand how parameter importance varies through parameter space is critical to inform decisions about, for example, additional data collection and model development. The ability to perform such analyses with modest computational requirements provides exciting opportunities to evaluate complicated models as well as many alternative models.
Rine, J.M.; Shafer, J.M.; Covington, E.; Berg, R.C.
2006-01-01
Published information on the correlation and field-testing of the technique of stack-unit/aquifer sensitivity mapping with documented subsurface contaminant plumes is rare. The inherent characteristic of stack-unit mapping, which makes it a superior technique to other analyses that amalgamate data, is the ability to deconstruct the sensitivity analysis on a unit-by-unit basis. An aquifer sensitivity map, delineating the relative sensitivity of the Crouch Branch aquifer of the Administrative/Manufacturing Area (A/M) at the Savannah River Site (SRS) in South Carolina, USA, incorporates six hydrostratigraphic units, surface soil units, and relevant hydrologic data. When this sensitivity map is compared with the distribution of the contaminant tetrachloroethylene (PCE), PCE is present within the Crouch Branch aquifer within an area classified as highly sensitive, even though the PCE was primarily released on the ground surface within areas classified with low aquifer sensitivity. This phenomenon is explained through analysis of the aquifer sensitivity map, the groundwater potentiometric surface maps, and the plume distributions within the area on a unit-by- unit basis. The results of this correlation show how the paths of the PCE plume are influenced by both the geology and the groundwater flow. ?? Springer-Verlag 2006.
A Bayesian approach to parameter and reliability estimation in the Poisson distribution.
NASA Technical Reports Server (NTRS)
Canavos, G. C.
1972-01-01
For life testing procedures, a Bayesian analysis is developed with respect to a random intensity parameter in the Poisson distribution. Bayes estimators are derived for the Poisson parameter and the reliability function based on uniform and gamma prior distributions of that parameter. A Monte Carlo procedure is implemented to make possible an empirical mean-squared error comparison between Bayes and existing minimum variance unbiased, as well as maximum likelihood, estimators. As expected, the Bayes estimators have mean-squared errors that are appreciably smaller than those of the other two.
Brown, Zachary M; Gibbs, Jenna C; Adachi, Jonathan D; Ashe, Maureen C; Hill, Keith D; Kendler, David L; Khan, Aliya; Papaioannou, Alexandra; Prasad, Sadhana; Wark, John D; Giangregorio, Lora M
2017-11-28
We sought to evaluate the Balance Outcome Measure for Elder Rehabilitation (BOOMER) in community-dwelling women 65 years and older with vertebral fracture and to describe score distributions and potential ceiling and floor effects. This was a secondary data analysis of baseline data from the Build Better Bones with Exercise randomized controlled trial using the BOOMER. A total of 141 women with osteoporosis and radiographically confirmed vertebral fracture were included. Concurrent validity and internal consistency were assessed in comparison to the Short Physical Performance Battery (SPPB). Normality and ceiling/floor effects of total BOOMER scores and component test items were also assessed. Exploratory analyses of assistive aid use and falls history were performed. Tests for concurrent validity demonstrated moderate correlation between total BOOMER and SPPB scores. The BOOMER component tests showed modest internal consistency. Substantial ceiling effect and nonnormal score distributions were present among overall sample and those not using assistive aids for total BOOMER scores, although scores were normally distributed for those using assistive aids. The static standing with eyes closed test demonstrated the greatest ceiling effects of the component tests, with 92% of participants achieving a maximal score. While the BOOMER compares well with the SPPB in community-dwelling women with vertebral fractures, researchers or clinicians considering using the BOOMER in similar or higher-functioning populations should be aware of the potential for ceiling effects.
[Analysis for Discordance of Positive and Negative Blood Typing by Gel Card].
Li, Cui-Ying; Xu, Hong; Lei, Hui-Fen; Liu, Juan; Li, Xiao-Wei
2017-08-01
To explore the method of Gel card identifying ABO blood group, determine the inconsistent cause and the distribution of disease affecting factors, and put forward a method of its solutions. To collect 240 positive and negative typing-discordant blood speciments from patients examined by Gel card and send these speciments to blood type reference laboratory for examining with the classic tube method and serological test, such as salivary blood-group substance, in order to performe genotyping method when serologic test can not be determined. Among 240 positive and negative typing-discordant blood speciments from patients examined by Gel card, 107 blood speciments were positive and negative consistent examined by false agglutination test (44.58%), 133 blood specinents were discordent examined by false agglutination (55.42%), out of them, 35 cases (14.58%) with inconsistent cold agglutination test, 22 cases (9.17%) with weakened AB antigenicity, 16 cases (6.67%) with ABO subtyping, 12 cases (5.00%) with positive direct antiglobulin test, 11 cases (4.58%) with reduced or without antibodies, 11 cases (4.58%) with false aggregation caused by drugs or protein, 11 cases (4.58%) with salivary blood-type substances, 8 cases (3.33%) with non-ABO alloantibody, and 7 cases (2.92%) with allogeneic bone marrow transplantation. The distribution of disease were following: blood disease (16.83%), tumor (11.88%), and cardiopulmonary diseases (11.39%); chi-square test results indicated that the distribution significantly different. The analysis of ABO blood grouping shows a variety factors influencing positive and negative blood typing, and the Gel Card identification can produc more false positive blood types. Therefore, more attention should be paid on the high incidence diseases, such as blood disease, tumor, and cardiopulmonary disease.
Subtypes of breast cancer show different spatial distributions of brain metastases.
Kyeong, Sunghyon; Cha, Yoon Jin; Ahn, Sung Gwe; Suh, Sang Hyun; Son, Eun Ju; Ahn, Sung Jun
2017-01-01
The aim of our study was to test the hypothesis that the spatial distribution of breast cancer brain metastases (BM) differ according to their biological subtypes. MR images of 100 patients with BM from primary breast cancer were retrospectively reviewed. Patients were divided according to the biological subtype of the primary tumor, (triple-negative: 24, HER2 positive: 48, luminal: 28). All images marked with BMs were standardized to the human brain MRI atlas provided by the Montreal Neurological Institute 152 database. Distribution pattern of BM was evaluated with intra-group and intergroup analysis. In intra-group analysis, hot spots of metastases from triple-negative are evenly distributed in the brain, meanwhile BMs from HER2 positive and luminal type occur dominantly in occipital lobe and cerebellum. In intergroup analysis, BMs from triple-negative type occurred more often in frontal lobe, limbic region, and parietal lobe, compared with other types (P < .05). Breast cancer subtypes tend to demonstrate different spatial distributions of their BMs. These findings may have direct implications for dose modulation in prophylactic irradiation as well as for differential diagnoses. Thus, this result should be validated in future study with a larger population.
Genetic Modeling of Radiation Injury in Prostate Cancer Patients Treated with Radiotherapy
2017-10-01
approaches in the GWAS meta-analysis: 1) logistic regression to test association of each SNP with grade 1 or worse toxicity at 2 years post ...Annual PREPARED FOR: U.S. Army Medical Research and Materiel Command Fort Detrick, Maryland 21702-5012 DISTRIBUTION STATEMENT: Approved for...Army Medical Research and Materiel Command Fort Detrick, Maryland 21702-5012 11. SPONSOR/MONITOR’S REPORT NUMBER(S) 12. DISTRIBUTION / AVAILABILITY
Risk Metrics for Android (trademark) Devices
2017-02-01
allows for easy distribution of malware. This report surveys malware distribution methodologies , then describes current work being done to determine the...given a standard weight of wi = 1. Two data sets were used for testing this methodology . Because the authors are Chinese, they chose to download apps...Order Analysis excels at handling non -obfuscated apps, but may not be able to detect malware that employs encryption or dynamically changes its payload
Avionics test bed development plan
NASA Technical Reports Server (NTRS)
Harris, L. H.; Parks, J. M.; Murdock, C. R.
1981-01-01
A development plan for a proposed avionics test bed facility for the early investigation and evaluation of new concepts for the control of large space structures, orbiter attached flex body experiments, and orbiter enhancements is presented. A distributed data processing facility that utilizes the current laboratory resources for the test bed development is outlined. Future studies required for implementation, the management system for project control, and the baseline system configuration are defined. A background analysis of the specific hardware system for the preliminary baseline avionics test bed system is included.
Qiao, Yuchun; Shang, Jizhen; Li, Shuying; Feng, Luping; Jiang, Yao; Duan, Zhiqiang; Lv, Xiaoxia; Zhang, Chunxian; Yao, Tiantian; Dong, Zhichao; Zhang, Yu; Wang, Hua
2016-01-01
A fluorimetric Hg2+ test strip has been developed using a lotus-inspired fabrication method for suppressing the “coffee stains” toward the uniform distribution of probe materials through creating a hydrophobic drying pattern for fast solvent evaporation. The test strips were first loaded with the model probes of fluorescent gold-silver nanoclusters and then dried in vacuum on the hydrophobic pattern. On the one hand, here, the hydrophobic constraining forces from the lotus surface-like pattern could control the exterior transport of dispersed nanoclusters on strips leading to the minimized “coffee stains”. On the other hand, the vacuum-aided fast solvent evaporation could boost the interior Marangoni flow of probe materials on strips to expect the further improved probe distribution on strips. High aqueous stability and enhanced fluorescence of probes on test strips were realized by the hydrophilic treatment with amine-derivatized silicane. A test strips-based fluorimetry has thereby been developed for probing Hg2+ ions in wastewater, showing the detection performances comparable to the classic instrumental analysis ones. Such a facile and efficient fabrication route for the bio-inspired suppression of “coffee stains” on test strips may expand the scope of applications of test strips-based “point-of-care” analysis methods or detection devices in the biomedical and environmental fields. PMID:27812040
NASA Astrophysics Data System (ADS)
Muñoz-Gorriz, J.; Monaghan, S.; Cherkaoui, K.; Suñé, J.; Hurley, P. K.; Miranda, E.
2017-12-01
The angular wavelet analysis is applied for assessing the spatial distribution of breakdown spots in Pt/HfO2/Pt capacitors with areas ranging from 104 to 105 μm2. The breakdown spot lateral sizes are in the range from 1 to 3 μm, and they appear distributed on the top metal electrode as a point pattern. The spots are generated by ramped and constant voltage stresses and are the consequence of microexplosions caused by the formation of shorts spanning the dielectric film. This kind of pattern was analyzed in the past using the conventional spatial analysis tools such as intensity plots, distance histograms, pair correlation function, and nearest neighbours. Here, we show that the wavelet analysis offers an alternative and complementary method for testing whether or not the failure site distribution departs from a complete spatial randomness process in the angular domain. The effect of using different wavelet functions, such as the Haar, Sine, French top hat, Mexican hat, and Morlet, as well as the roles played by the process intensity, the location of the voltage probe, and the aspect ratio of the device, are all discussed.
NASA Astrophysics Data System (ADS)
Cheng, Liangliang; Busca, Giorgio; Cigada, Alfredo
2017-07-01
Modal analysis is commonly considered as an effective tool to obtain the intrinsic characteristics of structures including natural frequencies, modal damping ratios, and mode shapes, which are significant indicators for monitoring the health status of engineering structures. The complex mode indicator function (CMIF) can be regarded as an effective numerical tool to perform modal analysis. In this paper, experimental strain modal analysis based on the CMIF has been introduced. Moreover, a distributed fiber-optic sensor, as a dense measuring device, has been applied to acquire strain data along a beam surface. Thanks to the dense spatial resolution of the distributed fiber optics, more detailed mode shapes could be obtained. In order to test the effectiveness of the method, a mass lump—considered as a linear damage component—has been attached to the surface of the beam, and damage detection based on strain mode shape has been carried out. The results manifest that strain modal parameters can be estimated effectively by utilizing the CMIF based on the corresponding simulations and experiments. Furthermore, damage detection based on strain mode shapes benefits from the accuracy of strain mode shape recognition and the excellent performance of the distributed fiber optics.
Percentiles of the null distribution of 2 maximum lod score tests.
Ulgen, Ayse; Yoo, Yun Joo; Gordon, Derek; Finch, Stephen J; Mendell, Nancy R
2004-01-01
We here consider the null distribution of the maximum lod score (LOD-M) obtained upon maximizing over transmission model parameters (penetrance values, dominance, and allele frequency) as well as the recombination fraction. Also considered is the lod score maximized over a fixed choice of genetic model parameters and recombination-fraction values set prior to the analysis (MMLS) as proposed by Hodge et al. The objective is to fit parametric distributions to MMLS and LOD-M. Our results are based on 3,600 simulations of samples of n = 100 nuclear families ascertained for having one affected member and at least one other sibling available for linkage analysis. Each null distribution is approximately a mixture p(2)(0) + (1 - p)(2)(v). The values of MMLS appear to fit the mixture 0.20(2)(0) + 0.80chi(2)(1.6). The mixture distribution 0.13(2)(0) + 0.87chi(2)(2.8). appears to describe the null distribution of LOD-M. From these results we derive a simple method for obtaining critical values of LOD-M and MMLS. Copyright 2004 S. Karger AG, Basel
Exploring the Factor Structure of Neurocognitive Measures in Older Individuals
Santos, Nadine Correia; Costa, Patrício Soares; Amorim, Liliana; Moreira, Pedro Silva; Cunha, Pedro; Cotter, Jorge; Sousa, Nuno
2015-01-01
Here we focus on factor analysis from a best practices point of view, by investigating the factor structure of neuropsychological tests and using the results obtained to illustrate on choosing a reasonable solution. The sample (n=1051 individuals) was randomly divided into two groups: one for exploratory factor analysis (EFA) and principal component analysis (PCA), to investigate the number of factors underlying the neurocognitive variables; the second to test the “best fit” model via confirmatory factor analysis (CFA). For the exploratory step, three extraction (maximum likelihood, principal axis factoring and principal components) and two rotation (orthogonal and oblique) methods were used. The analysis methodology allowed exploring how different cognitive/psychological tests correlated/discriminated between dimensions, indicating that to capture latent structures in similar sample sizes and measures, with approximately normal data distribution, reflective models with oblimin rotation might prove the most adequate. PMID:25880732
Montoliu, Lluís
2012-06-01
The analysis of transgenic and knockout mice always involves the establishment of matings with individuals carrying different loci, segregating independently, whose presence is expected among the progeny, according to a Mendelian distribution. The appearance of distorted inheritance ratios suggests the existence of unexpected lethal or sub-lethal phenotypes associated with some genotypes. These situations are common in a number of cases, including: testing transgenic founder mice for germ-line transmission of their transgenes; setting up heterozygous crosses to obtain homozygous individuals, both for transgenic and knockout mice; establishing matings between floxed mouse lines and suitable cre transgenic mouse lines, etc. The Pearson's χ(2) test can be used to assess the significance of the observed frequencies of genotypes/phenotypes in relation to the expected values, in order to determine whether the observed cases fit the expected distribution. Here, I describe a simple Excel workbook to compare the observed and expected distributions of genotypes/phenotypes in transgenic and knockout mouse crosses involving up to three unlinked loci by means of a χ(2) test. The file is freely available for download from my laboratory's web page at: http://www.cnb.csic.es/~montoliu/Mendel.xls .
TRAC-PD2 posttest analysis of CCTF Test C1-16 (Run 025). [Cylindrical Core Test Facility
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sugimoto, J.
The TRAC-PD2 code version was used to analyze CCTF Test C1-16 (Run 025). The results indicate that the core heater rod temperatures, the liquid mass in the vessel, and differential pressures in the primary loop are predicted well, but the void fraction distribution in the core and water accumulation in the upper plenum are not in good agreement with the data.
Proposed military handbook for dynamic data acquisition and analysis - An invitation to review
NASA Technical Reports Server (NTRS)
Himelblau, Harry; Wise, James H.; Piersol, Allan G.; Grundvig, Max R.
1990-01-01
A draft Military Handbook prepared under the sponsorship of the USAF Space Division is presently being distributed throughout the U.S. for review by the aerospace community. This comprehensive document provides recommended guidelines for the acquisition and analysis of structural dynamics and aeroacoustic data, and is intended to reduce the errors and variability commonly found in flight, ground and laboratory dynamic test measurements. In addition to the usual variety of measurement problems encountered in the definition of dynamic loads, the development of design and test criteria, and the analysis of failures, special emphasis is given to certain state-of-the-art topics, such as pyroshock data acquisition and nonstationary random data analysis.
Stress Analysis of Columns and Beam Columns by the Photoelastic Method
NASA Technical Reports Server (NTRS)
Ruffner, B F
1946-01-01
Principles of similarity and other factors in the design of models for photoelastic testing are discussed. Some approximate theoretical equations, useful in the analysis of results obtained from photoelastic tests are derived. Examples of the use of photoelastic techniques and the analysis of results as applied to uniform and tapered beam columns, circular rings, and statically indeterminate frames, are given. It is concluded that this method is an effective tool for the analysis of structures in which column action is present, particularly in tapered beam columns, and in statically indeterminate structures in which the distribution of loads in the structures is influenced by bending moments due to axial loads in one or more members.
Exploratory reconstructability analysis of accident TBI data
NASA Astrophysics Data System (ADS)
Zwick, Martin; Carney, Nancy; Nettleton, Rosemary
2018-02-01
This paper describes the use of reconstructability analysis to perform a secondary study of traumatic brain injury data from automobile accidents. Neutral searches were done and their results displayed with a hypergraph. Directed searches, using both variable-based and state-based models, were applied to predict performance on two cognitive tests and one neurological test. Very simple state-based models gave large uncertainty reductions for all three DVs and sizeable improvements in percent correct for the two cognitive test DVs which were equally sampled. Conditional probability distributions for these models are easily visualized with simple decision trees. Confounding variables and counter-intuitive findings are also reported.
Influence of dental occlusion on postural control and plantar pressure distribution.
Scharnweber, Benjamin; Adjami, Frederic; Schuster, Gabriele; Kopp, Stefan; Natrup, Jörg; Erbe, Christina; Ohlendorf, Daniela
2017-11-01
The number of studies investigating correlations between the temporomandibular system and body posture, postural control or plantar pressure distribution is continuously increasing. If a connection can be found, it is often of minor influence or for only a single parameter. However, small subject groups are critical. This study was conducted to define correlations between dental parameters, postural control and plantar pressure distribution in healthy males. In this study, 87 male subjects with an average age of 25.23 ± 3.5 years (ranging from 18 to 35 years) were examined. Dental casts of the subjects were analyzed. Postural control and plantar pressure distribution were recorded by a force platform. Possible orthodontic and orthopedic factors of influence were determined by either an anamnesis or a questionnaire. All tests performed were randomized and repeated three times each for intercuspal position (ICP) and blocked occlusion (BO). For a statistical analysis of the results, non-parametric tests (Wilcoxon-Matched-Pairs-Test, Kruskall-Wallis-Test) were used. A revision of the results via Bonferroni-Holm correction was considered. ICP increases body sway in the frontal (p ≤ 0.01) and sagittal planes (p ≤ 0.03) compared to BO, whereas all other 29 correlations were independent of the occlusion position. For both of the ICP or BO cases, Angle-class, midline-displacement, crossbite, or orthodontic therapy were found to have no influence on postural control or plantar pressure distribution (p > 0.05). However, the contact time of the left foot decreased (p ≤ 0.001) while detecting the plantar pressure distribution in each position. Persistent dental parameters have no effect on postural sway. In addition, postural control and plantar pressure distribution have been found to be independent postural criteria.
COBRA ATD minefield detection model initial performance analysis
NASA Astrophysics Data System (ADS)
Holmes, V. Todd; Kenton, Arthur C.; Hilton, Russell J.; Witherspoon, Ned H.; Holloway, John H., Jr.
2000-08-01
A statistical performance analysis of the USMC Coastal Battlefield Reconnaissance and Analysis (COBRA) Minefield Detection (MFD) Model has been performed in support of the COBRA ATD Program under execution by the Naval Surface Warfare Center/Dahlgren Division/Coastal Systems Station . This analysis uses the Veridian ERIM International MFD model from the COBRA Sensor Performance Evaluation and Computational Tools for Research Analysis modeling toolbox and a collection of multispectral mine detection algorithm response distributions for mines and minelike clutter objects. These mine detection response distributions were generated form actual COBRA ATD test missions over littoral zone minefields. This analysis serves to validate both the utility and effectiveness of the COBRA MFD Model as a predictive MFD performance too. COBRA ATD minefield detection model algorithm performance results based on a simulate baseline minefield detection scenario are presented, as well as result of a MFD model algorithm parametric sensitivity study.
R/S analysis of reaction time in Neuron Type Test for human activity in civil aviation
NASA Astrophysics Data System (ADS)
Zhang, Hong-Yan; Kang, Ming-Cui; Li, Jing-Qiang; Liu, Hai-Tao
2017-03-01
Human factors become the most serious problem leading to accidents of civil aviation, which stimulates the design and analysis of Neuron Type Test (NTT) system to explore the intrinsic properties and patterns behind the behaviors of professionals and students in civil aviation. In the experiment, normal practitioners' reaction time sequences, collected from NTT, exhibit log-normal distribution approximately. We apply the χ2 test to compute the goodness-of-fit by transforming the time sequence with Box-Cox transformation to cluster practitioners. The long-term correlation of different individual practitioner's time sequence is represented by the Hurst exponent via Rescaled Range Analysis, also named by Range/Standard deviation (R/S) Analysis. The different Hurst exponent suggests the existence of different collective behavior and different intrinsic patterns of human factors in civil aviation.
TASK ALLOCATION IN GEO-DISTRIBUTED CYBER-PHYSICAL SYSTEMS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aggarwal, Rachit; Smidts, Carol
This paper studies the task allocation algorithm for a distributed test facility (DTF), which aims to assemble geo-distributed cyber (software) and physical (hardware in the loop components into a prototype cyber-physical system (CPS). This allows low cost testing on an early conceptual prototype (ECP) of the ultimate CPS (UCPS) to be developed. The DTF provides an instrumentation interface for carrying out reliability experiments remotely such as fault propagation analysis and in-situ testing of hardware and software components in a simulated environment. Unfortunately, the geo-distribution introduces an overhead that is not inherent to the UCPS, i.e. a significant time delay inmore » communication that threatens the stability of the ECP and is not an appropriate representation of the behavior of the UCPS. This can be mitigated by implementing a task allocation algorithm to find a suitable configuration and assign the software components to appropriate computational locations, dynamically. This would allow the ECP to operate more efficiently with less probability of being unstable due to the delays introduced by geo-distribution. The task allocation algorithm proposed in this work uses a Monte Carlo approach along with Dynamic Programming to identify the optimal network configuration to keep the time delays to a minimum.« less
Collevatti, Rosane Garcia; de Castro, Thaís Guimarães; de Souza Lima, Jacqueline; de Campos Telles, Mariana Pires
2012-01-01
Many endemic species present disjunct geographical distribution; therefore, they are suitable models to test hypotheses about the ecological and evolutionary mechanisms involved in the origin of disjunct distributions in these habitats. We studied the genetic structure and phylogeography of Tibouchina papyrus (Melastomataceae), endemic to rocky savannas in Central Brazil, to test hypothesis of vicariance and dispersal in the origin of the disjunct geographical distribution. We sampled 474 individuals from the three localities where the species is reported: Serra dos Pirineus, Serra Dourada, and Serra de Natividade. Analyses were based on the polymorphisms at cpDNA and on nuclear microsatellite loci. To test for vicariance and dispersal we constructed a median-joining network and performed an analysis of molecular variance (AMOVA). We also tested population bottleneck and estimated demographic parameters and time to most recent common ancestor (TMRCA) using coalescent analyses. A remarkable differentiation among populations was found. No significant effect of population expansion was detected and coalescent analyses showed a negligible gene flow among populations and an ancient coalescence time for chloroplast genome. Our results support that the disjunct distribution of T. papyrus may represent a climatic relict. With an estimated TMRCA dated from ∼836.491 ± 107.515 kyr BP (before present), we hypothesized that the disjunct distribution may be the outcome of bidirectional expansion of the geographical distribution favored by the drier and colder conditions that prevailed in much of Brazil during the Pre-Illinoian glaciation, followed by the retraction as the climate became warmer and moister. PMID:22837846
NASA Astrophysics Data System (ADS)
Le Pichon, C.; Belliard, J.; Talès, E.; Gorges, G.; Clément, F.
2009-12-01
Most of the rivers of the Ile de France region, intimately linked with the megalopolis of Paris, are severely altered and freshwater fishes are exposed to habitat alteration, reduced connectivity and pollution. Several species thus present fragmented distributions and decreasing densities. In this context, the European Water Framework Directive (2000) has goals of hydrosystems rehabilitation and no further damage. In particular, the preservation and restoration of ecological connectivity of river networks is a key element for fish populations. These goals require the identification of natural and anthropological factors which influence the spatial distribution of species. We have proposed a riverscape approach, based on landscape ecology concepts, combined with a set of spatial analysis methods to assess the multiscale relationships between the spatial pattern of fish habitats and processes depending on fish movements. In particular, we used this approach to test the relative roles of spatial arrangement of fish habitats and the presence of physical barriers in explaining fish spatial distributions in a small rural watershed (106 km2). We performed a spatially continuous analysis of fish-habitat relationships. Fish habitats and physical barriers were mapped along the river network (33 km) with a GPS and imported into a GIS. In parallel, a longitudinal electrofishing survey of the distribution and abundance of fishes was made using a point abundance sampling scheme. Longitudinal arrangement of fish habitats were evaluated using spatial analysis methods: patch/distance metrics and moving window analysis. Explanatory models were developed to test the relative contribution of local environmental variables and spatial context in explaining fish presence. We have recorded about 100 physical barriers, on average one every 330 meters; most artificial barriers were road pipe culverts, falls associated with ponds and sluice gates. Contrasted fish communities and densities were observed in the different areas of the watershed, related to various land use (riparian forest or agriculture). The first results of fish-habitat association analysis on a 5 km stream are that longitudinal distribution of fish species was mainly impacted by falls associated with ponds. The impact was both due to the barrier effect and to the modification of aquatic habitats. Abundance distribution of Salmo trutta and Cottus gobio was particularly affected. Spatially continuous analysis of fish-habitat relationships allowed us to identify the relative impacts of habitat alteration and presence of physical barriers to fish movements. These techniques could help prioritize preservation and restoration policies in human-impacted watersheds, in particular, identifying the key physical barriers to remove.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pradeep Rohatgi
2002-12-31
In this research, the effects of casting foundry, testing laboratory, surface conditions, and casting processes on the mechanical properties of A359-SiC composites were identified. To observe the effects, A359-SiC composites with 20 and 305 SiC particles were cast at three different foundries and tested at three different laboratories. The composites were cast in sand and permanent molds and tested as-cast and machined conditions. To identify the effect of the volume fraction and distribution of particles on the properties of the composites, particle distribution was determined using Clemex Image analysis systems, and particle volume fraction was determined using wet chemical analysismore » and Clemex Image analysis systems. The microstructure and fractured surfaces of the samples were analyzed using SEM, and EDX analysis was done to analyze chemical reaction between the particles and the matrix. The results of the tensile strengths exhibited that the tensile strengths depend on the density and porosity of the composites; in general the higher tensile strength is associated with lower porosity and higher density. In some cases, composites with lower density were higher than these with higher density. In the Al-20% SiC samples, the composites with more inclusions exhibited a lower tensile strength than the ones with fewer inclusions. This suggests that macroscopic casting defects such as micro-porosity, shrinkage porosity and inclusions appear to strongly influence the tensile strength more than the microstructure and particle distribution. The fatigue properties of A359/20 vol.% SiC composites were investigated under strain controlled conditions. Hysteresis loops obtained from strain controlled cyclic loading of 20% SiCp reinforced material did not exhibit any measurable softening or hardening. The fatigue life of Al-20% SiC heat treated alloy at a given total strain showed wide variation in fatigue life, which appeared to be related to factors such as inclusions, porosity, and particle distribution. The inclusions and porosity on the fracture surfaces seem to have a more significant influence on the fatigue life of cast Al-20% SiC as compared to other variables, including SiC particle volume percentage and its distribution. Striations were generally not visible on the fracture surface of the composites. In many specimens, SiC particle fracture was also observed. Fracture was more severe around pores and inclusions than in the matrix away from them. Inclusions and porosity seem to have a much stronger influence on fatigue behavior than the particle distribution. The analysis suggests that the enhancement of fatigue behavior of cast MMCs requires a decrease in the size of defects, porosity, and inclusions. The particle volume fraction determined using wet chemical analysis gives values of SiC vol.% which are closer to the nominal Sic % than the values of SiC% obtained by ultrasonic and Clemex Image Analysis system. In view of ALCAN's recommendation one must use wet chemical analysis for determining the volume percent SiC.« less
Application of Statistically Derived CPAS Parachute Parameters
NASA Technical Reports Server (NTRS)
Romero, Leah M.; Ray, Eric S.
2013-01-01
The Capsule Parachute Assembly System (CPAS) Analysis Team is responsible for determining parachute inflation parameters and dispersions that are ultimately used in verifying system requirements. A model memo is internally released semi-annually documenting parachute inflation and other key parameters reconstructed from flight test data. Dispersion probability distributions published in previous versions of the model memo were uniform because insufficient data were available for determination of statistical based distributions. Uniform distributions do not accurately represent the expected distributions since extreme parameter values are just as likely to occur as the nominal value. CPAS has taken incremental steps to move away from uniform distributions. Model Memo version 9 (MMv9) made the first use of non-uniform dispersions, but only for the reefing cutter timing, for which a large number of sample was available. In order to maximize the utility of the available flight test data, clusters of parachutes were reconstructed individually starting with Model Memo version 10. This allowed for statistical assessment for steady-state drag area (CDS) and parachute inflation parameters such as the canopy fill distance (n), profile shape exponent (expopen), over-inflation factor (C(sub k)), and ramp-down time (t(sub k)) distributions. Built-in MATLAB distributions were applied to the histograms, and parameters such as scale (sigma) and location (mu) were output. Engineering judgment was used to determine the "best fit" distribution based on the test data. Results include normal, log normal, and uniform (where available data remains insufficient) fits of nominal and failure (loss of parachute and skipped stage) cases for all CPAS parachutes. This paper discusses the uniform methodology that was previously used, the process and result of the statistical assessment, how the dispersions were incorporated into Monte Carlo analyses, and the application of the distributions in trajectory benchmark testing assessments with parachute inflation parameters, drag area, and reefing cutter timing used by CPAS.
Uncertainty Analysis of Power Grid Investment Capacity Based on Monte Carlo
NASA Astrophysics Data System (ADS)
Qin, Junsong; Liu, Bingyi; Niu, Dongxiao
By analyzing the influence factors of the investment capacity of power grid, to depreciation cost, sales price and sales quantity, net profit, financing and GDP of the second industry as the dependent variable to build the investment capacity analysis model. After carrying out Kolmogorov-Smirnov test, get the probability distribution of each influence factor. Finally, obtained the grid investment capacity uncertainty of analysis results by Monte Carlo simulation.
Galati, Luisa; Peronace, Cinzia; Fiorillo, Maria Teresa; Masciari, Rosanna; Giraldi, Cristina; Nisticò, Salvatore; Minchella, Pasquale; Maiolo, Vincenzo; Barreca, Giorgio Settimo; Marascio, Nadia; Lamberti, Angelo Giuseppe; Giancotti, Aida; Lepore, Maria Gabriella; Greco, Francesca; Mauro, Maria Vittoria; Borelli, Annelisa; Bocchiaro, Giuseppa Lo; Surace, Giovanni; Liberto, Maria Carla; Focà, Alfredo
2017-01-01
Although analysis of the Human papillomavirus (HPV) genotype spread in a particular area has a crucial impact on public health and prevention programmes, there is a lack of epidemiological data regarding HPV in the Calabria region of Italy. We therefore update information on HPV age/genotype distribution by retrospectively analysing a cohort of women, with and without cervical lesions, living in Calabria, who underwent HPV DNA testing; moreover, we also evaluated HPV age/genotype distribution in a subset of patients with cervical lesions. Cervical scrape specimens obtained from 9590 women (age range 20-75 years) from January 2010 to December 2015 were tested for HPV DNA. Viral types were genotyped by Linear Array HPV Genotyping® test (Roche, USA) at the Clinical Microbiology Operative Unit of six hospitals located in four provinces of the Calabria region. Cervical scrape specimens were also used to perform Pap smears for cytological analysis in a subset of 405 women; cytological classification of the samples was performed according to the Bethesda classification system. A total of 2974 women (31%) (C.I. 95% 30.09-31.94) were found to be HPV DNA positive for at least one (57.3%) or several (42.7%) HPV genotypes. Of single genotype HPV infections, 46.5% and 36.4 % were classed as high-risk (HR, Group 1) and low-risk (LR, Group 3) respectively, while 16.9% were classed as probably/possibly carcinogenic and 0.2% undetermined risk. Stratified by age, total HPV distribution, showed the highest prevalence within the range 30-39 years (37.2%), while single genotype infection distribution displayed a peak in women from the age range 20-29 years (37.5%). The most common high-risk HPV type was HPV 16 (19.1%), followed by HPV 31 (9.1%). We provide epidemiological data on HPV age/genotype distribution in women living in the Calabria region with or without cytological abnormalities, further to the enhancement of HPV screening/prevention programmes for the local population.
Fok, Jonathan; Toogood, Roger W; Badawi, Hisham; Carey, Jason P; Major, Paul W
2011-11-01
To better understand the mechanics of bracket/archwire interaction through analysis of force and couple distribution along the maxillary arch using elastic ligation and to compare these results with passive ligation. An orthodontic simulator was used to study a high canine malocclusion. Force and couple distributions produced by elastic ligation and round wire were measured. Forces and couples were referenced to the center of resistance of each tooth. Tests were repeated for 12 bracket sets with 12 wires per set. Data were compared with those derived from similar tests for passive ligation. Propagation of the force/couple systems around the arch using elastic ligation was extensive. Elastic ligation produced significantly more resistance to sliding, contributing to higher forces and couples at the center of resistance than were observed for passive ligation. The results of this study suggest some potential mechanical advantages of passive over elastic ligation. In particular, limited propagation around the arch in passive ligation reduces the occurrence of unwanted force/couple systems compared with elastic ligation. These advantages may not transfer to a clinical setting because of the conditions of the tests; additional testing would be required to determine whether these advantages can be generalized.
Application of survival analysis methodology to the quantitative analysis of LC-MS proteomics data.
Tekwe, Carmen D; Carroll, Raymond J; Dabney, Alan R
2012-08-01
Protein abundance in quantitative proteomics is often based on observed spectral features derived from liquid chromatography mass spectrometry (LC-MS) or LC-MS/MS experiments. Peak intensities are largely non-normal in distribution. Furthermore, LC-MS-based proteomics data frequently have large proportions of missing peak intensities due to censoring mechanisms on low-abundance spectral features. Recognizing that the observed peak intensities detected with the LC-MS method are all positive, skewed and often left-censored, we propose using survival methodology to carry out differential expression analysis of proteins. Various standard statistical techniques including non-parametric tests such as the Kolmogorov-Smirnov and Wilcoxon-Mann-Whitney rank sum tests, and the parametric survival model and accelerated failure time-model with log-normal, log-logistic and Weibull distributions were used to detect any differentially expressed proteins. The statistical operating characteristics of each method are explored using both real and simulated datasets. Survival methods generally have greater statistical power than standard differential expression methods when the proportion of missing protein level data is 5% or more. In particular, the AFT models we consider consistently achieve greater statistical power than standard testing procedures, with the discrepancy widening with increasing missingness in the proportions. The testing procedures discussed in this article can all be performed using readily available software such as R. The R codes are provided as supplemental materials. ctekwe@stat.tamu.edu.
A DMAP Program for the Selection of Accelerometer Locations in MSC/NASTRAN
NASA Technical Reports Server (NTRS)
Peck, Jeff; Torres, Isaias
2004-01-01
A new program for selecting sensor locations has been written in the DMAP (Direct Matrix Abstraction Program) language of MSC/NASTRAN. The program implements the method of Effective Independence for selecting sensor locations, and is executed within a single NASTRAN analysis as a "rigid format alter" to the normal modes solution sequence (SOL 103). The user of the program is able to choose among various analysis options using Case Control and Bulk Data entries. Algorithms tailored for the placement of both uni-axial and tri- axial accelerometers are available, as well as several options for including the model s mass distribution into the calculations. Target modes for the Effective Independence analysis are selected from the MSC/NASTRAN ASET modes calculated by the "SOL 103" solution sequence. The initial candidate sensor set is also under user control, and is selected from the ASET degrees of freedom. Analysis results are printed to the MSCINASTRAN output file (*.f06), and may include the current candidate sensors set, and their associated Effective Independence distribution, at user specified iteration intervals. At the conclusion of the analysis, the model is reduced to the final sensor set, and frequencies and orthogonality checks are printed. Example results are given for a pre-test analysis of NASA s five-segment solid rocket booster modal test.
The Statistical Value of Raw Fluorescence Signal in Luminex xMAP Based Multiplex Immunoassays
Breen, Edmond J.; Tan, Woei; Khan, Alamgir
2016-01-01
Tissue samples (plasma, saliva, serum or urine) from 169 patients classified as either normal or having one of seven possible diseases are analysed across three 96-well plates for the presences of 37 analytes using cytokine inflammation multiplexed immunoassay panels. Censoring for concentration data caused problems for analysis of the low abundant analytes. Using fluorescence analysis over concentration based analysis allowed analysis of these low abundant analytes. Mixed-effects analysis on the resulting fluorescence and concentration responses reveals a combination of censoring and mapping the fluorescence responses to concentration values, through a 5PL curve, changed observed analyte concentrations. Simulation verifies this, by showing a dependence on the mean florescence response and its distribution on the observed analyte concentration levels. Differences from normality, in the fluorescence responses, can lead to differences in concentration estimates and unreliable probabilities for treatment effects. It is seen that when fluorescence responses are normally distributed, probabilities of treatment effects for fluorescence based t-tests has greater statistical power than the same probabilities from concentration based t-tests. We add evidence that the fluorescence response, unlike concentration values, doesn’t require censoring and we show with respect to differential analysis on the fluorescence responses that background correction is not required. PMID:27243383
Scientific fraud in 20 falsified anesthesia papers : detection using financial auditing methods.
Hein, J; Zobrist, R; Konrad, C; Schuepfer, G
2012-06-01
Data from natural sources show counter-intuitive distribution patterns for the leading digits to the left of the decimal point and the digit 1 is observed more frequently than all other numbers. This pattern, which was first described by Newcomb and later confirmed by Benford, is used in financial and tax auditing to detect fraud. Deviations from the pattern indicate possible falsifications. Anesthesiology journals are affected not only by ghostwriting and plagiarism but also by counterfeiting. In the present study 20 publications in anesthesiology known to be falsified by an author were investigated for irregularities with respect to Benford's law using the χ(2)-test and the Z-test. In the 20 retracted publications an average first-digit frequency of 243.1 (standard deviation SD ± 118.2, range: 30-592) and an average second-digit frequency of 132.3 (SD ± 72.2, range: 15-383) were found. The observed distribution of the first and second digits to the left of the decimal point differed significantly (p< 0.01) from the expected distribution described by Benford. Only the observed absolute frequencies for digits 3, 4 and 5 did not differ significantly from the expected values. In an analysis of each paper 17 out of 20 studies differed significantly from the expected value for the first digit and 18 out of 20 studies varied significantly from the expected value of the second digit. Only one paper did not vary significantly from expected values for the digits to the left of the decimal. For comparison, a meta-analysis using complex mathematical procedures was chosen as a control. The analysis showed a first-digit distribution consistent with the Benford distribution. Thus, the method used in the present study seems to be sensitive for detecting fraud. Additional statements of specificity cannot yet be made as this requires further analysis of data that is definitely not falsified. Future studies exploring conformity might help prevent falsified studies from being published.
Henríquez-Henríquez, Marcela Patricia; Billeke, Pablo; Henríquez, Hugo; Zamorano, Francisco Javier; Rothhammer, Francisco; Aboitiz, Francisco
2014-01-01
Intra-individual variability of response times (RTisv) is considered as potential endophenotype for attentional deficit/hyperactivity disorder (ADHD). Traditional methods for estimating RTisv lose information regarding response times (RTs) distribution along the task, with eventual effects on statistical power. Ex-Gaussian analysis captures the dynamic nature of RTisv, estimating normal and exponential components for RT distribution, with specific phenomenological correlates. Here, we applied ex-Gaussian analysis to explore whether intra-individual variability of RTs agrees with criteria proposed by Gottesman and Gould for endophenotypes. Specifically, we evaluated if normal and/or exponential components of RTs may (a) present the stair-like distribution expected for endophenotypes (ADHD > siblings > typically developing children (TD) without familiar history of ADHD) and (b) represent a phenotypic correlate for previously described genetic risk variants. This is a pilot study including 55 subjects (20 ADHD-discordant sibling-pairs and 15 TD children), all aged between 8 and 13 years. Participants resolved a visual Go/Nogo with 10% Nogo probability. Ex-Gaussian distributions were fitted to individual RT data and compared among the three samples. In order to test whether intra-individual variability may represent a correlate for previously described genetic risk variants, VNTRs at DRD4 and SLC6A3 were identified in all sibling-pairs following standard protocols. Groups were compared adjusting independent general linear models for the exponential and normal components from the ex-Gaussian analysis. Identified trends were confirmed by the non-parametric Jonckheere-Terpstra test. Stair-like distributions were observed for μ (p = 0.036) and σ (p = 0.009). An additional "DRD4-genotype" × "clinical status" interaction was present for τ (p = 0.014) reflecting a possible severity factor. Thus, normal and exponential RTisv components are suitable as ADHD endophenotypes.
Aerodynamic characteristics of the National Launch System (NLS) 1 1/2 stage launch vehicle
NASA Technical Reports Server (NTRS)
Springer, A. M.; Pokora, D. C.
1994-01-01
The National Aeronautics and Space Administration (NASA) is studying ways of assuring more reliable and cost effective means to space. One launch system studied was the NLS which included the l l/2 stage vehicle. This document encompasses the aerodynamic characteristics of the 1 l/2 stage vehicle. To support the detailed configuration definition two wind tunnel tests were conducted in the NASA Marshall Space Flight Center's 14x14-Inch Trisonic Wind Tunnel during 1992. The tests were a static stability and a pressure test, each utilizing 0.004 scale models. The static stability test resulted in the forces and moments acting on the vehicle. The aerodynamics for the reference configuration with and without feedlines and an evaluation of three proposed engine shroud configurations were also determined. The pressure test resulted in pressure distributions over the reference vehicle with and without feedlines including the reference engine shrouds. These pressure distributions were integrated and balanced to the static stability coefficients resulting in distributed aerodynamic loads on the vehicle. The wind tunnel tests covered a Mach range of 0.60 to 4.96. These ascent flight aerodynamic characteristics provide the basis for trajectory and performance analysis, loads determination, and guidance and control evaluation.
Analysis of critical operating conditions for LV distribution networks with microgrids
NASA Astrophysics Data System (ADS)
Zehir, M. A.; Batman, A.; Sonmez, M. A.; Font, A.; Tsiamitros, D.; Stimoniaris, D.; Kollatou, T.; Bagriyanik, M.; Ozdemir, A.; Dialynas, E.
2016-11-01
Increase in the penetration of Distributed Generation (DG) in distribution networks, raises the risk of voltage limit violations while contributing to line losses. Especially in low voltage (LV) distribution networks (secondary distribution networks), impacts of active power flows on the bus voltages and on the network losses are more dominant. As network operators must meet regulatory limitations, they have to take into account the most critical operating conditions in their systems. In this study, it is aimed to present the impact of the worst operation cases of LV distribution networks comprising microgrids. Simulation studies are performed on a field data-based virtual test-bed. The simulations are repeated for several cases consisting different microgrid points of connection with different network loading and microgrid supply/demand conditions.
Loophole-free Bell test using electron spins in diamond: second experiment and additional analysis
Hensen, B.; Kalb, N.; Blok, M. S.; Dréau, A. E.; Reiserer, A.; Vermeulen, R. F. L.; Schouten, R. N.; Markham, M.; Twitchen, D. J.; Goodenough, K.; Elkouss, D.; Wehner, S.; Taminiau, T. H.; Hanson, R.
2016-01-01
The recently reported violation of a Bell inequality using entangled electronic spins in diamonds (Hensen et al., Nature 526, 682–686) provided the first loophole-free evidence against local-realist theories of nature. Here we report on data from a second Bell experiment using the same experimental setup with minor modifications. We find a violation of the CHSH-Bell inequality of 2.35 ± 0.18, in agreement with the first run, yielding an overall value of S = 2.38 ± 0.14. We calculate the resulting P-values of the second experiment and of the combined Bell tests. We provide an additional analysis of the distribution of settings choices recorded during the two tests, finding that the observed distributions are consistent with uniform settings for both tests. Finally, we analytically study the effect of particular models of random number generator (RNG) imperfection on our hypothesis test. We find that the winning probability per trial in the CHSH game can be bounded knowing only the mean of the RNG bias. This implies that our experimental result is robust for any model underlying the estimated average RNG bias, for random bits produced up to 690 ns too early by the random number generator. PMID:27509823
GPS FOM Chimney Analysis using Generalized Extreme Value Distribution
NASA Technical Reports Server (NTRS)
Ott, Rick; Frisbee, Joe; Saha, Kanan
2004-01-01
Many a time an objective of a statistical analysis is to estimate a limit value like 3-sigma 95% confidence upper limit from a data sample. The generalized Extreme Value Distribution method can be profitably employed in many situations for such an estimate. . .. It is well known that according to the Central Limit theorem the mean value of a large data set is normally distributed irrespective of the distribution of the data from which the mean value is derived. In a somewhat similar fashion it is observed that many times the extreme value of a data set has a distribution that can be formulated with a Generalized Distribution. In space shuttle entry with 3-string GPS navigation the Figure Of Merit (FOM) value gives a measure of GPS navigated state accuracy. A GPS navigated state with FOM of 6 or higher is deemed unacceptable and is said to form a FOM 6 or higher chimney. A FOM chimney is a period of time during which the FOM value stays higher than 5. A longer period of FOM of value 6 or higher causes navigated state to accumulate more error for a lack of state update. For an acceptable landing it is imperative that the state error remains low and hence at low altitude during entry GPS data of FOM greater than 5 must not last more than 138 seconds. I To test the GPS performAnce many entry test cases were simulated at the Avionics Development Laboratory. Only high value FoM chimneys are consequential. The extreme value statistical technique is applied to analyze high value FOM chimneys. The Maximum likelihood method is used to determine parameters that characterize the GEV distribution, and then the limit value statistics are estimated.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sig Drellack, Lance Prothro
2007-12-01
The Underground Test Area (UGTA) Project of the U.S. Department of Energy, National Nuclear Security Administration Nevada Site Office is in the process of assessing and developing regulatory decision options based on modeling predictions of contaminant transport from underground testing of nuclear weapons at the Nevada Test Site (NTS). The UGTA Project is attempting to develop an effective modeling strategy that addresses and quantifies multiple components of uncertainty including natural variability, parameter uncertainty, conceptual/model uncertainty, and decision uncertainty in translating model results into regulatory requirements. The modeling task presents multiple unique challenges to the hydrological sciences as a result ofmore » the complex fractured and faulted hydrostratigraphy, the distributed locations of sources, the suite of reactive and non-reactive radionuclides, and uncertainty in conceptual models. Characterization of the hydrogeologic system is difficult and expensive because of deep groundwater in the arid desert setting and the large spatial setting of the NTS. Therefore, conceptual model uncertainty is partially addressed through the development of multiple alternative conceptual models of the hydrostratigraphic framework and multiple alternative models of recharge and discharge. Uncertainty in boundary conditions is assessed through development of alternative groundwater fluxes through multiple simulations using the regional groundwater flow model. Calibration of alternative models to heads and measured or inferred fluxes has not proven to provide clear measures of model quality. Therefore, model screening by comparison to independently-derived natural geochemical mixing targets through cluster analysis has also been invoked to evaluate differences between alternative conceptual models. Advancing multiple alternative flow models, sensitivity of transport predictions to parameter uncertainty is assessed through Monte Carlo simulations. The simulations are challenged by the distributed sources in each of the Corrective Action Units, by complex mass transfer processes, and by the size and complexity of the field-scale flow models. An efficient methodology utilizing particle tracking results and convolution integrals provides in situ concentrations appropriate for Monte Carlo analysis. Uncertainty in source releases and transport parameters including effective porosity, fracture apertures and spacing, matrix diffusion coefficients, sorption coefficients, and colloid load and mobility are considered. With the distributions of input uncertainties and output plume volumes, global analysis methods including stepwise regression, contingency table analysis, and classification tree analysis are used to develop sensitivity rankings of parameter uncertainties for each model considered, thus assisting a variety of decisions.« less
An asymptotic analysis of the logrank test.
Strawderman, R L
1997-01-01
Asymptotic expansions for the null distribution of the logrank statistic and its distribution under local proportional hazards alternatives are developed in the case of iid observations. The results, which are derived from the work of Gu (1992) and Taniguchi (1992), are easy to interpret, and provide some theoretical justification for many behavioral characteristics of the logrank test that have been previously observed in simulation studies. We focus primarily upon (i) the inadequacy of the usual normal approximation under treatment group imbalance; and, (ii) the effects of treatment group imbalance on power and sample size calculations. A simple transformation of the logrank statistic is also derived based on results in Konishi (1991) and is found to substantially improve the standard normal approximation to its distribution under the null hypothesis of no survival difference when there is treatment group imbalance.
Bohn, Justin; Eddings, Wesley; Schneeweiss, Sebastian
2017-03-15
Distributed networks of health-care data sources are increasingly being utilized to conduct pharmacoepidemiologic database studies. Such networks may contain data that are not physically pooled but instead are distributed horizontally (separate patients within each data source) or vertically (separate measures within each data source) in order to preserve patient privacy. While multivariable methods for the analysis of horizontally distributed data are frequently employed, few practical approaches have been put forth to deal with vertically distributed health-care databases. In this paper, we propose 2 propensity score-based approaches to vertically distributed data analysis and test their performance using 5 example studies. We found that these approaches produced point estimates close to what could be achieved without partitioning. We further found a performance benefit (i.e., lower mean squared error) for sequentially passing a propensity score through each data domain (called the "sequential approach") as compared with fitting separate domain-specific propensity scores (called the "parallel approach"). These results were validated in a small simulation study. This proof-of-concept study suggests a new multivariable analysis approach to vertically distributed health-care databases that is practical, preserves patient privacy, and warrants further investigation for use in clinical research applications that rely on health-care databases. © The Author 2017. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Airside HVAC BESTEST: HVAC Air-Distribution System Model Test Cases for ASHRAE Standard 140
DOE Office of Scientific and Technical Information (OSTI.GOV)
Judkoff, Ronald; Neymark, Joel; Kennedy, Mike D.
This paper summarizes recent work to develop new airside HVAC equipment model analytical verification test cases for ANSI/ASHRAE Standard 140, Standard Method of Test for the Evaluation of Building Energy Analysis Computer Programs. The analytical verification test method allows comparison of simulation results from a wide variety of building energy simulation programs with quasi-analytical solutions, further described below. Standard 140 is widely cited for evaluating software for use with performance-path energy efficiency analysis, in conjunction with well-known energy-efficiency standards including ASHRAE Standard 90.1, the International Energy Conservation Code, and other international standards. Airside HVAC Equipment is a common area ofmore » modelling not previously explicitly tested by Standard 140. Integration of the completed test suite into Standard 140 is in progress.« less
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.
Testing the Gamma-Ray Burst Energy Relationships
NASA Technical Reports Server (NTRS)
Band, David L.; Preece, Robert D.
2005-01-01
Building on Nakar & Piran's analysis of the Amati relation relating gamma-ray burst peak energies E(sub p) and isotropic energies E(sub iso ) we test the consistency of a large sample of BATSE bursts with the Amati and Ghirlanda (which relates peak energies and actual gamma-ray energies E(sub gamma)) relations. Each of these relations can be exp ressed as a ratio of the different energies that is a function of red shift (for both the Amati and Ghirlanda relations) and beaming fraction f(sub B) (for the Ghirlanda relation). The most rigorous test, whic h allows bursts to be at any redshift, corroborates Nakar & Piran's r esult - 88% of the BATSE bursts are inconsistent with the Amati relat ion - while only l.6% of the bursts are inconsistent with the Ghirlan da relation if f(sub B) = 1. Modelling the redshift distribution resu lts in an energy ratio distribution for the Amati relation that is sh ifted by an order of magnitude relative to the observed distributions; any sub-population satisfying the Amati relation can comprise at mos t approx. 18% of our burst sample. A similar analysis of the Ghirland a relation depends sensitively on the beaming fraction distribution f or small values of f(sub B); for reasonable estimates of this distrib ution about a third of the burst sample is inconsistent with the Ghir landa relation. Our results indicate that these relations are an artifact of the selection effects of the burst sample in which they were f ound; these selection effects may favor sub-populations for which the se relations are valid.
Ávila-Jiménez, María Luisa; Coulson, Stephen James
2011-01-01
We aimed to describe the main Arctic biogeographical patterns of the Collembola, and analyze historical factors and current climatic regimes determining Arctic collembolan species distribution. Furthermore, we aimed to identify possible dispersal routes, colonization sources and glacial refugia for Arctic collembola. We implemented a Gaussian Mixture Clustering method on species distribution ranges and applied a distance- based parametric bootstrap test on presence-absence collembolan species distribution data. Additionally, multivariate analysis was performed considering species distributions, biodiversity, cluster distribution and environmental factors (temperature and precipitation). No clear relation was found between current climatic regimes and species distribution in the Arctic. Gaussian Mixture Clustering found common elements within Siberian areas, Atlantic areas, the Canadian Arctic, a mid-Siberian cluster and specific Beringian elements, following the same pattern previously described, using a variety of molecular methods, for Arctic plants. Species distribution hence indicate the influence of recent glacial history, as LGM glacial refugia (mid-Siberia, and Beringia) and major dispersal routes to high Arctic island groups can be identified. Endemic species are found in the high Arctic, but no specific biogeographical pattern can be clearly identified as a sign of high Arctic glacial refugia. Ocean currents patterns are suggested as being an important factor shaping the distribution of Arctic Collembola, which is consistent with Antarctic studies in collembolan biogeography. The clear relations between cluster distribution and geographical areas considering their recent glacial history, lack of relationship of species distribution with current climatic regimes, and consistency with previously described Arctic patterns in a series of organisms inferred using a variety of methods, suggest that historical phenomena shaping contemporary collembolan distribution can be inferred through biogeographical analysis. PMID:26467728
Moon, Joon-Shik; Kang, Su-Tae
2018-01-26
Considering the case of fabricating a UHSFRC (ultra-high strength fiber-reinforced concrete) beam with the method of one end placing and self-flowing to the other end, it was intended to simulate the variation of the fiber orientation distribution according to the flow distance and the variation of the resultant tensile behaviors. Then the validity of the simulation approach was shown by comparing the simulated results with experimental ones. A three-point bending test with a notched beam was adopted for the experiment and a finite element analysis was performed to obtain the simulated results for the bending test considering the flow-dependent tensile behavior of the UHSFRC. From the simulation for the fiber orientation distribution according to the flow distance, it could be found that the major change in the fiber orientation distribution took place within a short flow distance and most of the fibers became nearly aligned to the flow direction. After some flow distance, there was a not-so-remarkable variation in the fiber orientation distribution that could influence the tensile behavior of the composite. For this flow region, the consistent flexural test results, regardless of flow distance, demonstrate the reliability of the simulation.
The reliability and validity of the SF-8 with a conflict-affected population in northern Uganda.
Roberts, Bayard; Browne, John; Ocaka, Kaducu Felix; Oyok, Thomas; Sondorp, Egbert
2008-12-02
The SF-8 is a health-related quality of life instrument that could provide a useful means of assessing general physical and mental health amongst populations affected by conflict. The purpose of this study was to test the validity and reliability of the SF-8 with a conflict-affected population in northern Uganda. A cross-sectional multi-staged, random cluster survey was conducted with 1206 adults in camps for internally displaced persons in Gulu and Amuru districts of northern Uganda. Data quality was assessed by analysing the number of incomplete responses to SF-8 items. Response distribution was analysed using aggregate endorsement frequency. Test-retest reliability was assessed in a separate smaller survey using the intraclass correlation test. Construct validity was measured using principal component analysis, and the Pearson Correlation test for item-summary score correlation and inter-instrument correlations. Known groups validity was assessed using a two sample t-test to evaluates the ability of the SF-8 to discriminate between groups known to have, and not have, physical and mental health problems. The SF-8 showed excellent data quality. It showed acceptable item response distribution based upon analysis of aggregate endorsement frequencies. Test-retest showed a good intraclass correlation of 0.61 for PCS and 0.68 for MCS. The principal component analysis indicated strong construct validity and concurred with the results of the validity tests by the SF-8 developers. The SF-8 also showed strong construct validity between the 8 items and PCS and MCS summary score, moderate inter-instrument validity, and strong known groups validity. This study provides evidence on the reliability and validity of the SF-8 amongst IDPs in northern Uganda.
The reliability and validity of the SF-8 with a conflict-affected population in northern Uganda
Roberts, Bayard; Browne, John; Ocaka, Kaducu Felix; Oyok, Thomas; Sondorp, Egbert
2008-01-01
Background The SF-8 is a health-related quality of life instrument that could provide a useful means of assessing general physical and mental health amongst populations affected by conflict. The purpose of this study was to test the validity and reliability of the SF-8 with a conflict-affected population in northern Uganda. Methods A cross-sectional multi-staged, random cluster survey was conducted with 1206 adults in camps for internally displaced persons in Gulu and Amuru districts of northern Uganda. Data quality was assessed by analysing the number of incomplete responses to SF-8 items. Response distribution was analysed using aggregate endorsement frequency. Test-retest reliability was assessed in a separate smaller survey using the intraclass correlation test. Construct validity was measured using principal component analysis, and the Pearson Correlation test for item-summary score correlation and inter-instrument correlations. Known groups validity was assessed using a two sample t-test to evaluates the ability of the SF-8 to discriminate between groups known to have, and not have, physical and mental health problems. Results The SF-8 showed excellent data quality. It showed acceptable item response distribution based upon analysis of aggregate endorsement frequencies. Test-retest showed a good intraclass correlation of 0.61 for PCS and 0.68 for MCS. The principal component analysis indicated strong construct validity and concurred with the results of the validity tests by the SF-8 developers. The SF-8 also showed strong construct validity between the 8 items and PCS and MCS summary score, moderate inter-instrument validity, and strong known groups validity. Conclusion This study provides evidence on the reliability and validity of the SF-8 amongst IDPs in northern Uganda. PMID:19055716
Cohn, T.A.; England, J.F.; Berenbrock, C.E.; Mason, R.R.; Stedinger, J.R.; Lamontagne, J.R.
2013-01-01
he Grubbs-Beck test is recommended by the federal guidelines for detection of low outliers in flood flow frequency computation in the United States. This paper presents a generalization of the Grubbs-Beck test for normal data (similar to the Rosner (1983) test; see also Spencer and McCuen (1996)) that can provide a consistent standard for identifying multiple potentially influential low flows. In cases where low outliers have been identified, they can be represented as “less-than” values, and a frequency distribution can be developed using censored-data statistical techniques, such as the Expected Moments Algorithm. This approach can improve the fit of the right-hand tail of a frequency distribution and provide protection from lack-of-fit due to unimportant but potentially influential low flows (PILFs) in a flood series, thus making the flood frequency analysis procedure more robust.
NASA Astrophysics Data System (ADS)
Cohn, T. A.; England, J. F.; Berenbrock, C. E.; Mason, R. R.; Stedinger, J. R.; Lamontagne, J. R.
2013-08-01
The Grubbs-Beck test is recommended by the federal guidelines for detection of low outliers in flood flow frequency computation in the United States. This paper presents a generalization of the Grubbs-Beck test for normal data (similar to the Rosner (1983) test; see also Spencer and McCuen (1996)) that can provide a consistent standard for identifying multiple potentially influential low flows. In cases where low outliers have been identified, they can be represented as "less-than" values, and a frequency distribution can be developed using censored-data statistical techniques, such as the Expected Moments Algorithm. This approach can improve the fit of the right-hand tail of a frequency distribution and provide protection from lack-of-fit due to unimportant but potentially influential low flows (PILFs) in a flood series, thus making the flood frequency analysis procedure more robust.
Hybrid network defense model based on fuzzy evaluation.
Cho, Ying-Chiang; Pan, Jen-Yi
2014-01-01
With sustained and rapid developments in the field of information technology, the issue of network security has become increasingly prominent. The theme of this study is network data security, with the test subject being a classified and sensitive network laboratory that belongs to the academic network. The analysis is based on the deficiencies and potential risks of the network's existing defense technology, characteristics of cyber attacks, and network security technologies. Subsequently, a distributed network security architecture using the technology of an intrusion prevention system is designed and implemented. In this paper, first, the overall design approach is presented. This design is used as the basis to establish a network defense model, an improvement over the traditional single-technology model that addresses the latter's inadequacies. Next, a distributed network security architecture is implemented, comprising a hybrid firewall, intrusion detection, virtual honeynet projects, and connectivity and interactivity between these three components. Finally, the proposed security system is tested. A statistical analysis of the test results verifies the feasibility and reliability of the proposed architecture. The findings of this study will potentially provide new ideas and stimuli for future designs of network security architecture.
Development of Testing Methodologies for the Mechanical Properties of MEMS
NASA Technical Reports Server (NTRS)
Ekwaro-Osire, Stephen
2003-01-01
This effort is to investigate and design testing strategies to determine the mechanical properties of MicroElectroMechanical Systems (MEMS) as well as investigate the development of a MEMS Probabilistic Design Methodology (PDM). One item of potential interest is the design of a test for the Weibull size effect in pressure membranes. The Weibull size effect is a consequence of a stochastic strength response predicted from the Weibull distribution. Confirming that MEMS strength is controlled by the Weibull distribution will enable the development of a probabilistic design methodology for MEMS - similar to the GRC developed CARES/Life program for bulk ceramics. However, the primary area of investigation will most likely be analysis and modeling of material interfaces for strength as well as developing a strategy to handle stress singularities at sharp corners, filets, and material interfaces. This will be a continuation of the previous years work. The ultimate objective of this effort is to further develop and verify the ability of the Ceramics Analysis and Reliability Evaluation of Structures Life (CARES/Life) code to predict the time-dependent reliability of MEMS structures subjected to multiple transient loads.
Pasternak, Amy; Sideridis, Georgios; Fragala-Pinkham, Maria; Glanzman, Allan M; Montes, Jacqueline; Dunaway, Sally; Salazar, Rachel; Quigley, Janet; Pandya, Shree; O'Riley, Susan; Greenwood, Jonathan; Chiriboga, Claudia; Finkel, Richard; Tennekoon, Gihan; Martens, William B; McDermott, Michael P; Fournier, Heather Szelag; Madabusi, Lavanya; Harrington, Timothy; Cruz, Rosangel E; LaMarca, Nicole M; Videon, Nancy M; Vivo, Darryl C De; Darras, Basil T
2016-12-01
In this study we evaluated the suitability of a caregiver-reported functional measure, the Pediatric Evaluation of Disability Inventory-Computer Adaptive Test (PEDI-CAT), for children and young adults with spinal muscular atrophy (SMA). PEDI-CAT Mobility and Daily Activities domain item banks were administered to 58 caregivers of children and young adults with SMA. Rasch analysis was used to evaluate test properties across SMA types. Unidimensional content for each domain was confirmed. The PEDI-CAT was most informative for type III SMA, with ability levels distributed close to 0.0 logits in both domains. It was less informative for types I and II SMA, especially for mobility skills. Item and person abilities were not distributed evenly across all types. The PEDI-CAT may be used to measure functional performance in SMA, but additional items are needed to identify small changes in function and best represent the abilities of all types of SMA. Muscle Nerve 54: 1097-1107, 2016. © 2016 Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Rahman, Yuli Asmi; Manjang, Salama; Yusran, Ilham, Amil Ahmad
2018-03-01
Power loss minimization have many advantagess to the distribution system radial among others reduction of power flow in feeder lines, freeing stress on feeder loading, deterrence of power procurement from the grid and also the cost of loss compensating instruments. This paper, presents capacitor and photovoltaic (PV) placement as alternative means to decrease power system losses. The paper aims to evaluate the best alternative for decreasing power system losses and improving voltage profile in the radial distribution system. To achieve the objectives of paper, they are used three cases tested by Electric Transient and Analysis Program (ETAP) simulation. Firstly, it performs simulation of placement capacitor. Secondly, simulated placement of PV. Lastly, it runs simulation of placement capacitor and PV simultaneously. The simulations were validated using the IEEE 34-bus test system. As a result, they proved that the installation of capacitor and PV integration simultaneously leading to voltage profile correction and power losses minimization significantly.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mahfuz, H.; Maniruzzaman, M.; Vaidya, U.
1997-04-01
Monotonic tensile and fatigue response of continuous silicon carbide fiber reinforced silicon nitride (SiC{sub f}/Si{sub 3}N{sub 4}) composites has been investigated. The monotonic tensile tests have been performed at room and elevated temperatures. Fatigue tests have been conducted at room temperature (RT), at a stress ratio, R = 0.1 and a frequency of 5 Hz. It is observed during the monotonic tests that the composites retain only 30% of its room temperature strength at 1,600 C suggesting a substantial chemical degradation of the matrix at that temperature. The softening of the matrix at elevated temperature also causes reduction in tensilemore » modulus, and the total reduction in modulus is around 45%. Fatigue data have been generated at three load levels and the fatigue strength of the composite has been found to be considerably high; about 75% of its ultimate room temperature strength. Extensive statistical analysis has been performed to understand the degree of scatter in the fatigue as well as in the static test data. Weibull shape factors and characteristic values have been determined for each set of tests and their relationship with the response of the composites has been discussed. A statistical fatigue life prediction method developed from the Weibull distribution is also presented. Maximum Likelihood Estimator with censoring techniques and data pooling schemes has been employed to determine the distribution parameters for the statistical analysis. These parameters have been used to generate the S-N diagram with desired level of reliability. Details of the statistical analysis and the discussion of the static and fatigue behavior of the composites are presented in this paper.« less
Red cell distribution width does not predict stroke severity or functional outcome.
Ntaios, George; Gurer, Ozgur; Faouzi, Mohamed; Aubert, Carole; Michel, Patrik
2012-01-01
Red cell distribution width was recently identified as a predictor of cardiovascular and all-cause mortality in patients with previous stroke. Red cell distribution width is also higher in patients with stroke compared with those without. However, there are no data on the association of red cell distribution width, assessed during the acute phase of ischemic stroke, with stroke severity and functional outcome. In the present study, we sought to investigate this relationship and ascertain the main determinants of red cell distribution width in this population. We used data from the Acute Stroke Registry and Analysis of Lausanne for patients between January 2003 and December 2008. Red cell distribution width was generated at admission by the Sysmex XE-2100 automated cell counter from ethylene diamine tetraacetic acid blood samples stored at room temperature until measurement. An χ(2) -test was performed to compare frequencies of categorical variables between different red cell distribution width quartiles, and one-way analysis of variance for continuous variables. The effect of red cell distribution width on severity and functional outcome was investigated in univariate and multivariate robust regression analysis. Level of significance was set at 95%. There were 1504 patients (72±15·76 years, 43·9% females) included in the analysis. Red cell distribution width was significantly associated to NIHSS (β-value=0·24, P=0·01) and functional outcome (odds ratio=10·73 for poor outcome, P<0·001) at univariate analysis but not multivariate. Prehospital Rankin score (β=0·19, P<0·001), serum creatinine (β=0·008, P<0·001), hemoglobin (β=-0·009, P<0·001), mean platelet volume (β=0·09, P<0·05), age (β=0·02, P<0·001), low ejection fraction (β=0·66, P<0·001) and antihypertensive treatment (β=0·32, P<0·001) were independent determinants of red cell distribution width. Red cell distribution width, assessed during the early phase of acute ischemic stroke, does not predict severity or functional outcome. © 2011 The Authors. International Journal of Stroke © 2011 World Stroke Organization.
2011-06-03
Permutationalmultivariate analysis of variance ( PerMANOVA ; McArdle and Anderson, 2001) was used to test hypotheses regard- ing regions and invasion level...for the differences due to invasion level after removing any differences due to regions, soil texture, and habitat. The null distribution for PerMANOVA ...soil neigh- borhoods, PerMANOVA tests were carried out separately for each site. We did not use a stratified randomization scheme for these tests, under
Zhang, Jingyang; Chaloner, Kathryn; McLinden, James H.; Stapleton, Jack T.
2013-01-01
Reconciling two quantitative ELISA tests for an antibody to an RNA virus, in a situation without a gold standard and where false negatives may occur, is the motivation for this work. False negatives occur when access of the antibody to the binding site is blocked. Based on the mechanism of the assay, a mixture of four bivariate normal distributions is proposed with the mixture probabilities depending on a two-stage latent variable model including the prevalence of the antibody in the population and the probabilities of blocking on each test. There is prior information on the prevalence of the antibody, and also on the probability of false negatives, and so a Bayesian analysis is used. The dependence between the two tests is modeled to be consistent with the biological mechanism. Bayesian decision theory is utilized for classification. The proposed method is applied to the motivating data set to classify the data into two groups: those with and those without the antibody. Simulation studies describe the properties of the estimation and the classification. Sensitivity to the choice of the prior distribution is also addressed by simulation. The same model with two levels of latent variables is applicable in other testing procedures such as quantitative polymerase chain reaction tests where false negatives occur when there is a mutation in the primer sequence. PMID:23592433
Fine-scale habitat modeling of a top marine predator: do prey data improve predictive capacity?
Torres, Leigh G; Read, Andrew J; Halpin, Patrick
2008-10-01
Predators and prey assort themselves relative to each other, the availability of resources and refuges, and the temporal and spatial scale of their interaction. Predictive models of predator distributions often rely on these relationships by incorporating data on environmental variability and prey availability to determine predator habitat selection patterns. This approach to predictive modeling holds true in marine systems where observations of predators are logistically difficult, emphasizing the need for accurate models. In this paper, we ask whether including prey distribution data in fine-scale predictive models of bottlenose dolphin (Tursiops truncatus) habitat selection in Florida Bay, Florida, U.S.A., improves predictive capacity. Environmental characteristics are often used as predictor variables in habitat models of top marine predators with the assumption that they act as proxies of prey distribution. We examine the validity of this assumption by comparing the response of dolphin distribution and fish catch rates to the same environmental variables. Next, the predictive capacities of four models, with and without prey distribution data, are tested to determine whether dolphin habitat selection can be predicted without recourse to describing the distribution of their prey. The final analysis determines the accuracy of predictive maps of dolphin distribution produced by modeling areas of high fish catch based on significant environmental characteristics. We use spatial analysis and independent data sets to train and test the models. Our results indicate that, due to high habitat heterogeneity and the spatial variability of prey patches, fine-scale models of dolphin habitat selection in coastal habitats will be more successful if environmental variables are used as predictor variables of predator distributions rather than relying on prey data as explanatory variables. However, predictive modeling of prey distribution as the response variable based on environmental variability did produce high predictive performance of dolphin habitat selection, particularly foraging habitat.
Biostatistics Series Module 3: Comparing Groups: Numerical Variables.
Hazra, Avijit; Gogtay, Nithya
2016-01-01
Numerical data that are normally distributed can be analyzed with parametric tests, that is, tests which are based on the parameters that define a normal distribution curve. If the distribution is uncertain, the data can be plotted as a normal probability plot and visually inspected, or tested for normality using one of a number of goodness of fit tests, such as the Kolmogorov-Smirnov test. The widely used Student's t-test has three variants. The one-sample t-test is used to assess if a sample mean (as an estimate of the population mean) differs significantly from a given population mean. The means of two independent samples may be compared for a statistically significant difference by the unpaired or independent samples t-test. If the data sets are related in some way, their means may be compared by the paired or dependent samples t-test. The t-test should not be used to compare the means of more than two groups. Although it is possible to compare groups in pairs, when there are more than two groups, this will increase the probability of a Type I error. The one-way analysis of variance (ANOVA) is employed to compare the means of three or more independent data sets that are normally distributed. Multiple measurements from the same set of subjects cannot be treated as separate, unrelated data sets. Comparison of means in such a situation requires repeated measures ANOVA. It is to be noted that while a multiple group comparison test such as ANOVA can point to a significant difference, it does not identify exactly between which two groups the difference lies. To do this, multiple group comparison needs to be followed up by an appropriate post hoc test. An example is the Tukey's honestly significant difference test following ANOVA. If the assumptions for parametric tests are not met, there are nonparametric alternatives for comparing data sets. These include Mann-Whitney U-test as the nonparametric counterpart of the unpaired Student's t-test, Wilcoxon signed-rank test as the counterpart of the paired Student's t-test, Kruskal-Wallis test as the nonparametric equivalent of ANOVA and the Friedman's test as the counterpart of repeated measures ANOVA.
Statistical analysis of flight times for space shuttle ferry flights
NASA Technical Reports Server (NTRS)
Graves, M. E.; Perlmutter, M.
1974-01-01
Markov chain and Monte Carlo analysis techniques are applied to the simulated Space Shuttle Orbiter Ferry flights to obtain statistical distributions of flight time duration between Edwards Air Force Base and Kennedy Space Center. The two methods are compared, and are found to be in excellent agreement. The flights are subjected to certain operational and meteorological requirements, or constraints, which cause eastbound and westbound trips to yield different results. Persistence of events theory is applied to the occurrence of inclement conditions to find their effect upon the statistical flight time distribution. In a sensitivity test, some of the constraints are varied to observe the corresponding changes in the results.
Analysis of the Cape Cod tracer data
Ezzedine, Souheil; Rubin, Yoram
1997-01-01
An analysis of the Cape Cod test was performed using several first- and higher-order theoretical models. We compare conditional and unconditional solutions of the transport equation and employ them for analysis of the experimental data. We consider spatial moments, mass breakthrough curves, and the distribution of the solute mass in space. The concentration measurements were also analyzed using theoretical models for the expected value and variance of concentration. The theoretical models we employed are based on the spatial correlation structure of the conductivity field, without any fitting of parameters to the tracer data, and hence we can test the predictive power of the theories tested. The effects of recharge on macrodispersion are investigated, and it is shown that recharge provides a reasonable explanation for the enhanced lateral spread of the Cape Cod plume. The compendium of the experimental results presented here is useful for testing of theoretical and numerical models.
NASA Astrophysics Data System (ADS)
Messerotti, Mauro; Otruba, Wolfgang; Hanslmeier, Arnold
2000-06-01
The Kanzelhoehe Solar Observatory is an observing facility located in Carinthia (Austria) and operated by the Institute of Geophysics, Astrophysics and Meteorology of the Karl- Franzens University Graz. A set of instruments for solar surveillance at different wavelengths bands is continuously operated in automatic mode and is presently being upgraded to be used in supplying near-real-time solar activity indexes for space weather applications. In this frame, we tested a low-end software/hardware architecture running on the PC platform in a non-homogeneous, remotely distributed environment that allows efficient or moderately efficient application sharing at the Intranet and Extranet (i.e., Wide Area Network) levels respectively. Due to the geographical distributed of participating teams (Trieste, Italy; Kanzelhoehe and Graz, Austria), we have been using such features for collaborative remote software development and testing, data analysis and calibration, and observing run emulation from multiple sites as well. In this work, we describe the used architecture and its performances based on a series of application sharing tests we carried out to ascertain its effectiveness in real collaborative remote work, observations and data exchange. The system proved to be reliable at the Intranet level for most distributed tasks, limited to less demanding ones at the Extranet level, but quite effective in remote instrument control when real time response is not needed.
Test functions for three-dimensional control-volume mixed finite-element methods on irregular grids
Naff, R.L.; Russell, T.F.; Wilson, J.D.; ,; ,; ,; ,; ,
2000-01-01
Numerical methods based on unstructured grids, with irregular cells, usually require discrete shape functions to approximate the distribution of quantities across cells. For control-volume mixed finite-element methods, vector shape functions are used to approximate the distribution of velocities across cells and vector test functions are used to minimize the error associated with the numerical approximation scheme. For a logically cubic mesh, the lowest-order shape functions are chosen in a natural way to conserve intercell fluxes that vary linearly in logical space. Vector test functions, while somewhat restricted by the mapping into the logical reference cube, admit a wider class of possibilities. Ideally, an error minimization procedure to select the test function from an acceptable class of candidates would be the best procedure. Lacking such a procedure, we first investigate the effect of possible test functions on the pressure distribution over the control volume; specifically, we look for test functions that allow for the elimination of intermediate pressures on cell faces. From these results, we select three forms for the test function for use in a control-volume mixed method code and subject them to an error analysis for different forms of grid irregularity; errors are reported in terms of the discrete L2 norm of the velocity error. Of these three forms, one appears to produce optimal results for most forms of grid irregularity.
Climate Informed Low Flow Frequency Analysis Using Nonstationary Modeling
NASA Astrophysics Data System (ADS)
Liu, D.; Guo, S.; Lian, Y.
2014-12-01
Stationarity is often assumed for frequency analysis of low flows in water resources management and planning. However, many studies have shown that flow characteristics, particularly the frequency spectrum of extreme hydrologic events,were modified by climate change and human activities and the conventional frequency analysis without considering the non-stationary characteristics may lead to costly design. The analysis presented in this paper was based on the more than 100 years of daily flow data from the Yichang gaging station 44 kilometers downstream of the Three Gorges Dam. The Mann-Kendall trend test under the scaling hypothesis showed that the annual low flows had significant monotonic trend, whereas an abrupt change point was identified in 1936 by the Pettitt test. The climate informed low flow frequency analysis and the divided and combined method are employed to account for the impacts from related climate variables and the nonstationarities in annual low flows. Without prior knowledge of the probability density function for the gaging station, six distribution functions including the Generalized Extreme Values (GEV), Pearson Type III, Gumbel, Gamma, Lognormal, and Weibull distributions have been tested to find the best fit, in which the local likelihood method is used to estimate the parameters. Analyses show that GEV had the best fit for the observed low flows. This study has also shown that the climate informed low flow frequency analysis is able to exploit the link between climate indices and low flows, which would account for the dynamic feature for reservoir management and provide more accurate and reliable designs for infrastructure and water supply.
NASA Astrophysics Data System (ADS)
Santarelli, M.; Leone, P.; Calì, M.; Orsello, G.
The tubular SOFC generator CHP-100, built by Siemens Power Generation (SPG) Stationary Fuel Cells (SFC), is running at the Gas Turbine Technologies (GTT) in Torino (Italy), in the framework of the EOS Project. The nominal load of the generator ensures a produced electric power of around 105 kW e ac and around 60 kW t of thermal power at 250 °C to be used for the custom tailored HVAC system. Several experimental sessions have been scheduled on the generator; the aim is to characterize the operation through the analysis of some global performance index and the detailed control of the operation of the different bundles of the whole stack. All the scheduled tests have been performed by applying the methodology of design of experiment; the main obtained results show the effect of the change of the analysed operating factors in terms of distribution of voltage and temperature over the stack. Fuel consumption tests give information about the sensitivity of the voltage and temperature distribution along the single bundles. On the other hand, since the generator is an air cooled system, the results of the tests on the air stoichs have been used to analyze the generator thermal management (temperature distribution and profiles) and its effect on the polarization. The sensitivity analysis of the local voltage to the overall fuel consumption modifications can be used as a powerful procedure to deduce the local distribution of fuel utilization (FU) along the single bundles: in fact, through a model obtained by deriving the polarization curve respect to FU, it is possible to link the distribution of voltage sensitivities to FC to the distribution of the local FU. The FU distribution will be shown as non-uniform, and this affects the local voltage and temperatures, causing a high warming effect in some rows of the generator. Therefore, a discussion around the effectiveness of the thermal regulation made by the air stoichs, in order to reduce the non-uniform distribution of temperature and the overheating (increasing therefore the voltage behavior along the generator) has been performed. It is demonstrated that the utilization of one air plenum is not effective in the thermal regulation of the whole generator, in particular in the reduction of the temperature gradients linked to the non-uniform fuel distribution.
Development of Probabilistic Life Prediction Methodologies and Testing Strategies for MEMS and CMC's
NASA Technical Reports Server (NTRS)
Jadaan, Osama
2003-01-01
This effort is to investigate probabilistic life prediction methodologies for ceramic matrix composites and MicroElectroMechanical Systems (MEMS) and to analyze designs that determine stochastic properties of MEMS. For CMC's this includes a brief literature survey regarding lifing methodologies. Also of interest for MEMS is the design of a proper test for the Weibull size effect in thin film (bulge test) specimens. The Weibull size effect is a consequence of a stochastic strength response predicted from the Weibull distribution. Confirming that MEMS strength is controlled by the Weibull distribution will enable the development of a probabilistic design methodology for MEMS - similar to the GRC developed CARES/Life program for bulk ceramics. A main objective of this effort is to further develop and verify the ability of the Ceramics Analysis and Reliability Evaluation of Structures/Life (CARES/Life) code to predict the time-dependent reliability of MEMS structures subjected to multiple transient loads. A second set of objectives is to determine the applicability/suitability of the CARES/Life methodology for CMC analysis, what changes would be needed to the methodology and software, and if feasible, run a demonstration problem. Also important is an evaluation of CARES/Life coupled to the ANSYS Probabilistic Design System (PDS) and the potential of coupling transient reliability analysis to the ANSYS PDS.
Handling nonnormality and variance heterogeneity for quantitative sublethal toxicity tests.
Ritz, Christian; Van der Vliet, Leana
2009-09-01
The advantages of using regression-based techniques to derive endpoints from environmental toxicity data are clear, and slowly, this superior analytical technique is gaining acceptance. As use of regression-based analysis becomes more widespread, some of the associated nuances and potential problems come into sharper focus. Looking at data sets that cover a broad spectrum of standard test species, we noticed that some model fits to data failed to meet two key assumptions-variance homogeneity and normality-that are necessary for correct statistical analysis via regression-based techniques. Failure to meet these assumptions often is caused by reduced variance at the concentrations showing severe adverse effects. Although commonly used with linear regression analysis, transformation of the response variable only is not appropriate when fitting data using nonlinear regression techniques. Through analysis of sample data sets, including Lemna minor, Eisenia andrei (terrestrial earthworm), and algae, we show that both the so-called Box-Cox transformation and use of the Poisson distribution can help to correct variance heterogeneity and nonnormality and so allow nonlinear regression analysis to be implemented. Both the Box-Cox transformation and the Poisson distribution can be readily implemented into existing protocols for statistical analysis. By correcting for nonnormality and variance heterogeneity, these two statistical tools can be used to encourage the transition to regression-based analysis and the depreciation of less-desirable and less-flexible analytical techniques, such as linear interpolation.
A novel method for correcting scanline-observational bias of discontinuity orientation
Huang, Lei; Tang, Huiming; Tan, Qinwen; Wang, Dingjian; Wang, Liangqing; Ez Eldin, Mutasim A. M.; Li, Changdong; Wu, Qiong
2016-01-01
Scanline observation is known to introduce an angular bias into the probability distribution of orientation in three-dimensional space. In this paper, numerical solutions expressing the functional relationship between the scanline-observational distribution (in one-dimensional space) and the inherent distribution (in three-dimensional space) are derived using probability theory and calculus under the independence hypothesis of dip direction and dip angle. Based on these solutions, a novel method for obtaining the inherent distribution (also for correcting the bias) is proposed, an approach which includes two procedures: 1) Correcting the cumulative probabilities of orientation according to the solutions, and 2) Determining the distribution of the corrected orientations using approximation methods such as the one-sample Kolmogorov-Smirnov test. The inherent distribution corrected by the proposed method can be used for discrete fracture network (DFN) modelling, which is applied to such areas as rockmass stability evaluation, rockmass permeability analysis, rockmass quality calculation and other related fields. To maximize the correction capacity of the proposed method, the observed sample size is suggested through effectiveness tests for different distribution types, dispersions and sample sizes. The performance of the proposed method and the comparison of its correction capacity with existing methods are illustrated with two case studies. PMID:26961249
Analysis of the silicone polymer surface aging profile with laser-induced breakdown spectroscopy
NASA Astrophysics Data System (ADS)
Wang, Xilin; Hong, Xiao; Wang, Han; Chen, Can; Zhao, Chenlong; Jia, Zhidong; Wang, Liming; Zou, Lin
2017-10-01
Silicone rubber composite materials have been widely used in high voltage transmission lines for anti-pollution flashover. The aging surface of silicone rubber materials decreases service properties, causing loss of the anti-pollution ability. In this paper, as an analysis method requiring no sample preparation that is able to be conducted on site and suitable for nearly all types of materials, laser-induced breakdown spectroscopy (LIBS) was used for the analysis of newly prepared and aging (out of service) silicone rubber composites. With scanning electron microscopy (SEM) and hydrophobicity test, LIBS was proven to be nearly non-destructive for silicone rubber. Under the same LIBS testing parameters, a linear relationship was observed between ablation depth and laser pulses number. With the emission spectra, all types of elements and their distribution in samples along the depth direction from the surface to the inner part were acquired and verified with EDS results. This research showed that LIBS was suitable to detect the aging layer depth and element distribution of the silicone rubber surface.
ERIC Educational Resources Information Center
Vasu, Ellen S.; Elmore, Patricia B.
The effects of the violation of the assumption of normality coupled with the condition of multicollinearity upon the outcome of testing the hypothesis Beta equals zero in the two-predictor regression equation is investigated. A monte carlo approach was utilized in which three differenct distributions were sampled for two sample sizes over…
Effect of geometrical parameters on pressure distributions of impulse manufacturing technologies
NASA Astrophysics Data System (ADS)
Brune, Ryan Carl
Impulse manufacturing techniques constitute a growing field of methods that utilize high-intensity pressure events to conduct useful mechanical operations. As interest in applying this technology continues to grow, greater understanding must be achieved with respect to output pressure events in both magnitude and distribution. In order to address this need, a novel pressure measurement has been developed called the Profile Indentation Pressure Evaluation (PIPE) method that systematically analyzes indentation patterns created with impulse events. Correlation with quasi-static test data and use of software-assisted analysis techniques allows for colorized pressure maps to be generated for both electromagnetic and vaporizing foil actuator (VFA) impulse forming events. Development of this technique aided introduction of a design method for electromagnetic path actuator systems, where key geometrical variables are considered using a newly developed analysis method, which is called the Path Actuator Proximal Array (PAPA) pressure model. This model considers key current distribution and proximity effects and interprets generated pressure by considering the adjacent conductor surfaces as proximal arrays of individual conductors. According to PIPE output pressure analysis, the PAPA model provides a reliable prediction of generated pressure for path actuator systems as local geometry is changed. Associated mechanical calculations allow for pressure requirements to be calculated for shearing, flanging, and hemming operations, providing a design process for such cases. Additionally, geometry effect is investigated through a formability enhancement study using VFA metalworking techniques. A conical die assembly is utilized with both VFA high velocity and traditional quasi-static test methods on varied Hasek-type sample geometries to elicit strain states consistent with different locations on a forming limit diagram. Digital image correlation techniques are utilized to measure major and minor strains for each sample type to compare limit strain results. Overall testing indicated decreased formability at high velocity for 304 DDQ stainless steel and increased formability at high velocity for 3003-H14 aluminum. Microstructural and fractographic analysis helped dissect and analyze the observed differences in these cases. Overall, these studies comprehensively explore the effects of geometrical parameters on magnitude and distribution of impulse manufacturing generated pressure, establishing key guidelines and models for continued development and implementation in commercial applications.
Effect of Jig Design and Assessment of Stress Distribution in Testing Metal-Ceramic Adhesion.
Özcan, Mutlu; Kojima, Alberto Noriyuki; Nishioka, Renato Sussumu; Mesquita, Alfredo Mikail Melo; Bottino, Marco Antonio; Filho, Gilberto Duarte
2016-12-01
In testing adhesion using shear bond test, a combination of shear and tensile forces occur at the interface, resulting in complex stresses. The jig designs used for this kind of test show variations in published studies, complicating direct comparison between studies. This study evaluated the effect of different jig designs on metal-ceramic bond strength and assessed the stress distribution at the interface using finite element analysis (FEA). Metal-ceramic (Metal: Ni-Cr, Wiron 99, Bego; Ceramic: Vita Omega 900, Vita) specimens (N = 36) (diameter: 4 mm, veneer thickness: 4 mm; base diameter: 5 mm, thickness: 1 mm) were fabricated and randomly divided into three groups (n = 12 per group) to be tested using one of the following jig designs: (a) chisel (CH) (ISO 11405), (b) steel strip (SS), (c) piston (PI). Metal-ceramic interfaces were loaded under shear until debonding in a universal testing machine (0.5 mm/min). Failure types were evaluated using scanning electron microscopy (SEM). FEA was used to study the stress distribution using different jigs. Metal-ceramic bond strength data (MPa) were analyzed using ANOVA and Tukey's tests (α = 0.05). The jig type significantly affected the bond results (p = 0.0001). PI type of jig presented the highest results (MPa) (p < 0.05) (58.2 ± 14.8), followed by CH (38.7 ± 7.6) and SS jig type (23.3 ± 4.2) (p < 0.05). Failure types were exclusively a combination of cohesive failure in the opaque ceramic and adhesive interface failure. FEA analysis indicated that the SS jig presented slightly more stress formation than with the CH jig. The PI jig presented small stress concentration with more homogeneous force distribution compared to the CH jig where the stress concentrated in the area where the force was applied. Metal-ceramic bond strength was affected by the jig design. Accordingly, the results of in vitro studies on metal-ceramic adhesion should be evaluated with caution. When adhesion of ceramic materials to metals is evaluated in in vitro studies, it should be noted that the loading jig type affects the results. Clinical observations should report on the location and type of ceramic fractures in metal-ceramic reconstructions so that the most relevant test method can be identified. © 2015 by the American College of Prosthodontists.
Hypervelocity impact testing of the Space Station utility distribution system carrier
NASA Technical Reports Server (NTRS)
Lazaroff, Scott
1993-01-01
A two-phase, joint JSC and McDonnell Douglas Aerospace-Huntington Beach hypervelocity impact (HVI) test program was initiated to develop an improved understanding of how meteoroid and orbital debris (M/OD) impacts affect the Space Station Freedom (SSF) avionic and fluid lines routed in the Utility Distribution System (UDS) carrier. This report documents the first phase of the test program which covers nonpowered avionic line segment and pressurized fluid line segment HVI testing. From these tests, a better estimation of avionic line failures is approximately 15 failures per year and could very well drop to around 1 or 2 avionic line failures per year (depending upon the results of the second phase testing of the powered avionic line at White Sands). For the fluid lines, the initial McDonnell Douglas analysis calculated 1 to 2 line failures over a 30 year period. The data obtained from these tests indicate the number of predicted fluid line failures increased slightly to as many as 3 in the first 10 years and up to 15 for the entire 30 year life of SSF.
NASA Technical Reports Server (NTRS)
Siemers, P. M., III; Henry, M. W.
1986-01-01
Pressure distribution test data obtained on a 0.10-scale model of the forward fuselage of the Space Shuttle Orbiter are presented without analysis. The tests were completed in the AEDC 16T Propulsion Wind Tunnel. The 0.10-scale model was tested at angles of attack from -2 deg to 18 deg and angles of side slip from -6 to 6 deg at Mach numbers from 0.25 to 1/5 deg. The tests were conducted in support of the development of the Shuttle Entry Air Data System (SEADS). In addition to modeling the 20 SEADS orifices, the wind-tunnel model was also instrumented with orifices to match Development Flight Instrumentation (DFI) port locations that existed on the Space Shuttle Orbiter Columbia (OV-102) during the Orbiter Flight Test program. This DFI simulation has provided a means of comparisons between reentry flight pressure data and wind-tunnel and computational data.
NASA Technical Reports Server (NTRS)
Hill, Gerald M.; Evans, Richard K.
2009-01-01
A large-scale, distributed, high-speed data acquisition system (HSDAS) is currently being installed at the Space Power Facility (SPF) at NASA Glenn Research Center s Plum Brook Station in Sandusky, OH. This installation is being done as part of a facility construction project to add Vibro-acoustic Test Capabilities (VTC) to the current thermal-vacuum testing capability of SPF in support of the Orion Project s requirement for Space Environments Testing (SET). The HSDAS architecture is a modular design, which utilizes fully-remotely managed components, enables the system to support multiple test locations with a wide-range of measurement types and a very large system channel count. The architecture of the system is presented along with details on system scalability and measurement verification. In addition, the ability of the system to automate many of its processes such as measurement verification and measurement system analysis is also discussed.
NASA Astrophysics Data System (ADS)
Diem, Samuel; Vogt, Tobias; Hoehn, Eduard
2010-12-01
For groundwater transport modeling on a scale of 10-100 m, detailed information about the spatial distribution of hydraulic conductivity is of great importance. At a test site (10×20 m) in the alluvial gravel-and-sand aquifer of the perialpine Thur valley (Switzerland), four different methods were applied on different scales to assess this parameter. A comparison of the results showed that multilevel slug tests give reliable results at the required scale. For its analysis, a plausible value of the anisotropy ratio of hydraulic conductivity ( K v / K h ) is needed, which was calculated using a pumping test. The integral results of pumping tests provide an upper boundary of the natural spectrum of hydraulic conductivity at the scale of the test site. Flowmeter logs are recommended if the relative distribution of hydraulic conductivity is of primary importance, while sieve analyses can be used if only a rough estimate of hydraulic conductivity is acceptable.
NASA Astrophysics Data System (ADS)
Rest, J.; Hofman, G. L.; Kim, Yeon Soo
2009-04-01
An analytical model for the nucleation and growth of intra and intergranular fission-gas bubbles is used to characterize fission-gas bubble development in low-enriched U-Mo alloy fuel irradiated in the advanced test reactor in Idaho as part of the Reduced Enrichment for Research and Test Reactor (RERTR) program. Fuel burnup was limited to less than ˜7.8 at.% U in order to capture the fuel-swelling stage prior to irradiation-induced recrystallization. The model couples the calculation of the time evolution of the average intergranular bubble radius and number density to the calculation of the intergranular bubble-size distribution based on differential growth rate and sputtering coalescence processes. Recent results on TEM analysis of intragranular bubbles in U-Mo were used to set the irradiation-induced diffusivity and re-solution rate in the bubble-swelling model. Using these values, good agreement was obtained for intergranular bubble distribution compared against measured post-irradiation examination (PIE) data using grain-boundary diffusion enhancement factors of 15-125, depending on the Mo concentration. This range of enhancement factors is consistent with values obtained in the literature.
Dynamic stall characterization using modal analysis of phase-averaged pressure distributions
NASA Astrophysics Data System (ADS)
Harms, Tanner; Nikoueeyan, Pourya; Naughton, Jonathan
2017-11-01
Dynamic stall characterization by means of surface pressure measurements can simplify the time and cost associated with experimental investigation of unsteady airfoil aerodynamics. A unique test capability has been developed at University of Wyoming over the past few years that allows for time and cost efficient measurement of dynamic stall. A variety of rotorcraft and wind turbine airfoils have been tested under a variety of pitch oscillation conditions resulting in a range of dynamic stall behavior. Formation, development and separation of different flow structures are responsible for the complex aerodynamic loading behavior experienced during dynamic stall. These structures have unique signatures on the pressure distribution over the airfoil. This work investigates the statistical behavior of phase-averaged pressure distribution for different types of dynamic stall by means of modal analysis. The use of different modes to identify specific flow structures is being investigated. The use of these modes for different types of dynamic stall can provide a new approach for understanding and categorizing these flows. This work uses airfoil data acquired under Army contract W911W60160C-0021, DOE Grant DE-SC0001261, and a gift from BP Alternative Energy North America, Inc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1988-08-01
The objective of this report is to develop a generalized methodology for examining water distribution systems for adjustable speed drive (ASD) applications and to provide an example (the City of Chicago 68th Street Water Pumping Station) using the methodology. The City of Chicago water system was chosen as the candidate for analysis because it has a large service area distribution network with no storage provisions after the distribution pumps. Many industrial motors operate at only one speed or a few speeds. By speeding up or slowing down, ASDs achieve gentle startups and gradual shutdowns thereby providing plant equipment a longermore » life with fewer breakdowns while minimizing the energy requirements. The test program substantiated that ASDs enhance product quality and increase productivity in many industrial operations, including extended equipment life. 35 figs.« less
Bivariate drought frequency analysis using the copula method
NASA Astrophysics Data System (ADS)
Mirabbasi, Rasoul; Fakheri-Fard, Ahmad; Dinpashoh, Yagob
2012-04-01
Droughts are major natural hazards with significant environmental and economic impacts. In this study, two-dimensional copulas were applied to the analysis of the meteorological drought characteristics of the Sharafkhaneh gauge station, located in the northwest of Iran. Two major drought characteristics, duration and severity, as defined by the standardized precipitation index, were abstracted from observed drought events. Since drought duration and severity exhibited a significant correlation and since they were modeled using different distributions, copulas were used to construct the joint distribution function of the drought characteristics. The parameter of copulas was estimated using the method of the Inference Function for Margins. Several copulas were tested in order to determine the best data fit. According to the error analysis and the tail dependence coefficient, the Galambos copula provided the best fit for the observed drought data. Some bivariate probabilistic properties of droughts, based on the derived copula-based joint distribution, were also investigated. These probabilistic properties can provide useful information for water resource planning and management.
Computational Analysis of Arc-Jet Wedge Tests Including Ablation and Shape Change
NASA Technical Reports Server (NTRS)
Goekcen, Tahir; Chen, Yih-Kanq; Skokova, Kristina A.; Milos, Frank S.
2010-01-01
Coupled fluid-material response analyses of arc-jet wedge ablation tests conducted in a NASA Ames arc-jet facility are considered. These tests were conducted using blunt wedge models placed in a free jet downstream of the 6-inch diameter conical nozzle in the Ames 60-MW Interaction Heating Facility. The fluid analysis includes computational Navier-Stokes simulations of the nonequilibrium flowfield in the facility nozzle and test box as well as the flowfield over the models. The material response analysis includes simulation of two-dimensional surface ablation and internal heat conduction, thermal decomposition, and pyrolysis gas flow. For ablating test articles undergoing shape change, the material response and fluid analyses are coupled in order to calculate the time dependent surface heating and pressure distributions that result from shape change. The ablating material used in these arc-jet tests was Phenolic Impregnated Carbon Ablator. Effects of the test article shape change on fluid and material response simulations are demonstrated, and computational predictions of surface recession, shape change, and in-depth temperatures are compared with the experimental measurements.
Detecting Multiple Model Components with the Likelihood Ratio Test
NASA Astrophysics Data System (ADS)
Protassov, R. S.; van Dyk, D. A.
2000-05-01
The likelihood ratio test (LRT) and F-test popularized in astrophysics by Bevington (Data Reduction and Error Analysis in the Physical Sciences ) and Cash (1977, ApJ 228, 939), do not (even asymptotically) adhere to their nominal χ2 and F distributions in many statistical tests commonly used in astrophysics. The many legitimate uses of the LRT (see, e.g., the examples given in Cash (1977)) notwithstanding, it can be impossible to compute the false positive rate of the LRT or related tests such as the F-test. For example, although Cash (1977) did not suggest the LRT for detecting a line profile in a spectral model, it has become common practice despite the lack of certain required mathematical regularity conditions. Contrary to common practice, the nominal distribution of the LRT statistic should not be used in these situations. In this paper, we characterize an important class of problems where the LRT fails, show the non-standard behavior of the test in this setting, and provide a Bayesian alternative to the LRT, i.e., posterior predictive p-values. We emphasize that there are many legitimate uses of the LRT in astrophysics, and even when the LRT is inappropriate, there remain several statistical alternatives (e.g., judicious use of error bars and Bayes factors). We illustrate this point in our analysis of GRB 970508 that was studied by Piro et al. in ApJ, 514:L73-L77, 1999.
Further Progress Applying the Generalized Wigner Distribution to Analysis of Vicinal Surfaces
NASA Astrophysics Data System (ADS)
Einstein, T. L.; Richards, Howard L.; Cohen, S. D.
2001-03-01
Terrace width distributions (TWDs) can be well fit by the generalized Wigner distribution (GWD), generally better than by conventional Gaussians, and thus offers a convenient way to estimate the dimensionless elastic repulsion strength tildeA from σ^2, the TWD variance.(T.L. Einstein and O. Pierre-Louis, Surface Sci. 424), L299 (1999) The GWD σ^2 accurately reproduces values for the two exactly soluble cases at small tildeA and in the asymptotic limit. Taxing numerical simulations show that the GWD σ^2 interpolates well between these limits. Extensive applications have been made to experimental data, esp. on Cu.(M. Giesen and T.L. Einstein, Surface Sci. 449), 191 (2000) Recommended analysis procedures are catalogued.(H.L. Richards, S.D. Cohen, TLE, & M. Giesen, Surf Sci 453), 59 (2000) Extensions of the GWD for multistep distributions are tested, with good agreement for second-neighbor distributions, less good for third.(TLE, HLR, SDC, & OP-L, Proc ISSI-PDSC2000, cond-mat/0012xxxxx) Alternatively, step-step correlation functions, about which there is more theoretical information, should be measured.
Liu, Li; Sabo, Aniko; Neale, Benjamin M.; Nagaswamy, Uma; Stevens, Christine; Lim, Elaine; Bodea, Corneliu A.; Muzny, Donna; Reid, Jeffrey G.; Banks, Eric; Coon, Hillary; DePristo, Mark; Dinh, Huyen; Fennel, Tim; Flannick, Jason; Gabriel, Stacey; Garimella, Kiran; Gross, Shannon; Hawes, Alicia; Lewis, Lora; Makarov, Vladimir; Maguire, Jared; Newsham, Irene; Poplin, Ryan; Ripke, Stephan; Shakir, Khalid; Samocha, Kaitlin E.; Wu, Yuanqing; Boerwinkle, Eric; Buxbaum, Joseph D.; Cook, Edwin H.; Devlin, Bernie; Schellenberg, Gerard D.; Sutcliffe, James S.; Daly, Mark J.; Gibbs, Richard A.; Roeder, Kathryn
2013-01-01
We report on results from whole-exome sequencing (WES) of 1,039 subjects diagnosed with autism spectrum disorders (ASD) and 870 controls selected from the NIMH repository to be of similar ancestry to cases. The WES data came from two centers using different methods to produce sequence and to call variants from it. Therefore, an initial goal was to ensure the distribution of rare variation was similar for data from different centers. This proved straightforward by filtering called variants by fraction of missing data, read depth, and balance of alternative to reference reads. Results were evaluated using seven samples sequenced at both centers and by results from the association study. Next we addressed how the data and/or results from the centers should be combined. Gene-based analyses of association was an obvious choice, but should statistics for association be combined across centers (meta-analysis) or should data be combined and then analyzed (mega-analysis)? Because of the nature of many gene-based tests, we showed by theory and simulations that mega-analysis has better power than meta-analysis. Finally, before analyzing the data for association, we explored the impact of population structure on rare variant analysis in these data. Like other recent studies, we found evidence that population structure can confound case-control studies by the clustering of rare variants in ancestry space; yet, unlike some recent studies, for these data we found that principal component-based analyses were sufficient to control for ancestry and produce test statistics with appropriate distributions. After using a variety of gene-based tests and both meta- and mega-analysis, we found no new risk genes for ASD in this sample. Our results suggest that standard gene-based tests will require much larger samples of cases and controls before being effective for gene discovery, even for a disorder like ASD. PMID:23593035
DOE Office of Scientific and Technical Information (OSTI.GOV)
Isozaki, Toshikuni; Shibata, Katsuyuki
1997-04-01
Experimental and computed results applicable to Leak Before Break analysis are presented. The specific area of investigation is the effect of the temperature distribution changes due to wetting of the test pipe near the crack on the increase in the crack opening area and leak rate. Two 12-inch straight pipes subjected to both internal pressure and thermal load, but not to bending load, are modelled. The leak rate was found to be very susceptible to the metal temperature of the piping. In leak rate tests, therefore, it is recommended that temperature distribution be measured precisely for a wide area.
MTL distributed magnet measurement system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nogiec, J.M.; Craker, P.A.; Garbarini, J.P.
1993-04-01
The Magnet Test Laboratory (MTL) at the Superconducting Super collider Laboratory will be required to precisely and reliably measure properties of magnets in a production environment. The extensive testing of the superconducting magnets comprises several types of measurements whose main purpose is to evaluate some basic parameters characterizing magnetic, mechanic and cryogenic properties of magnets. The measurement process will produce a significant amount of data which will be subjected to complex analysis. Such massive measurements require a careful design of both the hardware and software of computer systems, having in mind a reliable, maximally automated system. In order to fulfillmore » this requirement a dedicated Distributed Magnet Measurement System (DMMS) is being developed.« less
NASA Astrophysics Data System (ADS)
Basu, A.; Das, B.; Middya, T. R.; Bhattacharya, D. P.
2017-01-01
The phonon growth characteristic in a degenerate semiconductor has been calculated under the condition of low temperature. If the lattice temperature is high, the energy of the intravalley acoustic phonon is negligibly small compared to the average thermal energy of the electrons. Hence one can traditionally assume the electron-phonon collisions to be elastic and approximate the Bose-Einstein (B.E.) distribution for the phonons by the simple equipartition law. However, in the present analysis at the low lattice temperatures, the interaction of the non equilibrium electrons with the acoustic phonons becomes inelastic and the simple equipartition law for the phonon distribution is not valid. Hence the analysis is made taking into account the inelastic collisions and the complete form of the B.E. distribution. The high-field distribution function of the carriers given by Fermi-Dirac (F.D.) function at the field dependent carrier temperature, has been approximated by a well tested model that apparently overcomes the intrinsic problem of correct evaluation of the integrals involving the product and powers of the Fermi function. Hence the results thus obtained are more reliable compared to the rough estimation that one may obtain from using the exact F.D. function, but taking recourse to some over simplified approximations.
Intensity distribution of the x ray source for the AXAF VETA-I mirror test
NASA Technical Reports Server (NTRS)
Zhao, Ping; Kellogg, Edwin M.; Schwartz, Daniel A.; Shao, Yibo; Fulton, M. Ann
1992-01-01
The X-ray generator for the AXAF VETA-I mirror test is an electron impact X-ray source with various anode materials. The source sizes of different anodes and their intensity distributions were measured with a pinhole camera before the VETA-I test. The pinhole camera consists of a 30 micrometers diameter pinhole for imaging the source and a Microchannel Plate Imaging Detector with 25 micrometers FWHM spatial resolution for detecting and recording the image. The camera has a magnification factor of 8.79, which enables measuring the detailed spatial structure of the source. The spot size, the intensity distribution, and the flux level of each source were measured with different operating parameters. During the VETA-I test, microscope pictures were taken for each used anode immediately after it was brought out of the source chamber. The source sizes and the intensity distribution structures are clearly shown in the pictures. They are compared and agree with the results from the pinhole camera measurements. This paper presents the results of the above measurements. The results show that under operating conditions characteristic of the VETA-I test, all the source sizes have a FWHM of less than 0.45 mm. For a source of this size at 528 meters away, the angular size to VETA is less than 0.17 arcsec which is small compared to the on ground VETA angular resolution (0.5 arcsec, required and 0.22 arcsec, measured). Even so, the results show the intensity distributions of the sources have complicated structures. These results were crucial for the VETA data analysis and for obtaining the on ground and predicted in orbit VETA Point Response Function.
Active distribution network planning considering linearized system loss
NASA Astrophysics Data System (ADS)
Li, Xiao; Wang, Mingqiang; Xu, Hao
2018-02-01
In this paper, various distribution network planning techniques with DGs are reviewed, and a new distribution network planning method is proposed. It assumes that the location of DGs and the topology of the network are fixed. The proposed model optimizes the capacities of DG and the optimal distribution line capacity simultaneously by a cost/benefit analysis and the benefit is quantified by the reduction of the expected interruption cost. Besides, the network loss is explicitly analyzed in the paper. For simplicity, the network loss is appropriately simplified as a quadratic function of difference of voltage phase angle. Then it is further piecewise linearized. In this paper, a piecewise linearization technique with different segment lengths is proposed. To validate its effectiveness and superiority, the proposed distribution network planning model with elaborate linearization technique is tested on the IEEE 33-bus distribution network system.
Multivariate Analysis and Its Applications
1989-02-14
defined in situations where measurements are taken on natural clusters of individuals like brothers in a family. A number of problems arise in the study of...intraclass correlations. How do we estimate it when observations are available on clusters of different sizes? How do we test the hypothesis that the...the random variable y(X) = #I X + G2X 2 + ... + GmX m , follows an exponential distribution with mean unity. Such a class of life distributions, has a
Atmospheric neutrino oscillations from upward throughgoing muon multiple scattering in MACRO
NASA Astrophysics Data System (ADS)
MACRO Collaboration; Ambrosio, M.; Antolini, R.; Bakari, D.; Baldini, A.; Barbarino, G. C.; Barish, B. C.; Battistoni, G.; Becherini, Y.; Bellotti, R.; Bemporad, C.; Bernardini, P.; Bilokon, H.; Bloise, C.; Bower, C.; Brigida, M.; Bussino, S.; Cafagna, F.; Calicchio, M.; Campana, D.; Carboni, M.; Caruso, R.; Cecchini, S.; Cei, F.; Chiarella, V.; Chiarusi, T.; Choudhary, B. C.; Coutu, S.; Cozzi, M.; de Cataldo, G.; Dekhissi, H.; de Marzo, C.; de Mitri, I.; Derkaoui, J.; de Vincenzi, M.; di Credico, A.; Favuzzi, C.; Forti, C.; Fusco, P.; Giacomelli, G.; Giannini, G.; Giglietto, N.; Giorgini, M.; Grassi, M.; Grillo, A.; Gustavino, C.; Habig, A.; Hanson, K.; Heinz, R.; Iarocci, E.; Katsavounidis, E.; Katsavounidis, I.; Kearns, E.; Kim, H.; Kumar, A.; Kyriazopoulou, S.; Lamanna, E.; Lane, C.; Levin, D. S.; Lipari, P.; Longo, M. J.; Loparco, F.; Maaroufi, F.; Mancarella, G.; Mandrioli, G.; Manzoor, S.; Margiotta, A.; Marini, A.; Martello, D.; Marzari-Chiesa, A.; Mazziotta, M. N.; Michael, D. G.; Mikheyev, S.; Monacelli, P.; Montaruli, T.; Monteno, M.; Mufson, S.; Musser, J.; Nicolò, D.; Nolty, R.; Orth, C.; Osteria, G.; Palamara, O.; Patera, V.; Patrizii, L.; Pazzi, R.; Peck, C. W.; Perrone, L.; Petrera, S.; Popa, V.; Rainò, A.; Reynoldson, J.; Ronga, F.; Rrhioua, A.; Satriano, C.; Scapparone, E.; Scholberg, K.; Sciubba, A.; Serra, P.; Sioli, M.; Sirri, G.; Sitta, M.; Spinelli, P.; Spinetti, M.; Spurio, M.; Steinberg, R.; Stone, J. L.; Sulak, L. R.; Surdo, A.; Tarlè, G.; Togo, V.; Vakili, M.; Walter, C. W.; Webb, R.
2003-07-01
The energy of atmospheric neutrinos detected by MACRO was estimated using multiple Coulomb scattering of upward throughgoing muons. This analysis allows a test of atmospheric neutrino oscillations, relying on the distortion of the muon energy distribution. These results have been combined with those coming from the upward throughgoing muon angular distribution only. Both analyses are independent of the neutrino flux normalization and provide strong evidence, above the /4σ level, in favour of neutrino oscillations.
Vuillaume, P; Bruyere, V; Aubert, M
1998-01-01
In a plateau and hill region of France (the Doubs), two protocols of rabies vaccine bait distribution targeted at foxes were compared: helicopter distribution of vaccine baits alone (control zone) and a combined aerial distribution by helicopter with an additional deposit of vaccine baits at fox den entrances by foot (test zone). In the test zone covering an area of 436 km2, baits were distributed by helicopter at a rate of 13.4 baits/km2. Additionally, an average of 11.4 vaccine baits at the entrances of 871 fox dens were terrestrially distributed by 110 persons (9,964 baits). In this test zone, 90% of the young foxes were marked with tetracycline which permitted estimation of the bait consumption; however, only 38% had significant titre of rabies antibodies and less than one fox cub per 2.4 of those having consumed at least one bait were immunized. In the control zone, these percentages were significantly lower: respectively, 35 and 17% and one fox cub per 4.2. The relative lack of benefit between bait uptake and rate of immunological response may be due to maternal immunity which could have interfered with fox cub active immunization. A booster effect following a second distribution of baits by foot may be suggested in both adult foxes and their offspring. That these baits needed to be terrestrially distributed in order to obtain a booster effect is uncertain. Terrestrial distribution at fox den entrances is difficult to do and entails additional expenses not incurred in aerial distribution. The cost of terrestrial vaccination is 3.5 times higher than classical aerial vaccination and takes 63.5 times longer. A cost effective analysis of this type of supplementary terrestrial intervention determined that bait deposit at den entrances can be recommended for restricted areas, where residual focii exist, as a complement to the aerial distribution of baits.
Simultaneous Comparison of Two Roller Compaction Techniques and Two Particle Size Analysis Methods.
Saarinen, Tuomas; Antikainen, Osmo; Yliruusi, Jouko
2017-11-01
A new dry granulation technique, gas-assisted roller compaction (GARC), was compared with conventional roller compaction (CRC) by manufacturing 34 granulation batches. The process variables studied were roll pressure, roll speed, and sieve size of the conical mill. The main quality attributes measured were granule size and flow characteristics. Within granulations also the real applicability of two particle size analysis techniques, sieve analysis (SA) and fast imaging technique (Flashsizer, FS), was tested. All granules obtained were acceptable. In general, the particle size of GARC granules was slightly larger than that of CRC granules. In addition, the GARC granules had better flowability. For example, the tablet weight variation of GARC granules was close to 2%, indicating good flowing and packing characteristics. The comparison of the two particle size analysis techniques showed that SA was more accurate in determining wide and bimodal size distributions while FS showed narrower and mono-modal distributions. However, both techniques gave good estimates for mean granule sizes. Overall, SA was a time-consuming but accurate technique that provided reliable information for the entire granule size distribution. By contrast, FS oversimplified the shape of the size distribution, but nevertheless yielded acceptable estimates for mean particle size. In general, FS was two to three orders of magnitude faster than SA.
Preparation, testing and analysis of zinc diffusion samples, NASA Skylab experiment M-558
NASA Technical Reports Server (NTRS)
Braski, D. N.; Kobisk, E. H.; Odonnell, F. R.
1974-01-01
Transport mechanisms of zinc atoms in molten zinc were investigated by radiotracer techniques in unit and in near-zero gravity environments. Each melt in the Skylab flight experiments was maintained in a thermal gradient of 420 C to 790 C. Similar tests were performed in a unit gravity environment for comparison. After melting in the gradient furnace followed by a thermal soak period (the latter was used for flight samples only), the samples were cooled and analyzed for Zn-65 distribution. All samples melted in a unit gravity environment were found to have uniform Zn-65 distribution - no concentration gradient was observed even when the sample was brought rapidly to melting and then quenched. Space-melted samples, however, showed textbook distributions, obviously the result of diffusion. It was evident that convection phenomena were the dominant factors influencing zinc transport in unit gravity experiments, while diffusion was the dominant factor in near-zero gravity experiments.
F-16XL-2 Supersonic Laminar Flow Control Flight Test Experiment
NASA Technical Reports Server (NTRS)
Anders, Scott G.; Fischer, Michael C.
1999-01-01
The F-16XL-2 Supersonic Laminar Flow Control Flight Test Experiment was part of the NASA High-Speed Research Program. The goal of the experiment was to demonstrate extensive laminar flow, to validate computational fluid dynamics (CFD) codes and design methodology, and to establish laminar flow control design criteria. Topics include the flight test hardware and design, airplane modification, the pressure and suction distributions achieved, the laminar flow achieved, and the data analysis and code correlation.
de Araujo-Barbosa, Paulo Henrique Ferreira; de Menezes, Lidiane Teles; Costa, Abraão Souza; Couto Paz, Clarissa Cardoso Dos Santos; Fachin-Martins, Emerson
2015-05-01
Described as an alternative way of assessing weight-bearing asymmetries, the measures obtained from digital scales have been used as an index to classify weight-bearing distribution. This study aimed to describe the intra-test and the test/retest reliability of measures in subjects with and without hemiparesis during quiet stance. The percentage of body weight borne by one limb was calculated for a sample of subjects with hemiparesis and for a control group that was matched by gender and age. A two-way analysis of variance was used to verify the intra-test reliability. This analysis was calculated using the differences between the averages of the measures obtained during single, double or triple trials. The intra-class correlation coefficient (ICC) was utilized and data plotted using the Bland-Altman method. The intra-test analysis showed significant differences, only observed in the hemiparesis group, between the measures obtained by single and triple trials. Excellent and moderate ICC values (0.69-0.84) between test and retest were observed in the hemiparesis group, while for control groups ICC values (0.41-0.74) were classified as moderate, progressing from almost poor for measures obtained by a single trial to almost excellent for those obtained by triple trials. In conclusion, good reliability ranging from moderate to excellent classifications was found for participants with and without hemiparesis. Moreover, an improvement of the repeatability was observed with fewer trials for participants with hemiparesis, and with more trials for participants without hemiparesis.
heterogeneous mixture distributions for multi-source extreme rainfall
NASA Astrophysics Data System (ADS)
Ouarda, T.; Shin, J.; Lee, T. S.
2013-12-01
Mixture distributions have been used to model hydro-meteorological variables showing mixture distributional characteristics, e.g. bimodality. Homogeneous mixture (HOM) distributions (e.g. Normal-Normal and Gumbel-Gumbel) have been traditionally applied to hydro-meteorological variables. However, there is no reason to restrict the mixture distribution as the combination of one identical type. It might be beneficial to characterize the statistical behavior of hydro-meteorological variables from the application of heterogeneous mixture (HTM) distributions such as Normal-Gamma. In the present work, we focus on assessing the suitability of HTM distributions for the frequency analysis of hydro-meteorological variables. In the present work, in order to estimate the parameters of HTM distributions, the meta-heuristic algorithm (Genetic Algorithm) is employed to maximize the likelihood function. In the present study, a number of distributions are compared, including the Gamma-Extreme value type-one (EV1) HTM distribution, the EV1-EV1 HOM distribution, and EV1 distribution. The proposed distribution models are applied to the annual maximum precipitation data in South Korea. The Akaike Information Criterion (AIC), the root mean squared errors (RMSE) and the log-likelihood are used as measures of goodness-of-fit of the tested distributions. Results indicate that the HTM distribution (Gamma-EV1) presents the best fitness. The HTM distribution shows significant improvement in the estimation of quantiles corresponding to the 20-year return period. It is shown that extreme rainfall in the coastal region of South Korea presents strong heterogeneous mixture distributional characteristics. Results indicate that HTM distributions are a good alternative for the frequency analysis of hydro-meteorological variables when disparate statistical characteristics are presented.
Wang, Ling; Xia, Jie-lai; Yu, Li-li; Li, Chan-juan; Wang, Su-zhen
2008-06-01
To explore several numerical methods of ordinal variable in one-way ordinal contingency table and their interrelationship, and to compare corresponding statistical analysis methods such as Ridit analysis and rank sum test. Formula deduction was based on five simplified grading approaches including rank_r(i), ridit_r(i), ridit_r(ci), ridit_r(mi), and table scores. Practical data set was verified by SAS8.2 in clinical practice (to test the effect of Shiwei solution in treatment for chronic tracheitis). Because of the linear relationship of rank_r(i) = N ridit_r(i) + 1/2 = N ridit_r(ci) = (N + 1) ridit_r(mi), the exact chi2 values in Ridit analysis based on ridit_r(i), ridit_r(ci), and ridit_r(mi), were completely the same, and they were equivalent to the Kruskal-Wallis H test. Traditional Ridit analysis was based on ridit_r(i), and its corresponding chi2 value calculated with an approximate variance (1/12) was conservative. The exact chi2 test of Ridit analysis should be used when comparing multiple groups in the clinical researches because of its special merits such as distribution of mean ridit value on (0,1) and clear graph expression. The exact chi2 test of Ridit analysis can be output directly by proc freq of SAS8.2 with ridit and modridit option (SCORES =). The exact chi2 test of Ridit analysis is equivalent to the Kruskal-Wallis H test, and should be used when comparing multiple groups in the clinical researches.
Commissioning and Testing the 1970's Era LASS Solenoid Magnet in JLab's Hall D
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ballard, Joshua T.; Biallas, George H.; Brown, G.
2015-06-01
JLab refurbished and reconfigured the LASS1, 1.85m bore Solenoid and installed it as the principal analysis magnet for nuclear physics in the newly constructed, Hall D at Jefferson Lab. The magnet contains four superconducting coils within an iron yoke. The magnet was built in the early1970's at Stanford Linear Accelerator Center and used a second time at Los Alamos National Laboratory. The coils were extensively refurbished and individually tested by JLab. A new Cryogenic Distribution Box provides cryogens and their control valving, current distribution bus, and instrumentation pass-through. A repurposed CTI 2800 refrigerator system and new transfer line complete themore » system. We describe the re-configuration, the process and problems of re-commissioning the magnet and the results of testing the completed magnet.« less
Basic biostatistics for post-graduate students
Dakhale, Ganesh N.; Hiware, Sachin K.; Shinde, Abhijit T.; Mahatme, Mohini S.
2012-01-01
Statistical methods are important to draw valid conclusions from the obtained data. This article provides background information related to fundamental methods and techniques in biostatistics for the use of postgraduate students. Main focus is given to types of data, measurement of central variations and basic tests, which are useful for analysis of different types of observations. Few parameters like normal distribution, calculation of sample size, level of significance, null hypothesis, indices of variability, and different test are explained in detail by giving suitable examples. Using these guidelines, we are confident enough that postgraduate students will be able to classify distribution of data along with application of proper test. Information is also given regarding various free software programs and websites useful for calculations of statistics. Thus, postgraduate students will be benefitted in both ways whether they opt for academics or for industry. PMID:23087501
New advances in the statistical parton distributions approach
NASA Astrophysics Data System (ADS)
Soffer, Jacques; Bourrely, Claude
2016-03-01
The quantum statistical parton distributions approach proposed more than one decade ago is revisited by considering a larger set of recent and accurate Deep Inelastic Scattering experimental results. It enables us to improve the description of the data by means of a new determination of the parton distributions. This global next-to-leading order QCD analysis leads to a good description of several structure functions, involving unpolarized parton distributions and helicity distributions, in terms of a rather small number of free parameters. There are many serious challenging issues. The predictions of this theoretical approach will be tested for single-jet production and charge asymmetry in W± production in p¯p and pp collisions up to LHC energies, using recent data and also for forthcoming experimental results. Presented by J. So.er at POETIC 2015
FMRI group analysis combining effect estimates and their variances
Chen, Gang; Saad, Ziad S.; Nath, Audrey R.; Beauchamp, Michael S.; Cox, Robert W.
2012-01-01
Conventional functional magnetic resonance imaging (FMRI) group analysis makes two key assumptions that are not always justified. First, the data from each subject is condensed into a single number per voxel, under the assumption that within-subject variance for the effect of interest is the same across all subjects or is negligible relative to the cross-subject variance. Second, it is assumed that all data values are drawn from the same Gaussian distribution with no outliers. We propose an approach that does not make such strong assumptions, and present a computationally efficient frequentist approach to FMRI group analysis, which we term mixed-effects multilevel analysis (MEMA), that incorporates both the variability across subjects and the precision estimate of each effect of interest from individual subject analyses. On average, the more accurate tests result in higher statistical power, especially when conventional variance assumptions do not hold, or in the presence of outliers. In addition, various heterogeneity measures are available with MEMA that may assist the investigator in further improving the modeling. Our method allows group effect t-tests and comparisons among conditions and among groups. In addition, it has the capability to incorporate subject-specific covariates such as age, IQ, or behavioral data. Simulations were performed to illustrate power comparisons and the capability of controlling type I errors among various significance testing methods, and the results indicated that the testing statistic we adopted struck a good balance between power gain and type I error control. Our approach is instantiated in an open-source, freely distributed program that may be used on any dataset stored in the universal neuroimaging file transfer (NIfTI) format. To date, the main impediment for more accurate testing that incorporates both within- and cross-subject variability has been the high computational cost. Our efficient implementation makes this approach practical. We recommend its use in lieu of the less accurate approach in the conventional group analysis. PMID:22245637
Interoperability Outlook in the Big Data Future
NASA Astrophysics Data System (ADS)
Kuo, K. S.; Ramachandran, R.
2015-12-01
The establishment of distributed active archive centers (DAACs) as data warehouses and the standardization of file format by NASA's Earth Observing System Data Information System (EOSDIS) had doubtlessly propelled interoperability of NASA Earth science data to unprecedented heights in the 1990s. However, we obviously still feel wanting two decades later. We believe the inadequate interoperability we experience is a result of the the current practice that data are first packaged into files before distribution and only the metadata of these files are cataloged into databases and become searchable. Data therefore cannot be efficiently filtered. Any extensive study thus requires downloading large volumes of data files to a local system for processing and analysis.The need to download data not only creates duplication and inefficiency but also further impedes interoperability, because the analysis has to be performed locally by individual researchers in individual institutions. Each institution or researcher often has its/his/her own preference in the choice of data management practice as well as programming languages. Analysis results (derived data) so produced are thus subject to the differences of these practices, which later form formidable barriers to interoperability. A number of Big Data technologies are currently being examined and tested to address Big Earth Data issues. These technologies share one common characteristics: exploiting compute and storage affinity to more efficiently analyze large volumes and great varieties of data. Distributed active "archive" centers are likely to evolve into distributed active "analysis" centers, which not only archive data but also provide analysis service right where the data reside. "Analysis" will become the more visible function of these centers. It is thus reasonable to expect interoperability to improve because analysis, in addition to data, becomes more centralized. Within a "distributed active analysis center" interoperability is almost guaranteed because data, analysis, and results all can be readily shared and reused. Effectively, with the establishment of "distributed active analysis centers", interoperation turns from a many-to-many problem into a less complicated few-to-few problem and becomes easier to solve.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Palmer, A L; University of Surrey, Guildford, Surrey; Bradley, D A
Purpose: HDR brachytherapy is undergoing significant development, and quality assurance (QA) checks must keep pace. Current recommendations do not adequately verify delivered against planned dose distributions: This is particularly relevant for new treatment planning system (TPS) calculation algorithms (non TG-43 based), and an era of significant patient-specific plan optimisation. Full system checks are desirable in modern QA recommendations, complementary to device-centric individual tests. We present a QA system incorporating TPS calculation, dose distribution export, HDR unit performance, and dose distribution measurement. Such an approach, more common in external beam radiotherapy, has not previously been reported in the literature for brachytherapy.more » Methods: Our QA method was tested at 24 UK brachytherapy centres. As a novel approach, we used the TPS DICOM RTDose file export to compare planned dose distribution with that measured using Gafchromic EBT3 films placed around clinical brachytherapy treatment applicators. Gamma analysis was used to compare the dose distributions. Dose difference and distance to agreement were determined at prescription Point A. Accurate film dosimetry was achieved using a glass compression plate at scanning to ensure physically-flat films, simultaneous scanning of known dose films with measurement films, and triple-channel dosimetric analysis. Results: The mean gamma pass rate of RTDose compared to film-measured dose distributions was 98.1% at 3%(local), 2 mm criteria. The mean dose difference, measured to planned, at Point A was -0.5% for plastic treatment applicators and -2.4% for metal applicators, due to shielding not accounted for in TPS. The mean distance to agreement was 0.6 mm. Conclusion: It is recommended to develop brachytherapy QA to include full-system verification of agreement between planned and delivered dose distributions. This is a novel approach for HDR brachytherapy QA. A methodology using advanced film dosimetry and gamma comparison to DICOM RTDose files has been demonstrated as suitable to fulfil this need.« less
Numerical analysis of beam with sinusoidally corrugated webs
NASA Astrophysics Data System (ADS)
Górecki, Marcin; Pieńko, Michał; Łagoda, GraŻyna
2018-01-01
The paper presents numerical tests results of the steel beam with sinusoidally corrugated web, which were performed in the Autodesk Algor Simulation Professional 2010. The analysis was preceded by laboratory tests including the beam's work under the influence of the four point bending as well as the study of material characteristics. Significant web's thickness and use of tools available in the software allowed to analyze the behavior of the plate girder as beam, and also to observe the occurrence of stresses in the characteristic element - the corrugated web. The stress distribution observed on the both web's surfaces was analyzed.
Stress analysis and buckling of J-stiffened graphite-epoxy panel
NASA Technical Reports Server (NTRS)
Davis, R. C.
1980-01-01
A graphite epoxy shear panel with bonded on J stiffeners was investigated. The panel, loaded to buckling in a picture frame shear test is described. Two finite element models, each of which included the doubler material bonded to the panel skin under the stiffeners and at the panel edges, were used to make a stress analysis of the panel. The shear load distributions in the panel from two commonly used boundary conditions, applied shear load and applied displacement, were compared with the results from one of the finite element models that included the picture frame test fixture.
Coupled CFD and Particle Vortex Transport Method: Wing Performance and Wake Validations
2008-06-26
the PVTM analysis. The results obtained using the coupled RANS/PVTM analysis compare well with experimental data , in particular the pressure...searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. Send comments...is validated against wind tunnel test data . Comparisons with measured pressure distribution, loadings, and vortex parameters, and the corresponding
Joseph E. Aldy; Randall A. Kramer; Thomas P. Holmes
1999-01-01
Some critics in the environmental equity literature argue that low-income populations disproportionately have environmental risks, while the wealthy and better educated gain disproportionately from protecting unique ecosystems. The authors test this hypothesis in an analysis of the decline of Southern Appalachian spruce-fir forests. They calculate willingness-to-pay...
A method of using cluster analysis to study statistical dependence in multivariate data
NASA Technical Reports Server (NTRS)
Borucki, W. J.; Card, D. H.; Lyle, G. C.
1975-01-01
A technique is presented that uses both cluster analysis and a Monte Carlo significance test of clusters to discover associations between variables in multidimensional data. The method is applied to an example of a noisy function in three-dimensional space, to a sample from a mixture of three bivariate normal distributions, and to the well-known Fisher's Iris data.
Anantha M. Prasad
2015-01-01
I test for macroscale intraspecific variation of abundance, mortality, and regeneration of four eastern US tree species (Tsuga canadensis, Betula lenta, Liriodendron tulipifera, and Quercus prinus) by splitting them into three climatic zones based on plant hardiness zones (PHZs). The primary goals of the analysis are to assess the...
Neutron Physics Division progress report for period ending February 28, 1977
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maienschein, F.C.
1977-05-01
Summaries are given of research progress in the following areas: (1) measurements of cross sections and related quantities, (2) cross section evaluations and theory, (3) cross section processing, testing, and sensitivity analysis, (4) integral experiments and their analyses, (5) development of methods for shield and reactor analyses, (6) analyses for specific systems or applications, and (7) information analysis and distribution. (SDF)
Dai, Qi; Yang, Yanchun; Wang, Tianming
2008-10-15
Many proposed statistical measures can efficiently compare biological sequences to further infer their structures, functions and evolutionary information. They are related in spirit because all the ideas for sequence comparison try to use the information on the k-word distributions, Markov model or both. Motivated by adding k-word distributions to Markov model directly, we investigated two novel statistical measures for sequence comparison, called wre.k.r and S2.k.r. The proposed measures were tested by similarity search, evaluation on functionally related regulatory sequences and phylogenetic analysis. This offers the systematic and quantitative experimental assessment of our measures. Moreover, we compared our achievements with these based on alignment or alignment-free. We grouped our experiments into two sets. The first one, performed via ROC (receiver operating curve) analysis, aims at assessing the intrinsic ability of our statistical measures to search for similar sequences from a database and discriminate functionally related regulatory sequences from unrelated sequences. The second one aims at assessing how well our statistical measure is used for phylogenetic analysis. The experimental assessment demonstrates that our similarity measures intending to incorporate k-word distributions into Markov model are more efficient.
Soares, Carlos José; Raposo, Luís Henrique Araújo; Soares, Paulo Vinícius; Santos-Filho, Paulo César Freitas; Menezes, Murilo Sousa; Soares, Priscilla Barbosa Ferreira; Magalhães, Denildo
2010-02-01
To test the hypothesis that the type of cement used for fixation of cast dowel-and-cores might influence fracture resistance, fracture mode, and stress distribution of single-rooted teeth restored with this class of metallic dowels. The coronal portion was removed from 40 bovine incisors, leaving a 15 mm root. After endodontic treatment and standardized root canal relief at 10 mm, specimens were embedded in polystyrene resin, and the periodontal ligament was simulated with polyether impression material. The specimens were randomly divided into four groups (n = 10), and restored with Cu-Al cast dowel-and-cores cemented with one of four options: conventional glass ionomer cement (GI); resin-modified glass ionomer cement (GR); dual-cure resin cement (RC); or zinc-phosphate cement (ZP). Sequentially, fracture resistance of the specimens was tested with a tangential load at a 135 degrees angle with a 0.5 mm/min crosshead speed. Data were analyzed using one-way analysis of variance (ANOVA) and the Fisher test. Two-dimensional finite element analysis (2D-FEA) was then performed with representative models of each group simulating a 100 microm cement layer. Results were analyzed based on von Mises stress distribution criteria. The mean fracture resistance values were (in N): RC, 838.2 +/- 135.9; GI, 772.4 +/- 169.8; GR, 613.4 +/- 157.5; ZP, 643.6 +/- 106.7. FEA revealed that RC and GR presented lower stress values than ZP and GI. The higher stress concentration was coincident with more catastrophic failures, and consequently, with lower fracture resistance values. The type of cement influenced fracture resistance, failure mode, and stress distribution on teeth restored with cast dowel-and-cores.
NASA Astrophysics Data System (ADS)
Wei, T. B.; Chen, Y. L.; Lin, H. R.; Huang, S. Y.; Yeh, T. C. J.; Wen, J. C.
2016-12-01
In the groundwater study, it estimated the heterogeneous spatial distribution of hydraulic Properties, there were many scholars use to hydraulic tomography (HT) from field site pumping tests to estimate inverse of heterogeneous spatial distribution of hydraulic Properties, to prove the most of most field site aquifer was heterogeneous hydrogeological parameters spatial distribution field. Many scholars had proposed a method of hydraulic tomography to estimate heterogeneous spatial distribution of hydraulic Properties of aquifer, the Huang et al. [2011] was used the non-redundant verification analysis of pumping wells changed, observation wells fixed on the inverse and the forward, to reflect the feasibility of the heterogeneous spatial distribution of hydraulic Properties of field site aquifer of the non-redundant verification analysis on steady-state model.From post literature, finding only in steady state, non-redundant verification analysis of pumping well changed location and observation wells fixed location for inverse and forward. But the studies had not yet pumping wells fixed or changed location, and observation wells fixed location for redundant verification or observation wells change location for non-redundant verification of the various combinations may to explore of influences of hydraulic tomography method. In this study, it carried out redundant verification method and non-redundant verification method for forward to influences of hydraulic tomography method in transient. And it discuss above mentioned in NYUST campus sites the actual case, to prove the effectiveness of hydraulic tomography methods, and confirmed the feasibility on inverse and forward analysis from analysis results.Keywords: Hydraulic Tomography, Redundant Verification, Heterogeneous, Inverse, Forward
An integrated tool for loop calculations: AITALC
NASA Astrophysics Data System (ADS)
Lorca, Alejandro; Riemann, Tord
2006-01-01
AITALC, a new tool for automating loop calculations in high energy physics, is described. The package creates Fortran code for two-fermion scattering processes automatically, starting from the generation and analysis of the Feynman graphs. We describe the modules of the tool, the intercommunication between them and illustrate its use with three examples. Program summaryTitle of the program:AITALC version 1.2.1 (9 August 2005) Catalogue identifier:ADWO Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADWO Program obtainable from:CPC Program Library, Queen's University of Belfast, N. Ireland Computer:PC i386 Operating system:GNU/ LINUX, tested on different distributions SuSE 8.2 to 9.3, Red Hat 7.2, Debian 3.0, Ubuntu 5.04. Also on SOLARIS Programming language used:GNU MAKE, DIANA, FORM, FORTRAN77 Additional programs/libraries used:DIANA 2.35 ( QGRAF 2.0), FORM 3.1, LOOPTOOLS 2.1 ( FF) Memory required to execute with typical data:Up to about 10 MB No. of processors used:1 No. of lines in distributed program, including test data, etc.:40 926 No. of bytes in distributed program, including test data, etc.:371 424 Distribution format:tar gzip file High-speed storage required:from 1.5 to 30 MB, depending on modules present and unfolding of examples Nature of the physical problem:Calculation of differential cross sections for ee annihilation in one-loop approximation. Method of solution:Generation and perturbative analysis of Feynman diagrams with later evaluation of matrix elements and form factors. Restriction of the complexity of the problem:The limit of application is, for the moment, the 2→2 particle reactions in the electro-weak standard model. Typical running time:Few minutes, being highly depending on the complexity of the process and the FORTRAN compiler.
Adhesion of new bioactive glass coating.
Schrooten, J; Van Oosterwyck, H; Vander Sloten, J; Helsen, J A
1999-03-05
A valuable alternative to the existing biomedical implant coatings is a bioactive glass (BAG) coating that is produced by reactive plasma spraying. A mechanical performance requirement that is of the utmost importance is the adhesion strength of the coating. Considering the application as dental implant, a new adhesion test (shear test), which was close to the service conditions, was designed. A Ti6Al4V rod (3 mm) with a sprayed BAG coating of 50 microm was glued with an epoxy glue to a hollow cylindrical counterpart and was used as such in the tensile machine. This test was evaluated by finite element analysis (FEA). Preliminary experiments showed that a conversion from shear to tensile adhesion strength is possible by using the Von Mises criterion (sigma = 3(1/2)tau), indicating that thin coatings of brittle materials can behave as a ductile material. The new coating technique was proved to produce a high quality coating with an adhesion strength of 40.1 +/- 4.8 MPa in shear and 69.4 +/- 8.4 MPa in tension. The FEA revealed that no one homogeneously distributed shear stress is present but several nonhomogeneously distributed stress components (shear and tensile) are present in the coating. This analysis indicated that real service conditions are much more complicated than standard adhesion tests. Copyright 1999 John Wiley & Sons, Inc.
Guttersrud, Øystein; Petterson, Kjell Sverre
2015-10-01
The present study validates a revised scale measuring individuals' level of the 'engagement in dietary behaviour' aspect of 'critical nutrition literacy' and describes how background factors affect this aspect of Norwegian tenth-grade students' nutrition literacy. Data were gathered electronically during a field trial of a standardised sample test in science. Test items and questionnaire constructs were distributed evenly across four electronic field-test booklets. Data management and analysis were performed using the RUMM2030 item analysis package and the IBM SPSS Statistics 20 statistical software package. Students responded on computers at school. Seven hundred and forty tenth-grade students at twenty-seven randomly sampled public schools were enrolled in the field-test study. The engagement in dietary behaviour scale and the self-efficacy in science scale were distributed to 178 of these students. The dietary behaviour scale and the self-efficacy in science scale came out as valid, reliable and well-targeted instruments usable for the construction of measurements. Girls and students with high self-efficacy reported higher engagement in dietary behaviour than other students. Socio-economic status and scientific literacy - measured as ability in science by applying an achievement test - did not correlate significantly different from zero with students' engagement in dietary behaviour.
A SIGNIFICANCE TEST FOR THE LASSO1
Lockhart, Richard; Taylor, Jonathan; Tibshirani, Ryan J.; Tibshirani, Robert
2014-01-01
In the sparse linear regression setting, we consider testing the significance of the predictor variable that enters the current lasso model, in the sequence of models visited along the lasso solution path. We propose a simple test statistic based on lasso fitted values, called the covariance test statistic, and show that when the true model is linear, this statistic has an Exp(1) asymptotic distribution under the null hypothesis (the null being that all truly active variables are contained in the current lasso model). Our proof of this result for the special case of the first predictor to enter the model (i.e., testing for a single significant predictor variable against the global null) requires only weak assumptions on the predictor matrix X. On the other hand, our proof for a general step in the lasso path places further technical assumptions on X and the generative model, but still allows for the important high-dimensional case p > n, and does not necessarily require that the current lasso model achieves perfect recovery of the truly active variables. Of course, for testing the significance of an additional variable between two nested linear models, one typically uses the chi-squared test, comparing the drop in residual sum of squares (RSS) to a χ12 distribution. But when this additional variable is not fixed, and has been chosen adaptively or greedily, this test is no longer appropriate: adaptivity makes the drop in RSS stochastically much larger than χ12 under the null hypothesis. Our analysis explicitly accounts for adaptivity, as it must, since the lasso builds an adaptive sequence of linear models as the tuning parameter λ decreases. In this analysis, shrinkage plays a key role: though additional variables are chosen adaptively, the coefficients of lasso active variables are shrunken due to the l1 penalty. Therefore, the test statistic (which is based on lasso fitted values) is in a sense balanced by these two opposing properties—adaptivity and shrinkage—and its null distribution is tractable and asymptotically Exp(1). PMID:25574062
Deilkås, Ellen T; Hofoss, Dag
2008-09-22
How to protect patients from harm is a question of universal interest. Measuring and improving safety culture in care giving units is an important strategy for promoting a safe environment for patients. The Safety Attitudes Questionnaire (SAQ) is the only instrument that measures safety culture in a way which correlates with patient outcome. We have translated the SAQ to Norwegian and validated the translated version. The psychometric properties of the translated questionnaire are presented in this article. The questionnaire was translated with the back translation technique and tested in 47 clinical units in a Norwegian university hospital. SAQ's (the Generic version (Short Form 2006) the version with the two sets of questions on perceptions of management: on unit management and on hospital management) were distributed to 1911 frontline staff. 762 were distributed during unit meetings and 1149 through the postal system. Cronbach alphas, item-to-own correlations, and test-retest correlations were calculated, and response distribution analysis and confirmatory factor analysis were performed, as well as early validity tests. 1306 staff members completed and returned the questionnaire: a response rate of 68%. Questionnaire acceptability was good. The reliability measures were acceptable. The factor structure of the responses was tested by confirmatory factor analysis. 36 items were ascribed to seven underlying factors: Teamwork Climate, Safety Climate, Stress Recognition, Perceptions of Hospital Management, Perceptions of Unit Management, Working conditions, and Job satisfaction. Goodness-of-Fit Indices showed reasonable, but not indisputable, model fit. External validity indicators - recognizability of results, correlations with "trigger tool"-identified adverse events, with patient satisfaction with hospitalization, patient reports of possible maltreatment, and patient evaluation of organization of hospital work - provided preliminary validation. Based on the data from Akershus University Hospital, we conclude that the Norwegian translation of the SAQ showed satisfactory internal psychometric properties. With data from one hospital only, we cannot draw strong conclusions on its external validity. Further validation studies linking the SAQ-scores to patient outcome data should be performed.
NASA Technical Reports Server (NTRS)
Nicks, C. O.; Childs, D. W.
1984-01-01
The importance of seal behavior in rotordynamics is discussed and current annular seal theory is reviewed. A Nelson's analytical-computational method for determining rotordynamic coefficients for this type of compressible-flow seal is outlined. Various means for the experimental identification of the dynamic coefficients are given, and the method employed at the Texas A and M University (TAMU) test facility is explained. The TAMU test apparatus is described, and the test procedures are discussed. Experimental results, including leakage, entrance-loss coefficients, pressure distributions, and rotordynamic coefficients for a smooth and a honeycomb constant-clearance seal are presented and compared to theoretical results from Nelson's analysis. The results for both seals show little sensitivity to the running speed over the test range. Agreement between test results and theory for leakage through the seal is satisfactory. Test results for direct stiffness show a greater sensitivity to fluid pre-rotation than predicted. Results also indicate that the deliberately roughened surface of the honeycomb seal provides improved stability versus the smooth seal.
de Mendoza, Guillermo; Traunspurger, Walter; Palomo, Alejandro; Catalan, Jordi
2017-05-01
Nematode species are widely tolerant of environmental conditions and disperse passively. Therefore, the species richness distribution in this group might largely depend on the topological distribution of the habitats and main aerial and aquatic dispersal pathways connecting them. If so, the nematode species richness distributions may serve as null models for evaluating that of other groups more affected by environmental gradients. We investigated this hypothesis in lakes across an altitudinal gradient in the Pyrenees. We compared the altitudinal distribution, environmental tolerance, and species richness, of nematodes with that of three other invertebrate groups collected during the same sampling: oligochaetes, chironomids, and nonchironomid insects. We tested the altitudinal bias in distributions with t -tests and the significance of narrow-ranging altitudinal distributions with randomizations. We compared results between groups with Fisher's exact tests. We then explored the influence of environmental factors on species assemblages in all groups with redundancy analysis (RDA), using 28 environmental variables. And, finally, we analyzed species richness patterns across altitude with simple linear and quadratic regressions. Nematode species were rarely biased from random distributions (5% of species) in contrast with other groups (35%, 47%, and 50%, respectively). The altitudinal bias most often shifted toward low altitudes (85% of biased species). Nematodes showed a lower portion of narrow-ranging species than any other group, and differed significantly from nonchironomid insects (10% and 43%, respectively). Environmental variables barely explained nematode assemblages (RDA adjusted R 2 = 0.02), in contrast with other groups (0.13, 0.19 and 0.24). Despite these substantial differences in the response to environmental factors, species richness across altitude was unimodal, peaking at mid elevations, in all groups. This similarity indicates that the spatial distribution of lakes across altitude is a primary driver of invertebrate richness. Provided that nematodes are ubiquitous, their distribution offers potential null models to investigate species richness across environmental gradients in other ecosystem types and biogeographic regions.
It's Time to Develop a New "Draft Test Protocol" for a Mars Sample Return Mission (or Two…).
Rummel, John D; Kminek, Gerhard
2018-04-01
The last time NASA envisioned a sample return mission from Mars, the development of a protocol to support the analysis of the samples in a containment facility resulted in a "Draft Test Protocol" that outlined required preparations "for the safe receiving, handling, testing, distributing, and archiving of martian materials here on Earth" (Rummel et al., 2002 ). This document comprised a specific protocol to be used to conduct a biohazard test for a returned martian sample, following the recommendations of the Space Studies Board of the US National Academy of Sciences. Given the planned launch of a sample-collecting and sample-caching rover (Mars 2020) in 2 years' time, and with a sample return planned for the end of the next decade, it is time to revisit the Draft Test Protocol to develop a sample analysis and biohazard test plan to meet the needs of these future missions. Key Words: Biohazard detection-Mars sample analysis-Sample receiving facility-Protocol-New analytical techniques-Robotic sample handling. Astrobiology 18, 377-380.
Enabling analytical and Modeling Tools for Enhanced Disease Surveillance
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dawn K. Manley
2003-04-01
Early detection, identification, and warning are essential to minimize casualties from a biological attack. For covert attacks, sick people are likely to provide the first indication of an attack. An enhanced medical surveillance system that synthesizes distributed health indicator information and rapidly analyzes the information can dramatically increase the number of lives saved. Current surveillance methods to detect both biological attacks and natural outbreaks are hindered by factors such as distributed ownership of information, incompatible data storage and analysis programs, and patient privacy concerns. Moreover, because data are not widely shared, few data mining algorithms have been tested on andmore » applied to diverse health indicator data. This project addressed both integration of multiple data sources and development and integration of analytical tools for rapid detection of disease outbreaks. As a first prototype, we developed an application to query and display distributed patient records. This application incorporated need-to-know access control and incorporated data from standard commercial databases. We developed and tested two different algorithms for outbreak recognition. The first is a pattern recognition technique that searches for space-time data clusters that may signal a disease outbreak. The second is a genetic algorithm to design and train neural networks (GANN) that we applied toward disease forecasting. We tested these algorithms against influenza, respiratory illness, and Dengue Fever data. Through this LDRD in combination with other internal funding, we delivered a distributed simulation capability to synthesize disparate information and models for earlier recognition and improved decision-making in the event of a biological attack. The architecture incorporates user feedback and control so that a user's decision inputs can impact the scenario outcome as well as integrated security and role-based access-control for communicating between distributed data and analytical tools. This work included construction of interfaces to various commercial database products and to one of the data analysis algorithms developed through this LDRD.« less
Effect of extreme data loss on heart rate signals quantified by entropy analysis
NASA Astrophysics Data System (ADS)
Li, Yu; Wang, Jun; Li, Jin; Liu, Dazhao
2015-02-01
The phenomenon of data loss always occurs in the analysis of large databases. Maintaining the stability of analysis results in the event of data loss is very important. In this paper, we used a segmentation approach to generate a synthetic signal that is randomly wiped from data according to the Gaussian distribution and the exponential distribution of the original signal. Then, the logistic map is used as verification. Finally, two methods of measuring entropy-base-scale entropy and approximate entropy-are comparatively analyzed. Our results show the following: (1) Two key parameters-the percentage and the average length of removed data segments-can change the sequence complexity according to logistic map testing. (2) The calculation results have preferable stability for base-scale entropy analysis, which is not sensitive to data loss. (3) The loss percentage of HRV signals should be controlled below the range (p = 30 %), which can provide useful information in clinical applications.
Investigation of water droplet trajectories within the NASA icing research tunnel
NASA Technical Reports Server (NTRS)
Reehorst, Andrew; Ibrahim, Mounir
1995-01-01
Water droplet trajectories within the NASA Lewis Research Center's Icing Research Tunnel (IRT) were studied through computer analysis. Of interest was the influence of the wind tunnel contraction and wind tunnel model blockage on the water droplet trajectories. The computer analysis was carried out with a program package consisting of a three-dimensional potential panel code and a three-dimensional droplet trajectory code. The wind tunnel contraction was found to influence the droplet size distribution and liquid water content distribution across the test section from that at the inlet. The wind tunnel walls were found to have negligible influence upon the impingement of water droplets upon a wing model.
(16) {C}16C-elastic scattering examined using several models at different energies
NASA Astrophysics Data System (ADS)
El-hammamy, M. N.; Attia, A.
2018-05-01
In the present paper, the first results concerning the theoretical analysis of the ^{16}C + p reaction by investigating two elastic scattering angular distributions measured at high energy compared to low energy for this system are reported. Several models for the real part of the nuclear potential are tested within the optical model formalism. The imaginary potential has a Woods-Saxon shape with three free parameters. Two types of density distribution and three different cluster structures for ^{16}C are assumed in the analysis. The results are compared with each other as well as with the experimental data to give evidence of the importance of these studied items.
Rice, Stephen B; Chan, Christopher; Brown, Scott C; Eschbach, Peter; Han, Li; Ensor, David S; Stefaniak, Aleksandr B; Bonevich, John; Vladár, András E; Hight Walker, Angela R; Zheng, Jiwen; Starnes, Catherine; Stromberg, Arnold; Ye, Jia; Grulke, Eric A
2015-01-01
This paper reports an interlaboratory comparison that evaluated a protocol for measuring and analysing the particle size distribution of discrete, metallic, spheroidal nanoparticles using transmission electron microscopy (TEM). The study was focused on automated image capture and automated particle analysis. NIST RM8012 gold nanoparticles (30 nm nominal diameter) were measured for area-equivalent diameter distributions by eight laboratories. Statistical analysis was used to (1) assess the data quality without using size distribution reference models, (2) determine reference model parameters for different size distribution reference models and non-linear regression fitting methods and (3) assess the measurement uncertainty of a size distribution parameter by using its coefficient of variation. The interlaboratory area-equivalent diameter mean, 27.6 nm ± 2.4 nm (computed based on a normal distribution), was quite similar to the area-equivalent diameter, 27.6 nm, assigned to NIST RM8012. The lognormal reference model was the preferred choice for these particle size distributions as, for all laboratories, its parameters had lower relative standard errors (RSEs) than the other size distribution reference models tested (normal, Weibull and Rosin–Rammler–Bennett). The RSEs for the fitted standard deviations were two orders of magnitude higher than those for the fitted means, suggesting that most of the parameter estimate errors were associated with estimating the breadth of the distributions. The coefficients of variation for the interlaboratory statistics also confirmed the lognormal reference model as the preferred choice. From quasi-linear plots, the typical range for good fits between the model and cumulative number-based distributions was 1.9 fitted standard deviations less than the mean to 2.3 fitted standard deviations above the mean. Automated image capture, automated particle analysis and statistical evaluation of the data and fitting coefficients provide a framework for assessing nanoparticle size distributions using TEM for image acquisition. PMID:26361398
NASA Astrophysics Data System (ADS)
Arneodo, M.; Arvidson, A.; Aubert, J. J.; Badelek, B.; Beaufays, J.; Bee, C. P.; Benchouk, C.; Berghoff, G.; Bird, I. G.; Blum, D.; Böhm, E.; De Bouard, X.; Brasse, F. W.; Braun, H.; Broll, C.; Brown, S. C.; Brück, H.; Calen, H.; Chima, J. S.; Ciborowski, J.; Clifft, R.; Coignet, G.; Combley, F.; Coughlan, J.; D'Agostini, G.; Dahlgren, S.; Dengler, F.; Derado, I.; Dreyer, T.; Drees, J.; Düren, M.; Eckardt, V.; Edwards, A.; Edwards, M.; Ernst, T.; Eszes, G.; Favier, J.; Ferrero, M. I.; Figiel, J.; Flauger, W.; Foster, J.; Gabathuler, E.; Gajewski, J.; Gamet, R.; Gayler, J.; Geddes, N.; Grafström, P.; Grard, F.; Haas, J.; Hagberg, E.; Hasert, F. J.; Hayman, P.; Heusse, P.; Jaffre, M.; Jacholkowska, A.; Janata, F.; Jancso, G.; Johnson, A. S.; Kabuss, E. M.; Kellner, G.; Korbel, V.; Krüger, A.; Krüger, J.; Kullander, S.; Landgraf, U.; Lanske, D.; Loken, J.; Long, K.; Maire, M.; Malecki, P.; Manz, A.; Maselli, S.; Mohr, W.; Montanet, F.; Montgomery, H. E.; Nagy, E.; Nassalski, J.; Norton, P. R.; Oakham, F. G.; Osborne, A. M.; Pascaud, C.; Pawlik, B.; Payre, P.; Peroni, C.; Peschel, H.; Pessard, H.; Pettingale, J.; Pietrzyk, B.; Poensgen, B.; Pötsch, M.; Renton, P.; Ribarics, P.; Rith, K.; Rondio, E.; Sandacz, A.; Scheer, M.; Schlagböhmer, A.; Schiemann, H.; Schmitz, N.; Schneegans, M.; Scholz, M.; Schouten, M.; Schröder, T.; Schultze, K.; Sloan, T.; Stier, H. E.; Studt, M.; Taylor, G. N.; Thenard, J. M.; Thompson, J. C.; De la Torre, A.; Toth, J.; Urban, L.; Urban, L.; Wallucks, W.; Whalley, M.; Wheeler, S.; Williams, W. S. C.; Wimpenny, S. J.; Windmolders, R.; Wolf, G.; European Muon Collaboration
1989-07-01
A new determination of the u valence quark distribution function in the proton is obtained from the analysis of identified charged pions, kaons, protons and antiprotons produced in muon-proton and muon-deuteron scattering. The comparison with results obtained in inclusive deep inelastic lepton-nucleon scattering provides a further test of the quark-parton model. The u quark fragmentation functions into positive and negative pions, kaons, protons and antiprotons are also measured.
NASA Astrophysics Data System (ADS)
Shao, Yuxiang; Chen, Qing; Wei, Zhenhua
Logistics distribution center location evaluation is a dynamic, fuzzy, open and complicated nonlinear system, which makes it difficult to evaluate the distribution center location by the traditional analysis method. The paper proposes a distribution center location evaluation system which uses the fuzzy neural network combined with the genetic algorithm. In this model, the neural network is adopted to construct the fuzzy system. By using the genetic algorithm, the parameters of the neural network are optimized and trained so as to improve the fuzzy system’s abilities of self-study and self-adaptation. At last, the sampled data are trained and tested by Matlab software. The simulation results indicate that the proposed identification model has very small errors.
On the scaling of the distribution of daily price fluctuations in the Mexican financial market index
NASA Astrophysics Data System (ADS)
Alfonso, Léster; Mansilla, Ricardo; Terrero-Escalante, César A.
2012-05-01
In this paper, a statistical analysis of log-return fluctuations of the IPC, the Mexican Stock Market Index is presented. A sample of daily data covering the period from 04/09/2000-04/09/2010 was analyzed, and fitted to different distributions. Tests of the goodness of fit were performed in order to quantitatively asses the quality of the estimation. Special attention was paid to the impact of the size of the sample on the estimated decay of the distributions tail. In this study a forceful rejection of normality was obtained. On the other hand, the null hypothesis that the log-fluctuations are fitted to a α-stable Lévy distribution cannot be rejected at the 5% significance level.
NASA Astrophysics Data System (ADS)
Endreny, Theodore A.; Pashiardis, Stelios
2007-02-01
SummaryRobust and accurate estimates of rainfall frequencies are difficult to make with short, and arid-climate, rainfall records, however new regional and global methods were used to supplement such a constrained 15-34 yr record in Cyprus. The impact of supplementing rainfall frequency analysis with the regional and global approaches was measured with relative bias and root mean square error (RMSE) values. Analysis considered 42 stations with 8 time intervals (5-360 min) in four regions delineated by proximity to sea and elevation. Regional statistical algorithms found the sites passed discordancy tests of coefficient of variation, skewness and kurtosis, while heterogeneity tests revealed the regions were homogeneous to mildly heterogeneous. Rainfall depths were simulated in the regional analysis method 500 times, and then goodness of fit tests identified the best candidate distribution as the general extreme value (GEV) Type II. In the regional analysis, the method of L-moments was used to estimate location, shape, and scale parameters. In the global based analysis, the distribution was a priori prescribed as GEV Type II, a shape parameter was a priori set to 0.15, and a time interval term was constructed to use one set of parameters for all time intervals. Relative RMSE values were approximately equal at 10% for the regional and global method when regions were compared, but when time intervals were compared the global method RMSE had a parabolic-shaped time interval trend. Relative bias values were also approximately equal for both methods when regions were compared, but again a parabolic-shaped time interval trend was found for the global method. The global method relative RMSE and bias trended with time interval, which may be caused by fitting a single scale value for all time intervals.
Publication Bias in Meta-Analysis: Confidence Intervals for Rosenthal's Fail-Safe Number.
Fragkos, Konstantinos C; Tsagris, Michail; Frangos, Christos C
2014-01-01
The purpose of the present paper is to assess the efficacy of confidence intervals for Rosenthal's fail-safe number. Although Rosenthal's estimator is highly used by researchers, its statistical properties are largely unexplored. First of all, we developed statistical theory which allowed us to produce confidence intervals for Rosenthal's fail-safe number. This was produced by discerning whether the number of studies analysed in a meta-analysis is fixed or random. Each case produces different variance estimators. For a given number of studies and a given distribution, we provided five variance estimators. Confidence intervals are examined with a normal approximation and a nonparametric bootstrap. The accuracy of the different confidence interval estimates was then tested by methods of simulation under different distributional assumptions. The half normal distribution variance estimator has the best probability coverage. Finally, we provide a table of lower confidence intervals for Rosenthal's estimator.
Publication Bias in Meta-Analysis: Confidence Intervals for Rosenthal's Fail-Safe Number
Fragkos, Konstantinos C.; Tsagris, Michail; Frangos, Christos C.
2014-01-01
The purpose of the present paper is to assess the efficacy of confidence intervals for Rosenthal's fail-safe number. Although Rosenthal's estimator is highly used by researchers, its statistical properties are largely unexplored. First of all, we developed statistical theory which allowed us to produce confidence intervals for Rosenthal's fail-safe number. This was produced by discerning whether the number of studies analysed in a meta-analysis is fixed or random. Each case produces different variance estimators. For a given number of studies and a given distribution, we provided five variance estimators. Confidence intervals are examined with a normal approximation and a nonparametric bootstrap. The accuracy of the different confidence interval estimates was then tested by methods of simulation under different distributional assumptions. The half normal distribution variance estimator has the best probability coverage. Finally, we provide a table of lower confidence intervals for Rosenthal's estimator. PMID:27437470
Avalanche statistics from data with low time resolution
DOE Office of Scientific and Technical Information (OSTI.GOV)
LeBlanc, Michael; Nawano, Aya; Wright, Wendelin J.
Extracting avalanche distributions from experimental microplasticity data can be hampered by limited time resolution. We compute the effects of low time resolution on avalanche size distributions and give quantitative criteria for diagnosing and circumventing problems associated with low time resolution. We show that traditional analysis of data obtained at low acquisition rates can lead to avalanche size distributions with incorrect power-law exponents or no power-law scaling at all. Furthermore, we demonstrate that it can lead to apparent data collapses with incorrect power-law and cutoff exponents. We propose new methods to analyze low-resolution stress-time series that can recover the size distributionmore » of the underlying avalanches even when the resolution is so low that naive analysis methods give incorrect results. We test these methods on both downsampled simulation data from a simple model and downsampled bulk metallic glass compression data and find that the methods recover the correct critical exponents.« less
Avalanche statistics from data with low time resolution
LeBlanc, Michael; Nawano, Aya; Wright, Wendelin J.; ...
2016-11-22
Extracting avalanche distributions from experimental microplasticity data can be hampered by limited time resolution. We compute the effects of low time resolution on avalanche size distributions and give quantitative criteria for diagnosing and circumventing problems associated with low time resolution. We show that traditional analysis of data obtained at low acquisition rates can lead to avalanche size distributions with incorrect power-law exponents or no power-law scaling at all. Furthermore, we demonstrate that it can lead to apparent data collapses with incorrect power-law and cutoff exponents. We propose new methods to analyze low-resolution stress-time series that can recover the size distributionmore » of the underlying avalanches even when the resolution is so low that naive analysis methods give incorrect results. We test these methods on both downsampled simulation data from a simple model and downsampled bulk metallic glass compression data and find that the methods recover the correct critical exponents.« less
Vibrational Analysis of Engine Components Using Neural-Net Processing and Electronic Holography
NASA Technical Reports Server (NTRS)
Decker, Arthur J.; Fite, E. Brian; Mehmed, Oral; Thorp, Scott A.
1997-01-01
The use of computational-model trained artificial neural networks to acquire damage specific information from electronic holograms is discussed. A neural network is trained to transform two time-average holograms into a pattern related to the bending-induced-strain distribution of the vibrating component. The bending distribution is very sensitive to component damage unlike the characteristic fringe pattern or the displacement amplitude distribution. The neural network processor is fast for real-time visualization of damage. The two-hologram limit makes the processor more robust to speckle pattern decorrelation. Undamaged and cracked cantilever plates serve as effective objects for testing the combination of electronic holography and neural-net processing. The requirements are discussed for using finite-element-model trained neural networks for field inspections of engine components. The paper specifically discusses neural-network fringe pattern analysis in the presence of the laser speckle effect and the performances of two limiting cases of the neural-net architecture.
Vibrational Analysis of Engine Components Using Neural-Net Processing and Electronic Holography
NASA Technical Reports Server (NTRS)
Decker, Arthur J.; Fite, E. Brian; Mehmed, Oral; Thorp, Scott A.
1998-01-01
The use of computational-model trained artificial neural networks to acquire damage specific information from electronic holograms is discussed. A neural network is trained to transform two time-average holograms into a pattern related to the bending-induced-strain distribution of the vibrating component. The bending distribution is very sensitive to component damage unlike the characteristic fringe pattern or the displacement amplitude distribution. The neural network processor is fast for real-time visualization of damage. The two-hologram limit makes the processor more robust to speckle pattern decorrelation. Undamaged and cracked cantilever plates serve as effective objects for testing the combination of electronic holography and neural-net processing. The requirements are discussed for using finite-element-model trained neural networks for field inspections of engine components. The paper specifically discusses neural-network fringe pattern analysis in the presence of the laser speckle effect and the performances of two limiting cases of the neural-net architecture.
Bao, Yi; Hoehler, Matthew S; Smith, Christopher M; Bundy, Matthew; Chen, Genda
2017-10-01
In this study, distributed fiber optic sensors based on pulse pre-pump Brillouin optical time domain analysis (PPP-BODTA) are characterized and deployed to measure spatially-distributed temperatures in reinforced concrete specimens exposed to fire. Four beams were tested to failure in a natural gas fueled compartment fire, each instrumented with one fused silica, single-mode optical fiber as a distributed sensor and four thermocouples. Prior to concrete cracking, the distributed temperature was validated at locations of the thermocouples by a relative difference of less than 9 %. The cracks in concrete can be identified as sharp peaks in the temperature distribution since the cracks are locally filled with hot air. Concrete cracking did not affect the sensitivity of the distributed sensor but concrete spalling broke the optical fiber loop required for PPP-BOTDA measurements.
NASA Astrophysics Data System (ADS)
Cianciara, Aleksander
2016-09-01
The paper presents the results of research aimed at verifying the hypothesis that the Weibull distribution is an appropriate statistical distribution model of microseismicity emission characteristics, namely: energy of phenomena and inter-event time. It is understood that the emission under consideration is induced by the natural rock mass fracturing. Because the recorded emission contain noise, therefore, it is subjected to an appropriate filtering. The study has been conducted using the method of statistical verification of null hypothesis that the Weibull distribution fits the empirical cumulative distribution function. As the model describing the cumulative distribution function is given in an analytical form, its verification may be performed using the Kolmogorov-Smirnov goodness-of-fit test. Interpretations by means of probabilistic methods require specifying the correct model describing the statistical distribution of data. Because in these methods measurement data are not used directly, but their statistical distributions, e.g., in the method based on the hazard analysis, or in that that uses maximum value statistics.
The implementation and use of Ada on distributed systems with high reliability requirements
NASA Technical Reports Server (NTRS)
Knight, J. C.
1984-01-01
The use and implementation of Ada in distributed environments in which reliability is the primary concern is investigated. Emphasis is placed on the possibility that a distributed system may be programmed entirely in ADA so that the individual tasks of the system are unconcerned with which processors they are executing on, and that failures may occur in the software or underlying hardware. The primary activities are: (1) Continued development and testing of our fault-tolerant Ada testbed; (2) consideration of desirable language changes to allow Ada to provide useful semantics for failure; (3) analysis of the inadequacies of existing software fault tolerance strategies.
Lee, L.; Helsel, D.
2007-01-01
Analysis of low concentrations of trace contaminants in environmental media often results in left-censored data that are below some limit of analytical precision. Interpretation of values becomes complicated when there are multiple detection limits in the data-perhaps as a result of changing analytical precision over time. Parametric and semi-parametric methods, such as maximum likelihood estimation and robust regression on order statistics, can be employed to model distributions of multiply censored data and provide estimates of summary statistics. However, these methods are based on assumptions about the underlying distribution of data. Nonparametric methods provide an alternative that does not require such assumptions. A standard nonparametric method for estimating summary statistics of multiply-censored data is the Kaplan-Meier (K-M) method. This method has seen widespread usage in the medical sciences within a general framework termed "survival analysis" where it is employed with right-censored time-to-failure data. However, K-M methods are equally valid for the left-censored data common in the geosciences. Our S-language software provides an analytical framework based on K-M methods that is tailored to the needs of the earth and environmental sciences community. This includes routines for the generation of empirical cumulative distribution functions, prediction or exceedance probabilities, and related confidence limits computation. Additionally, our software contains K-M-based routines for nonparametric hypothesis testing among an unlimited number of grouping variables. A primary characteristic of K-M methods is that they do not perform extrapolation and interpolation. Thus, these routines cannot be used to model statistics beyond the observed data range or when linear interpolation is desired. For such applications, the aforementioned parametric and semi-parametric methods must be used.
Towards the mechanical characterization of abdominal wall by inverse analysis.
Simón-Allué, R; Calvo, B; Oberai, A A; Barbone, P E
2017-02-01
The aim of this study is to characterize the passive mechanical behaviour of abdominal wall in vivo in an animal model using only external cameras and numerical analysis. The main objective lies in defining a methodology that provides in vivo information of a specific patient without altering mechanical properties. It is demonstrated in the mechanical study of abdomen for hernia purposes. Mechanical tests consisted on pneumoperitoneum tests performed on New Zealand rabbits, where inner pressure was varied from 0mmHg to 12mmHg. Changes in the external abdominal surface were recorded and several points were tracked. Based on their coordinates we reconstructed a 3D finite element model of the abdominal wall, considering an incompressible hyperelastic material model defined by two parameters. The spatial distributions of these parameters (shear modulus and non linear parameter) were calculated by inverse analysis, using two different types of regularization: Total Variation Diminishing (TVD) and Tikhonov (H 1 ). After solving the inverse problem, the distribution of the material parameters were obtained along the abdominal surface. Accuracy of the results was evaluated for the last level of pressure. Results revealed a higher value of the shear modulus in a wide stripe along the craneo-caudal direction, associated with the presence of linea alba in conjunction with fascias and rectus abdominis. Non linear parameter distribution was smoother and the location of higher values varied with the regularization type. Both regularizations proved to yield in an accurate predicted displacement field, but H 1 obtained a smoother material parameter distribution while TVD included some discontinuities. The methodology here presented was able to characterize in vivo the passive non linear mechanical response of the abdominal wall. Copyright © 2016 Elsevier Ltd. All rights reserved.
Billi, Fabrizio; Benya, Paul; Kavanaugh, Aaron; Adams, John; Ebramzadeh, Edward; McKellop, Harry
2012-02-01
Numerous studies indicate highly crosslinked polyethylenes reduce the wear debris volume generated by hip arthroplasty acetabular liners. This, in turns, requires new methods to isolate and characterize them. We describe a method for extracting polyethylene wear particles from bovine serum typically used in wear tests and for characterizing their size, distribution, and morphology. Serum proteins were completely digested using an optimized enzymatic digestion method that prevented the loss of the smallest particles and minimized their clumping. Density-gradient ultracentrifugation was designed to remove contaminants and recover the particles without filtration, depositing them directly onto a silicon wafer. This provided uniform distribution of the particles and high contrast against the background, facilitating accurate, automated, morphometric image analysis. The accuracy and precision of the new protocol were assessed by recovering and characterizing particles from wear tests of three types of polyethylene acetabular cups (no crosslinking and 5 Mrads and 7.5 Mrads of gamma irradiation crosslinking). The new method demonstrated important differences in the particle size distributions and morphologic parameters among the three types of polyethylene that could not be detected using prior isolation methods. The new protocol overcomes a number of limitations, such as loss of nanometer-sized particles and artifactual clumping, among others. The analysis of polyethylene wear particles produced in joint simulator wear tests of prosthetic joints is a key tool to identify the wear mechanisms that produce the particles and predict and evaluate their effects on periprosthetic tissues.
Lukashin, A V; Fuchs, R
2001-05-01
Cluster analysis of genome-wide expression data from DNA microarray hybridization studies has proved to be a useful tool for identifying biologically relevant groupings of genes and samples. In the present paper, we focus on several important issues related to clustering algorithms that have not yet been fully studied. We describe a simple and robust algorithm for the clustering of temporal gene expression profiles that is based on the simulated annealing procedure. In general, this algorithm guarantees to eventually find the globally optimal distribution of genes over clusters. We introduce an iterative scheme that serves to evaluate quantitatively the optimal number of clusters for each specific data set. The scheme is based on standard approaches used in regular statistical tests. The basic idea is to organize the search of the optimal number of clusters simultaneously with the optimization of the distribution of genes over clusters. The efficiency of the proposed algorithm has been evaluated by means of a reverse engineering experiment, that is, a situation in which the correct distribution of genes over clusters is known a priori. The employment of this statistically rigorous test has shown that our algorithm places greater than 90% genes into correct clusters. Finally, the algorithm has been tested on real gene expression data (expression changes during yeast cell cycle) for which the fundamental patterns of gene expression and the assignment of genes to clusters are well understood from numerous previous studies.
NASA Astrophysics Data System (ADS)
Kartono; Suryadi, D.; Herman, T.
2018-01-01
This study aimed to analyze the enhancement of non-linear learning (NLL) in the online tutorial (OT) content to students’ knowledge of normal distribution application (KONDA). KONDA is a competence expected to be achieved after students studied the topic of normal distribution application in the course named Education Statistics. The analysis was performed by quasi-experiment study design. The subject of the study was divided into an experimental class that was given OT content in NLL model and a control class which was given OT content in conventional learning (CL) model. Data used in this study were the results of online objective tests to measure students’ statistical prior knowledge (SPK) and students’ pre- and post-test of KONDA. The statistical analysis test of a gain score of KONDA of students who had low and moderate SPK’s scores showed students’ KONDA who learn OT content with NLL model was better than students’ KONDA who learn OT content with CL model. Meanwhile, for students who had high SPK’s scores, the gain score of students who learn OT content with NLL model had relatively similar with the gain score of students who learn OT content with CL model. Based on those findings it could be concluded that the NLL model applied to OT content could enhance KONDA of students in low and moderate SPK’s levels. Extra and more challenging didactical situation was needed for students in high SPK’s level to achieve the significant gain score.
NASA Technical Reports Server (NTRS)
Smith, D. R.
1982-01-01
The Purdue Regional Objective Analysis of the Mesoscale (PROAM) is a Barness-type scheme for the analysis of surface meteorological data. Modifications are introduced to the original version in order to increase its flexibility and to permit greater ease of usage. The code was rewritten for an interactive computer environment. Furthermore, a multiple iteration technique suggested by Barnes was implemented for greater accuracy. PROAM was subjected to a series of experiments in order to evaluate its performance under a variety of analysis conditions. The tests include use of a known analytic temperature distribution in order to quantify error bounds for the scheme. Similar experiments were conducted using actual atmospheric data. Results indicate that the multiple iteration technique increases the accuracy of the analysis. Furthermore, the tests verify appropriate values for the analysis parameters in resolving meso-beta scale phenomena.
Dalmaris, Eleftheria; Ramalho, Cristina E; Poot, Pieter; Veneklaas, Erik J; Byrne, Margaret
2015-11-01
A worldwide increase in tree decline and mortality has been linked to climate change and, where these represent foundation species, this can have important implications for ecosystem functions. This study tests a combined approach of phylogeographic analysis and species distribution modelling to provide a climate change context for an observed decline in crown health and an increase in mortality in Eucalyptus wandoo, an endemic tree of south-western Australia. Phylogeographic analyses were undertaken using restriction fragment length polymorphism analysis of chloroplast DNA in 26 populations across the species distribution. Parsimony analysis of haplotype relationships was conducted, a haplotype network was prepared, and haplotype and nucleotide diversity were calculated. Species distribution modelling was undertaken using Maxent models based on extant species occurrences and projected to climate models of the last glacial maximum (LGM). A structured pattern of diversity was identified, with the presence of two groups that followed a climatic gradient from mesic to semi-arid regions. Most populations were represented by a single haplotype, but many haplotypes were shared among populations, with some having widespread distributions. A putative refugial area with high haplotype diversity was identified at the centre of the species distribution. Species distribution modelling showed high climatic suitability at the LGM and high climatic stability in the central region where higher genetic diversity was found, and low suitability elsewhere, consistent with a pattern of range contraction. Combination of phylogeography and paleo-distribution modelling can provide an evolutionary context for climate-driven tree decline, as both can be used to cross-validate evidence for refugia and contraction under harsh climatic conditions. This approach identified a central refugial area in the test species E. wandoo, with more recent expansion into peripheral areas from where it had contracted at the LGM. This signature of contraction from lower rainfall areas is consistent with current observations of decline on the semi-arid margin of the range, and indicates low capacity to tolerate forecast climatic change. Identification of a paleo-historical context for current tree decline enables conservation interventions to focus on maintaining genetic diversity, which provides the evolutionary potential for adaptation to climate change. © The Author 2015. Published by Oxford University Press on behalf of the Annals of Botany Company. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dionne, B.; Tzanos, C. P.
To support the safety analyses required for the conversion of the Belgian Reactor 2 (BR2) from highly-enriched uranium (HEU) to low-enriched uranium (LEU) fuel, the simulation of a number of loss-of-flow tests, with or without loss of pressure, has been undertaken. These tests were performed at BR2 in 1963 and used instrumented fuel assemblies (FAs) with thermocouples (TC) imbedded in the cladding as well as probes to measure the FAs power on the basis of their coolant temperature rise. The availability of experimental data for these tests offers an opportunity to better establish the credibility of the RELAP5-3D model andmore » methodology used in the conversion analysis. In order to support the HEU to LEU conversion safety analyses of the BR2 reactor, RELAP simulations of a number of loss-of-flow/loss-of-pressure tests have been undertaken. Preliminary analyses showed that the conservative power distributions used historically in the BR2 RELAP model resulted in a significant overestimation of the peak cladding temperature during the transient. Therefore, it was concluded that better estimates of the steady-state and decay power distributions were needed to accurately predict the cladding temperatures measured during the tests and establish the credibility of the RELAP model and methodology. The new approach ('best estimate' methodology) uses the MCNP5, ORIGEN-2 and BERYL codes to obtain steady-state and decay power distributions for the BR2 core during the tests A/400/1, C/600/3 and F/400/1. This methodology can be easily extended to simulate any BR2 core configuration. Comparisons with measured peak cladding temperatures showed a much better agreement when power distributions obtained with the new methodology are used.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pete McGrail
This GDR submission is an interim technical report and raw data files from the first year of testing on functionalized nanoparticles for rare earth element extraction from geothermal fluids. The report contains Rare Earth Element uptake results (percent removal, mg Rare Earth Element/gram of sorbent, distribution coefficient) for the elements of Neodymium, Europium, Yttrium, Dysprosium, and Cesium. A detailed techno economic analysis is also presented in the report for a scaled up geothermal rare earth element extraction process. All rare earth element uptake testing was done on simulated geothermal brines with one rare earth element in each brine. The raremore » earth element uptake testing was conducted at room temperature.« less
On the Behavior of Different PCMs in a Hot Water Storage Tank against Thermal Demands.
Porteiro, Jacobo; Míguez, José Luis; Crespo, Bárbara; de Lara, José; Pousada, José María
2016-03-21
Advantages, such as thermal storage improvement, are found when using PCMs (Phase Change Materials) in storage tanks. The inclusion of three different types of materials in a 60 l test tank is studied. Two test methodologies were developed, and four tests were performed following each methodology. A thermal analysis is performed to check the thermal properties of each PCM. The distributions of the water temperatures inside the test tanks are evaluated by installing four Pt-100 sensors at different heights. A temperature recovery is observed after exposing the test tank to an energy demand. An energetic analysis that takes into account the energy due to the water temperature, the energy due to the PCM and the thermal loss to the ambient environment is also presented. The percentage of each PCM that remains in the liquid state after the energy demand is obtained.
On the Behavior of Different PCMs in a Hot Water Storage Tank against Thermal Demands
Porteiro, Jacobo; Míguez, José Luis; Crespo, Bárbara; de Lara, José; Pousada, José María
2016-01-01
Advantages, such as thermal storage improvement, are found when using PCMs (Phase Change Materials) in storage tanks. The inclusion of three different types of materials in a 60 𝓁 test tank is studied. Two test methodologies were developed, and four tests were performed following each methodology. A thermal analysis is performed to check the thermal properties of each PCM. The distributions of the water temperatures inside the test tanks are evaluated by installing four Pt-100 sensors at different heights. A temperature recovery is observed after exposing the test tank to an energy demand. An energetic analysis that takes into account the energy due to the water temperature, the energy due to the PCM and the thermal loss to the ambient environment is also presented. The percentage of each PCM that remains in the liquid state after the energy demand is obtained. PMID:28773339
Two-Dimensional Finite Element Ablative Thermal Response Analysis of an Arcjet Stagnation Test
NASA Technical Reports Server (NTRS)
Dec, John A.; Laub, Bernard; Braun, Robert D.
2011-01-01
The finite element ablation and thermal response (FEAtR, hence forth called FEAR) design and analysis program simulates the one, two, or three-dimensional ablation, internal heat conduction, thermal decomposition, and pyrolysis gas flow of thermal protection system materials. As part of a code validation study, two-dimensional axisymmetric results from FEAR are compared to thermal response data obtained from an arc-jet stagnation test in this paper. The results from FEAR are also compared to the two-dimensional axisymmetric computations from the two-dimensional implicit thermal response and ablation program under the same arcjet conditions. The ablating material being used in this arcjet test is phenolic impregnated carbon ablator with an LI-2200 insulator as backup material. The test is performed at the NASA, Ames Research Center Interaction Heating Facility. Spatially distributed computational fluid dynamics solutions for the flow field around the test article are used for the surface boundary conditions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Willitsford, Adam H.; Brown, David M.; Brown, Andrea M.
2014-08-28
Multi-wavelength laser transmittance was measured during a series of open-air propellant burn tests at Alliant Techsystems, Inc., in Elkton, MD, in May 2012. A Mie scattering model was combined with an alumina optical properties model in a simple single-scatter approach to fitting plume transmittance. Wavelength-dependent plume transmission curves were fit to the measured multi-wave- length transmittance data to infer plume particle size distributions at several heights in the plume. Tri-modal lognormal distributions described transmittance data well at all heights. Overall distributions included a mode with nanometer-scale diameter, a second mode at a diameter of ~0.5 µm, and a third, largermore » particle mode. Larger parti- cles measured 2.5 µm in diameter at 34 cm (14 in.) above the burning propellant surface, but grew to 4 µm in diameter at a height of 57 cm (22 in.), indicative of particle agglomeration in progress as the plume rises. This report presents data, analysis, and results from the study.« less
Hybrid Network Defense Model Based on Fuzzy Evaluation
2014-01-01
With sustained and rapid developments in the field of information technology, the issue of network security has become increasingly prominent. The theme of this study is network data security, with the test subject being a classified and sensitive network laboratory that belongs to the academic network. The analysis is based on the deficiencies and potential risks of the network's existing defense technology, characteristics of cyber attacks, and network security technologies. Subsequently, a distributed network security architecture using the technology of an intrusion prevention system is designed and implemented. In this paper, first, the overall design approach is presented. This design is used as the basis to establish a network defense model, an improvement over the traditional single-technology model that addresses the latter's inadequacies. Next, a distributed network security architecture is implemented, comprising a hybrid firewall, intrusion detection, virtual honeynet projects, and connectivity and interactivity between these three components. Finally, the proposed security system is tested. A statistical analysis of the test results verifies the feasibility and reliability of the proposed architecture. The findings of this study will potentially provide new ideas and stimuli for future designs of network security architecture. PMID:24574870
Non-operative management (NOM) of blunt hepatic trauma: 80 cases.
Özoğul, Bünyami; Kısaoğlu, Abdullah; Aydınlı, Bülent; Öztürk, Gürkan; Bayramoğlu, Atıf; Sarıtemur, Murat; Aköz, Ayhan; Bulut, Özgür Hakan; Atamanalp, Sabri Selçuk
2014-03-01
Liver is the most frequently injured organ upon abdominal trauma. We present a group of patients with blunt hepatic trauma who were managed without any invasive diagnostic tools and/or surgical intervention. A total of 80 patients with blunt liver injury who were hospitalized to the general surgery clinic or other clinics due to the concomitant injuries were followed non-operatively. The normally distributed numeric variables were evaluated by Student's t-test or one way analysis of variance, while non-normally distributed variables were analyzed by Mann-Whitney U-test or Kruskal-Wallis variance analysis. Chi-square test was also employed for the comparison of categorical variables. Statistical significance was assumed for p<0.05. There was no significant relationship between patients' Hgb level and liver injury grade, outcome, and mechanism of injury. Also, there was no statistical relationship between liver injury grade, outcome, and mechanism of injury and ALT levels as well as AST level. There was no mortality in any of the patients. During the last quarter of century, changes in the diagnosis and treatment of liver injury were associated with increased survival. NOM of liver injury in patients with stable hemodynamics and hepatic trauma seems to be the gold standard.
Neural net diagnostics for VLSI test
NASA Technical Reports Server (NTRS)
Lin, T.; Tseng, H.; Wu, A.; Dogan, N.; Meador, J.
1990-01-01
This paper discusses the application of neural network pattern analysis algorithms to the IC fault diagnosis problem. A fault diagnostic is a decision rule combining what is known about an ideal circuit test response with information about how it is distorted by fabrication variations and measurement noise. The rule is used to detect fault existence in fabricated circuits using real test equipment. Traditional statistical techniques may be used to achieve this goal, but they can employ unrealistic a priori assumptions about measurement data. Our approach to this problem employs an adaptive pattern analysis technique based on feedforward neural networks. During training, a feedforward network automatically captures unknown sample distributions. This is important because distributions arising from the nonlinear effects of process variation can be more complex than is typically assumed. A feedforward network is also able to extract measurement features which contribute significantly to making a correct decision. Traditional feature extraction techniques employ matrix manipulations which can be particularly costly for large measurement vectors. In this paper we discuss a software system which we are developing that uses this approach. We also provide a simple example illustrating the use of the technique for fault detection in an operational amplifier.
NASA Technical Reports Server (NTRS)
Aretskin-Hariton, Eliot D.; Zinnecker, Alicia Mae; Culley, Dennis E.
2014-01-01
Distributed Engine Control (DEC) is an enabling technology that has the potential to advance the state-of-the-art in gas turbine engine control. To analyze the capabilities that DEC offers, a Hardware-In-the-Loop (HIL) test bed is being developed at NASA Glenn Research Center. This test bed will support a systems-level analysis of control capabilities in closed-loop engine simulations. The structure of the HIL emulates a virtual test cell by implementing the operator functions, control system, and engine on three separate computers. This implementation increases the flexibility and extensibility of the HIL. Here, a method is discussed for implementing these interfaces by connecting the three platforms over a dedicated Local Area Network (LAN). This approach is verified using the Commercial Modular Aero-Propulsion System Simulation 40k (C-MAPSS40k), which is typically implemented on one computer. There are marginal differences between the results from simulation of the typical and the three-computer implementation. Additional analysis of the LAN network, including characterization of network load, packet drop, and latency, is presented. The three-computer setup supports the incorporation of complex control models and proprietary engine models into the HIL framework.
Music therapy career aptitude test.
Lim, Hayoung A
2011-01-01
The purpose of the Music Therapy Career Aptitude Test (MTCAT) was to measure the affective domain of music therapy students including their self-awareness as it relates to the music therapy career, value in human development, interest in general therapy, and aptitude for being a professional music therapist. The MTCAT was administered to 113 music therapy students who are currently freshman or sophomores in an undergraduate music therapy program or in the first year of a music therapy master's equivalency program. The results of analysis indicated that the MTCAT is normally distributed and that all 20 questions are significantly correlated with the total test score of the MTCAT. The reliability of the MTCAT was considerably high (Cronbach's Coefficient Alpha=0.8). The criterion-related validity was examined by comparing the MTCAT scores of music therapy students with the scores of 43 professional music therapists. The correlation between the scores of students and professionals was found to be statistically significant. The results suggests that normal distribution, internal consistency, homogeneity of construct, item discrimination, correlation analysis, content validity, and criterion-related validity in the MTCAT may be helpful in predicting music therapy career aptitude and may aid in the career decision making process of college music therapy students.
NASA Astrophysics Data System (ADS)
Frey, M. P.; Stamm, C.; Schneider, M. K.; Reichert, P.
2011-12-01
A distributed hydrological model was used to simulate the distribution of fast runoff formation as a proxy for critical source areas for herbicide pollution in a small agricultural catchment in Switzerland. We tested to what degree predictions based on prior knowledge without local measurements could be improved upon relying on observed discharge. This learning process consisted of five steps: For the prior prediction (step 1), knowledge of the model parameters was coarse and predictions were fairly uncertain. In the second step, discharge data were used to update the prior parameter distribution. Effects of uncertainty in input data and model structure were accounted for by an autoregressive error model. This step decreased the width of the marginal distributions of parameters describing the lower boundary (percolation rates) but hardly affected soil hydraulic parameters. Residual analysis (step 3) revealed model structure deficits. We modified the model, and in the subsequent Bayesian updating (step 4) the widths of the posterior marginal distributions were reduced for most parameters compared to those of the prior. This incremental procedure led to a strong reduction in the uncertainty of the spatial prediction. Thus, despite only using spatially integrated data (discharge), the spatially distributed effect of the improved model structure can be expected to improve the spatially distributed predictions also. The fifth step consisted of a test with independent spatial data on herbicide losses and revealed ambiguous results. The comparison depended critically on the ratio of event to preevent water that was discharged. This ratio cannot be estimated from hydrological data only. The results demonstrate that the value of local data is strongly dependent on a correct model structure. An iterative procedure of Bayesian updating, model testing, and model modification is suggested.
Cycles till failure of silver-zinc cells with competing failure modes - Preliminary data analysis
NASA Technical Reports Server (NTRS)
Sidik, S. M.; Leibecki, H. F.; Bozek, J. M.
1980-01-01
The data analysis of cycles to failure of silver-zinc electrochemical cells with competing failure modes is presented. The test ran 129 cells through charge-discharge cycles until failure; preliminary data analysis consisted of response surface estimate of life. Batteries fail through low voltage condition and an internal shorting condition; a competing failure modes analysis was made using maximum likelihood estimation for the extreme value life distribution. Extensive residual plotting and probability plotting were used to verify data quality and selection of model.
Programmable quantum random number generator without postprocessing.
Nguyen, Lac; Rehain, Patrick; Sua, Yong Meng; Huang, Yu-Ping
2018-02-15
We demonstrate a viable source of unbiased quantum random numbers whose statistical properties can be arbitrarily programmed without the need for any postprocessing such as randomness distillation or distribution transformation. It is based on measuring the arrival time of single photons in shaped temporal modes that are tailored with an electro-optical modulator. We show that quantum random numbers can be created directly in customized probability distributions and pass all randomness tests of the NIST and Dieharder test suites without any randomness extraction. The min-entropies of such generated random numbers are measured close to the theoretical limits, indicating their near-ideal statistics and ultrahigh purity. Easy to implement and arbitrarily programmable, this technique can find versatile uses in a multitude of data analysis areas.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1993-06-01
The bibliography contains citations concerning standards and standard tests for water quality in drinking water sources, reservoirs, and distribution systems. Standards from domestic and international sources are presented. Glossaries and vocabularies that concern water quality analysis, testing, and evaluation are included. Standard test methods for individual elements, selected chemicals, sensory properties, radioactivity, and other chemical and physical properties are described. Discussions for proposed standards on new pollutant materials are briefly considered. (Contains a minimum of 203 citations and includes a subject term index and title list.)
KC-135 winglet program overview
NASA Technical Reports Server (NTRS)
Barber, M. R.; Selegan, D.
1982-01-01
A joint NASA/USAF program was conducted to accomplish the following objectives: (1) evaluate the benefits that could be achieved from the application of winglets to KC-135 aircraft; and (2) determine the ability of wind tunnel tests and analytical analysis to predict winglet characteristics. The program included wind-tunnel development of a test winglet configuration; analytical predictions of the changes to the aircraft resulting from the application of the test winglet; and finally, flight tests of the developed configuration. Pressure distribution, loads, stability and control, buffet, fuel mileage, and flutter data were obtained to fulfill the objectives of the program.
First Accelerator Test of the Kinematic Lightweight Energy Meter (KLEM) Prototype
NASA Technical Reports Server (NTRS)
Bashindzhagyan, G.; Adams, J. H.; Bashindzhagyan, P.; Chilingarian, A.; Donnelly, J.; Drury, L.; Egorov, N.; Golubkov, S.; Grebenyuk, V.; Kalinin, A.;
2002-01-01
The essence of the KLEM (Kinematic Lightweight Energy Meter) instrument is to directly measure the elemental energy spectra of high-energy cosmic rays by determining the angular distribution of secondary particles produced in a target. The first test of the simple KLEM prototype has been performed at the CERN SPS test-beam with 180 GeV pions during 2001. The results of the first test analysis confirm that, using the KLEM method, the energy of 180 GeV pions can be measured with a relative error of about 67%, which is very close to the results of the simulation (65%).
NASA Technical Reports Server (NTRS)
Feng, C.; Sun, X.; Shen, Y. N.; Lombardi, Fabrizio
1992-01-01
This paper covers the verification and protocol validation for distributed computer and communication systems using a computer aided testing approach. Validation and verification make up the so-called process of conformance testing. Protocol applications which pass conformance testing are then checked to see whether they can operate together. This is referred to as interoperability testing. A new comprehensive approach to protocol testing is presented which address: (1) modeling for inter-layer representation for compatibility between conformance and interoperability testing; (2) computational improvement to current testing methods by using the proposed model inclusive of formulation of new qualitative and quantitative measures and time-dependent behavior; (3) analysis and evaluation of protocol behavior for interactive testing without extensive simulation.
Spatial Distribution of Bed Particles in Natural Boulder-Bed Streams
NASA Astrophysics Data System (ADS)
Clancy, K. F.; Prestegaard, K. L.
2001-12-01
The Wolman pebble count is used to obtain the size distribution of bed particles in natural streams. Statistics such as median particle size (D50) are used in resistance calculations. Additional information such as bed particle heterogeneity may also be obtained from the particle distribution, which is used to predict sediment transport rates (Hey, 1979), (Ferguson, Prestegaard, Ashworth, 1989). Boulder-bed streams have an extreme range of particles in the particle size distribution ranging from sand size particles to particles larger than 0.5-m. A study of a natural boulder-bed reach demonstrated that the spatial distribution of the particles is a significant factor in predicting sediment transport and stream bed and bank stability. Further experiments were performed to test the limits of the spatial distribution's effect on sediment transport. Three stream reaches 40-m in length were selected with similar hydrologic characteristics and spatial distributions but varying average size particles. We used a grid 0.5 by 0.5-m and measured four particles within each grid cell. Digital photographs of the streambed were taken in each grid cell. The photographs were examined using image analysis software to obtain particle size and position of the largest particles (D84) within the reach's particle distribution. Cross section, topography and stream depth were surveyed. Velocity and velocity profiles were measured and recorded. With these data and additional surveys of bankfull floods, we tested the significance of the spatial distributions as average particle size decreases. The spatial distribution of streambed particles may provide information about stream valley formation, bank stability, sediment transport, and the growth rate of riparian vegetation.
Investigation of melamine derived quaternary as ammonium salt potential shale inhibitor
NASA Astrophysics Data System (ADS)
Yu, Hongjiang; Hu, Weimin; Guo, Gang; Huang, Lei; Li, Lili; Gu, Xuefan; Zhang, Zhifang; Zhang, Jie; Chen, Gang
2017-06-01
Melamine, sodium chloroacetate and sodium hydroxide were used as raw materials to synthesize a kind of neutral quaternary ammonium salt (NQAS) as potential clay swelling inhibitor and water-based drilling fluid additive, and the reaction conditions were screened based on the linear expansion rate of bentonite. The inhibitive properties of NQASs were investigated by various methods, including montmorillonite (MMT) linear expansion test, mud ball immersing test, particle distribution measurement, thermogravimetric analysis and scanning electron microscopy etc. The results indicate that NQAS can inhibit expansion and dispersion of clay in water effectively. At the same condition, the bentonite linear expansion rate in NQAS-6 solution is much lower than those of others, and the hydration expansion degree of the mud ball in 0.5% NQAS-6 solution is appreciably weaker than the control test. The compatibility test indicates NQAS-6 could be compatible with the conventional additives in water-based drilling fluids, and the temperature resistance of modified starch was improved effectively. Meanwhile, the inhibitive mechanism was discussed through the particle distribution measurement.
Naval Research Laboratory Industrial Chemical Analysis and Respiratory Filter Standards Development
2017-09-29
Filter Standards Development September 29, 2017 Approved for public release; distribution is unlimited. Thomas E. suTTo Materials and Systems Branch...LIMITATION OF ABSTRACT Naval Research Laboratory Industrial Chemical Analysis and Respiratory Filter Standards Development Thomas E. Sutto Naval Research...approach, developed by NRL, is tested by examining the filter behavior against a number of chemicals to determine if the NRL approach resulted in the
NASA Astrophysics Data System (ADS)
Li, Hanshan
2016-04-01
To enhance the stability and reliability of multi-screens testing system, this paper studies multi-screens target optical information transmission link properties and performance in long-distance, sets up the discrete multi-tone modulation transmission model based on geometric model of laser multi-screens testing system and visible light information communication principle; analyzes the electro-optic and photoelectric conversion function of sender and receiver in target optical information communication system; researches target information transmission performance and transfer function of the generalized visible-light communication channel; found optical information communication transmission link light intensity space distribution model and distribution function; derives the SNR model of information transmission communication system. Through the calculation and experiment analysis, the results show that the transmission error rate increases with the increment of transmission rate in a certain channel modulation depth; when selecting the appropriate transmission rate, the bit error rate reach 0.01.
Rasch model based analysis of the Force Concept Inventory
NASA Astrophysics Data System (ADS)
Planinic, Maja; Ivanjek, Lana; Susac, Ana
2010-06-01
The Force Concept Inventory (FCI) is an important diagnostic instrument which is widely used in the field of physics education research. It is therefore very important to evaluate and monitor its functioning using different tools for statistical analysis. One of such tools is the stochastic Rasch model, which enables construction of linear measures for persons and items from raw test scores and which can provide important insight in the structure and functioning of the test (how item difficulties are distributed within the test, how well the items fit the model, and how well the items work together to define the underlying construct). The data for the Rasch analysis come from the large-scale research conducted in 2006-07, which investigated Croatian high school students’ conceptual understanding of mechanics on a representative sample of 1676 students (age 17-18 years). The instrument used in research was the FCI. The average FCI score for the whole sample was found to be (27.7±0.4)% , indicating that most of the students were still non-Newtonians at the end of high school, despite the fact that physics is a compulsory subject in Croatian schools. The large set of obtained data was analyzed with the Rasch measurement computer software WINSTEPS 3.66. Since the FCI is routinely used as pretest and post-test on two very different types of population (non-Newtonian and predominantly Newtonian), an additional predominantly Newtonian sample ( N=141 , average FCI score of 64.5%) of first year students enrolled in introductory physics course at University of Zagreb was also analyzed. The Rasch model based analysis suggests that the FCI has succeeded in defining a sufficiently unidimensional construct for each population. The analysis of fit of data to the model found no grossly misfitting items which would degrade measurement. Some items with larger misfit and items with significantly different difficulties in the two samples of students do require further examination. The analysis revealed some problems with item distribution in the FCI and suggested that the FCI may function differently in non-Newtonian and predominantly Newtonian population. Some possible improvements of the test are suggested.
Roccato, Anna; Uyttendaele, Mieke; Membré, Jeanne-Marie
2017-06-01
In the framework of food safety, when mimicking the consumer phase, the storage time and temperature used are mainly considered as single point estimates instead of probability distributions. This singlepoint approach does not take into account the variability within a population and could lead to an overestimation of the parameters. Therefore, the aim of this study was to analyse data on domestic refrigerator temperatures and storage times of chilled food in European countries in order to draw general rules which could be used either in shelf-life testing or risk assessment. In relation to domestic refrigerator temperatures, 15 studies provided pertinent data. Twelve studies presented normal distributions, according to the authors or from the data fitted into distributions. Analysis of temperature distributions revealed that the countries were separated into two groups: northern European countries and southern European countries. The overall variability of European domestic refrigerators is described by a normal distribution: N (7.0, 2.7)°C for southern countries, and, N (6.1, 2.8)°C for the northern countries. Concerning storage times, seven papers were pertinent. Analysis indicated that the storage time was likely to end in the first days or weeks (depending on the product use-by-date) after purchase. Data fitting showed the exponential distribution was the most appropriate distribution to describe the time that food spent at consumer's place. The storage time was described by an exponential distribution corresponding to the use-by date period divided by 4. In conclusion, knowing that collecting data is time and money consuming, in the absence of data, and at least for the European market and for refrigerated products, building a domestic refrigerator temperature distribution using a Normal law and a time-to-consumption distribution using an Exponential law would be appropriate. Copyright © 2017 Elsevier Ltd. All rights reserved.
Arrieta, Oscar; Anaya, Pablo; Morales-Oyarvide, Vicente; Ramírez-Tirado, Laura Alejandra; Polanco, Ana C
2016-09-01
Assess the cost-effectiveness of an EGFR-mutation testing strategy for advanced NSCLC in first-line therapy with either gefitinib or carboplatin-paclitaxel in Mexican institutions. Cost-effectiveness analysis using a discrete event simulation (DES) model to simulate two therapeutic strategies in patients with advanced NSCLC. Strategy one included patients tested for EGFR-mutation and therapy given accordingly. Strategy two included chemotherapy for all patients without testing. All results are presented in 2014 US dollars. The analysis was made with data from the Mexican frequency of EGFR-mutation. A univariate sensitivity analysis was conducted on EGFR prevalence. Progression-free survival (PFS) transition probabilities were estimated on data from the IPASS and simulated with a Weibull distribution, run with parallel trials to calculate a probabilistic sensitivity analysis. PFS of patients in the testing strategy was 6.76 months (95 % CI 6.10-7.44) vs 5.85 months (95 % CI 5.43-6.29) in the non-testing group. The one-way sensitivity analysis showed that PFS has a direct relationship with EGFR-mutation prevalence, while the ICER and testing cost have an inverse relationship with EGFR-mutation prevalence. The probabilistic sensitivity analysis showed that all iterations had incremental costs and incremental PFS for strategy 1 in comparison with strategy 2. There is a direct relationship between the ICER and the cost of EGFR testing, with an inverse relationship with the prevalence of EGFR-mutation. When prevalence is >10 % ICER remains constant. This study could impact Mexican and Latin American health policies regarding mutation detection testing and treatment for advanced NSCLC.
Jia, Lei; Li, Lin; Gui, Tao; Liu, Siyang; Li, Hanping; Han, Jingwan; Guo, Wei; Liu, Yongjian; Li, Jingyun
2016-09-21
With increasing data on HIV-1, a more relevant molecular model describing mechanism details of HIV-1 genetic recombination usually requires upgrades. Currently an incomplete structural understanding of the copy choice mechanism along with several other issues in the field that lack elucidation led us to perform an analysis of the correlation between breakpoint distributions and (1) the probability of base pairing, and (2) intersubtype genetic similarity to further explore structural mechanisms. Near full length sequences of URFs from Asia, Europe, and Africa (one sequence/patient), and representative sequences of worldwide CRFs were retrieved from the Los Alamos HIV database. Their recombination patterns were analyzed by jpHMM in detail. Then the relationships between breakpoint distributions and (1) the probability of base pairing, and (2) intersubtype genetic similarities were investigated. Pearson correlation test showed that all URF groups and the CRF group exhibit the same breakpoint distribution pattern. Additionally, the Wilcoxon two-sample test indicated a significant and inexplicable limitation of recombination in regions with high pairing probability. These regions have been found to be strongly conserved across distinct biological states (i.e., strong intersubtype similarity), and genetic similarity has been determined to be a very important factor promoting recombination. Thus, the results revealed an unexpected disagreement between intersubtype similarity and breakpoint distribution, which were further confirmed by genetic similarity analysis. Our analysis reveals a critical conflict between results from natural HIV-1 isolates and those from HIV-1-based assay vectors in which genetic similarity has been shown to be a very critical factor promoting recombination. These results indicate the region with high-pairing probabilities may be a more fundamental factor affecting HIV-1 recombination than sequence similarity in natural HIV-1 infections. Our findings will be relevant in furthering the understanding of HIV-1 recombination mechanisms.
NASA Technical Reports Server (NTRS)
Siemers, P. M., III; Henry, M. W.
1986-01-01
Pressure distribution test data obtained on a 0.10-scale model of the forward fuselage of the Space Shuttle Orbiter are presented without analysis. The tests were completed in the Ames Unitary Wind Tunnel (UPWT). The UPWT tests were conducted in two different test sections operating in the continuous mode, the 8 x 7 feet and 9 x 7 feet test sections. Each test section has its own Mach number range, 1.6 to 2.5 and 2.5 to 3.5 for the 9 x 7 feet and 8 x 7 feet test section, respectively. The test Reynolds number ranged from 1.6 to 2.5 x 10 to the 6th power ft and 0.6 to 2.0 x 10 to the 6th power ft, respectively. The tests were conducted in support of the development of the Shuttle Entry Air Data System (SEADS). In addition to modeling the 20 SEADS orifices, the wind-tunnel model was also instrumented with orifices to match Development Flight Instrumentation (DFI) port locations that existed on the Space Shuttle Columbia (OV-102) during the Orbiter Flight test program. This DFI simulation has provided a means for comparisons between reentry flight pressure data and wind-tunnel and computational data.
2016-06-01
design will help assess each individual’s perceptions on the five primary research questions. D. PILOT TESTING After creating the survey, it’s...distributed to individuals that have submitted requirements packages through the ASSP process. The survey field test was designed to determine the...Will be designated for each of the service portfolio groups and collaborates to define common processes across DOD Component Level Lead (CLL
HEMP (high-altitude electromagnetic pulse) test and analysis of selected recloser-control units
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, T.K.; Sands, S.H.; Tesche, F.M.
A simulated HEMP test was performed on power line recloser-control units in the ARES facility during the month of October 1988. Two types of recloser-control units were tested: an electronic control unit presently in wide use in electric power distribution systems and a new microprocessor based unit presently being introduced to electric utilities. It was found that the ARES fields did not cause reproducible disruptive failure of the equipment. Minor upsets, which were considered to be non-disruptive to the recloser operation, were observed. The test results were compared to the results of an analysis from a previous study and itmore » is concluded that the probability of disruptive failure of field operating recloser-control units subjected to a nominal unclassified HEMP environment is small. 3 refs., 30 figs., 1 tab.« less
Quantification of sensory and food quality: the R-index analysis.
Lee, Hye-Seong; van Hout, Danielle
2009-08-01
The accurate quantification of sensory difference/similarity between foods, as well as consumer acceptance/preference and concepts, is greatly needed to optimize and maintain food quality. The R-Index is one class of measures of the degree of difference/similarity, and was originally developed for sensory difference tests for food quality control, product development, and so on. The index is based on signal detection theory and is free of the response bias that can invalidate difference testing protocols, including categorization and same-different and A-Not A tests. It is also a nonparametric analysis, making no assumptions about sensory distributions, and is simple to compute and understand. The R-Index is also flexible in its application. Methods based on R-Index analysis have been used as detection and sensory difference tests, as simple alternatives to hedonic scaling, and for the measurement of consumer concepts. This review indicates the various computational strategies for the R-Index and its practical applications to consumer and sensory measurements in food science.
Quantifying Low Energy Proton Damage in Multijunction Solar Cells
NASA Technical Reports Server (NTRS)
Messenger, Scott R.; Burke, Edward A.; Walters, Robert J.; Warner, Jeffrey H.; Summers, Geoffrey P.; Lorentzen, Justin R.; Morton, Thomas L.; Taylor, Steven J.
2007-01-01
An analysis of the effects of low energy proton irradiation on the electrical performance of triple junction (3J) InGaP2/GaAs/Ge solar cells is presented. The Monte Carlo ion transport code (SRIM) is used to simulate the damage profile induced in a 3J solar cell under the conditions of typical ground testing and that of the space environment. The results are used to present a quantitative analysis of the defect, and hence damage, distribution induced in the cell active region by the different radiation conditions. The modelling results show that, in the space environment, the solar cell will experience a uniform damage distribution through the active region of the cell. Through an application of the displacement damage dose analysis methodology, the implications of this result on mission performance predictions are investigated.
Vo, T D; Dwyer, G; Szeto, H H
1986-04-01
A relatively powerful and inexpensive microcomputer-based system for the spectral analysis of the EEG is presented. High resolution and speed is achieved with the use of recently available large-scale integrated circuit technology with enhanced functionality (INTEL Math co-processors 8087) which can perform transcendental functions rapidly. The versatility of the system is achieved with a hardware organization that has distributed data acquisition capability performed by the use of a microprocessor-based analog to digital converter with large resident memory (Cyborg ISAAC-2000). Compiled BASIC programs and assembly language subroutines perform on-line or off-line the fast Fourier transform and spectral analysis of the EEG which is stored as soft as well as hard copy. Some results obtained from test application of the entire system in animal studies are presented.
Ahlborn, W; Tuz, H J; Uberla, K
1990-03-01
In cohort studies the Mantel-Haenszel estimator ORMH is computed from sample data and is used as a point estimator of relative risk. Test-based confidence intervals are estimated with the help of the asymptotic chi-squared distributed MH-statistic chi 2MHS. The Mantel-extension-chi-squared is used as a test statistic for a dose-response relationship. Both test statistics--the Mantel-Haenszel-chi as well as the Mantel-extension-chi--assume homogeneity of risk across strata, which is rarely present. Also an extended nonparametric statistic, proposed by Terpstra, which is based on the Mann-Whitney-statistics assumes homogeneity of risk across strata. We have earlier defined four risk measures RRkj (k = 1,2,...,4) in the population and considered their estimates and the corresponding asymptotic distributions. In order to overcome the homogeneity assumption we use the delta-method to get "test-based" confidence intervals. Because the four risk measures RRkj are presented as functions of four weights gik we give, consequently, the asymptotic variances of these risk estimators also as functions of the weights gik in a closed form. Approximations to these variances are given. For testing a dose-response relationship we propose a new class of chi 2(1)-distributed global measures Gk and the corresponding global chi 2-test. In contrast to the Mantel-extension-chi homogeneity of risk across strata must not be assumed. These global test statistics are of the Wald type for composite hypotheses.(ABSTRACT TRUNCATED AT 250 WORDS)
Distribution of new graphic warning labels: Are tobacco companies following regulations?
Wilson, Nick; Peace, Jo; Li, Judy; Edwards, Richard; Hoek, Janet; Stanley, James; Thomson, George
2009-08-25
To test the hypothesis that tobacco companies would not follow a regulation that required seven new graphic health warnings (GHWs) to be evenly distributed on cigarette packs and that they would distribute fewer packs featuring warnings regarded by smokers as being more disturbing. Cross-sectional survey of purchased packs (n = 168) and street-collected discarded packs (convenience sample of New Zealand cities and towns, n = 1208 packs) with statistical analysis of seven types of new GHWs. A priori warning impact was judged using three criteria, which were tested against data from depth interviews with retailers. The GHWs on the purchased packs and street-collected packs both showed a distribution pattern that was generally consistent with the hypothesis ie, there were disproportionately more packs featuring images judged as "least disturbing" and disproportionately fewer of those with warnings judged "more disturbing". The overall patterns were statistically significant, suggesting an unequal frequency of the different warnings for both purchased (p < 0.0001) and street-collected packs (p = 0.035). One of the least disturbing images (of a "corpse with toe-tag") dominated the distribution in both samples. Further analysis of the street-collected packs revealed that this image appeared disproportionately more frequently on manufactured cigarettes made by each of the three largest New Zealand tobacco companies. Although stock clustering could explain the purchase pack result, there were no obvious reasons why the same uneven warning distribution was also evident among the street-collected packs. These results suggest that tobacco companies are not following the regulations, which requires even distribution of the seven different GHWs on cigarette packs; further monitoring is required to estimate the extent of this non-compliance. As an immediate measure, governments should strictly enforce all regulations applying to health warnings, particularly given that these are an effective tobacco control intervention that cost tax payers nothing.
Shirazi, Mohammadali; Dhavala, Soma Sekhar; Lord, Dominique; Geedipally, Srinivas Reddy
2017-10-01
Safety analysts usually use post-modeling methods, such as the Goodness-of-Fit statistics or the Likelihood Ratio Test, to decide between two or more competitive distributions or models. Such metrics require all competitive distributions to be fitted to the data before any comparisons can be accomplished. Given the continuous growth in introducing new statistical distributions, choosing the best one using such post-modeling methods is not a trivial task, in addition to all theoretical or numerical issues the analyst may face during the analysis. Furthermore, and most importantly, these measures or tests do not provide any intuitions into why a specific distribution (or model) is preferred over another (Goodness-of-Logic). This paper ponders into these issues by proposing a methodology to design heuristics for Model Selection based on the characteristics of data, in terms of descriptive summary statistics, before fitting the models. The proposed methodology employs two analytic tools: (1) Monte-Carlo Simulations and (2) Machine Learning Classifiers, to design easy heuristics to predict the label of the 'most-likely-true' distribution for analyzing data. The proposed methodology was applied to investigate when the recently introduced Negative Binomial Lindley (NB-L) distribution is preferred over the Negative Binomial (NB) distribution. Heuristics were designed to select the 'most-likely-true' distribution between these two distributions, given a set of prescribed summary statistics of data. The proposed heuristics were successfully compared against classical tests for several real or observed datasets. Not only they are easy to use and do not need any post-modeling inputs, but also, using these heuristics, the analyst can attain useful information about why the NB-L is preferred over the NB - or vice versa- when modeling data. Copyright © 2017 Elsevier Ltd. All rights reserved.
Aerobots as a Ubiquitous Part of Society
NASA Technical Reports Server (NTRS)
Young, Larry A.
2006-01-01
Small autonomous aerial robots (aerobots) have the potential to make significant positive contributions to modern society. Aerobots of various vehicle-types - CTOL, STOL, VTOL, and even possibly LTA - will be a part of a new paradigm for the distribution of goods and services. Aerobots as a class of vehicles may test the boundaries of aircraft design. New system analysis and design tools will be required in order to account for the new technologies and design parameters/constraints for such vehicles. The analysis tools also provide new approaches to defining/assessing technology goals and objectives and the technology portfolio necessary to accomplish those goals and objectives. Using the aerobot concept as an illustrative test case, key attributes of these analysis tools are discussed.
On computation of p-values in parametric linkage analysis.
Kurbasic, Azra; Hössjer, Ola
2004-01-01
Parametric linkage analysis is usually used to find chromosomal regions linked to a disease (phenotype) that is described with a specific genetic model. This is done by investigating the relations between the disease and genetic markers, that is, well-characterized loci of known position with a clear Mendelian mode of inheritance. Assume we have found an interesting region on a chromosome that we suspect is linked to the disease. Then we want to test the hypothesis of no linkage versus the alternative one of linkage. As a measure we use the maximal lod score Z(max). It is well known that the maximal lod score has asymptotically a (2 ln 10)(-1) x (1/2 chi2(0) + 1/2 chi2(1)) distribution under the null hypothesis of no linkage when only one point (one marker) on the chromosome is studied. In this paper, we show, both by simulations and theoretical arguments, that the null hypothesis distribution of Zmax has no simple form when more than one marker is used (multipoint analysis). In fact, the distribution of Zmax depends on the number of families, their structure, the assumed genetic model, marker denseness, and marker informativity. This means that a constant critical limit of Zmax leads to tests associated with different significance levels. Because of the above-mentioned problems, from the statistical point of view the maximal lod score should be supplemented by a p-value when results are reported. Copyright (c) 2004 S. Karger AG, Basel.
NASA Astrophysics Data System (ADS)
Ma, Yong; Qin, Jianfeng; Zhang, Xiangyu; Lin, Naiming; Huang, Xiaobo; Tang, Bin
2015-07-01
Using the impact test and finite element simulation, the failure behavior of the Mo-modified layer on pure Ti was investigated. In the impact test, four loads of 100, 300, 500, and 700 N and 104 impacts were adopted. The three-dimensional residual impact dents were examined using an optical microscope (Olympus-DSX500i), indicating that the impact resistance of the Ti surface was improved. Two failure modes cohesive and wearing were elucidated by electron backscatter diffraction and energy-dispersive spectrometer performed in a field-emission scanning electron microscope. Through finite element forward analysis performed at a typical impact load of 300 N, stress-strain distributions in the Mo-modified Ti were quantitatively determined. In addition, the failure behavior of the Mo-modified layer was determined and an ideal failure model was proposed for high-load impact, based on the experimental and finite element forward analysis results.
Statistical inference methods for two crossing survival curves: a comparison of methods.
Li, Huimin; Han, Dong; Hou, Yawen; Chen, Huilin; Chen, Zheng
2015-01-01
A common problem that is encountered in medical applications is the overall homogeneity of survival distributions when two survival curves cross each other. A survey demonstrated that under this condition, which was an obvious violation of the assumption of proportional hazard rates, the log-rank test was still used in 70% of studies. Several statistical methods have been proposed to solve this problem. However, in many applications, it is difficult to specify the types of survival differences and choose an appropriate method prior to analysis. Thus, we conducted an extensive series of Monte Carlo simulations to investigate the power and type I error rate of these procedures under various patterns of crossing survival curves with different censoring rates and distribution parameters. Our objective was to evaluate the strengths and weaknesses of tests in different situations and for various censoring rates and to recommend an appropriate test that will not fail for a wide range of applications. Simulation studies demonstrated that adaptive Neyman's smooth tests and the two-stage procedure offer higher power and greater stability than other methods when the survival distributions cross at early, middle or late times. Even for proportional hazards, both methods maintain acceptable power compared with the log-rank test. In terms of the type I error rate, Renyi and Cramér-von Mises tests are relatively conservative, whereas the statistics of the Lin-Xu test exhibit apparent inflation as the censoring rate increases. Other tests produce results close to the nominal 0.05 level. In conclusion, adaptive Neyman's smooth tests and the two-stage procedure are found to be the most stable and feasible approaches for a variety of situations and censoring rates. Therefore, they are applicable to a wider spectrum of alternatives compared with other tests.
Statistical Inference Methods for Two Crossing Survival Curves: A Comparison of Methods
Li, Huimin; Han, Dong; Hou, Yawen; Chen, Huilin; Chen, Zheng
2015-01-01
A common problem that is encountered in medical applications is the overall homogeneity of survival distributions when two survival curves cross each other. A survey demonstrated that under this condition, which was an obvious violation of the assumption of proportional hazard rates, the log-rank test was still used in 70% of studies. Several statistical methods have been proposed to solve this problem. However, in many applications, it is difficult to specify the types of survival differences and choose an appropriate method prior to analysis. Thus, we conducted an extensive series of Monte Carlo simulations to investigate the power and type I error rate of these procedures under various patterns of crossing survival curves with different censoring rates and distribution parameters. Our objective was to evaluate the strengths and weaknesses of tests in different situations and for various censoring rates and to recommend an appropriate test that will not fail for a wide range of applications. Simulation studies demonstrated that adaptive Neyman’s smooth tests and the two-stage procedure offer higher power and greater stability than other methods when the survival distributions cross at early, middle or late times. Even for proportional hazards, both methods maintain acceptable power compared with the log-rank test. In terms of the type I error rate, Renyi and Cramér—von Mises tests are relatively conservative, whereas the statistics of the Lin-Xu test exhibit apparent inflation as the censoring rate increases. Other tests produce results close to the nominal 0.05 level. In conclusion, adaptive Neyman’s smooth tests and the two-stage procedure are found to be the most stable and feasible approaches for a variety of situations and censoring rates. Therefore, they are applicable to a wider spectrum of alternatives compared with other tests. PMID:25615624
NASA Technical Reports Server (NTRS)
Tessarzik, J. M.; Chiang, T.; Badgley, R. H.
1973-01-01
The random vibration response of a gas bearing rotor support system has been experimentally and analytically investigated in the amplitude and frequency domains. The NASA Brayton Rotating Unit (BRU), a 36,000 rpm, 10 KWe turbogenerator had previously been subjected in the laboratory to external random vibrations, and the response data recorded on magnetic tape. This data has now been experimentally analyzed for amplitude distribution and magnetic tape. This data has now been experimentally analyzed for amplitude distribution and frequency content. The results of the power spectral density analysis indicate strong vibration responses for the major rotor-bearing system components at frequencies which correspond closely to their resonant frequencies obtained under periodic vibration testing. The results of amplitude analysis indicate an increasing shift towards non-Gaussian distributions as the input level of external vibrations is raised. Analysis of axial random vibration response of the BRU was performed by using a linear three-mass model. Power spectral densities, the root-mean-square value of the thrust bearing surface contact were calculated for specified input random excitation.
NASA Astrophysics Data System (ADS)
Lian, Enyang; Ren, Yingyu; Han, Yunfeng; Liu, Weixin; Jin, Ningde; Zhao, Junying
2016-11-01
The multi-scale analysis is an important method for detecting nonlinear systems. In this study, we carry out experiments and measure the fluctuation signals from a rotating electric field conductance sensor with eight electrodes. We first use a recurrence plot to recognise flow patterns in vertical upward gas-liquid two-phase pipe flow from measured signals. Then we apply a multi-scale morphological analysis based on the first-order difference scatter plot to investigate the signals captured from the vertical upward gas-liquid two-phase flow loop test. We find that the invariant scaling exponent extracted from the multi-scale first-order difference scatter plot with the bisector of the second-fourth quadrant as the reference line is sensitive to the inhomogeneous distribution characteristics of the flow structure, and the variation trend of the exponent is helpful to understand the process of breakup and coalescence of the gas phase. In addition, we explore the dynamic mechanism influencing the inhomogeneous distribution of the gas phase in terms of adaptive optimal kernel time-frequency representation. The research indicates that the system energy is a factor influencing the distribution of the gas phase and the multi-scale morphological analysis based on the first-order difference scatter plot is an effective method for indicating the inhomogeneous distribution of the gas phase in gas-liquid two-phase flow.
Belle, Elise M S; Barbujani, Guido
2007-08-01
Previous studies of the correlations between the languages spoken by human populations and the genes carried by the members of those populations have been limited by the small amount of genetic markers available and by approximations in the treatment of linguistic data. In this study we analyzed a large collection of polymorphic microsatellite loci (377), distributed on all autosomes, and used Ruhlen's linguistic classification, to investigate the relative roles of geography and language in shaping the distribution of human DNA diversity at a worldwide scale. For this purpose, we performed three different kinds of analysis: (i) we partitioned genetic variances at three hierarchical levels of population subdivision according to language group by means of a molecular analysis of variance (AMOVA); (ii) we quantified by a series of Mantel's tests the correlation between measures of genetic and linguistic differentiation; and (iii) we tested whether linguistic differences are increased across known zones of increased genetic change between populations. Genetic differences appear to more closely reflect geographic than linguistic differentiation. However, our analyses show that language differences also have a detectable effect on DNA diversity at the genomic level, above and beyond the effects of geographic distance. (c) 2007 Wiley-Liss, Inc.
Combined tension and bending testing of tapered composite laminates
NASA Astrophysics Data System (ADS)
O'Brien, T. Kevin; Murri, Gretchen B.; Hagemeier, Rick; Rogers, Charles
1994-11-01
A simple beam element used at Bell Helicopter was incorporated in the Computational Mechanics Testbed (COMET) finite element code at the Langley Research Center (LaRC) to analyze the responce of tappered laminates typical of flexbeams in composite rotor hubs. This beam element incorporated the influence of membrane loads on the flexural response of the tapered laminate configurations modeled and tested in a combined axial tension and bending (ATB) hydraulic load frame designed and built at LaRC. The moments generated from the finite element model were used in a tapered laminated plate theory analysis to estimate axial stresses on the surface of the tapered laminates due to combined bending and tension loads. Surfaces strains were calculated and compared to surface strains measured using strain gages mounted along the laminate length. The strain distributions correlated reasonably well with the analysis. The analysis was then used to examine the surface strain distribution in a non-linear tapered laminate where a similarly good correlation was obtained. Results indicate that simple finite element beam models may be used to identify tapered laminate configurations best suited for simulating the response of a composite flexbeam in a full scale rotor hub.
NASA Astrophysics Data System (ADS)
Waghorn, Ben J.; Shah, Amish P.; Ngwa, Wilfred; Meeks, Sanford L.; Moore, Joseph A.; Siebers, Jeffrey V.; Langen, Katja M.
2010-07-01
Intra-fraction organ motion during intensity-modulated radiation therapy (IMRT) treatment can cause differences between the planned and the delivered dose distribution. To investigate the extent of these dosimetric changes, a computational model was developed and validated. The computational method allows for calculation of the rigid motion perturbed three-dimensional dose distribution in the CT volume and therefore a dose volume histogram-based assessment of the dosimetric impact of intra-fraction motion on a rigidly moving body. The method was developed and validated for both step-and-shoot IMRT and solid compensator IMRT treatment plans. For each segment (or beam), fluence maps were exported from the treatment planning system. Fluence maps were shifted according to the target position deduced from a motion track. These shifted, motion-encoded fluence maps were then re-imported into the treatment planning system and were used to calculate the motion-encoded dose distribution. To validate the accuracy of the motion-encoded dose distribution the treatment plan was delivered to a moving cylindrical phantom using a programmed four-dimensional motion phantom. Extended dose response (EDR-2) film was used to measure a planar dose distribution for comparison with the calculated motion-encoded distribution using a gamma index analysis (3% dose difference, 3 mm distance-to-agreement). A series of motion tracks incorporating both inter-beam step-function shifts and continuous sinusoidal motion were tested. The method was shown to accurately predict the film's dose distribution for all of the tested motion tracks, both for the step-and-shoot IMRT and compensator plans. The average gamma analysis pass rate for the measured dose distribution with respect to the calculated motion-encoded distribution was 98.3 ± 0.7%. For static delivery the average film-to-calculation pass rate was 98.7 ± 0.2%. In summary, a computational technique has been developed to calculate the dosimetric effect of intra-fraction motion. This technique has the potential to evaluate a given plan's sensitivity to anticipated organ motion. With knowledge of the organ's motion it can also be used as a tool to assess the impact of measured intra-fraction motion after dose delivery.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tougaard, Sven
The author reports a systematic study of the range of validity of a previously developed algorithm for automated x-ray photoelectron spectroscopy analysis, which takes into account the variation in both peak intensity and the intensity in the background of inelastically scattered electrons. This test was done by first simulating spectra for the Au4d peak with gold atoms distributed in the form of a wide range of nanostructures, which includes overlayers with varying thickness, a 5 A layer of atoms buried at varying depths and a substrate covered with an overlayer of varying thickness. Next, the algorithm was applied to analyzemore » these spectra. The algorithm determines the number of atoms within the outermost 3 {lambda} of the surface. This amount of substance is denoted AOS{sub 3{lambda}} (where {lambda} is the electron inelastic mean free path). In general the determined AOS{sub 3{lambda}} is found to be accurate to within {approx}10-20% depending on the depth distribution of the atoms. The algorithm also determines a characteristic length L, which was found to give unambiguous information on the depth distribution of the atoms for practically all studied cases. A set of rules for this parameter, which relates the value of L to the depths where the atoms are distributed, was tested, and these rules were found to be generally valid with only a few exceptions. The results were found to be rather independent of the spectral energy range (from 20 to 40 eV below the peak energy) used in the analysis.« less
Haworth, Claire M A; Kovas, Yulia; Harlaar, Nicole; Hayiou-Thomas, Marianna E; Petrill, Stephen A; Dale, Philip S; Plomin, Robert
2009-10-01
Our previous investigation found that the same genes influence poor reading and mathematics performance in 10-year-olds. Here we assess whether this finding extends to language and general cognitive disabilities, as well as replicating the earlier finding for reading and mathematics in an older and larger sample. Using a representative sample of 4000 pairs of 12-year-old twins from the UK Twins Early Development Study, we investigated the genetic and environmental overlap between internet-based batteries of language and general cognitive ability tests in addition to tests of reading and mathematics for the bottom 15% of the distribution using DeFries-Fulker extremes analysis. We compared these results to those for the entire distribution. All four traits were highly correlated at the low extreme (average group phenotypic correlation = .58). and in the entire distribution (average phenotypic correlation = .59). Genetic correlations for the low extreme were consistently high (average = .67), and non-shared environmental correlations were modest (average = .23). These results are similar to those seen across the entire distribution (.68 and .23, respectively). The 'Generalist Genes Hypothesis' holds for language and general cognitive disabilities, as well as reading and mathematics disabilities. Genetic correlations were high, indicating a strong degree of overlap in genetic influences on these diverse traits. In contrast, non-shared environmental influences were largely specific to each trait, causing phenotypic differentiation of traits.
Analysis of the stress state in an Iosipescu sheartest specimen
NASA Technical Reports Server (NTRS)
Walrath, D. E.; Adams, D. F.
1983-01-01
The state of stress in an Iosipescu shear test specimen is analyzed, utilizing a finite element computer program. The influence of test fixture configuration on this stress state is included. Variations of the standard specimen configuration, including notch depth, notch angle, and notch root radius are modeled. The purpose is to establish guidelines for a specimen geometry which will accommodate highly orthotropic materials while minimizing stress distribution nonuniformities. Materials ranging from isotropic to highly orthotropic are considered. An optimum specimen configuration is suggested, along with changes in the test fixture.
Measuring continuous baseline covariate imbalances in clinical trial data
Ciolino, Jody D.; Martin, Renee’ H.; Zhao, Wenle; Hill, Michael D.; Jauch, Edward C.; Palesch, Yuko Y.
2014-01-01
This paper presents and compares several methods of measuring continuous baseline covariate imbalance in clinical trial data. Simulations illustrate that though the t-test is an inappropriate method of assessing continuous baseline covariate imbalance, the test statistic itself is a robust measure in capturing imbalance in continuous covariate distributions. Guidelines to assess effects of imbalance on bias, type I error rate, and power for hypothesis test for treatment effect on continuous outcomes are presented, and the benefit of covariate-adjusted analysis (ANCOVA) is also illustrated. PMID:21865270
Wiedermann, Wolfgang; Li, Xintong
2018-04-16
In nonexperimental data, at least three possible explanations exist for the association of two variables x and y: (1) x is the cause of y, (2) y is the cause of x, or (3) an unmeasured confounder is present. Statistical tests that identify which of the three explanatory models fits best would be a useful adjunct to the use of theory alone. The present article introduces one such statistical method, direction dependence analysis (DDA), which assesses the relative plausibility of the three explanatory models on the basis of higher-moment information about the variables (i.e., skewness and kurtosis). DDA involves the evaluation of three properties of the data: (1) the observed distributions of the variables, (2) the residual distributions of the competing models, and (3) the independence properties of the predictors and residuals of the competing models. When the observed variables are nonnormally distributed, we show that DDA components can be used to uniquely identify each explanatory model. Statistical inference methods for model selection are presented, and macros to implement DDA in SPSS are provided. An empirical example is given to illustrate the approach. Conceptual and empirical considerations are discussed for best-practice applications in psychological data, and sample size recommendations based on previous simulation studies are provided.
10 CFR 431.198 - Enforcement testing for distribution transformers.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 10 Energy 3 2010-01-01 2010-01-01 false Enforcement testing for distribution transformers. 431.198... COMMERCIAL AND INDUSTRIAL EQUIPMENT Distribution Transformers Compliance and Enforcement § 431.198 Enforcement testing for distribution transformers. (a) Test notice. Upon receiving information in writing...
10 CFR 431.198 - Enforcement testing for distribution transformers.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 10 Energy 3 2011-01-01 2011-01-01 false Enforcement testing for distribution transformers. 431.198... COMMERCIAL AND INDUSTRIAL EQUIPMENT Distribution Transformers Compliance and Enforcement § 431.198 Enforcement testing for distribution transformers. (a) Test notice. Upon receiving information in writing...
Mechanical Network in Titin Immunoglobulin from Force Distribution Analysis
Wilmanns, Matthias; Gräter, Frauke
2009-01-01
The role of mechanical force in cellular processes is increasingly revealed by single molecule experiments and simulations of force-induced transitions in proteins. How the applied force propagates within proteins determines their mechanical behavior yet remains largely unknown. We present a new method based on molecular dynamics simulations to disclose the distribution of strain in protein structures, here for the newly determined high-resolution crystal structure of I27, a titin immunoglobulin (IG) domain. We obtain a sparse, spatially connected, and highly anisotropic mechanical network. This allows us to detect load-bearing motifs composed of interstrand hydrogen bonds and hydrophobic core interactions, including parts distal to the site to which force was applied. The role of the force distribution pattern for mechanical stability is tested by in silico unfolding of I27 mutants. We then compare the observed force pattern to the sparse network of coevolved residues found in this family. We find a remarkable overlap, suggesting the force distribution to reflect constraints for the evolutionary design of mechanical resistance in the IG family. The force distribution analysis provides a molecular interpretation of coevolution and opens the road to the study of the mechanism of signal propagation in proteins in general. PMID:19282960