Correlating the EMC analysis and testing methods for space systems in MIL-STD-1541A
NASA Technical Reports Server (NTRS)
Perez, Reinaldo J.
1990-01-01
A study was conducted to improve the correlation between the electromagnetic compatibility (EMC) analysis models stated in MIL-STD-1541A and the suggested testing methods used for space systems. The test and analysis methods outlined in MIL-STD-1541A are described, and a comparative assessment of testing and analysis techniques as they relate to several EMC areas is presented. Suggestions on present analysis and test methods are introduced to harmonize and bring the analysis and testing tools in MIL-STD-1541A into closer agreement. It is suggested that test procedures in MIL-STD-1541A must be improved by providing alternatives to the present use of shielded enclosures as the primary site for such tests. In addition, the alternate use of anechoic chambers and open field test sites must be considered.
Emiralioğlu, Nagehan; Özçelik, Uğur; Yalçın, Ebru; Doğru, Deniz; Kiper, Nural
2016-01-01
Sweat test with Gibson Cooke (GC) method is the diagnostic gold standard for cystic fibrosis (CF). Recently, alternative methods have been introduced to simplify both the collection and analysis of sweat samples. Our aim was to compare sweat chloride values obtained by GC method with other sweat test methods in patients diagnosed with CF and whose CF diagnosis had been ruled out. We wanted to determine if the other sweat test methods could reliably identify patients with CF and differentiate them from healthy subjects. Chloride concentration was measured with GC method, chloride meter and sweat test analysis system; also conductivity was determined with sweat test analysis system. Forty eight patients with CF and 82 patients without CF underwent the sweat test, showing median sweat chloride values 98.9 mEq/L with GC method, 101 mmol/L with chloride meter, 87.8 mmol/L with sweat test analysis system. In non-CF group, median sweat chloride values were 16.8 mEq/L with GC method, 10.5 mmol/L with chloride meter, and 15.6 mmol/L with sweat test analysis system. Median conductivity value was 107.3 mmol/L in CF group and 32.1 mmol/L in non CF group. There was a strong positive correlation between GC method and the other sweat test methods with a statistical significance (r=0.85) in all subjects. Sweat chloride concentration and conductivity by other sweat test methods highly correlate with the GC method. We think that the other sweat test equipments can be used as reliably as the classic GC method to diagnose or exclude CF.
Further evidence for the increased power of LOD scores compared with nonparametric methods.
Durner, M; Vieland, V J; Greenberg, D A
1999-01-01
In genetic analysis of diseases in which the underlying model is unknown, "model free" methods-such as affected sib pair (ASP) tests-are often preferred over LOD-score methods, although LOD-score methods under the correct or even approximately correct model are more powerful than ASP tests. However, there might be circumstances in which nonparametric methods will outperform LOD-score methods. Recently, Dizier et al. reported that, in some complex two-locus (2L) models, LOD-score methods with segregation analysis-derived parameters had less power to detect linkage than ASP tests. We investigated whether these particular models, in fact, represent a situation that ASP tests are more powerful than LOD scores. We simulated data according to the parameters specified by Dizier et al. and analyzed the data by using a (a) single locus (SL) LOD-score analysis performed twice, under a simple dominant and a recessive mode of inheritance (MOI), (b) ASP methods, and (c) nonparametric linkage (NPL) analysis. We show that SL analysis performed twice and corrected for the type I-error increase due to multiple testing yields almost as much linkage information as does an analysis under the correct 2L model and is more powerful than either the ASP method or the NPL method. We demonstrate that, even for complex genetic models, the most important condition for linkage analysis is that the assumed MOI at the disease locus being tested is approximately correct, not that the inheritance of the disease per se is correctly specified. In the analysis by Dizier et al., segregation analysis led to estimates of dominance parameters that were grossly misspecified for the locus tested in those models in which ASP tests appeared to be more powerful than LOD-score analyses.
An Integrated Analysis-Test Approach
NASA Technical Reports Server (NTRS)
Kaufman, Daniel
2003-01-01
This viewgraph presentation provides an overview of a project to develop a computer program which integrates data analysis and test procedures. The software application aims to propose a new perspective to traditional mechanical analysis and test procedures and to integrate pre-test and test analysis calculation methods. The program also should also be able to be used in portable devices and allows for the 'quasi-real time' analysis of data sent by electronic means. Test methods reviewed during this presentation include: shaker swept sine and random tests, shaker shock mode tests, shaker base driven model survey tests and acoustic tests.
Lee, Juneyoung; Kim, Kyung Won; Choi, Sang Hyun; Huh, Jimi
2015-01-01
Meta-analysis of diagnostic test accuracy studies differs from the usual meta-analysis of therapeutic/interventional studies in that, it is required to simultaneously analyze a pair of two outcome measures such as sensitivity and specificity, instead of a single outcome. Since sensitivity and specificity are generally inversely correlated and could be affected by a threshold effect, more sophisticated statistical methods are required for the meta-analysis of diagnostic test accuracy. Hierarchical models including the bivariate model and the hierarchical summary receiver operating characteristic model are increasingly being accepted as standard methods for meta-analysis of diagnostic test accuracy studies. We provide a conceptual review of statistical methods currently used and recommended for meta-analysis of diagnostic test accuracy studies. This article could serve as a methodological reference for those who perform systematic review and meta-analysis of diagnostic test accuracy studies. PMID:26576107
Buckling analysis and test correlation of hat stiffened panels for hypersonic vehicles
NASA Technical Reports Server (NTRS)
Percy, Wendy C.; Fields, Roger A.
1990-01-01
The paper discusses the design, analysis, and test of hat stiffened panels subjected to a variety of thermal and mechanical load conditions. The panels were designed using data from structural optimization computer codes and finite element analysis. Test methods included the grid shadow moire method and a single gage force stiffness method. The agreement between the test data and analysis provides confidence in the methods that are currently being used to design structures for hypersonic vehicles. The agreement also indicates that post buckled strength may potentially be used to reduce the vehicle weight.
Methods for the Joint Meta-Analysis of Multiple Tests
ERIC Educational Resources Information Center
Trikalinos, Thomas A.; Hoaglin, David C.; Small, Kevin M.; Terrin, Norma; Schmid, Christopher H.
2014-01-01
Existing methods for meta-analysis of diagnostic test accuracy focus primarily on a single index test. We propose models for the joint meta-analysis of studies comparing multiple index tests on the same participants in paired designs. These models respect the grouping of data by studies, account for the within-study correlation between the tests'…
NASA Technical Reports Server (NTRS)
Anstead, R. J. (Editor); Goldberg, E. (Editor)
1975-01-01
Failure analysis test methods are presented for use in analyzing candidate electronic parts and in improving future design reliability. Each test is classified as nondestructive, semidestructive, or destructive. The effects upon applicable part types (i.e. integrated circuit, transitor) are discussed. Methodology is given for performing the following: immersion tests, radio graphic tests, dewpoint tests, gas ambient analysis, cross sectioning, and ultraviolet examination.
A Review of Classical Methods of Item Analysis.
ERIC Educational Resources Information Center
French, Christine L.
Item analysis is a very important consideration in the test development process. It is a statistical procedure to analyze test items that combines methods used to evaluate the important characteristics of test items, such as difficulty, discrimination, and distractibility of the items in a test. This paper reviews some of the classical methods for…
NASA Technical Reports Server (NTRS)
Ray, Ronald J.
1994-01-01
New flight test maneuvers and analysis techniques for evaluating the dynamic response of in-flight thrust models during throttle transients have been developed and validated. The approach is based on the aircraft and engine performance relationship between thrust and drag. Two flight test maneuvers, a throttle step and a throttle frequency sweep, were developed and used in the study. Graphical analysis techniques, including a frequency domain analysis method, were also developed and evaluated. They provide quantitative and qualitative results. Four thrust calculation methods were used to demonstrate and validate the test technique. Flight test applications on two high-performance aircraft confirmed the test methods as valid and accurate. These maneuvers and analysis techniques were easy to implement and use. Flight test results indicate the analysis techniques can identify the combined effects of model error and instrumentation response limitations on the calculated thrust value. The methods developed in this report provide an accurate approach for evaluating, validating, or comparing thrust calculation methods for dynamic flight applications.
Kruschke, John K; Liddell, Torrin M
2018-02-01
In the practice of data analysis, there is a conceptual distinction between hypothesis testing, on the one hand, and estimation with quantified uncertainty on the other. Among frequentists in psychology, a shift of emphasis from hypothesis testing to estimation has been dubbed "the New Statistics" (Cumming 2014). A second conceptual distinction is between frequentist methods and Bayesian methods. Our main goal in this article is to explain how Bayesian methods achieve the goals of the New Statistics better than frequentist methods. The article reviews frequentist and Bayesian approaches to hypothesis testing and to estimation with confidence or credible intervals. The article also describes Bayesian approaches to meta-analysis, randomized controlled trials, and power analysis.
How Is Wilson Disease Inherited?
... ATP7B gene have been identified thus far. Testing Methods Available Linkage analysis (Haplotype analysis) Molecular genetic testing ... genetic counselor who can carefully discuss the best method of testing to perform and the benefits, limitations, ...
40 CFR 98.7 - What standardized methods are incorporated by reference into this part?
Code of Federal Regulations, 2012 CFR
2012-07-01
....astm.org. (1) ASTM C25-06 Standard Test Method for Chemical Analysis of Limestone, Quicklime, and....194(c), and § 98.334(b). (2) ASTM C114-09 Standard Test Methods for Chemical Analysis of Hydraulic... approved for § 98.6. (4) ASTM D240-02 (Reapproved 2007) Standard Test Method for Heat of Combustion of...
40 CFR 98.7 - What standardized methods are incorporated by reference into this part?
Code of Federal Regulations, 2013 CFR
2013-07-01
....astm.org. (1) ASTM C25-06 Standard Test Method for Chemical Analysis of Limestone, Quicklime, and....194(c), and § 98.334(b). (2) ASTM C114-09 Standard Test Methods for Chemical Analysis of Hydraulic... approved for § 98.6. (4) ASTM D240-02 (Reapproved 2007) Standard Test Method for Heat of Combustion of...
Documentation of spreadsheets for the analysis of aquifer-test and slug-test data
Halford, Keith J.; Kuniansky, Eve L.
2002-01-01
Several spreadsheets have been developed for the analysis of aquifer-test and slug-test data. Each spreadsheet incorporates analytical solution(s) of the partial differential equation for ground-water flow to a well for a specific type of condition or aquifer. The derivations of the analytical solutions were previously published. Thus, this report abbreviates the theoretical discussion, but includes practical information about each method and the important assumptions for the applications of each method. These spreadsheets were written in Microsoft Excel 9.0 (use of trade names does not constitute endorsement by the USGS). Storage properties should not be estimated with many of the spreadsheets because most are for analyzing single-well tests. Estimation of storage properties from single-well tests is generally discouraged because single-well tests are affected by wellbore storage and by well construction. These non-ideal effects frequently cause estimates of storage to be erroneous by orders of magnitude. Additionally, single-well tests are not sensitive to aquifer-storage properties. Single-well tests include all slug tests (Bouwer and Rice Method, Cooper, Bredehoeft, Papadopulos Method, and van der Kamp Method), the Cooper-Jacob straight-line Method, Theis recovery-data analysis, Jacob-Lohman method for flowing wells in a confined aquifer, and the step-drawdown test. Multi-well test spreadsheets included in this report are; Hantush-Jacob Leaky Aquifer Method and Distance-Drawdown Methods. The distance-drawdown method is an equilibrium or steady-state method, thus storage cannot be estimated.
Analysis of beryllium and depleted uranium: An overview of detection methods in aerosols and soils
DOE Office of Scientific and Technical Information (OSTI.GOV)
Camins, I.; Shinn, J.H.
We conducted a survey of commercially available methods for analysis of beryllium and depleted uranium in aerosols and soils to find a reliable, cost-effective, and sufficiently precise method for researchers involved in environmental testing at the Yuma Proving Ground, Yuma, Arizona. Criteria used for evaluation include cost, method of analysis, specificity, sensitivity, reproducibility, applicability, and commercial availability. We found that atomic absorption spectrometry with graphite furnace meets these criteria for testing samples for beryllium. We found that this method can also be used to test samples for depleted uranium. However, atomic absorption with graphite furnace is not as sensitive amore » measurement method for depleted uranium as it is for beryllium, so we recommend that quality control of depleted uranium analysis be maintained by testing 10 of every 1000 samples by neutron activation analysis. We also evaluated 45 companies and institutions that provide analyses of beryllium and depleted uranium. 5 refs., 1 tab.« less
Dynamic test/analysis correlation using reduced analytical models
NASA Technical Reports Server (NTRS)
Mcgowan, Paul E.; Angelucci, A. Filippo; Javeed, Mehzad
1992-01-01
Test/analysis correlation is an important aspect of the verification of analysis models which are used to predict on-orbit response characteristics of large space structures. This paper presents results of a study using reduced analysis models for performing dynamic test/analysis correlation. The reduced test-analysis model (TAM) has the same number and orientation of DOF as the test measurements. Two reduction methods, static (Guyan) reduction and the Improved Reduced System (IRS) reduction, are applied to the test/analysis correlation of a laboratory truss structure. Simulated test results and modal test data are used to examine the performance of each method. It is shown that selection of DOF to be retained in the TAM is critical when large structural masses are involved. In addition, the use of modal test results may provide difficulties in TAM accuracy even if a large number of DOF are retained in the TAM.
General Framework for Meta-analysis of Rare Variants in Sequencing Association Studies
Lee, Seunggeun; Teslovich, Tanya M.; Boehnke, Michael; Lin, Xihong
2013-01-01
We propose a general statistical framework for meta-analysis of gene- or region-based multimarker rare variant association tests in sequencing association studies. In genome-wide association studies, single-marker meta-analysis has been widely used to increase statistical power by combining results via regression coefficients and standard errors from different studies. In analysis of rare variants in sequencing studies, region-based multimarker tests are often used to increase power. We propose meta-analysis methods for commonly used gene- or region-based rare variants tests, such as burden tests and variance component tests. Because estimation of regression coefficients of individual rare variants is often unstable or not feasible, the proposed method avoids this difficulty by calculating score statistics instead that only require fitting the null model for each study and then aggregating these score statistics across studies. Our proposed meta-analysis rare variant association tests are conducted based on study-specific summary statistics, specifically score statistics for each variant and between-variant covariance-type (linkage disequilibrium) relationship statistics for each gene or region. The proposed methods are able to incorporate different levels of heterogeneity of genetic effects across studies and are applicable to meta-analysis of multiple ancestry groups. We show that the proposed methods are essentially as powerful as joint analysis by directly pooling individual level genotype data. We conduct extensive simulations to evaluate the performance of our methods by varying levels of heterogeneity across studies, and we apply the proposed methods to meta-analysis of rare variant effects in a multicohort study of the genetics of blood lipid levels. PMID:23768515
[Application of Stata software to test heterogeneity in meta-analysis method].
Wang, Dan; Mou, Zhen-yun; Zhai, Jun-xia; Zong, Hong-xia; Zhao, Xiao-dong
2008-07-01
To introduce the application of Stata software to heterogeneity test in meta-analysis. A data set was set up according to the example in the study, and the corresponding commands of the methods in Stata 9 software were applied to test the example. The methods used were Q-test and I2 statistic attached to the fixed effect model forest plot, H statistic and Galbraith plot. The existence of the heterogeneity among studies could be detected by Q-test and H statistic and the degree of the heterogeneity could be detected by I2 statistic. The outliers which were the sources of the heterogeneity could be spotted from the Galbraith plot. Heterogeneity test in meta-analysis can be completed by the four methods in Stata software simply and quickly. H and I2 statistics are more robust, and the outliers of the heterogeneity can be clearly seen in the Galbraith plot among the four methods.
NASA Astrophysics Data System (ADS)
Ishizawa, Y.; Abe, K.; Shirako, G.; Takai, T.; Kato, H.
The electromagnetic compatibility (EMC) control method, system EMC analysis method, and system test method which have been applied to test the components of the MOS-1 satellite are described. The merits and demerits of the problem solving, specification, and system approaches to EMC control are summarized, and the data requirements of the SEMCAP (specification and electromagnetic compatibility analysis program) computer program for verifying the EMI safety margin of the components are sumamrized. Examples of EMC design are mentioned, and the EMC design process and selection method for EMC critical points are shown along with sample EMC test results.
[Research progress on mechanical performance evaluation of artificial intervertebral disc].
Li, Rui; Wang, Song; Liao, Zhenhua; Liu, Weiqiang
2018-03-01
The mechanical properties of artificial intervertebral disc (AID) are related to long-term reliability of prosthesis. There are three testing methods involved in the mechanical performance evaluation of AID based on different tools: the testing method using mechanical simulator, in vitro specimen testing method and finite element analysis method. In this study, the testing standard, testing equipment and materials of AID were firstly introduced. Then, the present status of AID static mechanical properties test (static axial compression, static axial compression-shear), dynamic mechanical properties test (dynamic axial compression, dynamic axial compression-shear), creep and stress relaxation test, device pushout test, core pushout test, subsidence test, etc. were focused on. The experimental techniques using in vitro specimen testing method and testing results of available artificial discs were summarized. The experimental methods and research status of finite element analysis were also summarized. Finally, the research trends of AID mechanical performance evaluation were forecasted. The simulator, load, dynamic cycle, motion mode, specimen and test standard would be important research fields in the future.
Cappella, Annalisa; Gibelli, Daniele; Muccino, Enrico; Scarpulla, Valentina; Cerutti, Elisa; Caruso, Valentina; Sguazza, Emanuela; Mazzarelli, Debora; Cattaneo, Cristina
2015-01-27
When estimating post-mortem interval (PMI) in forensic anthropology, the only method able to give an unambiguous result is the analysis of C-14, although the procedure is expensive. Other methods, such as luminol tests and histological analysis, can be performed as preliminary investigations and may allow the operators to gain a preliminary indication concerning PMI, but they lack scientific verification, although luminol testing has been somewhat more accredited in the past few years. Such methods in fact may provide some help as they are inexpensive and can give a fast response, especially in the phase of preliminary investigations. In this study, 20 court cases of human skeletonized remains were dated by the C-14 method. For two cases, results were chronologically set after the 1950s; for one case, the analysis was not possible technically. The remaining 17 cases showed an archaeological or historical collocation. The same bone samples were also screened with histological examination and with the luminol test. Results showed that only four cases gave a positivity to luminol and a high Oxford Histology Index (OHI) score at the same time: among these, two cases were dated as recent by the radiocarbon analysis. Thus, only two false-positive results were given by the combination of these methods and no false negatives. Thus, the combination of two qualitative methods (luminol test and microscopic analysis) may represent a promising solution to cases where many fragments need to be quickly tested.
Study on color difference estimation method of medicine biochemical analysis
NASA Astrophysics Data System (ADS)
Wang, Chunhong; Zhou, Yue; Zhao, Hongxia; Sun, Jiashi; Zhou, Fengkun
2006-01-01
The biochemical analysis in medicine is an important inspection and diagnosis method in hospital clinic. The biochemical analysis of urine is one important item. The Urine test paper shows corresponding color with different detection project or different illness degree. The color difference between the standard threshold and the test paper color of urine can be used to judge the illness degree, so that further analysis and diagnosis to urine is gotten. The color is a three-dimensional physical variable concerning psychology, while reflectance is one-dimensional variable; therefore, the estimation method of color difference in urine test can have better precision and facility than the conventional test method with one-dimensional reflectance, it can make an accurate diagnose. The digital camera is easy to take an image of urine test paper and is used to carry out the urine biochemical analysis conveniently. On the experiment, the color image of urine test paper is taken by popular color digital camera and saved in the computer which installs a simple color space conversion (RGB -> XYZ -> L *a *b *)and the calculation software. Test sample is graded according to intelligent detection of quantitative color. The images taken every time were saved in computer, and the whole illness process will be monitored. This method can also use in other medicine biochemical analyses that have relation with color. Experiment result shows that this test method is quick and accurate; it can be used in hospital, calibrating organization and family, so its application prospect is extensive.
NASA Technical Reports Server (NTRS)
Jordan, F. L., Jr.
1980-01-01
As part of basic research to improve aerial applications technology, methods were developed at the Langley Vortex Research Facility to simulate and measure deposition patterns of aerially-applied sprays and granular materials by means of tests with small-scale models of agricultural aircraft and dynamically-scaled test particles. Interactions between the aircraft wake and the dispersed particles are being studied with the objective of modifying wake characteristics and dispersal techniques to increase swath width, improve deposition pattern uniformity, and minimize drift. The particle scaling analysis, test methods for particle dispersal from the model aircraft, visualization of particle trajectories, and measurement and computer analysis of test deposition patterns are described. An experimental validation of the scaling analysis and test results that indicate improved control of chemical drift by use of winglets are presented to demonstrate test methods.
NASA Astrophysics Data System (ADS)
Derajat; Hariowibowo, Hindawan
2018-04-01
The new proposed In-Flight Pitot Static Calibration Method has been carried out during Development and Qualification of CN235-100 MPA (Military Patrol Aircraft). This method is expected to reduce flight hours, less human resources required, no additional special equipment, simple analysis calculation and finally by using this method it is expected to automatically minimized operational cost. At The Indonesian Aerospace (IAe) Flight Test Center Division, the development and updating of new flight test technique and data analysis method as specially for flight physics test subject are still continued to be developed as long as it safety for flight and give additional value for the industrial side. More than 30 years, Flight Test Data Engineers at The Flight Test center Division work together with the Air Crew (Test Pilots, Co-Pilots, and Flight Test Engineers) to execute the flight test activity with standard procedure for both the existance or development test techniques and test data analysis. In this paper the approximation of mathematical model, data reduction and flight test technique of The In-Flight Pitot Static Calibration by using Radio Altimeter as reference will be described and the test results had been compared with another methods ie. By using Global Position System (GPS) and the traditional method (Tower Fly By Method) which were used previously during this Flight Test Program (Ref. [10]). The flight test data case are using CN235-100 MPA flight test data during development and Qualification Flight Test Program at Cazaux Airport, France, in June-November 2009 (Ref. [2]).
De Hertogh, Benoît; De Meulder, Bertrand; Berger, Fabrice; Pierre, Michael; Bareke, Eric; Gaigneaux, Anthoula; Depiereux, Eric
2010-01-11
Recent reanalysis of spike-in datasets underscored the need for new and more accurate benchmark datasets for statistical microarray analysis. We present here a fresh method using biologically-relevant data to evaluate the performance of statistical methods. Our novel method ranks the probesets from a dataset composed of publicly-available biological microarray data and extracts subset matrices with precise information/noise ratios. Our method can be used to determine the capability of different methods to better estimate variance for a given number of replicates. The mean-variance and mean-fold change relationships of the matrices revealed a closer approximation of biological reality. Performance analysis refined the results from benchmarks published previously.We show that the Shrinkage t test (close to Limma) was the best of the methods tested, except when two replicates were examined, where the Regularized t test and the Window t test performed slightly better. The R scripts used for the analysis are available at http://urbm-cluster.urbm.fundp.ac.be/~bdemeulder/.
Fatigue analysis and testing of wind turbine blades
NASA Astrophysics Data System (ADS)
Greaves, Peter Robert
This thesis focuses on fatigue analysis and testing of large, multi MW wind turbine blades. The blades are one of the most expensive components of a wind turbine, and their mass has cost implications for the hub, nacelle, tower and foundations of the turbine so it is important that they are not unnecessarily strong. Fatigue is often an important design driver, but fatigue of composites is poorly understood and so large safety factors are often applied to the loads. This has implications for the weight of the blade. Full scale fatigue testing of blades is required by the design standards, and provides manufacturers with confidence that the blade will be able to survive its service life. This testing is usually performed by resonating the blade in the flapwise and edgewise directions separately, but in service these two loads occur at the same time.. A fatigue testing method developed at Narec (the National Renewable Energy Centre) in the UK in which the flapwise and edgewise directions are excited simultaneously has been evaluated by comparing the Palmgren-Miner damage sum around the blade cross section after testing with the damage distribution caused by the service life. A method to obtain the resonant test configuration that will result in the optimum mode shapes for the flapwise and edgewise directions was then developed, and simulation software was designed to allow the blade test to be simulated so that realistic comparisons between the damage distributions after different test types could be obtained. During the course of this work the shortcomings with conventional fatigue analysis methods became apparent, and a novel method of fatigue analysis based on multi-continuum theory and the kinetic theory of fracture was developed. This method was benchmarked using physical test data from the OPTIDAT database and was applied to the analysis of a complete blade. A full scale fatigue test method based on this new analysis approach is also discussed..
ERIC Educational Resources Information Center
Kapanadze, Dilek Ünveren
2018-01-01
The aim of this study is to identify the effect of using discourse analysis method on the skills of reading comprehension, textual analysis, creating discourse and use of language. In this study, the authentic test model with pre-test and post-test control group was used in order to determine the difference of academic achievement between…
NEAT: an efficient network enrichment analysis test.
Signorelli, Mirko; Vinciotti, Veronica; Wit, Ernst C
2016-09-05
Network enrichment analysis is a powerful method, which allows to integrate gene enrichment analysis with the information on relationships between genes that is provided by gene networks. Existing tests for network enrichment analysis deal only with undirected networks, they can be computationally slow and are based on normality assumptions. We propose NEAT, a test for network enrichment analysis. The test is based on the hypergeometric distribution, which naturally arises as the null distribution in this context. NEAT can be applied not only to undirected, but to directed and partially directed networks as well. Our simulations indicate that NEAT is considerably faster than alternative resampling-based methods, and that its capacity to detect enrichments is at least as good as the one of alternative tests. We discuss applications of NEAT to network analyses in yeast by testing for enrichment of the Environmental Stress Response target gene set with GO Slim and KEGG functional gene sets, and also by inspecting associations between functional sets themselves. NEAT is a flexible and efficient test for network enrichment analysis that aims to overcome some limitations of existing resampling-based tests. The method is implemented in the R package neat, which can be freely downloaded from CRAN ( https://cran.r-project.org/package=neat ).
Repeatability study of replicate crash tests: A signal analysis approach.
Seppi, Jeremy; Toczyski, Jacek; Crandall, Jeff R; Kerrigan, Jason
2017-10-03
To provide an objective basis on which to evaluate the repeatability of vehicle crash test methods, a recently developed signal analysis method was used to evaluate correlation of sensor time history data between replicate vehicle crash tests. The goal of this study was to evaluate the repeatability of rollover crash tests performed with the Dynamic Rollover Test System (DRoTS) relative to other vehicle crash test methods. Test data from DRoTS tests, deceleration rollover sled (DRS) tests, frontal crash tests, frontal offset crash tests, small overlap crash tests, small overlap impact (SOI) crash tests, and oblique crash tests were obtained from the literature and publicly available databases (the NHTSA vehicle database and the Insurance Institute for Highway Safety TechData) to examine crash test repeatability. Signal analysis of the DRoTS tests showed that force and deformation time histories had good to excellent repeatability, whereas vehicle kinematics showed only fair repeatability due to the vehicle mounting method for one pair of tests and slightly dissimilar mass properties (2.2%) in a second pair of tests. Relative to the DRS, the DRoTS tests showed very similar or higher levels of repeatability in nearly all vehicle kinematic data signals with the exception of global X' (road direction of travel) velocity and displacement due to the functionality of the DRoTS fixture. Based on the average overall scoring metric of the dominant acceleration, DRoTS was found to be as repeatable as all other crash tests analyzed. Vertical force measures showed good repeatability and were on par with frontal crash barrier forces. Dynamic deformation measures showed good to excellent repeatability as opposed to poor repeatability seen in SOI and oblique deformation measures. Using the signal analysis method as outlined in this article, the DRoTS was shown to have the same or better repeatability of crash test methods used in government regulatory and consumer evaluation test protocols.
Comparison of Modal Analysis Methods Applied to a Vibro-Acoustic Test Article
NASA Technical Reports Server (NTRS)
Pritchard, Jocelyn; Pappa, Richard; Buehrle, Ralph; Grosveld, Ferdinand
2001-01-01
Modal testing of a vibro-acoustic test article referred to as the Aluminum Testbed Cylinder (ATC) has provided frequency response data for the development of validated numerical models of complex structures for interior noise prediction and control. The ATC is an all aluminum, ring and stringer stiffened cylinder, 12 feet in length and 4 feet in diameter. The cylinder was designed to represent typical aircraft construction. Modal tests were conducted for several different configurations of the cylinder assembly under ambient and pressurized conditions. The purpose of this paper is to present results from dynamic testing of different ATC configurations using two modal analysis software methods: Eigensystem Realization Algorithm (ERA) and MTS IDEAS Polyreference method. The paper compares results from the two analysis methods as well as the results from various test configurations. The effects of pressurization on the modal characteristics are discussed.
2012-03-01
edu 75 Appendix C Factor Analysis of Measurement Items Interrole conflict Factor Analysis (FA): Table: KMO and Bartlett’s Test Kaiser-Meyer...Extraction Method: Principal Component Analysis. Rotation Method: Varimax with Kaiser Normalization. 77 POS FA: Table: KMO and Bartlett’s...Tempo FA: Table: KMO and Bartlett’s Test Kaiser-Meyer-Olkin Measure of Sampling Adequacy. .733 Bartlett’s Test of Sphericity Approx. Chi-Square
Rath, S; Panda, M; Sahu, M C; Padhy, R N
2015-09-01
Quantitatively, conventional methods of diagnosis of tinea capitis or paediatric ringworm, microscopic and culture tests were evaluated with Bayes rule. This analysis would help in quantifying the pervasive errors in each diagnostic method, particularly the microscopic method, as a long-term treatment would be involved to eradicate the infection by the use of a particular antifungal chemotherapy. Secondly, the analysis of clinical data would help in obtaining digitally the fallible standard of the microscopic test method, as the culture test method is taken as gold standard. Test results of 51 paediatric patients were of 4 categories: 21 samples were true positive (both tests positive), and 13 were true negative; the rest samples comprised both 14 false positive (microscopic test positivity with culture test negativity) and 3 false negative (microscopic test negativity with culture test positivity) samples. The prevalence of tinea infection was 47.01% in the population of 51 children. The microscopic test of a sample was efficient by 87.5%, in arriving at a positive result on diagnosis, when its culture test was positive; and, this test was efficient by 76.4%, in arriving at a negative result, when its culture test was negative. But, the post-test probability value of a sample with both microscopic and culture tests would be correct in distinguishing a sample from a sick or a healthy child with a chance of 71.5%. However, since the sensitivity of the analysis is 87.5%, the microscopic test positivity would be easier to detect in the presence of infection. In conclusion, it could be stated that Trychophyton rubrum was the most prevalent species; sensitivity and specificity of treating the infection, by antifungal therapy before ascertaining by the culture method remain as 0.8751 and 0.7642, respectively. A correct/coveted diagnostic method of fungal infection would be could be achieved by modern molecular methods (matrix-assisted laser desorption ionisation-time of flight mass spectrometry or fluorescence in situ hybridization or enzyme-linked immunosorbent assay [ELISA] or restriction fragment length polymorphism or DNA/RNA probes of known fungal taxa) in advanced laboratories. Copyright © 2015 Elsevier Masson SAS. All rights reserved.
NASA Astrophysics Data System (ADS)
Nepomuceno, Miguel C. S.; Lopes, Sérgio M. R.
2017-10-01
Non-destructive tests (NDT) have been used in the last decades for the assessment of in-situ quality and integrity of concrete elements. An important step in the application of NDT methods concerns to the interpretation and validation of the test results. In general, interpretation of NDT results should involve three distinct phases leading to the development of conclusions: processing of collected data, analysis of within-test variability and quantitative evaluation of property under investigation. The analysis of within-test variability can provide valuable information, since this can be compared with that of within-test variability associated with the NDT method in use, either to provide a measure of the quality control or to detect the presence of abnormal circumstances during the in-situ application. This paper reports the analysis of the experimental results of within-test variability of NDT obtained for normal vibrated concrete and self-compacting concrete. The NDT reported includes the surface hardness test, ultrasonic pulse velocity test, penetration resistance test, pull-off test, pull-out test and maturity test. The obtained results are discussed and conclusions are presented.
Non-destructive testing of full-length bonded rock bolts based on HHT signal analysis
NASA Astrophysics Data System (ADS)
Shi, Z. M.; Liu, L.; Peng, M.; Liu, C. C.; Tao, F. J.; Liu, C. S.
2018-04-01
Full-length bonded rock bolts are commonly used in mining, tunneling and slope engineering because of their simple design and resistance to corrosion. However, the length of a rock bolt and grouting quality do not often meet the required design standards in practice because of the concealment and complexity of bolt construction. Non-destructive testing is preferred when testing a rock bolt's quality because of the convenience, low cost and wide detection range. In this paper, a signal analysis method for the non-destructive sound wave testing of full-length bonded rock bolts is presented, which is based on the Hilbert-Huang transform (HHT). First, we introduce the HHT analysis method to calculate the bolt length and identify defect locations based on sound wave reflection test signals, which includes decomposing the test signal via empirical mode decomposition (EMD), selecting the intrinsic mode functions (IMF) using the Pearson Correlation Index (PCI) and calculating the instantaneous phase and frequency via the Hilbert transform (HT). Second, six model tests are conducted using different grouting defects and bolt protruding lengths to verify the effectiveness of the HHT analysis method. Lastly, the influence of the bolt protruding length on the test signal, identification of multiple reflections from defects, bolt end and protruding end, and mode mixing from EMD are discussed. The HHT analysis method can identify the bolt length and grouting defect locations from signals that contain noise at multiple reflected interfaces. The reflection from the long protruding end creates an irregular test signal with many frequency peaks on the spectrum. The reflections from defects barely change the original signal because they are low energy, which cannot be adequately resolved using existing methods. The HHT analysis method can identify reflections from the long protruding end of the bolt and multiple reflections from grouting defects based on mutations in the instantaneous frequency, which makes weak reflections more noticeable. The mode mixing phenomenon is observed in several tests, but this does not markedly affect the identification results due to the simple medium in bolt tests. The mode mixing can be reduced by ensemble EMD (EEMD) or complete ensemble EMD with adaptive noise (CEEMDAN), which are powerful tools to used analyze the test signal in a complex medium and may play an important role in future studies. The HHT bolt signal analysis method is a self-adaptive and automatic process, which can be programed as analysis software and will make bolt tests more convenient in practice.
[Optimization of cluster analysis based on drug resistance profiles of MRSA isolates].
Tani, Hiroya; Kishi, Takahiko; Gotoh, Minehiro; Yamagishi, Yuka; Mikamo, Hiroshige
2015-12-01
We examined 402 methicillin-resistant Staphylococcus aureus (MRSA) strains isolated from clinical specimens in our hospital between November 19, 2010 and December 27, 2011 to evaluate the similarity between cluster analysis of drug susceptibility tests and pulsed-field gel electrophoresis (PFGE). The results showed that the 402 strains tested were classified into 27 PFGE patterns (151 subtypes of patterns). Cluster analyses of drug susceptibility tests with the cut-off distance yielding a similar classification capability showed favorable results--when the MIC method was used, and minimum inhibitory concentration (MIC) values were used directly in the method, the level of agreement with PFGE was 74.2% when 15 drugs were tested. The Unweighted Pair Group Method with Arithmetic mean (UPGMA) method was effective when the cut-off distance was 16. Using the SIR method in which susceptible (S), intermediate (I), and resistant (R) were coded as 0, 2, and 3, respectively, according to the Clinical and Laboratory Standards Institute (CLSI) criteria, the level of agreement with PFGE was 75.9% when the number of drugs tested was 17, the method used for clustering was the UPGMA, and the cut-off distance was 3.6. In addition, to assess the reproducibility of the results, 10 strains were randomly sampled from the overall test and subjected to cluster analysis. This was repeated 100 times under the same conditions. The results indicated good reproducibility of the results, with the level of agreement with PFGE showing a mean of 82.0%, standard deviation of 12.1%, and mode of 90.0% for the MIC method and a mean of 80.0%, standard deviation of 13.4%, and mode of 90.0% for the SIR method. In summary, cluster analysis for drug susceptibility tests is useful for the epidemiological analysis of MRSA.
Preliminary Tests For Development Of A Non-Pertechnetate Analysis Method
DOE Office of Scientific and Technical Information (OSTI.GOV)
Diprete, D.; McCabe, D.
2016-09-28
The objective of this task was to develop a non-pertechnetate analysis method that 222-S lab could easily implement. The initial scope involved working with 222-S laboratory personnel to adapt the existing Tc analytical method to fractionate the non-pertechnetate and pertechnetate. SRNL then developed and tested a method using commercial sorbents containing Aliquat ® 336 to extract the pertechnetate (thereby separating it from non-pertechnetate), followed by oxidation, extraction, and stripping steps, and finally analysis by beta counting and Mass Spectroscopy. Several additional items were partially investigated, including impacts of a 137Cs removal step. The method was initially tested on SRS tankmore » waste samples to determine its viability. Although SRS tank waste does not contain non-pertechnetate, testing with it was useful to investigate the compatibility, separation efficiency, interference removal efficacy, and method sensitivity.« less
Test/semi-empirical analysis of a carbon/epoxy fabric stiffened panel
NASA Technical Reports Server (NTRS)
Spier, E. E.; Anderson, J. A.
1990-01-01
The purpose of this work-in-progress is to present a semi-empirical analysis method developed to predict the buckling and crippling loads of carbon/epoxy fabric blade stiffened panels in compression. This is a hand analysis method comprised of well known, accepted techniques, logical engineering judgements, and experimental data that results in conservative solutions. In order to verify this method, a stiffened panel was fabricated and tested. Both the best and analysis results are presented.
NASA Technical Reports Server (NTRS)
Martos, Borja; Kiszely, Paul; Foster, John V.
2011-01-01
As part of the NASA Aviation Safety Program (AvSP), a novel pitot-static calibration method was developed to allow rapid in-flight calibration for subscale aircraft while flying within confined test areas. This approach uses Global Positioning System (GPS) technology coupled with modern system identification methods that rapidly computes optimal pressure error models over a range of airspeed with defined confidence bounds. This method has been demonstrated in subscale flight tests and has shown small 2- error bounds with significant reduction in test time compared to other methods. The current research was motivated by the desire to further evaluate and develop this method for full-scale aircraft. A goal of this research was to develop an accurate calibration method that enables reductions in test equipment and flight time, thus reducing costs. The approach involved analysis of data acquisition requirements, development of efficient flight patterns, and analysis of pressure error models based on system identification methods. Flight tests were conducted at The University of Tennessee Space Institute (UTSI) utilizing an instrumented Piper Navajo research aircraft. In addition, the UTSI engineering flight simulator was used to investigate test maneuver requirements and handling qualities issues associated with this technique. This paper provides a summary of piloted simulation and flight test results that illustrates the performance and capabilities of the NASA calibration method. Discussion of maneuver requirements and data analysis methods is included as well as recommendations for piloting technique.
Integrated Data Collection Analysis (IDCA) Program - SSST Testing Methods
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sandstrom, Mary M.; Brown, Geoffrey W.; Preston, Daniel N.
The Integrated Data Collection Analysis (IDCA) program is conducting a proficiency study for Small- Scale Safety and Thermal (SSST) testing of homemade explosives (HMEs). Described here are the methods used for impact, friction, electrostatic discharge, and differential scanning calorimetry analysis during the IDCA program. These methods changed throughout the Proficiency Test and the reasons for these changes are documented in this report. The most significant modifications in standard testing methods are: 1) including one specified sandpaper in impact testing among all the participants, 2) diversifying liquid test methods for selected participants, and 3) including sealed sample holders for thermal testingmore » by at least one participant. This effort, funded by the Department of Homeland Security (DHS), is putting the issues of safe handling of these materials in perspective with standard military explosives. The study is adding SSST testing results for a broad suite of different HMEs to the literature. Ultimately the study will suggest new guidelines and methods and possibly establish the SSST testing accuracies needed to develop safe handling practices for HMEs. Each participating testing laboratory uses identical test materials and preparation methods wherever possible. The testing performers involved are Lawrence Livermore National Laboratory (LLNL), Los Alamos National Laboratory (LANL), Indian Head Division, Naval Surface Warfare Center, (NSWC IHD), Sandia National Laboratories (SNL), and Air Force Research Laboratory (AFRL/RXQL). These tests are conducted as a proficiency study in order to establish some consistency in test protocols, procedures, and experiments and to compare results when these testing variables cannot be made consistent.« less
Wang, D Z; Wang, C; Shen, C F; Zhang, Y; Zhang, H; Song, G D; Xue, X D; Xu, Z L; Zhang, S; Jiang, G H
2017-05-10
We described the time trend of acute myocardial infarction (AMI) from 1999 to 2013 in Tianjin incidence rate with Cochran-Armitage trend (CAT) test and linear regression analysis, and the results were compared. Based on actual population, CAT test had much stronger statistical power than linear regression analysis for both overall incidence trend and age specific incidence trend (Cochran-Armitage trend P value
A power analysis for multivariate tests of temporal trend in species composition.
Irvine, Kathryn M; Dinger, Eric C; Sarr, Daniel
2011-10-01
Long-term monitoring programs emphasize power analysis as a tool to determine the sampling effort necessary to effectively document ecologically significant changes in ecosystems. Programs that monitor entire multispecies assemblages require a method for determining the power of multivariate statistical models to detect trend. We provide a method to simulate presence-absence species assemblage data that are consistent with increasing or decreasing directional change in species composition within multiple sites. This step is the foundation for using Monte Carlo methods to approximate the power of any multivariate method for detecting temporal trends. We focus on comparing the power of the Mantel test, permutational multivariate analysis of variance, and constrained analysis of principal coordinates. We find that the power of the various methods we investigate is sensitive to the number of species in the community, univariate species patterns, and the number of sites sampled over time. For increasing directional change scenarios, constrained analysis of principal coordinates was as or more powerful than permutational multivariate analysis of variance, the Mantel test was the least powerful. However, in our investigation of decreasing directional change, the Mantel test was typically as or more powerful than the other models.
A Statistical Discrimination Experiment for Eurasian Events Using a Twenty-Seven-Station Network
1980-07-08
to test the effectiveness of a multivariate method of analysis for distinguishing earthquakes from explosions. The data base for the experiment...to test the effectiveness of a multivariate method of analysis for distinguishing earthquakes from explosions. The data base for the experiment...the weight assigned to each variable whenever a new one is added. Jennrich, R. I. (1977). Stepwise discriminant analysis , in Statistical Methods for
Seminar on Understanding Digital Control and Analysis in Vibration Test Systems
NASA Technical Reports Server (NTRS)
1975-01-01
The advantages of the digital methods over the analog vibration methods are demonstrated. The following topics are covered: (1) methods of computer-controlled random vibration and reverberation acoustic testing, (2) methods of computer-controlled sinewave vibration testing, and (3) methods of computer-controlled shock testing. General algorithms are described in the form of block diagrams and flow diagrams.
Ice Growth Measurements from Image Data to Support Ice Crystal and Mixed-Phase Accretion Testing
NASA Technical Reports Server (NTRS)
Struk, Peter M.; Lynch, Christopher J.
2012-01-01
This paper describes the imaging techniques as well as the analysis methods used to measure the ice thickness and growth rate in support of ice-crystal icing tests performed at the National Research Council of Canada (NRC) Research Altitude Test Facility (RATFac). A detailed description of the camera setup, which involves both still and video cameras, as well as the analysis methods using the NASA Spotlight software, are presented. Two cases, one from two different test entries, showing significant ice growth are analyzed in detail describing the ice thickness and growth rate which is generally linear. Estimates of the bias uncertainty are presented for all measurements. Finally some of the challenges related to the imaging and analysis methods are discussed as well as methods used to overcome them.
Ice Growth Measurements from Image Data to Support Ice-Crystal and Mixed-Phase Accretion Testing
NASA Technical Reports Server (NTRS)
Struk, Peter, M; Lynch, Christopher, J.
2012-01-01
This paper describes the imaging techniques as well as the analysis methods used to measure the ice thickness and growth rate in support of ice-crystal icing tests performed at the National Research Council of Canada (NRC) Research Altitude Test Facility (RATFac). A detailed description of the camera setup, which involves both still and video cameras, as well as the analysis methods using the NASA Spotlight software, are presented. Two cases, one from two different test entries, showing significant ice growth are analyzed in detail describing the ice thickness and growth rate which is generally linear. Estimates of the bias uncertainty are presented for all measurements. Finally some of the challenges related to the imaging and analysis methods are discussed as well as methods used to overcome them.
Temporary Thermocouple Attachment for Thermal/Vacuum Testing at Non-Extreme Temperatures
NASA Technical Reports Server (NTRS)
Ungar, Eugene K.; Wright, Sarah E.
2016-01-01
Post-test examination and data analysis that followed a two week long vacuum test showed that numerous self-stick thermocouples became detached from the test article. The thermocouples were reattached with thermally conductive epoxy and the test was repeated to obtain the required data. Because the thermocouple detachment resulted in significant expense and rework, it was decided to investigate the temporary attachment methods used around NASA and to perform a test to assess their efficacy. The present work describes the original test and the analysis that showed that the thermocouples had become detached, temporary thermocouple attachment methods assessed in the retest and in the thermocouple attachment test, and makes a recommendation for attachment methods for future tests.
Testing for intracycle determinism in pseudoperiodic time series.
Coelho, Mara C S; Mendes, Eduardo M A M; Aguirre, Luis A
2008-06-01
A determinism test is proposed based on the well-known method of the surrogate data. Assuming predictability to be a signature of determinism, the proposed method checks for intracycle (e.g., short-term) determinism in the pseudoperiodic time series for which standard methods of surrogate analysis do not apply. The approach presented is composed of two steps. First, the data are preprocessed to reduce the effects of seasonal and trend components. Second, standard tests of surrogate analysis can then be used. The determinism test is applied to simulated and experimental pseudoperiodic time series and the results show the applicability of the proposed test.
Scavuzzo-Duggan, Tess R.; Chaves, Arielle M.; Roberts, Alison W.
2015-07-14
Here, a method for rapid in vivo functional analysis of engineered proteins was developed using Physcomitrella patens. A complementation assay was designed for testing structure/function relationships in cellulose synthase (CESA) proteins. The components of the assay include (1) construction of test vectors that drive expression of epitope-tagged PpCESA5 carrying engineered mutations, (2) transformation of a ppcesa5 knockout line that fails to produce gametophores with test and control vectors, (3) scoring the stable transformants for gametophore production, (4) statistical analysis comparing complementation rates for test vectors to positive and negative control vectors, and (5) analysis of transgenic protein expression by Westernmore » blotting. The assay distinguished mutations that generate fully functional, nonfunctional, and partially functional proteins. In conclusion, compared with existing methods for in vivo testing of protein function, this complementation assay provides a rapid method for investigating protein structure/function relationships in plants.« less
Battery Test Manual For 48 Volt Mild Hybrid Electric Vehicles
DOE Office of Scientific and Technical Information (OSTI.GOV)
Walker, Lee Kenneth
2017-03-01
This manual details the U.S. Advanced Battery Consortium and U.S. Department of Energy Vehicle Technologies Program goals, test methods, and analysis techniques for a 48 Volt Mild Hybrid Electric Vehicle system. The test methods are outlined stating with characterization tests, followed by life tests. The final section details standardized analysis techniques for 48 V systems that allow for the comparison of different programs that use this manual. An example test plan is included, along with guidance to filling in gap table numbers.
Application of software technology to automatic test data analysis
NASA Technical Reports Server (NTRS)
Stagner, J. R.
1991-01-01
The verification process for a major software subsystem was partially automated as part of a feasibility demonstration. The methods employed are generally useful and applicable to other types of subsystems. The effort resulted in substantial savings in test engineer analysis time and offers a method for inclusion of automatic verification as a part of regression testing.
Benefits of Using Planned Comparisons Rather Than Post Hoc Tests: A Brief Review with Examples.
ERIC Educational Resources Information Center
DuRapau, Theresa M.
The rationale behind analysis of variance (including analysis of covariance and multiple analyses of variance and covariance) methods is reviewed, and unplanned and planned methods of evaluating differences between means are briefly described. Two advantages of using planned or a priori tests over unplanned or post hoc tests are presented. In…
The Use of Time Series Analysis and t Tests with Serially Correlated Data Tests.
ERIC Educational Resources Information Center
Nicolich, Mark J.; Weinstein, Carol S.
1981-01-01
Results of three methods of analysis applied to simulated autocorrelated data sets with an intervention point (varying in autocorrelation degree, variance of error term, and magnitude of intervention effect) are compared and presented. The three methods are: t tests; maximum likelihood Box-Jenkins (ARIMA); and Bayesian Box Jenkins. (Author/AEF)
ANALYSIS OF ALDEHYDES AND KETONES IN THE GAS PHASE
The development and testing of a 2,4-dinitrophenylhydrazine-acetonitrile (DNPH-ACN) method for the analysis of aldehydes and ketones in ambient air are described. A discussion of interferences, preparation of calibration standards, analytical testing, fluorescence methods and car...
[Isolation and identification methods of enterobacteria group and its technological advancement].
Furuta, Itaru
2007-08-01
In the last half-century, isolation and identification methods of enterobacteria groups have markedly improved by technological advancement. Clinical microbiology tests have changed overtime from tube methods to commercial identification kits and automated identification. Tube methods are the original method for the identification of enterobacteria groups, that is, a basically essential method to recognize bacterial fermentation and biochemical principles. In this paper, traditional tube tests are discussed, such as the utilization of carbohydrates, indole, methyl red, and citrate and urease tests. Commercial identification kits and automated instruments by computer based analysis as current methods are also discussed, and those methods provide rapidity and accuracy. Nonculture techniques of nucleic acid typing methods using PCR analysis, and immunochemical methods using monoclonal antibodies can be further developed.
The SNPforID Assay as a Supplementary Method in Kinship and Trace Analysis
Schwark, Thorsten; Meyer, Patrick; Harder, Melanie; Modrow, Jan-Hendrick; von Wurmb-Schwark, Nicole
2012-01-01
Objective Short tandem repeat (STR) analysis using commercial multiplex PCR kits is the method of choice for kinship testing and trace analysis. However, under certain circumstances (deficiency testing, mutations, minute DNA amounts), STRs alone may not suffice. Methods We present a 50-plex single nucleotide polymorphism (SNP) assay based on the SNPs chosen by the SNPforID consortium as an additional method for paternity and for trace analysis. The new assay was applied to selected routine paternity and trace cases from our laboratory. Results and Conclusions Our investigation shows that the new SNP multiplex assay is a valuable method to supplement STR analysis, and is a powerful means to solve complicated genetic analyses. PMID:22851934
Topics on Test Methods for Space Systems and Operations Safety: Applicability of Experimental Data
NASA Technical Reports Server (NTRS)
Hirsch, David B.
2009-01-01
This viewgraph presentation reviews topics on test methods for space systems and operations safety through experimentation and analysis. The contents include: 1) Perception of reality through experimentation and analysis; 2) Measurements, methods, and correlations with real life; and 3) Correlating laboratory aerospace materials flammability data with data in spacecraft environments.
How to test validity in orthodontic research: a mixed dentition analysis example.
Donatelli, Richard E; Lee, Shin-Jae
2015-02-01
The data used to test the validity of a prediction method should be different from the data used to generate the prediction model. In this study, we explored whether an independent data set is mandatory for testing the validity of a new prediction method and how validity can be tested without independent new data. Several validation methods were compared in an example using the data from a mixed dentition analysis with a regression model. The validation errors of real mixed dentition analysis data and simulation data were analyzed for increasingly large data sets. The validation results of both the real and the simulation studies demonstrated that the leave-1-out cross-validation method had the smallest errors. The largest errors occurred in the traditional simple validation method. The differences between the validation methods diminished as the sample size increased. The leave-1-out cross-validation method seems to be an optimal validation method for improving the prediction accuracy in a data set with limited sample sizes. Copyright © 2015 American Association of Orthodontists. Published by Elsevier Inc. All rights reserved.
40 CFR 136.6 - Method modifications and analytical requirements.
Code of Federal Regulations, 2013 CFR
2013-07-01
... PROGRAMS (CONTINUED) GUIDELINES ESTABLISHING TEST PROCEDURES FOR THE ANALYSIS OF POLLUTANTS § 136.6 Method... person or laboratory using a test procedure (analytical method) in this part. (2) Chemistry of the method means the reagents and reactions used in a test procedure that allow determination of the analyte(s) of...
40 CFR 136.6 - Method modifications and analytical requirements.
Code of Federal Regulations, 2011 CFR
2011-07-01
... PROGRAMS (CONTINUED) GUIDELINES ESTABLISHING TEST PROCEDURES FOR THE ANALYSIS OF POLLUTANTS § 136.6 Method... person or laboratory using a test procedure (analytical method) in this Part. (2) Chemistry of the method means the reagents and reactions used in a test procedure that allow determination of the analyte(s) of...
40 CFR 136.6 - Method modifications and analytical requirements.
Code of Federal Regulations, 2012 CFR
2012-07-01
... PROGRAMS (CONTINUED) GUIDELINES ESTABLISHING TEST PROCEDURES FOR THE ANALYSIS OF POLLUTANTS § 136.6 Method... person or laboratory using a test procedure (analytical method) in this part. (2) Chemistry of the method means the reagents and reactions used in a test procedure that allow determination of the analyte(s) of...
40 CFR 136.6 - Method modifications and analytical requirements.
Code of Federal Regulations, 2014 CFR
2014-07-01
... PROGRAMS (CONTINUED) GUIDELINES ESTABLISHING TEST PROCEDURES FOR THE ANALYSIS OF POLLUTANTS § 136.6 Method... person or laboratory using a test procedure (analytical method) in this part. (2) Chemistry of the method means the reagents and reactions used in a test procedure that allow determination of the analyte(s) of...
Larson, Nicholas B; McDonnell, Shannon; Cannon Albright, Lisa; Teerlink, Craig; Stanford, Janet; Ostrander, Elaine A; Isaacs, William B; Xu, Jianfeng; Cooney, Kathleen A; Lange, Ethan; Schleutker, Johanna; Carpten, John D; Powell, Isaac; Bailey-Wilson, Joan E; Cussenot, Olivier; Cancel-Tassin, Geraldine; Giles, Graham G; MacInnis, Robert J; Maier, Christiane; Whittemore, Alice S; Hsieh, Chih-Lin; Wiklund, Fredrik; Catalona, William J; Foulkes, William; Mandal, Diptasri; Eeles, Rosalind; Kote-Jarai, Zsofia; Ackerman, Michael J; Olson, Timothy M; Klein, Christopher J; Thibodeau, Stephen N; Schaid, Daniel J
2017-05-01
Next-generation sequencing technologies have afforded unprecedented characterization of low-frequency and rare genetic variation. Due to low power for single-variant testing, aggregative methods are commonly used to combine observed rare variation within a single gene. Causal variation may also aggregate across multiple genes within relevant biomolecular pathways. Kernel-machine regression and adaptive testing methods for aggregative rare-variant association testing have been demonstrated to be powerful approaches for pathway-level analysis, although these methods tend to be computationally intensive at high-variant dimensionality and require access to complete data. An additional analytical issue in scans of large pathway definition sets is multiple testing correction. Gene set definitions may exhibit substantial genic overlap, and the impact of the resultant correlation in test statistics on Type I error rate control for large agnostic gene set scans has not been fully explored. Herein, we first outline a statistical strategy for aggregative rare-variant analysis using component gene-level linear kernel score test summary statistics as well as derive simple estimators of the effective number of tests for family-wise error rate control. We then conduct extensive simulation studies to characterize the behavior of our approach relative to direct application of kernel and adaptive methods under a variety of conditions. We also apply our method to two case-control studies, respectively, evaluating rare variation in hereditary prostate cancer and schizophrenia. Finally, we provide open-source R code for public use to facilitate easy application of our methods to existing rare-variant analysis results. © 2017 WILEY PERIODICALS, INC.
Research design and statistical methods in Pakistan Journal of Medical Sciences (PJMS).
Akhtar, Sohail; Shah, Syed Wadood Ali; Rafiq, M; Khan, Ajmal
2016-01-01
This article compares the study design and statistical methods used in 2005, 2010 and 2015 of Pakistan Journal of Medical Sciences (PJMS). Only original articles of PJMS were considered for the analysis. The articles were carefully reviewed for statistical methods and designs, and then recorded accordingly. The frequency of each statistical method and research design was estimated and compared with previous years. A total of 429 articles were evaluated (n=74 in 2005, n=179 in 2010, n=176 in 2015) in which 171 (40%) were cross-sectional and 116 (27%) were prospective study designs. A verity of statistical methods were found in the analysis. The most frequent methods include: descriptive statistics (n=315, 73.4%), chi-square/Fisher's exact tests (n=205, 47.8%) and student t-test (n=186, 43.4%). There was a significant increase in the use of statistical methods over time period: t-test, chi-square/Fisher's exact test, logistic regression, epidemiological statistics, and non-parametric tests. This study shows that a diverse variety of statistical methods have been used in the research articles of PJMS and frequency improved from 2005 to 2015. However, descriptive statistics was the most frequent method of statistical analysis in the published articles while cross-sectional study design was common study design.
NASA Astrophysics Data System (ADS)
Chen, Xinjia; Lacy, Fred; Carriere, Patrick
2015-05-01
Sequential test algorithms are playing increasingly important roles for quick detecting network intrusions such as portscanners. In view of the fact that such algorithms are usually analyzed based on intuitive approximation or asymptotic analysis, we develop an exact computational method for the performance analysis of such algorithms. Our method can be used to calculate the probability of false alarm and average detection time up to arbitrarily pre-specified accuracy.
Testing an automated method to estimate ground-water recharge from streamflow records
Rutledge, A.T.; Daniel, C.C.
1994-01-01
The computer program, RORA, allows automated analysis of streamflow hydrographs to estimate ground-water recharge. Output from the program, which is based on the recession-curve-displacement method (often referred to as the Rorabaugh method, for whom the program is named), was compared to estimates of recharge obtained from a manual analysis of 156 years of streamflow record from 15 streamflow-gaging stations in the eastern United States. Statistical tests showed that there was no significant difference between paired estimates of annual recharge by the two methods. Tests of results produced by the four workers who performed the manual method showed that results can differ significantly between workers. Twenty-two percent of the variation between manual and automated estimates could be attributed to having different workers perform the manual method. The program RORA will produce estimates of recharge equivalent to estimates produced manually, greatly increase the speed od analysis, and reduce the subjectivity inherent in manual analysis.
7 CFR 58.812 - Methods of sample analysis.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 7 Agriculture 3 2010-01-01 2010-01-01 false Methods of sample analysis. 58.812 Section 58.812... Procedures § 58.812 Methods of sample analysis. Samples shall be tested according to the applicable methods of laboratory analysis contained in either DA Instruction 918-RL, as issued by the USDA, Agricultural...
7 CFR 58.245 - Method of sample analysis.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 7 Agriculture 3 2010-01-01 2010-01-01 false Method of sample analysis. 58.245 Section 58.245... Procedures § 58.245 Method of sample analysis. Samples shall be tested according to the applicable methods of laboratory analysis contained in either DA Instruction 918-RL as issued by the USDA, Agricultural Marketing...
Protocol vulnerability detection based on network traffic analysis and binary reverse engineering.
Wen, Shameng; Meng, Qingkun; Feng, Chao; Tang, Chaojing
2017-01-01
Network protocol vulnerability detection plays an important role in many domains, including protocol security analysis, application security, and network intrusion detection. In this study, by analyzing the general fuzzing method of network protocols, we propose a novel approach that combines network traffic analysis with the binary reverse engineering method. For network traffic analysis, the block-based protocol description language is introduced to construct test scripts, while the binary reverse engineering method employs the genetic algorithm with a fitness function designed to focus on code coverage. This combination leads to a substantial improvement in fuzz testing for network protocols. We build a prototype system and use it to test several real-world network protocol implementations. The experimental results show that the proposed approach detects vulnerabilities more efficiently and effectively than general fuzzing methods such as SPIKE.
Ryu, Ehri; Cheong, Jeewon
2017-01-01
In this article, we evaluated the performance of statistical methods in single-group and multi-group analysis approaches for testing group difference in indirect effects and for testing simple indirect effects in each group. We also investigated whether the performance of the methods in the single-group approach was affected when the assumption of equal variance was not satisfied. The assumption was critical for the performance of the two methods in the single-group analysis: the method using a product term for testing the group difference in a single path coefficient, and the Wald test for testing the group difference in the indirect effect. Bootstrap confidence intervals in the single-group approach and all methods in the multi-group approach were not affected by the violation of the assumption. We compared the performance of the methods and provided recommendations. PMID:28553248
A scoping review of spatial cluster analysis techniques for point-event data.
Fritz, Charles E; Schuurman, Nadine; Robertson, Colin; Lear, Scott
2013-05-01
Spatial cluster analysis is a uniquely interdisciplinary endeavour, and so it is important to communicate and disseminate ideas, innovations, best practices and challenges across practitioners, applied epidemiology researchers and spatial statisticians. In this research we conducted a scoping review to systematically search peer-reviewed journal databases for research that has employed spatial cluster analysis methods on individual-level, address location, or x and y coordinate derived data. To illustrate the thematic issues raised by our results, methods were tested using a dataset where known clusters existed. Point pattern methods, spatial clustering and cluster detection tests, and a locally weighted spatial regression model were most commonly used for individual-level, address location data (n = 29). The spatial scan statistic was the most popular method for address location data (n = 19). Six themes were identified relating to the application of spatial cluster analysis methods and subsequent analyses, which we recommend researchers to consider; exploratory analysis, visualization, spatial resolution, aetiology, scale and spatial weights. It is our intention that researchers seeking direction for using spatial cluster analysis methods, consider the caveats and strengths of each approach, but also explore the numerous other methods available for this type of analysis. Applied spatial epidemiology researchers and practitioners should give special consideration to applying multiple tests to a dataset. Future research should focus on developing frameworks for selecting appropriate methods and the corresponding spatial weighting schemes.
Effects of Barometric Fluctuations on Well Water-Level Measurements and Aquifer Test Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spane, Frank A.
1999-12-16
This report examines the effects of barometric fluctuations on well water-level measurements and evaluates adjustment and removal methods for determining areal aquifer head conditions and aquifer test analysis. Two examples of Hanford Site unconfined aquifer tests are examined that demonstrate baro-metric response analysis and illustrate the predictive/removal capabilities of various methods for well water-level and aquifer total head values. Good predictive/removal characteristics were demonstrated with best corrective results provided by multiple-regression deconvolution methods.
Contribution to interplay between a delamination test and a sensory analysis of mid-range lipsticks.
Richard, C; Tillé-Salmon, B; Mofid, Y
2016-02-01
Lipstick is currently one of the most sold products of cosmetics industry, and the competition between the various manufacturers is significant. Customers mainly seek products with high spreadability, especially long-lasting or long wear on the lips. Evaluation tests of cosmetics are usually performed by sensory analysis. This can then represent a considerable cost. The object of this study was to develop a fast and simple test of delamination (objective method with calibrated instruments) and to interplay the obtained results with those of a discriminative sensory analysis (subjective method) in order to show the relevance of the instrumental test. Three mid-range lipsticks were randomly chosen and were tested. They were made of compositions as described by the International Nomenclature of Cosmetic Ingredients (INCI). Instrumental characterization was performed by texture profile analysis and by a special delamination test. The sensory analysis was voluntarily conducted with an untrained panel as blind test to confirm or reverse the possible interplay. The two approaches or methods gave the same type of classification. The high-fat lipstick had the worst behaviour with the delamination test and the worst notation of the intensity of descriptors with the sensory analysis. There is a high correlation between the sensory analysis and the instrumental measurements in this study. The delamination test carried out should permit to quickly determine the lasting (screening test) and in consequence optimize the basic formula of lipsticks. © 2015 Society of Cosmetic Scientists and the Société Française de Cosmétologie.
Dwivedi, Alok Kumar; Mallawaarachchi, Indika; Alvarado, Luis A
2017-06-30
Experimental studies in biomedical research frequently pose analytical problems related to small sample size. In such studies, there are conflicting findings regarding the choice of parametric and nonparametric analysis, especially with non-normal data. In such instances, some methodologists questioned the validity of parametric tests and suggested nonparametric tests. In contrast, other methodologists found nonparametric tests to be too conservative and less powerful and thus preferred using parametric tests. Some researchers have recommended using a bootstrap test; however, this method also has small sample size limitation. We used a pooled method in nonparametric bootstrap test that may overcome the problem related with small samples in hypothesis testing. The present study compared nonparametric bootstrap test with pooled resampling method corresponding to parametric, nonparametric, and permutation tests through extensive simulations under various conditions and using real data examples. The nonparametric pooled bootstrap t-test provided equal or greater power for comparing two means as compared with unpaired t-test, Welch t-test, Wilcoxon rank sum test, and permutation test while maintaining type I error probability for any conditions except for Cauchy and extreme variable lognormal distributions. In such cases, we suggest using an exact Wilcoxon rank sum test. Nonparametric bootstrap paired t-test also provided better performance than other alternatives. Nonparametric bootstrap test provided benefit over exact Kruskal-Wallis test. We suggest using nonparametric bootstrap test with pooled resampling method for comparing paired or unpaired means and for validating the one way analysis of variance test results for non-normal data in small sample size studies. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.
Development of the mathematical model for design and verification of acoustic modal analysis methods
NASA Astrophysics Data System (ADS)
Siner, Alexander; Startseva, Maria
2016-10-01
To reduce the turbofan noise it is necessary to develop methods for the analysis of the sound field generated by the blade machinery called modal analysis. Because modal analysis methods are very difficult and their testing on the full scale measurements are very expensive and tedious it is necessary to construct some mathematical models allowing to test modal analysis algorithms fast and cheap. At this work the model allowing to set single modes at the channel and to analyze generated sound field is presented. Modal analysis of the sound generated by the ring array of point sound sources is made. Comparison of experimental and numerical modal analysis results is presented at this work.
Results of the first provisional technical secretariat interlaboratory comparison test
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stuff, J.R.; Hoffland, L.
1995-06-01
The principal task of this laboratory in the first Provisional Technical Secretariat (PTS) Interlaboratory Comparison Test was to verify and test the extraction and preparation procedures outlined in the Recommended Operating Procedures for Sampling and Analysis in the Verification of Chemical Disarmament in addition to our laboratory extraction methods and our laboratory analysis methods. Sample preparation began on 16 May 1994 and analysis was completed on 12 June 1994. The analytical methods used included NMR ({sup 1}H and {sup 31}P) GC/AED, GC/MS (EI and methane CI), GC/IRD, HPLC/IC, HPLC/TSP/MS, MS/MS(Electrospray), and CZE.
Blade loss transient dynamic analysis of turbomachinery
NASA Technical Reports Server (NTRS)
Stallone, M. J.; Gallardo, V.; Storace, A. F.; Bach, L. J.; Black, G.; Gaffney, E. F.
1982-01-01
This paper reports on work completed to develop an analytical method for predicting the transient non-linear response of a complete aircraft engine system due to the loss of a fan blade, and to validate the analysis by comparing the results against actual blade loss test data. The solution, which is based on the component element method, accounts for rotor-to-casing rubs, high damping and rapid deceleration rates associated with the blade loss event. A comparison of test results and predicted response show good agreement except for an initial overshoot spike not observed in test. The method is effective for analysis of large systems.
Improving the quality of parameter estimates obtained from slug tests
Butler, J.J.; McElwee, C.D.; Liu, W.
1996-01-01
The slug test is one of the most commonly used field methods for obtaining in situ estimates of hydraulic conductivity. Despite its prevalence, this method has received criticism from many quarters in the ground-water community. This criticism emphasizes the poor quality of the estimated parameters, a condition that is primarily a product of the somewhat casual approach that is often employed in slug tests. Recently, the Kansas Geological Survey (KGS) has pursued research directed it improving methods for the performance and analysis of slug tests. Based on extensive theoretical and field research, a series of guidelines have been proposed that should enable the quality of parameter estimates to be improved. The most significant of these guidelines are: (1) three or more slug tests should be performed at each well during a given test period; (2) two or more different initial displacements (Ho) should be used at each well during a test period; (3) the method used to initiate a test should enable the slug to be introduced in a near-instantaneous manner and should allow a good estimate of Ho to be obtained; (4) data-acquisition equipment that enables a large quantity of high quality data to be collected should be employed; (5) if an estimate of the storage parameter is needed, an observation well other than the test well should be employed; (6) the method chosen for analysis of the slug-test data should be appropriate for site conditions; (7) use of pre- and post-analysis plots should be an integral component of the analysis procedure, and (8) appropriate well construction parameters should be employed. Data from slug tests performed at a number of KGS field sites demonstrate the importance of these guidelines.
Tremblay, Marie-Claude; Brousselle, Astrid; Richard, Lucie; Beaudet, Nicole
2013-10-01
Program designers and evaluators should make a point of testing the validity of a program's intervention theory before investing either in implementation or in any type of evaluation. In this context, logic analysis can be a particularly useful option, since it can be used to test the plausibility of a program's intervention theory using scientific knowledge. Professional development in public health is one field among several that would truly benefit from logic analysis, as it appears to be generally lacking in theorization and evaluation. This article presents the application of this analysis method to an innovative public health professional development program, the Health Promotion Laboratory. More specifically, this paper aims to (1) define the logic analysis approach and differentiate it from similar evaluative methods; (2) illustrate the application of this method by a concrete example (logic analysis of a professional development program); and (3) reflect on the requirements of each phase of logic analysis, as well as on the advantages and disadvantages of such an evaluation method. Using logic analysis to evaluate the Health Promotion Laboratory showed that, generally speaking, the program's intervention theory appeared to have been well designed. By testing and critically discussing logic analysis, this article also contributes to further improving and clarifying the method. Copyright © 2013 Elsevier Ltd. All rights reserved.
Willis, Brian H; Hyde, Christopher J
2014-05-01
To determine a plausible estimate for a test's performance in a specific setting using a new method for selecting studies. It is shown how routine data from practice may be used to define an "applicable region" for studies in receiver operating characteristic space. After qualitative appraisal, studies are selected based on the probability that their study accuracy estimates arose from parameters lying in this applicable region. Three methods for calculating these probabilities are developed and used to tailor the selection of studies for meta-analysis. The Pap test applied to the UK National Health Service (NHS) Cervical Screening Programme provides a case example. The meta-analysis for the Pap test included 68 studies, but at most 17 studies were considered applicable to the NHS. For conventional meta-analysis, the sensitivity and specificity (with 95% confidence intervals) were estimated to be 72.8% (65.8, 78.8) and 75.4% (68.1, 81.5) compared with 50.9% (35.8, 66.0) and 98.0% (95.4, 99.1) from tailored meta-analysis using a binomial method for selection. Thus, for a cervical intraepithelial neoplasia (CIN) 1 prevalence of 2.2%, the post-test probability for CIN 1 would increase from 6.2% to 36.6% between the two methods of meta-analysis. Tailored meta-analysis provides a method for augmenting study selection based on the study's applicability to a setting. As such, the summary estimate is more likely to be plausible for a setting and could improve diagnostic prediction in practice. Copyright © 2014 Elsevier Inc. All rights reserved.
On sample size of the kruskal-wallis test with application to a mouse peritoneal cavity study.
Fan, Chunpeng; Zhang, Donghui; Zhang, Cun-Hui
2011-03-01
As the nonparametric generalization of the one-way analysis of variance model, the Kruskal-Wallis test applies when the goal is to test the difference between multiple samples and the underlying population distributions are nonnormal or unknown. Although the Kruskal-Wallis test has been widely used for data analysis, power and sample size methods for this test have been investigated to a much lesser extent. This article proposes new power and sample size calculation methods for the Kruskal-Wallis test based on the pilot study in either a completely nonparametric model or a semiparametric location model. No assumption is made on the shape of the underlying population distributions. Simulation results show that, in terms of sample size calculation for the Kruskal-Wallis test, the proposed methods are more reliable and preferable to some more traditional methods. A mouse peritoneal cavity study is used to demonstrate the application of the methods. © 2010, The International Biometric Society.
Research design and statistical methods in Pakistan Journal of Medical Sciences (PJMS)
Akhtar, Sohail; Shah, Syed Wadood Ali; Rafiq, M.; Khan, Ajmal
2016-01-01
Objective: This article compares the study design and statistical methods used in 2005, 2010 and 2015 of Pakistan Journal of Medical Sciences (PJMS). Methods: Only original articles of PJMS were considered for the analysis. The articles were carefully reviewed for statistical methods and designs, and then recorded accordingly. The frequency of each statistical method and research design was estimated and compared with previous years. Results: A total of 429 articles were evaluated (n=74 in 2005, n=179 in 2010, n=176 in 2015) in which 171 (40%) were cross-sectional and 116 (27%) were prospective study designs. A verity of statistical methods were found in the analysis. The most frequent methods include: descriptive statistics (n=315, 73.4%), chi-square/Fisher’s exact tests (n=205, 47.8%) and student t-test (n=186, 43.4%). There was a significant increase in the use of statistical methods over time period: t-test, chi-square/Fisher’s exact test, logistic regression, epidemiological statistics, and non-parametric tests. Conclusion: This study shows that a diverse variety of statistical methods have been used in the research articles of PJMS and frequency improved from 2005 to 2015. However, descriptive statistics was the most frequent method of statistical analysis in the published articles while cross-sectional study design was common study design. PMID:27022365
Salmonella testing of pooled pre-enrichment broth cultures for screening multiple food samples.
Price, W R; Olsen, R A; Hunter, J E
1972-04-01
A method has been described for testing multiple food samples for Salmonella without loss in sensitivity. The method pools multiple pre-enrichment broth cultures into single enrichment broths. The subsequent stages of the Salmonella analysis are not altered. The method was found applicable to several dry food materials including nonfat dry milk, dried egg albumin, cocoa, cottonseed flour, wheat flour, and shredded coconut. As many as 25 pre-enrichment broth cultures were pooled without apparent loss in the sensitivity of Salmonella detection as compared to individual sample analysis. The procedure offers a simple, yet effective, way to increase sample capacity in the Salmonella testing of foods, particularly where a large proportion of samples ordinarily is negative. It also permits small portions of pre-enrichment broth cultures to be retained for subsequent individual analysis if positive tests are found. Salmonella testing of pooled pre-enrichment broths provides increased consumer protection for a given amount of analytical effort as compared to individual sample analysis.
Annual Book of ASTM Standards, Part 23: Water; Atmospheric Analysis.
ERIC Educational Resources Information Center
American Society for Testing and Materials, Philadelphia, PA.
Standards for water and atmospheric analysis are compiled in this segment, Part 23, of the American Society for Testing and Materials (ASTM) annual book of standards. It contains all current formally approved ASTM standard and tentative test methods, definitions, recommended practices, proposed methods, classifications, and specifications. One…
An Analysis of Effects of Variable Factors on Weapon Performance
1993-03-01
ALTERNATIVE ANALYSIS A. CATEGORICAL DATA ANALYSIS Statistical methodology for categorical data analysis traces its roots to the work of Francis Galton in the...choice of statistical tests . This thesis examines an analysis performed by Surface Warfare Development Group (SWDG). The SWDG analysis is shown to be...incorrect due to the misapplication of testing methods. A corrected analysis is presented and recommendations suggested for changes to the testing
Hammack, Thomas S; Johnson, Mildred L; Jacobson, Andrew P; Andrews, Wallace H
2006-01-01
Studies were conducted to determine the relative effectiveness of buffered peptone water (BPW), lactose (LAC) broth, and Universal Preenrichment (UP) broth for the recovery of Salmonella organisms from fruit rinses, whole fruit, and comminuted fruit. In the first phase, the relative effectiveness of the rinse and soak methods for the recovery of Salmonella from surface-contaminated mangoes and tomatoes was examined. Fruits were spot inoculated with single Salmonella serovars and held for 4 days at 2-6 degrees C before analysis was initiated. The contaminated fruit was rinsed in portions of BPW, LAC broth, or UP broth. Portions from each rinse were added to its respective broth (e.g., BPW to BPW). Individual whole fruit, in their remaining broth rinses (soak method), and the fruit rinse/broths (rinse method) were incubated for 24 h at 35 degrees C. The Bacteriological Analytical Manual (BAM) Salmonella culture method was followed thereafter. The soak method produced significantly greater numbers (P < 0.05) of positive test portions than did the rinse method for the analysis of mangoes (93 versus 12) and tomatoes (85 versus 34). The 3 broths were comparable for the recovery of Salmonella for both the soak and the rinse methods for mangoes. For tomatoes, there were no significant differences among the broths for the soak method, but BPW and UP broth were significantly more productive (P < 0.05) than LAC broth by the rinse method. In the second phase, the relative effectiveness of LAC broth, BPW, and UP broth for the recovery of Salmonella from comminuted fruit was examined. Fruits were contaminated with single Salmonella serovars and aged for 4 days at 2-6 degrees C. Twenty 25 g test portions were preenriched in each of the following broths: BPW, LAC broth, and UP broth. The BAM Salmonella culture method was followed thereafter. For cantaloupes, significantly more (P < 0.05) Salmonella-positive test portions were recovered with UP broth (96 Salmonella-positive test portions) and BPW (87 Salmonella-positive test portions) than with LAC broth (57 Salmonella-positive test portions). For mangoes, BPW recovered an arithmetically larger number of Salmonella-positive test portions (27 Salmonella-positive test portions) than did either LAC broth (14 Salmonella-positive test portions) or UP broth (18 Salmonella-positive test portions). For tomatoes, there were no significant differences among the broths: BPW recovered 65 Salmonella-positive test portions, UP broth recovered 62 Salmonella-positive test portions, and LAC broth recovered 60 Salmonella-positive test portions. For the analysis of whole fruit, it is recommended that the soak method be used. For whole fruit analyzed with the soak method, UP broth should be used for tomatoes and BPW should be used for mangoes. It is further recommended that UP broth be used for the analysis of comminuted cantaloupes and that BPW be used for the analysis of comminuted mangoes and tomatoes.
Development of Test-Analysis Models (TAM) for correlation of dynamic test and analysis results
NASA Technical Reports Server (NTRS)
Angelucci, Filippo; Javeed, Mehzad; Mcgowan, Paul
1992-01-01
The primary objective of structural analysis of aerospace applications is to obtain a verified finite element model (FEM). The verified FEM can be used for loads analysis, evaluate structural modifications, or design control systems. Verification of the FEM is generally obtained as the result of correlating test and FEM models. A test analysis model (TAM) is very useful in the correlation process. A TAM is essentially a FEM reduced to the size of the test model, which attempts to preserve the dynamic characteristics of the original FEM in the analysis range of interest. Numerous methods for generating TAMs have been developed in the literature. The major emphasis of this paper is a description of the procedures necessary for creation of the TAM and the correlation of the reduced models with the FEM or the test results. Herein, three methods are discussed, namely Guyan, Improved Reduced System (IRS), and Hybrid. Also included are the procedures for performing these analyses using MSC/NASTRAN. Finally, application of the TAM process is demonstrated with an experimental test configuration of a ten bay cantilevered truss structure.
Experimental and simulation flow rate analysis of the 3/2 directional pneumatic valve
NASA Astrophysics Data System (ADS)
Blasiak, Slawomir; Takosoglu, Jakub E.; Laski, Pawel A.; Pietrala, Dawid S.; Zwierzchowski, Jaroslaw; Bracha, Gabriel; Nowakowski, Lukasz; Blasiak, Malgorzata
The work includes a study on the comparative analysis of two test methods. The first method - numerical method, consists in determining the flow characteristics with the use of ANSYS CFX. A modeled poppet directional valve 3/2 3D CAD software - SolidWorks was used for this purpose. Based on the solid model that was developed, simulation studies of the air flow through the way valve in the software for computational fluid dynamics Ansys CFX were conducted. The second method - experimental, entailed conducting tests on a specially constructed test stand. The comparison of the test results obtained on the basis of both methods made it possible to determine the cross-correlation. High compatibility of the results confirms the usefulness of the numerical procedures. Thus, they might serve to determine the flow characteristics of directional valves as an alternative to a costly and time-consuming test stand.
Unidirectional Fabric Drape Testing Method
Mei, Zaihuan; Yang, Jingzhi; Zhou, Ting; Zhou, Hua
2015-01-01
In most cases, fabrics such as curtains, skirts, suit pants and so on are draped under their own gravity parallel to fabric plane while the gravity is perpendicular to fabric plane in traditional drape testing method. As a result, it does not conform to actual situation and the test data is not convincing enough. To overcome this problem, this paper presents a novel method which simulates the real mechanical conditions and ensures the gravity is parallel to the fabric plane. This method applied a low-cost Kinect Sensor device to capture the 3-dimensional (3D) drape profile, thus we obtained the drape degree parameters and aesthetic parameters by 3D reconstruction and image processing and analysis techniques. The experiment was conducted on our self-devised drape-testing instrument by choosing different kinds of weave structure fabrics as our testing samples and the results were compared with those of traditional method and subjective evaluation. Through regression and correlation analysis we found that this novel testing method was significantly correlated with the traditional and subjective evaluation method. We achieved a new, non-contact 3D measurement method for drape testing, namely unidirectional fabric drape testing method. This method is more suitable for evaluating drape behavior because it is more in line with actual mechanical conditions of draped fabrics and has a well consistency with the requirements of visual and aesthetic style of fabrics. PMID:26600387
Characterization of Triaxial Braided Composite Material Properties for Impact Simulation
NASA Technical Reports Server (NTRS)
Roberts, Gary D.; Goldberg, Robert K.; Biniendak, Wieslaw K.; Arnold, William A.; Littell, Justin D.; Kohlman, Lee W.
2009-01-01
The reliability of impact simulations for aircraft components made with triaxial braided carbon fiber composites is currently limited by inadequate material property data and lack of validated material models for analysis. Improvements to standard quasi-static test methods are needed to account for the large unit cell size and localized damage within the unit cell. The deformation and damage of a triaxial braided composite material was examined using standard quasi-static in-plane tension, compression, and shear tests. Some modifications to standard test specimen geometries are suggested, and methods for measuring the local strain at the onset of failure within the braid unit cell are presented. Deformation and damage at higher strain rates is examined using ballistic impact tests on 61- by 61- by 3.2-mm (24- by 24- by 0.125-in.) composite panels. Digital image correlation techniques were used to examine full-field deformation and damage during both quasi-static and impact tests. An impact analysis method is presented that utilizes both local and global deformation and failure information from the quasi-static tests as input for impact simulations. Improvements that are needed in test and analysis methods for better predictive capability are examined.
A novel tensile test method to assess texture and gaping in salmon fillets.
Ashton, Thomas J; Michie, Ian; Johnston, Ian A
2010-05-01
A new tensile strength method was developed to quantify the force required to tear a standardized block of Atlantic salmon muscle with the aim of identifying those samples more prone to factory downgrading as a result of softness and fillet gaping. The new method effectively overcomes problems of sample attachment encountered with previous tensile strength tests. The repeatability and sensitivity and predictability of the new technique were evaluated against other common instrumental texture measurement methods. The relationship between sensory assessments of firmness and parameters from the instrumental texture methods was also determined. Data from the new method were shown to have the strongest correlations with gaping severity (r =-0.514, P < 0.001) and the highest level of repeatability of data when analyzing cold-smoked samples. The Warner Bratzler shear method gave the most repeatable data from fresh samples and had the highest correlations between fresh and smoked product from the same fish (r = 0.811, P < 0.001). A hierarchical cluster analysis placed the tensile test in the top cluster, alongside the Warner Bratzler method, demonstrating that it also yields adequate data with respect to these tests. None of the tested sensory analysis attributes showed significant relationships to mechanical tests except fillet firmness, with correlations (r) of 0.42 for cylinder probe maximum force (P = 0.005) and 0.31 for tensile work (P = 0.04). It was concluded that the tensile test method developed provides an important addition to the available tools for mechanical analysis of salmon quality, particularly with respect to the prediction of gaping during factory processing, which is a serious commercial problem. A novel, reliable method of measuring flesh tensile strength in salmon, provides data of relevance to gaping.
Yang, James J; Li, Jia; Williams, L Keoki; Buu, Anne
2016-01-05
In genome-wide association studies (GWAS) for complex diseases, the association between a SNP and each phenotype is usually weak. Combining multiple related phenotypic traits can increase the power of gene search and thus is a practically important area that requires methodology work. This study provides a comprehensive review of existing methods for conducting GWAS on complex diseases with multiple phenotypes including the multivariate analysis of variance (MANOVA), the principal component analysis (PCA), the generalizing estimating equations (GEE), the trait-based association test involving the extended Simes procedure (TATES), and the classical Fisher combination test. We propose a new method that relaxes the unrealistic independence assumption of the classical Fisher combination test and is computationally efficient. To demonstrate applications of the proposed method, we also present the results of statistical analysis on the Study of Addiction: Genetics and Environment (SAGE) data. Our simulation study shows that the proposed method has higher power than existing methods while controlling for the type I error rate. The GEE and the classical Fisher combination test, on the other hand, do not control the type I error rate and thus are not recommended. In general, the power of the competing methods decreases as the correlation between phenotypes increases. All the methods tend to have lower power when the multivariate phenotypes come from long tailed distributions. The real data analysis also demonstrates that the proposed method allows us to compare the marginal results with the multivariate results and specify which SNPs are specific to a particular phenotype or contribute to the common construct. The proposed method outperforms existing methods in most settings and also has great applications in GWAS on complex diseases with multiple phenotypes such as the substance abuse disorders.
A quality quantitative method of silicon direct bonding based on wavelet image analysis
NASA Astrophysics Data System (ADS)
Tan, Xiao; Tao, Zhi; Li, Haiwang; Xu, Tiantong; Yu, Mingxing
2018-04-01
The rapid development of MEMS (micro-electro-mechanical systems) has received significant attention from researchers in various fields and subjects. In particular, the MEMS fabrication process is elaborate and, as such, has been the focus of extensive research inquiries. However, in MEMS fabrication, component bonding is difficult to achieve and requires a complex approach. Thus, improvements in bonding quality are relatively important objectives. A higher quality bond can only be achieved with improved measurement and testing capabilities. In particular, the traditional testing methods mainly include infrared testing, tensile testing, and strength testing, despite the fact that using these methods to measure bond quality often results in low efficiency or destructive analysis. Therefore, this paper focuses on the development of a precise, nondestructive visual testing method based on wavelet image analysis that is shown to be highly effective in practice. The process of wavelet image analysis includes wavelet image denoising, wavelet image enhancement, and contrast enhancement, and as an end result, can display an image with low background noise. In addition, because the wavelet analysis software was developed with MATLAB, it can reveal the bonding boundaries and bonding rates to precisely indicate the bond quality at all locations on the wafer. This work also presents a set of orthogonal experiments that consist of three prebonding factors, the prebonding temperature, the positive pressure value and the prebonding time, which are used to analyze the prebonding quality. This method was used to quantify the quality of silicon-to-silicon wafer bonding, yielding standard treatment quantities that could be practical for large-scale use.
Association analysis of multiple traits by an approach of combining P values.
Chen, Lili; Wang, Yong; Zhou, Yajing
2018-03-01
Increasing evidence shows that one variant can affect multiple traits, which is a widespread phenomenon in complex diseases. Joint analysis of multiple traits can increase statistical power of association analysis and uncover the underlying genetic mechanism. Although there are many statistical methods to analyse multiple traits, most of these methods are usually suitable for detecting common variants associated with multiple traits. However, because of low minor allele frequency of rare variant, these methods are not optimal for rare variant association analysis. In this paper, we extend an adaptive combination of P values method (termed ADA) for single trait to test association between multiple traits and rare variants in the given region. For a given region, we use reverse regression model to test each rare variant associated with multiple traits and obtain the P value of single-variant test. Further, we take the weighted combination of these P values as the test statistic. Extensive simulation studies show that our approach is more powerful than several other comparison methods in most cases and is robust to the inclusion of a high proportion of neutral variants and the different directions of effects of causal variants.
Williams, Mary R; Sigman, Michael E; Lewis, Jennifer; Pitan, Kelly McHugh
2012-10-10
A bayesian soft classification method combined with target factor analysis (TFA) is described and tested for the analysis of fire debris data. The method relies on analysis of the average mass spectrum across the chromatographic profile (i.e., the total ion spectrum, TIS) from multiple samples taken from a single fire scene. A library of TIS from reference ignitable liquids with assigned ASTM classification is used as the target factors in TFA. The class-conditional distributions of correlations between the target and predicted factors for each ASTM class are represented by kernel functions and analyzed by bayesian decision theory. The soft classification approach assists in assessing the probability that ignitable liquid residue from a specific ASTM E1618 class, is present in a set of samples from a single fire scene, even in the presence of unspecified background contributions from pyrolysis products. The method is demonstrated with sample data sets and then tested on laboratory-scale burn data and large-scale field test burns. The overall performance achieved in laboratory and field test of the method is approximately 80% correct classification of fire debris samples. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Bull, William B. (Compiler); Pinoli, Pat C. (Compiler); Upton, Cindy G. (Compiler); Day, Tony (Compiler); Hill, Keith (Compiler); Stone, Frank (Compiler); Hall, William B.
1994-01-01
This report is a compendium of the presentations of the 12th biannual meeting of the Industry Advisory Committee under the Solid Propulsion Integrity Program. A complete transcript of the welcoming talks is provided. Presentation outlines and overheads are included for the other sessions: SPIP Overview, Past, Current and Future Activity; Test Methods Manual and Video Tape Library; Air Force Developed Computer Aided Cure Program and SPC/TQM Experience; Magneto-Optical mapper (MOM), Joint Army/NASA program to assess composite integrity; Permeability Testing; Moisture Effusion Testing by Karl Fischer Analysis; Statistical Analysis of Acceptance Test Data; NMR Phenolic Resin Advancement; Constituent Testing Highlights on the LDC Optimization Program; Carbon Sulfur Study, Performance Related Testing; Current Rayon Specifications and Future Availability; RSRM/SPC Implementation; SRM Test Methods, Delta/Titan/FBM/RSRM; and Open Forum on Performance Based Acceptance Testing -- Industry Experience.
40 CFR 63.805 - Performance test methods.
Code of Federal Regulations, 2011 CFR
2011-07-01
... alternative method for determining the VHAP content of the coating. In the event of any inconsistency between... Collection of Coating and Ink Samples for VOC Content Analysis by Reference Method 24 and Reference Method... (see § 63.801); (iii) Use any alternative protocol and test method provided they meet either the...
Review of the Air-Coupled Impact-Echo Method for Non-Destructive Testing
NASA Astrophysics Data System (ADS)
Nowotarski, Piotr; Dubas, Sebastian; Milwicz, Roman
2017-10-01
The article presents the general idea of Air-Coupled Impact-Echo (ACIE) method which is one of the non-destructive testing (NDT) techniques used in the construction industry. One of the main advantages of the general Impact Echo (IE) method is that it is sufficient to access from one side to that of the structure which greatly facilitate research in the road facilities or places which are difficult to access and diagnose. The main purpose of the article is to present state-of-the-art related to ACIE method based on the publications available at Thomson Reuters Web of Science Core Collection database (WOS) with the further analysis of the mentioned methods. Deeper analysis was also performed for the newest publications published within last 3 years related to ACIE for investigation on the subject of main focus of the researchers and scientists to try to define possible regions where additional examination and work is necessary. One of the main conclusions that comes from the performed analysis is that ACIE methods can be widely used for performing NDT of concrete structures and can be performed faster than standard IE method thanks to the Air-coupled sensors. What is more, 92.3% of the analysed recent research described in publications connected with ACIE was performed in laboratories, and only 23.1% in-situ on real structures. This indicates that method requires further research to prepare test stand ready to perform analysis on real objects outside laboratory conditions. Moreover, algorithms that are used for data processing and later presentation in ACIE method are still being developed and there is no universal solution available for all kinds of the existing and possible to find defects, which indicates possible research area for further works. Authors are of the opinion that emerging ACIE method could be good opportunity for ND testing especially for concrete structures. Development and refinement of test stands that will allow to perform in-situ tests could shorten the overall time of the research and with the connection of implementation of higher accuracy algorithms for data analysis better precision of defects localization can be achieved.
16 CFR 1500.41 - Method of testing primary irritant substances.
Code of Federal Regulations, 2014 CFR
2014-01-01
... corrosivity properties of substances, including testing that does not require animals, are presented in the CPSC's animal testing policy set forth in 16 CFR 1500.232. A weight-of-evidence analysis or a validated... conducted, a sequential testing strategy is recommended to reduce the number of test animals. The method of...
Shear Lag in Box Beams Methods of Analysis and Experimental Investigations
NASA Technical Reports Server (NTRS)
Kuhn, Paul; Chiarito, Patrick T
1942-01-01
The bending stresses in the covers of box beams or wide-flange beams differ appreciably from the stresses predicted by the ordinary bending theory on account of shear deformation of the flanges. The problem of predicting these differences has become known as the shear-lag problem. The first part of this paper deals with methods of shear-lag analysis suitable for practical use. The second part of the paper describes strain-gage tests made by the NACA to verify the theory. Three tests published by other investigators are also analyzed by the proposed method. The third part of the paper gives numerical examples illustrating the methods of analysis. An appendix gives comparisons with other methods, particularly with the method of Ebner and Koller.
ERIC Educational Resources Information Center
Patalino, Marianne
Problems in current course evaluation methods are discussed and an alternative method is described for the construction, analysis, and interpretation of a test to evaluate instructional programs. The method presented represents a different approach to the traditional overreliance on standardized achievement tests and the total scores they provide.…
ERIC Educational Resources Information Center
Lonchamp, F.
This is a presentation of the results of a factor analysis of a battery of tests intended to measure listening and reading comprehension in English as a second language. The analysis sought to answer the following questions: (1) whether the factor analysis method yields results when applied to tests which are not specifically designed for this…
Methodology for the Assessment of 3D Conduction Effects in an Aerothermal Wind Tunnel Test
NASA Technical Reports Server (NTRS)
Oliver, Anthony Brandon
2010-01-01
This slide presentation reviews a method for the assessment of three-dimensional conduction effects during test in a Aerothermal Wind Tunnel. The test objectives were to duplicate and extend tests that were performed during the 1960's on thermal conduction on proturberance on a flat plate. Slides review the 1D versus 3D conduction data reduction error, the analysis process, CFD-based analysis, loose coupling method that simulates a wind tunnel test run, verification of the CFD solution, Grid convergence, Mach number trend, size trends, and a Sumary of the CFD conduction analysis. Other slides show comparisons to pretest CFD at Mach 1.5 and 2.16 and the geometries of the models and grids.
Salim, Agus; Mackinnon, Andrew; Christensen, Helen; Griffiths, Kathleen
2008-09-30
The pre-test-post-test design (PPD) is predominant in trials of psychotherapeutic treatments. Missing data due to withdrawals present an even bigger challenge in assessing treatment effectiveness under the PPD than under designs with more observations since dropout implies an absence of information about response to treatment. When confronted with missing data, often it is reasonable to assume that the mechanism underlying missingness is related to observed but not to unobserved outcomes (missing at random, MAR). Previous simulation and theoretical studies have shown that, under MAR, modern techniques such as maximum-likelihood (ML) based methods and multiple imputation (MI) can be used to produce unbiased estimates of treatment effects. In practice, however, ad hoc methods such as last observation carried forward (LOCF) imputation and complete-case (CC) analysis continue to be used. In order to better understand the behaviour of these methods in the PPD, we compare the performance of traditional approaches (LOCF, CC) and theoretically sound techniques (MI, ML), under various MAR mechanisms. We show that the LOCF method is seriously biased and conclude that its use should be abandoned. Complete-case analysis produces unbiased estimates only when the dropout mechanism does not depend on pre-test values even when dropout is related to fixed covariates including treatment group (covariate-dependent: CD). However, CC analysis is generally biased under MAR. The magnitude of the bias is largest when the correlation of post- and pre-test is relatively low.
method for testing home energy audit software and associated calibration methods. BESTEST-EX is one of Energy Analysis Model Calibration Methods. When completed, the ANSI/RESNET SMOT will specify test procedures for evaluating calibration methods used in conjunction with predicting building energy use and
An analytic data analysis method for oscillatory slug tests.
Chen, Chia-Shyun
2006-01-01
An analytical data analysis method is developed for slug tests in partially penetrating wells in confined or unconfined aquifers of high hydraulic conductivity. As adapted from the van der Kamp method, the determination of the hydraulic conductivity is based on the occurrence times and the displacements of the extreme points measured from the oscillatory data and their theoretical counterparts available in the literature. This method is applied to two sets of slug test response data presented by Butler et al.: one set shows slow damping with seven discernable extremities, and the other shows rapid damping with three extreme points. The estimates of the hydraulic conductivity obtained by the analytic method are in good agreement with those determined by an available curve-matching technique.
NASA Astrophysics Data System (ADS)
Rakkapao, Suttida; Prasitpong, Singha; Arayathanitkul, Kwan
2016-12-01
This study investigated the multiple-choice test of understanding of vectors (TUV), by applying item response theory (IRT). The difficulty, discriminatory, and guessing parameters of the TUV items were fit with the three-parameter logistic model of IRT, using the parscale program. The TUV ability is an ability parameter, here estimated assuming unidimensionality and local independence. Moreover, all distractors of the TUV were analyzed from item response curves (IRC) that represent simplified IRT. Data were gathered on 2392 science and engineering freshmen, from three universities in Thailand. The results revealed IRT analysis to be useful in assessing the test since its item parameters are independent of the ability parameters. The IRT framework reveals item-level information, and indicates appropriate ability ranges for the test. Moreover, the IRC analysis can be used to assess the effectiveness of the test's distractors. Both IRT and IRC approaches reveal test characteristics beyond those revealed by the classical analysis methods of tests. Test developers can apply these methods to diagnose and evaluate the features of items at various ability levels of test takers.
PyHLA: tests for the association between HLA alleles and diseases.
Fan, Yanhui; Song, You-Qiang
2017-02-06
Recently, several tools have been designed for human leukocyte antigen (HLA) typing using single nucleotide polymorphism (SNP) array and next-generation sequencing (NGS) data. These tools provide high-throughput and cost-effective approaches for identifying HLA types. Therefore, tools for downstream association analysis are highly desirable. Although several tools have been designed for multi-allelic marker association analysis, they were designed only for microsatellite markers and do not scale well with increasing data volumes, or they were designed for large-scale data but provided a limited number of tests. We have developed a Python package called PyHLA, which implements several methods for HLA association analysis, to fill the gap. PyHLA is a tailor-made, easy to use, and flexible tool designed specifically for the association analysis of the HLA types imputed from genome-wide genotyping and NGS data. PyHLA provides functions for association analysis, zygosity tests, and interaction tests between HLA alleles and diseases. Monte Carlo permutation and several methods for multiple testing corrections have also been implemented. PyHLA provides a convenient and powerful tool for HLA analysis. Existing methods have been integrated and desired methods have been added in PyHLA. Furthermore, PyHLA is applicable to small and large sample sizes and can finish the analysis in a timely manner on a personal computer with different platforms. PyHLA is implemented in Python. PyHLA is a free, open source software distributed under the GPLv2 license. The source code, tutorial, and examples are available at https://github.com/felixfan/PyHLA.
Significance testing of rules in rule-based models of human problem solving
NASA Technical Reports Server (NTRS)
Lewis, C. M.; Hammer, J. M.
1986-01-01
Rule-based models of human problem solving have typically not been tested for statistical significance. Three methods of testing rules - analysis of variance, randomization, and contingency tables - are presented. Advantages and disadvantages of the methods are also described.
Propfan experimental data analysis
NASA Technical Reports Server (NTRS)
Vernon, David F.; Page, Gregory S.; Welge, H. Robert
1984-01-01
A data reduction method, which is consistent with the performance prediction methods used for analysis of new aircraft designs, is defined and compared to the method currently used by NASA using data obtained from an Ames Res. Center 11 foot transonic wind tunnel test. Pressure and flow visualization data from the Ames test for both the powered straight underwing nacelle, and an unpowered contoured overwing nacelle installation is used to determine the flow phenomena present for a wind mounted turboprop installation. The test data is compared to analytic methods, showing the analytic methods to be suitable for design and analysis of new configurations. The data analysis indicated that designs with zero interference drag levels are achieveable with proper wind and nacelle tailoring. A new overwing contoured nacelle design and a modification to the wing leading edge extension for the current wind tunnel model design are evaluated. Hardware constraints of the current model parts prevent obtaining any significant performance improvement due to a modified nacelle contouring. A new aspect ratio wing design for an up outboard rotation turboprop installation is defined, and an advanced contoured nacelle is provided.
[A study of biomechanical method for urine test based on color difference estimation].
Wang, Chunhong; Zhou, Yue; Zhao, Hongxia; Zhou, Fengkun
2008-02-01
The biochemical analysis of urine is an important inspection and diagnosis method in hospitals. The conventional method of urine analysis covers mainly colorimetric visual appraisement and automation detection, in which the colorimetric visual appraisement technique has been superseded basically, and the automation detection method is adopted in hospital; moreover, the price of urine biochemical analyzer on market is around twenty thousand RMB yuan (Y), which is hard to enter into ordinary families. It is known that computer vision system is not subject to the physiological and psychological influence of person, its appraisement standard is objective and steady. Therefore, according to the color theory, we have established a computer vision system, which can carry through collection, management, display, and appraisement of color difference between the color of standard threshold value and the color of urine test paper after reaction with urine liquid, and then the level of an illness can be judged accurately. In this paper, we introduce the Urine Test Biochemical Analysis method, which is new and can be popularized in families. Experimental result shows that this test method is easy-to-use and cost-effective. It can realize the monitoring of a whole course and can find extensive applications.
METHOD OF CHEMICAL ANALYSIS FOR OIL SHALE WASTES
Several methods of chemical analysis are described for oil shale wastewaters and retort gases. These methods are designed to support the field testing of various pollution control systems. As such, emphasis has been placed on methods which are rapid and sufficiently rugged to per...
Advanced superposition methods for high speed turbopump vibration analysis
NASA Technical Reports Server (NTRS)
Nielson, C. E.; Campany, A. D.
1981-01-01
The small, high pressure Mark 48 liquid hydrogen turbopump was analyzed and dynamically tested to determine the cause of high speed vibration at an operating speed of 92,400 rpm. This approaches the design point operating speed of 95,000 rpm. The initial dynamic analysis in the design stage and subsequent further analysis of the rotor only dynamics failed to predict the vibration characteristics found during testing. An advanced procedure for dynamics analysis was used in this investigation. The procedure involves developing accurate dynamic models of the rotor assembly and casing assembly by finite element analysis. The dynamically instrumented assemblies are independently rap tested to verify the analytical models. The verified models are then combined by modal superposition techniques to develop a completed turbopump model where dynamic characteristics are determined. The results of the dynamic testing and analysis obtained are presented and methods of moving the high speed vibration characteristics to speeds above the operating range are recommended. Recommendations for use of these advanced dynamic analysis procedures during initial design phases are given.
NASA Technical Reports Server (NTRS)
Phillips, E. P.
1993-01-01
A second experimental Round Robin on the measurement of the crack opening load in fatigue crack growth tests has been completed by the ASTM Task Group E24.04.04 on Crack Closure Measurement and Analysis. Fourteen laboratories participated in the testing of aluminum alloy compact tension specimens. Opening-load measurements were made at three crack lengths during constant Delta K, constant stress ratio tests by most of the participants. Four participants made opening-load measurements during threshold tests. All opening-load measurements were based on the analysis of specimens compliance behavior, where the displacement/strain was measured either at the crack mouth or the mid-height back face location. The Round Robin data were analyzed for opening load using two non-subjective analysis methods: the compliance offset and the correlation coefficient methods. The scatter in the opening load results was significantly reduced when some of the results were excluded from the analysis population based on an accept/reject criterion for raw data quality. The compliance offset and correlation coefficient opening load analysis methods produced similar results for data populations that had been screened to eliminate poor quality data.
NASA Astrophysics Data System (ADS)
Qin, Le; Xie, HuiMin; Zhu, RongHua; Wu, Dan; Che, ZhiGang; Zou, ShiKun
2014-04-01
This paper investigates the effect of the location of testing area in residual stress measurement by Moiré interferometry combined with hole-drilling method. The selection of the location of the testing area is analyzed from theory and experiment. In the theoretical study, the factors which affect the surface released radial strain ɛ r were analyzed on the basis of the formulae of the hole-drilling method, and the relations between those factors and ɛ r were established. By combining Moiré interferometry with the hole-drilling method, the residual stress of interference-fit specimen was measured to verify the theoretical analysis. According to the analysis results, the testing area for minimizing the error of strain measurement is determined. Moreover, if the orientation of the maximum principal stress is known, the value of strain will be measured with higher precision by the Moiré interferometry method.
Simulation Analysis of Helicopter Ground Resonance Nonlinear Dynamics
NASA Astrophysics Data System (ADS)
Zhu, Yan; Lu, Yu-hui; Ling, Ai-min
2017-07-01
In order to accurately predict the dynamic instability of helicopter ground resonance, a modeling and simulation method of helicopter ground resonance considering nonlinear dynamic characteristics of components (rotor lead-lag damper, landing gear wheel and absorber) is presented. The numerical integral method is used to calculate the transient responses of the body and rotor, simulating some disturbance. To obtain quantitative instabilities, Fast Fourier Transform (FFT) is conducted to estimate the modal frequencies, and the mobile rectangular window method is employed in the predictions of the modal damping in terms of the response time history. Simulation results show that ground resonance simulation test can exactly lead up the blade lead-lag regressing mode frequency, and the modal damping obtained according to attenuation curves are close to the test results. The simulation test results are in accordance with the actual accident situation, and prove the correctness of the simulation method. This analysis method used for ground resonance simulation test can give out the results according with real helicopter engineering tests.
Waste Analysis Plan and Waste Characterization Survey, Barksdale AFB, Louisiana
1991-03-01
review to assess if analysis is needed, any analyses that are to be provided by generators, and methods to be used to meet specific waste analysis ...sampling method , sampling frequency, parameters of analysis , SW 846 test methods , Department of Transportation (DOT) shipping name and hazard class...S.e.iceA w/Atchs 2. HQ SAC/DEV Ltr, 28 Sep 90 19 119 APPENDIX B Waste Analysis Plan Rationale 21 APPENDIX B 1. SAMPLING METHOD RATIONALE: Composite Liquid
Trivedi, Prinal; Edwards, Jode W; Wang, Jelai; Gadbury, Gary L; Srinivasasainagendra, Vinodh; Zakharkin, Stanislav O; Kim, Kyoungmi; Mehta, Tapan; Brand, Jacob P L; Patki, Amit; Page, Grier P; Allison, David B
2005-04-06
Many efforts in microarray data analysis are focused on providing tools and methods for the qualitative analysis of microarray data. HDBStat! (High-Dimensional Biology-Statistics) is a software package designed for analysis of high dimensional biology data such as microarray data. It was initially developed for the analysis of microarray gene expression data, but it can also be used for some applications in proteomics and other aspects of genomics. HDBStat! provides statisticians and biologists a flexible and easy-to-use interface to analyze complex microarray data using a variety of methods for data preprocessing, quality control analysis and hypothesis testing. Results generated from data preprocessing methods, quality control analysis and hypothesis testing methods are output in the form of Excel CSV tables, graphs and an Html report summarizing data analysis. HDBStat! is a platform-independent software that is freely available to academic institutions and non-profit organizations. It can be downloaded from our website http://www.soph.uab.edu/ssg_content.asp?id=1164.
NASA Technical Reports Server (NTRS)
Gallardo, V. C.; Storace, A. S.; Gaffney, E. F.; Bach, L. J.; Stallone, M. J.
1981-01-01
The component element method was used to develop a transient dynamic analysis computer program which is essentially based on modal synthesis combined with a central, finite difference, numerical integration scheme. The methodology leads to a modular or building-block technique that is amenable to computer programming. To verify the analytical method, turbine engine transient response analysis (TETRA), was applied to two blade-out test vehicles that had been previously instrumented and tested. Comparison of the time dependent test data with those predicted by TETRA led to recommendations for refinement or extension of the analytical method to improve its accuracy and overcome its shortcomings. The development of working equations, their discretization, numerical solution scheme, the modular concept of engine modelling, the program logical structure and some illustrated results are discussed. The blade-loss test vehicles (rig full engine), the type of measured data, and the engine structural model are described.
Modeling and dynamic environment analysis technology for spacecraft
NASA Astrophysics Data System (ADS)
Fang, Ren; Zhaohong, Qin; Zhong, Zhang; Zhenhao, Liu; Kai, Yuan; Long, Wei
Spacecraft sustains complex and severe vibrations and acoustic environments during flight. Predicting the resulting structures, including numerical predictions of fluctuating pressure, updating models and random vibration and acoustic analysis, plays an important role during the design, manufacture and ground testing of spacecraft. In this paper, Monotony Integrative Large Eddy Simulation (MILES) is introduced to predict the fluctuating pressure of the fairing. The exact flow structures of the fairing wall surface under different Mach numbers are obtained, then a spacecraft model is constructed using the finite element method (FEM). According to the modal test data, the model is updated by the penalty method. On this basis, the random vibration and acoustic responses of the fairing and satellite are analyzed by different methods. The simulated results agree well with the experimental ones, which shows the validity of the modeling and dynamic environment analysis technology. This information can better support test planning, defining test conditions and designing optimal structures.
2012-01-01
Background Gene Set Analysis (GSA) has proven to be a useful approach to microarray analysis. However, most of the method development for GSA has focused on the statistical tests to be used rather than on the generation of sets that will be tested. Existing methods of set generation are often overly simplistic. The creation of sets from individual pathways (in isolation) is a poor reflection of the complexity of the underlying metabolic network. We have developed a novel approach to set generation via the use of Principal Component Analysis of the Laplacian matrix of a metabolic network. We have analysed a relatively simple data set to show the difference in results between our method and the current state-of-the-art pathway-based sets. Results The sets generated with this method are semi-exhaustive and capture much of the topological complexity of the metabolic network. The semi-exhaustive nature of this method has also allowed us to design a hypergeometric enrichment test to determine which genes are likely responsible for set significance. We show that our method finds significant aspects of biology that would be missed (i.e. false negatives) and addresses the false positive rates found with the use of simple pathway-based sets. Conclusions The set generation step for GSA is often neglected but is a crucial part of the analysis as it defines the full context for the analysis. As such, set generation methods should be robust and yield as complete a representation of the extant biological knowledge as possible. The method reported here achieves this goal and is demonstrably superior to previous set analysis methods. PMID:22876834
Sudo, Hirotaka; O'driscoll, Michael; Nishiwaki, Kenji; Kawamoto, Yuji; Gammell, Philip; Schramm, Gerhard; Wertli, Toni; Prinz, Heino; Mori, Atsuhide; Sako, Kazuhiro
2012-01-01
The application of a head space analyzer for oxygen concentration was examined to develop a novel ampoule leak test method. Studies using ampoules filled with ethanol-based solution and with nitrogen in the headspace demonstrated that the head space analysis (HSA) method showed sufficient sensitivity in detecting an ampoule crack. The proposed method is the use of HSA in conjunction with the pretreatment of an overpressurising process known as bombing to facilitate the oxygen flow through the crack in the ampoule. The method was examined in comparative studies with a conventional dye ingress method, and the results showed that the HSA method exhibits sensitivity superior to the dye method. The results indicate that the HSA method in combination with the bombing treatment provides potential application as a leak test for the detection of container defects not only for ampoule products with ethanol-based solutions, but also for testing lyophilized products in vials with nitrogen in the head space. The application of a head space analyzer for oxygen concentration was examined to develop a novel ampoule leak test method. The proposed method is the use of head space analysis (HSA) in conjunction with the pretreatment of an overpressurising process known as bombing to facilitate oxygen flow through the crack in the ampoule for use in routine production. The result of the comparative study with a conventional dye leak test method indicates that the HSA method in combination with the bombing treatment can be used as a leak test method, enabling detection of container defects.
Testing deformation hypotheses by constraints on a time series of geodetic observations
NASA Astrophysics Data System (ADS)
Velsink, Hiddo
2018-01-01
In geodetic deformation analysis observations are used to identify form and size changes of a geodetic network, representing objects on the earth's surface. The network points are monitored, often continuously, because of suspected deformations. A deformation may affect many points during many epochs. The problem is that the best description of the deformation is, in general, unknown. To find it, different hypothesised deformation models have to be tested systematically for agreement with the observations. The tests have to be capable of stating with a certain probability the size of detectable deformations, and to be datum invariant. A statistical criterion is needed to find the best deformation model. Existing methods do not fulfil these requirements. Here we propose a method that formulates the different hypotheses as sets of constraints on the parameters of a least-squares adjustment model. The constraints can relate to subsets of epochs and to subsets of points, thus combining time series analysis and congruence model analysis. The constraints are formulated as nonstochastic observations in an adjustment model of observation equations. This gives an easy way to test the constraints and to get a quality description. The proposed method aims at providing a good discriminating method to find the best description of a deformation. The method is expected to improve the quality of geodetic deformation analysis. We demonstrate the method with an elaborate example.
Meta‐analysis of test accuracy studies using imputation for partial reporting of multiple thresholds
Deeks, J.J.; Martin, E.C.; Riley, R.D.
2017-01-01
Introduction For tests reporting continuous results, primary studies usually provide test performance at multiple but often different thresholds. This creates missing data when performing a meta‐analysis at each threshold. A standard meta‐analysis (no imputation [NI]) ignores such missing data. A single imputation (SI) approach was recently proposed to recover missing threshold results. Here, we propose a new method that performs multiple imputation of the missing threshold results using discrete combinations (MIDC). Methods The new MIDC method imputes missing threshold results by randomly selecting from the set of all possible discrete combinations which lie between the results for 2 known bounding thresholds. Imputed and observed results are then synthesised at each threshold. This is repeated multiple times, and the multiple pooled results at each threshold are combined using Rubin's rules to give final estimates. We compared the NI, SI, and MIDC approaches via simulation. Results Both imputation methods outperform the NI method in simulations. There was generally little difference in the SI and MIDC methods, but the latter was noticeably better in terms of estimating the between‐study variances and generally gave better coverage, due to slightly larger standard errors of pooled estimates. Given selective reporting of thresholds, the imputation methods also reduced bias in the summary receiver operating characteristic curve. Simulations demonstrate the imputation methods rely on an equal threshold spacing assumption. A real example is presented. Conclusions The SI and, in particular, MIDC methods can be used to examine the impact of missing threshold results in meta‐analysis of test accuracy studies. PMID:29052347
Li, Huanjie; Nickerson, Lisa D; Nichols, Thomas E; Gao, Jia-Hong
2017-03-01
Two powerful methods for statistical inference on MRI brain images have been proposed recently, a non-stationary voxelation-corrected cluster-size test (CST) based on random field theory and threshold-free cluster enhancement (TFCE) based on calculating the level of local support for a cluster, then using permutation testing for inference. Unlike other statistical approaches, these two methods do not rest on the assumptions of a uniform and high degree of spatial smoothness of the statistic image. Thus, they are strongly recommended for group-level fMRI analysis compared to other statistical methods. In this work, the non-stationary voxelation-corrected CST and TFCE methods for group-level analysis were evaluated for both stationary and non-stationary images under varying smoothness levels, degrees of freedom and signal to noise ratios. Our results suggest that, both methods provide adequate control for the number of voxel-wise statistical tests being performed during inference on fMRI data and they are both superior to current CSTs implemented in popular MRI data analysis software packages. However, TFCE is more sensitive and stable for group-level analysis of VBM data. Thus, the voxelation-corrected CST approach may confer some advantages by being computationally less demanding for fMRI data analysis than TFCE with permutation testing and by also being applicable for single-subject fMRI analyses, while the TFCE approach is advantageous for VBM data. Hum Brain Mapp 38:1269-1280, 2017. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
Process fault detection and nonlinear time series analysis for anomaly detection in safeguards
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burr, T.L.; Mullen, M.F.; Wangen, L.E.
In this paper we discuss two advanced techniques, process fault detection and nonlinear time series analysis, and apply them to the analysis of vector-valued and single-valued time-series data. We investigate model-based process fault detection methods for analyzing simulated, multivariate, time-series data from a three-tank system. The model-predictions are compared with simulated measurements of the same variables to form residual vectors that are tested for the presence of faults (possible diversions in safeguards terminology). We evaluate two methods, testing all individual residuals with a univariate z-score and testing all variables simultaneously with the Mahalanobis distance, for their ability to detect lossmore » of material from two different leak scenarios from the three-tank system: a leak without and with replacement of the lost volume. Nonlinear time-series analysis tools were compared with the linear methods popularized by Box and Jenkins. We compare prediction results using three nonlinear and two linear modeling methods on each of six simulated time series: two nonlinear and four linear. The nonlinear methods performed better at predicting the nonlinear time series and did as well as the linear methods at predicting the linear values.« less
Gerwin, Philip M; Arbona, Rodolfo J Ricart; Riedel, Elyn R; Lepherd, Michelle L; Henderson, Ken S; Lipman, Neil S
2017-01-01
There is no consensus regarding the best practice for detecting murine pinworm infections. Initially, we evaluated 7 fecal concentration methods by using feces containing Aspiculuris tetraptera (AT) eggs (n = 20 samples per method). Sodium nitrate flotation, sodium nitrate centrifugation, Sheather sugar centrifugation, and zinc sulfate centrifugation detected eggs in 100% of samples; zinc sulfate flotation and water sedimentation detected eggs in 90%. All had better detection rates than Sheather sugar flotation (50%). To determine optimal detection methods, Swiss Webster mice were exposed to Syphacia obvelata (SO; n = 60) or AT (n = 60). We compared the following methods at days 0, 30, and 90, beginning 21 or 28 d after SO and AT exposure, respectively: fecal concentration (AT only), anal tape test (SO only), direct examination of intestinal contents (cecum and colon), Swiss roll histology (cecum and colon), and PCR analysis (pooled fur swab and feces). Detection rates for SO-exposed mice were: PCR analysis, 45%; Swiss roll histology, 30%; intestinal content exam, 27%; and tape test, 27%. The SO detection rate for PCR analysis was significantly greater than that for the tape test. Detection rates for AT-exposed mice were: intestinal content exam, 53%; PCR analysis, 33%; fecal flotation, 22%; and Swiss roll histology, 17%. The AT detection rate of PCR analysis combined with intestinal content examination was greater than for PCR analysis only and the AT detection rate of intestinal content examination was greater than for Swiss roll histology. Combining PCR analysis with intestinal content examination detected 100% of infected animals. No single test detected all positive animals. We recommend combining PCR analysis with intestinal content examination for optimal pinworm detection. PMID:28905712
Random left censoring: a second look at bone lead concentration measurements
NASA Astrophysics Data System (ADS)
Popovic, M.; Nie, H.; Chettle, D. R.; McNeill, F. E.
2007-09-01
Bone lead concentrations measured in vivo by x-ray fluorescence (XRF) are subjected to left censoring due to limited precision of the technique at very low concentrations. In the analysis of bone lead measurements, inverse variance weighting (IVW) of measurements is commonly used to estimate the mean of a data set and its standard error. Student's t-test is used to compare the IVW means of two sets, testing the hypothesis that the two sets are from the same population. This analysis was undertaken to assess the adequacy of IVW in the analysis of bone lead measurements or to confirm the results of IVW using an independent approach. The rationale is provided for the use of methods of survival data analysis in the study of XRF bone lead measurements. The procedure is provided for bone lead data analysis using the Kaplan-Meier and Nelson-Aalen estimators. The methodology is also outlined for the rank tests that are used to determine whether two censored sets are from the same population. The methods are applied on six data sets acquired in epidemiological studies. The estimated parameters and test statistics were compared with the results of the IVW approach. It is concluded that the proposed methods of statistical analysis can provide valid inference about bone lead concentrations, but the computed parameters do not differ substantially from those derived by the more widely used method of IVW.
ERIC Educational Resources Information Center
Wei, Youhua; Qu, Yanxuan
2014-01-01
For a testing program with frequent administrations, it is important to understand and monitor the stability and fluctuation of test performance across administrations. Different methods have been proposed for this purpose. This study explored the potential of using multilevel analysis to understand and monitor examinees' test performance across…
Turbulence excited frequency domain damping measurement and truncation effects
NASA Technical Reports Server (NTRS)
Soovere, J.
1976-01-01
Existing frequency domain modal frequency and damping analysis methods are discussed. The effects of truncation in the Laplace and Fourier transform data analysis methods are described. Methods for eliminating truncation errors from measured damping are presented. Implications of truncation effects in fast Fourier transform analysis are discussed. Limited comparison with test data is presented.
40 CFR 86.1 - Reference materials.
Code of Federal Regulations, 2012 CFR
2012-07-01
...) ASTM D1945-91, Standard Test Method for Analysis of Natural Gas by Gas Chromatography, IBR approved for §§ 86.113-94, 86.513-94, 86.1213-94, 86.1313-94. (iii) ASTM D2163-91, Standard Test Method for Analysis... §§ 86.113-94, 86.1213-94, 86.1313-94. (iv) ASTM D2986-95a, Reapproved 1999, Standard Practice for...
40 CFR 86.1 - Reference materials.
Code of Federal Regulations, 2011 CFR
2011-07-01
...) ASTM D1945-91, Standard Test Method for Analysis of Natural Gas by Gas Chromatography, IBR approved for §§ 86.113-94, 86.513-94, 86.1213-94, 86.1313-94. (iii) ASTM D2163-91, Standard Test Method for Analysis... §§ 86.113-94, 86.1213-94, 86.1313-94. (iv) ASTM D2986-95a, Reapproved 1999, Standard Practice for...
E-learning platform for automated testing of electronic circuits using signature analysis method
NASA Astrophysics Data System (ADS)
Gherghina, Cǎtǎlina; Bacivarov, Angelica; Bacivarov, Ioan C.; Petricǎ, Gabriel
2016-12-01
Dependability of electronic circuits can be ensured only through testing of circuit modules. This is done by generating test vectors and their application to the circuit. Testability should be viewed as a concerted effort to ensure maximum efficiency throughout the product life cycle, from conception and design stage, through production to repairs during products operating. In this paper, is presented the platform developed by authors for training for testability in electronics, in general and in using signature analysis method, in particular. The platform allows highlighting the two approaches in the field namely analog and digital signature of circuits. As a part of this e-learning platform, it has been developed a database for signatures of different electronic components meant to put into the spotlight different techniques implying fault detection, and from this there were also self-repairing techniques of the systems with this kind of components. An approach for realizing self-testing circuits based on MATLAB environment and using signature analysis method is proposed. This paper analyses the benefits of signature analysis method and simulates signature analyzer performance based on the use of pseudo-random sequences, too.
Monte Carlo based statistical power analysis for mediation models: methods and software.
Zhang, Zhiyong
2014-12-01
The existing literature on statistical power analysis for mediation models often assumes data normality and is based on a less powerful Sobel test instead of the more powerful bootstrap test. This study proposes to estimate statistical power to detect mediation effects on the basis of the bootstrap method through Monte Carlo simulation. Nonnormal data with excessive skewness and kurtosis are allowed in the proposed method. A free R package called bmem is developed to conduct the power analysis discussed in this study. Four examples, including a simple mediation model, a multiple-mediator model with a latent mediator, a multiple-group mediation model, and a longitudinal mediation model, are provided to illustrate the proposed method.
Freeman, Karoline; Tsertsvadze, Alexander; Taylor-Phillips, Sian; McCarthy, Noel; Mistry, Hema; Manuel, Rohini; Mason, James
2017-01-01
Multiplex gastrointestinal pathogen panel (GPP) tests simultaneously identify bacterial, viral and parasitic pathogens from the stool samples of patients with suspected infectious gastroenteritis presenting in hospital or the community. We undertook a systematic review to compare the accuracy of GPP tests with standard microbiology techniques. Searches in Medline, Embase, Web of Science and the Cochrane library were undertaken from inception to January 2016. Eligible studies compared GPP tests with standard microbiology techniques in patients with suspected gastroenteritis. Quality assessment of included studies used tailored QUADAS-2. In the absence of a reference standard we analysed test performance taking GPP tests and standard microbiology techniques in turn as the benchmark test, using random effects meta-analysis of proportions. No study provided an adequate reference standard with which to compare the test accuracy of GPP and conventional tests. Ten studies informed a meta-analysis of positive and negative agreement. Positive agreement across all pathogens was 0.93 (95% CI 0.90 to 0.96) when conventional methods were the benchmark and 0.68 (95% CI: 0.58 to 0.77) when GPP provided the benchmark. Negative agreement was high in both instances due to the high proportion of negative cases. GPP testing produced a greater number of pathogen-positive findings than conventional testing. It is unclear whether these additional 'positives' are clinically important. GPP testing has the potential to simplify testing and accelerate reporting when compared to conventional microbiology methods. However the impact of GPP testing upon the management, treatment and outcome of patients is poorly understood and further studies are needed to evaluate the health economic impact of GPP testing compared with standard methods. The review protocol is registered with PROSPERO as CRD42016033320.
NASA Astrophysics Data System (ADS)
Chen, Zhe; Qiu, Zurong; Huo, Xinming; Fan, Yuming; Li, Xinghua
2017-03-01
A fiber-capacitive drop analyzer is an instrument which monitors a growing droplet to produce a capacitive opto-tensiotrace (COT). Each COT is an integration of fiber light intensity signals and capacitance signals and can reflect the unique physicochemical property of a liquid. In this study, we propose a solution analytical and concentration quantitative method based on multivariate statistical methods. Eight characteristic values are extracted from each COT. A series of COT characteristic values of training solutions at different concentrations compose a data library of this kind of solution. A two-stage linear discriminant analysis is applied to analyze different solution libraries and establish discriminant functions. Test solutions can be discriminated by these functions. After determining the variety of test solutions, Spearman correlation test and principal components analysis are used to filter and reduce dimensions of eight characteristic values, producing a new representative parameter. A cubic spline interpolation function is built between the parameters and concentrations, based on which we can calculate the concentration of the test solution. Methanol, ethanol, n-propanol, and saline solutions are taken as experimental subjects in this paper. For each solution, nine or ten different concentrations are chosen to be the standard library, and the other two concentrations compose the test group. By using the methods mentioned above, all eight test solutions are correctly identified and the average relative error of quantitative analysis is 1.11%. The method proposed is feasible which enlarges the applicable scope of recognizing liquids based on the COT and improves the concentration quantitative precision, as well.
Instruments for Water Quality Monitoring
ERIC Educational Resources Information Center
Ballinger, Dwight G.
1972-01-01
Presents information regarding available instruments for industries and agencies who must monitor numerous aquatic parameters. Charts denote examples of parameters sampled, testing methods, range and accuracy of test methods, cost analysis, and reliability of instruments. (BL)
Meta-analysis of diagnostic accuracy studies in mental health
Takwoingi, Yemisi; Riley, Richard D; Deeks, Jonathan J
2015-01-01
Objectives To explain methods for data synthesis of evidence from diagnostic test accuracy (DTA) studies, and to illustrate different types of analyses that may be performed in a DTA systematic review. Methods We described properties of meta-analytic methods for quantitative synthesis of evidence. We used a DTA review comparing the accuracy of three screening questionnaires for bipolar disorder to illustrate application of the methods for each type of analysis. Results The discriminatory ability of a test is commonly expressed in terms of sensitivity (proportion of those with the condition who test positive) and specificity (proportion of those without the condition who test negative). There is a trade-off between sensitivity and specificity, as an increasing threshold for defining test positivity will decrease sensitivity and increase specificity. Methods recommended for meta-analysis of DTA studies --such as the bivariate or hierarchical summary receiver operating characteristic (HSROC) model --jointly summarise sensitivity and specificity while taking into account this threshold effect, as well as allowing for between study differences in test performance beyond what would be expected by chance. The bivariate model focuses on estimation of a summary sensitivity and specificity at a common threshold while the HSROC model focuses on the estimation of a summary curve from studies that have used different thresholds. Conclusions Meta-analyses of diagnostic accuracy studies can provide answers to important clinical questions. We hope this article will provide clinicians with sufficient understanding of the terminology and methods to aid interpretation of systematic reviews and facilitate better patient care. PMID:26446042
Validated Test Method 5030C: Purge-and-Trap for Aqueous Samples
This method describes a purge-and-trap procedure for the analysis of volatile organic compoundsin aqueous samples & water miscible liquid samples. It also describes the analysis of high concentration soil and waste sample extracts prepared in Method 5035.
Conducting meta-analyses of HIV prevention literatures from a theory-testing perspective.
Marsh, K L; Johnson, B T; Carey, M P
2001-09-01
Using illustrations from HIV prevention research, the current article advocates approaching meta-analysis as a theory-testing scientific method rather than as merely a set of rules for quantitative analysis. Like other scientific methods, meta-analysis has central concerns with internal, external, and construct validity. The focus of a meta-analysis should only rarely be merely describing the effects of health promotion, but rather should be on understanding and explaining phenomena and the processes underlying them. The methodological decisions meta-analysts make in conducting reviews should be guided by a consideration of the underlying goals of the review (e.g., simply effect size estimation or, preferably theory testing). From the advocated perspective that a health behavior meta-analyst should test theory, the authors present a number of issues to be considered during the conduct of meta-analyses.
Considering Horn's Parallel Analysis from a Random Matrix Theory Point of View.
Saccenti, Edoardo; Timmerman, Marieke E
2017-03-01
Horn's parallel analysis is a widely used method for assessing the number of principal components and common factors. We discuss the theoretical foundations of parallel analysis for principal components based on a covariance matrix by making use of arguments from random matrix theory. In particular, we show that (i) for the first component, parallel analysis is an inferential method equivalent to the Tracy-Widom test, (ii) its use to test high-order eigenvalues is equivalent to the use of the joint distribution of the eigenvalues, and thus should be discouraged, and (iii) a formal test for higher-order components can be obtained based on a Tracy-Widom approximation. We illustrate the performance of the two testing procedures using simulated data generated under both a principal component model and a common factors model. For the principal component model, the Tracy-Widom test performs consistently in all conditions, while parallel analysis shows unpredictable behavior for higher-order components. For the common factor model, including major and minor factors, both procedures are heuristic approaches, with variable performance. We conclude that the Tracy-Widom procedure is preferred over parallel analysis for statistically testing the number of principal components based on a covariance matrix.
Leontaridou, Maria; Gabbert, Silke; Van Ierland, Ekko C; Worth, Andrew P; Landsiedel, Robert
2016-07-01
This paper offers a Bayesian Value-of-Information (VOI) analysis for guiding the development of non-animal testing strategies, balancing information gains from testing with the expected social gains and costs from the adoption of regulatory decisions. Testing is assumed to have value, if, and only if, the information revealed from testing triggers a welfare-improving decision on the use (or non-use) of a substance. As an illustration, our VOI model is applied to a set of five individual non-animal prediction methods used for skin sensitisation hazard assessment, seven battery combinations of these methods, and 236 sequential 2-test and 3-test strategies. Their expected values are quantified and compared to the expected value of the local lymph node assay (LLNA) as the animal method. We find that battery and sequential combinations of non-animal prediction methods reveal a significantly higher expected value than the LLNA. This holds for the entire range of prior beliefs. Furthermore, our results illustrate that the testing strategy with the highest expected value does not necessarily have to follow the order of key events in the sensitisation adverse outcome pathway (AOP). 2016 FRAME.
Robust inference for group sequential trials.
Ganju, Jitendra; Lin, Yunzhi; Zhou, Kefei
2017-03-01
For ethical reasons, group sequential trials were introduced to allow trials to stop early in the event of extreme results. Endpoints in such trials are usually mortality or irreversible morbidity. For a given endpoint, the norm is to use a single test statistic and to use that same statistic for each analysis. This approach is risky because the test statistic has to be specified before the study is unblinded, and there is loss in power if the assumptions that ensure optimality for each analysis are not met. To minimize the risk of moderate to substantial loss in power due to a suboptimal choice of a statistic, a robust method was developed for nonsequential trials. The concept is analogous to diversification of financial investments to minimize risk. The method is based on combining P values from multiple test statistics for formal inference while controlling the type I error rate at its designated value.This article evaluates the performance of 2 P value combining methods for group sequential trials. The emphasis is on time to event trials although results from less complex trials are also included. The gain or loss in power with the combination method relative to a single statistic is asymmetric in its favor. Depending on the power of each individual test, the combination method can give more power than any single test or give power that is closer to the test with the most power. The versatility of the method is that it can combine P values from different test statistics for analysis at different times. The robustness of results suggests that inference from group sequential trials can be strengthened with the use of combined tests. Copyright © 2017 John Wiley & Sons, Ltd.
A Bootstrap Generalization of Modified Parallel Analysis for IRT Dimensionality Assessment
ERIC Educational Resources Information Center
Finch, Holmes; Monahan, Patrick
2008-01-01
This article introduces a bootstrap generalization to the Modified Parallel Analysis (MPA) method of test dimensionality assessment using factor analysis. This methodology, based on the use of Marginal Maximum Likelihood nonlinear factor analysis, provides for the calculation of a test statistic based on a parametric bootstrap using the MPA…
Appropriate Statistical Analysis for Two Independent Groups of Likert-Type Data
ERIC Educational Resources Information Center
Warachan, Boonyasit
2011-01-01
The objective of this research was to determine the robustness and statistical power of three different methods for testing the hypothesis that ordinal samples of five and seven Likert categories come from equal populations. The three methods are the two sample t-test with equal variances, the Mann-Whitney test, and the Kolmogorov-Smirnov test. In…
ERIC Educational Resources Information Center
Luh, Wei-Ming; Guo, Jiin-Huarng
2005-01-01
To deal with nonnormal and heterogeneous data for the one-way fixed effect analysis of variance model, the authors adopted a trimmed means method in conjunction with Hall's invertible transformation into a heteroscedastic test statistic (Alexander-Govern test or Welch test). The results of simulation experiments showed that the proposed technique…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rowe, M.D.; Pierce, B.L.
This report presents results of tests of different final site selection methods used for siting large-scale facilities such as nuclear power plants. Test data are adapted from a nuclear power plant siting study conducted on Long Island, New York. The purpose of the tests is to determine whether or not different final site selection methods produce different results, and to obtain some understanding of the nature of any differences found. Decision rules and weighting methods are included. Decision rules tested are Weighting Summation, Power Law, Decision Analysis, Goal Programming, and Goal Attainment; weighting methods tested are Categorization, Ranking, Rating Ratiomore » Estimation, Metfessel Allocation, Indifferent Tradeoff, Decision Analysis lottery, and Global Evaluation. Results show that different methods can, indeed, produce different results, but that the probability that they will do so is controlled by the structure of differences among the sites being evaluated. Differences in weights and suitability scores attributable to methods have reduced significance if the alternatives include one or two sites that are superior to all others in many attributes. The more tradeoffs there are among good and bad levels of different attributes at different sites, the more important are the specifics of methods to the final decision. 5 refs., 14 figs., 19 tabs.« less
Propfan test assessment testbed aircraft flutter model test report
NASA Technical Reports Server (NTRS)
Jenness, C. M. J.
1987-01-01
The PropFan Test Assessment (PTA) program includes flight tests of a propfan power plant mounted on the left wind of a modified Gulfstream II testbed aircraft. A static balance boom is mounted on the right wing tip for lateral balance. Flutter analyses indicate that these installations reduce the wing flutter stabilizing speed and that torsional stiffening and the installation of a flutter stabilizing tip boom are required on the left wing for adequate flutter safety margins. Wind tunnel tests of a 1/9th scale high speed flutter model of the testbed aircraft were conducted. The test program included the design, fabrication, and testing of the flutter model and the correlation of the flutter test data with analysis results. Excellent correlations with the test data were achieved in posttest flutter analysis using actual model properties. It was concluded that the flutter analysis method used was capable of accurate flutter predictions for both the (symmetric) twin propfan configuration and the (unsymmetric) single propfan configuration. The flutter analysis also revealed that the differences between the tested model configurations and the current aircraft design caused the (scaled) model flutter speed to be significantly higher than that of the aircraft, at least for the single propfan configuration without a flutter boom. Verification of the aircraft final design should, therefore, be based on flutter predictions made with the test validated analysis methods.
Investigating the detection of multi-homed devices independent of operating systems
2017-09-01
timestamp data was used to estimate clock skews using linear regression and linear optimization methods. Analysis revealed that detection depends on...the consistency of the estimated clock skew. Through vertical testing, it was also shown that clock skew consistency depends on the installed...optimization methods. Analysis revealed that detection depends on the consistency of the estimated clock skew. Through vertical testing, it was also
Advanced MRI in Acute Military TBI
2015-11-01
advanced MRI methods, DTI and resting-state fMRI correlation analysis, in military TBI patients acutely after injury and correlate findings with TBI...14 4 Introduction The objective of the project was to test two advanced MRI methods, DTI and resting-state fMRI correlation analysis, in...of Concussion Exam (MACE )(44) were reviewed. This brief cognitive test 279 assesses orientation, immediate verbal memory , concentration, and short
Development of synthetic nuclear melt glass for forensic analysis.
Molgaard, Joshua J; Auxier, John D; Giminaro, Andrew V; Oldham, C J; Cook, Matthew T; Young, Stephen A; Hall, Howard L
A method for producing synthetic debris similar to the melt glass produced by nuclear surface testing is demonstrated. Melt glass from the first nuclear weapon test (commonly referred to as trinitite) is used as the benchmark for this study. These surrogates can be used to simulate a variety of scenarios and will serve as a tool for developing and validating forensic analysis methods.
Extended time-interval analysis
NASA Astrophysics Data System (ADS)
Fynbo, H. O. U.; Riisager, K.
2014-01-01
Several extensions of the halflife analysis method recently suggested by Horvat and Hardy are put forward. Goodness-of-fit testing is included, and the method is extended to cases where more information is available for each decay event which allows applications also for e.g. γ decay data. The results are tested with Monte Carlo simulations and are applied to the decays of 64Cu and 56Mn.
ERIC Educational Resources Information Center
Eckes, Thomas
2017-01-01
This paper presents an approach to standard setting that combines the prototype group method (PGM; Eckes, 2012) with a receiver operating characteristic (ROC) analysis. The combined PGM-ROC approach is applied to setting cut scores on a placement test of English as a foreign language (EFL). To implement the PGM, experts first named learners whom…
Computational Design and Analysis of a Transonic Natural Laminar Flow Wing for a Wind Tunnel Model
NASA Technical Reports Server (NTRS)
Lynde, Michelle N.; Campbell, Richard L.
2017-01-01
A natural laminar flow (NLF) wind tunnel model has been designed and analyzed for a wind tunnel test in the National Transonic Facility (NTF) at the NASA Langley Research Center. The NLF design method is built into the CDISC design module and uses a Navier-Stokes flow solver, a boundary layer profile solver, and stability analysis and transition prediction software. The NLF design method alters the pressure distribution to support laminar flow on the upper surface of wings with high sweep and flight Reynolds numbers. The method addresses transition due to attachment line contamination/transition, Gortler vortices, and crossflow and Tollmien-Schlichting modal instabilities. The design method is applied to the wing of the Common Research Model (CRM) at transonic flight conditions. Computational analysis predicts significant extents of laminar flow on the wing upper surface, which results in drag savings. A 5.2 percent scale semispan model of the CRM NLF wing will be built and tested in the NTF. This test will aim to validate the NLF design method, as well as characterize the laminar flow testing capabilities in the wind tunnel facility.
Implementation of a computer database testing and analysis program.
Rouse, Deborah P
2007-01-01
The author is the coordinator of a computer software database testing and analysis program implemented in an associate degree nursing program. Computer software database programs help support the testing development and analysis process. Critical thinking is measurable and promoted with their use. The reader of this article will learn what is involved in procuring and implementing a computer database testing and analysis program in an academic nursing program. The use of the computerized database for testing and analysis will be approached as a method to promote and evaluate the nursing student's critical thinking skills and to prepare the nursing student for the National Council Licensure Examination.
Characterization of Low-Molecular-Weight Heparins by Strong Anion-Exchange Chromatography.
Sadowski, Radosław; Gadzała-Kopciuch, Renata; Kowalkowski, Tomasz; Widomski, Paweł; Jujeczka, Ludwik; Buszewski, Bogusław
2017-11-01
Currently, detailed structural characterization of low-molecular-weight heparin (LMWH) products is an analytical subject of great interest. In this work, we carried out a comprehensive structural analysis of LMWHs and applied a modified pharmacopeial method, as well as methods developed by other researchers, to the analysis of novel biosimilar LMWH products; and, for the first time, compared the qualitative and quantitative composition of commercially available drugs (enoxaparin, nadroparin, and dalteparin). For this purpose, we used strong anion-exchange (SAX) chromatography with spectrophotometric detection because this method is more helpful, easier, and faster than other separation techniques for the detailed disaccharide analysis of new LMWH drugs. In addition, we subjected the obtained results to statistical analysis (factor analysis, t-test, and Newman-Keuls post hoc test).
Improved Test Planning and Analysis Through the Use of Advanced Statistical Methods
NASA Technical Reports Server (NTRS)
Green, Lawrence L.; Maxwell, Katherine A.; Glass, David E.; Vaughn, Wallace L.; Barger, Weston; Cook, Mylan
2016-01-01
The goal of this work is, through computational simulations, to provide statistically-based evidence to convince the testing community that a distributed testing approach is superior to a clustered testing approach for most situations. For clustered testing, numerous, repeated test points are acquired at a limited number of test conditions. For distributed testing, only one or a few test points are requested at many different conditions. The statistical techniques of Analysis of Variance (ANOVA), Design of Experiments (DOE) and Response Surface Methods (RSM) are applied to enable distributed test planning, data analysis and test augmentation. The D-Optimal class of DOE is used to plan an optimally efficient single- and multi-factor test. The resulting simulated test data are analyzed via ANOVA and a parametric model is constructed using RSM. Finally, ANOVA can be used to plan a second round of testing to augment the existing data set with new data points. The use of these techniques is demonstrated through several illustrative examples. To date, many thousands of comparisons have been performed and the results strongly support the conclusion that the distributed testing approach outperforms the clustered testing approach.
40 CFR Table 8 to Subpart Zzzz of... - Applicability of General Provisions to Subpart ZZZZ.
Code of Federal Regulations, 2010 CFR
2010-07-01
...) Conduct of performance tests and reduction of data Yes Subpart ZZZZ specifies test methods at § 63.6620... 114 of the CAA Yes. § 63.7(f) Alternative test method provisions Yes. § 63.7(g) Performance test data analysis, recordkeeping, and reporting Yes. § 63.7(h) Waiver of tests Yes. § 63.8(a)(1) Applicability of...
Suggestions for presenting the results of data analyses
Anderson, David R.; Link, William A.; Johnson, Douglas H.; Burnham, Kenneth P.
2001-01-01
We give suggestions for the presentation of research results from frequentist, information-theoretic, and Bayesian analysis paradigms, followed by several general suggestions. The information-theoretic and Bayesian methods offer alternative approaches to data analysis and inference compared to traditionally used methods. Guidance is lacking on the presentation of results under these alternative procedures and on nontesting aspects of classical frequentists methods of statistical analysis. Null hypothesis testing has come under intense criticism. We recommend less reporting of the results of statistical tests of null hypotheses in cases where the null is surely false anyway, or where the null hypothesis is of little interest to science or management.
Open-Source Photometric System for Enzymatic Nitrate Quantification
Wittbrodt, B. T.; Squires, D. A.; Walbeck, J.; Campbell, E.; Campbell, W. H.; Pearce, J. M.
2015-01-01
Nitrate, the most oxidized form of nitrogen, is regulated to protect people and animals from harmful levels as there is a large over abundance due to anthropogenic factors. Widespread field testing for nitrate could begin to address the nitrate pollution problem, however, the Cadmium Reduction Method, the leading certified method to detect and quantify nitrate, demands the use of a toxic heavy metal. An alternative, the recently proposed Environmental Protection Agency Nitrate Reductase Nitrate-Nitrogen Analysis Method, eliminates this problem but requires an expensive proprietary spectrophotometer. The development of an inexpensive portable, handheld photometer will greatly expedite field nitrate analysis to combat pollution. To accomplish this goal, a methodology for the design, development, and technical validation of an improved open-source water testing platform capable of performing Nitrate Reductase Nitrate-Nitrogen Analysis Method. This approach is evaluated for its potential to i) eliminate the need for toxic chemicals in water testing for nitrate and nitrite, ii) reduce the cost of equipment to perform this method for measurement for water quality, and iii) make the method easier to carryout in the field. The device is able to perform as well as commercial proprietary systems for less than 15% of the cost for materials. This allows for greater access to the technology and the new, safer nitrate testing technique. PMID:26244342
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kiefel, Denis, E-mail: Denis.Kiefel@airbus.com, E-mail: Rainer.Stoessel@airbus.com; Stoessel, Rainer, E-mail: Denis.Kiefel@airbus.com, E-mail: Rainer.Stoessel@airbus.com; Grosse, Christian, E-mail: Grosse@tum.de
2015-03-31
In recent years, an increasing number of safety-relevant structures are designed and manufactured from carbon fiber reinforced polymers (CFRP) in order to reduce weight of airplanes by taking the advantage of their specific strength into account. Non-destructive testing (NDT) methods for quantitative defect analysis of damages are liquid- or air-coupled ultrasonic testing (UT), phased array ultrasonic techniques, and active thermography (IR). The advantage of these testing methods is the applicability on large areas. However, their quantitative information is often limited on impact localization and size. In addition to these techniques, Airbus Group Innovations operates a micro x-ray computed tomography (μ-XCT)more » system, which was developed for CFRP characterization. It is an open system which allows different kinds of acquisition, reconstruction, and data evaluation. One main advantage of this μ-XCT system is its high resolution with 3-dimensional analysis and visualization opportunities, which enables to gain important quantitative information for composite part design and stress analysis. Within this study, different NDT methods will be compared at CFRP samples with specified artificial impact damages. The results can be used to select the most suitable NDT-method for specific application cases. Furthermore, novel evaluation and visualization methods for impact analyzes are developed and will be presented.« less
Open-Source Photometric System for Enzymatic Nitrate Quantification.
Wittbrodt, B T; Squires, D A; Walbeck, J; Campbell, E; Campbell, W H; Pearce, J M
2015-01-01
Nitrate, the most oxidized form of nitrogen, is regulated to protect people and animals from harmful levels as there is a large over abundance due to anthropogenic factors. Widespread field testing for nitrate could begin to address the nitrate pollution problem, however, the Cadmium Reduction Method, the leading certified method to detect and quantify nitrate, demands the use of a toxic heavy metal. An alternative, the recently proposed Environmental Protection Agency Nitrate Reductase Nitrate-Nitrogen Analysis Method, eliminates this problem but requires an expensive proprietary spectrophotometer. The development of an inexpensive portable, handheld photometer will greatly expedite field nitrate analysis to combat pollution. To accomplish this goal, a methodology for the design, development, and technical validation of an improved open-source water testing platform capable of performing Nitrate Reductase Nitrate-Nitrogen Analysis Method. This approach is evaluated for its potential to i) eliminate the need for toxic chemicals in water testing for nitrate and nitrite, ii) reduce the cost of equipment to perform this method for measurement for water quality, and iii) make the method easier to carryout in the field. The device is able to perform as well as commercial proprietary systems for less than 15% of the cost for materials. This allows for greater access to the technology and the new, safer nitrate testing technique.
Trébucq, A; Guérin, N; Ali Ismael, H; Bernatas, J J; Sèvre, J P; Rieder, H L
2005-10-01
Djibouti, 1994 and 2001. To estimate the prevalence of tuberculosis (TB) and average annual risk of TB infection (ARTI) and trends, and to test a new method for calculations. Tuberculin surveys among schoolchildren and sputum smear-positive TB patients. Prevalence of infection was calculated using cut-off points, the mirror image technique, mixture analysis, and a new method based on the operating characteristics of the tuberculin test. Test sensitivity was derived from tuberculin reactions among TB patients and test specificity from a comparison of reaction size distributions among children with and without a BCG scar. The ARTI was estimated to lie between 2.6% and 3.1%, with no significant changes between 1994 and 2001. The close match of the distributions between children tested in 1994 and patients justifies the utilisation of the latter to determine test sensitivity. This new method gave very consistent estimates of prevalence of infection for any induration for values between 15 and 20 mm. Specificity was successfully determined for 1994, but not for 2001. Mixture analysis confirmed the estimates obtained with the new method. Djibouti has a high ARTI, and no apparent change over the observation time was found. Using operating test characteristics to estimate prevalence of infection looks promising.
[Current status of gene test market].
Ohtani, Shinichi
2002-12-01
The technological innovation of the gene analysis makes the adaptation range of the gene test in clinical diagnosis expand. Then, gene test has popularized increasingly around the infection disease for clinical inspection. Also in the field of clinical inspection, the increase of the importance of clinical application and the inspection item new year by year have appeared with the functional analysis of a gene. Moreover, the new test method and automation analysis equipment tend to be developed by progress of gene-analysis technology, and it is going to be introduced. The spread of gene test and development of a gene test market have an important possibility of activating the present clinical inspection field.
Comparisons of Exploratory and Confirmatory Factor Analysis.
ERIC Educational Resources Information Center
Daniel, Larry G.
Historically, most researchers conducting factor analysis have used exploratory methods. However, more recently, confirmatory factor analytic methods have been developed that can directly test theory either during factor rotation using "best fit" rotation methods or during factor extraction, as with the LISREL computer programs developed…
Görgen, Kai; Hebart, Martin N; Allefeld, Carsten; Haynes, John-Dylan
2017-12-27
Standard neuroimaging data analysis based on traditional principles of experimental design, modelling, and statistical inference is increasingly complemented by novel analysis methods, driven e.g. by machine learning methods. While these novel approaches provide new insights into neuroimaging data, they often have unexpected properties, generating a growing literature on possible pitfalls. We propose to meet this challenge by adopting a habit of systematic testing of experimental design, analysis procedures, and statistical inference. Specifically, we suggest to apply the analysis method used for experimental data also to aspects of the experimental design, simulated confounds, simulated null data, and control data. We stress the importance of keeping the analysis method the same in main and test analyses, because only this way possible confounds and unexpected properties can be reliably detected and avoided. We describe and discuss this Same Analysis Approach in detail, and demonstrate it in two worked examples using multivariate decoding. With these examples, we reveal two sources of error: A mismatch between counterbalancing (crossover designs) and cross-validation which leads to systematic below-chance accuracies, and linear decoding of a nonlinear effect, a difference in variance. Copyright © 2017 Elsevier Inc. All rights reserved.
Test-retest stability of the Task and Ego Orientation Questionnaire.
Lane, Andrew M; Nevill, Alan M; Bowes, Neal; Fox, Kenneth R
2005-09-01
Establishing stability, defined as observing minimal measurement error in a test-retest assessment, is vital to validating psychometric tools. Correlational methods, such as Pearson product-moment, intraclass, and kappa are tests of association or consistency, whereas stability or reproducibility (regarded here as synonymous) assesses the agreement between test-retest scores. Indexes of reproducibility using the Task and Ego Orientation in Sport Questionnaire (TEOSQ; Duda & Nicholls, 1992) were investigated using correlational (Pearson product-moment, intraclass, and kappa) methods, repeated measures multivariate analysis of variance, and calculating the proportion of agreement within a referent value of +/-1 as suggested by Nevill, Lane, Kilgour, Bowes, and Whyte (2001). Two hundred thirteen soccer players completed the TEOSQ on two occasions, 1 week apart. Correlation analyses indicated a stronger test-retest correlation for the Ego subscale than the Task subscale. Multivariate analysis of variance indicated stability for ego items but with significant increases in four task items. The proportion of test-retest agreement scores indicated that all ego items reported relatively poor stability statistics with test-retest scores within a range of +/-1, ranging from 82.7-86.9%. By contrast, all task items showed test-retest difference scores ranging from 92.5-99%, although further analysis indicated that four task subscale items increased significantly. Findings illustrated that correlational methods (Pearson product-moment, intraclass, and kappa) are influenced by the range in scores, and calculating the proportion of agreement of test-retest differences with a referent value of +/-1 could provide additional insight into the stability of the questionnaire. It is suggested that the item-by-item proportion of agreement method proposed by Nevill et al. (2001) should be used to supplement existing methods and could be especially helpful in identifying rogue items in the initial stages of psychometric questionnaire validation.
NASA Technical Reports Server (NTRS)
Gates, Thomas S.; Johnson, Theodore F.; Whitley, Karen S.
2005-01-01
The objective of this report is to contribute to the independent assessment of the Space Shuttle External Tank Foam Material. This report specifically addresses material modeling, characterization testing, data reduction methods, and data pedigree. A brief description of the External Tank foam materials, locations, and standard failure modes is provided to develop suitable background information. A review of mechanics based analysis methods from the open literature is used to provide an assessment of the state-of-the-art in material modeling of closed cell foams. Further, this report assesses the existing material property database and investigates sources of material property variability. The report presents identified deficiencies in testing methods and procedures, recommendations for additional testing as required, identification of near-term improvements that should be pursued, and long-term capabilities or enhancements that should be developed.
Nanita, Sergio C; Padivitage, Nilusha L T
2013-03-20
A sample extraction and purification procedure that uses ammonium-salt-induced acetonitrile/water phase separation was developed and demonstrated to be compatible with the recently reported method for pesticide residue analysis based on fast extraction and dilution flow injection mass spectrometry (FED-FI-MS). The ammonium salts evaluated were chloride, acetate, formate, carbonate, and sulfate. A mixture of NaCl and MgSO4, salts used in the well-known QuEChERS method, was also tested for comparison. With thermal decomposition/evaporation temperature of <350°C, ammonium salts resulted in negligible ion source residual under typical electrospray conditions, leading to consistent method performance and less instrument cleaning. Although all ammonium salts tested induced acetonitrile/water phase separation, NH4Cl yielded the best performance, thus it was the preferred salting out agent. The NH4Cl salting out method was successfully coupled with FI/MS/MS and tested for fourteen pesticide active ingredients: chlorantraniliprole, cyantraniliprole, chlorimuron ethyl, oxamyl, methomyl, sulfometuron methyl, chlorsulfuron, triflusulfuron methyl, azimsulfuron, flupyrsulfuron methyl, aminocyclopyrachlor, aminocyclopyrachlor methyl, diuron and hexazinone. A validation study was conducted with nine complex matrices: sorghum, rice, grapefruit, canola, milk, eggs, beef, urine and blood plasma. The method is applicable to all analytes, except aminocyclopyrachlor. The method was deemed appropriate for quantitative analysis in 114 out of 126 analyte/matrix cases tested (applicability rate=0.90). The NH4Cl salting out extraction/cleanup allowed expansion of FI/MS/MS for analysis in food of plant and animal origin, and body fluids with increased ruggedness and sensitivity, while maintaining high-throughput (run time=30s/sample). Limits of quantitation (LOQs) of 0.01mgkg(-1) (ppm), the 'well-accepted standard' in pesticide residue analysis, were achieved in >80% of cases tested; while limits of detection (LODs) were typically in the range of 0.001-0.01mgkg(-1) (ppm). A comparison to a well-established HPLC/MS/MS method was also conducted, yielding comparable results, thus confirming the suitability of NH4Cl salting out FI/MS/MS for pesticide residue analysis. Copyright © 2013 Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Golovachyova, Viktoriya N.; Menlibekova, Gulbakhyt Zh.; Abayeva, Nella F.; Ten, Tatyana L.; Kogaya, Galina D.
2016-01-01
Using computer-based monitoring systems that rely on tests could be the most effective way of knowledge evaluation. The problem of objective knowledge assessment by means of testing takes on a new dimension in the context of new paradigms in education. The analysis of the existing test methods enabled us to conclude that tests with selected…
A study on quantifying COPD severity by combining pulmonary function tests and CT image analysis
NASA Astrophysics Data System (ADS)
Nimura, Yukitaka; Kitasaka, Takayuki; Honma, Hirotoshi; Takabatake, Hirotsugu; Mori, Masaki; Natori, Hiroshi; Mori, Kensaku
2011-03-01
This paper describes a novel method that can evaluate chronic obstructive pulmonary disease (COPD) severity by combining measurements of pulmonary function tests and measurements obtained from CT image analysis. There is no cure for COPD. However, with regular medical care and consistent patient compliance with treatments and lifestyle changes, the symptoms of COPD can be minimized and progression of the disease can be slowed. Therefore, many diagnosis methods based on CT image analysis have been proposed for quantifying COPD. Most of diagnosis methods for COPD extract the lesions as low-attenuation areas (LAA) by thresholding and evaluate the COPD severity by calculating the LAA in the lung (LAA%). However, COPD is usually the result of a combination of two conditions, emphysema and chronic obstructive bronchitis. Therefore, the previous methods based on only LAA% do not work well. The proposed method utilizes both of information including the measurements of pulmonary function tests and the results of the chest CT image analysis to evaluate the COPD severity. In this paper, we utilize a multi-class AdaBoost to combine both of information and classify the COPD severity into five stages automatically. The experimental results revealed that the accuracy rate of the proposed method was 88.9% (resubstitution scheme) and 64.4% (leave-one-out scheme).
Silbernagel, Karen M; Jechorek, Robert P; Kaufer, Amanda L; Johnson, Ronald L; Aleo, V; Brown, B; Buen, M; Buresh, J; Carson, M; Franklin, J; Ham, P; Humes, L; Husby, G; Hutchins, J; Jechorek, R; Jenkins, J; Kaufer, A; Kexel, N; Kora, L; Lam, L; Lau, D; Leighton, S; Loftis, M; Luc, S; Martin, J; Nacar, I; Nogle, J; Park, J; Schultz, A; Seymore, D; Smith, C; Smith, J; Thou, P; Ulmer, M; Voss, R; Weaver, V
2005-01-01
A multilaboratory study was conducted to compare the VIDAS LIS immunoassay with the standard cultural methods for the detection of Listeria in foods using an enrichment modification of AOAC Official Method 999.06. The modified enrichment protocol was implemented to harmonize the VIDAS LIS assay with the VIDAS LMO2 assay. Five food types--brie cheese, vanilla ice cream, frozen green beans, frozen raw tilapia fish, and cooked roast beef--at 3 inoculation levels, were analyzed by each method. A total of 15 laboratories representing government and industry participated. In this study, 1206 test portions were tested, of which 1170 were used in the statistical analysis. There were 433 positive by the VIDAS LIS assay and 396 positive by the standard culture methods. A Chi-square analysis of each of the 5 food types, at the 3 inoculation levels tested, was performed. The resulting average Chi square analysis, 0.42, indicated that, overall, there are no statistical differences between the VIDAS LIS assay and the standard methods at the 5% level of significance.
Comparison of analytical methods for the determination of histamine in reference canned fish samples
NASA Astrophysics Data System (ADS)
Jakšić, S.; Baloš, M. Ž.; Mihaljev, Ž.; Prodanov Radulović, J.; Nešić, K.
2017-09-01
Two screening methods for histamine in canned fish, an enzymatic test and a competitive direct enzyme-linked immunosorbent assay (CD-ELISA), were compared with the reversed-phase liquid chromatography (RP-HPLC) standard method. For enzymatic and CD-ELISA methods, determination was conducted according to producers’ manuals. For RP-HPLC, histamine was derivatized with dansyl-chloride, followed by RP-HPLC and diode array detection. Results of analysis of canned fish, supplied as reference samples for proficiency testing, showed good agreement when histamine was present at higher concentrations (above 100 mg kg-1). At a lower level (16.95 mg kg-1), the enzymatic test produced some higher results. Generally, analysis of four reference samples according to CD-ELISA and RP-HPLC showed good agreement for histamine determination (r=0.977 in concentration range 16.95-216 mg kg-1) The results show that the applied enzymatic test and CD-ELISA appeared to be suitable screening methods for the determination of histamine in canned fish.
NASA Astrophysics Data System (ADS)
Obuchowski, Nancy A.; Bullen, Jennifer A.
2018-04-01
Receiver operating characteristic (ROC) analysis is a tool used to describe the discrimination accuracy of a diagnostic test or prediction model. While sensitivity and specificity are the basic metrics of accuracy, they have many limitations when characterizing test accuracy, particularly when comparing the accuracies of competing tests. In this article we review the basic study design features of ROC studies, illustrate sample size calculations, present statistical methods for measuring and comparing accuracy, and highlight commonly used ROC software. We include descriptions of multi-reader ROC study design and analysis, address frequently seen problems of verification and location bias, discuss clustered data, and provide strategies for testing endpoints in ROC studies. The methods are illustrated with a study of transmission ultrasound for diagnosing breast lesions.
Murphy, Christine M; Devlin, John J; Beuhler, Michael C; Cheifetz, Paul; Maynard, Susan; Schwartz, Michael D; Kacinko, Sherri
2018-04-01
Nitromethane, found in fuels used for short distance racing, model cars, and model airplanes, produces a falsely elevated serum creatinine with standard creatinine analysis via the Jaffé method. Erroneous creatinine elevation often triggers extensive testing, leads to inaccurate diagnoses, and delayed or inappropriate medical interventions. Multiple reports in the literature identify "enzymatic assays" as an alternative method to detect the true value of creatinine, but this ambiguity does not help providers translate what type of enzymatic assay testing can be done in real time to determine if there is indeed false elevation. We report seven cases of ingested nitromethane where creatinine was determined via Beckman Coulter ® analyser using the Jaffé method, Vitros ® analyser, or i-Stat ® point-of-care testing. Nitromethane was detected and semi-quantified using a common clinical toxic alcohol analysis method, and quantified by headspace-gas chromatography-mass spectrometry. When creatinine was determined using i-Stat ® point-of-care testing or a Vitros ® analyser, levels were within the normal range. Comparatively, all initial creatinine levels obtained via the Jaffé method were elevated. Nitromethane concentrations ranged from 42 to 310 μg/mL. These cases demonstrate reliable assessment of creatinine through other enzymatic methods using a Vitros ® analyser or i-STAT ® . Additionally, nitromethane is detectable and quantifiable using routine alcohols gas chromatography analysis and by headspace-gas chromatography-mass spectrometry.
NASA Technical Reports Server (NTRS)
Macwilkinson, D. G.; Blackerby, W. T.; Paterson, J. H.
1974-01-01
The degree of cruise drag correlation on the C-141A aircraft is determined between predictions based on wind tunnel test data, and flight test results. An analysis of wind tunnel tests on a 0.0275 scale model at Reynolds number up to 3.05 x 1 million/MAC is reported. Model support interference corrections are evaluated through a series of tests, and fully corrected model data are analyzed to provide details on model component interference factors. It is shown that predicted minimum profile drag for the complete configuration agrees within 0.75% of flight test data, using a wind tunnel extrapolation method based on flat plate skin friction and component shape factors. An alternative method of extrapolation, based on computed profile drag from a subsonic viscous theory, results in a prediction four percent lower than flight test data.
Microfluidic systems and methods of transport and lysis of cells and analysis of cell lysate
Culbertson, Christopher T.; Jacobson, Stephen C.; McClain, Maxine A.; Ramsey, J. Michael
2004-08-31
Microfluidic systems and methods are disclosed which are adapted to transport and lyse cellular components of a test sample for analysis. The disclosed microfluidic systems and methods, which employ an electric field to rupture the cell membrane, cause unusually rapid lysis, thereby minimizing continued cellular activity and resulting in greater accuracy of analysis of cell processes.
Microfluidic systems and methods for transport and lysis of cells and analysis of cell lysate
Culbertson, Christopher T [Oak Ridge, TN; Jacobson, Stephen C [Knoxville, TN; McClain, Maxine A [Knoxville, TN; Ramsey, J Michael [Knoxville, TN
2008-09-02
Microfluidic systems and methods are disclosed which are adapted to transport and lyse cellular components of a test sample for analysis. The disclosed microfluidic systems and methods, which employ an electric field to rupture the cell membrane, cause unusually rapid lysis, thereby minimizing continued cellular activity and resulting in greater accuracy of analysis of cell processes.
A kernel regression approach to gene-gene interaction detection for case-control studies.
Larson, Nicholas B; Schaid, Daniel J
2013-11-01
Gene-gene interactions are increasingly being addressed as a potentially important contributor to the variability of complex traits. Consequently, attentions have moved beyond single locus analysis of association to more complex genetic models. Although several single-marker approaches toward interaction analysis have been developed, such methods suffer from very high testing dimensionality and do not take advantage of existing information, notably the definition of genes as functional units. Here, we propose a comprehensive family of gene-level score tests for identifying genetic elements of disease risk, in particular pairwise gene-gene interactions. Using kernel machine methods, we devise score-based variance component tests under a generalized linear mixed model framework. We conducted simulations based upon coalescent genetic models to evaluate the performance of our approach under a variety of disease models. These simulations indicate that our methods are generally higher powered than alternative gene-level approaches and at worst competitive with exhaustive SNP-level (where SNP is single-nucleotide polymorphism) analyses. Furthermore, we observe that simulated epistatic effects resulted in significant marginal testing results for the involved genes regardless of whether or not true main effects were present. We detail the benefits of our methods and discuss potential genome-wide analysis strategies for gene-gene interaction analysis in a case-control study design. © 2013 WILEY PERIODICALS, INC.
Assessment of current AASHTO LRFD methods for static pile capacity analysis in Rhode Island soils.
DOT National Transportation Integrated Search
2013-07-01
This report presents an assessment of current AASHTO LRFD methods for static pile capacity analysis in Rhode : Island soils. Current static capacity methods and associated resistance factors are based on pile load test data in sands : and clays. Some...
A meta-analysis of in vitro antibiotic synergy against Acinetobacter baumannii.
March, Gabriel A; Bratos, Miguel A
2015-12-01
The aim of the work was to describe the different in vitro models for testing synergism of antibiotics and gather the results of antibiotic synergy against multidrug-resistant Acinetobacter baumannii (MDR-Ab). The different original articles were obtained from different web sites. In order to compare the results obtained by the different methods for synergy testing, the Pearson chi-square and the Fischer tests were used. Moreover, non-parametric chi-square test was used in order to compare the frequency distribution in each analysed manuscript. In the current meta-analysis 24 manuscripts, which encompassed 2016 tests of in vitro synergism of different antimicrobials against MDR-Ab, were revised. Checkerboard synergy testing was used in 11 studies, which encompasses 1086 tests (53.9%); time-kill assays were applied in 12 studies, which encompass 359 tests (17.8%); gradient diffusion methods were used in seven studies, encompassing 293 tests (14.5%). And, finally, time-kill plus checkerboard were applied in two studies, encompassing 278 tests (13.8%). By comparing these data, checkerboard and time-kill methods were significantly more used than gradient diffusion methods (p<0.005). Regarding synergy rates obtained on the basis of the applied method, checkerboard provided 227 tests (20.9%) with a synergistic effect; time-kill assays yielded 222 tests (61.8%) with a synergistic effect; gradient diffusion methods only provided 29 tests (9.9%) with a synergistic effect; and, finally, time-kill plus checkerboard yielded just 15 tests (5.4%) with a synergistic effect. When comparing these percentages, synergy rates reported by time-kill methods were significantly higher than that obtained by checkerboard and gradient diffusion methods (p<0.005). On the basis of the revised data, the combinations of a bactericidal antibiotic plus Tigecycline, Vancomycin or Teicoplanin are not recommended. The best combinations of antibiotics are those which include bactericidal antibiotics such as Carbapenems, Fosfomycin, Amikacin, Polymyxins, Rifampicin and Ampicillin/Sulbactam. Copyright © 2015. Published by Elsevier B.V.
Lee, Sang Hun; Yoo, Myung Hoon; Park, Jun Woo; Kang, Byung Chul; Yang, Chan Joo; Kang, Woo Suk; Ahn, Joong Ho; Chung, Jong Woo; Park, Hong Ju
2018-06-01
To evaluate whether video head impulse test (vHIT) gains are dependent on the measuring device and method of analysis. Prospective study. vHIT was performed in 25 healthy subjects using two devices simultaneously. vHIT gains were compared between these instruments and using five different methods of comparing position and velocity gains during head movement intervals. The two devices produced different vHIT gain results with the same method of analysis. There were also significant differences in the vHIT gains measured using different analytical methods. The gain analytic method that compares the areas under the velocity curve (AUC) of the head and eye movements during head movements showed lower vHIT gains than a method that compared the peak velocities of the head and eye movements. The former method produced the vHIT gain with the smallest standard deviation among the five procedures tested in this study. vHIT gains differ in normal subjects depending on the device and method of analysis used, suggesting that it is advisable for each device to have its own normal values. Gain calculations that compare the AUC of the head and eye movements during the head movements show the smallest variance.
NASA Astrophysics Data System (ADS)
Endreny, Theodore A.; Pashiardis, Stelios
2007-02-01
SummaryRobust and accurate estimates of rainfall frequencies are difficult to make with short, and arid-climate, rainfall records, however new regional and global methods were used to supplement such a constrained 15-34 yr record in Cyprus. The impact of supplementing rainfall frequency analysis with the regional and global approaches was measured with relative bias and root mean square error (RMSE) values. Analysis considered 42 stations with 8 time intervals (5-360 min) in four regions delineated by proximity to sea and elevation. Regional statistical algorithms found the sites passed discordancy tests of coefficient of variation, skewness and kurtosis, while heterogeneity tests revealed the regions were homogeneous to mildly heterogeneous. Rainfall depths were simulated in the regional analysis method 500 times, and then goodness of fit tests identified the best candidate distribution as the general extreme value (GEV) Type II. In the regional analysis, the method of L-moments was used to estimate location, shape, and scale parameters. In the global based analysis, the distribution was a priori prescribed as GEV Type II, a shape parameter was a priori set to 0.15, and a time interval term was constructed to use one set of parameters for all time intervals. Relative RMSE values were approximately equal at 10% for the regional and global method when regions were compared, but when time intervals were compared the global method RMSE had a parabolic-shaped time interval trend. Relative bias values were also approximately equal for both methods when regions were compared, but again a parabolic-shaped time interval trend was found for the global method. The global method relative RMSE and bias trended with time interval, which may be caused by fitting a single scale value for all time intervals.
[Alternatives to animal testing].
Fabre, Isabelle
2009-11-01
The use of alternative methods to animal testing are an integral part of the 3Rs concept (refine, reduce, replace) defined by Russel & Burch in 1959. These approaches include in silico methods (databases and computer models), in vitro physicochemical analysis, biological methods using bacteria or isolated cells, reconstructed enzyme systems, and reconstructed tissues. Emerging "omic" methods used in integrated approaches further help to reduce animal use, while stem cells offer promising approaches to toxicologic and pathophysiologic studies, along with organotypic cultures and bio-artificial organs. Only a few alternative methods can so far be used in stand-alone tests as substitutes for animal testing. The best way to use these methods is to integrate them in tiered testing strategies (ITS), in which animals are only used as a last resort.
On Bayesian Testing of Additive Conjoint Measurement Axioms Using Synthetic Likelihood.
Karabatsos, George
2018-06-01
This article introduces a Bayesian method for testing the axioms of additive conjoint measurement. The method is based on an importance sampling algorithm that performs likelihood-free, approximate Bayesian inference using a synthetic likelihood to overcome the analytical intractability of this testing problem. This new method improves upon previous methods because it provides an omnibus test of the entire hierarchy of cancellation axioms, beyond double cancellation. It does so while accounting for the posterior uncertainty that is inherent in the empirical orderings that are implied by these axioms, together. The new method is illustrated through a test of the cancellation axioms on a classic survey data set, and through the analysis of simulated data.
Negeri, Zelalem F; Shaikh, Mateen; Beyene, Joseph
2018-05-11
Diagnostic or screening tests are widely used in medical fields to classify patients according to their disease status. Several statistical models for meta-analysis of diagnostic test accuracy studies have been developed to synthesize test sensitivity and specificity of a diagnostic test of interest. Because of the correlation between test sensitivity and specificity, modeling the two measures using a bivariate model is recommended. In this paper, we extend the current standard bivariate linear mixed model (LMM) by proposing two variance-stabilizing transformations: the arcsine square root and the Freeman-Tukey double arcsine transformation. We compared the performance of the proposed methods with the standard method through simulations using several performance measures. The simulation results showed that our proposed methods performed better than the standard LMM in terms of bias, root mean square error, and coverage probability in most of the scenarios, even when data were generated assuming the standard LMM. We also illustrated the methods using two real data sets. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Astrophysics Data System (ADS)
Noor, M. J. Md; Ibrahim, A.; Rahman, A. S. A.
2018-04-01
Small strain triaxial test measurement is considered to be significantly accurate compared to the external strain measurement using conventional method due to systematic errors normally associated with the test. Three submersible miniature linear variable differential transducer (LVDT) mounted on yokes which clamped directly onto the soil sample at equally 120° from the others. The device setup using 0.4 N resolution load cell and 16 bit AD converter was capable of consistently resolving displacement of less than 1µm and measuring axial strains ranging from less than 0.001% to 2.5%. Further analysis of small strain local measurement data was performed using new Normalized Multiple Yield Surface Framework (NRMYSF) method and compared with existing Rotational Multiple Yield Surface Framework (RMYSF) prediction method. The prediction of shear strength based on combined intrinsic curvilinear shear strength envelope using small strain triaxial test data confirmed the significant improvement and reliability of the measurement and analysis methods. Moreover, the NRMYSF method shows an excellent data prediction and significant improvement toward more reliable prediction of soil strength that can reduce the cost and time of experimental laboratory test.
Field demonstration of on-site analytical methods for TNT and RDX in ground water
DOE Office of Scientific and Technical Information (OSTI.GOV)
Craig, H.; Ferguson, G.; Markos, A.
1996-12-31
A field demonstration was conducted to assess the performance of eight commercially-available and emerging colorimetric, immunoassay, and biosensor on-site analytical methods for explosives 2,4,6-trinitrotoluene (TNT) and hexahydro-1,3,5-trinitro-1,3,5-triazine (RDX) in ground water and leachate at the Umatilla Army Depot Activity, Hermiston, Oregon and US Naval Submarine Base, Bangor, Washington, Superfund sites. Ground water samples were analyzed by each of the on-site methods and results compared to laboratory analysis using high performance liquid chromatography (HPLC) with EPA SW-846 Method 8330. The commercial methods evaluated include the EnSys, Inc., TNT and RDX colorimetric test kits (EPA SW-846 Methods 8515 and 8510) with amore » solid phase extraction (SPE) step, the DTECH/EM Science TNT and RDX immunoassay test kits (EPA SW-846 Methods 4050 and 4051), and the Ohmicron TNT immunoassay test kit. The emerging methods tested include the antibody-based Naval Research Laboratory (NRL) Continuous Flow Immunosensor (CFI) for TNT and RDX, and the Fiber Optic Biosensor (FOB) for TNT. Accuracy of the on-site methods were evaluated using linear regression analysis and relative percent difference (RPD) comparison criteria. Over the range of conditions tested, the colorimetric methods for TNT and RDX showed the highest accuracy of the emerging methods for TNT and RDX. The colorimetric method was selected for routine ground water monitoring at the Umatilla site, and further field testing on the NRL CFI and FOB biosensors will continue at both Superfund sites.« less
40 CFR 98.7 - What standardized methods are incorporated by reference into this part?
Code of Federal Regulations, 2014 CFR
2014-07-01
...-B2959, (800) 262-1373, http://www.astm.org. (1) ASTM C25-06 Standard Test Method for Chemical Analysis...), § 98.174(b), § 98.184(b), § 98.194(c), and § 98.334(b). (2) ASTM C114-09 Standard Test Methods for... Dry Cleaning Solvent), IBR approved for § 98.6. (4) ASTM D240-02 (Reapproved 2007) Standard Test...
NASA Technical Reports Server (NTRS)
Marley, Mike
2008-01-01
The focus of this paper will be on the thermal balance testing for the Operationally Responsive Space Standard Bus Battery. The Standard Bus thermal design required that the battery be isolated from the bus itself. This required the battery to have its own thermal control, including heaters and a radiator surface. Since the battery was not ready for testing during the overall bus thermal balance testing, a separate test was conducted to verify the thermal design for the battery. This paper will discuss in detail, the test set up, test procedure, and results from this test. Additionally this paper will consider the methods taken to determine the heat dissipation of the battery during charge and discharge. It seems that the heat dissipation for Lithium Ion batteries is relatively unknown and hard to quantify. The methods used during test and the post test analysis to estimate the heat dissipation of the battery will be discussed.
Ranking metrics in gene set enrichment analysis: do they matter?
Zyla, Joanna; Marczyk, Michal; Weiner, January; Polanska, Joanna
2017-05-12
There exist many methods for describing the complex relation between changes of gene expression in molecular pathways or gene ontologies under different experimental conditions. Among them, Gene Set Enrichment Analysis seems to be one of the most commonly used (over 10,000 citations). An important parameter, which could affect the final result, is the choice of a metric for the ranking of genes. Applying a default ranking metric may lead to poor results. In this work 28 benchmark data sets were used to evaluate the sensitivity and false positive rate of gene set analysis for 16 different ranking metrics including new proposals. Furthermore, the robustness of the chosen methods to sample size was tested. Using k-means clustering algorithm a group of four metrics with the highest performance in terms of overall sensitivity, overall false positive rate and computational load was established i.e. absolute value of Moderated Welch Test statistic, Minimum Significant Difference, absolute value of Signal-To-Noise ratio and Baumgartner-Weiss-Schindler test statistic. In case of false positive rate estimation, all selected ranking metrics were robust with respect to sample size. In case of sensitivity, the absolute value of Moderated Welch Test statistic and absolute value of Signal-To-Noise ratio gave stable results, while Baumgartner-Weiss-Schindler and Minimum Significant Difference showed better results for larger sample size. Finally, the Gene Set Enrichment Analysis method with all tested ranking metrics was parallelised and implemented in MATLAB, and is available at https://github.com/ZAEDPolSl/MrGSEA . Choosing a ranking metric in Gene Set Enrichment Analysis has critical impact on results of pathway enrichment analysis. The absolute value of Moderated Welch Test has the best overall sensitivity and Minimum Significant Difference has the best overall specificity of gene set analysis. When the number of non-normally distributed genes is high, using Baumgartner-Weiss-Schindler test statistic gives better outcomes. Also, it finds more enriched pathways than other tested metrics, which may induce new biological discoveries.
Feasibility analysis of EDXRF method to detect heavy metal pollution in ecological environment
NASA Astrophysics Data System (ADS)
Hao, Zhixu; Qin, Xulei
2018-02-01
The change of heavy metal content in water environment, soil and plant can reflect the change of heavy metal pollution in ecological environment, and it is important to monitor the trend of heavy metal pollution in eco-environment by using water environment, soil and heavy metal content in plant. However, the content of heavy metals in nature is very low, the background elements of water environment, soil and plant samples are complex, and there are many interfering factors in the EDXRF system that will affect the spectral analysis results and reduce the detection accuracy. Through the contrastive analysis of several heavy metal elements detection methods, it is concluded that the EDXRF method is superior to other chemical methods in testing accuracy and method feasibility when the heavy metal pollution in soil is tested in ecological environment.
Test versus analysis: A discussion of methods
NASA Technical Reports Server (NTRS)
Butler, T. G.
1986-01-01
Some techniques for comparing structural vibration data determined from test and analysis are discussed. Orthogonality is a general category of one group, correlation is a second, synthesis is a third and matrix improvement is a fourth. Advantages and short-comings of the methods are explored with suggestions as to how they can complement one another. The purpose for comparing vibration data from test and analysis for a given structure is to find out whether each is representing the dynamic properties of the structure in the same way. Specifically, whether: mode shapes are alike; the frequencies of the modes are alike; modes appear in the same frequency sequence; and if they are not alike, how to judge which to believe.
Bevilacqua, Elisa; Jani, Jacques C; Letourneau, Alexandra; Duiella, Silvia F; Kleinfinger, Pascale; Lohmann, Laurence; Resta, Serena; Cos Sanchez, Teresa; Fils, Jean-François; Mirra, Marilyn; Benachi, Alexandra; Costa, Jean-Marc
2018-06-13
To evaluate the failure rate and performance of cell-free DNA (cfDNA) testing, mainly in terms of detection rates for trisomy 21, performed by 2 laboratories using different analytical methods. cfDNA testing was performed on 2,870 pregnancies with the HarmonyTM Prenatal Test using the targeted digital analysis of selected regions (DANSR) method, and on 2,635 pregnancies with the "Cerba test" using the genome-wide massively parallel sequencing (GW-MPS) method, with available outcomes. Propensity score analysis was used to match patients between the 2 groups. A comparison of the detection rates for trisomy 21 between the 2 laboratories was made. In all, 2,811 patients in the Harmony group and 2,530 patients in the Cerba group had no trisomy 21, 18, or 13. Postmatched comparisons of the patient characteristics indicated a higher no-result rate in the Harmony group (1.30%) than in the Cerba group (0.75%; p = 0.039). All 41 cases of trisomy 21 in the Harmony group and 93 cases in the Cerba group were detected. Both methods of cfDNA testing showed low no-result rates and a comparable performance in detecting trisomy 21; yet GW-MPS had a slightly lower no-result rate than the DANSR method. © 2018 S. Karger AG, Basel.
In-Flight System Identification
NASA Technical Reports Server (NTRS)
Morelli, Eugene A.
1998-01-01
A method is proposed and studied whereby the system identification cycle consisting of experiment design and data analysis can be repeatedly implemented aboard a test aircraft in real time. This adaptive in-flight system identification scheme has many advantages, including increased flight test efficiency, adaptability to dynamic characteristics that are imperfectly known a priori, in-flight improvement of data quality through iterative input design, and immediate feedback of the quality of flight test results. The technique uses equation error in the frequency domain with a recursive Fourier transform for the real time data analysis, and simple design methods employing square wave input forms to design the test inputs in flight. Simulation examples are used to demonstrate that the technique produces increasingly accurate model parameter estimates resulting from sequentially designed and implemented flight test maneuvers. The method has reasonable computational requirements, and could be implemented aboard an aircraft in real time.
Liu, Boquan; Polce, Evan; Sprott, Julien C; Jiang, Jack J
2018-05-17
The purpose of this study is to introduce a chaos level test to evaluate linear and nonlinear voice type classification method performances under varying signal chaos conditions without subjective impression. Voice signals were constructed with differing degrees of noise to model signal chaos. Within each noise power, 100 Monte Carlo experiments were applied to analyze the output of jitter, shimmer, correlation dimension, and spectrum convergence ratio. The computational output of the 4 classifiers was then plotted against signal chaos level to investigate the performance of these acoustic analysis methods under varying degrees of signal chaos. A diffusive behavior detection-based chaos level test was used to investigate the performances of different voice classification methods. Voice signals were constructed by varying the signal-to-noise ratio to establish differing signal chaos conditions. Chaos level increased sigmoidally with increasing noise power. Jitter and shimmer performed optimally when the chaos level was less than or equal to 0.01, whereas correlation dimension was capable of analyzing signals with chaos levels of less than or equal to 0.0179. Spectrum convergence ratio demonstrated proficiency in analyzing voice signals with all chaos levels investigated in this study. The results of this study corroborate the performance relationships observed in previous studies and, therefore, demonstrate the validity of the validation test method. The presented chaos level validation test could be broadly utilized to evaluate acoustic analysis methods and establish the most appropriate methodology for objective voice analysis in clinical practice.
Epistasis analysis for quantitative traits by functional regression model.
Zhang, Futao; Boerwinkle, Eric; Xiong, Momiao
2014-06-01
The critical barrier in interaction analysis for rare variants is that most traditional statistical methods for testing interactions were originally designed for testing the interaction between common variants and are difficult to apply to rare variants because of their prohibitive computational time and poor ability. The great challenges for successful detection of interactions with next-generation sequencing (NGS) data are (1) lack of methods for interaction analysis with rare variants, (2) severe multiple testing, and (3) time-consuming computations. To meet these challenges, we shift the paradigm of interaction analysis between two loci to interaction analysis between two sets of loci or genomic regions and collectively test interactions between all possible pairs of SNPs within two genomic regions. In other words, we take a genome region as a basic unit of interaction analysis and use high-dimensional data reduction and functional data analysis techniques to develop a novel functional regression model to collectively test interactions between all possible pairs of single nucleotide polymorphisms (SNPs) within two genome regions. By intensive simulations, we demonstrate that the functional regression models for interaction analysis of the quantitative trait have the correct type 1 error rates and a much better ability to detect interactions than the current pairwise interaction analysis. The proposed method was applied to exome sequence data from the NHLBI's Exome Sequencing Project (ESP) and CHARGE-S study. We discovered 27 pairs of genes showing significant interactions after applying the Bonferroni correction (P-values < 4.58 × 10(-10)) in the ESP, and 11 were replicated in the CHARGE-S study. © 2014 Zhang et al.; Published by Cold Spring Harbor Laboratory Press.
Smith, Douglas D.; Hiller, John M.
1998-01-01
The present invention is an improved method and related apparatus for quantitatively analyzing machine working fluids and other aqueous compositions such as wastewater which contain various mixtures of cationic, neutral, and/or anionic surfactants, soluble soaps, and the like. The method utilizes a single-phase, non-aqueous, reactive titration composition containing water insoluble bismuth nitrate dissolved in glycerol for the titration reactant. The chemical reaction of the bismuth ion and glycerol with the surfactant in the test solutions results in formation of micelles, changes in micelle size, and the formation of insoluble bismuth soaps. These soaps are quantified by physical and chemical changes in the aqueous test solution. Both classical potentiometric analysis and turbidity measurements have been used as sensing techniques to determine the quantity of surfactant present in test solutions. This method is amenable to the analysis of various types of new, in-use, dirty or decomposed surfactants and detergents. It is a quick and efficient method utilizing a single-phase reaction without needing a separate extraction from the aqueous solution. It is adaptable to automated control with simple and reliable sensing methods. The method is applicable to a variety of compositions with concentrations from about 1% to about 10% weight. It is also applicable to the analysis of waste water containing surfactants with appropriate pre-treatments for concentration.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, D.D.; Hiller, J.M.
1998-02-24
The present invention is an improved method and related apparatus for quantitatively analyzing machine working fluids and other aqueous compositions such as wastewater which contain various mixtures of cationic, neutral, and/or anionic surfactants, soluble soaps, and the like. The method utilizes a single-phase, non-aqueous, reactive titration composition containing water insoluble bismuth nitrate dissolved in glycerol for the titration reactant. The chemical reaction of the bismuth ion and glycerol with the surfactant in the test solutions results in formation of micelles, changes in micelle size, and the formation of insoluble bismuth soaps. These soaps are quantified by physical and chemical changesmore » in the aqueous test solution. Both classical potentiometric analysis and turbidity measurements have been used as sensing techniques to determine the quantity of surfactant present in test solutions. This method is amenable to the analysis of various types of new, in-use, dirty or decomposed surfactants and detergents. It is a quick and efficient method utilizing a single-phase reaction without needing a separate extraction from the aqueous solution. It is adaptable to automated control with simple and reliable sensing methods. The method is applicable to a variety of compositions with concentrations from about 1% to about 10% weight. It is also applicable to the analysis of waste water containing surfactants with appropriate pre-treatments for concentration. 1 fig.« less
Smith, D.D.; Hiller, J.M.
1998-02-24
The present invention is an improved method and related apparatus for quantitatively analyzing machine working fluids and other aqueous compositions such as wastewater which contain various mixtures of cationic, neutral, and/or anionic surfactants, soluble soaps, and the like. The method utilizes a single-phase, non-aqueous, reactive titration composition containing water insoluble bismuth nitrate dissolved in glycerol for the titration reactant. The chemical reaction of the bismuth ion and glycerol with the surfactant in the test solutions results in formation of micelles, changes in micelle size, and the formation of insoluble bismuth soaps. These soaps are quantified by physical and chemical changes in the aqueous test solution. Both classical potentiometric analysis and turbidity measurements have been used as sensing techniques to determine the quantity of surfactant present in test solutions. This method is amenable to the analysis of various types of new, in-use, dirty or decomposed surfactants and detergents. It is a quick and efficient method utilizing a single-phase reaction without needing a separate extraction from the aqueous solution. It is adaptable to automated control with simple and reliable sensing methods. The method is applicable to a variety of compositions with concentrations from about 1% to about 10% weight. It is also applicable to the analysis of waste water containing surfactants with appropriate pre-treatments for concentration. 1 fig.
2011-01-01
Background Dementia and cognitive impairment associated with aging are a major medical and social concern. Neuropsychological testing is a key element in the diagnostic procedures of Mild Cognitive Impairment (MCI), but has presently a limited value in the prediction of progression to dementia. We advance the hypothesis that newer statistical classification methods derived from data mining and machine learning methods like Neural Networks, Support Vector Machines and Random Forests can improve accuracy, sensitivity and specificity of predictions obtained from neuropsychological testing. Seven non parametric classifiers derived from data mining methods (Multilayer Perceptrons Neural Networks, Radial Basis Function Neural Networks, Support Vector Machines, CART, CHAID and QUEST Classification Trees and Random Forests) were compared to three traditional classifiers (Linear Discriminant Analysis, Quadratic Discriminant Analysis and Logistic Regression) in terms of overall classification accuracy, specificity, sensitivity, Area under the ROC curve and Press'Q. Model predictors were 10 neuropsychological tests currently used in the diagnosis of dementia. Statistical distributions of classification parameters obtained from a 5-fold cross-validation were compared using the Friedman's nonparametric test. Results Press' Q test showed that all classifiers performed better than chance alone (p < 0.05). Support Vector Machines showed the larger overall classification accuracy (Median (Me) = 0.76) an area under the ROC (Me = 0.90). However this method showed high specificity (Me = 1.0) but low sensitivity (Me = 0.3). Random Forest ranked second in overall accuracy (Me = 0.73) with high area under the ROC (Me = 0.73) specificity (Me = 0.73) and sensitivity (Me = 0.64). Linear Discriminant Analysis also showed acceptable overall accuracy (Me = 0.66), with acceptable area under the ROC (Me = 0.72) specificity (Me = 0.66) and sensitivity (Me = 0.64). The remaining classifiers showed overall classification accuracy above a median value of 0.63, but for most sensitivity was around or even lower than a median value of 0.5. Conclusions When taking into account sensitivity, specificity and overall classification accuracy Random Forests and Linear Discriminant analysis rank first among all the classifiers tested in prediction of dementia using several neuropsychological tests. These methods may be used to improve accuracy, sensitivity and specificity of Dementia predictions from neuropsychological testing. PMID:21849043
Methodenvergleich zur Bestimmung der hydraulischen Durchlässigkeit
NASA Astrophysics Data System (ADS)
Storz, Katharina; Steger, Hagen; Wagner, Valentin; Bayer, Peter; Blum, Philipp
2017-06-01
Knowing the hydraulic conductivity (K) is a precondition for understanding groundwater flow processes in the subsurface. Numerous laboratory and field methods for the determination of hydraulic conductivity exist, which can lead to significantly different results. In order to quantify the variability of these various methods, the hydraulic conductivity was examined for an industrial silica sand (Dorsilit) using four different methods: (1) grain-size analysis, (2) Kozeny-Carman approach, (3) permeameter tests and (4) flow rate experiments in large-scale tank experiments. Due to the large volume of the artificially built aquifer, the tank experiment results are assumed to be the most representative. Hydraulic conductivity values derived from permeameter tests show only minor deviation, while results of the empirically evaluated grain-size analysis are about one magnitude higher and show great variances. The latter was confirmed by the analysis of several methods for the determination of K-values found in the literature, thus we generally question the suitability of grain-size analyses and strongly recommend the use of permeameter tests.
Non-Volatile Residue (NVR) Contamination from Dry Handling and Solvent Cleaning
NASA Technical Reports Server (NTRS)
Sovinski, Marjorie F.
2009-01-01
This slide presentation reviews the testing for Non-Volatile Residue contamination transferred to surfaces from handling and solvent cleaning. Included in the presentation is a list of the items tested, formal work instructions dealing with NVR. There is an explanation of the Gravimetric determination method used to test the NVR in a variety of items, i.e., Gloves, Swabs, Garments, Bagging material, film and Wipes. Another method to test for contamination from NVR is the contact transfer method. The use of this method for testing gloves, garments, bagging material and film is explained. Certain equations use in NVR analysis and the use of a database for testing of NVR in consumables are reviewed.
Foreign Object Damage to Tires Operating in a Wartime Environment
1991-11-01
barriers were successfully overcome and the method of testing employed can now be confidently used for future test needs of this type. Data Analysis ...combined variable effects. Analysis consideration involved cut types, cut depths, number of cuts, cut/hit probabilities, tire failures, and aircraft...November 1988 with data reduction and analysis continuing into October 1989. All of the cutting tests reported in this report were conducted at the
Lísa, Miroslav; Cífková, Eva; Khalikova, Maria; Ovčačíková, Magdaléna; Holčapek, Michal
2017-11-24
Lipidomic analysis of biological samples in a clinical research represents challenging task for analytical methods given by the large number of samples and their extreme complexity. In this work, we compare direct infusion (DI) and chromatography - mass spectrometry (MS) lipidomic approaches represented by three analytical methods in terms of comprehensiveness, sample throughput, and validation results for the lipidomic analysis of biological samples represented by tumor tissue, surrounding normal tissue, plasma, and erythrocytes of kidney cancer patients. Methods are compared in one laboratory using the identical analytical protocol to ensure comparable conditions. Ultrahigh-performance liquid chromatography/MS (UHPLC/MS) method in hydrophilic interaction liquid chromatography mode and DI-MS method are used for this comparison as the most widely used methods for the lipidomic analysis together with ultrahigh-performance supercritical fluid chromatography/MS (UHPSFC/MS) method showing promising results in metabolomics analyses. The nontargeted analysis of pooled samples is performed using all tested methods and 610 lipid species within 23 lipid classes are identified. DI method provides the most comprehensive results due to identification of some polar lipid classes, which are not identified by UHPLC and UHPSFC methods. On the other hand, UHPSFC method provides an excellent sensitivity for less polar lipid classes and the highest sample throughput within 10min method time. The sample consumption of DI method is 125 times higher than for other methods, while only 40μL of organic solvent is used for one sample analysis compared to 3.5mL and 4.9mL in case of UHPLC and UHPSFC methods, respectively. Methods are validated for the quantitative lipidomic analysis of plasma samples with one internal standard for each lipid class. Results show applicability of all tested methods for the lipidomic analysis of biological samples depending on the analysis requirements. Copyright © 2017 Elsevier B.V. All rights reserved.
Gene Level Meta-Analysis of Quantitative Traits by Functional Linear Models.
Fan, Ruzong; Wang, Yifan; Boehnke, Michael; Chen, Wei; Li, Yun; Ren, Haobo; Lobach, Iryna; Xiong, Momiao
2015-08-01
Meta-analysis of genetic data must account for differences among studies including study designs, markers genotyped, and covariates. The effects of genetic variants may differ from population to population, i.e., heterogeneity. Thus, meta-analysis of combining data of multiple studies is difficult. Novel statistical methods for meta-analysis are needed. In this article, functional linear models are developed for meta-analyses that connect genetic data to quantitative traits, adjusting for covariates. The models can be used to analyze rare variants, common variants, or a combination of the two. Both likelihood-ratio test (LRT) and F-distributed statistics are introduced to test association between quantitative traits and multiple variants in one genetic region. Extensive simulations are performed to evaluate empirical type I error rates and power performance of the proposed tests. The proposed LRT and F-distributed statistics control the type I error very well and have higher power than the existing methods of the meta-analysis sequence kernel association test (MetaSKAT). We analyze four blood lipid levels in data from a meta-analysis of eight European studies. The proposed methods detect more significant associations than MetaSKAT and the P-values of the proposed LRT and F-distributed statistics are usually much smaller than those of MetaSKAT. The functional linear models and related test statistics can be useful in whole-genome and whole-exome association studies. Copyright © 2015 by the Genetics Society of America.
Wang, Ling; Xia, Jie-lai; Yu, Li-li; Li, Chan-juan; Wang, Su-zhen
2008-06-01
To explore several numerical methods of ordinal variable in one-way ordinal contingency table and their interrelationship, and to compare corresponding statistical analysis methods such as Ridit analysis and rank sum test. Formula deduction was based on five simplified grading approaches including rank_r(i), ridit_r(i), ridit_r(ci), ridit_r(mi), and table scores. Practical data set was verified by SAS8.2 in clinical practice (to test the effect of Shiwei solution in treatment for chronic tracheitis). Because of the linear relationship of rank_r(i) = N ridit_r(i) + 1/2 = N ridit_r(ci) = (N + 1) ridit_r(mi), the exact chi2 values in Ridit analysis based on ridit_r(i), ridit_r(ci), and ridit_r(mi), were completely the same, and they were equivalent to the Kruskal-Wallis H test. Traditional Ridit analysis was based on ridit_r(i), and its corresponding chi2 value calculated with an approximate variance (1/12) was conservative. The exact chi2 test of Ridit analysis should be used when comparing multiple groups in the clinical researches because of its special merits such as distribution of mean ridit value on (0,1) and clear graph expression. The exact chi2 test of Ridit analysis can be output directly by proc freq of SAS8.2 with ridit and modridit option (SCORES =). The exact chi2 test of Ridit analysis is equivalent to the Kruskal-Wallis H test, and should be used when comparing multiple groups in the clinical researches.
Detection of incipient defects in cables by partial discharge signal analysis
NASA Astrophysics Data System (ADS)
Martzloff, F. D.; Simmon, E.; Steiner, J. P.; Vanbrunt, R. J.
1992-07-01
As one of the objectives of a program aimed at assessing test methods for in-situ detection of incipient defects in cables due to aging, a laboratory test system was implemented to demonstrate that the partial discharge analysis method can be successfully applied to low-voltage cables. Previous investigations generally involved cables rated 5 kV or higher, while the objective of the program focused on the lower voltages associated with the safety systems of nuclear power plants. The defect detection system implemented for the project was based on commercially available signal analysis hardware and software packages, customized for the specific purposes of the project. The test specimens included several cables of the type found in nuclear power plants, including artificial defects introduced at various points of the cable. The results indicate that indeed, partial discharge analysis is capable of detecting incipient defects in low-voltage cables. There are, however, some limitations of technical and non-technical nature that need further exploration before this method can be accepted in the industry.
Analysis of 3D printing parameters of gears for hybrid manufacturing
NASA Astrophysics Data System (ADS)
Budzik, Grzegorz; Przeszlowski, Łukasz; Wieczorowski, Michal; Rzucidlo, Arkadiusz; Gapinski, Bartosz; Krolczyk, Grzegorz
2018-05-01
The paper deals with analysis and selection of parameters of rapid prototyping of gears by selective sintering of metal powders. Presented results show wide spectrum of application of RP systems in manufacturing processes of machine elements, basing on analysis of market in term of application of additive manufacturing technology in different sectors of industry. Considerable growth of these methods over the past years can be observed. The characteristic errors of printed model with respect to ideal one for each technique were pointed out. Special attention was paid to the method of preparation of numerical data CAD/STL/RP. Moreover the analysis of manufacturing processes of gear type elements was presented. The tested gears were modeled with different allowances for final machining and made by DMLS. Metallographic analysis and strength tests on prepared specimens were performed. The above mentioned analysis and tests were used to compare the real properties of material with the nominal ones. To improve the quality of surface after sintering the gears were subjected to final machining. The analysis of geometry of gears after hybrid manufacturing method was performed (fig.1). The manufacturing process was defined in a traditional way as well as with the aid of modern manufacturing techniques. Methodology and obtained results can be used for other machine elements than gears and constitutes the general theory of production processes in rapid prototyping methods as well as in designing and implementation of production.
Testing for genetic association taking into account phenotypic information of relatives.
Uh, Hae-Won; Wijk, Henk Jan van der; Houwing-Duistermaat, Jeanine J
2009-12-15
We investigated efficient case-control association analysis using family data. The outcome of interest was coronary heart disease. We employed existing and new methods that take into account the correlations among related individuals to obtain the proper type I error rates. The methods considered for autosomal single-nucleotide polymorphisms were: 1) generalized estimating equations-based methods, 2) variance-modified Cochran-Armitage (MCA) trend test incorporating kinship coefficients, and 3) genotypic modified quasi-likelihood score test. Additionally, for X-linked single-nucleotide polymorphisms we proposed a two-degrees-of-freedom test. Performance of these methods was tested using Framingham Heart Study 500 k array data.
Nondestructive equipment study
NASA Technical Reports Server (NTRS)
1985-01-01
Identification of existing nondestructive Evaluation (NDE) methods that could be used in a low Earth orbit environment; evaluation of each method with respect to the set of criteria called out in the statement of work; selection of the most promising NDE methods for further evaluation; use of selected NDE methods to test samples of pressure vessel materials in a vacuum; pressure testing of a complex monolythic pressure vessel with known flaws using acoustic emissions in a vacuum; and recommendations for further studies based on analysis and testing are covered.
Tyrer, Jonathan P; Guo, Qi; Easton, Douglas F; Pharoah, Paul D P
2013-06-06
The development of genotyping arrays containing hundreds of thousands of rare variants across the genome and advances in high-throughput sequencing technologies have made feasible empirical genetic association studies to search for rare disease susceptibility alleles. As single variant testing is underpowered to detect associations, the development of statistical methods to combine analysis across variants - so-called "burden tests" - is an area of active research interest. We previously developed a method, the admixture maximum likelihood test, to test multiple, common variants for association with a trait of interest. We have extended this method, called the rare admixture maximum likelihood test (RAML), for the analysis of rare variants. In this paper we compare the performance of RAML with six other burden tests designed to test for association of rare variants. We used simulation testing over a range of scenarios to test the power of RAML compared to the other rare variant association testing methods. These scenarios modelled differences in effect variability, the average direction of effect and the proportion of associated variants. We evaluated the power for all the different scenarios. RAML tended to have the greatest power for most scenarios where the proportion of associated variants was small, whereas SKAT-O performed a little better for the scenarios with a higher proportion of associated variants. The RAML method makes no assumptions about the proportion of variants that are associated with the phenotype of interest or the magnitude and direction of their effect. The method is flexible and can be applied to both dichotomous and quantitative traits and allows for the inclusion of covariates in the underlying regression model. The RAML method performed well compared to the other methods over a wide range of scenarios. Generally power was moderate in most of the scenarios, underlying the need for large sample sizes in any form of association testing.
[Interlaboratory Study on Evaporation Residue Test for Food Contact Products (Report 1)].
Ohno, Hiroyuki; Mutsuga, Motoh; Abe, Tomoyuki; Abe, Yutaka; Amano, Homare; Ishihara, Kinuyo; Ohsaka, Ikue; Ohno, Haruka; Ohno, Yuichiro; Ozaki, Asako; Kakihara, Yoshiteru; Kobayashi, Hisashi; Sakuragi, Hiroshi; Shibata, Hiroshi; Shirono, Katsuhiro; Sekido, Haruko; Takasaka, Noriko; Takenaka, Yu; Tajima, Yoshiyasu; Tanaka, Aoi; Tanaka, Hideyuki; Tonooka, Hiroyuki; Nakanishi, Toru; Nomura, Chie; Haneishi, Nahoko; Hayakawa, Masato; Miura, Toshihiko; Yamaguchi, Miku; Watanabe, Kazunari; Sato, Kyoko
2018-01-01
An interlaboratory study was performed to evaluate the equivalence between an official method and a modified method of evaporation residue test using three food-simulating solvents (water, 4% acetic acid and 20% ethanol), based on the Japanese Food Sanitation Law for food contact products. Twenty-three laboratories participated, and tested the evaporation residues of nine test solutions as blind duplicates. For evaporation, a water bath was used in the official method, and a hot plate in the modified method. In most laboratories, the test solutions were heated until just prior to evaporation to dryness, and then allowed to dry under residual heat. Statistical analysis revealed that there was no significant difference between the two methods, regardless of the heating equipment used. Accordingly, the modified method provides performance equal to the official method, and is available as an alternative method.
Ding, Kuan-Fu; Petricoin, Emanuel F; Finlay, Darren; Yin, Hongwei; Hendricks, William P D; Sereduk, Chris; Kiefer, Jeffrey; Sekulic, Aleksandar; LoRusso, Patricia M; Vuori, Kristiina; Trent, Jeffrey M; Schork, Nicholas J
2018-01-12
Cancer cell lines are often used in high throughput drug screens (HTS) to explore the relationship between cell line characteristics and responsiveness to different therapies. Many current analysis methods infer relationships by focusing on one aspect of cell line drug-specific dose-response curves (DRCs), the concentration causing 50% inhibition of a phenotypic endpoint (IC 50 ). Such methods may overlook DRC features and do not simultaneously leverage information about drug response patterns across cell lines, potentially increasing false positive and negative rates in drug response associations. We consider the application of two methods, each rooted in nonlinear mixed effects (NLME) models, that test the relationship relationships between estimated cell line DRCs and factors that might mitigate response. Both methods leverage estimation and testing techniques that consider the simultaneous analysis of different cell lines to draw inferences about any one cell line. One of the methods is designed to provide an omnibus test of the differences between cell line DRCs that is not focused on any one aspect of the DRC (such as the IC 50 value). We simulated different settings and compared the different methods on the simulated data. We also compared the proposed methods against traditional IC 50 -based methods using 40 melanoma cell lines whose transcriptomes, proteomes, and, importantly, BRAF and related mutation profiles were available. Ultimately, we find that the NLME-based methods are more robust, powerful and, for the omnibus test, more flexible, than traditional methods. Their application to the melanoma cell lines reveals insights into factors that may be clinically useful.
Parallel processing methods for space based power systems
NASA Technical Reports Server (NTRS)
Berry, F. C.
1993-01-01
This report presents a method for doing load-flow analysis of a power system by using a decomposition approach. The power system for the Space Shuttle is used as a basis to build a model for the load-flow analysis. To test the decomposition method for doing load-flow analysis, simulations were performed on power systems of 16, 25, 34, 43, 52, 61, 70, and 79 nodes. Each of the power systems was divided into subsystems and simulated under steady-state conditions. The results from these tests have been found to be as accurate as tests performed using a standard serial simulator. The division of the power systems into different subsystems was done by assigning a processor to each area. There were 13 transputers available, therefore, up to 13 different subsystems could be simulated at the same time. This report has preliminary results for a load-flow analysis using a decomposition principal. The report shows that the decomposition algorithm for load-flow analysis is well suited for parallel processing and provides increases in the speed of execution.
Strain Modal Analysis of Small and Light Pipes Using Distributed Fibre Bragg Grating Sensors
Huang, Jun; Zhou, Zude; Zhang, Lin; Chen, Juntao; Ji, Chunqian; Pham, Duc Truong
2016-01-01
Vibration fatigue failure is a critical problem of hydraulic pipes under severe working conditions. Strain modal testing of small and light pipes is a good option for dynamic characteristic evaluation, structural health monitoring and damage identification. Unique features such as small size, light weight, and high multiplexing capability enable Fibre Bragg Grating (FBG) sensors to measure structural dynamic responses where sensor size and placement are critical. In this paper, experimental strain modal analysis of pipes using distributed FBG sensors ispresented. Strain modal analysis and parameter identification methods are introduced. Experimental strain modal testing and finite element analysis for a cantilever pipe have been carried out. The analysis results indicate that the natural frequencies and strain mode shapes of the tested pipe acquired by FBG sensors are in good agreement with the results obtained by a reference accelerometer and simulation outputs. The strain modal parameters of a hydraulic pipe were obtained by the proposed strain modal testing method. FBG sensors have been shown to be useful in the experimental strain modal analysis of small and light pipes in mechanical, aeronautic and aerospace applications. PMID:27681728
Single-strain-gage force/stiffness buckling prediction techniques on a hat-stiffened panel
NASA Technical Reports Server (NTRS)
Hudson, Larry D.; Thompson, Randolph C.
1991-01-01
Predicting the buckling characteristics of a test panel is necessary to ensure panel integrity during a test program. A single-strain-gage buckling prediction method was developed on a hat-stiffened, monolithic titanium buckling panel. The method is an adaptation of the original force/stiffness method which requires back-to-back gages. The single-gage method was developed because the test panel did not have back-to-back gages. The method was used to predict buckling loads and temperatures under various heating and loading conditions. The results correlated well with a finite element buckling analysis. The single-gage force/stiffness method was a valid real-time and post-test buckling prediction technique.
Samide, Michael J; Smith, Gregory D
2015-12-24
Construction materials used in museums for the display, storage, and transportation of artwork must be assessed for their tendency to emit harmful pollution that could potentially damage cultural treasures. Traditionally, a subjective metals corrosion test known as the Oddy test has been widely utilized in museums for this purpose. To augment the Oddy test, an instrumental sampling approach based on evolved gas analysis (EGA) coupled to gas chromatography (GC) with mass spectral (MS) detection has been implemented for the first time to qualitatively identify off-gassed pollutants under specific conditions. This approach is compared to other instrumental methods reported in the literature. This novel application of the EGA sampling technique yields several benefits over traditional testing, including rapidity, high sensitivity, and broad detectability of volatile organic compounds (VOCs). Furthermore, unlike other reported instrumental approaches, the EGA method was used to determine quantitatively the amount of VOCs emitted by acetate resins and polyurethane foams under specific conditions using both an external calibration method as well as surrogate response factors. EGA was successfully employed to rapidly characterize emissions from 12 types of common plastics. This analysis is advocated as a rapid pre-screening method to rule out poorly performing materials prior to investing time and energy in Oddy testing. The approach is also useful for rapid, routine testing of construction materials previously vetted by traditional testing, but which may experience detrimental formulation changes over time. As an example, a case study on batch re-orders of rigid expanded poly(vinyl chloride) board stock is presented. Copyright © 2015 Elsevier B.V. All rights reserved.
Integrated Data Collection Analysis (IDCA) Program - Statistical Analysis of RDX Standard Data Sets
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sandstrom, Mary M.; Brown, Geoffrey W.; Preston, Daniel N.
2015-10-30
The Integrated Data Collection Analysis (IDCA) program is conducting a Proficiency Test for Small- Scale Safety and Thermal (SSST) testing of homemade explosives (HMEs). Described here are statistical analyses of the results for impact, friction, electrostatic discharge, and differential scanning calorimetry analysis of the RDX Type II Class 5 standard. The material was tested as a well-characterized standard several times during the proficiency study to assess differences among participants and the range of results that may arise for well-behaved explosive materials. The analyses show that there are detectable differences among the results from IDCA participants. While these differences are statisticallymore » significant, most of them can be disregarded for comparison purposes to assess potential variability when laboratories attempt to measure identical samples using methods assumed to be nominally the same. The results presented in this report include the average sensitivity results for the IDCA participants and the ranges of values obtained. The ranges represent variation about the mean values of the tests of between 26% and 42%. The magnitude of this variation is attributed to differences in operator, method, and environment as well as the use of different instruments that are also of varying age. The results appear to be a good representation of the broader safety testing community based on the range of methods, instruments, and environments included in the IDCA Proficiency Test.« less
Analysis of messy data with heteroscedastic in mean models
NASA Astrophysics Data System (ADS)
Trianasari, Nurvita; Sumarni, Cucu
2016-02-01
In the analysis of the data, we often faced with the problem of data where the data did not meet some assumptions. In conditions of such data is often called data messy. This problem is a consequence of the data that generates outliers that bias or error estimation. To analyze the data messy, there are three approaches, namely standard analysis, transform data and data analysis methods rather than a standard. Simulations conducted to determine the performance of a third comparative test procedure on average often the model variance is not homogeneous. Data simulation of each scenario is raised as much as 500 times. Next, we do the analysis of the average comparison test using three methods, Welch test, mixed models and Welch-r test. Data generation is done through software R version 3.1.2. Based on simulation results, these three methods can be used for both normal and abnormal case (homoscedastic). The third method works very well on data balanced or unbalanced when there is no violation in the homogenity's assumptions variance. For balanced data, the three methods still showed an excellent performance despite the violation of the assumption of homogeneity of variance, with the requisite degree of heterogeneity is high. It can be shown from the level of power test above 90 percent, and the best to Welch method (98.4%) and the Welch-r method (97.8%). For unbalanced data, Welch method will be very good moderate at in case of heterogeneity positive pair with a 98.2% power. Mixed models method will be very good at case of highly heterogeneity was negative negative pairs with power. Welch-r method works very well in both cases. However, if the level of heterogeneity of variance is very high, the power of all method will decrease especially for mixed models methods. The method which still works well enough (power more than 50%) is Welch-r method (62.6%), and the method of Welch (58.6%) in the case of balanced data. If the data are unbalanced, Welch-r method works well enough in the case of highly heterogeneous positive positive or negative negative pairs, there power are 68.8% and 51% consequencly. Welch method perform well enough only in the case of highly heterogeneous variety of positive positive pairs with it is power of 64.8%. While mixed models method is good in the case of a very heterogeneous variety of negative partner with 54.6% power. So in general, when there is a variance is not homogeneous case, Welch method is applied to the data rank (Welch-r) has a better performance than the other methods.
Helicopter external noise prediction and correlation with flight test
NASA Technical Reports Server (NTRS)
Gupta, B. P.
1978-01-01
Mathematical analysis procedures for predicting the main and tail rotor rotational and broadband noise are presented. The aerodynamic and acoustical data from Operational Loads Survey (OLS) flight program are used for validating the analysis and noise prediction methodology. For the long method of rotational noise prediction, the spanwise, chordwise, and azimuthwise airloading is used. In the short method, the airloads are assumed to be concentrated at a single spanwise station and for higher harmonics an airloading harmonic exponent of 2.0 is assumed. For the same flight condition, the predictions from long and short methods of rotational noise prediction are compared with the flight test results. The short method correlates as well or better than the long method.
Marzulli, F; Maguire, H C
1982-02-01
Several guinea-pig predictive test methods were evaluated by comparison of results with those obtained with human predictive tests, using ten compounds that have been used in cosmetics. The method involves the statistical analysis of the frequency with which guinea-pig tests agree with the findings of tests in humans. In addition, the frequencies of false positive and false negative predictive findings are considered and statistically analysed. The results clearly demonstrate the superiority of adjuvant tests (complete Freund's adjuvant) in determining skin sensitizers and the overall superiority of the guinea-pig maximization test in providing results similar to those obtained by human testing. A procedure is suggested for utilizing adjuvant and non-adjuvant test methods for characterizing compounds as of weak, moderate or strong sensitizing potential.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Butler, J.J. Jr.; Hyder, Z.
The Nguyen and Pinder method is one of four techniques commonly used for analysis of response data from slug tests. Limited field research has raised questions about the reliability of the parameter estimates obtained with this method. A theoretical evaluation of this technique reveals that errors were made in the derivation of the analytical solution upon which the technique is based. Simulation and field examples show that the errors result in parameter estimates that can differ from actual values by orders of magnitude. These findings indicate that the Nguyen and Pinder method should no longer be a tool in themore » repertoire of the field hydrogeologist. If data from a slug test performed in a partially penetrating well in a confined aquifer need to be analyzed, recent work has shown that the Hvorslev method is the best alternative among the commonly used techniques.« less
Cost analysis in the toxicology laboratory.
Travers, E M
1990-09-01
The process of determining laboratory sectional and departmental costs and test costs for instrument-generated and manually generated reportable results for toxicology laboratories has been outlined in this article. It is hoped that the basic principles outlined in the preceding text will clarify and elucidate one of the most important areas needed for laboratory fiscal integrity and its survival in these difficult times for health care providers. The following general principles derived from this article are helpful aids for managers of toxicology laboratories. 1. To manage a cost-effective, efficient toxicology laboratory, several factors must be considered: the laboratory's instrument configuration, test turnaround time needs, the test menu offered, the analytic methods used, the cost of labor based on time expended and the experience and educational level of the staff, and logistics that determine specimen delivery time and costs. 2. There is a wide variation in costs for toxicologic methods, which requires that an analysis of capital (equipment) purchase and operational (test performance) costs be performed to avoid waste, purchase wisely, and determine which tests consume the majority of the laboratory's resources. 3. Toxicologic analysis is composed of many complex steps. Each step must be individually cost-accounted. Screening test results must be confirmed, and the cost for both steps must be included in the cost per reportable result. 4. Total costs will vary in the same laboratory and between laboratories based on differences in salaries paid to technical staff, differences in reagent/supply costs, the number of technical staff needed to operate the analyzer or perform the method, and the inefficient use of highly paid staff to operate the analyzer or perform the method. 5. Since direct test costs vary directly with the type and number of analyzers or methods and are dependent on the operational mode designed by the manufacturer, laboratory managers should construct an actual test-cost data base for instrument or method in use to accurately compare costs using the "bottom-up" approach. 6. Laboratory expenses can be examined from three perspectives: total laboratory, laboratory section, and subsection workstation. The objective is to track all laboratory expenses through each of these levels. 7. In the final analysis, a portion of total laboratory expenses must be allocated to each unit of laboratory output--the billable procedure or, in laboratories where tests are not billed, the tests produced.(ABSTRACT TRUNCATED AT 400 WORDS)
Doorn, J; Storteboom, T T R; Mulder, A M; de Jong, W H A; Rottier, B L; Kema, I P
2015-07-01
Measurement of chloride in sweat is an essential part of the diagnostic algorithm for cystic fibrosis. The lack in sensitivity and reproducibility of current methods led us to develop an ion chromatography/high-performance liquid chromatography (IC/HPLC) method, suitable for the analysis of both chloride and sodium in small volumes of sweat. Precision, linearity and limit of detection of an in-house developed IC/HPLC method were established. Method comparison between the newly developed IC/HPLC method and the traditional Chlorocounter was performed, and trueness was determined using Passing Bablok method comparison with external quality assurance material (Royal College of Pathologists of Australasia). Precision and linearity fulfill criteria as established by UK guidelines are comparable with inductively coupled plasma-mass spectrometry methods. Passing Bablok analysis demonstrated excellent correlation between IC/HPLC measurements and external quality assessment target values, for both chloride and sodium. With a limit of quantitation of 0.95 mmol/L, our method is suitable for the analysis of small amounts of sweat and can thus be used in combination with the Macroduct collection system. Although a chromatographic application results in a somewhat more expensive test compared to a Chlorocounter test, more accurate measurements are achieved. In addition, simultaneous measurements of sodium concentrations will result in better detection of false positives, less test repeating and thus faster and more accurate and effective diagnosis. The described IC/HPLC method, therefore, provides a precise, relatively cheap and easy-to-handle application for the analysis of both chloride and sodium in sweat. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.
Hoyer, Annika; Kuss, Oliver
2018-05-01
Meta-analysis of diagnostic studies is still a rapidly developing area of biostatistical research. Especially, there is an increasing interest in methods to compare different diagnostic tests to a common gold standard. Restricting to the case of two diagnostic tests, in these meta-analyses the parameters of interest are the differences of sensitivities and specificities (with their corresponding confidence intervals) between the two diagnostic tests while accounting for the various associations across single studies and between the two tests. We propose statistical models with a quadrivariate response (where sensitivity of test 1, specificity of test 1, sensitivity of test 2, and specificity of test 2 are the four responses) as a sensible approach to this task. Using a quadrivariate generalized linear mixed model naturally generalizes the common standard bivariate model of meta-analysis for a single diagnostic test. If information on several thresholds of the tests is available, the quadrivariate model can be further generalized to yield a comparison of full receiver operating characteristic (ROC) curves. We illustrate our model by an example where two screening methods for the diagnosis of type 2 diabetes are compared.
NASA Technical Reports Server (NTRS)
Littell, Justin D.; Binienda, Wieslaw K.; Arnold, William A.; Roberts, Gary D.; Goldberg, Robert K.
2010-01-01
The reliability of impact simulations for aircraft components made with triaxial-braided carbon-fiber composites is currently limited by inadequate material property data and lack of validated material models for analysis. Methods to characterize the material properties used in the analytical models from a systematically obtained set of test data are also lacking. A macroscopic finite element based analytical model to analyze the impact response of these materials has been developed. The stiffness and strength properties utilized in the material model are obtained from a set of quasi-static in-plane tension, compression and shear coupon level tests. Full-field optical strain measurement techniques are applied in the testing, and the results are used to help in characterizing the model. The unit cell of the braided composite is modeled as a series of shell elements, where each element is modeled as a laminated composite. The braided architecture can thus be approximated within the analytical model. The transient dynamic finite element code LS-DYNA is utilized to conduct the finite element simulations, and an internal LS-DYNA constitutive model is utilized in the analysis. Methods to obtain the stiffness and strength properties required by the constitutive model from the available test data are developed. Simulations of quasi-static coupon tests and impact tests of a represented braided composite are conducted. Overall, the developed method shows promise, but improvements that are needed in test and analysis methods for better predictive capability are examined.
NASA Astrophysics Data System (ADS)
Yusa, Yasunori; Okada, Hiroshi; Yamada, Tomonori; Yoshimura, Shinobu
2018-04-01
A domain decomposition method for large-scale elastic-plastic problems is proposed. The proposed method is based on a quasi-Newton method in conjunction with a balancing domain decomposition preconditioner. The use of a quasi-Newton method overcomes two problems associated with the conventional domain decomposition method based on the Newton-Raphson method: (1) avoidance of a double-loop iteration algorithm, which generally has large computational complexity, and (2) consideration of the local concentration of nonlinear deformation, which is observed in elastic-plastic problems with stress concentration. Moreover, the application of a balancing domain decomposition preconditioner ensures scalability. Using the conventional and proposed domain decomposition methods, several numerical tests, including weak scaling tests, were performed. The convergence performance of the proposed method is comparable to that of the conventional method. In particular, in elastic-plastic analysis, the proposed method exhibits better convergence performance than the conventional method.
NASA Astrophysics Data System (ADS)
García-Moreno, Angel-Iván; González-Barbosa, José-Joel; Ramírez-Pedraza, Alfonso; Hurtado-Ramos, Juan B.; Ornelas-Rodriguez, Francisco-Javier
2016-04-01
Computer-based reconstruction models can be used to approximate urban environments. These models are usually based on several mathematical approximations and the usage of different sensors, which implies dependency on many variables. The sensitivity analysis presented in this paper is used to weigh the relative importance of each uncertainty contributor into the calibration of a panoramic camera-LiDAR system. Both sensors are used for three-dimensional urban reconstruction. Simulated and experimental tests were conducted. For the simulated tests we analyze and compare the calibration parameters using the Monte Carlo and Latin hypercube sampling techniques. Sensitivity analysis for each variable involved into the calibration was computed by the Sobol method, which is based on the analysis of the variance breakdown, and the Fourier amplitude sensitivity test method, which is based on Fourier's analysis. Sensitivity analysis is an essential tool in simulation modeling and for performing error propagation assessments.
Prioritizing individual genetic variants after kernel machine testing using variable selection.
He, Qianchuan; Cai, Tianxi; Liu, Yang; Zhao, Ni; Harmon, Quaker E; Almli, Lynn M; Binder, Elisabeth B; Engel, Stephanie M; Ressler, Kerry J; Conneely, Karen N; Lin, Xihong; Wu, Michael C
2016-12-01
Kernel machine learning methods, such as the SNP-set kernel association test (SKAT), have been widely used to test associations between traits and genetic polymorphisms. In contrast to traditional single-SNP analysis methods, these methods are designed to examine the joint effect of a set of related SNPs (such as a group of SNPs within a gene or a pathway) and are able to identify sets of SNPs that are associated with the trait of interest. However, as with many multi-SNP testing approaches, kernel machine testing can draw conclusion only at the SNP-set level, and does not directly inform on which one(s) of the identified SNP set is actually driving the associations. A recently proposed procedure, KerNel Iterative Feature Extraction (KNIFE), provides a general framework for incorporating variable selection into kernel machine methods. In this article, we focus on quantitative traits and relatively common SNPs, and adapt the KNIFE procedure to genetic association studies and propose an approach to identify driver SNPs after the application of SKAT to gene set analysis. Our approach accommodates several kernels that are widely used in SNP analysis, such as the linear kernel and the Identity by State (IBS) kernel. The proposed approach provides practically useful utilities to prioritize SNPs, and fills the gap between SNP set analysis and biological functional studies. Both simulation studies and real data application are used to demonstrate the proposed approach. © 2016 WILEY PERIODICALS, INC.
NASA Technical Reports Server (NTRS)
Nguyen, Truong X.; Ely, Jay J.; Koppen, Sandra V.
2001-01-01
This paper describes the implementation of mode-stirred method for susceptibility testing according to the current DO-160D standard. Test results on an Engine Data Processor using the implemented procedure and the comparisons with the standard anechoic test results are presented. The comparison experimentally shows that the susceptibility thresholds found in mode-stirred method are consistently higher than anechoic. This is consistent with the recent statistical analysis finding by NIST that the current calibration procedure overstates field strength by a fixed amount. Once the test results are adjusted for this value, the comparisons with the anechoic results are excellent. The results also show that test method has excellent chamber to chamber repeatability. Several areas for improvements to the current procedure are also identified and implemented.
Design and Test of an Improved Crashworthiness Small Composite Airframe
NASA Technical Reports Server (NTRS)
Terry, James E.; Hooper, Steven J.; Nicholson, Mark
2002-01-01
The purpose of this small business innovative research (SBIR) program was to evaluate the feasibility of developing small composite airplanes with improved crashworthiness. A combination of analysis and half scale component tests were used to develop an energy absorbing airframe. Four full scale crash tests were conducted at the NASA Impact Dynamics Research Facility, two on a hard surface and two onto soft soil, replicating earlier NASA tests of production general aviation airplanes. Several seat designs and restraint systems including both an air bag and load limiting shoulder harnesses were tested. Tests showed that occupant loads were within survivable limits with the improved structural design and the proper combination of seats and restraint systems. There was no loss of cabin volume during the events. The analysis method developed provided design guidance but time did not allow extending the analysis to soft soil impact. This project demonstrated that survivability improvements are possible with modest weight penalties. The design methods can be readily applied by airplane designers using the examples in this report.
Analysis of Added Value of Subscores with Respect to Classification
ERIC Educational Resources Information Center
Sinharay, Sandip
2014-01-01
Brennan noted that users of test scores often want (indeed, demand) that subscores be reported, along with total test scores, for diagnostic purposes. Haberman suggested a method based on classical test theory (CTT) to determine if subscores have added value over the total score. One way to interpret the method is that a subscore has added value…
Check-Standard Testing Across Multiple Transonic Wind Tunnels with the Modern Design of Experiments
NASA Technical Reports Server (NTRS)
Deloach, Richard
2012-01-01
This paper reports the result of an analysis of wind tunnel data acquired in support of the Facility Analysis Verification & Operational Reliability (FAVOR) project. The analysis uses methods referred to collectively at Langley Research Center as the Modern Design of Experiments (MDOE). These methods quantify the total variance in a sample of wind tunnel data and partition it into explained and unexplained components. The unexplained component is further partitioned in random and systematic components. This analysis was performed on data acquired in similar wind tunnel tests executed in four different U.S. transonic facilities. The measurement environment of each facility was quantified and compared.
STATISTICAL ANALYSIS OF SNAP 10A THERMOELECTRIC CONVERTER ELEMENT PROCESS DEVELOPMENT VARIABLES
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fitch, S.H.; Morris, J.W.
1962-12-15
Statistical analysis, primarily analysis of variance, was applied to evaluate several factors involved in the development of suitable fabrication and processing techniques for the production of lead telluride thermoelectric elements for the SNAP 10A energy conversion system. The analysis methods are described as to their application for determining the effects of various processing steps, estabIishing the value of individual operations, and evaluating the significance of test results. The elimination of unnecessary or detrimental processing steps was accomplished and the number of required tests was substantially reduced by application of these statistical methods to the SNAP 10A production development effort. (auth)
DOT National Transportation Integrated Search
2014-04-01
A correlation between Wavelength Dispersive X-ray Fluorescence(WDXRF) analysis of Hardened : Concrete for Chlorides and Atomic Absorption (AA) analysis (current method AASHTO T-260, procedure B) has been : found and a new method of analysis has been ...
Survival analysis: Part I — analysis of time-to-event
2018-01-01
Length of time is a variable often encountered during data analysis. Survival analysis provides simple, intuitive results concerning time-to-event for events of interest, which are not confined to death. This review introduces methods of analyzing time-to-event. The Kaplan-Meier survival analysis, log-rank test, and Cox proportional hazards regression modeling method are described with examples of hypothetical data. PMID:29768911
Webster, Gregory K; Marsden, Ian; Pommerening, Cynthia A; Tyrakowski, Christina M
2010-05-01
With the changing development paradigms in the pharmaceutical industry, laboratories are challenged to release materials for clinical studies with rapid turnaround times. To minimize cost demands, many businesses are looking to develop ways of using early Good Manufacturing Practice (GMP) materials of active pharmaceutical ingredients (API) for Good Laboratory Practice (GLP) toxicology studies. To make this happen, the analytical laboratory releases the material by one of three scenarios: (1) holding the GLP release until full GMP testing is ready, (2) issuing a separate lot number for a portion of the GMP material and releasing the material for GLP use, or (3) releasing the lot of material for GLP using alternate (equivalent) method(s) not specified for GMP release testing. Many companies are finding the third scenario to be advantageous in terms of cost and efficiency through the use of quantitative nuclear magnetic resonance (q-NMR). The use of q-NMR has proved to be a single-point replacement for routine early development testing that previously combined elements of identity testing, chromatographic assay, moisture analysis, residual solvent analysis, and elemental analysis. This study highlights that q-NMR can be validated to meet current regulatory analytical method guidelines for routine pharmaceutical analysis.
Non-Destructive Thermography Analysis of Impact Damage on Large-Scale CFRP Automotive Parts.
Maier, Alexander; Schmidt, Roland; Oswald-Tranta, Beate; Schledjewski, Ralf
2014-01-14
Laminated composites are increasingly used in aeronautics and the wind energy industry, as well as in the automotive industry. In these applications, the construction and processing need to fulfill the highest requirements regarding weight and mechanical properties. Environmental issues, like fuel consumption and CO₂-footprint, set new challenges in producing lightweight parts that meet the highly monitored standards for these branches. In the automotive industry, one main aspect of construction is the impact behavior of structural parts. To verify the quality of parts made from composite materials with little effort, cost and time, non-destructive test methods are increasingly used. A highly recommended non-destructive testing method is thermography analysis. In this work, a prototype for a car's base plate was produced by using vacuum infusion. For research work, testing specimens were produced with the same multi-layer build up as the prototypes. These specimens were charged with defined loads in impact tests to simulate the effect of stone chips. Afterwards, the impacted specimens were investigated with thermography analysis. The research results in that work will help to understand the possible fields of application and the usage of thermography analysis as the first quick and economic failure detection method for automotive parts.
7 CFR 58.930 - Official test methods.
Code of Federal Regulations, 2010 CFR
2010-01-01
..., GENERAL SPECIFICATIONS FOR APPROVED PLANTS AND STANDARDS FOR GRADES OF DAIRY PRODUCTS 1 General Specifications for Dairy Plants Approved for USDA Inspection and Grading Service 1 Operations and Operating Procedures § 58.930 Official test methods. (a) Chemical. Chemical analysis, except where otherwise prescribed...
7 CFR 58.930 - Official test methods.
Code of Federal Regulations, 2012 CFR
2012-01-01
..., GENERAL SPECIFICATIONS FOR APPROVED PLANTS AND STANDARDS FOR GRADES OF DAIRY PRODUCTS 1 General Specifications for Dairy Plants Approved for USDA Inspection and Grading Service 1 Operations and Operating Procedures § 58.930 Official test methods. (a) Chemical. Chemical analysis, except where otherwise prescribed...
7 CFR 58.930 - Official test methods.
Code of Federal Regulations, 2013 CFR
2013-01-01
..., GENERAL SPECIFICATIONS FOR APPROVED PLANTS AND STANDARDS FOR GRADES OF DAIRY PRODUCTS 1 General Specifications for Dairy Plants Approved for USDA Inspection and Grading Service 1 Operations and Operating Procedures § 58.930 Official test methods. (a) Chemical. Chemical analysis, except where otherwise prescribed...
7 CFR 58.930 - Official test methods.
Code of Federal Regulations, 2011 CFR
2011-01-01
..., GENERAL SPECIFICATIONS FOR APPROVED PLANTS AND STANDARDS FOR GRADES OF DAIRY PRODUCTS 1 General Specifications for Dairy Plants Approved for USDA Inspection and Grading Service 1 Operations and Operating Procedures § 58.930 Official test methods. (a) Chemical. Chemical analysis, except where otherwise prescribed...
7 CFR 58.930 - Official test methods.
Code of Federal Regulations, 2014 CFR
2014-01-01
..., GENERAL SPECIFICATIONS FOR APPROVED PLANTS AND STANDARDS FOR GRADES OF DAIRY PRODUCTS 1 General Specifications for Dairy Plants Approved for USDA Inspection and Grading Service 1 Operations and Operating Procedures § 58.930 Official test methods. (a) Chemical. Chemical analysis, except where otherwise prescribed...
ENVIRONMENTAL METHODS TESTING SITE PROJECT: DATA MANAGEMENT PROCEDURES PLAN
The Environmental Methods Testing Site (EMTS) Data Management Procedures Plan identifies the computer hardware and software resources used in the EMTS project. It identifies the major software packages that are available for use by principal investigators for the analysis of data...
Novel methods of imaging and analysis for the thermoregulatory sweat test.
Carroll, Michael Sean; Reed, David W; Kuntz, Nancy L; Weese-Mayer, Debra Ellyn
2018-06-07
The thermoregulatory sweat test (TST) can be central to the identification and management of disorders affecting sudomotor function and small sensory and autonomic nerve fibers, but the cumbersome nature of the standard testing protocol has prevented its widespread adoption. A high resolution, quantitative, clean and simple assay of sweating could significantly improve identification and management of these disorders. Images from 89 clinical TSTs were analyzed retrospectively using two novel techniques. First, using the standard indicator powder, skin surface sweat distributions were determined algorithmically for each patient. Second, a fundamentally novel method using thermal imaging of forced evaporative cooling was evaluated through comparison with the standard technique. Correlation and receiver operating characteristic analyses were used to determine the degree of match between these methods, and the potential limits of thermal imaging were examined through cumulative analysis of all studied patients. Algorithmic encoding of sweating and non-sweating regions produces a more objective analysis for clinical decision making. Additionally, results from the forced cooling method correspond well with those from indicator powder imaging, with a correlation across spatial regions of -0.78 (CI: -0.84 to -0.71). The method works similarly across body regions, and frame-by-frame analysis suggests the ability to identify sweating regions within about 1 second of imaging. While algorithmic encoding can enhance the standard sweat testing protocol, thermal imaging with forced evaporative cooling can dramatically improve the TST by making it less time-consuming and more patient-friendly than the current approach.
Walter, Stephen D.; Riddell, Corinne A.; Rabachini, Tatiana; Villa, Luisa L.; Franco, Eduardo L.
2013-01-01
Introduction Studies on the association of a polymorphism in codon 72 of the p53 tumour suppressor gene (rs1042522) with cervical neoplasia have inconsistent results. While several methods for genotyping p53 exist, they vary in accuracy and are often discrepant. Methods We used latent class models (LCM) to examine the accuracy of six methods for p53 determination, all conducted by the same laboratory. We also examined the association of p53 with cytological cervical abnormalities, recognising potential test inaccuracy. Results Pairwise disagreement between laboratory methods occurred approximately 10% of the time. Given the estimated true p53 status of each woman, we found that each laboratory method is most likely to classify a woman to her correct status. Arg/Arg women had the highest risk of squamous intraepithelial lesions (SIL). Test accuracy was independent of cytology. There was no strong evidence for correlations of test errors. Discussion Empirical analyses ignore possible laboratory errors, and so are inherently biased, but test accuracy estimated by the LCM approach is unbiased when model assumptions are met. LCM analysis avoids ambiguities arising from empirical test discrepancies, obviating the need to regard any of the methods as a “gold” standard measurement. The methods we presented here to analyse the p53 data can be applied in many other situations where multiple tests exist, but where none of them is a gold standard. PMID:23441193
NASA Technical Reports Server (NTRS)
Wanthal, Steven; Schaefer, Joseph; Justusson, Brian; Hyder, Imran; Engelstad, Stephen; Rose, Cheryl
2017-01-01
The Advanced Composites Consortium is a US Government/Industry partnership supporting technologies to enable timeline and cost reduction in the development of certified composite aerospace structures. A key component of the consortium's approach is the development and validation of improved progressive damage and failure analysis methods for composite structures. These methods will enable increased use of simulations in design trade studies and detailed design development, and thereby enable more targeted physical test programs to validate designs. To accomplish this goal with confidence, a rigorous verification and validation process was developed. The process was used to evaluate analysis methods and associated implementation requirements to ensure calculation accuracy and to gage predictability for composite failure modes of interest. This paper introduces the verification and validation process developed by the consortium during the Phase I effort of the Advanced Composites Project. Specific structural failure modes of interest are first identified, and a subset of standard composite test articles are proposed to interrogate a progressive damage analysis method's ability to predict each failure mode of interest. Test articles are designed to capture the underlying composite material constitutive response as well as the interaction of failure modes representing typical failure patterns observed in aerospace structures.
Rivoirard, Romain; Duplay, Vianney; Oriol, Mathieu; Tinquaut, Fabien; Chauvin, Franck; Magne, Nicolas; Bourmaud, Aurelie
2016-01-01
Background Quality of reporting for Randomized Clinical Trials (RCTs) in oncology was analyzed in several systematic reviews, but, in this setting, there is paucity of data for the outcomes definitions and consistency of reporting for statistical tests in RCTs and Observational Studies (OBS). The objective of this review was to describe those two reporting aspects, for OBS and RCTs in oncology. Methods From a list of 19 medical journals, three were retained for analysis, after a random selection: British Medical Journal (BMJ), Annals of Oncology (AoO) and British Journal of Cancer (BJC). All original articles published between March 2009 and March 2014 were screened. Only studies whose main outcome was accompanied by a corresponding statistical test were included in the analysis. Studies based on censored data were excluded. Primary outcome was to assess quality of reporting for description of primary outcome measure in RCTs and of variables of interest in OBS. A logistic regression was performed to identify covariates of studies potentially associated with concordance of tests between Methods and Results parts. Results 826 studies were included in the review, and 698 were OBS. Variables were described in Methods section for all OBS studies and primary endpoint was clearly detailed in Methods section for 109 RCTs (85.2%). 295 OBS (42.2%) and 43 RCTs (33.6%) had perfect agreement for reported statistical test between Methods and Results parts. In multivariable analysis, variable "number of included patients in study" was associated with test consistency: aOR (adjusted Odds Ratio) for third group compared to first group was equal to: aOR Grp3 = 0.52 [0.31–0.89] (P value = 0.009). Conclusion Variables in OBS and primary endpoint in RCTs are reported and described with a high frequency. However, statistical tests consistency between methods and Results sections of OBS is not always noted. Therefore, we encourage authors and peer reviewers to verify consistency of statistical tests in oncology studies. PMID:27716793
Lawless, Martin S; Vigeant, Michelle C
2017-10-01
Selecting an appropriate listening test design for concert hall research depends on several factors, including listening test method and participant critical-listening experience. Although expert listeners afford more reliable data, their perceptions may not be broadly representative. The present paper contains two studies that examined the validity and reliability of the data obtained from two listening test methods, a successive and a comparative method, and two types of participants, musicians and non-musicians. Participants rated their overall preference of auralizations generated from eight concert hall conditions with a range of reverberation times (0.0-7.2 s). Study 1, with 34 participants, assessed the two methods. The comparative method yielded similar results and reliability as the successive method. Additionally, the comparative method was rated as less difficult and more preferable. For study 2, an additional 37 participants rated the stimuli using the comparative method only. An analysis of variance of the responses from both studies revealed that musicians are better than non-musicians at discerning their preferences across stimuli. This result was confirmed with a k-means clustering analysis on the entire dataset that revealed five preference groups. Four groups exhibited clear preferences to the stimuli, while the fifth group, predominantly comprising non-musicians, demonstrated no clear preference.
The Performance of Methods to Test Upper-Level Mediation in the Presence of Nonnormal Data
ERIC Educational Resources Information Center
Pituch, Keenan A.; Stapleton, Laura M.
2008-01-01
A Monte Carlo study compared the statistical performance of standard and robust multilevel mediation analysis methods to test indirect effects for a cluster randomized experimental design under various departures from normality. The performance of these methods was examined for an upper-level mediation process, where the indirect effect is a fixed…
NASA Technical Reports Server (NTRS)
Bentley, Nicole L.; Thomas, Evan A.; VanWie, Michael; Morrison, Chad; Stinson, Richard G.
2010-01-01
The Total Organic Carbon Analyzer (TOGA) is designed to autonomously determine recovered water quality as a function of TOC. The current TOGA has been on the International Space Station since November 2008. Functional checkout and operations revealed complex operating considerations. Specifically, failure of the hydrogen catalyst resulted in the development of an innovative oxidation analysis method. This method reduces the activation time and limits the hydrogen produced during analysis, while retaining the ability to indicate TOC concentrations within 25% accuracy. Subsequent testing and comparison to archived samples returned from the Station and tested on the ground yield high confidence in this method, and in the quality of the recovered water.
Garrido-Acosta, Osvaldo; Meza-Toledo, Sergio Enrique; Anguiano-Robledo, Liliana; Valencia-Hernández, Ignacio; Chamorro-Cevallos, Germán
2014-01-01
We determined the median effective dose (ED50) values for the anticonvulsants phenobarbital and sodium valproate using a modification of Lorke's method. This modification allowed appropriate statistical analysis and the use of a smaller number of mice per compound tested. The anticonvulsant activities of phenobarbital and sodium valproate were evaluated in male CD1 mice by maximal electroshock (MES) and intraperitoneal administration of pentylenetetrazole (PTZ). The anticonvulsant ED50 values were obtained through modifications of Lorke's method that involved changes in the selection of the three first doses in the initial test and the fourth dose in the second test. Furthermore, a test was added to evaluate the ED50 calculated by the modified Lorke's method, allowing statistical analysis of the data and determination of the confidence limits for ED50. The ED50 for phenobarbital against MES- and PTZ-induced seizures was 16.3mg/kg and 12.7mg/kg, respectively. The sodium valproate values were 261.2mg/kg and 159.7mg/kg, respectively. These results are similar to those found using the traditional methods of finding ED50, suggesting that the modifications made to Lorke's method generate equal results using fewer mice while increasing confidence in the statistical analysis. This adaptation of Lorke's method can be used to determine median letal dose (LD50) or ED50 for compounds with other pharmacological activities. Copyright © 2014 Elsevier Inc. All rights reserved.
Osathanunkul, Maslin; Ounjai, Sarawut; Osathanunkul, Rossarin; Madesis, Panagiotis
2017-01-01
It is long believed that some spices may help protect against certain chronic conditions. Spices are usually parts of plants that have been powdered into small pieces. Have you ever wondered what the curry powder in your dish is made of? The aim of this work was to develop an appropriate DNA-based method for assessment of spice identity. Selecting the best marker for species recognition in the Zingiberaceae family. Six DNA regions were investigated in silico, including ITS, matK, rbcL, rpoC, trnH-psbA and trnL. Then, only four regions (ITS, matK, rbcL and trnH-psbA) were included in the simulated HRM (High-resolution Melting) analysis as the results from previous analysis showed that rpoC and trnL may not be suitable to be used to identify Zingiberaceae species in HRM analysis based on both the percentage of nucleotide variation and GC content. Simulated HRM analysis was performed to test the feasibility of Bar-HRM. We found that ITS2 is the most effective region to be used for identification of the studied species and thus was used in laboratory HRM analysis. All seven tested Zingiberaceae plants were then able to be distinguished using the ITS2 primers in laboratory HRM. Most importantly the melting curves gained from fresh and dried tissue overlapped, which is a crucial outcome for the applicability of the analysis. The method could be used in an authentication test for dried products. In the authentication test, only one of seven store-sold Zingiberaceae products that were tested contained the species listed on their labels, while we found substitution/contamination of the tested purchased products in the rest.
Ounjai, Sarawut; Osathanunkul, Rossarin; Madesis, Panagiotis
2017-01-01
It is long believed that some spices may help protect against certain chronic conditions. Spices are usually parts of plants that have been powdered into small pieces. Have you ever wondered what the curry powder in your dish is made of? The aim of this work was to develop an appropriate DNA-based method for assessment of spice identity. Selecting the best marker for species recognition in the Zingiberaceae family. Six DNA regions were investigated in silico, including ITS, matK, rbcL, rpoC, trnH-psbA and trnL. Then, only four regions (ITS, matK, rbcL and trnH-psbA) were included in the simulated HRM (High-resolution Melting) analysis as the results from previous analysis showed that rpoC and trnL may not be suitable to be used to identify Zingiberaceae species in HRM analysis based on both the percentage of nucleotide variation and GC content. Simulated HRM analysis was performed to test the feasibility of Bar-HRM. We found that ITS2 is the most effective region to be used for identification of the studied species and thus was used in laboratory HRM analysis. All seven tested Zingiberaceae plants were then able to be distinguished using the ITS2 primers in laboratory HRM. Most importantly the melting curves gained from fresh and dried tissue overlapped, which is a crucial outcome for the applicability of the analysis. The method could be used in an authentication test for dried products. In the authentication test, only one of seven store-sold Zingiberaceae products that were tested contained the species listed on their labels, while we found substitution/contamination of the tested purchased products in the rest. PMID:29020084
FaceSheet Push-off Tests to Determine Composite Sandwich Toughness at Cryogenic Temperatures
NASA Technical Reports Server (NTRS)
Gates, Thomas S.; Herring, Helen M.
2001-01-01
A new novel test method, associated analysis, and experimental procedures are developed to investigate the toughness of the facesheet-to-core interface of a sandwich material at cryogenic temperatures. The test method is designed to simulate the failure mode associated with facesheet debonding from high levels of gas pressure in the sandwich core. The effects of specimen orientation are considered, and the results of toughness measurements are presented. Comparisons are made between room and liquid nitrogen (-196 C) test temperatures. It was determined that the test method is insensitive to specimen facesheet orientation and strain energy release rate increases with a decrease in the test temperature.
Moazami-Goudarzi, K; Laloë, D
2002-01-01
To determine the relationships among closely related populations or species, two methods are commonly used in the literature: phylogenetic reconstruction or multivariate analysis. The aim of this article is to assess the reliability of multivariate analysis. We describe a method that is based on principal component analysis and Mantel correlations, using a two-step process: The first step consists of a single-marker analysis and the second step tests if each marker reveals the same typology concerning population differentiation. We conclude that if single markers are not congruent, the compromise structure is not meaningful. Our model is not based on any particular mutation process and it can be applied to most of the commonly used genetic markers. This method is also useful to determine the contribution of each marker to the typology of populations. We test whether our method is efficient with two real data sets based on microsatellite markers. Our analysis suggests that for closely related populations, it is not always possible to accept the hypothesis that an increase in the number of markers will increase the reliability of the typology analysis. PMID:12242255
NASA Technical Reports Server (NTRS)
Keegan, W. B.
1974-01-01
In order to produce cost effective environmental test programs, the test specifications must be realistic and to be useful, they must be available early in the life of a program. This paper describes a method for achieving such specifications for subsystems by utilizing the results of a statistical analysis of data acquired at subsystem mounting locations during system level environmental tests. The paper describes the details of this statistical analysis. The resultant recommended levels are a function of the subsystems' mounting location in the spacecraft. Methods of determining this mounting 'zone' are described. Recommendations are then made as to which of the various problem areas encountered should be pursued further.
Lee, Sung-Jae; Brooks, Ronald; Bolan, Robert K.; Flynn, Risa
2013-01-01
Men who have sex with men (MSM) in the United States represent a vulnerable population with lower rates of HIV testing. There are various specific attributes of HIV testing that may impact willingness to test (WTT) for HIV. Identifying specific attributes influencing patients’ decisions around WTT for HIV is critical to ensure improved HIV testing uptake. This study examined WTT for HIV by using conjoint analysis, an innovative method for systematically estimating consumer preferences across discrete attributes. WTT for HIV was assessed across eight hypothetical HIV testing scenarios varying across seven dichotomous attributes: location (home vs. clinic), price (free vs. $50), sample collection (finger prick vs. blood), timeliness of results (immediate vs. 1–2 weeks), privacy (anonymous vs. confidential), results given (by phone vs. in-person), and type of counseling (brochure vs. in-person). Seventy-five MSM were recruited from a community based organization providing HIV testing services in Los Angeles to participate in conjoint analysis. WTT for HIV score was based on a 100-point scale. Scores ranged from 32.2 to 80.3 for eight hypothetical HIV testing scenarios. Price of HIV testing (free vs. $50) had the highest impact on WTT (impact score=31.4, SD=29.2, p<.0001), followed by timeliness of results (immediate vs. 1–2 weeks) (impact score=13.9, SD=19.9, p=<.0001) and testing location (home vs. clinic) (impact score=10.3, SD=22.8, p=.0002). Impacts of other HIV testing attributes were not significant. Conjoint analysis method enabled direct assessment of HIV testing preferences and identified specific attributes that significantly impact WTT for HIV among MSM. This method provided empirical evidence to support the potential uptake of the newly FDA-approved over-the-counter HIV home-test kit with immediate results, with cautionary note on the cost of the kit. PMID:23651439
Tatone, Elise H; Gordon, Jessica L; Hubbs, Jessie; LeBlanc, Stephen J; DeVries, Trevor J; Duffield, Todd F
2016-08-01
Several rapid tests for use on farm have been validated for the detection of hyperketonemia (HK) in dairy cattle, however the reported sensitivity and specificity of each method varies and no single study has compared them all. Meta-analysis of diagnostic test accuracy is becoming more common in human medical literature but there are few veterinary examples. The objective of this work was to perform a systematic review and meta-analysis to determine the point-of-care testing method with the highest combined sensitivity and specificity, the optimal threshold for each method, and to identify gaps in the literature. A comprehensive literature search resulted in 5196 references. After removing duplicates and performing relevance screening, 23 studies were included for the qualitative synthesis and 18 for the meta-analysis. The three index tests evaluated in the meta-analysis were: the Precision Xtra(®) handheld device measuring beta-hydroxybutyrate (BHB) concentration in whole blood, and Ketostix(®) and KetoTest(®) semi-quantitative strips measuring the concentration of acetoacetate in urine and BHB in milk, respectively. The diagnostic accuracy of the 3 index tests relative to the reference standard measurement of BHB in serum or whole blood between 1.0-1.4mmol/L was compared using the hierarchical summary receiver operator characteristic (HSROC) method. Subgroup analysis was conducted for each index test to examine the accuracy at different thresholds. The impact of the reference standard threshold, the reference standard method, the prevalence of HK in the population, the primary study source and risk of bias of the primary study was explored using meta-regression. The Precision Xtra(®) device had the highest summary sensitivity in whole blood BHB at 1.2mmol/L, 94.8% (CI95%: 92.6-97.0), and specificity, 97.5% (CI95%: 96.9-98.1). The threshold employed (1.2-1.4mmol/L) did not impact the diagnostic accuracy of the test. The Ketostix(®) and KetoTest(®) strips had the highest summary sensitivity and specificity when the trace and weak positive thresholds were used, respectively. Controlling for the source of publication, HK prevalence and reference standard employed did not impact the estimated sensitivity and specificity of the tests. Including only peer-reviewed studies reduced the number of primary studies evaluating the Precision Xtra(®) by 43% and Ketostix(®) by 33%. Diagnosing HK with blood, urine or milk are valid options, however, the diagnostic inaccuracy of urine and milk should be considered when making economic and treatment decisions. Copyright © 2016 Elsevier B.V. All rights reserved.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-06
... Establishing Test Procedures for the Analysis of Pollutants Under the Clean Water Act; Analysis and Sampling... for use as an alternative oil and grease method. Some comments were specific to the sampling...-side comparison using the specific procedures (e.g. sampling frequency, number of samples, QA/QC, and...
The Shock and Vibration Bulletin. Part 1. Summaries of Presented Papers
1974-10-01
15 S. Smith, R. C. Stroud, G. A. Hamma, W. L. Hallaver, R. C. Yee MODALAB-A NEW SYSTEM FOR STRUCTURAL DYNAMIC TESTING, II, ANALYSIS ...PV -A ........................................................... 33 A. Burkhard and R. Scott ANALYSIS AND FLIGHT TEST CORRELATION OF VIBROACOUSTIC...METHODS FOR THE ANALYSIS OF ELASTICALLY SUPPORTED ISOLATION SYSTEMS ............................................. 41 G. L. Fox IMPACT ON COMPLEX
Joe, Jonathan; Chaudhuri, Shomir; Le, Thai; Thompson, Hilaire; Demiris, George
2015-08-01
While health information technologies have become increasingly popular, many have not been formally tested to ascertain their usability. Traditional rigorous methods take significant amounts of time and manpower to evaluate the usability of a system. In this paper, we evaluate the use of instant data analysis (IDA) as developed by Kjeldskov et al. to perform usability testing on a tool designed for older adults and caregivers. The IDA method is attractive because it takes significantly less time and manpower than the traditional usability testing methods. In this paper we demonstrate how IDA was used to evaluate usability of a multifunctional wellness tool, discuss study results and lessons learned while using this method. We also present findings from an extension of the method which allows the grouping of similar usability problems in an efficient manner. We found that the IDA method is a quick, relatively easy approach to identifying and ranking usability issues among health information technologies. Copyright © 2015 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Howland, G. R.; Durno, J. A.; Twomey, W. J.
1990-01-01
Sikorsky Aircraft, together with the other major helicopter airframe manufacturers, is engaged in a study to improve the use of finite element analysis to predict the dynamic behavior of helicopter airframes, under a rotorcraft structural dynamics program called DAMVIBS (Design Analysis Methods for VIBrationS), sponsored by the NASA-Langley. The test plan and test results are presented for a shake test of the UH-60A BLACK HAWK helicopter. A comparison is also presented of test results with results obtained from analysis using a NASTRAN finite element model.
Gan, Yanjun; Duan, Qingyun; Gong, Wei; ...
2014-01-01
Sensitivity analysis (SA) is a commonly used approach for identifying important parameters that dominate model behaviors. We use a newly developed software package, a Problem Solving environment for Uncertainty Analysis and Design Exploration (PSUADE), to evaluate the effectiveness and efficiency of ten widely used SA methods, including seven qualitative and three quantitative ones. All SA methods are tested using a variety of sampling techniques to screen out the most sensitive (i.e., important) parameters from the insensitive ones. The Sacramento Soil Moisture Accounting (SAC-SMA) model, which has thirteen tunable parameters, is used for illustration. The South Branch Potomac River basin nearmore » Springfield, West Virginia in the U.S. is chosen as the study area. The key findings from this study are: (1) For qualitative SA methods, Correlation Analysis (CA), Regression Analysis (RA), and Gaussian Process (GP) screening methods are shown to be not effective in this example. Morris One-At-a-Time (MOAT) screening is the most efficient, needing only 280 samples to identify the most important parameters, but it is the least robust method. Multivariate Adaptive Regression Splines (MARS), Delta Test (DT) and Sum-Of-Trees (SOT) screening methods need about 400–600 samples for the same purpose. Monte Carlo (MC), Orthogonal Array (OA) and Orthogonal Array based Latin Hypercube (OALH) are appropriate sampling techniques for them; (2) For quantitative SA methods, at least 2777 samples are needed for Fourier Amplitude Sensitivity Test (FAST) to identity parameter main effect. McKay method needs about 360 samples to evaluate the main effect, more than 1000 samples to assess the two-way interaction effect. OALH and LPτ (LPTAU) sampling techniques are more appropriate for McKay method. For the Sobol' method, the minimum samples needed are 1050 to compute the first-order and total sensitivity indices correctly. These comparisons show that qualitative SA methods are more efficient but less accurate and robust than quantitative ones.« less
Applications of Automation Methods for Nonlinear Fracture Test Analysis
NASA Technical Reports Server (NTRS)
Allen, Phillip A.; Wells, Douglas N.
2013-01-01
Using automated and standardized computer tools to calculate the pertinent test result values has several advantages such as: 1. allowing high-fidelity solutions to complex nonlinear phenomena that would be impractical to express in written equation form, 2. eliminating errors associated with the interpretation and programing of analysis procedures from the text of test standards, 3. lessening the need for expertise in the areas of solid mechanics, fracture mechanics, numerical methods, and/or finite element modeling, to achieve sound results, 4. and providing one computer tool and/or one set of solutions for all users for a more "standardized" answer. In summary, this approach allows a non-expert with rudimentary training to get the best practical solution based on the latest understanding with minimum difficulty.Other existing ASTM standards that cover complicated phenomena use standard computer programs: 1. ASTM C1340/C1340M-10- Standard Practice for Estimation of Heat Gain or Loss Through Ceilings Under Attics Containing Radiant Barriers by Use of a Computer Program 2. ASTM F 2815 - Standard Practice for Chemical Permeation through Protective Clothing Materials: Testing Data Analysis by Use of a Computer Program 3. ASTM E2807 - Standard Specification for 3D Imaging Data Exchange, Version 1.0 The verification, validation, and round-robin processes required of a computer tool closely parallel the methods that are used to ensure the solution validity for equations included in test standard. The use of automated analysis tools allows the creation and practical implementation of advanced fracture mechanics test standards that capture the physics of a nonlinear fracture mechanics problem without adding undue burden or expense to the user. The presented approach forms a bridge between the equation-based fracture testing standards of today and the next generation of standards solving complex problems through analysis automation.
NASA Technical Reports Server (NTRS)
Gillis, Clarence L; Mitchell, Jesse L
1957-01-01
A test technique and data analysis method has been developed for determining the longitudinal aerodynamic characteristics from free-flight tests of rocket-propelled models. The technique makes use of accelerometers and an angle-of-attack indicator to permit instantaneous measurements of lift, drag, and pitching moments. The data, obtained during transient oscillations resulting from control-surface disturbances, are analyzed by essentially nonlinear direct methods (such as cross plots of the variation of lift coefficient with angle of attack) and by linear indirect methods by using the equations of motion for a transient oscillation. The analysis procedure has been set forth in some detail and the feasibility of the method has been demonstrated by data measured through the transonic speed range on several airplane configurations. It was shown that the flight conditions and dynamic similitude factors for the tests described were reasonably close to typical full-scale airplane conditions.
Hamada, Tsuyoshi; Nakai, Yousuke; Isayama, Hiroyuki; Togawa, Osamu; Kogure, Hirofumi; Kawakubo, Kazumichi; Tsujino, Takeshi; Sasahira, Naoki; Hirano, Kenji; Yamamoto, Natsuyo; Ito, Yukiko; Sasaki, Takashi; Mizuno, Suguru; Toda, Nobuo; Tada, Minoru; Koike, Kazuhiko
2014-03-01
Self-expandable metallic stent (SEMS) placement is widely carried out for distal malignant biliary obstruction, and survival analysis is used to evaluate the cumulative incidences of SEMS dysfunction (e.g. the Kaplan-Meier [KM] method and the log-rank test). However, these statistical methods might be inappropriate in the presence of 'competing risks' (here, death without SEMS dysfunction), which affects the probability of experiencing the event of interest (SEMS dysfunction); that is, SEMS dysfunction can no longer be observed after death. A competing risk analysis has rarely been done in studies on SEMS. We introduced the concept of a competing risk analysis and illustrated its impact on the evaluation of SEMS outcomes using hypothetical and actual data. Our illustrative study included 476 consecutive patients who underwent SEMS placement for unresectable distal malignant biliary obstruction. A significant difference between cumulative incidences of SEMS dysfunction in male and female patients via theKM method (P = 0.044 by the log-rank test) disappeared after applying a competing risk analysis (P = 0.115 by Gray's test). In contrast, although cumulative incidences of SEMS dysfunction via the KM method were similar with and without chemotherapy (P = 0.647 by the log-rank test), cumulative incidence of SEMS dysfunction in the non-chemotherapy group was shown to be significantly lower (P = 0.031 by Gray's test) in a competing risk analysis. Death as a competing risk event needs to be appropriately considered in estimating a cumulative incidence of SEMS dysfunction, otherwise analytical results may be biased. © 2013 The Authors. Digestive Endoscopy © 2013 Japan Gastroenterological Endoscopy Society.
NASA Technical Reports Server (NTRS)
Tripp, John S.; Tcheng, Ping
1999-01-01
Statistical tools, previously developed for nonlinear least-squares estimation of multivariate sensor calibration parameters and the associated calibration uncertainty analysis, have been applied to single- and multiple-axis inertial model attitude sensors used in wind tunnel testing to measure angle of attack and roll angle. The analysis provides confidence and prediction intervals of calibrated sensor measurement uncertainty as functions of applied input pitch and roll angles. A comparative performance study of various experimental designs for inertial sensor calibration is presented along with corroborating experimental data. The importance of replicated calibrations over extended time periods has been emphasized; replication provides independent estimates of calibration precision and bias uncertainties, statistical tests for calibration or modeling bias uncertainty, and statistical tests for sensor parameter drift over time. A set of recommendations for a new standardized model attitude sensor calibration method and usage procedures is included. The statistical information provided by these procedures is necessary for the uncertainty analysis of aerospace test results now required by users of industrial wind tunnel test facilities.
Vibration test of 1/5 scale H-II launch vehicle
NASA Astrophysics Data System (ADS)
Morino, Yoshiki; Komatsu, Keiji; Sano, Masaaki; Minegishi, Masakatsu; Morita, Toshiyuki; Kohsetsu, Y.
In order to predict dynamic loads on the newly designed Japanese H-II launch vehicle, the adequacy of prediction methods has been assessed by the dynamic scale model testing. The three-dimensional dynamic model was used in the analysis to express coupling effects among axial, lateral (pitch and yaw) and torsional vibrations. The liquid/tank interaction was considered by use of a boundary element method. The 1/5 scale model of the H-II launch vehicle was designed to simulate stiffness and mass properties of important structural parts, such as core/SRB junctions, first and second stage Lox tanks and engine mount structures. Modal excitation of the test vehicle was accomplished with 100-1000 N shakers which produced random or sinusoidal vibrational forces. The vibrational response of the test vehicle was measured at various locations with accelerometers and pressure sensor. In the lower frequency range, corresmpondence between analysis and experiment was generally good. The basic procedures in analysis seem to be adequate so far, but some improvements in mathematical modeling are suggested by comparison of test and analysis.
Chiu, Chi-yang; Jung, Jeesun; Chen, Wei; Weeks, Daniel E; Ren, Haobo; Boehnke, Michael; Amos, Christopher I; Liu, Aiyi; Mills, James L; Ting Lee, Mei-ling; Xiong, Momiao; Fan, Ruzong
2017-01-01
To analyze next-generation sequencing data, multivariate functional linear models are developed for a meta-analysis of multiple studies to connect genetic variant data to multiple quantitative traits adjusting for covariates. The goal is to take the advantage of both meta-analysis and pleiotropic analysis in order to improve power and to carry out a unified association analysis of multiple studies and multiple traits of complex disorders. Three types of approximate F -distributions based on Pillai–Bartlett trace, Hotelling–Lawley trace, and Wilks's Lambda are introduced to test for association between multiple quantitative traits and multiple genetic variants. Simulation analysis is performed to evaluate false-positive rates and power of the proposed tests. The proposed methods are applied to analyze lipid traits in eight European cohorts. It is shown that it is more advantageous to perform multivariate analysis than univariate analysis in general, and it is more advantageous to perform meta-analysis of multiple studies instead of analyzing the individual studies separately. The proposed models require individual observations. The value of the current paper can be seen at least for two reasons: (a) the proposed methods can be applied to studies that have individual genotype data; (b) the proposed methods can be used as a criterion for future work that uses summary statistics to build test statistics to meta-analyze the data. PMID:28000696
Chiu, Chi-Yang; Jung, Jeesun; Chen, Wei; Weeks, Daniel E; Ren, Haobo; Boehnke, Michael; Amos, Christopher I; Liu, Aiyi; Mills, James L; Ting Lee, Mei-Ling; Xiong, Momiao; Fan, Ruzong
2017-02-01
To analyze next-generation sequencing data, multivariate functional linear models are developed for a meta-analysis of multiple studies to connect genetic variant data to multiple quantitative traits adjusting for covariates. The goal is to take the advantage of both meta-analysis and pleiotropic analysis in order to improve power and to carry out a unified association analysis of multiple studies and multiple traits of complex disorders. Three types of approximate F -distributions based on Pillai-Bartlett trace, Hotelling-Lawley trace, and Wilks's Lambda are introduced to test for association between multiple quantitative traits and multiple genetic variants. Simulation analysis is performed to evaluate false-positive rates and power of the proposed tests. The proposed methods are applied to analyze lipid traits in eight European cohorts. It is shown that it is more advantageous to perform multivariate analysis than univariate analysis in general, and it is more advantageous to perform meta-analysis of multiple studies instead of analyzing the individual studies separately. The proposed models require individual observations. The value of the current paper can be seen at least for two reasons: (a) the proposed methods can be applied to studies that have individual genotype data; (b) the proposed methods can be used as a criterion for future work that uses summary statistics to build test statistics to meta-analyze the data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1998-09-01
This report describes the test site, equipment, and procedures and presents the data obtained during field testing at G.P.U. Genco Homer City Station, August 19--24, 1997. This was the third of three field tests that the US Environmental Protection Agency (EPA) conducted in 1997 as part of a major study to evaluate potential improvements to Method 3, EPA`s test method for measuring flue gas volumetric flow in stacks. The report also includes a Data Distribution Package, the official, complete repository of the results obtained at the test site.
NASA Astrophysics Data System (ADS)
Okajima, Kenji; Imai, Junichi; Tanaka, Tadatsugu; Iida, Toshiaki
Damage to piles in the liquefied ground is frequently reported. Buckling by the excess vertical load could be one of the causes of the pile damage, as well as the lateral flow of the ground and the lateral load at the pile head. The buckling mechanism is described as a complicated interaction between the pile deformation by the vertical load and the earth pressure change cased by the pile deformation. In this study, series of static buckling model tests of a pile were carried out in dried sand ground with various thickness of the layer. Finite element analysis was applied to the test results to verify the effectiveness of the elasto-plastic finite element analysis combining the implicit-explicit mixed type dynamic relaxation method with the return mapping method to the pile buckling problems. The test results and the analysis indicated the possibility that the buckling load of a pile decreases greatly where the thickness of the layer increases.
A new method for flight test determination of propulsive efficiency and drag coefficient
NASA Technical Reports Server (NTRS)
Bull, G.; Bridges, P. D.
1983-01-01
A flight test method is described from which propulsive efficiency as well as parasite and induced drag coefficients can be directly determined using relatively simple instrumentation and analysis techniques. The method uses information contained in the transient response in airspeed for a small power change in level flight in addition to the usual measurement of power required for level flight. Measurements of pitch angle and longitudinal and normal acceleration are eliminated. The theoretical basis for the method, the analytical techniques used, and the results of application of the method to flight test data are presented.
NASA Astrophysics Data System (ADS)
Chen, Pengfei; Jing, Qi
2017-02-01
An assumption that the non-linear method is more reasonable than the linear method when canopy reflectance is used to establish the yield prediction model was proposed and tested in this study. For this purpose, partial least squares regression (PLSR) and artificial neural networks (ANN), represented linear and non-linear analysis method, were applied and compared for wheat yield prediction. Multi-period Landsat-8 OLI images were collected at two different wheat growth stages, and a field campaign was conducted to obtain grain yields at selected sampling sites in 2014. The field data were divided into a calibration database and a testing database. Using calibration data, a cross-validation concept was introduced for the PLSR and ANN model construction to prevent over-fitting. All models were tested using the test data. The ANN yield-prediction model produced R2, RMSE and RMSE% values of 0.61, 979 kg ha-1, and 10.38%, respectively, in the testing phase, performing better than the PLSR yield-prediction model, which produced R2, RMSE, and RMSE% values of 0.39, 1211 kg ha-1, and 12.84%, respectively. Non-linear method was suggested as a better method for yield prediction.
Poisson Approximation-Based Score Test for Detecting Association of Rare Variants.
Fang, Hongyan; Zhang, Hong; Yang, Yaning
2016-07-01
Genome-wide association study (GWAS) has achieved great success in identifying genetic variants, but the nature of GWAS has determined its inherent limitations. Under the common disease rare variants (CDRV) hypothesis, the traditional association analysis methods commonly used in GWAS for common variants do not have enough power for detecting rare variants with a limited sample size. As a solution to this problem, pooling rare variants by their functions provides an efficient way for identifying susceptible genes. Rare variant typically have low frequencies of minor alleles, and the distribution of the total number of minor alleles of the rare variants can be approximated by a Poisson distribution. Based on this fact, we propose a new test method, the Poisson Approximation-based Score Test (PAST), for association analysis of rare variants. Two testing methods, namely, ePAST and mPAST, are proposed based on different strategies of pooling rare variants. Simulation results and application to the CRESCENDO cohort data show that our methods are more powerful than the existing methods. © 2016 John Wiley & Sons Ltd/University College London.
Alvin H. Yu; Garry Chick
2010-01-01
This study compared the utility of two different post-hoc tests after detecting significant differences within factors on multiple dependent variables using multivariate analysis of variance (MANOVA). We compared the univariate F test (the Scheffé method) to descriptive discriminant analysis (DDA) using an educational-tour survey of university study-...
ERIC Educational Resources Information Center
Ojerinde, Dibu; Popoola, Omokunmi; Onyeneho, Patrick; Egberongbe, Aminat
2016-01-01
Statistical procedure used in adjusting test score difficulties on test forms is known as "equating". Equating makes it possible for various test forms to be used interchangeably. In terms of where the equating method fits in the assessment cycle, there are pre-equating and post-equating methods. The major benefits of pre-equating, when…
Noar, Seth M; Mehrotra, Purnima
2011-03-01
Traditional theory testing commonly applies cross-sectional (and occasionally longitudinal) survey research to test health behavior theory. Since such correlational research cannot demonstrate causality, a number of researchers have called for the increased use of experimental methods for theory testing. We introduce the multi-methodological theory-testing (MMTT) framework for testing health behavior theory. The MMTT framework introduces a set of principles that broaden the perspective of how we view evidence for health behavior theory. It suggests that while correlational survey research designs represent one method of testing theory, the weaknesses of this approach demand that complementary approaches be applied. Such approaches include randomized lab and field experiments, mediation analysis of theory-based interventions, and meta-analysis. These alternative approaches to theory testing can demonstrate causality in a much more robust way than is possible with correlational survey research methods. Such approaches should thus be increasingly applied in order to more completely and rigorously test health behavior theory. Greater application of research derived from the MMTT may lead researchers to refine and modify theory and ultimately make theory more valuable to practitioners. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.
Leak Rate Quantification Method for Gas Pressure Seals with Controlled Pressure Differential
NASA Technical Reports Server (NTRS)
Daniels, Christopher C.; Braun, Minel J.; Oravec, Heather A.; Mather, Janice L.; Taylor, Shawn C.
2015-01-01
An enhancement to the pressure decay leak rate method with mass point analysis solved deficiencies in the standard method. By adding a control system, a constant gas pressure differential across the test article was maintained. As a result, the desired pressure condition was met at the onset of the test, and the mass leak rate and measurement uncertainty were computed in real-time. The data acquisition and control system were programmed to automatically stop when specified criteria were met. Typically, the test was stopped when a specified level of measurement uncertainty was attained. Using silicone O-ring test articles, the new method was compared with the standard method that permitted the downstream pressure to be non-constant atmospheric pressure. The two methods recorded comparable leak rates, but the new method recorded leak rates with significantly lower measurement uncertainty, statistical variance, and test duration. Utilizing this new method in leak rate quantification, projects will reduce cost and schedule, improve test results, and ease interpretation between data sets.
Substantiation Data for Advanced Beaded and Tubular Structural Panels. Volume 3: Testing
NASA Technical Reports Server (NTRS)
Hedges, P. C.; Greene, B. E.
1974-01-01
The test program is described, which was conducted to provide the necessary experimental data to verify the design and analysis methods developed for beaded and tubular panels. Test results are summarized and presented for all local buckling and full size panel tests. Selected representative test data from each of these tests is presented in detail. The results of this program established a valid analysis and design procedure for circular tube panels. Test results from three other configurations show deformational modes which are not adequately accounted for in the present analyses.
NASA Astrophysics Data System (ADS)
Oleksik, Mihaela; Oleksik, Valentin
2013-05-01
The current paper intends to realise a fast method for determining the material characteristics in the case of composite materials used in the airbags manufacturing. For determining the material data needed for other complex numerical simulations at macroscopic level there was used the inverse analysis method. In fact, there were carried out tensile tests for the composite material extracted along two directions - the direction of the weft and the direction of the warp and afterwards there were realised numerical simulations (using the Ls-Dyna software). A second stage consisted in the numerical simulation through the finite element method and the experimental testing for the Bias test. The material characteristics of the composite fabric material were then obtained by applying a multicriterial analysis using the Ls-Opt software, for which there was imposed a decrease of the mismatch between the force-displacement curves obtained numerically and experimentally, respectively, for both directions (weft and warp) as well as the decrease of the mismatch between the strain - extension curves for two points at the Bias test.
Nonclinical dose formulation analysis method validation and sample analysis.
Whitmire, Monica Lee; Bryan, Peter; Henry, Teresa R; Holbrook, John; Lehmann, Paul; Mollitor, Thomas; Ohorodnik, Susan; Reed, David; Wietgrefe, Holly D
2010-12-01
Nonclinical dose formulation analysis methods are used to confirm test article concentration and homogeneity in formulations and determine formulation stability in support of regulated nonclinical studies. There is currently no regulatory guidance for nonclinical dose formulation analysis method validation or sample analysis. Regulatory guidance for the validation of analytical procedures has been developed for drug product/formulation testing; however, verification of the formulation concentrations falls under the framework of GLP regulations (not GMP). The only current related regulatory guidance is the bioanalytical guidance for method validation. The fundamental parameters for bioanalysis and formulation analysis validations that overlap include: recovery, accuracy, precision, specificity, selectivity, carryover, sensitivity, and stability. Divergence in bioanalytical and drug product validations typically center around the acceptance criteria used. As the dose formulation samples are not true "unknowns", the concept of quality control samples that cover the entire range of the standard curve serving as the indication for the confidence in the data generated from the "unknown" study samples may not always be necessary. Also, the standard bioanalytical acceptance criteria may not be directly applicable, especially when the determined concentration does not match the target concentration. This paper attempts to reconcile the different practices being performed in the community and to provide recommendations of best practices and proposed acceptance criteria for nonclinical dose formulation method validation and sample analysis.
20 Meter Solar Sail Analysis and Correlation
NASA Technical Reports Server (NTRS)
Taleghani, B.; Lively, P.; Banik, J.; Murphy, D.; Trautt, T.
2005-01-01
This presentation discusses studies conducted to determine the element type and size that best represents a 20-meter solar sail under ground-test load conditions, the performance of test/Analysis correlation by using Static Shape Optimization Method for Q4 sail, and system dynamic. TRIA3 elements better represent wrinkle patterns than do QUAD3 elements Baseline, ten-inch elements are small enough to accurately represent sail shape, and baseline TRIA3 mesh requires a reasonable computation time of 8 min. 21 sec. In the test/analysis correlation by using Static shape optimization method for Q4 sail, ten parameters were chosen and varied during optimization. 300 sail models were created with random parameters. A response surfaces for each targets which were created based on the varied parameters. Parameters were optimized based on response surface. Deflection shape comparison for 0 and 22.5 degrees yielded a 4.3% and 2.1% error respectively. For the system dynamic study testing was done on the booms without the sails attached. The nominal boom properties produced a good correlation to test data the frequencies were within 10%. Boom dominated analysis frequencies and modes compared well with the test results.
Airside HVAC BESTEST: HVAC Air-Distribution System Model Test Cases for ASHRAE Standard 140
DOE Office of Scientific and Technical Information (OSTI.GOV)
Judkoff, Ronald; Neymark, Joel; Kennedy, Mike D.
This paper summarizes recent work to develop new airside HVAC equipment model analytical verification test cases for ANSI/ASHRAE Standard 140, Standard Method of Test for the Evaluation of Building Energy Analysis Computer Programs. The analytical verification test method allows comparison of simulation results from a wide variety of building energy simulation programs with quasi-analytical solutions, further described below. Standard 140 is widely cited for evaluating software for use with performance-path energy efficiency analysis, in conjunction with well-known energy-efficiency standards including ASHRAE Standard 90.1, the International Energy Conservation Code, and other international standards. Airside HVAC Equipment is a common area ofmore » modelling not previously explicitly tested by Standard 140. Integration of the completed test suite into Standard 140 is in progress.« less
Second ventilatory threshold from heart-rate variability: valid when the upper body is involved?
Mourot, Laurent; Fabre, Nicolas; Savoldelli, Aldo; Schena, Federico
2014-07-01
To determine the most accurate method based on spectral analysis of heart-rate variability (SA-HRV) during an incremental and continuous maximal test involving the upper body, the authors tested 4 different methods to obtain the heart rate (HR) at the second ventilatory threshold (VT(2)). Sixteen ski mountaineers (mean ± SD; age 25 ± 3 y, height 177 ± 8 cm, mass 69 ± 10 kg) performed a roller-ski test on a treadmill. Respiratory variables and HR were continuously recorded, and the 4 SA-HRV methods were compared with the gas-exchange method through Bland and Altman analyses. The best method was the one based on a time-varying spectral analysis with high frequency ranging from 0.15 Hz to a cutoff point relative to the individual's respiratory sinus arrhythmia. The HR values were significantly correlated (r(2) = .903), with a mean HR difference with the respiratory method of 0.1 ± 3.0 beats/min and low limits of agreements (around -6 /+6 beats/min). The 3 other methods led to larger errors and lower agreements (up to 5 beats/min and around -23/+20 beats/min). It is possible to accurately determine VT(2) with an HR monitor during an incremental test involving the upper body if the appropriate HRV method is used.
Testing all six person-oriented principles in dynamic factor analysis.
Molenaar, Peter C M
2010-05-01
All six person-oriented principles identified by Sterba and Bauer's Keynote Article can be tested by means of dynamic factor analysis in its current form. In particular, it is shown how complex interactions and interindividual differences/intraindividual change can be tested in this way. In addition, the necessity to use single-subject methods in the analysis of developmental processes is emphasized, and attention is drawn to the possibility to optimally treat developmental psychopathology by means of new computational techniques that can be integrated with dynamic factor analysis.
Owen, Rhiannon K; Cooper, Nicola J; Quinn, Terence J; Lees, Rosalind; Sutton, Alex J
2018-07-01
Network meta-analyses (NMA) have extensively been used to compare the effectiveness of multiple interventions for health care policy and decision-making. However, methods for evaluating the performance of multiple diagnostic tests are less established. In a decision-making context, we are often interested in comparing and ranking the performance of multiple diagnostic tests, at varying levels of test thresholds, in one simultaneous analysis. Motivated by an example of cognitive impairment diagnosis following stroke, we synthesized data from 13 studies assessing the efficiency of two diagnostic tests: Mini-Mental State Examination (MMSE) and Montreal Cognitive Assessment (MoCA), at two test thresholds: MMSE <25/30 and <27/30, and MoCA <22/30 and <26/30. Using Markov chain Monte Carlo (MCMC) methods, we fitted a bivariate network meta-analysis model incorporating constraints on increasing test threshold, and accounting for the correlations between multiple test accuracy measures from the same study. We developed and successfully fitted a model comparing multiple tests/threshold combinations while imposing threshold constraints. Using this model, we found that MoCA at threshold <26/30 appeared to have the best true positive rate, whereas MMSE at threshold <25/30 appeared to have the best true negative rate. The combined analysis of multiple tests at multiple thresholds allowed for more rigorous comparisons between competing diagnostics tests for decision making. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.
A Meta-Analysis of Gifted and Talented Identification Practices
ERIC Educational Resources Information Center
Hodges, Jaret; Tay, Juliana; Maeda, Yukiko; Gentry, Marcia
2018-01-01
Researchers consider the underrepresentation of Black, Hispanic, and Native American students is largely due to the use of traditional methods of identification (i.e., IQ and standardized achievement tests). To address this concern, researchers created novel nontraditional identification methods (e.g., nonverbal tests, student portfolios,…
Mechergui, Arij; Achour, Wafa; Ben Hassen, Assia
2014-08-01
We aimed to compare accuracy of genus and species level identification of Neisseria spp. using biochemical testing and 16S rRNA sequence analysis. These methods were evaluated using 85 Neisseria spp. clinical isolates initially identified to the genus level by conventional biochemical tests and API NH system (Bio-Mérieux(®)). In 34 % (29/85), more than one possibility was given by 16S rRNA sequence analysis. In 6 % (5/85), one of the possibilities offered by 16S rRNA gene sequencing, agreed with the result given by biochemical testing. In 4 % (3/85), the same species was given by both methods. 16S rRNA gene sequencing results did not correlate well with biochemical tests.
NASA Astrophysics Data System (ADS)
Vainshtein, Igor; Baruch, Shlomi; Regev, Itai; Segal, Victor; Filis, Avishai; Riabzev, Sergey
2018-05-01
The growing demand for EO applications that work around the clock 24hr/7days a week, such as in border surveillance systems, emphasizes the need for a highly reliable cryocooler having increased operational availability and optimized system's Integrated Logistic Support (ILS). In order to meet this need, RICOR developed linear and rotary cryocoolers which achieved successfully this goal. Cryocoolers MTTF was analyzed by theoretical reliability evaluation methods, demonstrated by normal and accelerated life tests at Cryocooler level and finally verified by field data analysis derived from Cryocoolers operating at system level. The following paper reviews theoretical reliability analysis methods together with analyzing reliability test results derived from standard and accelerated life demonstration tests performed at Ricor's advanced reliability laboratory. As a summary for the work process, reliability verification data will be presented as a feedback from fielded systems.
A 100-Year Review: Sensory analysis of milk.
Schiano, A N; Harwood, W S; Drake, M A
2017-12-01
Evaluation of the sensory characteristics of food products has been, and will continue to be, the ultimate method for evaluating product quality. Sensory quality is a parameter that can be evaluated only by humans and consists of a series of tests or tools that can be applied objectively or subjectively within the constructs of carefully selected testing procedures and parameters. Depending on the chosen test, evaluators are able to probe areas of interest that are intrinsic product attributes (e.g., flavor profiles and off-flavors) as well as extrinsic measures (e.g., market penetration and consumer perception). This review outlines the literature pertaining to relevant testing procedures and studies of the history of sensory analysis of fluid milk. In addition, evaluation methods outside of traditional sensory techniques and future outlooks on the subject of sensory analysis of fluid milk are explored and presented. Copyright © 2017 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
Program to analyze aquifer test data and check for validity with the jacob method
DOE Office of Scientific and Technical Information (OSTI.GOV)
Field, M.S.
1993-01-01
The Jacob straight-line method of aquifer analysis deals with the late-time data and small radius of the Theis type curve which plot as a straight line if the drawdown data are plotted on an arithmetic scale and the time data on a logarithmic (base 10) scale. Correct analysis with the Jacob method normally assumes that (1) the data lie on a straight line, (2) the value of the dimensionless time factor is less than 0.01, and (3) the site's hydrogeology conforms to the method's assumptions and limiting conditions. Items 1 and 2 are usually considered for the Jacob method, butmore » item 3 is often ignored, which can lead to incorrect calculations of aquifer parameters. A BASIC computer program was developed to analyze aquifer test data with the Jacob method to test the validity of its use. Aquifer test data are entered into the program and manipulated so that a slope and time intercept of the straight line drawn through the data (excluding early-time and late-time data) can be used to calculate transmissivity and storage coefficient. Late-time data are excluded to eliminate the effects of positive and negative boundaries. The time-drawdown data then are converted into dimensionless units to determine if the Jacob method's assumptions are valid for the hydrogeologic conditions under which the test was conducted.« less
Relativity Concept Inventory: Development, Analysis, and Results
ERIC Educational Resources Information Center
Aslanides, J. S.; Savage, C. M.
2013-01-01
We report on a concept inventory for special relativity: the development process, data analysis methods, and results from an introductory relativity class. The Relativity Concept Inventory tests understanding of relativistic concepts. An unusual feature is confidence testing for each question. This can provide additional information; for example,…
Experimental and Analytical Determinations of Spiral Bevel Gear-Tooth Bending Stress Compared
NASA Technical Reports Server (NTRS)
Handschuh, Robert F.
2000-01-01
Spiral bevel gears are currently used in all main-rotor drive systems for rotorcraft produced in the United States. Applications such as these need spiral bevel gears to turn the corner from the horizontal gas turbine engine to the vertical rotor shaft. These gears must typically operate at extremely high rotational speeds and carry high power levels. With these difficult operating conditions, an improved analytical capability is paramount to increasing aircraft safety and reliability. Also, literature on the analysis and testing of spiral bevel gears has been very sparse in comparison to that for parallel axis gears. This is due to the complex geometry of this type of gear and to the specialized test equipment necessary to test these components. To develop an analytical model of spiral bevel gears, researchers use differential geometry methods to model the manufacturing kinematics. A three-dimensional spiral bevel gear modeling method was developed that uses finite elements for the structural analysis. This method was used to analyze the three-dimensional contact pattern between the test pinion and gear used in the Spiral Bevel Gear Test Facility at the NASA Glenn Research Center at Lewis Field. Results of this analysis are illustrated in the preceding figure. The development of the analytical method was a joint endeavor between NASA Glenn, the U.S. Army Research Laboratory, and the University of North Dakota.
Multivariate Welch t-test on distances
2016-01-01
Motivation: Permutational non-Euclidean analysis of variance, PERMANOVA, is routinely used in exploratory analysis of multivariate datasets to draw conclusions about the significance of patterns visualized through dimension reduction. This method recognizes that pairwise distance matrix between observations is sufficient to compute within and between group sums of squares necessary to form the (pseudo) F statistic. Moreover, not only Euclidean, but arbitrary distances can be used. This method, however, suffers from loss of power and type I error inflation in the presence of heteroscedasticity and sample size imbalances. Results: We develop a solution in the form of a distance-based Welch t-test, TW2, for two sample potentially unbalanced and heteroscedastic data. We demonstrate empirically the desirable type I error and power characteristics of the new test. We compare the performance of PERMANOVA and TW2 in reanalysis of two existing microbiome datasets, where the methodology has originated. Availability and Implementation: The source code for methods and analysis of this article is available at https://github.com/alekseyenko/Tw2. Further guidance on application of these methods can be obtained from the author. Contact: alekseye@musc.edu PMID:27515741
Multivariate Welch t-test on distances.
Alekseyenko, Alexander V
2016-12-01
Permutational non-Euclidean analysis of variance, PERMANOVA, is routinely used in exploratory analysis of multivariate datasets to draw conclusions about the significance of patterns visualized through dimension reduction. This method recognizes that pairwise distance matrix between observations is sufficient to compute within and between group sums of squares necessary to form the (pseudo) F statistic. Moreover, not only Euclidean, but arbitrary distances can be used. This method, however, suffers from loss of power and type I error inflation in the presence of heteroscedasticity and sample size imbalances. We develop a solution in the form of a distance-based Welch t-test, [Formula: see text], for two sample potentially unbalanced and heteroscedastic data. We demonstrate empirically the desirable type I error and power characteristics of the new test. We compare the performance of PERMANOVA and [Formula: see text] in reanalysis of two existing microbiome datasets, where the methodology has originated. The source code for methods and analysis of this article is available at https://github.com/alekseyenko/Tw2 Further guidance on application of these methods can be obtained from the author. alekseye@musc.edu. © The Author 2016. Published by Oxford University Press.
Shear Recovery Accuracy in Weak-Lensing Analysis with the Elliptical Gauss-Laguerre Method
NASA Astrophysics Data System (ADS)
Nakajima, Reiko; Bernstein, Gary
2007-04-01
We implement the elliptical Gauss-Laguerre (EGL) galaxy-shape measurement method proposed by Bernstein & Jarvis and quantify the shear recovery accuracy in weak-lensing analysis. This method uses a deconvolution fitting scheme to remove the effects of the point-spread function (PSF). The test simulates >107 noisy galaxy images convolved with anisotropic PSFs and attempts to recover an input shear. The tests are designed to be immune to statistical (random) distributions of shapes, selection biases, and crowding, in order to test more rigorously the effects of detection significance (signal-to-noise ratio [S/N]), PSF, and galaxy resolution. The systematic error in shear recovery is divided into two classes, calibration (multiplicative) and additive, with the latter arising from PSF anisotropy. At S/N > 50, the deconvolution method measures the galaxy shape and input shear to ~1% multiplicative accuracy and suppresses >99% of the PSF anisotropy. These systematic errors increase to ~4% for the worst conditions, with poorly resolved galaxies at S/N simeq 20. The EGL weak-lensing analysis has the best demonstrated accuracy to date, sufficient for the next generation of weak-lensing surveys.
ERIC Educational Resources Information Center
Karadag, Engin
2010-01-01
To assess research methods and analysis of statistical techniques employed by educational researchers, this study surveyed unpublished doctoral dissertation from 2003 to 2007. Frequently used research methods consisted of experimental research; a survey; a correlational study; and a case study. Descriptive statistics, t-test, ANOVA, factor…
The "Promise" of Three Methods of Word Association Analysis to L2 Lexical Research
ERIC Educational Resources Information Center
Zareva, Alla; Wolter, Brent
2012-01-01
The present study is an attempt to empirically test and compare the results of three methods of word association (WA) analysis. Two of the methods--namely, associative commonality and nativelikeness, and lexico-syntactic patterns of associative organization--have been traditionally used in both first language (L1) and second language (L2)…
NASA Astrophysics Data System (ADS)
Wojcieszak, D.; Przybył, J.; Lewicki, A.; Ludwiczak, A.; Przybylak, A.; Boniecki, P.; Koszela, K.; Zaborowicz, M.; Przybył, K.; Witaszek, K.
2015-07-01
The aim of this research was investigate the possibility of using methods of computer image analysis and artificial neural networks for to assess the amount of dry matter in the tested compost samples. The research lead to the conclusion that the neural image analysis may be a useful tool in determining the quantity of dry matter in the compost. Generated neural model may be the beginning of research into the use of neural image analysis assess the content of dry matter and other constituents of compost. The presented model RBF 19:19-2-1:1 characterized by test error 0.092189 may be more efficient.
Stress Analysis of Columns and Beam Columns by the Photoelastic Method
NASA Technical Reports Server (NTRS)
Ruffner, B F
1946-01-01
Principles of similarity and other factors in the design of models for photoelastic testing are discussed. Some approximate theoretical equations, useful in the analysis of results obtained from photoelastic tests are derived. Examples of the use of photoelastic techniques and the analysis of results as applied to uniform and tapered beam columns, circular rings, and statically indeterminate frames, are given. It is concluded that this method is an effective tool for the analysis of structures in which column action is present, particularly in tapered beam columns, and in statically indeterminate structures in which the distribution of loads in the structures is influenced by bending moments due to axial loads in one or more members.
Literature review : an analysis of laboratory fatigue tests.
DOT National Transportation Integrated Search
1975-01-01
This report discusses the various types of fatigue tests, grouped by the type of specimen (beam, plate, Marshall, etc.) used. The discussion under each type of specimen covers the test, and the analytical methods used in evaluating the data. The test...
Kim, Eun Sook; Cao, Chunhua
2015-01-01
Considering that group comparisons are common in social science, we examined two latent group mean testing methods when groups of interest were either at the between or within level of multilevel data: multiple-group multilevel confirmatory factor analysis (MG ML CFA) and multilevel multiple-indicators multiple-causes modeling (ML MIMIC). The performance of these methods were investigated through three Monte Carlo studies. In Studies 1 and 2, either factor variances or residual variances were manipulated to be heterogeneous between groups. In Study 3, which focused on within-level multiple-group analysis, six different model specifications were considered depending on how to model the intra-class group correlation (i.e., correlation between random effect factors for groups within cluster). The results of simulations generally supported the adequacy of MG ML CFA and ML MIMIC for multiple-group analysis with multilevel data. The two methods did not show any notable difference in the latent group mean testing across three studies. Finally, a demonstration with real data and guidelines in selecting an appropriate approach to multilevel multiple-group analysis are provided.
NASA Technical Reports Server (NTRS)
Chin, Alexander W.; Herrera, Claudia Y.; Spivey, Natalie D.; Fladung, William A.; Cloutier, David
2015-01-01
The mass properties of an aerospace vehicle are required by multiple disciplines in the analysis and prediction of flight behavior. Pendulum oscillation methods have been developed and employed for almost a century as a means to measure mass properties. However, these oscillation methods are costly, time consuming, and risky. The NASA Armstrong Flight Research Center has been investigating the Dynamic Inertia Measurement, or DIM method as a possible alternative to oscillation methods. The DIM method uses ground test techniques that are already applied to aerospace vehicles when conducting modal surveys. Ground vibration tests would require minimal additional instrumentation and time to apply the DIM method. The DIM method has been validated on smaller test articles, but has not yet been fully proven on large aerospace vehicles.
USDA-ARS?s Scientific Manuscript database
Visible/near-infrared (Vis/NIR) spectroscopy with wavelength range between 400 and 2500 nm combined with factor analysis method was tested to predict quality attributes of chicken breast fillets. Quality attributes, including color (L*, a*, b*), pH, and drip loss were analyzed using factor analysis ...
An instructional guide for leaf color analysis using digital imaging software
Paula F. Murakami; Michelle R. Turner; Abby K. van den Berg; Paul G. Schaberg
2005-01-01
Digital color analysis has become an increasingly popular and cost-effective method utilized by resource managers and scientists for evaluating foliar nutrition and health in response to environmental stresses. We developed and tested a new method of digital image analysis that uses Scion Image or NIH image public domain software to quantify leaf color. This...
One-year test-retest reliability of intrinsic connectivity network fMRI in older adults
Guo, Cong C.; Kurth, Florian; Zhou, Juan; Mayer, Emeran A.; Eickhoff, Simon B; Kramer, Joel H.; Seeley, William W.
2014-01-01
“Resting-state” or task-free fMRI can assess intrinsic connectivity network (ICN) integrity in health and disease, suggesting a potential for use of these methods as disease-monitoring biomarkers. Numerous analytical options are available, including model-driven ROI-based correlation analysis and model-free, independent component analysis (ICA). High test-retest reliability will be a necessary feature of a successful ICN biomarker, yet available reliability data remains limited. Here, we examined ICN fMRI test-retest reliability in 24 healthy older subjects scanned roughly one year apart. We focused on the salience network, a disease-relevant ICN not previously subjected to reliability analysis. Most ICN analytical methods proved reliable (intraclass coefficients > 0.4) and could be further improved by wavelet analysis. Seed-based ROI correlation analysis showed high map-wise reliability, whereas graph theoretical measures and temporal concatenation group ICA produced the most reliable individual unit-wise outcomes. Including global signal regression in ROI-based correlation analyses reduced reliability. Our study provides a direct comparison between the most commonly used ICN fMRI methods and potential guidelines for measuring intrinsic connectivity in aging control and patient populations over time. PMID:22446491
Results and Analysis from Space Suit Joint Torque Testing
NASA Technical Reports Server (NTRS)
Matty, Jennifer
2010-01-01
This joint mobility KC lecture included information from two papers, "A Method for and Issues Associated with the Determination of Space Suit Joint Requirements" and "Results and Analysis from Space Suit Joint Torque Testing," as presented for the International Conference on Environmental Systems in 2009 and 2010, respectively. The first paper discusses historical joint torque testing methodologies and approaches that were tested in 2008 and 2009. The second paper discusses the testing that was completed in 2009 and 2010.
A functional U-statistic method for association analysis of sequencing data.
Jadhav, Sneha; Tong, Xiaoran; Lu, Qing
2017-11-01
Although sequencing studies hold great promise for uncovering novel variants predisposing to human diseases, the high dimensionality of the sequencing data brings tremendous challenges to data analysis. Moreover, for many complex diseases (e.g., psychiatric disorders) multiple related phenotypes are collected. These phenotypes can be different measurements of an underlying disease, or measurements characterizing multiple related diseases for studying common genetic mechanism. Although jointly analyzing these phenotypes could potentially increase the power of identifying disease-associated genes, the different types of phenotypes pose challenges for association analysis. To address these challenges, we propose a nonparametric method, functional U-statistic method (FU), for multivariate analysis of sequencing data. It first constructs smooth functions from individuals' sequencing data, and then tests the association of these functions with multiple phenotypes by using a U-statistic. The method provides a general framework for analyzing various types of phenotypes (e.g., binary and continuous phenotypes) with unknown distributions. Fitting the genetic variants within a gene using a smoothing function also allows us to capture complexities of gene structure (e.g., linkage disequilibrium, LD), which could potentially increase the power of association analysis. Through simulations, we compared our method to the multivariate outcome score test (MOST), and found that our test attained better performance than MOST. In a real data application, we apply our method to the sequencing data from Minnesota Twin Study (MTS) and found potential associations of several nicotine receptor subunit (CHRN) genes, including CHRNB3, associated with nicotine dependence and/or alcohol dependence. © 2017 WILEY PERIODICALS, INC.
Application of econometric and ecology analysis methods in physics software
NASA Astrophysics Data System (ADS)
Han, Min Cheol; Hoff, Gabriela; Kim, Chan Hyeong; Kim, Sung Hun; Grazia Pia, Maria; Ronchieri, Elisabetta; Saracco, Paolo
2017-10-01
Some data analysis methods typically used in econometric studies and in ecology have been evaluated and applied in physics software environments. They concern the evolution of observables through objective identification of change points and trends, and measurements of inequality, diversity and evenness across a data set. Within each analysis area, various statistical tests and measures have been examined. This conference paper summarizes a brief overview of some of these methods.
Transonic propulsion system integration analysis at McDonnell Aircraft Company
NASA Technical Reports Server (NTRS)
Cosner, Raymond R.
1989-01-01
The technology of Computational Fluid Dynamics (CFD) is becoming an important tool in the development of aircraft propulsion systems. Two of the most valuable features of CFD are: (1) quick acquisition of flow field data; and (2) complete description of flow fields, allowing detailed investigation of interactions. Current analysis methods complement wind tunnel testing in several ways. Herein, the discussion is focused on CFD methods. However, aircraft design studies need data from both CFD and wind tunnel testing. Each approach complements the other.
Recent Applications of Higher-Order Spectral Analysis to Nonlinear Aeroelastic Phenomena
NASA Technical Reports Server (NTRS)
Silva, Walter A.; Hajj, Muhammad R.; Dunn, Shane; Strganac, Thomas W.; Powers, Edward J.; Stearman, Ronald
2005-01-01
Recent applications of higher-order spectral (HOS) methods to nonlinear aeroelastic phenomena are presented. Applications include the analysis of data from a simulated nonlinear pitch and plunge apparatus and from F-18 flight flutter tests. A MATLAB model of the Texas A&MUniversity s Nonlinear Aeroelastic Testbed Apparatus (NATA) is used to generate aeroelastic transients at various conditions including limit cycle oscillations (LCO). The Gaussian or non-Gaussian nature of the transients is investigated, related to HOS methods, and used to identify levels of increasing nonlinear aeroelastic response. Royal Australian Air Force (RAAF) F/A-18 flight flutter test data is presented and analyzed. The data includes high-quality measurements of forced responses and LCO phenomena. Standard power spectral density (PSD) techniques and HOS methods are applied to the data and presented. The goal of this research is to develop methods that can identify the onset of nonlinear aeroelastic phenomena, such as LCO, during flutter testing.
Wang, Lu-Yong; Fasulo, D
2006-01-01
Genome-wide association study for complex diseases will generate massive amount of single nucleotide polymorphisms (SNPs) data. Univariate statistical test (i.e. Fisher exact test) was used to single out non-associated SNPs. However, the disease-susceptible SNPs may have little marginal effects in population and are unlikely to retain after the univariate tests. Also, model-based methods are impractical for large-scale dataset. Moreover, genetic heterogeneity makes the traditional methods harder to identify the genetic causes of diseases. A more recent random forest method provides a more robust method for screening the SNPs in thousands scale. However, for more large-scale data, i.e., Affymetrix Human Mapping 100K GeneChip data, a faster screening method is required to screening SNPs in whole-genome large scale association analysis with genetic heterogeneity. We propose a boosting-based method for rapid screening in large-scale analysis of complex traits in the presence of genetic heterogeneity. It provides a relatively fast and fairly good tool for screening and limiting the candidate SNPs for further more complex computational modeling task.
The Shock and Vibration Digest. Volume 13. Number 7
1981-07-01
Richards, ISVR, University of Southampton Presidential Address "A Structural Dynamicist Looks at Statistical Energy Analysis " Professor B.L...excitation and for random and sine sweep mechanical excitation. Test data were used to assess prediction methods, in particular a statistical energy analysis method
Computer-aided target tracking in motion analysis studies
NASA Astrophysics Data System (ADS)
Burdick, Dominic C.; Marcuse, M. L.; Mislan, J. D.
1990-08-01
Motion analysis studies require the precise tracking of reference objects in sequential scenes. In a typical situation, events of interest are captured at high frame rates using special cameras, and selected objects or targets are tracked on a frame by frame basis to provide necessary data for motion reconstruction. Tracking is usually done using manual methods which are slow and prone to error. A computer based image analysis system has been developed that performs tracking automatically. The objective of this work was to eliminate the bottleneck due to manual methods in high volume tracking applications such as the analysis of crash test films for the automotive industry. The system has proven to be successful in tracking standard fiducial targets and other objects in crash test scenes. Over 95 percent of target positions which could be located using manual methods can be tracked by the system, with a significant improvement in throughput over manual methods. Future work will focus on the tracking of clusters of targets and on tracking deformable objects such as airbags.
Brooks, M.H.; Schroder, L.J.; Malo, B.A.
1985-01-01
Four laboratories were evaluated in their analysis of identical natural and simulated precipitation water samples. Interlaboratory comparability was evaluated using analysis of variance coupled with Duncan 's multiple range test, and linear-regression models describing the relations between individual laboratory analytical results for natural precipitation samples. Results of the statistical analyses indicate that certain pairs of laboratories produce different results when analyzing identical samples. Analyte bias for each laboratory was examined using analysis of variance coupled with Duncan 's multiple range test on data produced by the laboratories from the analysis of identical simulated precipitation samples. Bias for a given analyte produced by a single laboratory has been indicated when the laboratory mean for that analyte is shown to be significantly different from the mean for the most-probable analyte concentrations in the simulated precipitation samples. Ion-chromatographic methods for the determination of chloride, nitrate, and sulfate have been compared with the colorimetric methods that were also in use during the study period. Comparisons were made using analysis of variance coupled with Duncan 's multiple range test for means produced by the two methods. Analyte precision for each laboratory has been estimated by calculating a pooled variance for each analyte. Analyte estimated precisions have been compared using F-tests and differences in analyte precisions for laboratory pairs have been reported. (USGS)
Nondestructive test method accurately sorts mixed bolts
NASA Technical Reports Server (NTRS)
Dezeih, C. J.
1966-01-01
Neutron activation analysis method sorts copper plated steel bolts from nickel plated steel bolts. Copper and nickel plated steel bolt specimens of the same configuration are irradiated with thermal neutrons in a test reactor for a short time. After thermal neutron irradiation, the bolts are analyzed using scintillation energy readout equipment.
Live load test and failure analysis for the steel deck truss bridge over the New River in Virginia.
DOT National Transportation Integrated Search
2009-01-01
This report presents the methods used to model a steel deck truss bridge over the New River in Hillsville, Virginia. These methods were evaluated by comparing analytical results with data recorded from 14 members during live load testing. The researc...
DOT National Transportation Integrated Search
2012-12-01
Current AASHTO provisions for load rating flat-slab concrete bridges use the equivalent strip : width method, which is regarded as overly conservative compared to more advanced analysis : methods and field live load testing. It has been shown that li...
Sakunpak, Apirak; Suksaeree, Jirapornchai; Monton, Chaowalit; Pathompak, Pathamaporn; Kraisintu, Krisana
2014-01-01
Objective To develop and validate an image analysis method for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. Methods TLC-densitometric and TLC-image analysis methods were developed, validated, and used for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. The results obtained by these two different quantification methods were compared by paired t-test. Results Both assays provided good linearity, accuracy, reproducibility and selectivity for determination of γ-oryzanol. Conclusions The TLC-densitometric and TLC-image analysis methods provided a similar reproducibility, accuracy and selectivity for the quantitative determination of γ-oryzanol in cold pressed rice bran oil. A statistical comparison of the quantitative determinations of γ-oryzanol in samples did not show any statistically significant difference between TLC-densitometric and TLC-image analysis methods. As both methods were found to be equal, they therefore can be used for the determination of γ-oryzanol in cold pressed rice bran oil. PMID:25182282
AGR-1 Thermocouple Data Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jeff Einerson
2012-05-01
This report documents an effort to analyze measured and simulated data obtained in the Advanced Gas Reactor (AGR) fuel irradiation test program conducted in the INL's Advanced Test Reactor (ATR) to support the Next Generation Nuclear Plant (NGNP) R&D program. The work follows up on a previous study (Pham and Einerson, 2010), in which statistical analysis methods were applied for AGR-1 thermocouple data qualification. The present work exercises the idea that, while recognizing uncertainties inherent in physics and thermal simulations of the AGR-1 test, results of the numerical simulations can be used in combination with the statistical analysis methods tomore » further improve qualification of measured data. Additionally, the combined analysis of measured and simulation data can generate insights about simulation model uncertainty that can be useful for model improvement. This report also describes an experimental control procedure to maintain fuel target temperature in the future AGR tests using regression relationships that include simulation results. The report is organized into four chapters. Chapter 1 introduces the AGR Fuel Development and Qualification program, AGR-1 test configuration and test procedure, overview of AGR-1 measured data, and overview of physics and thermal simulation, including modeling assumptions and uncertainties. A brief summary of statistical analysis methods developed in (Pham and Einerson 2010) for AGR-1 measured data qualification within NGNP Data Management and Analysis System (NDMAS) is also included for completeness. Chapters 2-3 describe and discuss cases, in which the combined use of experimental and simulation data is realized. A set of issues associated with measurement and modeling uncertainties resulted from the combined analysis are identified. This includes demonstration that such a combined analysis led to important insights for reducing uncertainty in presentation of AGR-1 measured data (Chapter 2) and interpretation of simulation results (Chapter 3). The statistics-based simulation-aided experimental control procedure described for the future AGR tests is developed and demonstrated in Chapter 4. The procedure for controlling the target fuel temperature (capsule peak or average) is based on regression functions of thermocouple readings and other relevant parameters and accounting for possible changes in both physical and thermal conditions and in instrument performance.« less
Li, Dongmei; Le Pape, Marc A; Parikh, Nisha I; Chen, Will X; Dye, Timothy D
2013-01-01
Microarrays are widely used for examining differential gene expression, identifying single nucleotide polymorphisms, and detecting methylation loci. Multiple testing methods in microarray data analysis aim at controlling both Type I and Type II error rates; however, real microarray data do not always fit their distribution assumptions. Smyth's ubiquitous parametric method, for example, inadequately accommodates violations of normality assumptions, resulting in inflated Type I error rates. The Significance Analysis of Microarrays, another widely used microarray data analysis method, is based on a permutation test and is robust to non-normally distributed data; however, the Significance Analysis of Microarrays method fold change criteria are problematic, and can critically alter the conclusion of a study, as a result of compositional changes of the control data set in the analysis. We propose a novel approach, combining resampling with empirical Bayes methods: the Resampling-based empirical Bayes Methods. This approach not only reduces false discovery rates for non-normally distributed microarray data, but it is also impervious to fold change threshold since no control data set selection is needed. Through simulation studies, sensitivities, specificities, total rejections, and false discovery rates are compared across the Smyth's parametric method, the Significance Analysis of Microarrays, and the Resampling-based empirical Bayes Methods. Differences in false discovery rates controls between each approach are illustrated through a preterm delivery methylation study. The results show that the Resampling-based empirical Bayes Methods offer significantly higher specificity and lower false discovery rates compared to Smyth's parametric method when data are not normally distributed. The Resampling-based empirical Bayes Methods also offers higher statistical power than the Significance Analysis of Microarrays method when the proportion of significantly differentially expressed genes is large for both normally and non-normally distributed data. Finally, the Resampling-based empirical Bayes Methods are generalizable to next generation sequencing RNA-seq data analysis.
ERIC Educational Resources Information Center
Ling, Guangming
2012-01-01
To assess the value of individual students' subscores on the Major Field Test in Business (MFT Business), I examined the test's internal structure with factor analysis and structural equation model methods, and analyzed the subscore reliabilities using the augmented scores method. Analyses of the internal structure suggested that the MFT Business…
Non-parametric characterization of long-term rainfall time series
NASA Astrophysics Data System (ADS)
Tiwari, Harinarayan; Pandey, Brij Kishor
2018-03-01
The statistical study of rainfall time series is one of the approaches for efficient hydrological system design. Identifying, and characterizing long-term rainfall time series could aid in improving hydrological systems forecasting. In the present study, eventual statistics was applied for the long-term (1851-2006) rainfall time series under seven meteorological regions of India. Linear trend analysis was carried out using Mann-Kendall test for the observed rainfall series. The observed trend using the above-mentioned approach has been ascertained using the innovative trend analysis method. Innovative trend analysis has been found to be a strong tool to detect the general trend of rainfall time series. Sequential Mann-Kendall test has also been carried out to examine nonlinear trends of the series. The partial sum of cumulative deviation test is also found to be suitable to detect the nonlinear trend. Innovative trend analysis, sequential Mann-Kendall test and partial cumulative deviation test have potential to detect the general as well as nonlinear trend for the rainfall time series. Annual rainfall analysis suggests that the maximum changes in mean rainfall is 11.53% for West Peninsular India, whereas the maximum fall in mean rainfall is 7.8% for the North Mountainous Indian region. The innovative trend analysis method is also capable of finding the number of change point available in the time series. Additionally, we have performed von Neumann ratio test and cumulative deviation test to estimate the departure from homogeneity. Singular spectrum analysis has been applied in this study to evaluate the order of departure from homogeneity in the rainfall time series. Monsoon season (JS) of North Mountainous India and West Peninsular India zones has higher departure from homogeneity and singular spectrum analysis shows the results to be in coherence with the same.
Development and evaluation of the impulse transfer function technique
NASA Technical Reports Server (NTRS)
Mantus, M.
1972-01-01
The development of the test/analysis technique known as the impulse transfer function (ITF) method is discussed. This technique, when implemented with proper data processing systems, should become a valuable supplement to conventional dynamic testing and analysis procedures that will be used in the space shuttle development program. The method can relieve many of the problems associated with extensive and costly testing of the shuttle for transient loading conditions. In addition, the time history information derived from impulse testing has the potential for being used to determine modal data for the structure under investigation. The technique could be very useful in determining the time-varying modal characteristics of structures subjected to thermal transients, where conventional mode surveys are difficult to perform.
Can Percentiles Replace Raw Scores in the Statistical Analysis of Test Data?
ERIC Educational Resources Information Center
Zimmerman, Donald W.; Zumbo, Bruno D.
2005-01-01
Educational and psychological testing textbooks typically warn of the inappropriateness of performing arithmetic operations and statistical analysis on percentiles instead of raw scores. This seems inconsistent with the well-established finding that transforming scores to ranks and using nonparametric methods often improves the validity and power…
NASA Technical Reports Server (NTRS)
Antonille, Scott R.; Miskey, Cherie L.; Ohl, Raymond G.; Rohrbach, Scott O.; Aronstein, David L.; Bartoszyk, Andrew E.; Bowers, Charles W.; Cofie, Emmanuel; Collins, Nicholas R.; Comber, Brian J.;
2016-01-01
NASA's James Webb Space Telescope (JWST) is a 6.6m diameter, segmented, deployable telescope for cryogenic IR space astronomy (40K). The JWST Observatory includes the Optical Telescope Element (OTE) and the Integrated Science Instrument Module (ISIM) that contains four science instruments (SI) and the fine guider. The SIs are mounted to a composite metering structure. The SI and guider units were integrated to the ISIM structure and optically tested at the NASA Goddard Space Flight Center as a suite using the Optical Telescope Element SIMulator (OSIM). OSIM is a full field, cryogenic JWST telescope simulator. SI performance, including alignment and wave front error, were evaluated using OSIM. We describe test and analysis methods for optical performance verification of the ISIM Element, with an emphasis on the processes used to plan and execute the test. The complexity of ISIM and OSIM drove us to develop a software tool for test planning that allows for configuration control of observations, associated scripts, and management of hardware and software limits and constraints, as well as tools for rapid data evaluation, and flexible re-planning in response to the unexpected. As examples of our test and analysis approach, we discuss how factors such as the ground test thermal environment are compensated in alignment. We describe how these innovative methods for test planning and execution and post-test analysis were instrumental in the verification program for the ISIM element, with enough information to allow the reader to consider these innovations and lessons learned in this successful effort in their future testing for other programs.
NASA Astrophysics Data System (ADS)
Antonille, Scott R.; Miskey, Cherie L.; Ohl, Raymond G.; Rohrbach, Scott O.; Aronstein, David L.; Bartoszyk, Andrew E.; Bowers, Charles W.; Cofie, Emmanuel; Collins, Nicholas R.; Comber, Brian J.; Eichhorn, William L.; Glasse, Alistair C.; Gracey, Renee; Hartig, George F.; Howard, Joseph M.; Kelly, Douglas M.; Kimble, Randy A.; Kirk, Jeffrey R.; Kubalak, David A.; Landsman, Wayne B.; Lindler, Don J.; Malumuth, Eliot M.; Maszkiewicz, Michael; Rieke, Marcia J.; Rowlands, Neil; Sabatke, Derek S.; Smith, Corbett T.; Smith, J. Scott; Sullivan, Joseph F.; Telfer, Randal C.; Te Plate, Maurice; Vila, M. Begoña.; Warner, Gerry D.; Wright, David; Wright, Raymond H.; Zhou, Julia; Zielinski, Thomas P.
2016-09-01
NASA's James Webb Space Telescope (JWST) is a 6.5m diameter, segmented, deployable telescope for cryogenic IR space astronomy. The JWST Observatory includes the Optical Telescope Element (OTE) and the Integrated Science Instrument Module (ISIM), that contains four science instruments (SI) and the Fine Guidance Sensor (FGS). The SIs are mounted to a composite metering structure. The SIs and FGS were integrated to the ISIM structure and optically tested at NASA's Goddard Space Flight Center using the Optical Telescope Element SIMulator (OSIM). OSIM is a full-field, cryogenic JWST telescope simulator. SI performance, including alignment and wavefront error, was evaluated using OSIM. We describe test and analysis methods for optical performance verification of the ISIM Element, with an emphasis on the processes used to plan and execute the test. The complexity of ISIM and OSIM drove us to develop a software tool for test planning that allows for configuration control of observations, implementation of associated scripts, and management of hardware and software limits and constraints, as well as tools for rapid data evaluation, and flexible re-planning in response to the unexpected. As examples of our test and analysis approach, we discuss how factors such as the ground test thermal environment are compensated in alignment. We describe how these innovative methods for test planning and execution and post-test analysis were instrumental in the verification program for the ISIM element, with enough information to allow the reader to consider these innovations and lessons learned in this successful effort in their future testing for other programs.
Sensitivity analysis of a sound absorption model with correlated inputs
NASA Astrophysics Data System (ADS)
Chai, W.; Christen, J.-L.; Zine, A.-M.; Ichchou, M.
2017-04-01
Sound absorption in porous media is a complex phenomenon, which is usually addressed with homogenized models, depending on macroscopic parameters. Since these parameters emerge from the structure at microscopic scale, they may be correlated. This paper deals with sensitivity analysis methods of a sound absorption model with correlated inputs. Specifically, the Johnson-Champoux-Allard model (JCA) is chosen as the objective model with correlation effects generated by a secondary micro-macro semi-empirical model. To deal with this case, a relatively new sensitivity analysis method Fourier Amplitude Sensitivity Test with Correlation design (FASTC), based on Iman's transform, is taken into application. This method requires a priori information such as variables' marginal distribution functions and their correlation matrix. The results are compared to the Correlation Ratio Method (CRM) for reference and validation. The distribution of the macroscopic variables arising from the microstructure, as well as their correlation matrix are studied. Finally the results of tests shows that the correlation has a very important impact on the results of sensitivity analysis. Assessment of correlation strength among input variables on the sensitivity analysis is also achieved.
NASA Astrophysics Data System (ADS)
Weng, Hanli; Li, Youping
2017-04-01
The working principle, process device and test procedure of runner static balancing test method by weighting with three-pivot pressure transducers are introduced in this paper. Based on an actual instance of a V hydraulic turbine runner, the error and sensitivity of the three-pivot pressure transducer static balancing method are analysed. Suggestions about improving the accuracy and the application of the method are also proposed.
Nichols, Jessica E; Harries, Megan E; Lovestead, Tara M; Bruno, Thomas J
2014-03-21
In this paper we present results of the application of PLOT-cryoadsorption (PLOT-cryo) to the analysis of ignitable liquids in fire debris. We tested ignitable liquids, broadly divided into fuels and solvents (although the majority of the results presented here were obtained with gasoline and diesel fuel) on three substrates: Douglas fir, oak plywood and Nylon carpet. We determined that PLOT-cryo allows the analyst to distinguish all of the ignitable liquids tested by use of a very rapid sampling protocol, and performs better (more recovered components, higher efficiency, lower elution solvent volumes) than a conventional purge and trap method. We also tested the effect of latency (the time period between applying the ignitable liquid and ignition), and we tested a variety of sampling times and a variety of PLOT capillary lengths. Reliable results can be obtained with sampling time periods as short as 3min, and on PLOT capillaries as short as 20cm. The variability of separate samples was also assessed, a study made possible by the high throughput nature of the PLOT-cryo method. We also determined that the method performs better than the conventional carbon strip method that is commonly used in fire debris analysis. Published by Elsevier B.V.
NASA Technical Reports Server (NTRS)
Sanchez, Christopher M.
2011-01-01
NASA White Sands Test Facility (WSTF) is leading an evaluation effort in advanced destructive and nondestructive testing of composite pressure vessels and structures. WSTF is using progressive finite element analysis methods for test design and for confirmation of composite pressure vessel performance. Using composite finite element analysis models and failure theories tested in the World-Wide Failure Exercise, WSTF is able to estimate the static strength of composite pressure vessels. Additionally, test and evaluation on composites that have been impact damaged is in progress so that models can be developed to estimate damage tolerance and the degradation in static strength.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-15
... kit) \\5\\ 17.50 (v) NIR or NMR Analysis (protein, oil, starch, etc.) 2.40 (vi) Waxy corn (per test) 2...) (d) All other Mycotoxins (rapid test kit 38.50 method-applicant provides kit) \\3\\ (e) NIR or NMR... kit) \\3\\ (e) NIR or NMR Analysis (protein, oil, starch, 18.60 etc.) (f) Sunflower oil (per test) 18.60...
Geiser, Christian; Burns, G. Leonard; Servera, Mateu
2014-01-01
Models of confirmatory factor analysis (CFA) are frequently applied to examine the convergent validity of scores obtained from multiple raters or methods in so-called multitrait-multimethod (MTMM) investigations. We show that interesting incremental information about method effects can be gained from including mean structures and tests of MI across methods in MTMM models. We present a modeling framework for testing MI in the first step of a CFA-MTMM analysis. We also discuss the relevance of MI in the context of four more complex CFA-MTMM models with method factors. We focus on three recently developed multiple-indicator CFA-MTMM models for structurally different methods [the correlated traits-correlated (methods – 1), latent difference, and latent means models; Geiser et al., 2014a; Pohl and Steyer, 2010; Pohl et al., 2008] and one model for interchangeable methods (Eid et al., 2008). We demonstrate that some of these models require or imply MI by definition for a proper interpretation of trait or method factors, whereas others do not, and explain why MI may or may not be required in each model. We show that in the model for interchangeable methods, testing for MI is critical for determining whether methods can truly be seen as interchangeable. We illustrate the theoretical issues in an empirical application to an MTMM study of attention deficit and hyperactivity disorder (ADHD) with mother, father, and teacher ratings as methods. PMID:25400603
Sorio, Daniela; De Palo, Elio Franco; Bertaso, Anna; Bortolotti, Federica; Tagliaro, Franco
2017-02-01
This paper puts forward a new method for the transferrin (Tf) glycoform analysis in body fluids that involves the formation of a transferrin-terbium fluorescent adduct (TfFluo). The key idea is to validate the analytical procedure for carbohydrate-deficient transferrin (CDT), a traditional biochemical serum marker to identify chronic alcohol abuse. Terbium added to a human body-fluid sample produced TfFluo. Anion exchange HPLC technique, with fluorescence detection (λ exc 298 nm and λ em 550 nm), permitted clear separation and identification of Tf glycoform peaks without any interfering signals, allowing selective Tf sialoforms analysis in human serum and body fluids (cadaveric blood, cerebrospinal fluid, and dried blood spots) hampered for routine test. Serum samples (n = 78) were analyzed by both traditional absorbance (Abs) and fluorescence (Fl) HPLC methods and CDT% levels demonstrated a significant correlation (p < 0.001 Pearson). Intra- and inter-runs CV% was 3.1 and 4.6%, respectively. The cut-off of 1.9 CDT%, related to the HPLC Abs proposed as the reference method, by interpolation in the correlation curve with the present method demonstrated a 1.3 CDT% cut-off. Method comparison by Passing-Bablok and Bland-Altman tests demonstrated Fl versus Abs agreement. In conclusion, the novel method is a reliable test for CDT% analysis and provides a substantial analytical improvement offering important advantages in terms of types of body fluid analysis. Its sensitivity and absence of interferences extend clinical applications being reliable for CDT assay on body fluids usually not suitable for routine test. Graphical Abstract The formation of a transferrin-terbium fluorescent adduct can be used to analyze the transferrin glycoforms. The HPLC method for carbohydrate-deficient transferrin (CDT%) measurement was validated and employed to determine the levels in different body fluids.
Non-Destructive Thermography Analysis of Impact Damage on Large-Scale CFRP Automotive Parts
Maier, Alexander; Schmidt, Roland; Oswald-Tranta, Beate; Schledjewski, Ralf
2014-01-01
Laminated composites are increasingly used in aeronautics and the wind energy industry, as well as in the automotive industry. In these applications, the construction and processing need to fulfill the highest requirements regarding weight and mechanical properties. Environmental issues, like fuel consumption and CO2-footprint, set new challenges in producing lightweight parts that meet the highly monitored standards for these branches. In the automotive industry, one main aspect of construction is the impact behavior of structural parts. To verify the quality of parts made from composite materials with little effort, cost and time, non-destructive test methods are increasingly used. A highly recommended non-destructive testing method is thermography analysis. In this work, a prototype for a car’s base plate was produced by using vacuum infusion. For research work, testing specimens were produced with the same multi-layer build up as the prototypes. These specimens were charged with defined loads in impact tests to simulate the effect of stone chips. Afterwards, the impacted specimens were investigated with thermography analysis. The research results in that work will help to understand the possible fields of application and the usage of thermography analysis as the first quick and economic failure detection method for automotive parts. PMID:28788464
Verification of a ground-based method for simulating high-altitude, supersonic flight conditions
NASA Astrophysics Data System (ADS)
Zhou, Xuewen; Xu, Jian; Lv, Shuiyan
Ground-based methods for accurately representing high-altitude, high-speed flight conditions have been an important research topic in the aerospace field. Based on an analysis of the requirements for high-altitude supersonic flight tests, a ground-based test bed was designed combining Laval nozzle, which is often found in wind tunnels, with a rocket sled system. Sled tests were used to verify the performance of the test bed. The test results indicated that the test bed produced a uniform-flow field with a static pressure and density equivalent to atmospheric conditions at an altitude of 13-15km and at a flow velocity of approximately M 2.4. This test method has the advantages of accuracy, fewer experimental limitations, and reusability.
The effect of uncertainties in distance-based ranking methods for multi-criteria decision making
NASA Astrophysics Data System (ADS)
Jaini, Nor I.; Utyuzhnikov, Sergei V.
2017-08-01
Data in the multi-criteria decision making are often imprecise and changeable. Therefore, it is important to carry out sensitivity analysis test for the multi-criteria decision making problem. The paper aims to present a sensitivity analysis for some ranking techniques based on the distance measures in multi-criteria decision making. Two types of uncertainties are considered for the sensitivity analysis test. The first uncertainty is related to the input data, while the second uncertainty is towards the Decision Maker preferences (weights). The ranking techniques considered in this study are TOPSIS, the relative distance and trade-off ranking methods. TOPSIS and the relative distance method measure a distance from an alternative to the ideal and antiideal solutions. In turn, the trade-off ranking calculates a distance of an alternative to the extreme solutions and other alternatives. Several test cases are considered to study the performance of each ranking technique in both types of uncertainties.
Study and application of acoustic emission testing in fault diagnosis of low-speed heavy-duty gears.
Gao, Lixin; Zai, Fenlou; Su, Shanbin; Wang, Huaqing; Chen, Peng; Liu, Limei
2011-01-01
Most present studies on the acoustic emission signals of rotating machinery are experiment-oriented, while few of them involve on-spot applications. In this study, a method of redundant second generation wavelet transform based on the principle of interpolated subdivision was developed. With this method, subdivision was not needed during the decomposition. The lengths of approximation signals and detail signals were the same as those of original ones, so the data volume was twice that of original signals; besides, the data redundancy characteristic also guaranteed the excellent analysis effect of the method. The analysis of the acoustic emission data from the faults of on-spot low-speed heavy-duty gears validated the redundant second generation wavelet transform in the processing and denoising of acoustic emission signals. Furthermore, the analysis illustrated that the acoustic emission testing could be used in the fault diagnosis of on-spot low-speed heavy-duty gears and could be a significant supplement to vibration testing diagnosis.
Study and Application of Acoustic Emission Testing in Fault Diagnosis of Low-Speed Heavy-Duty Gears
Gao, Lixin; Zai, Fenlou; Su, Shanbin; Wang, Huaqing; Chen, Peng; Liu, Limei
2011-01-01
Most present studies on the acoustic emission signals of rotating machinery are experiment-oriented, while few of them involve on-spot applications. In this study, a method of redundant second generation wavelet transform based on the principle of interpolated subdivision was developed. With this method, subdivision was not needed during the decomposition. The lengths of approximation signals and detail signals were the same as those of original ones, so the data volume was twice that of original signals; besides, the data redundancy characteristic also guaranteed the excellent analysis effect of the method. The analysis of the acoustic emission data from the faults of on-spot low-speed heavy-duty gears validated the redundant second generation wavelet transform in the processing and denoising of acoustic emission signals. Furthermore, the analysis illustrated that the acoustic emission testing could be used in the fault diagnosis of on-spot low-speed heavy-duty gears and could be a significant supplement to vibration testing diagnosis. PMID:22346592
NASA Technical Reports Server (NTRS)
Nguyen, D. T.; Al-Nasra, M.; Zhang, Y.; Baddourah, M. A.; Agarwal, T. K.; Storaasli, O. O.; Carmona, E. A.
1991-01-01
Several parallel-vector computational improvements to the unconstrained optimization procedure are described which speed up the structural analysis-synthesis process. A fast parallel-vector Choleski-based equation solver, pvsolve, is incorporated into the well-known SAP-4 general-purpose finite-element code. The new code, denoted PV-SAP, is tested for static structural analysis. Initial results on a four processor CRAY 2 show that using pvsolve reduces the equation solution time by a factor of 14-16 over the original SAP-4 code. In addition, parallel-vector procedures for the Golden Block Search technique and the BFGS method are developed and tested for nonlinear unconstrained optimization. A parallel version of an iterative solver and the pvsolve direct solver are incorporated into the BFGS method. Preliminary results on nonlinear unconstrained optimization test problems, using pvsolve in the analysis, show excellent parallel-vector performance indicating that these parallel-vector algorithms can be used in a new generation of finite-element based structural design/analysis-synthesis codes.
CADDIS Volume 4. Data Analysis: Selecting an Analysis Approach
An approach for selecting statistical analyses to inform causal analysis. Describes methods for determining whether test site conditions differ from reference expectations. Describes an approach for estimating stressor-response relationships.
A Simple Method for Causal Analysis of Return on IT Investment
Alemi, Farrokh; Zargoush, Manaf; Oakes, James L.; Edrees, Hanan
2011-01-01
This paper proposes a method for examining the causal relationship among investment in information technology (IT) and the organization's productivity. In this method, first a strong relationship among (1) investment in IT, (2) use of IT and (3) organization's productivity is verified using correlations. Second, the assumption that IT investment preceded improved productivity is tested using partial correlation. Finally, the assumption of what may have happened in the absence of IT investment, the so called counterfactual, is tested through forecasting productivity at different levels of investment. The paper applies the proposed method to investment in the Veterans Health Information Systems and Technology Architecture (VISTA) system. Result show that the causal analysis can be done, even with limited data. Furthermore, because the procedure relies on overall organization's productivity, it might be more objective than when the analyst picks and chooses which costs and benefits should be included in the analysis. PMID:23019515
Osathanunkul, Maslin; Suwannapoom, Chatmongkon; Khamyong, Nuttaluck; Pintakum, Danupol; Lamphun, Santisuk Na; Triwitayakorn, Kanokporn; Osathanunkul, Kitisak; Madesis, Panagiotis
2016-01-01
Andrographis paniculata Nees is a medicinal plant with multiple pharmacological properties. It has been used over many centuries as a household remedy. A. paniculata products sold on the markets are in processed forms so it is difficult to authenticate. Therefore buying the herbal products poses a high-risk of acquiring counterfeited, substituted and/or adulterated products. Due to these issues, a reliable method to authenticate products is needed. High resolution melting analysis coupled with DNA barcoding (Bar-HRM) was applied to detect adulteration in commercial herbal products. The rbcL barcode was selected to use in primers design for HRM analysis to produce standard melting profile of A. paniculata species. DNA of the tested commercial products was isolated and their melting profiles were then generated and compared with the standard A. paniculata. The melting profiles of the rbcL amplicons of the three closely related herbal species (A. paniculata, Acanthus ebracteatus and Rhinacanthus nasutus) are clearly separated so that they can be distinguished by the developed method. The method was then used to authenticate commercial herbal products. HRM curves of all 10 samples tested are similar to A. paniculata which indicated that all tested products were contained the correct species as labeled. The method described in this study has been proved to be useful in aiding identification and/or authenticating A. paniculata. This Bar-HRM analysis has allowed us easily to determine the A. paniculata species in herbal products on the markets even they are in processed forms. We propose the use of DNA barcoding combined with High Resolution Melting analysis for authenticating of Andrographis paniculata products.The developed method can be used regardless of the type of the DNA template (fresh or dried tissue, leaf, and stem).rbcL region was chosen for the analysis and work well with our samplesWe can easily determine the A. paniculata species in herbal products tested. Abbreviations used: bp: Base pair, Tm: Melting temperature.
Zailer, Elina; Holzgrabe, Ulrike; Diehl, Bernd W K
2017-11-01
A proton (1H) NMR spectroscopic method was established for the quality assessment of vegetable oils. To date, several research studies have been published demonstrating the high potential of the NMR technique in lipid analysis. An interlaboratory comparison was organized with the following main objectives: (1) to evaluate an alternative analysis of edible oils by using 1H NMR spectroscopy; and (2) to determine the robustness and reproducibility of the method. Five different edible oil samples were analyzed by evaluating 15 signals (free fatty acids, peroxides, aldehydes, double bonds, and linoleic and linolenic acids) in each spectrum. A total of 21 NMR data sets were obtained from 17 international participant laboratories. The performance of each laboratory was assessed by their z-scores. The test was successfully passed by 90.5% of the participants. Results showed that NMR spectroscopy is a robust alternative method for edible oil analysis.
Vibration Signature Analysis of a Faulted Gear Transmission System
NASA Technical Reports Server (NTRS)
Choy, F. K.; Huang, S.; Zakrajsek, J. J.; Handschuh, R. F.; Townsend, D. P.
1994-01-01
A comprehensive procedure in predicting faults in gear transmission systems under normal operating conditions is presented. Experimental data was obtained from a spiral bevel gear fatigue test rig at NASA Lewis Research Center. Time synchronous averaged vibration data was recorded throughout the test as the fault progressed from a small single pit to severe pitting over several teeth, and finally tooth fracture. A numerical procedure based on the Winger-Ville distribution was used to examine the time averaged vibration data. Results from the Wigner-Ville procedure are compared to results from a variety of signal analysis techniques which include time domain analysis methods and frequency analysis methods. Using photographs of the gear tooth at various stages of damage, the limitations and accuracy of the various techniques are compared and discussed. Conclusions are drawn from the comparison of the different approaches as well as the applicability of the Wigner-Ville method in predicting gear faults.
2010-01-01
Background Recent developments in high-throughput methods of analyzing transcriptomic profiles are promising for many areas of biology, including ecophysiology. However, although commercial microarrays are available for most common laboratory models, transcriptome analysis in non-traditional model species still remains a challenge. Indeed, the signal resulting from heterologous hybridization is low and difficult to interpret because of the weak complementarity between probe and target sequences, especially when no microarray dedicated to a genetically close species is available. Results We show here that transcriptome analysis in a species genetically distant from laboratory models is made possible by using MAXRS, a new method of analyzing heterologous hybridization on microarrays. This method takes advantage of the design of several commercial microarrays, with different probes targeting the same transcript. To illustrate and test this method, we analyzed the transcriptome of king penguin pectoralis muscle hybridized to Affymetrix chicken microarrays, two organisms separated by an evolutionary distance of approximately 100 million years. The differential gene expression observed between different physiological situations computed by MAXRS was confirmed by real-time PCR on 10 genes out of 11 tested. Conclusions MAXRS appears to be an appropriate method for gene expression analysis under heterologous hybridization conditions. PMID:20509979
Rivoirard, Romain; Duplay, Vianney; Oriol, Mathieu; Tinquaut, Fabien; Chauvin, Franck; Magne, Nicolas; Bourmaud, Aurelie
2016-01-01
Quality of reporting for Randomized Clinical Trials (RCTs) in oncology was analyzed in several systematic reviews, but, in this setting, there is paucity of data for the outcomes definitions and consistency of reporting for statistical tests in RCTs and Observational Studies (OBS). The objective of this review was to describe those two reporting aspects, for OBS and RCTs in oncology. From a list of 19 medical journals, three were retained for analysis, after a random selection: British Medical Journal (BMJ), Annals of Oncology (AoO) and British Journal of Cancer (BJC). All original articles published between March 2009 and March 2014 were screened. Only studies whose main outcome was accompanied by a corresponding statistical test were included in the analysis. Studies based on censored data were excluded. Primary outcome was to assess quality of reporting for description of primary outcome measure in RCTs and of variables of interest in OBS. A logistic regression was performed to identify covariates of studies potentially associated with concordance of tests between Methods and Results parts. 826 studies were included in the review, and 698 were OBS. Variables were described in Methods section for all OBS studies and primary endpoint was clearly detailed in Methods section for 109 RCTs (85.2%). 295 OBS (42.2%) and 43 RCTs (33.6%) had perfect agreement for reported statistical test between Methods and Results parts. In multivariable analysis, variable "number of included patients in study" was associated with test consistency: aOR (adjusted Odds Ratio) for third group compared to first group was equal to: aOR Grp3 = 0.52 [0.31-0.89] (P value = 0.009). Variables in OBS and primary endpoint in RCTs are reported and described with a high frequency. However, statistical tests consistency between methods and Results sections of OBS is not always noted. Therefore, we encourage authors and peer reviewers to verify consistency of statistical tests in oncology studies.
Predicting juvenile recidivism: new method, old problems.
Benda, B B
1987-01-01
This prediction study compared three statistical procedures for accuracy using two assessment methods. The criterion is return to a juvenile prison after the first release, and the models tested are logit analysis, predictive attribute analysis, and a Burgess procedure. No significant differences are found between statistics in prediction.
40 CFR 766.16 - Developing the analytical test method.
Code of Federal Regulations, 2011 CFR
2011-07-01
... Resolution Gas Chromatography (HRGC) with High Resolution Mass Spectrometry (HRMS) is the method of choice... meet the requirements of the chemical matrix. (d) Analysis. The method of choice is High Resolution Gas...
40 CFR 766.16 - Developing the analytical test method.
Code of Federal Regulations, 2010 CFR
2010-07-01
... Resolution Gas Chromatography (HRGC) with High Resolution Mass Spectrometry (HRMS) is the method of choice... meet the requirements of the chemical matrix. (d) Analysis. The method of choice is High Resolution Gas...
Other Clean Water Act Test Methods: Biosolids
Methods for analysis of chemical pollutants in biosolids (municipal sewage sludge). These methods are not approved under 40 CFR Part 136, but may be of interest to regulated entities, permitting authorities and the public.
Occupied Volume Integrity Testing : Elastic Test Results and Analyses
DOT National Transportation Integrated Search
2011-09-21
Federal Railroad Administration (FRA) and the Volpe Center have been conducting research into developing an alternative method of demonstrating occupied volume integrity (OVI) through a combination of testing and analysis. Previous works have been pu...
Spacelab scrubber analysis and test support
NASA Technical Reports Server (NTRS)
1979-01-01
Contaminants to be used in qualification and development tests of the add-on charcoal bed scrubber were established, along with rates and methods for their introduction. The contaminant levels to be achieved were predicted and test results were analyzed.
40 CFR Table 6 of Subpart Bbbbbbb... - General Provisions
Code of Federal Regulations, 2010 CFR
2010-07-01
... under which performance tests must be conducted. § 63.7(e)(2)-(4) Conduct of Performance Tests and Data Reduction Yes. § 63.7(f)-(h) Use of Alternative Test Method; Data Analysis, Recordkeeping, and Reporting...
ERIC Educational Resources Information Center
Li, Feifei
2017-01-01
An information-correction method for testlet-based tests is introduced. This method takes advantage of both generalizability theory (GT) and item response theory (IRT). The measurement error for the examinee proficiency parameter is often underestimated when a unidimensional conditional-independence IRT model is specified for a testlet dataset. By…
Analysis of aquifer tests in the Punjab region of West Pakistan
Bennett, Gordon D.; ,; Sheikh, Ijaz Ahmed; Alr, Sabire
1967-01-01
The results of 141 pumping tests in the Punjab Plain of West Pakistan are reported. Methods of test analysis are described in detail, and an outline of the theory underlying these methods is given. The lateral permeability of the screened interval is given for all tests; the specific yield of the material at water-table depth is given for 1(6 tests; and the vertical permeability of the material between the water table and the top of the screen is given for 14 tests. The lateral permeabilities are predominantly in the range 0.001 to 0.006 cfs per sq ft; the average value is 0.0032 cfs per sq ft. Specific yields generally range from 0.02 to 0.26; the average value is 0.14. All vertical permeability results fall in the range 10 -5 to 10 -3 cfs per sq ft.
Application of the Semi-Empirical Force-Limiting Approach for the CoNNeCT SCAN Testbed
NASA Technical Reports Server (NTRS)
Staab, Lucas D.; McNelis, Mark E.; Akers, James C.; Suarez, Vicente J.; Jones, Trevor M.
2012-01-01
The semi-empirical force-limiting vibration method was developed and implemented for payload testing to limit the structural impedance mismatch (high force) that occurs during shaker vibration testing. The method has since been extended for use in analytical models. The Space Communications and Navigation Testbed (SCAN Testbed), known at NASA as, the Communications, Navigation, and Networking re-Configurable Testbed (CoNNeCT), project utilized force-limiting testing and analysis following the semi-empirical approach. This paper presents the steps in performing a force-limiting analysis and then compares the results to test data recovered during the CoNNeCT force-limiting random vibration qualification test that took place at NASA Glenn Research Center (GRC) in the Structural Dynamics Laboratory (SDL) December 19, 2010 to January 7, 2011. A compilation of lessons learned and considerations for future force-limiting tests is also included.
NASA Technical Reports Server (NTRS)
Mohn, L. W.
1975-01-01
The use of the Boeing TEA-230 Subsonic Flow Analysis method as a primary design tool in the development of cruise overwing nacelle configurations is presented. Surface pressure characteristics at 0.7 Mach number were determined by the TEA-230 method for a selected overwing flow-through nacelle configuration. Results of this analysis show excellent overall agreement with corresponding wind tunnel data. Effects of the presence of the nacelle on the wing pressure field were predicted accurately by the theoretical method. Evidence is provided that differences between theoretical and experimental pressure distributions in the present study would not result in significant discrepancies in the nacelle lines or nacelle drag estimates.
Analysis of longitudinal data from animals with missing values using SPSS.
Duricki, Denise A; Soleman, Sara; Moon, Lawrence D F
2016-06-01
Testing of therapies for disease or injury often involves the analysis of longitudinal data from animals. Modern analytical methods have advantages over conventional methods (particularly when some data are missing), yet they are not used widely by preclinical researchers. Here we provide an easy-to-use protocol for the analysis of longitudinal data from animals, and we present a click-by-click guide for performing suitable analyses using the statistical package IBM SPSS Statistics software (SPSS). We guide readers through the analysis of a real-life data set obtained when testing a therapy for brain injury (stroke) in elderly rats. If a few data points are missing, as in this example data set (for example, because of animal dropout), repeated-measures analysis of covariance may fail to detect a treatment effect. An alternative analysis method, such as the use of linear models (with various covariance structures), and analysis using restricted maximum likelihood estimation (to include all available data) can be used to better detect treatment effects. This protocol takes 2 h to carry out.
NASA Astrophysics Data System (ADS)
Pardimin, H.; Arcana, N.
2018-01-01
Many types of research in the field of mathematics education apply the Quasi-Experimental method and statistical analysis use t-test. Quasi-experiment has a weakness that is difficult to fulfil “the law of a single independent variable”. T-test also has a weakness that is a generalization of the conclusions obtained is less powerful. This research aimed to find ways to reduce the weaknesses of the Quasi-experimental method and improved the generalization of the research results. The method applied in the research was a non-interactive qualitative method, and the type was concept analysis. Concepts analysed are the concept of statistics, research methods of education, and research reports. The result represented a way to overcome the weaknesses of quasi-Experiments and T-test. In addition, the way was to apply a combination of Factorial Design and Balanced Design, which the authors refer to as Factorial-Balanced Design. The advantages of this design are: (1) almost fulfilling “the low of single independent variable” so no need to test the similarity of the academic ability, (2) the sample size of the experimental group and the control group became larger and equal; so it becomes robust to deal with violations of the assumptions of the ANOVA test.
Sociometric Indicators of Leadership: An Exploratory Analysis
2018-01-01
streamline existing observational protocols and assessment methods . This research provides an initial test of sociometric badges in the context of the U.S...understand, the requirements of the mission. Traditional research and assessment methods focusing on leader and follower interactions require direct...based methods of social network analysis. Novel Measures of Leadership Building on these findings and earlier research , it is apparent that
Recent statistical methods for orientation data
NASA Technical Reports Server (NTRS)
Batschelet, E.
1972-01-01
The application of statistical methods for determining the areas of animal orientation and navigation are discussed. The method employed is limited to the two-dimensional case. Various tests for determining the validity of the statistical analysis are presented. Mathematical models are included to support the theoretical considerations and tables of data are developed to show the value of information obtained by statistical analysis.
NASA Technical Reports Server (NTRS)
Phillips, Edward P.
1989-01-01
An experimental Round Robin on the measurement of the opening load in fatigue crack growth tests was conducted on Crack Closure Measurement and Analysis. The Round Robin evaluated the current level of consistency of opening load measurements among laboratories and to identify causes for observed inconsistency. Eleven laboratories participated in the testing of compact and middle-crack specimens. Opening-load measurements were made for crack growth at two stress-intensity factor levels, three crack lengths, and following an overload. All opening-load measurements were based on the analysis of specimen compliance data. When all of the results reported (from all participants, all measurement methods, and all data analysis methods) for a given test condition were pooled, the range of opening loads was very large--typically spanning the lower half of the fatigue loading cycle. Part of the large scatter in the reported opening-load results was ascribed to consistent differences in results produced by the various methods used to measure specimen compliance and to evaluate the opening load from the compliance data. Another significant portion of the scatter was ascribed to lab-to-lab differences in producing the compliance data when using nominally the same method of measurement.
NASA Technical Reports Server (NTRS)
Hilburger, Mark W.; Lovejoy, Andrew E.; Thornburgh, Robert P.; Rankin, Charles
2012-01-01
NASA s Shell Buckling Knockdown Factor (SBKF) project has the goal of developing new analysis-based shell buckling design factors (knockdown factors) and design and analysis technologies for launch vehicle structures. Preliminary design studies indicate that implementation of these new knockdown factors can enable significant reductions in mass and mass-growth in these vehicles. However, in order to validate any new analysis-based design data or methods, a series of carefully designed and executed structural tests are required at both the subscale and full-scale levels. This paper describes the design and analysis of three different orthogrid-stiffeNed metallic cylindrical-shell test articles. Two of the test articles are 8-ft-diameter, 6-ft-long test articles, and one test article is a 27.5-ft-diameter, 20-ft-long Space Shuttle External Tank-derived test article.
Comparison of chemiluminescence methods for analysis of hydrogen peroxide and hydroxyl radicals
NASA Astrophysics Data System (ADS)
Pehrman, R.; Amme, M.; Cachoir, C.
2006-01-01
Assessment of alpha radiolysis influence on the chemistry of geologically disposed spent fuel demands analytical methods for radiolytic product determination at trace levels. Several chemiluminescence methods for the detection of radiolytic oxidants hydrogen peroxide and hydroxyl radicals are tested. Two of hydrogen peroxide methods use luminol, catalyzed by either μ-peroxidase or hemin, one uses 10-methyl-9-(p-formylphenyl)-acridinium carboxylate trifluoromethanesulfonate and one potassium periodate. All recipes are tested as batch systems in basic conditions. For hydroxyl radical detection luminophores selected are 3-hydroxyphthalic hydrazide and rutin. Both methods are tested as batch systems. The results are compared and the applicability of the methods for near-field dissolution studies is discussed.
Study on magnetic circuit of moving magnet linear compressor
NASA Astrophysics Data System (ADS)
Xia, Ming; Chen, Xiaoping; Chen, Jun
2015-05-01
The moving magnet linear compressors are very popular in the tactical miniature stirling cryocoolers. The magnetic circuit of LFC3600 moving magnet linear compressor, manufactured by Kunming institute of Physics, was studied in this study. Three methods of the analysis theory, numerical calculation and experiment study were applied in the analysis process. The calculated formula of magnetic reluctance and magnetomotive force were given in theoretical analysis model. The magnetic flux density and magnetic flux line were analyzed in numerical analysis model. A testing method was designed to test the magnetic flux density of the linear compressor. When the piston of the motor was in the equilibrium position, the value of the magnetic flux density was at the maximum of 0.27T. The results were almost equal to the ones from numerical analysis.
Material nonlinear analysis via mixed-iterative finite element method
NASA Technical Reports Server (NTRS)
Sutjahjo, Edhi; Chamis, Christos C.
1992-01-01
The performance of elastic-plastic mixed-iterative analysis is examined through a set of convergence studies. Membrane and bending behaviors are tested using 4-node quadrilateral finite elements. The membrane result is excellent, which indicates the implementation of elastic-plastic mixed-iterative analysis is appropriate. On the other hand, further research to improve bending performance of the method seems to be warranted.
Reliability Validation and Improvement Framework
2012-11-01
systems . Steps in that direction include the use of the Architec- ture Tradeoff Analysis Method ® (ATAM®) developed at the Carnegie Mellon...embedded software • cyber - physical systems (CPSs) to indicate that the embedded software interacts with, manag - es, and controls a physical system [Lee...the use of formal static analysis methods to increase our confidence in system operation beyond testing. However, analysis results
ERIC Educational Resources Information Center
Gallinat, Erica; Spaulding, Tammie J.
2014-01-01
Purpose: This study used meta-analysis to investigate the difference in nonverbal cognitive test performance of children with specific language impairment (SLI) and their typically developing (TD) peers. Method: The meta-analysis included studies (a) that were published between 1995 and 2012 of children with SLI who were age matched (and not…
Testing alternative ground water models using cross-validation and other methods
Foglia, L.; Mehl, S.W.; Hill, M.C.; Perona, P.; Burlando, P.
2007-01-01
Many methods can be used to test alternative ground water models. Of concern in this work are methods able to (1) rank alternative models (also called model discrimination) and (2) identify observations important to parameter estimates and predictions (equivalent to the purpose served by some types of sensitivity analysis). Some of the measures investigated are computationally efficient; others are computationally demanding. The latter are generally needed to account for model nonlinearity. The efficient model discrimination methods investigated include the information criteria: the corrected Akaike information criterion, Bayesian information criterion, and generalized cross-validation. The efficient sensitivity analysis measures used are dimensionless scaled sensitivity (DSS), composite scaled sensitivity, and parameter correlation coefficient (PCC); the other statistics are DFBETAS, Cook's D, and observation-prediction statistic. Acronyms are explained in the introduction. Cross-validation (CV) is a computationally intensive nonlinear method that is used for both model discrimination and sensitivity analysis. The methods are tested using up to five alternative parsimoniously constructed models of the ground water system of the Maggia Valley in southern Switzerland. The alternative models differ in their representation of hydraulic conductivity. A new method for graphically representing CV and sensitivity analysis results for complex models is presented and used to evaluate the utility of the efficient statistics. The results indicate that for model selection, the information criteria produce similar results at much smaller computational cost than CV. For identifying important observations, the only obviously inferior linear measure is DSS; the poor performance was expected because DSS does not include the effects of parameter correlation and PCC reveals large parameter correlations. ?? 2007 National Ground Water Association.
Analysis and testing of a new method for drop size measurement using laser scatter interferometry
NASA Technical Reports Server (NTRS)
Bachalo, W. D.; Houser, M. J.
1984-01-01
Research was conducted on a laser light scatter detection method for measuring the size and velocity of spherical particles. The method is based upon the measurement of the interference fringe pattern produced by spheres passing through the intersection of two laser beams. A theoretical analysis of the method was carried out using the geometrical optics theory. Experimental verification of the theory was obtained by using monodisperse droplet streams. Several optical configurations were tested to identify all of the parametric effects upon the size measurements. Both off-axis forward and backscatter light detection were utilized. Simulated spray environments and fuel spray nozzles were used in the evaluation of the method. The measurements of the monodisperse drops showed complete agreement with the theoretical predictions. The method was demonstrated to be independent of the beam intensity and extinction resulting from the surrounding drops. Signal processing concepts were considered and a method was selected for development.
Batsoulis, A N; Nacos, M K; Pappas, C S; Tarantilis, P A; Mavromoustakos, T; Polissiou, M G
2004-02-01
Hemicellulose samples were isolated from kenaf (Hibiscus cannabinus L.). Hemicellulosic fractions usually contain a variable percentage of uronic acids. The uronic acid content (expressed in polygalacturonic acid) of the isolated hemicelluloses was determined by diffuse reflectance infrared Fourier transform spectroscopy (DRIFTS) and the curve-fitting deconvolution method. A linear relationship between uronic acids content and the sum of the peak areas at 1745, 1715, and 1600 cm(-1) was established with a high correlation coefficient (0.98). The deconvolution analysis using the curve-fitting method allowed the elimination of spectral interferences from other cell wall components. The above method was compared with an established spectrophotometric method and was found equivalent for accuracy and repeatability (t-test, F-test). This method is applicable in analysis of natural or synthetic mixtures and/or crude substances. The proposed method is simple, rapid, and nondestructive for the samples.
Good Laboratory Practices of Materials Testing at NASA White Sands Test Facility
NASA Technical Reports Server (NTRS)
Hirsch, David; Williams, James H.
2005-01-01
An approach to good laboratory practices of materials testing at NASA White Sands Test Facility is presented. The contents include: 1) Current approach; 2) Data analysis; and 3) Improvements sought by WSTF to enhance the diagnostic capability of existing methods.
Testa, Maria; Livingston, Jennifer A; VanZile-Tamsen, Carol
2011-02-01
A mixed methods approach, combining quantitative with qualitative data methods and analysis, offers a promising means of advancing the study of violence. Integrating semi-structured interviews and qualitative analysis into a quantitative program of research on women's sexual victimization has resulted in valuable scientific insight and generation of novel hypotheses for testing. This mixed methods approach is described and recommendations for integrating qualitative data into quantitative research are provided.
Probabilistic methods for rotordynamics analysis
NASA Technical Reports Server (NTRS)
Wu, Y.-T.; Torng, T. Y.; Millwater, H. R.; Fossum, A. F.; Rheinfurth, M. H.
1991-01-01
This paper summarizes the development of the methods and a computer program to compute the probability of instability of dynamic systems that can be represented by a system of second-order ordinary linear differential equations. Two instability criteria based upon the eigenvalues or Routh-Hurwitz test functions are investigated. Computational methods based on a fast probability integration concept and an efficient adaptive importance sampling method are proposed to perform efficient probabilistic analysis. A numerical example is provided to demonstrate the methods.
QCL spectroscopy combined with the least squares method for substance analysis
NASA Astrophysics Data System (ADS)
Samsonov, D. A.; Tabalina, A. S.; Fufurin, I. L.
2017-11-01
The article briefly describes distinctive features of quantum cascade lasers (QCL). It also describes an experimental set-up for acquiring mid-infrared absorption spectra using QCL. The paper demonstrates experimental results in the form of normed spectra. We tested the application of the least squares method for spectrum analysis. We used this method for substance identification and extraction of concentration data. We compare the results with more common methods of absorption spectroscopy. Eventually, we prove the feasibility of using this simple method for quantitative and qualitative analysis of experimental data acquired with QCL.
NASA Technical Reports Server (NTRS)
Clothiaux, John D.; Dowling, Norman E.
1992-01-01
The suitability of using rain-flow reconstructions as an alternative to an original loading spectrum for component fatigue life testing is investigated. A modified helicopter maneuver history is used for the rain-flow cycle counting and history regenerations. Experimental testing on a notched test specimen over a wide range of loads produces similar lives for the original history and the reconstructions. The test lives also agree with a simplified local strain analysis performed on the specimen utilizing the rain-flow cycle count. The rain-flow reconstruction technique is shown to be a viable test spectrum alternative to storing the complete original load history, especially in saving computer storage space and processing time. A description of the regeneration method, the simplified life prediction analysis, and the experimental methods are included in the investigation.
Concerns regarding a call for pluralism of information theory and hypothesis testing
Lukacs, P.M.; Thompson, W.L.; Kendall, W.L.; Gould, W.R.; Doherty, P.F.; Burnham, K.P.; Anderson, D.R.
2007-01-01
1. Stephens et al . (2005) argue for `pluralism? in statistical analysis, combining null hypothesis testing and information-theoretic (I-T) methods. We show that I-T methods are more informative even in single variable problems and we provide an ecological example. 2. I-T methods allow inferences to be made from multiple models simultaneously. We believe multimodel inference is the future of data analysis, which cannot be achieved with null hypothesis-testing approaches. 3. We argue for a stronger emphasis on critical thinking in science in general and less reliance on exploratory data analysis and data dredging. Deriving alternative hypotheses is central to science; deriving a single interesting science hypothesis and then comparing it to a default null hypothesis (e.g. `no difference?) is not an efficient strategy for gaining knowledge. We think this single-hypothesis strategy has been relied upon too often in the past. 4. We clarify misconceptions presented by Stephens et al . (2005). 5. We think inference should be made about models, directly linked to scientific hypotheses, and their parameters conditioned on data, Prob(Hj| data). I-T methods provide a basis for this inference. Null hypothesis testing merely provides a probability statement about the data conditioned on a null model, Prob(data |H0). 6. Synthesis and applications. I-T methods provide a more informative approach to inference. I-T methods provide a direct measure of evidence for or against hypotheses and a means to consider simultaneously multiple hypotheses as a basis for rigorous inference. Progress in our science can be accelerated if modern methods can be used intelligently; this includes various I-T and Bayesian methods.
NASA Astrophysics Data System (ADS)
Bongale, Arunkumar M.; Kumar, Satish; Sachit, T. S.; Jadhav, Priya
2018-03-01
Studies on wear properties of Aluminium based hybrid nano composite materials, processed through powder metallurgy technique, are reported in the present study. Silicon Carbide nano particles and E-glass fibre are reinforced in pure aluminium matrix to fabricate hybrid nano composite material samples. Pin-on-Disc wear testing equipment is used to evaluate dry sliding wear properties of the composite samples. The tests were conducted following the Taguchi’s Design of Experiments method. Signal-to-Noise ratio analysis and Analysis of Variance are carried out on the test data to find out the influence of test parameters on the wear rate. Scanning Electron Microscopic analysis and Energy Dispersive x-ray analysis are conducted on the worn surfaces to find out the wear mechanisms responsible for wear of the composites. Multiple linear regression analysis and Genetic Algorithm techniques are employed for optimization of wear test parameters to yield minimum wear of the composite samples. Finally, a wear model is built by the application of Artificial Neural Networks to predict the wear rate of the composite material, under different testing conditions. The predicted values of wear rate are found to be very close to the experimental values with a deviation in the range of 0.15% to 8.09%.
Determination of UAV pre-flight Checklist for flight test purpose using qualitative failure analysis
NASA Astrophysics Data System (ADS)
Hendarko; Indriyanto, T.; Syardianto; Maulana, F. A.
2018-05-01
Safety aspects are of paramount importance in flight, especially in flight test phase. Before performing any flight tests of either manned or unmanned aircraft, one should include pre-flight checklists as a required safety document in the flight test plan. This paper reports on the development of a new approach for determination of pre-flight checklists for UAV flight test based on aircraft’s failure analysis. The Lapan’s LSA (Light Surveillance Aircraft) is used as a study case, assuming this aircraft has been transformed into the unmanned version. Failure analysis is performed on LSA using fault tree analysis (FTA) method. Analysis is focused on propulsion system and flight control system, which fail of these systems will lead to catastrophic events. Pre-flight checklist of the UAV is then constructed based on the basic causes obtained from failure analysis.
Sakunpak, Apirak; Suksaeree, Jirapornchai; Monton, Chaowalit; Pathompak, Pathamaporn; Kraisintu, Krisana
2014-02-01
To develop and validate an image analysis method for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. TLC-densitometric and TLC-image analysis methods were developed, validated, and used for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. The results obtained by these two different quantification methods were compared by paired t-test. Both assays provided good linearity, accuracy, reproducibility and selectivity for determination of γ-oryzanol. The TLC-densitometric and TLC-image analysis methods provided a similar reproducibility, accuracy and selectivity for the quantitative determination of γ-oryzanol in cold pressed rice bran oil. A statistical comparison of the quantitative determinations of γ-oryzanol in samples did not show any statistically significant difference between TLC-densitometric and TLC-image analysis methods. As both methods were found to be equal, they therefore can be used for the determination of γ-oryzanol in cold pressed rice bran oil.
A Statistical Approach for Testing Cross-Phenotype Effects of Rare Variants
Broadaway, K. Alaine; Cutler, David J.; Duncan, Richard; Moore, Jacob L.; Ware, Erin B.; Jhun, Min A.; Bielak, Lawrence F.; Zhao, Wei; Smith, Jennifer A.; Peyser, Patricia A.; Kardia, Sharon L.R.; Ghosh, Debashis; Epstein, Michael P.
2016-01-01
Increasing empirical evidence suggests that many genetic variants influence multiple distinct phenotypes. When cross-phenotype effects exist, multivariate association methods that consider pleiotropy are often more powerful than univariate methods that model each phenotype separately. Although several statistical approaches exist for testing cross-phenotype effects for common variants, there is a lack of similar tests for gene-based analysis of rare variants. In order to fill this important gap, we introduce a statistical method for cross-phenotype analysis of rare variants using a nonparametric distance-covariance approach that compares similarity in multivariate phenotypes to similarity in rare-variant genotypes across a gene. The approach can accommodate both binary and continuous phenotypes and further can adjust for covariates. Our approach yields a closed-form test whose significance can be evaluated analytically, thereby improving computational efficiency and permitting application on a genome-wide scale. We use simulated data to demonstrate that our method, which we refer to as the Gene Association with Multiple Traits (GAMuT) test, provides increased power over competing approaches. We also illustrate our approach using exome-chip data from the Genetic Epidemiology Network of Arteriopathy. PMID:26942286
Breast-feeding and postpartum weight retention: a systematic review and meta-analysis.
He, Xiujie; Zhu, Meng; Hu, Chuanlai; Tao, Xingyong; Li, Yingchun; Wang, Qiuwei; Liu, Yue
2015-12-01
Weight gained during pregnancy and postpartum weight retention might contribute to obesity in women of childbearing age. Whether breast-feeding (BF) may decrease postpartum weight retention (PPWR) is still controversial. The purpose of our systematic review and meta-analysis was to investigate the relationship between BF and PPWR. Three databases were systematically reviewed and the reference lists of relevant articles were checked. Meta-analysis was performed to quantify the pooled standardized mean differences (SMD) of BF on PPWR by using a random-effect model. Heterogeneity was tested using the χ 2 test and I 2 statistics. Publication bias was estimated from Egger's test (linear regression method) or Begg's test (rank correlation method). Among 349 search hits, eleven studies met the inclusion criteria for the meta-analysis. Seven studies were conducted in the USA, one in Brazil, one in France, one in Georgia and one in Croatia. Compared with formula-feeding, BF for 3 to ≤6 months seemed to have a negative influence on PPWR and if BF continued for >6 months had little or no influence on PPWR. In a subgroup meta-analysis, the results did not change substantially after the analysis had been classified by available confounding factors. There was no indication of a publication bias from the result of either Egger's test or Begg's test. Although the available evidence held belief that BF decreases PPWR, more robust studies are needed to reliably assess the impact of patterns and duration of BF on PPWR.
Škrbić, Biljana; Héberger, Károly; Durišić-Mladenović, Nataša
2013-10-01
Sum of ranking differences (SRD) was applied for comparing multianalyte results obtained by several analytical methods used in one or in different laboratories, i.e., for ranking the overall performances of the methods (or laboratories) in simultaneous determination of the same set of analytes. The data sets for testing of the SRD applicability contained the results reported during one of the proficiency tests (PTs) organized by EU Reference Laboratory for Polycyclic Aromatic Hydrocarbons (EU-RL-PAH). In this way, the SRD was also tested as a discriminant method alternative to existing average performance scores used to compare mutlianalyte PT results. SRD should be used along with the z scores--the most commonly used PT performance statistics. SRD was further developed to handle the same rankings (ties) among laboratories. Two benchmark concentration series were selected as reference: (a) the assigned PAH concentrations (determined precisely beforehand by the EU-RL-PAH) and (b) the averages of all individual PAH concentrations determined by each laboratory. Ranking relative to the assigned values and also to the average (or median) values pointed to the laboratories with the most extreme results, as well as revealed groups of laboratories with similar overall performances. SRD reveals differences between methods or laboratories even if classical test(s) cannot. The ranking was validated using comparison of ranks by random numbers (a randomization test) and using seven folds cross-validation, which highlighted the similarities among the (methods used in) laboratories. Principal component analysis and hierarchical cluster analysis justified the findings based on SRD ranking/grouping. If the PAH-concentrations are row-scaled, (i.e., z scores are analyzed as input for ranking) SRD can still be used for checking the normality of errors. Moreover, cross-validation of SRD on z scores groups the laboratories similarly. The SRD technique is general in nature, i.e., it can be applied to any experimental problem in which multianalyte results obtained either by several analytical procedures, analysts, instruments, or laboratories need to be compared.
Analysis of longitudinal data from animals where some data are missing in SPSS
Duricki, DA; Soleman, S; Moon, LDF
2017-01-01
Testing of therapies for disease or injury often involves analysis of longitudinal data from animals. Modern analytical methods have advantages over conventional methods (particularly where some data are missing) yet are not used widely by pre-clinical researchers. We provide here an easy to use protocol for analysing longitudinal data from animals and present a click-by-click guide for performing suitable analyses using the statistical package SPSS. We guide readers through analysis of a real-life data set obtained when testing a therapy for brain injury (stroke) in elderly rats. We show that repeated measures analysis of covariance failed to detect a treatment effect when a few data points were missing (due to animal drop-out) whereas analysis using an alternative method detected a beneficial effect of treatment; specifically, we demonstrate the superiority of linear models (with various covariance structures) analysed using Restricted Maximum Likelihood estimation (to include all available data). This protocol takes two hours to follow. PMID:27196723
A numerical study of some potential sources of error in side-by-side seismometer evaluations
Holcomb, L. Gary
1990-01-01
This report presents the results of a series of computer simulations of potential errors in test data, which might be obtained when conducting side-by-side comparisons of seismometers. These results can be used as guides in estimating potential sources and magnitudes of errors one might expect when analyzing real test data. First, the derivation of a direct method for calculating the noise levels of two sensors in a side-by-side evaluation is repeated and extended slightly herein. This bulk of this derivation was presented previously (see Holcomb 1989); it is repeated here for easy reference.This method is applied to the analysis of a simulated test of two sensors in a side-by-side test in which the outputs of both sensors consist of white noise spectra with known signal-tonoise ratios (SNR's). This report extends this analysis to high SNR's to determine the limitations of the direct method for calculating the noise levels at signal-to-noise levels which are much higher than presented previously (see Holcomb 1989). Next, the method is used to analyze a simulated test of two sensors in a side-by-side test in which the outputs of both sensors consist of bandshaped noise spectra with known signal-tonoise ratios. This is a much more realistic representation of real world data because the earth's background spectrum is certainly not flat.Finally, the results of the analysis of simulated white and bandshaped side-by-side test data are used to assist in interpreting the analysis of the effects of simulated azimuthal misalignment in side-by-side sensor evaluations. A thorough understanding of azimuthal misalignment errors is important because of the physical impossibility of perfectly aligning two sensors in a real world situation. The analysis herein indicates that alignment errors place lower limits on the levels of system noise which can be resolved in a side-by-side measurement It also indicates that alignment errors are the source of the fact that real data noise spectra tend to follow the earth's background spectra in shape.
Yang, Chan; Xu, Bing; Zhang, Zhi-Qiang; Wang, Xin; Shi, Xin-Yuan; Fu, Jing; Qiao, Yan-Jiang
2016-10-01
Blending uniformity is essential to ensure the homogeneity of Chinese medicine formula particles within each batch. This study was based on the blending process of ebony spray dried powder and dextrin(the proportion of dextrin was 10%),in which the analysis of near infrared (NIR) diffuse reflectance spectra was collected from six different sampling points in combination with moving window F test method in order to assess the blending uniformity of the blending process.The method was validated by the changes of citric acid content determined by the HPLC. The results of moving window F test method showed that the ebony spray dried powder and dextrin was homogeneous during 200-300 r and was segregated during 300-400 r. An advantage of this method is that the threshold value is defined statistically, not empirically and thus does not suffer from threshold ambiguities in common with the moving block standard deviatiun (MBSD). And this method could be employed to monitor other blending process of Chinese medicine powders on line. Copyright© by the Chinese Pharmaceutical Association.
On determining the most appropriate test cut-off value: the case of tests with continuous results
Habibzadeh, Parham; Yadollahie, Mahboobeh
2016-01-01
There are several criteria for determination of the most appropriate cut-off value in a diagnostic test with continuous results. Mostly based on receiver operating characteristic (ROC) analysis, there are various methods to determine the test cut-off value. The most common criteria are the point on ROC curve where the sensitivity and specificity of the test are equal; the point on the curve with minimum distance from the left-upper corner of the unit square; and the point where the Youden’s index is maximum. There are also methods mainly based on Bayesian decision analysis. Herein, we show that a proposed method that maximizes the weighted number needed to misdiagnose, an index of diagnostic test effectiveness we previously proposed, is the most appropriate technique compared to the aforementioned ones. For determination of the cut-off value, we need to know the pretest probability of the disease of interest as well as the costs incurred by misdiagnosis. This means that even for a certain diagnostic test, the cut-off value is not universal and should be determined for each region and for each disease condition. PMID:27812299
Determination of orthotropic material properties by modal analysis
NASA Astrophysics Data System (ADS)
Lai, Junpeng
The methodology for determination of orthotropic material properties in plane stress condition will be presented. It is applied to orthotropic laminated plates like printed wiring boards. The first part of the thesis will focus on theories and methodologies. The static beam model and vibratory plate model is presented. The methods are validated by operating a series of test on aluminum. In the static tests, deflection and two directions of strain are measured, thus four of the properties will be identified: Ex, Ey, nuxy, nuyx. Moving on to dynamic test, the first ten modes' resonance frequencies are obtained. The technique of modal analysis is adopted. The measured data is processed by FFT and analyzed by curve fitting to extract natural frequencies and mode shapes. With the last material property to be determined, a finite element method using ANSYS is applied. Along with the identified material properties in static tests, and proper initial guess of the unknown shear modulus, an iterative process creates finite element model and conducts modal analysis with the updating model. When the modal analysis result produced by ANSYS matches the natural frequencies acquired by dynamic test, the process will halt. Then we obtained the last material property in plane stress condition.
FECAL SOURCE TRACKING BY ANTIBIOTIC RESISTANCE ANALYSIS ON A WATERSHED EXHIBITING LOW RESISTANCE
The ongoing development of microbial source tracking has made it possible to identify contamination sources with varying accuracy, depending on the method used. The purpose of this study was done to test the efficiency of the antibiotic resistance analysis (ARA) method under low ...
Boison, Joe O
2016-05-01
The Joint Food and Agriculture Organization and World Health Organization (FAO/WHO) Expert Committee on Food Additives (JECFA) is one of three Codex committees tasked with applying risk analysis and relying on independent scientific advice provided by expert bodies organized by FAO/WHO when developing standards. While not officially part of the Codex Alimentarius Commission structure, JECFA provides independent scientific advice to the Commission and its specialist committees such as the Codex Committee on Residues of Veterinary Drugs in Foods (CCRVDF) in setting maximum residue limits (MRLs) for veterinary drugs. Codex methods of analysis (Types I, II, III, and IV) are defined in the Codex Procedural Manual as are criteria to be used for selecting methods of analysis. However, if a method is to be used under a single laboratory condition to support regulatory work, it must be validated according to an internationally recognized protocol and the use of the method must be embedded in a quality assurance system in compliance with ISO/IEC 17025:2005. This paper examines the attributes of the methods used to generate residue depletion data for drug registration and/or licensing and for supporting regulatory enforcement initiatives that experts consider to be useful and appropriate in their assessment of methods of analysis. Copyright © 2016 Her Majesty the Queen in Right of Canada. Drug Testing and Analysis © 2016 John Wiley & Sons, Ltd. © 2016 Her Majesty the Queen in Right of Canada. Drug Testing and Analysis © 2016 John Wiley & Sons, Ltd.
Agin, Patricia Poh; Edmonds, Susan H
2002-08-01
The goals of this study were (i) to demonstrate that existing and widely used sun protection factor (SPF) test methodologies can produce accurate and reproducible results for high SPF formulations and (ii) to provide data on the number of test-subjects needed, the variability of the data, and the appropriate exposure increments needed for testing high SPF formulations. Three high SPF formulations were tested, according to the Food and Drug Administration's (FDA) 1993 tentative final monograph (TFM) 'very water resistant' test method and/or the 1978 proposed monograph 'waterproof' test method, within one laboratory. A fourth high SPF formulation was tested at four independent SPF testing laboratories, using the 1978 waterproof SPF test method. All laboratories utilized xenon arc solar simulators. The data illustrate that the testing conducted within one laboratory, following either the 1978 proposed or the 1993 TFM SPF test method, was able to reproducibly determine the SPFs of the formulations tested, using either the statistical analysis method in the proposed monograph or the statistical method described in the TFM. When one formulation was tested at four different laboratories, the anticipated variation in the data owing to the equipment and other operational differences was minimized through the use of the statistical method described in the 1993 monograph. The data illustrate that either the 1978 proposed monograph SPF test method or the 1993 TFM SPF test method can provide accurate and reproducible results for high SPF formulations. Further, these results can be achieved with panels of 20-25 subjects with an acceptable level of variability. Utilization of the statistical controls from the 1993 sunscreen monograph can help to minimize lab-to-lab variability for well-formulated products.
40 CFR 98.324 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2012 CFR
2012-07-01
... Procedures and Inspection Tracking System Handbook Number: PH-08-V-1, January 1, 2008 (incorporated by... paragraphs (d)(1) through (d)(2) of this section. (1) ASTM D1945-03, Standard Test Method for Analysis of... Reformed Gas by Gas Chromatography; ASTM D4891-89 (Reapproved 2006), Standard Test Method for Heating Value...
40 CFR 98.324 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2013 CFR
2013-07-01
... Procedures and Inspection Tracking System Handbook Number: PH-08-V-1, January 1, 2008 (incorporated by... paragraphs (d)(1) through (d)(2) of this section. (1) ASTM D1945-03, Standard Test Method for Analysis of... Reformed Gas by Gas Chromatography; ASTM D4891-89 (Reapproved 2006), Standard Test Method for Heating Value...
40 CFR 98.324 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2011 CFR
2011-07-01
... Procedures and Inspection Tracking System Handbook Number: PH-08-V-1, January 1, 2008 (incorporated by... ASTM D1945-03, Standard Test Method for Analysis of Natural Gas by Gas Chromatography; ASTM D1946-90... (Reapproved 2006), Standard Test Method for Heating Value of Gases in Natural Gas Range by Stoichiometric...
Best Technical Approach for the Petroleum Quality Analysis (PQA) System
1994-08-01
two test methods for determination of water content in a fuel. The Karl Fischer titration method (ASTM D 1744) measures for total water, both...difficult to automate. ASTM D 664, "Standard Test Method for Acid Number of Petroleum Products by Potentiometric Titration," is simple to automate...release. distribution unlimnied 13. ABSTRACT (Maximum 2C3 words) Recent U.S. militar-y operations have identified a need for improved methods of fuel and
Major advances in testing of dairy products: milk component and dairy product attribute testing.
Barbano, D M; Lynch, J M
2006-04-01
Milk component analysis is relatively unusual in the field of quantitative analytical chemistry because an analytical test result determines the allocation of very large amounts of money between buyers and sellers of milk. Therefore, there is high incentive to develop and refine these methods to achieve a level of analytical performance rarely demanded of most methods or laboratory staff working in analytical chemistry. In the last 25 yr, well-defined statistical methods to characterize and validate analytical method performance combined with significant improvements in both the chemical and instrumental methods have allowed achievement of improved analytical performance for payment testing. A shift from marketing commodity dairy products to the development, manufacture, and marketing of value added dairy foods for specific market segments has created a need for instrumental and sensory approaches and quantitative data to support product development and marketing. Bringing together sensory data from quantitative descriptive analysis and analytical data from gas chromatography olfactometry for identification of odor-active compounds in complex natural dairy foods has enabled the sensory scientist and analytical chemist to work together to improve the consistency and quality of dairy food flavors.
Fully Bayesian tests of neutrality using genealogical summary statistics.
Drummond, Alexei J; Suchard, Marc A
2008-10-31
Many data summary statistics have been developed to detect departures from neutral expectations of evolutionary models. However questions about the neutrality of the evolution of genetic loci within natural populations remain difficult to assess. One critical cause of this difficulty is that most methods for testing neutrality make simplifying assumptions simultaneously about the mutational model and the population size model. Consequentially, rejecting the null hypothesis of neutrality under these methods could result from violations of either or both assumptions, making interpretation troublesome. Here we harness posterior predictive simulation to exploit summary statistics of both the data and model parameters to test the goodness-of-fit of standard models of evolution. We apply the method to test the selective neutrality of molecular evolution in non-recombining gene genealogies and we demonstrate the utility of our method on four real data sets, identifying significant departures of neutrality in human influenza A virus, even after controlling for variation in population size. Importantly, by employing a full model-based Bayesian analysis, our method separates the effects of demography from the effects of selection. The method also allows multiple summary statistics to be used in concert, thus potentially increasing sensitivity. Furthermore, our method remains useful in situations where analytical expectations and variances of summary statistics are not available. This aspect has great potential for the analysis of temporally spaced data, an expanding area previously ignored for limited availability of theory and methods.
Analysis of International Space Station Materials on MISSE-3 and MISSE-4
NASA Technical Reports Server (NTRS)
Finckenor, Miria M.; Golden, Johnny L.; O'Rourke, Mary Jane
2008-01-01
For high-temperature applications (> 2,000 C) such as solid rocket motors, hypersonic aircraft, nuclear electric/thermal propulsion for spacecraft, and more efficient jet engines, creep becomes one of the most important design factors to be considered. Conventional creep-testing methods, where the specimen and test apparatus are in contact with each other, are limited to temperatures 1,700 deg. C. Development of alloys for higher-temperature applications is limited by the availability of testing methods at temperatures above 2000 C. Development of alloys for applications requiring a long service life at temperatures as low as 1500 C, such as the next generation of jet turbine superalloys, is limited by the difficulty of accelerated testing at temperatures above 1700 0c. For these reasons, a new, non-contact creep-measurement technique is needed for higher temperature applications. A new non-contact method for creep measurements of ultra-high-temperature metals and ceramics has been developed and validated. Using the electrostatic levitation (ESL) facility at NASA Marshall Space Flight Center, a spherical sample is rotated quickly enough to cause creep deformation due to centrifugal acceleration. Very accurate measurement of the deformed shape through digital image analysis allows the stress exponent n to be determined very precisely from a single test, rather than from numerous conventional tests. Validation tests on single-crystal niobium spheres showed excellent agreement with conventional tests at 1985 C; however the non-contact method provides much greater precision while using only about 40 milligrams of material. This method is being applied to materials including metals and ceramics for noneroding throats in solid rockets and next-generation superalloys for turbine engines. Recent advances in the method and the current state of these new measurements will be presented.
Inductive interference in rapid transit signaling systems. volume 2. suggested test procedures.
DOT National Transportation Integrated Search
1987-03-31
These suggested test procedures have been prepared in order to develop standard methods of analysis and testing to quantify and resolve issues of electromagnetic compatibility in rail transit operations. Electromagnetic interference, generated by rai...
NASA Technical Reports Server (NTRS)
Ferrenberg, A.; Hunt, K.; Duesberg, J.
1985-01-01
The primary objective was the obtainment of atomization and mixing performance data for a variety of typical liquid oxygen/hydrocarbon injector element designs. Such data are required to establish injector design criteria and to provide critical inputs to liquid rocket engine combustor performance and stability analysis, and computational codes and methods. Deficiencies and problems with the atomization test equipment were identified, and action initiated to resolve them. Test results of the gas/liquid mixing tests indicated that an assessment of test methods was required. A series of 71 liquid/liquid tests were performed.
ERIC Educational Resources Information Center
Li, Spencer D.
2011-01-01
Mediation analysis in child and adolescent development research is possible using large secondary data sets. This article provides an overview of two statistical methods commonly used to test mediated effects in secondary analysis: multiple regression and structural equation modeling (SEM). Two empirical studies are presented to illustrate the…
Statistical analysis of 59 inspected SSME HPFTP turbine blades (uncracked and cracked)
NASA Technical Reports Server (NTRS)
Wheeler, John T.
1987-01-01
The numerical results of statistical analysis of the test data of Space Shuttle Main Engine high pressure fuel turbopump second-stage turbine blades, including some with cracks are presented. Several statistical methods use the test data to determine the application of differences in frequency variations between the uncracked and cracked blades.
Pyrolysis kinetics and combustion of thin wood using advanced cone calorimetry test method
Mark A. Dietenberger
2011-01-01
Mechanistic pyrolysis kinetics analysis of extractives, holocellulose, and lignin in solid wood over entire heating regime was possible using specialized cone calorimeter test and new mathematical analysis tools. Added hardware components include: modified sample holder for thin specimen with tiny thermocouples, methane ring burner with stainless steel mesh above cone...
Medical microbiological analysis of Apollo-Soyuz test project crewmembers
NASA Technical Reports Server (NTRS)
Taylor, G. R.; Zaloguev, S. N.
1976-01-01
The procedures and results of the Microbial Exchange Experiment (AR-002) of the Apollo-Soyuz Test Project are described. Included in the discussion of procedural aspects are methods and materials, in-flight microbial specimen collection, and preliminary analysis of microbial specimens. Medically important microorganisms recovered from both Apollo and Soyuz crewmen are evaluated.
NASA Technical Reports Server (NTRS)
Mason, P. W.; Harris, H. G.; Zalesak, J.; Bernstein, M.
1974-01-01
The methods and procedures used in the analysis and testing of the scale model are reported together with the correlation of the analytical and experimental results. The model, the NASTRAN finite element analysis, and results are discussed. Tests and analytical investigations are also reported.
Using Meta-Analysis to Inform the Design of Subsequent Studies of Diagnostic Test Accuracy
ERIC Educational Resources Information Center
Hinchliffe, Sally R.; Crowther, Michael J.; Phillips, Robert S.; Sutton, Alex J.
2013-01-01
An individual diagnostic accuracy study rarely provides enough information to make conclusive recommendations about the accuracy of a diagnostic test; particularly when the study is small. Meta-analysis methods provide a way of combining information from multiple studies, reducing uncertainty in the result and hopefully providing substantial…
PCR Testing of IVC Filter Tops as a Method for Detecting Murine Pinworms and Fur Mites.
Gerwin, Philip M; Ricart Arbona, Rodolfo J; Riedel, Elyn R; Henderson, Kenneth S; Lipman, Neil S
2017-11-01
We evaluated PCR testing of filter tops from cages maintained on an IVC system through which exhaust air is filtered at the cage level as a method for detecting parasite-infected and -infested cages. Cages containing 4 naïve Swiss Webster mice received 360 mL of uncontaminated aspen chip or α-cellulose bedding (n = 18 cages each) and 60 mL of the same type of bedding weekly from each of the following 4 groups of cages housing mice infected or infested with Syphacia obvelata (SO), Aspiculuris tetraptera (AT), Myocoptes musculinus (MC), or Myobia musculi (MB) and Radfordia affinis (RA; 240 mL bedding total). Detection rates were compared at 30, 60, and 90 d after initiating bedding exposure, by using PCR analysis of filter tops (media extract and swabs) and testing of mouse samples (fur swab [direct] PCR testing, fecal flotation, anal tape test, direct examination of intestinal contents, and skin scrape). PCR testing of filter media extract detected 100% of all parasites at 30 d (both bedding types) except for AT (α-cellulose bedding, 67% detection rate); identified more cages with fur mites (MB and MC) than direct PCR when cellulose bedding was used; and was better at detecting parasites than all nonmolecular methods evaluated. PCR analysis of filter media extract was superior to swab and direct PCR for all parasites cumulatively for each bedding type. Direct PCR more effectively detected MC and all parasites combined for aspen chip compared with cellulose bedding. PCR analysis of filter media extract for IVC systems in which exhaust air is filtered at the cage level was shown to be a highly effective environmental testing method.
PCR Testing of IVC Filter Tops as a Method for Detecting Murine Pinworms and Fur Mites
Gerwin, Philip M; Arbona, Rodolfo J Ricart; Riedel, Elyn R; Henderson, Kenneth S; Lipman, Neil S
2017-01-01
We evaluated PCR testing of filter tops from cages maintained on an IVC system through which exhaust air is filtered at the cage level as a method for detecting parasite- infected and -infested cages. Cages containing 4 naïve Swiss Webster mice received 360 mL of uncontaminated aspen chip or α-cellulose bedding (n = 18 cages each) and 60 mL of the same type of bedding weekly from each of the following 4 groups of cages housing mice infected or infested with Syphacia obvelata (SO), Aspiculuris tetraptera (AT), Myocoptes musculinus (MC), or Myobia musculi (MB) and Radfordia affinis (RA; 240 mL bedding total). Detection rates were compared at 30, 60, and 90 d after initiating bedding exposure, by using PCR analysis of filter tops (media extract and swabs) and testing of mouse samples (fur swab [direct] PCR testing, fecal flotation, anal tape test, direct examination of intestinal contents, and skin scrape). PCR testing of filter media extract detected 100% of all parasites at 30 d (both bedding types) except for AT (α-cellulose bedding, 67% detection rate); identified more cages with fur mites (MB and MC) than direct PCR when cellulose bedding was used; and was better at detecting parasites than all nonmolecular methods evaluated. PCR analysis of filter media extract was superior to swab and direct PCR for all parasites cumulatively for each bedding type. Direct PCR more effectively detected MC and all parasites combined for aspen chip compared with cellulose bedding. PCR analysis of filter media extract for IVC systems in which exhaust air is filtered at the cage level was shown to be a highly effective environmental testing method. PMID:29256370
Robot-operated quality control station based on the UTT method
NASA Astrophysics Data System (ADS)
Burghardt, Andrzej; Kurc, Krzysztof; Szybicki, Dariusz; Muszyńska, Magdalena; Nawrocki, Jacek
2017-03-01
This paper presents a robotic test stand for the ultrasonic transmission tomography (UTT) inspection of stator vane thickness. The article presents the method of the test stand design in Autodesk Robot Structural Analysis Professional 2013 software suite. The performance of the designed test stand solution was simulated in the RobotStudio software suite. The operating principle of the test stand measurement system is presented with a specific focus on the measurement strategy. The results of actual wall thickness measurements performed on stator vanes are presented.
A Method for and Issues Associated with the Determination of Space Suit Joint Requirements
NASA Technical Reports Server (NTRS)
Matty, Jennifer E.; Aitchison, Lindsay
2010-01-01
This joint mobility KC lecture included information from two papers, "A Method for and Issues Associated with the Determination of Space Suit Joint Requirements" and "Results and Analysis from Space Suit Joint Torque Testing," as presented for the International Conference on Environmental Systems in 2009 and 2010, respectively. The first paper discusses historical joint torque testing methodologies and approaches that were tested in 2008 and 2009. The second paper discusses the testing that was completed in 2009 and 2010.
Osathanunkul, Maslin; Suwannapoom, Chatmongkon; Khamyong, Nuttaluck; Pintakum, Danupol; Lamphun, Santisuk Na; Triwitayakorn, Kanokporn; Osathanunkul, Kitisak; Madesis, Panagiotis
2016-01-01
Background: Andrographis paniculata Nees is a medicinal plant with multiple pharmacological properties. It has been used over many centuries as a household remedy. A. paniculata products sold on the markets are in processed forms so it is difficult to authenticate. Therefore buying the herbal products poses a high-risk of acquiring counterfeited, substituted and/or adulterated products. Due to these issues, a reliable method to authenticate products is needed. Materials and Methods: High resolution melting analysis coupled with DNA barcoding (Bar-HRM) was applied to detect adulteration in commercial herbal products. The rbcL barcode was selected to use in primers design for HRM analysis to produce standard melting profile of A. paniculata species. DNA of the tested commercial products was isolated and their melting profiles were then generated and compared with the standard A. paniculata. Results: The melting profiles of the rbcL amplicons of the three closely related herbal species (A. paniculata, Acanthus ebracteatus and Rhinacanthus nasutus) are clearly separated so that they can be distinguished by the developed method. The method was then used to authenticate commercial herbal products. HRM curves of all 10 samples tested are similar to A. paniculata which indicated that all tested products were contained the correct species as labeled. Conclusion: The method described in this study has been proved to be useful in aiding identification and/or authenticating A. paniculata. This Bar-HRM analysis has allowed us easily to determine the A. paniculata species in herbal products on the markets even they are in processed forms. SUMMARY We propose the use of DNA barcoding combined with High Resolution Melting analysis for authenticating of Andrographis paniculata products.The developed method can be used regardless of the type of the DNA template (fresh or dried tissue, leaf, and stem).rbcL region was chosen for the analysis and work well with our samplesWe can easily determine the A. paniculata species in herbal products tested. Abbreviations used: bp: Base pair, Tm: Melting temperature PMID:27041863
Analysis of Duplicated Multiple-Samples Rank Data Using the Mack-Skillings Test.
Carabante, Kennet Mariano; Alonso-Marenco, Jose Ramon; Chokumnoyporn, Napapan; Sriwattana, Sujinda; Prinyawiwatkul, Witoon
2016-07-01
Appropriate analysis for duplicated multiple-samples rank data is needed. This study compared analysis of duplicated rank preference data using the Friedman versus Mack-Skillings tests. Panelists (n = 125) ranked twice 2 orange juice sets: different-samples set (100%, 70%, vs. 40% juice) and similar-samples set (100%, 95%, vs. 90%). These 2 sample sets were designed to get contrasting differences in preference. For each sample set, rank sum data were obtained from (1) averaged rank data of each panelist from the 2 replications (n = 125), (2) rank data of all panelists from each of the 2 separate replications (n = 125 each), (3) jointed rank data of all panelists from the 2 replications (n = 125), and (4) rank data of all panelists pooled from the 2 replications (n = 250); rank data (1), (2), and (4) were separately analyzed by the Friedman test, although those from (3) by the Mack-Skillings test. The effect of sample sizes (n = 10 to 125) was evaluated. For the similar-samples set, higher variations in rank data from the 2 replications were observed; therefore, results of the main effects were more inconsistent among methods and sample sizes. Regardless of analysis methods, the larger the sample size, the higher the χ(2) value, the lower the P-value (testing H0 : all samples are not different). Analyzing rank data (2) separately by replication yielded inconsistent conclusions across sample sizes, hence this method is not recommended. The Mack-Skillings test was more sensitive than the Friedman test. Furthermore, it takes into account within-panelist variations and is more appropriate for analyzing duplicated rank data. © 2016 Institute of Food Technologists®
Review and analysis of Hamburg Wheel Tracking device test data.
DOT National Transportation Integrated Search
2014-02-01
The Hamburg Wheel Tracking Device (HWTD) test (TEX-242-F) and the Kansas Test Method KT-56 (KT-56), or : modified Lottman test, have been used in Kansas for the last 10 years or so to predict rutting and moisture damage potential of : Superpave mixes...
Bioelectrical Impedance and Body Composition Assessment
ERIC Educational Resources Information Center
Martino, Mike
2006-01-01
This article discusses field tests that can be used in physical education programs. The most common field tests are anthropometric measurements, which include body mass index (BMI), girth measurements, and skinfold testing. Another field test that is gaining popularity is bioelectrical impedance analysis (BIA). Each method has particular strengths…
Fatigue crack identification method based on strain amplitude changing
NASA Astrophysics Data System (ADS)
Guo, Tiancai; Gao, Jun; Wang, Yonghong; Xu, Youliang
2017-09-01
Aiming at the difficulties in identifying the location and time of crack initiation in the castings of helicopter transmission system during fatigue tests, by introducing the classification diagnostic criteria of similar failure mode to find out the similarity of fatigue crack initiation among castings, an engineering method and quantitative criterion for detecting fatigue cracks based on strain amplitude changing is proposed. This method is applied on the fatigue test of a gearbox housing, whose results indicates: during the fatigue test, the system alarms when SC strain meter reaches the quantitative criterion. The afterwards check shows that a fatigue crack less than 5mm is found at the corresponding location of SC strain meter. The test result proves that the method can provide accurate test data for strength life analysis.
Sierad, Leslie Neil; Shaw, Eliza Laine; Bina, Alexander; Brazile, Bryn; Rierson, Nicholas; Patnaik, Sourav S.; Kennamer, Allison; Odum, Rebekah; Cotoi, Ovidiu; Terezia, Preda; Branzaniuc, Klara; Smallwood, Harrison; Deac, Radu; Egyed, Imre; Pavai, Zoltan; Szanto, Annamaria; Harceaga, Lucian; Suciu, Horatiu; Raicea, Victor; Olah, Peter; Simionescu, Agneta; Liao, Jun; Movileanu, Ionela
2015-01-01
There is a great need for living valve replacements for patients of all ages. Such constructs could be built by tissue engineering, with perspective of the unique structure and biology of the aortic root. The aortic valve root is composed of several different tissues, and careful structural and functional consideration has to be given to each segment and component. Previous work has shown that immersion techniques are inadequate for whole-root decellularization, with the aortic wall segment being particularly resistant to decellularization. The aim of this study was to develop a differential pressure gradient perfusion system capable of being rigorous enough to decellularize the aortic root wall while gentle enough to preserve the integrity of the cusps. Fresh porcine aortic roots have been subjected to various regimens of perfusion decellularization using detergents and enzymes and results compared to immersion decellularized roots. Success criteria for evaluation of each root segment (cusp, muscle, sinus, wall) for decellularization completeness, tissue integrity, and valve functionality were defined using complementary methods of cell analysis (histology with nuclear and matrix stains and DNA analysis), biomechanics (biaxial and bending tests), and physiologic heart valve bioreactor testing (with advanced image analysis of open–close cycles and geometric orifice area measurement). Fully acellular porcine roots treated with the optimized method exhibited preserved macroscopic structures and microscopic matrix components, which translated into conserved anisotropic mechanical properties, including bending and excellent valve functionality when tested in aortic flow and pressure conditions. This study highlighted the importance of (1) adapting decellularization methods to specific target tissues, (2) combining several methods of cell analysis compared to relying solely on histology, (3) developing relevant valve-specific mechanical tests, and (4) in vitro testing of valve functionality. PMID:26467108
NASA Technical Reports Server (NTRS)
Colvin, E. L.; Emptage, M. R.
1992-01-01
The breaking load test provides quantitative stress corrosion cracking data by determining the residual strength of tension specimens that have been exposed to corrosive environments. Eight laboratories have participated in a cooperative test program under the auspices of ASTM Committee G-1 to evaluate the new test method. All eight laboratories were able to distinguish between three tempers of aluminum alloy 7075. The statistical analysis procedures that were used in the test program do not work well in all situations. An alternative procedure using Box-Cox transformations shows a great deal of promise. An ASTM standard method has been drafted which incorporates the Box-Cox procedure.
Development of Flight-Test Performance Estimation Techniques for Small Unmanned Aerial Systems
NASA Astrophysics Data System (ADS)
McCrink, Matthew Henry
This dissertation provides a flight-testing framework for assessing the performance of fixed-wing, small-scale unmanned aerial systems (sUAS) by leveraging sub-system models of components unique to these vehicles. The development of the sub-system models, and their links to broader impacts on sUAS performance, is the key contribution of this work. The sub-system modeling and analysis focuses on the vehicle's propulsion, navigation and guidance, and airframe components. Quantification of the uncertainty in the vehicle's power available and control states is essential for assessing the validity of both the methods and results obtained from flight-tests. Therefore, detailed propulsion and navigation system analyses are presented to validate the flight testing methodology. Propulsion system analysis required the development of an analytic model of the propeller in order to predict the power available over a range of flight conditions. The model is based on the blade element momentum (BEM) method. Additional corrections are added to the basic model in order to capture the Reynolds-dependent scale effects unique to sUAS. The model was experimentally validated using a ground based testing apparatus. The BEM predictions and experimental analysis allow for a parameterized model relating the electrical power, measurable during flight, to the power available required for vehicle performance analysis. Navigation system details are presented with a specific focus on the sensors used for state estimation, and the resulting uncertainty in vehicle state. Uncertainty quantification is provided by detailed calibration techniques validated using quasi-static and hardware-in-the-loop (HIL) ground based testing. The HIL methods introduced use a soft real-time flight simulator to provide inertial quality data for assessing overall system performance. Using this tool, the uncertainty in vehicle state estimation based on a range of sensors, and vehicle operational environments is presented. The propulsion and navigation system models are used to evaluate flight-testing methods for evaluating fixed-wing sUAS performance. A brief airframe analysis is presented to provide a foundation for assessing the efficacy of the flight-test methods. The flight-testing presented in this work is focused on validating the aircraft drag polar, zero-lift drag coefficient, and span efficiency factor. Three methods are detailed and evaluated for estimating these design parameters. Specific focus is placed on the influence of propulsion and navigation system uncertainty on the resulting performance data. Performance estimates are used in conjunction with the propulsion model to estimate the impact sensor and measurement uncertainty on the endurance and range of a fixed-wing sUAS. Endurance and range results for a simplistic power available model are compared to the Reynolds-dependent model presented in this work. Additional parameter sensitivity analysis related to state estimation uncertainties encountered in flight-testing are presented. Results from these analyses indicate that the sub-system models introduced in this work are of first-order importance, on the order of 5-10% change in range and endurance, in assessing the performance of a fixed-wing sUAS.
[Quantitative spectrum analysis of characteristic gases of spontaneous combustion coal].
Liang, Yun-Tao; Tang, Xiao-Jun; Luo, Hai-Zhu; Sun, Yong
2011-09-01
Aimed at the characteristics of spontaneous combustion gas such as a variety of gases, lou limit of detection, and critical requirement of safety, Fourier transform infrared (FTIR) spectral analysis is presented to analyze characteristic gases of spontaneous combustion In this paper, analysis method is introduced at first by combing characteristics of absorption spectra of analyte and analysis requirement. Parameter setting method, sample preparation, feature variable abstract and analysis model building are taken into consideration. The methods of sample preparation, feature abstraction and analysis model are introduced in detail. And then, eleven kinds of gases were tested with Tensor 27 spectrometer. CH4, C2H6, C3H8, iC4H10, nC4H10, C2 H4, C3 H6, C3 H2, SF6, CO and CO2 were included. The optical path length was 10 cm while the spectra resolution was set as 1 cm(-1). The testing results show that the detection limit of all analytes is less than 2 x 10(-6). All the detection limits fit the measurement requirement of spontaneous combustion gas, which means that FTIR may be an ideal instrument and the analysis method used in this paper is competent for spontaneous combustion gas measurement on line.
Automated Cross-Sectional Measurement Method of Intracranial Dural Venous Sinuses.
Lublinsky, S; Friedman, A; Kesler, A; Zur, D; Anconina, R; Shelef, I
2016-03-01
MRV is an important blood vessel imaging and diagnostic tool for the evaluation of stenosis, occlusions, or aneurysms. However, an accurate image-processing tool for vessel comparison is unavailable. The purpose of this study was to develop and test an automated technique for vessel cross-sectional analysis. An algorithm for vessel cross-sectional analysis was developed that included 7 main steps: 1) image registration, 2) masking, 3) segmentation, 4) skeletonization, 5) cross-sectional planes, 6) clustering, and 7) cross-sectional analysis. Phantom models were used to validate the technique. The method was also tested on a control subject and a patient with idiopathic intracranial hypertension (4 large sinuses tested: right and left transverse sinuses, superior sagittal sinus, and straight sinus). The cross-sectional area and shape measurements were evaluated before and after lumbar puncture in patients with idiopathic intracranial hypertension. The vessel-analysis algorithm had a high degree of stability with <3% of cross-sections manually corrected. All investigated principal cranial blood sinuses had a significant cross-sectional area increase after lumbar puncture (P ≤ .05). The average triangularity of the transverse sinuses was increased, and the mean circularity of the sinuses was decreased by 6% ± 12% after lumbar puncture. Comparison of phantom and real data showed that all computed errors were <1 voxel unit, which confirmed that the method provided a very accurate solution. In this article, we present a novel automated imaging method for cross-sectional vessels analysis. The method can provide an efficient quantitative detection of abnormalities in the dural sinuses. © 2016 by American Journal of Neuroradiology.
Methods Used in Economic Evaluations of Chronic Kidney Disease Testing — A Systematic Review
Sutton, Andrew J.; Breheny, Katie; Deeks, Jon; Khunti, Kamlesh; Sharpe, Claire; Ottridge, Ryan S.; Stevens, Paul E.; Cockwell, Paul; Kalra, Philp A.; Lamb, Edmund J.
2015-01-01
Background The prevalence of chronic kidney disease (CKD) is high in general populations around the world. Targeted testing and screening for CKD are often conducted to help identify individuals that may benefit from treatment to ameliorate or prevent their disease progression. Aims This systematic review examines the methods used in economic evaluations of testing and screening in CKD, with a particular focus on whether test accuracy has been considered, and how analysis has incorporated issues that may be important to the patient, such as the impact of testing on quality of life and the costs they incur. Methods Articles that described model-based economic evaluations of patient testing interventions focused on CKD were identified through the searching of electronic databases and the hand searching of the bibliographies of the included studies. Results The initial electronic searches identified 2,671 papers of which 21 were included in the final review. Eighteen studies focused on proteinuria, three evaluated glomerular filtration rate testing and one included both tests. The full impact of inaccurate test results was frequently not considered in economic evaluations in this setting as a societal perspective was rarely adopted. The impact of false positive tests on patients in terms of the costs incurred in re-attending for repeat testing, and the anxiety associated with a positive test was almost always overlooked. In one study where the impact of a false positive test on patient quality of life was examined in sensitivity analysis, it had a significant impact on the conclusions drawn from the model. Conclusion Future economic evaluations of kidney function testing should examine testing and monitoring pathways from the perspective of patients, to ensure that issues that are important to patients, such as the possibility of inaccurate test results, are properly considered in the analysis. PMID:26465773
Buckling Testing and Analysis of Space Shuttle Solid Rocket Motor Cylinders
NASA Technical Reports Server (NTRS)
Weidner, Thomas J.; Larsen, David V.; McCool, Alex (Technical Monitor)
2002-01-01
A series of full-scale buckling tests were performed on the space shuttle Reusable Solid Rocket Motor (RSRM) cylinders. The tests were performed to determine the buckling capability of the cylinders and to provide data for analytical comparison. A nonlinear ANSYS Finite Element Analysis (FEA) model was used to represent and evaluate the testing. Analytical results demonstrated excellent correlation to test results, predicting the failure load within 5%. The analytical value was on the conservative side, predicting a lower failure load than was applied to the test. The resulting study and analysis indicated the important parameters for FEA to accurately predict buckling failure. The resulting method was subsequently used to establish the pre-launch buckling capability of the space shuttle system.
HEATED PURGE AND TRAP METHOD DEVELOPMENT AND TESTING
The goal of the research was to develop a heated purge and trap method that could be used in conjunction with SW-846 method 8240 for the analysis of volatile, water soluble Appendix VIII analytes. The developed method was validated according to a partial single laboratory method ...
Yuan, Naiming; Fu, Zuntao; Zhang, Huan; Piao, Lin; Xoplaki, Elena; Luterbacher, Juerg
2015-01-01
In this paper, a new method, detrended partial-cross-correlation analysis (DPCCA), is proposed. Based on detrended cross-correlation analysis (DCCA), this method is improved by including partial-correlation technique, which can be applied to quantify the relations of two non-stationary signals (with influences of other signals removed) on different time scales. We illustrate the advantages of this method by performing two numerical tests. Test I shows the advantages of DPCCA in handling non-stationary signals, while Test II reveals the “intrinsic” relations between two considered time series with potential influences of other unconsidered signals removed. To further show the utility of DPCCA in natural complex systems, we provide new evidence on the winter-time Pacific Decadal Oscillation (PDO) and the winter-time Nino3 Sea Surface Temperature Anomaly (Nino3-SSTA) affecting the Summer Rainfall over the middle-lower reaches of the Yangtze River (SRYR). By applying DPCCA, better significant correlations between SRYR and Nino3-SSTA on time scales of 6 ~ 8 years are found over the period 1951 ~ 2012, while significant correlations between SRYR and PDO on time scales of 35 years arise. With these physically explainable results, we have confidence that DPCCA is an useful method in addressing complex systems. PMID:25634341
Safety diagnosis: are we doing a good job?
Park, Peter Y; Sahaji, Rajib
2013-03-01
Collision diagnosis is the second step in the six-step road safety management process described in the AASHTO Highway Safety Manual (HSM). Diagnosis is designed to identify a dominant or abnormally high proportion of particular collision configurations (e.g., rear end, right angle, etc.) at a target location. The primary diagnosis method suggested in the HSM is descriptive data analysis. This type of analysis relies on, for example, pie charts, histograms, and/or collision diagrams. Using location specific collision data (e.g., collision frequency per collision configuration for a target location), safety engineers identify (the most) frequent collision configurations. Safety countermeasures are then likely to concentrate on preventing the selected collision configurations. Although its real-world application in engineering practice is limited, an additional collision diagnosis method, known as the beta-binomial (BB) test, is also presented as the secondary diagnosis tool in the HSM. The BB test compares the proportion of a particular collision configuration observed at one location with the proportion of the same collision configuration found at other reference locations which are similar to the target location in terms of selected traffic and roadway characteristics (e.g., traffic volume, traffic control, and number of lanes). This study compared the outcomes obtained from descriptive data analysis and the BB test, and investigates two questions: (1) Do descriptive data analysis and the BB tests produce the same results (i.e., do they select the same collision configurations at the same locations)? and (2) If the tests produce different results, which result should be adopted in engineering practice? This study's analysis was based on a sample of the most recent five years (2005-2009) of collision and roadway configuration data for 143 signalized intersections in the City of Saskatoon, Saskatchewan. The study results show that the BB test's role in diagnosing safety concerns in road safety engineering projects such as safety review projects for existing roadways may be just as important as the descriptive data analysis method. Copyright © 2012 Elsevier Ltd. All rights reserved.
Testa, Maria; Livingston, Jennifer A.; VanZile-Tamsen, Carol
2011-01-01
A mixed methods approach, combining quantitative with qualitative data methods and analysis, offers a promising means of advancing the study of violence. Integrating semi-structured interviews and qualitative analysis into a quantitative program of research on women’s sexual victimization has resulted in valuable scientific insight and generation of novel hypotheses for testing. This mixed methods approach is described and recommendations for integrating qualitative data into quantitative research are provided. PMID:21307032
A survey of variable selection methods in two Chinese epidemiology journals
2010-01-01
Background Although much has been written on developing better procedures for variable selection, there is little research on how it is practiced in actual studies. This review surveys the variable selection methods reported in two high-ranking Chinese epidemiology journals. Methods Articles published in 2004, 2006, and 2008 in the Chinese Journal of Epidemiology and the Chinese Journal of Preventive Medicine were reviewed. Five categories of methods were identified whereby variables were selected using: A - bivariate analyses; B - multivariable analysis; e.g. stepwise or individual significance testing of model coefficients; C - first bivariate analyses, followed by multivariable analysis; D - bivariate analyses or multivariable analysis; and E - other criteria like prior knowledge or personal judgment. Results Among the 287 articles that reported using variable selection methods, 6%, 26%, 30%, 21%, and 17% were in categories A through E, respectively. One hundred sixty-three studies selected variables using bivariate analyses, 80% (130/163) via multiple significance testing at the 5% alpha-level. Of the 219 multivariable analyses, 97 (44%) used stepwise procedures, 89 (41%) tested individual regression coefficients, but 33 (15%) did not mention how variables were selected. Sixty percent (58/97) of the stepwise routines also did not specify the algorithm and/or significance levels. Conclusions The variable selection methods reported in the two journals were limited in variety, and details were often missing. Many studies still relied on problematic techniques like stepwise procedures and/or multiple testing of bivariate associations at the 0.05 alpha-level. These deficiencies should be rectified to safeguard the scientific validity of articles published in Chinese epidemiology journals. PMID:20920252
Non-contact method of search and analysis of pulsating vessels
NASA Astrophysics Data System (ADS)
Avtomonov, Yuri N.; Tsoy, Maria O.; Postnov, Dmitry E.
2018-04-01
Despite the variety of existing methods of recording the human pulse and a solid history of their development, there is still considerable interest in this topic. The development of new non-contact methods, based on advanced image processing, caused a new wave of interest in this issue. We present a simple but quite effective method for analyzing the mechanical pulsations of blood vessels lying close to the surface of the skin. Our technique is a modification of imaging (or remote) photoplethysmography (i-PPG). We supplemented this method with the addition of a laser light source, which made it possible to use other methods of searching for the proposed pulsation zone. During the testing of the method, several series of experiments were carried out with both artificial oscillating objects as well as with the target signal source (human wrist). The obtained results show that our method allows correct interpretation of complex data. To summarize, we proposed and tested an alternative method for the search and analysis of pulsating vessels.
Three-beam interferogram analysis method for surface flatness testing of glass plates and wedges
NASA Astrophysics Data System (ADS)
Sunderland, Zofia; Patorski, Krzysztof
2015-09-01
When testing transparent plates with high quality flat surfaces and a small angle between them the three-beam interference phenomenon is observed. Since the reference beam and the object beams reflected from both the front and back surface of a sample are detected, the recorded intensity distribution may be regarded as a sum of three fringe patterns. Images of that type cannot be succesfully analyzed with standard interferogram analysis methods. They contain, however, useful information on the tested plate surface flatness and its optical thickness variations. Several methods were elaborated to decode the plate parameters. Our technique represents a competitive solution which allows for retrieval of phase components of the three-beam interferogram. It requires recording two images: a three-beam interferogram and the two-beam one with the reference beam blocked. Mutually subtracting these images leads to the intensity distribution which, under some assumptions, provides access to the two component fringe sets which encode surfaces flatness. At various stages of processing we take advantage of nonlinear operations as well as single-frame interferogram analysis methods. Two-dimensional continuous wavelet transform (2D CWT) is used to separate a particular fringe family from the overall interferogram intensity distribution as well as to estimate the phase distribution from a pattern. We distinguish two processing paths depending on the relative density of fringe sets which is connected with geometry of a sample and optical setup. The proposed method is tested on simulated data.
Yang, Mingxing; Li, Xiumin; Li, Zhibin; Ou, Zhimin; Liu, Ming; Liu, Suhuan; Li, Xuejun; Yang, Shuyu
2013-01-01
DNA microarray analysis is characterized by obtaining a large number of gene variables from a small number of observations. Cluster analysis is widely used to analyze DNA microarray data to make classification and diagnosis of disease. Because there are so many irrelevant and insignificant genes in a dataset, a feature selection approach must be employed in data analysis. The performance of cluster analysis of this high-throughput data depends on whether the feature selection approach chooses the most relevant genes associated with disease classes. Here we proposed a new method using multiple Orthogonal Partial Least Squares-Discriminant Analysis (mOPLS-DA) models and S-plots to select the most relevant genes to conduct three-class disease classification and prediction. We tested our method using Golub's leukemia microarray data. For three classes with subtypes, we proposed hierarchical orthogonal partial least squares-discriminant analysis (OPLS-DA) models and S-plots to select features for two main classes and their subtypes. For three classes in parallel, we employed three OPLS-DA models and S-plots to choose marker genes for each class. The power of feature selection to classify and predict three-class disease was evaluated using cluster analysis. Further, the general performance of our method was tested using four public datasets and compared with those of four other feature selection methods. The results revealed that our method effectively selected the most relevant features for disease classification and prediction, and its performance was better than that of the other methods.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jochen, J.E.; Hopkins, C.W.
1993-12-31
;Contents: Naturally fractured reservoir description; Geologic considerations; Shale-specific log model; Stress profiles; Berea reasearch; Benefits analysis; Summary of technologies; Novel well test methods; Natural fracture identification; Reverse drilling; Production data analysis; Fracture treatment quality control; Novel core analysis methods; and Shale well cleanouts.
The Shock and Vibration Digest. Volume 16, Number 3
1984-03-01
Fluid-induced Statistical Energy Analysis Method excitation, Wind tunnel testing V.R. Miller and L.L. Faulkner Flight Dynamics Lab., Air Force...84475 wall by the statistical energy analysis (SEA) method. The fuselage structure is represented as a series of curved, iso- Probabilistic Fracture...heavy are demonstrated in three-dimensional form. floor, a statistical energy analysis (SEA) model is presented. Only structural systems (i.e., no
Manges, Amee R; Tellis, Patricia A; Vincent, Caroline; Lifeso, Kimberley; Geneau, Geneviève; Reid-Smith, Richard J; Boerlin, Patrick
2009-11-01
Discriminatory genotyping methods for the analysis of Escherichia coli other than O157:H7 are necessary for public health-related activities. A new multi-locus variable number tandem repeat analysis protocol is presented; this method achieves an index of discrimination of 99.5% and is reproducible and valid when tested on a collection of 836 diverse E. coli.
Properties of Tuffs, Grout and Other Materials.
1982-01-01
analysis , and tested in uniaxial strain. Table 2 presents the physical properties, ultrasonic data, and the per- manent volume compaction resulting from the... methods provide an accuracy of ±2% on pressure and stress measure- ments. Strain Measurements - Strains are measured using cantilever arms inside the...that are used in nuclear blast effects analysis , and specifically to assist in the analysis of the grout sphere explosive tests being conducted by the
Drop Testing Representative Multi-Canister Overpacks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Snow, Spencer D.; Morton, Dana K.
The objective of the work reported herein was to determine the ability of the Multi- Canister Overpack (MCO) canister design to maintain its containment boundary after an accidental drop event. Two test MCO canisters were assembled at Hanford, prepared for testing at the Idaho National Engineering and Environmental Laboratory (INEEL), drop tested at Sandia National Laboratories, and evaluated back at the INEEL. In addition to the actual testing efforts, finite element plastic analysis techniques were used to make both pre-test and post-test predictions of the test MCOs structural deformations. The completed effort has demonstrated that the canister design is capablemore » of maintaining a 50 psig pressure boundary after drop testing. Based on helium leak testing methods, one test MCO was determined to have a leakage rate not greater than 1x10 -5 std cc/sec (prior internal helium presence prevented a more rigorous test) and the remaining test MCO had a measured leakage rate less than 1x10 -7 std cc/sec (i.e., a leaktight containment) after the drop test. The effort has also demonstrated the capability of finite element methods using plastic analysis techniques to accurately predict the structural deformations of canisters subjected to an accidental drop event.« less
Yu, Chen; Zhang, Qian; Xu, Peng-Yao; Bai, Yin; Shen, Wen-Bin; Di, Bin; Su, Meng-Xiang
2018-01-01
Quantitative nuclear magnetic resonance (qNMR) is a well-established technique in quantitative analysis. We presented a validated 1 H-qNMR method for assay of octreotide acetate, a kind of cyclic octopeptide. Deuterium oxide was used to remove the undesired exchangeable peaks, which was referred to as proton exchange, in order to make the quantitative signals isolated in the crowded spectrum of the peptide and ensure precise quantitative analysis. Gemcitabine hydrochloride was chosen as the suitable internal standard. Experimental conditions, including relaxation delay time, the numbers of scans, and pulse angle, were optimized first. Then method validation was carried out in terms of selectivity, stability, linearity, precision, and robustness. The assay result was compared with that by means of high performance liquid chromatography, which is provided by Chinese Pharmacopoeia. The statistical F test, Student's t test, and nonparametric test at 95% confidence level indicate that there was no significant difference between these two methods. qNMR is a simple and accurate quantitative tool with no need for specific corresponding reference standards. It has the potential of the quantitative analysis of other peptide drugs and standardization of the corresponding reference standards. Copyright © 2017 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Bravo-Imaz, Inaki; Davari Ardakani, Hossein; Liu, Zongchang; García-Arribas, Alfredo; Arnaiz, Aitor; Lee, Jay
2017-09-01
This paper focuses on analyzing motor current signature for fault diagnosis of gearboxes operating under transient speed regimes. Two different strategies are evaluated, extensively tested and compared to analyze the motor current signature in order to implement a condition monitoring system for gearboxes in industrial machinery. A specially designed test bench is used, thoroughly monitored to fully characterize the experiments, in which gears in different health status are tested. The measured signals are analyzed using discrete wavelet decomposition, in different decomposition levels using a range of mother wavelets. Moreover, a dual-level time synchronous averaging analysis is performed on the same signal to compare the performance of the two methods. From both analyses, the relevant features of the signals are extracted and cataloged using a self-organizing map, which allows for an easy detection and classification of the diverse health states of the gears. The results demonstrate the effectiveness of both methods for diagnosing gearbox faults. A slightly better performance was observed for dual-level time synchronous averaging method. Based on the obtained results, the proposed methods can used as effective and reliable condition monitoring procedures for gearbox condition monitoring using only motor current signature.
Rank Determination of Mental Functions by 1D Wavelets and Partial Correlation.
Karaca, Y; Aslan, Z; Cattani, C; Galletta, D; Zhang, Y
2017-01-01
The main aim of this paper is to classify mental functions by the Wechsler Adult Intelligence Scale-Revised tests with a mixed method based on wavelets and partial correlation. The Wechsler Adult Intelligence Scale-Revised is a widely used test designed and applied for the classification of the adults cognitive skills in a comprehensive manner. In this paper, many different intellectual profiles have been taken into consideration to measure the relationship between the mental functioning and psychological disorder. We propose a method based on wavelets and correlation analysis for classifying mental functioning, by the analysis of some selected parameters measured by the Wechsler Adult Intelligence Scale-Revised tests. In particular, 1-D Continuous Wavelet Analysis, 1-D Wavelet Coefficient Method and Partial Correlation Method have been analyzed on some Wechsler Adult Intelligence Scale-Revised parameters such as School Education, Gender, Age, Performance Information Verbal and Full Scale Intelligence Quotient. In particular, we will show that gender variable has a negative but a significant role on age and Performance Information Verbal factors. The age parameters also has a significant relation in its role on Performance Information Verbal and Full Scale Intelligence Quotient change.
Mačkić-Đurović, Mirela; Projić, Petar; Ibrulj, Slavka; Cakar, Jasmina; Marjanović, Damir
2014-05-01
The goal of this study was to examine the effectiveness of 6 STR markers application (D21S1435, D21S11, D21S1270, D21S1411, D21S226 and IFNAR) in molecular genetic diagnostics of Down syndrome (DS) and to compare it with cytogenetic method. Testing was performed on 73 children, with the previously cytogenetically confirmed Down syndrome. DNA isolated from the buccal swab was used. Previously mentioned loci located on chromosome 21 were simultaneously amplified using quantitative fluorescence PCR (QF PCR). Using this method, 60 previously cytogenetically diagnosed DS with standard type of trisomy 21 were confirmed. Furthermore, six of eight children with mosaic type of DS were detected. Two false negative results for mosaic type of DS were obtained. Finally, five children with the translocation type of Down syndrome were also confirmed with this molecular test. In conclusion, molecular genetic analysis of STR loci is fast, cheap and simple method that could be used in detection of DS. Regarding possible false results detected for certain number of mosaic types, cytogenetic analysis should be used as a confirmatory test.
[Seed quality test methods of Paeonia suffruticosa].
Cao, Ya-Yue; Zhu, Zai-Biao; Guo, Qiao-Sheng; Liu, Li; Wang, Chang-Lin
2014-11-01
In order to optimize the testing methods for Paeonia suffruticosa seed quality, and provide basis for establishing seed testing rules and seed quality standard of P. suffruticosa. The seed quality of P. suffruticosa from different producing areas was measured based on the related seed testing regulations. The seed testing methods for quality items of P. suffruticosa was established preliminarily. The samples weight of P. suffruticosa was at least 7 000 g for purity analysis and was at least 700 g for test. The phenotypic observation and size measurement were used for authenticity testing. The 1 000-seed weight was determined by 100-seed method, and the water content was carried out by low temperature drying method (10 hours). After soaking in distilled water for 24 h, the seeds was treated with different temperature stratifications of day and night (25 degrees C/20 degrees C, day/night) in the dark for 60 d. After soaking in the liquor of GA3 300 mg x L(-1) for 24 h, the P. suffruticos seeds were cultured in wet sand at 15 degrees C for 12-60 days for germination testing. Seed viability was tested by TlC method.
Integration of optical measurement methods with flight parameter measurement systems
NASA Astrophysics Data System (ADS)
Kopecki, Grzegorz; Rzucidlo, Pawel
2016-05-01
During the AIM (advanced in-flight measurement techniques) and AIM2 projects, innovative modern techniques were developed. The purpose of the AIM project was to develop optical measurement techniques dedicated for flight tests. Such methods give information about aircraft elements deformation, thermal loads or pressure distribution, etc. In AIM2 the development of optical methods for flight testing was continued. In particular, this project aimed at the development of methods that could be easily applied in flight tests in an industrial setting. Another equally important task was to guarantee the synchronization of the classical measuring system with cameras. The PW-6U glider used in flight tests was provided by the Rzeszów University of Technology. The glider had all the equipment necessary for testing the IPCT (image pattern correlation technique) and IRT (infrared thermometry) methods. Additionally, equipment adequate for the measurement of typical flight parameters, registration and analysis has been developed. This article describes the designed system, as well as presenting the system’s application during flight tests. Additionally, the results obtained in flight tests show certain limitations of the IRT method as applied.
NASA Astrophysics Data System (ADS)
Liu, Bin; Gang, Tie; Wan, Chuhao; Wang, Changxi; Luo, Zhiwei
2015-07-01
Vibro-acoustic modulation technique is a nonlinear ultrasonic method in nondestructive testing. This technique detects the defects by monitoring the modulation components generated by the interaction between the vibration and the ultrasound wave due to the nonlinear material behaviour caused by the damage. In this work, a swept frequency signal was used as high frequency excitation, then the Hilbert transform based amplitude and phase demodulation and synchronous demodulation (SD) were used to extract the modulation information from the received signal, the results were graphed in the time-frequency domain after the short time Fourier transform. The demodulation results were quite different from each other. The reason for the difference was investigated by analysing the demodulation process of the two methods. According to the analysis and the subsequent verification test, it was indicated that the SD method was more proper for the test and a new index called MISD was defined to evaluate the structure quality in the Vibro-acoustic modulation test with swept probing excitation.
Sakoulas, George; Gold, Howard S.; Venkataraman, Lata; DeGirolami, Paola C.; Eliopoulos, George M.; Qian, Qinfang
2001-01-01
Methicillin-resistant Staphylococcus aureus (MRSA) is responsible for an increasing number of serious nosocomial and community-acquired infections. Phenotypic heterogeneous drug resistance (heteroresistance) to antistaphylococcal beta-lactams affects the results of susceptibility testing. The present study compared the MRSA-Screen latex agglutination test (Denka Seiken Co., Ltd., Tokyo, Japan) for detection of PBP 2a with agar dilution, the VITEK-1 and VITEK-2 systems (bioMérieux, St. Louis, Mo.), and the oxacillin agar screen test for detection of MRSA, with PCR for the mecA gene used as the “gold standard” assay. Analysis of 107 methicillin-susceptible S. aureus (MSSA) isolates and 203 MRSA isolates revealed that the MRSA-Screen latex agglutination test is superior to any single phenotype-based susceptibility testing method, with a sensitivity of 100% and a specificity of 99.1%. Only one isolate that lacked mecA was weakly positive by the MRSA-Screen latex agglutination test. This isolate was phenotypically susceptible to oxacillin and did not contain the mecA gene by Southern blot hybridization. The oxacillin agar screen test, the VITEK-1 system, the VITEK-2 system, and agar dilution showed sensitivities of 99.0, 99.0, 99.5, and 99%, respectively, and specificities of 98.1, 100, 97.2, and 100%, respectively. The differences in sensitivity or specificity were not statistically significant. Oxacillin bactericidal assays showed that mecA- and PBP 2a-positive S. aureus isolates that are susceptible to antistaphylococcal beta-lactams by conventional methods are functionally resistant to oxacillin. We conclude that the accuracy of the MRSA-Screen latex agglutination method for detection of PBP 2a approaches the accuracy of PCR and is more accurate than any susceptibility testing method used alone for the detection of MRSA. PMID:11682512
DC/DC Converter Stability Testing Study
NASA Technical Reports Server (NTRS)
Wang, Bright L.
2008-01-01
This report presents study results on hybrid DC/DC converter stability testing methods. An input impedance measurement method and a gain/phase margin measurement method were evaluated to be effective to determine front-end oscillation and feedback loop oscillation. In particular, certain channel power levels of converter input noises have been found to have high degree correlation with the gain/phase margins. It becomes a potential new method to evaluate stability levels of all type of DC/DC converters by utilizing the spectral analysis on converter input noises.
Analysis of subsonic wind tunnel with variation shape rectangular and octagonal on test section
NASA Astrophysics Data System (ADS)
Rhakasywi, D.; Ismail; Suwandi, A.; Fadhli, A.
2018-02-01
The need for good design in the aerodynamics field required a wind tunnel design. The wind tunnel design required in this case is capable of generating laminar flow. In this research searched for wind tunnel models with rectangular and octagonal variations with objectives to generate laminar flow in the test section. The research method used numerical approach of CFD (Computational Fluid Dynamics) and manual analysis to analyze internal flow in test section. By CFD simulation results and manual analysis to generate laminar flow in the test section is a design that has an octagonal shape without filled for optimal design.
NASA Technical Reports Server (NTRS)
Johnson, W. S. (Editor)
1989-01-01
The present conference discusses the tension and compression testing of MMCs, the measurement of advanced composites' thermal expansion, plasticity theory for fiber-reinforced composites, a deformation analysis of boron/aluminum specimens by moire interferometry, strength prediction methods for MMCs, and the analysis of notched MMCs under tensile loading. Also discussed are techniques for the mechanical and thermal testing of Ti3Al/SCS-6 MMCs, damage initiation and growth in fiber-reinforced MMCs, the shear testing of MMCs, the crack growth and fracture of continuous fiber-reinforced MMCs in view of analytical and experimental results, and MMC fiber-matrix interface failures.
Why Students Fail at Volumetric Analysis.
ERIC Educational Resources Information Center
Pickering, Miles
1979-01-01
Investigates the reasons for students' failure in an introductory volumetric analysis course by analyzing test papers and judging them against a hypothetical ideal method of grading laboratory techniques. (GA)
Atmospheric turbulence profiling with SLODAR using multiple adaptive optics wavefront sensors.
Wang, Lianqi; Schöck, Matthias; Chanan, Gary
2008-04-10
The slope detection and ranging (SLODAR) method recovers atmospheric turbulence profiles from time averaged spatial cross correlations of wavefront slopes measured by Shack-Hartmann wavefront sensors. The Palomar multiple guide star unit (MGSU) was set up to test tomographic multiple guide star adaptive optics and provided an ideal test bed for SLODAR turbulence altitude profiling. We present the data reduction methods and SLODAR results from MGSU observations made in 2006. Wind profiling is also performed using delayed wavefront cross correlations along with SLODAR analysis. The wind profiling analysis is shown to improve the height resolution of the SLODAR method and in addition gives the wind velocities of the turbulent layers.
Chen, J D; Sun, H L
1999-04-01
Objective. To assess and predict reliability of an equipment dynamically by making full use of various test informations in the development of products. Method. A new reliability growth assessment method based on army material system analysis activity (AMSAA) model was developed. The method is composed of the AMSAA model and test data conversion technology. Result. The assessment and prediction results of a space-borne equipment conform to its expectations. Conclusion. It is suggested that this method should be further researched and popularized.
Application of survival analysis methodology to the quantitative analysis of LC-MS proteomics data.
Tekwe, Carmen D; Carroll, Raymond J; Dabney, Alan R
2012-08-01
Protein abundance in quantitative proteomics is often based on observed spectral features derived from liquid chromatography mass spectrometry (LC-MS) or LC-MS/MS experiments. Peak intensities are largely non-normal in distribution. Furthermore, LC-MS-based proteomics data frequently have large proportions of missing peak intensities due to censoring mechanisms on low-abundance spectral features. Recognizing that the observed peak intensities detected with the LC-MS method are all positive, skewed and often left-censored, we propose using survival methodology to carry out differential expression analysis of proteins. Various standard statistical techniques including non-parametric tests such as the Kolmogorov-Smirnov and Wilcoxon-Mann-Whitney rank sum tests, and the parametric survival model and accelerated failure time-model with log-normal, log-logistic and Weibull distributions were used to detect any differentially expressed proteins. The statistical operating characteristics of each method are explored using both real and simulated datasets. Survival methods generally have greater statistical power than standard differential expression methods when the proportion of missing protein level data is 5% or more. In particular, the AFT models we consider consistently achieve greater statistical power than standard testing procedures, with the discrepancy widening with increasing missingness in the proportions. The testing procedures discussed in this article can all be performed using readily available software such as R. The R codes are provided as supplemental materials. ctekwe@stat.tamu.edu.
16 CFR 309.10 - Alternative vehicle fuel rating.
Code of Federal Regulations, 2010 CFR
2010-01-01
... Analysis of Natural Gas by Gas Chromatography.” For the purposes of this section, fuel ratings for the... methods set forth in ASTM D 1946-90, “Standard Practice for Analysis of Reformed Gas by Gas Chromatography... the principal component of compressed natural gas are to be determined in accordance with test methods...
16 CFR 309.10 - Alternative vehicle fuel rating.
Code of Federal Regulations, 2011 CFR
2011-01-01
... Analysis of Natural Gas by Gas Chromatography.” For the purposes of this section, fuel ratings for the... methods set forth in ASTM D 1946-90, “Standard Practice for Analysis of Reformed Gas by Gas Chromatography... the principal component of compressed natural gas are to be determined in accordance with test methods...
40 CFR 60.435 - Test methods and procedures.
Code of Federal Regulations, 2013 CFR
2013-07-01
... of any affected facility using solvent-borne ink systems shall determine the VOC content of the raw inks and related coatings used at the affected facility by: (1) Analysis using Method 24A of routine weekly samples of raw ink and related coatings in each respective storage tank; or (2) Analysis using...
40 CFR 60.435 - Test methods and procedures.
Code of Federal Regulations, 2012 CFR
2012-07-01
... of any affected facility using solvent-borne ink systems shall determine the VOC content of the raw inks and related coatings used at the affected facility by: (1) Analysis using Method 24A of routine weekly samples of raw ink and related coatings in each respective storage tank; or (2) Analysis using...
40 CFR 60.435 - Test methods and procedures.
Code of Federal Regulations, 2014 CFR
2014-07-01
... of any affected facility using solvent-borne ink systems shall determine the VOC content of the raw inks and related coatings used at the affected facility by: (1) Analysis using Method 24A of routine weekly samples of raw ink and related coatings in each respective storage tank; or (2) Analysis using...
40 CFR 98.324 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2014 CFR
2014-07-01
... Tracking System Handbook Number: PH-08-V-1, January 1, 2008 (incorporated by reference, see § 98.7). You... paragraphs (d)(1) through (d)(2) of this section. (1) ASTM D1945-03, Standard Test Method for Analysis of... Reformed Gas by Gas Chromatography; ASTM D4891-89 (Reapproved 2006), Standard Test Method for Heating Value...
Wort free amino nitrogen analysis adapted to a microplate format
USDA-ARS?s Scientific Manuscript database
The standard method for determining wort free amino nitrogen content calls for the use of test tubes and glass marbles, as well as boiling and 20°C water baths. In this paper we describe how the standard method can be updated and streamlined by replacing water baths, test tubes and marbles with a th...
Exploring and Testing the Predictors of New Faculty Success: A Mixed Methods Study
ERIC Educational Resources Information Center
Stupnisky, R. H.; Weaver-Hightower, M. B.; Kartoshkina, Y.
2015-01-01
The purpose of this mixed methods study was to investigate and test the factors contributing to new faculty members' success. In the first phase, qualitative analysis of focus groups revealed four prominent themes affecting new faculty members: expectations, collegiality, balance, and location. In the second phase, new faculty members completed an…
Testing the Self-Efficacy Questionnaire with Korean Children in Institutionalized Care
ERIC Educational Resources Information Center
Kim, Youngmi; Kim, Kyeongmo; Lee, Shinhye
2017-01-01
Purpose: We tested the reliability and validity of the Self-Efficacy Questionnaire for Children (SEQ-C) in a sample of children living in orphanages in South Korea. Methods: Our study sample consisted of 334 children aged 13-18 obtained using a convenience sampling method. We conducted a confirmatory factor analysis to identify the factor…
Pikkemaat, M G; Rapallini, M L B A; Karp, M T; Elferink, J W A
2010-08-01
Tetracyclines are extensively used in veterinary medicine. For the detection of tetracycline residues in animal products, a broad array of methods is available. Luminescent bacterial biosensors represent an attractive inexpensive, simple and fast method for screening large numbers of samples. A previously developed cell-biosensor method was subjected to an evaluation study using over 300 routine poultry samples and the results were compared with a microbial inhibition test. The cell-biosensor assay yielded many more suspect samples, 10.2% versus 2% with the inhibition test, which all could be confirmed by liquid chromatography-tandem mass spectrometry (LC-MS/MS). Only one sample contained a concentration above the maximum residue limit (MRL) of 100 microg kg(-1), while residue levels in most of the suspect samples were very low (<10 microg kg(-1)). The method appeared to be specific and robust. Using an experimental set-up comprising the analysis of a series of three sample dilutions allowed an appropriate cut-off for confirmatory analysis, limiting the number of samples and requiring further analysis to a minimum.
NASA Astrophysics Data System (ADS)
Buri, N.; Mantau, Z.
2018-05-01
The share of food expenditure is one of food security indicator in communities. It also can be used as an indicator of the success of rural development. The aim of this research was to find the share of food expenditure of farm households before and after the program of Food Reserved Garden Area (KRPL/FRGA) in Suwawa and Tilongkabila districtat Bone Bolango Regency of Gorontalo Province. Analysis method used share of food expenditure method. The method measure the ratio of food expenditure and total expenditure of household for a month. Statistical test used a non-parametric method, especially The Wilcoxon Test (two paired samples test). The results found that KRPL program in Ulanta Village of Suwawa district did not significantly affect the share of food expenditure of farm household. While in the South Tunggulo village of Tilongkabila district, FRGA program significantly affected the share of food expenditure.
Wright, Stuart J; Vass, Caroline M; Sim, Gene; Burton, Michael; Fiebig, Denzil G; Payne, Katherine
2018-02-28
Scale heterogeneity, or differences in the error variance of choices, may account for a significant amount of the observed variation in the results of discrete choice experiments (DCEs) when comparing preferences between different groups of respondents. The aim of this study was to identify if, and how, scale heterogeneity has been addressed in healthcare DCEs that compare the preferences of different groups. A systematic review identified all healthcare DCEs published between 1990 and February 2016. The full-text of each DCE was then screened to identify studies that compared preferences using data generated from multiple groups. Data were extracted and tabulated on year of publication, samples compared, tests for scale heterogeneity, and analytical methods to account for scale heterogeneity. Narrative analysis was used to describe if, and how, scale heterogeneity was accounted for when preferences were compared. A total of 626 healthcare DCEs were identified. Of these 199 (32%) aimed to compare the preferences of different groups specified at the design stage, while 79 (13%) compared the preferences of groups identified at the analysis stage. Of the 278 included papers, 49 (18%) discussed potential scale issues, 18 (7%) used a formal method of analysis to account for scale between groups, and 2 (1%) accounted for scale differences between preference groups at the analysis stage. Scale heterogeneity was present in 65% (n = 13) of studies that tested for it. Analytical methods to test for scale heterogeneity included coefficient plots (n = 5, 2%), heteroscedastic conditional logit models (n = 6, 2%), Swait and Louviere tests (n = 4, 1%), generalised multinomial logit models (n = 5, 2%), and scale-adjusted latent class analysis (n = 2, 1%). Scale heterogeneity is a prevalent issue in healthcare DCEs. Despite this, few published DCEs have discussed such issues, and fewer still have used formal methods to identify and account for the impact of scale heterogeneity. The use of formal methods to test for scale heterogeneity should be used, otherwise the results of DCEs potentially risk producing biased and potentially misleading conclusions regarding preferences for aspects of healthcare.
Computational Fluid Dynamics Analysis Success Stories of X-Plane Design to Flight Test
NASA Technical Reports Server (NTRS)
Cosentino, Gary B.
2008-01-01
Examples of the design and flight test of three true X-planes are described, particularly X-plane design techniques that relied heavily on computational fluid dynamics(CFD) analysis. Three examples are presented: the X-36 Tailless Fighter Agility Research Aircraft, the X-45A Unmanned Combat Air Vehicle, and the X-48B Blended Wing Body Demonstrator Aircraft. An overview is presented of the uses of CFD analysis, comparison and contrast with wind tunnel testing, and information derived from CFD analysis that directly related to successful flight test. Lessons learned on the proper and improper application of CFD analysis are presented. Highlights of the flight-test results of the three example X-planes are presented. This report discusses developing an aircraft shape from early concept and three-dimensional modeling through CFD analysis, wind tunnel testing, further refined CFD analysis, and, finally, flight. An overview of the areas in which CFD analysis does and does not perform well during this process is presented. How wind tunnel testing complements, calibrates, and verifies CFD analysis is discussed. Lessons learned revealing circumstances under which CFD analysis results can be misleading are given. Strengths and weaknesses of the various flow solvers, including panel methods, Euler, and Navier-Stokes techniques, are discussed.
Parametric Methods for Dynamic 11C-Phenytoin PET Studies.
Mansor, Syahir; Yaqub, Maqsood; Boellaard, Ronald; Froklage, Femke E; de Vries, Anke; Bakker, Esther D M; Voskuyl, Rob A; Eriksson, Jonas; Schwarte, Lothar A; Verbeek, Joost; Windhorst, Albert D; Lammertsma, Adriaan A
2017-03-01
In this study, the performance of various methods for generating quantitative parametric images of dynamic 11 C-phenytoin PET studies was evaluated. Methods: Double-baseline 60-min dynamic 11 C-phenytoin PET studies, including online arterial sampling, were acquired for 6 healthy subjects. Parametric images were generated using Logan plot analysis, a basis function method, and spectral analysis. Parametric distribution volume (V T ) and influx rate ( K 1 ) were compared with those obtained from nonlinear regression analysis of time-activity curves. In addition, global and regional test-retest (TRT) variability was determined for parametric K 1 and V T values. Results: Biases in V T observed with all parametric methods were less than 5%. For K 1 , spectral analysis showed a negative bias of 16%. The mean TRT variabilities of V T and K 1 were less than 10% for all methods. Shortening the scan duration to 45 min provided similar V T and K 1 with comparable TRT performance compared with 60-min data. Conclusion: Among the various parametric methods tested, the basis function method provided parametric V T and K 1 values with the least bias compared with nonlinear regression data and showed TRT variabilities lower than 5%, also for smaller volume-of-interest sizes (i.e., higher noise levels) and shorter scan duration. © 2017 by the Society of Nuclear Medicine and Molecular Imaging.
Markby, Jessica; Boeke, Caroline; Penazzato, Martina; Urick, Brittany; Ghadrshenas, Anisa; Harris, Lindsay; Ford, Nathan; Peter, Trevor
2017-01-01
Background: Despite significant gains made toward improving access, early infant diagnosis (EID) testing programs suffer from long test turnaround times that result in substantial loss to follow-up and mortality associated with delays in antiretroviral therapy initiation. These delays in treatment initiation are particularly impactful because of significant HIV-related infant mortality observed by 2–3 months of age. Short message service (SMS) and general packet radio service (GPRS) printers allow test results to be transmitted immediately to health care facilities on completion of testing in the laboratory. Methods: We conducted a systematic review and meta-analysis to assess the benefit of using SMS/GPRS printers to increase the efficiency of EID test result delivery compared with traditional courier paper–based results delivery methods. Results: We identified 11 studies contributing data for over 16,000 patients from East and Southern Africa. The test turnaround time from specimen collection to result received at the health care facility with courier paper–based methods was 68.0 days (n = 6835), whereas the test turnaround time with SMS/GPRS printers was 51.1 days (n = 6711), resulting in a 2.5-week (25%) reduction in the turnaround time. Conclusions: Courier paper–based EID test result delivery methods are estimated to add 2.5 weeks to EID test turnaround times in low resource settings and increase the risk that infants receive test results during or after the early peak of infant mortality. SMS/GPRS result delivery to health care facility printers significantly reduced test turnaround time and may reduce this risk. SMS/GPRS printers should be considered for expedited delivery of EID and other centralized laboratory test results. PMID:28825941
Sarrigiannis, Ptolemaios G; Zhao, Yifan; Wei, Hua-Liang; Billings, Stephen A; Fotheringham, Jayne; Hadjivassiliou, Marios
2014-01-01
To introduce a new method of quantitative EEG analysis in the time domain, the error reduction ratio (ERR)-causality test. To compare performance against cross-correlation and coherence with phase measures. A simulation example was used as a gold standard to assess the performance of ERR-causality, against cross-correlation and coherence. The methods were then applied to real EEG data. Analysis of both simulated and real EEG data demonstrates that ERR-causality successfully detects dynamically evolving changes between two signals, with very high time resolution, dependent on the sampling rate of the data. Our method can properly detect both linear and non-linear effects, encountered during analysis of focal and generalised seizures. We introduce a new quantitative EEG method of analysis. It detects real time levels of synchronisation in the linear and non-linear domains. It computes directionality of information flow with corresponding time lags. This novel dynamic real time EEG signal analysis unveils hidden neural network interactions with a very high time resolution. These interactions cannot be adequately resolved by the traditional methods of coherence and cross-correlation, which provide limited results in the presence of non-linear effects and lack fidelity for changes appearing over small periods of time. Copyright © 2013 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.
Pyrolysis kinetics and combustion of thin wood by an advanced cone caorimetry test method
Mark Dietenberger
2012-01-01
Pyrolysis kinetics analysis of extractives, holocellulose, and lignin in the solid redwood over the entire heating regime was possible by specialized cone calorimeter test and new mathematical analysis tools. Added hardware components include: modified sample holder for the thin specimen with tiny thermocouples, the methane ring burner with stainless-steel mesh above...
An Analysis of Variance Approach for the Estimation of Response Time Distributions in Tests
ERIC Educational Resources Information Center
Attali, Yigal
2010-01-01
Generalizability theory and analysis of variance methods are employed, together with the concept of objective time pressure, to estimate response time distributions and the degree of time pressure in timed tests. By estimating response time variance components due to person, item, and their interaction, and fixed effects due to item types and…
Laboratory Analytical Procedures | Bioenergy | NREL
analytical procedures (LAPs) to provide validated methods for biofuels and pyrolysis bio-oils research . Biomass Compositional Analysis These lab procedures provide tested and accepted methods for performing
A comparison of simple shear characterization methods for composite laminates
NASA Technical Reports Server (NTRS)
Yeow, Y. T.; Brinson, H. F.
1978-01-01
Various methods for the shear stress/strain characterization of composite laminates are examined and their advantages and limitations are briefly discussed. Experimental results and the necessary accompanying analysis are then presented and compared for three simple shear characterization procedures. These are the off-axis tensile test method, the (+/- 45 deg)s tensile test method and the (0/90 deg)s symmetric rail shear test method. It is shown that the first technique indicates the shear properties of the graphite/epoxy laminates investigated are fundamentally brittle in nature while the latter two methods tend to indicate that these laminates are fundamentally ductile in nature. Finally, predictions of incrementally determined tensile stress/strain curves utilizing the various different shear behaviour methods as input information are presented and discussed.
A comparison of simple shear characterization methods for composite laminates
NASA Technical Reports Server (NTRS)
Yeow, Y. T.; Brinson, H. F.
1977-01-01
Various methods for the shear stress-strain characterization of composite laminates are examined, and their advantages and limitations are briefly discussed. Experimental results and the necessary accompanying analysis are then presented and compared for three simple shear characterization procedures. These are the off-axis tensile test method, the + or - 45 degs tensile test method and the 0 deg/90 degs symmetric rail shear test method. It is shown that the first technique indicates that the shear properties of the G/E laminates investigated are fundamentally brittle in nature while the latter two methods tend to indicate that the G/E laminates are fundamentally ductile in nature. Finally, predictions of incrementally determined tensile stress-strain curves utilizing the various different shear behavior methods as input information are presented and discussed.
Position Accuracy Analysis of a Robust Vision-Based Navigation
NASA Astrophysics Data System (ADS)
Gaglione, S.; Del Pizzo, S.; Troisi, S.; Angrisano, A.
2018-05-01
Using images to determine camera position and attitude is a consolidated method, very widespread for application like UAV navigation. In harsh environment, where GNSS could be degraded or denied, image-based positioning could represent a possible candidate for an integrated or alternative system. In this paper, such method is investigated using a system based on single camera and 3D maps. A robust estimation method is proposed in order to limit the effect of blunders or noisy measurements on position solution. The proposed approach is tested using images collected in an urban canyon, where GNSS positioning is very unaccurate. A previous photogrammetry survey has been performed to build the 3D model of tested area. The position accuracy analysis is performed and the effect of the robust method proposed is validated.
Data handling and analysis for the 1971 corn blight watch experiment.
NASA Technical Reports Server (NTRS)
Anuta, P. E.; Phillips, T. L.; Landgrebe, D. A.
1972-01-01
Review of the data handling and analysis methods used in the near-operational test of remote sensing systems provided by the 1971 corn blight watch experiment. The general data analysis techniques and, particularly, the statistical multispectral pattern recognition methods for automatic computer analysis of aircraft scanner data are described. Some of the results obtained are examined, and the implications of the experiment for future data communication requirements of earth resource survey systems are discussed.
NASA Astrophysics Data System (ADS)
Chatzistergos, Theodosios; Ermolli, Ilaria; Solanki, Sami K.; Krivova, Natalie A.
2018-01-01
Context. Historical Ca II K spectroheliograms (SHG) are unique in representing long-term variations of the solar chromospheric magnetic field. They usually suffer from numerous problems and lack photometric calibration. Thus accurate processing of these data is required to get meaningful results from their analysis. Aims: In this paper we aim at developing an automatic processing and photometric calibration method that provides precise and consistent results when applied to historical SHG. Methods: The proposed method is based on the assumption that the centre-to-limb variation of the intensity in quiet Sun regions does not vary with time. We tested the accuracy of the proposed method on various sets of synthetic images that mimic problems encountered in historical observations. We also tested our approach on a large sample of images randomly extracted from seven different SHG archives. Results: The tests carried out on the synthetic data show that the maximum relative errors of the method are generally <6.5%, while the average error is <1%, even if rather poor quality observations are considered. In the absence of strong artefacts the method returns images that differ from the ideal ones by <2% in any pixel. The method gives consistent values for both plage and network areas. We also show that our method returns consistent results for images from different SHG archives. Conclusions: Our tests show that the proposed method is more accurate than other methods presented in the literature. Our method can also be applied to process images from photographic archives of solar observations at other wavelengths than Ca II K.
Cecilio-Fernandes, Dario; Medema, Harro; Collares, Carlos Fernando; Schuwirth, Lambert; Cohen-Schotanus, Janke; Tio, René A
2017-11-09
Progress testing is an assessment tool used to periodically assess all students at the end-of-curriculum level. Because students cannot know everything, it is important that they recognize their lack of knowledge. For that reason, the formula-scoring method has usually been used. However, where partial knowledge needs to be taken into account, the number-right scoring method is used. Research comparing both methods has yielded conflicting results. As far as we know, in all these studies, Classical Test Theory or Generalizability Theory was used to analyze the data. In contrast to these studies, we will explore the use of the Rasch model to compare both methods. A 2 × 2 crossover design was used in a study where 298 students from four medical schools participated. A sample of 200 previously used questions from the progress tests was selected. The data were analyzed using the Rasch model, which provides fit parameters, reliability coefficients, and response option analysis. The fit parameters were in the optimal interval ranging from 0.50 to 1.50, and the means were around 1.00. The person and item reliability coefficients were higher in the number-right condition than in the formula-scoring condition. The response option analysis showed that the majority of dysfunctional items emerged in the formula-scoring condition. The findings of this study support the use of number-right scoring over formula scoring. Rasch model analyses showed that tests with number-right scoring have better psychometric properties than formula scoring. However, choosing the appropriate scoring method should depend not only on psychometric properties but also on self-directed test-taking strategies and metacognitive skills.
Vickers, Andrew J; Cronin, Angel M; Elkin, Elena B; Gonen, Mithat
2008-01-01
Background Decision curve analysis is a novel method for evaluating diagnostic tests, prediction models and molecular markers. It combines the mathematical simplicity of accuracy measures, such as sensitivity and specificity, with the clinical applicability of decision analytic approaches. Most critically, decision curve analysis can be applied directly to a data set, and does not require the sort of external data on costs, benefits and preferences typically required by traditional decision analytic techniques. Methods In this paper we present several extensions to decision curve analysis including correction for overfit, confidence intervals, application to censored data (including competing risk) and calculation of decision curves directly from predicted probabilities. All of these extensions are based on straightforward methods that have previously been described in the literature for application to analogous statistical techniques. Results Simulation studies showed that repeated 10-fold crossvalidation provided the best method for correcting a decision curve for overfit. The method for applying decision curves to censored data had little bias and coverage was excellent; for competing risk, decision curves were appropriately affected by the incidence of the competing risk and the association between the competing risk and the predictor of interest. Calculation of decision curves directly from predicted probabilities led to a smoothing of the decision curve. Conclusion Decision curve analysis can be easily extended to many of the applications common to performance measures for prediction models. Software to implement decision curve analysis is provided. PMID:19036144
A Method of DTM Construction Based on Quadrangular Irregular Networks and Related Error Analysis
Kang, Mengjun
2015-01-01
A new method of DTM construction based on quadrangular irregular networks (QINs) that considers all the original data points and has a topological matrix is presented. A numerical test and a real-world example are used to comparatively analyse the accuracy of QINs against classical interpolation methods and other DTM representation methods, including SPLINE, KRIGING and triangulated irregular networks (TINs). The numerical test finds that the QIN method is the second-most accurate of the four methods. In the real-world example, DTMs are constructed using QINs and the three classical interpolation methods. The results indicate that the QIN method is the most accurate method tested. The difference in accuracy rank seems to be caused by the locations of the data points sampled. Although the QIN method has drawbacks, it is an alternative method for DTM construction. PMID:25996691
Methods for the design and analysis of power optimized finite-state machines using clock gating
NASA Astrophysics Data System (ADS)
Chodorowski, Piotr
2017-11-01
The paper discusses two methods of design of power optimized FSMs. Both methods use clock gating techniques. The main objective of the research was to write a program capable of generating automatic hardware description of finite-state machines in VHDL and testbenches to help power analysis. The creation of relevant output files is detailed step by step. The program was tested using the LGSynth91 FSM benchmark package. An analysis of the generated circuits shows that the second method presented in this paper leads to significant reduction of power consumption.
40 CFR Table 2 to Subpart Cccc of... - Requirements for Performance Tests
Code of Federal Regulations, 2010 CFR
2010-07-01
... port's location and the number of traverse points Method 1* 3. Measure volumetric flow rate. Method 2* 4. Perform gas analysis to determine the dry molecular weight of the stack gas Method 3* 5...
Hydrogen recombiner catalyst test supporting data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Britton, M.D.
1995-01-19
This is a data package supporting the Hydrogen Recombiner Catalyst Performance and Carbon Monoxide Sorption Capacity Test Report, WHC-SD-WM-TRP-211, Rev 0. This report contains 10 appendices which consist of the following: Mass spectrometer analysis reports: HRC samples 93-001 through 93-157; Gas spectrometry analysis reports: HRC samples 93-141 through 93-658; Mass spectrometer procedure PNL-MA-299 ALO-284; Alternate analytical method for ammonia and water vapor; Sample log sheets; Job Safety analysis; Certificate of mixture analysis for feed gases; Flow controller calibration check; Westinghouse Standards Laboratory report on Bois flow calibrator; and Sorption capacity test data, tables, and graphs.
7 CFR 160.17 - Laboratory analysis.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 7 Agriculture 3 2011-01-01 2011-01-01 false Laboratory analysis. 160.17 Section 160.17 Agriculture... STANDARDS FOR NAVAL STORES Methods of Analysis, Inspection, Sampling and Grading § 160.17 Laboratory analysis. The analysis and laboratory testing of naval stores shall be conducted, so far as is practicable...
7 CFR 160.17 - Laboratory analysis.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 7 Agriculture 3 2010-01-01 2010-01-01 false Laboratory analysis. 160.17 Section 160.17 Agriculture... STANDARDS FOR NAVAL STORES Methods of Analysis, Inspection, Sampling and Grading § 160.17 Laboratory analysis. The analysis and laboratory testing of naval stores shall be conducted, so far as is practicable...
Uncertainty Analysis of NASA Glenn's 8- by 6-Foot Supersonic Wind Tunnel
NASA Technical Reports Server (NTRS)
Stephens, Julia E.; Hubbard, Erin P.; Walter, Joel A.; McElroy, Tyler
2016-01-01
An analysis was performed to determine the measurement uncertainty of the Mach Number of the 8- by 6-foot Supersonic Wind Tunnel at the NASA Glenn Research Center. This paper details the analysis process used, including methods for handling limited data and complicated data correlations. Due to the complexity of the equations used, a Monte Carlo Method was utilized for this uncertainty analysis. A summary of the findings are presented as pertains to understanding what the uncertainties are, how they impact various research tests in the facility, and methods of reducing the uncertainties in the future.
OPATs: Omnibus P-value association tests.
Chen, Chia-Wei; Yang, Hsin-Chou
2017-07-10
Combining statistical significances (P-values) from a set of single-locus association tests in genome-wide association studies is a proof-of-principle method for identifying disease-associated genomic segments, functional genes and biological pathways. We review P-value combinations for genome-wide association studies and introduce an integrated analysis tool, Omnibus P-value Association Tests (OPATs), which provides popular analysis methods of P-value combinations. The software OPATs programmed in R and R graphical user interface features a user-friendly interface. In addition to analysis modules for data quality control and single-locus association tests, OPATs provides three types of set-based association test: window-, gene- and biopathway-based association tests. P-value combinations with or without threshold and rank truncation are provided. The significance of a set-based association test is evaluated by using resampling procedures. Performance of the set-based association tests in OPATs has been evaluated by simulation studies and real data analyses. These set-based association tests help boost the statistical power, alleviate the multiple-testing problem, reduce the impact of genetic heterogeneity, increase the replication efficiency of association tests and facilitate the interpretation of association signals by streamlining the testing procedures and integrating the genetic effects of multiple variants in genomic regions of biological relevance. In summary, P-value combinations facilitate the identification of marker sets associated with disease susceptibility and uncover missing heritability in association studies, thereby establishing a foundation for the genetic dissection of complex diseases and traits. OPATs provides an easy-to-use and statistically powerful analysis tool for P-value combinations. OPATs, examples, and user guide can be downloaded from http://www.stat.sinica.edu.tw/hsinchou/genetics/association/OPATs.htm. © The Author 2017. Published by Oxford University Press.
Bird, Susan M.; Fram, Miranda S.; Crepeau, Kathryn L.
2003-01-01
An analytical method has been developed for the determination of dissolved organic carbon concentration in water samples. This method includes the results of the tests used to validate the method and the quality-control practices used for dissolved organic carbon analysis. Prior to analysis, water samples are filtered to remove suspended particulate matter. A Shimadzu TOC-5000A Total Organic Carbon Analyzer in the nonpurgeable organic carbon mode is used to analyze the samples by high temperature catalytic oxidation. The analysis usually is completed within 48 hours of sample collection. The laboratory reporting level is 0.22 milligrams per liter.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Okura, Yuki; Futamase, Toshifumi, E-mail: yuki.okura@riken.jp
We improve the ellipticity of re-smeared artificial image (ERA) method of point-spread function (PSF) correction in a weak lensing shear analysis in order to treat the realistic shape of galaxies and the PSF. This is done by re-smearing the PSF and the observed galaxy image using a re-smearing function (RSF) and allows us to use a new PSF with a simple shape and to correct the PSF effect without any approximations or assumptions. We perform a numerical test to show that the method applied for galaxies and PSF with some complicated shapes can correct the PSF effect with a systematicmore » error of less than 0.1%. We also apply the ERA method for real data of the Abell 1689 cluster to confirm that it is able to detect the systematic weak lensing shear pattern. The ERA method requires less than 0.1 or 1 s to correct the PSF for each object in a numerical test and a real data analysis, respectively.« less
Meta-Analysis of Rare Binary Adverse Event Data
Bhaumik, Dulal K.; Amatya, Anup; Normand, Sharon-Lise; Greenhouse, Joel; Kaizar, Eloise; Neelon, Brian; Gibbons, Robert D.
2013-01-01
We examine the use of fixed-effects and random-effects moment-based meta-analytic methods for analysis of binary adverse event data. Special attention is paid to the case of rare adverse events which are commonly encountered in routine practice. We study estimation of model parameters and between-study heterogeneity. In addition, we examine traditional approaches to hypothesis testing of the average treatment effect and detection of the heterogeneity of treatment effect across studies. We derive three new methods, simple (unweighted) average treatment effect estimator, a new heterogeneity estimator, and a parametric bootstrapping test for heterogeneity. We then study the statistical properties of both the traditional and new methods via simulation. We find that in general, moment-based estimators of combined treatment effects and heterogeneity are biased and the degree of bias is proportional to the rarity of the event under study. The new methods eliminate much, but not all of this bias. The various estimators and hypothesis testing methods are then compared and contrasted using an example dataset on treatment of stable coronary artery disease. PMID:23734068
Adjusting for partial verification or workup bias in meta-analyses of diagnostic accuracy studies.
de Groot, Joris A H; Dendukuri, Nandini; Janssen, Kristel J M; Reitsma, Johannes B; Brophy, James; Joseph, Lawrence; Bossuyt, Patrick M M; Moons, Karel G M
2012-04-15
A key requirement in the design of diagnostic accuracy studies is that all study participants receive both the test under evaluation and the reference standard test. For a variety of practical and ethical reasons, sometimes only a proportion of patients receive the reference standard, which can bias the accuracy estimates. Numerous methods have been described for correcting this partial verification bias or workup bias in individual studies. In this article, the authors describe a Bayesian method for obtaining adjusted results from a diagnostic meta-analysis when partial verification or workup bias is present in a subset of the primary studies. The method corrects for verification bias without having to exclude primary studies with verification bias, thus preserving the main advantages of a meta-analysis: increased precision and better generalizability. The results of this method are compared with the existing methods for dealing with verification bias in diagnostic meta-analyses. For illustration, the authors use empirical data from a systematic review of studies of the accuracy of the immunohistochemistry test for diagnosis of human epidermal growth factor receptor 2 status in breast cancer patients.
NASA Astrophysics Data System (ADS)
Konkol, Jakub; Bałachowski, Lech
2017-03-01
In this paper, the whole process of pile construction and performance during loading is modelled via large deformation finite element methods such as Coupled Eulerian Lagrangian (CEL) and Updated Lagrangian (UL). Numerical study consists of installation process, consolidation phase and following pile static load test (SLT). The Poznań site is chosen as the reference location for the numerical analysis, where series of pile SLTs have been performed in highly overconsolidated clay (OCR ≈ 12). The results of numerical analysis are compared with corresponding field tests and with so-called "wish-in-place" numerical model of pile, where no installation effects are taken into account. The advantages of using large deformation numerical analysis are presented and its application to the pile designing is shown.
NASA Technical Reports Server (NTRS)
Kerley, James J.
1987-01-01
The method of retroduction, adapted from the doctoral thesis of Dr. A. Croce, relies on a process of dialectic questioning that begins with the information sought, proceeds to Given items (either in the form of dimensions or limits of research). and to Known mathematical forms of analysis in design or to principles of study in research. Finally, analysis and synthesis are used to abstract the dielectic questions and to arrive at the information desired. This method is used to solve the engineering design problem of a beam and to determine why bolts and nuts vibrate apart. Both mathematical analysis and dialectic logical analysis are utilized. Results are provided of tests conducted to check the retroductive study of why and how nuts back off.
Juck, Gregory; Gonzalez, Verapaz; Allen, Ann-Christine Olsson; Sutzko, Meredith; Seward, Kody; Muldoon, Mark T
2018-04-27
The Romer Labs RapidChek ® Listeria monocytogenes test system (Performance Tested Method ℠ 011805) was validated against the U.S. Department of Agriculture-Food Safety and Inspection Service Microbiology Laboratory Guidebook (USDA-FSIS/MLG), U.S. Food and Drug Association Bacteriological Analytical Manual (FDA/BAM), and AOAC Official Methods of Analysis ℠ (AOAC/OMA) cultural reference methods for the detection of L. monocytogenes on selected foods including hot dogs, frozen cooked breaded chicken, frozen cooked shrimp, cured ham, and ice cream, and environmental surfaces including stainless steel and plastic in an unpaired study design. The RapidChek method uses a proprietary enrichment media system, a 44-48 h enrichment at 30 ± 1°C, and detects L. monocytogenes on an immunochromatographic lateral flow device within 10 min. Different L. monocytogenes strains were used to spike each of the matrixes. Samples were confirmed based on the reference method confirmations and an alternate confirmation method. A total of 140 low-level spiked samples were tested by the RapidChek method after enrichment for 44-48 h in parallel with the cultural reference method. There were 88 RapidChek presumptive positives. One of the presumptive positives was not confirmed culturally. Additionally, one of the culturally confirmed samples did not exhibit a presumptive positive. No difference between the alternate confirmation method and reference confirmation method was observed. The respective cultural reference methods (USDA-FSIS/MLG, FDA/BAM, and AOAC/OMA) produced a total of 63 confirmed positive results. Nonspiked samples from all foods were reported as negative for L. monocytogenes by all methods. Probability of detection analysis demonstrated no significant differences in the number of positive samples detected by the RapidChek method and the respective cultural reference method.
Thompson, Angus H; Waye, Arianna
2018-06-01
Presenteeism (reduced productivity at work) is thought to be responsible for large economic costs. Nevertheless, much of the research supporting this is based on self-report questionnaires that have not been adequately evaluated. To examine the level of agreement among leading tests of presenteeism and to determine the inter-relationship of the two productivity subcategories, amount and quality, within the context of construct validity and method variance. Just under 500 health care workers from an urban health area were asked to complete a questionnaire containing the productivity items from eight presenteeism instruments. The analysis included an examination of test intercorrelations, separately for amount and quality, supplemented by principal-component analyses to determine whether either construct could be described by a single factor. A multitest, multiconstruct analysis was performed on the four tests that assessed both amount and quality to test for the relative contributions of construct and method variance. A total of 137 questionnaires were completed. Agreement among tests was positive, but modest. Pearson r ranges were 0 to 0.64 (mean = 0.32) for Amount and 0.03 to 0.38 (mean = 0.25) for Quality. Further analysis suggested that agreement was influenced more by method variance than by the productivity constructs the tests were designed to measure. The results suggest that presenteeism tests do not accurately assess work performance. Given their importance in the determination of policy-relevant conclusions, attention needs to be given to test improvement in the context of criterion validity assessment. Copyright © 2018 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
Status and analysis of test standard for on-board charger
NASA Astrophysics Data System (ADS)
Hou, Shuai; Liu, Haiming; Jiang, Li; Chen, Xichen; Ma, Junjie; Zhao, Bing; Wu, Zaiyuan
2018-05-01
This paper analyzes the test standards of on -board charger (OBC). In the process of testing, we found that there exists some problems in test method and functional status, such as failed to follow up the latest test standards, estimated loosely, rectification uncertainty and consistency. Finally, putting forward some own viewpoints on these problems.
Dual-domain point diffraction interferometer
Naulleau, Patrick P.; Goldberg, Kenneth Alan
2000-01-01
A hybrid spatial/temporal-domain point diffraction interferometer (referred to as the dual-domain PS/PDI) that is capable of suppressing the scattered-reference-light noise that hinders the conventional PS/PDI is provided. The dual-domain PS/PDI combines the separate noise-suppression capabilities of the widely-used phase-shifting and Fourier-transform fringe pattern analysis methods. The dual-domain PS/PDI relies on both a more restrictive implementation of the image plane PS/PDI mask and a new analysis method to be applied to the interferograms generated and recorded by the modified PS/PDI. The more restrictive PS/PDI mask guarantees the elimination of spatial-frequency crosstalk between the signal and the scattered-light noise arising from scattered-reference-light interfering with the test beam. The new dual-domain analysis method is then used to eliminate scattered-light noise arising from both the scattered-reference-light interfering with the test beam and the scattered-reference-light interfering with the "true" pinhole-diffracted reference light. The dual-domain analysis method has also been demonstrated to provide performance enhancement when using the non-optimized standard PS/PDI design. The dual-domain PS/PDI is essentially a three-tiered filtering system composed of lowpass spatial-filtering the test-beam electric field using the more restrictive PS/PDI mask, bandpass spatial-filtering the individual interferogram irradiance frames making up the phase-shifting series, and bandpass temporal-filtering the phase-shifting series as a whole.
Azari, Nadia; Soleimani, Farin; Vameghi, Roshanak; Sajedi, Firoozeh; Shahshahani, Soheila; Karimi, Hossein; Kraskian, Adis; Shahrokhi, Amin; Teymouri, Robab; Gharib, Masoud
2017-01-01
Bayley Scales of infant & toddler development is a well-known diagnostic developmental assessment tool for children aged 1-42 months. Our aim was investigating the validity & reliability of this scale in Persian speaking children. The method was descriptive-analytic. Translation- back translation and cultural adaptation was done. Content & face validity of translated scale was determined by experts' opinions. Overall, 403 children aged 1 to 42 months were recruited from health centers of Tehran, during years of 2013-2014 for developmental assessment in cognitive, communicative (receptive & expressive) and motor (fine & gross) domains. Reliability of scale was calculated through three methods; internal consistency using Cronbach's alpha coefficient, test-retest and interrater methods. Construct validity was calculated using factor analysis and comparison of the mean scores methods. Cultural and linguistic changes were made in items of all domains especially on communication subscale. Content and face validity of the test were approved by experts' opinions. Cronbach's alpha coefficient was above 0.74 in all domains. Pearson correlation coefficient in various domains, were ≥ 0.982 in test retest method, and ≥0.993 in inter-rater method. Construct validity of the test was approved by factor analysis. Moreover, the mean scores for the different age groups were compared and statistically significant differences were observed between mean scores of different age groups, that confirms validity of the test. The Bayley Scales of Infant and Toddler Development is a valid and reliable tool for child developmental assessment in Persian language children.
Validation of structural analysis methods using the in-house liner cyclic rigs
NASA Technical Reports Server (NTRS)
Thompson, R. L.
1982-01-01
Test conditions and variables to be considered in each of the test rigs and test configurations, and also used in the validation of the structural predictive theories and tools, include: thermal and mechanical load histories (simulating an engine mission cycle; different boundary conditions; specimens and components of different dimensions and geometries; different materials; various cooling schemes and cooling hole configurations; several advanced burner liner structural design concepts; and the simulation of hot streaks. Based on these test conditions and test variables, the test matrices for each rig and configurations can be established to verify the predictive tools over as wide a range of test conditions as possible using the simplest possible tests. A flow chart for the thermal/structural analysis of a burner liner and how the analysis relates to the tests is shown schematically. The chart shows that several nonlinear constitutive theories are to be evaluated.