Sample records for tests analysis analyses

  1. Test Analysis Guidelines

    NASA Technical Reports Server (NTRS)

    Jeng, Frank F.

    2007-01-01

    Development of analysis guidelines for Exploration Life Support (ELS) technology tests was completed. The guidelines were developed based on analysis experiences gained from supporting Environmental Control and Life Support System (ECLSS) technology development in air revitalization systems and water recovery systems. Analyses are vital during all three phases of the ELS technology test: pre-test, during test and post test. Pre-test analyses of a test system help define hardware components, predict system and component performances, required test duration, sampling frequencies of operation parameters, etc. Analyses conducted during tests could verify the consistency of all the measurements and the performance of the test system. Post test analyses are an essential part of the test task. Results of post test analyses are an important factor in judging whether the technology development is a successful one. In addition, development of a rigorous model for a test system is an important objective of any new technology development. Test data analyses, especially post test data analyses, serve to verify the model. Test analyses have supported development of many ECLSS technologies. Some test analysis tasks in ECLSS technology development are listed in the Appendix. To have effective analysis support for ECLSS technology tests, analysis guidelines would be a useful tool. These test guidelines were developed based on experiences gained through previous analysis support of various ECLSS technology tests. A comment on analysis from an experienced NASA ECLSS manager (1) follows: "Bad analysis was one that bent the test to prove that the analysis was right to begin with. Good analysis was one that directed where the testing should go and also bridged the gap between the reality of the test facility and what was expected on orbit."

  2. 14 CFR Appendix E to Part 417 - Flight Termination System Testing and Analysis

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... contains requirements for tests and analyses that apply to all flight termination systems and the... termination system components that satisfy the requirements of this appendix. (b) Component tests and analyses. A component must satisfy each test or analysis required by any table of this appendix to demonstrate...

  3. 14 CFR Appendix E to Part 417 - Flight Termination System Testing and Analysis

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... contains requirements for tests and analyses that apply to all flight termination systems and the... termination system components that satisfy the requirements of this appendix. (b) Component tests and analyses. A component must satisfy each test or analysis required by any table of this appendix to demonstrate...

  4. 14 CFR Appendix E to Part 417 - Flight Termination System Testing and Analysis

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... contains requirements for tests and analyses that apply to all flight termination systems and the... termination system components that satisfy the requirements of this appendix. (b) Component tests and analyses. A component must satisfy each test or analysis required by any table of this appendix to demonstrate...

  5. 14 CFR Appendix E to Part 417 - Flight Termination System Testing and Analysis

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... contains requirements for tests and analyses that apply to all flight termination systems and the... termination system components that satisfy the requirements of this appendix. (b) Component tests and analyses. A component must satisfy each test or analysis required by any table of this appendix to demonstrate...

  6. 14 CFR Appendix E to Part 417 - Flight Termination System Testing and Analysis

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... contains requirements for tests and analyses that apply to all flight termination systems and the... termination system components that satisfy the requirements of this appendix. (b) Component tests and analyses. A component must satisfy each test or analysis required by any table of this appendix to demonstrate...

  7. Analyses of the dynamic docking test system for advanced mission docking system test programs. [Apollo Soyuz Test Project

    NASA Technical Reports Server (NTRS)

    Gates, R. M.; Williams, J. E.

    1974-01-01

    Results are given of analytical studies performed in support of the design, implementation, checkout and use of NASA's dynamic docking test system (DDTS). Included are analyses of simulator components, a list of detailed operational test procedures, a summary of simulator performance, and an analysis and comparison of docking dynamics and loads obtained by test and analysis.

  8. Design and Analysis of Tooth Impact Test Rig for Spur Gear

    NASA Astrophysics Data System (ADS)

    Ghazali, Wafiuddin Bin Md; Aziz, Ismail Ali Bin Abdul; Daing Idris, Daing Mohamad Nafiz Bin; Ismail, Nurazima Binti; Sofian, Azizul Helmi Bin

    2016-02-01

    This paper is about the design and analysis of a prototype of tooth impact test rig for spur gear. The test rig was fabricated and analysis was conducted to study its’ limitation and capabilities. The design of the rig is analysed to ensure that there will be no problem occurring during the test and reliable data can be obtained. From the result of the analysis, the maximum amount of load that can be applied, the factor of safety of the machine, the stresses on the test rig parts were determined. This is important in the design consideration of the test rig. The materials used for the fabrication of the test rig were also discussed and analysed. MSC Nastran Patran software was used to analyse the model, which was designed by using SolidWorks 2014 software. Based from the results, there were limitations found from the initial design and the test rig design needs to be improved in order for the test rig to operate properly.

  9. SRM Internal Flow Tests and Computational Fluid Dynamic Analysis. Volume 2; CFD RSRM Full-Scale Analyses

    NASA Technical Reports Server (NTRS)

    2001-01-01

    This document presents the full-scale analyses of the CFD RSRM. The RSRM model was developed with a 20 second burn time. The following are presented as part of the full-scale analyses: (1) RSRM embedded inclusion analysis; (2) RSRM igniter nozzle design analysis; (3) Nozzle Joint 4 erosion anomaly; (4) RSRM full motor port slag accumulation analysis; (5) RSRM motor analysis of two-phase flow in the aft segment/submerged nozzle region; (6) Completion of 3-D Analysis of the hot air nozzle manifold; (7) Bates Motor distributed combustion test case; and (8) Three Dimensional Polysulfide Bump Analysis.

  10. Using meta-analytic path analysis to test theoretical predictions in health behavior: An illustration based on meta-analyses of the theory of planned behavior.

    PubMed

    Hagger, Martin S; Chan, Derwin K C; Protogerou, Cleo; Chatzisarantis, Nikos L D

    2016-08-01

    Synthesizing research on social cognitive theories applied to health behavior is an important step in the development of an evidence base of psychological factors as targets for effective behavioral interventions. However, few meta-analyses of research on social cognitive theories in health contexts have conducted simultaneous tests of theoretically-stipulated pattern effects using path analysis. We argue that conducting path analyses of meta-analytic effects among constructs from social cognitive theories is important to test nomological validity, account for mediation effects, and evaluate unique effects of theory constructs independent of past behavior. We illustrate our points by conducting new analyses of two meta-analyses of a popular theory applied to health behaviors, the theory of planned behavior. We conducted meta-analytic path analyses of the theory in two behavioral contexts (alcohol and dietary behaviors) using data from the primary studies included in the original meta-analyses augmented to include intercorrelations among constructs and relations with past behavior missing from the original analysis. Findings supported the nomological validity of the theory and its hypotheses for both behaviors, confirmed important model processes through mediation analysis, demonstrated the attenuating effect of past behavior on theory relations, and provided estimates of the unique effects of theory constructs independent of past behavior. Our analysis illustrates the importance of conducting a simultaneous test of theory-stipulated effects in meta-analyses of social cognitive theories applied to health behavior. We recommend researchers adopt this analytic procedure when synthesizing evidence across primary tests of social cognitive theories in health. Copyright © 2016 Elsevier Inc. All rights reserved.

  11. 40 CFR 63.7515 - When must I conduct subsequent performance tests or fuel analyses?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... performance tests or fuel analyses? 63.7515 Section 63.7515 Protection of Environment ENVIRONMENTAL PROTECTION... Requirements § 63.7515 When must I conduct subsequent performance tests or fuel analyses? (a) You must conduct... previous performance test. (f) You must conduct a fuel analysis according to § 63.7521 for each type of...

  12. 40 CFR 63.7515 - When must I conduct subsequent performance tests or fuel analyses?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... performance tests or fuel analyses? 63.7515 Section 63.7515 Protection of Environment ENVIRONMENTAL PROTECTION... Requirements § 63.7515 When must I conduct subsequent performance tests or fuel analyses? (a) You must conduct... previous performance test. (f) You must conduct a fuel analysis according to § 63.7521 for each type of...

  13. 40 CFR 63.7515 - When must I conduct subsequent performance tests or fuel analyses?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... performance tests or fuel analyses? 63.7515 Section 63.7515 Protection of Environment ENVIRONMENTAL PROTECTION... Requirements § 63.7515 When must I conduct subsequent performance tests or fuel analyses? (a) You must conduct... previous performance test. (f) You must conduct a fuel analysis according to § 63.7521 for each type of...

  14. Finite Element Analysis and Test Correlation of a 10-Meter Inflation-Deployed Solar Sail

    NASA Technical Reports Server (NTRS)

    Sleight, David W.; Michii, Yuki; Lichodziejewski, David; Derbes, Billy; Mann. Troy O.; Slade, Kara N.; Wang, John T.

    2005-01-01

    Under the direction of the NASA In-Space Propulsion Technology Office, the team of L Garde, NASA Jet Propulsion Laboratory, Ball Aerospace, and NASA Langley Research Center has been developing a scalable solar sail configuration to address NASA's future space propulsion needs. Prior to a flight experiment of a full-scale solar sail, a comprehensive phased test plan is currently being implemented to advance the technology readiness level of the solar sail design. These tests consist of solar sail component, subsystem, and sub-scale system ground tests that simulate the vacuum and thermal conditions of the space environment. Recently, two solar sail test articles, a 7.4-m beam assembly subsystem test article and a 10-m four-quadrant solar sail system test article, were tested in vacuum conditions with a gravity-offload system to mitigate the effects of gravity. This paper presents the structural analyses simulating the ground tests and the correlation of the analyses with the test results. For programmatic risk reduction, a two-prong analysis approach was undertaken in which two separate teams independently developed computational models of the solar sail test articles using the finite element analysis software packages: NEiNastran and ABAQUS. This paper compares the pre-test and post-test analysis predictions from both software packages with the test data including load-deflection curves from static load tests, and vibration frequencies and mode shapes from vibration tests. The analysis predictions were in reasonable agreement with the test data. Factors that precluded better correlation of the analyses and the tests were uncertainties in the material properties, test conditions, and modeling assumptions used in the analyses.

  15. 45 CFR 13.7 - Studies, exhibits, analyses, engineering reports, tests and projects.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 45 Public Welfare 1 2011-10-01 2011-10-01 false Studies, exhibits, analyses, engineering reports... Studies, exhibits, analyses, engineering reports, tests and projects. The reasonable cost (or the reasonable portion of the cost) for any study, exhibit, analysis, engineering report, test, project or...

  16. 45 CFR 13.7 - Studies, exhibits, analyses, engineering reports, tests and projects.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 45 Public Welfare 1 2013-10-01 2013-10-01 false Studies, exhibits, analyses, engineering reports... Studies, exhibits, analyses, engineering reports, tests and projects. The reasonable cost (or the reasonable portion of the cost) for any study, exhibit, analysis, engineering report, test, project or...

  17. 45 CFR 13.7 - Studies, exhibits, analyses, engineering reports, tests and projects.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 45 Public Welfare 1 2014-10-01 2014-10-01 false Studies, exhibits, analyses, engineering reports... Studies, exhibits, analyses, engineering reports, tests and projects. The reasonable cost (or the reasonable portion of the cost) for any study, exhibit, analysis, engineering report, test, project or...

  18. 45 CFR 13.7 - Studies, exhibits, analyses, engineering reports, tests and projects.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 45 Public Welfare 1 2010-10-01 2010-10-01 false Studies, exhibits, analyses, engineering reports... Studies, exhibits, analyses, engineering reports, tests and projects. The reasonable cost (or the reasonable portion of the cost) for any study, exhibit, analysis, engineering report, test, project or...

  19. 45 CFR 13.7 - Studies, exhibits, analyses, engineering reports, tests and projects.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 45 Public Welfare 1 2012-10-01 2012-10-01 false Studies, exhibits, analyses, engineering reports... Studies, exhibits, analyses, engineering reports, tests and projects. The reasonable cost (or the reasonable portion of the cost) for any study, exhibit, analysis, engineering report, test, project or...

  20. Structural Analysis of an Inflation-Deployed Solar Sail With Experimental Validation

    NASA Technical Reports Server (NTRS)

    Sleight, David W.; Michii, Yuki; Lichodziejewski, David; Derbes, Billy; Mann, Troy O.

    2005-01-01

    Under the direction of the NASA In-Space Propulsion Technology Office, the team of L Garde, NASA Jet Propulsion Laboratory, Ball Aerospace, and NASA Langley Research Center has been developing a scalable solar sail configuration to address NASA s future space propulsion needs. Prior to a flight experiment of a full-scale solar sail, a comprehensive phased test plan is currently being implemented to advance the technology readiness level of the solar sail design. These tests consist of solar sail component, subsystem, and sub-scale system ground tests that simulate the vacuum and thermal conditions of the space environment. Recently, two solar sail test articles, a 7.4-m beam assembly subsystem test article and a 10-m four-quadrant solar sail system test article, were tested in vacuum conditions with a gravity-offload system to mitigate the effects of gravity. This paper presents the structural analyses simulating the ground tests and the correlation of the analyses with the test results. For programmatic risk reduction, a two-prong analysis approach was undertaken in which two separate teams independently developed computational models of the solar sail test articles using the finite element analysis software packages: NEiNastran and ABAQUS. This paper compares the pre-test and post-test analysis predictions from both software packages with the test data including load-deflection curves from static load tests, and vibration frequencies and mode shapes from structural dynamics tests. The analysis predictions were in reasonable agreement with the test data. Factors that precluded better correlation of the analyses and the tests were uncertainties in the material properties, test conditions, and modeling assumptions used in the analyses.

  1. Test 6, Test 7, and Gas Standard Analysis Results

    NASA Technical Reports Server (NTRS)

    Perez, Horacio, III

    2007-01-01

    This viewgraph presentation shows results of analyses on odor, toxic off gassing and gas standards. The topics include: 1) Statistical Analysis Definitions; 2) Odor Analysis Results NASA Standard 6001 Test 6; 3) Toxic Off gassing Analysis Results NASA Standard 6001 Test 7; and 4) Gas Standard Results NASA Standard 6001 Test 7;

  2. Design, analysis and test verification of advanced encapsulation systems

    NASA Technical Reports Server (NTRS)

    Garcia, A., III; Kallis, J. M.; Trucker, D. C.

    1983-01-01

    Analytical models were developed to perform optical, thermal, electrical and structural analyses on candidate encapsulation systems. From these analyses several candidate encapsulation systems were selected for qualification testing.

  3. Workplace drug testing in Italy: findings about second-stage testing.

    PubMed

    Vignali, Claudia; Stramesi, Cristiana; Morini, Luca; San Bartolomeo, Paolo; Groppi, Angelo

    2015-03-01

    Workplace Drug Testing (WDT) in Italy includes two levels of monitoring: a first stage concerning drug testing on urine samples and a second involving both urine and hair analysis. The second stage is performed only on workers who tested positive at the first level. We analyzed urine and hair specimens from 120 workers undergoing second-level testing between 2009 and 2012. Eighty percent of them had tested positive for cannabinoids during the first level analysis, and 15.8% for cocaine. Both urine and hair samples were analyzed in order to find the following drugs of abuse: amphetamines, buprenorphine, cannabinoids, cocaine, ecstasy, methadone, and opiates. Urine analyses were performed by immunological screening (EMIT); urine confirmatory tests and hair analyses were performed by gas chromatography-mass spectrometry (GC-MS). As regards second-stage testing on urine samples, 71.2% of workers were always negative, whereas 23.9% tested positive at least once for cannabinoids and 2.5% for cocaine. Hair analyses produced surprising results: 61.9% of hair samples tested negative, only 6.2% tested positive for cannabinoids, whereas 28.8% tested positive for cocaine. These findings confirm that second-level surveillance of WDT, which includes hair analysis, is very effective because it highlights drug intake - sometimes heavy - that cannot be revealed only through urine analyses. The employees for whom drug addiction is proved can begin rehabilitation, while keeping their job. Eventually, our results confirmed the widespread and undeclared use of cocaine in Italy. Copyright © 2014 John Wiley & Sons, Ltd.

  4. LOD score exclusion analyses for candidate genes using random population samples.

    PubMed

    Deng, H W; Li, J; Recker, R R

    2001-05-01

    While extensive analyses have been conducted to test for, no formal analyses have been conducted to test against, the importance of candidate genes with random population samples. We develop a LOD score approach for exclusion analyses of candidate genes with random population samples. Under this approach, specific genetic effects and inheritance models at candidate genes can be analysed and if a LOD score is < or = - 2.0, the locus can be excluded from having an effect larger than that specified. Computer simulations show that, with sample sizes often employed in association studies, this approach has high power to exclude a gene from having moderate genetic effects. In contrast to regular association analyses, population admixture will not affect the robustness of our analyses; in fact, it renders our analyses more conservative and thus any significant exclusion result is robust. Our exclusion analysis complements association analysis for candidate genes in random population samples and is parallel to the exclusion mapping analyses that may be conducted in linkage analyses with pedigrees or relative pairs. The usefulness of the approach is demonstrated by an application to test the importance of vitamin D receptor and estrogen receptor genes underlying the differential risk to osteoporotic fractures.

  5. Posttest analysis of the FFTF inherent safety tests

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Padilla, A. Jr.; Claybrook, S.W.

    Inherent safety tests were performed during 1986 in the 400-MW (thermal) Fast Flux Test Facility (FFTF) reactor to demonstrate the effectiveness of an inherent shutdown device called the gas expansion module (GEM). The GEM device provided a strong negative reactivity feedback during loss-of-flow conditions by increasing the neutron leakage as a result of an expanding gas bubble. The best-estimate pretest calculations for these tests were performed using the IANUS plant analysis code (Westinghouse Electric Corporation proprietary code) and the MELT/SIEX3 core analysis code. These two codes were also used to perform the required operational safety analyses for the FFTF reactormore » and plant. Although it was intended to also use the SASSYS systems (core and plant) analysis code, the calibration of the SASSYS code for FFTF core and plant analysis was not completed in time to perform pretest analyses. The purpose of this paper is to present the results of the posttest analysis of the 1986 FFTF inherent safety tests using the SASSYS code.« less

  6. Guidelines for Proof Test Analysis

    NASA Technical Reports Server (NTRS)

    Chell, G. G.; McClung, R. C.; Kuhlman, C. J.; Russell, D. A.; Garr, K.; Donnelly, B.

    1999-01-01

    These guidelines integrate state-of-the-art elastic-plastic fracture mechanics (EPFM) and proof test implementation issues into a comprehensive proof test analysis procedure in the form of a road map which identifies the types of data, fracture mechanics based parameters, and calculations needed to perform flaw screening and minimum proof load analyses of fracture critical components. Worked examples are presented to illustrate the application of the road map to proof test analysis. The state-of-the art fracture technology employed in these guidelines is based on the EPFM parameter, J, and a pictorial representation of a J fracture analysis, called the failure assessment diagram (FAD) approach. The recommended fracture technology is validated using finite element J results, and laboratory and hardware fracture test results on the nickel-based superalloy Inconel 718, the aluminum alloy 2024-T3511, and ferritic pressure vessel steels. In all cases the laboratory specimens and hardware failed by ductile mechanisms. Advanced proof test analyses involving probability analysis and multiple-cycle proof testing (MCPT) are addressed. Finally, recommendations are provided on how to account for the effects of the proof test overload on subsequent service fatigue and fracture behaviors.

  7. Combustion Stability Analyses of Coaxial Element Injectors with Liquid Oxygen/Liquid Methane Propellants

    NASA Technical Reports Server (NTRS)

    Hulka, J. R.

    2010-01-01

    Liquid rocket engines using oxygen and methane propellants are being considered by the National Aeronautics and Space Administration (NASA) for in-space vehicles. This propellant combination has not been previously used in a flight-qualified engine system, so limited test data and analysis results are available at this stage of early development. NASA has funded several hardware-oriented activities with oxygen and methane propellants over the past several years with the Propulsion and Cryogenic Advanced Development (PCAD) project, under the Exploration Technology Development Program. As part of this effort, the NASA Marshall Space Flight Center has conducted combustion stability analyses of several of the configurations. This paper presents test data and analyses of combustion stability from the recent PCAD-funded test programs at the NASA MSFC. These test programs used swirl coaxial element injectors with liquid oxygen and liquid methane propellants. Oxygen was injected conventionally in the center of the coaxial element, and swirl was provided by tangential entry slots. Injectors with 28-element and 40-element patterns were tested with several configurations of combustion chambers, including ablative and calorimeter spool sections, and several configurations of fuel injection design. Low frequency combustion instability (chug) occurred with both injectors, and high-frequency combustion instability occurred at the first tangential (1T) transverse mode with the 40-element injector. In most tests, a transition between high-amplitude chug with gaseous methane flow and low-amplitude chug with liquid methane flow was readily observed. Chug analyses of both conditions were conducted using techniques from Wenzel and Szuch and from the Rocket Combustor Interactive Design and Analysis (ROCCID) code. The 1T mode instability occurred in several tests and was apparent by high-frequency pressure measurements as well as dramatic increases in calorimeter-measured heat flux throughout the chamber. Analyses of the transverse mode were conducted with ROCCID and empirical methods such as Hewitt d/V. This paper describes the test hardware configurations, test data, analysis methods, and presents results of the various analyses.

  8. SOCR Analyses - an Instructional Java Web-based Statistical Analysis Toolkit.

    PubMed

    Chu, Annie; Cui, Jenny; Dinov, Ivo D

    2009-03-01

    The Statistical Online Computational Resource (SOCR) designs web-based tools for educational use in a variety of undergraduate courses (Dinov 2006). Several studies have demonstrated that these resources significantly improve students' motivation and learning experiences (Dinov et al. 2008). SOCR Analyses is a new component that concentrates on data modeling and analysis using parametric and non-parametric techniques supported with graphical model diagnostics. Currently implemented analyses include commonly used models in undergraduate statistics courses like linear models (Simple Linear Regression, Multiple Linear Regression, One-Way and Two-Way ANOVA). In addition, we implemented tests for sample comparisons, such as t-test in the parametric category; and Wilcoxon rank sum test, Kruskal-Wallis test, Friedman's test, in the non-parametric category. SOCR Analyses also include several hypothesis test models, such as Contingency tables, Friedman's test and Fisher's exact test.The code itself is open source (http://socr.googlecode.com/), hoping to contribute to the efforts of the statistical computing community. The code includes functionality for each specific analysis model and it has general utilities that can be applied in various statistical computing tasks. For example, concrete methods with API (Application Programming Interface) have been implemented in statistical summary, least square solutions of general linear models, rank calculations, etc. HTML interfaces, tutorials, source code, activities, and data are freely available via the web (www.SOCR.ucla.edu). Code examples for developers and demos for educators are provided on the SOCR Wiki website.In this article, the pedagogical utilization of the SOCR Analyses is discussed, as well as the underlying design framework. As the SOCR project is on-going and more functions and tools are being added to it, these resources are constantly improved. The reader is strongly encouraged to check the SOCR site for most updated information and newly added models.

  9. Conducting meta-analyses of HIV prevention literatures from a theory-testing perspective.

    PubMed

    Marsh, K L; Johnson, B T; Carey, M P

    2001-09-01

    Using illustrations from HIV prevention research, the current article advocates approaching meta-analysis as a theory-testing scientific method rather than as merely a set of rules for quantitative analysis. Like other scientific methods, meta-analysis has central concerns with internal, external, and construct validity. The focus of a meta-analysis should only rarely be merely describing the effects of health promotion, but rather should be on understanding and explaining phenomena and the processes underlying them. The methodological decisions meta-analysts make in conducting reviews should be guided by a consideration of the underlying goals of the review (e.g., simply effect size estimation or, preferably theory testing). From the advocated perspective that a health behavior meta-analyst should test theory, the authors present a number of issues to be considered during the conduct of meta-analyses.

  10. Analyse Factorielle d'une Batterie de Tests de Comprehension Orale et Ecrite (Factor Analysis of a Battery of Tests of Listening and Reading Comprehension). Melanges Pedagogiques, 1971.

    ERIC Educational Resources Information Center

    Lonchamp, F.

    This is a presentation of the results of a factor analysis of a battery of tests intended to measure listening and reading comprehension in English as a second language. The analysis sought to answer the following questions: (1) whether the factor analysis method yields results when applied to tests which are not specifically designed for this…

  11. Combustion and Performance Analyses of Coaxial Element Injectors with Liquid Oxygen/Liquid Methane Propellants

    NASA Technical Reports Server (NTRS)

    Hulka, J. R.; Jones, G. W.

    2010-01-01

    Liquid rocket engines using oxygen and methane propellants are being considered by the National Aeronautics and Space Administration (NASA) for in-space vehicles. This propellant combination has not been previously used in a flight-qualified engine system, so limited test data and analysis results are available at this stage of early development. NASA has funded several hardware-oriented activities with oxygen and methane propellants over the past several years with the Propulsion and Cryogenic Advanced Development (PCAD) project, under the Exploration Technology Development Program. As part of this effort, the NASA Marshall Space Flight Center has conducted combustion, performance, and combustion stability analyses of several of the configurations. This paper summarizes the analyses of combustion and performance as a follow-up to a paper published in the 2008 JANNAF/LPS meeting. Combustion stability analyses are presented in a separate paper. The current paper includes test and analysis results of coaxial element injectors using liquid oxygen and liquid methane or gaseous methane propellants. Several thrust chamber configurations have been modeled, including thrust chambers with multi-element swirl coax element injectors tested at the NASA MSFC, and a uni-element chamber with shear and swirl coax injectors tested at The Pennsylvania State University. Configurations were modeled with two one-dimensional liquid rocket combustion analysis codes, the Rocket Combustor Interaction Design and Analysis (ROCCID), and the Coaxial Injector Combustion Model (CICM). Significant effort was applied to show how these codes can be used to model combustion and performance with oxygen/methane propellants a priori, and what anchoring or calibrating features need to be applied or developed in the future. This paper describes the test hardware configurations, presents the results of all the analyses, and compares the results from the two analytical methods

  12. Structural Analysis and Test Comparison of a 20-Meter Inflation-Deployed Solar Sail

    NASA Technical Reports Server (NTRS)

    Sleight, David W.; Mann, Troy; Lichodziejewski, David; Derbes, Billy

    2006-01-01

    Under the direction of the NASA In-Space Propulsion Technology Office, the team of L Garde, NASA Jet Propulsion Laboratory, Ball Aerospace, and NASA Langley Research Center has been developing a scalable solar sail configuration to address NASA s future space propulsion needs. Prior to a flight experiment of a full-scale solar sail, a comprehensive test program was implemented to advance the technology readiness level of the solar sail design. These tests consisted of solar sail component, subsystem, and sub-scale system ground tests that simulated the aspects of the space environment such as vacuum and thermal conditions. In July 2005, a 20-m four-quadrant solar sail system test article was tested in the NASA Glenn Research Center s Space Power Facility to measure its static and dynamic structural responses. Key to the maturation of solar sail technology is the development of validated finite element analysis (FEA) models that can be used for design and analysis of solar sails. A major objective of the program was to utilize the test data to validate the FEA models simulating the solar sail ground tests. The FEA software, ABAQUS, was used to perform the structural analyses to simulate the ground tests performed on the 20-m solar sail test article. This paper presents the details of the FEA modeling, the structural analyses simulating the ground tests, and a comparison of the pretest and post-test analysis predictions with the ground test results for the 20-m solar sail system test article. The structural responses that are compared in the paper include load-deflection curves and natural frequencies for the beam structural assembly and static shape, natural frequencies, and mode shapes for the solar sail membrane. The analysis predictions were in reasonable agreement with the test data. Factors that precluded better correlation of the analyses and the tests were unmeasured initial conditions in the test set-up.

  13. Thermal-Structural Analysis of PICA Tiles for Solar Tower Test

    NASA Technical Reports Server (NTRS)

    Agrawal, Parul; Empey, Daniel M.; Squire, Thomas H.

    2009-01-01

    Thermal protection materials used in spacecraft heatshields are subjected to severe thermal and mechanical loading environments during re-entry into earth atmosphere. In order to investigate the reliability of PICA tiles in the presence of high thermal gradients as well as mechanical loads, the authors designed and conducted solar-tower tests. This paper presents the design and analysis work for this tests series. Coupled non-linear thermal-mechanical finite element analyses was conducted to estimate in-depth temperature distribution and stress contours for various cases. The first set of analyses performed on isolated PICA tile showed that stresses generated during the tests were below the PICA allowable limit and should not lead to any catastrophic failure during the test. The tests results were consistent with analytical predictions. The temperature distribution and magnitude of the measured strains were also consistent with predicted values. The second test series is designed to test the arrayed PICA tiles with various gap-filler materials. A nonlinear contact method is used to model the complex geometry with various tiles. The analyses for these coupons predict the stress contours in PICA and inside gap fillers. Suitable mechanical loads for this architecture will be predicted, which can be applied during the test to exceed the allowable limits and demonstrate failure modes. Thermocouple and strain-gauge data obtained from the solar tower tests will be used for subsequent analyses and validation of FEM models.

  14. NASA Contractor Report: Guidelines for Proof Test Analysis

    NASA Technical Reports Server (NTRS)

    Chell, G. G.; McClung, R. C.; Kuhlman, C. J.; Russell, D. A.; Garr, K.; Donnelly, B.

    1997-01-01

    These Guidelines integrate state-of-the-art Elastic-Plastic Fracture Mechanics (EPFM) and proof test implementation issues into a comprehensive proof test analysis procedure in the form of a Road Map which identifies the types of data, fracture mechanics based parameters, and calculations needed to perform flaw screening and minimum proof load analyses of fracture critical components. Worked examples are presented to illustrate the application of the Road Map to proof test analysis. The state-of-the-art fracture technology employed in these Guidelines is based on the EPFM parameter, J, and a pictorial representation of a J fracture analysis, called the Failure Assessment Diagram (FAD) approach. The recommended fracture technology is validated using finite element J results, and laboratory and hardware fracture test results on the nickel-based superalloy IN-718, the aluminum alloy 2024-T351 1, and ferritic pressure vessel steels. In all cases the laboratory specimens and hardware failed by ductile mechanisms. Advanced proof test analyses involving probability analysis and Multiple Cycle Proof Testing (MCPT) are addressed. Finally, recommendations are provided on to how to account for the effects of the proof test overload on subsequent service fatigue and fracture behaviors.

  15. Integrated analyses in plastics forming

    NASA Astrophysics Data System (ADS)

    Bo, Wang

    This is the thesis which explains the progress made in the analysis, simulation and testing of plastics forming. This progress can be applied to injection and compression mould design. Three activities of plastics forming have been investigated, namely filling analysis, cooling analysis and ejecting analysis. The filling section of plastics forming has been analysed and calculated by using MOLDFLOW and FILLCALC V. software. A comparing of high speed compression moulding and injection moulding has been made. The cooling section of plastics forming has been analysed by using MOLDFLOW software and a finite difference computer program. The latter program can be used as a sample program to calculate the feasibility of cooling different materials to required target temperatures under controlled cooling conditions. The application of thermal imaging has been also introduced to determine the actual process temperatures. Thermal imaging can be used as a powerful tool to analyse mould surface temperatures and to verify the mathematical model. A buckling problem for ejecting section has been modelled and calculated by PATRAN/ABAQUS finite element analysis software and tested. These calculations and analysis are applied to the special case but can be use as an example for general analysis and calculation in the ejection section of plastics forming.

  16. A general framework for the use of logistic regression models in meta-analysis.

    PubMed

    Simmonds, Mark C; Higgins, Julian Pt

    2016-12-01

    Where individual participant data are available for every randomised trial in a meta-analysis of dichotomous event outcomes, "one-stage" random-effects logistic regression models have been proposed as a way to analyse these data. Such models can also be used even when individual participant data are not available and we have only summary contingency table data. One benefit of this one-stage regression model over conventional meta-analysis methods is that it maximises the correct binomial likelihood for the data and so does not require the common assumption that effect estimates are normally distributed. A second benefit of using this model is that it may be applied, with only minor modification, in a range of meta-analytic scenarios, including meta-regression, network meta-analyses and meta-analyses of diagnostic test accuracy. This single model can potentially replace the variety of often complex methods used in these areas. This paper considers, with a range of meta-analysis examples, how random-effects logistic regression models may be used in a number of different types of meta-analyses. This one-stage approach is compared with widely used meta-analysis methods including Bayesian network meta-analysis and the bivariate and hierarchical summary receiver operating characteristic (ROC) models for meta-analyses of diagnostic test accuracy. © The Author(s) 2014.

  17. Neutron Physics Division progress report for period ending February 28, 1977

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maienschein, F.C.

    1977-05-01

    Summaries are given of research progress in the following areas: (1) measurements of cross sections and related quantities, (2) cross section evaluations and theory, (3) cross section processing, testing, and sensitivity analysis, (4) integral experiments and their analyses, (5) development of methods for shield and reactor analyses, (6) analyses for specific systems or applications, and (7) information analysis and distribution. (SDF)

  18. Integrated Vehicle Ground Vibration Testing in Support of Launch Vehicle Loads and Controls Analysis

    NASA Technical Reports Server (NTRS)

    Askins, Bruce R.; Davis, Susan R.; Salyer, Blaine H.; Tuma, Margaret L.

    2008-01-01

    All structural systems possess a basic set of physical characteristics unique to that system. These unique physical characteristics include items such as mass distribution and damping. When specified, they allow engineers to understand and predict how a structural system behaves under given loading conditions and different methods of control. These physical properties of launch vehicles may be predicted by analysis or measured by certain types of tests. Generally, these properties are predicted by analysis during the design phase of a launch vehicle and then verified by testing before the vehicle becomes operational. A ground vibration test (GVT) is intended to measure by test the fundamental dynamic characteristics of launch vehicles during various phases of flight. During the series of tests, properties such as natural frequencies, mode shapes, and transfer functions are measured directly. These data will then be used to calibrate loads and control systems analysis models for verifying analyses of the launch vehicle. NASA manned launch vehicles have undergone ground vibration testing leading to the development of successful launch vehicles. A GVT was not performed on the inaugural launch of the unmanned Delta III which was lost during launch. Subsequent analyses indicated had a GVT been performed, it would have identified instability issues avoiding loss of the vehicle. This discussion will address GVT planning, set-up, execution and analyses, for the Saturn and Shuttle programs, and will also focus on the current and on-going planning for the Ares I and V Integrated Vehicle Ground Vibration Test (IVGVT).

  19. Destructive physical analysis of hollow cathodes from the Deep Space 1 Flight spare ion engine 30,000 hr life test

    NASA Technical Reports Server (NTRS)

    Sengupta, Anita

    2005-01-01

    Destructive physical analysis of the discharge and neutralizer hollow cathode assemblies from the Deep Space 1 Flight Spare 30,000 Hr life test was performed to characterize physical and chemical evidence of operationally induced effects after 30,372 hours of operation with beam extraction. Post-test inspection of the discharge-cathode assembly was subdivided into detailed analyses at the subcomponent level. Detailed materials analysis and optical inspection of the insert, orifice plate, cathode tube, heater, keeper assembly, insulator, and low-voltage propellant isolator were performed. Energy dispersive X-ray (EDX) and scanning electron microscopy (SEW analyses were used to determine the extent and composition of regions of net deposition and erosion of both the discharge and neutralizer inserts. A comparative approach with an un-operated 4:1:1 insert was used to determine the extent of impregnate material depletion as a function of depth from the ID surface and axial position from the orifice plate. Analysis results are compared and contrasted with those obtained from similar analyses on components from shorter term tests, and provide insight regarding the prospect for successful longer-term operation consistent with SOA ion engine program life objectives at NASA.

  20. Test Basher Benefit-Cost Analysis.

    ERIC Educational Resources Information Center

    Phelps, Richard P.

    1996-01-01

    Starting in the late 1980s, two teams of researchers, well known for their criticism of standardized tests on equity and validity grounds, began attacking standardized testing on efficiency grounds as well, using cost-benefit analysis to do it. Their analyses are reviewed, and their conclusions discussed. The first team, Lorrie A. Shepard, Amelia…

  1. Autonomous Precision Landing and Hazard Avoidance Technology (ALHAT) Project Status as of May 2010

    NASA Technical Reports Server (NTRS)

    Striepe, Scott A.; Epp, Chirold D.; Robertson, Edward A.

    2010-01-01

    This paper includes the current status of NASA s Autonomous precision Landing and Hazard Avoidance Technology (ALHAT) Project. The ALHAT team has completed several flight tests and two major design analysis cycles. These tests and analyses examine terrain relative navigation sensors, hazard detection and avoidance sensors and algorithms, and hazard relative navigation algorithms, and the guidance and navigation system using these ALHAT functions. The next flight test is scheduled for July 2010. The paper contains results from completed flight tests and analysis cycles. ALHAT system status, upcoming tests and analyses is also addressed. The current ALHAT plans as of May 2010 are discussed. Application of the ALHAT system to landing on bodies other than the Moon is included

  2. The Data from Aeromechanics Test and Analytics -- Management and Analysis Package (DATAMAP). Volume I. User’s Manual.

    DTIC Science & Technology

    1980-12-01

    to sound pressure level in decibels assuming a fre- quency of 1000 Hz. 249 The perceived noisiness values are derived from a formula specified in...Analyses .......... 244 6.i.16 Perceived Noise Level Analysis .............249 6.1.17 Acoustic Weighting Networks ................250 6.2 DERIVATIONS...BAND ANALYSIS BASIC STATISTICAL ANALYSES: *OCTAVE ANALYSIS MEAN *THIRD OCTAVE ANALYSIS VARIANCE *PERCEIVED NOISE LEVEL STANDARD DEVIATION CALCULATION

  3. Progressing from initially ambiguous functional analyses: three case examples.

    PubMed

    Tiger, Jeffrey H; Fisher, Wayne W; Toussaint, Karen A; Kodak, Tiffany

    2009-01-01

    Most often functional analyses are initiated using a standard set of test conditions, similar to those described by Iwata, Dorsey, Slifer, Bauman, and Richman [Iwata, B. A., Dorsey, M. F., Slifer, K. J., Bauman, K. E., & Richman, G. S. (1994). Toward a functional analysis of self-injury. Journal of Applied Behavior Analysis, 27, 197-209 (Reprinted from Analysis and Intervention in Developmental Disabilities, 2, 3-20, 1982)]. These test conditions involve the careful manipulation of motivating operations, discriminative stimuli, and reinforcement contingencies to determine the events related to the occurrence and maintenance of problem behavior. Some individuals display problem behavior that is occasioned and reinforced by idiosyncratic or otherwise unique combinations of environmental antecedents and consequences of behavior, which are unlikely to be detected using these standard assessment conditions. For these individuals, modifications to the standard test conditions or the inclusion of novel test conditions may result in clearer assessment outcomes. The current study provides three case examples of individuals whose functional analyses were initially undifferentiated; however, modifications to the standard conditions resulted in the identification of behavioral functions and the implementation of effective function-based treatments.

  4. Design, analysis, and test verification of advanced encapsulation systems

    NASA Technical Reports Server (NTRS)

    Mardesich, N.; Minning, C.

    1982-01-01

    Design sensitivities are established for the development of photovoltaic module criteria and the definition of needed research tasks. The program consists of three phases. In Phase I, analytical models were developed to perform optical, thermal, electrical, and structural analyses on candidate encapsulation systems. From these analyses several candidate systems will be selected for qualification testing during Phase II. Additionally, during Phase II, test specimens of various types will be constructed and tested to determine the validity of the analysis methodology developed in Phase I. In Phse III, a finalized optimum design based on knowledge gained in Phase I and II will be developed. All verification testing was completed during this period. Preliminary results and observations are discussed. Descriptions of the thermal, thermal structural, and structural deflection test setups are included.

  5. 75 FR 17281 - Changes in Hourly Fee Rates for Science and Technology Laboratory Services-Fiscal Years 2010-2012

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-06

    ..., residue chemistry, proximate analysis for composition, and biomolecular (DNA-based) testing. A user fee... provide greater clarity of reported test analyses and laboratory determinations. DATES: Effective April 7... analyses and laboratory determinations provided by AMS laboratory services apply only to the submitted...

  6. The Use of Web 2.0 Tools by Students in Learning and Leisure Contexts: A Study in a Portuguese Institution of Higher Education

    ERIC Educational Resources Information Center

    Costa, Carolina; Alvelos, Helena; Teixeira, Leonor

    2016-01-01

    This study analyses and compares the use of Web 2.0 tools by students in both learning and leisure contexts. Data were collected based on a questionnaire applied to 234 students from the University of Aveiro (Portugal) and the results were analysed by using descriptive analysis, paired samples t-tests, cluster analyses and Kruskal-Wallis tests.…

  7. Shielding requirements for the Space Station habitability modules

    NASA Technical Reports Server (NTRS)

    Avans, Sherman L.; Horn, Jennifer R.; Williamsen, Joel E.

    1990-01-01

    The design, analysis, development, and tests of the total meteoroid/debris protection system for the Space Station Freedom habitability modules, such as the habitation module, the laboratory module, and the node structures, are described. Design requirements are discussed along with development efforts, including a combination of hypervelocity testing and analyses. Computer hydrocode analysis of hypervelocity impact phenomena associated with Space Station habitability structures is covered and the use of optimization techniques, engineering models, and parametric analyses is assessed. Explosive rail gun development efforts and protective capability and damage tolerance of multilayer insulation due to meteoroid/debris impact are considered. It is concluded that anticipated changes in the debris environment definition and requirements will require rescoping the tests and analysis required to develop a protection system.

  8. Biomechanical Analysis of Military Boots. Phase 1. Materials Testing of Military and Commercial Footwear

    DTIC Science & Technology

    1992-10-01

    N=8) and Results of 44 Statistical Analyses for Impact Test Performed on Forefoot of Unworn Footwear A-2. Summary Statistics (N=8) and Results of...on Forefoot of Worn Footwear Vlll Tables (continued) Table Page B-2. Summary Statistics (N=4) and Results of 76 Statistical Analyses for Impact...used tests to assess heel and forefoot shock absorption, upper and sole durability, and flexibility (Cavanagh, 1978). Later, the number of tests was

  9. Ongoing Analyses of Rocket Based Combined Cycle Engines by the Applied Fluid Dynamics Analysis Group at Marshall Space Flight Center

    NASA Technical Reports Server (NTRS)

    Ruf, Joseph H.; Holt, James B.; Canabal, Francisco

    2001-01-01

    This paper presents the status of analyses on three Rocket Based Combined Cycle (RBCC) configurations underway in the Applied Fluid Dynamics Analysis Group (TD64). TD64 is performing computational fluid dynamics (CFD) analysis on a Penn State RBCC test rig, the proposed Draco axisymmetric RBCC engine and the Trailblazer engine. The intent of the analysis on the Penn State test rig is to benchmark the Finite Difference Navier Stokes (FDNS) code for ejector mode fluid dynamics. The Draco analysis was a trade study to determine the ejector mode performance as a function of three engine design variables. The Trailblazer analysis is to evaluate the nozzle performance in scramjet mode. Results to date of each analysis are presented.

  10. SOCR Analyses – an Instructional Java Web-based Statistical Analysis Toolkit

    PubMed Central

    Chu, Annie; Cui, Jenny; Dinov, Ivo D.

    2011-01-01

    The Statistical Online Computational Resource (SOCR) designs web-based tools for educational use in a variety of undergraduate courses (Dinov 2006). Several studies have demonstrated that these resources significantly improve students' motivation and learning experiences (Dinov et al. 2008). SOCR Analyses is a new component that concentrates on data modeling and analysis using parametric and non-parametric techniques supported with graphical model diagnostics. Currently implemented analyses include commonly used models in undergraduate statistics courses like linear models (Simple Linear Regression, Multiple Linear Regression, One-Way and Two-Way ANOVA). In addition, we implemented tests for sample comparisons, such as t-test in the parametric category; and Wilcoxon rank sum test, Kruskal-Wallis test, Friedman's test, in the non-parametric category. SOCR Analyses also include several hypothesis test models, such as Contingency tables, Friedman's test and Fisher's exact test. The code itself is open source (http://socr.googlecode.com/), hoping to contribute to the efforts of the statistical computing community. The code includes functionality for each specific analysis model and it has general utilities that can be applied in various statistical computing tasks. For example, concrete methods with API (Application Programming Interface) have been implemented in statistical summary, least square solutions of general linear models, rank calculations, etc. HTML interfaces, tutorials, source code, activities, and data are freely available via the web (www.SOCR.ucla.edu). Code examples for developers and demos for educators are provided on the SOCR Wiki website. In this article, the pedagogical utilization of the SOCR Analyses is discussed, as well as the underlying design framework. As the SOCR project is on-going and more functions and tools are being added to it, these resources are constantly improved. The reader is strongly encouraged to check the SOCR site for most updated information and newly added models. PMID:21546994

  11. Seismic analysis of the Mirror Fusion Test Facility: soil structure interaction analyses of the Axicell vacuum vessel. Revision 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maslenikov, O.R.; Mraz, M.J.; Johnson, J.J.

    1986-03-01

    This report documents the seismic analyses performed by SMA for the MFTF-B Axicell vacuum vessel. In the course of this study we performed response spectrum analyses, CLASSI fixed-base analyses, and SSI analyses that included interaction effects between the vessel and vault. The response spectrum analysis served to benchmark certain modeling differences between the LLNL and SMA versions of the vessel model. The fixed-base analysis benchmarked the differences between analysis techniques. The SSI analyses provided our best estimate of vessel response to the postulated seismic excitation for the MFTF-B facility, and included consideration of uncertainties in soil properties by calculating responsemore » for a range of soil shear moduli. Our results are presented in this report as tables of comparisons of specific member forces from our analyses and the analyses performed by LLNL. Also presented are tables of maximum accelerations and relative displacements and plots of response spectra at various selected locations.« less

  12. Detection of Functional Change Using Cluster Trend Analysis in Glaucoma.

    PubMed

    Gardiner, Stuart K; Mansberger, Steven L; Demirel, Shaban

    2017-05-01

    Global analyses using mean deviation (MD) assess visual field progression, but can miss localized changes. Pointwise analyses are more sensitive to localized progression, but more variable so require confirmation. This study assessed whether cluster trend analysis, averaging information across subsets of locations, could improve progression detection. A total of 133 test-retest eyes were tested 7 to 10 times. Rates of change and P values were calculated for possible re-orderings of these series to generate global analysis ("MD worsening faster than x dB/y with P < y"), pointwise and cluster analyses ("n locations [or clusters] worsening faster than x dB/y with P < y") with specificity exactly 95%. These criteria were applied to 505 eyes tested over a mean of 10.5 years, to find how soon each detected "deterioration," and compared using survival models. This was repeated including two subsequent visual fields to determine whether "deterioration" was confirmed. The best global criterion detected deterioration in 25% of eyes in 5.0 years (95% confidence interval [CI], 4.7-5.3 years), compared with 4.8 years (95% CI, 4.2-5.1) for the best cluster analysis criterion, and 4.1 years (95% CI, 4.0-4.5) for the best pointwise criterion. However, for pointwise analysis, only 38% of these changes were confirmed, compared with 61% for clusters and 76% for MD. The time until 25% of eyes showed subsequently confirmed deterioration was 6.3 years (95% CI, 6.0-7.2) for global, 6.3 years (95% CI, 6.0-7.0) for pointwise, and 6.0 years (95% CI, 5.3-6.6) for cluster analyses. Although the specificity is still suboptimal, cluster trend analysis detects subsequently confirmed deterioration sooner than either global or pointwise analyses.

  13. SPSS and SAS programs for determining the number of components using parallel analysis and velicer's MAP test.

    PubMed

    O'Connor, B P

    2000-08-01

    Popular statistical software packages do not have the proper procedures for determining the number of components in factor and principal components analyses. Parallel analysis and Velicer's minimum average partial (MAP) test are validated procedures, recommended widely by statisticians. However, many researchers continue to use alternative, simpler, but flawed procedures, such as the eigenvalues-greater-than-one rule. Use of the proper procedures might be increased if these procedures could be conducted within familiar software environments. This paper describes brief and efficient programs for using SPSS and SAS to conduct parallel analyses and the MAP test.

  14. Analysis of longitudinal data from animals where some data are missing in SPSS

    PubMed Central

    Duricki, DA; Soleman, S; Moon, LDF

    2017-01-01

    Testing of therapies for disease or injury often involves analysis of longitudinal data from animals. Modern analytical methods have advantages over conventional methods (particularly where some data are missing) yet are not used widely by pre-clinical researchers. We provide here an easy to use protocol for analysing longitudinal data from animals and present a click-by-click guide for performing suitable analyses using the statistical package SPSS. We guide readers through analysis of a real-life data set obtained when testing a therapy for brain injury (stroke) in elderly rats. We show that repeated measures analysis of covariance failed to detect a treatment effect when a few data points were missing (due to animal drop-out) whereas analysis using an alternative method detected a beneficial effect of treatment; specifically, we demonstrate the superiority of linear models (with various covariance structures) analysed using Restricted Maximum Likelihood estimation (to include all available data). This protocol takes two hours to follow. PMID:27196723

  15. Evaluation of a weighted test in the analysis of ordinal gait scores in an additivity model for five OP pesticides.

    EPA Science Inventory

    Appropriate statistical analyses are critical for evaluating interactions of mixtures with a common mode of action, as is often the case for cumulative risk assessments. Our objective is to develop analyses for use when a response variable is ordinal, and to test for interaction...

  16. Separate-channel analysis of two-channel microarrays: recovering inter-spot information.

    PubMed

    Smyth, Gordon K; Altman, Naomi S

    2013-05-26

    Two-channel (or two-color) microarrays are cost-effective platforms for comparative analysis of gene expression. They are traditionally analysed in terms of the log-ratios (M-values) of the two channel intensities at each spot, but this analysis does not use all the information available in the separate channel observations. Mixed models have been proposed to analyse intensities from the two channels as separate observations, but such models can be complex to use and the gain in efficiency over the log-ratio analysis is difficult to quantify. Mixed models yield test statistics for the null distributions can be specified only approximately, and some approaches do not borrow strength between genes. This article reformulates the mixed model to clarify the relationship with the traditional log-ratio analysis, to facilitate information borrowing between genes, and to obtain an exact distributional theory for the resulting test statistics. The mixed model is transformed to operate on the M-values and A-values (average log-expression for each spot) instead of on the log-expression values. The log-ratio analysis is shown to ignore information contained in the A-values. The relative efficiency of the log-ratio analysis is shown to depend on the size of the intraspot correlation. A new separate channel analysis method is proposed that assumes a constant intra-spot correlation coefficient across all genes. This approach permits the mixed model to be transformed into an ordinary linear model, allowing the data analysis to use a well-understood empirical Bayes analysis pipeline for linear modeling of microarray data. This yields statistically powerful test statistics that have an exact distributional theory. The log-ratio, mixed model and common correlation methods are compared using three case studies. The results show that separate channel analyses that borrow strength between genes are more powerful than log-ratio analyses. The common correlation analysis is the most powerful of all. The common correlation method proposed in this article for separate-channel analysis of two-channel microarray data is no more difficult to apply in practice than the traditional log-ratio analysis. It provides an intuitive and powerful means to conduct analyses and make comparisons that might otherwise not be possible.

  17. Accelerated test plan for nickel cadmium spacecraft batteries

    NASA Technical Reports Server (NTRS)

    Hennigan, T. J.

    1973-01-01

    An accelerated test matrix is outlined that includes acceptance, baseline and post-cycling tests, chemical and physical analyses, and the data analysis procedures to be used in determining the feasibility of an accelerated test for sealed, nickel cadmium cells.

  18. Statistical analysis of Thematic Mapper Simulator data for the geobotanical discrimination of rock types in southwest Oregon

    NASA Technical Reports Server (NTRS)

    Morrissey, L. A.; Weinstock, K. J.; Mouat, D. A.; Card, D. H.

    1984-01-01

    An evaluation of Thematic Mapper Simulator (TMS) data for the geobotanical discrimination of rock types based on vegetative cover characteristics is addressed in this research. A methodology for accomplishing this evaluation utilizing univariate and multivariate techniques is presented. TMS data acquired with a Daedalus DEI-1260 multispectral scanner were integrated with vegetation and geologic information for subsequent statistical analyses, which included a chi-square test, an analysis of variance, stepwise discriminant analysis, and Duncan's multiple range test. Results indicate that ultramafic rock types are spectrally separable from nonultramafics based on vegetative cover through the use of statistical analyses.

  19. Thermal Structure Analysis of SIRCA Tile for X-34 Wing Leading Edge TPS

    NASA Technical Reports Server (NTRS)

    Milos, Frank S.; Squire, Thomas H.; Rasky, Daniel J. (Technical Monitor)

    1997-01-01

    This paper will describe in detail thermal/structural analyses of SIRCA tiles which were performed at NASA Ames under the The Tile Analysis Task of the X-34 Program. The analyses used the COSMOS/M finite element software to simulate the material response in arc-jet tests, mechanical deflection tests, and the performance of candidate designs for the TPS system. Purposes of the analysis were to verify thermal and structural models for the SIRCA tiles, to establish failure criteria for stressed tiles, to simulate the TPS response under flight aerothermal and mechanical load, and to confirm that adequate safety margins exist for the actual TPS design.

  20. Laser geodynamic satellite thermal/optical/ vibrational analyses and testing. Volume 2: Technical report, book 1. [retroreflector design

    NASA Technical Reports Server (NTRS)

    1974-01-01

    The results of the LAGEOS thermal/optical/vibrational analysis and test program are reported. Through analyses and tests it is verified that the MSFC LAGEOS design provides a retroreflector thermal environment which maintains acceptable retroflector internal thermal gradients. The technical results of the study, organized by the major task areas are presented. The interrelationships of the major tasks are described and the major decisions are identified.

  1. CADDIS Volume 4. Data Analysis: Basic Analyses

    EPA Pesticide Factsheets

    Use of statistical tests to determine if an observation is outside the normal range of expected values. Details of CART, regression analysis, use of quantile regression analysis, CART in causal analysis, simplifying or pruning resulting trees.

  2. Benefits of Using Planned Comparisons Rather Than Post Hoc Tests: A Brief Review with Examples.

    ERIC Educational Resources Information Center

    DuRapau, Theresa M.

    The rationale behind analysis of variance (including analysis of covariance and multiple analyses of variance and covariance) methods is reviewed, and unplanned and planned methods of evaluating differences between means are briefly described. Two advantages of using planned or a priori tests over unplanned or post hoc tests are presented. In…

  3. Statistical Analysis Techniques for Small Sample Sizes

    NASA Technical Reports Server (NTRS)

    Navard, S. E.

    1984-01-01

    The small sample sizes problem which is encountered when dealing with analysis of space-flight data is examined. Because of such a amount of data available, careful analyses are essential to extract the maximum amount of information with acceptable accuracy. Statistical analysis of small samples is described. The background material necessary for understanding statistical hypothesis testing is outlined and the various tests which can be done on small samples are explained. Emphasis is on the underlying assumptions of each test and on considerations needed to choose the most appropriate test for a given type of analysis.

  4. Detection of ingested nitromethane and reliable creatinine assessment using multiple common analytical methods.

    PubMed

    Murphy, Christine M; Devlin, John J; Beuhler, Michael C; Cheifetz, Paul; Maynard, Susan; Schwartz, Michael D; Kacinko, Sherri

    2018-04-01

    Nitromethane, found in fuels used for short distance racing, model cars, and model airplanes, produces a falsely elevated serum creatinine with standard creatinine analysis via the Jaffé method. Erroneous creatinine elevation often triggers extensive testing, leads to inaccurate diagnoses, and delayed or inappropriate medical interventions. Multiple reports in the literature identify "enzymatic assays" as an alternative method to detect the true value of creatinine, but this ambiguity does not help providers translate what type of enzymatic assay testing can be done in real time to determine if there is indeed false elevation. We report seven cases of ingested nitromethane where creatinine was determined via Beckman Coulter ® analyser using the Jaffé method, Vitros ® analyser, or i-Stat ® point-of-care testing. Nitromethane was detected and semi-quantified using a common clinical toxic alcohol analysis method, and quantified by headspace-gas chromatography-mass spectrometry. When creatinine was determined using i-Stat ® point-of-care testing or a Vitros ® analyser, levels were within the normal range. Comparatively, all initial creatinine levels obtained via the Jaffé method were elevated. Nitromethane concentrations ranged from 42 to 310 μg/mL. These cases demonstrate reliable assessment of creatinine through other enzymatic methods using a Vitros ® analyser or i-STAT ® . Additionally, nitromethane is detectable and quantifiable using routine alcohols gas chromatography analysis and by headspace-gas chromatography-mass spectrometry.

  5. Performance and Stability Analyses of Rocket Combustion Devices Using Liquid Oxygen/Liquid Methane Propellants

    NASA Technical Reports Server (NTRS)

    Hulka, James R.; Jones, G. W.

    2010-01-01

    Liquid rocket engines using oxygen and methane propellants are being considered by the National Aeronautics and Space Administration (NASA) for in-space vehicles. This propellant combination has not been previously used in flight-qualified engine systems, so limited test data and analysis results are available at this stage of early development. NASA has funded several hardware-oriented programs with oxygen and methane propellants over the past several years with the Propulsion and Cryogenic Advanced Development (PCAD) project, under the Exploration Technology Development Program. As part of this effort, NASA Marshall Space Flight Center has conducted combustion, performance, and combustion stability analyses of several of the configurations on these programs. This paper summarizes these analyses. Test and analysis results of impinging and coaxial element injectors using liquid oxygen and liquid methane propellants are included. Several cases with gaseous methane are included for reference. Several different thrust chamber configurations have been modeled, including thrust chambers with multi-element like-on-like and swirl coax element injectors tested at NASA MSFC, and a unielement chamber with shear and swirl coax injectors tested at The Pennsylvania State University. Configurations were modeled with two one-dimensional liquid rocket combustion analysis codes, the Rocket Combustor Interaction Design and Analysis (ROCCID), and the Coaxial Injector Combustion Model (CICM). Significant effort was applied to show how these codes can be used to model combustion and performance with oxygen/methane propellants a priori, and what anchoring or calibrating features need to be applied or developed in the future. This paper describes the test hardware configurations, presents the results of all the analyses, and compares the results from the two analytical methods.

  6. 20 Meter Solar Sail Analysis and Correlation

    NASA Technical Reports Server (NTRS)

    Taleghani, B. K.; Lively, P. S.; Banik, J.; Murphy, D. M.; Trautt, T. A.

    2005-01-01

    This paper describes finite element analyses and correlation studies to predict deformations and vibration modes/frequencies of a 20-meter solar sail system developed by ATK Space Systems. Under the programmatic leadership of NASA Marshall Space Flight Center's In-Space Propulsion activity, the 20-meter solar sail program objectives were to verify the design, to assess structural responses of the sail system, to implement lessons learned from a previous 10-meter quadrant system analysis and test program, and to mature solar sail technology to a technology readiness level (TRL) of 5. For this 20 meter sail system, static and ground vibration tests were conducted in NASA Glenn Research Center's 100 meter diameter vacuum chamber at Plum Brook station. Prior to testing, a preliminary analysis was performed to evaluate test conditions and to determine sensor and actuator locations. After testing was completed, an analysis of each test configuration was performed. Post-test model refinements included updated properties to account for the mass of sensors, wiring, and other components used for testing. This paper describes the development of finite element models (FEM) for sail membranes and masts in each of four quadrants at both the component and system levels, as well as an optimization procedure for the static test/analyses correlation.

  7. Modal Survey of ETM-3, A 5-Segment Derivative of the Space Shuttle Solid Rocket Booster

    NASA Technical Reports Server (NTRS)

    Nielsen, D.; Townsend, J.; Kappus, K.; Driskill, T.; Torres, I.; Parks, R.

    2005-01-01

    The complex interactions between internal motor generated pressure oscillations and motor structural vibration modes associated with the static test configuration of a Reusable Solid Rocket Motor have potential to generate significant dynamic thrust loads in the 5-segment configuration (Engineering Test Motor 3). Finite element model load predictions for worst-case conditions were generated based on extrapolation of a previously correlated 4-segment motor model. A modal survey was performed on the largest rocket motor to date, Engineering Test Motor #3 (ETM-3), to provide data for finite element model correlation and validation of model generated design loads. The modal survey preparation included pretest analyses to determine an efficient analysis set selection using the Effective Independence Method and test simulations to assure critical test stand component loads did not exceed design limits. Historical Reusable Solid Rocket Motor modal testing, ETM-3 test analysis model development and pre-test loads analyses, as well as test execution, and a comparison of results to pre-test predictions are discussed.

  8. Development of the Spatial Ability Test for Middle School Students

    ERIC Educational Resources Information Center

    Yildiz, Sevda Göktepe; Özdemir, Ahmet Sükrü

    2017-01-01

    The purpose of this study was to develop a test to determine spatial ability of middle school students. The participants were 704 middle school students (6th, 7th and 8th grade) who were studying at different schools from Istanbul. Item analysis, exploratory and confirmatory factor analysis, reliability analysis were used to analyse the data.…

  9. A practical and systematic review of Weibull statistics for reporting strengths of dental materials

    PubMed Central

    Quinn, George D.; Quinn, Janet B.

    2011-01-01

    Objectives To review the history, theory and current applications of Weibull analyses sufficient to make informed decisions regarding practical use of the analysis in dental material strength testing. Data References are made to examples in the engineering and dental literature, but this paper also includes illustrative analyses of Weibull plots, fractographic interpretations, and Weibull distribution parameters obtained for a dense alumina, two feldspathic porcelains, and a zirconia. Sources Informational sources include Weibull's original articles, later articles specific to applications and theoretical foundations of Weibull analysis, texts on statistics and fracture mechanics and the international standards literature. Study Selection The chosen Weibull analyses are used to illustrate technique, the importance of flaw size distributions, physical meaning of Weibull parameters and concepts of “equivalent volumes” to compare measured strengths obtained from different test configurations. Conclusions Weibull analysis has a strong theoretical basis and can be of particular value in dental applications, primarily because of test specimen size limitations and the use of different test configurations. Also endemic to dental materials, however, is increased difficulty in satisfying application requirements, such as confirming fracture origin type and diligence in obtaining quality strength data. PMID:19945745

  10. Computational Analysis of Arc-Jet Wedge Tests Including Ablation and Shape Change

    NASA Technical Reports Server (NTRS)

    Goekcen, Tahir; Chen, Yih-Kanq; Skokova, Kristina A.; Milos, Frank S.

    2010-01-01

    Coupled fluid-material response analyses of arc-jet wedge ablation tests conducted in a NASA Ames arc-jet facility are considered. These tests were conducted using blunt wedge models placed in a free jet downstream of the 6-inch diameter conical nozzle in the Ames 60-MW Interaction Heating Facility. The fluid analysis includes computational Navier-Stokes simulations of the nonequilibrium flowfield in the facility nozzle and test box as well as the flowfield over the models. The material response analysis includes simulation of two-dimensional surface ablation and internal heat conduction, thermal decomposition, and pyrolysis gas flow. For ablating test articles undergoing shape change, the material response and fluid analyses are coupled in order to calculate the time dependent surface heating and pressure distributions that result from shape change. The ablating material used in these arc-jet tests was Phenolic Impregnated Carbon Ablator. Effects of the test article shape change on fluid and material response simulations are demonstrated, and computational predictions of surface recession, shape change, and in-depth temperatures are compared with the experimental measurements.

  11. CADDIS Volume 4. Data Analysis: Selecting an Analysis Approach

    EPA Pesticide Factsheets

    An approach for selecting statistical analyses to inform causal analysis. Describes methods for determining whether test site conditions differ from reference expectations. Describes an approach for estimating stressor-response relationships.

  12. Hazard Analysis for the Mark III Space Suit Assembly (SSA) Used in One-g Operations

    NASA Technical Reports Server (NTRS)

    Mitchell, Kate; Ross, Amy; Blanco, Raul; Wood, Art

    2012-01-01

    This Hazard Analysis document encompasses the Mark III Space Suit Assembly (SSA) and associated ancillary equipment. It has been prepared using JSC17773, "Preparing Hazard Analyses for JSC Ground Operation", as a guide. The purpose of this document is to present the potential hazards involved in ground (23 % maximum O2, One-g) operations of the Mark III and associated ancillary support equipment system. The hazards listed in this document are specific to suit operations only; each supporting facility (Bldg. 9, etc.) is responsible for test specific Hazard Analyses. A "hazard" is defined as any condition that has the potential for harming personnel or equipment. This analysis was performed to document the safety aspects associated with manned use of the Mark III for pressurized and unpressurized ambient, ground-based, One-g human testing. The hazards identified herein represent generic hazards inherent to all standard JSC test venues for nominal ground test configurations. Non-standard test venues or test specific configurations may warrant consideration of additional hazards analysis prior to test. The cognizant suit engineer is responsible for the safety of the astronaut/test subject, space suit, and suit support personnel. The test requester, for the test supported by the suit test engineer and suited subject, is responsible for overall safety and any necessary Test Readiness Reviews (TRR).

  13. Patterns of medicinal plant use: an examination of the Ecuadorian Shuar medicinal flora using contingency table and binomial analyses.

    PubMed

    Bennett, Bradley C; Husby, Chad E

    2008-03-28

    Botanical pharmacopoeias are non-random subsets of floras, with some taxonomic groups over- or under-represented. Moerman [Moerman, D.E., 1979. Symbols and selectivity: a statistical analysis of Native American medical ethnobotany, Journal of Ethnopharmacology 1, 111-119] introduced linear regression/residual analysis to examine these patterns. However, regression, the commonly-employed analysis, suffers from several statistical flaws. We use contingency table and binomial analyses to examine patterns of Shuar medicinal plant use (from Amazonian Ecuador). We first analyzed the Shuar data using Moerman's approach, modified to better meet requirements of linear regression analysis. Second, we assessed the exact randomization contingency table test for goodness of fit. Third, we developed a binomial model to test for non-random selection of plants in individual families. Modified regression models (which accommodated assumptions of linear regression) reduced R(2) to from 0.59 to 0.38, but did not eliminate all problems associated with regression analyses. Contingency table analyses revealed that the entire flora departs from the null model of equal proportions of medicinal plants in all families. In the binomial analysis, only 10 angiosperm families (of 115) differed significantly from the null model. These 10 families are largely responsible for patterns seen at higher taxonomic levels. Contingency table and binomial analyses offer an easy and statistically valid alternative to the regression approach.

  14. A survey of variable selection methods in two Chinese epidemiology journals

    PubMed Central

    2010-01-01

    Background Although much has been written on developing better procedures for variable selection, there is little research on how it is practiced in actual studies. This review surveys the variable selection methods reported in two high-ranking Chinese epidemiology journals. Methods Articles published in 2004, 2006, and 2008 in the Chinese Journal of Epidemiology and the Chinese Journal of Preventive Medicine were reviewed. Five categories of methods were identified whereby variables were selected using: A - bivariate analyses; B - multivariable analysis; e.g. stepwise or individual significance testing of model coefficients; C - first bivariate analyses, followed by multivariable analysis; D - bivariate analyses or multivariable analysis; and E - other criteria like prior knowledge or personal judgment. Results Among the 287 articles that reported using variable selection methods, 6%, 26%, 30%, 21%, and 17% were in categories A through E, respectively. One hundred sixty-three studies selected variables using bivariate analyses, 80% (130/163) via multiple significance testing at the 5% alpha-level. Of the 219 multivariable analyses, 97 (44%) used stepwise procedures, 89 (41%) tested individual regression coefficients, but 33 (15%) did not mention how variables were selected. Sixty percent (58/97) of the stepwise routines also did not specify the algorithm and/or significance levels. Conclusions The variable selection methods reported in the two journals were limited in variety, and details were often missing. Many studies still relied on problematic techniques like stepwise procedures and/or multiple testing of bivariate associations at the 0.05 alpha-level. These deficiencies should be rectified to safeguard the scientific validity of articles published in Chinese epidemiology journals. PMID:20920252

  15. Digital data processing system dynamic loading analysis

    NASA Technical Reports Server (NTRS)

    Lagas, J. J.; Peterka, J. J.; Tucker, A. E.

    1976-01-01

    Simulation and analysis of the Space Shuttle Orbiter Digital Data Processing System (DDPS) are reported. The mated flight and postseparation flight phases of the space shuttle's approach and landing test configuration were modeled utilizing the Information Management System Interpretative Model (IMSIM) in a computerized simulation modeling of the ALT hardware, software, and workload. System requirements simulated for the ALT configuration were defined. Sensitivity analyses determined areas of potential data flow problems in DDPS operation. Based on the defined system requirements and the sensitivity analyses, a test design is described for adapting, parameterizing, and executing the IMSIM. Varying load and stress conditions for the model execution are given. The analyses of the computer simulation runs were documented as results, conclusions, and recommendations for DDPS improvements.

  16. Space shuttle orbiter digital data processing system timing sensitivity analysis OFT ascent phase

    NASA Technical Reports Server (NTRS)

    Lagas, J. J.; Peterka, J. J.; Becker, D. A.

    1977-01-01

    Dynamic loads were investigated to provide simulation and analysis of the space shuttle orbiter digital data processing system (DDPS). Segments of the ascent test (OFT) configuration were modeled utilizing the information management system interpretive model (IMSIM) in a computerized simulation modeling of the OFT hardware and software workload. System requirements for simulation of the OFT configuration were defined, and sensitivity analyses determined areas of potential data flow problems in DDPS operation. Based on the defined system requirements and these sensitivity analyses, a test design was developed for adapting, parameterizing, and executing IMSIM, using varying load and stress conditions for model execution. Analyses of the computer simulation runs are documented, including results, conclusions, and recommendations for DDPS improvements.

  17. The Use of the Position Analysis Questionnaire (PAQ) for Establishing the Job Component Validity of Tests. Report No. 5. Final Report.

    ERIC Educational Resources Information Center

    McCormick, Ernest J.; And Others

    The Position Analysis Questionnaire (PAQ), a structured job analysis questionnaire that provides for the analysis of individual jobs in terms of each of 187 job elements, was used to establish the job component validity of certain commercially-available vocational aptitude tests. Prior to the general analyses reported here, a statistical analysis…

  18. Analytical performance, agreement and user-friendliness of six point-of-care testing urine analysers for urinary tract infection in general practice

    PubMed Central

    Schot, Marjolein J C; van Delft, Sanne; Kooijman-Buiting, Antoinette M J; de Wit, Niek J; Hopstaken, Rogier M

    2015-01-01

    Objective Various point-of-care testing (POCT) urine analysers are commercially available for routine urine analysis in general practice. The present study compares analytical performance, agreement and user-friendliness of six different POCT urine analysers for diagnosing urinary tract infection in general practice. Setting All testing procedures were performed at a diagnostic centre for primary care in the Netherlands. Urine samples were collected at four general practices. Primary and secondary outcome measures Analytical performance and agreement of the POCT analysers regarding nitrite, leucocytes and erythrocytes, with the laboratory reference standard, was the primary outcome measure, and analysed by calculating sensitivity, specificity, positive and negative predictive value, and Cohen's κ coefficient for agreement. Secondary outcome measures were the user-friendliness of the POCT analysers, in addition to other characteristics of the analysers. Results The following six POCT analysers were evaluated: Uryxxon Relax (Macherey Nagel), Urisys 1100 (Roche), Clinitek Status (Siemens), Aution 11 (Menarini), Aution Micro (Menarini) and Urilyzer (Analyticon). Analytical performance was good for all analysers. Compared with laboratory reference standards, overall agreement was good, but differed per parameter and per analyser. Concerning the nitrite test, the most important test for clinical practice, all but one showed perfect agreement with the laboratory standard. For leucocytes and erythrocytes specificity was high, but sensitivity was considerably lower. Agreement for leucocytes varied between good to very good, and for the erythrocyte test between fair and good. First-time users indicated that the analysers were easy to use. They expected higher productivity and accuracy when using these analysers in daily practice. Conclusions The overall performance and user-friendliness of all six commercially available POCT urine analysers was sufficient to justify routine use in suspected urinary tract infections in general practice. PMID:25986635

  19. Occupied Volume Integrity Testing : Elastic Test Results and Analyses

    DOT National Transportation Integrated Search

    2011-09-21

    Federal Railroad Administration (FRA) and the Volpe Center have been conducting research into developing an alternative method of demonstrating occupied volume integrity (OVI) through a combination of testing and analysis. Previous works have been pu...

  20. NASA Structural Analysis Report on the American Airlines Flight 587 Accident - Local Analysis of the Right Rear Lug

    NASA Technical Reports Server (NTRS)

    Raju, Ivatury S; Glaessgen, Edward H.; Mason, Brian H; Krishnamurthy, Thiagarajan; Davila, Carlos G

    2005-01-01

    A detailed finite element analysis of the right rear lug of the American Airlines Flight 587 - Airbus A300-600R was performed as part of the National Transportation Safety Board s failure investigation of the accident that occurred on November 12, 2001. The loads experienced by the right rear lug are evaluated using global models of the vertical tail, local models near the right rear lug, and a global-local analysis procedure. The right rear lug was analyzed using two modeling approaches. In the first approach, solid-shell type modeling is used, and in the second approach, layered-shell type modeling is used. The solid-shell and the layered-shell modeling approaches were used in progressive failure analyses (PFA) to determine the load, mode, and location of failure in the right rear lug under loading representative of an Airbus certification test conducted in 1985 (the 1985-certification test). Both analyses were in excellent agreement with each other on the predicted failure loads, failure mode, and location of failure. The solid-shell type modeling was then used to analyze both a subcomponent test conducted by Airbus in 2003 (the 2003-subcomponent test) and the accident condition. Excellent agreement was observed between the analyses and the observed failures in both cases. From the analyses conducted and presented in this paper, the following conclusions were drawn. The moment, Mx (moment about the fuselage longitudinal axis), has significant effect on the failure load of the lugs. Higher absolute values of Mx give lower failure loads. The predicted load, mode, and location of the failure of the 1985-certification test, 2003-subcomponent test, and the accident condition are in very good agreement. This agreement suggests that the 1985-certification and 2003- subcomponent tests represent the accident condition accurately. The failure mode of the right rear lug for the 1985-certification test, 2003-subcomponent test, and the accident load case is identified as a cleavage-type failure. For the accident case, the predicted failure load for the right rear lug from the PFA is greater than 1.98 times the limit load of the lugs. I.

  1. Difficult Decisions Made Easier

    NASA Technical Reports Server (NTRS)

    2006-01-01

    NASA missions are extremely complex and prone to sudden, catastrophic failure if equipment falters or if an unforeseen event occurs. For these reasons, NASA trains to expect the unexpected. It tests its equipment and systems in extreme conditions, and it develops risk-analysis tests to foresee any possible problems. The Space Agency recently worked with an industry partner to develop reliability analysis software capable of modeling complex, highly dynamic systems, taking into account variations in input parameters and the evolution of the system over the course of a mission. The goal of this research was multifold. It included performance and risk analyses of complex, multiphase missions, like the insertion of the Mars Reconnaissance Orbiter; reliability analyses of systems with redundant and/or repairable components; optimization analyses of system configurations with respect to cost and reliability; and sensitivity analyses to identify optimal areas for uncertainty reduction or performance enhancement.

  2. Analysis and correlation of the test data from an advanced technology rotor system

    NASA Technical Reports Server (NTRS)

    Jepson, D.; Moffitt, R.; Hilzinger, K.; Bissell, J.

    1983-01-01

    Comparisons were made of the performance and blade vibratory loads characteristics for an advanced rotor system as predicted by analysis and as measured in a 1/5 scale model wind tunnel test, a full scale model wind tunnel test and flight test. The accuracy with which the various tools available at the various stages in the design/development process (analysis, model test etc.) could predict final characteristics as measured on the aircraft was determined. The accuracy of the analyses in predicting the effects of systematic tip planform variations investigated in the full scale wind tunnel test was evaluated.

  3. Statistical analysis of an inter-laboratory comparison of small-scale safety and thermal testing of RDX

    DOE PAGES

    Brown, Geoffrey W.; Sandstrom, Mary M.; Preston, Daniel N.; ...

    2014-11-17

    In this study, the Integrated Data Collection Analysis (IDCA) program has conducted a proficiency test for small-scale safety and thermal (SSST) testing of homemade explosives (HMEs). Described here are statistical analyses of the results from this test for impact, friction, electrostatic discharge, and differential scanning calorimetry analysis of the RDX Class 5 Type II standard. The material was tested as a well-characterized standard several times during the proficiency test to assess differences among participants and the range of results that may arise for well-behaved explosive materials.

  4. Statistical power analyses using G*Power 3.1: tests for correlation and regression analyses.

    PubMed

    Faul, Franz; Erdfelder, Edgar; Buchner, Axel; Lang, Albert-Georg

    2009-11-01

    G*Power is a free power analysis program for a variety of statistical tests. We present extensions and improvements of the version introduced by Faul, Erdfelder, Lang, and Buchner (2007) in the domain of correlation and regression analyses. In the new version, we have added procedures to analyze the power of tests based on (1) single-sample tetrachoric correlations, (2) comparisons of dependent correlations, (3) bivariate linear regression, (4) multiple linear regression based on the random predictor model, (5) logistic regression, and (6) Poisson regression. We describe these new features and provide a brief introduction to their scope and handling.

  5. Analysis of a Hybrid Wing Body Center Section Test Article

    NASA Technical Reports Server (NTRS)

    Wu, Hsi-Yung T.; Shaw, Peter; Przekop, Adam

    2013-01-01

    The hybrid wing body center section test article is an all-composite structure made of crown, floor, keel, bulkhead, and rib panels utilizing the Pultruded Rod Stitched Efficient Unitized Structure (PRSEUS) design concept. The primary goal of this test article is to prove that PRSEUS components are capable of carrying combined loads that are representative of a hybrid wing body pressure cabin design regime. This paper summarizes the analytical approach, analysis results, and failure predictions of the test article. A global finite element model of composite panels, metallic fittings, mechanical fasteners, and the Combined Loads Test System (COLTS) test fixture was used to conduct linear structural strength and stability analyses to validate the specimen under the most critical combination of bending and pressure loading conditions found in the hybrid wing body pressure cabin. Local detail analyses were also performed at locations with high stress concentrations, at Tee-cap noodle interfaces with surrounding laminates, and at fastener locations with high bearing/bypass loads. Failure predictions for different composite and metallic failure modes were made, and nonlinear analyses were also performed to study the structural response of the test article under combined bending and pressure loading. This large-scale specimen test will be conducted at the COLTS facility at the NASA Langley Research Center.

  6. Rasch analysis for psychometric improvement of science attitude rating scales

    NASA Astrophysics Data System (ADS)

    Oon, Pey-Tee; Fan, Xitao

    2017-04-01

    Students' attitude towards science (SAS) is often a subject of investigation in science education research. Survey of rating scale is commonly used in the study of SAS. The present study illustrates how Rasch analysis can be used to provide psychometric information of SAS rating scales. The analyses were conducted on a 20-item SAS scale used in an existing dataset of The Trends in International Mathematics and Science Study (TIMSS) (2011). Data of all the eight-grade participants from Hong Kong and Singapore (N = 9942) were retrieved for analyses. Additional insights from Rasch analysis that are not commonly available from conventional test and item analyses were discussed, such as invariance measurement of SAS, unidimensionality of SAS construct, optimum utilization of SAS rating categories, and item difficulty hierarchy in the SAS scale. Recommendations on how TIMSS items on the measurement of SAS can be better designed were discussed. The study also highlights the importance of using Rasch estimates for statistical parametric tests (e.g. ANOVA, t-test) that are common in science education research for group comparisons.

  7. Comparative multivariate analyses of transient otoacoustic emissions and distorsion products in normal and impaired hearing.

    PubMed

    Stamate, Mirela Cristina; Todor, Nicolae; Cosgarea, Marcel

    2015-01-01

    The clinical utility of otoacoustic emissions as a noninvasive objective test of cochlear function has been long studied. Both transient otoacoustic emissions and distorsion products can be used to identify hearing loss, but to what extent they can be used as predictors for hearing loss is still debated. Most studies agree that multivariate analyses have better test performances than univariate analyses. The aim of the study was to determine transient otoacoustic emissions and distorsion products performance in identifying normal and impaired hearing loss, using the pure tone audiogram as a gold standard procedure and different multivariate statistical approaches. The study included 105 adult subjects with normal hearing and hearing loss who underwent the same test battery: pure-tone audiometry, tympanometry, otoacoustic emission tests. We chose to use the logistic regression as a multivariate statistical technique. Three logistic regression models were developed to characterize the relations between different risk factors (age, sex, tinnitus, demographic features, cochlear status defined by otoacoustic emissions) and hearing status defined by pure-tone audiometry. The multivariate analyses allow the calculation of the logistic score, which is a combination of the inputs, weighted by coefficients, calculated within the analyses. The accuracy of each model was assessed using receiver operating characteristics curve analysis. We used the logistic score to generate receivers operating curves and to estimate the areas under the curves in order to compare different multivariate analyses. We compared the performance of each otoacoustic emission (transient, distorsion product) using three different multivariate analyses for each ear, when multi-frequency gold standards were used. We demonstrated that all multivariate analyses provided high values of the area under the curve proving the performance of the otoacoustic emissions. Each otoacoustic emission test presented high values of area under the curve, suggesting that implementing a multivariate approach to evaluate the performances of each otoacoustic emission test would serve to increase the accuracy in identifying the normal and impaired ears. We encountered the highest area under the curve value for the combined multivariate analysis suggesting that both otoacoustic emission tests should be used in assessing hearing status. Our multivariate analyses revealed that age is a constant predictor factor of the auditory status for both ears, but the presence of tinnitus was the most important predictor for the hearing level, only for the left ear. Age presented similar coefficients, but tinnitus coefficients, by their high value, produced the highest variations of the logistic scores, only for the left ear group, thus increasing the risk of hearing loss. We did not find gender differences between ears for any otoacoustic emission tests, but studies still debate this question as the results are contradictory. Neither gender, nor environment origin had any predictive value for the hearing status, according to the results of our study. Like any other audiological test, using otoacoustic emissions to identify hearing loss is not without error. Even when applying multivariate analysis, perfect test performance is never achieved. Although most studies demonstrated the benefit of using the multivariate analysis, it has not been incorporated into clinical decisions maybe because of the idiosyncratic nature of multivariate solutions or because of the lack of the validation studies.

  8. Comparative multivariate analyses of transient otoacoustic emissions and distorsion products in normal and impaired hearing

    PubMed Central

    STAMATE, MIRELA CRISTINA; TODOR, NICOLAE; COSGAREA, MARCEL

    2015-01-01

    Background and aim The clinical utility of otoacoustic emissions as a noninvasive objective test of cochlear function has been long studied. Both transient otoacoustic emissions and distorsion products can be used to identify hearing loss, but to what extent they can be used as predictors for hearing loss is still debated. Most studies agree that multivariate analyses have better test performances than univariate analyses. The aim of the study was to determine transient otoacoustic emissions and distorsion products performance in identifying normal and impaired hearing loss, using the pure tone audiogram as a gold standard procedure and different multivariate statistical approaches. Methods The study included 105 adult subjects with normal hearing and hearing loss who underwent the same test battery: pure-tone audiometry, tympanometry, otoacoustic emission tests. We chose to use the logistic regression as a multivariate statistical technique. Three logistic regression models were developed to characterize the relations between different risk factors (age, sex, tinnitus, demographic features, cochlear status defined by otoacoustic emissions) and hearing status defined by pure-tone audiometry. The multivariate analyses allow the calculation of the logistic score, which is a combination of the inputs, weighted by coefficients, calculated within the analyses. The accuracy of each model was assessed using receiver operating characteristics curve analysis. We used the logistic score to generate receivers operating curves and to estimate the areas under the curves in order to compare different multivariate analyses. Results We compared the performance of each otoacoustic emission (transient, distorsion product) using three different multivariate analyses for each ear, when multi-frequency gold standards were used. We demonstrated that all multivariate analyses provided high values of the area under the curve proving the performance of the otoacoustic emissions. Each otoacoustic emission test presented high values of area under the curve, suggesting that implementing a multivariate approach to evaluate the performances of each otoacoustic emission test would serve to increase the accuracy in identifying the normal and impaired ears. We encountered the highest area under the curve value for the combined multivariate analysis suggesting that both otoacoustic emission tests should be used in assessing hearing status. Our multivariate analyses revealed that age is a constant predictor factor of the auditory status for both ears, but the presence of tinnitus was the most important predictor for the hearing level, only for the left ear. Age presented similar coefficients, but tinnitus coefficients, by their high value, produced the highest variations of the logistic scores, only for the left ear group, thus increasing the risk of hearing loss. We did not find gender differences between ears for any otoacoustic emission tests, but studies still debate this question as the results are contradictory. Neither gender, nor environment origin had any predictive value for the hearing status, according to the results of our study. Conclusion Like any other audiological test, using otoacoustic emissions to identify hearing loss is not without error. Even when applying multivariate analysis, perfect test performance is never achieved. Although most studies demonstrated the benefit of using the multivariate analysis, it has not been incorporated into clinical decisions maybe because of the idiosyncratic nature of multivariate solutions or because of the lack of the validation studies. PMID:26733749

  9. Analysis and test of a 16-foot radial rib reflector developmental model

    NASA Technical Reports Server (NTRS)

    Birchenough, Shawn A.

    1989-01-01

    Analytical and experimental modal tests were performed to determine the vibrational characteristics of a 16-foot diameter radial rib reflector model. Single rib analyses and experimental tests provided preliminary information relating to the reflector. A finite element model predicted mode shapes and frequencies of the reflector. The analyses correlated well with the experimental tests, verifying the modeling method used. The results indicate that five related, characteristic mode shapes form a group. The frequencies of the modes are determined by the relative phase of the radial ribs.

  10. Pretest analysis document for Test S-FS-6

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shaw, R.A.; Hall, D.G.

    This report documents the pretest analyses completed for Semiscale Test S-FS-6. This test will simulate a transient initiated by a 100% break in a steam generator bottom feedwater line downstream of the check valve. The initial conditions represent normal operating conditions for a C-E System 80 nuclear power plant. Predictions of transients resulting from feedwater line breaks in these plants have indicated that significant primary system overpressurization may occur. The enclosed analyses include a RELAP5/MOD2/CY21 code calculation and preliminary results from a facility hot, integrated test which was conducted to near S-FS-6 specifications. The results of these analyses indicate thatmore » the test objectives for Test S-FS-6 can be achieved. The primary system overpressurization will pose no threat to personnel or plant integrity.« less

  11. LOD score exclusion analyses for candidate QTLs using random population samples.

    PubMed

    Deng, Hong-Wen

    2003-11-01

    While extensive analyses have been conducted to test for, no formal analyses have been conducted to test against, the importance of candidate genes as putative QTLs using random population samples. Previously, we developed an LOD score exclusion mapping approach for candidate genes for complex diseases. Here, we extend this LOD score approach for exclusion analyses of candidate genes for quantitative traits. Under this approach, specific genetic effects (as reflected by heritability) and inheritance models at candidate QTLs can be analyzed and if an LOD score is < or = -2.0, the locus can be excluded from having a heritability larger than that specified. Simulations show that this approach has high power to exclude a candidate gene from having moderate genetic effects if it is not a QTL and is robust to population admixture. Our exclusion analysis complements association analysis for candidate genes as putative QTLs in random population samples. The approach is applied to test the importance of Vitamin D receptor (VDR) gene as a potential QTL underlying the variation of bone mass, an important determinant of osteoporosis.

  12. It's DE-licious: A Recipe for Differential Expression Analyses of RNA-seq Experiments Using Quasi-Likelihood Methods in edgeR.

    PubMed

    Lun, Aaron T L; Chen, Yunshun; Smyth, Gordon K

    2016-01-01

    RNA sequencing (RNA-seq) is widely used to profile transcriptional activity in biological systems. Here we present an analysis pipeline for differential expression analysis of RNA-seq experiments using the Rsubread and edgeR software packages. The basic pipeline includes read alignment and counting, filtering and normalization, modelling of biological variability and hypothesis testing. For hypothesis testing, we describe particularly the quasi-likelihood features of edgeR. Some more advanced downstream analysis steps are also covered, including complex comparisons, gene ontology enrichment analyses and gene set testing. The code required to run each step is described, along with an outline of the underlying theory. The chapter includes a case study in which the pipeline is used to study the expression profiles of mammary gland cells in virgin, pregnant and lactating mice.

  13. LSAT Dimensionality Analysis for the December 1991, June 1992, and October 1992 Administrations. Statistical Report. LSAC Research Report Series.

    ERIC Educational Resources Information Center

    Douglas, Jeff; Kim, Hae-Rim; Roussos, Louis; Stout, William; Zhang, Jinming

    An extensive nonparametric dimensionality analysis of latent structure was conducted on three forms of the Law School Admission Test (LSAT) (December 1991, June 1992, and October 1992) using the DIMTEST model in confirmatory analyses and using DIMTEST, FAC, DETECT, HCA, PROX, and a genetic algorithm in exploratory analyses. Results indicate that…

  14. A comparison of two follow-up analyses after multiple analysis of variance, analysis of variance, and descriptive discriminant analysis: A case study of the program effects on education-abroad programs

    Treesearch

    Alvin H. Yu; Garry Chick

    2010-01-01

    This study compared the utility of two different post-hoc tests after detecting significant differences within factors on multiple dependent variables using multivariate analysis of variance (MANOVA). We compared the univariate F test (the Scheffé method) to descriptive discriminant analysis (DDA) using an educational-tour survey of university study-...

  15. Methods to increase reproducibility in differential gene expression via meta-analysis

    PubMed Central

    Sweeney, Timothy E.; Haynes, Winston A.; Vallania, Francesco; Ioannidis, John P.; Khatri, Purvesh

    2017-01-01

    Findings from clinical and biological studies are often not reproducible when tested in independent cohorts. Due to the testing of a large number of hypotheses and relatively small sample sizes, results from whole-genome expression studies in particular are often not reproducible. Compared to single-study analysis, gene expression meta-analysis can improve reproducibility by integrating data from multiple studies. However, there are multiple choices in designing and carrying out a meta-analysis. Yet, clear guidelines on best practices are scarce. Here, we hypothesized that studying subsets of very large meta-analyses would allow for systematic identification of best practices to improve reproducibility. We therefore constructed three very large gene expression meta-analyses from clinical samples, and then examined meta-analyses of subsets of the datasets (all combinations of datasets with up to N/2 samples and K/2 datasets) compared to a ‘silver standard’ of differentially expressed genes found in the entire cohort. We tested three random-effects meta-analysis models using this procedure. We showed relatively greater reproducibility with more-stringent effect size thresholds with relaxed significance thresholds; relatively lower reproducibility when imposing extraneous constraints on residual heterogeneity; and an underestimation of actual false positive rate by Benjamini–Hochberg correction. In addition, multivariate regression showed that the accuracy of a meta-analysis increased significantly with more included datasets even when controlling for sample size. PMID:27634930

  16. Full in-vitro analyses of new-generation bulk fill dental composites cured by halogen light.

    PubMed

    Tekin, Tuçe Hazal; Kantürk Figen, Aysel; Yılmaz Atalı, Pınar; Coşkuner Filiz, Bilge; Pişkin, Mehmet Burçin

    2017-08-01

    The objective of this study was to investigate the full in-vitro analyses of new-generation bulk-fill dental composites cured by halogen light (HLG). Two types' four composites were studied: Surefill SDR (SDR) and Xtra Base (XB) as bulk-fill flowable materials; QuixFill (QF) and XtraFill (XF) as packable bulk-fill materials. Samples were prepared for each analysis and test by applying the same procedure, but with different diameters and thicknesses appropriate to the analysis and test requirements. Thermal properties were determined by thermogravimetric analysis (TG/DTG) and differential scanning calorimetry (DSC) analysis; the Vickers microhardness (VHN) was measured after 1, 7, 15 and 30days of storage in water. The degree of conversion values for the materials (DC, %) were immediately measured using near-infrared spectroscopy (FT-IR). The surface morphology of the composites was investigated by scanning electron microscopes (SEM) and atomic-force microscopy (AFM) analyses. The sorption and solubility measurements were also performed after 1, 7, 15 and 30days of storage in water. In addition to his, the data were statistically analyzed using one-way analysis of variance, and both the Newman Keuls and Tukey multiple comparison tests. The statistical significance level was established at p<0.05. According to the ISO 4049 standards, all the tested materials showed acceptable water sorption and solubility, and a halogen light source was an option to polymerize bulk-fill, resin-based dental composites. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Design, analysis and test verification of advanced encapsulation systems

    NASA Technical Reports Server (NTRS)

    Garcia, A.; Minning, C.

    1982-01-01

    Analytical models were developed to perform optical, thermal, electrical and structural analyses on candidate encapsulation systems. Qualification testing, specimens of various types, and a finalized optimum design are projected.

  18. How to Make Nothing Out of Something: Analyses of the Impact of Study Sampling and Statistical Interpretation in Misleading Meta-Analytic Conclusions

    PubMed Central

    Cunningham, Michael R.; Baumeister, Roy F.

    2016-01-01

    The limited resource model states that self-control is governed by a relatively finite set of inner resources on which people draw when exerting willpower. Once self-control resources have been used up or depleted, they are less available for other self-control tasks, leading to a decrement in subsequent self-control success. The depletion effect has been studied for over 20 years, tested or extended in more than 600 studies, and supported in an independent meta-analysis (Hagger et al., 2010). Meta-analyses are supposed to reduce bias in literature reviews. Carter et al.’s (2015) meta-analysis, by contrast, included a series of questionable decisions involving sampling, methods, and data analysis. We provide quantitative analyses of key sampling issues: exclusion of many of the best depletion studies based on idiosyncratic criteria and the emphasis on mini meta-analyses with low statistical power as opposed to the overall depletion effect. We discuss two key methodological issues: failure to code for research quality, and the quantitative impact of weak studies by novice researchers. We discuss two key data analysis issues: questionable interpretation of the results of trim and fill and Funnel Plot Asymmetry test procedures, and the use and misinterpretation of the untested Precision Effect Test and Precision Effect Estimate with Standard Error (PEESE) procedures. Despite these serious problems, the Carter et al. (2015) meta-analysis results actually indicate that there is a real depletion effect – contrary to their title. PMID:27826272

  19. Considerations in STS payload environmental verification

    NASA Technical Reports Server (NTRS)

    Keegan, W. B.

    1978-01-01

    Considerations regarding the Space Transportation System (STS) payload environmental verification are reviewed. It is noted that emphasis is placed on testing at the subassembly level and that the basic objective of structural dynamic payload verification is to ensure reliability in a cost-effective manner. Structural analyses consist of: (1) stress analysis for critical loading conditions, (2) model analysis for launch and orbital configurations, (3) flight loads analysis, (4) test simulation analysis to verify models, (5) kinematic analysis of deployment/retraction sequences, and (6) structural-thermal-optical program analysis. In addition to these approaches, payload verification programs are being developed in the thermal-vacuum area. These include the exposure to extreme temperatures, temperature cycling, thermal-balance testing and thermal-vacuum testing.

  20. Substantiation Data for Advanced Beaded and Tubular Structural Panels. Volume 3: Testing

    NASA Technical Reports Server (NTRS)

    Hedges, P. C.; Greene, B. E.

    1974-01-01

    The test program is described, which was conducted to provide the necessary experimental data to verify the design and analysis methods developed for beaded and tubular panels. Test results are summarized and presented for all local buckling and full size panel tests. Selected representative test data from each of these tests is presented in detail. The results of this program established a valid analysis and design procedure for circular tube panels. Test results from three other configurations show deformational modes which are not adequately accounted for in the present analyses.

  1. Testing Mediation Using Multiple Regression and Structural Equation Modeling Analyses in Secondary Data

    ERIC Educational Resources Information Center

    Li, Spencer D.

    2011-01-01

    Mediation analysis in child and adolescent development research is possible using large secondary data sets. This article provides an overview of two statistical methods commonly used to test mediated effects in secondary analysis: multiple regression and structural equation modeling (SEM). Two empirical studies are presented to illustrate the…

  2. Agreement between Descriptive and Experimental Analyses of Behavior under Naturalistic Test Conditions

    ERIC Educational Resources Information Center

    Martens, Brian K.; Gertz, Lynne E.; Werder, Candace Susan de Lacy; Rymanowski, Jennifer L.

    2010-01-01

    We compared the results of a contingency space analysis (CSA) of behavior-consequence recordings to the results of functional analysis (FA) test conditions involving antecedent stimuli and verbal statements that both differed from and mimicked those in the natural environment. Three preschool children with autism spectrum disorder participated.…

  3. Effect of Purification Procedures on DIF Analysis in IRTPRO

    ERIC Educational Resources Information Center

    Fikis, David R. J.; Oshima, T. C.

    2017-01-01

    Purification of the test has been a well-accepted procedure in enhancing the performance of tests for differential item functioning (DIF). As defined by Lord, purification requires reestimation of ability parameters after removing DIF items before conducting the final DIF analysis. IRTPRO 3 is a recently updated program for analyses in item…

  4. Biaxial Testing of 2219-T87 Aluminum Alloy Using Cruciform Specimens

    NASA Technical Reports Server (NTRS)

    Dawicke, D. S.; Pollock, W. D.

    1997-01-01

    A cruciform biaxial test specimen was designed and seven biaxial tensile tests were conducted on 2219-T87 aluminum alloy. An elastic-plastic finite element analysis was used to simulate each tests and predict the yield stresses. The elastic-plastic finite analysis accurately simulated the measured load-strain behavior for each test. The yield stresses predicted by the finite element analyses indicated that the yield behavior of the 2219-T87 aluminum alloy agrees with the von Mises yield criterion.

  5. Buckling Design and Analysis of a Payload Fairing One-Sixth Cylindrical Arc-Segment Panel

    NASA Technical Reports Server (NTRS)

    Kosareo, Daniel N.; Oliver, Stanley T.; Bednarcyk, Brett A.

    2013-01-01

    Design and analysis results are reported for a panel that is a 16th arc-segment of a full 33-ft diameter cylindrical barrel section of a payload fairing structure. Six such panels could be used to construct the fairing barrel, and, as such, compression buckling testing of a 16th arc-segment panel would serve as a validation test of the buckling analyses used to design the fairing panels. In this report, linear and nonlinear buckling analyses have been performed using finite element software for 16th arc-segment panels composed of aluminum honeycomb core with graphiteepoxy composite facesheets and an alternative fiber reinforced foam (FRF) composite sandwich design. The cross sections of both concepts were sized to represent realistic Space Launch Systems (SLS) Payload Fairing panels. Based on shell-based linear buckling analyses, smaller, more manageable buckling test panel dimensions were determined such that the panel would still be expected to buckle with a circumferential (as opposed to column-like) mode with significant separation between the first and second buckling modes. More detailed nonlinear buckling analyses were then conducted for honeycomb panels of various sizes using both Abaqus and ANSYS finite element codes, and for the smaller size panel, a solid-based finite element analysis was conducted. Finally, for the smaller size FRF panel, nonlinear buckling analysis was performed wherein geometric imperfections measured from an actual manufactured FRF were included. It was found that the measured imperfection did not significantly affect the panel's predicted buckling response

  6. Characterization and Analyses of Valves, Feed Lines and Tanks used in Propellant Delivery Systems at NASA SSC

    NASA Technical Reports Server (NTRS)

    Ryan, Harry M.; Coote, David J.; Ahuja, Vineet; Hosangadi, Ashvin

    2006-01-01

    Accurate modeling of liquid rocket engine test processes involves assessing critical fluid mechanic and heat and mass transfer mechanisms within a cryogenic environment, and accurately modeling fluid properties such as vapor pressure and liquid and gas densities as a function of pressure and temperature. The Engineering and Science Directorate at the NASA John C. Stennis Space Center has developed and implemented such analytic models and analysis processes that have been used over a broad range of thermodynamic systems and resulted in substantial improvements in rocket propulsion testing services. In this paper, we offer an overview of the analyses techniques used to simulate pressurization and propellant fluid systems associated with the test stands at the NASA John C. Stennis Space Center. More specifically, examples of the global performance (one-dimensional) of a propellant system are provided as predicted using the Rocket Propulsion Test Analysis (RPTA) model. Computational fluid dynamic (CFD) analyses utilizing multi-element, unstructured, moving grid capability of complex cryogenic feed ducts, transient valve operation, and pressurization and mixing in propellant tanks are provided as well.

  7. Rocket-Based Combined Cycle Engine Technology Development: Inlet CFD Validation and Application

    NASA Technical Reports Server (NTRS)

    DeBonis, J. R.; Yungster, S.

    1996-01-01

    A CFD methodology has been developed for inlet analyses of Rocket-Based Combined Cycle (RBCC) Engines. A full Navier-Stokes analysis code, NPARC, was used in conjunction with pre- and post-processing tools to obtain a complete description of the flow field and integrated inlet performance. This methodology was developed and validated using results from a subscale test of the inlet to a RBCC 'Strut-Jet' engine performed in the NASA Lewis 1 x 1 ft. supersonic wind tunnel. Results obtained from this study include analyses at flight Mach numbers of 5 and 6 for super-critical operating conditions. These results showed excellent agreement with experimental data. The analysis tools were also used to obtain pre-test performance and operability predictions for the RBCC demonstrator engine planned for testing in the NASA Lewis Hypersonic Test Facility. This analysis calculated the baseline fuel-off internal force of the engine which is needed to determine the net thrust with fuel on.

  8. Variable Stiffness Panel Structural Analyses With Material Nonlinearity and Correlation With Tests

    NASA Technical Reports Server (NTRS)

    Wu, K. Chauncey; Gurdal, Zafer

    2006-01-01

    Results from structural analyses of three tow-placed AS4/977-3 composite panels with both geometric and material nonlinearities are presented. Two of the panels have variable stiffness layups where the fiber orientation angle varies as a continuous function of location on the panel planform. One variable stiffness panel has overlapping tow bands of varying thickness, while the other has a theoretically uniform thickness. The third panel has a conventional uniform-thickness [plus or minus 45](sub 5s) layup with straight fibers, providing a baseline for comparing the performance of the variable stiffness panels. Parametric finite element analyses including nonlinear material shear are first compared with material characterization test results for two orthotropic layups. This nonlinear material model is incorporated into structural analysis models of the variable stiffness and baseline panels with applied end shortenings. Measured geometric imperfections and mechanical prestresses, generated by forcing the variable stiffness panels from their cured anticlastic shapes into their flatter test configurations, are also modeled. Results of these structural analyses are then compared to the measured panel structural response. Good correlation is observed between the analysis results and displacement test data throughout deep postbuckling up to global failure, suggesting that nonlinear material behavior is an important component of the actual panel structural response.

  9. Modeling and Analysis of Structural Dynamics for a One-Tenth Scale Model NGST Sunshield

    NASA Technical Reports Server (NTRS)

    Johnston, John; Lienard, Sebastien; Brodeur, Steve (Technical Monitor)

    2001-01-01

    New modeling and analysis techniques have been developed for predicting the dynamic behavior of the Next Generation Space Telescope (NGST) sunshield. The sunshield consists of multiple layers of pretensioned, thin-film membranes supported by deployable booms. Modeling the structural dynamic behavior of the sunshield is a challenging aspect of the problem due to the effects of membrane wrinkling. A finite element model of the sunshield was developed using an approximate engineering approach, the cable network method, to account for membrane wrinkling effects. Ground testing of a one-tenth scale model of the NGST sunshield were carried out to provide data for validating the analytical model. A series of analyses were performed to predict the behavior of the sunshield under the ground test conditions. Modal analyses were performed to predict the frequencies and mode shapes of the test article and transient response analyses were completed to simulate impulse excitation tests. Comparison was made between analytical predictions and test measurements for the dynamic behavior of the sunshield. In general, the results show good agreement with the analytical model correctly predicting the approximate frequency and mode shapes for the significant structural modes.

  10. Alterations in rotation thromboelastometry (ROTEM®) parameters: point-of-care testing vs analysis after pneumatic tube system transport.

    PubMed

    Martin, J; Schuster, T; Moessmer, G; Kochs, E F; Wagner, K J

    2012-10-01

    Thromboelastometry as point-of-care (POC) testing enables the analysis of the clotting process at the bedside, providing rapid results to guide haemostatic therapy. However, POC testing utilizes medical staff who are managing critically ill patients, as non-laboratory personnel may not be sufficiently trained to run the devices. To resolve these problems, thromboelastometry can be performed in the central laboratory and rapid transport of samples can be accomplished via a pneumatic tube system (PTS). This study compares thromboelastometry parameters of blood samples analysed immediately with those analysed after PTS transport. In patients with normal haemostasis, two arterial blood samples were collected from each patient (n=92) in citrated plastic tubes to investigate the assays INTEM (n=35), EXTEM (n=27), and FIBTEM (n=30). One blood sample was analysed immediately, the other sample after PTS transport. Thromboelastometry was performed using a single ROTEM(®) device. The mean clot firmness values were significantly lower for PTS samples in both the INTEM (-0.7 mm cf. -1.1 mm) and EXTEM (-1.4 cf. -1.7 mm) assays. INTEM coagulation time (CT) was significantly lower in PTS samples with a mean difference of -13 s. EXTEM CT was significantly higher in PTS samples with a mean difference of +3.9 s. Thromboelastometry parameters of blood samples analysed after PTS transport are significantly altered compared with those analysed immediately. However, in patients with normal haemostasis, the alterations were small and without clinical consequence, implying that analysis after PTS transport is an acceptable alternative to prompt analysis at the bedside. Further studies should focus on patients with impaired haemostasis.

  11. Dual nozzle aerodynamic and cooling analysis study. [dual throat and dual expander nozzles

    NASA Technical Reports Server (NTRS)

    Meagher, G. M.

    1980-01-01

    Geometric, aerodynamic flow field, performance prediction, and heat transfer analyses are considered for two advanced chamber nozzle concepts applicable to Earth-to-orbit engine systems. Topics covered include improvements to the dual throat aerodynamic and performance prediction program; geometric and flow field analyses of the dual expander concept; heat transfer analysis of both concepts, and engineering analysis of data from the NASA/MSFC hot-fire testing of a dual throat thruster model thrust chamber assembly. Preliminary results obtained are presented in graphs.

  12. Reporting of analyses from randomized controlled trials with multiple arms: a systematic review.

    PubMed

    Baron, Gabriel; Perrodeau, Elodie; Boutron, Isabelle; Ravaud, Philippe

    2013-03-27

    Multiple-arm randomized trials can be more complex in their design, data analysis, and result reporting than two-arm trials. We conducted a systematic review to assess the reporting of analyses in reports of randomized controlled trials (RCTs) with multiple arms. The literature in the MEDLINE database was searched for reports of RCTs with multiple arms published in 2009 in the core clinical journals. Two reviewers extracted data using a standardized extraction form. In total, 298 reports were identified. Descriptions of the baseline characteristics and outcomes per group were missing in 45 reports (15.1%) and 48 reports (16.1%), respectively. More than half of the articles (n = 171, 57.4%) reported that a planned global test comparison was used (that is, assessment of the global differences between all groups), but 67 (39.2%) of these 171 articles did not report details of the planned analysis. Of the 116 articles reporting a global comparison test, 12 (10.3%) did not report the analysis as planned. In all, 60% of publications (n = 180) described planned pairwise test comparisons (that is, assessment of the difference between two groups), but 20 of these 180 articles (11.1%) did not report the pairwise test comparisons. Of the 204 articles reporting pairwise test comparisons, the comparisons were not planned for 44 (21.6%) of them. Less than half the reports (n = 137; 46%) provided baseline and outcome data per arm and reported the analysis as planned. Our findings highlight discrepancies between the planning and reporting of analyses in reports of multiple-arm trials.

  13. Analytical performance, agreement and user-friendliness of six point-of-care testing urine analysers for urinary tract infection in general practice.

    PubMed

    Schot, Marjolein J C; van Delft, Sanne; Kooijman-Buiting, Antoinette M J; de Wit, Niek J; Hopstaken, Rogier M

    2015-05-18

    Various point-of-care testing (POCT) urine analysers are commercially available for routine urine analysis in general practice. The present study compares analytical performance, agreement and user-friendliness of six different POCT urine analysers for diagnosing urinary tract infection in general practice. All testing procedures were performed at a diagnostic centre for primary care in the Netherlands. Urine samples were collected at four general practices. Analytical performance and agreement of the POCT analysers regarding nitrite, leucocytes and erythrocytes, with the laboratory reference standard, was the primary outcome measure, and analysed by calculating sensitivity, specificity, positive and negative predictive value, and Cohen's κ coefficient for agreement. Secondary outcome measures were the user-friendliness of the POCT analysers, in addition to other characteristics of the analysers. The following six POCT analysers were evaluated: Uryxxon Relax (Macherey Nagel), Urisys 1100 (Roche), Clinitek Status (Siemens), Aution 11 (Menarini), Aution Micro (Menarini) and Urilyzer (Analyticon). Analytical performance was good for all analysers. Compared with laboratory reference standards, overall agreement was good, but differed per parameter and per analyser. Concerning the nitrite test, the most important test for clinical practice, all but one showed perfect agreement with the laboratory standard. For leucocytes and erythrocytes specificity was high, but sensitivity was considerably lower. Agreement for leucocytes varied between good to very good, and for the erythrocyte test between fair and good. First-time users indicated that the analysers were easy to use. They expected higher productivity and accuracy when using these analysers in daily practice. The overall performance and user-friendliness of all six commercially available POCT urine analysers was sufficient to justify routine use in suspected urinary tract infections in general practice. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  14. Ecological Momentary Assessments and Automated Time Series Analysis to Promote Tailored Health Care: A Proof-of-Principle Study.

    PubMed

    van der Krieke, Lian; Emerencia, Ando C; Bos, Elisabeth H; Rosmalen, Judith Gm; Riese, Harriëtte; Aiello, Marco; Sytema, Sjoerd; de Jonge, Peter

    2015-08-07

    Health promotion can be tailored by combining ecological momentary assessments (EMA) with time series analysis. This combined method allows for studying the temporal order of dynamic relationships among variables, which may provide concrete indications for intervention. However, application of this method in health care practice is hampered because analyses are conducted manually and advanced statistical expertise is required. This study aims to show how this limitation can be overcome by introducing automated vector autoregressive modeling (VAR) of EMA data and to evaluate its feasibility through comparisons with results of previously published manual analyses. We developed a Web-based open source application, called AutoVAR, which automates time series analyses of EMA data and provides output that is intended to be interpretable by nonexperts. The statistical technique we used was VAR. AutoVAR tests and evaluates all possible VAR models within a given combinatorial search space and summarizes their results, thereby replacing the researcher's tasks of conducting the analysis, making an informed selection of models, and choosing the best model. We compared the output of AutoVAR to the output of a previously published manual analysis (n=4). An illustrative example consisting of 4 analyses was provided. Compared to the manual output, the AutoVAR output presents similar model characteristics and statistical results in terms of the Akaike information criterion, the Bayesian information criterion, and the test statistic of the Granger causality test. Results suggest that automated analysis and interpretation of times series is feasible. Compared to a manual procedure, the automated procedure is more robust and can save days of time. These findings may pave the way for using time series analysis for health promotion on a larger scale. AutoVAR was evaluated using the results of a previously conducted manual analysis. Analysis of additional datasets is needed in order to validate and refine the application for general use.

  15. Ecological Momentary Assessments and Automated Time Series Analysis to Promote Tailored Health Care: A Proof-of-Principle Study

    PubMed Central

    Emerencia, Ando C; Bos, Elisabeth H; Rosmalen, Judith GM; Riese, Harriëtte; Aiello, Marco; Sytema, Sjoerd; de Jonge, Peter

    2015-01-01

    Background Health promotion can be tailored by combining ecological momentary assessments (EMA) with time series analysis. This combined method allows for studying the temporal order of dynamic relationships among variables, which may provide concrete indications for intervention. However, application of this method in health care practice is hampered because analyses are conducted manually and advanced statistical expertise is required. Objective This study aims to show how this limitation can be overcome by introducing automated vector autoregressive modeling (VAR) of EMA data and to evaluate its feasibility through comparisons with results of previously published manual analyses. Methods We developed a Web-based open source application, called AutoVAR, which automates time series analyses of EMA data and provides output that is intended to be interpretable by nonexperts. The statistical technique we used was VAR. AutoVAR tests and evaluates all possible VAR models within a given combinatorial search space and summarizes their results, thereby replacing the researcher’s tasks of conducting the analysis, making an informed selection of models, and choosing the best model. We compared the output of AutoVAR to the output of a previously published manual analysis (n=4). Results An illustrative example consisting of 4 analyses was provided. Compared to the manual output, the AutoVAR output presents similar model characteristics and statistical results in terms of the Akaike information criterion, the Bayesian information criterion, and the test statistic of the Granger causality test. Conclusions Results suggest that automated analysis and interpretation of times series is feasible. Compared to a manual procedure, the automated procedure is more robust and can save days of time. These findings may pave the way for using time series analysis for health promotion on a larger scale. AutoVAR was evaluated using the results of a previously conducted manual analysis. Analysis of additional datasets is needed in order to validate and refine the application for general use. PMID:26254160

  16. Real Cost-Benefit Analysis Is Needed in American Public Education

    ERIC Educational Resources Information Center

    Stoneberg, Bert D.

    2015-01-01

    Public school critics often point to rising expenditures and relatively flat test scores to justify their school reform agendas. The claims are flawed because their analyses fail to account for the difference in data types between dollars (ratio) and test scores (interval). A cost-benefit analysis using dollars as a common metric for both costs…

  17. Analysis of NSWC Ocean EM Observatory Test Data

    DTIC Science & Technology

    2016-09-01

    deployment locations. 1S. SUBJECT TERMS magnetic anomaly detection (MAD), oceanographic magnetic fields, coherence, magnetic noise reduction 16...analyses ......................................................................................... 11 3. Analysis of magnetic data...37 Appendix B: Feb 11 underwater magnetic data

  18. Longitudinal and Cross-Sectional Analyses of Visual Field Progression in Participants of the Ocular Hypertension Treatment Study (OHTS)

    PubMed Central

    Chauhan, Balwantray C; Keltner, John L; Cello, Kim E; Johnson, Chris A; Anderson, Douglas R; Gordon, Mae O; Kass, Michael A

    2014-01-01

    Purpose Visual field progression can be determined by evaluating the visual field by serial examinations (longitudinal analysis), or by a change in classification derived from comparison to age-matched normal data in single examinations (cross-sectional analysis). We determined the agreement between these two approaches in data from the Ocular Hypertension Treatment Study (OHTS). Methods Visual field data from 3088 eyes of 1570 OHTS participants (median follow-up 7 yrs, 15 tests with static automated perimetry) were analysed. Longitudinal analyses were performed with change probability with total and pattern deviation, and cross-sectional analysis with Glaucoma Hemifield Test, Corrected Pattern Standard Deviation, and Mean Deviation. The rates of Mean Deviation and General Height change were compared to estimate the degree of diffuse loss in emerging glaucoma. Results The agreement on progression in longitudinal and cross-sectional analyses ranged from 50% to 61% and remained nearly constant across a wide range of criteria. In contrast, the agreement on absence of progression ranged from 97% to 99.7%, being highest for the stricter criteria. Analyses of pattern deviation were more conservative than total deviation, with a 3 to 5 times lesser incidence of progression. Most participants developing field loss had both diffuse and focal change. Conclusions Despite considerable overall agreement, between 40 to 50% of eyes identified as having progressed with either longitudinal or cross-sectional analyses were identified with only one of the analyses. Because diffuse change is part of early glaucomatous damage, pattern deviation analyses may underestimate progression in patients with ocular hypertension. PMID:21149774

  19. Analysis and testing of aeroelastic model stability augmentation systems. [for supersonic transport aircraft wing and B-52 aircraft control system

    NASA Technical Reports Server (NTRS)

    Sevart, F. D.; Patel, S. M.

    1973-01-01

    Testing and evaluation of a stability augmentation system for aircraft flight control were performed. The flutter suppression system and synthesis conducted on a scale model of a supersonic wing for a transport aircraft are discussed. Mechanization and testing of the leading and trailing edge surface actuation systems are described. The ride control system analyses for a 375,000 pound gross weight B-52E aircraft are presented. Analyses of the B-52E aircraft maneuver load control system are included.

  20. MGAS: a powerful tool for multivariate gene-based genome-wide association analysis.

    PubMed

    Van der Sluis, Sophie; Dolan, Conor V; Li, Jiang; Song, Youqiang; Sham, Pak; Posthuma, Danielle; Li, Miao-Xin

    2015-04-01

    Standard genome-wide association studies, testing the association between one phenotype and a large number of single nucleotide polymorphisms (SNPs), are limited in two ways: (i) traits are often multivariate, and analysis of composite scores entails loss in statistical power and (ii) gene-based analyses may be preferred, e.g. to decrease the multiple testing problem. Here we present a new method, multivariate gene-based association test by extended Simes procedure (MGAS), that allows gene-based testing of multivariate phenotypes in unrelated individuals. Through extensive simulation, we show that under most trait-generating genotype-phenotype models MGAS has superior statistical power to detect associated genes compared with gene-based analyses of univariate phenotypic composite scores (i.e. GATES, multiple regression), and multivariate analysis of variance (MANOVA). Re-analysis of metabolic data revealed 32 False Discovery Rate controlled genome-wide significant genes, and 12 regions harboring multiple genes; of these 44 regions, 30 were not reported in the original analysis. MGAS allows researchers to conduct their multivariate gene-based analyses efficiently, and without the loss of power that is often associated with an incorrectly specified genotype-phenotype models. MGAS is freely available in KGG v3.0 (http://statgenpro.psychiatry.hku.hk/limx/kgg/download.php). Access to the metabolic dataset can be requested at dbGaP (https://dbgap.ncbi.nlm.nih.gov/). The R-simulation code is available from http://ctglab.nl/people/sophie_van_der_sluis. Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press.

  1. Data Analysis for the LISA Pathfinder Mission

    NASA Technical Reports Server (NTRS)

    Thorpe, James Ira

    2009-01-01

    The LTP (LISA Technology Package) is the core part of the Laser Interferometer Space Antenna (LISA) Pathfinder mission. The main goal of the mission is to study the sources of any disturbances that perturb the motion of the freely-falling test masses from their geodesic trajectories as well as 10 test various technologies needed for LISA. The LTP experiment is designed as a sequence of experimental runs in which the performance of the instrument is studied and characterized under different operating conditions. In order to best optimize subsequent experimental runs, each run must be promptly analysed to ensure that the following ones make best use of the available knowledge of the instrument ' In order to do this, all analyses must be designed and tested in advance of the mission and have sufficient built-in flexibility to account for unexpected results or behaviour. To support this activity, a robust and flexible data analysis software package is also required. This poster presents two of the main components that make up the data analysis effort: the data analysis software and the mock-data challenges used to validate analysis procedures and experiment designs.

  2. Tissue microarrays and quantitative tissue-based image analysis as a tool for oncology biomarker and diagnostic development.

    PubMed

    Dolled-Filhart, Marisa P; Gustavson, Mark D

    2012-11-01

    Translational oncology has been improved by using tissue microarrays (TMAs), which facilitate biomarker analysis of large cohorts on a single slide. This has allowed for rapid analysis and validation of potential biomarkers for prognostic and predictive value, as well as for evaluation of biomarker prevalence. Coupled with quantitative analysis of immunohistochemical (IHC) staining, objective and standardized biomarker data from tumor samples can further advance companion diagnostic approaches for the identification of drug-responsive or resistant patient subpopulations. This review covers the advantages, disadvantages and applications of TMAs for biomarker research. Research literature and reviews of TMAs and quantitative image analysis methodology have been surveyed for this review (with an AQUA® analysis focus). Applications such as multi-marker diagnostic development and pathway-based biomarker subpopulation analyses are described. Tissue microarrays are a useful tool for biomarker analyses including prevalence surveys, disease progression assessment and addressing potential prognostic or predictive value. By combining quantitative image analysis with TMAs, analyses will be more objective and reproducible, allowing for more robust IHC-based diagnostic test development. Quantitative multi-biomarker IHC diagnostic tests that can predict drug response will allow for greater success of clinical trials for targeted therapies and provide more personalized clinical decision making.

  3. Driving performance analysis of the ACAS FOT data and recommendations for a driving workload manager.

    DOT National Transportation Integrated Search

    2006-12-01

    This report contains analyses of driving performance data from the Advanced Collision Avoidance System (ACAS) Field Operational Test (FOT), with data from nearly 100 drivers and over 100,000 miles of driving. The analyses compared normal and distract...

  4. [Confrontation of knowledge on alcohol concentration in blood and in exhaled air].

    PubMed

    Bauer, Miroslav; Bauerová, Jiřina; Šikuta, Ján; Šidlo, Jozef

    2015-01-01

    The authors of the paper give a brief historical overview of the development of experimental alcohology in the former Czechoslovakia. Enhanced attention is paid to tests of work quality control of toxicological laboratories. Information on results of control tests of blood samples using the method of gas chromatography in Slovakia and within a world-wide study "Eurotox 1990" is presented. There are pointed out the pitfalls related to objective evaluation of the analysis results interpreting alcohol concentration in biological materials and the associated need to eliminate a negative influence of the human factor. The authors recommend performing analyses of alcohol in biological materials only at accredited workplaces and in the case of samples storage to secure a mandatory inhibition of phosphorylation process. There are analysed the reasons of numerical differences of analyses while taking evidence of alcohol in blood and in exhaled air. The authors confirm analysis accuracy using the method of gas chromatography along with breath analysers of exhaled air. They highlight the need for making the analysis results more objective also through confrontation with the results of clinical examination and with examined circumstances. The authors suggest a method of elimination of the human factor, the most frequently responsible for inaccuracy, to a tolerable level (safety factor) and the need of sample analysis by two methods independent of each other or the need of analysis of two biological materials.

  5. COMPADRE: an R and web resource for pathway activity analysis by component decompositions.

    PubMed

    Ramos-Rodriguez, Roberto-Rafael; Cuevas-Diaz-Duran, Raquel; Falciani, Francesco; Tamez-Peña, Jose-Gerardo; Trevino, Victor

    2012-10-15

    The analysis of biological networks has become essential to study functional genomic data. Compadre is a tool to estimate pathway/gene sets activity indexes using sub-matrix decompositions for biological networks analyses. The Compadre pipeline also includes one of the direct uses of activity indexes to detect altered gene sets. For this, the gene expression sub-matrix of a gene set is decomposed into components, which are used to test differences between groups of samples. This procedure is performed with and without differentially expressed genes to decrease false calls. During this process, Compadre also performs an over-representation test. Compadre already implements four decomposition methods [principal component analysis (PCA), Isomaps, independent component analysis (ICA) and non-negative matrix factorization (NMF)], six statistical tests (t- and f-test, SAM, Kruskal-Wallis, Welch and Brown-Forsythe), several gene sets (KEGG, BioCarta, Reactome, GO and MsigDB) and can be easily expanded. Our simulation results shown in Supplementary Information suggest that Compadre detects more pathways than over-representation tools like David, Babelomics and Webgestalt and less false positives than PLAGE. The output is composed of results from decomposition and over-representation analyses providing a more complete biological picture. Examples provided in Supplementary Information show the utility, versatility and simplicity of Compadre for analyses of biological networks. Compadre is freely available at http://bioinformatica.mty.itesm.mx:8080/compadre. The R package is also available at https://sourceforge.net/p/compadre.

  6. Finite Elements Analysis of a Composite Semi-Span Test Article With and Without Discrete Damage

    NASA Technical Reports Server (NTRS)

    Lovejoy, Andrew E.; Jegley, Dawn C. (Technical Monitor)

    2000-01-01

    AS&M Inc. performed finite element analysis, with and without discrete damage, of a composite semi-span test article that represents the Boeing 220-passenger transport aircraft composite semi-span test article. A NASTRAN bulk data file and drawings of the test mount fixtures and semi-span components were utilized to generate the baseline finite element model. In this model, the stringer blades are represented by shell elements, and the stringer flanges are combined with the skin. Numerous modeling modifications and discrete source damage scenarios were applied to the test article model throughout the course of the study. This report details the analysis method and results obtained from the composite semi-span study. Analyses were carried out for three load cases: Braked Roll, LOG Down-Bending and 2.5G Up-Bending. These analyses included linear and nonlinear static response, as well as linear and nonlinear buckling response. Results are presented in the form of stress and strain plots. factors of safety for failed elements, buckling loads and modes, deflection prediction tables and plots, and strainage prediction tables and plots. The collected results are presented within this report for comparison to test results.

  7. Kruskal-Wallis test: BASIC computer program to perform nonparametric one-way analysis of variance and multiple comparisons on ranks of several independent samples.

    PubMed

    Theodorsson-Norheim, E

    1986-08-01

    Multiple t tests at a fixed p level are frequently used to analyse biomedical data where analysis of variance followed by multiple comparisons or the adjustment of the p values according to Bonferroni would be more appropriate. The Kruskal-Wallis test is a nonparametric 'analysis of variance' which may be used to compare several independent samples. The present program is written in an elementary subset of BASIC and will perform Kruskal-Wallis test followed by multiple comparisons between the groups on practically any computer programmable in BASIC.

  8. In-Space Transportation with Tethers

    NASA Technical Reports Server (NTRS)

    Lorenzini, Enrico; Estes, Robert D.; Cosmo, Mario L.

    1998-01-01

    The annual report covers the research conducted on the following topics related to the use of spaceborne tethers for in-space transportation: ProSEDS tether modeling (current collection analyses, influence of a varying tether temperature); proSEDS mission analysis and system dynamics (tether thermal model, thermo-electro-dynamics integrated simulations); proSEDS-tether development and testing (tether requirements, deployment test plan, tether properties testing, deployment tests); and tethers for reboosting the space-based laser (mission analysis, tether system preliminary design, evaluation of attitude constraints).

  9. DKIST enclosure modeling and verification during factory assembly and testing

    NASA Astrophysics Data System (ADS)

    Larrakoetxea, Ibon; McBride, William; Marshall, Heather K.; Murga, Gaizka

    2014-08-01

    The Daniel K. Inouye Solar Telescope (DKIST, formerly the Advanced Technology Solar Telescope, ATST) is unique as, apart from protecting the telescope and its instrumentation from the weather, it holds the entrance aperture stop and is required to position it with millimeter-level accuracy. The compliance of the Enclosure design with the requirements, as of Final Design Review in January 2012, was supported by mathematical models and other analyses which included structural and mechanical analyses (FEA), control models, ventilation analysis (CFD), thermal models, reliability analysis, etc. During the Enclosure Factory Assembly and Testing the compliance with the requirements has been verified using the real hardware and the models created during the design phase have been revisited. The tests performed during shutter mechanism subsystem (crawler test stand) functional and endurance testing (completed summer 2013) and two comprehensive system-level factory acceptance testing campaigns (FAT#1 in December 2013 and FAT#2 in March 2014) included functional and performance tests on all mechanisms, off-normal mode tests, mechanism wobble tests, creation of the Enclosure pointing map, control system tests, and vibration tests. The comparison of the assumptions used during the design phase with the properties measured during the test campaign provides an interesting reference for future projects.

  10. Cost-effectiveness of point-of-care testing for dehydration in the pediatric ED.

    PubMed

    Whitney, Rachel E; Santucci, Karen; Hsiao, Allen; Chen, Lei

    2016-08-01

    Acute gastroenteritis (AGE) and subsequent dehydration account for a large proportion of pediatric emergency department (PED) visits. Point-of-care (POC) testing has been used in conjunction with clinical assessment to determine the degree of dehydration. Despite the wide acceptance of POC testing, little formal cost-effective analysis of POC testing in the PED exists. We aim to examine the cost-effectiveness of using POC electrolyte testing vs traditional serum chemistry testing in the PED for children with AGE. This was a cost-effective analysis using data from a randomized control trial of children with AGE. A decision analysis model was constructed to calculate cost-savings from the point of view of the payer and the provider. We used parameters obtained from the trial, including cost of testing, admission rates, cost of admission, and length of stay. Sensitivity analyses were performed to evaluate the stability of our model. Using the data set of 225 subjects, POC testing results in a cost savings of $303.30 per patient compared with traditional serum testing from the point of the view of the payer. From the point-of-view of the provider, POC testing results in consistent mean savings of $36.32 ($8.29-$64.35) per patient. Sensitivity analyses demonstrated the stability of the model and consistent savings. This decision analysis provides evidence that POC testing in children with gastroenteritis-related moderate dehydration results in significant cost savings from the points of view of payers and providers compared to traditional serum chemistry testing. Copyright © 2016 Elsevier Inc. All rights reserved.

  11. Posttest Analyses of the Steel Containment Vessel Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Costello, J.F.; Hessheimer, M.F.; Ludwigsen, J.S.

    A high pressure test of a scale model of a steel containment vessel (SCV) was conducted on December 11-12, 1996 at Sandia National Laboratories, Albuquerque, NM, USA. The test model is a mixed-scaled model (1:10 in geometry and 1:4 in shell thickness) of an improved Mark II boiling water reactor (BWR) containment. This testis part of a program to investigate the response of representative models of nuclear containment structures to pressure loads beyond the design basis accident. The posttest analyses of this test focused on three areas where the pretest analysis effort did not adequately predict the model behavior duringmore » the test. These areas are the onset of global yielding, the strain concentrations around the equipment hatch and the strain concentrations that led to a small tear near a weld relief opening that was not modeled in the pretest analysis.« less

  12. Special environmental control and life support equipment test analyses and hardware

    NASA Technical Reports Server (NTRS)

    Callahan, David M.

    1995-01-01

    This final report summarizes NAS8-38250 contract events, 'Special Environmental Control and Life Support Systems Test Analysis and Hardware'. This report is technical and includes programmatic development. Key to the success of this contract was the evaluation of Environmental Control and Life Support Systems (ECLSS) test results via sophisticated laboratory analysis capabilities. The history of the contract, including all subcontracts, is followed by the support and development of each Task.

  13. IN SITU FIELD TESTING OF PROCESSES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    J.S.Y. YANG

    2004-11-08

    The purpose of this scientific analysis report is to update and document the data and subsequent analyses from ambient field-testing activities performed in underground drifts and surface-based boreholes through unsaturated zone (UZ) tuff rock units. In situ testing, monitoring, and associated laboratory studies are conducted to directly assess and evaluate the waste emplacement environment and the natural barriers to radionuclide transport at Yucca Mountain. This scientific analysis report supports and provides data to UZ flow and transport model reports, which in turn contribute to the Total System Performance Assessment (TSPA) of Yucca Mountain, an important document for the license applicationmore » (LA). The objectives of ambient field-testing activities are described in Section 1.1. This report is the third revision (REV 03), which supercedes REV 02. The scientific analysis of data for inputs to model calibration and validation as documented in REV 02 were developed in accordance with the Technical Work Plan (TWP) ''Technical Work Plan for: Performance Assessment Unsaturated Zone'' (BSC 2004 [DIRS 167969]). This revision was developed in accordance with the ''Technical Work Plan for: Unsaturated Zone Flow Analysis and Model Report Integration'' (BSC 2004 [DIRS 169654], Section 1.2.4) for better integrated, consistent, transparent, traceable, and more complete documentation in this scientific analysis report and associated UZ flow and transport model reports. No additional testing or analyses were performed as part of this revision. The list of relevant acceptance criteria is provided by ''Technical Work Plan for: Unsaturated Zone Flow Analysis and Model Report Integration'' (BSC 2004 [DIRS 169654]), Table 3-1. Additional deviations from the TWP regarding the features, events, and processes (FEPs) list are discussed in Section 1.3. Documentation in this report includes descriptions of how, and under what conditions, the tests were conducted. The descriptions and analyses provide data useful for refining and confirming the understanding of flow, drift seepage, and transport processes in the UZ. The UZ testing activities included measurement of permeability distribution, quantification of the seepage of water into the drifts, evaluation of fracture-matrix interaction, study of flow along faults, testing of flow and transport between drifts, characterization of hydrologic heterogeneity along drifts, estimation of drying effects on the rock surrounding the drifts due to ventilation, monitoring of moisture conditions in open and sealed drifts, and determination of the degree of minimum construction water migration below drift. These field tests were conducted in two underground drifts at Yucca Mountain, the Exploratory Studies Facility (ESF) drift, and the cross-drift for Enhanced Characterization of the Repository Block (ECRB), as described in Section 1.2. Samples collected in boreholes and underground drifts have been used for additional hydrochemical and isotopic analyses for additional understanding of the UZ setting. The UZ transport tests conducted at the nearby Busted Butte site (see Figure 1-4) are also described in this scientific analysis report.« less

  14. Impact and fracture analysis of fish scales from Arapaima gigas.

    PubMed

    Torres, F G; Malásquez, M; Troncoso, O P

    2015-06-01

    Fish scales from the Amazonian fish Arapaima gigas have been characterised to study their impact and fracture behaviour at three different environmental conditions. Scales were cut in two different directions to analyse the influence of the orientation of collagen layers. The energy absorbed during impact tests was measured for each sample and SEM images were taken after each test in order to analyse the failure mechanisms. The results showed that scales tested at cryogenic temperatures display fragile behaviour, while scales tested at room temperature did not fracture. Different failure mechanisms have been identified, analysed and compared with the failure modes that occur in bone. The impact energy obtained for fish scales was two to three times higher than the values reported for bone in the literature. Copyright © 2015 Elsevier B.V. All rights reserved.

  15. Meta-analysis of gene-level tests for rare variant association.

    PubMed

    Liu, Dajiang J; Peloso, Gina M; Zhan, Xiaowei; Holmen, Oddgeir L; Zawistowski, Matthew; Feng, Shuang; Nikpay, Majid; Auer, Paul L; Goel, Anuj; Zhang, He; Peters, Ulrike; Farrall, Martin; Orho-Melander, Marju; Kooperberg, Charles; McPherson, Ruth; Watkins, Hugh; Willer, Cristen J; Hveem, Kristian; Melander, Olle; Kathiresan, Sekar; Abecasis, Gonçalo R

    2014-02-01

    The majority of reported complex disease associations for common genetic variants have been identified through meta-analysis, a powerful approach that enables the use of large sample sizes while protecting against common artifacts due to population structure and repeated small-sample analyses sharing individual-level data. As the focus of genetic association studies shifts to rare variants, genes and other functional units are becoming the focus of analysis. Here we propose and evaluate new approaches for performing meta-analysis of rare variant association tests, including burden tests, weighted burden tests, variable-threshold tests and tests that allow variants with opposite effects to be grouped together. We show that our approach retains useful features from single-variant meta-analysis approaches and demonstrate its use in a study of blood lipid levels in ∼18,500 individuals genotyped with exome arrays.

  16. Geographic analysis of multiple sensor data from the NASA/USGS earth resources program

    NASA Technical Reports Server (NTRS)

    Pascucci, R. F.; North, G. W.; Albrizio, R. A.; Shelkin, B. D.

    1969-01-01

    Qualitative and quantitative analyses were made of multi-sensor data acquired during aircraft missions. While the principal analysis effort was concentrated on imagery taken over test sites in Southern California, data were also studied from records acquired on missions over test sites at Phoenix, Chicago, Asheville, and New Orleans. The objectives of the analyses were: (1) to determine the capabilities of ten remote sensors in identifying the elements of information necessary in conducting geographic investigations in land use analysis, urban problems, surface energy budget, and soil moisture; (2) to determine the feasibility of using these sensors for these purposes at orbital altitudes; and (3) to collate and analyze ground and air data previously collected and assemble it in a format useful in the accomplishment of cost effectiveness studies.

  17. A Model for Simulating the Response of Aluminum Honeycomb Structure to Transverse Loading

    NASA Technical Reports Server (NTRS)

    Ratcliffe, James G.; Czabaj, Michael W.; Jackson, Wade C.

    2012-01-01

    A 1-dimensional material model was developed for simulating the transverse (thickness-direction) loading and unloading response of aluminum honeycomb structure. The model was implemented as a user-defined material subroutine (UMAT) in the commercial finite element analysis code, ABAQUS(Registered TradeMark)/Standard. The UMAT has been applied to analyses for simulating quasi-static indentation tests on aluminum honeycomb-based sandwich plates. Comparison of analysis results with data from these experiments shows overall good agreement. Specifically, analyses of quasi-static indentation tests yielded accurate global specimen responses. Predicted residual indentation was also in reasonable agreement with measured values. Overall, this simple model does not involve a significant computational burden, which makes it more tractable to simulate other damage mechanisms in the same analysis.

  18. To 3D or Not to 3D, That Is the Question: Do 3D Surface Analyses Improve the Ecomorphological Power of the Distal Femur in Placental Mammals?

    PubMed Central

    Gould, Francois D. H.

    2014-01-01

    Improvements in three-dimensional imaging technologies have renewed interest in the study of functional and ecological morphology. Quantitative approaches to shape analysis are used increasingly to study form-function relationships. These methods are computationally intensive, technically demanding, and time-consuming, which may limit sampling potential. There have been few side-by-side comparisons of the effectiveness of such approaches relative to more traditional analyses using linear measurements and ratios. Morphological variation in the distal femur of mammals has been shown to reflect differences in locomotor modes across clades. Thus I tested whether a geometric morphometric analysis of surface shape was superior to a multivariate analysis of ratios for describing ecomorphological patterns in distal femoral variation. A sample of 164 mammalian specimens from 44 genera was assembled. Each genus was assigned to one of six locomotor categories. The same hypotheses were tested using two methods. Six linear measurements of the distal femur were taken with calipers, from which four ratios were calculated. A 3D model was generated with a laser scanner, and analyzed using three dimensional geometric morphometrics. Locomotor category significantly predicted variation in distal femoral morphology in both analyses. Effect size was larger in the geometric morphometric analysis than in the analysis of ratios. Ordination reveals a similar pattern with arboreal and cursorial taxa as extremes on a continuum of morphologies in both analyses. Discriminant functions calculated from the geometric morphometric analysis were more accurate than those calculated from ratios. Both analysis of ratios and geometric morphometric surface analysis reveal similar, biologically meaningful relationships between distal femoral shape and locomotor mode. The functional signal from the morphology is slightly higher in the geometric morphometric analysis. The practical costs of conducting these sorts of analyses should be weighed against potentially slight increases in power when designing protocols for ecomorphological studies. PMID:24633081

  19. Mass univariate analysis of event-related brain potentials/fields I: a critical tutorial review.

    PubMed

    Groppe, David M; Urbach, Thomas P; Kutas, Marta

    2011-12-01

    Event-related potentials (ERPs) and magnetic fields (ERFs) are typically analyzed via ANOVAs on mean activity in a priori windows. Advances in computing power and statistics have produced an alternative, mass univariate analyses consisting of thousands of statistical tests and powerful corrections for multiple comparisons. Such analyses are most useful when one has little a priori knowledge of effect locations or latencies, and for delineating effect boundaries. Mass univariate analyses complement and, at times, obviate traditional analyses. Here we review this approach as applied to ERP/ERF data and four methods for multiple comparison correction: strong control of the familywise error rate (FWER) via permutation tests, weak control of FWER via cluster-based permutation tests, false discovery rate control, and control of the generalized FWER. We end with recommendations for their use and introduce free MATLAB software for their implementation. Copyright © 2011 Society for Psychophysiological Research.

  20. Employee Attitudes of the Organizational Culture: Assessment of a TQM implementation Process

    DTIC Science & Technology

    1990-09-01

    68 Analysis of Hypotheses..............70 Null Hypotheses Hola -Holi .............. 71 T-test...............................71 Summary of...Analyses for Hola -Holi 72 Null Hypotheses Ho2a-Ho2i .............. 73 One-way Analysis of Variance ......... 73 Summary of Analyses for Ho2a-Ho2i .... 77...supervisory and non- supervisory organizational members with regard to their attitudes about the organization’s quality culture. Hola . There will be no

  1. Impacts Analyses Supporting the National Environmental Policy Act Environmental Assessment for the Resumption of Transient Testing Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schafer, Annette L.; Brown, LLoyd C.; Carathers, David C.

    2014-02-01

    This document contains the analysis details and summary of analyses conducted to evaluate the environmental impacts for the Resumption of Transient Fuel and Materials Testing Program. It provides an assessment of the impacts for the two action alternatives being evaluated in the environmental assessment. These alternatives are (1) resumption of transient testing using the Transient Reactor Test Facility (TREAT) at Idaho National Laboratory (INL) and (2) conducting transient testing using the Annular Core Research Reactor (ACRR) at Sandia National Laboratory in New Mexico (SNL/NM). Analyses are provided for radiologic emissions, other air emissions, soil contamination, and groundwater contamination that couldmore » occur (1) during normal operations, (2) as a result of accidents in one of the facilities, and (3) during transport. It does not include an assessment of the biotic, cultural resources, waste generation, or other impacts that could result from the resumption of transient testing. Analyses were conducted by technical professionals at INL and SNL/NM as noted throughout this report. The analyses are based on bounding radionuclide inventories, with the same inventories used for test materials by both alternatives and different inventories for the TREAT Reactor and ACRR. An upper value on the number of tests was assumed, with a test frequency determined by the realistic turn-around times required between experiments. The estimates provided for impacts during normal operations are based on historical emission rates and projected usage rates; therefore, they are bounding. Estimated doses for members of the public, collocated workers, and facility workers that could be incurred as a result of an accident are very conservative. They do not credit safety systems or administrative procedures (such as evacuation plans or use of personal protective equipment) that could be used to limit worker doses. Doses estimated for transportation are conservative and are based on transport of the bounding radiologic inventory that will be contained in any given test. The transportation analysis assumes all transports will contain the bounding inventory.« less

  2. Thermal finite-element analysis of space shuttle main engine turbine blade

    NASA Technical Reports Server (NTRS)

    Abdul-Aziz, Ali; Tong, Michael T.; Kaufman, Albert

    1987-01-01

    Finite-element, transient heat transfer analyses were performed for the first-stage blades of the space shuttle main engine (SSME) high-pressure fuel turbopump. The analyses were based on test engine data provided by Rocketdyne. Heat transfer coefficients were predicted by performing a boundary-layer analysis at steady-state conditions with the STAN5 boundary-layer code. Two different peak-temperature overshoots were evaluated for the startup transient. Cutoff transient conditions were also analyzed. A reduced gas temperature profile based on actual thermocouple data was also considered. Transient heat transfer analyses were conducted with the MARC finite-element computer code.

  3. RAS testing and cetuximab treatment for metastatic colorectal cancer: a cost-effectiveness analysis in a setting with limited health resources.

    PubMed

    Wu, Bin; Yao, Yuan; Zhang, Ke; Ma, Xuezhen

    2017-09-19

    To test the cost-effectiveness of cetuximab plus irinotecan, fluorouracil, and leucovorin (FOLFIRI) as first-line treatment in patients with metastatic colorectal cancer (mCRC) from a Chinese medical insurance perspective. Baseline analysis showed that the addition of cetuximab increased quality-adjusted life-years (QALYs) by 0.63, an increase of $17,086 relative to FOLFIRI chemotherapy, resulting in an incremental cost-effectiveness ratio (ICER) of $27,145/QALY. When the patient assistance program (PAP) was available, the ICER decreased to $14,049/QALY, which indicated that the cetuximab is cost-effective at a willingness-to-pay threshold of China ($22,200/QALY). One-way sensitivity analyses showed that the median overall survival time for the cetuximab was the most influential parameter. A Markov model by incorporating clinical, utility and cost data was developed to evaluate the economic outcome of cetuximab in mCRC. The lifetime horizon was used, and sensitivity analyses were carried out to test the robustness of the model results. The impact of PAP was also evaluated in scenario analyses. RAS testing with cetuximab treatment is likely to be cost-effective for patients with mCRC when PAP is available in China.

  4. Comparative study of smile analysis by subjective and computerized methods.

    PubMed

    Basting, Roberta Tarkany; da Trindade, Rita de Cássia Silva; Flório, Flávia Martão

    2006-01-01

    This study compared: 1) the subjective analyses of a smile done by specialists with advanced training and by general dentists; 2) the subjective analysis of a smile, or that associated with the face, by specialists with advanced training and general dentists; 3) subjective analysis using a computerized analysis of the smile by specialists with advanced training, verifying the midline, labial line, smile line, the line between commissures and the golden proportion. The sample consisted of 100 adults with natural dentition; 200 photographs were taken (100 of the smile and 100 of the entire face). Computerized analysis using AutoCAD software was performed, together with the subjective analyses of 2 groups of professionals (3 general dentists and 3 specialists with advanced training), using the following assessment factors: the midline, labial line, smile line, line between the commissures and the golden proportion. The smile itself and the smile associated with the entire face were recorded as being agreeable or not agreeable by the professionals. The McNemar test showed a highly significant difference (p=0.0000) among the subjective analyses performed by specialists compared to general dentists. Between the 2 groups of dental professionals, there were highly significant differences (p=0.0000) found between the subjective analyses of the smile and that of the face. The McNemar test showed statistical differences in all factors assessed, with the exception of the midline (p=0.1951), when the computerized analysis and subjective analysis of the specialists were compared. In order to establish harmony of the smile, it was not possible to establish a greater or lesser relevance among the factors analyzed.

  5. Biochemical phenotypes to discriminate microbial subpopulations and improve outbreak detection.

    PubMed

    Galar, Alicia; Kulldorff, Martin; Rudnick, Wallis; O'Brien, Thomas F; Stelling, John

    2013-01-01

    Clinical microbiology laboratories worldwide constitute an invaluable resource for monitoring emerging threats and the spread of antimicrobial resistance. We studied the growing number of biochemical tests routinely performed on clinical isolates to explore their value as epidemiological markers. Microbiology laboratory results from January 2009 through December 2011 from a 793-bed hospital stored in WHONET were examined. Variables included patient location, collection date, organism, and 47 biochemical and 17 antimicrobial susceptibility test results reported by Vitek 2. To identify biochemical tests that were particularly valuable (stable with repeat testing, but good variability across the species) or problematic (inconsistent results with repeat testing), three types of variance analyses were performed on isolates of K. pneumonia: descriptive analysis of discordant biochemical results in same-day isolates, an average within-patient variance index, and generalized linear mixed model variance component analysis. 4,200 isolates of K. pneumoniae were identified from 2,485 patients, 32% of whom had multiple isolates. The first two variance analyses highlighted SUCT, TyrA, GlyA, and GGT as "nuisance" biochemicals for which discordant within-patient test results impacted a high proportion of patient results, while dTAG had relatively good within-patient stability with good heterogeneity across the species. Variance component analyses confirmed the relative stability of dTAG, and identified additional biochemicals such as PHOS with a large between patient to within patient variance ratio. A reduced subset of biochemicals improved the robustness of strain definition for carbapenem-resistant K. pneumoniae. Surveillance analyses suggest that the reduced biochemical profile could improve the timeliness and specificity of outbreak detection algorithms. The statistical approaches explored can improve the robust recognition of microbial subpopulations with routinely available biochemical test results, of value in the timely detection of outbreak clones and evolutionarily important genetic events.

  6. Multidimensional Assessment of Phonological Similarity within and between Children

    ERIC Educational Resources Information Center

    Ingram, David; Dubasik, Virginia L.

    2011-01-01

    Multidimensional analysis involves moving away from one-dimensional analyses such as most articulation tests to comprehensive analyses involving levels of phonological information from the word level down to segments. This article outlines one such approach that looks at four levels from words to segments, using nine phonological measures. It also…

  7. Aeromechanics Analysis of a Distortion-Tolerant Fan with Boundary Layer Ingestion

    NASA Technical Reports Server (NTRS)

    Bakhle, Milind A.; Reddy, T. S. R.; Coroneos, Rula M.; Min, James B.; Provenza, Andrew J.; Duffy, Kirsten P.; Stefko, George L.; Heinlein, Gregory S.

    2018-01-01

    A propulsion system with Boundary Layer Ingestion (BLI) has the potential to significantly reduce aircraft engine fuel burn. But a critical challenge is to design a fan that can operate continuously with a persistent BLI distortion without aeromechanical failure -- flutter or high cycle fatigue due to forced response. High-fidelity computational aeromechanics analysis can be very valuable to support the design of a fan that has satisfactory aeromechanic characteristics and good aerodynamic performance and operability. Detailed aeromechanics analyses together with careful monitoring of the test article is necessary to avoid unexpected problems or failures during testing. In the present work, an aeromechanics analysis based on a three-dimensional, time-accurate, Reynolds-averaged Navier-Stokes computational fluid dynamics code is used to study the performance and aeromechanical characteristics of the fan in both circumferentially-uniform and circumferentially-varying distorted flows. Pre-test aeromechanics analyses are used to prepare for the wind tunnel test and comparisons are made with measured blade vibration data after the test. The analysis shows that the fan has low levels of aerodynamic damping at various operating conditions examined. In the test, the fan remained free of flutter except at one near-stall operating condition. Analysis could not be performed at this low mass flow rate operating condition since it fell beyond the limit of numerical stability of the analysis code. The measured resonant forced response at a specific low-response crossing indicated that the analysis under-predicted this response and work is in progress to understand possible sources of differences and to analyze other larger resonant responses. Follow-on work is also planned with a coupled inlet-fan aeromechanics analysis that will more accurately represent the interactions between the fan and BLI distortion.

  8. Gene set analysis using variance component tests.

    PubMed

    Huang, Yen-Tsung; Lin, Xihong

    2013-06-28

    Gene set analyses have become increasingly important in genomic research, as many complex diseases are contributed jointly by alterations of numerous genes. Genes often coordinate together as a functional repertoire, e.g., a biological pathway/network and are highly correlated. However, most of the existing gene set analysis methods do not fully account for the correlation among the genes. Here we propose to tackle this important feature of a gene set to improve statistical power in gene set analyses. We propose to model the effects of an independent variable, e.g., exposure/biological status (yes/no), on multiple gene expression values in a gene set using a multivariate linear regression model, where the correlation among the genes is explicitly modeled using a working covariance matrix. We develop TEGS (Test for the Effect of a Gene Set), a variance component test for the gene set effects by assuming a common distribution for regression coefficients in multivariate linear regression models, and calculate the p-values using permutation and a scaled chi-square approximation. We show using simulations that type I error is protected under different choices of working covariance matrices and power is improved as the working covariance approaches the true covariance. The global test is a special case of TEGS when correlation among genes in a gene set is ignored. Using both simulation data and a published diabetes dataset, we show that our test outperforms the commonly used approaches, the global test and gene set enrichment analysis (GSEA). We develop a gene set analyses method (TEGS) under the multivariate regression framework, which directly models the interdependence of the expression values in a gene set using a working covariance. TEGS outperforms two widely used methods, GSEA and global test in both simulation and a diabetes microarray data.

  9. Motivations for genetic testing for lung cancer risk among young smokers.

    PubMed

    O'Neill, Suzanne C; Lipkus, Isaac M; Sanderson, Saskia C; Shepperd, James; Docherty, Sharron; McBride, Colleen M

    2013-11-01

    To examine why young people might want to undergo genetic susceptibility testing for lung cancer despite knowing that tested gene variants are associated with small increases in disease risk. The authors used a mixed-method approach to evaluate motives for and against genetic testing and the association between these motivations and testing intentions in 128 college students who smoke. Exploratory factor analysis yielded four reliable factors: Test Scepticism, Test Optimism, Knowledge Enhancement and Smoking Optimism. Test Optimism and Knowledge Enhancement correlated positively with intentions to test in bivariate and multivariate analyses (ps<0.001). Test Scepticism correlated negatively with testing intentions in multivariate analyses (p<0.05). Open-ended questions assessing testing motivations generally replicated themes of the quantitative survey. In addition to learning about health risks, young people may be motivated to seek genetic testing for reasons, such as gaining knowledge about new genetic technologies more broadly.

  10. Fallout Deposition in the Marshall Islands from Bikini and Enewetak Nuclear Weapons Tests

    PubMed Central

    Beck, Harold L.; Bouville, André; Moroz, Brian E.; Simon, Steven L.

    2009-01-01

    Deposition densities (Bq m-2) of all important dose-contributing radionuclides occurring in nuclear weapons testing fallout from tests conducted at Bikini and Enewetak Atolls (1946-1958) have been estimated on a test-specific basis for all the 31 atolls and separate reef islands of the Marshall Islands. A complete review of various historical and contemporary data, as well as meteorological analysis, was used to make judgments regarding which tests deposited fallout in the Marshall Islands and to estimate fallout deposition density. Our analysis suggested that only 20 of the 66 nuclear tests conducted in or near the Marshall Islands resulted in substantial fallout deposition on any of the 25 inhabited atolls. This analysis was confirmed by the fact that the sum of our estimates of 137Cs deposition from these 20 tests at each atoll is in good agreement with the total 137Cs deposited as estimated from contemporary soil sample analyses. The monitoring data and meteorological analyses were used to quantitatively estimate the deposition density of 63 activation and fission products for each nuclear test, plus the cumulative deposition of 239+240Pu at each atoll. Estimates of the degree of fractionation of fallout from each test at each atoll, as well as of the fallout transit times from the test sites to the atolls were used in this analysis. The estimates of radionuclide deposition density, fractionation, and transit times reported here are the most complete available anywhere and are suitable for estimations of both external and internal dose to representative persons as described in companion papers. PMID:20622548

  11. Fallout deposition in the Marshall Islands from Bikini and Enewetak nuclear weapons tests.

    PubMed

    Beck, Harold L; Bouville, André; Moroz, Brian E; Simon, Steven L

    2010-08-01

    Deposition densities (Bq m(-2)) of all important dose-contributing radionuclides occurring in nuclear weapons testing fallout from tests conducted at Bikini and Enewetak Atolls (1946-1958) have been estimated on a test-specific basis for 32 atolls and separate reef islands of the Marshall Islands. A complete review of various historical and contemporary data, as well as meteorological analysis, was used to make judgments regarding which tests deposited fallout in the Marshall Islands and to estimate fallout deposition density. Our analysis suggested that only 20 of the 66 nuclear tests conducted in or near the Marshall Islands resulted in substantial fallout deposition on any of the 23 inhabited atolls. This analysis was confirmed by the fact that the sum of our estimates of 137Cs deposition from these 20 tests at each atoll is in good agreement with the total 137Cs deposited as estimated from contemporary soil sample analyses. The monitoring data and meteorological analyses were used to quantitatively estimate the deposition density of 63 activation and fission products for each nuclear test, plus the cumulative deposition of 239+240Pu at each atoll. Estimates of the degree of fractionation of fallout from each test at each atoll, as well as of the fallout transit times from the test sites to the atolls were used in this analysis. The estimates of radionuclide deposition density, fractionation, and transit times reported here are the most complete available anywhere and are suitable for estimations of both external and internal dose to representative persons as described in companion papers.

  12. Integrated Data Collection Analysis (IDCA) Program - Statistical Analysis of RDX Standard Data Sets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sandstrom, Mary M.; Brown, Geoffrey W.; Preston, Daniel N.

    2015-10-30

    The Integrated Data Collection Analysis (IDCA) program is conducting a Proficiency Test for Small- Scale Safety and Thermal (SSST) testing of homemade explosives (HMEs). Described here are statistical analyses of the results for impact, friction, electrostatic discharge, and differential scanning calorimetry analysis of the RDX Type II Class 5 standard. The material was tested as a well-characterized standard several times during the proficiency study to assess differences among participants and the range of results that may arise for well-behaved explosive materials. The analyses show that there are detectable differences among the results from IDCA participants. While these differences are statisticallymore » significant, most of them can be disregarded for comparison purposes to assess potential variability when laboratories attempt to measure identical samples using methods assumed to be nominally the same. The results presented in this report include the average sensitivity results for the IDCA participants and the ranges of values obtained. The ranges represent variation about the mean values of the tests of between 26% and 42%. The magnitude of this variation is attributed to differences in operator, method, and environment as well as the use of different instruments that are also of varying age. The results appear to be a good representation of the broader safety testing community based on the range of methods, instruments, and environments included in the IDCA Proficiency Test.« less

  13. 7 CFR 94.5 - Charges for laboratory service.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ..., Inspections, Marketing Practices), DEPARTMENT OF AGRICULTURE (CONTINUED) COMMODITY LABORATORY TESTING PROGRAMS... costs for analysis of mandatory egg product samples at Science and Technology Division laboratories... program. The costs for any other mandatory laboratory analyses and testing of an egg product's identity...

  14. 7 CFR 94.5 - Charges for laboratory service.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ..., Inspections, Marketing Practices), DEPARTMENT OF AGRICULTURE (CONTINUED) COMMODITY LABORATORY TESTING PROGRAMS... costs for analysis of mandatory egg product samples at Science and Technology Division laboratories... program. The costs for any other mandatory laboratory analyses and testing of an egg product's identity...

  15. 7 CFR 94.5 - Charges for laboratory service.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ..., Inspections, Marketing Practices), DEPARTMENT OF AGRICULTURE (CONTINUED) COMMODITY LABORATORY TESTING PROGRAMS... costs for analysis of mandatory egg product samples at Science and Technology Division laboratories... program. The costs for any other mandatory laboratory analyses and testing of an egg product's identity...

  16. 7 CFR 94.5 - Charges for laboratory service.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ..., Inspections, Marketing Practices), DEPARTMENT OF AGRICULTURE (CONTINUED) COMMODITY LABORATORY TESTING PROGRAMS... costs for analysis of mandatory egg product samples at Science and Technology Division laboratories... program. The costs for any other mandatory laboratory analyses and testing of an egg product's identity...

  17. Chemical composition analysis and product consistency tests supporting refinement of the Nepheline model for the high aluminum Hanford Glass composition region

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fox, K. M.; Edwards, T. B.; Mcclane, D. L.

    2016-02-17

    In this report, SRNL provides chemical analyses and Product Consistency Test (PCT) results for a series of simulated HLW glasses fabricated by Pacific Northwest National Laboratory (PNNL) as part of an ongoing nepheline crystallization study. The results of these analyses will be used to improve the ability to predict crystallization of nepheline as a function of composition and heat treatment for glasses formulated at high alumina concentrations.

  18. Chemical composition analysis and product consistency tests supporting refinement of the Nepheline Model for the high aluminum Hanford glass composition region

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fox, K. M.; Edwards, T. B.; Mcclane, D. L.

    2016-03-01

    In this report, Savannah River National Laboratory provides chemical analyses and Product Consistency Test (PCT) results for a series of simulated high level waste (HLW) glasses fabricated by Pacific Northwest National Laboratory (PNNL) as part of an ongoing nepheline crystallization study. The results of these analyses will be used to improve the ability to predict crystallization of nepheline as a function of composition and heat treatment for glasses formulated at high alumina concentrations.

  19. ORNL Pre-test Analyses of A Large-scale Experiment in STYLE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, Paul T; Yin, Shengjun; Klasky, Hilda B

    Oak Ridge National Laboratory (ORNL) is conducting a series of numerical analyses to simulate a large scale mock-up experiment planned within the European Network for Structural Integrity for Lifetime Management non-RPV Components (STYLE). STYLE is a European cooperative effort to assess the structural integrity of (non-reactor pressure vessel) reactor coolant pressure boundary components relevant to ageing and life-time management and to integrate the knowledge created in the project into mainstream nuclear industry assessment codes. ORNL contributes work-in-kind support to STYLE Work Package 2 (Numerical Analysis/Advanced Tools) and Work Package 3 (Engineering Assessment Methods/LBB Analyses). This paper summarizes the current statusmore » of ORNL analyses of the STYLE Mock-Up3 large-scale experiment to simulate and evaluate crack growth in a cladded ferritic pipe. The analyses are being performed in two parts. In the first part, advanced fracture mechanics models are being developed and performed to evaluate several experiment designs taking into account the capabilities of the test facility while satisfying the test objectives. Then these advanced fracture mechanics models will be utilized to simulate the crack growth in the large scale mock-up test. For the second part, the recently developed ORNL SIAM-PFM open-source, cross-platform, probabilistic computational tool will be used to generate an alternative assessment for comparison with the advanced fracture mechanics model results. The SIAM-PFM probabilistic analysis of the Mock-Up3 experiment will utilize fracture modules that are installed into a general probabilistic framework. The probabilistic results of the Mock-Up3 experiment obtained from SIAM-PFM will be compared to those results generated using the deterministic 3D nonlinear finite-element modeling approach. The objective of the probabilistic analysis is to provide uncertainty bounds that will assist in assessing the more detailed 3D finite-element solutions and to also assess the level of confidence that can be placed in the best-estimate finiteelement solutions.« less

  20. Interlaboratory comparability, bias, and precision for four laboratories measuring analytes in wet deposition, October 1983-December 1984

    USGS Publications Warehouse

    Brooks, Myron H.; Schroder, LeRoy J.; Willoughby, Timothy C.

    1987-01-01

    Four laboratories involved in the routine analysis of wet-deposition samples participated in an interlaboratory comparison program managed by the U.S. Geological Survey. The four participants were: Illinois State Water Survey central analytical laboratory in Champaign, Illinois; U.S. Geological Survey national water-quality laboratories in Atlanta, Georgia, and Denver, Colorado; and Inland Waters Directorate national water-quality laboratory in Burlington, Ontario, Canada. Analyses of interlaboratory samples performed by the four laboratories from October 1983 through December 1984 were compared.Participating laboratories analyzed three types of interlaboratory samples--natural wet deposition, simulated wet deposition, and deionized water--for pH and specific conductance, and for dissolved calcium, magnesium, sodium, sodium, potassium, chloride, sulfate, nitrate, ammonium, and orthophosphate. Natural wet-deposition samples were aliquots of actual wet-deposition samples. Analyses of these samples by the four laboratories were compared using analysis of variance. Test results indicated that pH, calcium, nitrate, and ammonium results were not directly comparable among the four laboratories. Statistically significant differences between laboratory results probably only were meaningful for analyses of dissolved calcium. Simulated wet-deposition samples with known analyte concentrations were used to test each laboratory for analyte bias. Laboratory analyses of calcium, magnesium, sodium, potassium, chloride, sulfate, and nitrate were not significantly different from the known concentrations of these analytes when tested using analysis of variance. Deionized-water samples were used to test each laboratory for reporting of false positive values. The Illinois State Water Survey Laboratory reported the smallest percentage of false positive values for most analytes. Analyte precision was estimated for each laboratory from results of replicate measurements. In general, the Illinois State Water Survey laboratory achieved the greatest precision, whereas the U.S. Geological Survey laboratories achieved the least precision.

  1. Pre-clinical cognitive phenotypes for Alzheimer disease: a latent profile approach.

    PubMed

    Hayden, Kathleen M; Kuchibhatla, Maragatha; Romero, Heather R; Plassman, Brenda L; Burke, James R; Browndyke, Jeffrey N; Welsh-Bohmer, Kathleen A

    2014-11-01

    Cognitive profiles for pre-clinical Alzheimer disease (AD) can be used to identify groups of individuals at risk for disease and better characterize pre-clinical disease. Profiles or patterns of performance as pre-clinical phenotypes may be more useful than individual test scores or measures of global decline. To evaluate patterns of cognitive performance in cognitively normal individuals to derive latent profiles associated with later onset of disease using a combination of factor analysis and latent profile analysis. The National Alzheimer Coordinating Centers collect data, including a battery of neuropsychological tests, from participants at 29 National Institute on Aging-funded Alzheimer Disease Centers across the United States. Prior factor analyses of this battery demonstrated a four-factor structure comprising memory, attention, language, and executive function. Factor scores from these analyses were used in a latent profile approach to characterize cognition among a group of cognitively normal participants (N = 3,911). Associations between latent profiles and disease outcomes an average of 3 years later were evaluated with multinomial regression models. Similar analyses were used to determine predictors of profile membership. Four groups were identified; each with distinct characteristics and significantly associated with later disease outcomes. Two groups were significantly associated with development of cognitive impairment. In post hoc analyses, both the Trail Making Test Part B, and a contrast score (Delayed Recall - Trails B), significantly predicted group membership and later cognitive impairment. Latent profile analysis is a useful method to evaluate patterns of cognition in large samples for the identification of preclinical AD phenotypes; comparable results, however, can be achieved with very sensitive tests and contrast scores. Copyright © 2014 American Association for Geriatric Psychiatry. Published by Elsevier Inc. All rights reserved.

  2. General aviation crash safety program at Langley Research Center

    NASA Technical Reports Server (NTRS)

    Thomson, R. G.

    1976-01-01

    The purpose of the crash safety program is to support development of the technology to define and demonstrate new structural concepts for improved crash safety and occupant survivability in general aviation aircraft. The program involves three basic areas of research: full-scale crash simulation testing, nonlinear structural analyses necessary to predict failure modes and collapse mechanisms of the vehicle, and evaluation of energy absorption concepts for specific component design. Both analytical and experimental methods are being used to develop expertise in these areas. Analyses include both simplified procedures for estimating energy absorption capabilities and more complex computer programs for analysis of general airframe response. Full-scale tests of typical structures as well as tests on structural components are being used to verify the analyses and to demonstrate improved design concepts.

  3. Star 48 solid rocket motor nozzle analyses and instrumented firings

    NASA Technical Reports Server (NTRS)

    Porter, R. L.

    1986-01-01

    The analyses and testing performed by NASA in support of an expanded and improved nozzle design data base for use by the U.S. solid rocket motor industry is presented. A production nozzle with a history of one ground failure and two flight failures was selected for analyses and testing. The stress analysis was performed with the Champion computer code developed by the U.S. Navy. Several improvements were made to the code. Strain predictions were made and compared to test data. Two short duration motor firings were conducted with highly instrumented nozzles. The first nozzle had 58 thermocouples, 66 strain gages, and 8 bondline pressure measurements. The second nozzle had 59 thermocouples, 68 strain measurements, and 8 bondline pressure measurements. Most of this instrumentation was on the nonmetallic parts, and provided significantly more thermal and strain data on the nonmetallic components of a nozzle than has been accumulated in a solid rocket motor test to date.

  4. 2007 international meeting on Reduced Enrichment for Research and Test Reactors (RERTR). Abstracts and available papers presented at the meeting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    2008-07-15

    The Meeting papers discuss research and test reactor fuel performance, manufacturing and testing. Some of the main topics are: conversion from HEU to LEU in different reactors and corresponding problems and activities; flux performance and core lifetime analysis with HEU and LEU fuels; physics and safety characteristics; measurement of gamma field parameters in core with LEU fuel; nondestructive analysis of RERTR fuel; thermal hydraulic analysis; fuel interactions; transient analyses and thermal hydraulics for HEU and LEU cores; microstructure research reactor fuels; post irradiation analysis and performance; computer codes and other related problems.

  5. Herbicide Orange Site Characterization Study Naval Construction Battalion Center

    DTIC Science & Technology

    1987-01-01

    U.S. Testing Laboratories for analysis. Over 200 additional analyses were performed for a variety of quality assurance criteria. The resultant data...TABLE 9. NCBC PERFORMANCE AUDIT SAMPLE ANALYSIS SUNMARYa (SERIES 1) TCDD Sppb ) Reported Detection Relative b Sample Number Concentration Limit...limit rather than estimating the variance of the results. The sample results were transformed using the natural logarithm. The Shapiro-Wilk W test

  6. SPSS and SAS programs for addressing interdependence and basic levels-of-analysis issues in psychological data.

    PubMed

    O'Connor, Brian P

    2004-02-01

    Levels-of-analysis issues arise whenever individual-level data are collected from more than one person from the same dyad, family, classroom, work group, or other interaction unit. Interdependence in data from individuals in the same interaction units also violates the independence-of-observations assumption that underlies commonly used statistical tests. This article describes the data analysis challenges that are presented by these issues and presents SPSS and SAS programs for conducting appropriate analyses. The programs conduct the within-and-between-analyses described by Dansereau, Alutto, and Yammarino (1984) and the dyad-level analyses described by Gonzalez and Griffin (1999) and Griffin and Gonzalez (1995). Contrasts with general multilevel modeling procedures are then discussed.

  7. Design and analysis of randomized clinical trials requiring prolonged observation of each patient. II. analysis and examples.

    PubMed Central

    Peto, R.; Pike, M. C.; Armitage, P.; Breslow, N. E.; Cox, D. R.; Howard, S. V.; Mantel, N.; McPherson, K.; Peto, J.; Smith, P. G.

    1977-01-01

    Part I of this report appeared in the previous issue (Br. J. Cancer (1976) 34,585), and discussed the design of randomized clinical trials. Part II now describes efficient methods of analysis of randomized clinical trials in which we wish to compare the duration of survival (or the time until some other untoward event first occurs) among different groups of patients. It is intended to enable physicians without statistical training either to analyse such data themselves using life tables, the logrank test and retrospective stratification, or, when such analyses are presented, to appreciate them more critically, but the discussion may also be of interest to statisticians who have not yet specialized in clinical trial analyses. PMID:831755

  8. Planned Comparisons as Better Alternatives to ANOVA Omnibus Tests.

    ERIC Educational Resources Information Center

    Benton, Roberta L.

    Analyses of data are presented to illustrate the advantages of using a priori or planned comparisons rather than omnibus analysis of variance (ANOVA) tests followed by post hoc or posteriori testing. The two types of planned comparisons considered are planned orthogonal non-trend coding contrasts and orthogonal polynomial or trend contrast coding.…

  9. Use of "t"-Test and ANOVA in Career-Technical Education Research

    ERIC Educational Resources Information Center

    Rojewski, Jay W.; Lee, In Heok; Gemici, Sinan

    2012-01-01

    Use of t-tests and analysis of variance (ANOVA) procedures in published research from three scholarly journals in career and technical education (CTE) during a recent 5-year period was examined. Information on post hoc analyses, reporting of effect size, alpha adjustments to account for multiple tests, power, and examination of assumptions…

  10. Impact Testing of Composites for Aircraft Engine Fan Cases

    NASA Technical Reports Server (NTRS)

    Roberts, Gary D.; Revilock, Duane M.; Binienda, Wieslaw K.; Nie, Walter Z.; Mackenzie, S. Ben; Todd, Kevin B.

    2001-01-01

    Before composite materials can be considered for use in the fan case of a commercial jet engine, the performance of a composite structure under blade-out loads needs to be demonstrated. The objective of this program is to develop an efficient test and analysis method for evaluating potential composite case concepts. Ballistic impact tests were performed on laminated glass/epoxy composites in order to identify potential failure modes and to provide data for analysis. Flat 7x7 in. panels were impacted with cylindrical titanium projectiles, and 15 in. diameter half-rings were impacted with wedge-shaped titanium projectiles. Composite failure involved local fiber fracture as well as tearing and delamination on a larger scale. A 36 in. diameter full-ring subcomponent was proposed for larger scale testing. Explicit, transient, finite element analyses were used to evaluate impact dynamics and subsequent global deformation for the proposed full-ring subcomponent test. Analyses on half-ring and quarter ring configurations indicated that less expensive smaller scale tests could be used to screen potential composite concepts when evaluation of local impact damage is the primary concern.

  11. Finite element analyses for seismic shear wall international standard problem

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park, Y.J.; Hofmayer, C.H.

    Two identical reinforced concrete (RC) shear walls, which consist of web, flanges and massive top and bottom slabs, were tested up to ultimate failure under earthquake motions at the Nuclear Power Engineering Corporation`s (NUPEC) Tadotsu Engineering Laboratory, Japan. NUPEC provided the dynamic test results to the OECD (Organization for Economic Cooperation and Development), Nuclear Energy Agency (NEA) for use as an International Standard Problem (ISP). The shear walls were intended to be part of a typical reactor building. One of the major objectives of the Seismic Shear Wall ISP (SSWISP) was to evaluate various seismic analysis methods for concrete structuresmore » used for design and seismic margin assessment. It also offered a unique opportunity to assess the state-of-the-art in nonlinear dynamic analysis of reinforced concrete shear wall structures under severe earthquake loadings. As a participant of the SSWISP workshops, Brookhaven National Laboratory (BNL) performed finite element analyses under the sponsorship of the U.S. Nuclear Regulatory Commission (USNRC). Three types of analysis were performed, i.e., monotonic static (push-over), cyclic static and dynamic analyses. Additional monotonic static analyses were performed by two consultants, F. Vecchio of the University of Toronto (UT) and F. Filippou of the University of California at Berkeley (UCB). The analysis results by BNL and the consultants were presented during the second workshop in Yokohama, Japan in 1996. A total of 55 analyses were presented during the workshop by 30 participants from 11 different countries. The major findings on the presented analysis methods, as well as engineering insights regarding the applicability and reliability of the FEM codes are described in detail in this report. 16 refs., 60 figs., 16 tabs.« less

  12. Detailed analysis and test correlation of a stiffened composite wing panel

    NASA Technical Reports Server (NTRS)

    Davis, D. Dale, Jr.

    1991-01-01

    Nonlinear finite element analysis techniques are evaluated by applying them to a realistic aircraft structural component. A wing panel from the V-22 tiltrotor aircraft is chosen because it is a typical modern aircraft structural component for which there is experimental data for comparison of results. From blueprints and drawings supplied by the Bell Helicopter Textron Corporation, a very detailed finite element model containing 2284 9-node Assumed Natural-Coordinate Strain (ANS) elements was generated. A novel solution strategy which accounts for geometric nonlinearity through the use of corotating element reference frames and nonlinear strain displacements relations is used to analyze this detailed model. Results from linear analyses using the same finite element model are presented in order to illustrate the advantages and costs of the nonlinear analysis as compared with the more traditional linear analysis. Strain predictions from both the linear and nonlinear stress analyses are shown to compare well with experimental data up through the Design Ultimate Load (DUL) of the panel. However, due to the extreme nonlinear response of the panel, the linear analysis was not accurate at loads above the DUL. The nonlinear analysis more accurately predicted the strain at high values of applied load, and even predicted complicated nonlinear response characteristics, such as load reversals, at the observed failure load of the test panel. In order to understand the failure mechanism of the panel, buckling and first ply failure analyses were performed. The buckling load was 17 percent above the observed failure load while first ply failure analyses indicated significant material damage at and below the observed failure load.

  13. Passenger car crippling end-load test and analyses

    DOT National Transportation Integrated Search

    2017-09-01

    The Transportation Technology Center, Inc. (TTCI) performed a series of full-scale tests and a finite element analysis (FEA) in a case study that may become a model for manufacturers seeking to use the waiver process of Tier I crashworthiness and occ...

  14. An Analysis of the Quality of English Testing for Aviation Purposes in Finland

    ERIC Educational Resources Information Center

    Huhta, Ari

    2009-01-01

    This article describes and analyses the development of a new test of aviation English by the Finnish Civil Aviation Authority (FCAA), as well as the overall situation in Finland as regards the testing of aviation English. The article describes the FCAA development project and evaluates the strengths and weaknesses of the new test and the whole…

  15. 76 FR 21700 - Notice of Request for Extension and Revision of a Currently Approved Collection

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-18

    ... analyses has its own methodology and time necessary to perform the analyses. (3) Aflatoxin in Pistachios Program (A High Performance Liquid Chromatography (HPLC) method for exporting pistachios to European Union requested by the California Pistachio Committee) and the domestic program using HPLC or a test kit analysis...

  16. Design, analysis, and testing of a metal matrix composite web/flange intersection

    NASA Technical Reports Server (NTRS)

    Biggers, S. B.; Knight, N. F., Jr.; Moran, S. G.; Olliffe, R.

    1992-01-01

    An experimental and analytical program to study the local design details of a typical T-shaped web/flange intersection made from a metal matrix composite is described. Loads creating flange bending were applied to specimens having different designs and boundary conditions. Finite element analyses were conducted on models of the test specimens to predict the structural response. The analyses correctly predict failure load, mode, and location in the fillet material in the intersection region of the web and the flange when specimen quality is good. The test program shows the importance of fabrication quality in the intersection region. The full-scale test program that led to the investigation of this local detail is also described.

  17. SSME Post Test Diagnostic System: Systems Section

    NASA Technical Reports Server (NTRS)

    Bickmore, Timothy

    1995-01-01

    An assessment of engine and component health is routinely made after each test firing or flight firing of a Space Shuttle Main Engine (SSME). Currently, this health assessment is done by teams of engineers who manually review sensor data, performance data, and engine and component operating histories. Based on review of information from these various sources, an evaluation is made as to the health of each component of the SSME and the preparedness of the engine for another test or flight. The objective of this project - the SSME Post Test Diagnostic System (PTDS) - is to develop a computer program which automates the analysis of test data from the SSME in order to detect and diagnose anomalies. This report primarily covers work on the Systems Section of the PTDS, which automates the analyses performed by the systems/performance group at the Propulsion Branch of NASA Marshall Space Flight Center (MSFC). This group is responsible for assessing the overall health and performance of the engine, and detecting and diagnosing anomalies which involve multiple components (other groups are responsible for analyzing the behavior of specific components). The PTDS utilizes several advanced software technologies to perform its analyses. Raw test data is analyzed using signal processing routines which detect features in the data, such as spikes, shifts, peaks, and drifts. Component analyses are performed by expert systems, which use 'rules-of-thumb' obtained from interviews with the MSFC data analysts to detect and diagnose anomalies. The systems analysis is performed using case-based reasoning. Results of all analyses are stored in a relational database and displayed via an X-window-based graphical user interface which provides ranked lists of anomalies and observations by engine component, along with supporting data plots for each.

  18. Finite element elastic-plastic-creep and cyclic life analysis of a cowl lip

    NASA Technical Reports Server (NTRS)

    Arya, Vinod K.; Melis, Matthew E.; Halford, Gary R.

    1990-01-01

    Results are presented of elastic, elastic-plastic, and elastic-plastic-creep analyses of a test-rig component of an actively cooled cowl lip. A cowl lip is part of the leading edge of an engine inlet of proposed hypersonic aircraft and is subject to severe thermal loadings and gradients during flight. Values of stresses calculated by elastic analysis are well above the yield strength of the cowl lip material. Such values are highly unrealistic, and thus elastic stress analyses are inappropriate. The inelastic (elastic-plastic and elastic-plastic-creep) analyses produce more reasonable and acceptable stress and strain distributions in the component. Finally, using the results from these analyses, predictions are made for the cyclic crack initiation life of a cowl lip. A comparison of predicted cyclic lives shows the cyclic life prediction from the elastic-plastic-creep analysis to be the lowest and, hence, most realistic.

  19. Development and psychometric evaluation of a cardiovascular risk and disease management knowledge assessment tool.

    PubMed

    Rosneck, James S; Hughes, Joel; Gunstad, John; Josephson, Richard; Noe, Donald A; Waechter, Donna

    2014-01-01

    This article describes the systematic construction and psychometric analysis of a knowledge assessment instrument for phase II cardiac rehabilitation (CR) patients measuring risk modification disease management knowledge and behavioral outcomes derived from national standards relevant to secondary prevention and management of cardiovascular disease. First, using adult curriculum based on disease-specific learning outcomes and competencies, a systematic test item development process was completed by clinical staff. Second, a panel of educational and clinical experts used an iterative process to identify test content domain and arrive at consensus in selecting items meeting criteria. Third, the resulting 31-question instrument, the Cardiac Knowledge Assessment Tool (CKAT), was piloted in CR patients to ensure use of application. Validity and reliability analyses were performed on 3638 adults before test administrations with additional focused analyses on 1999 individuals completing both pretreatment and posttreatment administrations within 6 months. Evidence of CKAT content validity was substantiated, with 85% agreement among content experts. Evidence of construct validity was demonstrated via factor analysis identifying key underlying factors. Estimates of internal consistency, for example, Cronbach's α = .852 and Spearman-Brown split-half reliability = 0.817 on pretesting, support test reliability. Item analysis, using point biserial correlation, measured relationships between performance on single items and total score (P < .01). Analyses using item difficulty and item discrimination indices further verified item stability and validity of the CKAT. A knowledge instrument specifically designed for an adult CR population was systematically developed and tested in a large representative patient population, satisfying psychometric parameters, including validity and reliability.

  20. Analysis of full-scale tank car shell impact tests

    DOT National Transportation Integrated Search

    2007-09-11

    This paper describes analyses of a railroad tank car : impacted at its side by a ram car with a rigid punch. This : generalized collision, referred to as a shell impact, is examined : using nonlinear finite element analysis (FEA) and threedimensional...

  1. Analysing task design and students' responses to context-based problems through different analytical frameworks

    NASA Astrophysics Data System (ADS)

    Broman, Karolina; Bernholt, Sascha; Parchmann, Ilka

    2015-05-01

    Background:Context-based learning approaches are used to enhance students' interest in, and knowledge about, science. According to different empirical studies, students' interest is improved by applying these more non-conventional approaches, while effects on learning outcomes are less coherent. Hence, further insights are needed into the structure of context-based problems in comparison to traditional problems, and into students' problem-solving strategies. Therefore, a suitable framework is necessary, both for the analysis of tasks and strategies. Purpose:The aim of this paper is to explore traditional and context-based tasks as well as students' responses to exemplary tasks to identify a suitable framework for future design and analyses of context-based problems. The paper discusses different established frameworks and applies the Higher-Order Cognitive Skills/Lower-Order Cognitive Skills (HOCS/LOCS) taxonomy and the Model of Hierarchical Complexity in Chemistry (MHC-C) to analyse traditional tasks and students' responses. Sample:Upper secondary students (n=236) at the Natural Science Programme, i.e. possible future scientists, are investigated to explore learning outcomes when they solve chemistry tasks, both more conventional as well as context-based chemistry problems. Design and methods:A typical chemistry examination test has been analysed, first the test items in themselves (n=36), and thereafter 236 students' responses to one representative context-based problem. Content analysis using HOCS/LOCS and MHC-C frameworks has been applied to analyse both quantitative and qualitative data, allowing us to describe different problem-solving strategies. Results:The empirical results show that both frameworks are suitable to identify students' strategies, mainly focusing on recall of memorized facts when solving chemistry test items. Almost all test items were also assessing lower order thinking. The combination of frameworks with the chemistry syllabus has been found successful to analyse both the test items as well as students' responses in a systematic way. The framework can therefore be applied in the design of new tasks, the analysis and assessment of students' responses, and as a tool for teachers to scaffold students in their problem-solving process. Conclusions:This paper gives implications for practice and for future research to both develop new context-based problems in a structured way, as well as providing analytical tools for investigating students' higher order thinking in their responses to these tasks.

  2. Identification and detection of anomalies through SSME data analysis

    NASA Technical Reports Server (NTRS)

    Pereira, Lisa; Ali, Moonis

    1990-01-01

    The goal of the ongoing research described in this paper is to analyze real-time ground test data in order to identify patterns associated with the anomalous engine behavior, and on the basis of this analysis to develop an expert system which detects anomalous engine behavior in the early stages of fault development. A prototype of the expert system has been developed and tested on the high frequency data of two SSME tests, namely Test #901-0516 and Test #904-044. The comparison of our results with the post-test analyses indicates that the expert system detected the presence of the anomalies in a significantly early stage of fault development.

  3. Imperfection Insensitivity Analyses of Advanced Composite Tow-Steered Shells

    NASA Technical Reports Server (NTRS)

    Wu, K. Chauncey; Farrokh, Babak; Stanford, Bret K.; Weaver, Paul M.

    2016-01-01

    Two advanced composite tow-steered shells, one with tow overlaps and another without overlaps, were previously designed, fabricated and tested in end compression, both without cutouts, and with small and large cutouts. In each case, good agreement was observed between experimental buckling loads and supporting linear bifurcation buckling analyses. However, previous buckling tests and analyses have shown historically poor correlation, perhaps due to the presence of geometric imperfections that serve as failure initiators. For the tow-steered shells, their circumferential variation in axial stiffness may have suppressed this sensitivity to imperfections, leading to the agreement noted between tests and analyses. To investigate this further, a numerical investigation was performed in this study using geometric imperfections measured from both shells. Finite element models of both shells were analyzed first without, and then, with measured imperfections that were then, superposed in different orientations around the shell longitudinal axis. Small variations in both the axial prebuckling stiffness and global buckling load were observed for the range of imperfections studied here, which suggests that the tow steering, and resulting circumferentially varying axial stiffness, may result in the test-analysis correlation observed for these shells.

  4. How often do sensitivity analyses for economic parameters change cost-utility analysis conclusions?

    PubMed

    Schackman, Bruce R; Gold, Heather Taffet; Stone, Patricia W; Neumann, Peter J

    2004-01-01

    There is limited evidence about the extent to which sensitivity analysis has been used in the cost-effectiveness literature. Sensitivity analyses for health-related QOL (HR-QOL), cost and discount rate economic parameters are of particular interest because they measure the effects of methodological and estimation uncertainties. To investigate the use of sensitivity analyses in the pharmaceutical cost-utility literature in order to test whether a change in economic parameters could result in a different conclusion regarding the cost effectiveness of the intervention analysed. Cost-utility analyses of pharmaceuticals identified in a prior comprehensive audit (70 articles) were reviewed and further audited. For each base case for which sensitivity analyses were reported (n = 122), up to two sensitivity analyses for HR-QOL (n = 133), cost (n = 99), and discount rate (n = 128) were examined. Article mentions of thresholds for acceptable cost-utility ratios were recorded (total 36). Cost-utility ratios were denominated in US dollars for the year reported in each of the original articles in order to determine whether a different conclusion would have been indicated at the time the article was published. Quality ratings from the original audit for articles where sensitivity analysis results crossed the cost-utility ratio threshold above the base-case result were compared with those that did not. The most frequently mentioned cost-utility thresholds were $US20,000/QALY, $US50,000/QALY, and $US100,000/QALY. The proportions of sensitivity analyses reporting quantitative results that crossed the threshold above the base-case results (or where the sensitivity analysis result was dominated) were 31% for HR-QOL sensitivity analyses, 20% for cost-sensitivity analyses, and 15% for discount-rate sensitivity analyses. Almost half of the discount-rate sensitivity analyses did not report quantitative results. Articles that reported sensitivity analyses where results crossed the cost-utility threshold above the base-case results (n = 25) were of somewhat higher quality, and were more likely to justify their sensitivity analysis parameters, than those that did not (n = 45), but the overall quality rating was only moderate. Sensitivity analyses for economic parameters are widely reported and often identify whether choosing different assumptions leads to a different conclusion regarding cost effectiveness. Changes in HR-QOL and cost parameters should be used to test alternative guideline recommendations when there is uncertainty regarding these parameters. Changes in discount rates less frequently produce results that would change the conclusion about cost effectiveness. Improving the overall quality of published studies and describing the justifications for parameter ranges would allow more meaningful conclusions to be drawn from sensitivity analyses.

  5. Kolmogorov-Smirnov statistical test for analysis of ZAP-70 expression in B-CLL, compared with quantitative PCR and IgV(H) mutation status.

    PubMed

    Van Bockstaele, Femke; Janssens, Ann; Piette, Anne; Callewaert, Filip; Pede, Valerie; Offner, Fritz; Verhasselt, Bruno; Philippé, Jan

    2006-07-15

    ZAP-70 has been proposed as a surrogate marker for immunoglobulin heavy-chain variable region (IgV(H)) mutation status, which is known as a prognostic marker in B-cell chronic lymphocytic leukemia (CLL). The flow cytometric analysis of ZAP-70 suffers from difficulties in standardization and interpretation. We applied the Kolmogorov-Smirnov (KS) statistical test to make analysis more straightforward. We examined ZAP-70 expression by flow cytometry in 53 patients with CLL. Analysis was performed as initially described by Crespo et al. (New England J Med 2003; 348:1764-1775) and alternatively by application of the KS statistical test comparing T cells with B cells. Receiver-operating-characteristics (ROC)-curve analyses were performed to determine the optimal cut-off values for ZAP-70 measured by the two approaches. ZAP-70 protein expression was compared with ZAP-70 mRNA expression measured by a quantitative PCR (qPCR) and with the IgV(H) mutation status. Both flow cytometric analyses correlated well with the molecular technique and proved to be of equal value in predicting the IgV(H) mutation status. Applying the KS test is reproducible, simple, straightforward, and overcomes a number of difficulties encountered in the Crespo-method. The KS statistical test is an essential part of the software delivered with modern routine analytical flow cytometers and is well suited for analysis of ZAP-70 expression in CLL. (c) 2006 International Society for Analytical Cytology.

  6. GAT: a graph-theoretical analysis toolbox for analyzing between-group differences in large-scale structural and functional brain networks.

    PubMed

    Hosseini, S M Hadi; Hoeft, Fumiko; Kesler, Shelli R

    2012-01-01

    In recent years, graph theoretical analyses of neuroimaging data have increased our understanding of the organization of large-scale structural and functional brain networks. However, tools for pipeline application of graph theory for analyzing topology of brain networks is still lacking. In this report, we describe the development of a graph-analysis toolbox (GAT) that facilitates analysis and comparison of structural and functional network brain networks. GAT provides a graphical user interface (GUI) that facilitates construction and analysis of brain networks, comparison of regional and global topological properties between networks, analysis of network hub and modules, and analysis of resilience of the networks to random failure and targeted attacks. Area under a curve (AUC) and functional data analyses (FDA), in conjunction with permutation testing, is employed for testing the differences in network topologies; analyses that are less sensitive to the thresholding process. We demonstrated the capabilities of GAT by investigating the differences in the organization of regional gray-matter correlation networks in survivors of acute lymphoblastic leukemia (ALL) and healthy matched Controls (CON). The results revealed an alteration in small-world characteristics of the brain networks in the ALL survivors; an observation that confirm our hypothesis suggesting widespread neurobiological injury in ALL survivors. Along with demonstration of the capabilities of the GAT, this is the first report of altered large-scale structural brain networks in ALL survivors.

  7. Residual Strength Analyses of Riveted Lap-Splice Joints

    NASA Technical Reports Server (NTRS)

    Seshadri, B. R.; Newman, J. C., Jr.

    2000-01-01

    The objective of this paper was to analyze the crack-linkup behavior in riveted-stiffened lap-splice joint panels with small multiple-site damage (MSD) cracks at several adjacent rivet holes. Analyses are based on the STAGS (STructural Analysis of General Shells) code with the critical crack-tip-opening angle (CTOA) fracture criterion. To account for high constraint around a crack front, the "plane strain core" option in STAGS was used. The importance of modeling rivet flexibility with fastener elements that accurately model load transfer across the joint is discussed. Fastener holes are not modeled but rivet connectivity is accounted for by attaching rivets to the sheet on one side of the cracks that simulated both the rivet diameter and MSD cracks. Residual strength analyses made on 2024-T3 alloy (1.6-mm thick) riveted-lap-splice joints with a lead crack and various size MSD cracks were compared with test data from Boeing Airplane Company. Analyses were conducted for both restrained and unrestrained buckling conditions. Comparison of results from these analyses and results from lap-splice-joint test panels, which were partially restrained against buckling indicate that the test results were bounded by the failure loads predicted by the analyses with restrained and unrestrained conditions.

  8. Testing sequential extraction methods for the analysis of multiple stable isotope systems from a bone sample

    NASA Astrophysics Data System (ADS)

    Sahlstedt, Elina; Arppe, Laura

    2017-04-01

    Stable isotope composition of bones, analysed either from the mineral phase (hydroxyapatite) or from the organic phase (mainly collagen) carry important climatological and ecological information and are therefore widely used in paleontological and archaeological research. For the analysis of the stable isotope compositions, both of the phases, hydroxyapatite and collagen, have their more or less well established separation and analytical techniques. Recent development in IRMS and wet chemical extraction methods have facilitated the analysis of very small bone fractions (500 μg or less starting material) for PO43-O isotope composition. However, the uniqueness and (pre-) historical value of each archaeological and paleontological finding lead to preciously little material available for stable isotope analyses, encouraging further development of microanalytical methods for the use of stable isotope analyses. Here we present the first results in developing extraction methods for combining collagen C- and N-isotope analyses to PO43-O-isotope analyses from a single bone sample fraction. We tested sequential extraction starting with dilute acid demineralization and collection of both collagen and PO43-fractions, followed by further purification step by H2O2 (PO43-fraction). First results show that bone sample separates as small as 2 mg may be analysed for their δ15N, δ13C and δ18OPO4 values. The method may be incorporated in detailed investigation of sequentially developing skeletal material such as teeth, potentially allowing for the investigation of interannual variability in climatological/environmental signals or investigation of the early life history of an individual.

  9. Some applications of categorical data analysis to epidemiological studies.

    PubMed Central

    Grizzle, J E; Koch, G G

    1979-01-01

    Several examples of categorized data from epidemiological studies are analyzed to illustrate that more informative analysis than tests of independence can be performed by fitting models. All of the analyses fit into a unified conceptual framework that can be performed by weighted least squares. The methods presented show how to calculate point estimate of parameters, asymptotic variances, and asymptotically valid chi 2 tests. The examples presented are analysis of relative risks estimated from several 2 x 2 tables, analysis of selected features of life tables, construction of synthetic life tables from cross-sectional studies, and analysis of dose-response curves. PMID:540590

  10. Analysis for the Progressive Failure Response of Textile Composite Fuselage Frames

    NASA Technical Reports Server (NTRS)

    Johnson, Eric R.; Boitnott, Richard L. (Technical Monitor)

    2002-01-01

    A part of aviation accident mitigation is a crash worthy airframe structure, and an important measure of merit for a crash worthy structure is the amount of kinetic energy that can be absorbed in the crush of the structure. Prediction of the energy absorbed from finite element analyses requires modeling the progressive failure sequence. Progressive failure modes may include material degradation, fracture and crack growth, and buckling and collapse. The design of crash worthy airframe components will benefit from progressive failure analyses that have been validated by tests. The subject of this research is the development of a progressive failure analysis for textile composite. circumferential fuselage frames subjected to a quasi-static, crash-type load. The test data for these frames are reported, and these data, along with stub column test data, are to be used to develop and to validate methods for the progressive failure response.

  11. Numerical prediction of 3-D ejector flows

    NASA Technical Reports Server (NTRS)

    Roberts, D. W.; Paynter, G. C.

    1979-01-01

    The use of parametric flow analysis, rather than parametric scale testing, to support the design of an ejector system offers a number of potential advantages. The application of available 3-D flow analyses to the design ejectors can be subdivided into several key elements. These are numerics, turbulence modeling, data handling and display, and testing in support of analysis development. Experimental and predicted jet exhaust for the Boeing 727 aircraft are examined.

  12. What Do Test Score Really Mean? A Latent Class Analysis of Danish Test Score Performance

    ERIC Educational Resources Information Center

    McIntosh, James; Munk, Martin D.

    2014-01-01

    Latent class Poisson count models are used to analyse a sample of Danish test score results from a cohort of individuals born in 1954-1955, tested in 1968, and followed until 2011. The procedure takes account of unobservable effects as well as excessive zeros in the data. We show that the test scores measure manifest or measured ability as it has…

  13. Ares I-X Upper Stage Simulator Structural Analyses Supporting the NESC Critical Initial Flaw Size Assessment

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.; Phillips, Dawn R.; Raju, Ivatury S.

    2008-01-01

    The structural analyses described in the present report were performed in support of the NASA Engineering and Safety Center (NESC) Critical Initial Flaw Size (CIFS) assessment for the ARES I-X Upper Stage Simulator (USS) common shell segment. The structural analysis effort for the NESC assessment had three thrusts: shell buckling analyses, detailed stress analyses of the single-bolt joint test; and stress analyses of two-segment 10 degree-wedge models for the peak axial tensile running load. Elasto-plastic, large-deformation simulations were performed. Stress analysis results indicated that the stress levels were well below the material yield stress for the bounding axial tensile design load. This report also summarizes the analyses and results from parametric studies on modeling the shell-to-gusset weld, flange-surface mismatch, bolt preload, and washer-bearing-surface modeling. These analyses models were used to generate the stress levels specified for the fatigue crack growth assessment using the design load with a factor of safety.

  14. The Fallibility of High Stakes "11-Plus" Testing in Northern Ireland

    ERIC Educational Resources Information Center

    Gardner, John; Cowan, Pamela

    2005-01-01

    This paper sets out the findings from a large-scale analysis of the Northern Ireland Transfer Procedure Tests, used to select pupils for grammar schools. As it was not possible to get completed test scripts from government agencies, over 3000 practice scripts were completed in simulated conditions and were analysed to establish whether the tests…

  15. Teacher Gender and Student Performance in Mathematics. Evidence from Catalonia (Spain)

    ERIC Educational Resources Information Center

    Escardíbul, Josep-Oriol; Mora, Toni

    2013-01-01

    This paper analyses the impact of teacher gender towards students' test results in a blinded Math test administered to students in Catalonia (Spain). The data for this analysis are drawn from a sample of secondary school students who participated in an international blind-test known as the "Mathematical Kangaroo" in 2008. The estimation…

  16. Item response theory analysis of the life orientation test-revised: age and gender differential item functioning analyses.

    PubMed

    Steca, Patrizia; Monzani, Dario; Greco, Andrea; Chiesi, Francesca; Primi, Caterina

    2015-06-01

    This study is aimed at testing the measurement properties of the Life Orientation Test-Revised (LOT-R) for the assessment of dispositional optimism by employing item response theory (IRT) analyses. The LOT-R was administered to a large sample of 2,862 Italian adults. First, confirmatory factor analyses demonstrated the theoretical conceptualization of the construct measured by the LOT-R as a single bipolar dimension. Subsequently, IRT analyses for polytomous, ordered response category data were applied to investigate the items' properties. The equivalence of the items across gender and age was assessed by analyzing differential item functioning. Discrimination and severity parameters indicated that all items were able to distinguish people with different levels of optimism and adequately covered the spectrum of the latent trait. Additionally, the LOT-R appears to be gender invariant and, with minor exceptions, age invariant. Results provided evidence that the LOT-R is a reliable and valid measure of dispositional optimism. © The Author(s) 2014.

  17. Placental growth factor (alone or in combination with soluble fms-like tyrosine kinase 1) as an aid to the assessment of women with suspected pre-eclampsia: systematic review and economic analysis.

    PubMed

    Frampton, Geoff K; Jones, Jeremy; Rose, Micah; Payne, Liz

    2016-11-01

    Pre-eclampsia (PE) prediction based on blood pressure, presence of protein in the urine, symptoms and laboratory test abnormalities can result in false-positive diagnoses. This may lead to unnecessary antenatal admissions and preterm delivery. Blood tests that measure placental growth factor (PlGF) or the ratio of soluble fms-like tyrosine kinase 1 (sFlt-1) to PlGF could aid prediction of PE if either were added to routine clinical assessment or used as a replacement for proteinuria testing. To evaluate the diagnostic accuracy and cost-effectiveness of PlGF-based tests for patients referred to secondary care with suspected PE in weeks 20-37 of pregnancy. Systematic reviews and an economic analysis. Bibliographic databases including MEDLINE, EMBASE, Web of Science and The Cochrane Library and Database of Abstracts of Reviews of Effects were searched up to July 2015 for English-language references. Conferences, websites, systematic reviews and confidential company submissions were also accessed. Systematic reviews of test accuracy and economic studies were conducted to inform an economic analysis. Test accuracy studies were required to include women with suspected PE and report quantitatively the accuracy of PlGF-based tests; their risk of bias was assessed using the Quality Assessment of Diagnostic Accuracy Studies (QUADAS) criteria. The economic studies review had broad eligibility criteria to capture any types of economic analysis; critical appraisal employed standard checklists consistent with National Institute for Health and Care Excellence criteria. Study selection, critical appraisal and data extraction in both reviews were performed by two reviewers. An independent economic analysis was conducted based on a decision tree model, using the best evidence available. The model evaluates costs (2014, GBP) from a NHS and Personal Social Services perspective. Given the short analysis time horizon, no discounting was undertaken. Four studies were included in the systematic review of test accuracy: two on Alere's Triage ® PlGF test (Alere, Inc., San Diego, CA, USA) for predicting PE requiring delivery within a specified time and two on Roche Diagnostics' Elecsys ® sFlt-1 to PlGF ratio test (Roche Diagnostics GmbH, Mannheim, Germany) for predicting PE within a specified time. Three studies were included in the systematic review of economic studies, and two confidential company economic analyses were assessed separately. Study heterogeneity precluded meta-analyses of test accuracy or cost-analysis outcomes, so narrative syntheses were conducted to inform the independent economic model. The model predicts that, when supplementing routine clinical assessment for rule-out and rule-in of PE, the two tests would be cost-saving in weeks 20-35 of gestation, and marginally cost-saving in weeks 35-37, but with minuscule impact on quality of life. Length of neonatal intensive care unit stay was the most influential parameter in sensitivity analyses. All other sensitivity analyses had negligible effects on results. No head-to-head comparisons of the tests were identified. No studies investigated accuracy of PlGF-based tests when used as a replacement for proteinuria testing. Test accuracy studies were found to be at high risk of clinical review bias. The Triage and Elecsys tests would save money if added to routine clinical assessment for PE. The magnitude of savings is uncertain, but the tests remain cost-saving under worst-case assumptions. Further research is required to clarify how the test results would be interpreted and applied in clinical practice. This study is registered as PROSPERO CRD42015017670. The National Institute for Health Research Health Technology Assessment programme.

  18. GenAlEx 6.5: genetic analysis in Excel. Population genetic software for teaching and research--an update.

    PubMed

    Peakall, Rod; Smouse, Peter E

    2012-10-01

    GenAlEx: Genetic Analysis in Excel is a cross-platform package for population genetic analyses that runs within Microsoft Excel. GenAlEx offers analysis of diploid codominant, haploid and binary genetic loci and DNA sequences. Both frequency-based (F-statistics, heterozygosity, HWE, population assignment, relatedness) and distance-based (AMOVA, PCoA, Mantel tests, multivariate spatial autocorrelation) analyses are provided. New features include calculation of new estimators of population structure: G'(ST), G''(ST), Jost's D(est) and F'(ST) through AMOVA, Shannon Information analysis, linkage disequilibrium analysis for biallelic data and novel heterogeneity tests for spatial autocorrelation analysis. Export to more than 30 other data formats is provided. Teaching tutorials and expanded step-by-step output options are included. The comprehensive guide has been fully revised. GenAlEx is written in VBA and provided as a Microsoft Excel Add-in (compatible with Excel 2003, 2007, 2010 on PC; Excel 2004, 2011 on Macintosh). GenAlEx, and supporting documentation and tutorials are freely available at: http://biology.anu.edu.au/GenAlEx. rod.peakall@anu.edu.au.

  19. Full-scale testing, production and cost analysis data for the advanced composite stabilizer for Boeing 737 aircraft. Volume 1: Technical summary

    NASA Technical Reports Server (NTRS)

    Aniversario, R. B.; Harvey, S. T.; Mccarty, J. E.; Parsons, J. T.; Peterson, D. C.; Pritchett, L. D.; Wilson, D. R.; Wogulis, E. R.

    1983-01-01

    The full scale ground test, ground vibration test, and flight tests conducted to demonstrate a composite structure stabilizer for the Boeing 737 aircraft and obtain FAA certification are described. Detail tools, assembly tools, and overall production are discussed. Cost analyses aspects covered include production costs, composite material usage factors, and cost comparisons.

  20. Refinement of Reading and Mathematics Test Through an Analysis of Reactivity. Beginning Teacher Evaluation Study. Technical Report Series. Technical Report III-6.

    ERIC Educational Resources Information Center

    Filby, Nikola N.; Dishaw, Marilyn

    Major analyses of the achievement tests used in the Beginning Teacher Evaluation Study were conducted to determine test reactivity to instruction. Reading and mathematics tests were administered to second and fifth grade children. Classroom teachers' records were examined to determine the amount of opportunity students had to learn the content…

  1. Statistical power analysis in wildlife research

    USGS Publications Warehouse

    Steidl, R.J.; Hayes, J.P.

    1997-01-01

    Statistical power analysis can be used to increase the efficiency of research efforts and to clarify research results. Power analysis is most valuable in the design or planning phases of research efforts. Such prospective (a priori) power analyses can be used to guide research design and to estimate the number of samples necessary to achieve a high probability of detecting biologically significant effects. Retrospective (a posteriori) power analysis has been advocated as a method to increase information about hypothesis tests that were not rejected. However, estimating power for tests of null hypotheses that were not rejected with the effect size observed in the study is incorrect; these power estimates will always be a??0.50 when bias adjusted and have no relation to true power. Therefore, retrospective power estimates based on the observed effect size for hypothesis tests that were not rejected are misleading; retrospective power estimates are only meaningful when based on effect sizes other than the observed effect size, such as those effect sizes hypothesized to be biologically significant. Retrospective power analysis can be used effectively to estimate the number of samples or effect size that would have been necessary for a completed study to have rejected a specific null hypothesis. Simply presenting confidence intervals can provide additional information about null hypotheses that were not rejected, including information about the size of the true effect and whether or not there is adequate evidence to 'accept' a null hypothesis as true. We suggest that (1) statistical power analyses be routinely incorporated into research planning efforts to increase their efficiency, (2) confidence intervals be used in lieu of retrospective power analyses for null hypotheses that were not rejected to assess the likely size of the true effect, (3) minimum biologically significant effect sizes be used for all power analyses, and (4) if retrospective power estimates are to be reported, then the I?-level, effect sizes, and sample sizes used in calculations must also be reported.

  2. 27 CFR 22.101 - Authorized uses.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... use exclusively in scientific research; (c) For use at any hospital, blood bank, or sanitarium (including use in making any analysis or test at a hospital, blood bank, or sanitarium), or at any pathological laboratory exclusively engage in making analyses, or test, for hospitals or sanitariums; or (d...

  3. The self-description inventory+, part 1 : factor structure and convergent validity analyses.

    DOT National Transportation Integrated Search

    2013-07-01

    Each year the FAA hires approximately 900 new air traffic controller candidates, the majority of whom take the Air Traffic Selection and Training test battery, better known as AT-SAT. This test, developed in 1997, is based on a job/task analysis cond...

  4. Factor Analysis of the WAIS and Twenty French-Kit Reference Tests.

    ERIC Educational Resources Information Center

    Ramsey, Philip H.

    1979-01-01

    The Wechsler Adult Intelligence Scale (WAIS) and 20 tests from the French Kit were administered to over 100 undergraduates. Analyses revealed ten factors: verbal comprehension, visualization, memory span, syllogistic reasoning, general reasoning, induction, mechanical knowledge, number facility, spatial orientation, and associative memory.…

  5. 40 CFR 85.2120 - Maintenance and submittal of records.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... testing program, including all production part sampling techniques used to verify compliance of the... subsequent analyses of that data; (7) A description of all the methodology, analysis, testing and/or sampling techniques used to ascertain the emission critical parameter specifications of the originial equipment part...

  6. 7 CFR 91.19 - General requirements of suitable samples.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... the analyses requested. (b) Each sample must be identified with the following information: (1) Product... other information which is required by the specific program under which analysis or test is performed. ... LABORATORY TESTING PROGRAMS SERVICES AND GENERAL INFORMATION Samples § 91.19 General requirements of suitable...

  7. 7 CFR 91.19 - General requirements of suitable samples.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... the analyses requested. (b) Each sample must be identified with the following information: (1) Product... other information which is required by the specific program under which analysis or test is performed. ... LABORATORY TESTING PROGRAMS SERVICES AND GENERAL INFORMATION Samples § 91.19 General requirements of suitable...

  8. 7 CFR 91.19 - General requirements of suitable samples.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... the analyses requested. (b) Each sample must be identified with the following information: (1) Product... other information which is required by the specific program under which analysis or test is performed. ... LABORATORY TESTING PROGRAMS SERVICES AND GENERAL INFORMATION Samples § 91.19 General requirements of suitable...

  9. 7 CFR 91.19 - General requirements of suitable samples.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... the analyses requested. (b) Each sample must be identified with the following information: (1) Product... other information which is required by the specific program under which analysis or test is performed. ... LABORATORY TESTING PROGRAMS SERVICES AND GENERAL INFORMATION Samples § 91.19 General requirements of suitable...

  10. 7 CFR 91.19 - General requirements of suitable samples.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... the analyses requested. (b) Each sample must be identified with the following information: (1) Product... other information which is required by the specific program under which analysis or test is performed. ... LABORATORY TESTING PROGRAMS SERVICES AND GENERAL INFORMATION Samples § 91.19 General requirements of suitable...

  11. Validation of contractor HMA testing data in the materials acceptance process - phase II : final report.

    DOT National Transportation Integrated Search

    2016-08-01

    This study conducted an analysis of the SCDOT HMA specification. A Research Steering Committee provided oversight : of the process. The research process included extensive statistical analyses of test data supplied by SCDOT. : A total of 2,789 AC tes...

  12. HIV testing of pregnant women: an ethical analysis.

    PubMed

    Johansson, Kjell Arne; Pedersen, Kirsten Bjerkreim; Andersson, Anna-Karin

    2011-12-01

    Recent global advances in available technology to prevent mother-to-child HIV transmission necessitate a rethinking of contemporary and previous ethical debates on HIV testing as a means to preventing vertical transmission. In this paper, we will provide an ethical analysis of HIV-testing strategies of pregnant women. First, we argue that provider-initiated opt-out HIV testing seems to be the most effective HIV test strategy. The flip-side of an opt-out strategy is that it may end up as involuntary testing in a clinical setting. We analyse this ethical puzzle from a novel perspective, taking into account the moral importance of certain hypothetical preferences of the child, as well as the moral importance of certain actual preferences of the mother. Finally, we balance the conflicting concerns and try to arrive at an ethically sound solution to this dilemma. Our aim is to introduce a novel perspective from which to analyse testing strategies, and to explore the implications and possible benefits of our proposal. The conclusion from our analysis is that policies that recommend provider-initiated opt-out HIV testing of pregnant mothers, with a risk of becoming involuntary testing in a clinical setting, are acceptable. The rationale behind this is that the increased availability of very effective and inexpensive life-saving drugs makes the ethical problems raised by the possible intrusiveness of HIV testing less important than the child's hypothetical preferences to be born healthy. Health care providers, therefore, have a duty to offer both opt-out HIV testing and available PMTCT (preventing mother-to-child transmission) interventions. © 2011 Blackwell Publishing Ltd.

  13. Studies on geotechnical properties of subsoil in south east coastal region of India

    NASA Astrophysics Data System (ADS)

    Dutta, Susom; Barik, D. K.

    2017-11-01

    Soil testing and analysis has become essential before commencement of any activity or process on soil i.e. residential construction, road construction etc. It is the most important work particularly in coastal area as these areas are more vulnerable to the natural disastrous like tsunami and cyclone. In India, there is lack of facility to collect and analyse the soil from the field. Hence, to study the various characteristics of the coastal region sub soil, Old Mahabalipuram area, which is the South East region of India has been chosen in this study. The aim of this study is to collect and analyse the soil sample from various localities of the Old Mahabalipuram area. The analysed soil data will be helpful for the people who are working in the field of Geotechnical in coastal region of India to make decision. The soil sample collected from different boreholes have undergone various field and laboratory tests like Pressuremeter Test, Field Permeability Test, Electrical Resistivity Test, Standard Penetration Test, Shear Test, Atterberg Limits etc. are performed including rock tests to know the geotechnical properties of the soil samples for each and every stratum

  14. Pretest predictions of the Fast Flux Test Facility Passive Safety Test Phase IIB transients using United States derived computer codes and methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heard, F.J.; Harris, R.A.; Padilla, A.

    The SASSYS/SAS4A systems analysis code was used to simulate a series of unprotected loss of flow (ULOF) tests planned at the Fast Flux Test Facility (FFTF). The subject tests were designed to investigate the transient performance of the FFTF during various ULOF scenarios for two different loading patterns designed to produce extremes in the assembly load pad clearance and the direction of the initial assembly bows. The tests are part of an international program designed to extend the existing data base on the performance of liquid metal reactors (LMR). The analyses demonstrate that a wide range of power-to-flow ratios canmore » be reached during the transients and, therefore, will yield valuable data on the dynamic character of the structural feedbacks in LMRS. These analyses will be repeated once the actual FFTF core loadings for the tests are available. These predictions, similar ones obtained by other international participants in the FFTF program, and post-test analyses will be used to upgrade and further verify the computer codes used to predict the behavior of LMRS.« less

  15. Age differences in the effect of framing on risky choice: A meta-analysis

    PubMed Central

    Best, Ryan; Charness, Neil

    2015-01-01

    The framing of decision scenarios in terms of potential gains versus losses has been shown to influence choice preferences between sure and risky options. Normative cognitive changes associated with aging have been known to affect decision-making, which has led to a number of studies investigating the influence of aging on the effect of framing. Mata, Josef, Samanez-Larkin, and Hertwig (2011) systematically reviewed the available literature using a meta-analytic approach, but did not include tests of homogeneity nor subsequent moderator variable analyses. The current review serves to extend the previous analysis to include such tests as well as update the pool of studies available for analysis. Results for both positively and negatively framed conditions were reviewed using two meta-analyses encompassing data collected from 3,232 subjects across 18 studies. Deviating from the previous results, the current analysis finds a tendency for younger adults to choose the risky option more often than older adults for positively framed items. Moderator variable analyses find this effect to likely be driven by the specific decision scenario, showing a significant effect with younger adults choosing the risky option more often in small-amount financial and large-amount mortality-based scenarios. For negatively framed items, the current review found no overall age difference in risky decision making, confirming the results from the prior meta-analysis. Moderator variable analyses conducted to address heterogeneity found younger adults to be more likely than older adults to choose the risky option for negatively framed high-amount mortality-based decision scenarios. Practical implications for older adults are discussed. PMID:26098168

  16. Age differences in the effect of framing on risky choice: A meta-analysis.

    PubMed

    Best, Ryan; Charness, Neil

    2015-09-01

    The framing of decision scenarios in terms of potential gains versus losses has been shown to influence choice preferences between sure and risky options. Normative cognitive changes associated with aging have been known to affect decision making, which has led to a number of studies investigating the influence of aging on the effect of framing. Mata, Josef, Samanez-Larkin, and Hertwig (2011) systematically reviewed the available literature using a meta-analytic approach, but did not include tests of homogeneity or subsequent moderator variable analyses. The current review serves to extend the previous analysis to include such tests as well as update the pool of studies available for analysis. Results for both positively and negatively framed conditions were reviewed using 2 meta-analyses encompassing data collected from 3,232 subjects across 18 studies. Deviating from the previous results, the current analysis found a tendency for younger adults to choose the risky option more often than older adults for positively framed items. Moderator variable analyses found this effect likely to be driven by the specific decision scenario, showing a significant effect, with younger adults choosing the risky option more often in small-amount financial and large-amount mortality-based scenarios. For negatively framed items, the current review found no overall age difference in risky decision making, confirming the results from the prior meta-analysis. Moderator variable analyses conducted to address heterogeneity found younger adults to be more likely than older adults to choose the risky option for negatively framed high-amount mortality-based decision scenarios. Practical implications for older adults are discussed. (c) 2015 APA, all rights reserved).

  17. Application of conditional moment tests to model checking for generalized linear models.

    PubMed

    Pan, Wei

    2002-06-01

    Generalized linear models (GLMs) are increasingly being used in daily data analysis. However, model checking for GLMs with correlated discrete response data remains difficult. In this paper, through a case study on marginal logistic regression using a real data set, we illustrate the flexibility and effectiveness of using conditional moment tests (CMTs), along with other graphical methods, to do model checking for generalized estimation equation (GEE) analyses. Although CMTs provide an array of powerful diagnostic tests for model checking, they were originally proposed in the econometrics literature and, to our knowledge, have never been applied to GEE analyses. CMTs cover many existing tests, including the (generalized) score test for an omitted covariate, as special cases. In summary, we believe that CMTs provide a class of useful model checking tools.

  18. Making the Hubble Space Telescope servicing mission safe

    NASA Technical Reports Server (NTRS)

    Bahr, N. J.; Depalo, S. V.

    1992-01-01

    The implementation of the HST system safety program is detailed. Numerous safety analyses are conducted through various phases of design, test, and fabrication, and results are presented to NASA management for discussion during dedicated safety reviews. Attention is given to the system safety assessment and risk analysis methodologies used, i.e., hazard analysis, fault tree analysis, and failure modes and effects analysis, and to how they are coupled with engineering and test analysis for a 'synergistic picture' of the system. Some preliminary safety analysis results, showing the relationship between hazard identification, control or abatement, and finally control verification, are presented as examples of this safety process.

  19. Chemical composition analysis and product consistency tests to support Enhanced Hanford Waste Glass Models. Results for the Augusta and October 2014 LAW Glasses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fox, K. M.; Edwards, T. B.; Best, D. R.

    2015-07-07

    In this report, the Savannah River National Laboratory provides chemical analyses and Product Consistency Test (PCT) results for several simulated low activity waste (LAW) glasses (designated as the August and October 2014 LAW glasses) fabricated by the Pacific Northwest National Laboratory. The results of these analyses will be used as part of efforts to revise or extend the validation regions of the current Hanford Waste Treatment and Immobilization Plant glass property models to cover a broader span of waste compositions.

  20. Maps showing mines, quarries, and prospects, with analyses of samples, Gee Creek Wilderness, Polk and Monroe counties, Tennessee

    USGS Publications Warehouse

    Gazdik, Gertrude C.; Behum, Paul T.

    1983-01-01

    During the recent U.S. Bureau of Mines field investigation, 21 samples were collected (fig. 2) and were submitted to the Bureau's Reno Metallurgy Research Center, Reno, Nev., for analysis. All samples were tested for 40 elements by semiquantitative spectrographic analyses. Additional testing by atomic absorption, neutron activation, and wet chemical techniques was performed for selected elements on some samples. Two shale samples were submitted to the Bureau of Mines, Tuscaloosa Metallurgy Research Center, Tuscaloosa, Ala., for the evaluation of ceramic properties. 

  1. 77 FR 2340 - Occupational Information Development Advisory Panel

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-01-17

    ... testing of an OIS content model and taxonomy, work analysis instrumentation, sampling, and data collection... economics, sampling, data collection and analyses; (b) disability evaluation, vocational rehabilitation...

  2. Airbags to Martian Landers: Analyses at Sandia National Laboratories

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gwinn, K.W.

    1994-03-01

    A new direction for the national laboratories is to assist US business with research and development, primarily through cooperative research and development agreements (CRADAs). Technology transfer to the private sector has been very successful as over 200 CRADAs are in place at Sandia. Because of these cooperative efforts, technology has evolved into some new areas not commonly associated with the former mission of the national laboratories. An example of this is the analysis of fabric structures. Explicit analyses and expertise in constructing parachutes led to the development of a next generation automobile airbag; which led to the construction, testing, andmore » analysis of the Jet Propulsion Laboratory Mars Environmental Survey Lander; and finally led to the development of CAD based custom garment designs using 3D scanned images of the human body. The structural analysis of these fabric structures is described as well as a more traditional example Sandia with the test/analysis correlation of the impact of a weapon container.« less

  3. Recent developments in deployment analysis simulation using a multi-body computer code

    NASA Technical Reports Server (NTRS)

    Housner, Jerrold M.

    1989-01-01

    Deployment is a candidate mode for construction of structural space systems components. By its very nature, deployment is a dynamic event, often involving large angle unfolding of flexible beam members. Validation of proposed designs and conceptual deployment mechanisms is enhanced through analysis. Analysis may be used to determine member loads thus helping to establish deployment rates and deployment control requirements for a given concept. Futhermore, member flexibility, joint free-play, manufacturing tolerances, and imperfections can affect the reliability of deployment. Analyses which include these effects can aid in reducing risks associated with a particular concept. Ground tests which can play a similar role to that of analyses are difficult and expensive to perform. Suspension systems just for vibration ground tests of large space structures in a 1 g environment present many challenges. Suspension of a structure which spatially expands is even more challenging. Analysis validation through experimental confirmation on relatively small simple models would permit analytical extrapolation to larger more complex space structures.

  4. Integration of Engine, Plume, and CFD Analyses in Conceptual Design of Low-Boom Supersonic Aircraft

    NASA Technical Reports Server (NTRS)

    Li, Wu; Campbell, Richard; Geiselhart, Karl; Shields, Elwood; Nayani, Sudheer; Shenoy, Rajiv

    2009-01-01

    This paper documents an integration of engine, plume, and computational fluid dynamics (CFD) analyses in the conceptual design of low-boom supersonic aircraft, using a variable fidelity approach. In particular, the Numerical Propulsion Simulation System (NPSS) is used for propulsion system cycle analysis and nacelle outer mold line definition, and a low-fidelity plume model is developed for plume shape prediction based on NPSS engine data and nacelle geometry. This model provides a capability for the conceptual design of low-boom supersonic aircraft that accounts for plume effects. Then a newly developed process for automated CFD analysis is presented for CFD-based plume and boom analyses of the conceptual geometry. Five test cases are used to demonstrate the integrated engine, plume, and CFD analysis process based on a variable fidelity approach, as well as the feasibility of the automated CFD plume and boom analysis capability.

  5. Bioconductor Workflow for Microbiome Data Analysis: from raw reads to community analyses

    PubMed Central

    Callahan, Ben J.; Sankaran, Kris; Fukuyama, Julia A.; McMurdie, Paul J.; Holmes, Susan P.

    2016-01-01

    High-throughput sequencing of PCR-amplified taxonomic markers (like the 16S rRNA gene) has enabled a new level of analysis of complex bacterial communities known as microbiomes. Many tools exist to quantify and compare abundance levels or OTU composition of communities in different conditions. The sequencing reads have to be denoised and assigned to the closest taxa from a reference database. Common approaches use a notion of 97% similarity and normalize the data by subsampling to equalize library sizes. In this paper, we show that statistical models allow more accurate abundance estimates. By providing a complete workflow in R, we enable the user to do sophisticated downstream statistical analyses, whether parametric or nonparametric. We provide examples of using the R packages dada2, phyloseq, DESeq2, ggplot2 and vegan to filter, visualize and test microbiome data. We also provide examples of supervised analyses using random forests and nonparametric testing using community networks and the ggnetwork package. PMID:27508062

  6. A Standard Platform for Testing and Comparison of MDAO Architectures

    NASA Technical Reports Server (NTRS)

    Gray, Justin S.; Moore, Kenneth T.; Hearn, Tristan A.; Naylor, Bret A.

    2012-01-01

    The Multidisciplinary Design Analysis and Optimization (MDAO) community has developed a multitude of algorithms and techniques, called architectures, for performing optimizations on complex engineering systems which involve coupling between multiple discipline analyses. These architectures seek to efficiently handle optimizations with computationally expensive analyses including multiple disciplines. We propose a new testing procedure that can provide a quantitative and qualitative means of comparison among architectures. The proposed test procedure is implemented within the open source framework, OpenMDAO, and comparative results are presented for five well-known architectures: MDF, IDF, CO, BLISS, and BLISS-2000. We also demonstrate how using open source soft- ware development methods can allow the MDAO community to submit new problems and architectures to keep the test suite relevant.

  7. TRAC analyses for CCTF and SCTF tests and UPTF design/operation. [Cylindrical Core Test Facility; Slab Core Test Facility; Upper Plenum Test Facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spore, J.W.; Cappiello, M.W.; Dotson, P.J.

    The analytical support in 1985 for Cylindrical Core Test Facility (CCTF), Slab Core Test Facility (SCTF), and Upper Plenum Test Facility (UPTF) tests involves the posttest analysis of 16 tests that have already been run in the CCTF and the SCTF and the pretest analysis of 3 tests to be performed in the UPTF. Posttest analysis is used to provide insight into the detailed thermal-hydraulic phenomena occurring during the refill and reflood tests performed in CCTF and SCTF. Pretest analysis is used to ensure that the test facility is operated in a manner consistent with the expected behavior of anmore » operating full-scale plant during an accident. To obtain expected behavior of a plant during an accident, two plant loss-of-coolant-accident (LOCA) calculations were performed: a 200% cold-leg-break LOCA calculation for a 2772 MW(t) Babcock and Wilcox plant and a 200% cold-leg-break LOCA calculation for a 3315 MW(t) Westinghouse plant. Detailed results are presented for several CCTF UPI tests and the Westinghouse plant analysis.« less

  8. Development of Test-Analysis Models (TAM) for correlation of dynamic test and analysis results

    NASA Technical Reports Server (NTRS)

    Angelucci, Filippo; Javeed, Mehzad; Mcgowan, Paul

    1992-01-01

    The primary objective of structural analysis of aerospace applications is to obtain a verified finite element model (FEM). The verified FEM can be used for loads analysis, evaluate structural modifications, or design control systems. Verification of the FEM is generally obtained as the result of correlating test and FEM models. A test analysis model (TAM) is very useful in the correlation process. A TAM is essentially a FEM reduced to the size of the test model, which attempts to preserve the dynamic characteristics of the original FEM in the analysis range of interest. Numerous methods for generating TAMs have been developed in the literature. The major emphasis of this paper is a description of the procedures necessary for creation of the TAM and the correlation of the reduced models with the FEM or the test results. Herein, three methods are discussed, namely Guyan, Improved Reduced System (IRS), and Hybrid. Also included are the procedures for performing these analyses using MSC/NASTRAN. Finally, application of the TAM process is demonstrated with an experimental test configuration of a ten bay cantilevered truss structure.

  9. Geoengineering properties of potential repository units at Yucca Mountain, southern Nevada

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tillerson, J.R.; Nimick, F.B.

    1984-12-01

    The Nevada Nuclear Waste Storage Investigations (NNWSI) Project is currently evaluating volcanic tuffs at the Yucca Mountain site, located on and adjacent to the Nevada Test Site, for possible use as a host rock for a radioactive waste repository. The behavior of tuff as an engineering material must be understood to design, license, construct, and operate a repository. Geoengineering evaluations and measurements are being made to develop confidence in both the analysis techniques for thermal, mechanical, and hydrothermal effects and the supporting data base of rock properties. The analysis techniques and the data base are currently used for repository design,more » waste package design, and performance assessment analyses. This report documents the data base of geoengineering properties used in the analyses that aided the selection of the waste emplacement horizon and in analyses synopsized in the Environmental Assessment Report prepared for the Yucca Mountain site. The strategy used for the development of the data base relies primarily on data obtained in laboratory tests that are then confirmed in field tests. Average thermal and mechanical properties (and their anticipated variations) are presented. Based upon these data, analyses completed to date, and previous excavation experience in tuff, it is anticipated that existing mining technology can be used to develop stable underground openings and that repository operations can be carried out safely.« less

  10. Analysis and testing of stability augmentation systems. [for supersonic transport aircraft wing and B-52 aircraft control system

    NASA Technical Reports Server (NTRS)

    Sevart, F. D.; Patel, S. M.; Wattman, W. J.

    1972-01-01

    Testing and evaluation of stability augmentation systems for aircraft flight control were conducted. The flutter suppression system analysis of a scale supersonic transport wing model is described. Mechanization of the flutter suppression system is reported. The ride control synthesis for the B-52 aeroelastic model is discussed. Model analyses were conducted using equations of motion generated from generalized mass and stiffness data.

  11. What Does the Cognitive Assessment System (CAS) Measure? Joint Confirmatory Factor Analysis of the CAS and the Woodcock-Johnson Tests of Cognitive Ability (3rd Edition).

    ERIC Educational Resources Information Center

    Keith, Timothy Z.; Kranzler, John H.; Flanagan, Dawn P.

    2001-01-01

    Reports the results of the first joint confirmatory factor analysis (CFA) of the Cognitive Assessment System (CAS) and the Woodcock-Johnson Tests of Cognitive Abilities-3rd Edition (WJ III). Results of these analyses do not support the construct validity of the CAS as a measure of the PASS (planning, attention, simultaneous, and sequential)…

  12. Quality assurance and quality control for thermal/optical analysis of aerosol samples for organic and elemental carbon.

    PubMed

    Chow, Judith C; Watson, John G; Robles, Jerome; Wang, Xiaoliang; Chen, L-W Antony; Trimble, Dana L; Kohl, Steven D; Tropp, Richard J; Fung, Kochy K

    2011-12-01

    Accurate, precise, and valid organic and elemental carbon (OC and EC, respectively) measurements require more effort than the routine analysis of ambient aerosol and source samples. This paper documents the quality assurance (QA) and quality control (QC) procedures that should be implemented to ensure consistency of OC and EC measurements. Prior to field sampling, the appropriate filter substrate must be selected and tested for sampling effectiveness. Unexposed filters are pre-fired to remove contaminants and acceptance tested. After sampling, filters must be stored in the laboratory in clean, labeled containers under refrigeration (<4 °C) to minimize loss of semi-volatile OC. QA activities include participation in laboratory accreditation programs, external system audits, and interlaboratory comparisons. For thermal/optical carbon analyses, periodic QC tests include calibration of the flame ionization detector with different types of carbon standards, thermogram inspection, replicate analyses, quantification of trace oxygen concentrations (<100 ppmv) in the helium atmosphere, and calibration of the sample temperature sensor. These established QA/QC procedures are applicable to aerosol sampling and analysis for carbon and other chemical components.

  13. Relationship of Selected Abilities to Problem Solving Performance.

    ERIC Educational Resources Information Center

    Harmel, Sarah Jane

    This study investigated five ability tests related to the water-jug problem. Previous analyses identified two processes used during solution: means-ends analysis and memory of visited states. Subjects were 240 undergraduate psychology students. A real-time computer system presented the problem and recorded responses. Ability tests were paper and…

  14. 27 CFR 1.60 - Use of distilled spirits.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... laboratory for use exclusively in scientific research; (3) For use at any hospital, blood bank, or sanitarium (including use in making analysis or test at such hospital, blood bank, or sanitarium), or at any pathological laboratory exclusively engaged in making analyses, or tests, for hospitals or sanitariums; or (4...

  15. Commognitive Analysis of Undergraduate Mathematics Students' First Encounter with the Subgroup Test

    ERIC Educational Resources Information Center

    Ioannou, Marios

    2018-01-01

    This study analyses learning aspects of undergraduate mathematics students' first encounter with the subgroup test, using the commognitive theoretical framework. It focuses on students' difficulties as these are related to the object-level and metalevel mathematical learning in group theory, and, when possible, highlights any commognitive…

  16. Toward the Replacement of Animal Experiments through the Bioinformatics-driven Analysis of 'Omics' Data from Human Cell Cultures.

    PubMed

    Grafström, Roland C; Nymark, Penny; Hongisto, Vesa; Spjuth, Ola; Ceder, Rebecca; Willighagen, Egon; Hardy, Barry; Kaski, Samuel; Kohonen, Pekka

    2015-11-01

    This paper outlines the work for which Roland Grafström and Pekka Kohonen were awarded the 2014 Lush Science Prize. The research activities of the Grafström laboratory have, for many years, covered cancer biology studies, as well as the development and application of toxicity-predictive in vitro models to determine chemical safety. Through the integration of in silico analyses of diverse types of genomics data (transcriptomic and proteomic), their efforts have proved to fit well into the recently-developed Adverse Outcome Pathway paradigm. Genomics analysis within state-of-the-art cancer biology research and Toxicology in the 21st Century concepts share many technological tools. A key category within the Three Rs paradigm is the Replacement of animals in toxicity testing with alternative methods, such as bioinformatics-driven analyses of data obtained from human cell cultures exposed to diverse toxicants. This work was recently expanded within the pan-European SEURAT-1 project (Safety Evaluation Ultimately Replacing Animal Testing), to replace repeat-dose toxicity testing with data-rich analyses of sophisticated cell culture models. The aims and objectives of the SEURAT project have been to guide the application, analysis, interpretation and storage of 'omics' technology-derived data within the service-oriented sub-project, ToxBank. Particularly addressing the Lush Science Prize focus on the relevance of toxicity pathways, a 'data warehouse' that is under continuous expansion, coupled with the development of novel data storage and management methods for toxicology, serve to address data integration across multiple 'omics' technologies. The prize winners' guiding principles and concepts for modern knowledge management of toxicological data are summarised. The translation of basic discovery results ranged from chemical-testing and material-testing data, to information relevant to human health and environmental safety. 2015 FRAME.

  17. Biochemical Phenotypes to Discriminate Microbial Subpopulations and Improve Outbreak Detection

    PubMed Central

    Galar, Alicia; Kulldorff, Martin; Rudnick, Wallis; O'Brien, Thomas F.; Stelling, John

    2013-01-01

    Background Clinical microbiology laboratories worldwide constitute an invaluable resource for monitoring emerging threats and the spread of antimicrobial resistance. We studied the growing number of biochemical tests routinely performed on clinical isolates to explore their value as epidemiological markers. Methodology/Principal Findings Microbiology laboratory results from January 2009 through December 2011 from a 793-bed hospital stored in WHONET were examined. Variables included patient location, collection date, organism, and 47 biochemical and 17 antimicrobial susceptibility test results reported by Vitek 2. To identify biochemical tests that were particularly valuable (stable with repeat testing, but good variability across the species) or problematic (inconsistent results with repeat testing), three types of variance analyses were performed on isolates of K. pneumonia: descriptive analysis of discordant biochemical results in same-day isolates, an average within-patient variance index, and generalized linear mixed model variance component analysis. Results: 4,200 isolates of K. pneumoniae were identified from 2,485 patients, 32% of whom had multiple isolates. The first two variance analyses highlighted SUCT, TyrA, GlyA, and GGT as “nuisance” biochemicals for which discordant within-patient test results impacted a high proportion of patient results, while dTAG had relatively good within-patient stability with good heterogeneity across the species. Variance component analyses confirmed the relative stability of dTAG, and identified additional biochemicals such as PHOS with a large between patient to within patient variance ratio. A reduced subset of biochemicals improved the robustness of strain definition for carbapenem-resistant K. pneumoniae. Surveillance analyses suggest that the reduced biochemical profile could improve the timeliness and specificity of outbreak detection algorithms. Conclusions The statistical approaches explored can improve the robust recognition of microbial subpopulations with routinely available biochemical test results, of value in the timely detection of outbreak clones and evolutionarily important genetic events. PMID:24391936

  18. Cumulative meta-analysis: a new tool for detection of temporal trends and publication bias in ecology.

    PubMed Central

    Leimu, Roosa; Koricheva, Julia

    2004-01-01

    Temporal changes in the magnitude of research findings have recently been recognized as a general phenomenon in ecology, and have been attributed to the delayed publication of non-significant results and disconfirming evidence. Here we introduce a method of cumulative meta-analysis which allows detection of both temporal trends and publication bias in the ecological literature. To illustrate the application of the method, we used two datasets from recently conducted meta-analyses of studies testing two plant defence theories. Our results revealed three phases in the evolution of the treatment effects. Early studies strongly supported the hypothesis tested, but the magnitude of the effect decreased considerably in later studies. In the latest studies, a trend towards an increase in effect size was observed. In one of the datasets, a cumulative meta-analysis revealed publication bias against studies reporting disconfirming evidence; such studies were published in journals with a lower impact factor compared to studies with results supporting the hypothesis tested. Correlation analysis revealed neither temporal trends nor evidence of publication bias in the datasets analysed. We thus suggest that cumulative meta-analysis should be used as a visual aid to detect temporal trends and publication bias in research findings in ecology in addition to the correlative approach. PMID:15347521

  19. A Confirmatory Factor Analysis of the California Verbal Learning Test-Second Edition (CVLT-II) in the Standardization Sample

    ERIC Educational Resources Information Center

    Donders, Jacobus

    2008-01-01

    The purpose of this study is to determine the latent structure of the California Verbal Learning Test-Second Edition (CVLT-II; Delis, Kramer, Kaplan, & Ober, 2000) at three different age levels, using the standardization sample. Maximum likelihood confirmatory factor analyses are performed to test four competing hypothetical models for fit and…

  20. Preflight transient dynamic analyses of B-52 aircraft carrying Space Shuttle solid rocket booster drop-test vehicle

    NASA Technical Reports Server (NTRS)

    Ko, W. L.; Schuster, L. S.

    1984-01-01

    This paper concerns the transient dynamic analysis of the B-52 aircraft carrying the Space Shuttle solid rocket booster drop test vehicle (SRB/DTV). The NASA structural analysis (NASTRAN) finite element computer program was used in the analysis. The B-52 operating conditions considered for analysis were (1) landing and (2) braking on aborted takeoff runs. The transient loads for the B-52 pylon front and rear hooks were calculated. The results can be used to establish the safe maneuver envelopes for the B-52 carrying the SRB/DTV in landings and brakings.

  1. Airborne Laser Systems Testing and Analysis (essals et analyse des systemes laser embarques)

    DTIC Science & Technology

    2010-04-01

    of Surface/ Paints Reflection Properties (PILASTER targets); • PILASTER Sensors Testing and Calibration; • LOAS Laser System Testing; and • Test...PILASTER targets candidate paints and materials), a Laser Scatter-meter (LSM) was built. To briefly summarise the fundamental concepts involved...Green Painted Target. 7.6.3 Laser Beam Misalignment with Respect to the Beam-Expander Support For measuring the beam misalignment, the beam expander

  2. Invariance levels across language versions of the PISA 2009 reading comprehension tests in Spain.

    PubMed

    Elosua Oliden, Paula; Mujika Lizaso, Josu

    2013-01-01

    The PISA project provides the basis for studying curriculum design and for comparing factors associated with school effectiveness. These studies are only valid if the different language versions are equivalent to each other. In Spain, the application of PISA in autonomous regions with their own languages means that equivalency must also be extended to the Spanish, Galician, Catalan and Basque versions of the test. The aim of this work was to analyse the equivalence among the four language versions of the Reading Comprehension Test (PISA 2009). After defining the testlet as the unit of analysis, equivalence among the language versions was analysed using two invariance testing procedures: multiple-group mean and covariance structure analyses for ordinal data and ordinal logistic regression. The procedures yielded concordant results supporting metric equivalence across all four language versions: Spanish, Basque, Galician and Catalan. The equivalence supports the estimated reading literacy score comparability among the language versions used in Spain.

  3. An Analysis of Methods Used to Examine Gender Differences in Computer-Related Behavior.

    ERIC Educational Resources Information Center

    Kay, Robin

    1992-01-01

    Review of research investigating gender differences in computer-related behavior examines statistical and methodological flaws. Issues addressed include sample selection, sample size, scale development, scale quality, the use of univariate and multivariate analyses, regressional analysis, construct definition, construct testing, and the…

  4. Reliability and Factorial Validity of the Artes de Lenguaje.

    ERIC Educational Resources Information Center

    Powers, Stephen; And Others

    1984-01-01

    Spanish speaking first graders were administered the Artes de Lenguage (ADL)--a Spanish, criterion-referenced, language arts test. Reliability analyses indicated the adequacy of three of the four subscales (Phonetic Analysis, Vocabulary Development, Comprehension Skills, and General Skills). A principal factors analysis of the intercorrelation…

  5. Geotechnical Parameters of Alluvial Soils from in-situ Tests

    NASA Astrophysics Data System (ADS)

    Młynarek, Zbigniew; Stefaniak, Katarzyna; Wierzbicki, Jędrzej

    2012-10-01

    The article concentrates on the identification of geotechnical parameters of alluvial soil represented by silts found near Poznan and Elblag. Strength and deformation parameters of the subsoil tested were identified by the CPTU (static penetration) and SDMT (dilatometric) methods, as well as by the vane test (VT). Geotechnical parameters of the subsoil were analysed with a view to using the soil as an earth construction material and as a foundation for buildings constructed on the grounds tested. The article includes an analysis of the overconsolidation process of the soil tested and a formula for the identification of the overconsolidation ratio OCR. Equation 9 reflects the relation between the undrained shear strength and plasticity of the silts analyzed and the OCR value. The analysis resulted in the determination of the Nkt coefficient, which might be used to identify the undrained shear strength of both sediments tested. On the basis of a detailed analysis of changes in terms of the constrained oedometric modulus M0, the relations between the said modulus, the liquidity index and the OCR value were identified. Mayne's formula (1995) was used to determine the M0 modulus from the CPTU test. The usefullness of the sediments found near Poznan as an earth construction material was analysed after their structure had been destroyed and compacted with a Proctor apparatus. In cases of samples characterised by different water content and soil particle density, the analysis of changes in terms of cohesion and the internal friction angle proved that these parameters are influenced by the soil phase composition (Fig. 18 and 19). On the basis of the tests, it was concluded that the most desirable shear strength parameters are achieved when the silt is compacted below the optimum water content.

  6. Geotechnical Parameters of Alluvial Soils from in-situ Tests

    NASA Astrophysics Data System (ADS)

    Młynarek, Zbigniew; Stefaniak, Katarzyna; Wierzbicki, Jedrzej

    2012-10-01

    The article concentrates on the identification of geotechnical parameters of alluvial soil represented by silts found near Poznan and Elblag. Strength and deformation parameters of the subsoil tested were identified by the CPTU (static penetration) and SDMT (dilatometric) methods, as well as by the vane test (VT). Geotechnical parameters of the subsoil were analysed with a view to using the soil as an earth construction material and as a foundation for buildings constructed on the grounds tested. The article includes an analysis of the overconsolidation process of the soil tested and a formula for the identification of the overconsolidation ratio OCR. Equation 9 reflects the relation between the undrained shear strength and plasticity of the silts analyzed and the OCR value. The analysis resulted in the determination of the Nkt coefficient, which might be used to identify the undrained shear strength of both sediments tested. On the basis of a detailed analysis of changes in terms of the constrained oedometric modulus M0, the relations between the said modulus, the liquidity index and the OCR value were identified. Mayne's formula (1995) was used to determine the M0 modulus from the CPTU test. The usefullness of the sediments found near Poznan as an earth construction material was analysed after their structure had been destroyed and compacted with a Proctor apparatus. In cases of samples characterised by different water content and soil particle density, the analysis of changes in terms of cohesion and the internal friction angle proved that these parameters are influenced by the soil phase composition (Fig. 18 and 19). On the basis of the tests, it was concluded that the most desirable shear strength parameters are achieved when the silt is compacted below the optimum water content.

  7. On the Extraction of Components and the Applicability of the Factor Model.

    ERIC Educational Resources Information Center

    Dziuban, Charles D.; Harris, Chester W.

    A reanalysis of Shaycroft's matrix of intercorrelations of 10 test variables plus 4 random variables is discussed. Three different procedures were used in the reanalysis: (1) Image Component Analysis, (2) Uniqueness Rescaling Factor Analysis, and (3) Alpha Factor Analysis. The results of these analyses are presented in tables. It is concluded from…

  8. Analysis of Ninety Degree Flexure Tests for Characterization of Composite Transverse Tensile Strength

    NASA Technical Reports Server (NTRS)

    OBrien, T. Kevin; Krueger, Ronald

    2001-01-01

    Finite element (FE) analysis was performed on 3-point and 4-point bending test configurations of ninety degree oriented glass-epoxy and graphite-epoxy composite beams to identify deviations from beam theory predictions. Both linear and geometric non-linear analyses were performed using the ABAQUS finite element code. The 3-point and 4-point bending specimens were first modeled with two-dimensional elements. Three-dimensional finite element models were then performed for selected 4-point bending configurations to study the stress distribution across the width of the specimens and compare the results to the stresses computed from two-dimensional plane strain and plane stress analyses and the stresses from beam theory. Stresses for all configurations were analyzed at load levels corresponding to the measured transverse tensile strength of the material.

  9. TU-FG-201-05: Varian MPC as a Statistical Process Control Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carver, A; Rowbottom, C

    Purpose: Quality assurance in radiotherapy requires the measurement of various machine parameters to ensure they remain within permitted values over time. In Truebeam release 2.0 the Machine Performance Check (MPC) was released allowing beam output and machine axis movements to be assessed in a single test. We aim to evaluate the Varian Machine Performance Check (MPC) as a tool for Statistical Process Control (SPC). Methods: Varian’s MPC tool was used on three Truebeam and one EDGE linac for a period of approximately one year. MPC was commissioned against independent systems. After this period the data were reviewed to determine whethermore » or not the MPC was useful as a process control tool. Analyses on individual tests were analysed using Shewhart control plots, using Matlab for analysis. Principal component analysis was used to determine if a multivariate model was of any benefit in analysing the data. Results: Control charts were found to be useful to detect beam output changes, worn T-nuts and jaw calibration issues. Upper and lower control limits were defined at the 95% level. Multivariate SPC was performed using Principal Component Analysis. We found little evidence of clustering beyond that which might be naively expected such as beam uniformity and beam output. Whilst this makes multivariate analysis of little use it suggests that each test is giving independent information. Conclusion: The variety of independent parameters tested in MPC makes it a sensitive tool for routine machine QA. We have determined that using control charts in our QA programme would rapidly detect changes in machine performance. The use of control charts allows large quantities of tests to be performed on all linacs without visual inspection of all results. The use of control limits alerts users when data are inconsistent with previous measurements before they become out of specification. A. Carver has received a speaker’s honorarium from Varian.« less

  10. Shoreside Boiler Demonstration of Fuel-Water Emulsions.

    DTIC Science & Technology

    1982-08-01

    34 ( metal seat) P/N IE3980 46172 1 Valve plug 416SST P/N 1E3981 46172 2 Diaphram 302SST P/N 1E3992 36012 1 Gasket P/N 1E3993 04022 2 Gasket P/N 1P7880...each long term test. The analyses per- formed were semi-quantitative spectrographic analysis for metallic elements, quantitative analysis for carbon...analysis were to be compared with samples * of the parent metal to determine the extent of any corrosion and/or erosion during the Long Term Tests, it was

  11. Integrating the Analysis of Mental Operations into Multilevel Models to Validate an Assessment of Higher Education Students' Competency in Business and Economics

    ERIC Educational Resources Information Center

    Brückner, Sebastian; Pellegrino, James W.

    2016-01-01

    The Standards for Educational and Psychological Testing indicate that validation of assessments should include analyses of participants' response processes. However, such analyses typically are conducted only to supplement quantitative field studies with qualitative data, and seldom are such data connected to quantitative data on student or item…

  12. Global and Local Stress Analyses of McDonnell Douglas Stitched/RFI Composite Wing Stub Box

    NASA Technical Reports Server (NTRS)

    Wang, John T.

    1996-01-01

    This report contains results of structural analyses performed in support of the NASA structural testing of an all-composite stitched/RFI (resin film infusion) wing stub box. McDonnell Douglas Aerospace Company designed and fabricated the wing stub box. The analyses used a global/local approach. The global model contains the entire test article. It includes the all-composite stub box, a metallic load-transition box and a metallic wing-tip extension box. The two metallic boxes are connected to the inboard and outboard ends of the composite wing stub box, respectively. The load-transition box was attached to a steel and concrete vertical reaction structure and a load was applied at the tip of the extension box to bend the wing stub box upward. The local model contains an upper cover region surrounding three stringer runouts. In that region, a large nonlinear deformation was identified by the global analyses. A more detailed mesh was used for the local model to obtain more accurate analysis results near stringer runouts. Numerous analysis results such as deformed shapes, displacements at selected locations, and strains at critical locations are included in this report.

  13. A default Bayesian hypothesis test for mediation.

    PubMed

    Nuijten, Michèle B; Wetzels, Ruud; Matzke, Dora; Dolan, Conor V; Wagenmakers, Eric-Jan

    2015-03-01

    In order to quantify the relationship between multiple variables, researchers often carry out a mediation analysis. In such an analysis, a mediator (e.g., knowledge of a healthy diet) transmits the effect from an independent variable (e.g., classroom instruction on a healthy diet) to a dependent variable (e.g., consumption of fruits and vegetables). Almost all mediation analyses in psychology use frequentist estimation and hypothesis-testing techniques. A recent exception is Yuan and MacKinnon (Psychological Methods, 14, 301-322, 2009), who outlined a Bayesian parameter estimation procedure for mediation analysis. Here we complete the Bayesian alternative to frequentist mediation analysis by specifying a default Bayesian hypothesis test based on the Jeffreys-Zellner-Siow approach. We further extend this default Bayesian test by allowing a comparison to directional or one-sided alternatives, using Markov chain Monte Carlo techniques implemented in JAGS. All Bayesian tests are implemented in the R package BayesMed (Nuijten, Wetzels, Matzke, Dolan, & Wagenmakers, 2014).

  14. Test-specific control conditions for functional analyses.

    PubMed

    Fahmie, Tara A; Iwata, Brian A; Querim, Angie C; Harper, Jill M

    2013-01-01

    Most functional analyses of problem behavior include a common condition (play or noncontingent reinforcement) as a control for both positive and negative reinforcement. However, test-specific conditions that control for each potential source of reinforcement may be beneficial occasionally. We compared responding during alone, ignore, play, and differential reinforcement of other behavior (DRO) control conditions for individuals whose problem behavior was maintained by positive or negative reinforcement. Results showed that all of the conditions were effective controls for problem behavior maintained by positive reinforcement; however, the DRO condition was consistently ineffective as a control for problem behavior maintained by negative reinforcement. Implications for the design of functional analyses and future research are discussed. © Society for the Experimental Analysis of Behavior.

  15. Thermal measurement of brake pad lining surfaces during the braking process

    NASA Astrophysics Data System (ADS)

    Piątkowski, Tadeusz; Polakowski, Henryk; Kastek, Mariusz; Baranowski, Pawel; Damaziak, Krzysztof; Małachowski, Jerzy; Mazurkiewicz, Łukasz

    2012-06-01

    This paper presents the test campaign concept and definition and the analysis of the recorded measurements. One of the most important systems in cars and trucks are brakes. The braking temperature on a lining surface can rise above 500°C. This shows how linings requirements are so strict and, what is more, continuously rising. Besides experimental tests, very supportive method for investigating processes which occur on the brake pad linings are numerical analyses. Experimental tests were conducted on the test machine called IL-68. The main component of IL-68 is so called frictional unit, which consists of: rotational head, which convey a shaft torque and where counter samples are placed and translational head, where samples of coatings are placed and pressed against counter samples. Due to the high rotational speeds and thus the rapid changes in temperature field, the infrared camera was used for testing. The paper presents results of analysis registered thermograms during the tests with different conditions. Furthermore, based on this testing machine, the numerical model was developed. In order to avoid resource demanding analyses only the frictional unit (described above) was taken into consideration. Firstly the geometrical model was performed thanks to CAD techniques, which in the next stage was a base for developing the finite element model. Material properties and boundary conditions exactly correspond to experimental tests. Computations were performed using a dynamic LS-Dyna code where heat generation was estimated assuming full (100%) conversion of mechanical work done by friction forces. Paper presents the results of dynamic thermomechanical analysis too and these results were compared with laboratory tests.

  16. Placental alpha-microglobulin-1 and combined traditional diagnostic test: a cost-benefit analysis.

    PubMed

    Echebiri, Nelson C; McDoom, M Maya; Pullen, Jessica A; Aalto, Meaghan M; Patel, Natasha N; Doyle, Nora M

    2015-01-01

    We sought to evaluate if the placental alpha-microglobulin (PAMG)-1 test vs the combined traditional diagnostic test (CTDT) of pooling, nitrazine, and ferning would be a cost-beneficial screening strategy in the setting of potential preterm premature rupture of membranes. A decision analysis model was used to estimate the economic impact of PAMG-1 test vs the CTDT on preterm delivery costs from a societal perspective. Our primary outcome was the annual net cost-benefit per person tested. Baseline probabilities and costs assumptions were derived from published literature. We conducted sensitivity analyses using both deterministic and probabilistic models. Cost estimates reflect 2013 US dollars. Annual net benefit from PAMG-1 was $20,014 per person tested, while CTDT had a net benefit of $15,757 per person tested. If the probability of rupture is <38%, PAMG-1 will be cost-beneficial with an annual net benefit of $16,000-37,000 per person tested, while CTDT will have an annual net benefit of $16,000-19,500 per person tested. If the probability of rupture is >38%, CTDT is more cost-beneficial. Monte Carlo simulations of 1 million trials selected PAMG-1 as the optimal strategy with a frequency of 89%, while CTDT was only selected as the optimal strategy with a frequency of 11%. Sensitivity analyses were robust. Our cost-benefit analysis provides the economic evidence for the adoption of PAMG-1 in diagnosing preterm premature rupture of membranes in uncertain presentations and when CTDT is equivocal at 34 to <37 weeks' gestation. Copyright © 2015 Elsevier Inc. All rights reserved.

  17. Reinventing the ames test as a quantitative lab that connects classical and molecular genetics.

    PubMed

    Goodson-Gregg, Nathan; De Stasio, Elizabeth A

    2009-01-01

    While many institutions use a version of the Ames test in the undergraduate genetics laboratory, students typically are not exposed to techniques or procedures beyond qualitative analysis of phenotypic reversion, thereby seriously limiting the scope of learning. We have extended the Ames test to include both quantitative analysis of reversion frequency and molecular analysis of revertant gene sequences. By giving students a role in designing their quantitative methods and analyses, students practice and apply quantitative skills. To help students connect classical and molecular genetic concepts and techniques, we report here procedures for characterizing the molecular lesions that confer a revertant phenotype. We suggest undertaking reversion of both missense and frameshift mutants to allow a more sophisticated molecular genetic analysis. These modifications and additions broaden the educational content of the traditional Ames test teaching laboratory, while simultaneously enhancing students' skills in experimental design, quantitative analysis, and data interpretation.

  18. LANDSAT-4 image data quality analysis for energy related applications. [nuclear power plant sites

    NASA Technical Reports Server (NTRS)

    Wukelic, G. E. (Principal Investigator)

    1983-01-01

    No useable LANDSAT 4 TM data were obtained for the Hanford site in the Columbia Plateau region, but TM simulator data for a Virginia Electric Company nuclear power plant was used to test image processing algorithms. Principal component analyses of this data set clearly indicated that thermal plumes in surface waters used for reactor cooling would be discrenible. Image processing and analysis programs were successfully testing using the 7 band Arkansas test scene and preliminary analysis of TM data for the Savanah River Plant shows that current interactive, image enhancement, analysis and integration techniques can be effectively used for LANDSAT 4 data. Thermal band data appear adequate for gross estimates of thermal changes occurring near operating nuclear facilities especially in surface water bodies being used for reactor cooling purposes. Additional image processing software was written and tested which provides for more rapid and effective analysis of the 7 band TM data.

  19. [Clinical research XXIII. From clinical judgment to meta-analyses].

    PubMed

    Rivas-Ruiz, Rodolfo; Castelán-Martínez, Osvaldo D; Pérez-Rodríguez, Marcela; Palacios-Cruz, Lino; Noyola-Castillo, Maura E; Talavera, Juan O

    2014-01-01

    Systematic reviews (SR) are studies made in order to ask clinical questions based on original articles. Meta-analysis (MTA) is the mathematical analysis of SR. These analyses are divided in two groups, those which evaluate the measured results of quantitative variables (for example, the body mass index -BMI-) and those which evaluate qualitative variables (for example, if a patient is alive or dead, or if he is healing or not). Quantitative variables generally use the mean difference analysis and qualitative variables can be performed using several calculations: odds ratio (OR), relative risk (RR), absolute risk reduction (ARR) and hazard ratio (HR). These analyses are represented through forest plots which allow the evaluation of each individual study, as well as the heterogeneity between studies and the overall effect of the intervention. These analyses are mainly based on Student's t test and chi-squared. To take appropriate decisions based on the MTA, it is important to understand the characteristics of statistical methods in order to avoid misinterpretations.

  20. Integrated Vehicle Ground Vibration Testing in Support of NASA Launch Vehicle Loads and Controls Analysis

    NASA Technical Reports Server (NTRS)

    Tuma, Margaret L.; Davis, Susan R.; Askins, Bruce R.; Salyer, Blaine H.

    2008-01-01

    The National Aeronautics and Space Administration (NASA) Ares Projects Office (APO) is continuing to make progress toward the final design of the Ares I crew launch vehicle and Ares V cargo launch vehicle. Ares I and V will form the space launch capabilities necessary to fulfill NASA's exploration strategy of sending human beings to the Moon, Mars, and beyond. As with all new space vehicles there will be a number of tests to ensure the design can be Human Rated. One of these is the Integrated Vehicle Ground Vibration Test (IVGVT) that will be measuring responses of the Ares I as a system. All structural systems possess a basic set of physical characteristics unique to that system. These unique characteristics include items such as mass distribution, frequency and damping. When specified, they allow engineers to understand and predict how a structural system like the Ares I launch vehicle behaves under given loading conditions. These physical properties of launch vehicles may be predicted by analysis or measured through certain types of tests. Generally, these properties are predicted by analysis during the design phase of a launch vehicle and then verified through testing before the vehicle is Human Rated. The IVGVT is intended to measure by test the fundamental dynamic characteristics of Ares I during various phases of operational/flight. This testing includes excitations of the vehicle in lateral, longitudinal, and torsional directions at vehicle configurations representing different trajectory points. During the series of tests, properties such as natural frequencies, mode shapes, and transfer functions are measured directly. These data will then be used to calibrate loads and Guidance, Navigation, and Controls (GN&C) analysis models for verifying analyses of Ares I. NASA launch vehicles from Saturn to Shuttle have undergone Ground Vibration Tests (GVTs) leading to successful launch vehicles. A GVT was not performed on the unmanned Delta III. This vehicle was lost during launch. Subsequent analyses indicated that had a GVT been conducted on the vehicle, problems with vehicle modes and control may have been discovered and corrected, avoiding loss of the vehicle/mission. This paper will address GVT planning, set-up, conduction and analyses, for the Saturn and Shuttle programs, and also focus on the current and on-going planning for the Ares I and V IVGVT.

  1. Psychometric Properties of the Revised Mathematics Anxiety Rating Scale

    ERIC Educational Resources Information Center

    Baloglu, Mustafa; Zelhart, Paul F.

    2007-01-01

    An exploratory factor analysis and several confirmatory analyses were performed to evaluate the factorial structure of the Revised Mathematics Anxiety Rating Scale (RMARS) through the responses of 805 college students. On 559 students' scores, the instrument's construct validity was tested through a confirmatory factor analysis (CFA) and was found…

  2. Automated spectrophotometric bicarbonate analysis in duodenal juice compared to the back titration method.

    PubMed

    Erchinger, Friedemann; Engjom, Trond; Gudbrandsen, Oddrun Anita; Tjora, Erling; Gilja, Odd H; Dimcevski, Georg

    2016-01-01

    We have recently evaluated a short endoscopic secretin test for exocrine pancreatic function. Bicarbonate concentration in duodenal juice is an important parameter in this test. Measurement of bicarbonate by back titration as the gold standard method is time consuming, expensive and technically difficult, thus a simplified method is warranted. We aimed to evaluate an automated spectrophotometric method in samples spanning the effective range of bicarbonate concentrations in duodenal juice. We also evaluated if freezing of samples before analyses would affect its results. Patients routinely examined with short endoscopic secretin test suspected to have decreased pancreatic function of various reasons were included. Bicarbonate in duodenal juice was quantified by back titration and automatic spectrophotometry. Both fresh and thawed samples were analysed spectrophotometrically. 177 samples from 71 patients were analysed. Correlation coefficient of all measurements was r = 0.98 (p < 0.001). Correlation coefficient of fresh versus frozen samples conducted with automatic spectrophotometry (n = 25): r = 0.96 (p < 0.001) CONCLUSIONS: The measurement of bicarbonate in fresh and thawed samples by automatic spectrophotometrical analysis correlates excellent with the back titration gold standard. This is a major simplification of direct pancreas function testing, and allows a wider distribution of bicarbonate testing in duodenal juice. Extreme values for Bicarbonate concentration achieved by the autoanalyser method have to be interpreted with caution. Copyright © 2016 IAP and EPC. Published by Elsevier India Pvt Ltd. All rights reserved.

  3. Overview of MSFC AMSD Integrated Modeling and Analysis

    NASA Technical Reports Server (NTRS)

    Cummings, Ramona; Russell, Kevin (Technical Monitor)

    2002-01-01

    Structural, thermal, dynamic, and optical models of the NGST AMSD mirror assemblies are being finalized and integrated for predicting cryogenic vacuum test performance of the developing designs. Analyzers in use by the MSFC Modeling and Analysis Team are identified, with overview of approach to integrate simulated effects. Guidelines to verify the individual models and calibration cases for comparison with the vendors' analyses are presented. In addition, baseline and proposed additional scenarios for the cryogenic vacuum testing are briefly described.

  4. Dynamic and Static Shape Test/Analysis Correlation of a 10 Meter Quadrant Solar Sail

    NASA Technical Reports Server (NTRS)

    Taleghani, Barmac K.; Lively, Peter S.; Gaspar, James L.; Murphy, David M.; Trautt, Thomas A.

    2005-01-01

    This paper describes finite element analyses and correlation studies to predict deformations and vibration modes/frequencies of a 10-meter quadrant solar sail system. Thin film membranes and booms were analyzed at the component and system-level. The objective was to verify the design and structural responses of the sail system and to mature solar sail technology to a TRL 5. The focus of this paper is in test/analysis correlation.

  5. Thermal Analysis of Small Re-Entry Probe

    NASA Technical Reports Server (NTRS)

    Agrawal, Parul; Prabhu, Dinesh K.; Chen, Y. K.

    2012-01-01

    The Small Probe Reentry Investigation for TPS Engineering (SPRITE) concept was developed at NASA Ames Research Center to facilitate arc-jet testing of a fully instrumented prototype probe at flight scale. Besides demonstrating the feasibility of testing a flight-scale model and the capability of an on-board data acquisition system, another objective for this project was to investigate the capability of simulation tools to predict thermal environments of the probe/test article and its interior. This paper focuses on finite-element thermal analyses of the SPRITE probe during the arcjet tests. Several iterations were performed during the early design phase to provide critical design parameters and guidelines for testing. The thermal effects of ablation and pyrolysis were incorporated into the final higher-fidelity modeling approach by coupling the finite-element analyses with a two-dimensional thermal protection materials response code. Model predictions show good agreement with thermocouple data obtained during the arcjet test.

  6. NEAT: an efficient network enrichment analysis test.

    PubMed

    Signorelli, Mirko; Vinciotti, Veronica; Wit, Ernst C

    2016-09-05

    Network enrichment analysis is a powerful method, which allows to integrate gene enrichment analysis with the information on relationships between genes that is provided by gene networks. Existing tests for network enrichment analysis deal only with undirected networks, they can be computationally slow and are based on normality assumptions. We propose NEAT, a test for network enrichment analysis. The test is based on the hypergeometric distribution, which naturally arises as the null distribution in this context. NEAT can be applied not only to undirected, but to directed and partially directed networks as well. Our simulations indicate that NEAT is considerably faster than alternative resampling-based methods, and that its capacity to detect enrichments is at least as good as the one of alternative tests. We discuss applications of NEAT to network analyses in yeast by testing for enrichment of the Environmental Stress Response target gene set with GO Slim and KEGG functional gene sets, and also by inspecting associations between functional sets themselves. NEAT is a flexible and efficient test for network enrichment analysis that aims to overcome some limitations of existing resampling-based tests. The method is implemented in the R package neat, which can be freely downloaded from CRAN ( https://cran.r-project.org/package=neat ).

  7. Univariate and bivariate likelihood-based meta-analysis methods performed comparably when marginal sensitivity and specificity were the targets of inference.

    PubMed

    Dahabreh, Issa J; Trikalinos, Thomas A; Lau, Joseph; Schmid, Christopher H

    2017-03-01

    To compare statistical methods for meta-analysis of sensitivity and specificity of medical tests (e.g., diagnostic or screening tests). We constructed a database of PubMed-indexed meta-analyses of test performance from which 2 × 2 tables for each included study could be extracted. We reanalyzed the data using univariate and bivariate random effects models fit with inverse variance and maximum likelihood methods. Analyses were performed using both normal and binomial likelihoods to describe within-study variability. The bivariate model using the binomial likelihood was also fit using a fully Bayesian approach. We use two worked examples-thoracic computerized tomography to detect aortic injury and rapid prescreening of Papanicolaou smears to detect cytological abnormalities-to highlight that different meta-analysis approaches can produce different results. We also present results from reanalysis of 308 meta-analyses of sensitivity and specificity. Models using the normal approximation produced sensitivity and specificity estimates closer to 50% and smaller standard errors compared to models using the binomial likelihood; absolute differences of 5% or greater were observed in 12% and 5% of meta-analyses for sensitivity and specificity, respectively. Results from univariate and bivariate random effects models were similar, regardless of estimation method. Maximum likelihood and Bayesian methods produced almost identical summary estimates under the bivariate model; however, Bayesian analyses indicated greater uncertainty around those estimates. Bivariate models produced imprecise estimates of the between-study correlation of sensitivity and specificity. Differences between methods were larger with increasing proportion of studies that were small or required a continuity correction. The binomial likelihood should be used to model within-study variability. Univariate and bivariate models give similar estimates of the marginal distributions for sensitivity and specificity. Bayesian methods fully quantify uncertainty and their ability to incorporate external evidence may be useful for imprecisely estimated parameters. Copyright © 2017 Elsevier Inc. All rights reserved.

  8. Verification of the FBR fuel bundle-duct interaction analysis code BAMBOO by the out-of-pile bundle compression test with large diameter pins

    NASA Astrophysics Data System (ADS)

    Uwaba, Tomoyuki; Ito, Masahiro; Nemoto, Junichi; Ichikawa, Shoichi; Katsuyama, Kozo

    2014-09-01

    The BAMBOO computer code was verified by results for the out-of-pile bundle compression test with large diameter pin bundle deformation under the bundle-duct interaction (BDI) condition. The pin diameters of the examined test bundles were 8.5 mm and 10.4 mm, which are targeted as preliminary fuel pin diameters for the upgraded core of the prototype fast breeder reactor (FBR) and for demonstration and commercial FBRs studied in the FaCT project. In the bundle compression test, bundle cross-sectional views were obtained from X-ray computer tomography (CT) images and local parameters of bundle deformation such as pin-to-duct and pin-to-pin clearances were measured by CT image analyses. In the verification, calculation results of bundle deformation obtained by the BAMBOO code analyses were compared with the experimental results from the CT image analyses. The comparison showed that the BAMBOO code reasonably predicts deformation of large diameter pin bundles under the BDI condition by assuming that pin bowing and cladding oval distortion are the major deformation mechanisms, the same as in the case of small diameter pin bundles. In addition, the BAMBOO analysis results confirmed that cladding oval distortion effectively suppresses BDI in large diameter pin bundles as well as in small diameter pin bundles.

  9. Religious Priming: A Meta-Analysis With a Focus on Prosociality.

    PubMed

    Shariff, Azim F; Willard, Aiyana K; Andersen, Teresa; Norenzayan, Ara

    2016-02-01

    Priming has emerged as a valuable tool within the psychological study of religion, allowing for tests of religion's causal effect on a number of psychological outcomes, such as prosocial behavior. As the literature has grown, questions about the reliability and boundary conditions of religious priming have arisen. We use a combination of traditional effect-size analyses, p-curve analyses, and adjustments for publication bias to evaluate the robustness of four types of religious priming (Analyses 1-3), review the empirical evidence for religion's effect specifically on prosocial behavior (Analyses 4-5), and test whether religious-priming effects generalize to individuals who report little or no religiosity (Analyses 6-7). Results across 93 studies and 11,653 participants show that religious priming has robust effects across a variety of outcome measures-prosocial measures included. Religious priming does not, however, reliably affect non-religious participants-suggesting that priming depends on the cognitive activation of culturally transmitted religious beliefs. © 2015 by the Society for Personality and Social Psychology, Inc.

  10. Tafamidis delays disease progression in patients with early stage transthyretin familial amyloid polyneuropathy: additional supportive analyses from the pivotal trial.

    PubMed

    Keohane, Denis; Schwartz, Jeffrey; Gundapaneni, Balarama; Stewart, Michelle; Amass, Leslie

    2017-03-01

    Tafamidis, a non-NSAID highly specific transthyretin stabilizer, delayed neurologic disease progression as measured by Neuropathy Impairment Score-Lower Limbs (NIS-LL) in an 18-month, double-blind, placebo-controlled randomized trial in 128 patients with early-stage transthyretin V30M familial amyloid polyneuropathy (ATTRV30M-FAP). The current post hoc analyses aimed to further evaluate the effects of tafamidis in delaying ATTRV30M-FAP progression in this trial. Pre-specified, repeated-measures analysis of change from baseline in NIS-LL in this trial (ClinicalTrials.gov NCT00409175) was repeated with addition of baseline as covariate and multiple imputation analysis for missing data by treatment group. Change in NIS-LL plus three small-fiber nerve tests (NIS-LL + Σ3) and NIS-LL plus seven nerve tests (NIS-LL + Σ7) were assessed without baseline as covariate. Treatment outcomes over the NIS-LL, Σ3, Σ7, modified body mass index and Norfolk Quality of Life-Diabetic Neuropathy Total Quality of Life Score were also examined using multivariate analysis techniques. Neuropathy progression based on NIS-LL change from baseline to Month 18 remained significantly reduced for tafamidis versus placebo in the baseline-adjusted and multiple imputation analyses. NIS-LL + Σ3 and NIS-LL + Σ7 captured significant treatment group differences. Multivariate analyses provided strong statistical evidence for a superior tafamidis treatment effect. These supportive analyses confirm that tafamidis delays neurologic progression in early-stage ATTRV30M-FAP. NCT00409175.

  11. Why the Major Field Test in Business Does Not Report Subscores: Reliability and Construct Validity Evidence. Research Report. ETS RR-12-11

    ERIC Educational Resources Information Center

    Ling, Guangming

    2012-01-01

    To assess the value of individual students' subscores on the Major Field Test in Business (MFT Business), I examined the test's internal structure with factor analysis and structural equation model methods, and analyzed the subscore reliabilities using the augmented scores method. Analyses of the internal structure suggested that the MFT Business…

  12. Inductively Coupled Plasma Mass Spectrometry: Sample Analysis of Zirconium and Ruthenium in Metal Organic Frameworks

    DTIC Science & Technology

    2018-02-01

    international proficiency testing sponsored by the Organisation for the Prohibition of Chemical Weapons (The Hague, Netherlands). Traditionally...separate batch of standards at each level for a total of six analyses at each calibration level. Concentrations of the tested calibration levels are...and ruthenium at each calibration level. 11 REFERENCES 1. General Requirements for the Competence of Testing and Calibration Laboratories

  13. Repeatability and uncertainty analyses of NASA/MSFC light gas gun test data

    NASA Technical Reports Server (NTRS)

    Schonberg, William P.; Cooper, David

    1993-01-01

    This Final Report presents an overview of the impact tests performed at NASA/MSFC in the time period 1985 to 1991 and the results of phenomena repeatability and data uncertainty studies performed using the information obtained from those tests. An analysis of the data from over 400 tests conducted between 1989 and 1991 was performed to generate a database to supplement the Hypervelocity Impact Damage Database developed under a previous effort.

  14. Preliminary data from Arbuckle test wells, Miami, Douglas, Saline, and Labette counties, Kansas

    USGS Publications Warehouse

    Gogel, Tony

    1981-01-01

    Formation data from drill-stem tests are presented for use in calculating transmissivity, hydraulic conductivity, and hydraulic head. Complete analyses of water samples from wells at sites 2, 3, and 4, and a partial analysis at site 1, are presented to indicate water quality in the aquifers.

  15. Measurement Invariance of the "Servant Leadership Questionnaire" across K-12 Principal Gender

    ERIC Educational Resources Information Center

    Xu, Lihua; Stewart, Trae; Haber-Curran, Paige

    2015-01-01

    Measurement invariance of the five-factor "Servant Leadership Questionnaire" between female and male K-12 principals was tested using multi-group confirmatory factor analysis. A sample of 956 principals (56.9% were females and 43.1% were males) was analysed in this study. The hierarchical multi-step measurement invariance test supported…

  16. Assessment of Adolescent Perceptions on Parental Attitudes on Different Variables

    ERIC Educational Resources Information Center

    Ersoy, Evren

    2015-01-01

    The purpose of this study is to examine secondary school student perceptions of parental attitudes with regards to specific variables. Independent samples t test for parametric distributions and one-way variance analysis (ANOVA) was used for analyzing the data, when the ANOVA analyses were significant Scheffe test was conducted on homogeneous…

  17. Structural Analysis of the Right Rear Lug of American Airlines Flight 587

    NASA Technical Reports Server (NTRS)

    Raju, Ivatury S.; Glaessgen, Edward H.; Mason, Brian H.; Krishnamurthy, Thiagarajan; Davila, Carlos G.

    2006-01-01

    A detailed finite element analysis of the right rear lug of the American Airlines Flight 587 - Airbus A300-600R was performed as part of the National Transportation Safety Board s failure investigation of the accident that occurred on November 12, 2001. The loads experienced by the right rear lug are evaluated using global models of the vertical tail, local models near the right rear lug, and a global-local analysis procedure. The right rear lug was analyzed using two modeling approaches. In the first approach, solid-shell type modeling is used, and in the second approach, layered-shell type modeling is used. The solid-shell and the layered-shell modeling approaches were used in progressive failure analyses (PFA) to determine the load, mode, and location of failure in the right rear lug under loading representative of an Airbus certification test conducted in 1985 (the 1985-certification test). Both analyses were in excellent agreement with each other on the predicted failure loads, failure mode, and location of failure. The solid-shell type modeling was then used to analyze both a subcomponent test conducted by Airbus in 2003 (the 2003-subcomponent test) and the accident condition. Excellent agreement was observed between the analyses and the observed failures in both cases. The moment, Mx (moment about the fuselage longitudinal axis), has significant effect on the failure load of the lugs. Higher absolute values of Mx give lower failure loads. The predicted load, mode, and location of the failure of the 1985- certification test, 2003-subcomponent test, and the accident condition are in very good agreement. This agreement suggests that the 1985-certification and 2003-subcomponent tests represent the accident condition accurately. The failure mode of the right rear lug for the 1985-certification test, 2003-subcomponent test, and the accident load case is identified as a cleavage-type failure. For the accident case, the predicted failure load for the right rear lug from the PFA is greater than 1.98 times the limit load of the lugs.

  18. Use of the Analysis of the Volatile Faecal Metabolome in Screening for Colorectal Cancer

    PubMed Central

    2015-01-01

    Diagnosis of colorectal cancer is an invasive and expensive colonoscopy, which is usually carried out after a positive screening test. Unfortunately, existing screening tests lack specificity and sensitivity, hence many unnecessary colonoscopies are performed. Here we report on a potential new screening test for colorectal cancer based on the analysis of volatile organic compounds (VOCs) in the headspace of faecal samples. Faecal samples were obtained from subjects who had a positive faecal occult blood sample (FOBT). Subjects subsequently had colonoscopies performed to classify them into low risk (non-cancer) and high risk (colorectal cancer) groups. Volatile organic compounds were analysed by selected ion flow tube mass spectrometry (SIFT-MS) and then data were analysed using both univariate and multivariate statistical methods. Ions most likely from hydrogen sulphide, dimethyl sulphide and dimethyl disulphide are statistically significantly higher in samples from high risk rather than low risk subjects. Results using multivariate methods show that the test gives a correct classification of 75% with 78% specificity and 72% sensitivity on FOBT positive samples, offering a potentially effective alternative to FOBT. PMID:26086914

  19. Bayesian multivariate hierarchical transformation models for ROC analysis.

    PubMed

    O'Malley, A James; Zou, Kelly H

    2006-02-15

    A Bayesian multivariate hierarchical transformation model (BMHTM) is developed for receiver operating characteristic (ROC) curve analysis based on clustered continuous diagnostic outcome data with covariates. Two special features of this model are that it incorporates non-linear monotone transformations of the outcomes and that multiple correlated outcomes may be analysed. The mean, variance, and transformation components are all modelled parametrically, enabling a wide range of inferences. The general framework is illustrated by focusing on two problems: (1) analysis of the diagnostic accuracy of a covariate-dependent univariate test outcome requiring a Box-Cox transformation within each cluster to map the test outcomes to a common family of distributions; (2) development of an optimal composite diagnostic test using multivariate clustered outcome data. In the second problem, the composite test is estimated using discriminant function analysis and compared to the test derived from logistic regression analysis where the gold standard is a binary outcome. The proposed methodology is illustrated on prostate cancer biopsy data from a multi-centre clinical trial.

  20. Bayesian multivariate hierarchical transformation models for ROC analysis

    PubMed Central

    O'Malley, A. James; Zou, Kelly H.

    2006-01-01

    SUMMARY A Bayesian multivariate hierarchical transformation model (BMHTM) is developed for receiver operating characteristic (ROC) curve analysis based on clustered continuous diagnostic outcome data with covariates. Two special features of this model are that it incorporates non-linear monotone transformations of the outcomes and that multiple correlated outcomes may be analysed. The mean, variance, and transformation components are all modelled parametrically, enabling a wide range of inferences. The general framework is illustrated by focusing on two problems: (1) analysis of the diagnostic accuracy of a covariate-dependent univariate test outcome requiring a Box–Cox transformation within each cluster to map the test outcomes to a common family of distributions; (2) development of an optimal composite diagnostic test using multivariate clustered outcome data. In the second problem, the composite test is estimated using discriminant function analysis and compared to the test derived from logistic regression analysis where the gold standard is a binary outcome. The proposed methodology is illustrated on prostate cancer biopsy data from a multi-centre clinical trial. PMID:16217836

  1. Environmental Testing Philosophy for a Sandia National Laboratories' Small Satellite Project - A Retrospective

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    CAP,JEROME S.

    2000-08-24

    Sandia has recently completed the flight certification test series for the Multi-Spectral Thermal Imaging satellite (MTI), which is a small satellite for which Sandia was the system integrator. A paper was presented at the 16th Aerospace Testing Seminar discussing plans for performing the structural dynamics certification program for that satellite. The testing philosophy was originally based on a combination of system level vibroacoustic tests and component level shock and vibration tests. However, the plans evolved to include computational analyses using both Finite Element Analysis and Statistical Energy Analysis techniques. This paper outlines the final certification process and discuss lessons learnedmore » including both things that went well and things that should/could have been done differently.« less

  2. Engineering analysis activities in support of susquehanna unit 1 startup testing and cycle 1 operations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, G.D.; Kukielka, C.A.; Olson, L.M.

    The engineering analysis group is responsible for all nuclear plant systems analysis and reactor analysis activities, excluding fuel management analysis, at Pennsylvania Power and Light Company. These activities include making pretest and posttest predictions of startup tests; analyzing unplanned or unexpected transient events; providing technical training to plant personnel; assisting in the development of emergency drill scenarios; providing engineering evaluations to support design and technical specification changes, and evaluating, assessing, and resolving a number of license conditions. Many of these activities have required the direct use of RETRAN models. Two RETRAN analyses that were completed to support plant operations -more » a pretest analysis of the turbine trip startup test, and a posttest analysis of the loss of startup transformer event - are investigated. For each case, RETRAN results are compared with available plant data and comparisons are drawn on the acceptability of the performance of the plant systems.« less

  3. GenAlEx 6.5: genetic analysis in Excel. Population genetic software for teaching and research—an update

    PubMed Central

    Peakall, Rod; Smouse, Peter E.

    2012-01-01

    Summary: GenAlEx: Genetic Analysis in Excel is a cross-platform package for population genetic analyses that runs within Microsoft Excel. GenAlEx offers analysis of diploid codominant, haploid and binary genetic loci and DNA sequences. Both frequency-based (F-statistics, heterozygosity, HWE, population assignment, relatedness) and distance-based (AMOVA, PCoA, Mantel tests, multivariate spatial autocorrelation) analyses are provided. New features include calculation of new estimators of population structure: G′ST, G′′ST, Jost’s Dest and F′ST through AMOVA, Shannon Information analysis, linkage disequilibrium analysis for biallelic data and novel heterogeneity tests for spatial autocorrelation analysis. Export to more than 30 other data formats is provided. Teaching tutorials and expanded step-by-step output options are included. The comprehensive guide has been fully revised. Availability and implementation: GenAlEx is written in VBA and provided as a Microsoft Excel Add-in (compatible with Excel 2003, 2007, 2010 on PC; Excel 2004, 2011 on Macintosh). GenAlEx, and supporting documentation and tutorials are freely available at: http://biology.anu.edu.au/GenAlEx. Contact: rod.peakall@anu.edu.au PMID:22820204

  4. Residual Strength Pressure Tests and Nonlinear Analyses of Stringer- and Frame-Stiffened Aluminum Fuselage Panels with Longitudinal Cracks

    NASA Technical Reports Server (NTRS)

    Young, Richard D.; Rouse, Marshall; Ambur, Damodar R.; Starnes, James H., Jr.

    1999-01-01

    The results of residual strength pressure tests and nonlinear analyses of stringer- and frame-stiffened aluminum fuselage panels with longitudinal cracks are presented. Two types of damage are considered: a longitudinal crack located midway between stringers, and a longitudinal crack adjacent to a stringer and along a row of fasteners in a lap joint that has multiple-site damage (MSD). In both cases, the longitudinal crack is centered on a severed frame. The panels are subjected to internal pressure plus axial tension loads. The axial tension loads are equivalent to a bulkhead pressure load. Nonlinear elastic-plastic residual strength analyses of the fuselage panels are conducted using a finite element program and the crack-tip-opening-angle (CTOA) fracture criterion. Predicted crack growth and residual strength results from nonlinear analyses of the stiffened fuselage panels are compared with experimental measurements and observations. Both the test and analysis results indicate that the presence of MSD affects crack growth stability and reduces the residual strength of stiffened fuselage shells with long cracks.

  5. Residual Strength Pressure Tests and Nonlinear Analyses of Stringer-and Frame-Stiffened Aluminum Fuselage Panels with Longitudinal Cracks

    NASA Technical Reports Server (NTRS)

    Young, Richard D.; Rouse, Marshall; Ambur, Damodar R.; Starnes, James H., Jr.

    1998-01-01

    The results of residual strength pressure tests and nonlinear analyses of stringer- and frame-stiffened aluminum fuselage panels with longitudinal cracks are presented. Two types of damage are considered: a longitudinal crack located midway between stringers, and a longitudinal crack adjacent to a stringer and along a row of fasteners in a lap joint that has multiple-site damage (MSD). In both cases, the longitudinal crack is centered on a severed frame. The panels are subjected to internal pressure plus axial tension loads. The axial tension loads are equivalent to a bulkhead pressure load. Nonlinear elastic-plastic residual strength analyses of the fuselage panels are conducted using a finite element program and the crack-tip-opening-angle (CTOA) fracture criterion. Predicted crack growth and residual strength results from nonlinear analyses of the stiffened fuselage panels are compared with experimental measurements and observations. Both the test and analysis results indicate that the presence of MSD affects crack growth stability and reduces the residual strength of stiffened fuselage shells with long cracks.

  6. A review of 241 subjects who were patch tested twice: could fragrance mix I cause active sensitization?

    PubMed

    White, J M L; McFadden, J P; White, I R

    2008-03-01

    Active patch test sensitization is an uncommon phenomenon which may have undesirable consequences for those undergoing this gold-standard investigation for contact allergy. To perform a retrospective analysis of the results of 241 subjects who were patch tested twice in a monocentre evaluating approximately 1500 subjects per year. Positivity to 11 common allergens in the recommended Baseline Series of contact allergens (European) was analysed: nickel sulphate; Myroxylon pereirae; fragrance mix I; para-phenylenediamine; colophonium; epoxy resin; neomycin; quaternium-15; thiuram mix; sesquiterpene lactone mix; and para-tert-butylphenol resin. Only fragrance mix I gave a statistically significant, increased rate of positivity on the second reading compared with the first (P=0.011). This trend was maintained when separately analysing a subgroup of 42 subjects who had been repeat patch tested within 1 year; this analysis was done to minimize the potential confounding factor of increased usage of fragrances with a wide interval between both tests. To reduce the confounding effect of age on our data, we calculated expected frequencies of positivity to fragrance mix I based on previously published data from our centre. This showed a marked excess of observed cases over predicted ones, particularly in women in the age range 40-60 years. We suspect that active sensitization to fragrance mix I may occur. Similar published analysis from another large group using standard methodology supports our data.

  7. Bayesian meta-analysis of Cronbach's coefficient alpha to evaluate informative hypotheses.

    PubMed

    Okada, Kensuke

    2015-12-01

    This paper proposes a new method to evaluate informative hypotheses for meta-analysis of Cronbach's coefficient alpha using a Bayesian approach. The coefficient alpha is one of the most widely used reliability indices. In meta-analyses of reliability, researchers typically form specific informative hypotheses beforehand, such as 'alpha of this test is greater than 0.8' or 'alpha of one form of a test is greater than the others.' The proposed method enables direct evaluation of these informative hypotheses. To this end, a Bayes factor is calculated to evaluate the informative hypothesis against its complement. It allows researchers to summarize the evidence provided by previous studies in favor of their informative hypothesis. The proposed approach can be seen as a natural extension of the Bayesian meta-analysis of coefficient alpha recently proposed in this journal (Brannick and Zhang, 2013). The proposed method is illustrated through two meta-analyses of real data that evaluate different kinds of informative hypotheses on superpopulation: one is that alpha of a particular test is above the criterion value, and the other is that alphas among different test versions have ordered relationships. Informative hypotheses are supported from the data in both cases, suggesting that the proposed approach is promising for application. Copyright © 2015 John Wiley & Sons, Ltd.

  8. Airfoil Vibration Dampers program

    NASA Technical Reports Server (NTRS)

    Cook, Robert M.

    1991-01-01

    The Airfoil Vibration Damper program has consisted of an analysis phase and a testing phase. During the analysis phase, a state-of-the-art computer code was developed, which can be used to guide designers in the placement and sizing of friction dampers. The use of this computer code was demonstrated by performing representative analyses on turbine blades from the High Pressure Oxidizer Turbopump (HPOTP) and High Pressure Fuel Turbopump (HPFTP) of the Space Shuttle Main Engine (SSME). The testing phase of the program consisted of performing friction damping tests on two different cantilever beams. Data from these tests provided an empirical check on the accuracy of the computer code developed in the analysis phase. Results of the analysis and testing showed that the computer code can accurately predict the performance of friction dampers. In addition, a valuable set of friction damping data was generated, which can be used to aid in the design of friction dampers, as well as provide benchmark test cases for future code developers.

  9. Assessment accommodations on tests of academic achievement for students who are deaf or hard of hearing: a qualitative meta-analysis of the research literature.

    PubMed

    Cawthon, Stephanie; Leppo, Rachel

    2013-01-01

    The authors conducted a qualitative meta-analysis of the research on assessment accommodations for students who are deaf or hard of hearing. There were 16 identified studies that analyzed the impact of factors related to student performance on academic assessments across different educational settings, content areas, and types of assessment accommodations. The meta-analysis found that the results of analyses of group effects of accommodated versus unaccommodated test formats are often not significant, test-level factors exist that can affect how students perceive the assessments, and differences exist in how test items function across different conditions. Student-level factors, including educational context and academic proficiency, influence accommodations' role in assessment processes. The results of this analysis highlight the complexity of and intersections between student-level factors, test-level factors, and larger policy contexts. Findings are discussed within the context of larger changes in academic assessment, including computer-based administration and high-stakes testing.

  10. Laboratory longitudinal diffusion tests: 1. Dimensionless formulations and validity of simplified solutions

    NASA Astrophysics Data System (ADS)

    Takeda, M.; Nakajima, H.; Zhang, M.; Hiratsuka, T.

    2008-04-01

    To obtain reliable diffusion parameters for diffusion testing, multiple experiments should not only be cross-checked but the internal consistency of each experiment should also be verified. In the through- and in-diffusion tests with solution reservoirs, test interpretation of different phases often makes use of simplified analytical solutions. This study explores the feasibility of steady, quasi-steady, equilibrium and transient-state analyses using simplified analytical solutions with respect to (i) valid conditions for each analytical solution, (ii) potential error, and (iii) experimental time. For increased generality, a series of numerical analyses are performed using unified dimensionless parameters and the results are all related to dimensionless reservoir volume (DRV) which includes only the sorptive parameter as an unknown. This means the above factors can be investigated on the basis of the sorption properties of the testing material and/or tracer. The main findings are that steady, quasi-steady and equilibrium-state analyses are applicable when the tracer is not highly sorptive. However, quasi-steady and equilibrium-state analyses become inefficient or impractical compared to steady state analysis when the tracer is non-sorbing and material porosity is significantly low. Systematic and comprehensive reformulation of analytical models enables the comparison of experimental times between different test methods. The applicability and potential error of each test interpretation can also be studied. These can be applied in designing, performing, and interpreting diffusion experiments by deducing DRV from the available information for the target material and tracer, combined with the results of this study.

  11. Analysis/test correlation using VAWT-SDS on a step-relaxation test for the rotating Sandia 34 m test bed

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Argueello, J.G.; Dohrmann, C.R.; Carne, T.G.

    The combined analysis/test effort described in this paper compares predictions with measured data from a step-relaxation test in the absence of significant wind-driven aerodynamic loading. The process described here is intended to illustrate a method for validation of time domain codes for structural analysis of wind turbine structures. Preliminary analyses were performed to investigate the transient dynamic response that the rotating Sandia 34 m Vertical Axis Wind Turbine (VAWT) would undergo when one of the two blades was excited by step-relaxation. The calculations served two purposes. The first was for pretest planning to evaluate the relative importance of the variousmore » forces that would be acting on the structure during the test and to determine if the applied force in the step-relaxation would be sufficient to produce an excitation that was distinguishable from that produced by the aerodynamic loads. The second was to provide predictions that could subsequently be compared to the data from the test. The test was carried out specifically to help in the validation of the time-domain structural dynamics code, VAWT-SDS, which predicts the dynamic response of VAWTs subject to transient events. Post-test comparisons with the data were performed and showed a qualitative agreement between pretest predictions and measured response. However, they also showed that there was significantly more damping in the measurements than included in the predictions. Efforts to resolve this difference, including post-test analyses, were undertaken and are reported herein. The overall effort described in this paper represents a major step in the process of arriving at a validated structural dynamics code.« less

  12. D and 18O enrichment measurements in biological fluids in a continuous-flow elemental analyser with an isotope-ratio mass spectrometer using two configurations.

    PubMed

    Ripoche, N; Ferchaud-Roucher, V; Krempf, M; Ritz, P

    2006-09-01

    In doubly labelled water studies, biological sample enrichments are mainly measured using off-line techniques (equilibration followed by dual-inlet introduction) or high-temperature elemental analysis (HT-EA), coupled with an isotope-ratio mass spectrometer (IRMS). Here another continuous-flow method, (CF-EA/IRMS), initially dedicated to water, is tested for plasma and urine analyses. The elemental analyser configuration is adapted for each stable isotope: chromium tube for deuterium reduction and glassy carbon reactor for 18O pyrolysis. Before on-line conversion of water into gas, each matrix is submitted to a short and easy treatment, which is the same for the analysis of the two isotopes. Plasma is passed through centrifugal filters. Urine is cleaned with black carbon and filtered (0.45 microm diameter). Tested between 150 and 300 ppm in these fluids, the D/H ratio response is linear with good repeatability (SD<0.2 ppm) and reproducibility (SD<0.5 ppm). For 18O/16O ratios (from 2000 to 2200 ppm), the same repeatability is obtained with a between-day precision lower than 1.4 ppm. The accuracy on biological samples is validated by comparison to classical dual-inlet methods: 18O analyses give more accurate results. The data show that enriched physiological fluids can be successfully analysed in CF-EA/IRMS. Copyright (c) 2006 John Wiley & Sons, Ltd.

  13. Cost-Effectiveness Analysis of Different Genetic Testing Strategies for Lynch Syndrome in Taiwan.

    PubMed

    Chen, Ying-Erh; Kao, Sung-Shuo; Chung, Ren-Hua

    2016-01-01

    Patients with Lynch syndrome (LS) have a significantly increased risk of developing colorectal cancer (CRC) and other cancers. Genetic screening for LS among patients with newly diagnosed CRC aims to identify mutations in the disease-causing genes (i.e., the DNA mismatch repair genes) in the patients, to offer genetic testing for relatives of the patients with the mutations, and then to provide early prevention for the relatives with the mutations. Several genetic tests are available for LS, such as DNA sequencing for MMR genes and tumor testing using microsatellite instability and immunohistochemical analyses. Cost-effectiveness analyses of different genetic testing strategies for LS have been performed in several studies from different countries such as the US and Germany. However, a cost-effectiveness analysis for the testing has not yet been performed in Taiwan. In this study, we evaluated the cost-effectiveness of four genetic testing strategies for LS described in previous studies, while population-specific parameters, such as the mutation rates of the DNA mismatch repair genes and treatment costs for CRC in Taiwan, were used. The incremental cost-effectiveness ratios based on discounted life years gained due to genetic screening were calculated for the strategies relative to no screening and to the previous strategy. Using the World Health Organization standard, which was defined based on Taiwan's Gross Domestic Product per capita, the strategy based on immunohistochemistry as a genetic test followed by BRAF mutation testing was considered to be highly cost-effective relative to no screening. Our probabilistic sensitivity analysis results also suggest that the strategy has a probability of 0.939 of being cost-effective relative to no screening based on the commonly used threshold of $50,000 to determine cost-effectiveness. To the best of our knowledge, this is the first cost-effectiveness analysis for evaluating different genetic testing strategies for LS in Taiwan. The results will be informative for the government when considering offering screening for LS in patients newly diagnosed with CRC.

  14. gHRV: Heart rate variability analysis made easy.

    PubMed

    Rodríguez-Liñares, L; Lado, M J; Vila, X A; Méndez, A J; Cuesta, P

    2014-08-01

    In this paper, the gHRV software tool is presented. It is a simple, free and portable tool developed in python for analysing heart rate variability. It includes a graphical user interface and it can import files in multiple formats, analyse time intervals in the signal, test statistical significance and export the results. This paper also contains, as an example of use, a clinical analysis performed with the gHRV tool, namely to determine whether the heart rate variability indexes change across different stages of sleep. Results from tests completed by researchers who have tried gHRV are also explained: in general the application was positively valued and results reflect a high level of satisfaction. gHRV is in continuous development and new versions will include suggestions made by testers. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  15. Development of the performance confirmation program at YUCCA mountain, nevada

    USGS Publications Warehouse

    LeCain, G.D.; Barr, D.; Weaver, D.; Snell, R.; Goodin, S.W.; Hansen, F.D.

    2006-01-01

    The Yucca Mountain Performance Confirmation program consists of tests, monitoring activities, experiments, and analyses to evaluate the adequacy of assumptions, data, and analyses that form the basis of the conceptual and numerical models of flow and transport associated with a proposed radioactive waste repository at Yucca Mountain, Nevada. The Performance Confirmation program uses an eight-stage risk-informed, performance-based approach. Selection of the Performance Confirmation activities for inclusion in the Performance Confirmation program was done using a risk-informed performance-based decision analysis. The result of this analysis was a Performance Confirmation base portfolio that consists of 20 activities. The 20 Performance Confirmation activities include geologic, hydrologie, and construction/engineering testing. Some of the activities began during site characterization, and others will begin during construction, or post emplacement, and continue until repository closure.

  16. Analysis for the Progressive Failure Response of Textile Composite Fuselage Frames

    NASA Technical Reports Server (NTRS)

    Johnson, Eric R.; Boitnott, Richard L. (Technical Monitor)

    2002-01-01

    A part of aviation accident mitigation is a crashworthy airframe structure, and an important measure of merit for a crashworthy structure is the amount of kinetic energy that can be absorbed in the crush of the structure. Prediction of the energy absorbed from finite element analyses requires modeling the progressive failure sequence. Progressive failure modes may include material degradation, fracture and crack growth, and buckling and collapse. The design of crashworthy airframe components will benefit from progressive failure analyses that have been validated by tests. The subject of this research is the development of a progressive failure analysis for a textile composite, circumferential fuselage frame subjected to a quasi-static, crash-type load. The test data for the frame are reported, and these data are used to develop and to validate methods for the progressive failure response.

  17. The Synthetic Experiment: E. B. Titchener's Cornell Psychological Laboratory and the Test of Introspective Analysis.

    PubMed

    Evans, Rand B

    2017-01-01

    Beginning in 1 9a0, a major thread of research was added to E. B. Titchener's Cornell laboratory: the synthetic experiment. Titchener and his graduate students used introspective analysis to reduce a perception, a complex experience, into its simple sensory constituents. To test the validity of that analysis, stimulus patterns were selected to reprodiuce the patterns of sensations found in the introspective analyses. If the original perception can be reconstructed in this way, then the analysis was considered validated. This article reviews development of the synthetic method in E. B. Titchener's laboratory at Cornell University and examines its impact on psychological research.

  18. Residual antibiotics in decontaminated human cardiovascular tissues intended for transplantation and risk of falsely negative microbiological analyses.

    PubMed

    Buzzi, Marina; Guarino, Anna; Gatto, Claudio; Manara, Sabrina; Dainese, Luca; Polvani, Gianluca; Tóthová, Jana D'Amato

    2014-01-01

    We investigated the presence of antibiotics in cryopreserved cardiovascular tissues and cryopreservation media, after tissue decontamination with antibiotic cocktails, and the impact of antibiotic residues on standard tissue bank microbiological analyses. Sixteen cardiovascular tissues were decontaminated with bank-prepared cocktails and cryopreserved by two different tissue banks according to their standard operating procedures. Before and after decontamination, samples underwent microbiological analysis by standard tissue bank methods. Cryopreserved samples were tested again with and without the removal of antibiotic residues using a RESEP tube, after thawing. Presence of antibiotics in tissue homogenates and processing liquids was determined by a modified agar diffusion test. All cryopreserved tissue homogenates and cryopreservation media induced important inhibition zones on both Staphylococcus aureus- and Pseudomonas aeruginosa-seeded plates, immediately after thawing and at the end of the sterility test. The RESEP tube treatment markedly reduced or totally eliminated the antimicrobial activity of tested tissues and media. Based on standard tissue bank analysis, 50% of tissues were found positive for bacteria and/or fungi, before decontamination and 2 out of 16 tested samples (13%) still contained microorganisms after decontamination. After thawing, none of the 16 cryopreserved samples resulted positive with direct inoculum method. When the same samples were tested after removal of antibiotic residues, 8 out of 16 (50%) were contaminated. Antibiotic residues present in tissue allografts and processing liquids after decontamination may mask microbial contamination during microbiological analysis performed with standard tissue bank methods, thus resulting in false negatives.

  19. Residual Antibiotics in Decontaminated Human Cardiovascular Tissues Intended for Transplantation and Risk of Falsely Negative Microbiological Analyses

    PubMed Central

    Gatto, Claudio; Manara, Sabrina; Dainese, Luca; Polvani, Gianluca; Tóthová, Jana D'Amato

    2014-01-01

    We investigated the presence of antibiotics in cryopreserved cardiovascular tissues and cryopreservation media, after tissue decontamination with antibiotic cocktails, and the impact of antibiotic residues on standard tissue bank microbiological analyses. Sixteen cardiovascular tissues were decontaminated with bank-prepared cocktails and cryopreserved by two different tissue banks according to their standard operating procedures. Before and after decontamination, samples underwent microbiological analysis by standard tissue bank methods. Cryopreserved samples were tested again with and without the removal of antibiotic residues using a RESEP tube, after thawing. Presence of antibiotics in tissue homogenates and processing liquids was determined by a modified agar diffusion test. All cryopreserved tissue homogenates and cryopreservation media induced important inhibition zones on both Staphylococcus aureus- and Pseudomonas aeruginosa-seeded plates, immediately after thawing and at the end of the sterility test. The RESEP tube treatment markedly reduced or totally eliminated the antimicrobial activity of tested tissues and media. Based on standard tissue bank analysis, 50% of tissues were found positive for bacteria and/or fungi, before decontamination and 2 out of 16 tested samples (13%) still contained microorganisms after decontamination. After thawing, none of the 16 cryopreserved samples resulted positive with direct inoculum method. When the same samples were tested after removal of antibiotic residues, 8 out of 16 (50%) were contaminated. Antibiotic residues present in tissue allografts and processing liquids after decontamination may mask microbial contamination during microbiological analysis performed with standard tissue bank methods, thus resulting in false negatives. PMID:25397402

  20. Meta-analysis of diagnostic accuracy studies in mental health

    PubMed Central

    Takwoingi, Yemisi; Riley, Richard D; Deeks, Jonathan J

    2015-01-01

    Objectives To explain methods for data synthesis of evidence from diagnostic test accuracy (DTA) studies, and to illustrate different types of analyses that may be performed in a DTA systematic review. Methods We described properties of meta-analytic methods for quantitative synthesis of evidence. We used a DTA review comparing the accuracy of three screening questionnaires for bipolar disorder to illustrate application of the methods for each type of analysis. Results The discriminatory ability of a test is commonly expressed in terms of sensitivity (proportion of those with the condition who test positive) and specificity (proportion of those without the condition who test negative). There is a trade-off between sensitivity and specificity, as an increasing threshold for defining test positivity will decrease sensitivity and increase specificity. Methods recommended for meta-analysis of DTA studies --such as the bivariate or hierarchical summary receiver operating characteristic (HSROC) model --jointly summarise sensitivity and specificity while taking into account this threshold effect, as well as allowing for between study differences in test performance beyond what would be expected by chance. The bivariate model focuses on estimation of a summary sensitivity and specificity at a common threshold while the HSROC model focuses on the estimation of a summary curve from studies that have used different thresholds. Conclusions Meta-analyses of diagnostic accuracy studies can provide answers to important clinical questions. We hope this article will provide clinicians with sufficient understanding of the terminology and methods to aid interpretation of systematic reviews and facilitate better patient care. PMID:26446042

  1. Pre-Test Analysis Predictions for the Shell Buckling Knockdown Factor Checkout Tests - TA01 and TA02

    NASA Technical Reports Server (NTRS)

    Thornburgh, Robert P.; Hilburger, Mark W.

    2011-01-01

    This report summarizes the pre-test analysis predictions for the SBKF-P2-CYL-TA01 and SBKF-P2-CYL-TA02 shell buckling tests conducted at the Marshall Space Flight Center (MSFC) in support of the Shell Buckling Knockdown Factor (SBKF) Project, NASA Engineering and Safety Center (NESC) Assessment. The test article (TA) is an 8-foot-diameter aluminum-lithium (Al-Li) orthogrid cylindrical shell with similar design features as that of the proposed Ares-I and Ares-V barrel structures. In support of the testing effort, detailed structural analyses were conducted and the results were used to monitor the behavior of the TA during the testing. A summary of predicted results for each of the five load sequences is presented herein.

  2. Performance Tested Method multiple laboratory validation study of ELISA-based assays for the detection of peanuts in food.

    PubMed

    Park, Douglas L; Coates, Scott; Brewer, Vickery A; Garber, Eric A E; Abouzied, Mohamed; Johnson, Kurt; Ritter, Bruce; McKenzie, Deborah

    2005-01-01

    Performance Tested Method multiple laboratory validations for the detection of peanut protein in 4 different food matrixes were conducted under the auspices of the AOAC Research Institute. In this blind study, 3 commercially available ELISA test kits were validated: Neogen Veratox for Peanut, R-Biopharm RIDASCREEN FAST Peanut, and Tepnel BioKits for Peanut Assay. The food matrixes used were breakfast cereal, cookies, ice cream, and milk chocolate spiked at 0 and 5 ppm peanut. Analyses of the samples were conducted by laboratories representing industry and international and U.S governmental agencies. All 3 commercial test kits successfully identified spiked and peanut-free samples. The validation study required 60 analyses on test samples at the target level 5 microg peanut/g food and 60 analyses at a peanut-free level, which was designed to ensure that the lower 95% confidence limit for the sensitivity and specificity would not be <90%. The probability that a test sample contains an allergen given a prevalence rate of 5% and a positive test result using a single test kit analysis with 95% sensitivity and 95% specificity, which was demonstrated for these test kits, would be 50%. When 2 test kits are run simultaneously on all samples, the probability becomes 95%. It is therefore recommended that all field samples be analyzed with at least 2 of the validated kits.

  3. Round Robin Analyses of the Steel Containment Vessel Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Costello, J.F.; Hashimote, T.; Klamerus, E.W.

    A high pressure test of the steel containment vessel (SCV) model was conducted on December 11-12, 1996 at Sandia National Laboratories, Albuquerque, NM, USA. The test model is a mixed-scaled model (1:10 in geometry and 1:4 in shell thickness) of an improved Mark II boiling water reactor (BWR) containment. Several organizations from the US, Europe, and Asia were invited to participate in a Round Robin analysis to perform independent pretest predictions and posttest evaluations of the behavior of the SCV model during the high pressure test. Both pretest and posttest analysis results from all Round Robin participants were compared tomore » the high pressure test data. This paper summarizes the Round Robin analysis activities and discusses the lessons learned from the collective effort.« less

  4. SRM Internal Flow Tests and Computational Fluid Dynamic Analysis. Volume 4; Cold Flow Analyses and CFD Analysis Capability Development

    NASA Technical Reports Server (NTRS)

    1995-01-01

    An evaluation of the effect of model inlet air temperature drift during a test run was performed to aid in the decision on the need for and/or the schedule for including heaters in the SRMAFTE. The Sverdrup acceptance test data was used to determine the drift in air temperature during runs over the entire range of delivered flow rates and pressures. The effect of this temperature drift on the model Reynolds number was also calculated. It was concluded from this study that a 2% change in absolute temperature during a test run could be adequately accounted for by the data analysis program. A handout package of these results was prepared and presented to ED35 management.

  5. ATD-1 Avionics Phase 2: Post-Flight Data Analysis Report

    NASA Technical Reports Server (NTRS)

    Scharl, Julien

    2017-01-01

    This report aims to satisfy Air Traffic Management Technology Demonstration - 1 (ATD-1) Statement of Work (SOW) 3.6.19 and serves as the delivery mechanism for the analysis described in Annex C of the Flight Test Plan. The report describes the data collected and derived as well as the analysis methodology and associated results extracted from the data set collected during the ATD-1 Flight Test. All analyses described in the SOW were performed and are covered in this report except for the analysis of Final Approach Speed and its effect on performance. This analysis was de-prioritized and, at the time of this report, is not considered feasible in the schedule and costs remaining.

  6. In Vivo Assessment of Cold Tolerance through Chlorophyll-a Fluorescence in Transgenic Zoysiagrass Expressing Mutant Phytochrome A

    PubMed Central

    Gururani, Mayank Anand; Venkatesh, Jelli; Ganesan, Markkandan; Strasser, Reto Jörg; Han, Yunjeong; Kim, Jeong-Il; Lee, Hyo-Yeon; Song, Pill-Soon

    2015-01-01

    Chlorophyll-a fluorescence analysis provides relevant information about the physiology of plants growing under abiotic stress. In this study, we evaluated the influence of cold stress on the photosynthetic machinery of transgenic turfgrass, Zoysia japonica, expressing oat phytochrome A (PhyA) or a hyperactive mutant phytochrome A (S599A) with post-translational phosphorylation blocked. Biochemical analysis of zoysiagrass subjected to cold stress revealed reduced levels of hydrogen peroxide, increased proline accumulation, and enhanced specific activities of antioxidant enzymes compared to those of control plants. Detailed analyses of the chlorophyll-a fluorescence data through the so-called OJIP test exhibited a marked difference in the physiological status among transgenic and control plants. Overall, these findings suggest an enhanced level of cold tolerance in S599A zoysiagrass cultivars as reflected in the biochemical and physiological analyses. Further, we propose that chlorophyll-a fluorescence analysis using OJIP test is an efficient tool in determining the physiological status of plants under cold stress conditions. PMID:26010864

  7. Evaluation of factors that affect diesel exhaust toxicity. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Norbeck, J.M.; Smith, M.R.; Arey, J.

    1998-07-01

    The scope of this project was to obtain a preliminary assessment of the potential impact of the fuel formulation on the speciation and toxic components of diesel exhaust. The test bed was a Cummins L10 engine operating over the heavy-duty transient test cycle using three diesel fuels: a pre-1993 diesel fuel, a low aromatic diesel fuel, and an alternative formulation diesel fuel. The sampling/analysis plan included: determination of the criteria pollutant emission rates (THC, CO, NOx, and PM); determination of PM(10) and PM(2.5) emission rates; collection and analysis of particulate samples for elemental, inorganic ion and elemental/organic carbon analyses; collectionmore » of bas samples for VOC speciation analyses; collection of 2,4-dinitrophenylhydrazine (DNPH) cartridges for determination of oxygenates; collection of nitrosomorpholine with Thermosorb N cartridges; collection of semi-volatiles on PF/XAD and particulate samples for PAH, nitro-PAH, and mutagenicity studies; and collection and analysis of dioxins for the pre-1993 and alternative formulation diesel fuels.« less

  8. Threatening communication: a critical re-analysis and a revised meta-analytic test of fear appeal theory.

    PubMed

    Peters, Gjalt-Jorn Ygram; Ruiter, Robert A C; Kok, Gerjo

    2013-05-01

    Despite decades of research, consensus regarding the dynamics of fear appeals remains elusive. A meta-analysis was conducted that was designed to resolve this controversy. Publications that were included in previous meta-analyses were re-analysed, and a number of additional publications were located. The inclusion criteria were full factorial orthogonal manipulations of threat and efficacy, and measurement of behaviour as an outcome. Fixed and random effects models were used to compute mean effect size estimates. Meta-analysis of the six studies that satisfied the inclusion criteria clearly showed a significant interaction between threat and efficacy, such that threat only had an effect under high efficacy (d = 0.31), and efficacy only had an effect under high threat (d = 0.71). Inconsistency in results regarding the effectiveness of threatening communication can likely be attributed to flawed methodology. Proper tests of fear appeal theory yielded the theoretically hypothesised interaction effect. Threatening communication should exclusively be used when pilot studies indicate that an intervention successfully enhances efficacy.

  9. Threatening communication: a critical re-analysis and a revised meta-analytic test of fear appeal theory

    PubMed Central

    Peters, Gjalt-Jorn Ygram; Ruiter, Robert A.C.; Kok, Gerjo

    2013-01-01

    Despite decades of research, consensus regarding the dynamics of fear appeals remains elusive. A meta-analysis was conducted that was designed to resolve this controversy. Publications that were included in previous meta-analyses were re-analysed, and a number of additional publications were located. The inclusion criteria were full factorial orthogonal manipulations of threat and efficacy, and measurement of behaviour as an outcome. Fixed and random effects models were used to compute mean effect size estimates. Meta-analysis of the six studies that satisfied the inclusion criteria clearly showed a significant interaction between threat and efficacy, such that threat only had an effect under high efficacy (d = 0.31), and efficacy only had an effect under high threat (d = 0.71). Inconsistency in results regarding the effectiveness of threatening communication can likely be attributed to flawed methodology. Proper tests of fear appeal theory yielded the theoretically hypothesised interaction effect. Threatening communication should exclusively be used when pilot studies indicate that an intervention successfully enhances efficacy. PMID:23772231

  10. Aeroelastic Analysis Of Joined Wing Of High Altitude Long Endurance (HALE) Aircraft Based On The Sensor-Craft Configuration

    NASA Astrophysics Data System (ADS)

    Marisarla, Soujanya; Ghia, Urmila; "Karman" Ghia, Kirti

    2002-11-01

    Towards a comprehensive aeroelastic analysis of a joined wing, fluid dynamics and structural analyses are initially performed separately. Steady flow calculations are currently performed using 3-D compressible Navier-Stokes equations. Flow analysis of M6-Onera wing served to validate the software for the fluid dynamics analysis. The complex flow field of the joined wing is analyzed and the prevailing fluid dynamic forces are computed using COBALT software. Currently, these forces are being transferred as fluid loads on the structure. For the structural analysis, several test cases were run considering the wing as a cantilever beam; these served as validation cases. A nonlinear structural analysis of the wing is being performed using ANSYS software to predict the deflections and stresses on the joined wing. Issues related to modeling, and selecting appropriate mesh for the structure were addressed by first performing a linear analysis. The frequencies and mode shapes of the deformed wing are obtained from modal analysis. Both static and dynamic analyses are carried out, and the results obtained are carefully analyzed. Loose coupling between the fluid and structural analyses is currently being examined.

  11. Using Rasch Analysis to Identify Uncharacteristic Responses to Undergraduate Assessments

    ERIC Educational Resources Information Center

    Edwards, Antony; Alcock, Lara

    2010-01-01

    Rasch Analysis is a statistical technique that is commonly used to analyse both test data and Likert survey data, to construct and evaluate question item banks, and to evaluate change in longitudinal studies. In this article, we introduce the dichotomous Rasch model, briefly discussing its assumptions. Then, using data collected in an…

  12. A Scale of Mobbing Impacts

    ERIC Educational Resources Information Center

    Yaman, Erkan

    2012-01-01

    The aim of this research was to develop the Mobbing Impacts Scale and to examine its validity and reliability analyses. The sample of study consisted of 509 teachers from Sakarya. In this study construct validity, internal consistency, test-retest reliabilities and item analysis of the scale were examined. As a result of factor analysis for…

  13. Meta-Analysis of Explicit Memory Studies in Populations with Intellectual Disability

    ERIC Educational Resources Information Center

    Lifshitz, Hefziba; Shtein, Sarit; Weiss, Izhak; Vakil, Eli

    2011-01-01

    This meta-analysis combines the effect size (ES) of 40 explicit memory experiments in populations with intellectual disability (ID). Eight meta-analyses were performed, as well as contrast tests between ES. The explicit memory of participants with ID was inferior to that of participants with typical development (TD). Relatively preserved explicit…

  14. Aeroelastic stability analyses of two counter rotating propfan designs for a cruise missile model

    NASA Technical Reports Server (NTRS)

    Mahajan, Aparajit J.; Lucero, John M.; Mehmed, Oral; Stefko, George L.

    1992-01-01

    Aeroelastic stability analyses were performed to insure structural integrity of two counterrotating propfan blade designs for a NAVY/Air Force/NASA cruise missile model wind tunnel test. This analysis predicted if the propfan designs would be flutter free at the operating conditions of the wind tunnel test. Calculated stability results are presented for the two blade designs with rotational speed and Mach number as the parameters. A aeroelastic analysis code ASTROP2 (Aeroelastic Stability and Response of Propulsion Systems - 2 Dimensional Analysis), developed at LeRC, was used in this project. The aeroelastic analysis is a modal method and uses the combination of a finite element structural model and two dimensional steady and unsteady cascade aerodynamic models. This code was developed to analyze single rotation propfans but was modified and applied to counterrotating propfans for the present work. Modifications were made to transform the geometry and rotation of the aft rotor to the same reference frame as the forward rotor, to input a non-uniform inflow into the rotor being analyzed, and to automatically converge to the least stable aeroelastic mode.

  15. Comparison between Standing Broad Jump test and Wingate test for assessing lower limb anaerobic power in elite sportsmen.

    PubMed

    Krishnan, Anup; Sharma, Deep; Bhatt, Madhu; Dixit, Apoorv; Pradeep, P

    2017-04-01

    Lower limb explosive power is an important motor quality for sporting performance and indicates use of anaerobic energy systems like stored ATP and Creatine phosphate system. Weightlifting, Fencing and Wrestling use it for monitoring and identification of potential sportsmen. The Wingate test and Standing Broad Jump (SBJ) test are reliable and accurate tests for its assessment. This study conducted on elite Indian sportsmen tries to analyse feasibility of use of the SBJ test in sports and military medicine when Wingate test is impractical. 95 elite sportsmen (51 Fencers, 17 Weight lifters and 27 Wrestlers) of a sports institute were administered Wingate cycle ergometer test and SBJ under standardised conditions. The results were analysed for mass and inter-discipline correlation. Analysis using Pearson's correlation showed significant positive correlation between Peak power ( r  = 0.446, p  < 0.0001) and SBJ (distance) in all sportsmen. Inter-sport correlation showed positive correlation between SBJ and peak power ( r  = 0.335, p  < 0.016) in Fencers and between SBJ, peak power ( r  = 0.686, p  < 0.002) in Weightlifters. Bland-Altman plot analysis showed that about 94% pairs of peak power and SBJ were within limits of agreement for each discipline as well as among all sportsmen. The test results show definite correlation and SBJ test can be used as a field test in performance monitoring, talent identification, military recruit screening and injury prevention.

  16. [The Freiburg monosyllable word test in postoperative cochlear implant diagnostics].

    PubMed

    Hey, M; Brademann, G; Ambrosch, P

    2016-08-01

    The Freiburg monosyllable word test represents a central tool of postoperative cochlear implant (CI) diagnostics. The objective of this study is to test the equivalence of different word lists by analysing word comprehension. For patients whose CI has been implanted for more than 5 years, the distribution of suprathreshold speech intelligibility outcomes will also be analysed. In a retrospective data analysis, speech understanding for 626 CI users word correct scores were evaluated using a total of 5211 lists with 20 words each. The analysis of word comprehension within each list shows differences in mean and in the kind of distribution function. There are lists which show a significant difference of their mean word recognition to the overall mean. The Freiburg monosyllable word test is easy to administer at suprathreshold speech level for CI recipients, and typically has a saturation level above 80 %. The Freiburg monosyllable word test can be performed successfully by the majority of CI patients. The limited balance of the test lists elicits the conclusion that an adaptive test procedure with the Freiburg monosyllable test does not make sense. The Freiburg monosyllable test can be restructured by resorting all words across lists, or by omitting individual words of a test list to increase the reliability of the test. The results show that speech intelligibility in quiet should also be investigated in CI recipients al levels below 70 dB.

  17. Using Meta-analyses for Comparative Effectiveness Research

    PubMed Central

    Ruppar, Todd M.; Phillips, Lorraine J.; Chase, Jo-Ana D.

    2012-01-01

    Comparative effectiveness research seeks to identify the most effective interventions for particular patient populations. Meta-analysis is an especially valuable form of comparative effectiveness research because it emphasizes the magnitude of intervention effects rather than relying on tests of statistical significance among primary studies. Overall effects can be calculated for diverse clinical and patient-centered variables to determine the outcome patterns. Moderator analyses compare intervention characteristics among primary studies by determining if effect sizes vary among studies with different intervention characteristics. Intervention effectiveness can be linked to patient characteristics to provide evidence for patient-centered care. Moderator analyses often answer questions never posed by primary studies because neither multiple intervention characteristics nor populations are compared in single primary studies. Thus meta-analyses provide unique contributions to knowledge. Although meta-analysis is a powerful comparative effectiveness strategy, methodological challenges and limitations in primary research must be acknowledged to interpret findings. PMID:22789450

  18. Measured and predicted structural behavior of the HiMAT tailored composite wing

    NASA Technical Reports Server (NTRS)

    Nelson, Lawrence H.

    1987-01-01

    A series of load tests was conducted on the HiMAT tailored composite wing. Coupon tests were also run on a series of unbalanced laminates, including the ply configuration of the wing, the purpose of which was to compare the measured and predicted behavior of unbalanced laminates, including - in the case of the wing - a comparison between the behavior of the full scale structure and coupon tests. Both linear and nonlinear finite element (NASTRAN) analyses were carried out on the wing. Both linear and nonlinear point-stress analyses were performed on the coupons. All test articles were instrumented with strain gages, and wing deflections measured. The leading and trailing edges were found to have no effect on the response of the wing to applied loads. A decrease in the stiffness of the wing box was evident over the 27-test program. The measured load-strain behavior of the wing was found to be linear, in contrast to coupon tests of the same laminate, which were nonlinear. A linear NASTRAN analysis of the wing generally correlated more favorably with measurements than did a nonlinear analysis. An examination of the predicted deflections in the wing root region revealed an anomalous behavior of the structural model that cannot be explained. Both hysteresis and creep appear to be less significant in the wing tests than in the corresponding laminate coupon tests.

  19. Evaluation of dredged material proposed for ocean disposal from Shark River Project area

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Antrim, L.D.; Gardiner, W.W.; Barrows, E.S.

    1996-09-01

    The objective of the Shark River Project was to evaluate proposed dredged material to determine its suitability for unconfined ocean disposal at the Mud Dump Site. Tests and analyses were conducted on the Shark River sediments. The evaluation of proposed dredged material consisted of bulk sediment chemical and physical analysis, chemical analyses of dredging site water and elutriate, water-column and benthic acute toxicity tests, and bioaccumulation tests. Individual sediment core samples collected from the Shark River were analyzed for grain size, moisture content, and total organic carbon (TOC). One sediment composite was analyzed for bulk density, specific gravity, metals, chlorinatedmore » pesticides, polychlorinated biphenyl (PCB) congeners, polynuclear aromatic hydrocarbons (PAHs), and 1,4- dichlorobenzene. Dredging site water and elutriate, prepared from suspended-particulate phase (SPP) of the Shark River sediment composite, were analyzed for metals, pesticides, and PCBs. Benthic acute toxicity tests and bioaccumulation tests were performed.« less

  20. Analysis of Operation TEAPOT nuclear test BEE radiological and meteorological data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Quinn, V.E.

    This report describes the Weather Service Nuclear Support Office (WSNSO) analyses of the radiological and meteorological data collected for the BEE nuclear test of Operation TEAPOT. Inconsistencies in the radiological data and their resolution are discussed. The methods of normalizing the radiological data to a standard time and estimating fallout-arrival times are presented. The meteorological situations on event day and the following day are described. A comparison of the WSNSO fallout analysis with an analysis performed in the 1950's is presented. The radiological data used to derive the WSNSO fallout pattern are tabulated in an appendix.

  1. Simulation Analysis of DC and Switching Impulse Superposition Circuit

    NASA Astrophysics Data System (ADS)

    Zhang, Chenmeng; Xie, Shijun; Zhang, Yu; Mao, Yuxiang

    2018-03-01

    Surge capacitors running between the natural bus and the ground are affected by DC and impulse superposition voltage during operation in the converter station. This paper analyses the simulation aging circuit of surge capacitors by PSCAD electromagnetic transient simulation software. This paper also analyses the effect of the DC voltage to the waveform of the impulse voltage generation. The effect of coupling capacitor to the test voltage waveform is also studied. Testing results prove that the DC voltage has little effect on the waveform of the output of the surge voltage generator, and the value of the coupling capacitor has little effect on the voltage waveform of the sample. Simulation results show that surge capacitor DC and impulse superimposed aging test is feasible.

  2. USAFSAM Review and Analysis of Radiofrequency Radiation Bioeffects Literature.

    DTIC Science & Technology

    1981-11-01

    less than 0.2 mW/cm . Clinical tests included ophthalmoscopic examination and a neurologic check-up supplemented by psychologic tests and EEG...34). These 12 studies are assessed individually and collectively below, followed by the analyses of each per se. Collective Summary In Lilienfeld et al...hours/workday). Electrocardiograms, heart and lung X-rays, eryth- rocyte sedimentation rates, urinalyses, and liver function tests were conducted as well

  3. Effects of Coaching on Standardized Admission Examinations. Revised Statistical Analyses of Data Gathered By Boston Regional Office of the Federal Trade Commission.

    ERIC Educational Resources Information Center

    Federal Trade Commission, Washington, DC. Bureau of Consumer Protection.

    The effect of commercial coaching on Scholastic Aptitude Test (SAT) scores was analyzed, using 1974-1977 test results of 2,500 non-coached students and 1,568 enrollees in two coaching schools. (The Stanley H. Kaplan Educational Center, Inc., and the Test Preparation Center, Inc.). Multiple regression analysis was used to control for student…

  4. Exploring visuospatial abilities and their contribution to constructional abilities and nonverbal intelligence.

    PubMed

    Trojano, Luigi; Siciliano, Mattia; Cristinzio, Chiara; Grossi, Dario

    2018-01-01

    The present study aimed at exploring relationships among the visuospatial tasks included in the Battery for Visuospatial Abilities (BVA), and at assessing the relative contribution of different facets of visuospatial processing on tests tapping constructional abilities and nonverbal abstract reasoning. One hundred forty-four healthy subjects with a normal score on Mini Mental State Examination completed the BVA plus Raven's Coloured Progressive Matrices and Constructional Apraxia test. We used Principal Axis Factoring and Parallel Analysis to investigate relationships among the BVA visuospatial tasks, and performed regression analyses to assess the visuospatial contribution to constructional abilities and nonverbal abstract reasoning. Principal Axis Factoring and Parallel Analysis revealed two eigenvalues exceeding 1, accounting for about 60% of the variance. A 2-factor model provided the best fit. Factor 1 included sub-tests exploring "complex" visuospatial skills, whereas Factor 2 included two subtests tapping "simple" visuospatial skills. Regression analyses revealed that both Factor 1 and Factor 2 significantly affected performance on Raven's Coloured Progressive Matrices, whereas only the Factor 1 affected performance on Constructional Apraxia test. Our results supported functional segregation proposed by De Renzi, suggesting clinical caution to utilize a single test to assess visuospatial domain, and qualified the visuospatial contribution in drawing and non-verbal intelligence test.

  5. Transfusion Indication Threshold Reduction (TITRe2) randomized controlled trial in cardiac surgery: statistical analysis plan.

    PubMed

    Pike, Katie; Nash, Rachel L; Murphy, Gavin J; Reeves, Barnaby C; Rogers, Chris A

    2015-02-22

    The Transfusion Indication Threshold Reduction (TITRe2) trial is the largest randomized controlled trial to date to compare red blood cell transfusion strategies following cardiac surgery. This update presents the statistical analysis plan, detailing how the study will be analyzed and presented. The statistical analysis plan has been written following recommendations from the International Conference on Harmonisation of Technical Requirements for Registration of Pharmaceuticals for Human Use, prior to database lock and the final analysis of trial data. Outlined analyses are in line with the Consolidated Standards of Reporting Trials (CONSORT). The study aims to randomize 2000 patients from 17 UK centres. Patients are randomized to either a restrictive (transfuse if haemoglobin concentration <7.5 g/dl) or liberal (transfuse if haemoglobin concentration <9 g/dl) transfusion strategy. The primary outcome is a binary composite outcome of any serious infectious or ischaemic event in the first 3 months following randomization. The statistical analysis plan details how non-adherence with the intervention, withdrawals from the study, and the study population will be derived and dealt with in the analysis. The planned analyses of the trial primary and secondary outcome measures are described in detail, including approaches taken to deal with multiple testing, model assumptions not being met and missing data. Details of planned subgroup and sensitivity analyses and pre-specified ancillary analyses are given, along with potential issues that have been identified with such analyses and possible approaches to overcome such issues. ISRCTN70923932 .

  6. BROCCOLI: Software for fast fMRI analysis on many-core CPUs and GPUs

    PubMed Central

    Eklund, Anders; Dufort, Paul; Villani, Mattias; LaConte, Stephen

    2014-01-01

    Analysis of functional magnetic resonance imaging (fMRI) data is becoming ever more computationally demanding as temporal and spatial resolutions improve, and large, publicly available data sets proliferate. Moreover, methodological improvements in the neuroimaging pipeline, such as non-linear spatial normalization, non-parametric permutation tests and Bayesian Markov Chain Monte Carlo approaches, can dramatically increase the computational burden. Despite these challenges, there do not yet exist any fMRI software packages which leverage inexpensive and powerful graphics processing units (GPUs) to perform these analyses. Here, we therefore present BROCCOLI, a free software package written in OpenCL (Open Computing Language) that can be used for parallel analysis of fMRI data on a large variety of hardware configurations. BROCCOLI has, for example, been tested with an Intel CPU, an Nvidia GPU, and an AMD GPU. These tests show that parallel processing of fMRI data can lead to significantly faster analysis pipelines. This speedup can be achieved on relatively standard hardware, but further, dramatic speed improvements require only a modest investment in GPU hardware. BROCCOLI (running on a GPU) can perform non-linear spatial normalization to a 1 mm3 brain template in 4–6 s, and run a second level permutation test with 10,000 permutations in about a minute. These non-parametric tests are generally more robust than their parametric counterparts, and can also enable more sophisticated analyses by estimating complicated null distributions. Additionally, BROCCOLI includes support for Bayesian first-level fMRI analysis using a Gibbs sampler. The new software is freely available under GNU GPL3 and can be downloaded from github (https://github.com/wanderine/BROCCOLI/). PMID:24672471

  7. The Influence of Study-Level Inference Models and Study Set Size on Coordinate-Based fMRI Meta-Analyses

    PubMed Central

    Bossier, Han; Seurinck, Ruth; Kühn, Simone; Banaschewski, Tobias; Barker, Gareth J.; Bokde, Arun L. W.; Martinot, Jean-Luc; Lemaitre, Herve; Paus, Tomáš; Millenet, Sabina; Moerkerke, Beatrijs

    2018-01-01

    Given the increasing amount of neuroimaging studies, there is a growing need to summarize published results. Coordinate-based meta-analyses use the locations of statistically significant local maxima with possibly the associated effect sizes to aggregate studies. In this paper, we investigate the influence of key characteristics of a coordinate-based meta-analysis on (1) the balance between false and true positives and (2) the activation reliability of the outcome from a coordinate-based meta-analysis. More particularly, we consider the influence of the chosen group level model at the study level [fixed effects, ordinary least squares (OLS), or mixed effects models], the type of coordinate-based meta-analysis [Activation Likelihood Estimation (ALE) that only uses peak locations, fixed effects, and random effects meta-analysis that take into account both peak location and height] and the amount of studies included in the analysis (from 10 to 35). To do this, we apply a resampling scheme on a large dataset (N = 1,400) to create a test condition and compare this with an independent evaluation condition. The test condition corresponds to subsampling participants into studies and combine these using meta-analyses. The evaluation condition corresponds to a high-powered group analysis. We observe the best performance when using mixed effects models in individual studies combined with a random effects meta-analysis. Moreover the performance increases with the number of studies included in the meta-analysis. When peak height is not taken into consideration, we show that the popular ALE procedure is a good alternative in terms of the balance between type I and II errors. However, it requires more studies compared to other procedures in terms of activation reliability. Finally, we discuss the differences, interpretations, and limitations of our results. PMID:29403344

  8. Relationship between Procedural Tactical Knowledge and Specific Motor Skills in Young Soccer Players

    PubMed Central

    Aquino, Rodrigo; Marques, Renato Francisco R.; Petiot, Grégory Hallé; Gonçalves, Luiz Guilherme C.; Moraes, Camila; Santiago, Paulo Roberto P.; Puggina, Enrico Fuini

    2016-01-01

    The purpose of this study was to investigate the association between offensive tactical knowledge and the soccer-specific motor skills performance. Fifteen participants were submitted to two evaluation tests, one to assess their technical and tactical analysis. The motor skills performance was measured through four tests of technical soccer skills: ball control, shooting, passing and dribbling. The tactical performance was based on a tactical assessment system called FUT-SAT (Analyses of Procedural Tactical Knowledge in Soccer). Afterwards, technical and tactical evaluation scores were ranked with and without the use of the cluster method. A positive, weak correlation was perceived in both analyses (rho = 0.39, not significant p = 0.14 (with cluster analysis); and rho = 0.35; not significant p = 0.20 (without cluster analysis)). We can conclude that there was a weak association between the technical and the offensive tactical knowledge. This shows the need to reflect on the use of such tests to assess technical skills in team sports since they do not take into account the variability and unpredictability of game actions and disregard the inherent needs to assess such skill performance in the game. PMID:29910300

  9. Elemental composition of edible nuts: fast optimization and validation procedure of an ICP-OES method.

    PubMed

    Tošić, Snežana B; Mitić, Snežana S; Velimirović, Dragan S; Stojanović, Gordana S; Pavlović, Aleksandra N; Pecev-Marinković, Emilija T

    2015-08-30

    An inductively coupled plasma-optical emission spectrometry method for the speedy simultaneous detection of 19 elements in edible nuts (walnuts: Juglans nigra; almonds: Prunus dulcis; hazelnuts: Corylus avellana; Brazil nuts: Bertholletia excelsa; cashews: Anacardium occidentalle; pistachios: Pistacia vera; and peanuts: Arachis hypogaea) available on the Serbian markets, was optimized and validated through the selection of instrumental parameters and analytical lines free from spectral interference and with the lowest matrix effects. The analysed macro-elements were present in the following descending order: Na > Mg > Ca > K. Of all the trace elements, the tested samples showed the highest content of Fe. The micro-element Se was detected in all the samples of nuts. The toxic elements As, Cd and Pb were either not detected or the contents were below the limit of detection. One-way analysis of variance, Student's t-test, Tukey's HSD post hoc test and hierarchical agglomerative cluster analysis were applied in the statistical analysis of the results. Based on the detected content of analysed elements it can be concluded that nuts may be a good additional source of minerals as micronutrients. © 2014 Society of Chemical Industry.

  10. An Exploratory Data Analysis System for Support in Medical Decision-Making

    PubMed Central

    Copeland, J. A.; Hamel, B.; Bourne, J. R.

    1979-01-01

    An experimental system was developed to allow retrieval and analysis of data collected during a study of neurobehavioral correlates of renal disease. After retrieving data organized in a relational data base, simple bivariate statistics of parametric and nonparametric nature could be conducted. An “exploratory” mode in which the system provided guidance in selection of appropriate statistical analyses was also available to the user. The system traversed a decision tree using the inherent qualities of the data (e.g., the identity and number of patients, tests, and time epochs) to search for the appropriate analyses to employ.

  11. A Risk-Based Approach for Aerothermal/TPS Analysis and Testing

    NASA Technical Reports Server (NTRS)

    Wright, Michael J.; Grinstead, Jay H.; Bose, Deepak

    2007-01-01

    The current status of aerothermal and thermal protection system modeling for civilian entry missions is reviewed. For most such missions, the accuracy of our simulations is limited not by the tools and processes currently employed, but rather by reducible deficiencies in the underlying physical models. Improving the accuracy of and reducing the uncertainties in these models will enable a greater understanding of the system level impacts of a particular thermal protection system and of the system operation and risk over the operational life of the system. A strategic plan will be laid out by which key modeling deficiencies can be identified via mission-specific gap analysis. Once these gaps have been identified, the driving component uncertainties are determined via sensitivity analyses. A Monte-Carlo based methodology is presented for physics-based probabilistic uncertainty analysis of aerothermodynamics and thermal protection system material response modeling. These data are then used to advocate for and plan focused testing aimed at reducing key uncertainties. The results of these tests are used to validate or modify existing physical models. Concurrently, a testing methodology is outlined for thermal protection materials. The proposed approach is based on using the results of uncertainty/sensitivity analyses discussed above to tailor ground testing so as to best identify and quantify system performance and risk drivers. A key component of this testing is understanding the relationship between the test and flight environments. No existing ground test facility can simultaneously replicate all aspects of the flight environment, and therefore good models for traceability to flight are critical to ensure a low risk, high reliability thermal protection system design. Finally, the role of flight testing in the overall thermal protection system development strategy is discussed.

  12. Analysis of Turkish Prospective Science Teachers' Perceptions on Technology in Education

    ERIC Educational Resources Information Center

    Koksal, Mustafa Serdar; Yaman, Suleyman; Saka, Yavuz

    2016-01-01

    Purpose of this study was to determine and analyze Turkish pre-service science teachers' perceptions on technology in terms of learning style, computer competency level, possession of a computer, and gender. The study involved 264 Turkish pre-service science teachers. Analyses were conducted through four-way ANOVA, t-tests, Mann Whitney U test and…

  13. Conceptual Abilities of Children with Mild Intellectual Disability: Analysis of Wisconsin Card Sorting Test Performance

    ERIC Educational Resources Information Center

    Gligorovic, Milica; Buha, Natasa

    2013-01-01

    Background: The ability to generate and flexibly change concepts is of great importance for the development of academic and adaptive skills. This paper analyses the conceptual reasoning ability of children with mild intellectual disability (MID) by their achievements on the Wisconsin Card Sorting Test (WCST). Method: The sample consisted of 95…

  14. Analysis and design of randomised clinical trials involving competing risks endpoints.

    PubMed

    Tai, Bee-Choo; Wee, Joseph; Machin, David

    2011-05-19

    In randomised clinical trials involving time-to-event outcomes, the failures concerned may be events of an entirely different nature and as such define a classical competing risks framework. In designing and analysing clinical trials involving such endpoints, it is important to account for the competing events, and evaluate how each contributes to the overall failure. An appropriate choice of statistical model is important for adequate determination of sample size. We describe how competing events may be summarised in such trials using cumulative incidence functions and Gray's test. The statistical modelling of competing events using proportional cause-specific and subdistribution hazard functions, and the corresponding procedures for sample size estimation are outlined. These are illustrated using data from a randomised clinical trial (SQNP01) of patients with advanced (non-metastatic) nasopharyngeal cancer. In this trial, treatment has no effect on the competing event of loco-regional recurrence. Thus the effects of treatment on the hazard of distant metastasis were similar via both the cause-specific (unadjusted csHR = 0.43, 95% CI 0.25 - 0.72) and subdistribution (unadjusted subHR 0.43; 95% CI 0.25 - 0.76) hazard analyses, in favour of concurrent chemo-radiotherapy followed by adjuvant chemotherapy. Adjusting for nodal status and tumour size did not alter the results. The results of the logrank test (p = 0.002) comparing the cause-specific hazards and the Gray's test (p = 0.003) comparing the cumulative incidences also led to the same conclusion. However, the subdistribution hazard analysis requires many more subjects than the cause-specific hazard analysis to detect the same magnitude of effect. The cause-specific hazard analysis is appropriate for analysing competing risks outcomes when treatment has no effect on the cause-specific hazard of the competing event. It requires fewer subjects than the subdistribution hazard analysis for a similar effect size. However, if the main and competing events are influenced in opposing directions by an intervention, a subdistribution hazard analysis may be warranted.

  15. Ongoing Analysis of Rocket Based Combined Cycle Engines by the Applied Fluid Dynamics Analysis Group at Marshall Space Flight Center

    NASA Technical Reports Server (NTRS)

    Ruf, Joseph; Holt, James B.; Canabal, Francisco

    1999-01-01

    This paper presents the status of analyses on three Rocket Based Combined Cycle configurations underway in the Applied Fluid Dynamics Analysis Group (TD64). TD64 is performing computational fluid dynamics analysis on a Penn State RBCC test rig, the proposed Draco axisymmetric RBCC engine and the Trailblazer engine. The intent of the analysis on the Penn State test rig is to benchmark the Finite Difference Navier Stokes code for ejector mode fluid dynamics. The Draco engine analysis is a trade study to determine the ejector mode performance as a function of three engine design variables. The Trailblazer analysis is to evaluate the nozzle performance in scramjet mode. Results to date of each analysis are presented.

  16. Summary and evaluation: fuel dynamics loss-of-flow experiments (tests L2, L3, and L4)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barts, E.W.; Deitrich, L.W.; Eberhart, J.G.

    1975-09-01

    Three similar experiments conducted to support the analyses of hypothetical LMFBR unprotected-loss-of-flow accidents are summarized and evaluated. The tests, designated L2, L3, and L4, provided experimental data against which accident-analysis codes could be compared, so as to guide further analysis and modeling of the initiating phases of the hypothetical accident. The tests were conducted using seven-pin bundles of mixed-oxide fuel pins in Mark-II flowing-sodium loops in the TREAT reactor. Test L2 used fresh fuel. Tests L3 and L4 used irradiated fuel pins having, respectively, ''intermediate-power'' (no central void) and ''high-power'' (fully developed central void) microstructure. 12 references. (auth)

  17. Test and Analysis of Foam Impacting a 6x6 Inch RCC Flat Panel

    NASA Technical Reports Server (NTRS)

    Lessard, Wendy B.

    2006-01-01

    This report presents the testing and analyses of a foam projectile impacting onto thirteen 6x6 inch flat panels at a 90 degrees incidence angle. The panels tested in this investigation were fabricated of Reinforced-Carbon-Carbon material and were used to aid in the validation of an existing material model, MAT58. The computational analyses were performed using LS-DYNA, which is a physics-based, nonlinear, transient, finite element code used for analyzing material responses subjected to high impact forces and other dynamic conditions. The test results were used to validate LS-DYNA predictions and to determine the threshold of damage generated by the MAT58 cumulative damage material model. The threshold of damage parameter represents any external or internal visible RCC damage detectable by nondestructive evaluation techniques.

  18. Testing of the Defense Waste Processing Facility Cold Chemical Dissolution Method in Sludge Batch 9 Qualification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Edwards, T.; Pareizs, J.; Coleman, C.

    For each sludge batch that is processed in the Defense Waste Processing Facility (DWPF), the Savannah River National Laboratory (SRNL) tests the applicability of the digestion methods used by the DWPF Laboratory for elemental analysis of Sludge Receipt and Adjustment Tank (SRAT) Receipt samples and SRAT Product process control samples. DWPF SRAT samples are typically dissolved using a method referred to as the DWPF Cold Chemical or Cold Chem Method (CC), (see DWPF Procedure SW4- 15.201). Testing indicates that the CC method produced mixed results. The CC method did not result in complete dissolution of either the SRAT Receipt ormore » SRAT Product with some fine, dark solids remaining. However, elemental analyses did not reveal extreme biases for the major elements in the sludge when compared with analyses obtained following dissolution by hot aqua regia (AR) or sodium peroxide fusion (PF) methods. The CC elemental analyses agreed with the AR and PF methods well enough that it should be adequate for routine process control analyses in the DWPF after much more extensive side-by-side tests of the CC method and the PF method are performed on the first 10 SRAT cycles of the Sludge Batch 9 (SB9) campaign. The DWPF Laboratory should continue with their plans for further tests of the CC method during these 10 SRAT cycles.« less

  19. Waste Analysis Plan and Waste Characterization Survey, Barksdale AFB, Louisiana

    DTIC Science & Technology

    1991-03-01

    review to assess if analysis is needed, any analyses that are to be provided by generators, and methods to be used to meet specific waste analysis ...sampling method , sampling frequency, parameters of analysis , SW 846 test methods , Department of Transportation (DOT) shipping name and hazard class...S.e.iceA w/Atchs 2. HQ SAC/DEV Ltr, 28 Sep 90 19 119 APPENDIX B Waste Analysis Plan Rationale 21 APPENDIX B 1. SAMPLING METHOD RATIONALE: Composite Liquid

  20. Detection of semi-volatile organic compounds in permeable ...

    EPA Pesticide Factsheets

    Abstract The Edison Environmental Center (EEC) has a research and demonstration permeable parking lot comprised of three different permeable systems: permeable asphalt, porous concrete and interlocking concrete permeable pavers. Water quality and quantity analysis has been ongoing since January, 2010. This paper describes a subset of the water quality analysis, analysis of semivolatile organic compounds (SVOCs) to determine if hydrocarbons were in water infiltrated through the permeable surfaces. SVOCs were analyzed in samples collected from 11 dates over a 3 year period, from 2/8/2010 to 4/1/2013.Results are broadly divided into three categories: 42 chemicals were never detected; 12 chemicals (11 chemical test) were detected at a rate of less than 10% or less; and 22 chemicals were detected at a frequency of 10% or greater (ranging from 10% to 66.5% detections). Fundamental and exploratory statistical analyses were performed on these latter analyses results by grouping results by surface type. The statistical analyses were limited due to low frequency of detections and dilutions of samples which impacted detection limits. The infiltrate data through three permeable surfaces were analyzed as non-parametric data by the Kaplan-Meier estimation method for fundamental statistics; there were some statistically observable difference in concentration between pavement types when using Tarone-Ware Comparison Hypothesis Test. Additionally Spearman Rank order non-parame

  1. Fracture analysis of stiffened panels under biaxial loading with widespread cracking

    NASA Technical Reports Server (NTRS)

    Newman, J. C., Jr.; Dawicke, D. S.

    1995-01-01

    An elastic-plastic finite-element analysis with a critical crack-tip-opening angle (CTOA) fracture criterion was used to model stable crack growth and fracture of 2024-T3 aluminum alloy (bare and clad) panels for several thicknesses. The panels had either single or multiple-site damage (MSD) cracks subjected to uniaxial or biaxial loading. Analyses were also conducted on cracked stiffened panels with single or MSD cracks. The critical CTOA value for each thickness was determined by matching the failure load on a middle-crack tension specimen. Comparisons were made between the critical angles determined from the finite-element analyses and those measured with photographic methods. Predicted load-against-crack extension and failure loads for panels under biaxial loading, panels with MSD cracks, and panels with various number of stiffeners were compared with test data, whenever possible. The predicted results agreed well with the test data even for large-scale plastic deformations. The analyses were also able to predict stable tearing behavior of a large lead crack in the presence of MSD cracks. The analyses were then used to study the influence of stiffeners on residual strength in the presence of widespread fatigue cracking. Small MSD cracks were found to greatly reduce the residual strength for large lead cracks even for stiffened panels.

  2. Fracture analysis of stiffened panels under biaxial loading with widespread cracking

    NASA Technical Reports Server (NTRS)

    Newman, J. C., Jr.

    1995-01-01

    An elastic-plastic finite-element analysis with a critical crack-tip opening angle (CTOA) fracture criterion was used to model stable crack growth and fracture of 2024-T3 aluminum alloy (bare and clad) panels for several thicknesses. The panels had either single or multiple-site damage (MSD) cracks subjected to uniaxial or biaxial loading. Analyses were also conducted on cracked stiffened panels with single or MSD cracks. The critical CTOA value for each thickness was determined by matching the failure load on a middle-crack tension specimen. Comparisons were made between the critical angles determined from the finite-element analyses and those measured with photographic methods. Predicted load-against-crack extension and failure loads for panels under biaxial loading, panels with MSD cracks, and panels with various numbers of stiffeners were compared with test data whenever possible. The predicted results agreed well with the test data even for large-scale plastic deformations. The analyses were also able to predict stable tearing behavior of a large lead crack in the presence of MSD cracks. The analyses were then used to study the influence of stiffeners on residual strength in the presence of widespread fatigue cracking. Small MSD cracks were found to greatly reduce the residual strength for large lead cracks even for stiffened panels.

  3. Microsatellite analysis of medfly bioinfestations in California.

    PubMed

    Bonizzoni, M; Zheng, L; Guglielmino, C R; Haymer, D S; Gasperi, G; Gomulski, L M; Malacrida, A R

    2001-10-01

    The Mediterranean fruit fly, Ceratitis capitata, is a destructive agricultural pest with a long history of invasion success. This pest has been affecting different regions of the United States for the past 30 years, but a number of studies of medfly bioinfestations has focused on the situation in California. Although some progress has been made in terms of establishing the origin of infestations, the overall status of this pest in this area remains controversial. Specifically, do flies captured over the years represent independent infestations or the persistence of a resident population? We present an effort to answer this question based on the use of multilocus genotyping. Ten microsatellite loci were used to analyse 109 medflies captured in several infestations within California between 1992 and 1998. Using these same markers, 242 medflies from regions of the world having 'established' populations of this pest including Hawaii, Guatemala, El Salvador, Ecuador, Brazil, Argentina and Peru, were also analysed. Although phylogenetic analysis, amova analysis, the IMMANC assignment test and geneclass exclusion test analysis suggest that some of the medflies captured in California are derived from independent invasion events, analysis of specimens from the Los Angeles basin provides support for the hypothesis that an endemic population, probably derived from Guatemala, has been established.

  4. Factors associated to quality of life in active elderly.

    PubMed

    Alexandre, Tiago da Silva; Cordeiro, Renata Cereda; Ramos, Luiz Roberto

    2009-08-01

    To analyze whether quality of life in active, healthy elderly individuals is influenced by functional status and sociodemographic characteristics, as well as psychological parameters. Study conducted in a sample of 120 active elderly subjects recruited from two open universities of the third age in the cities of São Paulo and São José dos Campos (Southeastern Brazil) between May 2005 and April 2006. Quality of life was measured using the abbreviated Brazilian version of the World Health Organization Quality of Live (WHOQOL-bref) questionnaire. Sociodemographic, clinical and functional variables were measured through crossculturally validated assessments by the Mini Mental State Examination, Geriatric Depression Scale, Functional Reach, One-Leg Balance Test, Timed Up and Go Test, Six-Minute Walk Test, Human Activity Profile and a complementary questionnaire. Simple descriptive analyses, Pearson's correlation coefficient, Student's t-test for non-related samples, analyses of variance, linear regression analyses and variance inflation factor were performed. The significance level for all statistical tests was set at 0.05. Linear regression analysis showed an independent correlation without colinearity between depressive symptoms measured by the Geriatric Depression Scale and four domains of the WHOQOL-bref. Not having a conjugal life implied greater perception in the social domain; developing leisure activities and having an income over five minimum wages implied greater perception in the environment domain. Functional status had no influence on the Quality of Life variable in the analysis models in active elderly. In contrast, psychological factors, as assessed by the Geriatric Depression Scale, and sociodemographic characteristics, such as marital status, income and leisure activities, had an impact on quality of life.

  5. Practical Thermal Evaluation Methods For HAC Fire Analysis In Type B Radiaoactive Material (RAM) Packages

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abramczyk, Glenn; Hensel, Stephen J; Gupta, Narendra K.

    Title 10 of the United States Code of Federal Regulations Part 71 for the Nuclear Regulatory Commission (10 CFR Part 71.73) requires that Type B radioactive material (RAM) packages satisfy certain Hypothetical Accident Conditions (HAC) thermal design requirements to ensure package safety during accidental fire conditions. Compliance with thermal design requirements can be met by prototype tests, analyses only or a combination of tests and analyses. Normally, it is impractical to meet all the HAC using tests only and the analytical methods are too complex due to the multi-physics non-linear nature of the fire event. Therefore, a combination of testsmore » and thermal analyses methods using commercial heat transfer software are used to meet the necessary design requirements. The authors, along with his other colleagues at Savannah River National Laboratory in Aiken, SC, USA, have successfully used this 'tests and analyses' approach in the design and certification of several United States' DOE/NNSA certified packages, e.g. 9975, 9977, 9978, 9979, H1700, and Bulk Tritium Shipping Package (BTSP). This paper will describe these methods and it is hoped that the RAM Type B package designers and analysts can use them for their applications.« less

  6. Knowledge and skills of the lamaze certified childbirth educator: results of a job task analysis.

    PubMed

    Budin, Wendy C; Gross, Leon; Lothian, Judith A; Mendelson, Jeanne

    2014-01-01

    Content validity of certification examinations is demonstrated over time with comprehensive job analyses conducted and analyzed by experts, with data gathered from stakeholders. In November 2011, the Lamaze International Certification Council conducted a job analysis update of the 2002 job analysis survey. This article presents the background, methodology, and findings of the job analysis. Changes in the test blueprint based on these findings are presented.

  7. Ares I-X Separation and Reentry Trajectory Analyses

    NASA Technical Reports Server (NTRS)

    Tartabini, Paul V.; Starr, Brett R.

    2011-01-01

    The Ares I-X Flight Test Vehicle was launched on October 28, 2009 and was the first and only test flight of NASA s two-stage Ares I launch vehicle design. The launch was successful and the flight test met all of its primary and secondary objectives. This paper discusses the stage separation and reentry trajectory analysis that was performed in support of the Ares I-X test flight. Pre-flight analyses were conducted to assess the risk of stage recontact during separation, to evaluate the first stage flight dynamics during reentry, and to define the range safety impact ellipses of both stages. The results of these pre-flight analyses were compared with available flight data. On-board video taken during flight showed that the flight test vehicle successfully separated without any recontact. Reconstructed trajectory data also showed that first stage flight dynamics were well characterized by pre-flight Monte Carlo results. In addition, comparisons with flight data indicated that the complex interference aerodynamic models employed in the reentry simulation were effective in capturing the flight dynamics during separation. Finally, the splash-down locations of both stages were well within predicted impact ellipses.

  8. Detection of colorectal cancer (CRC) by urinary volatile organic compound analysis.

    PubMed

    Arasaradnam, Ramesh P; McFarlane, Michael J; Ryan-Fisher, Courtenay; Westenbrink, Erik; Hodges, Phoebe; Hodges, Paula; Thomas, Matthew G; Chambers, Samantha; O'Connell, Nicola; Bailey, Catherine; Harmston, Christopher; Nwokolo, Chuka U; Bardhan, Karna D; Covington, James A

    2014-01-01

    Colorectal cancer (CRC) is a leading cause of cancer related death in Europe and the USA. There is no universally accepted effective non-invasive screening test for CRC. Guaiac based faecal occult blood (gFOB) testing has largely been superseded by Faecal Immunochemical testing (FIT), but sensitivity still remains poor. The uptake of population based FOBt testing in the UK is also low at around 50%. The detection of volatile organic compounds (VOCs) signature(s) for many cancer subtypes is receiving increasing interest using a variety of gas phase analytical instruments. One such example is FAIMS (Field Asymmetric Ion Mobility Spectrometer). FAIMS is able to identify Inflammatory Bowel disease (IBD) patients by analysing shifts in VOCs patterns in both urine and faeces. This study extends this concept to determine whether CRC patients can be identified through non-invasive analysis of urine, using FAIMS. 133 patients were recruited; 83 CRC patients and 50 healthy controls. Urine was collected at the time of CRC diagnosis and headspace analysis undertaken using a FAIMS instrument (Owlstone, Lonestar, UK). Data was processed using Fisher Discriminant Analysis (FDA) after feature extraction from the raw data. FAIMS analyses demonstrated that the VOC profiles of CRC patients were tightly clustered and could be distinguished from healthy controls. Sensitivity and specificity for CRC detection with FAIMS were 88% and 60% respectively. This study suggests that VOC signatures emanating from urine can be detected in patients with CRC using ion mobility spectroscopy technology (FAIMS) with potential as a novel screening tool.

  9. LOFT L2-3 blowdown experiment safety analyses D, E, and G; LOCA analyses H, K, K1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perryman, J.L.; Keeler, C.D.; Saukkoriipi, L.O.

    1978-12-01

    Three calculations using conservative off-nominal conditions and evaluation model options were made using RELAP4/MOD5 for blowdown-refill and RELAP4/MOD6 for reflood for Loss-of-Fluid Test Experiment L2-3 to support the experiment safety analysis effort. The three analyses are as follows: Analysis D: Loss of commercial power during Experiment L2-3; Analysis E: Hot leg quick-opening blowdown valve (QOBV) does not open during Experiment L2-3; and Analysis G: Cold leg QOBV does not open during Experiment L2-3. In addition, the results of three LOFT loss-of-coolant accident (LOCA) analyses using a power of 56.1 MW and a primary coolant system flow rate of 3.6 millionmore » 1bm/hr are presented: Analysis H: Intact loop 200% hot leg break; emergency core cooling (ECC) system B unavailable; Analysis K: Pressurizer relief valve stuck in open position; ECC system B unavailable; and Analysis K1: Same as analysis K, but using a primary coolant system flow rate of 1.92 million 1bm/hr (L2-4 pre-LOCE flow rate). For analysis D, the maximum cladding temperature reached was 1762/sup 0/F, 22 sec into reflood. In analyses E and G, the blowdowns were slower due to one of the QOBVs not functioning. The maximum cladding temperature reached in analysis E was 1700/sup 0/F, 64.7 sec into reflood; for analysis G, it was 1300/sup 0/F at the start of reflood. For analysis H, the maximum cladding temperature reached was 1825/sup 0/F, 0.01 sec into reflood. Analysis K was a very slow blowdown, and the cladding temperatures followed the saturation temperature of the system. The results of analysis K1 was nearly identical to analysis K; system depressurization was not affected by the primary coolant system flow rate.« less

  10. A Next Generation Digital Counting System For Low-Level Tritium Studies (Project Report)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bowman, P.

    2016-10-03

    Since the early seventies, SRNL has pioneered low-level tritium analysis using various nuclear counting technologies and techniques. Since 1999, SRNL has successfully performed routine low-level tritium analyses with counting systems based on digital signal processor (DSP) modules developed in the late 1990s. Each of these counting systems are complex, unique to SRNL, and fully dedicated to performing routine tritium analyses of low-level environmental samples. It is time to modernize these systems due to a variety of issues including (1) age, (2) lack of direct replacement electronics modules and (3) advances in digital signal processing and computer technology. There has beenmore » considerable development in many areas associated with the enterprise of performing low-level tritium analyses. The objective of this LDRD project was to design, build, and demonstrate a Next Generation Tritium Counting System (NGTCS), while not disrupting the routine low-level tritium analyses underway in the facility on the legacy counting systems. The work involved (1) developing a test bed for building and testing new counting system hardware that does not interfere with our routine analyses, (2) testing a new counting system based on a modern state of the art DSP module, and (3) evolving the low-level tritium counter design to reflect the state of the science.« less

  11. Pharmacoeconomic evaluations of pharmacogenetic and genomic screening programmes: a systematic review on content and adherence to guidelines.

    PubMed

    Vegter, Stefan; Boersma, Cornelis; Rozenbaum, Mark; Wilffert, Bob; Navis, Gerjan; Postma, Maarten J

    2008-01-01

    The fields of pharmacogenetics and pharmacogenomics have become important practical tools to progress goals in medical and pharmaceutical research and development. As more screening tests are being developed, with some already used in clinical practice, consideration of cost-effectiveness implications is important. A systematic review was performed on the content of and adherence to pharmacoeconomic guidelines of recent pharmacoeconomic analyses performed in the field of pharmacogenetics and pharmacogenomics. Economic analyses of screening strategies for genetic variations, which were evidence-based and assumed to be associated with drug efficacy or safety, were included in the review. The 20 papers included cover a variety of healthcare issues, including screening tests on several cytochrome P450 (CYP) enzyme genes, thiopurine S-methyltransferase (TMPT) and angiotensin-converting enzyme (ACE) insertion deletion (ACE I/D) polymorphisms. Most economic analyses reported that genetic screening was cost effective and often even clearly dominated existing non-screening strategies. However, we found a lack of standardization regarding aspects such as the perspective of the analysis, factors included in the sensitivity analysis and the applied discount rates. In particular, an important limitation of several studies related to the failure to provide a sufficient evidence-based rationale for an association between genotype and phenotype. Future economic analyses should be conducted utilizing correct methods, with adherence to guidelines and including extensive sensitivity analyses. Most importantly, genetic screening strategies should be based on good evidence-based rationales. For these goals, we provide a list of recommendations for good pharmacoeconomic practice deemed useful in the fields of pharmacogenetics and pharmacogenomics, regardless of country and origin of the economic analysis.

  12. JPL Test Effectiveness Analysis

    NASA Technical Reports Server (NTRS)

    Shreck, Stephanie; Sharratt, Stephen; Smith, Joseph F.; Strong, Edward

    2008-01-01

    1) The pilot study provided meaningful conclusions that are generally consistent with the earlier Test Effectiveness work done between 1992 and 1994: a) Analysis of pre-launch problem/failure reports is consistent with earlier work. b) Analysis of post-launch early mission anomaly reports indicates that there are more software issues in newer missions, and the no-test category for identification of post-launch failures is more significant than in the earlier analysis. 2) Future work includes understanding how differences in Missions effect these analyses: a) There are large variations in the number of problem reports and issues that are documented by the different Projects/Missions. b) Some missions do not have any reported environmental test anomalies, even though environmental tests were performed. 3) Each project/mission has different standards and conventions for filling out the PFR forms, the industry may wish to address this issue: a) Existing problem reporting forms are to document and track problems, failures, and issues (etc.) for the projects, to ensure high quality. b) Existing problem reporting forms are not intended for data mining.

  13. Thermal Analysis in Support of the Booster Separation Motor Crack Investigation

    NASA Technical Reports Server (NTRS)

    Davis, Darrell; Prickett, Terry; Turner, Larry D. (Technical Monitor)

    2001-01-01

    During a post-test inspection of a Booster Separation Motor (BSM) from a Lot Acceptance Test (LAT), a crack was noticed in the graphite throat. Since this was an out-of-family occurrence, an investigation team was formed to determine the cause of the crack. This paper will describe thermal analysis techniques used in support of this investigation. Models were generated to predict gradients in nominal motor conditions, as well as potentially anomalous conditions. Analysis was also performed on throats that were tested in the Laser Hardened Material Evaluation Laboratory (LHMEL). Some of these throats were pre-cracked, while others represented configurations designed to amplify effects of thermal stresses. Results from these analyses will be presented in this paper.

  14. Thermal Analysis in Support of the Booster Separation Motor Crack Investigation

    NASA Technical Reports Server (NTRS)

    Davis, Darrell; Prickett, Terry

    2002-01-01

    During a post-test inspection of a Booster Separation Motor (BSM) from a Lot Acceptance Test (LAT), a crack was noticed in the graphite throat. Since this was an out-of-family occurrence, an investigation team was formed to determine the cause of the crack. This paper will describe thermal analysis techniques used in support of this investigation. Models were generated to predict gradients in nominal motor conditions, as well as potentially anomalous conditions. Analysis was also performed on throats that were tested in the Laser Hardened Material Evaluation Laboratory (LHMEL). Some of these throats were pre-cracked, while others represented configurations designed to amplify effects of thermal stresses. Results from these analyses will be presented in this paper.

  15. Analysis of Palm Oil Production, Export, and Government Consumption to Gross Domestic Product of Five Districts in West Kalimantan by Panel Regression

    NASA Astrophysics Data System (ADS)

    Sulistianingsih, E.; Kiftiah, M.; Rosadi, D.; Wahyuni, H.

    2017-04-01

    Gross Domestic Product (GDP) is an indicator of economic growth in a region. GDP is a panel data, which consists of cross-section and time series data. Meanwhile, panel regression is a tool which can be utilised to analyse panel data. There are three models in panel regression, namely Common Effect Model (CEM), Fixed Effect Model (FEM) and Random Effect Model (REM). The models will be chosen based on results of Chow Test, Hausman Test and Lagrange Multiplier Test. This research analyses palm oil about production, export, and government consumption to five district GDP are in West Kalimantan, namely Sanggau, Sintang, Sambas, Ketapang and Bengkayang by panel regression. Based on the results of analyses, it concluded that REM, which adjusted-determination-coefficient is 0,823, is the best model in this case. Also, according to the result, only Export and Government Consumption that influence GDP of the districts.

  16. Research on Visual Analysis Methods of Terrorism Events

    NASA Astrophysics Data System (ADS)

    Guo, Wenyue; Liu, Haiyan; Yu, Anzhu; Li, Jing

    2016-06-01

    Under the situation that terrorism events occur more and more frequency throughout the world, improving the response capability of social security incidents has become an important aspect to test governments govern ability. Visual analysis has become an important method of event analysing for its advantage of intuitive and effective. To analyse events' spatio-temporal distribution characteristics, correlations among event items and the development trend, terrorism event's spatio-temporal characteristics are discussed. Suitable event data table structure based on "5W" theory is designed. Then, six types of visual analysis are purposed, and how to use thematic map and statistical charts to realize visual analysis on terrorism events is studied. Finally, experiments have been carried out by using the data provided by Global Terrorism Database, and the results of experiments proves the availability of the methods.

  17. Automated analysis of clonal cancer cells by intravital imaging

    PubMed Central

    Coffey, Sarah Earley; Giedt, Randy J; Weissleder, Ralph

    2013-01-01

    Longitudinal analyses of single cell lineages over prolonged periods have been challenging particularly in processes characterized by high cell turn-over such as inflammation, proliferation, or cancer. RGB marking has emerged as an elegant approach for enabling such investigations. However, methods for automated image analysis continue to be lacking. Here, to address this, we created a number of different multicolored poly- and monoclonal cancer cell lines for in vitro and in vivo use. To classify these cells in large scale data sets, we subsequently developed and tested an automated algorithm based on hue selection. Our results showed that this method allows accurate analyses at a fraction of the computational time required by more complex color classification methods. Moreover, the methodology should be broadly applicable to both in vitro and in vivo analyses. PMID:24349895

  18. Seal Analysis for the Ares-I Upper Stage Fuel Tank Manhole Cover

    NASA Technical Reports Server (NTRS)

    Phillips, Dawn R.; Wingate, Robert J.

    2010-01-01

    Techniques for studying the performance of Naflex pressure-assisted seals in the Ares-I Upper Stage liquid hydrogen tank manhole cover seal joint are explored. To assess the feasibility of using the identical seal design for the Upper Stage as was used for the Space Shuttle External Tank manhole covers, a preliminary seal deflection analysis using the ABAQUS commercial finite element software is employed. The ABAQUS analyses are performed using three-dimensional symmetric wedge finite element models. This analysis technique is validated by first modeling a heritage External Tank liquid hydrogen tank manhole cover joint and correlating the results to heritage test data. Once the technique is validated, the Upper Stage configuration is modeled. The Upper Stage analyses are performed at 1.4 times the expected pressure to comply with the Constellation Program factor of safety requirement on joint separation. Results from the analyses performed with the External Tank and Upper Stage models demonstrate the effects of several modeling assumptions on the seal deflection. The analyses for Upper Stage show that the integrity of the seal is successfully maintained.

  19. Space Shuttle Orbital Drag Parachute Design

    NASA Technical Reports Server (NTRS)

    Meyerson, Robert E.

    2001-01-01

    The drag parachute system was added to the Space Shuttle Orbiter's landing deceleration subsystem beginning with flight STS-49 in May 1992. The addition of this subsystem to an existing space vehicle required a detailed set of ground tests and analyses. The aerodynamic design and performance testing of the system consisted of wind tunnel tests, numerical simulations, pilot-in-the-loop simulations, and full-scale testing. This analysis and design resulted in a fully qualified system that is deployed on every flight of the Space Shuttle.

  20. Analysis of beryllium and depleted uranium: An overview of detection methods in aerosols and soils

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Camins, I.; Shinn, J.H.

    We conducted a survey of commercially available methods for analysis of beryllium and depleted uranium in aerosols and soils to find a reliable, cost-effective, and sufficiently precise method for researchers involved in environmental testing at the Yuma Proving Ground, Yuma, Arizona. Criteria used for evaluation include cost, method of analysis, specificity, sensitivity, reproducibility, applicability, and commercial availability. We found that atomic absorption spectrometry with graphite furnace meets these criteria for testing samples for beryllium. We found that this method can also be used to test samples for depleted uranium. However, atomic absorption with graphite furnace is not as sensitive amore » measurement method for depleted uranium as it is for beryllium, so we recommend that quality control of depleted uranium analysis be maintained by testing 10 of every 1000 samples by neutron activation analysis. We also evaluated 45 companies and institutions that provide analyses of beryllium and depleted uranium. 5 refs., 1 tab.« less

  1. Bridging ImmunoGenomic Data Analysis Workflow Gaps (BIGDAWG): An integrated case-control analysis pipeline.

    PubMed

    Pappas, Derek J; Marin, Wesley; Hollenbach, Jill A; Mack, Steven J

    2016-03-01

    Bridging ImmunoGenomic Data-Analysis Workflow Gaps (BIGDAWG) is an integrated data-analysis pipeline designed for the standardized analysis of highly-polymorphic genetic data, specifically for the HLA and KIR genetic systems. Most modern genetic analysis programs are designed for the analysis of single nucleotide polymorphisms, but the highly polymorphic nature of HLA and KIR data require specialized methods of data analysis. BIGDAWG performs case-control data analyses of highly polymorphic genotype data characteristic of the HLA and KIR loci. BIGDAWG performs tests for Hardy-Weinberg equilibrium, calculates allele frequencies and bins low-frequency alleles for k×2 and 2×2 chi-squared tests, and calculates odds ratios, confidence intervals and p-values for each allele. When multi-locus genotype data are available, BIGDAWG estimates user-specified haplotypes and performs the same binning and statistical calculations for each haplotype. For the HLA loci, BIGDAWG performs the same analyses at the individual amino-acid level. Finally, BIGDAWG generates figures and tables for each of these comparisons. BIGDAWG obviates the error-prone reformatting needed to traffic data between multiple programs, and streamlines and standardizes the data-analysis process for case-control studies of highly polymorphic data. BIGDAWG has been implemented as the bigdawg R package and as a free web application at bigdawg.immunogenomics.org. Copyright © 2015 American Society for Histocompatibility and Immunogenetics. Published by Elsevier Inc. All rights reserved.

  2. Linkage and related analyses of Barrett's esophagus and its associated adenocarcinomas.

    PubMed

    Sun, Xiangqing; Elston, Robert; Falk, Gary W; Grady, William M; Faulx, Ashley; Mittal, Sumeet K; Canto, Marcia I; Shaheen, Nicholas J; Wang, Jean S; Iyer, Prasad G; Abrams, Julian A; Willis, Joseph E; Guda, Kishore; Markowitz, Sanford; Barnholtz-Sloan, Jill S; Chandar, Apoorva; Brock, Wendy; Chak, Amitabh

    2016-07-01

    Familial aggregation and segregation analysis studies have provided evidence of a genetic basis for esophageal adenocarcinoma (EAC) and its premalignant precursor, Barrett's esophagus (BE). We aim to demonstrate the utility of linkage analysis to identify the genomic regions that might contain the genetic variants that predispose individuals to this complex trait (BE and EAC). We genotyped 144 individuals in 42 multiplex pedigrees chosen from 1000 singly ascertained BE/EAC pedigrees, and performed both model-based and model-free linkage analyses, using S.A.G.E. and other software. Segregation models were fitted, from the data on both the 42 pedigrees and the 1000 pedigrees, to determine parameters for performing model-based linkage analysis. Model-based and model-free linkage analyses were conducted in two sets of pedigrees: the 42 pedigrees and a subset of 18 pedigrees with female affected members that are expected to be more genetically homogeneous. Genome-wide associations were also tested in these families. Linkage analyses on the 42 pedigrees identified several regions consistently suggestive of linkage by different linkage analysis methods on chromosomes 2q31, 12q23, and 4p14. A linkage on 15q26 is the only consistent linkage region identified in the 18 female-affected pedigrees, in which the linkage signal is higher than in the 42 pedigrees. Other tentative linkage signals are also reported. Our linkage study of BE/EAC pedigrees identified linkage regions on chromosomes 2, 4, 12, and 15, with some reported associations located within our linkage peaks. Our linkage results can help prioritize association tests to delineate the genetic determinants underlying susceptibility to BE and EAC.

  3. Definition of the thermographic regions of interest in cycling by using a factor analysis

    NASA Astrophysics Data System (ADS)

    Priego Quesada, Jose Ignacio; Lucas-Cuevas, Angel Gabriel; Salvador Palmer, Rosario; Pérez-Soriano, Pedro; Cibrián Ortiz de Anda, Rosa M.a.

    2016-03-01

    Research in exercise physiology using infrared thermography has increased in the last years. However, the definition of the Regions of Interest (ROIs) varies strongly between studies. Therefore, the aim of this study was to use a factor analysis approach to define highly correlated groups of thermographic ROIs during a cycling test. Factor analyses were performed based on the moment of measurement and on the variation of skin temperatures as a result of the cycling exercise. 19 male participants cycled during 45 min at 50% of their individual peak power output with a cadence of 90 rpm. Infrared thermography was used to measure skin temperatures in sixteen ROIs of the trunk and lower limbs at three moments: before, immediately after and 10 min after the cycling test. Factor analyses were used to identify groups of ROIs based on the skin absolute temperatures at each moment of measurement as well as on skin temperature variations between moments. All the factor analyses performed for each moment and skin temperature variation explained more than the 80% of the variance. Different groups of ROIs were obtained when the analysis was based on the moment of measurement or on the effect of exercise on the skin temperature. Furthermore, some ROIs were grouped in the same way in both analyses (e.g. the ROIs of the trunk), whereas other regions (legs and their joints) were grouped differently in each analysis. Differences between groups of ROIs are related to their tissue composition, muscular activity and capacity of sweating. In conclusion, the resultant groups of ROIs were coherent and could help researchers to define the ROIs in future thermal studies.

  4. Posttest analysis of a 1:6-scale reinforced concrete reactor containment building

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weatherby, J.R.

    In an experiment conducted at Sandia National Laboratories, 1:6-scale model of a reinforced concrete light water reactor containment building was pressurized with nitrogen gas to more than three times its design pressure. The pressurization produced one large tear and several smaller tears in the steel liner plate that functioned as the primary pneumatic seal for the structure. The data collected from the overpressurization test have been used to evaluate and further refine methods of structural analysis that can be used to predict the performance of containment buildings under conditions produced by a severe accident. This report describes posttest finite elementmore » analyses of the 1:6-scale model tests and compares pretest predictions of the structural response to the experimental results. Strain and displacements calculated in axisymmetric finite element analyses of the 1:6-scale model are compared to strains and displacement measured in the experiment. Detailed analyses of the liner plate are also described in the report. The region of the liner surrounding the large tear was analyzed using two different two-dimensional finite elements model. The results from these analyzed indicate that the primary mechanisms that initiated the tear can be captured in a two- dimensional finite element model. Furthermore, the analyses show that studs used to anchor the liner to the concrete wall, played an important role in initiating the liner tear. Three-dimensional finite element analyses of liner plates loaded by studs are also presented. Results from the three-dimensional analyses are compared to results from two-dimensional analyses of the same problems. 12 refs., 56 figs., 1 tab.« less

  5. Methodological Standards for Meta-Analyses and Qualitative Systematic Reviews of Cardiac Prevention and Treatment Studies: A Scientific Statement From the American Heart Association.

    PubMed

    Rao, Goutham; Lopez-Jimenez, Francisco; Boyd, Jack; D'Amico, Frank; Durant, Nefertiti H; Hlatky, Mark A; Howard, George; Kirley, Katherine; Masi, Christopher; Powell-Wiley, Tiffany M; Solomonides, Anthony E; West, Colin P; Wessel, Jennifer

    2017-09-05

    Meta-analyses are becoming increasingly popular, especially in the fields of cardiovascular disease prevention and treatment. They are often considered to be a reliable source of evidence for making healthcare decisions. Unfortunately, problems among meta-analyses such as the misapplication and misinterpretation of statistical methods and tests are long-standing and widespread. The purposes of this statement are to review key steps in the development of a meta-analysis and to provide recommendations that will be useful for carrying out meta-analyses and for readers and journal editors, who must interpret the findings and gauge methodological quality. To make the statement practical and accessible, detailed descriptions of statistical methods have been omitted. Based on a survey of cardiovascular meta-analyses, published literature on methodology, expert consultation, and consensus among the writing group, key recommendations are provided. Recommendations reinforce several current practices, including protocol registration; comprehensive search strategies; methods for data extraction and abstraction; methods for identifying, measuring, and dealing with heterogeneity; and statistical methods for pooling results. Other practices should be discontinued, including the use of levels of evidence and evidence hierarchies to gauge the value and impact of different study designs (including meta-analyses) and the use of structured tools to assess the quality of studies to be included in a meta-analysis. We also recommend choosing a pooling model for conventional meta-analyses (fixed effect or random effects) on the basis of clinical and methodological similarities among studies to be included, rather than the results of a test for statistical heterogeneity. © 2017 American Heart Association, Inc.

  6. Contacts in the Office of Pesticide Programs, Biological and Economic Analysis Division

    EPA Pesticide Factsheets

    BEAD provides pesticide use-related information and economic analyses in support of pesticide regulatory activities. BEAD's laboratories validate analytical methods and test public health antimicrobials to ensure that they work as intended.

  7. CENTRIFUGAL VIBRATION TEST OF RC PILE FOUNDATION

    NASA Astrophysics Data System (ADS)

    Higuchi, Shunichi; Tsutsumiuchi, Takahiro; Otsuka, Rinna; Ito, Koji; Ejiri, Joji

    It is necessary that nonlinear responses of structures are clarified by soil-structure interaction analysis for the purpose of evaluating the seismic performances of underground structure or foundation structure. In this research, centrifuge shake table tests of reinforced concrete pile foundation installed in the liquefied ground were conducted. Then, finite element analyses for the tests were conducted to confirm an applicability of the analytical method by comparing the experimental results and analytical results.

  8. Seismic Discrimination

    DTIC Science & Technology

    1982-09-30

    Frequency-wave-number analyses of data from Nevada Test Site (NTS) shots recorded at LASA were computed in the frequency range from 0.01 to 0.05 Hz (Ref...from events in the Soviet Union at a known test site . In order to put further factual basis behind the SP spectral discriminants we used, comparisons...explosion. A catalogue of presumed explosion# in the Soviet Union away from the regular test sites was assembled. A time-domain analysis of seismograms

  9. A New Approach to Response Sets in Analysis of a Test of Motivation to Achieve. A Section of the Final Report for 1969-70.

    ERIC Educational Resources Information Center

    Adkins, Dorothy C.; Ballif, Bonnie L.

    Gumpgookies, an objective-projective test of school achievement motivation for children 3 1/2 to 8 year, was reduced from 100 to 75 items following extensive factor analyses. This revised test attempted to dissipate the effects of response sets of the subjects and was prepared in three versions--an individual form, a group form for non-readers,…

  10. Warranties in Weapon System Procurement: An Analysis of Practice and Theory.

    DTIC Science & Technology

    1987-04-01

    MICROCOPY RESOLUTION TEST CHART NATIONAL BUREAU OF SIANDARDS 1963 A fD r ILE GW cm AD-A190 933 IDA PAPER P-2024 WARRANTIES IN WEAPON SYSTEM PROCUREMENT...EPWT Essential Performance Warranty Test FAR Federal Acquisition Regulation GCS Guidance Control Section GLD Ground Laser Locator Designator GS General...Service IDA Institute for Defense Analyses I&L Installations and Logistics IPT Initial Production Test LDR Laser Designator Rangefimder MLDT Mean

  11. The Baptist Health Nurse Retention Questionnaire: A Methodological Study, Part 1.

    PubMed

    Lengerich, Alexander; Bugajski, Andrew; Marchese, Matthew; Hall, Brittany; Yackzan, Susan; Davies, Claire; Brockopp, Dorothy

    2017-05-01

    The purposes of this study were to develop and test the Baptist Health Nurse Retention Questionnaire (BHNRQ) and examine the importance of nurse retention factors. Multiple factors, including increasing patient acuity levels, have led to concerns regarding nurse retention. An understanding of current factors related to retention is limited. To establish the psychometric properties of the BHNRQ, data were collected from 279 bedside nurses at a 391-bed, Magnet® redesignated community hospital. A principal component analysis was conducted to determine the subscale structure of the BHNRQ. Additional analyses were conducted related to content validity and test-retest reliability. The results of the principal components analysis revealed 3 subscales: nursing practice, management, and staffing. Analyses demonstrate that the BHNRQ is a reliable and valid instrument for measuring nurse retention factors. The BHNRQ was found to be a clinically useful instrument for measuring important factors related to nurse retention.

  12. How is flow experienced and by whom? Testing flow among occupations.

    PubMed

    Llorens, Susana; Salanova, Marisa; Rodríguez, Alma M

    2013-04-01

    The aims of this paper are to test (1) the factorial structure of the frequency of flow experience at work; (2) the flow analysis model in work settings by differentiating the frequency of flow and the frequency of its prerequisites; and (3) whether there are significant differences in the frequency of flow experience depending on the occupation. A retrospective study among 957 employees (474 tile workers and 483 secondary school teachers) using multigroup confirmatory factorial analyses and multiple analyses of variance suggested that on the basis of the flow analysis model in work settings, (1) the frequency of flow experience has a two-factor structure (enjoyment and absorption); (2) the frequency of flow experience at work is produced when both challenge and skills are high and balanced; and (3) secondary school teachers experience flow more frequently than tile workers. Copyright © 2012 John Wiley & Sons, Ltd.

  13. Pathway-based analyses.

    PubMed

    Kent, Jack W

    2016-02-03

    New technologies for acquisition of genomic data, while offering unprecedented opportunities for genetic discovery, also impose severe burdens of interpretation and penalties for multiple testing. The Pathway-based Analyses Group of the Genetic Analysis Workshop 19 (GAW19) sought reduction of multiple-testing burden through various approaches to aggregation of highdimensional data in pathways informed by prior biological knowledge. Experimental methods testedincluded the use of "synthetic pathways" (random sets of genes) to estimate power and false-positive error rate of methods applied to simulated data; data reduction via independent components analysis, single-nucleotide polymorphism (SNP)-SNP interaction, and use of gene sets to estimate genetic similarity; and general assessment of the efficacy of prior biological knowledge to reduce the dimensionality of complex genomic data. The work of this group explored several promising approaches to managing high-dimensional data, with the caveat that these methods are necessarily constrained by the quality of external bioinformatic annotation.

  14. Statistical analysis and interpretation of prenatal diagnostic imaging studies, Part 2: descriptive and inferential statistical methods.

    PubMed

    Tuuli, Methodius G; Odibo, Anthony O

    2011-08-01

    The objective of this article is to discuss the rationale for common statistical tests used for the analysis and interpretation of prenatal diagnostic imaging studies. Examples from the literature are used to illustrate descriptive and inferential statistics. The uses and limitations of linear and logistic regression analyses are discussed in detail.

  15. Development and Psychometric Properties of the Math and Me Survey: Measuring Third through Sixth Graders' Attitudes toward Mathematics

    ERIC Educational Resources Information Center

    Adelson, Jill L.; McCoach, D. Betsy

    2011-01-01

    The Math and Me Survey was designed to measure elementary students' attitudes toward mathematics. The authors conducted content validation, exploratory factor analysis, confirmatory factor analysis, item response theory, reliability, and external validity analyses to improve it and to test its psychometric properties. The final Math and Me Survey…

  16. Evaluation of Thermal Protection Tile Transmissibility for Ground Vibration Test

    NASA Technical Reports Server (NTRS)

    Chung, Y. T.; Fowler, Samuel B.; Lo, Wenso; Towner, Robert

    2005-01-01

    Transmissibility analyses and tests were conducted on a composite panel with thermal protection system foams to evaluate the quality of the measured frequency response functions. Both the analysis and the test results indicate that the vehicle dynamic responses are fully transmitted to the accelerometers mounted on the thermal protection system in the normal direction below a certain frequency. In addition, the in-plane motions of the accelerometer mounted on the top surface of the thermal protection system behave more actively than those on the composite panel due to the geometric offset of the accelerometer from the panel in the test set-up. The transmissibility tests and analyses show that the frequency response functions measured from the accelerometers mounted on the TPS will provide accurate vehicle responses below 120 Hz for frequency and mode shape identification. By confirming that accurate dynamic responses below a given frequency can be obtained, this study increases the confidence needed for conducting the modal testing, model correlation, and model updating for a vehicle installed with TPS. '

  17. Item Response Theory Analyses of the Cambridge Face Memory Test (CFMT)

    PubMed Central

    Cho, Sun-Joo; Wilmer, Jeremy; Herzmann, Grit; McGugin, Rankin; Fiset, Daniel; Van Gulick, Ana E.; Ryan, Katie; Gauthier, Isabel

    2014-01-01

    We evaluated the psychometric properties of the Cambridge face memory test (CFMT; Duchaine & Nakayama, 2006). First, we assessed the dimensionality of the test with a bi-factor exploratory factor analysis (EFA). This EFA analysis revealed a general factor and three specific factors clustered by targets of CFMT. However, the three specific factors appeared to be minor factors that can be ignored. Second, we fit a unidimensional item response model. This item response model showed that the CFMT items could discriminate individuals at different ability levels and covered a wide range of the ability continuum. We found the CFMT to be particularly precise for a wide range of ability levels. Third, we implemented item response theory (IRT) differential item functioning (DIF) analyses for each gender group and two age groups (Age ≤ 20 versus Age > 21). This DIF analysis suggested little evidence of consequential differential functioning on the CFMT for these groups, supporting the use of the test to compare older to younger, or male to female, individuals. Fourth, we tested for a gender difference on the latent facial recognition ability with an explanatory item response model. We found a significant but small gender difference on the latent ability for face recognition, which was higher for women than men by 0.184, at age mean 23.2, controlling for linear and quadratic age effects. Finally, we discuss the practical considerations of the use of total scores versus IRT scale scores in applications of the CFMT. PMID:25642930

  18. Using multiple group modeling to test moderators in meta-analysis.

    PubMed

    Schoemann, Alexander M

    2016-12-01

    Meta-analysis is a popular and flexible analysis that can be fit in many modeling frameworks. Two methods of fitting meta-analyses that are growing in popularity are structural equation modeling (SEM) and multilevel modeling (MLM). By using SEM or MLM to fit a meta-analysis researchers have access to powerful techniques associated with SEM and MLM. This paper details how to use one such technique, multiple group analysis, to test categorical moderators in meta-analysis. In a multiple group meta-analysis a model is fit to each level of the moderator simultaneously. By constraining parameters across groups any model parameter can be tested for equality. Using multiple groups to test for moderators is especially relevant in random-effects meta-analysis where both the mean and the between studies variance of the effect size may be compared across groups. A simulation study and the analysis of a real data set are used to illustrate multiple group modeling with both SEM and MLM. Issues related to multiple group meta-analysis and future directions for research are discussed. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  19. The SNPforID Assay as a Supplementary Method in Kinship and Trace Analysis

    PubMed Central

    Schwark, Thorsten; Meyer, Patrick; Harder, Melanie; Modrow, Jan-Hendrick; von Wurmb-Schwark, Nicole

    2012-01-01

    Objective Short tandem repeat (STR) analysis using commercial multiplex PCR kits is the method of choice for kinship testing and trace analysis. However, under certain circumstances (deficiency testing, mutations, minute DNA amounts), STRs alone may not suffice. Methods We present a 50-plex single nucleotide polymorphism (SNP) assay based on the SNPs chosen by the SNPforID consortium as an additional method for paternity and for trace analysis. The new assay was applied to selected routine paternity and trace cases from our laboratory. Results and Conclusions Our investigation shows that the new SNP multiplex assay is a valuable method to supplement STR analysis, and is a powerful means to solve complicated genetic analyses. PMID:22851934

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    West, W.S.

    Progress during the period includes completion of the SNAP 7C system tests, completion of safety analysis for the SNAP 7A and C systems, assembly and initial testing of SNAP 7A, assembly of a modified reliability model, and assembly of a 10-W generator. Other activities include completion of thermal and safety analyses for SNAP 7B and D generators and fuel processing for these generators. (J.R.D.)

  1. A Confirmatory Factor Analysis of the California Verbal Learning Test-Second Edition (CVLT-II) in a Traumatic Brain Injury Sample

    ERIC Educational Resources Information Center

    DeJong, Joy; Donders, Jacobus

    2009-01-01

    The latent structure of the California Verbal Learning Test-Second Edition (CVLT-II) was examined in a clinical sample of 223 persons with traumatic brain injury that had been screened to remove individuals with complicating premorbid (e.g., psychiatric) or comorbid (e.g., financial compensation seeking) histories. Analyses incorporated the…

  2. Deck and Cable Dynamic Testing of a Single-span Bridge Using Radar Interferometry and Videometry Measurements

    NASA Astrophysics Data System (ADS)

    Piniotis, George; Gikas, Vassilis; Mpimis, Thanassis; Perakis, Harris

    2016-03-01

    This paper presents the dynamic testing of a roadway, single-span, cable-stayed bridge for a sequence of static load and ambient vibration monitoring scenarios. Deck movements were captured along both sideways of the bridge using a Digital Image Correlation (DIC) and a Ground-based Microwave Interfererometer (GBMI) system. Cable vibrations were measured at a single point location on each of the six cables using the GBMI technique. Dynamic testing involves three types of analyses; firstly, vibration analysis and modal parameter estimation (i. e., natural frequencies and modal shapes) of the deck using the combined DIC and GBMI measurements. Secondly, dynamic testing of the cables is performed through vibration analysis and experimental computation of their tension forces. Thirdly, the mechanism of cable-deck dynamic interaction is studied through their Power Spectra Density (PSD) and the Short Time Fourier Transform (STFT) analyses. Thereby, the global (deck and cable) and local (either deck or cable) bridge modes are identified, serving a concrete benchmark of the current state of the bridge for studying the evolution of its structural performance in the future. The level of synergy and complementarity between the GBMI and DIC techniques for bridge monitoring is also examined and assessed.

  3. What concept analysis in philosophy of science should be (and why competing philosophical analyses of gene concepts cannot be tested by polling scientists).

    PubMed

    Waters, C Kenneth

    2004-01-01

    What should philosophers of science accomplish when they analyze scientific concepts and interpret scientific knowledge? What is concept analysis if it is not a description of the way scientists actually think? I investigate these questions by using Hans Reichenbach's account of the descriptive, critical, and advisory tasks of philosophy of science to examine Karola Stotz and Paul Griffiths' idea that poll-based methodologies can test philosophical analyses of scientific concepts. Using Reichenbach's account as a point of departure, I argue that philosophy of science should identify and clarify epistemic virtues and describe scientific knowledge in relation to these virtues. The role of concept analysis is to articulate scientific concepts in ways that help reveal epistemic virtues and limitations of particular sciences. This means an analysis of the gene concept(s) should help clarify the explanatory power and limitations of gene-based explanations, and should help account for the investigative utility and biases of gene-centered sciences. I argue that a philosophical analysis of gene concept(s) that helps achieve these critical aims should not be rejected on the basis of poll-based studies even if such studies could show that professional biologists don't actually use gene terminology in precise ways corresponding to the philosophical analysis.

  4. Validation of the Work-Life Balance Culture Scale (WLBCS).

    PubMed

    Nitzsche, Anika; Jung, Julia; Kowalski, Christoph; Pfaff, Holger

    2014-01-01

    The purpose of this paper is to describe the theoretical development and initial validation of the newly developed Work-Life Balance Culture Scale (WLBCS), an instrument for measuring an organizational culture that promotes the work-life balance of employees. In Study 1 (N=498), the scale was developed and its factorial validity tested through exploratory factor analyses. In Study 2 (N=513), confirmatory factor analysis (CFA) was performed to examine model fit and retest the dimensional structure of the instrument. To assess construct validity, a priori hypotheses were formulated and subsequently tested using correlation analyses. Exploratory and confirmatory factor analyses revealed a one-factor model. Results of the bivariate correlation analyses may be interpreted as preliminary evidence of the scale's construct validity. The five-item WLBCS is a new and efficient instrument with good overall quality. Its conciseness makes it particularly suitable for use in employee surveys to gain initial insight into a company's perceived work-life balance culture.

  5. Validity of the estimates of oral cholera vaccine effectiveness derived from the test-negative design.

    PubMed

    Ali, Mohammad; You, Young Ae; Sur, Dipika; Kanungo, Suman; Kim, Deok Ryun; Deen, Jacqueline; Lopez, Anna Lena; Wierzba, Thomas F; Bhattacharya, Sujit K; Clemens, John D

    2016-01-20

    The test-negative design (TND) has emerged as a simple method for evaluating vaccine effectiveness (VE). Its utility for evaluating oral cholera vaccine (OCV) effectiveness is unknown. We examined this method's validity in assessing OCV effectiveness by comparing the results of TND analyses with those of conventional cohort analyses. Randomized controlled trials of OCV were conducted in Matlab (Bangladesh) and Kolkata (India), and an observational cohort design was used in Zanzibar (Tanzania). For all three studies, VE using the TND was estimated from the odds ratio (OR) relating vaccination status to fecal test status (Vibrio cholerae O1 positive or negative) among diarrheal patients enrolled during surveillance (VE= (1-OR)×100%). In cohort analyses of these studies, we employed the Cox proportional hazard model for estimating VE (=1-hazard ratio)×100%). OCV effectiveness estimates obtained using the TND (Matlab: 51%, 95% CI:37-62%; Kolkata: 67%, 95% CI:57-75%) were similar to the cohort analyses of these RCTs (Matlab: 52%, 95% CI:43-60% and Kolkata: 66%, 95% CI:55-74%). The TND VE estimate for the Zanzibar data was 94% (95% CI:84-98%) compared with 82% (95% CI:58-93%) in the cohort analysis. After adjusting for residual confounding in the cohort analysis of the Zanzibar study, using a bias indicator condition, we observed almost no difference in the two estimates. Our findings suggest that the TND is a valid approach for evaluating OCV effectiveness in routine vaccination programs. Copyright © 2015 Elsevier Ltd. All rights reserved.

  6. Mind mapping in qualitative research.

    PubMed

    Tattersall, Christopher; Powell, Julia; Stroud, James; Pringle, Jan

    We tested a theory that mind mapping could be used as a tool in qualitative research to transcribe and analyse an interview. We compared results derived from mind mapping with those from interpretive phenomenological analysis by examining patients' and carers' perceptions of a new nurse-led service. Mind mapping could be used to rapidly analyse simple qualitative audio-recorded interviews. More research is needed to establish the extent to which mind mapping can assist qualitative researchers.

  7. A comparison of reliability and conventional estimation of safe fatigue life and safe inspection intervals

    NASA Technical Reports Server (NTRS)

    Hooke, F. H.

    1972-01-01

    Both the conventional and reliability analyses for determining safe fatigue life are predicted on a population having a specified (usually log normal) distribution of life to collapse under a fatigue test load. Under a random service load spectrum, random occurrences of load larger than the fatigue test load may confront and cause collapse of structures which are weakened, though not yet to the fatigue test load. These collapses are included in reliability but excluded in conventional analysis. The theory of risk determination by each method is given, and several reasonably typical examples have been worked out, in which it transpires that if one excludes collapse through exceedance of the uncracked strength, the reliability and conventional analyses gave virtually identical probabilities of failure or survival.

  8. Assessment of technical condition of concrete pavement by the example of district road

    NASA Astrophysics Data System (ADS)

    Linek, M.; Nita, P.; Żebrowski, W.; Wolka, P.

    2018-05-01

    The article presents the comprehensive assessment of concrete pavement condition. Analyses included the district road located in the swietokrzyskie province, used for 11 years. Comparative analyses were conducted twice. The first analysis was carried out after 9 years of pavement operation, in 2015. In order to assess the extent of pavement degradation, the tests were repeated in 2017. Within the scope of field research, the traffic intensity within the analysed road section was determined. Visual assessment of pavement condition was conducted, according to the guidelines included in SOSN-B. Visual assessment can be extended by ground-penetrating radar measurements which allow to provide comprehensive assessment of the occurred structure changes within its entire thickness and length. The assessment included also performance parameters, i.e. pavement regularity, surface roughness and texture. Extension of test results by the assessment of changes in internal structure of concrete composite and structure observations by means of Scanning Electron Microscope allow for the assessment of parameters of internal structure of hardened concrete. Supplementing the observations of internal structure by means of computed tomography scan provides comprehensive information of possible discontinuities and composite structure. According to the analysis of the obtained results, conclusions concerning the analysed pavement condition were reached. It was determined that the pavement is distinguished by high performance parameters, its condition is good and it does not require any repairs. Maintenance treatment was suggested in order to extend the period of proper operation of the analysed pavement.

  9. Contrasting effects of geographical separation on the genetic population structure of sympatric species of mites in avocado orchards.

    PubMed

    Guzman-Valencia, S; Santillán-Galicia, M T; Guzmán-Franco, A W; González-Hernández, H; Carrillo-Benítez, M G; Suárez-Espinoza, J

    2014-10-01

    Oligonychus punicae and Oligonychus perseae (Acari: Tetranychidae) are the most important mite species affecting avocado orchards in Mexico. Here we used nucleotide sequence data from segments of the nuclear ribosomal internal transcribed spacers (ITS1 and ITS2) and mitochondrial cytochrome oxidase subunit I (COI) genes to assess the phylogenetic relationships between both sympatric mite species and, using only ITS sequence data, examine genetic variation and population structure in both species, to test the hypothesis that, although both species co-occur, their genetic population structures are different in both Michoacan state (main producer) and Mexico state. Phylogenetic analysis showed a clear separation between both species using ITS and COI sequence information. Haplotype network analysis done on 24 samples of O. punicae revealed low genetic diversity with only three haplotypes found but a significant geographical population structure confirmed by analysis of molecular variance (AMOVA) and Kimura-2-parameter (K2P) analyses. In addition, a Mantel test revealed that geographical isolation was a factor responsible for the genetic differentiation. In contrast, analyses of 22 samples of O. perseae revealed high genetic diversity with 15 haplotypes found but no geographical structure confirmed by the AMOVA, K2P and Mantel test analyses. We have suggested that geographical separation is one of the most important factors driving genetic variation, but that it affected each species differently. The role of the ecology of these species on our results, and the importance of our findings in the development of monitoring and control strategies are discussed.

  10. The Success of Linear Bootstrapping Models: Decision Domain-, Expertise-, and Criterion-Specific Meta-Analysis

    PubMed Central

    Kaufmann, Esther; Wittmann, Werner W.

    2016-01-01

    The success of bootstrapping or replacing a human judge with a model (e.g., an equation) has been demonstrated in Paul Meehl’s (1954) seminal work and bolstered by the results of several meta-analyses. To date, however, analyses considering different types of meta-analyses as well as the potential dependence of bootstrapping success on the decision domain, the level of expertise of the human judge, and the criterion for what constitutes an accurate decision have been missing from the literature. In this study, we addressed these research gaps by conducting a meta-analysis of lens model studies. We compared the results of a traditional (bare-bones) meta-analysis with findings of a meta-analysis of the success of bootstrap models corrected for various methodological artifacts. In line with previous studies, we found that bootstrapping was more successful than human judgment. Furthermore, bootstrapping was more successful in studies with an objective decision criterion than in studies with subjective or test score criteria. We did not find clear evidence that the success of bootstrapping depended on the decision domain (e.g., education or medicine) or on the judge’s level of expertise (novice or expert). Correction of methodological artifacts increased the estimated success of bootstrapping, suggesting that previous analyses without artifact correction (i.e., traditional meta-analyses) may have underestimated the value of bootstrapping models. PMID:27327085

  11. On-line evaluation of multiloop digital controller performance

    NASA Technical Reports Server (NTRS)

    Wieseman, Carol D.

    1993-01-01

    The purpose of this presentation is to inform the Guidance and Control community of capabilities which were developed by the Aeroservoelasticity Branch to evaluate the performance of multivariable control laws, on-line, during wind-tunnel testing. The capabilities are generic enough to be useful for all kinds of on-line analyses involving multivariable control in experimental testing. Consequently, it was decided to present this material at this workshop even though it has been presented elsewhere. Topics covered include: essential on-line analysis requirements; on-line analysis capabilities; on-line analysis software; frequency domain procedures; controller performance evaluation frequency-domain flutter suppression; and plant determination.

  12. Suggestions for presenting the results of data analyses

    USGS Publications Warehouse

    Anderson, David R.; Link, William A.; Johnson, Douglas H.; Burnham, Kenneth P.

    2001-01-01

    We give suggestions for the presentation of research results from frequentist, information-theoretic, and Bayesian analysis paradigms, followed by several general suggestions. The information-theoretic and Bayesian methods offer alternative approaches to data analysis and inference compared to traditionally used methods. Guidance is lacking on the presentation of results under these alternative procedures and on nontesting aspects of classical frequentists methods of statistical analysis. Null hypothesis testing has come under intense criticism. We recommend less reporting of the results of statistical tests of null hypotheses in cases where the null is surely false anyway, or where the null hypothesis is of little interest to science or management.

  13. Design, durability and low cost processing technology for composite fan exit guide vanes

    NASA Technical Reports Server (NTRS)

    Blecherman, S. S.

    1979-01-01

    A lightweight composite fan exit guide vane for high bypass ratio gas turbine engine application was investigated. Eight candidate material/design combinations were evaluated by NASTRAN finite element analyses. A total of four combinations were selected for further analytical evaluation, part fabrication by two ventors, and fatigue test in dry and wet condition. A core and shell vane design was chosen in which the unidirectional graphite core fiber was the same for all candidates. The shell material, fiber orientation, and ply configuration were varied. Material tests were performed on raw material and composite specimens to establish specification requirements. Pre-test and post-test microstructural examination and nondestructive analyses were conducted to determine the effect of material variations on fatigue durability and failure mode. Relevant data were acquired with respect to design analysis, materials properties, inspection standards, improved durability, weight benefits, and part price of the composite fan exit guide vane.

  14. A Low Cost, Self Acting, Liquid Hydrogen Boil-Off Recovery System

    NASA Technical Reports Server (NTRS)

    Pelfrey, Joy W.; Sharp, Kirk V. (Technical Monitor)

    2001-01-01

    The purpose of this research was to develop a prototype liquid hydrogen boll-off recovery system. Perform analyses to finalize recovery system cycle, design detail components, fabricate hardware, and conduct sub-component, component, and system level tests leading to the delivery of a prototype system. The design point and off-design analyses identified cycle improvements to increase the robustness of the system by adding a by-pass heat exchanger. Based on the design, analysis, and testing conducted, the recovery system will liquefy 31% of the gaseous boil off from a liquid hydrogen storage tank. All components, including a high speed, miniature turbocompressor, were designed and manufacturing drawings were created. All hardware was fabricated and tests were conducted in air, helium, and hydrogen. Testing validated the design, except for the turbocompressor. A rotor-to-stator clearance issue was discovered as a result of a concentricity tolerance stack-up.

  15. Investigation of Zircaloy-2 oxidation model for SFP accident analysis

    NASA Astrophysics Data System (ADS)

    Nemoto, Yoshiyuki; Kaji, Yoshiyuki; Ogawa, Chihiro; Kondo, Keietsu; Nakashima, Kazuo; Kanazawa, Toru; Tojo, Masayuki

    2017-05-01

    The authors previously conducted thermogravimetric analyses on Zircaloy-2 in air. By using the thermogravimetric data, an oxidation model was constructed in this study so that it can be applied for the modeling of cladding degradation in spent fuel pool (SFP) severe accident condition. For its validation, oxidation tests of long cladding tube were conducted, and computational fluid dynamics analyses using the constructed oxidation model were proceeded to simulate the experiments. In the oxidation tests, high temperature thermal gradient along the cladding axis was applied and air flow rates in testing chamber were controlled to simulate hypothetical SFP accidents. The analytical outputs successfully reproduced the growth of oxide film and porous oxide layer on the claddings in oxidation tests, and validity of the oxidation model was proved. Influence of air flow rate for the oxidation behavior was thought negligible in the conditions investigated in this study.

  16. Alzheimer disease and cancer risk: a meta-analysis.

    PubMed

    Shi, Hai-bin; Tang, Bo; Liu, Yao-Wen; Wang, Xue-Feng; Chen, Guo-Jun

    2015-03-01

    Alzheimer disease (AD) and cancer are seemingly two opposite ends of one spectrum. Studies have suggested that patients with AD showed a reduced risk of cancer and vice versa. However, available evidences are not conclusive. So we conducted a meta-analysis using published literatures to systematically examine cancer risk in AD patients. A PubMed, EMBASE, and Web of Science search were conducted in May 2014. Pooled risk ratios (RRs) with their corresponding 95 % confidence intervals (CIs) were obtained using random-effects meta-analysis. We tested for publication bias and heterogeneity, and stratified for study characteristics, smoking-related cancers versus nonsmoking-related cancers, and site-specific cancers. Nine studies were included in this meta-analysis. Compared with controls, the pooled RR of cancer in AD patients was 0.55 (95 % CI 0.41-0.75), with significant heterogeneity among these studies (P < 0.001, I(2) = 83.5 %). The reduced cancer risk was more substantial when we restricted analyses to cohort studies, studies with adjusted estimates, studies defining AD by generally accepted criteria, and studies with longer length of follow-up. In sub-analyses for site-specific cancers, only lung cancer showed significant decreased risk (RR 0.72; 95 % CI 0.56-0.91). We did not find significant publication bias (P = 0.251 for Begg and Mazumdar's test and P = 0.143 for Egger's regression asymmetry test). These results support an association between AD and decreased cancer risk.

  17. Propfan test assessment testbed aircraft flutter model test report

    NASA Technical Reports Server (NTRS)

    Jenness, C. M. J.

    1987-01-01

    The PropFan Test Assessment (PTA) program includes flight tests of a propfan power plant mounted on the left wind of a modified Gulfstream II testbed aircraft. A static balance boom is mounted on the right wing tip for lateral balance. Flutter analyses indicate that these installations reduce the wing flutter stabilizing speed and that torsional stiffening and the installation of a flutter stabilizing tip boom are required on the left wing for adequate flutter safety margins. Wind tunnel tests of a 1/9th scale high speed flutter model of the testbed aircraft were conducted. The test program included the design, fabrication, and testing of the flutter model and the correlation of the flutter test data with analysis results. Excellent correlations with the test data were achieved in posttest flutter analysis using actual model properties. It was concluded that the flutter analysis method used was capable of accurate flutter predictions for both the (symmetric) twin propfan configuration and the (unsymmetric) single propfan configuration. The flutter analysis also revealed that the differences between the tested model configurations and the current aircraft design caused the (scaled) model flutter speed to be significantly higher than that of the aircraft, at least for the single propfan configuration without a flutter boom. Verification of the aircraft final design should, therefore, be based on flutter predictions made with the test validated analysis methods.

  18. Plug cluster module demonstration

    NASA Technical Reports Server (NTRS)

    Rousar, D. C.

    1978-01-01

    The low pressure, film cooled rocket engine design concept developed during two previous ALRC programs was re-evaluated for application as a module for a plug cluster engine capable of performing space shuttle OTV missions. The nominal engine mixture ratio was 5.5 and the engine life requirements were 1200 thermal cycles and 10 hours total operating life. The program consisted of pretest analysis; engine tests, performed using residual components; and posttest analysis. The pretest analysis indicated that operation of the operation of the film cooled engine at O/F = 5.5 was feasible. During the engine tests, steady state wall temperature and performance measurement were obtained over a range of film cooling flow rates, and the durability of the engine was demonstrated by firing the test engine 1220 times at a nominal performance ranging from 430 - 432 seconds. The performance of the test engine was limited by film coolant sleeve damage which had occurred during previous testing. The post-test analyses indicated that the nominal performance level can be increased to 436 seconds.

  19. Adjusting for partial verification or workup bias in meta-analyses of diagnostic accuracy studies.

    PubMed

    de Groot, Joris A H; Dendukuri, Nandini; Janssen, Kristel J M; Reitsma, Johannes B; Brophy, James; Joseph, Lawrence; Bossuyt, Patrick M M; Moons, Karel G M

    2012-04-15

    A key requirement in the design of diagnostic accuracy studies is that all study participants receive both the test under evaluation and the reference standard test. For a variety of practical and ethical reasons, sometimes only a proportion of patients receive the reference standard, which can bias the accuracy estimates. Numerous methods have been described for correcting this partial verification bias or workup bias in individual studies. In this article, the authors describe a Bayesian method for obtaining adjusted results from a diagnostic meta-analysis when partial verification or workup bias is present in a subset of the primary studies. The method corrects for verification bias without having to exclude primary studies with verification bias, thus preserving the main advantages of a meta-analysis: increased precision and better generalizability. The results of this method are compared with the existing methods for dealing with verification bias in diagnostic meta-analyses. For illustration, the authors use empirical data from a systematic review of studies of the accuracy of the immunohistochemistry test for diagnosis of human epidermal growth factor receptor 2 status in breast cancer patients.

  20. Effects of winglet on transonic flutter characteristics of a cantilevered twin-engine-transport wing model

    NASA Technical Reports Server (NTRS)

    Ruhlin, C. L.; Bhatia, K. G.; Nagaraja, K. S.

    1986-01-01

    A transonic model and a low-speed model were flutter tested in the Langley Transonic Dynamics Tunnel at Mach numbers up to 0.90. Transonic flutter boundaries were measured for 10 different model configurations, which included variations in wing fuel, nacelle pylon stiffness, and wingtip configuration. The winglet effects were evaluated by testing the transonic model, having a specific wing fuel and nacelle pylon stiffness, with each of three wingtips, a nonimal tip, a winglet, and a nominal tip ballasted to simulate the winglet mass. The addition of the winglet substantially reduced the flutter speed of the wing at transonic Mach numbers. The winglet effect was configuration-dependent and was primarily due to winglet aerodynamics rather than mass. Flutter analyses using modified strip-theory aerodynamics (experimentally weighted) correlated reasonably well with test results. The four transonic flutter mechanisms predicted by analysis were obtained experimentally. The analysis satisfactorily predicted the mass-density-ratio effects on subsonic flutter obtained using the low-speed model. Additional analyses were made to determine the flutter sensitivity to several parameters at transonic speeds.

  1. Exploring the general motor ability construct.

    PubMed

    Ibrahim, Halijah; Heard, N Paul; Blanksby, Brian

    2011-10-01

    Malaysian students ages 12 to 15 years (N = 330; 165 girls, 165 boys) took the Australian Institute of Sport Talent Identification Test (AIST) and the Balance and Movement Coordination Test (BMC), developed specifically to identify sport talent in Malaysian adolescents. To investigate evidence for general aptitude ("g") in motor ability, a higher-order factor analysis was applied to the motor skills subtests from the AIST and BMC. First-order principal components analysis indicated that scores for the adolescent boys and girls could be described by similar sets of specific motor abilities. In particular, sets of skills identified as Movement Coordination and Postural Control were found, with Balancing Ability also emerging. For the girls, a factor labeled Static Balance was indicated. However, for the boys a more general balance ability labeled Kinesthetic Integration was found, along with an ability labeled Explosive Power. These first-order analyses accounted for 45% to 60% of the variance in the scores on the motor skills tests for the boys and girls, respectively. Separate second-order factor analyses for the boys and girls extracted a single higher-order factor, which was consistent with the existence of a motoric "g".

  2. Analysis and Test Correlation of Proof of Concept Box for Blended Wing Body-Low Speed Vehicle

    NASA Technical Reports Server (NTRS)

    Spellman, Regina L.

    2003-01-01

    The Low Speed Vehicle (LSV) is a 14.2% scale remotely piloted vehicle of the revolutionary Blended Wing Body concept. The design of the LSV includes an all composite airframe. Due to internal manufacturing capability restrictions, room temperature layups were necessary. An extensive materials testing and manufacturing process development effort was underwent to establish a process that would achieve the high modulus/low weight properties required to meet the design requirements. The analysis process involved a loads development effort that incorporated aero loads to determine internal forces that could be applied to a traditional FEM of the vehicle and to conduct detailed component analyses. A new tool, Hypersizer, was added to the design process to address various composite failure modes and to optimize the skin panel thickness of the upper and lower skins for the vehicle. The analysis required an iterative approach as material properties were continually changing. As a part of the material characterization effort, test articles, including a proof of concept wing box and a full-scale wing, were fabricated. The proof of concept box was fabricated based on very preliminary material studies and tested in bending, torsion, and shear. The box was then tested to failure under shear. The proof of concept box was also analyzed using Nastran and Hypersizer. The results of both analyses were scaled to determine the predicted failure load. The test results were compared to both the Nastran and Hypersizer analytical predictions. The actual failure occurred at 899 lbs. The failure was predicted at 1167 lbs based on the Nastran analysis. The Hypersizer analysis predicted a lower failure load of 960 lbs. The Nastran analysis alone was not sufficient to predict the failure load because it does not identify local composite failure modes. This analysis has traditionally been done using closed form solutions. Although Hypersizer is typically used as an optimizer for the design process, the failure prediction was used to help gain acceptance and confidence in this new tool. The correlated models and process were to be used to analyze the full BWB-LSV airframe design. The analysis and correlation with test results of the proof of concept box is presented here, including the comparison of the Nastran and Hypersizer results.

  3. Analysis of Operation Dominic II SMALL BOY radiological and meteorological data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Quinn, V.E., Kennedy, N.C.; Steadman, C.R.

    1984-08-01

    This report describes the Weather Service Nuclear Support Office (WSNSO) analyses of the radiological and meteorological data collected for the Operation Dominic II nuclear test SMALL BOY. Inconsistencies in the radiological data and their resolution are discussed. The methods of estimating fallout-arrival times are discussed. The meteorological situation on D-day and a few days following are described. A comparison of the fallout patterns resulting from these analyses and earlier (1966) analyses is presented. The radiological data used to derive the fallout pattern in this report are tabulated in an appendix. 11 references, 20 figures.

  4. Scramjet test flow reconstruction for a large-scale expansion tube, Part 2: axisymmetric CFD analysis

    NASA Astrophysics Data System (ADS)

    Gildfind, D. E.; Jacobs, P. A.; Morgan, R. G.; Chan, W. Y. K.; Gollan, R. J.

    2018-07-01

    This paper presents the second part of a study aiming to accurately characterise a Mach 10 scramjet test flow generated using a large free-piston-driven expansion tube. Part 1 described the experimental set-up, the quasi-one-dimensional simulation of the full facility, and the hybrid analysis technique used to compute the nozzle exit test flow properties. The second stage of the hybrid analysis applies the computed 1-D shock tube flow history as an inflow to a high-fidelity two-dimensional-axisymmetric analysis of the acceleration tube. The acceleration tube exit flow history is then applied as an inflow to a further refined axisymmetric nozzle model, providing the final nozzle exit test flow properties and thereby completing the analysis. This paper presents the results of the axisymmetric analyses. These simulations are shown to closely reproduce experimentally measured shock speeds and acceleration tube static pressure histories, as well as nozzle centreline static and impact pressure histories. The hybrid scheme less successfully predicts the diameter of the core test flow; however, this property is readily measured through experimental pitot surveys. In combination, the full test flow history can be accurately determined.

  5. Scramjet test flow reconstruction for a large-scale expansion tube, Part 2: axisymmetric CFD analysis

    NASA Astrophysics Data System (ADS)

    Gildfind, D. E.; Jacobs, P. A.; Morgan, R. G.; Chan, W. Y. K.; Gollan, R. J.

    2017-11-01

    This paper presents the second part of a study aiming to accurately characterise a Mach 10 scramjet test flow generated using a large free-piston-driven expansion tube. Part 1 described the experimental set-up, the quasi-one-dimensional simulation of the full facility, and the hybrid analysis technique used to compute the nozzle exit test flow properties. The second stage of the hybrid analysis applies the computed 1-D shock tube flow history as an inflow to a high-fidelity two-dimensional-axisymmetric analysis of the acceleration tube. The acceleration tube exit flow history is then applied as an inflow to a further refined axisymmetric nozzle model, providing the final nozzle exit test flow properties and thereby completing the analysis. This paper presents the results of the axisymmetric analyses. These simulations are shown to closely reproduce experimentally measured shock speeds and acceleration tube static pressure histories, as well as nozzle centreline static and impact pressure histories. The hybrid scheme less successfully predicts the diameter of the core test flow; however, this property is readily measured through experimental pitot surveys. In combination, the full test flow history can be accurately determined.

  6. GC-MS and LC-MS analysis of nerve agents in body fluids: intra-laboratory verification test using spiked plasma and urine samples.

    PubMed

    Koller, Marianne; Becker, Christian; Thiermann, Horst; Worek, Franz

    2010-05-15

    The purpose of this study was to check the applicability of different analytical methods for the identification of unknown nerve agents in human body fluids. Plasma and urine samples were spiked with nerve agents (plasma) or with their metabolites (urine) or were left blank. Seven random samples (35% of all samples) were selected for the verification test. Plasma was worked up for unchanged nerve agents and for regenerated nerve agents after fluoride-induced reactivation of nerve agent-inhibited butyrylcholinesterase. Both extracts were analysed by GC-MS. Metabolites were extracted from plasma and urine, respectively, and were analysed by LC-MS. The urinary metabolites and two blank samples could be identified without further measurements, plasma metabolites and blanks were identified in six of seven samples. The analysis of unchanged nerve agent provided five agents/blanks and the sixth agent after further investigation. The determination of the regenerated agents also provided only five clear findings during the first screening because of a rather noisy baseline. Therefore, the sample preparation was extended by a size exclusion step performed before addition of fluoride which visibly reduced baseline noise and thus improved identification of the two missing agents. The test clearly showed that verification should be performed by analysing more than one biomarker to ensure identification of the agent(s). Copyright (c) 2010 Elsevier B.V. All rights reserved.

  7. Correlation of human papilloma virus with oral squamous cell carcinoma in Chinese population.

    PubMed

    Zhou, Jingping; Tao, Detao; Tang, Daofang; Gao, Zhenlin

    2015-01-01

    Previous studies indicated that oral squamous cell carcinomas (OSCC) might be related to human papilloma virus (HPV) infection. However, the relationship between OSCC in a Chinese population and oral HPV infection is still unclear. In this study, we evaluate the relationship of OSCC with HPV infection in a Chinese population via a meta-analysis. The reports on HPV and OSCC in a Chinese population published between January, 1994, and October, 2015 were retrieved via CNKI/WANFANG/pubmed databases. According to the inclusion criteria, we selected 26 eligible case-control studies. After testing the heterogeneity of the studies by the Cochran Q test, the meta-analyses for HPV and HPV16 were performed using the random effects model. Quantitative meta-analyses showed that, compared with normal oral mucosa the combined odds ratio of OSCC with HPV infection were 1.98 (95% CI: 1.34-2.92). The test for overall effect showed that the P value was less than 0.05 (Z = 3.46). Forest plot analyses were seen in Figures 2 and 3. Publication bias and bias risk analysis using RevMan 5.3 software were measured indicators of the graphics of the basic symmetry. High incidences of HPV infection were found in the samples of Chinese OSCC. For the Chinese population, HPV infection elevates the risk of OSCC tumorigenesis.

  8. The reliability and validity of the SF-8 with a conflict-affected population in northern Uganda.

    PubMed

    Roberts, Bayard; Browne, John; Ocaka, Kaducu Felix; Oyok, Thomas; Sondorp, Egbert

    2008-12-02

    The SF-8 is a health-related quality of life instrument that could provide a useful means of assessing general physical and mental health amongst populations affected by conflict. The purpose of this study was to test the validity and reliability of the SF-8 with a conflict-affected population in northern Uganda. A cross-sectional multi-staged, random cluster survey was conducted with 1206 adults in camps for internally displaced persons in Gulu and Amuru districts of northern Uganda. Data quality was assessed by analysing the number of incomplete responses to SF-8 items. Response distribution was analysed using aggregate endorsement frequency. Test-retest reliability was assessed in a separate smaller survey using the intraclass correlation test. Construct validity was measured using principal component analysis, and the Pearson Correlation test for item-summary score correlation and inter-instrument correlations. Known groups validity was assessed using a two sample t-test to evaluates the ability of the SF-8 to discriminate between groups known to have, and not have, physical and mental health problems. The SF-8 showed excellent data quality. It showed acceptable item response distribution based upon analysis of aggregate endorsement frequencies. Test-retest showed a good intraclass correlation of 0.61 for PCS and 0.68 for MCS. The principal component analysis indicated strong construct validity and concurred with the results of the validity tests by the SF-8 developers. The SF-8 also showed strong construct validity between the 8 items and PCS and MCS summary score, moderate inter-instrument validity, and strong known groups validity. This study provides evidence on the reliability and validity of the SF-8 amongst IDPs in northern Uganda.

  9. The reliability and validity of the SF-8 with a conflict-affected population in northern Uganda

    PubMed Central

    Roberts, Bayard; Browne, John; Ocaka, Kaducu Felix; Oyok, Thomas; Sondorp, Egbert

    2008-01-01

    Background The SF-8 is a health-related quality of life instrument that could provide a useful means of assessing general physical and mental health amongst populations affected by conflict. The purpose of this study was to test the validity and reliability of the SF-8 with a conflict-affected population in northern Uganda. Methods A cross-sectional multi-staged, random cluster survey was conducted with 1206 adults in camps for internally displaced persons in Gulu and Amuru districts of northern Uganda. Data quality was assessed by analysing the number of incomplete responses to SF-8 items. Response distribution was analysed using aggregate endorsement frequency. Test-retest reliability was assessed in a separate smaller survey using the intraclass correlation test. Construct validity was measured using principal component analysis, and the Pearson Correlation test for item-summary score correlation and inter-instrument correlations. Known groups validity was assessed using a two sample t-test to evaluates the ability of the SF-8 to discriminate between groups known to have, and not have, physical and mental health problems. Results The SF-8 showed excellent data quality. It showed acceptable item response distribution based upon analysis of aggregate endorsement frequencies. Test-retest showed a good intraclass correlation of 0.61 for PCS and 0.68 for MCS. The principal component analysis indicated strong construct validity and concurred with the results of the validity tests by the SF-8 developers. The SF-8 also showed strong construct validity between the 8 items and PCS and MCS summary score, moderate inter-instrument validity, and strong known groups validity. Conclusion This study provides evidence on the reliability and validity of the SF-8 amongst IDPs in northern Uganda. PMID:19055716

  10. Aeroelastic Airworthiness Assesment of the Adaptive Compliant Trailing Edge Flaps

    NASA Technical Reports Server (NTRS)

    Herrera, Claudia Y.; Spivey, Natalie D.; Lung, Shun-fat; Ervin, Gregory; Flick, Peter

    2015-01-01

    The Adaptive Compliant Trailing Edge (ACTE) demonstrator is a joint task under the National Aeronautics and Space Administration Environmentally Responsible Aviation Project in partnership with the Air Force Research Laboratory and FlexSys, Inc. (Ann Arbor, Michigan). The project goal is to develop advanced technologies that enable environmentally friendly aircraft, such as adaptive compliant technologies. The ACTE demonstrator flight-test program encompassed replacing the Fowler flaps on the SubsoniC Aircraft Testbed, a modified Gulfstream III (Gulfstream Aerospace, Savannah, Georgia) aircraft, with control surfaces developed by FlexSys. The control surfaces developed by FlexSys are a pair of uniquely-designed unconventional flaps to be used as lifting surfaces during flight-testing to validate their structural effectiveness. The unconventional flaps required a multidisciplinary airworthiness assessment to prove they could withstand the prescribed flight envelope. Several challenges were posed due to the large deflections experienced by the structure, requiring non-linear analysis methods. The aeroelastic assessment necessitated both conventional and extensive testing and analysis methods. A series of ground vibration tests (GVTs) were conducted to provide modal characteristics to validate and update finite element models (FEMs) used for the flutter analyses for a subset of the various flight configurations. Numerous FEMs were developed using data from FlexSys and the ground tests. The flap FEMs were then attached to the aircraft model to generate a combined FEM that could be analyzed for aeroelastic instabilities. The aeroelastic analysis results showed the combined system of aircraft and flaps were predicted to have the required flutter margin to successfully demonstrate the adaptive compliant technology. This paper documents the details of the aeroelastic airworthiness assessment described, including the ground testing and analyses, and subsequent flight-testing performed on the unconventional ACTE flaps.

  11. [Comparison of simple pooling and bivariate model used in meta-analyses of diagnostic test accuracy published in Chinese journals].

    PubMed

    Huang, Yuan-sheng; Yang, Zhi-rong; Zhan, Si-yan

    2015-06-18

    To investigate the use of simple pooling and bivariate model in meta-analyses of diagnostic test accuracy (DTA) published in Chinese journals (January to November, 2014), compare the differences of results from these two models, and explore the impact of between-study variability of sensitivity and specificity on the differences. DTA meta-analyses were searched through Chinese Biomedical Literature Database (January to November, 2014). Details in models and data for fourfold table were extracted. Descriptive analysis was conducted to investigate the prevalence of the use of simple pooling method and bivariate model in the included literature. Data were re-analyzed with the two models respectively. Differences in the results were examined by Wilcoxon signed rank test. How the results differences were affected by between-study variability of sensitivity and specificity, expressed by I2, was explored. The 55 systematic reviews, containing 58 DTA meta-analyses, were included and 25 DTA meta-analyses were eligible for re-analysis. Simple pooling was used in 50 (90.9%) systematic reviews and bivariate model in 1 (1.8%). The remaining 4 (7.3%) articles used other models pooling sensitivity and specificity or pooled neither of them. Of the reviews simply pooling sensitivity and specificity, 41(82.0%) were at the risk of wrongly using Meta-disc software. The differences in medians of sensitivity and specificity between two models were both 0.011 (P<0.001, P=0.031 respectively). Greater differences could be found as I2 of sensitivity or specificity became larger, especially when I2>75%. Most DTA meta-analyses published in Chinese journals(January to November, 2014) combine the sensitivity and specificity by simple pooling. Meta-disc software can pool the sensitivity and specificity only through fixed-effect model, but a high proportion of authors think it can implement random-effect model. Simple pooling tends to underestimate the results compared with bivariate model. The greater the between-study variance is, the more likely the simple pooling has larger deviation. It is necessary to increase the knowledge level of statistical methods and software for meta-analyses of DTA data.

  12. Design, fracture control, fabrication, and testing of pressurized space-vehicle structures

    NASA Technical Reports Server (NTRS)

    Babel, H. W.; Christensen, R. H.; Dixon, H. H.

    1974-01-01

    The relationship between analysis, design, fabrication, and testing of thin shells is illustrated by Saturn S-IVB, Thor, Delta, and other single-use and reusable large-size cryogenic aluminum tankage. The analyses and design to meet the design requirements are reviewed and include consideration of fracture control, general instability, and other failure modes. The effect of research and development testing on the structure is indicated. It is shown how fabrication and nondestructive and acceptance testing constrain the design. Finally, qualification testing is reviewed to illustrate the extent of testing used to develop the Saturn S-IVB.

  13. Economics of Pharmacogenetic-Guided Treatments: Underwhelming or Overstated?

    PubMed

    Hughes, Dyfrig A

    2018-05-01

    Economic evaluations have dispelled a perception that precision medicine, achieved through pharmacogenetic testing, reduces healthcare costs. For many tests aimed at preventing adverse drug reactions, cost-effectiveness analyses predict modest improvements in health benefits and increases in total costs. While there are many uncertainties in estimating the value of testing, factors that influence cost-effectiveness include the rarity of the outcome, the effectiveness of alternative treatments, and the scope and perspective of analysis. © 2018 ASCPT.

  14. Models and Estimation Procedures for the Analysis of Subjects-by-Items Data Arrays.

    DTIC Science & Technology

    1982-06-30

    Conclusions and recommendations The usefulness of Tukey’s model for model-based psychological testing is probably greatest for analyses of responses which are...22314 National Institute of Education Attn: TC 1200 19th Street NW Washington, DC 20208 Dr. William Graham Testing Directorate 1 Dr. Lorraine D. Eyde ...Educational Testing Service 1 Dr. Norman Cliff Princeton, NJ 08450 Dept. of Psychology Univ. of So. California 1 Dr. Ina Bilodeau University Park

  15. Transforming Parent-Child Interaction in Family Routines: Longitudinal Analysis with Families of Children with Developmental Disabilities.

    PubMed

    Lucyshyn, Joseph M; Fossett, Brenda; Bakeman, Roger; Cheremshynski, Christy; Miller, Lynn; Lohrmann, Sharon; Binnendyk, Lauren; Khan, Sophia; Chinn, Stephen; Kwon, Samantha; Irvin, Larry K

    2015-12-01

    The efficacy and consequential validity of an ecological approach to behavioral intervention with families of children with developmental disabilities was examined. The approach aimed to transform coercive into constructive parent-child interaction in family routines. Ten families participated, including 10 mothers and fathers and 10 children 3-8 years old with developmental disabilities. Thirty-six family routines were selected (2 to 4 per family). Dependent measures included child problem behavior, routine steps completed, and coercive and constructive parent-child interaction. For each family, a single case, multiple baseline design was employed with three phases: baseline, intervention, and follow-up. Visual analysis evaluated the functional relation between intervention and improvements in child behavior and routine participation. Nonparametric tests across families evaluated the statistical significance of these improvements. Sequential analyses within families and univariate analyses across families examined changes from baseline to intervention in the percentage and odds ratio of coercive and constructive parent-child interaction. Multiple baseline results documented functional or basic effects for 8 of 10 families. Nonparametric tests showed these changes to be significant. Follow-up showed durability at 11 to 24 months postintervention. Sequential analyses documented the transformation of coercive into constructive processes for 9 of 10 families. Univariate analyses across families showed significant improvements in 2- and 4-step coercive and constructive processes but not in odds ratio. Results offer evidence of the efficacy of the approach and consequential validity of the ecological unit of analysis, parent-child interaction in family routines. Future studies should improve efficiency, and outcomes for families experiencing family systems challenges.

  16. An approach to trial design and analysis in the era of non-proportional hazards of the treatment effect.

    PubMed

    Royston, Patrick; Parmar, Mahesh K B

    2014-08-07

    Most randomized controlled trials with a time-to-event outcome are designed and analysed under the proportional hazards assumption, with a target hazard ratio for the treatment effect in mind. However, the hazards may be non-proportional. We address how to design a trial under such conditions, and how to analyse the results. We propose to extend the usual approach, a logrank test, to also include the Grambsch-Therneau test of proportional hazards. We test the resulting composite null hypothesis using a joint test for the hazard ratio and for time-dependent behaviour of the hazard ratio. We compute the power and sample size for the logrank test under proportional hazards, and from that we compute the power of the joint test. For the estimation of relevant quantities from the trial data, various models could be used; we advocate adopting a pre-specified flexible parametric survival model that supports time-dependent behaviour of the hazard ratio. We present the mathematics for calculating the power and sample size for the joint test. We illustrate the methodology in real data from two randomized trials, one in ovarian cancer and the other in treating cellulitis. We show selected estimates and their uncertainty derived from the advocated flexible parametric model. We demonstrate in a small simulation study that when a treatment effect either increases or decreases over time, the joint test can outperform the logrank test in the presence of both patterns of non-proportional hazards. Those designing and analysing trials in the era of non-proportional hazards need to acknowledge that a more complex type of treatment effect is becoming more common. Our method for the design of the trial retains the tools familiar in the standard methodology based on the logrank test, and extends it to incorporate a joint test of the null hypothesis with power against non-proportional hazards. For the analysis of trial data, we propose the use of a pre-specified flexible parametric model that can represent a time-dependent hazard ratio if one is present.

  17. TESTING AND PERFORMANCE ANALYSIS OF NASA 5 CM BY 5 CM BI-SUPPORTED SOLID OXIDE ELECTROLYSIS CELLS OPERATED IN BOTH FUEL CELL AND STEAM ELECTROLYSIS MODES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    R. C. O'Brien; J. E. O'Brien; C. M. Stoots

    A series of 5 cm by 5 cm bi-supported Solid Oxide Electrolysis Cells (SOEC) were produced by NASA for the Idaho National Laboratory (INL) and tested under the INL High Temperature Steam Electrolysis program. The results from the experimental demonstration of cell operation for both hydrogen production and operation as fuel cells is presented. An overview of the cell technology, test apparatus and performance analysis is also provided. The INL High Temperature Steam Electrolysis laboratory has developed significant test infrastructure in support of single cell and stack performance analyses. An overview of the single cell test apparatus is presented. Themore » test data presented in this paper is representative of a first batch of NASA's prototypic 5 cm by 5 cm SOEC single cells. Clearly a significant relationship between the operational current density and cell degradation rate is evident. While the performance of these cells was lower than anticipated, in-house testing at NASA Glenn has yielded significantly higher performance and lower degradation rates with subsequent production batches of cells. Current post-test microstructure analyses of the cells tested at INL will be published in a future paper. Modification to cell compositions and cell reduction techniques will be altered in the next series of cells to be delivered to INL with the aim to decrease the cell degradation rate while allowing for higher operational current densities to be sustained. Results from the testing of new batches of single cells will be presented in a future paper.« less

  18. Factors affecting behavioral changes in response to road fees : some analyses of the effect of attitudes, transit access and fuel efficiency on changes in miles driven, final report, February 2009.

    DOT National Transportation Integrated Search

    2009-02-01

    The Oregon Department of Transportation tested a system to collect a vehicle-based mileage fee as a replacement for the Oregon gas tax. This : project reports on additional analysis of the data from that experiment. Subjects include analysis of the c...

  19. Exponential approximations in optimal design

    NASA Technical Reports Server (NTRS)

    Belegundu, A. D.; Rajan, S. D.; Rajgopal, J.

    1990-01-01

    One-point and two-point exponential functions have been developed and proved to be very effective approximations of structural response. The exponential has been compared to the linear, reciprocal and quadratic fit methods. Four test problems in structural analysis have been selected. The use of such approximations is attractive in structural optimization to reduce the numbers of exact analyses which involve computationally expensive finite element analysis.

  20. 40 CFR 63.7550 - What reports must I submit and when?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... tests for affected sources subject to an emission limit, a summary of any fuel analyses associated with... analysis). If you burned a new type of fuel and are subject to a mercury emission limit, you must submit... through fuel analysis). (7) If you wish to burn a new type of fuel in an affected source subject to an...

  1. Accuracy of Revised and Traditional Parallel Analyses for Assessing Dimensionality with Binary Data

    ERIC Educational Resources Information Center

    Green, Samuel B.; Redell, Nickalus; Thompson, Marilyn S.; Levy, Roy

    2016-01-01

    Parallel analysis (PA) is a useful empirical tool for assessing the number of factors in exploratory factor analysis. On conceptual and empirical grounds, we argue for a revision to PA that makes it more consistent with hypothesis testing. Using Monte Carlo methods, we evaluated the relative accuracy of the revised PA (R-PA) and traditional PA…

  2. A Theoretical Analysis of the Performance of Learning Disabled Students on the Woodcock-Johnson Psycho-Educational Battery.

    ERIC Educational Resources Information Center

    Shinn, Mark; And Others

    Two studies were conducted to (1) analyze the subtest characteristics of the Woodcock-Johnson Psycho-Educational Battery, and (2) apply those results to an analysis of 50 fourth grade learning disabled (LD) students' performance on the Battery. Analyses indicated that the poorer performance of LD students on the Woodcock-Johnson Tests of Cognitive…

  3. The Bicycle Drawing Test: What Does It Measure in Developmentally Typical Children?

    PubMed

    Cannoni, Eleonora; Di Norcia, Anna; Bombi, Anna Silvia; Di Giunta, Laura

    2015-10-01

    To verify the dimensionality of Bicycle Drawing Test (BDT), we applied the coding system of Greenberg, Rodriguez, and Sesta to bicycle drawings made by 295 boys and 320 girls (6-10 years old) with typical development, and submitted the data to item analysis, exploratory factor analysis, and confirmatory factor analysis. These analyses confirmed only two of the original four dimensions of the BDT: spatial reasoning and visual-motor control. The scores in these two factors were correlated with the Colored Progressive Matrices, the Rey Complex Figure (Copy and Memory) and with the teachers' ratings in mathematics, language, and drawing. The correlations, albeit moderate in magnitude, were consistent with the hypothesized convergent and discriminant validity. After checking for measurement invariance across gender and age, we conducted two analyses of variance, the first of which showed a significant difference between younger children (6-8 years old) and older children (9-10 years old); the analysis of variance by gender did not yield significant differences. These data enhance the usefulness of the BDT as a measure of separate cognitive components, but do not support its use as a measure of mechanical reasoning. © The Author(s) 2014.

  4. Emotional and cognitive effects of peer tutoring among secondary school mathematics students

    NASA Astrophysics Data System (ADS)

    Alegre Ansuategui, Francisco José; Moliner Miravet, Lidón

    2017-11-01

    This paper describes an experience of same-age peer tutoring conducted with 19 eighth-grade mathematics students in a secondary school in Castellon de la Plana (Spain). Three constructs were analysed before and after launching the program: academic performance, mathematics self-concept and attitude of solidarity. Students' perceptions of the method were also analysed. The quantitative data was gathered by means of a mathematics self-concept questionnaire, an attitude of solidarity questionnaire and the students' numerical ratings. A statistical analysis was performed using Student's t-test. The qualitative information was gathered by means of discussion groups and a field diary. This information was analysed using descriptive analysis and by categorizing the information. Results show statistically significant improvements in all the variables and the positive assessment of the experience and the interactions that took place between the students.

  5. Luminol testing in detecting modern human skeletal remains: a test on different types of bone tissue and a caveat for PMI interpretation.

    PubMed

    Caudullo, Giorgio; Caruso, Valentina; Cappella, Annalisa; Sguazza, Emanuela; Mazzarelli, Debora; Amadasi, Alberto; Cattaneo, Cristina

    2017-01-01

    When forensic pathologists and anthropologists have to deal with the evaluation of the post-mortem interval (PMI) in skeletal remains, luminol testing is frequently performed as a preliminary screening method. However, the repeatability of this test on the same bone, as well as comparative studies on different bones of the same individual, has never been performed. Therefore, with the aim of investigating the influence that different types of bones may exert on the response to the luminol test, the present study analysed three different skeletal elements (femoral diaphysis, vertebra and cranial vault), gathered from ten recent exhumed skeletons (all with a 20-year PMI). The analysis was performed twice on the same bone after 2 months: the analysis at time 0 concerned the whole bone, whereas the second concerned only a part of the same bone taken during the first test (which already had been broken). The overall results showed different responses, depending on the type of bone and on the integrity of the samples. Negative results at the first analysis (6.6% out of the total of samples) are consistent with what is reported in the literature, whilst at the second analysis, the increase of about 20% of false-negative results highlights that the luminol test ought to be performed with caution in case of broken bones or elements which are taphonomically altered. Results have thus proven that the exposition to environmental agents might result in haemoglobin (Hb) loss, as detected even after only 2 months. The study also focused on the crucial issue of the type of bone subjected to testing, remarking the suitability of the femoral diaphysis (100% of positive responses at the first analysis vs only 18% of false-negative results at the second test, corresponding to 5% of total false-negative results) as opposed to other bone elements that showed a low yield. In particular, the cranial vault gave poor results, with 40% of discrepancy between results from the two analyses, which suggests caution in choosing the type of bone sample to test. In conclusion, luminol testing should be used with caution on bones different from long bones or on non-intact bones.

  6. Experimental validation of solid rocket motor damping models

    NASA Astrophysics Data System (ADS)

    Riso, Cristina; Fransen, Sebastiaan; Mastroddi, Franco; Coppotelli, Giuliano; Trequattrini, Francesco; De Vivo, Alessio

    2017-12-01

    In design and certification of spacecraft, payload/launcher coupled load analyses are performed to simulate the satellite dynamic environment. To obtain accurate predictions, the system damping properties must be properly taken into account in the finite element model used for coupled load analysis. This is typically done using a structural damping characterization in the frequency domain, which is not applicable in the time domain. Therefore, the structural damping matrix of the system must be converted into an equivalent viscous damping matrix when a transient coupled load analysis is performed. This paper focuses on the validation of equivalent viscous damping methods for dynamically condensed finite element models via correlation with experimental data for a realistic structure representative of a slender launch vehicle with solid rocket motors. A second scope of the paper is to investigate how to conveniently choose a single combination of Young's modulus and structural damping coefficient—complex Young's modulus—to approximate the viscoelastic behavior of a solid propellant material in the frequency band of interest for coupled load analysis. A scaled-down test article inspired to the Z9-ignition Vega launcher configuration is designed, manufactured, and experimentally tested to obtain data for validation of the equivalent viscous damping methods. The Z9-like component of the test article is filled with a viscoelastic material representative of the Z9 solid propellant that is also preliminarily tested to investigate the dependency of the complex Young's modulus on the excitation frequency and provide data for the test article finite element model. Experimental results from seismic and shock tests performed on the test configuration are correlated with numerical results from frequency and time domain analyses carried out on its dynamically condensed finite element model to assess the applicability of different equivalent viscous damping methods to describe damping properties of slender launch vehicles in payload/launcher coupled load analysis.

  7. Experimental validation of solid rocket motor damping models

    NASA Astrophysics Data System (ADS)

    Riso, Cristina; Fransen, Sebastiaan; Mastroddi, Franco; Coppotelli, Giuliano; Trequattrini, Francesco; De Vivo, Alessio

    2018-06-01

    In design and certification of spacecraft, payload/launcher coupled load analyses are performed to simulate the satellite dynamic environment. To obtain accurate predictions, the system damping properties must be properly taken into account in the finite element model used for coupled load analysis. This is typically done using a structural damping characterization in the frequency domain, which is not applicable in the time domain. Therefore, the structural damping matrix of the system must be converted into an equivalent viscous damping matrix when a transient coupled load analysis is performed. This paper focuses on the validation of equivalent viscous damping methods for dynamically condensed finite element models via correlation with experimental data for a realistic structure representative of a slender launch vehicle with solid rocket motors. A second scope of the paper is to investigate how to conveniently choose a single combination of Young's modulus and structural damping coefficient—complex Young's modulus—to approximate the viscoelastic behavior of a solid propellant material in the frequency band of interest for coupled load analysis. A scaled-down test article inspired to the Z9-ignition Vega launcher configuration is designed, manufactured, and experimentally tested to obtain data for validation of the equivalent viscous damping methods. The Z9-like component of the test article is filled with a viscoelastic material representative of the Z9 solid propellant that is also preliminarily tested to investigate the dependency of the complex Young's modulus on the excitation frequency and provide data for the test article finite element model. Experimental results from seismic and shock tests performed on the test configuration are correlated with numerical results from frequency and time domain analyses carried out on its dynamically condensed finite element model to assess the applicability of different equivalent viscous damping methods to describe damping properties of slender launch vehicles in payload/launcher coupled load analysis.

  8. Nickel-Hydrogen Cell Testing Experience, NASA/Goddard Space Flight Center

    NASA Technical Reports Server (NTRS)

    Rao, Gopalakrishna M.

    1999-01-01

    The objectives of the project were to test the Nickel-Hydrogen Cell to: (1) verify the Aerospace Cell Flight Worthiness, (2) Elucidate the Aerospace Cell Thermal Behavior, (3) Develop the Aerospace Battery Assembly Design(s) and In-orbit Battery Management plan(s) and (4) Understand the Aerospace Cell Failure Mechanism. The tests included the LEO and GEO Life cycle tests, Calorimetric Analysis, Destructive Physical analysis, and special tests. Charts show the Mission Profile Cycling Data, Stress Cycling Data. The test data complies with the mission requirements, validating the flight worthiness of batteries. The nominal stress and mission profile cycling performance test shows the charge voltage as high as 1.60V and recharge ratio greater than 1.05. It is apparent that the electrochemical signatures alone do not provide conclusive proof for Nickel precharge. The researchers recommend a gas and positive plate analyses for further confirmation.

  9. Tank 241-AY-101 Privatization Push Mode Core Sampling and Analysis Plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    TEMPLETON, A.M.

    2000-01-12

    This sampling and analysis plan (SAP) identifies characterization objectives pertaining to sample collection, laboratory analytical evaluation, and reporting requirements for samples obtained from tank 241-AY-101. The purpose of this sampling event is to obtain information about the characteristics of the contents of 241-AY-101 required to satisfy Data Quality Objectives For RPP Privatization Phase I: Confirm Tank T Is An Appropriate Feed Source For High-Level Waste Feed Batch X(HLW DQO) (Nguyen 1999a), Data Quality Objectives For TWRS Privatization Phase I : Confirm Tank T Is An Appropriate Feed Source For Low-Activity Waste Feed Batch X (LAW DQO) (Nguyen 1999b), Low Activitymore » Waste and High-Level Waste Feed Data Quality Objectives (L and H DQO) (Patello et al. 1999), and Characterization Data Needs for Development, Design, and Operation of Retrieval Equipment Developed through the Data Quality Objective Process (Equipment DQO) (Bloom 1996). Special instructions regarding support to the LAW and HLW DQOs are provided by Baldwin (1999). Push mode core samples will be obtained from risers 15G and 150 to provide sufficient material for the chemical analyses and tests required to satisfy these data quality objectives. The 222-S Laboratory will extrude core samples; composite the liquids and solids; perform chemical analyses on composite and segment samples; archive half-segment samples; and provide subsamples to the Process Chemistry Laboratory. The Process Chemistry Laboratory will prepare test plans and perform process tests to evaluate the behavior of the 241-AY-101 waste undergoing the retrieval and treatment scenarios defined in the applicable DQOs. Requirements for analyses of samples originating in the process tests will be documented in the corresponding test plans and are not within the scope of this SAP.« less

  10. Tank 241-AY-101 Privatization Push Mode Core Sampling and Analysis Plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    TEMPLETON, A.M.

    2000-05-19

    This sampling and analysis plan (SAP) identifies characterization objectives pertaining to sample collection, laboratory analytical evaluation, and reporting requirements for samples obtained from tank 241-AY-101. The purpose of this sampling event is to obtain information about the characteristics of the contents of 241-AY-101 required to satisfy ''Data Quality Objectives For RPP Privatization Phase I: Confirm Tank T Is An Appropriate Feed Source For High-Level Waste Feed Batch X(HLW DQO)' (Nguyen 1999a), ''Data Quality Objectives For TWRS Privatization Phase I: Confirm Tank T Is An Appropriate Feed Source For Low-Activity Waste Feed Butch X (LAW DQO) (Nguyen 1999b)'', ''Low Activity Wastemore » and High-Level Waste Feed Data Quality Objectives (L&H DQO)'' (Patello et al. 1999), and ''Characterization Data Needs for Development, Design, and Operation of Retrieval Equipment Developed through the Data Quality Objective Process (Equipment DQO)'' (Bloom 1996). Special instructions regarding support to the LAW and HLW DQOs are provided by Baldwin (1999). Push mode core samples will be obtained from risers 15G and 150 to provide sufficient material for the chemical analyses and tests required to satisfy these data quality objectives. The 222-S Laboratory will extrude core samples; composite the liquids and solids; perform chemical analyses on composite and segment samples; archive half-segment samples; and provide sub-samples to the Process Chemistry Laboratory. The Process Chemistry Laboratory will prepare test plans and perform process tests to evaluate the behavior of the 241-AY-101 waste undergoing the retrieval and treatment scenarios defined in the applicable DQOs. Requirements for analyses of samples originating in the process tests will be documented in the corresponding test plans and are not within the scope of this SAP.« less

  11. Behavioural changes, sharing behaviour and psychological responses after receiving direct-to-consumer genetic test results: a systematic review and meta-analysis.

    PubMed

    Stewart, Kelly F J; Wesselius, Anke; Schreurs, Maartje A C; Schols, Annemie M W J; Zeegers, Maurice P

    2018-01-01

    It has been hypothesised that direct-to-consumer genetic tests (DTC-GTs) could stimulate health behaviour change. However, genetic testing may also lead to anxiety and distress or unnecessarily burden the health care system. The aim is to review and meta-analyse the effects of DTC-GT on (1) behaviour change, (2) psychological response and (3) medical consumption. A systematic literature search was performed in three databases, using "direct-to-consumer genetic testing" as a key search term. Random effects meta-analyses were performed when at least two comparable outcomes were available. After selection, 19 articles were included involving 11 unique studies. Seven studies involved actual consumers who paid the retail price, whereas four included participants who received free genetic testing as part of a research trial (non-actual consumers). In meta-analysis, 23% had a positive lifestyle change. More specifically, improved dietary and exercise practices were both reported by 12%, whereas 19% quit smoking. Seven percent of participants had subsequent preventive checks. Thirty-three percent shared their results with any health care professional and 50% with family and/or friends. Sub-analyses show that behaviour change was more prevalent among non-actual consumers, whereas sharing was more prevalent among actual consumers. Results on psychological responses showed that anxiety, distress and worry were low or absent and that the effect faded with time. DTC-GT has potential to be effective as a health intervention, but the right audience needs to be addressed with tailored follow-up. Research is needed to identify consumers who do and do not change behaviour or experience adverse psychological responses.

  12. Power of mental health nursing research: a statistical analysis of studies in the International Journal of Mental Health Nursing.

    PubMed

    Gaskin, Cadeyrn J; Happell, Brenda

    2013-02-01

    Having sufficient power to detect effect sizes of an expected magnitude is a core consideration when designing studies in which inferential statistics will be used. The main aim of this study was to investigate the statistical power in studies published in the International Journal of Mental Health Nursing. From volumes 19 (2010) and 20 (2011) of the journal, studies were analysed for their power to detect small, medium, and large effect sizes, according to Cohen's guidelines. The power of the 23 studies included in this review to detect small, medium, and large effects was 0.34, 0.79, and 0.94, respectively. In 90% of papers, no adjustments for experiment-wise error were reported. With a median of nine inferential tests per paper, the mean experiment-wise error rate was 0.51. A priori power analyses were only reported in 17% of studies. Although effect sizes for correlations and regressions were routinely reported, effect sizes for other tests (χ(2)-tests, t-tests, ANOVA/MANOVA) were largely absent from the papers. All types of effect sizes were infrequently interpreted. Researchers are strongly encouraged to conduct power analyses when designing studies, and to avoid scattergun approaches to data analysis (i.e. undertaking large numbers of tests in the hope of finding 'significant' results). Because reviewing effect sizes is essential for determining the clinical significance of study findings, researchers would better serve the field of mental health nursing if they reported and interpreted effect sizes. © 2012 The Authors. International Journal of Mental Health Nursing © 2012 Australian College of Mental Health Nurses Inc.

  13. Effect of Measured Welding Residual Stresses on Crack Growth

    NASA Technical Reports Server (NTRS)

    Hampton, Roy W.; Nelson, Drew; Doty, Laura W. (Technical Monitor)

    1998-01-01

    Welding residual stresses in thin plate A516-70 steel and 2219-T87 aluminum butt weldments were measured by the strain-gage hole drilling and X-ray diffraction methods. The residual stress data were used to construct 3D strain fields which were modeled as thermally induced strains. These 3D strain fields were then analyzed with the WARP31) FEM fracture analysis code in order to predict their effect on fatigue and on fracture. For analyses of fatigue crack advance and subsequent verification testing, fatigue crack growth increments were simulated by successive saw-cuts and incremental loading to generate, as a function of crack length, effects on crack growth of the interaction between residual stresses and load induced stresses. The specimen experimental response was characterized and compared to the WARM linear elastic and elastic-plastic fracture mechanics analysis predictions. To perform the fracture analysis, the plate material's crack tearing resistance was determined by tests of thin plate M(T) specimens. Fracture analyses of these specimen were performed using WARP31D to determine the critical Crack Tip Opening Angle [CTOA] of each material. These critical CTOA values were used to predict crack tearing and fracture in the weldments. To verify the fracture predictions, weldment M(T) specimen were tested in monotonic loading to fracture while characterizing the fracture process.

  14. Expanding the enablement framework and testing an evaluative instrument for diabetes patient education.

    PubMed

    Leeseberg Stamler, L; Cole, M M; Patrick, L J

    2001-08-01

    Strategies to delay or prevent complications from diabetes include diabetes patient education. Diabetes educators seek to provide education that meets the needs of clients and influences positive health outcomes. (1) To expand prior research exploring an enablement framework for patient education by examining perceptions of patient education by persons with diabetes and (2) to test the mastery of stress instrument (MSI) as a potential evaluative instrument for patient education. Triangulated data collection with a convenience sample of adults taking diabetes education classes. Half the sample completed audio-taped semi-structured interviews pre, during and posteducation and all completed the MSI posteducation. Qualitative data were analysed using latent content analysis, descriptive statistics were completed. Qualitative analysis revealed content categories similar to previous work with prenatal participants, supporting the enablement framework. Statistical analyses noted congruence with psychometric findings from development of MSI; secondary qualitative analyses revealed congruency between MSI scores and patient perceptions. Mastery is an outcome congruent with the enablement framework for patient education across content areas. Mastery of stress instrument may be a instrument for identification of patients who are coping well with diabetes self-management, as well as those who are not and who require further nursing interventions.

  15. Gait patterns for crime fighting: statistical evaluation

    NASA Astrophysics Data System (ADS)

    Sulovská, Kateřina; Bělašková, Silvie; Adámek, Milan

    2013-10-01

    The criminality is omnipresent during the human history. Modern technology brings novel opportunities for identification of a perpetrator. One of these opportunities is an analysis of video recordings, which may be taken during the crime itself or before/after the crime. The video analysis can be classed as identification analyses, respectively identification of a person via externals. The bipedal locomotion focuses on human movement on the basis of their anatomical-physiological features. Nowadays, the human gait is tested by many laboratories to learn whether the identification via bipedal locomotion is possible or not. The aim of our study is to use 2D components out of 3D data from the VICON Mocap system for deep statistical analyses. This paper introduces recent results of a fundamental study focused on various gait patterns during different conditions. The study contains data from 12 participants. Curves obtained from these measurements were sorted, averaged and statistically tested to estimate the stability and distinctiveness of this biometrics. Results show satisfactory distinctness of some chosen points, while some do not embody significant difference. However, results presented in this paper are of initial phase of further deeper and more exacting analyses of gait patterns under different conditions.

  16. Analysis of categorical moderators in mixed-effects meta-analysis: Consequences of using pooled versus separate estimates of the residual between-studies variances.

    PubMed

    Rubio-Aparicio, María; Sánchez-Meca, Julio; López-López, José Antonio; Botella, Juan; Marín-Martínez, Fulgencio

    2017-11-01

    Subgroup analyses allow us to examine the influence of a categorical moderator on the effect size in meta-analysis. We conducted a simulation study using a dichotomous moderator, and compared the impact of pooled versus separate estimates of the residual between-studies variance on the statistical performance of the Q B (P) and Q B (S) tests for subgroup analyses assuming a mixed-effects model. Our results suggested that similar performance can be expected as long as there are at least 20 studies and these are approximately balanced across categories. Conversely, when subgroups were unbalanced, the practical consequences of having heterogeneous residual between-studies variances were more evident, with both tests leading to the wrong statistical conclusion more often than in the conditions with balanced subgroups. A pooled estimate should be preferred for most scenarios, unless the residual between-studies variances are clearly different and there are enough studies in each category to obtain precise separate estimates. © 2017 The British Psychological Society.

  17. Test and Analysis of a Buckling-Critical Large-Scale Sandwich Composite Cylinder

    NASA Technical Reports Server (NTRS)

    Schultz, Marc R.; Sleight, David W.; Gardner, Nathaniel W.; Rudd, Michelle T.; Hilburger, Mark W.; Palm, Tod E.; Oldfield, Nathan J.

    2018-01-01

    Structural stability is an important design consideration for launch-vehicle shell structures and it is well known that the buckling response of such shell structures can be very sensitive to small geometric imperfections. As part of an effort to develop new buckling design guidelines for sandwich composite cylindrical shells, an 8-ft-diameter honeycomb-core sandwich composite cylinder was tested under pure axial compression to failure. The results from this test are compared with finite-element-analysis predictions and overall agreement was very good. In particular, the predicted buckling load was within 1% of the test and the character of the response matched well. However, it was found that the agreement could be improved by including composite material nonlinearity in the analysis, and that the predicted buckling initiation site was sensitive to the addition of small bending loads to the primary axial load in analyses.

  18. QTest: Quantitative Testing of Theories of Binary Choice.

    PubMed

    Regenwetter, Michel; Davis-Stober, Clintin P; Lim, Shiau Hong; Guo, Ying; Popova, Anna; Zwilling, Chris; Cha, Yun-Shil; Messner, William

    2014-01-01

    The goal of this paper is to make modeling and quantitative testing accessible to behavioral decision researchers interested in substantive questions. We provide a novel, rigorous, yet very general, quantitative diagnostic framework for testing theories of binary choice. This permits the nontechnical scholar to proceed far beyond traditionally rather superficial methods of analysis, and it permits the quantitatively savvy scholar to triage theoretical proposals before investing effort into complex and specialized quantitative analyses. Our theoretical framework links static algebraic decision theory with observed variability in behavioral binary choice data. The paper is supplemented with a custom-designed public-domain statistical analysis package, the QTest software. We illustrate our approach with a quantitative analysis using published laboratory data, including tests of novel versions of "Random Cumulative Prospect Theory." A major asset of the approach is the potential to distinguish decision makers who have a fixed preference and commit errors in observed choices from decision makers who waver in their preferences.

  19. SOX9 Is a Progressive Factor in Prostate Cancer

    DTIC Science & Technology

    2013-09-01

    immunohis- tochemical analyses of BCa xenografts or mouse mammary tis- sues are detailed in the supplemental data. Meta- analysis of Gene Expression—SOX9mRNA...accession no. GSE5460) using dChip software (29). The analysis of variance function in dChip identified gene probes with significant correlation to...tumor grade groups (right panel). The p values of the difference analysis (Fisher’s Exact Test) are indicated. SOX9 Controlled Wnt/-catenin Activity

  20. Failure analysis on optical fiber on swarm flight payload

    NASA Astrophysics Data System (ADS)

    Bourcier, Frédéric; Fratter, Isabelle; Teyssandier, Florent; Barenes, Magali; Dhenin, Jérémie; Peyriguer, Marie; Petre-Bordenave, Romain

    2017-11-01

    Failure analysis on optical components is usually carried-out, on standard testing devices such as optical/electronic microscopes and spectrometers, on isolated but representative samples. Such analyses are not contactless and not totally non-invasive, so they cannot be used easily on flight models. Furthermore, for late payload or satellite integration/validation phases with tight schedule issues, it could be necessary to carry out a failure analysis directly on the flight hardware, in cleanroom.

  1. Results of Small-scale Solid Rocket Combustion Simulator testing at Marshall Space Flight Center

    NASA Technical Reports Server (NTRS)

    Goldberg, Benjamin E.; Cook, Jerry

    1993-01-01

    The Small-scale Solid Rocket Combustion Simulator (SSRCS) program was established at the Marshall Space Flight Center (MSFC), and used a government/industry team consisting of Hercules Aerospace Corporation, Aerotherm Corporation, United Technology Chemical Systems Division, Thiokol Corporation and MSFC personnel to study the feasibility of simulating the combustion species, temperatures and flow fields of a conventional solid rocket motor (SRM) with a versatile simulator system. The SSRCS design is based on hybrid rocket motor principles. The simulator uses a solid fuel and a gaseous oxidizer. Verification of the feasibility of a SSRCS system as a test bed was completed using flow field and system analyses, as well as empirical test data. A total of 27 hot firings of a subscale SSRCS motor were conducted at MSFC. Testing of the Small-scale SSRCS program was completed in October 1992. This paper, a compilation of reports from the above team members and additional analysis of the instrumentation results, will discuss the final results of the analyses and test programs.

  2. Descriptive Statistics for Modern Test Score Distributions: Skewness, Kurtosis, Discreteness, and Ceiling Effects.

    PubMed

    Ho, Andrew D; Yu, Carol C

    2015-06-01

    Many statistical analyses benefit from the assumption that unconditional or conditional distributions are continuous and normal. More than 50 years ago in this journal, Lord and Cook chronicled departures from normality in educational tests, and Micerri similarly showed that the normality assumption is met rarely in educational and psychological practice. In this article, the authors extend these previous analyses to state-level educational test score distributions that are an increasingly common target of high-stakes analysis and interpretation. Among 504 scale-score and raw-score distributions from state testing programs from recent years, nonnormal distributions are common and are often associated with particular state programs. The authors explain how scaling procedures from item response theory lead to nonnormal distributions as well as unusual patterns of discreteness. The authors recommend that distributional descriptive statistics be calculated routinely to inform model selection for large-scale test score data, and they illustrate consequences of nonnormality using sensitivity studies that compare baseline results to those from normalized score scales.

  3. Anlysis capabilities for plutonium-238 programs

    NASA Astrophysics Data System (ADS)

    Wong, A. S.; Rinehart, G. H.; Reimus, M. H.; Pansoy-Hjelvik, M. E.; Moniz, P. F.; Brock, J. C.; Ferrara, S. E.; Ramsey, S. S.

    2000-07-01

    In this presentation, an overview of analysis capabilities that support 238Pu programs will be discussed. These capabilities include neutron emission rate and calorimetric measurements, metallography/ceramography, ultrasonic examination, particle size determination, and chemical analyses. The data obtained from these measurements provide baseline parameters for fuel clad impact testing, fuel processing, product certifications, and waste disposal. Also several in-line analyses capabilities will be utilized for process control in the full-scale 238Pu Aqueous Scrap Recovery line in FY01.

  4. Stress analysis and damage evaluation of flawed composite laminates by hybrid-numerical methods

    NASA Technical Reports Server (NTRS)

    Yang, Yii-Ching

    1992-01-01

    Structural components in flight vehicles is often inherited flaws, such as microcracks, voids, holes, and delamination. These defects will degrade structures the same as that due to damages in service, such as impact, corrosion, and erosion. It is very important to know how a structural component can be useful and survive after these flaws and damages. To understand the behavior and limitation of these structural components researchers usually do experimental tests or theoretical analyses on structures with simulated flaws. However, neither approach has been completely successful. As Durelli states that 'Seldom does one method give a complete solution, with the most efficiency'. Examples of this principle is seen in photomechanics which additional strain-gage testing can only average stresses at locations of high concentration. On the other hand, theoretical analyses including numerical analyses are implemented with simplified assumptions which may not reflect actual boundary conditions. Hybrid-Numerical methods which combine photomechanics and numerical analysis have been used to correct this inefficiency since 1950's. But its application is limited until 1970's when modern computer codes became available. In recent years, researchers have enhanced the data obtained from photoelasticity, laser speckle, holography and moire' interferometry for input of finite element analysis on metals. Nevertheless, there is only few of literature being done on composite laminates. Therefore, this research is dedicated to this highly anisotropic material.

  5. Design/cost tradeoff studies. Appendix A. Supporting analyses and tradeoffs, book 2. Earth Observatory Satellite system definition study (EOS)

    NASA Technical Reports Server (NTRS)

    1974-01-01

    Attitude reference systems for use with the Earth Observatory Satellite (EOS) are described. The systems considered are fixed and gimbaled star trackers, star mappers, and digital sun sensors. Covariance analyses were performed to determine performance for the most promising candidate in low altitude and synchronous orbits. The performance of attitude estimators that employ gyroscopes which are periodically updated by a star sensor is established by a single axis covariance analysis. The other systems considered are: (1) the propulsion system design, (2) electric power and electrical integration, (3) thermal control, (4) ground data processing, and (5) the test plan and cost reduction aspects of observatory integration and test.

  6. Human immunodeficiency viruses appear compartmentalized to the female genital tract in cross-sectional analyses but genital lineages do not persist over time.

    PubMed

    Bull, Marta E; Heath, Laura M; McKernan-Mullin, Jennifer L; Kraft, Kelli M; Acevedo, Luis; Hitti, Jane E; Cohn, Susan E; Tapia, Kenneth A; Holte, Sarah E; Dragavon, Joan A; Coombs, Robert W; Mullins, James I; Frenkel, Lisa M

    2013-04-15

    Whether unique human immunodeficiency type 1 (HIV) genotypes occur in the genital tract is important for vaccine development and management of drug resistant viruses. Multiple cross-sectional studies suggest HIV is compartmentalized within the female genital tract. We hypothesize that bursts of HIV replication and/or proliferation of infected cells captured in cross-sectional analyses drive compartmentalization but over time genital-specific viral lineages do not form; rather viruses mix between genital tract and blood. Eight women with ongoing HIV replication were studied during a period of 1.5 to 4.5 years. Multiple viral sequences were derived by single-genome amplification of the HIV C2-V5 region of env from genital secretions and blood plasma. Maximum likelihood phylogenies were evaluated for compartmentalization using 4 statistical tests. In cross-sectional analyses compartmentalization of genital from blood viruses was detected in three of eight women by all tests; this was associated with tissue specific clades containing multiple monotypic sequences. In longitudinal analysis, the tissues-specific clades did not persist to form viral lineages. Rather, across women, HIV lineages were comprised of both genital tract and blood sequences. The observation of genital-specific HIV clades only in cross-sectional analysis and an absence of genital-specific lineages in longitudinal analyses suggest a dynamic interchange of HIV variants between the female genital tract and blood.

  7. Contour plot assessment of existing meta-analyses confirms robust association of statin use and acute kidney injury risk.

    PubMed

    Chevance, Aurélie; Schuster, Tibor; Steele, Russell; Ternès, Nils; Platt, Robert W

    2015-10-01

    Robustness of an existing meta-analysis can justify decisions on whether to conduct an additional study addressing the same research question. We illustrate the graphical assessment of the potential impact of an additional study on an existing meta-analysis using published data on statin use and the risk of acute kidney injury. A previously proposed graphical augmentation approach is used to assess the sensitivity of the current test and heterogeneity statistics extracted from existing meta-analysis data. In addition, we extended the graphical augmentation approach to assess potential changes in the pooled effect estimate after updating a current meta-analysis and applied the three graphical contour definitions to data from meta-analyses on statin use and acute kidney injury risk. In the considered example data, the pooled effect estimates and heterogeneity indices demonstrated to be considerably robust to the addition of a future study. Supportingly, for some previously inconclusive meta-analyses, a study update might yield statistically significant kidney injury risk increase associated with higher statin exposure. The illustrated contour approach should become a standard tool for the assessment of the robustness of meta-analyses. It can guide decisions on whether to conduct additional studies addressing a relevant research question. Copyright © 2015 Elsevier Inc. All rights reserved.

  8. [A Review on the Use of Effect Size in Nursing Research].

    PubMed

    Kang, Hyuncheol; Yeon, Kyupil; Han, Sang Tae

    2015-10-01

    The purpose of this study was to introduce the main concepts of statistical testing and effect size and to provide researchers in nursing science with guidance on how to calculate the effect size for the statistical analysis methods mainly used in nursing. For t-test, analysis of variance, correlation analysis, regression analysis which are used frequently in nursing research, the generally accepted definitions of the effect size were explained. Some formulae for calculating the effect size are described with several examples in nursing research. Furthermore, the authors present the required minimum sample size for each example utilizing G*Power 3 software that is the most widely used program for calculating sample size. It is noted that statistical significance testing and effect size measurement serve different purposes, and the reliance on only one side may be misleading. Some practical guidelines are recommended for combining statistical significance testing and effect size measure in order to make more balanced decisions in quantitative analyses.

  9. Meta-analysis for the comparison of two diagnostic tests to a common gold standard: A generalized linear mixed model approach.

    PubMed

    Hoyer, Annika; Kuss, Oliver

    2018-05-01

    Meta-analysis of diagnostic studies is still a rapidly developing area of biostatistical research. Especially, there is an increasing interest in methods to compare different diagnostic tests to a common gold standard. Restricting to the case of two diagnostic tests, in these meta-analyses the parameters of interest are the differences of sensitivities and specificities (with their corresponding confidence intervals) between the two diagnostic tests while accounting for the various associations across single studies and between the two tests. We propose statistical models with a quadrivariate response (where sensitivity of test 1, specificity of test 1, sensitivity of test 2, and specificity of test 2 are the four responses) as a sensible approach to this task. Using a quadrivariate generalized linear mixed model naturally generalizes the common standard bivariate model of meta-analysis for a single diagnostic test. If information on several thresholds of the tests is available, the quadrivariate model can be further generalized to yield a comparison of full receiver operating characteristic (ROC) curves. We illustrate our model by an example where two screening methods for the diagnosis of type 2 diabetes are compared.

  10. Subgroup analyses in randomised controlled trials: cohort study on trial protocols and journal publications.

    PubMed

    Kasenda, Benjamin; Schandelmaier, Stefan; Sun, Xin; von Elm, Erik; You, John; Blümle, Anette; Tomonaga, Yuki; Saccilotto, Ramon; Amstutz, Alain; Bengough, Theresa; Meerpohl, Joerg J; Stegert, Mihaela; Olu, Kelechi K; Tikkinen, Kari A O; Neumann, Ignacio; Carrasco-Labra, Alonso; Faulhaber, Markus; Mulla, Sohail M; Mertz, Dominik; Akl, Elie A; Bassler, Dirk; Busse, Jason W; Ferreira-González, Ignacio; Lamontagne, Francois; Nordmann, Alain; Gloy, Viktoria; Raatz, Heike; Moja, Lorenzo; Rosenthal, Rachel; Ebrahim, Shanil; Vandvik, Per O; Johnston, Bradley C; Walter, Martin A; Burnand, Bernard; Schwenkglenks, Matthias; Hemkens, Lars G; Bucher, Heiner C; Guyatt, Gordon H; Briel, Matthias

    2014-07-16

    To investigate the planning of subgroup analyses in protocols of randomised controlled trials and the agreement with corresponding full journal publications. Cohort of protocols of randomised controlled trial and subsequent full journal publications. Six research ethics committees in Switzerland, Germany, and Canada. 894 protocols of randomised controlled trial involving patients approved by participating research ethics committees between 2000 and 2003 and 515 subsequent full journal publications. Of 894 protocols of randomised controlled trials, 252 (28.2%) included one or more planned subgroup analyses. Of those, 17 (6.7%) provided a clear hypothesis for at least one subgroup analysis, 10 (4.0%) anticipated the direction of a subgroup effect, and 87 (34.5%) planned a statistical test for interaction. Industry sponsored trials more often planned subgroup analyses compared with investigator sponsored trials (195/551 (35.4%) v 57/343 (16.6%), P<0.001). Of 515 identified journal publications, 246 (47.8%) reported at least one subgroup analysis. In 81 (32.9%) of the 246 publications reporting subgroup analyses, authors stated that subgroup analyses were prespecified, but this was not supported by 28 (34.6%) corresponding protocols. In 86 publications, authors claimed a subgroup effect, but only 36 (41.9%) corresponding protocols reported a planned subgroup analysis. Subgroup analyses are insufficiently described in the protocols of randomised controlled trials submitted to research ethics committees, and investigators rarely specify the anticipated direction of subgroup effects. More than one third of statements in publications of randomised controlled trials about subgroup prespecification had no documentation in the corresponding protocols. Definitive judgments regarding credibility of claimed subgroup effects are not possible without access to protocols and analysis plans of randomised controlled trials. © The DISCO study group 2014.

  11. Multigene analysis of lophophorate and chaetognath phylogenetic relationships.

    PubMed

    Helmkampf, Martin; Bruchhaus, Iris; Hausdorf, Bernhard

    2008-01-01

    Maximum likelihood and Bayesian inference analyses of seven concatenated fragments of nuclear-encoded housekeeping genes indicate that Lophotrochozoa is monophyletic, i.e., the lophophorate groups Bryozoa, Brachiopoda and Phoronida are more closely related to molluscs and annelids than to Deuterostomia or Ecdysozoa. Lophophorates themselves, however, form a polyphyletic assemblage. The hypotheses that they are monophyletic and more closely allied to Deuterostomia than to Protostomia can be ruled out with both the approximately unbiased test and the expected likelihood weights test. The existence of Phoronozoa, a putative clade including Brachiopoda and Phoronida, has also been rejected. According to our analyses, phoronids instead share a more recent common ancestor with bryozoans than with brachiopods. Platyhelminthes is the sister group of Lophotrochozoa. Together these two constitute Spiralia. Although Chaetognatha appears as the sister group of Priapulida within Ecdysozoa in our analyses, alternative hypothesis concerning chaetognath relationships could not be rejected.

  12. Immunophenotyping of posttraumatic neutrophils on a routine haematology analyser.

    PubMed

    Groeneveld, Kathelijne Maaike; Heeres, Marjolein; Leenen, Loek Petrus Hendrikus; Huisman, Albert; Koenderman, Leo

    2012-01-01

    Flow cytometry markers have been proposed as useful predictors for the occurrence of posttraumatic inflammatory complications. However, currently the need for a dedicated laboratory and the labour-intensive analytical procedures make these markers less suitable for clinical practice. We tested an approach to overcome these limitations. Neutrophils of healthy donors were incubated with antibodies commonly used in trauma research: CD11b (MAC-1), L-selectin (CD62L), FcγRIII (CD16), and FcγRII (CD32) in active form (MoPhab A27). Flow cytometric analysis was performed both on a FACSCalibur, a standard flow cytometer, and on a Cell-Dyn Sapphire, a routine haematology analyser. There was a high level of agreement between the two types of analysers, with 41% for FcγRIII, 80% for L-selectin, 98% for CD11b, and even a 100% agreement for active FcγRII. Moreover, analysis on the routine haematology analyser was possible in less than a quarter of the time in comparison to the flow cytometer. Analysis of neutrophil phenotype on the Cell-Dyn Sapphire leads to the same conclusion compared to a standard flow cytometer. The markedly reduced time necessary for analysis and reduced labour intensity constitutes a step forward in implementation of this type of analysis in clinical diagnostics in trauma research. Copyright © 2012 Kathelijne Maaike Groeneveld et al.

  13. Rasch analysis suggested three unidimensional domains for Affiliate Stigma Scale: additional psychometric evaluation.

    PubMed

    Chang, Chih-Cheng; Su, Jian-An; Tsai, Ching-Shu; Yen, Cheng-Fang; Liu, Jiun-Horng; Lin, Chung-Ying

    2015-06-01

    To examine the psychometrics of the Affiliate Stigma Scale using rigorous psychometric analysis: classical test theory (CTT) (traditional) and Rasch analysis (modern). Differential item functioning (DIF) items were also tested using Rasch analysis. Caregivers of relatives with mental illness (n = 453; mean age: 53.29 ± 13.50 years) were recruited from southern Taiwan. Each participant filled out four questionnaires: Affiliate Stigma Scale, Rosenberg Self-Esteem Scale, Beck Anxiety Inventory, and one background information sheet. CTT analyses showed that the Affiliate Stigma Scale had satisfactory internal consistency (α = 0.85-0.94) and concurrent validity (Rosenberg Self-Esteem Scale: r = -0.52 to -0.46; Beck Anxiety Inventory: r = 0.27-0.34). Rasch analyses supported the unidimensionality of three domains in the Affiliate Stigma Scale and indicated four DIF items (affect domain: 1; cognitive domain: 3) across gender. Our findings, based on rigorous statistical analysis, verified the psychometrics of the Affiliate Stigma Scale and reported its DIF items. We conclude that the three domains of the Affiliate Stigma Scale can be separately used and are suitable for measuring the affiliate stigma of caregivers of relatives with mental illness. Copyright © 2015 Elsevier Inc. All rights reserved.

  14. Aeroelastic stability analyses of two counter rotating propfan designs for a cruise missile model

    NASA Technical Reports Server (NTRS)

    Mahajan, Aparajit J.; Lucero, John M.; Mehmed, Oral; Stefko, George L.

    1992-01-01

    A modal aeroelastic analysis combining structural and aerodynamic models is applied to counterrotating propfans to evaluate their structural integrity for wind-tunnel testing. The aeroelastic analysis code is an extension of the 2D analysis code called the Aeroelastic Stability and Response of Propulsion Systems. Rotational speed and freestream Mach number are the parameters for calculating the stability of the two blade designs with a modal method combining a finite-element structural model with 2D steady and unsteady cascade aerodynamic models. The model demonstrates convergence to the least stable aeroelastic mode, describes the effects of a nonuniform inflow, and permits the modification of geometry and rotation. The analysis shows that the propfan designs are suitable for the wind-tunnel test and confirms that the propfans should be flutter-free under the range of conditions of the testing.

  15. NASA. Marshall Space Flight Center Hydrostatic Bearing Activities

    NASA Technical Reports Server (NTRS)

    Benjamin, Theodore G.

    1991-01-01

    The basic approach for analyzing hydrostatic bearing flows at the Marshall Space Flight Center (MSFC) is briefly discussed. The Hydrostatic Bearing Team has responsibility for assessing and evaluating flow codes; evaluating friction, ignition, and galling effects; evaluating wear; and performing tests. The Office of Aerospace and Exploration Technology Turbomachinery Seals Tasks consist of tests and analysis. The MSFC in-house analyses utilize one-dimensional bulk-flow codes. Computational fluid dynamics (CFD) analysis is used to enhance understanding of bearing flow physics or to perform parametric analysis that are outside the bulk flow database. As long as the bulk flow codes are accurate enough for most needs, they will be utilized accordingly and will be supported by CFD analysis on an as-needed basis.

  16. Pretest analysis document for Semiscale Test S-LH-1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shaw, R.A.

    Results from various pretest calculations which were performed for Test S-LH-1 are included in this report. Test S-LH-1 has been designed to produce primary liquid holdup in the steam generator U-tubes similar to Tests S-UT-8. The analyses included in this report indicate liquid will be held in the tubes, the core liquid level will be appropriately depressed, and a core heater rod temperature excursion should occur. Several sensitivity studies are also included which identify parameters which could affect the response.

  17. Evaluation of Evidence of Statistical Support and Corroboration of Subgroup Claims in Randomized Clinical Trials.

    PubMed

    Wallach, Joshua D; Sullivan, Patrick G; Trepanowski, John F; Sainani, Kristin L; Steyerberg, Ewout W; Ioannidis, John P A

    2017-04-01

    Many published randomized clinical trials (RCTs) make claims for subgroup differences. To evaluate how often subgroup claims reported in the abstracts of RCTs are actually supported by statistical evidence (P < .05 from an interaction test) and corroborated by subsequent RCTs and meta-analyses. This meta-epidemiological survey examines data sets of trials with at least 1 subgroup claim, including Subgroup Analysis of Trials Is Rarely Easy (SATIRE) articles and Discontinuation of Randomized Trials (DISCO) articles. We used Scopus (updated July 2016) to search for English-language articles citing each of the eligible index articles with at least 1 subgroup finding in the abstract. Articles with a subgroup claim in the abstract with or without evidence of statistical heterogeneity (P < .05 from an interaction test) in the text and articles attempting to corroborate the subgroup findings. Study characteristics of trials with at least 1 subgroup claim in the abstract were recorded. Two reviewers extracted the data necessary to calculate subgroup-level effect sizes, standard errors, and the P values for interaction. For individual RCTs and meta-analyses that attempted to corroborate the subgroup findings from the index articles, trial characteristics were extracted. Cochran Q test was used to reevaluate heterogeneity with the data from all available trials. The number of subgroup claims in the abstracts of RCTs, the number of subgroup claims in the abstracts of RCTs with statistical support (subgroup findings), and the number of subgroup findings corroborated by subsequent RCTs and meta-analyses. Sixty-four eligible RCTs made a total of 117 subgroup claims in their abstracts. Of these 117 claims, only 46 (39.3%) in 33 articles had evidence of statistically significant heterogeneity from a test for interaction. In addition, out of these 46 subgroup findings, only 16 (34.8%) ensured balance between randomization groups within the subgroups (eg, through stratified randomization), 13 (28.3%) entailed a prespecified subgroup analysis, and 1 (2.2%) was adjusted for multiple testing. Only 5 (10.9%) of the 46 subgroup findings had at least 1 subsequent pure corroboration attempt by a meta-analysis or an RCT. In all 5 cases, the corroboration attempts found no evidence of a statistically significant subgroup effect. In addition, all effect sizes from meta-analyses were attenuated toward the null. A minority of subgroup claims made in the abstracts of RCTs are supported by their own data (ie, a significant interaction effect). For those that have statistical support (P < .05 from an interaction test), most fail to meet other best practices for subgroup tests, including prespecification, stratified randomization, and adjustment for multiple testing. Attempts to corroborate statistically significant subgroup differences are rare; when done, the initially observed subgroup differences are not reproduced.

  18. Metric properties of the "timed get up and go- modified version" test, in risk assessment of falls in active women

    PubMed Central

    2017-01-01

    Abstract Objective: To analyse the metric properties of the Timed Get up and Go-Modified Version Test (TGUGM), in risk assessment of falls in a group of physically active women. Methods: A sample was constituted by 202 women over 55 years of age, were assessed through a crosssectional study. The TGUGM was applied to assess their fall risk. The test was analysed by comparison of the qualitative and quantitative information and by factor analysis. The development of a logistic regression model explained the risk of falls according to the test components. Results: The TGUGM was useful for assessing the risk of falls in the studied group. The test revealed two factors: the Get Up and the Gait with dual task. Less than twelve points in the evaluation or runtimes higher than 35 seconds was associated with high risk of falling. More than 35 seconds in the test indicated a risk fall probability greater than 0.50. Also, scores less than 12 points were associated with a delay of 7 seconds more in the execution of the test (p= 0.0016). Conclusions: Factor analysis of TGUGM revealed two dimensions that can be independent predictors of risk of falling: The Get up that explains between 64% and 87% of the risk of falling, and the Gait with dual task, that explains between 77% and 95% of risk of falling. PMID:28559642

  19. Space shuttle maneuvering engine reusable thrust chamber program. Task 11: Stability analyses and acoustic model testing data dump

    NASA Technical Reports Server (NTRS)

    Oberg, C. L.

    1974-01-01

    The combustion stability characteristics of engines applicable to the Space Shuttle Orbit Maneuvering System and the adequacy of acoustic cavities as a means of assuring stability in these engines were investigated. The study comprised full-scale stability rating tests, bench-scale acoustic model tests and analysis. Two series of stability rating tests were made. Acoustic model tests were made to determine the resonance characteristics and effects of acoustic cavities. Analytical studies were done to aid design of the cavity configurations to be tested and, also, to aid evaluation of the effectiveness of acoustic cavities from available test results.

  20. Birth order, family configuration, and verbal achievement.

    PubMed

    Breland, H M

    1974-12-01

    Two samples of National Merit Scholarship participants test in 1962 and the entire population of almost 800,000 participants tested in 1965 were examined. Consistent effects in all 3 groups were observed with respect to both birth order and family size (1st born and those of smaller families scored higher). Control of both socioeconomic variables and maternal age, by analysis of variance as well as by analysis of covariance, failed to alter the relationships. Stepdown analyses suggested that the effects were due to a verbal component and that no differences were attributable to nonverbal factors. Mean test scores were computed for detailed sibship configurations based on birth order, family size, sibling spacing, and sibling sex.

  1. Post-test analysis of dryout test 7B' of the W-1 Sodium Loop Safety Facility Experiment with the SABRE-2P code. [LMFBR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rose, S.D.; Dearing, J.F.

    An understanding of conditions that may cause sodium boiling and boiling propagation that may lead to dryout and fuel failure is crucial in liquid-metal fast-breeder reactor safety. In this study, the SABRE-2P subchannel analysis code has been used to analyze the ultimate transient of the in-core W-1 Sodium Loop Safety Facility experiment. This code has a 3-D simple nondynamic boiling model which is able to predict the flow instability which caused dryout. In other analyses dryout has been predicted for out-of-core test bundles and so this study provides additional confirmation of the model.

  2. Space Launch System Base Heating Test: Sub-Scale Rocket Engine/Motor Design, Development and Performance Analysis

    NASA Technical Reports Server (NTRS)

    Mehta, Manish; Seaford, Mark; Kovarik, Brian; Dufrene, Aaron; Solly, Nathan; Kirchner, Robert; Engel, Carl D.

    2014-01-01

    The Space Launch System (SLS) base heating test is broken down into two test programs: (1) Pathfinder and (2) Main Test. The Pathfinder Test Program focuses on the design, development, hot-fire test and performance analyses of the 2% sub-scale SLS core-stage and booster element propulsion systems. The core-stage propulsion system is composed of four gaseous oxygen/hydrogen RS-25D model engines and the booster element is composed of two aluminum-based model solid rocket motors (SRMs). The first section of the paper discusses the motivation and test facility specifications for the test program. The second section briefly investigates the internal flow path of the design. The third section briefly shows the performance of the model RS-25D engines and SRMs for the conducted short duration hot-fire tests. Good agreement is observed based on design prediction analysis and test data. This program is a challenging research and development effort that has not been attempted in 40+ years for a NASA vehicle.

  3. Dry spell trend analysis in Kenya and the Murray Darling Basin using daily rainfall

    NASA Astrophysics Data System (ADS)

    Muita, R. R.; van Ogtrop, F. F.; Vervoort, R. W.

    2012-04-01

    Important agricultural areas in Kenya and the Murray Darling Basin (MDB) in Australia are largely semi-arid to arid. Persistent dry periods and timing of dry spells directly impact the availability of soil moisture and hence crop production in these regions. Most studies focus on the analysis of dry spell lengths at an annual scale. However, timing and length of dry spells at finer temporal scales is more beneficial for cropping when considering a trade-off between the time scale and the ability to analyse dry spell length. The aim of this study was to analyse the interannual and intra annual variations in dry spell lengths in the regions to inform crop management. This study analysed monthly dry spells based on daily rainfall for 1961-2010 on a limited dataset of 13 locations in Kenya and 17 locations in the MDB. This dataset was the most consistent across both regions and future analysis will incorporate more stations and longer time periods where available. Dry spell lengths were analysed by month and year and trends in monthly and annual dry spell lengths were analysed using Generalised Linear Models (GLM) and the Mann Kendall test (MK). Overall, monthly dryspell lengths are right skewed with higher frequency of shorter dryspells (3-25 days). In Kenya, significant increases in mean dry spell lengths (p≤0.02) are observed in inland arid-to semi humid locations but this temporal trend appears to decrease in highland and the coastal regions. Analysis of the MDB stations suggests changes in seasonality. For example, spatial trends suggest a North-South increase in dry spell length in summer (December - February), but a shortening after February. Generally, the GLM and MK results are similar in the two regions but the MK test tends to give higher values of positive slope coefficients and lower values for negative coefficients compared to GLM. This may limit the ability of finding the best estimates for model coefficients. Previous studies in Australia and Kenya have relied on continuous climatic indices based on global climate models and stochastic processes resulting in limited and mixed results. For agronomical purposes, our results show that direct assessment of dry spells lengths from daily rainfall also indicates changes in dry spells trends in Kenya and the MDB and that such an analysis is easy to use and requires limited assumptions. This initial analysis identifies significant increasing trends in the dry spell lengths in some areas and periods in Kenya and the MDB. This has major implications for crop production in these regions and it is recommended that this information be incorporated in the regions' management decisions. KEY WORDS: monthly dry spell length; Generalized Linear Models; Mann -Kendall test; month; Kenya, Murray Darling Basin (MDB).

  4. Analysis of Rare, Exonic Variation amongst Subjects with Autism Spectrum Disorders and Population Controls

    PubMed Central

    Liu, Li; Sabo, Aniko; Neale, Benjamin M.; Nagaswamy, Uma; Stevens, Christine; Lim, Elaine; Bodea, Corneliu A.; Muzny, Donna; Reid, Jeffrey G.; Banks, Eric; Coon, Hillary; DePristo, Mark; Dinh, Huyen; Fennel, Tim; Flannick, Jason; Gabriel, Stacey; Garimella, Kiran; Gross, Shannon; Hawes, Alicia; Lewis, Lora; Makarov, Vladimir; Maguire, Jared; Newsham, Irene; Poplin, Ryan; Ripke, Stephan; Shakir, Khalid; Samocha, Kaitlin E.; Wu, Yuanqing; Boerwinkle, Eric; Buxbaum, Joseph D.; Cook, Edwin H.; Devlin, Bernie; Schellenberg, Gerard D.; Sutcliffe, James S.; Daly, Mark J.; Gibbs, Richard A.; Roeder, Kathryn

    2013-01-01

    We report on results from whole-exome sequencing (WES) of 1,039 subjects diagnosed with autism spectrum disorders (ASD) and 870 controls selected from the NIMH repository to be of similar ancestry to cases. The WES data came from two centers using different methods to produce sequence and to call variants from it. Therefore, an initial goal was to ensure the distribution of rare variation was similar for data from different centers. This proved straightforward by filtering called variants by fraction of missing data, read depth, and balance of alternative to reference reads. Results were evaluated using seven samples sequenced at both centers and by results from the association study. Next we addressed how the data and/or results from the centers should be combined. Gene-based analyses of association was an obvious choice, but should statistics for association be combined across centers (meta-analysis) or should data be combined and then analyzed (mega-analysis)? Because of the nature of many gene-based tests, we showed by theory and simulations that mega-analysis has better power than meta-analysis. Finally, before analyzing the data for association, we explored the impact of population structure on rare variant analysis in these data. Like other recent studies, we found evidence that population structure can confound case-control studies by the clustering of rare variants in ancestry space; yet, unlike some recent studies, for these data we found that principal component-based analyses were sufficient to control for ancestry and produce test statistics with appropriate distributions. After using a variety of gene-based tests and both meta- and mega-analysis, we found no new risk genes for ASD in this sample. Our results suggest that standard gene-based tests will require much larger samples of cases and controls before being effective for gene discovery, even for a disorder like ASD. PMID:23593035

  5. Scaling and design analyses of a scaled-down, high-temperature test facility for experimental investigation of the initial stages of a VHTR air-ingress accident

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arcilesi, David J.; Ham, Tae Kyu; Kim, In Hun

    2015-07-01

    A critical event in the safety analysis of the very high-temperature gas-cooled reactor (VHTR) is an air-ingress accident. This accident is initiated, in its worst case scenario, by a double-ended guillotine break of the coaxial cross vessel, which leads to a rapid reactor vessel depressurization. In a VHTR, the reactor vessel is located within a reactor cavity that is filled with air during normal operating conditions. Following the vessel depressurization, the dominant mode of ingress of an air–helium mixture into the reactor vessel will either be molecular diffusion or density-driven stratified flow. The mode of ingress is hypothesized to dependmore » largely on the break conditions of the cross vessel. Since the time scales of these two ingress phenomena differ by orders of magnitude, it is imperative to understand under which conditions each of these mechanisms will dominate in the air ingress process. Computer models have been developed to analyze this type of accident scenario. There are, however, limited experimental data available to understand the phenomenology of the air-ingress accident and to validate these models. Therefore, there is a need to design and construct a scaled-down experimental test facility to simulate the air-ingress accident scenarios and to collect experimental data. The current paper focuses on the analyses performed for the design and operation of a 1/8th geometric scale (by height and diameter), high-temperature test facility. A geometric scaling analysis for the VHTR, a time scale analysis of the air-ingress phenomenon, a transient depressurization analysis of the reactor vessel, a hydraulic similarity analysis of the test facility, a heat transfer characterization of the hot plenum, a power scaling analysis for the reactor system, and a design analysis of the containment vessel are discussed.« less

  6. Space Shuttle program communication and tracking systems interface analysis

    NASA Technical Reports Server (NTRS)

    Dodds, J. G.; Holmes, J. K.; Huth, G. K.; Iwasaki, R. S.; Nilsen, P. W.; Polydoros, A.; Sampaio, D. R.; Udalov, S.

    1984-01-01

    The Space Shuttle Program Communications and Tracking Systems Interface Analysis began April 18, 1983. During this time, the shuttle communication and tracking systems began flight testing. Two areas of analysis documented were a result of observations made during flight tests. These analyses involved the Ku-band communication system. First, there was a detailed analysis of the interface between the solar max data format and the Ku-band communication system including the TDRSS ground station. The second analysis involving the Ku-band communication system was an analysis of the frequency lock loop of the Gunn oscillator used to generate the transmit frequency. The stability of the frequency lock loop was investigated and changes to the design were reviewed to alleviate the potential loss of data due the loop losing lock and entering the reacquisition mode. Other areas of investigation were the S-band antenna analysis and RF coverage analysis.

  7. Automatic Detection of Previously-Unseen Application States for Deployment Environment Testing and Analysis

    PubMed Central

    Murphy, Christian; Vaughan, Moses; Ilahi, Waseem; Kaiser, Gail

    2010-01-01

    For large, complex software systems, it is typically impossible in terms of time and cost to reliably test the application in all possible execution states and configurations before releasing it into production. One proposed way of addressing this problem has been to continue testing and analysis of the application in the field, after it has been deployed. A practical limitation of many such automated approaches is the potentially high performance overhead incurred by the necessary instrumentation. However, it may be possible to reduce this overhead by selecting test cases and performing analysis only in previously-unseen application states, thus reducing the number of redundant tests and analyses that are run. Solutions for fault detection, model checking, security testing, and fault localization in deployed software may all benefit from a technique that ignores application states that have already been tested or explored. In this paper, we present a solution that ensures that deployment environment tests are only executed in states that the application has not previously encountered. In addition to discussing our implementation, we present the results of an empirical study that demonstrates its effectiveness, and explain how the new approach can be generalized to assist other automated testing and analysis techniques intended for the deployment environment. PMID:21197140

  8. Post-examination interpretation of objective test data: monitoring and improving the quality of high-stakes examinations--a commentary on two AMEE Guides.

    PubMed

    Tavakol, Mohsen; Dennick, Reg

    2012-01-01

    As great emphasis is rightly placed upon the importance of assessment to judge the quality of our future healthcare professionals, it is appropriate not only to choose the most appropriate assessment method, but to continually monitor the quality of the tests themselves, in a hope that we may continually improve the process. This article stresses the importance of quality control mechanisms in the exam cycle and briefly outlines some of the key psychometric concepts including reliability measures, factor analysis, generalisability theory and item response theory. The importance of such analyses for the standard setting procedures is emphasised. This article also accompanies two new AMEE Guides in Medical Education (Tavakol M, Dennick R. Post-examination Analysis of Objective Tests: AMEE Guide No. 54 and Tavakol M, Dennick R. 2012. Post examination analysis of objective test data: Monitoring and improving the quality of high stakes examinations: AMEE Guide No. 66) which provide the reader with practical examples of analysis and interpretation, in order to help develop valid and reliable tests.

  9. The use of exploratory analyses within the National Institute for Health and Care Excellence single technology appraisal process: an evaluation and qualitative analysis.

    PubMed

    Kaltenthaler, Eva; Carroll, Christopher; Hill-McManus, Daniel; Scope, Alison; Holmes, Michael; Rice, Stephen; Rose, Micah; Tappenden, Paul; Woolacott, Nerys

    2016-04-01

    As part of the National Institute for Health and Care Excellence (NICE) single technology appraisal (STA) process, independent Evidence Review Groups (ERGs) critically appraise the company submission. During the critical appraisal process the ERG may undertake analyses to explore uncertainties around the company's model and their implications for decision-making. The ERG reports are a central component of the evidence considered by the NICE Technology Appraisal Committees (ACs) in their deliberations. The aim of this research was to develop an understanding of the number and type of exploratory analyses undertaken by the ERGs within the STA process and to understand how these analyses are used by the NICE ACs in their decision-making. The 100 most recently completed STAs with published guidance were selected for inclusion in the analysis. The documents considered were ERG reports, clarification letters, the first appraisal consultation document and the final appraisal determination. Over 400 documents were assessed in this study. The categories of types of exploratory analyses included fixing errors, fixing violations, addressing matters of judgement and the ERG-preferred base case. A content analysis of documents (documentary analysis) was undertaken to identify and extract relevant data, and narrative synthesis was then used to rationalise and present these data. The level and type of detail in ERG reports and clarification letters varied considerably. The vast majority (93%) of ERG reports reported one or more exploratory analyses. The most frequently reported type of analysis in these 93 ERG reports related to the category 'matters of judgement', which was reported in 83 (89%) reports. The category 'ERG base-case/preferred analysis' was reported in 45 (48%) reports, the category 'fixing errors' was reported in 33 (35%) reports and the category 'fixing violations' was reported in 17 (18%) reports. The exploratory analyses performed were the result of issues raised by an ERG in its critique of the submitted economic evidence. These analyses had more influence on recommendations earlier in the STA process than later on in the process. The descriptions of analyses undertaken were often highly specific to a particular STA and could be inconsistent across ERG reports and thus difficult to interpret. Evidence Review Groups frequently conduct exploratory analyses to test or improve the economic evaluations submitted by companies as part of the STA process. ERG exploratory analyses often have an influence on the recommendations produced by the ACs. More in-depth analysis is needed to understand how ERGs make decisions regarding which exploratory analyses should be undertaken. More research is also needed to fully understand which types of exploratory analyses are most useful to ACs in their decision-making. The National Institute for Health Research Health Technology Assessment programme.

  10. Development of a flash, bang, and smoke simulation of a shell burst

    NASA Technical Reports Server (NTRS)

    Williamson, F. R.; Kinney, J. F.; Wallace, T. V.

    1982-01-01

    A large number of experiments (cue test firings) were performed in the definition of the cue concepts and packaging configurations. A total of 344 of these experiments were recorded with instrumentation photography to allow a quantitative analysis of the smoke cloud to be made as a function of time. These analyses were predominantly made using a short test site. Supplementary long range visibility tests were conducted to insure the required 3 kilometer visibility of the smoke signature.

  11. Preparation Torque Limit for Composites Joined with Mechanical Fasteners

    NASA Technical Reports Server (NTRS)

    Thomas, Frank P.; Yi, Zhao

    2005-01-01

    Current design guidelines for determining torque ranges for composites are based on tests and analysis from isotropic materials. Properties of composites are not taken into account. No design criteria based upon a systematic analytical and test analyses is available. This paper is to study the maximum torque load a composite component could carry prior to any failure. Specifically, the torque-tension tests are conducted. NDT techniques including acoustic emission, thermography and photomicroscopy are also utilized to characterize the damage modes.

  12. Analysis, design, fabrication and testing of an optical tip clearance sensor. [turbocompressor blade tips

    NASA Technical Reports Server (NTRS)

    Poppel, G. L.; Marple, D. T. F.; Kingsley, J. D.

    1981-01-01

    Analyses and the design, fabrication, and testing of an optical tip clearance sensor with intended application in aircraft propulsion control systems are reported. The design of a sensor test rig, evaluation of optical sensor components at elevated temperatures, sensor design principles, sensor test results at room temperature, and estimations of sensor accuracy at temperatures of an aircraft engine environment are discussed. Room temperature testing indicated possible measurement accuracies of less than 12.7 microns (0.5 mils). Ways to improve performance at engine operating temperatures are recommended. The potential of this tip clearance sensor is assessed.

  13. Analysis of responses of cold pressor tests on pilots and executives

    NASA Technical Reports Server (NTRS)

    Swaroop, R.

    1977-01-01

    Statistical analyses were performed to study the relationship between cold pressor test responses and certain medical attributes of a group of 81 pilots and a group of 466 executives. The important results of this study were as follows: There was a significant relationship between a subject's cold pressor test response and his profession (that is, pilot or executive). The executives' diastolic cold pressor test responses were significantly related to their medical conditions, and their families' medical conditions. Significant relationships were observed between executives' diastolic and systolic cold pressor test responses and their history of tranquilizer and cardiac drug use.

  14. Cremated human and animal remains of the Roman period--microscopic method of analysis (Sepkovcica, Croatia).

    PubMed

    Hincak, Zdravka; Mihelić, Damir; Bugar, Aleksandra

    2007-12-01

    Human and animal cremated osteological remains from twelve graves of Roman Period from archaeological site Sepkovcica near Velika Gorica (Turopolje region, NW Croatia) were analysed. Beside the content of urns and grave pits, fillings of grave vessels like bowls, pots and amphoras from twentytwo grave samples were included in this study. The preservation of osteological and dental remains of human and animal origin was very poor, majority of fragments hardly reach lengths of 10 mm. Weight of each specimen barely exceeds 100 g per person. Apart from traditional macroscopic methods of analysing cremated remains, microscopic method for determination of age at death was also tested. Fragments of femoral bone diaphysis of eighteen persons whose remains had been found on the site were analysed. Person's age at death was presented in the range of five or ten years, and the long bone fragments of a child (infants) were detected. Taxonomic position for each analysed specimen was determined by microscopic analysis of animal cremated bones. Analysis results confirm validity of microscopic method in determination of age at death for human remains and taxonomic qualification of cremated animal remains from archaeological sites.

  15. Item validity vs. item discrimination index: a redundancy?

    NASA Astrophysics Data System (ADS)

    Panjaitan, R. L.; Irawati, R.; Sujana, A.; Hanifah, N.; Djuanda, D.

    2018-03-01

    In several literatures about evaluation and test analysis, it is common to find that there are calculations of item validity as well as item discrimination index (D) with different formula for each. Meanwhile, other resources said that item discrimination index could be obtained by calculating the correlation between the testee’s score in a particular item and the testee’s score on the overall test, which is actually the same concept as item validity. Some research reports, especially undergraduate theses tend to include both item validity and item discrimination index in the instrument analysis. It seems that these concepts might overlap for both reflect the test quality on measuring the examinees’ ability. In this paper, examples of some results of data processing on item validity and item discrimination index were compared. It would be discussed whether item validity and item discrimination index can be represented by one of them only or it should be better to present both calculations for simple test analysis, especially in undergraduate theses where test analyses were included.

  16. Differential Scanning Calorimetry and Evolved Gas Analysis at Mars Ambient Conditions Using the Thermal Evolved Gas Analyser (TEGA)

    NASA Technical Reports Server (NTRS)

    Musselwhite, D. S.; Boynton, W. V.; Ming, D. W.; Quadlander, G. A.; Kerry, K. E.; Bode, R. C.; Bailey, S. H.; Ward, M. G.; Pathare, A. V.; Lorenz, R. D.

    2000-01-01

    We are conducting DSC/EGA experiments at Mars ambient temperature and pressure using the TEGA engineering model. These tests illustrate the outstanding capabilities of a TEGA-like instrument on the surface of Mars.

  17. ANALYSIS OF REAL-TIME VEHICLE HYDROCARBON EMISSIONS DATA

    EPA Science Inventory

    The report gives results of analyses using real-time dynamometer test emissions data from 13 passenger cars to examine variations in emissions during different speeds or modes of travel. The resulting data provided a way to separately identify idle, cruise, acceleration, and dece...

  18. Development of Reasoning Test Instruments Based on TIMSS Framework for Measuring Reasoning Ability of Senior High School Student on the Physics Concept

    NASA Astrophysics Data System (ADS)

    Muslim; Suhandi, A.; Nugraha, M. G.

    2017-02-01

    The purposes of this study are to determine the quality of reasoning test instruments that follow the framework of Trends in International Mathematics and Science Study (TIMSS) as a development results and to analyse the profile of reasoning skill of senior high school students on physics materials. This research used research and development method (R&D), furthermore the subject were 104 students at three senior high schools in Bandung selected by random sampling technique. Reasoning test instruments are constructed following the TIMSS framework in multiple choice forms in 30 questions that cover five subject matters i.e. parabolic motion and circular motion, Newton’s law of gravity, work and energy, harmonic oscillation, as well as the momentum and impulse. The quality of reasoning tests were analysed using the Content Validity Ratio (CVR) and classic test analysis include the validity of item, level of difficulty, discriminating power, reliability and Ferguson’s delta. As for the students’ reasoning skills profiles were analysed by the average score of achievements on eight aspects of the reasoning TIMSS framework. The results showed that reasoning test have a good quality as instruments to measure reasoning skills of senior high school students on five matters physics which developed and able to explore the reasoning of students on all aspects of reasoning based on TIMSS framework.

  19. Does whole blood coagulation analysis reflect developmental haemostasis?

    PubMed

    Ravn, Hanne Berg; Andreasen, Jo Bønding; Hvas, Anne-Mette

    2017-04-01

    : Developmental haemostasis has been well documented over the last 3 decades and age-dependent reference ranges have been reported for a number of plasmatic coagulation parameters. With the increasing use of whole blood point-of-care tests like rotational thromboelastometry (ROTEM) and platelet function tests, an evaluation of age-dependent changes is warranted for these tests as well. We obtained blood samples from 149 children, aged 1 day to 5.9 years, and analysed conventional plasmatic coagulation tests, including activated partial prothrombin time, prothrombin time, and fibrinogen (functional). Whole blood samples were analysed using ROTEM to assess overall coagulation capacity and Multiplate analyzer to evaluate platelet aggregation. Age-dependent changes were analysed for all variables. We found age-dependent differences in all conventional coagulation tests (all P values < 0.05), but there was no sign of developmental changes in whole blood coagulation assessment when applying ROTEM, apart from clotting time in the EXTEM assay (P < 0.03). Despite marked differences in mean platelet aggregation between age groups, data did not reach statistical significance. Citrate-anticoagulated blood showed significantly reduced platelet aggregation compared with blood anticoagulated with heparin or hirudin (all P values < 0.003). We confirmed previous developmental changes in conventional plasmatic coagulation test. However, these age-dependent changes were not displayed in whole blood monitoring using ROTEM or Multiplate analyzer. Type of anticoagulant had a significant influence on platelet aggregation across all age groups.

  20. Test and analysis of Celion 3000/PMR-15, graphite/polyimide bonded composite joints: Data report

    NASA Technical Reports Server (NTRS)

    Cushman, J. B.; Mccleskey, S. F.; Ward, S. H.

    1982-01-01

    Standard single lap, double lap and symmetric step lap bonded joints of Celion 3000/PMR-15 graphite/polyimide composite were evaluated. Composite to composite and composite to titanium joints were tested at 116 K (-250 F), 294 K (70 F) and 561 K (550 F). Joint parameters evaluated are lap length, adherend thickness, adherend axial stiffness, lamina stacking sequence and adherend tapering. Advanced joint concepts were examined to establish the change in performance of preformed adherends, scalloped adherends and hybrid systems. The material properties of the high temperature adhesive, designated A7F, used for bonding were established. The bonded joint tests resulted in interlaminar shear or peel failures of the composite and there were very few adhesive failures. Average test results agree with expected performance trends for the various test parameters. Results of finite element analyses and of test/analysis correlations are also presented.

  1. Reliability of Computerized Neurocognitive Tests for Concussion Assessment: A Meta-Analysis.

    PubMed

    Farnsworth, James L; Dargo, Lucas; Ragan, Brian G; Kang, Minsoo

    2017-09-01

      Although widely used, computerized neurocognitive tests (CNTs) have been criticized because of low reliability and poor sensitivity. A systematic review was published summarizing the reliability of Immediate Post-Concussion Assessment and Cognitive Testing (ImPACT) scores; however, this was limited to a single CNT. Expansion of the previous review to include additional CNTs and a meta-analysis is needed. Therefore, our purpose was to analyze reliability data for CNTs using meta-analysis and examine moderating factors that may influence reliability.   A systematic literature search (key terms: reliability, computerized neurocognitive test, concussion) of electronic databases (MEDLINE, PubMed, Google Scholar, and SPORTDiscus) was conducted to identify relevant studies.   Studies were included if they met all of the following criteria: used a test-retest design, involved at least 1 CNT, provided sufficient statistical data to allow for effect-size calculation, and were published in English.   Two independent reviewers investigated each article to assess inclusion criteria. Eighteen studies involving 2674 participants were retained. Intraclass correlation coefficients were extracted to calculate effect sizes and determine overall reliability. The Fisher Z transformation adjusted for sampling error associated with averaging correlations. Moderator analyses were conducted to evaluate the effects of the length of the test-retest interval, intraclass correlation coefficient model selection, participant demographics, and study design on reliability. Heterogeneity was evaluated using the Cochran Q statistic.   The proportion of acceptable outcomes was greatest for the Axon Sports CogState Test (75%) and lowest for the ImPACT (25%). Moderator analyses indicated that the type of intraclass correlation coefficient model used significantly influenced effect-size estimates, accounting for 17% of the variation in reliability.   The Axon Sports CogState Test, which has a higher proportion of acceptable outcomes and shorter test duration relative to other CNTs, may be a reliable option; however, future studies are needed to compare the diagnostic accuracy of these instruments.

  2. A qualitative exploration of the human resource policy implications of voluntary counselling and testing scale-up in Kenya: applying a model for policy analysis

    PubMed Central

    2011-01-01

    Background Kenya experienced rapid scale up of HIV testing and counselling services in government health services from 2001. We set out to examine the human resource policy implications of scaling up HIV testing and counselling in Kenya and to analyse the resultant policy against a recognised theoretical framework of health policy reform (policy analysis triangle). Methods Qualitative methods were used to gain in-depth insights from policy makers who shaped scale up. This included 22 in-depth interviews with Voluntary Counselling and Testing (VCT) task force members, critical analysis of 53 sets of minutes and diary notes. We explore points of consensus and conflict amongst policymakers in Kenya and analyse this content to assess who favoured and resisted new policies, how scale up was achieved and the importance of the local context in which scale up occurred. Results The scale up of VCT in Kenya had a number of human resource policy implications resulting from the introduction of lay counsellors and their authorisation to conduct rapid HIV testing using newly introduced rapid testing technologies. Our findings indicate that three key groups of actors were critical: laboratory professionals, counselling associations and the Ministry of Health. Strategic alliances between donors, NGOs and these three key groups underpinned the process. The process of reaching consensus required compromise and time commitment but was critical to a unified nationwide approach. Policies around quality assurance were integral in ensuring standardisation of content and approach. Conclusion The introduction and scale up of new health service initiatives such as HIV voluntary counselling and testing necessitates changes to existing health systems and modification of entrenched interests around professional counselling and laboratory testing. Our methodological approach enabled exploration of complexities of scale up of HIV testing and counselling in Kenya. We argue that a better understanding of the diverse actors, the context and the process, is required to mitigate risks and maximise impact. PMID:22008721

  3. FLOW TESTING AND ANALYSIS OF THE FSP-1 EXPERIMENT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hawkes, Grant L.; Jones, Warren F.; Marcum, Wade

    The U.S. High Performance Research Reactor Conversions fuel development team is focused on developing and qualifying the uranium-molybdenum (U-Mo) alloy monolithic fuel to support conversion of domestic research reactors to low enriched uranium. Several previous irradiations have demonstrated the favorable behavior of the monolithic fuel. The Full Scale Plate 1 (FSP-1) fuel plate experiment will be irradiated in the northeast (NE) flux trap of the Advanced Test Reactor (ATR). This fueled experiment contains six aluminum-clad fuel plates consisting of monolithic U-Mo fuel meat. Flow testing experimentation and hydraulic analysis have been performed on the FSP-1 experiment to be irradiated inmore » the ATR at the Idaho National Laboratory (INL). A flow test experiment mockup of the FSP-1 experiment was completed at Oregon State University. Results of several flow test experiments are compared with analyses. This paper reports and shows hydraulic analyses are nearly identical to the flow test results. A water velocity of 14.0 meters per second is targeted between the fuel plates. Comparisons between FSP-1 measurements and this target will be discussed. This flow rate dominates the flow characteristics of the experiment and model. Separate branch flows have minimal effect on the overall experiment. A square flow orifice was placed to control the flowrate through the experiment. Four different orifices were tested. A flow versus delta P curve for each orifice is reported herein. Fuel plates with depleted uranium in the fuel meat zone were used in one of the flow tests. This test was performed to evaluate flow test vibration with actual fuel meat densities and reported herein. Fuel plate deformation tests were also performed and reported.« less

  4. Experiment Design and Analysis Guide - Neutronics & Physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Misti A Lillo

    2014-06-01

    The purpose of this guide is to provide a consistent, standardized approach to performing neutronics/physics analysis for experiments inserted into the Advanced Test Reactor (ATR). This document provides neutronics/physics analysis guidance to support experiment design and analysis needs for experiments irradiated in the ATR. This guide addresses neutronics/physics analysis in support of experiment design, experiment safety, and experiment program objectives and goals. The intent of this guide is to provide a standardized approach for performing typical neutronics/physics analyses. Deviation from this guide is allowed provided that neutronics/physics analysis details are properly documented in an analysis report.

  5. Analysis of operation UPSHOT-KNOTHOLE nuclear test BADGER radiological and meteorological data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Quinn, V.E.

    1986-04-01

    This report describes the Weather Service Nuclear Support Office (WSNSO) analyses of the radiological and meteorological data collected for the BADGER nuclear test of Operation UPSHOT-KNOTHOLE. Inconsistencies in the radiological data and their resolution are discussed. The methods of normalizing the radiological data to a standard time, of converting the aerial data to equivalent ground-level values, and of estimating fallout-arrival times are presented. The meteorological situations on event day and the following day are described. A comparison of the WSNSO fallout analysis with an analysis performed during the 1950's is presented. The radiological data used to derive the WSNSO falloutmore » pattern are tabulated in an appendix.« less

  6. To analyse a trace or not? Evaluating the decision-making process in the criminal investigation.

    PubMed

    Bitzer, Sonja; Ribaux, Olivier; Albertini, Nicola; Delémont, Olivier

    2016-05-01

    In order to broaden our knowledge and understanding of the decision steps in the criminal investigation process, we started by evaluating the decision to analyse a trace and the factors involved in this decision step. This decision step is embedded in the complete criminal investigation process, involving multiple decision and triaging steps. Considering robbery cases occurring in a geographic region during a 2-year-period, we have studied the factors influencing the decision to submit biological traces, directly sampled on the scene of the robbery or on collected objects, for analysis. The factors were categorised into five knowledge dimensions: strategic, immediate, physical, criminal and utility and decision tree analysis was carried out. Factors in each category played a role in the decision to analyse a biological trace. Interestingly, factors involving information available prior to the analysis are of importance, such as the fact that a positive result (a profile suitable for comparison) is already available in the case, or that a suspect has been identified through traditional police work before analysis. One factor that was taken into account, but was not significant, is the matrix of the trace. Hence, the decision to analyse a trace is not influenced by this variable. The decision to analyse a trace first is very complex and many of the tested variables were taken into account. The decisions are often made on a case-by-case basis. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  7. Analysis of Errors and Misconceptions in the Learning of Calculus by Undergraduate Students

    ERIC Educational Resources Information Center

    Muzangwa, Jonatan; Chifamba, Peter

    2012-01-01

    This paper is going to analyse errors and misconceptions in an undergraduate course in Calculus. The study will be based on a group of 10 BEd. Mathematics students at Great Zimbabwe University. Data is gathered through use of two exercises on Calculus 1&2.The analysis of the results from the tests showed that a majority of the errors were due…

  8. Cross-National Analysis of Islamic Fundamentalism

    DTIC Science & Technology

    2016-01-20

    attitudes, and was fully involved in activities concerning questionnaire design including a new experimental design in the survey, pilot testing, and...possible collaboration with the research design of the panel survey in Tunisia. • Data analysis: Analyses of religious fundamentalism, women’s dress, trust...the Event History Calendar and the best methods to ask about knowledge and experience of past events. The group designed a series of cognitive

  9. Quantified Choice of Root-Mean-Square Errors of Approximation for Evaluation and Power Analysis of Small Differences between Structural Equation Models

    ERIC Educational Resources Information Center

    Li, Libo; Bentler, Peter M.

    2011-01-01

    MacCallum, Browne, and Cai (2006) proposed a new framework for evaluation and power analysis of small differences between nested structural equation models (SEMs). In their framework, the null and alternative hypotheses for testing a small difference in fit and its related power analyses were defined by some chosen root-mean-square error of…

  10. Quantitative analysis of tympanic membrane perforation: a simple and reliable method.

    PubMed

    Ibekwe, T S; Adeosun, A A; Nwaorgu, O G

    2009-01-01

    Accurate assessment of the features of tympanic membrane perforation, especially size, site, duration and aetiology, is important, as it enables optimum management. To describe a simple, cheap and effective method of quantitatively analysing tympanic membrane perforations. The system described comprises a video-otoscope (capable of generating still and video images of the tympanic membrane), adapted via a universal serial bus box to a computer screen, with images analysed using the Image J geometrical analysis software package. The reproducibility of results and their correlation with conventional otoscopic methods of estimation were tested statistically with the paired t-test and correlational tests, using the Statistical Package for the Social Sciences version 11 software. The following equation was generated: P/T x 100 per cent = percentage perforation, where P is the area (in pixels2) of the tympanic membrane perforation and T is the total area (in pixels2) for the entire tympanic membrane (including the perforation). Illustrations are shown. Comparison of blinded data on tympanic membrane perforation area obtained independently from assessments by two trained otologists, of comparative years of experience, using the video-otoscopy system described, showed similar findings, with strong correlations devoid of inter-observer error (p = 0.000, r = 1). Comparison with conventional otoscopic assessment also indicated significant correlation, comparing results for two trained otologists, but some inter-observer variation was present (p = 0.000, r = 0.896). Correlation between the two methods for each of the otologists was also highly significant (p = 0.000). A computer-adapted video-otoscope, with images analysed by Image J software, represents a cheap, reliable, technology-driven, clinical method of quantitative analysis of tympanic membrane perforations and injuries.

  11. Does the growth response of woody plants to elevated CO2 increase with temperature? A model-oriented meta-analysis.

    PubMed

    Baig, Sofia; Medlyn, Belinda E; Mercado, Lina M; Zaehle, Sönke

    2015-12-01

    The temperature dependence of the reaction kinetics of the Rubisco enzyme implies that, at the level of a chloroplast, the response of photosynthesis to rising atmospheric CO2 concentration (Ca ) will increase with increasing air temperature. Vegetation models incorporating this interaction predict that the response of net primary productivity (NPP) to elevated CO2 (eCa ) will increase with rising temperature and will be substantially larger in warm tropical forests than in cold boreal forests. We tested these model predictions against evidence from eCa experiments by carrying out two meta-analyses. Firstly, we tested for an interaction effect on growth responses in factorial eCa  × temperature experiments. This analysis showed a positive, but nonsignificant interaction effect (95% CI for above-ground biomass response = -0.8, 18.0%) between eCa and temperature. Secondly, we tested field-based eCa experiments on woody plants across the globe for a relationship between the eCa effect on plant biomass and mean annual temperature (MAT). This second analysis showed a positive but nonsignificant correlation between the eCa response and MAT. The magnitude of the interactions between CO2 and temperature found in both meta-analyses were consistent with model predictions, even though both analyses gave nonsignificant results. Thus, we conclude that it is not possible to distinguish between the competing hypotheses of no interaction vs. an interaction based on Rubisco kinetics from the available experimental database. Experiments in a wider range of temperature zones are required. Until such experimental data are available, model predictions should aim to incorporate uncertainty about this interaction. © 2015 John Wiley & Sons Ltd.

  12. Beware of Kinked Frontiers: A Systematic Review of the Choice of Comparator Strategies in Cost-Effectiveness Analyses of Human Papillomavirus Testing in Cervical Screening.

    PubMed

    O'Mahony, James F; Naber, Steffie K; Normand, Charles; Sharp, Linda; O'Leary, John J; de Kok, Inge M C M

    2015-12-01

    To systematically review the choice of comparator strategies in cost-effectiveness analyses (CEAs) of human papillomavirus testing in cervical screening. The PubMed, Web of Knowledge, and Scopus databases were searched to identify eligible model-based CEAs of cervical screening programs using human papillomavirus testing. The eligible CEAs were reviewed to investigate what screening strategies were chosen for analysis and how this choice might have influenced estimates of the incremental cost-effectiveness ratio (ICER). Selected examples from the reviewed studies are presented to illustrate how the omission of relevant comparators might influence estimates of screening cost-effectiveness. The search identified 30 eligible CEAs. The omission of relevant comparator strategies appears likely in 18 studies. The ICER estimates in these cases are probably lower than would be estimated had more comparators been included. Five of the 30 studies restricted relevant comparator strategies to sensitivity analyses or other subanalyses not part of the principal base-case analysis. Such exclusion of relevant strategies from the base-case analysis can result in cost-ineffective strategies being identified as cost-effective. Many of the CEAs reviewed appear to include insufficient comparator strategies. In particular, they omit strategies with relatively long screening intervals. Omitting relevant comparators matters particularly if it leads to the underestimation of ICERs for strategies around the cost-effectiveness threshold because these strategies are the most policy relevant from the CEA perspective. Consequently, such CEAs may not be providing the best possible policy guidance and lead to the mistaken adoption of cost-ineffective screening strategies. Copyright © 2015 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  13. Influence of Structural Features and Fracture Processes on Surface Roughness: A Case Study from the Krosno Sandstones of the Górka-Mucharz Quarry (Little Beskids, Southern Poland)

    NASA Astrophysics Data System (ADS)

    Pieczara, Łukasz

    2015-09-01

    The paper presents the results of analysis of surface roughness parameters in the Krosno Sandstones of Mucharz, southern Poland. It was aimed at determining whether these parameters are influenced by structural features (mainly the laminar distribution of mineral components and directional distribution of non-isometric grains) and fracture processes. The tests applied in the analysis enabled us to determine and describe the primary statistical parameters used in the quantitative description of surface roughness, as well as specify the usefulness of contact profilometry as a method of visualizing spatial differentiation of fracture processes in rocks. These aims were achieved by selecting a model material (Krosno Sandstones from the Górka-Mucharz Quarry) and an appropriate research methodology. The schedule of laboratory analyses included: identification analyses connected with non-destructive ultrasonic tests, aimed at the preliminary determination of rock anisotropy, strength point load tests (cleaved surfaces were obtained due to destruction of rock samples), microscopic analysis (observation of thin sections in order to determine the mechanism of inducing fracture processes) and a test method of measuring surface roughness (two- and three-dimensional diagrams, topographic and contour maps, and statistical parameters of surface roughness). The highest values of roughness indicators were achieved for surfaces formed under the influence of intragranular fracture processes (cracks propagating directly through grains). This is related to the structural features of the Krosno Sandstones (distribution of lamination and bedding).

  14. Microcracking, microcrack-induced delamination, and longitudinal splitting of advanced composite structures

    NASA Technical Reports Server (NTRS)

    Nairn, John A.

    1992-01-01

    A combined analytical and experimental study was conducted to analyze microcracking, microcrack-induced delamination, and longitudinal splitting in polymer matrix composites. Strain energy release rates, calculated by a variational analysis, were used in a failure criterion to predict microcracking. Predictions and test results were compared for static, fatigue, and cyclic thermal loading. The longitudinal splitting analysis accounted for the effects of fiber bridging. Test data are analyzed and compared for longitudinal splitting and delamination under mixed-mode loading. This study emphasizes the importance of using fracture mechanics analyses to understand the complex failure processes that govern composite strength and life.

  15. Analysis of rolling contact spall life in 440 C steel bearing rims

    NASA Technical Reports Server (NTRS)

    Bastias, P. C.; Bhargava, V.; Bower, A. P.; Du, J.; Gupta, V.; Hahn, G. T.; Kulkarni, S. M.; Kumar, A. M.; Leng, X.; Rubin, C. A.

    1991-01-01

    The results of a two year study of the mechanisms of spall failure in the HPOTP bearings are described. The objective was to build a foundation for detailed analyses of the contact life in terms of: cyclic plasticity, contact mechanics, spall nucleation, and spall growth. Since the laboratory rolling contact testing is carried out in the 3 ball/rod contact fatigue testing machine, the analysis of the contacts and contact lives produced in this machine received attention. The results from the experimentally observed growth lives are compared with calculated predictions derived from the fracture mechanics calculations.

  16. Statistical analysis of fNIRS data: a comprehensive review.

    PubMed

    Tak, Sungho; Ye, Jong Chul

    2014-01-15

    Functional near-infrared spectroscopy (fNIRS) is a non-invasive method to measure brain activities using the changes of optical absorption in the brain through the intact skull. fNIRS has many advantages over other neuroimaging modalities such as positron emission tomography (PET), functional magnetic resonance imaging (fMRI), or magnetoencephalography (MEG), since it can directly measure blood oxygenation level changes related to neural activation with high temporal resolution. However, fNIRS signals are highly corrupted by measurement noises and physiology-based systemic interference. Careful statistical analyses are therefore required to extract neuronal activity-related signals from fNIRS data. In this paper, we provide an extensive review of historical developments of statistical analyses of fNIRS signal, which include motion artifact correction, short source-detector separation correction, principal component analysis (PCA)/independent component analysis (ICA), false discovery rate (FDR), serially-correlated errors, as well as inference techniques such as the standard t-test, F-test, analysis of variance (ANOVA), and statistical parameter mapping (SPM) framework. In addition, to provide a unified view of various existing inference techniques, we explain a linear mixed effect model with restricted maximum likelihood (ReML) variance estimation, and show that most of the existing inference methods for fNIRS analysis can be derived as special cases. Some of the open issues in statistical analysis are also described. Copyright © 2013 Elsevier Inc. All rights reserved.

  17. Determination of Shear Wave Velocity in Offshore Terengganu for Ground Response Analysis

    NASA Astrophysics Data System (ADS)

    Mazlina, M.; Liew, M. S.; Adnan, A.; Harahap, I. S. H.; Hamid, N. A.

    2018-04-01

    Amount of vibration received in any location can be analysed by conducting ground response analysis. Even though there are three different methods available in this analysis, One Dimensional ground response analysis method has been widely used. Shear wave velocity is one of the key parameters in this analysis. A lot of correlations have been formulated to determine shear wave velocity with cone penetration test. In this study, correlations developed for Quaternary geological age have been selected. Six equations have been adopted comprise of all soil and soil type dependent correlations. Two platforms sites consist of clay and combination of clay and sand have been analysed. Shear velocity to be used in ground response analysis has been obtained. Results have been illustrated in graphs where shear velocity for each case has been plotted. In avoiding under or over predicting of shear wave velocity, the average of all soil and soil type dependent results will be used as final Vs value.

  18. A method for identifying EMI critical circuits during development of a large C3

    NASA Astrophysics Data System (ADS)

    Barr, Douglas H.

    The circuit analysis methods and process Boeing Aerospace used on a large, ground-based military command, control, and communications (C3) system are described. This analysis was designed to help identify electromagnetic interference (EMI) critical circuits. The methodology used the MIL-E-6051 equipment criticality categories as the basis for defining critical circuits, relational database technology to help sort through and account for all of the approximately 5000 system signal cables, and Macintosh Plus personal computers to predict critical circuits based on safety margin analysis. The EMI circuit analysis process systematically examined all system circuits to identify which ones were likely to be EMI critical. The process used two separate, sequential safety margin analyses to identify critical circuits (conservative safety margin analysis, and detailed safety margin analysis). These analyses used field-to-wire and wire-to-wire coupling models using both worst-case and detailed circuit parameters (physical and electrical) to predict circuit safety margins. This process identified the predicted critical circuits that could then be verified by test.

  19. Extreme between-study homogeneity in meta-analyses could offer useful insights.

    PubMed

    Ioannidis, John P A; Trikalinos, Thomas A; Zintzaras, Elias

    2006-10-01

    Meta-analyses are routinely evaluated for the presence of large between-study heterogeneity. We examined whether it is also important to probe whether there is extreme between-study homogeneity. We used heterogeneity tests with left-sided statistical significance for inference and developed a Monte Carlo simulation test for testing extreme homogeneity in risk ratios across studies, using the empiric distribution of the summary risk ratio and heterogeneity statistic. A left-sided P=0.01 threshold was set for claiming extreme homogeneity to minimize type I error. Among 11,803 meta-analyses with binary contrasts from the Cochrane Library, 143 (1.21%) had left-sided P-value <0.01 for the asymptotic Q statistic and 1,004 (8.50%) had left-sided P-value <0.10. The frequency of extreme between-study homogeneity did not depend on the number of studies in the meta-analyses. We identified examples where extreme between-study homogeneity (left-sided P-value <0.01) could result from various possibilities beyond chance. These included inappropriate statistical inference (asymptotic vs. Monte Carlo), use of a specific effect metric, correlated data or stratification using strong predictors of outcome, and biases and potential fraud. Extreme between-study homogeneity may provide useful insights about a meta-analysis and its constituent studies.

  20. Real-time analysis system for gas turbine ground test acoustic measurements.

    PubMed

    Johnston, Robert T

    2003-10-01

    This paper provides an overview of a data system upgrade to the Pratt and Whitney facility designed for making acoustic measurements on aircraft gas turbine engines. A data system upgrade was undertaken because the return-on-investment was determined to be extremely high. That is, the savings on the first test series recovered the cost of the hardware. The commercial system selected for this application utilizes 48 input channels, which allows either 1/3 octave and/or narrow-band analyses to be preformed real-time. A high-speed disk drive allows raw data from all 48 channels to be stored simultaneously while the analyses are being preformed. Results of tests to ensure compliance of the new system with regulations and with existing systems are presented. Test times were reduced from 5 h to 1 h of engine run time per engine configuration by the introduction of this new system. Conservative cost reduction estimates for future acoustic testing are 75% on items related to engine run time and 50% on items related to the overall length of the test.

Top