Modeling Local Item Dependence Due to Common Test Format with a Multidimensional Rasch Model
ERIC Educational Resources Information Center
Baghaei, Purya; Aryadoust, Vahid
2015-01-01
Research shows that test method can exert a significant impact on test takers' performance and thereby contaminate test scores. We argue that common test method can exert the same effect as common stimuli and violate the conditional independence assumption of item response theory models because, in general, subsets of items which have a shared…
ERIC Educational Resources Information Center
Öztürk-Gübes, Nese; Kelecioglu, Hülya
2016-01-01
The purpose of this study was to examine the impact of dimensionality, common-item set format, and different scale linking methods on preserving equity property with mixed-format test equating. Item response theory (IRT) true-score equating (TSE) and IRT observed-score equating (OSE) methods were used under common-item nonequivalent groups design.…
ERIC Educational Resources Information Center
He, Yong
2013-01-01
Common test items play an important role in equating multiple test forms under the common-item nonequivalent groups design. Inconsistent item parameter estimates among common items can lead to large bias in equated scores for IRT true score equating. Current methods extensively focus on detection and elimination of outlying common items, which…
Using a Linear Regression Method to Detect Outliers in IRT Common Item Equating
ERIC Educational Resources Information Center
He, Yong; Cui, Zhongmin; Fang, Yu; Chen, Hanwei
2013-01-01
Common test items play an important role in equating alternate test forms under the common item nonequivalent groups design. When the item response theory (IRT) method is applied in equating, inconsistent item parameter estimates among common items can lead to large bias in equated scores. It is prudent to evaluate inconsistency in parameter…
Development of wheelchair caster testing equipment and preliminary testing of caster models
Mhatre, Anand; Ott, Joseph
2017-01-01
Background Because of the adverse environmental conditions present in less-resourced environments (LREs), the World Health Organization (WHO) has recommended that specialised wheelchair test methods may need to be developed to support product quality standards in these environments. A group of experts identified caster test methods as a high priority because of their common failure in LREs, and the insufficiency of existing test methods described in the International Organization for Standardization (ISO) Wheelchair Testing Standards (ISO 7176). Objectives To develop and demonstrate the feasibility of a caster system test method. Method Background literature and expert opinions were collected to identify existing caster test methods, caster failures common in LREs and environmental conditions present in LREs. Several conceptual designs for the caster testing method were developed, and through an iterative process using expert feedback, a final concept and a design were developed and a prototype was fabricated. Feasibility tests were conducted by testing a series of caster systems from wheelchairs used in LREs, and failure modes were recorded and compared to anecdotal reports about field failures. Results The new caster testing system was developed and it provides the flexibility to expose caster systems to typical conditions in LREs. Caster failures such as stem bolt fractures, fork fractures, bearing failures and tire cracking occurred during testing trials and are consistent with field failures. Conclusion The new caster test system has the capability to incorporate necessary test factors that degrade caster quality in LREs. Future work includes developing and validating a testing protocol that results in failure modes common during wheelchair use in LRE. PMID:29062762
NASA Technical Reports Server (NTRS)
Szatkowski, George N.; Dudley, Kenneth L.; Koppen, Sandra V.; Ely, Jay J.; Nguyen, Truong X.; Ticatch, Larry A.; Mielnik, John J.; Mcneill, Patrick A.
2013-01-01
To support FAA certification airworthiness standards, composite substrates are subjected to lightning direct-effect electrical waveforms to determine performance characteristics of the lightning strike protection (LSP) conductive layers used to protect composite substrates. Test results collected from independent LSP studies are often incomparable due to variability in test procedures & applied practices at different organizations, which impairs performance correlations between different LSP data sets. Under a NASA supported contract, The Boeing Company developed technical procedures and documentation as guidance in order to facilitate a test method for conducting universal common practice lightning strike protection test procedures. The procedures obtain conformity in future lightning strike protection evaluations to allow meaningful performance correlations across data sets. This universal common practice guidance provides the manufacturing specifications to fabricate carbon fiber reinforced plastic (CFRP) test panels, including finish, grounding configuration, and acceptable methods for pretest nondestructive inspection (NDI) and posttest destructive inspection. The test operations guidance elaborates on the provisions contained in SAE ARP5416 to address inconsistencies in the generation of damage protection performance data, so as to provide for maximum achievable correlation across capable lab facilities. In addition, the guidance details a direct effects test bed design to aid in quantification of the multi-physical phenomena surrounding a lightning direct attachment supporting validation data requirements for the development of predictive computational modeling. The lightning test bed is designed to accommodate a repeatable installation procedure to secure the test panel and eliminate test installation uncertainty. It also facilitates a means to capture the electrical waveform parameters in 2 dimensions, along with the mechanical displacement and thermal heating parameters which occur during lightning attachment. Following guidance defined in the universal common practice LSP test documents, protected and unprotected CFRP panels were evaluated at 20, 40 and 100KAmps. This report presents analyzed data demonstrating the scientific usefulness of the common practice approach. Descriptions of the common practice CFRP test articles, LSP test bed fixture, and monitoring techniques to capture the electrical, mechanical and thermal parameters during lightning attachment are presented here. Two methods of measuring the electrical currents were evaluated, inductive current probes and a newly developed fiberoptic sensor. Two mechanical displacement methods were also examined, optical laser measurement sensors and a digital imaging correlation camera system. Recommendations are provided to help users implement the common practice test approach and obtain LSP test characterizations comparable across data sets.
Alternative Test Methods for Electronic Parts
NASA Technical Reports Server (NTRS)
Plante, Jeannette
2004-01-01
It is common practice within NASA to test electronic parts at the manufacturing lot level to demonstrate, statistically, that parts from the lot tested will not fail in service using generic application conditions. The test methods and the generic application conditions used have been developed over the years through cooperation between NASA, DoD, and industry in order to establish a common set of standard practices. These common practices, found in MIL-STD-883, MIL-STD-750, military part specifications, EEE-INST-002, and other guidelines are preferred because they are considered to be effective and repeatable and their results are usually straightforward to interpret. These practices can sometimes be unavailable to some NASA projects due to special application conditions that must be addressed, such as schedule constraints, cost constraints, logistical constraints, or advances in the technology that make the historical standards an inappropriate choice for establishing part performance and reliability. Alternate methods have begun to emerge and to be used by NASA programs to test parts individually or as part of a system, especially when standard lot tests cannot be applied. Four alternate screening methods will be discussed in this paper: Highly accelerated life test (HALT), forward voltage drop tests for evaluating wire-bond integrity, burn-in options during or after highly accelerated stress test (HAST), and board-level qualification.
The Effect of Error in Item Parameter Estimates on the Test Response Function Method of Linking.
ERIC Educational Resources Information Center
Kaskowitz, Gary S.; De Ayala, R. J.
2001-01-01
Studied the effect of item parameter estimation for computation of linking coefficients for the test response function (TRF) linking/equating method. Simulation results showed that linking was more accurate when there was less error in the parameter estimates, and that 15 or 25 common items provided better results than 5 common items under both…
DOE Office of Scientific and Technical Information (OSTI.GOV)
None, None
The most common method of measuring air leakage is to perform single (or solo) blower door pressurization and/or depressurization test. In detached housing, the single blower door test measures leakage to the outside. In attached housing, however, this "solo" test method measures both air leakage to the outside and air leakage between adjacent units through common surfaces. Although minimizing leakage to neighboring units is highly recommended to avoid indoor air quality issues between units, reduce pressure differentials between units, and control stack effect, the energy benefits of air sealing can be significantly overpredicted if the solo air leakage number ismore » used in the energy analysis. Guarded blower door testing is more appropriate for isolating and measuring leakage to the outside in attached housing. This method uses multiple blower doors to depressurize adjacent spaces to the same level as the unit being tested. Maintaining a neutral pressure across common walls, ceilings, and floors acts as a "guard" against air leakage between units. The resulting measured air leakage in the test unit is only air leakage to the outside. Although preferred for assessing energy impacts, the challenges of performing guarded testing can be daunting.« less
Poisson Approximation-Based Score Test for Detecting Association of Rare Variants.
Fang, Hongyan; Zhang, Hong; Yang, Yaning
2016-07-01
Genome-wide association study (GWAS) has achieved great success in identifying genetic variants, but the nature of GWAS has determined its inherent limitations. Under the common disease rare variants (CDRV) hypothesis, the traditional association analysis methods commonly used in GWAS for common variants do not have enough power for detecting rare variants with a limited sample size. As a solution to this problem, pooling rare variants by their functions provides an efficient way for identifying susceptible genes. Rare variant typically have low frequencies of minor alleles, and the distribution of the total number of minor alleles of the rare variants can be approximated by a Poisson distribution. Based on this fact, we propose a new test method, the Poisson Approximation-based Score Test (PAST), for association analysis of rare variants. Two testing methods, namely, ePAST and mPAST, are proposed based on different strategies of pooling rare variants. Simulation results and application to the CRESCENDO cohort data show that our methods are more powerful than the existing methods. © 2016 John Wiley & Sons Ltd/University College London.
Summary of nondestructive testing theory and practice
NASA Technical Reports Server (NTRS)
Meister, R. P.; Randall, M. D.; Mitchell, D. K.; Williams, L. P.; Pattee, H. E.
1972-01-01
The ability to fabricate design critical and man-rated aerospace structures using materials near the limits of their capabilities requires a comprehensive and dependable assurance program. The quality assurance program must rely heavily on nondestructive testing methods for thorough inspection to assess properties and quality of hardware items. A survey of nondestructive testing methods is presented to provide space program managers, supervisors and engineers who are unfamiliar with this technical area with appropriate insight into the commonly accepted nondestructive testing methods available, their interrelationships, used, advantages and limitations. Primary emphasis is placed on the most common methods: liquid penetrant, magnetic particle, radiography, ultrasonics and eddy current. A number of the newer test techniques including thermal, acoustic emission, holography, microwaves, eddy-sonic and exo-electron emission, which are beginning to be used in applications of interest to NASA, are also discussed briefly.
A blind search for a common signal in gravitational wave detectors
NASA Astrophysics Data System (ADS)
Liu, Hao; Creswell, James; von Hausegger, Sebastian; Jackson, Andrew D.; Naselsky, Pavel
2018-02-01
We propose a blind, template-free method for the extraction of a common signal between the Hanford and Livingston detectors and apply it especially to the GW150914 event. We construct a log-likelihood method that maximizes the cross-correlation between each detector and the common signal and minimizes the cross-correlation between the residuals. The reliability of this method is tested using simulations with an injected common signal. Finally, our method is used to assess the quality of theoretical gravitational wave templates for GW150914.
Tests of Measurement Invariance without Subgroups: A Generalization of Classical Methods
ERIC Educational Resources Information Center
Merkle, Edgar C.; Zeileis, Achim
2013-01-01
The issue of measurement invariance commonly arises in factor-analytic contexts, with methods for assessment including likelihood ratio tests, Lagrange multiplier tests, and Wald tests. These tests all require advance definition of the number of groups, group membership, and offending model parameters. In this paper, we study tests of measurement…
Developing a laboratory protocol for asphalt binder recovery.
DOT National Transportation Integrated Search
2014-10-01
Asphalt binder extraction and recovery are common laboratory procedures used to provide material for research and quality : assurance testing. The most common methods of recovery performed today include the Abson method and the rotary evaporator : (o...
Hannan, Mary; Steffen, Alana; Quinn, Lauretta; Collins, Eileen G; Phillips, Shane A; Bronas, Ulf G
2018-05-25
Chronic kidney disease (CKD) is a common chronic condition in older adults that is associated with cognitive decline. However, the exact prevalence of cognitive impairment in older adults with CKD is unclear likely due to the variety of methods utilized to assess cognitive function. The purpose of this integrative review is to determine how cognitive function is most frequently assessed in older adult patients with CKD. Five electronic databases were searched to explore relevant literature related to cognitive function assessment in older adult patients with CKD. Inclusion and exclusion criteria were created to focus the search to the assessment of cognitive function with standardized cognitive tests in older adults with CKD, not on renal replacement therapy. Through the search methods, 36 articles were found that fulfilled the purpose of the review. There were 36 different types of cognitive tests utilized in the included articles, with each study utilizing between one and 12 tests. The most commonly utilized cognitive test was the Mini Mental State Exam (MMSE), followed by tests of digit symbol substitution and verbal fluency. The most commonly assessed aspect of cognitive function was global cognition. The assessment of cognitive function in older adults with CKD with standardized tests is completed in various ways. Unfortunately, the common methods of assessment of cognitive function may not be fully examining the domains of impairment commonly found in older adults with CKD. Further research is needed to identify the ideal cognitive test to best assess older adults with CKD for cognitive impairment.
The Effect of Schooling and Ability on Achievement Test Scores. NBER Working Paper Series.
ERIC Educational Resources Information Center
Hansen, Karsten; Heckman, James J.; Mullen, Kathleen J.
This study developed two methods for estimating the effect of schooling on achievement test scores that control for the endogeneity of schooling by postulating that both schooling and test scores are generated by a common unobserved latent ability. The methods were applied to data on schooling and test scores. Estimates from the two methods are in…
NASA Astrophysics Data System (ADS)
Macias, F. J.; Dahl, F.; Bruland, A.
2016-05-01
The tunnel boring machine (TBM) method has become widely used and is currently an important presence within the tunnelling industry. Large investments and high geological risk are involved using TBMs, and disc cutter consumption has a great influence on performance and cost, especially in hard rock conditions. Furthermore, reliable cutter life assessments facilitate the control of risk as well as avoiding delays and budget overruns. Since abrasive wear is the most common process affecting cutter consumption, good laboratory tests for rock abrasivity assessments are needed. A new abrasivity test method by rolling disc named Rolling Indentation Abrasion Test (RIAT) has been developed. The goal of the new test design and procedure is to reproduce wear behaviour on hard rock tunnel boring in a more realistic way than the traditionally used methods. Wear by rolling contact on intact rock samples is introduced and several rock types, covering a wide rock abrasiveness range, have been tested by RIAT. The RIAT procedure indicates a great ability of the testing method to assess abrasive wear on rolling discs. In addition and to evaluate the newly developed RIAT test method, a comprehensive laboratory testing programme including the most commonly used abrasivity test methods and the mineral composition were carried out. Relationships between the achieved results from conventional testing and RIAT results have been analysed.
Testing variance components by two jackknife methods
USDA-ARS?s Scientific Manuscript database
The jacknife method, a resampling technique, has been widely used for statistical tests for years. The pseudo value based jacknife method (defined as pseudo jackknife method) is commonly used to reduce the bias for an estimate; however, sometimes it could result in large variaion for an estmimate a...
Resilient moduli of typical Missouri soils and unbound granular base materials
DOT National Transportation Integrated Search
2008-03-01
The objective of this project is to accurately determine the resilient moduli for common Missouri subgrade soils and unbound granular base materials in accordance with the AASHTO T 307 test method. The test results included moduli data from 27 common...
Hands beat machines for collecting native seed
Mary Ann Davies; Scott Jensen
2008-01-01
A hedge trimmer (Garden Groom Pro) and a hand-held vacuum (Euro-Pro Shark) were tested to determine whether they might be more effective for collecting the seed of native plants than common hand methods. The common hand methods worked best.
Duct Leakage Repeatability Testing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Walker, Iain; Sherman, Max
2014-08-01
The purpose of this report is to evaluate the repeatability of the three most significant measurement techniques for duct leakage using data from the literature and recently obtained field data. We will also briefly discuss the first two factors. The main question to be answered by this study is to determine if differences in the repeatability of these tests methods is sufficient to indicate that any of these methods is so poor that it should be excluded from consideration as an allowed procedure in codes and standards. The three duct leak measurement methods assessed in this report are the twomore » duct pressurization methods that are commonly used by many practitioners and the DeltaQ technique. These are methods B, C and A, respectively of the ASTM E1554 standard. Although it would be useful to evaluate other duct leak test methods, this study focused on those test methods that are commonly used and are required in various test standards, such as BPI (2010), RESNET (2014), ASHRAE 62.2 (2013), California Title 24 (CEC 2012), DOE Weatherization and many other energy efficiency programs.« less
Mello, Enrica; Posteraro, Brunella; Vella, Antonietta; De Carolis, Elena; Torelli, Riccardo; D'Inzeo, Tiziana; Verweij, Paul E; Sanguinetti, Maurizio
2017-06-01
We tested 59 common and 27 uncommon Aspergillus species isolates for susceptibility to the mold-active azole antifungal agents itraconazole, voriconazole, and posaconazole using the Sensititre method. The overall essential agreement with the CLSI reference method was 96.5% for itraconazole and posaconazole and was 100% for voriconazole. By the Sensititre method as well as the CLSI reference method, all of 10 A. fumigatus isolates with a cyp51 mutant genotype were classified as being non-wild-type isolates (MIC > epidemiological cutoff value [ECV]) with respect to triazole susceptibility. Copyright © 2017 American Society for Microbiology.
Mello, Enrica; Posteraro, Brunella; Vella, Antonietta; De Carolis, Elena; Torelli, Riccardo; D'Inzeo, Tiziana; Verweij, Paul E.
2017-01-01
ABSTRACT We tested 59 common and 27 uncommon Aspergillus species isolates for susceptibility to the mold-active azole antifungal agents itraconazole, voriconazole, and posaconazole using the Sensititre method. The overall essential agreement with the CLSI reference method was 96.5% for itraconazole and posaconazole and was 100% for voriconazole. By the Sensititre method as well as the CLSI reference method, all of 10 A. fumigatus isolates with a cyp51 mutant genotype were classified as being non-wild-type isolates (MIC > epidemiological cutoff value [ECV]) with respect to triazole susceptibility. PMID:28416538
Diagnostic methods to assess inspiratory and expiratory muscle strength*
Caruso, Pedro; de Albuquerque, André Luis Pereira; Santana, Pauliane Vieira; Cardenas, Leticia Zumpano; Ferreira, Jeferson George; Prina, Elena; Trevizan, Patrícia Fernandes; Pereira, Mayra Caleffi; Iamonti, Vinicius; Pletsch, Renata; Macchione, Marcelo Ceneviva; Carvalho, Carlos Roberto Ribeiro
2015-01-01
Impairment of (inspiratory and expiratory) respiratory muscles is a common clinical finding, not only in patients with neuromuscular disease but also in patients with primary disease of the lung parenchyma or airways. Although such impairment is common, its recognition is usually delayed because its signs and symptoms are nonspecific and late. This delayed recognition, or even the lack thereof, occurs because the diagnostic tests used in the assessment of respiratory muscle strength are not widely known and available. There are various methods of assessing respiratory muscle strength during the inspiratory and expiratory phases. These methods are divided into two categories: volitional tests (which require patient understanding and cooperation); and non-volitional tests. Volitional tests, such as those that measure maximal inspiratory and expiratory pressures, are the most commonly used because they are readily available. Non-volitional tests depend on magnetic stimulation of the phrenic nerve accompanied by the measurement of inspiratory mouth pressure, inspiratory esophageal pressure, or inspiratory transdiaphragmatic pressure. Another method that has come to be widely used is ultrasound imaging of the diaphragm. We believe that pulmonologists involved in the care of patients with respiratory diseases should be familiar with the tests used in order to assess respiratory muscle function.Therefore, the aim of the present article is to describe the advantages, disadvantages, procedures, and clinical applicability of the main tests used in the assessment of respiratory muscle strength. PMID:25972965
A Comparison of Methods of Vertical Equating.
ERIC Educational Resources Information Center
Loyd, Brenda H.; Hoover, H. D.
Rasch model vertical equating procedures were applied to three mathematics computation tests for grades six, seven, and eight. Each level of the test was composed of 45 items in three sets of 15 items, arranged in such a way that tests for adjacent grades had two sets (30 items) in common, and the sixth and eighth grades had 15 items in common. In…
Using Caspar Creek flow records to test peak flow estimation methods applicable to crossing design
Peter H. Cafferata; Leslie M. Reid
2017-01-01
Long-term flow records from sub-watersheds in the Caspar Creek Experimental Watersheds were used to test the accuracy of four methods commonly used to estimate peak flows in small forested watersheds: the Rational Method, the updated USGS Magnitude and Frequency Method, flow transference methods, and the NRCS curve number method. Comparison of measured and calculated...
Development of a Rolling Dynamic Deflectometer for Continuous Deflection Testing of Pavements
DOT National Transportation Integrated Search
1998-05-01
A rolling dynamic deflectometer (RDD) was developed as a nondestructive method for determining continuous deflection profiles of pavements. Unlike other commonly used pavement testing methods, the RDD performs continuous rather than discrete measurem...
Hoyer, Annika; Kuss, Oliver
2018-05-01
Meta-analysis of diagnostic studies is still a rapidly developing area of biostatistical research. Especially, there is an increasing interest in methods to compare different diagnostic tests to a common gold standard. Restricting to the case of two diagnostic tests, in these meta-analyses the parameters of interest are the differences of sensitivities and specificities (with their corresponding confidence intervals) between the two diagnostic tests while accounting for the various associations across single studies and between the two tests. We propose statistical models with a quadrivariate response (where sensitivity of test 1, specificity of test 1, sensitivity of test 2, and specificity of test 2 are the four responses) as a sensible approach to this task. Using a quadrivariate generalized linear mixed model naturally generalizes the common standard bivariate model of meta-analysis for a single diagnostic test. If information on several thresholds of the tests is available, the quadrivariate model can be further generalized to yield a comparison of full receiver operating characteristic (ROC) curves. We illustrate our model by an example where two screening methods for the diagnosis of type 2 diabetes are compared.
Quality of reporting statistics in two Indian pharmacology journals
Jaykaran; Yadav, Preeti
2011-01-01
Objective: To evaluate the reporting of the statistical methods in articles published in two Indian pharmacology journals. Materials and Methods: All original articles published since 2002 were downloaded from the journals’ (Indian Journal of Pharmacology (IJP) and Indian Journal of Physiology and Pharmacology (IJPP)) website. These articles were evaluated on the basis of appropriateness of descriptive statistics and inferential statistics. Descriptive statistics was evaluated on the basis of reporting of method of description and central tendencies. Inferential statistics was evaluated on the basis of fulfilling of assumption of statistical methods and appropriateness of statistical tests. Values are described as frequencies, percentage, and 95% confidence interval (CI) around the percentages. Results: Inappropriate descriptive statistics was observed in 150 (78.1%, 95% CI 71.7–83.3%) articles. Most common reason for this inappropriate descriptive statistics was use of mean ± SEM at the place of “mean (SD)” or “mean ± SD.” Most common statistical method used was one-way ANOVA (58.4%). Information regarding checking of assumption of statistical test was mentioned in only two articles. Inappropriate statistical test was observed in 61 (31.7%, 95% CI 25.6–38.6%) articles. Most common reason for inappropriate statistical test was the use of two group test for three or more groups. Conclusion: Articles published in two Indian pharmacology journals are not devoid of statistical errors. PMID:21772766
Quality of reporting statistics in two Indian pharmacology journals.
Jaykaran; Yadav, Preeti
2011-04-01
To evaluate the reporting of the statistical methods in articles published in two Indian pharmacology journals. All original articles published since 2002 were downloaded from the journals' (Indian Journal of Pharmacology (IJP) and Indian Journal of Physiology and Pharmacology (IJPP)) website. These articles were evaluated on the basis of appropriateness of descriptive statistics and inferential statistics. Descriptive statistics was evaluated on the basis of reporting of method of description and central tendencies. Inferential statistics was evaluated on the basis of fulfilling of assumption of statistical methods and appropriateness of statistical tests. Values are described as frequencies, percentage, and 95% confidence interval (CI) around the percentages. Inappropriate descriptive statistics was observed in 150 (78.1%, 95% CI 71.7-83.3%) articles. Most common reason for this inappropriate descriptive statistics was use of mean ± SEM at the place of "mean (SD)" or "mean ± SD." Most common statistical method used was one-way ANOVA (58.4%). Information regarding checking of assumption of statistical test was mentioned in only two articles. Inappropriate statistical test was observed in 61 (31.7%, 95% CI 25.6-38.6%) articles. Most common reason for inappropriate statistical test was the use of two group test for three or more groups. Articles published in two Indian pharmacology journals are not devoid of statistical errors.
Kidney function tests are common lab tests used to evaluate how well the kidneys are working. Such tests include: ... Oh MS, Briefel G. Evaluation of renal function, water, electrolytes ... and Management by Laboratory Methods . 23rd ed. Philadelphia, ...
Lemna minor (Duckweed) is commonly used in aquatic toxicity investigations. Methods for culturing and testing with reference toxicants, such as atrazine, are somewhat variable among researchers. Our goal was to develop standardized methods of culturing and testing for use with L....
Evaluation of test procedures for hydrogen environment embrittlement
NASA Technical Reports Server (NTRS)
Nelson, H. G.
1974-01-01
Report presents discussion of three common and primary influences on embrittlement process. Application of theoretical considerations to design of test coupons and methods is illustrated for both internal and external hydrogen embrittlement. Acceptable designs and methods are indicated.
1992-07-01
be used effectively in new construction or retrofit applications. These systems usually contain: 1. Molded expanded polystyrene insulation board (MEPS...commonly referred to as "bead board," or extruded expanded polystyrene insulation board (XEPS), commonly referred to as "blue board." 2. An...Walls ( Expanded Polystyrene Insulation Faced with a Thin Rendering), M.O.A.T. n 22, June 1988. 7 ASTM D3029-90. "Standard Test Methods for Impact
ERIC Educational Resources Information Center
Stuive, Ilse; Kiers, Henk A. L.; Timmerman, Marieke E.
2009-01-01
A common question in test evaluation is whether an a priori assignment of items to subtests is supported by empirical data. If the analysis results indicate the assignment of items to subtests under study is not supported by data, the assignment is often adjusted. In this study the authors compare two methods on the quality of their suggestions to…
A test of inflated zeros for Poisson regression models.
He, Hua; Zhang, Hui; Ye, Peng; Tang, Wan
2017-01-01
Excessive zeros are common in practice and may cause overdispersion and invalidate inference when fitting Poisson regression models. There is a large body of literature on zero-inflated Poisson models. However, methods for testing whether there are excessive zeros are less well developed. The Vuong test comparing a Poisson and a zero-inflated Poisson model is commonly applied in practice. However, the type I error of the test often deviates seriously from the nominal level, rendering serious doubts on the validity of the test in such applications. In this paper, we develop a new approach for testing inflated zeros under the Poisson model. Unlike the Vuong test for inflated zeros, our method does not require a zero-inflated Poisson model to perform the test. Simulation studies show that when compared with the Vuong test our approach not only better at controlling type I error rate, but also yield more power.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Butler, J.J. Jr.; Hyder, Z.
The Nguyen and Pinder method is one of four techniques commonly used for analysis of response data from slug tests. Limited field research has raised questions about the reliability of the parameter estimates obtained with this method. A theoretical evaluation of this technique reveals that errors were made in the derivation of the analytical solution upon which the technique is based. Simulation and field examples show that the errors result in parameter estimates that can differ from actual values by orders of magnitude. These findings indicate that the Nguyen and Pinder method should no longer be a tool in themore » repertoire of the field hydrogeologist. If data from a slug test performed in a partially penetrating well in a confined aquifer need to be analyzed, recent work has shown that the Hvorslev method is the best alternative among the commonly used techniques.« less
Examinations of electron temperature calculation methods in Thomson scattering diagnostics.
Oh, Seungtae; Lee, Jong Ha; Wi, Hanmin
2012-10-01
Electron temperature from Thomson scattering diagnostic is derived through indirect calculation based on theoretical model. χ-square test is commonly used in the calculation, and the reliability of the calculation method highly depends on the noise level of input signals. In the simulations, noise effects of the χ-square test are examined and scale factor test is proposed as an alternative method.
Antimicrobial activity of fresh garlic juice: An in vitro study
Yadav, Seema; Trivedi, Niyati A.; Bhatt, Jagat D.
2015-01-01
Introduction: Antimicrobial resistance has been a global concern. Currently, interest has been focused on exploring antimicrobial properties of plants and herbs. One such botanical is Allium sativum (garlic). Aim: To evaluate the antimicrobial activity of fresh juice of garlic. Materials and Methods: Varying concentrations of fresh garlic juice (FGJ) were tested for their antimicrobial activity against common pathogenic organisms isolated at SSG Hospital, Vadodara, using well diffusion method. Moreover, minimum inhibitory concentration (MIC) and minimum lethal concentration (MLC) of FGJ were tested using broth dilution method. Sensitivity pattern of the conventional antimicrobials against common pathogenic bacteria was tested using disc diffusion method. Results: FGJ produced dose-dependent increase in the zone of inhibition at a concentration of 10% and higher. MIC of FGJ against the pathogens ranged from 4% to 16% v/v whereas MLC value ranged from 4% to 32% v/v with Escherichia coli and Staphylococcus aureus spp. showed highest sensitivity. Conclusion: FGJ has definite antimicrobial activity against common pathogenic organisms isolated at SSG Hospital, Vadodara. Further studies are needed to find out the efficacy, safety, and kinetic data of its active ingredients. PMID:27011724
Instructional Practices: A Qualitative Study on the Response to Common Core Standardized Testing
ERIC Educational Resources Information Center
Hightower, Gabrielle
2017-01-01
The purpose of this qualitative study was to examine the instructional practices implemented by Tennessee elementary teachers in response to Common Core Standardized Testing. This research study utilized a basic qualitative method that included a purposive and convenient sampling. This qualitative study focused on face-to-face interviews, phone…
A Comparison of Linking and Concurrent Calibration under the Graded Response Model.
ERIC Educational Resources Information Center
Kim, Seock-Ho; Cohen, Allan S.
Applications of item response theory to practical testing problems including equating, differential item functioning, and computerized adaptive testing, require that item parameter estimates be placed onto a common metric. In this study, two methods for developing a common metric for the graded response model under item response theory were…
Evaluation of Low-Tech Indoor Remediation Methods ...
Report This study identified, collected, evaluated, and summarized available articles, reports, guidance documents, and other pertinent information related to common housekeeping activities within the United States. This resulted in a summary compendium including relevant information about multiple low-tech cleaning methods from the literature search results. Through discussion and prioritization, an EPA project team, made up of several EPA scientists and emergency responders, focused the information into a list of 14 housekeeping activities for decontamination evaluation testing. These types of activities are collectively referred to as “low-tech” remediation methods because of the comparative simple tools, equipment, and operations involved. Similarly, eight common household surfaces were chosen that were contaminated using three different contamination conditions. Thirty-three combinations of methods and surfaces were chosen for testing under the three contamination conditions for a total of 99 tests.
Working with Sparse Data in Rated Language Tests: Generalizability Theory Applications
ERIC Educational Resources Information Center
Lin, Chih-Kai
2017-01-01
Sparse-rated data are common in operational performance-based language tests, as an inevitable result of assigning examinee responses to a fraction of available raters. The current study investigates the precision of two generalizability-theory methods (i.e., the rating method and the subdividing method) specifically designed to accommodate the…
Use of the Digital Surface Roughness Meter in Virginia.
DOT National Transportation Integrated Search
2006-01-01
Pavement surface texture is measured in a variety of ways in Virginia. Two methods commonly used are ASTM E 965, Standard Test Method for Measuring Pavement Macrotexture Depth Using a Volumetric Technique, known as the "sand patch" test, and ASTM E 2...
Liu, Xikai; Ma, Dong; Chen, Liang; Liu, Xiangdong
2018-02-08
Tuning the stiffness balance is crucial to full-band common-mode rejection for a superconducting gravity gradiometer (SGG). A reliable method to do so has been proposed and experimentally tested. In the tuning scheme, the frequency response functions of the displacement of individual test mass upon common-mode accelerations were measured and thus determined a characteristic frequency for each test mass. A reduced difference in characteristic frequencies between the two test masses was utilized as the criterion for an effective tuning. Since the measurement of the characteristic frequencies does not depend on the scale factors of displacement detection, stiffness tuning can be done independently. We have tested this new method on a single-component SGG and obtained a reduction of two orders of magnitude in stiffness mismatch.
Developments in Screening Tests and Strategies for Colorectal Cancer
Sovich, Justin L.; Sartor, Zachary
2015-01-01
Background. Worldwide, colorectal cancer (CRC) is the third most common cancer in men and second most common in women. It is the fourth most common cause of cancer mortality. In the United States, CRC is the third most common cause of cancer and second most common cause of cancer mortality. Incidence and mortality rates have steadily fallen, primarily due to widespread screening. Methods. We conducted keyword searches on PubMed in four categories of CRC screening: stool, endoscopic, radiologic, and serum, as well as news searches in Medscape and Google News. Results. Colonoscopy is the gold standard for CRC screening and the most common method in the United States. Technological improvements continue to be made, including the promising “third-eye retroscope.” Fecal occult blood remains widely used, particularly outside the United States. The first at-home screen, a fecal DNA screen, has also recently been approved. Radiological methods are effective but seldom used due to cost and other factors. Serum tests are largely experimental, although at least one is moving closer to market. Conclusions. Colonoscopy is likely to remain the most popular screening modality for the immediate future, although its shortcomings will continue to spur innovation in a variety of modalities. PMID:26504799
ERIC Educational Resources Information Center
Hazari, Zahra; Potvin, Geoff; Lock, Robynne M.; Lung, Florin; Sonnert, Gerhard; Sadler, Philip M.
2013-01-01
There are many hypotheses regarding factors that may encourage female students to pursue careers in the physical sciences. Using multivariate matching methods on national data drawn from the Persistence Research in Science and Engineering (PRiSE) project ("n" = 7505), we test the following five commonly held beliefs regarding what…
Nondestructive Evaluation Methods for the Ares I Common Bulkhead
NASA Technical Reports Server (NTRS)
Walker, James
2010-01-01
A large scale bonding demonstration test article was fabricated to prove out manufacturing techniques for the current design of the NASA Ares I Upper Stage common bulkhead. The common bulkhead serves as the single interface between the liquid hydrogen and liquid oxygen portions of the Upper Stage propellant tank. The bulkhead consists of spin-formed aluminum domes friction stir welded to Y-rings and bonded to a perforated phenolic honeycomb core. Nondestructive evaluation methods are being developed for assessing core integrity and the core-to-dome bond line of the common bulkhead. Detection of manufacturing defects such as delaminations between the core and face sheets as well as service life defects such as crushed or sheared core resulting from impact loading are all of interest. The focus of this work will be on the application of thermographic, shearographic, and phased array ultrasonic methods to the bonding demonstration article as well as various smaller test panels featuring design specific defect types and geometric features.
A Conditional Exposure Control Method for Multidimensional Adaptive Testing
ERIC Educational Resources Information Center
Finkelman, Matthew; Nering, Michael L.; Roussos, Louis A.
2009-01-01
In computerized adaptive testing (CAT), ensuring the security of test items is a crucial practical consideration. A common approach to reducing item theft is to define maximum item exposure rates, i.e., to limit the proportion of examinees to whom a given item can be administered. Numerous methods for controlling exposure rates have been proposed…
Simultaneous optimization method for absorption spectroscopy postprocessing.
Simms, Jean M; An, Xinliang; Brittelle, Mack S; Ramesh, Varun; Ghandhi, Jaal B; Sanders, Scott T
2015-05-10
A simultaneous optimization method is proposed for absorption spectroscopy postprocessing. This method is particularly useful for thermometry measurements based on congested spectra, as commonly encountered in combustion applications of H2O absorption spectroscopy. A comparison test demonstrated that the simultaneous optimization method had greater accuracy, greater precision, and was more user-independent than the common step-wise postprocessing method previously used by the authors. The simultaneous optimization method was also used to process experimental data from an environmental chamber and a constant volume combustion chamber, producing results with errors on the order of only 1%.
The freshwater amphipod Hyalella azteca is a common organism used for sediment toxicity testing in the United States and elsewhere. Standard methods for 10-d and 42-d toxicity tests with H. azteca were last revised and published by USEPA/ASTM in 2000. Under the methods in the man...
The freshwater amphipod, Hyalella azteca, is a common organism used for sediment toxicity testing. Standard methods for 10-d and 42-d sediment toxicity tests with H. azteca were last revised and published by USEPA/ASTM in 2000. While Hyalella azteca methods exist for sediment tox...
Dynamic Bayesian Networks as a Probabilistic Metamodel for Combat Simulations
2014-09-18
test is commonly used for large data sets and is the method of comparison presented in Section 5.5. 4.3.3 Kullback - Leibler Divergence Goodness of Fit ...methods exist that might improve the results. A goodness of fit test using the Kullback - Leibler Divergence was proposed in the first paper, but still... Kullback - Leibler Divergence Goodness of Fit Test . . .
Liu, Xikai; Ma, Dong; Chen, Liang; Liu, Xiangdong
2018-01-01
Tuning the stiffness balance is crucial to full-band common-mode rejection for a superconducting gravity gradiometer (SGG). A reliable method to do so has been proposed and experimentally tested. In the tuning scheme, the frequency response functions of the displacement of individual test mass upon common-mode accelerations were measured and thus determined a characteristic frequency for each test mass. A reduced difference in characteristic frequencies between the two test masses was utilized as the criterion for an effective tuning. Since the measurement of the characteristic frequencies does not depend on the scale factors of displacement detection, stiffness tuning can be done independently. We have tested this new method on a single-component SGG and obtained a reduction of two orders of magnitude in stiffness mismatch. PMID:29419796
SPSS and SAS programs for comparing Pearson correlations and OLS regression coefficients.
Weaver, Bruce; Wuensch, Karl L
2013-09-01
Several procedures that use summary data to test hypotheses about Pearson correlations and ordinary least squares regression coefficients have been described in various books and articles. To our knowledge, however, no single resource describes all of the most common tests. Furthermore, many of these tests have not yet been implemented in popular statistical software packages such as SPSS and SAS. In this article, we describe all of the most common tests and provide SPSS and SAS programs to perform them. When they are applicable, our code also computes 100 × (1 - α)% confidence intervals corresponding to the tests. For testing hypotheses about independent regression coefficients, we demonstrate one method that uses summary data and another that uses raw data (i.e., Potthoff analysis). When the raw data are available, the latter method is preferred, because use of summary data entails some loss of precision due to rounding.
Krishnamoorthy, K; Oral, Evrim
2017-12-01
Standardized likelihood ratio test (SLRT) for testing the equality of means of several log-normal distributions is proposed. The properties of the SLRT and an available modified likelihood ratio test (MLRT) and a generalized variable (GV) test are evaluated by Monte Carlo simulation and compared. Evaluation studies indicate that the SLRT is accurate even for small samples, whereas the MLRT could be quite liberal for some parameter values, and the GV test is in general conservative and less powerful than the SLRT. Furthermore, a closed-form approximate confidence interval for the common mean of several log-normal distributions is developed using the method of variance estimate recovery, and compared with the generalized confidence interval with respect to coverage probabilities and precision. Simulation studies indicate that the proposed confidence interval is accurate and better than the generalized confidence interval in terms of coverage probabilities. The methods are illustrated using two examples.
Considering Horn's Parallel Analysis from a Random Matrix Theory Point of View.
Saccenti, Edoardo; Timmerman, Marieke E
2017-03-01
Horn's parallel analysis is a widely used method for assessing the number of principal components and common factors. We discuss the theoretical foundations of parallel analysis for principal components based on a covariance matrix by making use of arguments from random matrix theory. In particular, we show that (i) for the first component, parallel analysis is an inferential method equivalent to the Tracy-Widom test, (ii) its use to test high-order eigenvalues is equivalent to the use of the joint distribution of the eigenvalues, and thus should be discouraged, and (iii) a formal test for higher-order components can be obtained based on a Tracy-Widom approximation. We illustrate the performance of the two testing procedures using simulated data generated under both a principal component model and a common factors model. For the principal component model, the Tracy-Widom test performs consistently in all conditions, while parallel analysis shows unpredictable behavior for higher-order components. For the common factor model, including major and minor factors, both procedures are heuristic approaches, with variable performance. We conclude that the Tracy-Widom procedure is preferred over parallel analysis for statistically testing the number of principal components based on a covariance matrix.
Menezes, Everardo Albuquerque; Vasconcelos Júnior, Antônio Alexandre de; Ângelo, Maria Rozzelê Ferreira; Cunha, Maria da Conceição dos Santos Oliveira; Cunha, Francisco Afrânio
2013-01-01
Antifungal susceptibility testing assists in finding the appropriate treatment for fungal infections, which are increasingly common. However, such testing is not very widespread. There are several existing methods, and the correlation between such methods was evaluated in this study. The susceptibility to fluconazole of 35 strains of Candida sp. isolated from blood cultures was evaluated by the following methods: microdilution, Etest, and disk diffusion. The correlation between the methods was around 90%. The disk diffusion test exhibited a good correlation and can be used in laboratory routines to detect strains of Candida sp. that are resistant to fluconazole.
Evaluation of clinical methods for peroneal muscle testing.
Sarig-Bahat, Hilla; Krasovsky, Andrei; Sprecher, Elliot
2013-03-01
Manual muscle testing of the peroneal muscles is well accepted as a testing method in musculoskeletal physiotherapy for the assessment of the foot and ankle. The peroneus longus and brevis are primary evertors and secondary plantar flexors of the ankle joint. However, some international textbooks describe them as dorsi flexors, when instructing peroneal muscle testing. The identified variability raised a question whether these educational texts are reflected in the clinical field. The purposes of this study were to investigate what are the methods commonly used in the clinical field for peroneal muscle testing and to evaluate their compatibility with functional anatomy. A cross-sectional study was conducted, using an electronic questionnaire sent to 143 Israeli physiotherapists in the musculoskeletal field. The survey questioned on the anatomical location of manual resistance and the combination of motions resisted. Ninety-seven responses were received. The majority (69%) of respondents related correctly to the peronei as evertors, but asserted that resistance should be located over the dorsal aspect of the fifth metatarsus, thereby disregarding the peroneus longus. Moreover, 38% of the respondents described the peronei as dorsi flexors, rather than plantar flexors. Only 2% selected the correct method of resisting plantarflexion and eversion at the base of the first metatarsus. We consider this technique to be the most compatible with the anatomy of the peroneus longus and brevis. The Fisher-Freeman-Halton test indicated that there was a significant relationship between responses on the questions (P = 0.0253, 95% CI 0.0249-0.0257), thus justifying further correspondence analysis. The correspondence analysis found no clustering of the answers that were compatible with anatomical evidence and were applied in the correct technique, but did demonstrate a common error, resisting dorsiflexion rather than plantarflexion, which was in agreement with the described frequencies. Inconsistencies were identified between the instruction method commonly provided for peroneal muscle testing in textbook and the functional anatomy of these muscles. Results reflect the lack of accuracy in applying functional anatomy to peroneal testing. This may be due to limited use of peroneal muscle testing or to inadequate investigation of the existing evaluation methods and their validity. Accordingly, teaching materials and clinical methods used for this test should be re-evaluated. Further research should investigate the value of peroneal muscle testing in clinical ankle evaluation. Copyright © 2012 John Wiley & Sons, Ltd.
Teaching beyond the Test: A Method for Designing Test-Preparation Classes
ERIC Educational Resources Information Center
Derrick, Deirdre
2013-01-01
Test-preparation classes that focus on skills will benefit students beyond the test by developing skills they can use at university. This article discusses the purposes of various tests and outlines how to design effective test-prep classes. Several practical activities are included, and an appendix provides information on common standardized…
Rooting common and cat greenbrier
Franz L. Pogge; John D. Gill; Bradford C. Bearce
1974-01-01
Because reliable methods for propagating greenbriers are needed for wildlife-habitat purposes, we tested stem and rhizome cuttings of common and cat greenbrier and tubers of the latter species. Common greenbrier is the better species for most wildlife habitat uses. It proved fairly easy to propagate from either stem or rhizome cuttings. Similar cuttings from cat...
Bioelectrical Impedance and Body Composition Assessment
ERIC Educational Resources Information Center
Martino, Mike
2006-01-01
This article discusses field tests that can be used in physical education programs. The most common field tests are anthropometric measurements, which include body mass index (BMI), girth measurements, and skinfold testing. Another field test that is gaining popularity is bioelectrical impedance analysis (BIA). Each method has particular strengths…
Chaurasia, Ashok; Harel, Ofer
2015-02-10
Tests for regression coefficients such as global, local, and partial F-tests are common in applied research. In the framework of multiple imputation, there are several papers addressing tests for regression coefficients. However, for simultaneous hypothesis testing, the existing methods are computationally intensive because they involve calculation with vectors and (inversion of) matrices. In this paper, we propose a simple method based on the scalar entity, coefficient of determination, to perform (global, local, and partial) F-tests with multiply imputed data. The proposed method is evaluated using simulated data and applied to suicide prevention data. Copyright © 2014 John Wiley & Sons, Ltd.
Kline, Joshua C.
2014-01-01
Over the past four decades, various methods have been implemented to measure synchronization of motor-unit firings. In this work, we provide evidence that prior reports of the existence of universal common inputs to all motoneurons and the presence of long-term synchronization are misleading, because they did not use sufficiently rigorous statistical tests to detect synchronization. We developed a statistically based method (SigMax) for computing synchronization and tested it with data from 17,736 motor-unit pairs containing 1,035,225 firing instances from the first dorsal interosseous and vastus lateralis muscles—a data set one order of magnitude greater than that reported in previous studies. Only firing data, obtained from surface electromyographic signal decomposition with >95% accuracy, were used in the study. The data were not subjectively selected in any manner. Because of the size of our data set and the statistical rigor inherent to SigMax, we have confidence that the synchronization values that we calculated provide an improved estimate of physiologically driven synchronization. Compared with three other commonly used techniques, ours revealed three types of discrepancies that result from failing to use sufficient statistical tests necessary to detect synchronization. 1) On average, the z-score method falsely detected synchronization at 16 separate latencies in each motor-unit pair. 2) The cumulative sum method missed one out of every four synchronization identifications found by SigMax. 3) The common input assumption method identified synchronization from 100% of motor-unit pairs studied. SigMax revealed that only 50% of motor-unit pairs actually manifested synchronization. PMID:25210152
DOE Office of Scientific and Technical Information (OSTI.GOV)
Comandi, G.L.; Toncelli, R.; Chiofalo, M.L.
'Galileo Galilei on the ground' (GGG) is a fast rotating differential accelerometer designed to test the equivalence principle (EP). Its sensitivity to differential effects, such as the effect of an EP violation, depends crucially on the capability of the accelerometer to reject all effects acting in common mode. By applying the theoretical and simulation methods reported in Part I of this work, and tested therein against experimental data, we predict the occurrence of an enhanced common mode rejection of the GGG accelerometer. We demonstrate that the best rejection of common mode disturbances can be tuned in a controlled way bymore » varying the spin frequency of the GGG rotor.« less
Risk-Based Object Oriented Testing
NASA Technical Reports Server (NTRS)
Rosenberg, Linda H.; Stapko, Ruth; Gallo, Albert
2000-01-01
Software testing is a well-defined phase of the software development life cycle. Functional ("black box") testing and structural ("white box") testing are two methods of test case design commonly used by software developers. A lesser known testing method is risk-based testing, which takes into account the probability of failure of a portion of code as determined by its complexity. For object oriented programs, a methodology is proposed for identification of risk-prone classes. Risk-based testing is a highly effective testing technique that can be used to find and fix the most important problems as quickly as possible.
Estimating Achievement Gaps from Test Scores Reported in Ordinal "Proficiency" Categories
ERIC Educational Resources Information Center
Ho, Andrew D.; Reardon, Sean F.
2012-01-01
Test scores are commonly reported in a small number of ordered categories. Examples of such reporting include state accountability testing, Advanced Placement tests, and English proficiency tests. This paper introduces and evaluates methods for estimating achievement gaps on a familiar standard-deviation-unit metric using data from these ordered…
Nondestructive spot test method for magnesium and magnesium alloys
NASA Technical Reports Server (NTRS)
Wilson, M. L. (Inventor)
1973-01-01
A method for spot test identification of magnesium and various magnesium alloys commonly used in aerospace applications is described. The spot test identification involves color codes obtained when several drops of 3 M hydrochloric acid are placed on the surface to be tested. After approximately thirty seconds, two drops of this reacted acid is transferred to each of two depressions in a spot plate for additions of other chemicals with subsequent color changes indicating magnesium or its alloy.
Study on AC loss measurements of HTS power cable for standardizing
NASA Astrophysics Data System (ADS)
Mukoyama, Shinichi; Amemiya, Naoyuki; Watanabe, Kazuo; Iijima, Yasuhiro; Mido, Nobuhiro; Masuda, Takao; Morimura, Toshiya; Oya, Masayoshi; Nakano, Tetsutaro; Yamamoto, Kiyoshi
2017-09-01
High-temperature superconducting power cables (HTS cables) have been developed for more than 20 years. In addition of the cable developments, the test methods of the HTS cables have been discussed and proposed in many laboratories and companies. Recently the test methods of the HTS cables is required to standardize and to common in the world. CIGRE made the working group (B1-31) for the discussion of the test methods of the HTS cables as a power cable, and published the recommendation of the test method. Additionally, IEC TC20 submitted the New Work Item Proposal (NP) based on the recommendation of CIGRE this year, IEC TC20 and IEC TC90 started the standardization work on Testing of HTS AC cables. However, the individual test method that used to measure a performance of HTS cables hasn’t been established as world’s common methods. The AC loss is one of the most important properties to disseminate low loss and economical efficient HTS cables in the world. We regard to establish the method of the AC loss measurements in rational and in high accuracy. Japan is at a leading position in the AC loss study, because Japanese researchers have studied on the AC loss technically and scientifically, and also developed the effective technologies for the AC loss reduction. The JP domestic commission of TC90 made a working team to discussion the methods of the AC loss measurements for aiming an international standard finally. This paper reports about the AC loss measurement of two type of the HTS conductors, such as a HTS conductor without a HTS shield and a HTS conductor with a HTS shield. The AC loss measurement method is suggested by the electrical method..
Measurement environments and testing
NASA Astrophysics Data System (ADS)
Marvin, A. C.
1991-06-01
The various methods used to assess both the emission (interference generation) performance of electronic equipment and the immunity of electronic equipment to external electromagnetic interference are described. The measurement methods attempt to simulate realistic operating conditions for the equipment being tested, yet at the same time they must be repeatable and practical to operate. This has led to the development of a variety of test methods, each of which has its limitations. Concentration is on the most common measurement methods such as open-field test sites, screened enclosures and transverse electromagnetic (TEM) cells. The physical justification for the methods, their limitations, and measurement precision are described. Ways of relating similar measurements made by different methods are discussed, and some thoughts on future measurement improvements are presented.
Generalized functional linear models for gene-based case-control association studies.
Fan, Ruzong; Wang, Yifan; Mills, James L; Carter, Tonia C; Lobach, Iryna; Wilson, Alexander F; Bailey-Wilson, Joan E; Weeks, Daniel E; Xiong, Momiao
2014-11-01
By using functional data analysis techniques, we developed generalized functional linear models for testing association between a dichotomous trait and multiple genetic variants in a genetic region while adjusting for covariates. Both fixed and mixed effect models are developed and compared. Extensive simulations show that Rao's efficient score tests of the fixed effect models are very conservative since they generate lower type I errors than nominal levels, and global tests of the mixed effect models generate accurate type I errors. Furthermore, we found that the Rao's efficient score test statistics of the fixed effect models have higher power than the sequence kernel association test (SKAT) and its optimal unified version (SKAT-O) in most cases when the causal variants are both rare and common. When the causal variants are all rare (i.e., minor allele frequencies less than 0.03), the Rao's efficient score test statistics and the global tests have similar or slightly lower power than SKAT and SKAT-O. In practice, it is not known whether rare variants or common variants in a gene region are disease related. All we can assume is that a combination of rare and common variants influences disease susceptibility. Thus, the improved performance of our models when the causal variants are both rare and common shows that the proposed models can be very useful in dissecting complex traits. We compare the performance of our methods with SKAT and SKAT-O on real neural tube defects and Hirschsprung's disease datasets. The Rao's efficient score test statistics and the global tests are more sensitive than SKAT and SKAT-O in the real data analysis. Our methods can be used in either gene-disease genome-wide/exome-wide association studies or candidate gene analyses. © 2014 WILEY PERIODICALS, INC.
Generalized Functional Linear Models for Gene-based Case-Control Association Studies
Mills, James L.; Carter, Tonia C.; Lobach, Iryna; Wilson, Alexander F.; Bailey-Wilson, Joan E.; Weeks, Daniel E.; Xiong, Momiao
2014-01-01
By using functional data analysis techniques, we developed generalized functional linear models for testing association between a dichotomous trait and multiple genetic variants in a genetic region while adjusting for covariates. Both fixed and mixed effect models are developed and compared. Extensive simulations show that Rao's efficient score tests of the fixed effect models are very conservative since they generate lower type I errors than nominal levels, and global tests of the mixed effect models generate accurate type I errors. Furthermore, we found that the Rao's efficient score test statistics of the fixed effect models have higher power than the sequence kernel association test (SKAT) and its optimal unified version (SKAT-O) in most cases when the causal variants are both rare and common. When the causal variants are all rare (i.e., minor allele frequencies less than 0.03), the Rao's efficient score test statistics and the global tests have similar or slightly lower power than SKAT and SKAT-O. In practice, it is not known whether rare variants or common variants in a gene are disease-related. All we can assume is that a combination of rare and common variants influences disease susceptibility. Thus, the improved performance of our models when the causal variants are both rare and common shows that the proposed models can be very useful in dissecting complex traits. We compare the performance of our methods with SKAT and SKAT-O on real neural tube defects and Hirschsprung's disease data sets. The Rao's efficient score test statistics and the global tests are more sensitive than SKAT and SKAT-O in the real data analysis. Our methods can be used in either gene-disease genome-wide/exome-wide association studies or candidate gene analyses. PMID:25203683
EGFR Mutation Testing Practices within the Asia Pacific Region
Kerr, Keith M.; Utomo, Ahmad; Rajadurai, Pathmanathan; Tran, Van Khanh; Du, Xiang; Chou, Teh-Ying; Enriquez, Ma. Luisa D.; Lee, Geon Kook; Iqbal, Jabed; Shuangshoti, Shanop; Chung, Jin-Haeng; Hagiwara, Koichi; Liang, Zhiyong; Normanno, Nicola; Park, Keunchil; Toyooka, Shinichi; Tsai, Chun-Ming; Waring, Paul; Zhang, Li; McCormack, Rose; Ratcliffe, Marianne; Itoh, Yohji; Sugeno, Masatoshi; Mok, Tony
2015-01-01
Introduction: The efficacy of epidermal growth factor receptor (EGFR) tyrosine kinase inhibitors in EGFR mutation-positive non–small-cell lung cancer (NSCLC) patients necessitates accurate, timely testing. Although EGFR mutation testing has been adopted by many laboratories in Asia, data are lacking on the proportion of NSCLC patients tested in each country, and the most commonly used testing methods. Methods: A retrospective survey of records from NSCLC patients tested for EGFR mutations during 2011 was conducted in 11 Asian Pacific countries at 40 sites that routinely performed EGFR mutation testing during that period. Patient records were used to complete an online questionnaire at each site. Results: Of the 22,193 NSCLC patient records surveyed, 31.8% (95% confidence interval: 31.2%–32.5%) were tested for EGFR mutations. The rate of EGFR mutation positivity was 39.6% among the 10,687 cases tested. The majority of samples were biopsy and/or cytology samples (71.4%). DNA sequencing was the most commonly used testing method accounting for 40% and 32.5% of tissue and cytology samples, respectively. A pathology report was available only to 60.0% of the sites, and 47.5% were not members of a Quality Assurance Scheme. Conclusions: In 2011, EGFR mutation testing practices varied widely across Asia. These data provide a reference platform from which to improve the molecular diagnosis of NSCLC, and EGFR mutation testing in particular, in Asia. PMID:25376513
Jiang, Xiao-Dan; Li, Guang-Yu; Dong, Zhen; Zhu, Dong-Dong
2011-01-01
Skin-prick testing (SPT) is the most common screening method for allergy evaluation. The detection of serum-specific immunoglobulin E (sIgE) is also commonly used. The sensitivity and specificity of these testing methods may vary due to type of causative allergen and type of allergic manifestation. The purpose of this study was to evaluate the correlation between two methods of measuring sIgE (AllergyScreen [Mediwiss Analytic GmbH, Moers, Germany] and ImmunoCAP [Pharmacia, Uppsala, Sweden]) and SPT for the diagnosis of allergic rhinitis (AR). All 216 patients who were referred to the allergist for suspected AR from June to October 2009 had SPT and the two serological tests. One hundred fifty-eight patients had a positive clinical history and a related positive SPT. The SPT was used as reference standard, and we selected three allergens (Dermatophagoides pteronyssinus, mugwort, and ragweed), which were common in fall in northeast China, to analyze the correlation of the two serum tests and SPT. Compared with the SPT, the diagnostic indexes (accuracy, sensitivity and specificity) of the AllergyScreen system and the ImmunoCAP system were 0.819 versus 0.810, 0.780 versus 0.872, and 0.862 versus 0.741, respectively. The accuracy was similar between the two systems (p > 0.05). The ImmunoCAP system method had a higher sensitivity (p < 0.01). The AllergyScreen system had a higher specificity (p < 0.01). These data support that the AllergyScreen system and ImmunoCAP system can identify potentially significant allergens in the diagnosis of AR in patients from northeastern China.
Proposal of a method for the evaluation of inaccuracy of home sphygmomanometers.
Akpolat, Tekin
2009-10-01
There is no formal protocol for evaluating the individual accuracy of home sphygmomanometers. The aims of this study were to propose a method for achieving accuracy in automated home sphygmomanometers and to test the applicability of the defined method. The purposes of this method were to avoid major inaccuracies and to estimate the optimal circumstance for individual accuracy. The method has three stages and sequential measurement of blood pressure is used. The tested devices were categorized into four groups: accurate, acceptable, inaccurate and very inaccurate (major inaccuracy). The defined method takes approximately 10 min (excluding relaxation time) and was tested on three different occasions. The application of the method has shown that inaccuracy is a common problem among non-tested devices, that validated devices are superior to those that are non-validated or whose validation status is unknown, that major inaccuracy is common, especially in non-tested devices and that validation does not guarantee individual accuracy. A protocol addressing the accuracy of a particular sphygmomanometer in an individual patient is required, and a practical method has been suggested to achieve this. This method can be modified, but the main idea and approach should be preserved unless a better method is proposed. The purchase of validated devices and evaluation of accuracy for the purchased device in an individual patient will improve the monitoring of self-measurement of blood pressure at home. This study addresses device inaccuracy, but errors related to the patient, observer or blood pressure measurement technique should not be underestimated, and strict adherence to the manufacturer's instructions is essential.
Meta-analysis of diagnostic accuracy studies in mental health
Takwoingi, Yemisi; Riley, Richard D; Deeks, Jonathan J
2015-01-01
Objectives To explain methods for data synthesis of evidence from diagnostic test accuracy (DTA) studies, and to illustrate different types of analyses that may be performed in a DTA systematic review. Methods We described properties of meta-analytic methods for quantitative synthesis of evidence. We used a DTA review comparing the accuracy of three screening questionnaires for bipolar disorder to illustrate application of the methods for each type of analysis. Results The discriminatory ability of a test is commonly expressed in terms of sensitivity (proportion of those with the condition who test positive) and specificity (proportion of those without the condition who test negative). There is a trade-off between sensitivity and specificity, as an increasing threshold for defining test positivity will decrease sensitivity and increase specificity. Methods recommended for meta-analysis of DTA studies --such as the bivariate or hierarchical summary receiver operating characteristic (HSROC) model --jointly summarise sensitivity and specificity while taking into account this threshold effect, as well as allowing for between study differences in test performance beyond what would be expected by chance. The bivariate model focuses on estimation of a summary sensitivity and specificity at a common threshold while the HSROC model focuses on the estimation of a summary curve from studies that have used different thresholds. Conclusions Meta-analyses of diagnostic accuracy studies can provide answers to important clinical questions. We hope this article will provide clinicians with sufficient understanding of the terminology and methods to aid interpretation of systematic reviews and facilitate better patient care. PMID:26446042
Correlation of geotextile puncture test methods : research brief.
DOT National Transportation Integrated Search
2017-05-01
Geotextiles are commonly used in pavements, earth retaining structures, landfills and other geotechnical contexts. Various tests are conducted to evaluate and classify geotextiles to determine their suitability for each of these applications. The AST...
Method refinements for the midge life-cycle, Chironomus dilutus test
Larval stages of non-biting midges can be found in almost any freshwater ecosystem, and one of the commonly tested midges is Chironomus dilutus (Chironomidae, Diptera) which is used for toxicity testing and ecological risk assessment of freshwater contaminants. USEPA, ASTM, Envir...
Collaborative derivation of reference intervals for major clinical laboratory tests in Japan.
Ichihara, Kiyoshi; Yomamoto, Yoshikazu; Hotta, Taeko; Hosogaya, Shigemi; Miyachi, Hayato; Itoh, Yoshihisa; Ishibashi, Midori; Kang, Dongchon
2016-05-01
Three multicentre studies of reference intervals were conducted recently in Japan. The Committee on Common Reference Intervals of the Japan Society of Clinical Chemistry sought to establish common reference intervals for 40 laboratory tests which were measured in common in the three studies and regarded as well harmonized in Japan. The study protocols were comparable with recruitment mostly from hospital workers with body mass index ≤28 and no medications. Age and sex distributions were made equal to obtain a final data size of 6345 individuals. Between-subgroup differences were expressed as the SD ratio (between-subgroup SD divided by SD representing the reference interval). Between-study differences were all within acceptable levels, and thus the three datasets were merged. By adopting SD ratio ≥0.50 as a guide, sex-specific reference intervals were necessary for 12 assays. Age-specific reference intervals for females partitioned at age 45 were required for five analytes. The reference intervals derived by the parametric method resulted in appreciable narrowing of the ranges by applying the latent abnormal values exclusion method in 10 items which were closely associated with prevalent disorders among healthy individuals. Sex- and age-related profiles of reference values, derived from individuals with no abnormal results in major tests, showed peculiar patterns specific to each analyte. Common reference intervals for nationwide use were developed for 40 major tests, based on three multicentre studies by advanced statistical methods. Sex- and age-related profiles of reference values are of great relevance not only for interpreting test results, but for applying clinical decision limits specified in various clinical guidelines. © The Author(s) 2015.
A novel method of testing the shear strength of thick honeycomb composites
NASA Technical Reports Server (NTRS)
Hodge, A. J.; Nettles, A. T.
1991-01-01
Sandwich composites of aluminum and glass/phenolic honeycomb core were tested for shear strength before and after impact damage. The assessment of shear strength was performed in two ways; by four point bend testing of sandwich beams and by a novel double lap shear (DLS) test. This testing technique was developed so smaller specimens could be used, thus making the use of common lab scale fabrication and testing possible. The two techniques yielded similar data. The DLS test gave slightly lower shear strength values of the two methods but were closer to the supplier's values for shear strength.
ERIC Educational Resources Information Center
Yoder, Zachariah
2017-01-01
The recorded text test (RTT) is commonly used to test dialect intelligibility, often to inform language development decisions. More than 25 papers using the RTT method were published on www.sil.org/silesr from January 2009 to March 2013. As introduced by Casad [1974. "Dialect Intelligibility Testing." Summer Institute of Linguistics…
The Sequential Probability Ratio Test and Binary Item Response Models
ERIC Educational Resources Information Center
Nydick, Steven W.
2014-01-01
The sequential probability ratio test (SPRT) is a common method for terminating item response theory (IRT)-based adaptive classification tests. To decide whether a classification test should stop, the SPRT compares a simple log-likelihood ratio, based on the classification bound separating two categories, to prespecified critical values. As has…
De Luca, Carlo J; Kline, Joshua C
2014-12-01
Over the past four decades, various methods have been implemented to measure synchronization of motor-unit firings. In this work, we provide evidence that prior reports of the existence of universal common inputs to all motoneurons and the presence of long-term synchronization are misleading, because they did not use sufficiently rigorous statistical tests to detect synchronization. We developed a statistically based method (SigMax) for computing synchronization and tested it with data from 17,736 motor-unit pairs containing 1,035,225 firing instances from the first dorsal interosseous and vastus lateralis muscles--a data set one order of magnitude greater than that reported in previous studies. Only firing data, obtained from surface electromyographic signal decomposition with >95% accuracy, were used in the study. The data were not subjectively selected in any manner. Because of the size of our data set and the statistical rigor inherent to SigMax, we have confidence that the synchronization values that we calculated provide an improved estimate of physiologically driven synchronization. Compared with three other commonly used techniques, ours revealed three types of discrepancies that result from failing to use sufficient statistical tests necessary to detect synchronization. 1) On average, the z-score method falsely detected synchronization at 16 separate latencies in each motor-unit pair. 2) The cumulative sum method missed one out of every four synchronization identifications found by SigMax. 3) The common input assumption method identified synchronization from 100% of motor-unit pairs studied. SigMax revealed that only 50% of motor-unit pairs actually manifested synchronization. Copyright © 2014 the American Physiological Society.
Adhesion testing procedure for hot-poured crack sealants.
DOT National Transportation Integrated Search
2008-11-01
Crack sealing is a common pavement maintenance treatment because it extends pavement service life significantly. : However, crack sealant often fails prematurely due to a loss of adhesion. Because current test methods are mostly : empirical and only ...
Technology Solutions Case Study: Predicting Envelope Leakage in Attached Dwellings
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
2013-11-01
The most common method of measuring air leakage is to perform single (or solo) blower door pressurization and/or depressurization test. In detached housing, the single blower door test measures leakage to the outside. In attached housing, however, this “solo” test method measures both air leakage to the outside and air leakage between adjacent units through common surfaces. In an attempt to create a simplified tool for predicting leakage to the outside, Building America team Consortium for Advanced Residential Buildings (CARB) performed a preliminary statistical analysis on blower door test results from 112 attached dwelling units in four apartment complexes. Althoughmore » the subject data set is limited in size and variety, the preliminary analyses suggest significant predictors are present and support the development of a predictive model. Further data collection is underway to create a more robust prediction tool for use across different construction types, climate zones, and unit configurations.« less
Murphy, Christine M; Devlin, John J; Beuhler, Michael C; Cheifetz, Paul; Maynard, Susan; Schwartz, Michael D; Kacinko, Sherri
2018-04-01
Nitromethane, found in fuels used for short distance racing, model cars, and model airplanes, produces a falsely elevated serum creatinine with standard creatinine analysis via the Jaffé method. Erroneous creatinine elevation often triggers extensive testing, leads to inaccurate diagnoses, and delayed or inappropriate medical interventions. Multiple reports in the literature identify "enzymatic assays" as an alternative method to detect the true value of creatinine, but this ambiguity does not help providers translate what type of enzymatic assay testing can be done in real time to determine if there is indeed false elevation. We report seven cases of ingested nitromethane where creatinine was determined via Beckman Coulter ® analyser using the Jaffé method, Vitros ® analyser, or i-Stat ® point-of-care testing. Nitromethane was detected and semi-quantified using a common clinical toxic alcohol analysis method, and quantified by headspace-gas chromatography-mass spectrometry. When creatinine was determined using i-Stat ® point-of-care testing or a Vitros ® analyser, levels were within the normal range. Comparatively, all initial creatinine levels obtained via the Jaffé method were elevated. Nitromethane concentrations ranged from 42 to 310 μg/mL. These cases demonstrate reliable assessment of creatinine through other enzymatic methods using a Vitros ® analyser or i-STAT ® . Additionally, nitromethane is detectable and quantifiable using routine alcohols gas chromatography analysis and by headspace-gas chromatography-mass spectrometry.
Efficient IDUA Gene Mutation Detection with Combined Use of dHPLC and Dried Blood Samples
Duarte, Ana Joana; Vieira, Luis
2013-01-01
Objectives. Development of a simple mutation directed method in order to allow lowering the cost of mutation testing using an easily obtainable biological material. Assessment of the feasibility of such method was tested using a GC-rich amplicon. Design and Methods. A method of denaturing high-performance liquid chromatography (dHPLC) was improved and implemented as a technique for the detection of variants in exon 9 of the IDUA gene. The optimized method was tested in 500 genomic DNA samples obtained from dried blood spots (DBS). Results. With this dHPLC approach it was possible to detect different variants, including the common p.Trp402Ter mutation in the IDUA gene. The high GC content did not interfere with the resolution and reliability of this technique, and discrimination of G-C transversions was also achieved. Conclusion. This PCR-based dHPLC method is proved to be a rapid, a sensitive, and an excellent option for screening numerous samples obtained from DBS. Furthermore, it resulted in the consistent detection of clearly distinguishable profiles of the common p.Trp402Ter IDUA mutation with an advantageous balance of cost and technical requirements. PMID:27335677
New rapid method for determining edgewise compressive strength of corrugated fiberboard
John W. Koning
1986-01-01
The objective of this study was to determine if corrugated fiberboard specimens that had been necked down with a common router would yield acceptable edgewise compressive strength values. Tests were conducted on specimens prepared using a circular saw and router, and the results were compared with those obtained on specimens prepared according to TAPPI Test Method T...
ERIC Educational Resources Information Center
Hsiao, Yu-Yu; Kwok, Oi-Man; Lai, Mark H. C.
2018-01-01
Path models with observed composites based on multiple items (e.g., mean or sum score of the items) are commonly used to test interaction effects. Under this practice, researchers generally assume that the observed composites are measured without errors. In this study, we reviewed and evaluated two alternative methods within the structural…
A Unified Approach to IRT Scale Linking and Scale Transformations. Research Report. RR-04-09
ERIC Educational Resources Information Center
von Davier, Matthias; von Davier, Alina A.
2004-01-01
This paper examines item response theory (IRT) scale transformations and IRT scale linking methods used in the Non-Equivalent Groups with Anchor Test (NEAT) design to equate two tests, X and Y. It proposes a unifying approach to the commonly used IRT linking methods: mean-mean, mean-var linking, concurrent calibration, Stocking and Lord and…
Feasibility of Combining Common Data Elements Across Studies to Test a Hypothesis.
Corwin, Elizabeth J; Moore, Shirley M; Plotsky, Andrea; Heitkemper, Margaret M; Dorsey, Susan G; Waldrop-Valverde, Drenna; Bailey, Donald E; Docherty, Sharron L; Whitney, Joanne D; Musil, Carol M; Dougherty, Cynthia M; McCloskey, Donna J; Austin, Joan K; Grady, Patricia A
2017-05-01
The purpose of this article is to describe the outcomes of a collaborative initiative to share data across five schools of nursing in order to evaluate the feasibility of collecting common data elements (CDEs) and developing a common data repository to test hypotheses of interest to nursing scientists. This initiative extended work already completed by the National Institute of Nursing Research CDE Working Group that successfully identified CDEs related to symptoms and self-management, with the goal of supporting more complex, reproducible, and patient-focused research. Two exemplars describing the group's efforts are presented. The first highlights a pilot study wherein data sets from various studies by the represented schools were collected retrospectively, and merging of the CDEs was attempted. The second exemplar describes the methods and results of an initiative at one school that utilized a prospective design for the collection and merging of CDEs. Methods for identifying a common symptom to be studied across schools and for collecting the data dictionaries for the related data elements are presented for the first exemplar. The processes for defining and comparing the concepts and acceptable values, and for evaluating the potential to combine and compare the data elements are also described. Presented next are the steps undertaken in the second exemplar to prospectively identify CDEs and establish the data dictionaries. Methods for common measurement and analysis strategies are included. Findings from the first exemplar indicated that without plans in place a priori to ensure the ability to combine and compare data from disparate sources, doing so retrospectively may not be possible, and as a result hypothesis testing across studies may be prohibited. Findings from the second exemplar, however, indicated that a plan developed prospectively to combine and compare data sets is feasible and conducive to merged hypothesis testing. Although challenges exist in combining CDEs across studies into a common data repository, a prospective, well-designed protocol for identifying, coding, and comparing CDEs is feasible and supports the development of a common data repository and the testing of important hypotheses to advance nursing science. Incorporating CDEs across studies will increase sample size and improve data validity, reliability, transparency, and reproducibility, all of which will increase the scientific rigor of the study and the likelihood of impacting clinical practice and patient care. © 2017 Sigma Theta Tau International.
Online Testing Suffers Setbacks in Multiple States
ERIC Educational Resources Information Center
Davis, Michelle R.
2013-01-01
Widespread technical failures and interruptions of recent online testing in a number of states have shaken the confidence of educators and policymakers in high-tech assessment methods and raised serious concerns about schools' technological readiness for the coming common-core online tests. The glitches arose as many districts in the 46 states…
Some consequences of using the Horsfall-Barratt scale for hypothesis testing
USDA-ARS?s Scientific Manuscript database
Comparing treatment effects by hypothesis testing is a common practice in plant pathology. Nearest percent estimates (NPEs) of disease severity were compared to Horsfall-Barratt (H-B) scale data to explore whether there was an effect of assessment method on hypothesis testing. A simulation model ba...
Multilevel Multidimensional Item Response Model with a Multilevel Latent Covariate
ERIC Educational Resources Information Center
Cho, Sun-Joo; Bottge, Brian A.
2015-01-01
In a pretest-posttest cluster-randomized trial, one of the methods commonly used to detect an intervention effect involves controlling pre-test scores and other related covariates while estimating an intervention effect at post-test. In many applications in education, the total post-test and pre-test scores that ignores measurement error in the…
Luo, Li; Zhu, Yun
2012-01-01
Abstract The genome-wide association studies (GWAS) designed for next-generation sequencing data involve testing association of genomic variants, including common, low frequency, and rare variants. The current strategies for association studies are well developed for identifying association of common variants with the common diseases, but may be ill-suited when large amounts of allelic heterogeneity are present in sequence data. Recently, group tests that analyze their collective frequency differences between cases and controls shift the current variant-by-variant analysis paradigm for GWAS of common variants to the collective test of multiple variants in the association analysis of rare variants. However, group tests ignore differences in genetic effects among SNPs at different genomic locations. As an alternative to group tests, we developed a novel genome-information content-based statistics for testing association of the entire allele frequency spectrum of genomic variation with the diseases. To evaluate the performance of the proposed statistics, we use large-scale simulations based on whole genome low coverage pilot data in the 1000 Genomes Project to calculate the type 1 error rates and power of seven alternative statistics: a genome-information content-based statistic, the generalized T2, collapsing method, multivariate and collapsing (CMC) method, individual χ2 test, weighted-sum statistic, and variable threshold statistic. Finally, we apply the seven statistics to published resequencing dataset from ANGPTL3, ANGPTL4, ANGPTL5, and ANGPTL6 genes in the Dallas Heart Study. We report that the genome-information content-based statistic has significantly improved type 1 error rates and higher power than the other six statistics in both simulated and empirical datasets. PMID:22651812
Luo, Li; Zhu, Yun; Xiong, Momiao
2012-06-01
The genome-wide association studies (GWAS) designed for next-generation sequencing data involve testing association of genomic variants, including common, low frequency, and rare variants. The current strategies for association studies are well developed for identifying association of common variants with the common diseases, but may be ill-suited when large amounts of allelic heterogeneity are present in sequence data. Recently, group tests that analyze their collective frequency differences between cases and controls shift the current variant-by-variant analysis paradigm for GWAS of common variants to the collective test of multiple variants in the association analysis of rare variants. However, group tests ignore differences in genetic effects among SNPs at different genomic locations. As an alternative to group tests, we developed a novel genome-information content-based statistics for testing association of the entire allele frequency spectrum of genomic variation with the diseases. To evaluate the performance of the proposed statistics, we use large-scale simulations based on whole genome low coverage pilot data in the 1000 Genomes Project to calculate the type 1 error rates and power of seven alternative statistics: a genome-information content-based statistic, the generalized T(2), collapsing method, multivariate and collapsing (CMC) method, individual χ(2) test, weighted-sum statistic, and variable threshold statistic. Finally, we apply the seven statistics to published resequencing dataset from ANGPTL3, ANGPTL4, ANGPTL5, and ANGPTL6 genes in the Dallas Heart Study. We report that the genome-information content-based statistic has significantly improved type 1 error rates and higher power than the other six statistics in both simulated and empirical datasets.
ERIC Educational Resources Information Center
Garcia-Quintana, Roan A.; Johnson, Lynne M.
Three different computational procedures for equating two forms of a test were applied to a pair of mathematics tests to compare the results of the three procedures. The tests that were being equated were two forms of the SRA Mastery Mathematics Tests. The common, linking test used for equating was the Comprehensive Tests of Basic Skills, Form S,…
Urinary tract infections associated with ureteral stents: A Review.
Liaw, A; Knudsen, B
2016-10-01
We review the literature on infections associated with ureteral stents and new technologies aimed at preventing them. Ureteral stent placement is one of the most common urologic procedures, but carries a comparatively high morbidity. Infection is one of the most common stent-associated morbidities. Several new stent materials and coatings have been proposed and tested to reduce stent-associated infections. We review the current methods of preventing bacterial infection, including antibiotic prophylaxis and minimising dwell time. We look at the science underlying infection and biofilm formation on stents. Several new stent materials and coatings are described, as well as the studies underlying their mechanism of action. While many promising ideas for new stent coatings and materials have been tested, no significant improvement to current polyurethane stent technology is commonly available or used. The basic principles of antibiotic prophylaxis at time of insertion, avoiding contamination, and minimising dwell times remain the best methods to prevent stent-associated infections.
Establishing a Cell-based Assay for Assessment of Cellular Metabolism on Chemical Toxicity
A major drawback of current in vitro chemical testing is that many commonly used cell lines lack chemical metabolism. To help address this challenge, we are established a method for assessing the impact of cellular metabolism on chemical-based cellular toxicity. A commonly used h...
Mano, Junichi; Hatano, Shuko; Nagatomi, Yasuaki; Futo, Satoshi; Takabatake, Reona; Kitta, Kazumi
2018-03-01
Current genetically modified organism (GMO) detection methods allow for sensitive detection. However, a further increase in sensitivity will enable more efficient testing for large grain samples and reliable testing for processed foods. In this study, we investigated real-time PCR-based GMO detection methods using a large amount of DNA template. We selected target sequences that are commonly introduced into many kinds of GM crops, i.e., 35S promoter and nopaline synthase (NOS) terminator. This makes the newly developed method applicable to a wide range of GMOs, including some unauthorized ones. The estimated LOD of the new method was 0.005% of GM maize events; to the best of our knowledge, this method is the most sensitive among the GM maize detection methods for which the LOD was evaluated in terms of GMO content. A 10-fold increase in the DNA amount as compared with the amount used under common testing conditions gave an approximately 10-fold reduction in the LOD without PCR inhibition. Our method is applicable to various analytical samples, including processed foods. The use of other primers and fluorescence probes would permit highly sensitive detection of various recombinant DNA sequences besides the 35S promoter and NOS terminator.
Development of synthetic nuclear melt glass for forensic analysis.
Molgaard, Joshua J; Auxier, John D; Giminaro, Andrew V; Oldham, C J; Cook, Matthew T; Young, Stephen A; Hall, Howard L
A method for producing synthetic debris similar to the melt glass produced by nuclear surface testing is demonstrated. Melt glass from the first nuclear weapon test (commonly referred to as trinitite) is used as the benchmark for this study. These surrogates can be used to simulate a variety of scenarios and will serve as a tool for developing and validating forensic analysis methods.
ERIC Educational Resources Information Center
Haebara, Tomokazu
When several ability scales in item response models are separately derived from different test forms administered to different samples of examinees, these scales must be equated to a common scale because their units and origins are arbitrarily determined and generally different from scale to scale. A general method for equating logistic ability…
Application of the Booth-Kautzmann method for the determination of N-2 packing leakage
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burkhart, D.M.; Milton, J.W.; Fawcett, S.T.
1995-06-01
To accurately determine turbine cycle heat rate, leakage past the N-2 steam seal packing must be determined on turbines with both HP and IP turbines contained within a common high pressure casing. N-2 packing leakage can be determined by the Booth-Kautzmann Method with instrumentation commonly used to determine the HP and IP turbine efficiency. The only additional requirements are changes to the main steam and/or hot reheat steam conditions. This paper discusses the actual test results using the Booth-Kautzmann test procedure on three natural gas fired units. The test results demonstrate the added advantage of having at least three N-2more » test runs, stability requirements for repeatable test runs and test procedures used to determine leakage results. Discussion of the sensitivity of the assumed N-2 enthalpy are also addressed. Utilizing Martins Formula with a series of N-2 Leakage test runs is shown to be a leakage prediction tool and a packing clearance approximation tool. It is concluded that the Booth-Kautzmann Method for determination of N-2 packing leakage should be utilized whenever HP and Ip turbine efficiency is determined. The two or three additional hours invested in the test runs is well worth the information gained on the performance of the N-2 packing.« less
Altomare, Christopher; Kinzler, Eric R; Buchhalter, August R; Cone, Edward J; Costantino, Anthony
The US Food and Drug Administration (FDA) considers the development of abuse-deterrent formulations of solid oral dosage forms a public health priority and has outlined a series of premarket studies that should be performed prior to submitting an application to the Agency. Category 1 studies are performed to characterize whether the abuse-deterrent properties of a new formulation can be easily defeated. Study protocols are designed to evaluate common abuse patterns of prescription medications as well as more advanced methods that have been reported on drug abuse websites and forums. Because FDA believes Category 1 testing should fully characterize the abuse-deterrent characteristics of an investigational formulation, Category 1 testing is time consuming and requires specialized laboratory resources as well as advanced knowledge of prescription medication abuse. Recent Advisory Committee meetings at FDA have shown that Category 1 tests play a critical role in FDA's evaluation of an investigational formulation. In this article, we will provide a general overview of the methods of manipulation and routes of administration commonly utilized by prescription drug abusers, how those methods and routes are evaluated in a laboratory setting, and discuss data intake, analysis, and reporting to satisfy FDA's Category 1 testing requirements.
Systematic evaluation of non-animal test methods for skin sensitisation safety assessment.
Reisinger, Kerstin; Hoffmann, Sebastian; Alépée, Nathalie; Ashikaga, Takao; Barroso, Joao; Elcombe, Cliff; Gellatly, Nicola; Galbiati, Valentina; Gibbs, Susan; Groux, Hervé; Hibatallah, Jalila; Keller, Donald; Kern, Petra; Klaric, Martina; Kolle, Susanne; Kuehnl, Jochen; Lambrechts, Nathalie; Lindstedt, Malin; Millet, Marion; Martinozzi-Teissier, Silvia; Natsch, Andreas; Petersohn, Dirk; Pike, Ian; Sakaguchi, Hitoshi; Schepky, Andreas; Tailhardat, Magalie; Templier, Marie; van Vliet, Erwin; Maxwell, Gavin
2015-02-01
The need for non-animal data to assess skin sensitisation properties of substances, especially cosmetics ingredients, has spawned the development of many in vitro methods. As it is widely believed that no single method can provide a solution, the Cosmetics Europe Skin Tolerance Task Force has defined a three-phase framework for the development of a non-animal testing strategy for skin sensitization potency prediction. The results of the first phase – systematic evaluation of 16 test methods – are presented here. This evaluation involved generation of data on a common set of ten substances in all methods and systematic collation of information including the level of standardisation, existing test data,potential for throughput, transferability and accessibility in cooperation with the test method developers.A workshop was held with the test method developers to review the outcome of this evaluation and to discuss the results. The evaluation informed the prioritisation of test methods for the next phase of the non-animal testing strategy development framework. Ultimately, the testing strategy – combined with bioavailability and skin metabolism data and exposure consideration – is envisaged to allow establishment of a data integration approach for skin sensitisation safety assessment of cosmetic ingredients.
Kanamori, Hajime; Rutala, William A; Gergen, Maria F; Sickbert-Bennett, Emily E; Weber, David J
2018-05-07
Susceptibility to germicides for carbapenem/colistin-resistant Enterobacteriaceae is poorly described. We investigated the efficacy of multiple germicides against these emerging antibiotic-resistant pathogens using the disc-based quantitative carrier test method that can produce results more similar to those encountered in healthcare settings than a suspension test. Our study results demonstrated that germicides commonly used in healthcare facilities likely will be effective against carbapenem/colistin-resistant Enterobacteriaceae when used appropriately in healthcare facilities. Copyright © 2018 American Society for Microbiology.
Chan, Cheng Leng; Rudrappa, Sowmya; Ang, Pei San; Li, Shu Chuen; Evans, Stephen J W
2017-08-01
The ability to detect safety concerns from spontaneous adverse drug reaction reports in a timely and efficient manner remains important in public health. This paper explores the behaviour of the Sequential Probability Ratio Test (SPRT) and ability to detect signals of disproportionate reporting (SDRs) in the Singapore context. We used SPRT with a combination of two hypothesised relative risks (hRRs) of 2 and 4.1 to detect signals of both common and rare adverse events in our small database. We compared SPRT with other methods in terms of number of signals detected and whether labelled adverse drug reactions were detected or the reaction terms were considered serious. The other methods used were reporting odds ratio (ROR), Bayesian Confidence Propagation Neural Network (BCPNN) and Gamma Poisson Shrinker (GPS). The SPRT produced 2187 signals in common with all methods, 268 unique signals, and 70 signals in common with at least one other method, and did not produce signals in 178 cases where two other methods detected them, and there were 403 signals unique to one of the other methods. In terms of sensitivity, ROR performed better than other methods, but the SPRT method found more new signals. The performances of the methods were similar for negative predictive value and specificity. Using a combination of hRRs for SPRT could be a useful screening tool for regulatory agencies, and more detailed investigation of the medical utility of the system is merited.
Common pitfalls in statistical analysis: The perils of multiple testing
Ranganathan, Priya; Pramesh, C. S.; Buyse, Marc
2016-01-01
Multiple testing refers to situations where a dataset is subjected to statistical testing multiple times - either at multiple time-points or through multiple subgroups or for multiple end-points. This amplifies the probability of a false-positive finding. In this article, we look at the consequences of multiple testing and explore various methods to deal with this issue. PMID:27141478
Evaluation of Asphalt Mixture Low-Temperature Performance in Bending Beam Creep Test.
Pszczola, Marek; Jaczewski, Mariusz; Rys, Dawid; Jaskula, Piotr; Szydlowski, Cezary
2018-01-10
Low-temperature cracking is one of the most common road pavement distress types in Poland. While bitumen performance can be evaluated in detail using bending beam rheometer (BBR) or dynamic shear rheometer (DSR) tests, none of the normalized test methods gives a comprehensive representation of low-temperature performance of the asphalt mixtures. This article presents the Bending Beam Creep test performed at temperatures from -20 °C to +10 °C in order to evaluate the low-temperature performance of asphalt mixtures. Both validation of the method and its utilization for the assessment of eight types of wearing courses commonly used in Poland were described. The performed test indicated that the source of bitumen and its production process (and not necessarily only bitumen penetration) had a significant impact on the low-temperature performance of the asphalt mixtures, comparable to the impact of binder modification (neat, polymer-modified, highly modified) and the aggregate skeleton used in the mixture (Stone Mastic Asphalt (SMA) vs. Asphalt Concrete (AC)). Obtained Bending Beam Creep test results were compared with the BBR bitumen test. Regression analysis confirmed that performing solely bitumen tests is insufficient for comprehensive low-temperature performance analysis.
Evaluation of Asphalt Mixture Low-Temperature Performance in Bending Beam Creep Test
Rys, Dawid; Jaskula, Piotr; Szydlowski, Cezary
2018-01-01
Low-temperature cracking is one of the most common road pavement distress types in Poland. While bitumen performance can be evaluated in detail using bending beam rheometer (BBR) or dynamic shear rheometer (DSR) tests, none of the normalized test methods gives a comprehensive representation of low-temperature performance of the asphalt mixtures. This article presents the Bending Beam Creep test performed at temperatures from −20 °C to +10 °C in order to evaluate the low-temperature performance of asphalt mixtures. Both validation of the method and its utilization for the assessment of eight types of wearing courses commonly used in Poland were described. The performed test indicated that the source of bitumen and its production process (and not necessarily only bitumen penetration) had a significant impact on the low-temperature performance of the asphalt mixtures, comparable to the impact of binder modification (neat, polymer-modified, highly modified) and the aggregate skeleton used in the mixture (Stone Mastic Asphalt (SMA) vs. Asphalt Concrete (AC)). Obtained Bending Beam Creep test results were compared with the BBR bitumen test. Regression analysis confirmed that performing solely bitumen tests is insufficient for comprehensive low-temperature performance analysis. PMID:29320443
Evaluation of Testing Methods to Develop Test Requirements for a Workstation Table Safety Standard
DOT National Transportation Integrated Search
2010-10-12
Investigations of passenger train accidents have revealed serious safety hazards associated with the thin, rigid tops of workstation tables, which are common fixtures aboard rail cars. Thoracic and abdominal injuries caused by occupant impact with wo...
Toxicity tests are a common method for determining whether sediment contaminants represent an environmental risk. Toxicity tests indicate if contaminants in sediments are bioavailable and capable of causing adverse biological effects to whole aquatic organisms. Several environmen...
Common Methods for Security Risk Analysis
2005-01-12
recognized in the others. In Canada, three firms have been accredited as IT Security Evaluation and Testing (ITSET) Facility, under ISO / IEC 17025 -1999...harmonized security standards such as the Common Criteria and ISO 17799 may further increase the applicability of TRA approach. 3.4.8 MOST AUTOMATION...create something more suitable, the Common Criteria with Mutual Recognition Agreement (MRA) signed in October 1998. The CC became an ISO standard
Method Analysis of Microbial Resistant Gypsum Products
Abstract: Several commercially available gypsum products are marketed as microbial-resistant. During previous test method research on a microbial resistant gypsum wallboard study, a common theme from both stakeholders and product vendors was the need for a unified and accepted m...
20170824 - Enhancing the Application of Alternative Methods Through Global Cooperation (WC10)
Progress towards the development and translation of alternative testing methods to safety-related decision making is a common goal that crosses organizational, stakeholder, and international boundaries. The challenge is that different organizations have different missions, differ...
ERIC Educational Resources Information Center
Reinisch, Bianca; Krell, Moritz; Hergert, Susann; Gogolin, Sarah; Krüger, Dirk
2017-01-01
Students' and pre-service teachers' conceptions of scientists have been assessed in a variety of studies. One of the most commonly used instruments is the Draw-A-Scientist Test (DAST) which offers the advantage that no verbal skills are needed by the participants. In some studies, methodical challenges related to the DAST have been discussed; for…
Test methods for textile composites
NASA Technical Reports Server (NTRS)
Minguet, Pierre J.; Fedro, Mark J.; Gunther, Christian K.
1994-01-01
Various test methods commonly used for measuring properties of tape laminate composites were evaluated to determine their suitability for the testing of textile composites. Three different types of textile composites were utilized in this investigation: two-dimensional (2-D) triaxial braids, stitched uniweave fabric, and three-dimensional (3-D) interlock woven fabric. Four 2-D braid architectures, five stitched laminates, and six 3-D woven architectures were tested. All preforms used AS4 fibers and were resin-transfer-molded with Shell RSL-1895 epoxy resin. Ten categories of material properties were investigated: tension, open-hole tension, compression, open-hole compression, in-plane shear, filled-hole tension, bolt bearing, interlaminar tension, interlaminar shear, and interlaminar fracture toughness. Different test methods and specimen sizes were considered for each category of test. Strength and stiffness properties obtained with each of these methods are documented in this report for all the material systems mentioned above.
DOT National Transportation Integrated Search
2017-04-01
Geotextiles are commonly used in pavements, earth retaining structures, and landfills, as well as other geotechnical applications. Various : tests are conducted to evaluate and classify geotextiles to determine their suitability for different applica...
DOT National Transportation Integrated Search
2014-01-01
The Maine Department of Transportation (MaineDOT) has noted poor correlation between predicted pile resistances : calculated using commonly accepted design methods and measured pile resistance from dynamic pile load tests (also : referred to as high ...
DOT National Transportation Integrated Search
2014-01-01
The Maine Department of Transportation (MaineDOT) has noted poor correlation between predicted pile resistances : calculated using commonly accepted design methods and measured pile resistance from dynamic pile load tests (also : referred to as high ...
Mathur, Sunil; Sadana, Ajit
2015-12-01
We present a rank-based test statistic for the identification of differentially expressed genes using a distance measure. The proposed test statistic is highly robust against extreme values and does not assume the distribution of parent population. Simulation studies show that the proposed test is more powerful than some of the commonly used methods, such as paired t-test, Wilcoxon signed rank test, and significance analysis of microarray (SAM) under certain non-normal distributions. The asymptotic distribution of the test statistic, and the p-value function are discussed. The application of proposed method is shown using a real-life data set. © The Author(s) 2011.
Fabrication of liquid-rocket thrust chambers by electroforming
NASA Technical Reports Server (NTRS)
Duscha, R. A.; Kazaroff, J. M.
1974-01-01
Electroforming has proven to be an excellent fabrication method for building liquid rocket regeneratively cooled thrust chambers. NASA sponsored technology programs have investigated both common and advanced methods. Using common procedures, several cooled spool pieces and thrust chambers have been made and successfully tested. The designs were made possible through the versatility of the electroforming procedure, which is not limited to simple geometric shapes. An advanced method of electroforming was used to produce a wire-wrapped, composite, pressure-loaded electroformed structure, which greatly increased the strength of the structure while still retaining the advantages of electroforming.
Chen, Qi; Chen, Quan; Luo, Xiaobing
2014-09-01
In recent years, due to the fast development of high power light-emitting diode (LED), its lifetime prediction and assessment have become a crucial issue. Although the in situ measurement has been widely used for reliability testing in laser diode community, it has not been applied commonly in LED community. In this paper, an online testing method for LED life projection under accelerated reliability test was proposed and the prototype was built. The optical parametric data were collected. The systematic error and the measuring uncertainty were calculated to be within 0.2% and within 2%, respectively. With this online testing method, experimental data can be acquired continuously and sufficient amount of data can be gathered. Thus, the projection fitting accuracy can be improved (r(2) = 0.954) and testing duration can be shortened.
Proficiency Standards and Cut-Scores for Language Proficiency Tests.
ERIC Educational Resources Information Center
Moy, Raymond H.
The problem of standard setting on language proficiency tests is often approached by the use of norms derived from the group being tested, a process commonly known as "grading on the curve." One particular problem with this ad hoc method of standard setting is that it will usually result in a fluctuating standard dependent on the particular group…
A Graphical Approach to Evaluating Equating Using Test Characteristic Curves
ERIC Educational Resources Information Center
Wyse, Adam E.; Reckase, Mark D.
2011-01-01
An essential concern in the application of any equating procedure is determining whether tests can be considered equated after the tests have been placed onto a common scale. This article clarifies one equating criterion, the first-order equity property of equating, and develops a new method for evaluating equating that is linked to this…
Criteria for establishing water quality standards that are protective of all native biota are generally based upon laboratory toxicity tests. These test utilize common model organisms that have established test methods. However, only a small portion of species have established ...
The Reading Span Test and Its Predictive Power for Reading Comprehension Ability
ERIC Educational Resources Information Center
Friedman, Naomi P.; Miyake, Akira
2004-01-01
This study had two major goals: to test the effect of administration method on the criterion validity of a commonly used working memory span test, the reading span task, and to examine the relationship between processing and storage in this task. With respect to the first goal, although experimenter- and participant-administered reading span tasks…
Clinical experimental stress studies: methods and assessment.
Bali, Anjana; Jaggi, Amteshwar Singh
2015-01-01
Stress is a state of threatened homeostasis during which a variety of adaptive processes are activated to produce physiological and behavioral changes. Stress induction methods are pivotal for understanding these physiological or pathophysiological changes in the body in response to stress. Furthermore, these methods are also important for the development of novel pharmacological agents for stress management. The well-described methods to induce stress in humans include the cold pressor test, Trier Social Stress Test, Montreal Imaging Stress Task, Maastricht Acute Stress Test, CO2 challenge test, Stroop test, Paced Auditory Serial Addition Task, noise stress, and Mannheim Multicomponent Stress Test. Stress assessment in humans is done by measuring biochemical markers such as cortisol, cortisol awakening response, dexamethasone suppression test, salivary α-amylase, plasma/urinary norepinephrine, norepinephrine spillover rate, and interleukins. Physiological and behavioral changes such as galvanic skin response, heart rate variability, pupil size, and muscle and/or skin sympathetic nerve activity (microneurography) and cardiovascular parameters such as heart rate, blood pressure, and self-reported anxiety are also monitored to assess stress response. This present review describes these commonly employed methods to induce stress in humans along with stress assessment methods.
Cytomegalovirus infection in the bone marrow transplant patient.
Bhat, Vivek; Joshi, Amit; Sarode, Rahul; Chavan, Preeti
2015-12-24
Cytomegalovirus (CMV) infection is an important contributor to the morbidity and mortality associated with bone marrow transplantation (BMT). Infection may lead to CMV disease involving multiple organs such as pneumonia, gastroenteritis, retinitis, central nervus system involvement and others. CMV seropositivity is an important risk factor and approximately half of BMT recipients will develop clinically significant infection most commonly in the first 100 d post-transplant. The commonly used tests to diagnose CMV infection in these patients include the pp65 antigenemia test and the CMV DNA polymerase chain reaction (PCR) assay. Because of its greater sensitivity and lesser turnaround time, the CMV PCR is nowadays the preferred test and serves as a main guide for pre-emptive therapy. Methods of CMV prevention include use of blood products from seronegative donors or leukodepleted products. Prophylaxis or pre-emptive therapy strategies for CMV prevention may be used post-transplant with the latter becoming more common. The commonly used antivirals for pre-emptive therapy and CMV disease management include intravenous gancyclovir and foscarnet. The role of intravenous immunoglobulin, although used commonly in CMV pneumonia is not clear.
Shallow Reflection Method for Water-Filled Void Detection and Characterization
NASA Astrophysics Data System (ADS)
Zahari, M. N. H.; Madun, A.; Dahlan, S. H.; Joret, A.; Hazreek, Z. A. M.; Mohammad, A. H.; Izzaty, R. A.
2018-04-01
Shallow investigation is crucial in enhancing the characteristics of subsurface void commonly encountered in civil engineering, and one such technique commonly used is seismic-reflection technique. An assessment of the effectiveness of such an approach is critical to determine whether the quality of the works meets the prescribed requirements. Conventional quality testing suffers limitations including: limited coverage (both area and depth) and problems with resolution quality. Traditionally quality assurance measurements use laboratory and in-situ invasive and destructive tests. However geophysical approaches, which are typically non-invasive and non-destructive, offer a method by which improvement of detection can be measured in a cost-effective way. Of this seismic reflection have proved useful to assess void characteristic, this paper evaluates the application of shallow seismic-reflection method in characterizing the water-filled void properties at 0.34 m depth, specifically for detection and characterization of void measurement using 2-dimensional tomography.
Korean Guidelines for Colorectal Cancer Screening and Polyp Detection
Lee, Bo-In; Hong, Sung Pil; Kim, Seong-Eun; Kim, Se Hyung; Hong, Sung Noh; Yang, Dong-Hoon; Shin, Sung Jae; Lee, Suck-Ho; Park, Dong Il; Kim, Young-Ho; Kim, Hyun Jung; Yang, Suk-Kyun; Kim, Hyo Jong; Jeon, Hae Jeong
2012-01-01
Now colorectal cancer is the second most common cancer in males and the fourth most common cancer in females in Korea. Since most of colorectal cancers occur after the prolonged transformation of adenomas into carcinomas, early detection and removal of colorectal adenomas are one of the most effective methods to prevent colorectal cancer. Considering the increasing incidence of colorectal cancer and polyps in Korea, it is very important to establish Korean guideline for colorectal cancer screening and polyp detection. The guideline was developed by the Korean Multi-Society Take Force and we tried to establish the guideline by evidence-based methods. Parts of the statements were draw by systematic reviews and meta-analyses. Herein we discussed epidemiology of colorectal cancers and adenomas in Korea and optimal methods for screening of colorectal cancer and detection of adenomas including fecal occult blood tests, radiologic tests, and endoscopic examinations. PMID:22741131
Identification of polar volatile organic compounds in consumer products and common microenvironments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wallace, L.A.; Nelson, W.C.; Pellizzari, E.
1991-03-01
Polar volatile organic compounds were identified in the headspace of 31 fragrance products such as perfumes, colognes and soaps. About 150 different chemicals were identified in a semiquantitative fashion, using two methods to analyze the headspace: direct injection into a gas chromatograph and collection by an evacuated canister, each followed by GC-MS analysis. The canister method displayed low recoveries for most of the 25 polar chemical standards tested. However, reconstructed ion chromatograms (RICs) from the canister showed good agreement with RICs from the direct injection method except for some high boiling point compounds. Canister samples collected in 15 microenvironments expectedmore » to contain the fragrance products tested (potpourri stores, fragrance sections of department stores, etc.) showed relatively low concentrations of most of these polar chemicals compared with certain common nonpolar chemicals. The results presented will be useful for models of personal exposure and indoor air quality.« less
Vital-Durand, F
1996-01-01
Acuity cards are being more commonly used in clinical and screening practice. The author describes his experience from over 6000 infants tested with the method, using two commercially available sets of cards to provide users with comprehensive guidelines to allow them to get the most out of this useful test.
What Food and Feeding Rates are Optimum for the Chironomus dilutus Sediment Toxicity Test Method?
Laboratory tests with benthic macroinvertebrates are commonly used to assess the toxicity of both contaminated sediments and individual chemicals. Among the standard procedures for benthic macroinvertebrates are 10-d, 20-d, and life cycle exposures using the midge, Chironomus ...
Determining irrigation distribution uniformity and efficiency for nurseries
R. Thomas Fernandez
2010-01-01
A simple method for testing the distribution uniformity of overhead irrigation systems is described. The procedure is described step-by-step along with an example. Other uses of distribution uniformity testing are presented, as well as common situations that affect distribution uniformity and how to alleviate them.
Masiri, Jongkit; Barrios-Lopez, Brianda; Benoit, Lora; Tamayo, Joshua; Day, Jeffrey; Nadala, Cesar; Sung, Shao-Lei; Samadpour, Mansour
2016-03-01
Allergies to cow's milk are very common and can present as life-threatening anaphylaxis. Consequently, food labeling legislation mandates that foods containing milk residues, including casein and/or β-lactoglobulin, provide an indication of such on the product label. Because contamination with either component independent of the other can occur during food manufacturing, effective allergen management measures for containment of milk residues necessitates the use of dual screening methods. To assist the food industry in improving food safety practices, we have developed a rapid lateral flow immunoassay test kit that reliably reports both residues down to 0.01 μg per swab and 0.1 ppm of protein for foods. The assay utilizes both sandwich and competitive format test lines and is specific for bovine milk residues. Selectivity testing using a panel of matrices with potentially interfering substances, including commonly used sanitizing agents, indicated reduction in the limit of detection by one-to fourfold. With food, residues were easily detected in all cow's milk-based foods tested, but goat and sheep milk residues were not detected. Specificity analysis revealed no cross-reactivity with common commodities, with the exception of kidney beans when present at high concentrations (> 1%). The development of a highly sensitive and rapid test method capable of detecting trace amounts of casein and/or β-lactoglobulin should aid food manufacturers and regulatory agencies in monitoring for milk allergens in environmental and food samples.
Kernel Equating Under the Non-Equivalent Groups With Covariates Design
Bränberg, Kenny
2015-01-01
When equating two tests, the traditional approach is to use common test takers and/or common items. Here, the idea is to use variables correlated with the test scores (e.g., school grades and other test scores) as a substitute for common items in a non-equivalent groups with covariates (NEC) design. This is performed in the framework of kernel equating and with an extension of the method developed for post-stratification equating in the non-equivalent groups with anchor test design. Real data from a college admissions test were used to illustrate the use of the design. The equated scores from the NEC design were compared with equated scores from the equivalent group (EG) design, that is, equating with no covariates as well as with equated scores when a constructed anchor test was used. The results indicate that the NEC design can produce lower standard errors compared with an EG design. When covariates were used together with an anchor test, the smallest standard errors were obtained over a large range of test scores. The results obtained, that an EG design equating can be improved by adjusting for differences in test score distributions caused by differences in the distribution of covariates, are useful in practice because not all standardized tests have anchor tests. PMID:29881012
Kernel Equating Under the Non-Equivalent Groups With Covariates Design.
Wiberg, Marie; Bränberg, Kenny
2015-07-01
When equating two tests, the traditional approach is to use common test takers and/or common items. Here, the idea is to use variables correlated with the test scores (e.g., school grades and other test scores) as a substitute for common items in a non-equivalent groups with covariates (NEC) design. This is performed in the framework of kernel equating and with an extension of the method developed for post-stratification equating in the non-equivalent groups with anchor test design. Real data from a college admissions test were used to illustrate the use of the design. The equated scores from the NEC design were compared with equated scores from the equivalent group (EG) design, that is, equating with no covariates as well as with equated scores when a constructed anchor test was used. The results indicate that the NEC design can produce lower standard errors compared with an EG design. When covariates were used together with an anchor test, the smallest standard errors were obtained over a large range of test scores. The results obtained, that an EG design equating can be improved by adjusting for differences in test score distributions caused by differences in the distribution of covariates, are useful in practice because not all standardized tests have anchor tests.
MEASUREMENT OF VOCS DESORBED FROM BUILDING MATERIALS--A HIGH TEMPERATURE DYNAMIC CHAMBER METHOD
Mass balance is a commonly used approach for characterizing the source and sink behavior of building materials. Because the traditional sink test methods evaluate the adsorption and desorption of volatile organic compounds (VOC) at ambient temperatures, the desorption process is...
Statistical evidence for common ancestry: Application to primates.
Baum, David A; Ané, Cécile; Larget, Bret; Solís-Lemus, Claudia; Ho, Lam Si Tung; Boone, Peggy; Drummond, Chloe P; Bontrager, Martin; Hunter, Steven J; Saucier, William
2016-06-01
Since Darwin, biologists have come to recognize that the theory of descent from common ancestry (CA) is very well supported by diverse lines of evidence. However, while the qualitative evidence is overwhelming, we also need formal methods for quantifying the evidential support for CA over the alternative hypothesis of separate ancestry (SA). In this article, we explore a diversity of statistical methods using data from the primates. We focus on two alternatives to CA, species SA (the separate origin of each named species) and family SA (the separate origin of each family). We implemented statistical tests based on morphological, molecular, and biogeographic data and developed two new methods: one that tests for phylogenetic autocorrelation while correcting for variation due to confounding ecological traits and a method for examining whether fossil taxa have fewer derived differences than living taxa. We overwhelmingly rejected both species and family SA with infinitesimal P values. We compare these results with those from two companion papers, which also found tremendously strong support for the CA of all primates, and discuss future directions and general philosophical issues that pertain to statistical testing of historical hypotheses such as CA. © 2016 The Author(s). Evolution © 2016 The Society for the Study of Evolution.
The clinical evaluation of infantile nystagmus: What to do first and why
Bertsch, Morgan; Floyd, Michael; Kehoe, Taylor; Pfeifer, Wanda; Drack, Arlene V.
2017-01-01
Introduction Infantile nystagmus has many causes, some life threatening. We determined the most common diagnoses in order to develop a testing algorithm. Methods Retrospective chart review. Exclusion criteria were no nystagmus, acquired after 6 months, or lack of examination. Data collected: pediatric eye examination findings, ancillary testing, order of testing, referral, and final diagnoses. Final diagnosis was defined as meeting published clinical criteria and/or confirmed by diagnostic testing. Patients with a diagnosis not meeting the definition were “unknown.” Patients with incomplete testing were “incomplete.” Patients with multiple plausible etiologies were “multifactorial.” Patients with negative complete workup were “motor.” Results 284 charts were identified; 202 met inclusion criteria. The 3 most common causes were Albinism(19%), Leber Congenital Amaurosis(LCA)(14%) and Non-LCA retinal dystrophy (13%). Anatomic retinal disorders comprised 10%, motor another 10%. The most common first test was MRI (74/202) with a diagnostic yield of 16%. For 28 MRI-first patients, nystagmus alone was the indication; for 46 MRI-first patients other neurologic signs were present. 0/28 nystagmus-only patients had a diagnostic MRI while 14/46 (30%) with neurologic signs did. Yield of ERG as first test was 56%, OCT 55%, and molecular genetic testing 47%. 90% of patients had an etiology identified. Conclusion The most common causes of infantile nystagmus were retinal disorders (56%), however the most common first test was brain MRI. For patients without other neurologic stigmata complete pediatric eye examination, ERG, OCT and molecular genetic testing had a higher yield than MRI scan. If MRI is not diagnostic, a complete ophthalmologic workup should be pursued. PMID:28177849
ERIC Educational Resources Information Center
Packer, Jaclyn; Reuschel, William
2018-01-01
Introduction: Accessibility of Voice over Internet Protocol (VoIP) systems was tested with a hands-on usability study and an online survey of VoIP users who are visually impaired. The survey examined the importance of common VoIP features, and both methods assessed difficulty in using those features. Methods: The usability test included four paid…
ERIC Educational Resources Information Center
Ades, A. E.; Lu, Guobing; Dias, Sofia; Mayo-Wilson, Evan; Kounali, Daphne
2015-01-01
Objective: Trials often may report several similar outcomes measured on different test instruments. We explored a method for synthesising treatment effect information both within and between trials and for reporting treatment effects on a common scale as an alternative to standardisation Study design: We applied a procedure that simultaneously…
Sensitivity of Equated Aggregate Scores to the Treatment of Misbehaving Common Items
ERIC Educational Resources Information Center
Michaelides, Michalis P.
2010-01-01
The delta-plot method (Angoff, 1972) is a graphical technique used in the context of test equating for identifying common items with aberrant changes in their item difficulties across administrations or alternate forms. This brief research report explores the effects on equated aggregate scores when delta-plot outliers are either retained in or…
Yatabe, Yasushi; Kerr, Keith M; Utomo, Ahmad; Rajadurai, Pathmanathan; Tran, Van Khanh; Du, Xiang; Chou, Teh-Ying; Enriquez, Ma Luisa D; Lee, Geon Kook; Iqbal, Jabed; Shuangshoti, Shanop; Chung, Jin-Haeng; Hagiwara, Koichi; Liang, Zhiyong; Normanno, Nicola; Park, Keunchil; Toyooka, Shinichi; Tsai, Chun-Ming; Waring, Paul; Zhang, Li; McCormack, Rose; Ratcliffe, Marianne; Itoh, Yohji; Sugeno, Masatoshi; Mok, Tony
2015-03-01
The efficacy of epidermal growth factor receptor (EGFR) tyrosine kinase inhibitors in EGFR mutation-positive non-small-cell lung cancer (NSCLC) patients necessitates accurate, timely testing. Although EGFR mutation testing has been adopted by many laboratories in Asia, data are lacking on the proportion of NSCLC patients tested in each country, and the most commonly used testing methods. A retrospective survey of records from NSCLC patients tested for EGFR mutations during 2011 was conducted in 11 Asian Pacific countries at 40 sites that routinely performed EGFR mutation testing during that period. Patient records were used to complete an online questionnaire at each site. Of the 22,193 NSCLC patient records surveyed, 31.8% (95% confidence interval: 31.2%-32.5%) were tested for EGFR mutations. The rate of EGFR mutation positivity was 39.6% among the 10,687 cases tested. The majority of samples were biopsy and/or cytology samples (71.4%). DNA sequencing was the most commonly used testing method accounting for 40% and 32.5% of tissue and cytology samples, respectively. A pathology report was available only to 60.0% of the sites, and 47.5% were not members of a Quality Assurance Scheme. In 2011, EGFR mutation testing practices varied widely across Asia. These data provide a reference platform from which to improve the molecular diagnosis of NSCLC, and EGFR mutation testing in particular, in Asia.
A comparison of diagnostic tests for lactose malabsorption - which one is the best?
2009-01-01
Background Perceived milk intolerance is a common complaint, and tests for lactose malabsorption (LM) are unreliable. This study assesses the agreement between diagnostic tests for LM and describes the diagnostic properties of the tests. Methods Patients above 18 years of age with suspected LM were included. After oral intake of 25 g lactose, a combined test with measurement of serum glucose (s-glucose) and hydrogen (H2) and methane (CH4) in expired air was performed and symptoms were recorded. In patients with discrepancies between the results, the combined test was repeated and a gene test for lactose non-persistence was added. The diagnosis of LM was based on an evaluation of all tests. The following tests were compared: Increase in H2, CH4, H2+CH4 and H2+CH4x2 in expired air, increase in s-glucose, and symptoms. The agreement was calculated and the diagnostic properties described. Results Sixty patients were included, seven (12%) had LM. The agreement (kappa-values) between the methods varied from 0.25 to 0.91. The best test was the lactose breath test with measurement of the increase in H2 + CH4x2 in expired air. With a cut-off level < 18 ppm, the area under the ROC-curve was 0.967 and sensitivity was 100%. This shows that measurement of CH4 in addition to H2 improves the diagnostic properties of the breath test. Conclusion The agreement between commonly used methods for the diagnosis of LM was unsatisfactory. A lactose breath test with measurement of H2 + CH4x2 in expired air had the best diagnostic properties. PMID:19878587
NASA Technical Reports Server (NTRS)
1972-01-01
The development of nondestructive testing procedures by NASA and the transfer of nondestructive testing to technology to civilian industry are discussed. The subjects presented are: (1) an overview of the nondestructive testing field, (2) NASA contributions to the field of nondestructive testing, (3) dissemination of NASA contributions, and (4) a transfer profile. Attachments are included which provide a brief description of common nondestructive testing methods and summarize the technology transfer reports involving NASA generated nondestructive testing technology.
NASA Technical Reports Server (NTRS)
Leppert, E. L.; Lee, S. H.; Day, F. D.; Chapman, P. C.; Wada, B. K.
1976-01-01
The Mariner Jupiter/Saturn (MJS) spacecraft was subjected to the traditional multipoint sine dwell (MPSD) modal test using 111 accelerometer channels, and also to single-point random (SPR) testing using 26 accelerometer channels, and the two methods are compared according to cost, schedule, and technical criteria. A measure of comparison between the systems was devised in terms of the cumulative difference in the kinetic energy distribution of the common accelerometers. The SPR and MPSD method show acceptable agreement with respect to frequencies and mode damping. The merit of the SPR method is that the excitation points are minimized and the test article can be committed to other uses while data analysis is performed. The MPSD approach allows validity of the data to be determined as the test progresses. Costs are about the same for the two methods.
Pitt, T; Sparrow, M; Warner, M; Stefanidou, M
2003-01-01
Methods: The susceptibility of 417 CF patient isolates of P aeruginosa from 17 hospitals to six commonly prescribed antibiotics were examined. Isolates were tested by an agar break point dilution method and E-tests according to British Society of Antimicrobial Chemotherapy guidelines. Genotyping of isolates was performed by XbaI DNA macrorestriction and pulsed field gel electrophoresis. Results: 38% of isolates were susceptible to all of the agents tested; almost half were resistant to gentamicin compared with ceftazidime (39%), piperacillin (32%), ciprofloxacin (30%), tobramycin (10%), and colistin (3%). Approximately 40% were resistant to two or more compounds with ceftazidime in combination with gentamicin, piperacillin or ciprofloxacin being the most common cross resistances. Resistance rates were generally similar to those reported recently from the USA and Germany. A selection of resistant isolates proved to be predominantly genotypically distinct by XbaI DNA macrorestriction but six pairs from three centres had similar genotypes. Conclusions: The level of resistance to front line antipseudomonal agents, with the exception of colistin, is disturbingly high. The prudent use of antimicrobial drugs and closer monitoring of accumulation of resistant strain populations should be actively considered. PMID:12947141
ERIC Educational Resources Information Center
Venetsanou, Fotini; Kambas, Antonis; Ellinoudis, Theodoros; Fatouros, Ioannis; Giannakidou, Dimitra; Kourtessis, Thomas
2011-01-01
Developmental Coordination Disorder (DCD) is an important risk factor in the development of children that can have a significant academic and social impact. This reinforces the need for its timely identification using appropriate assessment methods and accurate screening tests. The commonly used standardized motor test for the DCD identification…
ERIC Educational Resources Information Center
Reardon, Sean F.; Kalogrides, Demetra; Ho, Andrew D.
2017-01-01
There is no comprehensive database of U.S. district-level test scores that is comparable across states. We describe and evaluate a method for constructing such a database. First, we estimate linear, reliability-adjusted linking transformations from state test score scales to the scale of the National Assessment of Educational Progress (NAEP). We…
ERIC Educational Resources Information Center
Cascallar, Alicia S.; Dorans, Neil J.
2005-01-01
This study compares two methods commonly used (concordance and prediction) to establish linkages between scores from tests of similar content given in different languages. Score linkages between the Verbal and Math sections of the SAT I and the corresponding sections of the Spanish-language admissions test, the Prueba de Aptitud Academica (PAA),…
The "Promise" of Three Methods of Word Association Analysis to L2 Lexical Research
ERIC Educational Resources Information Center
Zareva, Alla; Wolter, Brent
2012-01-01
The present study is an attempt to empirically test and compare the results of three methods of word association (WA) analysis. Two of the methods--namely, associative commonality and nativelikeness, and lexico-syntactic patterns of associative organization--have been traditionally used in both first language (L1) and second language (L2)…
New methods for new questions: obstacles and opportunities.
Foster, E Michael; Kalil, Ariel
2008-03-01
Two forces motivate this special section, "New Methods for New Questions in Developmental Psychology." First are recent developments in social science methodology and the increasing availability of those methods in common software packages. Second, at the same time psychologists' understanding of developmental phenomena has continued to grow. At their best, these developments in theory and methods work in tandem, fueling each other. Newer methods make it possible for scientists to better test their ideas; better ideas lead methodologists to techniques that better reflect, capture, and quantify the underlying processes. The articles in this special section represent a sampling of these new methods and new questions. The authors describe common themes in these articles and identify barriers to future progress, such as the lack of data sharing by and analytical training for developmentalists.
Ryan, Matthew W; Marple, Bradley F; Leatherman, Bryan; Mims, J Whit; Fornadley, John; Veling, Maria; Lin, Sandra Y
2014-10-01
Clinical practices for the diagnosis and treatment of allergic disease evolve over time in response to a variety of forces. The techniques used by various physician specialties are not clearly defined and may vary from published descriptions or recommendations in the literature. This work is a Web-based survey enrolling 250 U.S. physicians in the following specialties: otolaryngology (ENT), allergy-immunology (A/I), and primary care (PCP). Respondents reported that skin-prick testing is the most common diagnostic testing method, followed by in vitro specific immunoglobulin E (IgE) testing. ENTs were more likely to use intradermal testing compared to other specialties (p = 0.0003 vs A/I; p < 0.0001 vs PCP). Respondents reported a wide distribution in number of allergens tested, regardless of testing method (range, 11 to >60). Significant use of home immunotherapy injections (defined as >10% of immunotherapy patients) ranged from 27% to 36% of physicians, with no statistically significant difference noted based upon specialty. PCPs reported greater use of sublingual immunotherapy (PCP, 68%; A/I, 45%; otolaryngology, 35%; A/I vs PCP, p = 0.005; ENT vs PCP p < 0.001)). A variety of allergy testing and treatment methods are employed by U.S. physicians, with some differences noted based upon specialty. Home immunotherapy continues to be employed in allergy practices, and sublingual immunotherapy is a common form of delivery, especially in primary care practices. © 2014 ARS-AAOA, LLC.
Applying high-resolution melting (HRM) technology to identify five commonly used Artemisia species.
Song, Ming; Li, Jingjian; Xiong, Chao; Liu, Hexia; Liang, Junsong
2016-10-04
Many members of the genus Artemisia are important for medicinal purposes with multiple pharmacological properties. Often, these herbal plants sold on the markets are in processed forms so it is difficult to authenticate. Routine testing and identification of these herbal materials should be performed to ensure that the raw materials used in pharmaceutical products are suitable for their intended use. In this study, five commonly used Artemisia species included Artemisia argyi, Artemisia annua, Artemisia lavandulaefolia, Artemisia indica, and Artemisia atrovirens were analyzed using high resolution melting (HRM) analysis based on the internal transcribed spacer 2 (ITS2) sequences. The melting profiles of the ITS2 amplicons of the five closely related herbal species are clearly separated so that they can be differentiated by HRM method. The method was further applied to authenticate commercial products in powdered. HRM curves of all the commercial samples tested are similar to the botanical species as labeled. These congeneric medicinal products were also clearly separated using the neighbor-joining (NJ) tree. Therefore, HRM method could provide an efficient and reliable authentication system to distinguish these commonly used Artemisia herbal products on the markets and offer a technical reference for medicines quality control in the drug supply chain.
Cummings, Kevin J.; Warnick, Lorin D.; Schukken, Ynte H.; Siler, Julie D.; Gröhn, Yrjo T.; Davis, Margaret A.; Besser, Tom E.; Wiedmann, Martin
2011-01-01
Abstract Data generated using different antimicrobial testing methods often have to be combined, but the equivalence of such results is difficult to assess. Here we compared two commonly used antimicrobial susceptibility testing methods, automated microbroth dilution and agar disk diffusion, for 8 common drugs, using 222 Salmonella isolates of serotypes Newport, Typhimurium, and 4,5,12:i-, which had been isolated from clinical salmonellosis cases among cattle and humans. Isolate classification corresponded well between tests, with 95% overall category agreement. Test results were significantly negatively correlated, and Spearman's correlation coefficients ranged from −0.98 to −0.38. Using Cox's proportional hazards model we determined that for most drugs, a 1 mm increase in zone diameter resulted in an estimated 20%–40% increase in the hazard of growth inhibition. However, additional parameters such as isolation year or serotype often impacted the hazard of growth inhibition as well. Comparison of economical feasibility showed that agar disk diffusion is clearly more cost-effective if the average sample throughput is small but that both methods are comparable at high sample throughput. In conclusion, for the Salmonella serotypes and antimicrobial drugs analyzed here, antimicrobial susceptibility data generated based on either test are qualitatively very comparable, and the current published break points for both methods are in excellent agreement. Economic feasibility clearly depends on the specific laboratory settings, and disk diffusion might be an attractive alternative for certain applications such as surveillance studies. PMID:21877930
Empirical methods for assessing meaningful neuropsychological change following epilepsy surgery.
Sawrie, S M; Chelune, G J; Naugle, R I; Lüders, H O
1996-11-01
Traditional methods for assessing the neurocognitive effects of epilepsy surgery are confounded by practice effects, test-retest reliability issues, and regression to the mean. This study employs 2 methods for assessing individual change that allow direct comparison of changes across both individuals and test measures. Fifty-one medically intractable epilepsy patients completed a comprehensive neuropsychological battery twice, approximately 8 months apart, prior to any invasive monitoring or surgical intervention. First, a Reliable Change (RC) index score was computed for each test score to take into account the reliability of that measure, and a cutoff score was empirically derived to establish the limits of statistically reliable change. These indices were subsequently adjusted for expected practice effects. The second approach used a regression technique to establish "change norms" along a common metric that models both expected practice effects and regression to the mean. The RC index scores provide the clinician with a statistical means of determining whether a patient's retest performance is "significantly" changed from baseline. The regression norms for change allow the clinician to evaluate the magnitude of a given patient's change on 1 or more variables along a common metric that takes into account the reliability and stability of each test measure. Case data illustrate how these methods provide an empirically grounded means for evaluating neurocognitive outcomes following medical interventions such as epilepsy surgery.
HPV Testing of Head and Neck Cancer in Clinical Practice.
Robinson, Max
The pathology laboratory has a central role in providing human papillomavirus (HPV) tests for patients with head and neck cancer. There is an extensive literature around HPV testing and a large number of proprietary HPV tests, which makes the field difficult to navigate. This review provides a concise contemporary overview of the evidence around HPV testing in head and neck cancer and signposts key publications, guideline documents and the most commonly used methods in clinical practice.
Role of failure-mechanism identification in accelerated testing
NASA Technical Reports Server (NTRS)
Hu, J. M.; Barker, D.; Dasgupta, A.; Arora, A.
1993-01-01
Accelerated life testing techniques provide a short-cut method to investigate the reliability of electronic devices with respect to certain dominant failure mechanisms that occur under normal operating conditions. However, accelerated tests have often been conducted without knowledge of the failure mechanisms and without ensuring that the test accelerated the same mechanism as that observed under normal operating conditions. This paper summarizes common failure mechanisms in electronic devices and packages and investigates possible failure mechanism shifting during accelerated testing.
Large-mirror testing facility at the National Optical Astronomy Observatories.
NASA Astrophysics Data System (ADS)
Barr, L. D.; Coudé du Foresto, V.; Fox, J.; Poczulp, G. A.; Richardson, J.; Roddier, C.; Roddier, F.
1991-09-01
A method for testing the surfaces of large mirrors has been developed to be used even when conditions of vibration and thermal turbulence in the light path cannot be eliminated. The full aperture of the mirror under test is examined by means of a scatterplate interferometer that has the property of being a quasi-common-path method, although any means for obtaining interference fringes will do. The method uses a remotely operated CCD camera system to record the fringe pattern from the workpiece. The typical test is done with a camera exposure of about a millisecond to "freeze" the fringe pattern on the detector. Averaging up to 10 separate exposures effectively eliminates the turbulence effects. The method described provides the optician with complete numerical information and visual plots for the surface under test and the diffracted image the method will produce, all within a few minutes, to an accuracy of 0.01 μm measured peak-to-valley.
Beyond Bigotry: Teaching about Unconscious Prejudice
ERIC Educational Resources Information Center
Ghoshal, Raj Andrew; Lippard, Cameron; Ribas, Vanesa; Muir, Ken
2013-01-01
Researchers have demonstrated that unconscious prejudices around characteristics such as race, gender, and class are common, even among people who avow themselves unbiased. The authors present a method for teaching about implicit racial bias using online Implicit Association Tests. The authors do not claim that their method rids students of…
DOT National Transportation Integrated Search
2015-04-01
A laboratory study was conducted to develop guidelines for the Multiple Stress Creep Recovery : (MSCR) test method for local conditions prevailing in Oklahoma. The study consisted of : commonly used binders in Oklahoma, namely PG 64-22, PG 70-28, and...
Method for adjusting warp measurements to a different board dimension
William T. Simpson; John R. Shelly
2000-01-01
Warp in lumber is a common problem that occurs while lumber is being dried. In research or other testing programs, it is sometimes necessary to compare warp of different species or warp caused by different process variables. If lumber dimensions are not the same, then direct comparisons are not possible, and adjusting warp to a common dimension would be desirable so...
Pathway analysis with next-generation sequencing data.
Zhao, Jinying; Zhu, Yun; Boerwinkle, Eric; Xiong, Momiao
2015-04-01
Although pathway analysis methods have been developed and successfully applied to association studies of common variants, the statistical methods for pathway-based association analysis of rare variants have not been well developed. Many investigators observed highly inflated false-positive rates and low power in pathway-based tests of association of rare variants. The inflated false-positive rates and low true-positive rates of the current methods are mainly due to their lack of ability to account for gametic phase disequilibrium. To overcome these serious limitations, we develop a novel statistic that is based on the smoothed functional principal component analysis (SFPCA) for pathway association tests with next-generation sequencing data. The developed statistic has the ability to capture position-level variant information and account for gametic phase disequilibrium. By intensive simulations, we demonstrate that the SFPCA-based statistic for testing pathway association with either rare or common or both rare and common variants has the correct type 1 error rates. Also the power of the SFPCA-based statistic and 22 additional existing statistics are evaluated. We found that the SFPCA-based statistic has a much higher power than other existing statistics in all the scenarios considered. To further evaluate its performance, the SFPCA-based statistic is applied to pathway analysis of exome sequencing data in the early-onset myocardial infarction (EOMI) project. We identify three pathways significantly associated with EOMI after the Bonferroni correction. In addition, our preliminary results show that the SFPCA-based statistic has much smaller P-values to identify pathway association than other existing methods.
Common Cause Failure Modeling: Aerospace Versus Nuclear
NASA Technical Reports Server (NTRS)
Stott, James E.; Britton, Paul; Ring, Robert W.; Hark, Frank; Hatfield, G. Spencer
2010-01-01
Aggregate nuclear plant failure data is used to produce generic common-cause factors that are specifically for use in the common-cause failure models of NUREG/CR-5485. Furthermore, the models presented in NUREG/CR-5485 are specifically designed to incorporate two significantly distinct assumptions about the methods of surveillance testing from whence this aggregate failure data came. What are the implications of using these NUREG generic factors to model the common-cause failures of aerospace systems? Herein, the implications of using the NUREG generic factors in the modeling of aerospace systems are investigated in detail and strong recommendations for modeling the common-cause failures of aerospace systems are given.
Time and temperature dependent modulus of pyrrone and polyimide moldings
NASA Technical Reports Server (NTRS)
Lander, L. L.
1972-01-01
A method is presented by which the modulus obtained from a stress relaxation test can be used to estimate the modulus which would be obtained from a sonic vibration test. The method was applied to stress relaxation, sonic vibration, and high speed stress-strain data which was obtained on a flexible epoxy. The modulus as measured by the three test methods was identical for identical test times, and a change of test temperature was equivalent to a shift in the logarithmic time scale. An estimate was then made of the dynamic modulus of moldings of two Pyrrones and two polyimides, using stress relaxation data and the method of analysis which was developed for the epoxy. Over the common temperature range (350 to 500 K) in which data from both types of tests were available, the estimated dynamic modulus value differed by only a few percent from the measured value. As a result, it is concluded that, over the 500 to 700 K temperature range, the estimated dynamic modulus values are accurate.
Collection of quantitative chemical release field data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Demirgian, J.; Macha, S.; Loyola Univ.
1999-01-01
Detection and quantitation of chemicals in the environment requires Fourier-transform infrared (FTIR) instruments that are properly calibrated and tested. This calibration and testing requires field testing using matrices that are representative of actual instrument use conditions. Three methods commonly used for developing calibration files and training sets in the field are a closed optical cell or chamber, a large-scale chemical release, and a small-scale chemical release. There is no best method. The advantages and limitations of each method should be considered in evaluating field results. Proper calibration characterizes the sensitivity of an instrument, its ability to detect a component inmore » different matrices, and the quantitative accuracy and precision of the results.« less
Enabling multiplexed testing of pooled donor cells through whole-genome sequencing.
Chan, Yingleong; Chan, Ying Kai; Goodman, Daniel B; Guo, Xiaoge; Chavez, Alejandro; Lim, Elaine T; Church, George M
2018-04-19
We describe a method that enables the multiplex screening of a pool of many different donor cell lines. Our method accurately predicts each donor proportion from the pool without requiring the use of unique DNA barcodes as markers of donor identity. Instead, we take advantage of common single nucleotide polymorphisms, whole-genome sequencing, and an algorithm to calculate the proportions from the sequencing data. By testing using simulated and real data, we showed that our method robustly predicts the individual proportions from a mixed-pool of numerous donors, thus enabling the multiplexed testing of diverse donor cells en masse.More information is available at https://pgpresearch.med.harvard.edu/poolseq/.
A low cost method of testing compression-after-impact strength of composite laminates
NASA Technical Reports Server (NTRS)
Nettles, Alan T.
1991-01-01
A method was devised to test the compression strength of composite laminate specimens that are much thinner and wider than other tests require. The specimen can be up to 7.62 cm (3 in) wide and as thin as 1.02 mm (.04 in). The best features of the Illinois Institute of Technology Research Institute (IITRI) fixture are combined with an antibuckling jig developed and used at the University of Dayton Research Institute to obtain a method of compression testing thin, wide test coupons on any 20 kip (or larger) loading frame. Up to 83 pct. less composite material is needed for the test coupons compared to the most commonly used compression-after-impact (CAI) tests, which calls for 48 ply thick (approx. 6.12 mm) test coupons. Another advantage of the new method is that composite coupons of the exact lay-up and thickness of production parts can be tested for CAI strength, thus yielding more meaningful results. This new method was used to compression test 8 and 16 ply laminates of T300/934 carbon/epoxy. These results were compared to those obtained using ASTM standard D 3410-87 (Celanese compression test). CAI testing was performed on IM6/3501-6, IM7/SP500 and IM7/F3900. The new test method and associated fixture work well and is a valuable asset to MSFC's damage tolerance program.
A Method for Generating Educational Test Items That Are Aligned to the Common Core State Standards
ERIC Educational Resources Information Center
Gierl, Mark J.; Lai, Hollis; Hogan, James B.; Matovinovic, Donna
2015-01-01
The demand for test items far outstrips the current supply. This increased demand can be attributed, in part, to the transition to computerized testing, but, it is also linked to dramatic changes in how 21st century educational assessments are designed and administered. One way to address this growing demand is with automatic item generation.…
Seed germination test for toxicity evaluation of compost: Its roles, problems and prospects.
Luo, Yuan; Liang, Jie; Zeng, Guangming; Chen, Ming; Mo, Dan; Li, Guoxue; Zhang, Difang
2018-01-01
Compost is commonly used for the growth of plants and the remediation of environmental pollution. It is important to evaluate the quality of compost and seed germination test is a powerful tool to examine the toxicity of compost, which is the most important aspect of the quality. Now the test is widely adopted, but the main problem is that the test results vary with different methods and seed species, which limits the development and application of it. The standardization of methods and the modelization of seeds can contribute to solving the problem. Additionally, according to the probabilistic theory of seed germination, the error caused by the analysis and judgment methods of the test results can be reduced. Here, we reviewed the roles, problems and prospects of the seed germination test in the studies of compost. Copyright © 2017 Elsevier Ltd. All rights reserved.
Quirós, Elia; Felicísimo, Angel M; Cuartero, Aurora
2009-01-01
This work proposes a new method to classify multi-spectral satellite images based on multivariate adaptive regression splines (MARS) and compares this classification system with the more common parallelepiped and maximum likelihood (ML) methods. We apply the classification methods to the land cover classification of a test zone located in southwestern Spain. The basis of the MARS method and its associated procedures are explained in detail, and the area under the ROC curve (AUC) is compared for the three methods. The results show that the MARS method provides better results than the parallelepiped method in all cases, and it provides better results than the maximum likelihood method in 13 cases out of 17. These results demonstrate that the MARS method can be used in isolation or in combination with other methods to improve the accuracy of soil cover classification. The improvement is statistically significant according to the Wilcoxon signed rank test.
Gourmelon, Anne; Delrue, Nathalie
Ten years elapsed since the OECD published the Guidance document on the validation and international regulatory acceptance of test methods for hazard assessment. Much experience has been gained since then in validation centres, in countries and at the OECD on a variety of test methods that were subjected to validation studies. This chapter reviews validation principles and highlights common features that appear to be important for further regulatory acceptance across studies. Existing OECD-agreed validation principles will most likely generally remain relevant and applicable to address challenges associated with the validation of future test methods. Some adaptations may be needed to take into account the level of technique introduced in test systems, but demonstration of relevance and reliability will continue to play a central role as pre-requisite for the regulatory acceptance. Demonstration of relevance will become more challenging for test methods that form part of a set of predictive tools and methods, and that do not stand alone. OECD is keen on ensuring that while these concepts evolve, countries can continue to rely on valid methods and harmonised approaches for an efficient testing and assessment of chemicals.
ERIC Educational Resources Information Center
Desoete, Annemie
2008-01-01
Third grade elementary school children solved tests on mathematical reasoning and numerical facility. Metacognitive skillfulness was assessed through think aloud protocols, prospective and retrospective child ratings, teacher questionnaires, calibration measures and EPA2000. In our dataset metacognition has a lot in common with intelligence, but…
ERIC Educational Resources Information Center
Li, Spencer D.
2011-01-01
Mediation analysis in child and adolescent development research is possible using large secondary data sets. This article provides an overview of two statistical methods commonly used to test mediated effects in secondary analysis: multiple regression and structural equation modeling (SEM). Two empirical studies are presented to illustrate the…
Critical assessment of jet erosion test methodologies for cohesive soil and sediment
USDA-ARS?s Scientific Manuscript database
The submerged Jet Erosion Test (JET) is a commonly used technique to assess the erodibility of cohesive soil. Employing a linear excess shear stress equation and impinging jet theory, simple numerical methods have been developed to analyze data collected using a JET to determine the critical shear s...
ERIC Educational Resources Information Center
Jacob, Robin Tepper; Jacob, Brian
2012-01-01
Teacher and principal surveys are among the most common data collection techniques employed in education research. Yet there is remarkably little research on survey methods in education, or about the most cost-effective way to raise response rates among teachers and principals. In an effort to explore various methods for increasing survey response…
A Groupwise Association Test for Rare Mutations Using a Weighted Sum Statistic
Madsen, Bo Eskerod; Browning, Sharon R.
2009-01-01
Resequencing is an emerging tool for identification of rare disease-associated mutations. Rare mutations are difficult to tag with SNP genotyping, as genotyping studies are designed to detect common variants. However, studies have shown that genetic heterogeneity is a probable scenario for common diseases, in which multiple rare mutations together explain a large proportion of the genetic basis for the disease. Thus, we propose a weighted-sum method to jointly analyse a group of mutations in order to test for groupwise association with disease status. For example, such a group of mutations may result from resequencing a gene. We compare the proposed weighted-sum method to alternative methods and show that it is powerful for identifying disease-associated genes, both on simulated and Encode data. Using the weighted-sum method, a resequencing study can identify a disease-associated gene with an overall population attributable risk (PAR) of 2%, even when each individual mutation has much lower PAR, using 1,000 to 7,000 affected and unaffected individuals, depending on the underlying genetic model. This study thus demonstrates that resequencing studies can identify important genetic associations, provided that specialised analysis methods, such as the weighted-sum method, are used. PMID:19214210
Yoga Latha, L; Darah, I; Sasidharan, S; Jain, K
2009-09-01
Chemical preservatives have been used in the food industry for many years. However, with increased health concerns, consumers prefer additive-free products or food preservatives based on natural products. This study evaluated antimicrobial activities of extracts from Emilia sonchifolia L. (Common name: lilac tassel flower), Tridax procumbens L. (Common name: tridax daisy) and Vernonia cinerea L. (Common name: Sahadevi), belonging to the Asteracea family, to explore their potential for use against general food spoilage and human pathogens so that new food preservatives may be developed. Three methanol extracts of these plants were tested in vitro against 20 bacterial species, 3 yeast species, and 12 filamentous fungi by the agar diffusion and broth dilution methods. The V. cinerea extract was found to be most effective against all of the tested organisms and the methanol fraction showed the most significant (p < 0.05) antimicrobial activity among all the soluble fractions tested. The minimum inhibitory concentrations (MICs) of extracts determined by the broth dilution method ranged from 1.56 to 100.00mg/mL. The MIC of methanol fraction was the lowest in comparison to the other four extracts. The study findings indicate that bioactive natural products from these plants may be isolated for further testing as leads in the development of new pharmaceuticals in food preservation as well as natural plant-based medicine.
Large-mirror testing facility at the National Optical Astronomy Observatories
NASA Astrophysics Data System (ADS)
Coudé du Foresto, V.; Fox, J.; Poczulp, G. A.; Richardson, J.; Roddier, Claude; Roddier, Francois; Barr, L. D.
1991-09-01
A method for testing the surfaces of large mirrors has been developed to be used even when conditions of vibration and thermal turbulence in the light path cannot be eliminated. The full aperture of the mirror under test is examined by means of a scatterplate interferometer that has the property of being a quasi-common-path method, although any means for obtaining interference fringes can be used. By operating the test equipment remotely, the optician does not cause unnecessary vibrations or heat in the testing area. The typical test is done with a camera exposure of about a millisecond to 'freeze' the fringe pattern on the detector. Averaging up to 10 separate exposures effectively eliminates the turbulence effects. From the intensity information, a phase map of the wavefront reflected from the surface is obtained using a phase-unwrapping technique. The method provides the optician with complete numerical information and visual plots for the surface under test and the diffracted image the method will produce to an accuracy of 0.01 micron measured peak-to-valley. The method has been extensively used for a variety of test of a 1.8-m-diam borosilicate-glass honeycomb mirror, where the method was shown to have a sensitivity equal to a Foucault test.
Could Daylight Glare Be Defined Mathematically?Results of Testing the DGIN Method in Japan
NASA Astrophysics Data System (ADS)
Nazzal, Ali; Oki, Masato
Discomfort glare from daylight is a common problem without valid prediction methods so far. A new mathematical DGIN (New Daylight Glare Index) method tries to respond the challenge. This paper reports on experiments carried out in daylit office environment in Japan to test applicability of the method. Slight positive correlation was found between the DGIN and the subjective evaluation. Additionally, a high Ladaptation value together with the small ratio of Lwindow to Ladaptation was obviously experienced sufficient to neutralize the effect of glare discomfort. However, subjective assessments are poor glare indicators and not reliable in testing glare prediction methods. DGIN is a good indicator of daylight glare, and when the DGIN value is analyzed together with the measured illuminance ratios, discomfort glare from daylight can be analyzed in a quantitative manner. The DGIN method could serve architects and lighting designers in testing daylighting systems, and also guide the action of daylight responsive lighting controls.
Aleid, Nouf M.; Fertig, Raymond; Maddy, Austin; Tosti, Antonella
2017-01-01
Background Contact dermatitis of the scalp is common and might be caused by many chemicals including metals, ingredients of shampoos and conditioners, dyes, or other hair treatments. Eliciting a careful history and patch tests are necessary to identify the responsible allergen and prevent relapses. Objectives To identify allergens that may cause contact dermatitis of the scalp by reviewing patch test results. Methods We reviewed the records of 1,015 patients referred for patch testing at the Dermatology Department of the University of Miami. A total of 226 patients (205 females and 21 males) with suspected scalp contact dermatitis were identified, and the patch test results and clinical data for those patients were analyzed. Most patients were referred for patch testing from a specialized hair clinic at our institution. Results The most common allergens in our study population were nickel (23.8%), cobalt (21.0%), balsam of Peru (18.2%), fragrance mix (14.4%), carba mix (11.6%), and propylene glycol (PG) (8.8%). The majority of patients were females aged 40–59 years, and scalp itching or burning were reported as the most common symptom. Conclusion Frequent sources of allergens for metals include hair clasps, pins, and brushes, while frequent sources of allergens for preservatives, fragrance mix, and balsam of Peru include shampoos, conditioners, and hair gels. Frequent sources of allergens for PG include topical medications. PMID:28611994
DOE Office of Scientific and Technical Information (OSTI.GOV)
Labby, Z.
Physicists are often expected to have a solid grounding in experimental design and statistical analysis, sometimes filling in when biostatisticians or other experts are not available for consultation. Unfortunately, graduate education on these topics is seldom emphasized and few opportunities for continuing education exist. Clinical physicists incorporate new technology and methods into their practice based on published literature. A poor understanding of experimental design and analysis could Result in inappropriate use of new techniques. Clinical physicists also improve current practice through quality initiatives that require sound experimental design and analysis. Academic physicists with a poor understanding of design and analysismore » may produce ambiguous (or misleading) results. This can Result in unnecessary rewrites, publication rejection, and experimental redesign (wasting time, money, and effort). This symposium will provide a practical review of error and uncertainty, common study designs, and statistical tests. Instruction will primarily focus on practical implementation through examples and answer questions such as: where would you typically apply the test/design and where is the test/design typically misapplied (i.e., common pitfalls)? An analysis of error and uncertainty will also be explored using biological studies and associated modeling as a specific use case. Learning Objectives: Understand common experimental testing and clinical trial designs, what questions they can answer, and how to interpret the results Determine where specific statistical tests are appropriate and identify common pitfalls Understand the how uncertainty and error are addressed in biological testing and associated biological modeling.« less
NASA Astrophysics Data System (ADS)
Obuchowski, Nancy A.; Bullen, Jennifer A.
2018-04-01
Receiver operating characteristic (ROC) analysis is a tool used to describe the discrimination accuracy of a diagnostic test or prediction model. While sensitivity and specificity are the basic metrics of accuracy, they have many limitations when characterizing test accuracy, particularly when comparing the accuracies of competing tests. In this article we review the basic study design features of ROC studies, illustrate sample size calculations, present statistical methods for measuring and comparing accuracy, and highlight commonly used ROC software. We include descriptions of multi-reader ROC study design and analysis, address frequently seen problems of verification and location bias, discuss clustered data, and provide strategies for testing endpoints in ROC studies. The methods are illustrated with a study of transmission ultrasound for diagnosing breast lesions.
Acute Unilateral Vestibular Failure Does Not Cause Spatial Hemineglect
Conrad, Julian; Habs, Maximilian; Brandt, Thomas; Dieterich, Marianne
2015-01-01
Objectives Visuo-spatial neglect and vestibular disorders have common clinical findings and involve the same cortical areas. We questioned (1) whether visuo-spatial hemineglect is not only a disorder of spatial attention but may also reflect a disorder of higher cortical vestibular function and (2) whether a vestibular tone imbalance due to an acute peripheral dysfunction can also cause symptoms of neglect or extinction. Therefore, patients with an acute unilateral peripheral vestibular failure (VF) were tested for symptoms of hemineglect. Methods Twenty-eight patients with acute VF were assessed for signs of vestibular deficits and spatial neglect using clinical measures and various common standardized paper-pencil tests. Neglect severity was evaluated further with the Center of Cancellation method. Pathological neglect test scores were correlated with the degree of vestibular dysfunction determined by the subjective visual vertical and caloric testing. Results Three patients showed isolated pathological scores in one or the other neglect test, either ipsilesionally or contralesionally to the VF. None of the patients fulfilled the diagnostic criteria of spatial hemineglect or extinction. Conclusions A vestibular tone imbalance due to unilateral failure of the vestibular endorgan does not cause spatial hemineglect, but evidence indicates it causes mild attentional deficits in both visual hemifields. PMID:26247469
Statistics, Handle with Care: Detecting Multiple Model Components with the Likelihood Ratio Test
NASA Astrophysics Data System (ADS)
Protassov, Rostislav; van Dyk, David A.; Connors, Alanna; Kashyap, Vinay L.; Siemiginowska, Aneta
2002-05-01
The likelihood ratio test (LRT) and the related F-test, popularized in astrophysics by Eadie and coworkers in 1971, Bevington in 1969, Lampton, Margon, & Bowyer, in 1976, Cash in 1979, and Avni in 1978, do not (even asymptotically) adhere to their nominal χ2 and F-distributions in many statistical tests common in astrophysics, thereby casting many marginal line or source detections and nondetections into doubt. Although the above authors illustrate the many legitimate uses of these statistics, in some important cases it can be impossible to compute the correct false positive rate. For example, it has become common practice to use the LRT or the F-test to detect a line in a spectral model or a source above background despite the lack of certain required regularity conditions. (These applications were not originally suggested by Cash or by Bevington.) In these and other settings that involve testing a hypothesis that is on the boundary of the parameter space, contrary to common practice, the nominal χ2 distribution for the LRT or the F-distribution for the F-test should not be used. In this paper, we characterize an important class of problems in which the LRT and the F-test fail and illustrate this nonstandard behavior. We briefly sketch several possible acceptable alternatives, focusing on Bayesian posterior predictive probability values. We present this method in some detail since it is a simple, robust, and intuitive approach. This alternative method is illustrated using the gamma-ray burst of 1997 May 8 (GRB 970508) to investigate the presence of an Fe K emission line during the initial phase of the observation. There are many legitimate uses of the LRT and the F-test in astrophysics, and even when these tests are inappropriate, there remain several statistical alternatives (e.g., judicious use of error bars and Bayes factors). Nevertheless, there are numerous cases of the inappropriate use of the LRT and similar tests in the literature, bringing substantive scientific results into question.
Chang, Edward C; Yu, Tina; Najarian, Alexandria S-M; Wright, Kaitlin M; Chen, Wenting; Chang, Olivia D; Du, Yifeng; Hirsch, Jameson K
2017-06-01
We tested a hypothesized model consistent with the notion that self-compassion mediates the association between negative life events and suicidal risk (viz., depressive symptoms and suicidal behaviors) in college students METHOD: The sample was comprised of 331 college students. Self-compassion facets (viz., self-kindness, self-judgment, common humanity, isolation, mindfulness, and overidentification) were used in testing for multiple mediation, controlling for sex. Common humanity, mindfulness, and overidentification were found to mediate the association between negative life events (NLE) and depressive symptoms. However, common humanity was found to be the only mediator of the association between NLE and suicidal behaviors. These findings suggest that there are specific facets of self-compassion that account for the association between NLE and suicidal risk in college students and that (loss of) common humanity plays a central role in this process. © 2016 Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Xi, Huixing
2017-03-01
With the continuous development of network technology and the rapid spread of the Internet, computer networks have been around the world every corner. However, the network attacks frequently occur. The ARP protocol vulnerability is one of the most common vulnerabilities in the TCP / IP four-layer architecture. The network protocol vulnerabilities can lead to the intrusion and attack of the information system, and disable or disable the normal defense function of the system [1]. At present, ARP spoofing Trojans spread widely in the LAN, the network security to run a huge hidden danger, is the primary threat to LAN security. In this paper, the author summarizes the research status and the key technologies involved in ARP protocol, analyzes the formation mechanism of ARP protocol vulnerability, and analyzes the feasibility of the attack technique. Based on the summary of the common defensive methods, the advantages and disadvantages of each defense method. At the same time, the current defense method is improved, and the advantage of the improved defense algorithm is given. At the end of this paper, the appropriate test method is selected and the test environment is set up. Experiment and test are carried out for each proposed improved defense algorithm.
Homogeneity study of fixed-point continuous marine environmental and meteorological data: a review
NASA Astrophysics Data System (ADS)
Yang, Jinkun; Yang, Yang; Miao, Qingsheng; Dong, Mingmei; Wan, Fangfang
2018-02-01
The principle of inhomogeneity and the classification of homogeneity test methods are briefly described, and several common inhomogeneity methods and relative merits are described in detail. Then based on the applications of the different homogeneity methods to the ground meteorological data and marine environment data, the present status and the progress are reviewed. At present, the homogeneity research of radiosonde and ground meteorological data is mature at home and abroad, and the research and application in the marine environmental data should also be given full attention. To carry out a variety of test and correction methods combined with the use of multi-mode test system, will make the results more reasonable and scientific, and also can be used to provide accurate first-hand information for the coastal climate change researches.
Angst, Ueli M.; Boschmann, Carolina; Wagner, Matthias; Elsener, Bernhard
2017-01-01
The aging of reinforced concrete infrastructure in developed countries imposes an urgent need for methods to reliably assess the condition of these structures. Corrosion of the embedded reinforcing steel is the most frequent cause for degradation. While it is well known that the ability of a structure to withstand corrosion depends strongly on factors such as the materials used or the age, it is common practice to rely on threshold values stipulated in standards or textbooks. These threshold values for corrosion initiation (Ccrit) are independent of the actual properties of a certain structure, which clearly limits the accuracy of condition assessments and service life predictions. The practice of using tabulated values can be traced to the lack of reliable methods to determine Ccrit on-site and in the laboratory. Here, an experimental protocol to determine Ccrit for individual engineering structures or structural members is presented. A number of reinforced concrete samples are taken from structures and laboratory corrosion testing is performed. The main advantage of this method is that it ensures real conditions concerning parameters that are well known to greatly influence Ccrit, such as the steel-concrete interface, which cannot be representatively mimicked in laboratory-produced samples. At the same time, the accelerated corrosion test in the laboratory permits the reliable determination of Ccrit prior to corrosion initiation on the tested structure; this is a major advantage over all common condition assessment methods that only permit estimating the conditions for corrosion after initiation, i.e., when the structure is already damaged. The protocol yields the statistical distribution of Ccrit for the tested structure. This serves as a basis for probabilistic prediction models for the remaining time to corrosion, which is needed for maintenance planning. This method can potentially be used in material testing of civil infrastructures, similar to established methods used for mechanical testing. PMID:28892023
Angst, Ueli M; Boschmann, Carolina; Wagner, Matthias; Elsener, Bernhard
2017-08-31
The aging of reinforced concrete infrastructure in developed countries imposes an urgent need for methods to reliably assess the condition of these structures. Corrosion of the embedded reinforcing steel is the most frequent cause for degradation. While it is well known that the ability of a structure to withstand corrosion depends strongly on factors such as the materials used or the age, it is common practice to rely on threshold values stipulated in standards or textbooks. These threshold values for corrosion initiation (Ccrit) are independent of the actual properties of a certain structure, which clearly limits the accuracy of condition assessments and service life predictions. The practice of using tabulated values can be traced to the lack of reliable methods to determine Ccrit on-site and in the laboratory. Here, an experimental protocol to determine Ccrit for individual engineering structures or structural members is presented. A number of reinforced concrete samples are taken from structures and laboratory corrosion testing is performed. The main advantage of this method is that it ensures real conditions concerning parameters that are well known to greatly influence Ccrit, such as the steel-concrete interface, which cannot be representatively mimicked in laboratory-produced samples. At the same time, the accelerated corrosion test in the laboratory permits the reliable determination of Ccrit prior to corrosion initiation on the tested structure; this is a major advantage over all common condition assessment methods that only permit estimating the conditions for corrosion after initiation, i.e., when the structure is already damaged. The protocol yields the statistical distribution of Ccrit for the tested structure. This serves as a basis for probabilistic prediction models for the remaining time to corrosion, which is needed for maintenance planning. This method can potentially be used in material testing of civil infrastructures, similar to established methods used for mechanical testing.
Big data in sleep medicine: prospects and pitfalls in phenotyping
Bianchi, Matt T; Russo, Kathryn; Gabbidon, Harriett; Smith, Tiaundra; Goparaju, Balaji; Westover, M Brandon
2017-01-01
Clinical polysomnography (PSG) databases are a rich resource in the era of “big data” analytics. We explore the uses and potential pitfalls of clinical data mining of PSG using statistical principles and analysis of clinical data from our sleep center. We performed retrospective analysis of self-reported and objective PSG data from adults who underwent overnight PSG (diagnostic tests, n=1835). Self-reported symptoms overlapped markedly between the two most common categories, insomnia and sleep apnea, with the majority reporting symptoms of both disorders. Standard clinical metrics routinely reported on objective data were analyzed for basic properties (missing values, distributions), pairwise correlations, and descriptive phenotyping. Of 41 continuous variables, including clinical and PSG derived, none passed testing for normality. Objective findings of sleep apnea and periodic limb movements were common, with 51% having an apnea–hypopnea index (AHI) >5 per hour and 25% having a leg movement index >15 per hour. Different visualization methods are shown for common variables to explore population distributions. Phenotyping methods based on clinical databases are discussed for sleep architecture, sleep apnea, and insomnia. Inferential pitfalls are discussed using the current dataset and case examples from the literature. The increasing availability of clinical databases for large-scale analytics holds important promise in sleep medicine, especially as it becomes increasingly important to demonstrate the utility of clinical testing methods in management of sleep disorders. Awareness of the strengths, as well as caution regarding the limitations, will maximize the productive use of big data analytics in sleep medicine. PMID:28243157
Usability Test of an Interactive Dietary Recording
ERIC Educational Resources Information Center
Chung, Louisa Ming Yan; Chung, Joanne Wai Yee; Wong, Thomas Kwok Shing
2009-01-01
Dietary intake methods are used to collect one's diet habit which is essential in nutrition assessment. Food diary, food frequency questionnaire (FFQ) and 24-hour recalls are the most common dietary intake methods. However, they are not welcomed by most clients. Digital handheld devices are now readily available, and the cost of digital…
A drawback of current in vitro chemical testing is that many commonly used cell lines lack chemical metabolism. To address this challenge, we present a method for assessing the impact of cellular metabolism on chemical-based cellular toxicity. A cell line with low endogenous meta...
Testing common stream sampling methods for broad-scale, long-term monitoring
Eric K. Archer; Brett B. Roper; Richard C. Henderson; Nick Bouwes; S. Chad Mellison; Jeffrey L. Kershner
2004-01-01
We evaluated sampling variability of stream habitat sampling methods used by the USDA Forest Service and the USDI Bureau of Land Management monitoring program for the upper Columbia River Basin. Three separate studies were conducted to describe the variability of individual measurement techniques, variability between crews, and temporal variation throughout the summer...
Emotion Recognition Ability: A Multimethod-Multitrait Study.
ERIC Educational Resources Information Center
Gaines, Margie; And Others
A common paradigm in measuring the ability to recognize facial expressions of emotion is to present photographs of facial expressions and to ask subjects to identify the emotion. The Affect Blend Test (ABT) uses this method of assessment and is scored for accuracy on specific affects as well as total accuracy. Another method of measuring affect…
Using Loss Functions for DIF Detection: An Empirical Bayes Approach.
ERIC Educational Resources Information Center
Zwick, Rebecca; Thayer, Dorothy; Lewis, Charles
2000-01-01
Studied a method for flagging differential item functioning (DIF) based on loss functions. Builds on earlier research that led to the development of an empirical Bayes enhancement to the Mantel-Haenszel DIF analysis. Tested the method through simulation and found its performance better than some commonly used DIF classification systems. (SLD)
Kwon, Jae-Sung; Kim, Kwang-Mahn; Kim, Kyoung-Nam
2014-10-01
The biocompatibility evaluation of nanomaterials is essential for their medical diagnostic and therapeutic usage, where a cytotoxicity test is the simplest form of biocompatibility evaluation. Three methods have been commonly used in previous studies for the cytotoxicity testing of nanomaterials: trypan blue exclusion, colorimetric assay using water soluble tetrazolium (WST), and imaging under a microscope following calcein AM/ethidium homodimer-1 staining. However, there has yet to be a study to compare each method. Therefore, in this study three methods were compared using the standard reference material of sodium lauryl sulfate (SLS). Each method of the cytotoxicity test was carried out using mouse fibroblasts of L-929 exposed to different concentrations of SLS. Compared to the gold standard trypan blue exclusion test, both colorimetric assay using water soluble tetrazolium (WST) and imaging under microscope with calcein AM/ethidium homodimer-1 staining showed results that were not statistically different. Also, each method exhibited various advantages and disadvantages, which included the need of equipment, time taken for the experiment, and provision of additional information such as cell morphology. Therefore, this study concludes that all three methods of cytotoxicity testing may be valid, though careful consideration will be needed when selecting tests with regard to time, finances, and the amount of information required by the researcher(s).
Moazami-Goudarzi, K; Laloë, D
2002-01-01
To determine the relationships among closely related populations or species, two methods are commonly used in the literature: phylogenetic reconstruction or multivariate analysis. The aim of this article is to assess the reliability of multivariate analysis. We describe a method that is based on principal component analysis and Mantel correlations, using a two-step process: The first step consists of a single-marker analysis and the second step tests if each marker reveals the same typology concerning population differentiation. We conclude that if single markers are not congruent, the compromise structure is not meaningful. Our model is not based on any particular mutation process and it can be applied to most of the commonly used genetic markers. This method is also useful to determine the contribution of each marker to the typology of populations. We test whether our method is efficient with two real data sets based on microsatellite markers. Our analysis suggests that for closely related populations, it is not always possible to accept the hypothesis that an increase in the number of markers will increase the reliability of the typology analysis. PMID:12242255
Ibrahim, Sarah A; Martini, Luigi
2014-08-01
Dissolution method transfer is a complicated yet common process in the pharmaceutical industry. With increased pharmaceutical product manufacturing and dissolution acceptance requirements, dissolution testing has become one of the most labor-intensive quality control testing methods. There is an increased trend for automation in dissolution testing, particularly for large pharmaceutical companies to reduce variability and increase personnel efficiency. There is no official guideline for dissolution testing method transfer from a manual, semi-automated, to automated dissolution tester. In this study, a manual multipoint dissolution testing procedure for an enteric-coated aspirin tablet was transferred effectively and reproducibly to a fully automated dissolution testing device, RoboDis II. Enteric-coated aspirin samples were used as a model formulation to assess the feasibility and accuracy of media pH change during continuous automated dissolution testing. Several RoboDis II parameters were evaluated to ensure the integrity and equivalency of dissolution method transfer from a manual dissolution tester. This current study provides a systematic outline for the transfer of the manual dissolution testing protocol to an automated dissolution tester. This study further supports that automated dissolution testers compliant with regulatory requirements and similar to manual dissolution testers facilitate method transfer. © 2014 Society for Laboratory Automation and Screening.
The Structural Heat Intercept-Insulation-Vibration Evaluation Rig (SHIVER)
NASA Technical Reports Server (NTRS)
Johnson, W. L.; Zoeckler, J. G.; Best-Ameen, L. M.
2015-01-01
NASA is currently investigating methods to reduce the boil-off rate on large cryogenic upper stages. Two such methods to reduce the total heat load on existing upper stages are vapor cooling of the cryogenic tank support structure and integration of thick multilayer insulation systems to the upper stage of a launch vehicle. Previous efforts have flown a 2-layer MLI blanket and shown an improved thermal performance, and other efforts have ground-tested blankets up to 70 layers thick on tanks with diameters between 2 3 meters. However, thick multilayer insulation installation and testing in both thermal and structural modes has not been completed on a large scale tank. Similarly, multiple vapor cooled shields are common place on science payload helium dewars; however, minimal effort has gone into intercepting heat on large structural surfaces associated with rocket stages. A majority of the vapor cooling effort focuses on metallic cylinders called skirts, which are the most common structural components for launch vehicles. In order to provide test data for comparison with analytical models, a representative test tank is currently being designed to include skirt structural systems with integral vapor cooling. The tank is 4 m in diameter and 6.8 m tall to contain 5000 kg of liquid hydrogen. A multilayer insulation system will be designed to insulate the tank and structure while being installed in a representative manner that can be extended to tanks up to 10 meters in diameter. In order to prove that the insulation system and vapor cooling attachment methods are structurally sound, acoustic testing will also be performed on the system. The test tank with insulation and vapor cooled shield installed will be tested thermally in the B2 test facility at NASAs Plumbrook Station both before and after being vibration tested at Plumbrooks Space Power Facility.
Bias correction for selecting the minimal-error classifier from many machine learning models.
Ding, Ying; Tang, Shaowu; Liao, Serena G; Jia, Jia; Oesterreich, Steffi; Lin, Yan; Tseng, George C
2014-11-15
Supervised machine learning is commonly applied in genomic research to construct a classifier from the training data that is generalizable to predict independent testing data. When test datasets are not available, cross-validation is commonly used to estimate the error rate. Many machine learning methods are available, and it is well known that no universally best method exists in general. It has been a common practice to apply many machine learning methods and report the method that produces the smallest cross-validation error rate. Theoretically, such a procedure produces a selection bias. Consequently, many clinical studies with moderate sample sizes (e.g. n = 30-60) risk reporting a falsely small cross-validation error rate that could not be validated later in independent cohorts. In this article, we illustrated the probabilistic framework of the problem and explored the statistical and asymptotic properties. We proposed a new bias correction method based on learning curve fitting by inverse power law (IPL) and compared it with three existing methods: nested cross-validation, weighted mean correction and Tibshirani-Tibshirani procedure. All methods were compared in simulation datasets, five moderate size real datasets and two large breast cancer datasets. The result showed that IPL outperforms the other methods in bias correction with smaller variance, and it has an additional advantage to extrapolate error estimates for larger sample sizes, a practical feature to recommend whether more samples should be recruited to improve the classifier and accuracy. An R package 'MLbias' and all source files are publicly available. tsenglab.biostat.pitt.edu/software.htm. ctseng@pitt.edu Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Testing and Evaluation of Passive Radiation Detection Equipment for Homeland Security
West, David L.; Wood, Nathan L.; Forrester, Christina D.
2017-12-01
This article is concerned with test and evaluation methods for passive radiation detection equipment used in homeland security applications. The different types of equipment used in these applications are briefly reviewed and then test and evaluation methods discussed. The primary emphasis is on the test and evaluation standards developed by the American National Standards Institute’s N42 committees. Commonalities among the standards are then reviewed as well as examples of unique aspects for specific equipment types. Throughout, sample test configurations and results from testing and evaluation at Oak Ridge National Laboratory are given. The article concludes with a brief discussion ofmore » typical tests and evaluations not covered by the N42 standards and some examples of test and evaluation that involve the end users of the equipment.« less
Testing and Evaluation of Passive Radiation Detection Equipment for Homeland Security
DOE Office of Scientific and Technical Information (OSTI.GOV)
West, David L.; Wood, Nathan L.; Forrester, Christina D.
This article is concerned with test and evaluation methods for passive radiation detection equipment used in homeland security applications. The different types of equipment used in these applications are briefly reviewed and then test and evaluation methods discussed. The primary emphasis is on the test and evaluation standards developed by the American National Standards Institute’s N42 committees. Commonalities among the standards are then reviewed as well as examples of unique aspects for specific equipment types. Throughout, sample test configurations and results from testing and evaluation at Oak Ridge National Laboratory are given. The article concludes with a brief discussion ofmore » typical tests and evaluations not covered by the N42 standards and some examples of test and evaluation that involve the end users of the equipment.« less
A close examination of double filtering with fold change and t test in microarray analysis
2009-01-01
Background Many researchers use the double filtering procedure with fold change and t test to identify differentially expressed genes, in the hope that the double filtering will provide extra confidence in the results. Due to its simplicity, the double filtering procedure has been popular with applied researchers despite the development of more sophisticated methods. Results This paper, for the first time to our knowledge, provides theoretical insight on the drawback of the double filtering procedure. We show that fold change assumes all genes to have a common variance while t statistic assumes gene-specific variances. The two statistics are based on contradicting assumptions. Under the assumption that gene variances arise from a mixture of a common variance and gene-specific variances, we develop the theoretically most powerful likelihood ratio test statistic. We further demonstrate that the posterior inference based on a Bayesian mixture model and the widely used significance analysis of microarrays (SAM) statistic are better approximations to the likelihood ratio test than the double filtering procedure. Conclusion We demonstrate through hypothesis testing theory, simulation studies and real data examples, that well constructed shrinkage testing methods, which can be united under the mixture gene variance assumption, can considerably outperform the double filtering procedure. PMID:19995439
Comparison of subject-reported allergy versus skin test results in a common cold trial.
Krahnke, Jason S; Gentile, Deborah A; Cordoro, Kelly M; Angelini, Betty L; Cohen, Sheldon A; Doyle, William J; Skoner, David P
2003-01-01
Few studies have examined the relationship between subject-reported allergy and results of allergy skin testing in large unselected or unbiased cohorts. The objective of this study was to compare the results of self-reported allergy via verbal questioning with the results of allergy skin testing by the puncture method in 237 healthy adult subjects enrolled in a common cold study. On enrollment, all subjects were verbally asked if they had a history of allergy and then underwent puncture skin testing to 19 relevant aeroallergens, as well as appropriate positive and negative controls. A skin test was considered positive if its wheal diameter was at least 3 mm larger than that obtained with the negative control. Forty-eight (20%) subjects reported a history of allergy and 124 (52%) subjects had at least one positive skin test response. A history of allergy was reported in 40 (32%) of the skin test-positive subjects and 8 (7%) of the skin test-negative subjects. At least one positive skin test response was found in 40 (83%) of those subjects reporting a history of allergy and 84 (44%) of those subjects denying a history of allergy. These data indicate that there is a relatively poor correlation between self-reported history of allergy and skin test results in subjects enrolled in a common cold study. These results have implications in both clinical practice and research settings.
Slotnick, Scott D
2017-07-01
Analysis of functional magnetic resonance imaging (fMRI) data typically involves over one hundred thousand independent statistical tests; therefore, it is necessary to correct for multiple comparisons to control familywise error. In a recent paper, Eklund, Nichols, and Knutsson used resting-state fMRI data to evaluate commonly employed methods to correct for multiple comparisons and reported unacceptable rates of familywise error. Eklund et al.'s analysis was based on the assumption that resting-state fMRI data reflect null data; however, their 'null data' actually reflected default network activity that inflated familywise error. As such, Eklund et al.'s results provide no basis to question the validity of the thousands of published fMRI studies that have corrected for multiple comparisons or the commonly employed methods to correct for multiple comparisons.
Optical microtopographic inspection of the surface of tooth subjected to stripping reduction
NASA Astrophysics Data System (ADS)
Costa, Manuel F.; Pereira, Pedro B.
2011-05-01
In orthodontics, the decreasing of tooth-size by reducing interproximal enamel surfaces (stripping) of teeth is a common procedure which allows dental alignment with minimal changes in the facial profile and no arch expansion. In order to achieve smooth surfaces, clinicians have been testing various methods and progressively improved this therapeutic technique. In order to evaluate the surface roughness of teeth subject to interproximal reduction through the five most commonly used methods, teeth were inspected by scanning electron microscopy and microtopographically measured using the optical active triangulation based microtopographer MICROTOP.06.MFC. The metrological procedure will be presented as well as the comparative results concluding on the most suitable tooth interproximal reduction method.
Construction of an Exome-Wide Risk Score for Schizophrenia Based on a Weighted Burden Test.
Curtis, David
2018-01-01
Polygenic risk scores obtained as a weighted sum of associated variants can be used to explore association in additional data sets and to assign risk scores to individuals. The methods used to derive polygenic risk scores from common SNPs are not suitable for variants detected in whole exome sequencing studies. Rare variants, which may have major effects, are seen too infrequently to judge whether they are associated and may not be shared between training and test subjects. A method is proposed whereby variants are weighted according to their frequency, their annotations and the genes they affect. A weighted sum across all variants provides an individual risk score. Scores constructed in this way are used in a weighted burden test and are shown to be significantly different between schizophrenia cases and controls using a five-way cross-validation procedure. This approach represents a first attempt to summarise exome sequence variation into a summary risk score, which could be combined with risk scores from common variants and from environmental factors. It is hoped that the method could be developed further. © 2017 John Wiley & Sons Ltd/University College London.
Luo, Yuan; Szolovits, Peter; Dighe, Anand S; Baron, Jason M
2018-06-01
A key challenge in clinical data mining is that most clinical datasets contain missing data. Since many commonly used machine learning algorithms require complete datasets (no missing data), clinical analytic approaches often entail an imputation procedure to "fill in" missing data. However, although most clinical datasets contain a temporal component, most commonly used imputation methods do not adequately accommodate longitudinal time-based data. We sought to develop a new imputation algorithm, 3-dimensional multiple imputation with chained equations (3D-MICE), that can perform accurate imputation of missing clinical time series data. We extracted clinical laboratory test results for 13 commonly measured analytes (clinical laboratory tests). We imputed missing test results for the 13 analytes using 3 imputation methods: multiple imputation with chained equations (MICE), Gaussian process (GP), and 3D-MICE. 3D-MICE utilizes both MICE and GP imputation to integrate cross-sectional and longitudinal information. To evaluate imputation method performance, we randomly masked selected test results and imputed these masked results alongside results missing from our original data. We compared predicted results to measured results for masked data points. 3D-MICE performed significantly better than MICE and GP-based imputation in a composite of all 13 analytes, predicting missing results with a normalized root-mean-square error of 0.342, compared to 0.373 for MICE alone and 0.358 for GP alone. 3D-MICE offers a novel and practical approach to imputing clinical laboratory time series data. 3D-MICE may provide an additional tool for use as a foundation in clinical predictive analytics and intelligent clinical decision support.
Ultrasonic-Based Nondestructive Evaluation Methods for Wood: A Primer and Historical Review
Adam C. Senalik; Greg Schueneman; Robert J. Ross
2014-01-01
The authors conducted a review of ultrasonic testing and evaluation of wood and wood products, starting with a description of basic ultrasonic inspection setups and commonly used equations. The literature review primarily covered wood research presented between 1965 and 2013 in the Proceedings of the Nondestructive Testing of Wood Symposiums. A table that lists the...
ERIC Educational Resources Information Center
Makransky, Guido; Mortensen, Erik Lykke; Glas, Cees A. W.
2013-01-01
Narrowly defined personality facet scores are commonly reported and used for making decisions in clinical and organizational settings. Although these facets are typically related, scoring is usually carried out for a single facet at a time. This method can be ineffective and time consuming when personality tests contain many highly correlated…
ERIC Educational Resources Information Center
Longabach, Tanya; Peyton, Vicki
2018-01-01
K-12 English language proficiency tests that assess multiple content domains (e.g., listening, speaking, reading, writing) often have subsections based on these content domains; scores assigned to these subsections are commonly known as subscores. Testing programs face increasing customer demands for the reporting of subscores in addition to the…
Linking Outcomes from Peabody Picture Vocabulary Test Forms Using Item Response Models
ERIC Educational Resources Information Center
Hoffman, Lesa; Templin, Jonathan; Rice, Mabel L.
2012-01-01
Purpose: The present work describes how vocabulary ability as assessed by 3 different forms of the Peabody Picture Vocabulary Test (PPVT; Dunn & Dunn, 1997) can be placed on a common latent metric through item response theory (IRT) modeling, by which valid comparisons of ability between samples or over time can then be made. Method: Responses…
Development of fire test methods for airplane interior materials
NASA Technical Reports Server (NTRS)
Tustin, E. A.
1978-01-01
Fire tests were conducted in a 737 airplane fuselage at NASA-JSC to characterize jet fuel fires in open steel pans (simulating post-crash fire sources and a ruptured airplane fuselage) and to characterize fires in some common combustibles (simulating in-flight fire sources). Design post-crash and in-flight fire source selections were based on these data. Large panels of airplane interior materials were exposed to closely-controlled large scale heating simulations of the two design fire sources in a Boeing fire test facility utilizing a surplused 707 fuselage section. Small samples of the same airplane materials were tested by several laboratory fire test methods. Large scale and laboratory scale data were examined for correlative factors. Published data for dangerous hazard levels in a fire environment were used as the basis for developing a method to select the most desirable material where trade-offs in heat, smoke and gaseous toxicant evolution must be considered.
The Stress Corrosion Performance Research of Three Kinds of Commonly Used Pipe Materials
NASA Astrophysics Data System (ADS)
Hu, Yayun; Zhang, Yiliang; Jia, Xiaoliang
The corrosion of pipe is most common problem for oil and gas industry. In this article, three kinds of tubes will be analyzed in terms of their resistance against stress corrosion. They are respectively N80 / 1, N80/ Q and P110. The loading method chosen in this test is constant tensile stress loading. In the test, samples will be separated in different groups, gradually loaded under specific levels and then soaked in H2S saturated solution. What can get from this test is threshold value of stress corrosion and stress-life curve, which can be used for evaluating the stress corrosion property of materials, as well as giving guidance for practical engineering.
Analysis of Statistical Methods and Errors in the Articles Published in the Korean Journal of Pain
Yim, Kyoung Hoon; Han, Kyoung Ah; Park, Soo Young
2010-01-01
Background Statistical analysis is essential in regard to obtaining objective reliability for medical research. However, medical researchers do not have enough statistical knowledge to properly analyze their study data. To help understand and potentially alleviate this problem, we have analyzed the statistical methods and errors of articles published in the Korean Journal of Pain (KJP), with the intention to improve the statistical quality of the journal. Methods All the articles, except case reports and editorials, published from 2004 to 2008 in the KJP were reviewed. The types of applied statistical methods and errors in the articles were evaluated. Results One hundred and thirty-nine original articles were reviewed. Inferential statistics and descriptive statistics were used in 119 papers and 20 papers, respectively. Only 20.9% of the papers were free from statistical errors. The most commonly adopted statistical method was the t-test (21.0%) followed by the chi-square test (15.9%). Errors of omission were encountered 101 times in 70 papers. Among the errors of omission, "no statistics used even though statistical methods were required" was the most common (40.6%). The errors of commission were encountered 165 times in 86 papers, among which "parametric inference for nonparametric data" was the most common (33.9%). Conclusions We found various types of statistical errors in the articles published in the KJP. This suggests that meticulous attention should be given not only in the applying statistical procedures but also in the reviewing process to improve the value of the article. PMID:20552071
Ozarda, Yesim; Ichihara, Kiyoshi; Aslan, Diler; Aybek, Hulya; Ari, Zeki; Taneli, Fatma; Coker, Canan; Akan, Pinar; Sisman, Ali Riza; Bahceci, Onur; Sezgin, Nurzen; Demir, Meltem; Yucel, Gultekin; Akbas, Halide; Ozdem, Sebahat; Polat, Gurbuz; Erbagci, Ayse Binnur; Orkmez, Mustafa; Mete, Nuriye; Evliyaoglu, Osman; Kiyici, Aysel; Vatansev, Husamettin; Ozturk, Bahadir; Yucel, Dogan; Kayaalp, Damla; Dogan, Kubra; Pinar, Asli; Gurbilek, Mehmet; Cetinkaya, Cigdem Damla; Akin, Okhan; Serdar, Muhittin; Kurt, Ismail; Erdinc, Selda; Kadicesme, Ozgur; Ilhan, Necip; Atali, Dilek Sadak; Bakan, Ebubekir; Polat, Harun; Noyan, Tevfik; Can, Murat; Bedir, Abdulkerim; Okuyucu, Ali; Deger, Orhan; Agac, Suret; Ademoglu, Evin; Kaya, Ayşem; Nogay, Turkan; Eren, Nezaket; Dirican, Melahat; Tuncer, GulOzlem; Aykus, Mehmet; Gunes, Yeliz; Ozmen, Sevda Unalli; Kawano, Reo; Tezcan, Sehavet; Demirpence, Ozlem; Degirmen, Elif
2014-12-01
A nationwide multicenter study was organized to establish reference intervals (RIs) in the Turkish population for 25 commonly tested biochemical analytes and to explore sources of variation in reference values, including regionality. Blood samples were collected nationwide in 28 laboratories from the seven regions (≥400 samples/region, 3066 in all). The sera were collectively analyzed in Uludag University in Bursa using Abbott reagents and analyzer. Reference materials were used for standardization of test results. After secondary exclusion using the latent abnormal values exclusion method, RIs were derived by a parametric method employing the modified Box-Cox formula and compared with the RIs by the non-parametric method. Three-level nested ANOVA was used to evaluate variations among sexes, ages and regions. Associations between test results and age, body mass index (BMI) and region were determined by multiple regression analysis (MRA). By ANOVA, differences of reference values among seven regions were significant in none of the 25 analytes. Significant sex-related and age-related differences were observed for 10 and seven analytes, respectively. MRA revealed BMI-related changes in results for uric acid, glucose, triglycerides, high-density lipoprotein (HDL)-cholesterol, alanine aminotransferase, and γ-glutamyltransferase. Their RIs were thus derived by applying stricter criteria excluding individuals with BMI >28 kg/m2. Ranges of RIs by non-parametric method were wider than those by parametric method especially for those analytes affected by BMI. With the lack of regional differences and the well-standardized status of test results, the RIs derived from this nationwide study can be used for the entire Turkish population.
Further investigations of the W-test for pairwise epistasis testing.
Howey, Richard; Cordell, Heather J
2017-01-01
Background: In a recent paper, a novel W-test for pairwise epistasis testing was proposed that appeared, in computer simulations, to have higher power than competing alternatives. Application to genome-wide bipolar data detected significant epistasis between SNPs in genes of relevant biological function. Network analysis indicated that the implicated genes formed two separate interaction networks, each containing genes highly related to autism and neurodegenerative disorders. Methods: Here we investigate further the properties and performance of the W-test via theoretical evaluation, computer simulations and application to real data. Results: We demonstrate that, for common variants, the W-test is closely related to several existing tests of association allowing for interaction, including logistic regression on 8 degrees of freedom, although logistic regression can show inflated type I error for low minor allele frequencies, whereas the W-test shows good/conservative type I error control. Although in some situations the W-test can show higher power, logistic regression is not limited to tests on 8 degrees of freedom but can instead be tailored to impose greater structure on the assumed alternative hypothesis, offering a power advantage when the imposed structure matches the true structure. Conclusions: The W-test is a potentially useful method for testing for association - without necessarily implying interaction - between genetic variants disease, particularly when one or more of the genetic variants are rare. For common variants, the advantages of the W-test are less clear, and, indeed, there are situations where existing methods perform better. In our investigations, we further uncover a number of problems with the practical implementation and application of the W-test (to bipolar disorder) previously described, apparently due to inadequate use of standard data quality-control procedures. This observation leads us to urge caution in interpretation of the previously-presented results, most of which we consider are highly likely to be artefacts.
Orodispersible tablets: A new trend in drug delivery
Dey, Paramita; Maiti, Sabyasachi
2010-01-01
The most common and preferred route of drug administration is through the oral route. Orodispersible tablets are gaining importance among novel oral drug-delivery system as they have improved patient compliance and have some additional advantages compared to other oral formulation. They are also solid unit dosage forms, which disintegrate in the mouth within a minute in the presence of saliva due to super disintegrants in the formulation. Thus this type of drug delivery helps a proper peroral administration in pediatric and geriatric population where swallowing is a matter of trouble. Various scientists have prepared orodispersible tablets by following various methods. However, the most common method of preparation is the compression method. Other special methods are molding, melt granulation, phase-transition process, sublimation, freeze-drying, spray-drying, and effervescent method. Since these tablets dissolve directly in the mouth, so, their taste is also an important factor. Various approaches have been taken in order to mask the bitter taste of the drug. A number of scientists have explored several drugs in this field. Like all other solid dosage forms, they are also evaluated in the field of hardness, friability, wetting time, moisture uptake, disintegration test, and dissolution test. PMID:22096326
Pisa, Pedro T; Landais, Edwige; Margetts, Barrie; Vorster, Hester H; Friedenreich, Christine M; Huybrechts, Inge; Martin-Prevel, Yves; Branca, Francesco; Lee, Warren T K; Leclercq, Catherine; Jerling, Johann; Zotor, Francis; Amuna, Paul; Al Jawaldeh, Ayoub; Aderibigbe, Olaide Ruth; Amoussa, Waliou Hounkpatin; Anderson, Cheryl A M; Aounallah-Skhiri, Hajer; Atek, Madjid; Benhura, Chakare; Chifamba, Jephat; Covic, Namukolo; Dary, Omar; Delisle, Hélène; El Ati, Jalila; El Hamdouchi, Asmaa; El Rhazi, Karima; Faber, Mieke; Kalimbira, Alexander; Korkalo, Liisa; Kruger, Annamarie; Ledo, James; Machiweni, Tatenda; Mahachi, Carol; Mathe, Nonsikelelo; Mokori, Alex; Mouquet-Rivier, Claire; Mutie, Catherine; Nashandi, Hilde Liisa; Norris, Shane A; Onabanjo, Oluseye Olusegun; Rambeloson, Zo; Saha, Foudjo Brice U; Ubaoji, Kingsley Ikechukwu; Zaghloul, Sahar; Slimani, Nadia
2018-01-02
To carry out an inventory on the availability, challenges, and needs of dietary assessment (DA) methods in Africa as a pre-requisite to provide evidence, and set directions (strategies) for implementing common dietary methods and support web-research infrastructure across countries. The inventory was performed within the framework of the "Africa's Study on Physical Activity and Dietary Assessment Methods" (AS-PADAM) project. It involves international institutional and African networks. An inventory questionnaire was developed and disseminated through the networks. Eighteen countries responded to the dietary inventory questionnaire. Various DA tools were reported in Africa; 24-Hour Dietary Recall and Food Frequency Questionnaire were the most commonly used tools. Few tools were validated and tested for reliability. Face-to-face interview was the common method of administration. No computerized software or other new (web) technologies were reported. No tools were standardized across countries. The lack of comparable DA methods across represented countries is a major obstacle to implement comprehensive and joint nutrition-related programmes for surveillance, programme evaluation, research, and prevention. There is a need to develop new or adapt existing DA methods across countries by employing related research infrastructure that has been validated and standardized in other settings, with the view to standardizing methods for wider use.
Sequi, Marco; Campi, Rita; Clavenna, Antonio; Bonati, Maurizio
2013-03-01
To evaluate the quality of data reporting and statistical methods performed in drug utilization studies in the pediatric population. Drug utilization studies evaluating all drug prescriptions to children and adolescents published between January 1994 and December 2011 were retrieved and analyzed. For each study, information on measures of exposure/consumption, the covariates considered, descriptive and inferential analyses, statistical tests, and methods of data reporting was extracted. An overall quality score was created for each study using a 12-item checklist that took into account the presence of outcome measures, covariates of measures, descriptive measures, statistical tests, and graphical representation. A total of 22 studies were reviewed and analyzed. Of these, 20 studies reported at least one descriptive measure. The mean was the most commonly used measure (18 studies), but only five of these also reported the standard deviation. Statistical analyses were performed in 12 studies, with the chi-square test being the most commonly performed test. Graphs were presented in 14 papers. Sixteen papers reported the number of drug prescriptions and/or packages, and ten reported the prevalence of the drug prescription. The mean quality score was 8 (median 9). Only seven of the 22 studies received a score of ≥10, while four studies received a score of <6. Our findings document that only a few of the studies reviewed applied statistical methods and reported data in a satisfactory manner. We therefore conclude that the methodology of drug utilization studies needs to be improved.
Psychometric evaluation of commonly used game-specific skills tests in rugby: A systematic review
Oorschot, Sander; Chiwaridzo, Matthew; CM Smits-Engelsman, Bouwien
2017-01-01
Objectives To (1) give an overview of commonly used game-specific skills tests in rugby and (2) evaluate available psychometric information of these tests. Methods The databases PubMed, MEDLINE CINAHL and Africa Wide information were systematically searched for articles published between January 1995 and March 2017. First, commonly used game-specific skills tests were identified. Second, the available psychometrics of these tests were evaluated and the methodological quality of the studies assessed using the Consensus-based Standards for the selection of health Measurement Instruments checklist. Studies included in the first step had to report detailed information on the construct and testing procedure of at least one game-specific skill, and studies included in the second step had additionally to report at least one psychometric property evaluating reliability, validity or responsiveness. Results 287 articles were identified in the first step, of which 30 articles met the inclusion criteria and 64 articles were identified in the second step of which 10 articles were included. Reactive agility, tackling and simulated rugby games were the most commonly used tests. All 10 studies reporting psychometrics reported reliability outcomes, revealing mainly strong evidence. However, all studies scored poor or fair on methodological quality. Four studies reported validity outcomes in which mainly moderate evidence was indicated, but all articles had fair methodological quality. Conclusion Game-specific skills tests indicated mainly high reliability and validity evidence, but the studies lacked methodological quality. Reactive agility seems to be a promising domain, but the specific tests need further development. Future high methodological quality studies are required in order to develop valid and reliable test batteries for rugby talent identification. Trial registration number PROSPERO CRD42015029747. PMID:29259812
Final Report on Jobin Yvon Contained Inductively Coupled Plasma Emission Spectrometer
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pennebaker, F.M.
2003-03-17
A new Inductively Coupled Plasma -- Emission Spectrometer (ICP-ES) was recently purchased and installed in Lab B-147/151 at SRTC. The contained JY Model Ultima 170-C ICP-ES has been tested and compared to current ADS ICP-ES instrumentation. The testing has included both performance tests to evaluate instrumental ability, and the measurement of matrix standards commonly analyzed by ICP-ES at Savannah River. In developing operating procedures for this instrument, we have implemented the use of internal standards and off-peak background subtraction. Both of these techniques are recommended by EPA SW-846 ICP-ES methods and are common to current ICP-ES operations. Based on themore » testing and changes, the JY Model Ultima 170-C ICP-ES provides improved performance for elemental analysis of radioactive samples in the Analytical Development Section.« less
Garg, Uttam; Munar, Ada; Frazee, Clinton; Scott, David
2012-09-01
Vitamin D plays a vital role not only in bone health but also in pathophysiology of many other body functions. In recent years, there has been significant increase in testing of 25-hydroxyvitamin D (25-OH vitamin D), a marker of vitamin D deficiency. The most commonly used methods for the measurement of 25-OH vitamin D are immunoassays and liquid chromatography tandem mass spectrometry (LC-MS-MS). Since immunoassays suffer from inaccuracies and interferences, LC-MS-MS is a preferred method. In LC-MS-MS methods, 25-OH vitamin D is extracted from serum or plasma by solid-phase or liquid-phase extraction. Because these extraction methods are time consuming, we developed an easy method that uses simple protein precipitation followed by injection of the supernatant to LC-MS-MS. Several mass-to-charge (m/z) ratio transitions, including commonly used transitions based on water loss, were evaluated and several tube types were tested. The optimal transitions for 25-OH vitamin D2 and D3 were 395.5 > 269.5 and 383.4 > 257.3, respectively. The reportable range of the method was 1-100 ng/mL, and repeatability (within-run) and within-laboratory imprecision were <4% and <6%, respectively. The method agreed well with the solid-phase extraction methods. © 2012 Wiley Periodicals, Inc.
Khodaei, Kazem; Mohammadi, Abbas; Badri, Neda
2017-10-01
The purpose of this study was to compare the effect of assisted, resisted and common plyometric training modes to enhance sprint and agility performance. Thirty active young males (age 20.67±1.12, height 174.83±4.69, weight 63.45±7.51) volunteered to participate in this study that 24 completed testing. The participants were randomly assigned into different groups: assisted, resisted and common plyometric exercises groups. Plyometric training involved three sessions per week for 4 weeks. The volume load of plyometric training modes was equated between the groups. The posttest was performed after 48 hours of the last training session. Between-group differences were analyzed with the ANCOVA and LSD post-hoc tests, and within-group differences were analyzed by a paired t-test. The findings of the present study indicated that 0-10-m, 20-30-m sprint time and the Illinois Agility Test time significantly decreased in the assisted and resisted plyometrics modes compared to the common plyometric training mode (P≤0.05). Also, the 0-10-m, 0-30-m sprint time and agility T-test time was significantly reduced with resisted plyometrics modes compared to the assisted and common plyometric modes (P≤0.05). There was no significant difference in the 10-20-m sprint time among the three plyometric training modes. The results of this study demonstrated that assisted and resisted plyometrics modes with elastic bands were effective methods to improve sprint and agility performance than common plyometric training in active males. Also, the resisted plyometrics mode was superior than the assisted plyometrics mode to improving sprint and agility tasks.
Davoren, Jon; Vanek, Daniel; Konjhodzić, Rijad; Crews, John; Huffine, Edwin; Parsons, Thomas J.
2007-01-01
Aim To quantitatively compare a silica extraction method with a commonly used phenol/chloroform extraction method for DNA analysis of specimens exhumed from mass graves. Methods DNA was extracted from twenty randomly chosen femur samples, using the International Commission on Missing Persons (ICMP) silica method, based on Qiagen Blood Maxi Kit, and compared with the DNA extracted by the standard phenol/chloroform-based method. The efficacy of extraction methods was compared by real time polymerase chain reaction (PCR) to measure DNA quantity and the presence of inhibitors and by amplification with the PowerPlex 16 (PP16) multiplex nuclear short tandem repeat (STR) kit. Results DNA quantification results showed that the silica-based method extracted on average 1.94 ng of DNA per gram of bone (range 0.25-9.58 ng/g), compared with only 0.68 ng/g by the organic method extracted (range 0.0016-4.4880 ng/g). Inhibition tests showed that there were on average significantly lower levels of PCR inhibitors in DNA isolated by the organic method. When amplified with PP16, all samples extracted by silica-based method produced 16 full loci profiles, while only 75% of the DNA extracts obtained by organic technique amplified 16 loci profiles. Conclusions The silica-based extraction method showed better results in nuclear STR typing from degraded bone samples than a commonly used phenol/chloroform method. PMID:17696302
Impact of Gene Patents and Licensing Practices on Access to Genetic Testing for Hearing Loss
Chandrasekharan, Subhashini; Fiffer, Melissa
2011-01-01
Genetic testing for heritable hearing loss involves a mix of patented and unpatented genes, mutations and testing methods. More than half of all hearing loss is linked to inherited mutations, and five genes are most commonly tested in the United States. There are no patents on three of these genes, but Athena Diagnostics holds exclusive licenses to test for a common mutation in the GJB2 gene associated with about 50% of all cases, as well as mutations in the MTRNR1 gene. This fragmented intellectual property landscape made hearing loss a useful case study for assessing whether patent rights in genetic testing can proliferate or overlap, and whether it is possible to gather the rights necessary to perform testing. Testing for hearing loss is widely available, primarily from academic medical centers. Based on literature reviews and interviews with researchers, research on the genetics of hearing loss has generally not been impeded by patents. There is no consistent evidence of a premium in testing prices attributable to patent status. Athena Diagnostics has, however, used its intellectual property to discourage other providers from offering some tests. There is no definitive answer about the suitability of current patenting and licensing of commonly tested genes because of continuing legal uncertainty about the extent of enforcement of patent rights. Clinicians have also expressed concerns that multiplex tests will be difficult to develop because of overlapping intellectual property and conflict with Athena’s sole provider business model. PMID:20393307
ERIC Educational Resources Information Center
Sforza, Dario; Tienken, Christopher H.; Kim, Eunyoung
2016-01-01
The creators and supporters of the Common Core State Standards claim that the Standards require greater emphasis on higher-order thinking than previous state standards in mathematics and English language arts. We used a qualitative case study design with content analysis methods to test the claim. We compared the levels of thinking required by the…
Study on antibacterial effect of medlar and hawthorn compound extract in vitro.
Niu, Yang; Nan, Yi; Yuan, Ling; Wang, Rong
2013-01-01
This paper evaluated the antibacterial effect of medlar and hawthorn compound extract in vitro. Water extract method and ethanol extraction method was adopted to prepare the compound extracts, and disc diffusion method and improved test tube doubling dilution method were used to conduct the antibacterial test on the two common pathogenic bacteria, Staphylococcus aureus and Klebsiella pneumonia, in vitro. The results showed that medlar and hawthorn compound extract was moderately sensitive to Staphylococcus aureus, while its inhibiting effect on Klebsiella pneumoniae was particularly significant, moreover, the antibacterial effect of ethanol extract was better than water extract. Medlar and hawthorn compounds had good antibacterial effect on the two pathogenic bacteria.
Study on electrochemical corrosion mechanism of steel foot of insulators for HVDC lines
NASA Astrophysics Data System (ADS)
Zheng, Weihua; Sun, Xiaoyu; Fan, Youping
2017-09-01
The main content of this paper is the mechanism of electrochemical corrosion of insulator steel foot in HVDC transmission line, and summarizes five commonly used artificial electrochemical corrosion accelerated test methods in the world. Various methods are analyzed and compared, and the simulation test of electrochemical corrosion of insulator steel feet is carried out by water jet method. The experimental results show that the experimental environment simulated by water jet method is close to the real environment. And the three suspension modes of insulators in the actual operation, the most serious corrosion of the V type suspension hardware, followed by the tension string suspension, and the linear string corrosion rate is the slowest.
Parental attitudes toward pediatric use of complementary/alternative medicine in Turkey.
Ustuner Top, Fadime; Konuk Sener, Dilek; Cangur, Sengul
2017-07-01
This study was conducted to determine the pediatric usage of complementary/alternative medicine (CAM) by parents in Turkey, the incidence of using these methods, and the factors affecting their use. The sectional and relational design of the study included a sample of 497 parents who took children for treatment at the Maternity and Children's Hospital in Giresun, Turkey. Data for the study were collected via the Personal Information Form and the Evaluation Form for Complementary/Alternative Treatment Use. The data collection tools were filled out by the researchers during the face-to-face interviews. Data obtained from the study were analyzed by Pearson chi-square, Fisher-Freeman-Halton and Fisher's exact (posthoc Bonferroni) tests and Z-test. It was determined that 97.7% of the parents had used at least one CAM method. Moreover, the parents had used CAM methods mostly for respiratory complaints. The CAM methods were most commonly used for the symptoms of fever, diarrhea, and cough. It was observed that the most commonly used alternative methods in the past were vitamin/mineral remedies, cold treatments, and hodja (Islamic teacher) consultations, while the most common alternative methods currently used are massage, music, and cold treatment. In addition, the differences found between CAM users in terms of sociodemographic characteristics were not statistically significant. It is crucial for nurses to learn the characteristics of the health/disease treatments used by those with whom they work in order to increase the efficiency of the service they provide. Thus, it was recommended that nurses should be knowledgeable and aware of the benefits/side effects, treatment methods, and contraindications of CAM. © 2017 Wiley Periodicals, Inc.
Elahian Firouz, Zahra; Kaboosi, Hami; Faghih Nasiri, Abdolreza; Tabatabaie, Seyed Saleh; Golhasani-Keshtan, Farideh; Zaboli, Fatemeh
2014-04-01
Toxoplasmosis is the most common disease in humans and animals (zoonosis) caused by the protozoan parasite Toxoplasma gondii. The disease is usually appeared as asymptomatic in immunocompromised individuals but its most common symptom is lymphadenopathy. Shortly before or during the first trimester of pregnancy, this disease can be transferred to the fetus and cause serious infection in the fetus. In late pregnancy (third trimester), the complications of this infection is very low or unsigned. Due to the absence of non-specific clinical symptoms or slight infection in pregnant women, prenatal diagnosis is often impossible. Since no research compared these two methods, we decided to compare these methods and determine which method works better for diagnosis of toxoplasmosis. In this study, 50 pregnant women who referred to the Chalus Health Center laboratory were included and the blood samples were tested for presence of IgG and IgM antibodies of Toxoplasma gondii by both ELISA and Chemiluminescence methods. Of the 50 samples tested by the ELISA method, 26 samples (52%) were positive for IgG . No samples were positive for IgM. Of the 50 samples tested by the Chemiluminescence method, 28 samples (56%) were positive for IgG. No samples were positive for IgM. A significant relationship between the age of the youngest child and the infection rate was seen. No significant correlation between age, number of individuals in the household, number of children, location, type of construction, consumption of greens, the way of greens and meat consumption, drug use, history of stillbirth and infection levels was seen.
A new IRT-based standard setting method: application to eCat-listening.
García, Pablo Eduardo; Abad, Francisco José; Olea, Julio; Aguado, David
2013-01-01
Criterion-referenced interpretations of tests are highly necessary, which usually involves the difficult task of establishing cut scores. Contrasting with other Item Response Theory (IRT)-based standard setting methods, a non-judgmental approach is proposed in this study, in which Item Characteristic Curve (ICC) transformations lead to the final cut scores. eCat-Listening, a computerized adaptive test for the evaluation of English Listening, was administered to 1,576 participants, and the proposed standard setting method was applied to classify them into the performance standards of the Common European Framework of Reference for Languages (CEFR). The results showed a classification closely related to relevant external measures of the English language domain, according to the CEFR. It is concluded that the proposed method is a practical and valid standard setting alternative for IRT-based tests interpretations.
Research design and statistical methods in Pakistan Journal of Medical Sciences (PJMS).
Akhtar, Sohail; Shah, Syed Wadood Ali; Rafiq, M; Khan, Ajmal
2016-01-01
This article compares the study design and statistical methods used in 2005, 2010 and 2015 of Pakistan Journal of Medical Sciences (PJMS). Only original articles of PJMS were considered for the analysis. The articles were carefully reviewed for statistical methods and designs, and then recorded accordingly. The frequency of each statistical method and research design was estimated and compared with previous years. A total of 429 articles were evaluated (n=74 in 2005, n=179 in 2010, n=176 in 2015) in which 171 (40%) were cross-sectional and 116 (27%) were prospective study designs. A verity of statistical methods were found in the analysis. The most frequent methods include: descriptive statistics (n=315, 73.4%), chi-square/Fisher's exact tests (n=205, 47.8%) and student t-test (n=186, 43.4%). There was a significant increase in the use of statistical methods over time period: t-test, chi-square/Fisher's exact test, logistic regression, epidemiological statistics, and non-parametric tests. This study shows that a diverse variety of statistical methods have been used in the research articles of PJMS and frequency improved from 2005 to 2015. However, descriptive statistics was the most frequent method of statistical analysis in the published articles while cross-sectional study design was common study design.
Antecedents and Consequences of Supplier Performance Evaluation Efficacy
2016-06-30
forming groups of high and low values. These tests are contingent on the reliable and valid measure of high and low rating inflation and high and...year)? Future research could deploy a SPM system as a test case on a limited set of transactions. Using a quasi-experimental design , comparisons...single source, common method bias must be of concern. Harmon’s one -factor test showed that when latent-indicator items were forced onto a single
Karasz, Alison; Patel, Viraj; Kabita, Mahbhooba; Shimu, Parvin
2015-01-01
Background Though common mental disorder (CMD) is highly prevalent among South Asian immigrant women, they rarely seek mental treatment. This may be due in part to the lack of conceptual synchrony between medical models of mental disorder and the social models of distress common in South Asian communities. Furthermore, common mental health screening and diagnostic measures may not adequately capture distress in this group. CBPR is ideally suited to help address measurement issues in CMD as well as develop culturally appropriate treatment models. Objectives To use participatory methods to identify an appropriate, culturally specific mental health syndrome and develop an instrument to measure this syndrome. Methods We formed a partnership between researchers, clinicians, and community members. The partnership selected a culturally specific model of emotional distress/ illness, “Tension,” as a focus for further study. Partners developed a scale to measure Tension and tested the new scale on 162 Bangladeshi immigrant women living in the Bronx. Results The 24-item “Tension Scale” had high internal consistency (alpha =0.83). In bivariate analysis, the scale significantly correlated in the expected direction with depressed as measured by the PHQ-2, age, education, self-rated health, having seen a physician in the past year, and other variables. Conclusions Using participatory techniques, we created a new measure designed to assess common mental disorder in an isolated immigrant group. The new measure shows excellent psychometric properties and will be helpful in the implementation of a community-based, culturally synchronous intervention for depression. We describe a useful strategy for the rapid development and field testing of culturally appropriate measures of mental distress and disorder. PMID:24375184
The precision and torque production of common hip adductor squeeze tests used in elite football.
Light, N; Thorborg, K
2016-11-01
Decreased hip adductor strength is a known risk factor for groin injury in footballers, with clinicians testing adductor strength in various positions and using different protocols. Understanding how reliable and how much torque different adductor squeeze tests produce will facilitate choosing the most appropriate method for future testing. In this study, the reliability and torque production of three common adductor squeeze tests were investigated. Test-retest reliability and cross-sectional comparison. Twenty elite level footballers (16-33 years) without previous or current groin pain were recruited. Relative and absolute test-retest reliability, and torque production of three adductor squeeze tests (long-lever in abduction, short-lever in adduction and short-lever in abduction/external rotation) were investigated. Each participant performed a series of isometric strength tests measured by hand-held dynamometry in each position, on two test days separated by two weeks. No systematic variation was seen for any of the tests when using the mean of three measures (ICC=0.84-0.97, MDC%=6.6-19.5). The smallest variation was observed when taking the mean of three repetitions in the long-lever position (ICC=0.97, MDC%=6.6). The long-lever test also yielded the highest mean torque values, which were 69% and 11% higher than the short-lever in adduction test and short-lever in abduction/external rotation test respectively (p<0.001). All three tests described in this study are reliable methods of measuring adductor squeeze strength. However, the test performed in the long-lever position seems the most promising as it displays high test-retest precision and the highest adductor torque production. Copyright © 2015 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.
Lower Cutoffs for LC-MS/MS Urine Drug Testing Indicates Better Patient Compliance.
Krock, Kevin; Pesce, Amadeo; Ritz, Dennis; Thomas, Richard; Cua, Agnes; Rogers, Ryan; Lipnick, Phil; Kilbourn, Kristen
2017-11-01
Urine drug testing is used by health care providers to determine a patient's compliance to their prescribed regimen and to detect non-prescribed medications and illicit drugs. However, the cutoff levels used by clinical labs are often arbitrarily set and may not reflect the urine drug concentrations of compliant patients. Our aim was to test the hypothesis that commonly used cutoffs for many prescribed and illicit drugs were set too high, and methods using these cutoffs may yield a considerable number of false-negative results. The goals of this study were to outline the way to analyze patient results and estimate a more appropriate cutoff, develop and validate a high sensitivity analytical method capable of quantitating drugs and metabolites at lower than the commonly used cutoffs, and determine the number of true positive results that would have been missed when using the common cutoffs. This was a retrospective study of urine specimens submitted for urine drug testing as part of the monitoring of prescription drug compliance described in chronic opioid therapy treatment guidelines. The study was set in a clinical toxicology laboratory, using specimens submitted for routine analysis by health care providers in the normal course of business. Lognormal distributions of test results were generated and fitted with a trendline to estimate the required cutoff level necessary to capture the normal distributions of each drug for the patient population study. A validated laboratory derived liquid chromatography tandem mass spectrometry (LC-MS/MS) analysis capable of achieving the required cutoff levels was developed for each drug and/or metabolite. The study shows that a lognormal distribution of patient urine test results fitted with a trendline is appropriate for estimating the required cutoff levels needed to assess medication adherence. The study showed a wide variation in the false-negative rate, ranging from 1.5% to 94.3% across a range of prescribed and illicit drugs. The patient specimens were largely sourced from patients in either a long-term pain management program or in treatment for substance use disorder in the US. These specimens may not be representative of patients in other types of treatment or in countries with different approaches to these issues. The high-sensitivity method reduces false-negative results which could negatively impact patient care. Clinicians using less sensitive methods for detecting and quantifying drugs and metabolites in urine should exercise caution in assessing patient adherence using and changing the treatment plan based on those results. Urine drug testing, patient adherence, clinical toxicology, immunoassay, LC-MS, definitive drug testing, REMS, negative test results, false negative.
Antibiotic Conditioned Growth Medium of Pseudomonas Aeruginosa
ERIC Educational Resources Information Center
Benathen, Isaiah A.; Cazeau, Barbara; Joseph, Njeri
2004-01-01
A simple method to study the consequences of bacterial antibiosis after interspecific competition between microorganisms is presented. Common microorganisms are used as the test organisms and Pseudomonas aeruginosa are used as the source of the inhibitor agents.
Requirements for diagnosis of malaria at different levels of the laboratory network in Africa.
Long, Earl G
2009-06-01
The rapid increase of resistance to cheap, reliable antimalarials, the increasing cost of effective drugs, and the low specificity of clinical diagnosis has increased the need for more reliable diagnostic methods for malaria. The most commonly used and most reliable remains microscopic examination of stained blood smears, but this technique requires skilled personnel, precision instruments, and ideally a source of electricity. Microscopy has the advantage of enabling the examiner to identify the species, stage, and density of an infection. An alternative to microscopy is the rapid diagnostic test (RDT), which uses a labeled monoclonal antibody to detect circulating parasitic antigens. This test is most commonly used to detect Plasmodium falciparum infections and is available in a plastic cassette format. Both microscopy and RDTs should be available at all levels of laboratory service in endemic areas, but in peripheral laboratories with minimally trained staff, the RDT may be a more practical diagnostic method.
Aircraft Power-Plant Instruments
NASA Technical Reports Server (NTRS)
Sontag, Harcourt; Brombacher, W G
1934-01-01
This report supersedes NACA-TR-129 which is now obsolete. Aircraft power-plant instruments include tachometers, engine thermometers, pressure gages, fuel-quantity gages, fuel flow meters and indicators, and manifold pressure gages. The report includes a description of the commonly used types and some others, the underlying principle utilized in the design, and some design data. The inherent errors of the instrument, the methods of making laboratory tests, descriptions of the test apparatus, and data in considerable detail in the performance of commonly used instruments are presented. Standard instruments and, in cases where it appears to be of interest, those used as secondary standards are described. A bibliography of important articles is included.
Proposed acceptance, qualification, and characterization tests for thin-film PV modules
NASA Technical Reports Server (NTRS)
Waddington, D.; Mrig, L.; Deblasio, R.; Ross, R.
1988-01-01
Details of a proposed test program for PV thin-film modules which the Department of Energy has directed the Solar Energy Research Institute (SERI) to prepare are presented. Results of one of the characterization tests that SERI has performed are also presented. The objective is to establish a common approach to testing modules that will be acceptable to both users and manufacturers. The tests include acceptance, qualification, and characterization tests. Acceptance tests verify that randomly selected modules have similar characteristics. Qualification tests are based on accelerated test methods designed to simulate adverse conditions. Characterization tests provide data on performance in a predefined environment.
Farhadi, Ashkan; Keshavarzian, Ali; Fields, Jeremy Z; Sheikh, Maliha; Banan, Ali
2006-05-19
The most widely accepted method for the evaluation of intestinal barrier integrity is the measurement of the permeation of sugar probes following an oral test dose of sugars. The most-widely used sugar probes are sucrose, lactulose, mannitol and sucralose. Measuring these sugars using a sensitive gas chromatographic (GC) method, we noticed interference on the area of the lactulose and mannitol peaks. We tested different sugars to detect the possible makeup of these interferences and finally detected that the lactose interferes with lactulose peak and fructose interferes with mannitol peak. On further developing of our method, we were able to reasonably separate these peaks using different columns and condition for our assay. Sample preparation was rapid and simple and included adding internal standard sugars, derivitization and silylation. We used two chromatographic methods. In the first method we used Megabore column and had a run time of 34 min. This resulted in partial separation of the peaks. In the second method we used thin capillary column and was able to reasonably separate the lactose and lactulose peaks and the mannitol and fructose peaks with run time of 22 min. The sugar probes including mannitol, sucrose, lactulose, sucralose, fructose and lactose were detected precisely, without interference. The assay was linear between lactulose concentrations of 0.5 and 40 g/L (r(2)=1.000, P<0.0001) and mannitol concentrations of 0.01 and 40 g/L (r(2)=1.000). The sensitivity of this method remained high using new column and assay condition. The minimum detectable concentration calculated for both methods was 0.5 mg/L for lactulose and 1 mg/L for mannitol. This is the first report of interference of commonly used sugars with test of intestinal permeability. These sugars are found in most of fruits and dairy products and could easily interfere with the result of permeability tests. Our new GC assay of urine sugar probes permits the simultaneous quantitation of sucralose, sucrose, mannitol and lactulose, without interference with lactose and fructose. This assay is a rapid, simple, sensitive and reproducible method to accurately measure intestinal permeability.
Detection methods and performance criteria for genetically modified organisms.
Bertheau, Yves; Diolez, Annick; Kobilinsky, André; Magin, Kimberly
2002-01-01
Detection methods for genetically modified organisms (GMOs) are necessary for many applications, from seed purity assessment to compliance of food labeling in several countries. Numerous analytical methods are currently used or under development to support these needs. The currently used methods are bioassays and protein- and DNA-based detection protocols. To avoid discrepancy of results between such largely different methods and, for instance, the potential resulting legal actions, compatibility of the methods is urgently needed. Performance criteria of methods allow evaluation against a common standard. The more-common performance criteria for detection methods are precision, accuracy, sensitivity, and specificity, which together specifically address other terms used to describe the performance of a method, such as applicability, selectivity, calibration, trueness, precision, recovery, operating range, limit of quantitation, limit of detection, and ruggedness. Performance criteria should provide objective tools to accept or reject specific methods, to validate them, to ensure compatibility between validated methods, and be used on a routine basis to reject data outside an acceptable range of variability. When selecting a method of detection, it is also important to consider its applicability, its field of applications, and its limitations, by including factors such as its ability to detect the target analyte in a given matrix, the duration of the analyses, its cost effectiveness, and the necessary sample sizes for testing. Thus, the current GMO detection methods should be evaluated against a common set of performance criteria.
ERIC Educational Resources Information Center
Merrel, Jeremy D.; Cirillo, Pier F.; Schwartz, Pauline M.; Webb, Jeffrey A.
2015-01-01
Multiple choice testing is a common but often ineffective method for evaluating learning. A newer approach, however, using Immediate Feedback Assessment Technique (IF AT®, Epstein Educational Enterprise, Inc.) forms, offers several advantages. In particular, a student learns immediately if his or her answer is correct and, in the case of an…
ERIC Educational Resources Information Center
Gugel, John F.
A new method for estimating the parameters of the normal ogive three-parameter model for multiple-choice test items--the normalized direct (NDIR) procedure--is examined. The procedure is compared to a more commonly used estimation procedure, Lord's LOGIST, using computer simulations. The NDIR procedure uses the normalized (mid-percentile)…
USDA-ARS?s Scientific Manuscript database
In 2011, the USDA-Food Safety and Inspection Service (FSIS) changed the method used for screening swine tissues for antimicrobial residues to the Kidney Inhibition Swab (KIS(TM)) from the Fast Antimicrobial Screen Test. A high dose of penicillin G procaine relative to a label dose is commonly used ...
A Method for the Comparison of Item Selection Rules in Computerized Adaptive Testing
ERIC Educational Resources Information Center
Barrada, Juan Ramon; Olea, Julio; Ponsoda, Vicente; Abad, Francisco Jose
2010-01-01
In a typical study comparing the relative efficiency of two item selection rules in computerized adaptive testing, the common result is that they simultaneously differ in accuracy and security, making it difficult to reach a conclusion on which is the more appropriate rule. This study proposes a strategy to conduct a global comparison of two or…
Predicting Envelope Leakage in Attached Dwellings
DOE Office of Scientific and Technical Information (OSTI.GOV)
Faakye, O.; Arena, L.; Griffiths, D.
2013-07-01
The most common method for measuring air leakage is to use a single blower door to pressurize and/or depressurize the test unit. In detached housing, the test unit is the entire home and the single blower door measures air leakage to the outside. In attached housing, this 'single unit', 'total', or 'solo' test method measures both the air leakage between adjacent units through common surfaces as well air leakage to the outside. Measuring and minimizing this total leakage is recommended to avoid indoor air quality issues between units, reduce energy losses to the outside, reduce pressure differentials between units, andmore » control stack effect. However, two significant limitations of the total leakage measurement in attached housing are: for retrofit work, if total leakage is assumed to be all to the outside, the energy benefits of air sealing can be significantly over predicted; for new construction, the total leakage values may result in failing to meet an energy-based house tightness program criterion. The scope of this research is to investigate an approach for developing a viable simplified algorithm that can be used by contractors to assess energy efficiency program qualification and/or compliance based upon solo test results.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brink, A.; Kilpinen, P.; Hupa, M.
1996-01-01
Two methods to improve the modeling of NO{sub x} emissions in numerical flow simulation of combustion are investigated. The models used are a reduced mechanism for nitrogen chemistry in methane combustion and a new model based on regression analysis of perfectly stirred reactor simulations using detailed comprehensive reaction kinetics. The applicability of the methods to numerical flow simulation of practical furnaces, especially in the near burner region, is tested against experimental data from a pulverized coal fired single burner furnace. The results are also compared to those obtained using a commonly used description for the overall reaction rate of NO.
Measurement of aspheric mirror segments using Fizeau interferometry with CGH correction
NASA Astrophysics Data System (ADS)
Burge, James H.; Zhao, Chunyu; Dubin, Matt
2010-07-01
Large aspheric primary mirrors are proposed that use hundreds segments that all must be aligned and phased to approximate the desired continuous mirror. We present a method of measuring these concave segments with a Fizeau interferometer where a spherical convex reference surface is held a few millimeters from the aspheric segment. The aspheric shape is accommodated by a small computer generated hologram (CGH). Different segments are measured by replacing the CGH. As a Fizeau test, nearly all of the optical elements and air spaces are common to both the measurement and reference wavefront, so the sensitivities are not tight. Also, since the reference surface of the test plate is common to all tests, this system achieves excellent control for the radius of curvature variation from one part to another. This paper describes the test system design and analysis for such a test, and presents data from a similar 1.4-m test performed at the University of Arizona.
Bayes Factor Covariance Testing in Item Response Models.
Fox, Jean-Paul; Mulder, Joris; Sinharay, Sandip
2017-12-01
Two marginal one-parameter item response theory models are introduced, by integrating out the latent variable or random item parameter. It is shown that both marginal response models are multivariate (probit) models with a compound symmetry covariance structure. Several common hypotheses concerning the underlying covariance structure are evaluated using (fractional) Bayes factor tests. The support for a unidimensional factor (i.e., assumption of local independence) and differential item functioning are evaluated by testing the covariance components. The posterior distribution of common covariance components is obtained in closed form by transforming latent responses with an orthogonal (Helmert) matrix. This posterior distribution is defined as a shifted-inverse-gamma, thereby introducing a default prior and a balanced prior distribution. Based on that, an MCMC algorithm is described to estimate all model parameters and to compute (fractional) Bayes factor tests. Simulation studies are used to show that the (fractional) Bayes factor tests have good properties for testing the underlying covariance structure of binary response data. The method is illustrated with two real data studies.
Multiple well-shutdown tests and site-scale flow simulation in fractured rocks
Tiedeman, Claire; Lacombe, Pierre J.; Goode, Daniel J.
2010-01-01
A new method was developed for conducting aquifer tests in fractured-rock flow systems that have a pump-and-treat (P&T) operation for containing and removing groundwater contaminants. The method involves temporary shutdown of individual pumps in wells of the P&T system. Conducting aquifer tests in this manner has several advantages, including (1) no additional contaminated water is withdrawn, and (2) hydraulic containment of contaminants remains largely intact because pumping continues at most wells. The well-shutdown test method was applied at the former Naval Air Warfare Center (NAWC), West Trenton, New Jersey, where a P&T operation is designed to contain and remove trichloroethene and its daughter products in the dipping fractured sedimentary rocks underlying the site. The detailed site-scale subsurface geologic stratigraphy, a three-dimensional MODFLOW model, and inverse methods in UCODE_2005 were used to analyze the shutdown tests. In the model, a deterministic method was used for representing the highly heterogeneous hydraulic conductivity distribution and simulations were conducted using an equivalent porous media method. This approach was very successful for simulating the shutdown tests, contrary to a common perception that flow in fractured rocks must be simulated using a stochastic or discrete fracture representation of heterogeneity. Use of inverse methods to simultaneously calibrate the model to the multiple shutdown tests was integral to the effectiveness of the approach.
Angular spectral framework to test full corrections of paraxial solutions.
Mahillo-Isla, R; González-Morales, M J
2015-07-01
Different correction methods for paraxial solutions have been used when such solutions extend out of the paraxial regime. The authors have used correction methods guided by either their experience or some educated hypothesis pertinent to the particular problem that they were tackling. This article provides a framework so as to classify full wave correction schemes. Thus, for a given solution of the paraxial wave equation, we can select the best correction scheme of those available. Some common correction methods are considered and evaluated under the proposed scope. Another remarkable contribution is obtained by giving the necessary conditions that two solutions of the Helmholtz equation must accomplish to accept a common solution of the parabolic wave equation as a paraxial approximation of both solutions.
Field test comparison of two dermal tolerance assessment methods of hand hygiene products.
Girard, R; Carré, E; Pires-Cronenberger, S; Bertin-Mandy, M; Favier-Bulit, M C; Coyault, C; Coudrais, S; Billard, M; Regard, A; Kerhoas, A; Valdeyron, M L; Cracco, B; Misslin, P
2008-06-01
This study aimed to compare the sensitivity and workload requirement of two dermal tolerance assessment methods of hand hygiene products, in order to select a suitable pilot testing method for field tests. An observer-rating method and a self-assessment method were compared in 12 voluntary hospital departments (autumn/winter of 2005-2006). Three test-periods of three weeks were separated by two-week intervals during which the routine products were reintroduced. The observer rating method scored dryness and irritation on four-point scales. In the self-assessment method, the user rated appearance, intactness, moisture content, and sensation on a visual analogue scale which was converted into a 10-point numerical scale. Eleven products (soaps) were tested (223/250 complete reports for observer rating, 131/251 for self-assessment). Two products were significantly less well tolerated than the routine product according to the observers, four products according to the self-assessments. There was no significant difference between the two methods when products were classified according to tolerance (Fisher's test: P=0.491). For the symptom common to both assessment methods (dryness), there is a good correlation between the two methods (Spearman's Rho: P=0.032). The workload was higher for observer rating method (288 h of observer time plus 122 h of prevention team and pharmacist time compared with 15 h of prevention team and pharmacist time for self-assessment). In conclusion, the self-assessment method was considered more suitable for pilot testing, although further time should be allocated for educational measures as the return rate of complete self-assessment forms was poor.
Airside HVAC BESTEST: HVAC Air-Distribution System Model Test Cases for ASHRAE Standard 140
DOE Office of Scientific and Technical Information (OSTI.GOV)
Judkoff, Ronald; Neymark, Joel; Kennedy, Mike D.
This paper summarizes recent work to develop new airside HVAC equipment model analytical verification test cases for ANSI/ASHRAE Standard 140, Standard Method of Test for the Evaluation of Building Energy Analysis Computer Programs. The analytical verification test method allows comparison of simulation results from a wide variety of building energy simulation programs with quasi-analytical solutions, further described below. Standard 140 is widely cited for evaluating software for use with performance-path energy efficiency analysis, in conjunction with well-known energy-efficiency standards including ASHRAE Standard 90.1, the International Energy Conservation Code, and other international standards. Airside HVAC Equipment is a common area ofmore » modelling not previously explicitly tested by Standard 140. Integration of the completed test suite into Standard 140 is in progress.« less
Characterization of DUT impedance in immunity test setups
NASA Astrophysics Data System (ADS)
Hassanpour Razavi, Seyyed Ali; Frei, Stephan
2016-09-01
Several immunity test procedures for narrowband radiated electromagnetic energy are available for automotive components. The ISO 11452 series describes the most commonly used test methods. The absorber line shielded enclosure (ALSE) is often considered as the most reliable method. However, testing with the bulk current injection (BCI) can be done with less efforts and is often preferred. As the test setup in both procedures is quite similar, there were several trials for finding appropriate modifications to the BCI in order to increase the matching to the ALSE. However, the lack of knowledge regarding the impedance of the tested component, makes it impossible to find the equivalent current to be injected by the BCI and a good match cannot be achieved. In this paper, three approaches are proposed to estimate the termination impedance indirectly by using different current probes.
Ranjith, Konduri; Sontam, Bhavani; Sharma, Savitri; Joseph, Joveeta; Chathoth, Kanchana N; Sama, Kalyana C; Murthy, Somasheila I; Shivaji, Sisinthy
2017-08-01
To determine the type of Candida species in ocular infections and to investigate the relationship of antifungal susceptibility profile to virulence factors. Fifty isolates of yeast-like fungi from patients with keratitis, endophthalmitis, and orbital cellulitis were identified by Vitek-2 compact system and DNA sequencing of ITS1-5.8S-ITS2 regions of the rRNA gene, followed by phylogenetic analysis for phenotypic and genotypic identification, respectively. Minimum inhibitory concentration of six antifungal drugs was determined by E test/microbroth dilution methods. Phenotypic and genotypic methods were used to determine the virulence factors. Phylogenetic analysis showed the clustering of all isolates into eight distinct groups with a major cluster formed Candida parapsilosis (n = 21), which was the most common species by both Vitek 2 and DNA sequencing. Using χ2 test no significant difference was noted between the techniques except that Vitek 2 did not identify C. viswanathii, C. orthopsilosis, and two non-Candida genera. Of 43 tested Candida isolates high susceptibility to amphotericin B (39/43, 90.6%) and natamycin (43/43, 100%) was noted. While none of the isolates produced coagulase, all produced esterase and catalase. The potential to form biofilm was detected in 23/43 (53.4%) isolates. Distribution of virulence factors by heat map analysis showed difference in metabolic activity of biofilm producers from nonbiofilm producers. Identified by Vitek 2 and DNA sequencing methods C. parapsilosis was the most common species associated with eye infections. Irrespective of the virulence factors elaborated, the Candida isolates were susceptible to commonly used antifungal drugs such as amphotericin B and natamycin.
Jenke, Dennis; Castner, James; Egert, Thomas; Feinberg, Tom; Hendricker, Alan; Houston, Christopher; Hunt, Desmond G; Lynch, Michael; Shaw, Arthur; Nicholas, Kumudini; Norwood, Daniel L; Paskiet, Diane; Ruberto, Michael; Smith, Edward J; Holcomb, Frank
2013-01-01
Polymeric and elastomeric materials are commonly encountered in medical devices and packaging systems used to manufacture, store, deliver, and/or administer drug products. Characterizing extractables from such materials is a necessary step in establishing their suitability for use in these applications. In this study, five individual materials representative of polymers and elastomers commonly used in packaging systems and devices were extracted under conditions and with solvents that are relevant to parenteral and ophthalmic drug products (PODPs). Extraction methods included elevated temperature sealed vessel extraction, sonication, refluxing, and Soxhlet extraction. Extraction solvents included a low-pH (pH = 2.5) salt mixture, a high-pH (pH = 9.5) phosphate buffer, a 1/1 isopropanol/water mixture, isopropanol, and hexane. The resulting extracts were chemically characterized via spectroscopic and chromatographic means to establish the metal/trace element and organic extractables profiles. Additionally, the test articles themselves were tested for volatile organic substances. The results of this testing established the extractables profiles of the test articles, which are reported herein. Trends in the extractables, and their estimated concentrations, as a function of the extraction and testing methodologies are considered in the context of the use of the test article in medical applications and with respect to establishing best demonstrated practices for extractables profiling of materials used in PODP-related packaging systems and devices. Plastic and rubber materials are commonly encountered in medical devices and packaging/delivery systems for drug products. Characterizing the extractables from these materials is an important part of determining that they are suitable for use. In this study, five materials representative of plastics and rubbers used in packaging and medical devices were extracted by several means, and the extracts were analytically characterized to establish each material's profile of extracted organic compounds and trace element/metals. This information was utilized to make generalizations about the appropriateness of the test methods and the appropriate use of the test materials.
NASA Astrophysics Data System (ADS)
Palmberg, Irmeli; Berg, Ida; Jeronen, Eila; Kärkkäinen, Sirpa; Norrgård-Sillanpää, Pia; Persson, Christel; Vilkonis, Rytis; Yli-Panula, Eija
2015-10-01
Knowledge of species, interest in nature, and nature experiences are the factors that best promote interest in and understanding of environmental issues, biodiversity and sustainable life. The aim of this study is to investigate how well student teachers identify common local species, their interest in and ideas about species identification, and their perceptions of the importance of species identification and biodiversity for sustainable development. Totally 456 student teachers for primary schools were tested using an identification test and a questionnaire consisting of fixed and open questions. A combination of quantitative and qualitative methods was used to get a more holistic view of students' level of knowledge and their preferred learning methods. The student teachers' ability to identify very common species was low, and only 3 % were able to identify most of the tested species. Experiential learning outdoors was suggested by the majority of students as the most efficient learning method, followed by experiential learning indoors, project work and experimental learning. They looked upon the identification of plants and animals as `important' or `very important' for citizens today and for sustainable development. Likewise, they looked upon biodiversity as `important' or `very important' for sustainable development. Our conclusion is that teaching and learning methods for identification and knowledge of species and for education of biodiversity and sustainable development should always include experiential and project-based methods in authentic environments.
Hypothesis testing for band size detection of high-dimensional banded precision matrices.
An, Baiguo; Guo, Jianhua; Liu, Yufeng
2014-06-01
Many statistical analysis procedures require a good estimator for a high-dimensional covariance matrix or its inverse, the precision matrix. When the precision matrix is banded, the Cholesky-based method often yields a good estimator of the precision matrix. One important aspect of this method is determination of the band size of the precision matrix. In practice, crossvalidation is commonly used; however, we show that crossvalidation not only is computationally intensive but can be very unstable. In this paper, we propose a new hypothesis testing procedure to determine the band size in high dimensions. Our proposed test statistic is shown to be asymptotically normal under the null hypothesis, and its theoretical power is studied. Numerical examples demonstrate the effectiveness of our testing procedure.
NASA Astrophysics Data System (ADS)
Majidi, Omid; Jahazi, Mohammad; Bombardier, Nicolas; Samuel, Ehab
2017-10-01
The strain rate sensitivity index, m-value, is being applied as a common tool to evaluate the impact of the strain rate on the viscoplastic behaviour of materials. The m-value, as a constant number, has been frequently taken into consideration for modeling material behaviour in the numerical simulation of superplastic forming processes. However, the impact of the testing variables on the measured m-values has not been investigated comprehensively. In this study, the m-value for a superplastic grade of an aluminum alloy (i.e., AA5083) has been investigated. The conditions and the parameters that influence the strain rate sensitivity for the material are compared with three different testing methods, i.e., monotonic uniaxial tension test, strain rate jump test and stress relaxation test. All tests were conducted at elevated temperature (470°C) and at strain rates up to 0.1 s-1. The results show that the m-value is not constant and is highly dependent on the applied strain rate, strain level and testing method.
Rifaximin-resistant Clostridium difficile strains isolated from symptomatic patients.
Reigadas, E; Muñoz-Pacheco, P; Vázquez-Cuesta, S; Alcalá, L; Marín, M; Martin, A; Bouza, E
2017-12-01
Rifaximin has been proposed as an alternative treatment for specific cases of Clostridium difficile infection (CDI) and intestinal decontamination. Rifaximin-resistant C. difficile has occasionally been reported. Antibiotic susceptibility testing relies on anaerobic agar dilution (reference method), which is cumbersome and not routinely used. There is no commercial test for detection of resistance to rifaximin. To assess resistance to rifaximin by C. difficile and to evaluate the correlation between the results of the rifampicin E-test and susceptibility to rifaximin. We compared the in vitro susceptibility of clinical CDI isolates to rifaximin over a 6-month period using the agar dilution method with susceptibility to rifampicin using the E-test. All isolates were characterized using PCR-ribotyping. Clinical data were recorded prospectively. We recovered 276 consecutive C. difficile isolates and found that 32.2% of episodes were caused by rifaximin-resistant strains. The MICs for rifaximin ranged from <0.0009-256 mg/L, with a geometric mean (GM) of 0.256 mg/L, an MIC 50/90 of 0.015/>256 mg/L. Rifaximin and rifampicin MICs were comparable, and all strains classed as resistant by agar dilution were correctly classified as resistant by E-test. The most common ribotypes were 001 (37.2%), 078/126 (14.3%), and 014 (12.0%). Ribotype 001 exhibited the highest MICs for rifaximin. Resistance to rifaximin was common; resistance rates were higher in ribotype 001 strains. Susceptibility to rifaximin determined by agar dilution correlated with susceptibility to rifampicin determined using the E-test, including rifaximin-resistant strains. Our results suggest that the rifampicin E-test is a valid method for the prediction of rifaximin-resistant C. difficile. Copyright © 2017 Elsevier Ltd. All rights reserved.
Gene set analysis using variance component tests.
Huang, Yen-Tsung; Lin, Xihong
2013-06-28
Gene set analyses have become increasingly important in genomic research, as many complex diseases are contributed jointly by alterations of numerous genes. Genes often coordinate together as a functional repertoire, e.g., a biological pathway/network and are highly correlated. However, most of the existing gene set analysis methods do not fully account for the correlation among the genes. Here we propose to tackle this important feature of a gene set to improve statistical power in gene set analyses. We propose to model the effects of an independent variable, e.g., exposure/biological status (yes/no), on multiple gene expression values in a gene set using a multivariate linear regression model, where the correlation among the genes is explicitly modeled using a working covariance matrix. We develop TEGS (Test for the Effect of a Gene Set), a variance component test for the gene set effects by assuming a common distribution for regression coefficients in multivariate linear regression models, and calculate the p-values using permutation and a scaled chi-square approximation. We show using simulations that type I error is protected under different choices of working covariance matrices and power is improved as the working covariance approaches the true covariance. The global test is a special case of TEGS when correlation among genes in a gene set is ignored. Using both simulation data and a published diabetes dataset, we show that our test outperforms the commonly used approaches, the global test and gene set enrichment analysis (GSEA). We develop a gene set analyses method (TEGS) under the multivariate regression framework, which directly models the interdependence of the expression values in a gene set using a working covariance. TEGS outperforms two widely used methods, GSEA and global test in both simulation and a diabetes microarray data.
Sex differences in mechanical allodynia: how can it be preclinically quantified and analyzed?
Nicotra, Lauren; Tuke, Jonathan; Grace, Peter M.; Rolan, Paul E.; Hutchinson, Mark R.
2014-01-01
Translating promising preclinical drug discoveries to successful clinical trials remains a significant hurdle in pain research. Although animal models have significantly contributed to understanding chronic pain pathophysiology, the majority of research has focused on male rodents using testing procedures that produce sex difference data that do not align well with comparable clinical experiences. Additionally, the use of animal pain models presents ongoing ethical challenges demanding continuing refinement of preclinical methods. To this end, this study sought to test a quantitative allodynia assessment technique and associated statistical analysis in a modified graded nerve injury pain model with the aim to further examine sex differences in allodynia. Graded allodynia was established in male and female Sprague Dawley rats by altering the number of sutures placed around the sciatic nerve and quantified by the von Frey test. Linear mixed effects modeling regressed response on each fixed effect (sex, oestrus cycle, pain treatment). On comparison with other common von Frey assessment techniques, utilizing lower threshold filaments than those ordinarily tested, at 1 s intervals, appropriately and successfully investigated female mechanical allodynia, revealing significant sex and oestrus cycle difference across the graded allodynia that other common behavioral methods were unable to detect. Utilizing this different von Frey approach and graded allodynia model, a single suture inflicting less allodynia was sufficient to demonstrate exaggerated female mechanical allodynia throughout the phases of dioestrus and pro-oestrus. Refining the von Frey testing method, statistical analysis technique and the use of a graded model of chronic pain, allowed for examination of the influences on female mechanical nociception that other von Frey methods cannot provide. PMID:24592221
Schön, T; Miotto, P; Köser, C U; Viveiros, M; Böttger, E; Cambau, E
2017-03-01
Drug-resistance testing, or antimicrobial susceptibility testing (AST), is mandatory for Mycobacterium tuberculosis in cases of failure on standard therapy. We reviewed the different methods and techniques of phenotypic and genotypic approaches. Although multiresistant and extensively drug-resistant (MDR/XDR) tuberculosis is present worldwide, AST for M. tuberculosis (AST-MTB) is still mainly performed according to the resources available rather than the drug-resistance rates. Phenotypic methods, i.e. culture-based AST, are commonly used in high-income countries to confirm susceptibility of new cases of tuberculosis. They are also used to detect resistance in tuberculosis cases with risk factors, in combination with genotypic tests. In low-income countries, genotypic methods screening hot-spot mutations known to confer resistance were found to be easier to perform because they avoid the culture and biosafety constraint. Given that genotypic tests can rapidly detect the prominent mechanisms of resistance, such as the rpoB mutation for rifampicin resistance, we are facing new challenges with the observation of false-resistance (mutations not conferring resistance) and false-susceptibility (mutations different from the common mechanism) results. Phenotypic and genotypic approaches are therefore complementary for obtaining a high sensitivity and specificity for detecting drug resistances and susceptibilities to accurately predict MDR/XDR cure and to gather relevant data for resistance surveillance. Although AST-MTB was established in the 1960s, there is no consensus reference method for MIC determination against which the numerous AST-MTB techniques can be compared. This information is necessary for assessing in vitro activity and setting breakpoints for future anti-tuberculosis agents. Copyright © 2016 European Society of Clinical Microbiology and Infectious Diseases. Published by Elsevier Ltd. All rights reserved.
Rasmussen, Kirsten; Rauscher, Hubert; Mech, Agnieszka; Riego Sintes, Juan; Gilliland, Douglas; González, Mar; Kearns, Peter; Moss, Kenneth; Visser, Maaike; Groenewold, Monique; Bleeker, Eric A J
2018-02-01
Identifying and characterising nanomaterials require additional information on physico-chemical properties and test methods, compared to chemicals in general. Furthermore, regulatory decisions for chemicals are usually based upon certain toxicological properties, and these effects may not be equivalent to those for nanomaterials. However, regulatory agencies lack an authoritative decision framework for nanomaterials that links the relevance of certain physico-chemical endpoints to toxicological effects. This paper investigates various physico-chemical endpoints and available test methods that could be used to produce such a decision framework for nanomaterials. It presents an overview of regulatory relevance and methods used for testing fifteen proposed physico-chemical properties of eleven nanomaterials in the OECD Working Party on Manufactured Nanomaterials' Testing Programme, complemented with methods from literature, and assesses the methods' adequacy and applications limits. Most endpoints are of regulatory relevance, though the specific parameters depend on the nanomaterial and type of assessment. Size (distribution) is the common characteristic of all nanomaterials and is decisive information for classifying a material as a nanomaterial. Shape is an important particle descriptor. The octanol-water partitioning coefficient is undefined for particulate nanomaterials. Methods, including sample preparation, need to be further standardised, and some new methods are needed. The current work of OECD's Test Guidelines Programme regarding physico-chemical properties is highlighted. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
Hardware Demonstration: Radiated Emissions as a Function of Common Mode Current
NASA Technical Reports Server (NTRS)
Mc Closkey, John; Roberts, Jen
2016-01-01
This presentation describes the electromagnetic compatibility (EMC) tests performed on the Integrated Science Instrument Module (ISIM), the science payload of the James Webb Space Telescope (JWST), at NASAs Goddard Space Flight Center (GSFC) in August 2015. By its very nature of being an integrated payload, it could be treated as neither a unit level test nor an integrated spacecraft observatory test. Non-standard test criteria are described along with non-standard test methods that had to be developed in order to evaluate them. Results are presented to demonstrate that all test criteria were met in less than the time allocated.
Invited Commentary: Beware the Test-Negative Design.
Westreich, Daniel; Hudgens, Michael G
2016-09-01
In this issue of the Journal, Sullivan et al. (Am J Epidemiol. 2016;184(5):345-353) carefully examine the theoretical justification for use of the test-negative design, a common observational study design, in assessing the effectiveness of influenza vaccination. Using modern causal inference methods (in particular, directed acyclic graphs), they describe different threats to the validity of inferences drawn about the effect of vaccination from test-negative design studies. These threats include confounding, selection bias, and measurement error in either the exposure or the outcome. While confounding and measurement error are common in observational studies, the potential for selection bias inherent in the test-negative design brings into question the validity of inferences drawn from such studies. © The Author 2016. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Schmidt, Robert L; Factor, Rachel E; Affolter, Kajsa E; Cook, Joshua B; Hall, Brian J; Narra, Krishna K; Witt, Benjamin L; Wilson, Andrew R; Layfield, Lester J
2012-01-01
Diagnostic test accuracy (DTA) studies on fine-needle aspiration cytology (FNAC) often show considerable variability in diagnostic accuracy between study centers. Many factors affect the accuracy of FNAC. A complete description of the testing parameters would help make valid comparisons between studies and determine causes of performance variation. We investigated the manner in which test conditions are specified in FNAC DTA studies to determine which parameters are most commonly specified and the frequency with which they are specified and to see whether there is significant variability in reporting practice. We identified 17 frequently reported test parameters and found significant variation in the reporting of these test specifications across studies. On average, studies reported 5 of the 17 items that would be required to specify the test conditions completely. A more complete and standardized reporting of methods, perhaps by means of a checklist, would improve the interpretation of FNAC DTA studies.
Comparison of attrition test methods: ASTM standard fluidized bed vs jet cup
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhao, R.; Goodwin, J.G. Jr.; Jothimurugesan, K.
2000-05-01
Attrition resistance is one of the key design parameters for catalysts used in fluidized-bed and slurry phase types of reactors. The ASTM fluidized-bed test has been one of the most commonly used attrition resistance evaluation methods; however, it requires the use of 50 g samples--a large amount for catalyst development studies. Recently a test using the jet cup requiring only 5 g samples has been proposed. In the present study, two series of spray-dried iron catalysts were evaluated using both the ASTM fluidized-bed test and a test based on the jet cup to determine this comparability. It is shown thatmore » the two tests give comparable results. This paper, by reporting a comparison of the jet-cup test with the ASTM standard, provides a basis for utilizing the more efficient jet cup with confidence in catalyst attrition studies.« less
Chan, Lai Wah; Cheah, Emily LC; Saw, Constance LL; Weng, Wanyu; Heng, Paul WS
2008-01-01
Background Eight medicinal plants were tested for their antimicrobial and antioxidant activities. Different extraction methods were also tested for their effects on the bioactivities of the medicinal plants. Methods Eight plants, namely Herba Polygonis Hydropiperis (Laliaocao), Folium Murraya Koenigii (Jialiye), Rhizoma Arachis Hypogea (Huashenggen), Herba Houttuyniae (Yuxingcao), Epipremnum pinnatum (Pashulong), Rhizoma Typhonium Flagelliforme (Laoshuyu), Cortex Magnoliae Officinalis (Houpo) and Rhizoma Imperatae (Baimaogen) were investigated for their potential antimicrobial and antioxidant properties. Results Extracts of Cortex Magnoliae Officinalis had the strongest activities against M. Smegmatis, C. albicans, B. subtilis and S. aureus. Boiled extracts of Cortex Magnoliae Officinalis, Folium Murraya Koenigii, Herba Polygonis Hydropiperis and Herba Houttuyniae demonstrated greater antioxidant activities than other tested medicinal plants. Conclusion Among the eight tested medicinal plants, Cortex Magnoliae Officinalis showed the highest antimicrobial and antioxidant activities. Different methods of extraction yield different spectra of bioactivities. PMID:19038060
NASA Technical Reports Server (NTRS)
Carpenter, J. L., Jr.; Stuhrke, W. F.
1976-01-01
Technical abstracts are presented for about 100 significant documents relating to nondestructive testing of aircraft structures or related structural testing and the reliability of the more commonly used evaluation methods. Particular attention is directed toward acoustic emission; liquid penetrant; magnetic particle; ultrasonics; eddy current; and radiography. The introduction of the report includes an overview of the state-of-the-art represented in the documents that have been abstracted.
The most common mistakes on dermatoscopy of melanocytic lesions
Kamińska-Winciorek, Grażyna
2015-01-01
Dermatoscopy is a method of in vivo evaluation of the structures within the epidermis and dermis. Currently, it may be the most precise pre-surgical method of diagnosing melanocytic lesions. Diagnostic errors may result in unnecessary removal of benign lesions or what is even worse, they can cause early and very early melanomas to be overlooked. Errors in assessment of dermatoscopy can be divided into those arising from failure to maintain proper test procedures (procedural and technical errors) and knowledge based mistakes related to the lack of sufficient familiarity and experience in dermatoscopy. The article discusses the most common mistakes made by beginner or inexperienced dermatoscopists. PMID:25821425
Mohd Nasir, Mohd Desa; Parasakthi, Navaratnam
2004-06-01
The increasing prevalence of penicillin-resistant Streptococuus pneumoniae urges for fast and accurate susceptibility testing methods. This study evaluated the comparability of three commonly used techniques; disk diffusion, E-test and agar dilution, to detect penicillin susceptibility in clinical isolates of S. pneumoniae. Fifty pneumococcal isolates, obtained from patients at the University of Malaya Medical Centre, were selected to include both penicillin-susceptible strains and those that had decreased susceptibility (resistant and intermediate) to penicillin. The minimum inhibitory concentration (MIC) values of penicillin to serve as the reference was determined by the agar dilution method in which, based on the MIC breakpoints recommended by the National Committee for Clinical Laboratory Standards (NCCLS), 27 strains had decreased susceptibility to penicillin with 17 strains resistant and 10 intermediate. Comparing to the agar dilution method, oxacillin disk diffusion test detected all strains with decreased penicillin susceptibility as such while E-test showed a close agreement of susceptibility (92%) of the isolates to penicillin. This confirmed that oxacillin is a good screening test for S. pneumoniae isolates with decreased susceptibility to penicillin while E-test is very reliable for rapid and accurate detection of penicillin susceptibility.
ERIC Educational Resources Information Center
Finch, Holmes; Stage, Alan Kirk; Monahan, Patrick
2008-01-01
A primary assumption underlying several of the common methods for modeling item response data is unidimensionality, that is, test items tap into only one latent trait. This assumption can be assessed several ways, using nonlinear factor analysis and DETECT, a method based on the item conditional covariances. When multidimensionality is identified,…
ERIC Educational Resources Information Center
Ozdemir, Burhanettin
2017-01-01
The purpose of this study is to equate Trends in International Mathematics and Science Study (TIMSS) mathematics subtest scores obtained from TIMSS 2011 to scores obtained from TIMSS 2007 form with different nonlinear observed score equating methods under Non-Equivalent Anchor Test (NEAT) design where common items are used to link two or more test…
Confidence Limits for the Indirect Effect: Distribution of the Product and Resampling Methods
ERIC Educational Resources Information Center
MacKinnon, David P.; Lockwood, Chondra M.; Williams, Jason
2004-01-01
The most commonly used method to test an indirect effect is to divide the estimate of the indirect effect by its standard error and compare the resulting z statistic with a critical value from the standard normal distribution. Confidence limits for the indirect effect are also typically based on critical values from the standard normal…
The Effectiveness of Circular Equating as a Criterion for Evaluating Equating.
ERIC Educational Resources Information Center
Wang, Tianyou; Hanson, Bradley A.; Harris, Deborah J.
Equating a test form to itself through a chain of equatings, commonly referred to as circular equating, has been widely used as a criterion to evaluate the adequacy of equating. This paper uses both analytical methods and simulation methods to show that this criterion is in general invalid in serving this purpose. For the random groups design done…
Effects of three mulch treatments on initial postfire erosion in north-central Arizona
George H. Riechers; Jan L. Beyers; Peter R. Robichaud; Karen Jennings; Erin Kreutz; Jeff Moll
2008-01-01
Mulching after wildfires is a common treatment designed to protect bare ground from raindrop impact and reduce subsequent erosion. We tested the effectiveness of three mulching methods on the Indian Fire near Prescott, Arizona, USA. The first method felled all fire-killed trees, chipped the logs and limbs, and spread the chips across the hillslope with a mobile...
Research design and statistical methods in Pakistan Journal of Medical Sciences (PJMS)
Akhtar, Sohail; Shah, Syed Wadood Ali; Rafiq, M.; Khan, Ajmal
2016-01-01
Objective: This article compares the study design and statistical methods used in 2005, 2010 and 2015 of Pakistan Journal of Medical Sciences (PJMS). Methods: Only original articles of PJMS were considered for the analysis. The articles were carefully reviewed for statistical methods and designs, and then recorded accordingly. The frequency of each statistical method and research design was estimated and compared with previous years. Results: A total of 429 articles were evaluated (n=74 in 2005, n=179 in 2010, n=176 in 2015) in which 171 (40%) were cross-sectional and 116 (27%) were prospective study designs. A verity of statistical methods were found in the analysis. The most frequent methods include: descriptive statistics (n=315, 73.4%), chi-square/Fisher’s exact tests (n=205, 47.8%) and student t-test (n=186, 43.4%). There was a significant increase in the use of statistical methods over time period: t-test, chi-square/Fisher’s exact test, logistic regression, epidemiological statistics, and non-parametric tests. Conclusion: This study shows that a diverse variety of statistical methods have been used in the research articles of PJMS and frequency improved from 2005 to 2015. However, descriptive statistics was the most frequent method of statistical analysis in the published articles while cross-sectional study design was common study design. PMID:27022365
Barker, Timothy Hugh; Howarth, Gordon Stanley; Whittaker, Alexandra Louise
2018-01-01
Extinction of learning is a common, yet under-reported limitation of judgment bias testing methods Repeated exposure to the ambiguous probe of a judgment bias paradigm encourages the animal to cease display of the required behaviours. However, there remains a need to repeatedly test animals to achieve statistical power. A delicate balance therefore needs to be struck between over- and under-exposure of the animals to the test conditions. This study presents the data of rats, a common animal subject of judgment bias testing. Rats were exposed to the ambiguous probe of a common, active-choice judgment bias test for 11 consecutive days. There was a significant increase in the latency to respond to the ambiguous probe following day 8, with no significant increase experienced for either the positive or less-positive probes. Following day 8 there was a significant increase in both optimistic and pessimistic latencies in response to the ambiguous probe. Therefore, repeated exposure to the ambiguous probe caused an increased latency in response even though optimistic interpretations were recorded. This implies that the use of response latency alone as a measure in judgment bias testing can falsely identify pessimism. Researchers should modify experimental design to include both choice and latency measures. Crown Copyright © 2017. Published by Elsevier B.V. All rights reserved.
Wiuf, Carsten; Schaumburg-Müller Pallesen, Jonatan; Foldager, Leslie; Grove, Jakob
2016-08-01
In many areas of science it is custom to perform many, potentially millions, of tests simultaneously. To gain statistical power it is common to group tests based on a priori criteria such as predefined regions or by sliding windows. However, it is not straightforward to choose grouping criteria and the results might depend on the chosen criteria. Methods that summarize, or aggregate, test statistics or p-values, without relying on a priori criteria, are therefore desirable. We present a simple method to aggregate a sequence of stochastic variables, such as test statistics or p-values, into fewer variables without assuming a priori defined groups. We provide different ways to evaluate the significance of the aggregated variables based on theoretical considerations and resampling techniques, and show that under certain assumptions the FWER is controlled in the strong sense. Validity of the method was demonstrated using simulations and real data analyses. Our method may be a useful supplement to standard procedures relying on evaluation of test statistics individually. Moreover, by being agnostic and not relying on predefined selected regions, it might be a practical alternative to conventionally used methods of aggregation of p-values over regions. The method is implemented in Python and freely available online (through GitHub, see the Supplementary information).
Visvanathan, Rizliya; Jayathilake, Chathuni; Liyanage, Ruvini
2016-11-15
For the first time, a reliable, simple, rapid and high-throughput analytical method for the detection and quantification of α-amylase inhibitory activity using the glucose assay kit was developed. The new method facilitates rapid screening of a large number of samples, reduces labor, time and reagents and is also suitable for kinetic studies. This method is based on the reaction of maltose with glucose oxidase (GOD) and the development of a red quinone. The test is done in microtitre plates with a total volume of 260μL and an assay time of 40min including the pre-incubation steps. The new method is tested for linearity, sensitivity, precision, reproducibility and applicability. The new method is also compared with the most commonly used 3,5-dinitrosalicylic acid (DNSA) method for determining α-amylase activity. Copyright © 2016 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Chard, Melissa; Roulin, Jean-Luc; Bouvard, Martine
2014-01-01
Background: The use of common psychological assessment tools is invalidated with persons with PIMD. The aim of this study was to test the feasibility of using a visual habituation procedure with a group of adults with PIMD, to develop a new theoretical and practical framework for the assessment of cognitive abilities. Methods: To test the…
ERIC Educational Resources Information Center
Moses, Tim; Liu, Jinghua
2011-01-01
In equating research and practice, equating functions that are smooth are typically assumed to be more accurate than equating functions with irregularities. This assumption presumes that population test score distributions are relatively smooth. In this study, two examples were used to reconsider common beliefs about smoothing and equating. The…
Statistical considerations for harmonization of the global multicenter study on reference values.
Ichihara, Kiyoshi
2014-05-15
The global multicenter study on reference values coordinated by the Committee on Reference Intervals and Decision Limits (C-RIDL) of the IFCC was launched in December 2011, targeting 45 commonly tested analytes with the following objectives: 1) to derive reference intervals (RIs) country by country using a common protocol, and 2) to explore regionality/ethnicity of reference values by aligning test results among the countries. To achieve these objectives, it is crucial to harmonize 1) the protocol for recruitment and sampling, 2) statistical procedures for deriving the RI, and 3) test results through measurement of a panel of sera in common. For harmonized recruitment, very lenient inclusion/exclusion criteria were adopted in view of differences in interpretation of what constitutes healthiness by different cultures and investigators. This policy may require secondary exclusion of individuals according to the standard of each country at the time of deriving RIs. An iterative optimization procedure, called the latent abnormal values exclusion (LAVE) method, can be applied to automate the process of refining the choice of reference individuals. For global comparison of reference values, test results must be harmonized, based on the among-country, pair-wise linear relationships of test values for the panel. Traceability of reference values can be ensured based on values assigned indirectly to the panel through collaborative measurement of certified reference materials. The validity of the adopted strategies is discussed in this article, based on interim results obtained to date from five countries. Special considerations are made for dissociation of RIs by parametric and nonparametric methods and between-country difference in the effect of body mass index on reference values. Copyright © 2014 Elsevier B.V. All rights reserved.
Dellanno, Christine; Vega, Quinn; Boesenberg, Diane
2009-10-01
The 2003 outbreak of severe acute respiratory syndrome (SARS) infected over 8000 people and killed 774. Transmission of SARS occurred through direct and indirect contact and large droplet nuclei. The World Health Organization recommended the use of household disinfectants, which have not been previously tested against SARS coronavirus (SARS-CoV), to disinfect potentially contaminated environmental surfaces. There is a need for a surrogate test system given the limited availability of the SARS-CoV for testing and biosafety requirements necessary to safely handle it. In this study, the antiviral activity of standard household products was assayed against murine hepatitis virus (MHV), as a potential surrogate for SARS-CoV. A surface test method, which involves drying an amount of virus on a surface and then applying the product for a specific contact time, was used to determine the virucidal activity. The virus titers and log reductions were determined by the Reed and Muench tissue culture infective dose (TCID)50 end point method. When tested as directed, common household disinfectants or antiseptics, containing either 0.050% of triclosan, 0.12% of PCMX, 0.21% of sodium hypochlorite, 0.23% of pine oil, or 0.10% of a quaternary compound with 79% of ethanol, demonstrated a 3-log reduction or better against MHV without any virus recovered in a 30-second contact time. Common household disinfectants and antiseptics were effective at inactivating MHV, a possible surrogate for SARS-CoV, from surfaces when used as directed. In an outbreak caused by novel agents, it is important to know the effectiveness of disinfectants and antiseptics to prevent or reduce the possibility of human-to-human transmission via surfaces.
Reiter, Paul L.; McRee, Annie-Laurie
2017-01-01
Objective Lesbian and bisexual women are at risk for human papillomavirus (HPV) infection and cervical disease. We examined Pap testing among these women and their acceptability of HPV self-testing at home, a potential cervical cancer screening strategy. Methods We analyzed data from a national sample of lesbian and bisexual women ages 21–26 who completed our online survey during Fall 2013 (n=418). Logistic regression identified correlates of: 1) receipt of a Pap test in the last three years; and 2) willingness to use an HPV self-test at home. Results About 70% of women had received a Pap test in the last three years. Pap testing was more common among women who had disclosed their sexual orientation to their healthcare provider (OR=2.01, 95% CI: 1.02–3.95) and less common among women who self-identified as lesbian (OR=0.48, 95% CI: 0.25–0.93). Just over half of women (51%) were willing to use an HPV self-test at home. Women were more willing to use an HPV self-test at home if they were older (OR=1.16, 95% CI: 1.03–1.30) or reported higher levels of worry about getting an HPV-related disease (OR=1.28, 95% CI: 1.01–1.63). The most common concerns about HPV self-testing at home were using the test incorrectly (70%) and test accuracy (64%). Conclusions Many young lesbian and bisexual women have not received a recent Pap test. HPV self-testing at home may be a promising future strategy for reaching and screening these women. Findings highlight beliefs and concerns that could be addressed by self-test programs. PMID:25385868
Developing a test method to investigate water susceptibility of joint and crack sealants.
DOT National Transportation Integrated Search
2016-10-31
Sealants are commonly used to insulate cracks and joints preventing water from entering the underlying structure. However, extended exposure of sealants to water has shown to negatively impact sealants properties causing gradual degradation of sealan...
Test procedure for determining organic matter content in soils : UV-VIS method.
DOT National Transportation Integrated Search
2010-11-01
The Texas Department of Transportation has been having problems with organic matter in soils that they : stabilize for use as subgrade layers in road construction. The organic matter reduces the effectiveness of : common soil additives (lime/cement) ...
DEVELOPMENT OF A MOLECULAR METHOD TO IDENTIFY ASTROVIRUS IN WATER
Astrovirus is a common cause of gastroenteritis that has been determined to be responsible for several outbreaks. Since astrovirus can be waterborne, there is interest in testing environmental water for astrovirus. We have developed a sensitive reverse transcription-polymerase ...
Resilient moduli of typical Missouri soils and unbound granular base materials.
DOT National Transportation Integrated Search
2009-01-01
The objective of this project was to determine the resilient moduli for common Missouri subgrade soils and typical unbound granular base materials in accordance with the AASHTO T 307 test method. The results allow Missouri Department of Transportatio...
Armour, John A. L.; Palla, Raquel; Zeeuwen, Patrick L. J. M.; den Heijer, Martin; Schalkwijk, Joost; Hollox, Edward J.
2007-01-01
Recent work has demonstrated an unexpected prevalence of copy number variation in the human genome, and has highlighted the part this variation may play in predisposition to common phenotypes. Some important genes vary in number over a high range (e.g. DEFB4, which commonly varies between two and seven copies), and have posed formidable technical challenges for accurate copy number typing, so that there are no simple, cheap, high-throughput approaches suitable for large-scale screening. We have developed a simple comparative PCR method based on dispersed repeat sequences, using a single pair of precisely designed primers to amplify products simultaneously from both test and reference loci, which are subsequently distinguished and quantified via internal sequence differences. We have validated the method for the measurement of copy number at DEFB4 by comparison of results from >800 DNA samples with copy number measurements by MAPH/REDVR, MLPA and array-CGH. The new Paralogue Ratio Test (PRT) method can require as little as 10 ng genomic DNA, appears to be comparable in accuracy to the other methods, and for the first time provides a rapid, simple and inexpensive method for copy number analysis, suitable for application to typing thousands of samples in large case-control association studies. PMID:17175532
Evaluation of surface renewal and flux-variance methods above agricultural and forest surfaces
NASA Astrophysics Data System (ADS)
Fischer, M.; Katul, G. G.; Noormets, A.; Poznikova, G.; Domec, J. C.; Trnka, M.; King, J. S.
2016-12-01
Measurements of turbulent surface energy fluxes are of high interest in agriculture and forest research. During last decades, eddy covariance (EC), has been adopted as the most commonly used micrometeorological method for measuring fluxes of greenhouse gases, energy and other scalars at the surface-atmosphere interface. Despite its robustness and accuracy, the costs of EC hinder its deployment at some research experiments and in practice like e.g. for irrigation scheduling. Therefore, testing and development of other cost-effective methods is of high interest. In our study, we tested performance of surface renewal (SR) and flux variance method (FV) for estimates of sensible heat flux density. Surface renewal method is based on the concept of non-random transport of scalars via so-called coherent structures which if accurately identified can be used for the computing of associated flux. Flux variance method predicts the flux from the scalar variance following the surface-layer similarity theory. We tested SR and FV against EC in three types of ecosystem with very distinct aerodynamic properties. First site was represented by agricultural wheat field in the Czech Republic. The second site was a 20-m tall mixed deciduous wetland forest on the coast of North Carolina, USA. The third site was represented by pine-switchgrass intercropping agro-forestry system located in coastal plain of North Carolina, USA. Apart from solving the coherent structures in a SR framework from the structure functions (representing the most common approach), we applied ramp wavelet detection scheme to test the hypothesis that the duration and amplitudes of the coherent structures are normally distributed within the particular 30-minutes time intervals and so just the estimates of their averages is sufficient for the accurate flux determination. Further, we tested whether the orthonormal wavelet thresholding can be used for isolating of the coherent structure scales which are associated with flux transport. Finally, we tested whether low-pass filtering in the Fourier domain based on integral length scale can improve estimates of both SR and FV as it supposedly removes the low frequency portion of the signal not related with the investigated fluxes.
Proposed Objective Odor Control Test Methodology for Waste Containment
NASA Technical Reports Server (NTRS)
Vos, Gordon
2010-01-01
The Orion Cockpit Working Group has requested that an odor control testing methodology be proposed to evaluate the odor containment effectiveness of waste disposal bags to be flown on the Orion Crew Exploration Vehicle. As a standardized "odor containment" test does not appear to be a matter of record for the project, a new test method is being proposed. This method is based on existing test methods used in industrial hygiene for the evaluation of respirator fit in occupational settings, and takes into consideration peer reviewed documentation of human odor thresholds for standardized contaminates, industry stardnard atmostpheric testing methodologies, and established criteria for laboratory analysis. The proposed methodology is quantitative, though it can readily be complimented with a qualitative subjective assessment. Isoamyl acetate (IAA - also known at isopentyl acetate) is commonly used in respirator fit testing, and there are documented methodologies for both measuring its quantitative airborne concentrations. IAA is a clear, colorless liquid with a banana-like odor, documented detectable smell threshold for humans of 0.025 PPM, and a 15 PPB level of quantation limit.
Tornambè, A; Manfra, L; Canepa, S; Oteri, F; Martuccio, G; Cicero, A M; Magaletti, E
2018-02-01
The OECD TG 215 method (2000) (C.14 method of EC Regulation 440/2008) was developed on the rainbow trout (Oncorynchus mykiss) to assess chronic toxicity (28d) of chemicals on fish juveniles. It contemplates to use other well documented species identifying suitable conditions to evaluate their growth. OECD proposes the European sea bass (Dicentrarchus labrax, L. 1758) as Mediterranean species among vertebrates recommended in the OECD guidelines for the toxicity testing of chemicals. In this context, our study is aimed to proposing the adaptation of the growth test (OECD TG 215, 2000) to D. labrax. For this purpose toxicity tests were performed with sodium dodecyl sulfate, a reference toxicant commonly used in fish toxicity assays. The main aspects of the testing procedure were reviewed: fish size (weight), environmental conditions, dilution water type, experimental design, loading rate and stocking density, feeding (food type and ration), test validity criteria. The experience gained from growth tests with the sea bass allows to promote its inclusion among the species to be used for the C.14 method. Copyright © 2016. Published by Elsevier Inc.
A general statistical test for correlations in a finite-length time series.
Hanson, Jeffery A; Yang, Haw
2008-06-07
The statistical properties of the autocorrelation function from a time series composed of independently and identically distributed stochastic variables has been studied. Analytical expressions for the autocorrelation function's variance have been derived. It has been found that two common ways of calculating the autocorrelation, moving-average and Fourier transform, exhibit different uncertainty characteristics. For periodic time series, the Fourier transform method is preferred because it gives smaller uncertainties that are uniform through all time lags. Based on these analytical results, a statistically robust method has been proposed to test the existence of correlations in a time series. The statistical test is verified by computer simulations and an application to single-molecule fluorescence spectroscopy is discussed.
The application of systems thinking in health: why use systems thinking?
Peters, David H
2014-08-26
This paper explores the question of what systems thinking adds to the field of global health. Observing that elements of systems thinking are already common in public health research, the article discusses which of the large body of theories, methods, and tools associated with systems thinking are more useful. The paper reviews the origins of systems thinking, describing a range of the theories, methods, and tools. A common thread is the idea that the behavior of systems is governed by common principles that can be discovered and expressed. They each address problems of complexity, which is a frequent challenge in global health. The different methods and tools are suited to different types of inquiry and involve both qualitative and quantitative techniques. The paper concludes by emphasizing that explicit models used in systems thinking provide new opportunities to understand and continuously test and revise our understanding of the nature of things, including how to intervene to improve people's health.
Comparison of statistical tests for association between rare variants and binary traits.
Bacanu, Silviu-Alin; Nelson, Matthew R; Whittaker, John C
2012-01-01
Genome-wide association studies have found thousands of common genetic variants associated with a wide variety of diseases and other complex traits. However, a large portion of the predicted genetic contribution to many traits remains unknown. One plausible explanation is that some of the missing variation is due to the effects of rare variants. Nonetheless, the statistical analysis of rare variants is challenging. A commonly used method is to contrast, within the same region (gene), the frequency of minor alleles at rare variants between cases and controls. However, this strategy is most useful under the assumption that the tested variants have similar effects. We previously proposed a method that can accommodate heterogeneous effects in the analysis of quantitative traits. Here we extend this method to include binary traits that can accommodate covariates. We use simulations for a variety of causal and covariate impact scenarios to compare the performance of the proposed method to standard logistic regression, C-alpha, SKAT, and EREC. We found that i) logistic regression methods perform well when the heterogeneity of the effects is not extreme and ii) SKAT and EREC have good performance under all tested scenarios but they can be computationally intensive. Consequently, it would be more computationally desirable to use a two-step strategy by (i) selecting promising genes by faster methods and ii) analyzing selected genes using SKAT/EREC. To select promising genes one can use (1) regression methods when effect heterogeneity is assumed to be low and the covariates explain a non-negligible part of trait variability, (2) C-alpha when heterogeneity is assumed to be large and covariates explain a small fraction of trait's variability and (3) the proposed trend and heterogeneity test when the heterogeneity is assumed to be non-trivial and the covariates explain a large fraction of trait variability.
Use of the dynamic stiffness method to interpret experimental data from a nonlinear system
NASA Astrophysics Data System (ADS)
Tang, Bin; Brennan, M. J.; Gatti, G.
2018-05-01
The interpretation of experimental data from nonlinear structures is challenging, primarily because of dependency on types and levels of excitation, and coupling issues with test equipment. In this paper, the use of the dynamic stiffness method, which is commonly used in the analysis of linear systems, is used to interpret the data from a vibration test of a controllable compressed beam structure coupled to a test shaker. For a single mode of the system, this method facilitates the separation of mass, stiffness and damping effects, including nonlinear stiffness effects. It also allows the separation of the dynamics of the shaker from the structure under test. The approach needs to be used with care, and is only suitable if the nonlinear system has a response that is predominantly at the excitation frequency. For the structure under test, the raw experimental data revealed little about the underlying causes of the dynamic behaviour. However, the dynamic stiffness approach allowed the effects due to the nonlinear stiffness to be easily determined.
Draft Plan to Develop Non-Intrusive Load Monitoring Test Protocols
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mayhorn, Ebony T.; Sullivan, Greg P.; Petersen, Joseph M.
2015-09-29
This document presents a Draft Plan proposed to develop a common test protocol that can be used to evaluate the performance requirements of Non-Intrusive Load Monitoring. Development on the test protocol will be focused on providing a consistent method that can be used to quantify and compare the performance characteristics of NILM products. Elements of the protocols include specifications for appliances to be used, metrics, instrumentation, and a procedure to simulate appliance behavior during tests. In addition, three priority use cases for NILM will be identified and their performance requirements will specified.
Tests for a disease-susceptibility locus allowing for an inbreeding coefficient (F).
Song, Kijoung; Elston, Robert C
2003-11-01
We begin by discussing the false positive test results that arise because of cryptic relatedness and population substructure when testing a disease susceptibility locus. We extend and evaluate the Hardy-Weinberg disequilibrium (HWD) method, allowing for an inbreeding coefficient (F) in a similar way that Devlin and Roeder (1999) allowed for inbreeding in a case-control study. Then we compare the HWD measure and the common direct measure of linkage disequilibrium, both when there is no population substructure (F = 0) and when there is population substructure (F not = 0), for a single marker. The HWD test statistic gives rise to false positives caused by population stratification. These false positives can be controlled by adjusting the test statistic for the amount of variance inflation caused by the inbreeding coefficient (F). The power loss for the HWD test that arises when controlling for population structure is much less than that which arises for the common direct measure of linkage disequilibrium. However, in the multiplicative model, the HWD test has virtually no power even when allowing for non-zero F.
Brocious, Jeffrey; Tarver, Michelle E; Hampton, Denise; Eydelman, Malvina
2018-04-24
With the increasing incidence of more pathogens that can cause microbial keratitis (MK), it is necessary to periodically reassess disinfection multipurpose solutions testing requirements to ensure that relevant organisms to challenge them are being used. Current testing protocols have included common pathogens such as Pseudomonas aeruginosa, Staphylococcus aureus, Serratia marcescens, Candida albicans, and Fusarium solani but have omitted less common pathogens such as Acanthamoeba. Specifically, Acanthamoeba sp. has recently been identified as a prevalent cause of MK in certain countries. Developing an appropriate protocol for this unique organism presents a challenge, given its two distinct life stages, methods to grow the organism, encystment techniques, and many other parameters that can affect testing outcomes. Therefore, the appropriate combination of these parameters is crucial to developing a protocol that ensures consistent, accurate results. The FDA has recognized the importance of establishing a standardized testing protocol for this pathogen and embarked on research efforts to provide a recommended testing protocol for testing contact lens care products.
Shtessel, Maria; Lobell, Elizabeth; Hudes, Golda; Rosenstreich, David; de Vos, Gabriele
2017-01-01
Background: Allergists commonly perform intradermal skin testing (IDST) after negative skin-prick testing (SPT) to comprehensively diagnose environmental allergic sensitization. However, with the availability of modern methods to detect serum-specific immunoglobulin E (ssIgE), it is unclear if ssIgE testing could substitute for IDST. Objective: To determine the efficacy of ssIgE testing and IDST when added to SPT in diagnosing environmental allergic sensitizations. Methods: SPT, IDST, and ssIgE testing to nine common environmental allergens were analyzed in 75 patients with oculonasal symptoms who presented to our allergy clinics in the Bronx, New York, between January 2014 and May 2015. Results: A total of 651 SPT and 499 ssIgE tests were independently performed and revealed 162 (25%) and 127 (25%) sensitizations, respectively. When SPT results were negative, IDST results revealed 108 of 452 additional sensitizations (24%). In contrast, when SPT results were negative, ssIgE test results only revealed 9% additional sensitizations. When both SPT and IDST results were negative, ssIgE testing only detected 3% of additional sensitizations, and ssIgE levels were typically low in these cases (median, 1.25 kU/L; range, 0.357–4.47 kU/L). When both SPT and ssIgE test results were negative, IDST results detected 15% additional sensitizations. Conclusion: IDST detected more additional environmental sensitizations compared with ssIgE testing. IDST, therefore, may be useful when the SPT and/or ssIgE testing results were negative, but the exposure history indicated relevant allergic sensitization. Serology added only a little more information if both SPT and IDST results were negative but may be useful in combination with SPT if IDST cannot be performed. PMID:28583228
Cruz, N; Rodrigues, S M; Tavares, D; Monteiro, R J R; Carvalho, L; Trindade, T; Duarte, A C; Pereira, E; Römkens, Paul F A M
2015-09-01
To assess if the geochemical reactivity and human bioaccessibility of silver nanoparticles (AgNPs) in soils can be determined by routine soil tests commonly applied to other metals in soil, colloidal Ag was introduced to five pots containing urban soils (equivalent to 6.8 mg Ag kg(-1) soil). Following a 45 days stabilization period, the geochemical reactivity was determined by extraction using 0.43 M and 2 M HNO3. The bioaccessibility of AgNPs was evaluated using the Simplified Bioaccessibility Extraction Test (SBET) the "Unified BARGE Method" (UBM), and two simulated lung fluids (modified Gamble's solution (MGS) and artificial lysosomal fluid (ALF)). The amount of Ag extracted by 0.43 M and 2 M HNO3 soil tests was <8% and <50%, respectively of the total amount of Ag added to soils suggesting that the reactivity of Ag present in the soil can be relatively low. The bioaccessibility of Ag as determined by the four in vitro tests ranged from 17% (ALF extraction) to 99% (SBET) indicating that almost all Ag can be released from soil due to specific interactions with the organic ligands present in the simulated body fluids. This study shows that to develop sound soil risk evaluations regarding soil contamination with AgNPs, aspects of Ag biochemistry need to be considered, particularly when linking commonly applied soil tests to human risk assessment. Copyright © 2015 Elsevier Ltd. All rights reserved.
Bayes factors based on robust TDT-type tests for family trio design.
Yuan, Min; Pan, Xiaoqing; Yang, Yaning
2015-06-01
Adaptive transmission disequilibrium test (aTDT) and MAX3 test are two robust-efficient association tests for case-parent family trio data. Both tests incorporate information of common genetic models including recessive, additive and dominant models and are efficient in power and robust to genetic model specifications. The aTDT uses information of departure from Hardy-Weinberg disequilibrium to identify the potential genetic model underlying the data and then applies the corresponding TDT-type test, and the MAX3 test is defined as the maximum of the absolute value of three TDT-type tests under the three common genetic models. In this article, we propose three robust Bayes procedures, the aTDT based Bayes factor, MAX3 based Bayes factor and Bayes model averaging (BMA), for association analysis with case-parent trio design. The asymptotic distributions of aTDT under the null and alternative hypothesis are derived in order to calculate its Bayes factor. Extensive simulations show that the Bayes factors and the p-values of the corresponding tests are generally consistent and these Bayes factors are robust to genetic model specifications, especially so when the priors on the genetic models are equal. When equal priors are used for the underlying genetic models, the Bayes factor method based on aTDT is more powerful than those based on MAX3 and Bayes model averaging. When the prior placed a small (large) probability on the true model, the Bayes factor based on aTDT (BMA) is more powerful. Analysis of a simulation data about RA from GAW15 is presented to illustrate applications of the proposed methods.
Corrêa, A M; Pereira, M I S; de Abreu, H K A; Sharon, T; de Melo, C L P; Ito, M A; Teodoro, P E; Bhering, L L
2016-10-17
The common bean, Phaseolus vulgaris, is predominantly grown on small farms and lacks accurate genotype recommendations for specific micro-regions in Brazil. This contributes to a low national average yield. The aim of this study was to use the methods of the harmonic mean of the relative performance of genetic values (HMRPGV) and the centroid, for selecting common bean genotypes with high yield, adaptability, and stability for the Cerrado/Pantanal ecotone region in Brazil. We evaluated 11 common bean genotypes in three trials carried out in the dry season in Aquidauana in 2013, 2014, and 2015. A likelihood ratio test detected a significant interaction between genotype x year, contributing 54% to the total phenotypic variation in grain yield. The three genotypes selected by the joint analysis of genotypic values in all years (Carioca Precoce, BRS Notável, and CNFC 15875) were the same as those recommended by the HMRPGV method. Using the centroid method, genotypes BRS Notável and CNFC 15875 were considered ideal genotypes based on their high stability to unfavorable environments and high responsiveness to environmental improvement. We identified a high association between the methods of adaptability and stability used in this study. However, the use of centroid method provided a more accurate and precise recommendation of the behavior of the evaluated genotypes.
A comparison of solute-transport solution techniques based on inverse modelling results
Mehl, S.; Hill, M.C.
2000-01-01
Five common numerical techniques (finite difference, predictor-corrector, total-variation-diminishing, method-of-characteristics, and modified-method-of-characteristics) were tested using simulations of a controlled conservative tracer-test experiment through a heterogeneous, two-dimensional sand tank. The experimental facility was constructed using randomly distributed homogeneous blocks of five sand types. This experimental model provides an outstanding opportunity to compare the solution techniques because of the heterogeneous hydraulic conductivity distribution of known structure, and the availability of detailed measurements with which to compare simulated concentrations. The present work uses this opportunity to investigate how three common types of results-simulated breakthrough curves, sensitivity analysis, and calibrated parameter values-change in this heterogeneous situation, given the different methods of simulating solute transport. The results show that simulated peak concentrations, even at very fine grid spacings, varied because of different amounts of numerical dispersion. Sensitivity analysis results were robust in that they were independent of the solution technique. They revealed extreme correlation between hydraulic conductivity and porosity, and that the breakthrough curve data did not provide enough information about the dispersivities to estimate individual values for the five sands. However, estimated hydraulic conductivity values are significantly influenced by both the large possible variations in model dispersion and the amount of numerical dispersion present in the solution technique.Five common numerical techniques (finite difference, predictor-corrector, total-variation-diminishing, method-of-characteristics, and modified-method-of-characteristics) were tested using simulations of a controlled conservative tracer-test experiment through a heterogeneous, two-dimensional sand tank. The experimental facility was constructed using randomly distributed homogeneous blocks of five sand types. This experimental model provides an outstanding opportunity to compare the solution techniques because of the heterogeneous hydraulic conductivity distribution of known structure, and the availability of detailed measurements with which to compare simulated concentrations. The present work uses this opportunity to investigate how three common types of results - simulated breakthrough curves, sensitivity analysis, and calibrated parameter values - change in this heterogeneous situation, given the different methods of simulating solute transport. The results show that simulated peak concentrations, even at very fine grid spacings, varied because of different amounts of numerical dispersion. Sensitivity analysis results were robust in that they were independent of the solution technique. They revealed extreme correlation between hydraulic conductivity and porosity, and that the breakthrough curve data did not provide enough information about the dispersivities to estimate individual values for the five sands. However, estimated hydraulic conductivity values are significantly influenced by both the large possible variations in model dispersion and the amount of numerical dispersion present in the solution technique.
Evaluation of 39 medical implants at 7.0 T
Feng, David X; McCauley, Joseph P; Morgan–Curtis, Fea K; Salam, Redoan A; Pennell, David R; Loveless, Mary E
2015-01-01
Objective: With increased signal to noise ratios, 7.0-T MRI has the potential to contribute unique information regarding anatomy and pathophysiology of a disease. However, concerns for the safety of subjects with metallic medical implants have hindered advancement in this field. The purpose of the present research was to evaluate the MRI safety for 39 commonly used medical implants at 7.0 T. Methods: Selected metallic implants were tested for magnetic field interactions, radiofrequency-induced heating and artefacts using standardized testing techniques. Results: 5 of the 39 implants tested may be unsafe for subjects undergoing MRI at 7.0 T. Conclusion: Implants were deemed either “MR Conditional” or “MR Unsafe” for the 7.0-T MRI environment. Further research is needed to expand the existing database categorizing implants that are acceptable for patients referred for MRI examinations at 7.0 T. Advances in knowledge: Lack of MRI testing for common metallic medical implants limits the translational potential of 7.0-T MRI. For safety reasons, patients with metallic implants are not allowed to undergo a 7.0-T MRI scan, precluding part of the population that can benefit from the detailed resolution of ultra-high-field MRIs. This investigation provides necessary MRI testing of common medical implants at 7.0 T. PMID:26481696
Methanogenic activity tests by Infrared Tunable Diode Laser Absorption Spectroscopy.
Martinez-Cruz, Karla; Sepulveda-Jauregui, Armando; Escobar-Orozco, Nayeli; Thalasso, Frederic
2012-10-01
Methanogenic activity (MA) tests are commonly carried out to estimate the capability of anaerobic biomass to treat effluents, to evaluate anaerobic activity in bioreactors or natural ecosystems, or to quantify inhibitory effects on methanogenic activity. These activity tests are usually based on the measurement of the volume of biogas produced by volumetric, pressure increase or gas chromatography (GC) methods. In this study, we present an alternative method for non-invasive measurement of methane produced during activity tests in closed vials, based on Infrared Tunable Diode Laser Absorption Spectroscopy (MA-TDLAS). This new method was tested during model acetoclastic and hydrogenotrophic methanogenic activity tests and was compared to a more traditional method based on gas chromatography. From the results obtained, the CH(4) detection limit of the method was estimated to 60 ppm and the minimum measurable methane production rate was estimated to 1.09(.)10(-3) mg l(-1) h(-1), which is below CH(4) production rate usually reported in both anaerobic reactors and natural ecosystems. Additionally to sensitivity, the method has several potential interests compared to more traditional methods among which short measurements time allowing the measurement of a large number of MA test vials, non-invasive measurements avoiding leakage or external interferences and similar cost to GC based methods. It is concluded that MA-TDLAS is a promising method that could be of interest not only in the field of anaerobic digestion but also, in the field of environmental ecology where CH(4) production rates are usually very low. Copyright © 2012 Elsevier B.V. All rights reserved.
Testing for significance of phase synchronisation dynamics in the EEG.
Daly, Ian; Sweeney-Reed, Catherine M; Nasuto, Slawomir J
2013-06-01
A number of tests exist to check for statistical significance of phase synchronisation within the Electroencephalogram (EEG); however, the majority suffer from a lack of generality and applicability. They may also fail to account for temporal dynamics in the phase synchronisation, regarding synchronisation as a constant state instead of a dynamical process. Therefore, a novel test is developed for identifying the statistical significance of phase synchronisation based upon a combination of work characterising temporal dynamics of multivariate time-series and Markov modelling. We show how this method is better able to assess the significance of phase synchronisation than a range of commonly used significance tests. We also show how the method may be applied to identify and classify significantly different phase synchronisation dynamics in both univariate and multivariate datasets.
NASA Technical Reports Server (NTRS)
Kolyer, J. M.
1978-01-01
An important principle is that encapsulants should be tested in a total array system allowing realistic interaction of components. Therefore, micromodule test specimens were fabricated with a variety of encapsulants, substrates, and types of circuitry. One common failure mode was corrosion of circuitry and solar cell metallization due to moisture penetration. Another was darkening and/or opacification of encapsulant. A test program plan was proposed. It includes multicondition accelerated exposure. Another method was hyperaccelerated photochemical exposure using a solar concentrator. It simulates 20 year of sunlight exposure in a short period of one to two weeks. The study was beneficial in identifying some cost effective encapsulants and array designs.
Noar, Seth M; Mehrotra, Purnima
2011-03-01
Traditional theory testing commonly applies cross-sectional (and occasionally longitudinal) survey research to test health behavior theory. Since such correlational research cannot demonstrate causality, a number of researchers have called for the increased use of experimental methods for theory testing. We introduce the multi-methodological theory-testing (MMTT) framework for testing health behavior theory. The MMTT framework introduces a set of principles that broaden the perspective of how we view evidence for health behavior theory. It suggests that while correlational survey research designs represent one method of testing theory, the weaknesses of this approach demand that complementary approaches be applied. Such approaches include randomized lab and field experiments, mediation analysis of theory-based interventions, and meta-analysis. These alternative approaches to theory testing can demonstrate causality in a much more robust way than is possible with correlational survey research methods. Such approaches should thus be increasingly applied in order to more completely and rigorously test health behavior theory. Greater application of research derived from the MMTT may lead researchers to refine and modify theory and ultimately make theory more valuable to practitioners. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.
Determination of elastic modulus of ceramics using ultrasonic testing
NASA Astrophysics Data System (ADS)
Sasmita, Firmansyah; Wibisono, Gatot; Judawisastra, Hermawan; Priambodo, Toni Agung
2018-04-01
Elastic modulus is important material property on structural ceramics application. However, bending test as a common method for determining this property require particular specimen preparation. Furthermore, elastic modulus of ceramics could vary because it depends on porosity content. For structural ceramics industry, such as ceramic tiles, this property is very important. This drives the development of new method to improve effectivity or verification method as well. In this research, ultrasonic testing was conducted to determine elastic modulus of soda lime glass and ceramic tiles. The experiment parameter was frequency of probe (1, 2, 4 MHz). Characterization of density and porosity were also done for analysis. Results from ultrasonic testing were compared with elastic modulus resulted from bending test. Elastic modulus of soda-lime glass based on ultrasonic testing showed excellent result with error 2.69% for 2 MHz probe relative to bending test result. Testing on red and white ceramic tiles were still contained error up to 41% and 158%, respectively. The results for red ceramic tile showed trend that 1 MHz probe gave better accuracy in determining elastic modulus. However, testing on white ceramic tile showed different trend. It was due to the presence of porosity and near field effect.
A scoping review of biomechanical testing for proximal humerus fracture implants.
Cruickshank, David; Lefaivre, Kelly A; Johal, Herman; MacIntyre, Norma J; Sprague, Sheila A; Scott, Taryn; Guy, Pierre; Cripton, Peter A; McKee, Michael; Bhandari, Mohit; Slobogean, Gerard P
2015-07-30
Fixation failure is a relatively common sequela of surgical management of proximal humerus fractures (PHF). The purpose of this study is to understand the current state of the literature with regard to the biomechanical testing of proximal humerus fracture implants. A scoping review of the proximal humerus fracture literature was performed, and studies testing the mechanical properties of a PHF treatment were included in this review. Descriptive statistics were used to summarize the characteristics and methods of the included studies. 1,051 proximal humerus fracture studies were reviewed; 67 studies met our inclusion criteria. The most common specimen used was cadaver bone (87%), followed by sawbones (7%) and animal bones (4%). A two-part fracture pattern was tested most frequently (68%), followed by three-part (23%), and four-part (8%). Implants tested included locking plates (52%), intramedullary devices (25%), and non-locking plates (25%). Hemi-arthroplasty was tested in 5 studies (7%), with no studies using reverse total shoulder arthroplasty (RTSA) implants. Torque was the most common mode of force applied (51%), followed by axial loading (45%), and cantilever bending (34%). Substantial testing diversity was observed across all studies. The biomechanical literature was found to be both diverse and heterogeneous. More complex fracture patterns and RTSA implants have not been adequately tested. These gaps in the current literature will need to be addressed to ensure that future biomechanical research is clinically relevant and capable of improving the outcomes of challenging proximal humerus fracture patterns.
ToxCast Communications and Outreach Strategy (SETAC)
US EPA's Chemical Safety for Sustainability Research Program has been using in vitro testing methods in an effort to accelerate the pace of chemical evaluations and address the significant lack of health and environmental data on the thousands of chemicals found in commonly used ...
Nondestructive evaluation of pavement structural condition for rehabilitation design : final report.
DOT National Transportation Integrated Search
2016-05-31
Falling Weight Deflectometer (FWD) is the common non-destructive testing method for in-situ evaluation of pavement condition. : This study aims to develop finite element (FE) models that can simulate FWD loading on pavement system and capture the : c...
Diagnostic Performance of a Molecular Test versus Clinician Assessment of Vaginitis
Gaydos, Charlotte A.; Nyirjesy, Paul; Paradis, Sonia; Kodsi, Salma; Cooper, Charles K.
2018-01-01
ABSTRACT Vaginitis is a common complaint, diagnosed either empirically or using Amsel's criteria and wet mount microscopy. This study sought to determine characteristics of an investigational test (a molecular test for vaginitis), compared to reference, for detection of bacterial vaginosis, Candida spp., and Trichomonas vaginalis. Vaginal specimens from a cross-sectional study were obtained from 1,740 women (≥18 years old), with vaginitis symptoms, during routine clinic visits (across 10 sites in the United States). Specimens were analyzed using a commercial PCR/fluorogenic probe-based investigational test that detects bacterial vaginosis, Candida spp., and Trichomonas vaginalis. Clinician diagnosis and in-clinic testing (Amsel's test, potassium hydroxide preparation, and wet mount) were also employed to detect the three vaginitis causes. All testing methods were compared to the respective reference methods (Nugent Gram stain for bacterial vaginosis, detection of the Candida gene its2, and Trichomonas vaginalis culture). The investigational test, clinician diagnosis, and in-clinic testing were compared to reference methods for bacterial vaginosis, Candida spp., and Trichomonas vaginalis. The investigational test resulted in significantly higher sensitivity and negative predictive value than clinician diagnosis or in-clinic testing. In addition, the investigational test showed a statistically higher overall percent agreement with each of the three reference methods than did clinician diagnosis or in-clinic testing. The investigational test showed significantly higher sensitivity for detecting vaginitis, involving more than one cause, than did clinician diagnosis. Taken together, these results suggest that a molecular investigational test can facilitate accurate detection of vaginitis. PMID:29643195
An experimental method to simulate incipient decay of wood basidiomycete fungi
Simon Curling; Jerrold E. Winandy; Carol A. Clausen
2000-01-01
At very early stages of decay of wood by basidiomycete fungi, strength loss can be measured from wood before any measurable weight loss. Therefore, strength loss is a more efficient measure of incipient decay than weight loss. However, common standard decay tests (e.g. EN 113 or ASTM D2017) use weight loss as the measure of decay. A method was developed that allowed...
Redd, Andrew M; Gundlapalli, Adi V; Divita, Guy; Carter, Marjorie E; Tran, Le-Thuy; Samore, Matthew H
2017-07-01
Templates in text notes pose challenges for automated information extraction algorithms. We propose a method that identifies novel templates in plain text medical notes. The identification can then be used to either include or exclude templates when processing notes for information extraction. The two-module method is based on the framework of information foraging and addresses the hypothesis that documents containing templates and the templates within those documents can be identified by common features. The first module takes documents from the corpus and groups those with common templates. This is accomplished through a binned word count hierarchical clustering algorithm. The second module extracts the templates. It uses the groupings and performs a longest common subsequence (LCS) algorithm to obtain the constituent parts of the templates. The method was developed and tested on a random document corpus of 750 notes derived from a large database of US Department of Veterans Affairs (VA) electronic medical notes. The grouping module, using hierarchical clustering, identified 23 groups with 3 documents or more, consisting of 120 documents from the 750 documents in our test corpus. Of these, 18 groups had at least one common template that was present in all documents in the group for a positive predictive value of 78%. The LCS extraction module performed with 100% positive predictive value, 94% sensitivity, and 83% negative predictive value. The human review determined that in 4 groups the template covered the entire document, with the remaining 14 groups containing a common section template. Among documents with templates, the number of templates per document ranged from 1 to 14. The mean and median number of templates per group was 5.9 and 5, respectively. The grouping method was successful in finding like documents containing templates. Of the groups of documents containing templates, the LCS module was successful in deciphering text belonging to the template and text that was extraneous. Major obstacles to improved performance included documents composed of multiple templates, templates that included other templates embedded within them, and variants of templates. We demonstrate proof of concept of the grouping and extraction method of identifying templates in electronic medical records in this pilot study and propose methods to improve performance and scaling up. Published by Elsevier Inc.
Interference of avian guano in analyses of fuel-contaminated soils
DOE Office of Scientific and Technical Information (OSTI.GOV)
James, D.E.; Johnson, T.E.; Kreamer, D.K.
1996-01-01
Site characterization on Johnston Island, Johnston Atoll, Pacific Ocean, has yielded preliminary data that seabird guano can be an interference in three common petroleum hydrocarbon quantification methods. Volatiles from seabird guano were measured on a hydrocarbon-specific handheld vapor meter (catalytic detector) in concentrations as high as 256 ppm by volume total hydrocarbon. Analysis of guano solids produced measurable concentrations of total petroleum hydrocarbon (TPH) as diesel using both an immunoassay test and the EPA 8015 Modified Method. The testing was conducted on one surface sample of guano collected from a seabird roosting and nesting area. Source species were not identified.more » Positive hydrocarbon test results for guano raise concerns regarding the effectiveness of standard methods of petroleum-contaminated site characterization for Johnston island, other Pacific islands, and coastal areas with historic or contemporary seabird populations.« less
Method study on fuzzy-PID adaptive control of electric-hydraulic hitch system
NASA Astrophysics Data System (ADS)
Li, Mingsheng; Wang, Liubu; Liu, Jian; Ye, Jin
2017-03-01
In this paper, fuzzy-PID adaptive control method is applied to the control of tractor electric-hydraulic hitch system. According to the characteristics of the system, a fuzzy-PID adaptive controller is designed and the electric-hydraulic hitch system model is established. Traction control and position control performance simulation are carried out with the common PID control method. A field test rig was set up to test the electric-hydraulic hitch system. The test results showed that, after the fuzzy-PID adaptive control is adopted, when the tillage depth steps from 0.1m to 0.3m, the system transition process time is 4s, without overshoot, and when the tractive force steps from 3000N to 7000N, the system transition process time is 5s, the system overshoot is 25%.
Reference Intervals of Common Clinical Chemistry Analytes for Adults in Hong Kong.
Lo, Y C; Armbruster, David A
2012-04-01
Defining reference intervals is a major challenge because of the difficulty in recruiting volunteers to participate and testing samples from a significant number of healthy reference individuals. Historical literature citation intervals are often suboptimal because they're be based on obsolete methods and/or only a small number of poorly defined reference samples. Blood donors in Hong Kong gave permission for additional blood to be collected for reference interval testing. The samples were tested for twenty-five routine analytes on the Abbott ARCHITECT clinical chemistry system. Results were analyzed using the Rhoads EP evaluator software program, which is based on the CLSI/IFCC C28-A guideline, and defines the reference interval as the 95% central range. Method specific reference intervals were established for twenty-five common clinical chemistry analytes for a Chinese ethnic population. The intervals were defined for each gender separately and for genders combined. Gender specific or combined gender intervals were adapted as appropriate for each analyte. A large number of healthy, apparently normal blood donors from a local ethnic population were tested to provide current reference intervals for a new clinical chemistry system. Intervals were determined following an accepted international guideline. Laboratories using the same or similar methodologies may adapt these intervals if deemed validated and deemed suitable for their patient population. Laboratories using different methodologies may be able to successfully adapt the intervals for their facilities using the reference interval transference technique based on a method comparison study.
Use of Contemporary Genetics in Cardiovascular Diagnosis
George, Alfred L.
2015-01-01
An explosion of knowledge regarding the genetic and genomic basis for rare and common diseases has provided a framework for revolutionizing the practice of medicine. Achieving the reality of a genomic medicine era requires that basic discoveries are effectively translated into clinical practice through implementation of genetic and genomic testing. Clinical genetic tests have become routine for many inherited disorders and can be regarded as the standard-of-care in many circumstances including disorders affecting the cardiovascular system. New, high-throughput methods for determining the DNA sequence of all coding exons or complete genomes are being adopted for clinical use to expand the speed and breadth of genetic testing. Along with these extraordinary advances have emerged new challenges to practicing physicians for understanding when and how to use genetic testing along with how to appropriately interpret test results. This review will acquaint readers with general principles of genetic testing including newer technologies, test interpretation and pitfalls. The focus will be on testing genes responsible for monogenic disorders and on other emerging applications such as pharmacogenomic profiling. The discussion will be extended to the new paradigm of direct-to-consumer genetic testing and the value of assessing genomic risk for common diseases. PMID:25421045
Xue, Xiaonan; Kim, Mimi Y; Castle, Philip E; Strickler, Howard D
2014-03-01
Studies to evaluate clinical screening tests often face the problem that the "gold standard" diagnostic approach is costly and/or invasive. It is therefore common to verify only a subset of negative screening tests using the gold standard method. However, undersampling the screen negatives can lead to substantial overestimation of the sensitivity and underestimation of the specificity of the diagnostic test. Our objective was to develop a simple and accurate statistical method to address this "verification bias." We developed a weighted generalized estimating equation approach to estimate, in a single model, the accuracy (eg, sensitivity/specificity) of multiple assays and simultaneously compare results between assays while addressing verification bias. This approach can be implemented using standard statistical software. Simulations were conducted to assess the proposed method. An example is provided using a cervical cancer screening trial that compared the accuracy of human papillomavirus and Pap tests, with histologic data as the gold standard. The proposed approach performed well in estimating and comparing the accuracy of multiple assays in the presence of verification bias. The proposed approach is an easy to apply and accurate method for addressing verification bias in studies of multiple screening methods. Copyright © 2014 Elsevier Inc. All rights reserved.
Lin, Shan-Yang; Wang, Shun-Li
2012-04-01
The solid-state chemistry of drugs has seen growing importance in the pharmaceutical industry for the development of useful API (active pharmaceutical ingredients) of drugs and stable dosage forms. The stability of drugs in various solid dosage forms is an important issue because solid dosage forms are the most common pharmaceutical formulation in clinical use. In solid-state stability studies of drugs, an ideal accelerated method must not only be selected by different complicated methods, but must also detect the formation of degraded product. In this review article, an analytical technique combining differential scanning calorimetry and Fourier-transform infrared (DSC-FTIR) microspectroscopy simulates the accelerated stability test, and simultaneously detects the decomposed products in real time. The pharmaceutical dipeptides aspartame hemihydrate, lisinopril dihydrate, and enalapril maleate either with or without Eudragit E were used as testing examples. This one-step simultaneous DSC-FTIR technique for real-time detection of diketopiperazine (DKP) directly evidenced the dehydration process and DKP formation as an impurity common in pharmaceutical dipeptides. DKP formation in various dipeptides determined by different analytical methods had been collected and compiled. Although many analytical methods have been applied, the combined DSC-FTIR technique is an easy and fast analytical method which not only can simulate the accelerated drug stability testing but also at the same time enable to explore phase transformation as well as degradation due to thermal-related reactions. This technique offers quick and proper interpretations. Copyright © 2012 Elsevier B.V. All rights reserved.
Automating testbed documentation and database access using World Wide Web (WWW) tools
NASA Technical Reports Server (NTRS)
Ames, Charles; Auernheimer, Brent; Lee, Young H.
1994-01-01
A method for providing uniform transparent access to disparate distributed information systems was demonstrated. A prototype testing interface was developed to access documentation and information using publicly available hypermedia tools. The prototype gives testers a uniform, platform-independent user interface to on-line documentation, user manuals, and mission-specific test and operations data. Mosaic was the common user interface, and HTML (Hypertext Markup Language) provided hypertext capability.
ERIC Educational Resources Information Center
Sinharay, Sandip; Dorans, Neil J.
2010-01-01
The Mantel-Haenszel (MH) procedure (Mantel and Haenszel) is a popular method for estimating and testing a common two-factor association parameter in a 2 x 2 x K table. Holland and Holland and Thayer described how to use the procedure to detect differential item functioning (DIF) for tests with dichotomously scored items. Wang, Bradlow, Wainer, and…
Nationwide Multicenter Reference Interval Study for 28 Common Biochemical Analytes in China.
Xia, Liangyu; Chen, Ming; Liu, Min; Tao, Zhihua; Li, Shijun; Wang, Liang; Cheng, Xinqi; Qin, Xuzhen; Han, Jianhua; Li, Pengchang; Hou, Li'an; Yu, Songlin; Ichihara, Kiyoshi; Qiu, Ling
2016-03-01
A nationwide multicenter study was conducted in the China to explore sources of variation of reference values and establish reference intervals for 28 common biochemical analytes, as a part of the International Federation of Clinical Chemistry and Laboratory Medicine, Committee on Reference Intervals and Decision Limits (IFCC/C-RIDL) global study on reference values. A total of 3148 apparently healthy volunteers were recruited in 6 cities covering a wide area in China. Blood samples were tested in 2 central laboratories using Beckman Coulter AU5800 chemistry analyzers. Certified reference materials and value-assigned serum panel were used for standardization of test results. Multiple regression analysis was performed to explore sources of variation. Need for partition of reference intervals was evaluated based on 3-level nested ANOVA. After secondary exclusion using the latent abnormal values exclusion method, reference intervals were derived by a parametric method using the modified Box-Cox formula. Test results of 20 analytes were made traceable to reference measurement procedures. By the ANOVA, significant sex-related and age-related differences were observed in 12 and 12 analytes, respectively. A small regional difference was observed in the results for albumin, glucose, and sodium. Multiple regression analysis revealed BMI-related changes in results of 9 analytes for man and 6 for woman. Reference intervals of 28 analytes were computed with 17 analytes partitioned by sex and/or age. In conclusion, reference intervals of 28 common chemistry analytes applicable to Chinese Han population were established by use of the latest methodology. Reference intervals of 20 analytes traceable to reference measurement procedures can be used as common reference intervals, whereas others can be used as the assay system-specific reference intervals in China.
Nationwide Multicenter Reference Interval Study for 28 Common Biochemical Analytes in China
Xia, Liangyu; Chen, Ming; Liu, Min; Tao, Zhihua; Li, Shijun; Wang, Liang; Cheng, Xinqi; Qin, Xuzhen; Han, Jianhua; Li, Pengchang; Hou, Li’an; Yu, Songlin; Ichihara, Kiyoshi; Qiu, Ling
2016-01-01
Abstract A nationwide multicenter study was conducted in the China to explore sources of variation of reference values and establish reference intervals for 28 common biochemical analytes, as a part of the International Federation of Clinical Chemistry and Laboratory Medicine, Committee on Reference Intervals and Decision Limits (IFCC/C-RIDL) global study on reference values. A total of 3148 apparently healthy volunteers were recruited in 6 cities covering a wide area in China. Blood samples were tested in 2 central laboratories using Beckman Coulter AU5800 chemistry analyzers. Certified reference materials and value-assigned serum panel were used for standardization of test results. Multiple regression analysis was performed to explore sources of variation. Need for partition of reference intervals was evaluated based on 3-level nested ANOVA. After secondary exclusion using the latent abnormal values exclusion method, reference intervals were derived by a parametric method using the modified Box–Cox formula. Test results of 20 analytes were made traceable to reference measurement procedures. By the ANOVA, significant sex-related and age-related differences were observed in 12 and 12 analytes, respectively. A small regional difference was observed in the results for albumin, glucose, and sodium. Multiple regression analysis revealed BMI-related changes in results of 9 analytes for man and 6 for woman. Reference intervals of 28 analytes were computed with 17 analytes partitioned by sex and/or age. In conclusion, reference intervals of 28 common chemistry analytes applicable to Chinese Han population were established by use of the latest methodology. Reference intervals of 20 analytes traceable to reference measurement procedures can be used as common reference intervals, whereas others can be used as the assay system-specific reference intervals in China. PMID:26945390
Development of a hybrid pollution index for heavy metals in marine and estuarine sediments.
Brady, James P; Ayoko, Godwin A; Martens, Wayde N; Goonetilleke, Ashantha
2015-05-01
Heavy metal pollution of sediments is a growing concern in most parts of the world, and numerous studies focussed on identifying contaminated sediments by using a range of digestion methods and pollution indices to estimate sediment contamination have been described in the literature. The current work provides a critical review of the more commonly used sediment digestion methods and identifies that weak acid digestion is more likely to provide guidance on elements that are likely to be bioavailable than other traditional methods of digestion. This work also reviews common pollution indices and identifies the Nemerow Pollution Index as the most appropriate method for establishing overall sediment quality. Consequently, a modified Pollution Index that can lead to a more reliable understanding of whole sediment quality is proposed. This modified pollution index is then tested against a number of existing studies and demonstrated to give a reliable and rapid estimate of sediment contamination and quality.
Multi-frame image processing with panning cameras and moving subjects
NASA Astrophysics Data System (ADS)
Paolini, Aaron; Humphrey, John; Curt, Petersen; Kelmelis, Eric
2014-06-01
Imaging scenarios commonly involve erratic, unpredictable camera behavior or subjects that are prone to movement, complicating multi-frame image processing techniques. To address these issues, we developed three techniques that can be applied to multi-frame image processing algorithms in order to mitigate the adverse effects observed when cameras are panning or subjects within the scene are moving. We provide a detailed overview of the techniques and discuss the applicability of each to various movement types. In addition to this, we evaluated algorithm efficacy with demonstrated benefits using field test video, which has been processed using our commercially available surveillance product. Our results show that algorithm efficacy is significantly improved in common scenarios, expanding our software's operational scope. Our methods introduce little computational burden, enabling their use in real-time and low-power solutions, and are appropriate for long observation periods. Our test cases focus on imaging through turbulence, a common use case for multi-frame techniques. We present results of a field study designed to test the efficacy of these techniques under expanded use cases.
A portable thoracic closed drainage instrument for hemopneumothorax.
Tang, Hua; Pan, Tiewen; Qin, Xiong; Xue, Lei; Wu, Bin; Zhao, Xuewei; Sun, Guangyuan; Yuan, Xinyu; Xu, Zhifei
2012-03-01
Hemopneumothorax is a common sequelae of traumatic thoracic injury. The most effective treatment of this condition is thoracic drainage. Despite the common occurrence of this condition, available instruments are difficult to use emergently, particularly when large amounts of patients need to be drained. In the present experiment, a newly designed chest tube and thoracic closed drainage package is described and preliminarily evaluated with the goal to improve the treatment of traumatic hemopneumothorax. Twenty canines were divided into two groups. In one group, the newly designed thoracic closed drainage package was used, whereas in the other group a currently available chest tube and bottle were used. Drainage test, ultrasound examination, flushing test, and tension test were performed to evaluate the effectiveness of the drainage package. We found that the newly-designed drainage tube is as effective as the common tube when evaluated using all of the chosen methods. In addition, the package is very lightweight and portable. The newly-designed thoracic drainage package is very effective in the emergency treatment of thoracic trauma and may be more suitable for the emergency treatment of hemopneumothorax.
Mata, Caio Augusto Sterse; Ota, Luiz Hirotoshi; Suzuki, Iunis; Telles, Adriana; Miotto, Andre; Leão, Luiz Eduardo Vilaça
2012-01-01
This study compares the traditional live lecture to a web-based approach in the teaching of bronchoscopy and evaluates the positive and negative aspects of both methods. We developed a web-based bronchoscopy curriculum, which integrates texts, images and animations. It was applied to first-year interns, who were later administered a multiple-choice test. Another group of eight first-year interns received the traditional teaching method and the same test. The two groups were compared using the Student's t-test. The mean scores (± SD) of students who used the website were 14.63 ± 1.41 (range 13-17). The test scores of the other group had the same range, with a mean score of 14.75 ± 1. The Student's t-test showed no difference between the test results. The common positive point noted was the presence of multimedia content. The web group cited as positive the ability to review the pages, and the other one the role of the teacher. Web-based bronchoscopy education showed results similar to the traditional live lecture in effectiveness.
Chiu, Chi-yang; Jung, Jeesun; Wang, Yifan; Weeks, Daniel E.; Wilson, Alexander F.; Bailey-Wilson, Joan E.; Amos, Christopher I.; Mills, James L.; Boehnke, Michael; Xiong, Momiao; Fan, Ruzong
2016-01-01
In this paper, extensive simulations are performed to compare two statistical methods to analyze multiple correlated quantitative phenotypes: (1) approximate F-distributed tests of multivariate functional linear models (MFLM) and additive models of multivariate analysis of variance (MANOVA), and (2) Gene Association with Multiple Traits (GAMuT) for association testing of high-dimensional genotype data. It is shown that approximate F-distributed tests of MFLM and MANOVA have higher power and are more appropriate for major gene association analysis (i.e., scenarios in which some genetic variants have relatively large effects on the phenotypes); GAMuT has higher power and is more appropriate for analyzing polygenic effects (i.e., effects from a large number of genetic variants each of which contributes a small amount to the phenotypes). MFLM and MANOVA are very flexible and can be used to perform association analysis for: (i) rare variants, (ii) common variants, and (iii) a combination of rare and common variants. Although GAMuT was designed to analyze rare variants, it can be applied to analyze a combination of rare and common variants and it performs well when (1) the number of genetic variants is large and (2) each variant contributes a small amount to the phenotypes (i.e., polygenes). MFLM and MANOVA are fixed effect models which perform well for major gene association analysis. GAMuT can be viewed as an extension of sequence kernel association tests (SKAT). Both GAMuT and SKAT are more appropriate for analyzing polygenic effects and they perform well not only in the rare variant case, but also in the case of a combination of rare and common variants. Data analyses of European cohorts and the Trinity Students Study are presented to compare the performance of the two methods. PMID:27917525
Beck, H J; Birch, G F
2013-06-01
Stormwater contaminant loading estimates using event mean concentration (EMC), rainfall/runoff relationship calculations and computer modelling (Model of Urban Stormwater Infrastructure Conceptualisation--MUSIC) demonstrated high variability in common methods of water quality assessment. Predictions of metal, nutrient and total suspended solid loadings for three highly urbanised catchments in Sydney estuary, Australia, varied greatly within and amongst methods tested. EMC and rainfall/runoff relationship calculations produced similar estimates (within 1 SD) in a statistically significant number of trials; however, considerable variability within estimates (∼50 and ∼25 % relative standard deviation, respectively) questions the reliability of these methods. Likewise, upper and lower default inputs in a commonly used loading model (MUSIC) produced an extensive range of loading estimates (3.8-8.3 times above and 2.6-4.1 times below typical default inputs, respectively). Default and calibrated MUSIC simulations produced loading estimates that agreed with EMC and rainfall/runoff calculations in some trials (4-10 from 18); however, they were not frequent enough to statistically infer that these methods produced the same results. Great variance within and amongst mean annual loads estimated by common methods of water quality assessment has important ramifications for water quality managers requiring accurate estimates of the quantities and nature of contaminants requiring treatment.
Advantages and limitations of common testing methods for antioxidants.
Amorati, R; Valgimigli, L
2015-05-01
Owing to the importance of antioxidants in the protection of both natural and man-made materials, a large variety of testing methods have been proposed and applied. These include methods based on inhibited autoxidation studies, which are better followed by monitoring the kinetics of oxygen consumption or of the formation of hydroperoxides, the primary oxidation products. Analytical determination of secondary oxidation products (e.g. carbonyl compounds) has also been used. The majority of testing methods, however, do not involve substrate autoxidation. They are based on the competitive bleaching of a probe (e.g. ORAC assay, β-carotene, crocin bleaching assays, and luminol assay), on reaction with a different probe (e.g. spin-trapping and TOSC assay), or they are indirect methods based on the reduction of persistent radicals (e.g. galvinoxyl, DPPH and TEAC assays), or of inorganic oxidizing species (e.g. FRAP, CUPRAC and Folin-Ciocalteu assays). Yet other methods are specific for preventive antioxidants. The relevance, advantages, and limitations of these methods are critically discussed, with respect to their chemistry and the mechanisms of antioxidant activity. A variety of cell-based assays have also been proposed, to investigate the biological activity of antioxidants. Their importance and critical aspects are discussed, along with arguments for the selection of the appropriate testing methods according to the different needs.
Testing antimicrobial paint efficacy on gypsum wallboard contaminated with Stachybotrys chartarum.
Menetrez, M Y; Foarde, K K; Webber, T D; Dean, T R; Betancourt, D A
2008-02-01
The goal of this research was to reduce occupant exposure to indoor mold through the efficacy testing of antimicrobial paints. An accepted method for handling Stachybotrys chartarum-contaminated gypsum wallboard (GWB) is removal and replacement. This practice is also recommended for water-damaged or mold-contaminated GWB but is not always followed completely. The efficacy of antimicrobial paints to eliminate or control mold regrowth on surfaces can be tested easily on nonporous surfaces. The testing of antimicrobial efficacy on porous surfaces found in the indoor environment, such as gypsum wallboard, can be more complicated and prone to incorrect conclusions regarding residual organisms. The mold S. chartarum has been studied for toxin production and its occurrence in water-damaged buildings. Research to control its growth using seven different antimicrobial paints and two commonly used paints on contaminated, common gypsum wallboard was performed in laboratory testing at high relative humidity. The results indicate differences in antimicrobial efficacy for the period of testing, and that proper cleaning and resurfacing of GWB with an antimicrobial paint can be an option in those unique circumstances when removal may not be possible.
Rhebergen, Martijn D F; Visser, Maaike J; Verberk, Maarten M; Lenderink, Annet F; van Dijk, Frank J H; Kezic, Sanja; Hulshof, Carel T J
2012-10-01
We compared three common user involvement methods in revealing barriers and facilitators from intended users that might influence their use of a new genetic test. The study was part of the development of a new genetic test on the susceptibility to hand eczema for nurses. Eighty student nurses participated in five focus groups (n = 33), 15 interviews (n = 15) or questionnaires (n = 32). For each method, data were collected until saturation. We compared the mean number of items and relevant remarks that could influence the use of the genetic test obtained per method, divided by the number of participants in that method. Thematic content analysis was performed using MAXQDA software. The focus groups revealed 30 unique items compared to 29 in the interviews and 21 in the questionnaires. The interviews produced more items and relevant remarks per participant (1.9 and 8.4 pp) than focus groups (0.9 and 4.8 pp) or questionnaires (0.7 and 2.3 pp). All three involvement methods revealed relevant barriers and facilitators to use a new genetic test. Focus groups and interviews revealed substantially more items than questionnaires. Furthermore, this study suggests a preference for the use of interviews because the number of items per participant was higher than for focus groups and questionnaires. This conclusion may be valid for other genetic tests as well.
Diagnostic methods of TSH in thyroid screening tests.
Matyjaszek-Matuszek, Beata; Pyzik, Aleksandra; Nowakowski, Andrzej; Jarosz, Mirosław J
2013-01-01
Reliable and quick thyreologic diagnostics, as well as verification of the effectiveness of the therapy undertaken, is of great importance for the state of health of society. The measurement of plasma TSH is the commonly accepted and most sensitive screening test for primary thyroid disorders, which are the most frequent diseases related to the endocrine glands. At present, the available methods for the determination of TSH are characterized by high sensitivity ≤0.01 µIU/ml and lack of cross-reactivity. However, many drugs and substances, as well as pathological conditions, may affect the TSH level. evaluation of contemporary laboratory methods for the determination of TSH and the principles of interpretation of screening tests. In many countries, the TSH test is the only test performed in the diagnostics of thyroid function; nevertheless, it seems that for genuine and objective assessment of thyroid status the TSH level, together with FT4 level, should be absolutely determined, which allows the differentiation and assessment of the intensity of thyroid function disorders and foresee its consequences. The interpretation of TSH results in screening tests is different in such population groups as: children aged under 14, pregnant women, the elderly, and patients with non-thyroidal illnesses. From among currently used laboratory methods for determination of TSH levels, third generation non-isotopic methods are most frequently recommended, especially the method of immunochemiluminescence.
Cotton, Robin W; Fisher, Matthew B
2015-09-01
Forensic DNA testing is grounded in molecular biology and population genetics. The technologies that were the basis of restriction length polymorphism testing (RFLP) have given way to PCR based technologies. While PCR has been the pillar of short tandem repeat (STR) methods and will continue to be used as DNA sequencing and analysis of single nucleotide polymorphisms (SNPs) are introduced into human identification, the molecular biology techniques in use today represent significant advances since the introduction of STR testing. Large forensic laboratories with dedicated research teams and forensic laboratories which are part of academic institutions have the resources to keep track of advances which can then be considered for further research or incorporated into current testing methods. However, many laboratories have limited ability to keep up with research advances outside of the immediate area of forensic science and may not have access to a large university library systems. This review focuses on filling this gap with respect to areas of research that intersect with selected methods used in forensic biology. The review summarizes information collected from several areas of the scientific literature where advances in molecular biology have produced information relevant to DNA analysis of sexual assault evidence and methods used in presumptive and confirmatory identification of semen. Older information from the literature is also included where this information may not be commonly known and is relevant to current methods. The topics selected highlight (1) information from applications of proteomics to sperm biology and human reproduction, (2) seminal fluid proteins and prostate cancer diagnostics, (3) developmental biology of sperm from the fertility literature and (4) areas where methods are common to forensic analysis and research in contraceptive use and monitoring. Information and progress made in these areas coincide with the research interests of forensic biology and cross-talk between these disciplines may benefit both. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
The variability of software scoring of the CDMAM phantom associated with a limited number of images
NASA Astrophysics Data System (ADS)
Yang, Chang-Ying J.; Van Metter, Richard
2007-03-01
Software scoring approaches provide an attractive alternative to human evaluation of CDMAM images from digital mammography systems, particularly for annual quality control testing as recommended by the European Protocol for the Quality Control of the Physical and Technical Aspects of Mammography Screening (EPQCM). Methods for correlating CDCOM-based results with human observer performance have been proposed. A common feature of all methods is the use of a small number (at most eight) of CDMAM images to evaluate the system. This study focuses on the potential variability in the estimated system performance that is associated with these methods. Sets of 36 CDMAM images were acquired under carefully controlled conditions from three different digital mammography systems. The threshold visibility thickness (TVT) for each disk diameter was determined using previously reported post-analysis methods from the CDCOM scorings for a randomly selected group of eight images for one measurement trial. This random selection process was repeated 3000 times to estimate the variability in the resulting TVT values for each disk diameter. The results from using different post-analysis methods, different random selection strategies and different digital systems were compared. Additional variability of the 0.1 mm disk diameter was explored by comparing the results from two different image data sets acquired under the same conditions from the same system. The magnitude and the type of error estimated for experimental data was explained through modeling. The modeled results also suggest a limitation in the current phantom design for the 0.1 mm diameter disks. Through modeling, it was also found that, because of the binomial statistic nature of the CDMAM test, the true variability of the test could be underestimated by the commonly used method of random re-sampling.
Tyrer, Jonathan P; Guo, Qi; Easton, Douglas F; Pharoah, Paul D P
2013-06-06
The development of genotyping arrays containing hundreds of thousands of rare variants across the genome and advances in high-throughput sequencing technologies have made feasible empirical genetic association studies to search for rare disease susceptibility alleles. As single variant testing is underpowered to detect associations, the development of statistical methods to combine analysis across variants - so-called "burden tests" - is an area of active research interest. We previously developed a method, the admixture maximum likelihood test, to test multiple, common variants for association with a trait of interest. We have extended this method, called the rare admixture maximum likelihood test (RAML), for the analysis of rare variants. In this paper we compare the performance of RAML with six other burden tests designed to test for association of rare variants. We used simulation testing over a range of scenarios to test the power of RAML compared to the other rare variant association testing methods. These scenarios modelled differences in effect variability, the average direction of effect and the proportion of associated variants. We evaluated the power for all the different scenarios. RAML tended to have the greatest power for most scenarios where the proportion of associated variants was small, whereas SKAT-O performed a little better for the scenarios with a higher proportion of associated variants. The RAML method makes no assumptions about the proportion of variants that are associated with the phenotype of interest or the magnitude and direction of their effect. The method is flexible and can be applied to both dichotomous and quantitative traits and allows for the inclusion of covariates in the underlying regression model. The RAML method performed well compared to the other methods over a wide range of scenarios. Generally power was moderate in most of the scenarios, underlying the need for large sample sizes in any form of association testing.
Method of testing gear wheels in impact bending
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tikhonov, A.K.; Palagin, Y.M.
1995-05-01
Chemicothermal treatment processes are widely used in engineering to improve the working lives of important components, of which the most common is nitrocementation. That process has been applied at the Volga Automobile Plant mainly to sprockets in gear transmissions, which need high hardness and wear resistance in the surfaces with relatively ductile cores. Although various forms of chemicothermal treatment are widely used, there has been no universal method of evaluating the strengths of gear wheels. Standard methods of estimating strength ({sigma}{sub u}, {sigma}{sub t}, {sigma}{sub b}, and hardness) have a major shortcoming: They can determine only the characteristics of themore » cores for case-hardened materials. Here we consider a method of impact bending test, which enables one to evaluate the actual strength of gear teeth.« less
Preparation of pyrolysis reference samples: evaluation of a standard method using a tube furnace.
Sandercock, P Mark L
2012-05-01
A new, simple method for the reproducible creation of pyrolysis products from different materials that may be found at a fire scene is described. A temperature programmable steady-state tube furnace was used to generate pyrolysis products from different substrates, including softwoods, paper, vinyl sheet flooring, and carpet. The temperature profile of the tube furnace was characterized, and the suitability of the method to reproducibly create pyrolysates similar to those found in real fire debris was assessed. The use of this method to create proficiency tests to realistically test an examiner's ability to interpret complex gas chromatograph-mass spectrometric fire debris data, and to create a library of pyrolsates generated from materials commonly found at a fire scene, is demonstrated. © 2011 American Academy of Forensic Sciences.
Instrumental Surveillance of Water Quality.
ERIC Educational Resources Information Center
Miller, J. A.; And Others
The role analytical instrumentation performs in the surveillance and control of the quality of water resources is reviewed. Commonly performed analyses may range from simple tests for physical parameters to more highly sophisticated radiological or spectrophotometric methods. This publication explores many of these types of water quality analyses…
FEASIBILITY OF HYDRAULIC FRACTURING OF SOILS TO IMPROVE REMEDIAL ACTIONS
Hydraulic fracturing, a technique commonly used to increase the yields of oil wells, could improve the effectiveness of several methods of in situ remediation. This project consisted of laboratory and field tests in which hydraulic fractures were created in soil. Laboratory te...
DEVELOPMENT OF A MOLECULAR METHOD TO DETECT ASTROVIRUS
Astrovirus is a common cause of gastroenteritis that has been determined to be responsible for several outbreaks. Since astrovirus can be waterborne, there is interest in testing environmental water for astrovirus and we have developed a sensitive RT-PCR assay that is designed t...
DOT National Transportation Integrated Search
2012-02-01
Hot mix asphalt (HMA) overlay is one of the most commonly used methods for rehabilitating deteriorated pavements. One major type of distress influencing the life of an overlay is reflective cracking. Many departments of transportation have implemente...
A novel tensile test method to assess texture and gaping in salmon fillets.
Ashton, Thomas J; Michie, Ian; Johnston, Ian A
2010-05-01
A new tensile strength method was developed to quantify the force required to tear a standardized block of Atlantic salmon muscle with the aim of identifying those samples more prone to factory downgrading as a result of softness and fillet gaping. The new method effectively overcomes problems of sample attachment encountered with previous tensile strength tests. The repeatability and sensitivity and predictability of the new technique were evaluated against other common instrumental texture measurement methods. The relationship between sensory assessments of firmness and parameters from the instrumental texture methods was also determined. Data from the new method were shown to have the strongest correlations with gaping severity (r =-0.514, P < 0.001) and the highest level of repeatability of data when analyzing cold-smoked samples. The Warner Bratzler shear method gave the most repeatable data from fresh samples and had the highest correlations between fresh and smoked product from the same fish (r = 0.811, P < 0.001). A hierarchical cluster analysis placed the tensile test in the top cluster, alongside the Warner Bratzler method, demonstrating that it also yields adequate data with respect to these tests. None of the tested sensory analysis attributes showed significant relationships to mechanical tests except fillet firmness, with correlations (r) of 0.42 for cylinder probe maximum force (P = 0.005) and 0.31 for tensile work (P = 0.04). It was concluded that the tensile test method developed provides an important addition to the available tools for mechanical analysis of salmon quality, particularly with respect to the prediction of gaping during factory processing, which is a serious commercial problem. A novel, reliable method of measuring flesh tensile strength in salmon, provides data of relevance to gaping.
Improved wheal detection from skin prick test images
NASA Astrophysics Data System (ADS)
Bulan, Orhan
2014-03-01
Skin prick test is a commonly used method for diagnosis of allergic diseases (e.g., pollen allergy, food allergy, etc.) in allergy clinics. The results of this test are erythema and wheal provoked on the skin where the test is applied. The sensitivity of the patient against a specific allergen is determined by the physical size of the wheal, which can be estimated from images captured by digital cameras. Accurate wheal detection from these images is an important step for precise estimation of wheal size. In this paper, we propose a method for improved wheal detection on prick test images captured by digital cameras. Our method operates by first localizing the test region by detecting calibration marks drawn on the skin. The luminance variation across the localized region is eliminated by applying a color transformation from RGB to YCbCr and discarding the luminance channel. We enhance the contrast of the captured images for the purpose of wheal detection by performing principal component analysis on the blue-difference (Cb) and red-difference (Cr) color channels. We finally, perform morphological operations on the contrast enhanced image to detect the wheal on the image plane. Our experiments performed on images acquired from 36 different patients show the efficiency of the proposed method for wheal detection from skin prick test images captured in an uncontrolled environment.
O'Neill, Suzanne C.; Tercyak, Kenneth P.; Baytop, Chanza; Alford, Sharon Hensley; McBride, Colleen M.
2015-01-01
Aims Personal genomic testing (PGT) for common disease risk is becoming increasingly frequent, but little is known about people's array of emotional reactions to learning their genomic risk profiles and the psychological harms/benefits of PGT. We conducted a study of post-PGT affect, including positive, neutral, and negative states that may arise after testing. Methods Two hundred twenty-eight healthy adults received PGT for common disease variants and completed a semi-structured research interview within two weeks of disclosure. Study participants reported how PGT results made them feel in their own words. Using an iterative coding process, responses were organized into three broad affective categories (Negative, Neutral, and Positive affect). Results Neutral affect was the most prevalent response (53.9%), followed by Positive affect (26.9%) and Negative affect (19.2%). We found no differences by gender, race or education. Conclusions While <20% of participants reported negative affect in response to learning their genomic risk profile for common disease, a majority experience either neutral or positive emotions. These findings contribute to the growing evidence that PGT does not impose significant psychological harms. Moreover, they point to a need to better link theories and assessments in both emotional and cognitive processing to capitalize on PGT information for healthy behavior change. PMID:25612474
Community Compensatory Trend Prevails from Tropical to Temperate Forest
Xiao, Lin; Yu, Shixiao; Li, Mingguang; Wang, Yongfan
2012-01-01
Community compensatory trend (CCT) is thought to facilitate persistence of rare species and thus stabilize species composition in tropical forests. However, whether CCT acts over broad geographical ranges is still in question. In this study, we tested for the presence of negative density dependence (NDD) and CCT in three forests along a tropical-temperate gradient. Inventory data were collected from forest communities located in three different latitudinal zones in China. Two widely used methods were used to test for NDD at the community level. The first method considered relationships between the relative abundance ratio and adult abundance. The second method emphasized the effect of adult abundance on abundance of established younger trees. Evidence for NDD acting on different growth forms was tested by using the first method, and the presence of CCT was tested by checking whether adult abundance of rare species affected that of established younger trees less than did abundance of common species. Both analyses indicated that NDD existed in seedling, sapling and pole stages in all three plant communities and that this effect increased with latitude. However, the extent of NDD varied among understory, midstory and canopy trees in the three communities along the gradient. Additionally, despite evidence of NDD for almost all common species, only a portion of rare species showed NDD, supporting the action of CCT in all three communities. So, we conclude that NDD and CCT prevail in the three recruitment stages of the tree communities studied; rare species achieve relative advantage through CCT and thus persist in these communities; CCT clearly facilitates newly established species and maintains tree diversity within communities across our latitudinal gradient. PMID:22701682
Jaureguiberry, María; Madoz, Laura Vanina; Giuliodori, Mauricio Javier; Wagener, Karen; Prunner, Isabella; Grunert, Tom; Ehling-Schulz, Monika; Drillich, Marc; de la Sota, Rodolfo Luzbel
2016-11-28
Uterine disorders are common postpartum diseases in dairy cows. In practice, uterine treatment is often based on systemic or locally applied antimicrobials with no previous identification of pathogens. Accurate on-farm diagnostics are not available, and routine testing is time-consuming and cost intensive. An accurate method that could simplify the identification of uterine pathogenic bacteria and improve pathogen-specific treatments could be an important advance to practitioners. The objective of the present study was to evaluate whether a database built with uterine bacteria from European dairy cows could be used to identify bacteria from Argentinean cows by Fourier transformed infrared (FTIR) spectroscopy. Uterine samples from 64 multiparous dairy cows with different types of vaginal discharge (VD) were collected between 5 and 60 days postpartum, analyzed by routine bacteriological testing methods and then re-evaluated by FTIR spectroscopy (n = 27). FTIR spectroscopy identified Escherichia coli in 12 out of 14 samples and Trueperella pyogenes in 8 out of 10 samples. The agreement between the two methods was good with a Kappa coefficient of 0.73. In addition, the likelihood for bacterial growth of common uterine pathogens such as E. coli and T. pyogenes tended to increase with VD score. The odds for a positive result to E. coli or T. pyogenes was 1.88 times higher in cows with fetid VD than in herdmates with clear normal VD. We conclude that the presence of E. coli and T. pyogenes in uterine samples from Argentinean dairy cows can be detected with FTIR with the use of a database built with uterine bacteria from European dairy cows. Future studies are needed to determine if FTIR can be used as an alternative to routine bacteriological testing methods.
Space Telecommunications Radio System (STRS) Architecture Standard. Release 1.02.1
NASA Technical Reports Server (NTRS)
Reinhart, Richard C.; Kacpura, Thomas J.; Handler, Louis M.; Hall, C. Steve; Mortensen, Dale J.; Johnson, Sandra K.; Briones, Janette C.; Nappier, Jennifer M.; Downey, Joseph A.; Lux, James P.
2012-01-01
This document contains the NASA architecture standard for software defined radios used in space- and ground-based platforms to enable commonality among radio developments to enhance capability and services while reducing mission and programmatic risk. Transceivers (or transponders) with functionality primarily defined in software (e.g., firmware) have the ability to change their functional behavior through software alone. This radio architecture standard offers value by employing common waveform software interfaces, method of instantiation, operation, and testing among different compliant hardware and software products. These common interfaces within the architecture abstract application software from the underlying hardware to enable technology insertion independently at either the software or hardware layer.
A Survey of Symplectic and Collocation Integration Methods for Orbit Propagation
NASA Technical Reports Server (NTRS)
Jones, Brandon A.; Anderson, Rodney L.
2012-01-01
Demands on numerical integration algorithms for astrodynamics applications continue to increase. Common methods, like explicit Runge-Kutta, meet the orbit propagation needs of most scenarios, but more specialized scenarios require new techniques to meet both computational efficiency and accuracy needs. This paper provides an extensive survey on the application of symplectic and collocation methods to astrodynamics. Both of these methods benefit from relatively recent theoretical developments, which improve their applicability to artificial satellite orbit propagation. This paper also details their implementation, with several tests demonstrating their advantages and disadvantages.
SU-E-T-468: Implementation of the TG-142 QA Process for Seven Linacs with Enhanced Beam Conformance
DOE Office of Scientific and Technical Information (OSTI.GOV)
Woollard, J; Ayan, A; DiCostanzo, D
2015-06-15
Purpose: To develop a TG-142 compliant QA process for 7 Varian TrueBeam linear accelerators (linacs) with enhanced beam conformance and dosimetrically matched beam models. To ensure consistent performance of all 7 linacs, the QA process should include a common set of baseline values for use in routine QA on all linacs. Methods: The TG 142 report provides recommended tests, tolerances and frequencies for quality assurance of medical accelerators. Based on the guidance provided in the report, measurement tests were developed to evaluate each of the applicable parameters listed for daily, monthly and annual QA. These tests were then performed onmore » each of our 7 new linacs as they came on line at our institution. Results: The tolerance values specified in TG-142 for each QA test are either absolute tolerances (i.e. ±2mm) or require a comparison to a baseline value. The results of our QA tests were first used to ensure that all 7 linacs were operating within the suggested tolerance values provided in TG −142 for those tests with absolute tolerances and that the performance of the linacs was adequately matched. The QA test results were then used to develop a set of common baseline values for those QA tests that require comparison to a baseline value at routine monthly and annual QA. The procedures and baseline values were incorporated into a spreadsheets for use in monthly and annual QA. Conclusion: We have developed a set of procedures for daily, monthly and annual QA of our linacs that are consistent with the TG-142 report. A common set of baseline values was developed for routine QA tests. The use of this common set of baseline values for comparison at monthly and annual QA will ensure consistent performance of all 7 linacs.« less
Comparison of Shear-wave Profiles for a Compacted Fill in a Geotechnical Test Pit
NASA Astrophysics Data System (ADS)
Sylvain, M. B.; Pando, M. A.; Whelan, M.; Bents, D.; Park, C.; Ogunro, V.
2014-12-01
This paper investigates the use of common methods for geological seismic site characterization including: i) multichannel analysis of surface waves (MASW),ii) crosshole seismic surveys, and iii) seismic cone penetrometer tests. The in-situ tests were performed in a geotechnical test pit located at the University of North Carolina at Charlotte High Bay Laboratory. The test pit has dimensions of 12 feet wide by 12 feet long by 10 feet deep. The pit was filled with a silty sand (SW-SM) soil, which was compacted in lifts using a vibratory plate compactor. The shear wave velocity values from the 3 techniques are compared in terms of magnitude versus depth as well as spatially. The comparison was carried out before and after inducing soil disturbance at controlled locations to evaluate which methods were better suited to captured the induced soil disturbance.
Measuring Diffusion of Liquids by Common-Path Interferometry
NASA Technical Reports Server (NTRS)
Rashidnia, Nasser
2003-01-01
A method of observing the interdiffusion of a pair of miscible liquids is based on the use of a common-path interferometer (CPI) to measure the spatially varying gradient of the index refraction in the interfacial region in which the interdiffusion takes place. Assuming that the indices of refraction of the two liquids are different and that the gradient of the index of refraction of the liquid is proportional to the gradient in the relative concentrations of either liquid, the diffusivity of the pair of liquids can be calculated from the temporal variation of the spatial variation of the index of refraction. This method yields robust measurements and does not require precise knowledge of the indices of refraction of the pure liquids. Moreover, the CPI instrumentation is compact and is optomechanically robust by virtue of its common- path design. The two liquids are placed in a transparent rectangular parallelepiped test cell. Initially, the interface between the liquids is a horizontal plane, above which lies pure liquid 2 (the less-dense liquid) and below which lies pure liquid 1 (the denser liquid). The subsequent interdiffusion of the liquids gives rise to a gradient of concentration and a corresponding gradient of the index of refraction in a mixing layer. For the purpose of observing the interdiffusion, the test cell is placed in the test section of the CPI, in which a collimated, polarized beam of light from a low-power laser is projected horizontally through a region that contains the mixing layer.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, R.; Harrison, D. E. Jr.
A variable time step integration algorithm for carrying out molecular dynamics simulations of atomic collision cascades is proposed which evaluates the interaction forces only once per time step. The algorithm is tested on some model problems which have exact solutions and is compared against other common methods. These comparisons show that the method has good stability and accuracy. Applications to Ar/sup +/ bombardment of Cu and Si show good accuracy and improved speed to the original method (D. E. Harrison, W. L. Gay, and H. M. Effron, J. Math. Phys. /bold 10/, 1179 (1969)).
Shen, Shaoshuai; Abe, Takumi; Tsuji, Taishi; Fujii, Keisuke; Ma, Jingyu; Okura, Tomohiro
2017-01-01
[Purpose] The purpose of this study was to investigate which of the four chair-rising methods has low-load and the highest success rate, and whether the GRF parameters in that method are useful for measuring lower extremity function among physically frail Japanese older adults. [Subjects and Methods] Fifty-two individuals participated in this study. The participants voluntarily attempted four types of Sit-to-stand test (one variation without and three variations with the use of their arms). The following parameters were measured: peak reaction force (F/w), two force development rate parameters (RFD1.25/w, RFD8.75/w) and two time-related parameters (T1, T2). Three additional commonly employed clinical tests (One-leg balance with eyes open, Timed up and go and 5-meter walk test) were also conducted. [Results] “Hands on a chair” chair-rising method produced the highest success rate among the four methods. All parameters were highly reliable between testing occasions. T2 showed strongly significant associations with Timed up and go and 5-meter walk test in males. RFD8.75/w showed significant associations with Timed up and go and 5-meter walk test in females. [Conclusion] Ground reaction force parameters in the Sit-to-stand test are a reliable and useful method for assessment of lower extremity function in physically frail Japanese older adults. PMID:28931988
Yang, Chan; Xu, Bing; Zhang, Zhi-Qiang; Wang, Xin; Shi, Xin-Yuan; Fu, Jing; Qiao, Yan-Jiang
2016-10-01
Blending uniformity is essential to ensure the homogeneity of Chinese medicine formula particles within each batch. This study was based on the blending process of ebony spray dried powder and dextrin(the proportion of dextrin was 10%),in which the analysis of near infrared (NIR) diffuse reflectance spectra was collected from six different sampling points in combination with moving window F test method in order to assess the blending uniformity of the blending process.The method was validated by the changes of citric acid content determined by the HPLC. The results of moving window F test method showed that the ebony spray dried powder and dextrin was homogeneous during 200-300 r and was segregated during 300-400 r. An advantage of this method is that the threshold value is defined statistically, not empirically and thus does not suffer from threshold ambiguities in common with the moving block standard deviatiun (MBSD). And this method could be employed to monitor other blending process of Chinese medicine powders on line. Copyright© by the Chinese Pharmaceutical Association.
Huang, Jennifer Y; Henao, Olga L; Griffin, Patricia M; Vugia, Duc J; Cronquist, Alicia B; Hurd, Sharon; Tobin-D'Angelo, Melissa; Ryan, Patricia; Smith, Kirk; Lathrop, Sarah; Zansky, Shelley; Cieslak, Paul R; Dunn, John; Holt, Kristin G; Wolpert, Beverly J; Patrick, Mary E
2016-04-15
To evaluate progress toward prevention of enteric and foodborne illnesses in the United States, the Foodborne Diseases Active Surveillance Network (FoodNet) monitors the incidence of laboratory-confirmed infections caused by nine pathogens transmitted commonly through food in 10 U.S. sites. This report summarizes preliminary 2015 data and describes trends since 2012. In 2015, FoodNet reported 20,107 confirmed cases (defined as culture-confirmed bacterial infections and laboratory-confirmed parasitic infections), 4,531 hospitalizations, and 77 deaths. FoodNet also received reports of 3,112 positive culture-independent diagnostic tests (CIDTs) without culture-confirmation, a number that has markedly increased since 2012. Diagnostic testing practices for enteric pathogens are rapidly moving away from culture-based methods. The continued shift from culture-based methods to CIDTs that do not produce the isolates needed to distinguish between strains and subtypes affects the interpretation of public health surveillance data and ability to monitor progress toward prevention efforts. Expanded case definitions and strategies for obtaining bacterial isolates are crucial during this transition period.
Recent advances in genetic testing for familial hypercholesterolemia.
Iacocca, Michael A; Hegele, Robert A
2017-07-01
Familial hypercholesterolemia (FH) is a common genetic cause of premature coronary heart disease that is widely underdiagnosed and undertreated. To improve the identification of FH and initiate timely and appropriate treatment strategies, genetic testing is becoming increasingly offered worldwide as a central part of diagnosis. Areas covered: Recent advances have been propelled by an improved understanding of the genetic determinants of FH together with substantially reduced costs of appropriate screening strategies. Here we review the various methods available for obtaining a molecular diagnosis of FH, and highlight the particular advantages of targeted next-generation sequencing (NGS) platforms as the most robust approach. Furthermore, we note the importance of screening for copy number variants and common polymorphisms to aid in molecularly defining suspected FH cases. Expert commentary: The need for genetic analysis of FH will increase, both for diagnosis and reimbursement of new therapies. An effective molecular diagnostic method must detect: 1) molecular and gene locus heterogeneity; 2) a wide range of mutation types; and 3) the polygenic component of FH. As availability of genetic testing for FH expands, standardization of variant curation, maintenance of clinical databases and registries, and wider health care provider education all assume greater importance.
High-Tensile Strength Tape Versus High-Tensile Strength Suture: A Biomechanical Study.
Gnandt, Ryan J; Smith, Jennifer L; Nguyen-Ta, Kim; McDonald, Lucas; LeClere, Lance E
2016-02-01
To determine which suture design, high-tensile strength tape or high-tensile strength suture, performed better at securing human tissue across 4 selected suture techniques commonly used in tendinous repair, by comparing the total load at failure measured during a fixed-rate longitudinal single load to failure using a biomechanical testing machine. Matched sets of tendon specimens with bony attachments were dissected from 15 human cadaveric lower extremities in a manner allowing for direct comparison testing. With the use of selected techniques (simple Mason-Allen in the patellar tendon specimens, whip stitch in the quadriceps tendon specimens, and Krackow stitch in the Achilles tendon specimens), 1 sample of each set was sutured with a 2-mm braided, nonabsorbable, high-tensile strength tape and the other with a No. 2 braided, nonabsorbable, high-tensile strength suture. A total of 120 specimens were tested. Each model was loaded to failure at a fixed longitudinal traction rate of 100 mm/min. The maximum load and failure method were recorded. In the whip stitch and the Krackow-stitch models, the high-tensile strength tape had a significantly greater mean load at failure with a difference of 181 N (P = .001) and 94 N (P = .015) respectively. No significant difference was found in the Mason-Allen and simple stitch models. Pull-through remained the most common method of failure at an overall rate of 56.7% (suture = 55%; tape = 58.3%). In biomechanical testing during a single load to failure, high-tensile strength tape performs more favorably than high-tensile strength suture, with a greater mean load to failure, in both the whip- and Krackow-stitch models. Although suture pull-through remains the most common method of failure, high-tensile strength tape requires a significantly greater load to pull-through in a whip-stitch and Krakow-stitch model. The biomechanical data obtained in the current study indicates that high-tensile strength tape may provide better repair strength compared with high-tensile strength suture at time-zero simulated testing. Published by Elsevier Inc.
Testing a Novel Method to Approximate Wood Specific Gravity of Trees
Michael C. Wiemann; G. Bruce Williamson
2012-01-01
Wood specific gravity (SG) has long been used by foresters as an index for wood properties. More recently, SG has been widely used by ecologists as a plant functional trait and as a key variable in estimates of biomass. However, sampling wood to determine SG can be problematic; at present, the most common method is sampling with an increment borer to extract a bark-to-...
Experimental Fatigue Study of Composite Patch Repaired Steel Plates with Cracks
NASA Astrophysics Data System (ADS)
Karatzas, Vasileios A.; Kotsidis, Elias A.; Tsouvalis, Nicholas G.
2015-10-01
Cracks are among the most commonly encountered defects in metallic structures operating at sea. Composite patch repairing is a repair method which is gaining popularity as it counters most of the problems faced by conventional renewal repairs. Extensive studies can be found in the literature addressing the efficiency of this novel repair method using techniques which meet higher performance and monitoring standards than these commonly found in naval applications. In this work the efficiency of practices widely used in the ship repair industry for the implementation of composite patch repairing is addressed. To this end, steel plates repaired with composite patches were tested under fatigue loading. The composite patches consisted of carbon fibers in epoxy matrix and were directly laminated to the steel surface using the vacuum infusion method. Two different surface preparation methods, namely grit-blasting and mechanical treatment with the use of a needle gun were studied. In addition, in order to account for the harsh environmental conditions during the operating life of the structure and to study its effect on the repair, two different aging scenarios were considered. Non-destructive evaluation of the patches was performed so as to assess the quality of the repair, and the evolution of debonding during testing.
Economic Evaluations of Pathology Tests, 2010-2015: A Scoping Review.
Watts, Rory D; Li, Ian W; Geelhoed, Elizabeth A; Sanfilippo, Frank M; St John, Andrew
2017-09-01
Concerns about pathology testing such as the value provided by new tests and the potential for inappropriate utilization have led to a greater need to assess costs and benefits. Economic evaluations are a formal method of analyzing costs and benefits, yet for pathology tests, questions remain about the scope and quality of the economic evidence. To describe the extent and quality of published evidence provided by economic evaluations of pathology tests from 2010 to 2015. Economic evaluations relating to pathology tests from 2010 to 2015 were reviewed. Eight databases were searched for published studies, and details recorded for the country, clinical focus, type of testing, and consideration of sensitivity, specificity, and false test results. The reporting quality of studies was assessed using the Consolidated Health Economic Evaluation Reporting Standards checklist and cost-effectiveness ratios were analyzed for publication bias. We found 356 economic evaluations of pathology tests, most of which regarded developed countries. The most common economic evaluations were cost-utility analyses and the most common clinical focus was infectious diseases. More than half of the studies considered sensitivity and specificity, but few studies considered the impact of false test results. The average Consolidated Health Economic Evaluation Reporting Standards checklist score was 17 out of 24. Cost-utility ratios were commonly less than $10,000/quality-adjusted life-year or more than $200,000/quality-adjusted life-year. The number of economic evaluations of pathology tests has increased in recent years, but the rate of increase has plateaued. Furthermore, the quality of studies in the past 5 years was highly variable, and there is some question of publication bias in reporting cost-effectiveness ratios. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
Development of phytotoxicity tests using wetland species
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nelson, M.K.; Fairchild, J.F.
1994-12-31
Laboratory phytotoxicity tests used to assess contaminant effects may not effectively protect wetland communities. The authors are developing routine culture and testing methods for selected fresh water plants, that can be used in risk assessments and monitoring of existing wetland systems. Utility of these tests includes evaluating the effects of point or non-point source contamination that may cause water or sediment quality degradation. Selected species include algae (blue-green, green), phytoflagellates (Chlamydomonas, Euglena), and floating or submerged vascular plants (milfoil, coontail, wild celery, elodea, duckweed). Algae toxicity tests range from 2-d, 4-d, and 7 day tests, and macrophyte tests from 10-dmore » to 14 days. Metribuzin and boron are the selected contaminants for developing the test methods. Metribuzin, a triazinone herbicide, is a photosystem 11 inhibitor, and is commonly used for control of grass and broad-leaf plants. As a plant micronutrient, boron is required in very small amounts, but excessive levels can result in phytotoxicity or accumulation. The investigations focus on the influence of important factors including the influence of light quality and quantity, and nutrient media. Reference toxicant exposures with potassium chloride are used to establish baseline data for sensitivity and vitality of the plants. These culture and test methods will be incorporated into recommendations for standard phytotoxicity test designs.« less
Statistical methods for investigating quiescence and other temporal seismicity patterns
Matthews, M.V.; Reasenberg, P.A.
1988-01-01
We propose a statistical model and a technique for objective recognition of one of the most commonly cited seismicity patterns:microearthquake quiescence. We use a Poisson process model for seismicity and define a process with quiescence as one with a particular type of piece-wise constant intensity function. From this model, we derive a statistic for testing stationarity against a 'quiescence' alternative. The large-sample null distribution of this statistic is approximated from simulated distributions of appropriate functionals applied to Brownian bridge processes. We point out the restrictiveness of the particular model we propose and of the quiescence idea in general. The fact that there are many point processes which have neither constant nor quiescent rate functions underscores the need to test for and describe nonuniformity thoroughly. We advocate the use of the quiescence test in conjunction with various other tests for nonuniformity and with graphical methods such as density estimation. ideally these methods may promote accurate description of temporal seismicity distributions and useful characterizations of interesting patterns. ?? 1988 Birkha??user Verlag.
NASA Astrophysics Data System (ADS)
Parkin, G.; O'Donnell, G.; Ewen, J.; Bathurst, J. C.; O'Connell, P. E.; Lavabre, J.
1996-02-01
Validation methods commonly used to test catchment models are not capable of demonstrating a model's fitness for making predictions for catchments where the catchment response is not known (including hypothetical catchments, and future conditions of existing catchments which are subject to land-use or climate change). This paper describes the first use of a new method of validation (Ewen and Parkin, 1996. J. Hydrol., 175: 583-594) designed to address these types of application; the method involves making 'blind' predictions of selected hydrological responses which are considered important for a particular application. SHETRAN (a physically based, distributed catchment modelling system) is tested on a small Mediterranean catchment. The test involves quantification of the uncertainty in four predicted features of the catchment response (continuous hydrograph, peak discharge rates, monthly runoff, and total runoff), and comparison of observations with the predicted ranges for these features. The results of this test are considered encouraging.
Simple F Test Reveals Gene-Gene Interactions in Case-Control Studies
Chen, Guanjie; Yuan, Ao; Zhou, Jie; Bentley, Amy R.; Adeyemo, Adebowale; Rotimi, Charles N.
2012-01-01
Missing heritability is still a challenge for Genome Wide Association Studies (GWAS). Gene-gene interactions may partially explain this residual genetic influence and contribute broadly to complex disease. To analyze the gene-gene interactions in case-control studies of complex disease, we propose a simple, non-parametric method that utilizes the F-statistic. This approach consists of three steps. First, we examine the joint distribution of a pair of SNPs in cases and controls separately. Second, an F-test is used to evaluate the ratio of dependence in cases to that of controls. Finally, results are adjusted for multiple tests. This method was used to evaluate gene-gene interactions that are associated with risk of Type 2 Diabetes among African Americans in the Howard University Family Study. We identified 18 gene-gene interactions (P < 0.0001). Compared with the commonly-used logistical regression method, we demonstrate that the F-ratio test is an efficient approach to measuring gene-gene interactions, especially for studies with limited sample size. PMID:22837643
Translating Computational Toxicology Data Through Stakeholder Outreach & Engagement (SOT)
US EPA has been using in vitro testing methods in an effort to accelerate the pace of chemical evaluations and address the significant lack of health and environmental data on the thousands of chemicals found in commonly used products. Since 2005, EPA’s researchers have generated...
Testing Interaction Effects without Discarding Variance.
ERIC Educational Resources Information Center
Lopez, Kay A.
Analysis of variance (ANOVA) and multiple regression are two of the most commonly used methods of data analysis in behavioral science research. Although ANOVA was intended for use with experimental designs, educational researchers have used ANOVA extensively in aptitude-treatment interaction (ATI) research. This practice tends to make researchers…
DEVELOPMENT OF A MOLECULAR METHOD TO IDENTIFY ASTROVIRUS IN WATER.
Astrovirus is a common cause of gastroenteritis that has been determined to be responsible for several outbreaks. Since astrovirus can be waterborne, there is interest in testing environmental water for astrovirus and we have developed a sensitive RT-PCR assay that is designed t...
NASA Technical Reports Server (NTRS)
Jones, R. T.
1976-01-01
For acoustic tests the violin is driven laterally at the bridge by a small speaker of the type commonly found in pocket transistor radios. An audio oscillator excites the tone which is picked up by a sound level meter. Gross patterns of vibration modes are obtained by the Chladni method.
Comparing Management Approaches for Automatic Test Systems: A Strategic Missile Case Study
2005-03-01
ground up, and is commonly conducted following five methods : ethnography , grounded theory , case study , phenomenological study , and biography...traditions frequently used (Creswell, 1998:5). The five traditions are biography, phenomenological study , grounded theory study , ethnography , and... Ethnography Biography Case Study Grounded Theory
Trends in Mediation Analysis in Nursing Research: Improving Current Practice.
Hertzog, Melody
2018-06-01
The purpose of this study was to describe common approaches used by nursing researchers to test mediation models and evaluate them within the context of current methodological advances. MEDLINE was used to locate studies testing a mediation model and published from 2004 to 2015 in nursing journals. Design (experimental/correlation, cross-sectional/longitudinal, model complexity) and analysis (method, inclusion of test of mediated effect, violations/discussion of assumptions, sample size/power) characteristics were coded for 456 studies. General trends were identified using descriptive statistics. Consistent with findings of reviews in other disciplines, evidence was found that nursing researchers may not be aware of the strong assumptions and serious limitations of their analyses. Suggestions for strengthening the rigor of such studies and an overview of current methods for testing more complex models, including longitudinal mediation processes, are presented.
Testing Common Envelopes on Double White Dwarf Binaries
NASA Astrophysics Data System (ADS)
Nandez, Jose L. A.; Ivanova, Natalia; Lombardi, James C., Jr.
2015-06-01
The formation of a double white dwarf binary likely involves a common envelope (CE) event between a red giant and a white dwarf (WD) during the most recent episode of Roche lobe overflow mass transfer. We study the role of recombination energy with hydrodynamic simulations of such stellar interactions. We find that the recombination energy helps to expel the common envelope entirely, while if recombination energy is not taken into account, a significant fraction of the common envelope remains bound. We apply our numerical methods to constrain the progenitor system for WD 1101+364 - a double WD binary that has well-measured mass ratio of q=0.87±0.03 and an orbital period of 0.145 days. Our best-fit progenitor for the pre-common envelope donor is a 1.5 ⊙ red giant.
Automated building of organometallic complexes from 3D fragments.
Foscato, Marco; Venkatraman, Vishwesh; Occhipinti, Giovanni; Alsberg, Bjørn K; Jensen, Vidar R
2014-07-28
A method for the automated construction of three-dimensional (3D) molecular models of organometallic species in design studies is described. Molecular structure fragments derived from crystallographic structures and accurate molecular-level calculations are used as 3D building blocks in the construction of multiple molecular models of analogous compounds. The method allows for precise control of stereochemistry and geometrical features that may otherwise be very challenging, or even impossible, to achieve with commonly available generators of 3D chemical structures. The new method was tested in the construction of three sets of active or metastable organometallic species of catalytic reactions in the homogeneous phase. The performance of the method was compared with those of commonly available methods for automated generation of 3D models, demonstrating higher accuracy of the prepared 3D models in general, and, in particular, a much wider range with respect to the kind of chemical structures that can be built automatically, with capabilities far beyond standard organic and main-group chemistry.
The application of biotechnological methods in authenticity testing.
Popping, Bert
2002-09-11
By counterfeiting brand names in the food and drink industry as well as fraudulently labelling and selling low quality products as premium products, this sector of the industry has lost significant amounts of money and the consumer has been deceived. While it was difficult to establish certain types of fraud before the advent of modern biotechnology, DNA-based methods make an important contribution to protect high-quality brand names and protect the consumer. Several years ago, DNA technologies were considered as methods used in universities, primarily for research purpose, not so much for 'real-life' applications. However, this has changed and a number of laboratories have specialised in offering such services to the industry. This article will review DNA-based techniques commonly used for authenticity testing.
Gu, Hai Ting; Xie, Ping; Sang, Yan Fang; Wu, Zi Yi
2018-04-01
Abrupt change is an important manifestation of hydrological process with dramatic variation in the context of global climate change, the accurate recognition of which has great significance to understand hydrological process changes and carry out the actual hydrological and water resources works. The traditional method is not reliable at both ends of the samples. The results of the methods are often inconsistent. In order to solve the problem, we proposed a comprehensive weighted recognition method for hydrological abrupt change based on weighting by comparing of 12 commonly used methods for testing change points. The reliability of the method was verified by Monte Carlo statistical test. The results showed that the efficiency of the 12 methods was influenced by the factors including coefficient of variation (Cv), deviation coefficient (Cs) before the change point, mean value difference coefficient, Cv difference coefficient and Cs difference coefficient, but with no significant relationship with the mean value of the sequence. Based on the performance of each method, the weight of each test method was given following the results from statistical test. The sliding rank sum test method and the sliding run test method had the highest weight, whereas the RS test method had the lowest weight. By this means, the change points with the largest comprehensive weight could be selected as the final result when the results of the different methods were inconsistent. This method was used to analyze the daily maximum sequence of Jiajiu station in the lower reaches of the Lancang River (1-day, 3-day, 5-day, 7-day and 1-month). The results showed that each sequence had obvious jump variation in 2004, which was in agreement with the physical causes of hydrological process change and water conservancy construction. The rationality and reliability of the proposed method was verified.
A Statistical Approach for Testing Cross-Phenotype Effects of Rare Variants
Broadaway, K. Alaine; Cutler, David J.; Duncan, Richard; Moore, Jacob L.; Ware, Erin B.; Jhun, Min A.; Bielak, Lawrence F.; Zhao, Wei; Smith, Jennifer A.; Peyser, Patricia A.; Kardia, Sharon L.R.; Ghosh, Debashis; Epstein, Michael P.
2016-01-01
Increasing empirical evidence suggests that many genetic variants influence multiple distinct phenotypes. When cross-phenotype effects exist, multivariate association methods that consider pleiotropy are often more powerful than univariate methods that model each phenotype separately. Although several statistical approaches exist for testing cross-phenotype effects for common variants, there is a lack of similar tests for gene-based analysis of rare variants. In order to fill this important gap, we introduce a statistical method for cross-phenotype analysis of rare variants using a nonparametric distance-covariance approach that compares similarity in multivariate phenotypes to similarity in rare-variant genotypes across a gene. The approach can accommodate both binary and continuous phenotypes and further can adjust for covariates. Our approach yields a closed-form test whose significance can be evaluated analytically, thereby improving computational efficiency and permitting application on a genome-wide scale. We use simulated data to demonstrate that our method, which we refer to as the Gene Association with Multiple Traits (GAMuT) test, provides increased power over competing approaches. We also illustrate our approach using exome-chip data from the Genetic Epidemiology Network of Arteriopathy. PMID:26942286
Conductive surge testing of circuits and systems
NASA Technical Reports Server (NTRS)
Richman, P.
1980-01-01
Techniques are given for conductive surge testing of powered electronic equipment. The correct definitions of common and normal mode are presented. Testing requires not only spike-surge generators with a suitable range of open-circuit voltage and short-circuit current waveshapes, but also appropriate means, termed couplers, for connecting test surges to the equipment under test. Key among coupler design considerations is minimization of fail positives resulting from reduction in delivered surge energy due to the coupler. Back-filters and the lines on which they are necessary, are considered as well as ground-fault and ground potential rise. A method for monitoring delivered and resulting surge waves is mentioned.
Chronic ankle instability: Current perspectives
Al-Mohrej, Omar A.; Al-Kenani, Nader S.
2016-01-01
Ankle sprain is reported to be among the most common recurrent injuries. About 20% of acute ankle sprain patients develop chronic ankle instability. The failure of functional rehabilitation after acute ankle sprain leads to the development of chronic ankle instability. Differentiation between functional and anatomical ankle instability is very essential to guide the proper treatment. Stability testing by varus stress test and anterior drawer test should be carried out. Subtalar instability is an important pathology that is commonly by passed during the assessment of chronic ankle instability. Unlike acute ankle sprain, chronic ankle instability might require surgical intervention. The surgical and conservative management options can be very much developed by in-depth knowledge of the ankle anatomy, biomechanics, and pathology. Anatomical repair, augmentation by tendon, or both are the basic methods of surgical intervention. Arthroscopy is becoming more popular in the management of chronic ankle instability. PMID:27843798
QCL spectroscopy combined with the least squares method for substance analysis
NASA Astrophysics Data System (ADS)
Samsonov, D. A.; Tabalina, A. S.; Fufurin, I. L.
2017-11-01
The article briefly describes distinctive features of quantum cascade lasers (QCL). It also describes an experimental set-up for acquiring mid-infrared absorption spectra using QCL. The paper demonstrates experimental results in the form of normed spectra. We tested the application of the least squares method for spectrum analysis. We used this method for substance identification and extraction of concentration data. We compare the results with more common methods of absorption spectroscopy. Eventually, we prove the feasibility of using this simple method for quantitative and qualitative analysis of experimental data acquired with QCL.
On the efficacy of procedures to normalize Ex-Gaussian distributions.
Marmolejo-Ramos, Fernando; Cousineau, Denis; Benites, Luis; Maehara, Rocío
2014-01-01
Reaction time (RT) is one of the most common types of measure used in experimental psychology. Its distribution is not normal (Gaussian) but resembles a convolution of normal and exponential distributions (Ex-Gaussian). One of the major assumptions in parametric tests (such as ANOVAs) is that variables are normally distributed. Hence, it is acknowledged by many that the normality assumption is not met. This paper presents different procedures to normalize data sampled from an Ex-Gaussian distribution in such a way that they are suitable for parametric tests based on the normality assumption. Using simulation studies, various outlier elimination and transformation procedures were tested against the level of normality they provide. The results suggest that the transformation methods are better than elimination methods in normalizing positively skewed data and the more skewed the distribution then the transformation methods are more effective in normalizing such data. Specifically, transformation with parameter lambda -1 leads to the best results.
NASA Astrophysics Data System (ADS)
Abdellatef, Hisham E.
2007-04-01
Picric acid, bromocresol green, bromothymol blue, cobalt thiocyanate and molybdenum(V) thiocyanate have been tested as spectrophotometric reagents for the determination of disopyramide and irbesartan. Reaction conditions have been optimized to obtain coloured comoplexes of higher sensitivity and longer stability. The absorbance of ion-pair complexes formed were found to increases linearity with increases in concentrations of disopyramide and irbesartan which were corroborated by correction coefficient values. The developed methods have been successfully applied for the determination of disopyramide and irbesartan in bulk drugs and pharmaceutical formulations. The common excipients and additives did not interfere in their determination. The results obtained by the proposed methods have been statistically compared by means of student t-test and by the variance ratio F-test. The validity was assessed by applying the standard addition technique. The results were compared statistically with the official or reference methods showing a good agreement with high precision and accuracy.
On determining the most appropriate test cut-off value: the case of tests with continuous results
Habibzadeh, Parham; Yadollahie, Mahboobeh
2016-01-01
There are several criteria for determination of the most appropriate cut-off value in a diagnostic test with continuous results. Mostly based on receiver operating characteristic (ROC) analysis, there are various methods to determine the test cut-off value. The most common criteria are the point on ROC curve where the sensitivity and specificity of the test are equal; the point on the curve with minimum distance from the left-upper corner of the unit square; and the point where the Youden’s index is maximum. There are also methods mainly based on Bayesian decision analysis. Herein, we show that a proposed method that maximizes the weighted number needed to misdiagnose, an index of diagnostic test effectiveness we previously proposed, is the most appropriate technique compared to the aforementioned ones. For determination of the cut-off value, we need to know the pretest probability of the disease of interest as well as the costs incurred by misdiagnosis. This means that even for a certain diagnostic test, the cut-off value is not universal and should be determined for each region and for each disease condition. PMID:27812299
Wang, Xuefeng; Lee, Seunggeun; Zhu, Xiaofeng; Redline, Susan; Lin, Xihong
2013-12-01
Family-based genetic association studies of related individuals provide opportunities to detect genetic variants that complement studies of unrelated individuals. Most statistical methods for family association studies for common variants are single marker based, which test one SNP a time. In this paper, we consider testing the effect of an SNP set, e.g., SNPs in a gene, in family studies, for both continuous and discrete traits. Specifically, we propose a generalized estimating equations (GEEs) based kernel association test, a variance component based testing method, to test for the association between a phenotype and multiple variants in an SNP set jointly using family samples. The proposed approach allows for both continuous and discrete traits, where the correlation among family members is taken into account through the use of an empirical covariance estimator. We derive the theoretical distribution of the proposed statistic under the null and develop analytical methods to calculate the P-values. We also propose an efficient resampling method for correcting for small sample size bias in family studies. The proposed method allows for easily incorporating covariates and SNP-SNP interactions. Simulation studies show that the proposed method properly controls for type I error rates under both random and ascertained sampling schemes in family studies. We demonstrate through simulation studies that our approach has superior performance for association mapping compared to the single marker based minimum P-value GEE test for an SNP-set effect over a range of scenarios. We illustrate the application of the proposed method using data from the Cleveland Family GWAS Study. © 2013 WILEY PERIODICALS, INC.
Occupational rhinitis in the Slovak Republic--a long-term retrospective study.
Perečinský, Slavomir; Legáth, L'ubomír; Varga, Marek; Javorský, Martin; Bátora, Igor; Klimentová, Gabriela
2014-12-01
Allergic and non-allergic rhinitis ranks among the common occupational health problems. However, data on the incidence of occupational rhinitis are lacking, since comprehensive studies are rare. The study includes a group of patients in the Slovak Republic who were reported as having occupational rhinitis in the years 1990-2011. The following parameters were tracked in the investigated sample: age, gender, number of cases by individual years, occupations, causative factors and the length of exposure to the given agent. Possible progression of rhinitis to bronchial asthma was evaluated as well. The diagnostic algorithm was also analysed retrospectively, which included skin tests, the examination of specific IgE antibodies and nasal provocation tests. A total of 70 cases of occupational rhinitis were reported. The disease most often occurred in food industry workers (50% of cases). The most common aetiological factor was flour. Among other relatively common allergens were synthetic textile, wool, cotton and different types of moulds. Significant agents were also different chemical factors causing allergic and irritant rhinitis. The average length of exposure was 14.8 years. Exposure was shorter in men than in women (11 years vs. 16 years) (p = 0.04). Bronchial asthma as a comorbidity was diagnosed in 13 patients (19.7%). The critical diagnostic method on the basis of which the causal association between rhinitis and work environments was confirmed in 59% of cases was skin test; confirmation of the occupational cause using nasal provocation test was less frequent (18%). Food industry, textile industry and agriculture were the most risky occupational environments. Workers in these sectors require preventive intervention. In case of showing rhinitis symptoms it is necessary to confirm the occupational aetiology of the disease by the objective diagnostic methods. Since occupational rhinitis mostly precedes the occupational asthma, the elimination from the workplace is necessary.
Diagnostic Performance of a Molecular Test versus Clinician Assessment of Vaginitis.
Schwebke, Jane R; Gaydos, Charlotte A; Nyirjesy, Paul; Paradis, Sonia; Kodsi, Salma; Cooper, Charles K
2018-06-01
Vaginitis is a common complaint, diagnosed either empirically or using Amsel's criteria and wet mount microscopy. This study sought to determine characteristics of an investigational test (a molecular test for vaginitis), compared to reference, for detection of bacterial vaginosis, Candida spp., and Trichomonas vaginalis Vaginal specimens from a cross-sectional study were obtained from 1,740 women (≥18 years old), with vaginitis symptoms, during routine clinic visits (across 10 sites in the United States). Specimens were analyzed using a commercial PCR/fluorogenic probe-based investigational test that detects bacterial vaginosis, Candida spp., and Trichomonas vaginalis Clinician diagnosis and in-clinic testing (Amsel's test, potassium hydroxide preparation, and wet mount) were also employed to detect the three vaginitis causes. All testing methods were compared to the respective reference methods (Nugent Gram stain for bacterial vaginosis, detection of the Candida gene its2 , and Trichomonas vaginalis culture). The investigational test, clinician diagnosis, and in-clinic testing were compared to reference methods for bacterial vaginosis, Candida spp., and Trichomonas vaginalis The investigational test resulted in significantly higher sensitivity and negative predictive value than clinician diagnosis or in-clinic testing. In addition, the investigational test showed a statistically higher overall percent agreement with each of the three reference methods than did clinician diagnosis or in-clinic testing. The investigational test showed significantly higher sensitivity for detecting vaginitis, involving more than one cause, than did clinician diagnosis. Taken together, these results suggest that a molecular investigational test can facilitate accurate detection of vaginitis. Copyright © 2018 Schwebke et al.
[Reliable microbiological diagnosis of vulvovaginal candidiasis].
Baykushev, R; Ouzounova-Raykova, V; Stoykova, V; Mitov, I
2014-01-01
Vulvovaginal candidiasis is common infection among those affecting the vulva and vagina. Is caused by the perpesentatives from the genus Candida, in most cases C. albicans (85-90%). An increase in the percentage of the so-called non-albicans agents is seen and these pathgogens are often resistant to the most commonly used in the practice antifungals. Faulty diagnosis, incorrect use of azoles, and self-treatment lead to selection of resistant strains and recurrent infections. Identification of Candida species associated with vulvovaginal candidiasis by conventional and PCR techniques. For six months a total number of 213 vaginal secretions were tested applying Gram stain and cultivation on ChromAgar. API Candida fermentation tests and API 20CAUX assimilation tests were performed for the identification of the bacteria. Extraction of DNA of all the smears with subsequent PCR detection of different Candida species were done. 80.7% materials showed presence of blastospores and/or hyphae. Positive culture results were detected in 60 (28.2%) samples. The species specific identification revealed presence of C. albicans in 51 (85%) smears, C. glabrata--in 8 (13.3%), C. krusei--in 2 (3.3%), and S. cervisie--in 1 (2.1%). The PCR technique confirmed the results of the conventional methods. It is worth to mention that 51 of the tested smears were positive for G. vaginalis using additional PCR. The correct diagnosis of the cause of vulvovaginal candidiasis helps in the correct choice of appropriate antifungal therapy and prevents development of recurrent infections and consequences. The PCR based method is rapid, specific and sensitive. It perfectly correlates with the results from the conventional diagnostic tests so it could be selected as a method of choice for the diagnosis of vulvovaginal candidiasis.
Computational Design and Analysis of a Transonic Natural Laminar Flow Wing for a Wind Tunnel Model
NASA Technical Reports Server (NTRS)
Lynde, Michelle N.; Campbell, Richard L.
2017-01-01
A natural laminar flow (NLF) wind tunnel model has been designed and analyzed for a wind tunnel test in the National Transonic Facility (NTF) at the NASA Langley Research Center. The NLF design method is built into the CDISC design module and uses a Navier-Stokes flow solver, a boundary layer profile solver, and stability analysis and transition prediction software. The NLF design method alters the pressure distribution to support laminar flow on the upper surface of wings with high sweep and flight Reynolds numbers. The method addresses transition due to attachment line contamination/transition, Gortler vortices, and crossflow and Tollmien-Schlichting modal instabilities. The design method is applied to the wing of the Common Research Model (CRM) at transonic flight conditions. Computational analysis predicts significant extents of laminar flow on the wing upper surface, which results in drag savings. A 5.2 percent scale semispan model of the CRM NLF wing will be built and tested in the NTF. This test will aim to validate the NLF design method, as well as characterize the laminar flow testing capabilities in the wind tunnel facility.
Association analysis of multiple traits by an approach of combining P values.
Chen, Lili; Wang, Yong; Zhou, Yajing
2018-03-01
Increasing evidence shows that one variant can affect multiple traits, which is a widespread phenomenon in complex diseases. Joint analysis of multiple traits can increase statistical power of association analysis and uncover the underlying genetic mechanism. Although there are many statistical methods to analyse multiple traits, most of these methods are usually suitable for detecting common variants associated with multiple traits. However, because of low minor allele frequency of rare variant, these methods are not optimal for rare variant association analysis. In this paper, we extend an adaptive combination of P values method (termed ADA) for single trait to test association between multiple traits and rare variants in the given region. For a given region, we use reverse regression model to test each rare variant associated with multiple traits and obtain the P value of single-variant test. Further, we take the weighted combination of these P values as the test statistic. Extensive simulation studies show that our approach is more powerful than several other comparison methods in most cases and is robust to the inclusion of a high proportion of neutral variants and the different directions of effects of causal variants.
Evaluation of Thermoelectric Performance and Durability of Functionalized Skutterudite Legs
NASA Astrophysics Data System (ADS)
Skomedal, Gunstein; Kristiansen, Nils R.; Sottong, Reinhard; Middleton, Hugh
2017-04-01
Thermoelectric generators are a promising technology for waste heat recovery. As new materials and devices enter a market penetration stage, it is of interest to employ fast and efficient measurement methods to evaluate the long-term stability of thermoelectric materials in combination with metallization and coating (functionalized thermoelectric legs). We have investigated a method for measuring several thermoelectric legs simultaneously. The legs are put under a common temperature gradient, and the electrical characteristics of each leg are measured individually during thermal cycling. Using this method, one can test different types of metallization and coating applied to skutterudite thermoelectric legs and look at the relative changes over time. Postcharacterization of these initial tests with skutterudite legs using a potential Seebeck microprobe and an electron microscope showed that oxidation and interlayer diffusion are the main reasons for the gradual increase in internal resistance and the decrease in open-circuit voltage. Although we only tested skutterudite material in this work, the method is fully capable of testing all kinds of material, metallization, and coating. It is thus a promising method for studying the relationship between failure modes and mechanisms of functionalized thermoelectric legs.
Robustness of S1 statistic with Hodges-Lehmann for skewed distributions
NASA Astrophysics Data System (ADS)
Ahad, Nor Aishah; Yahaya, Sharipah Soaad Syed; Yin, Lee Ping
2016-10-01
Analysis of variance (ANOVA) is a common use parametric method to test the differences in means for more than two groups when the populations are normally distributed. ANOVA is highly inefficient under the influence of non- normal and heteroscedastic settings. When the assumptions are violated, researchers are looking for alternative such as Kruskal-Wallis under nonparametric or robust method. This study focused on flexible method, S1 statistic for comparing groups using median as the location estimator. S1 statistic was modified by substituting the median with Hodges-Lehmann and the default scale estimator with the variance of Hodges-Lehmann and MADn to produce two different test statistics for comparing groups. Bootstrap method was used for testing the hypotheses since the sampling distributions of these modified S1 statistics are unknown. The performance of the proposed statistic in terms of Type I error was measured and compared against the original S1 statistic, ANOVA and Kruskal-Wallis. The propose procedures show improvement compared to the original statistic especially under extremely skewed distribution.
Mapping common aphasia assessments to underlying cognitive processes and their neural substrates
Lacey, Elizabeth H.; Skipper-Kallal, LM; Xing, S; Fama, ME; Turkeltaub, PE
2017-01-01
Background Understanding the relationships between clinical tests, the processes they measure, and the brain networks underlying them, is critical in order for clinicians to move beyond aphasia syndrome classification toward specification of individual language process impairments. Objective To understand the cognitive, language, and neuroanatomical factors underlying scores of commonly used aphasia tests. Methods 25 behavioral tests were administered to a group of 38 chronic left hemisphere stroke survivors and a high resolution MRI was obtained. Test scores were entered into a principal components analysis to extract the latent variables (factors) measured by the tests. Multivariate lesion-symptom mapping was used to localize lesions associated with the factor scores. Results The principal components analysis yielded four dissociable factors, which we labeled Word Finding/Fluency, Comprehension, Phonology/Working Memory Capacity, and Executive Function. While many tests loaded onto the factors in predictable ways, some relied heavily on factors not commonly associated with the tests. Lesion symptom mapping demonstrated discrete brain structures associated with each factor, including frontal, temporal, and parietal areas extending beyond the classical language network. Specific functions mapped onto brain anatomy largely in correspondence with modern neural models of language processing. Conclusions An extensive clinical aphasia assessment identifies four independent language functions, relying on discrete parts of the left middle cerebral artery territory. A better understanding of the processes underlying cognitive tests and the link between lesion and behavior may lead to improved aphasia diagnosis, and may yield treatments better targeted to an individual’s specific pattern of deficits and preserved abilities. PMID:28135902
Statistical inference methods for two crossing survival curves: a comparison of methods.
Li, Huimin; Han, Dong; Hou, Yawen; Chen, Huilin; Chen, Zheng
2015-01-01
A common problem that is encountered in medical applications is the overall homogeneity of survival distributions when two survival curves cross each other. A survey demonstrated that under this condition, which was an obvious violation of the assumption of proportional hazard rates, the log-rank test was still used in 70% of studies. Several statistical methods have been proposed to solve this problem. However, in many applications, it is difficult to specify the types of survival differences and choose an appropriate method prior to analysis. Thus, we conducted an extensive series of Monte Carlo simulations to investigate the power and type I error rate of these procedures under various patterns of crossing survival curves with different censoring rates and distribution parameters. Our objective was to evaluate the strengths and weaknesses of tests in different situations and for various censoring rates and to recommend an appropriate test that will not fail for a wide range of applications. Simulation studies demonstrated that adaptive Neyman's smooth tests and the two-stage procedure offer higher power and greater stability than other methods when the survival distributions cross at early, middle or late times. Even for proportional hazards, both methods maintain acceptable power compared with the log-rank test. In terms of the type I error rate, Renyi and Cramér-von Mises tests are relatively conservative, whereas the statistics of the Lin-Xu test exhibit apparent inflation as the censoring rate increases. Other tests produce results close to the nominal 0.05 level. In conclusion, adaptive Neyman's smooth tests and the two-stage procedure are found to be the most stable and feasible approaches for a variety of situations and censoring rates. Therefore, they are applicable to a wider spectrum of alternatives compared with other tests.
Statistical Inference Methods for Two Crossing Survival Curves: A Comparison of Methods
Li, Huimin; Han, Dong; Hou, Yawen; Chen, Huilin; Chen, Zheng
2015-01-01
A common problem that is encountered in medical applications is the overall homogeneity of survival distributions when two survival curves cross each other. A survey demonstrated that under this condition, which was an obvious violation of the assumption of proportional hazard rates, the log-rank test was still used in 70% of studies. Several statistical methods have been proposed to solve this problem. However, in many applications, it is difficult to specify the types of survival differences and choose an appropriate method prior to analysis. Thus, we conducted an extensive series of Monte Carlo simulations to investigate the power and type I error rate of these procedures under various patterns of crossing survival curves with different censoring rates and distribution parameters. Our objective was to evaluate the strengths and weaknesses of tests in different situations and for various censoring rates and to recommend an appropriate test that will not fail for a wide range of applications. Simulation studies demonstrated that adaptive Neyman’s smooth tests and the two-stage procedure offer higher power and greater stability than other methods when the survival distributions cross at early, middle or late times. Even for proportional hazards, both methods maintain acceptable power compared with the log-rank test. In terms of the type I error rate, Renyi and Cramér—von Mises tests are relatively conservative, whereas the statistics of the Lin-Xu test exhibit apparent inflation as the censoring rate increases. Other tests produce results close to the nominal 0.05 level. In conclusion, adaptive Neyman’s smooth tests and the two-stage procedure are found to be the most stable and feasible approaches for a variety of situations and censoring rates. Therefore, they are applicable to a wider spectrum of alternatives compared with other tests. PMID:25615624
Gülseren, Duygu; Hapa, Asli; Ersoy-Evans, Sibel; Elçin, Gonca; Karaduman, Ayşen
2017-03-01
Recurrent aphthous stomatitis (RAS) is a common disease of the oral mucosa with an unknown etiology. This study aimed to determine if food additives play a role in the etiology of RAS as well as to determine if patch testing can be used to detect which allergens cause RAS. This prospective study included 24 patients with RAS and 22 healthy controls. All the participants underwent patch testing for 23 food additives. In total, 21 (87.5%) RAS patients and 3 (13.6%) controls had positive patch test reactions to ≥1 allergens; the difference in the patch test positivity rate between groups was significant (P < 0.05). The most common allergen that elicited positive patch test results in the patient group was cochineal red (n = 15 [62.5%]), followed by azorubine (n = 11 [45.8%]) and amaranth (n = 6 [25%]). The present findings show that food additives might play a role in the etiology of RAS and that patch testing could be a method for determining the etiology of RAS. © 2016 The International Society of Dermatology.
Larson, Nicholas B; McDonnell, Shannon; Cannon Albright, Lisa; Teerlink, Craig; Stanford, Janet; Ostrander, Elaine A; Isaacs, William B; Xu, Jianfeng; Cooney, Kathleen A; Lange, Ethan; Schleutker, Johanna; Carpten, John D; Powell, Isaac; Bailey-Wilson, Joan E; Cussenot, Olivier; Cancel-Tassin, Geraldine; Giles, Graham G; MacInnis, Robert J; Maier, Christiane; Whittemore, Alice S; Hsieh, Chih-Lin; Wiklund, Fredrik; Catalona, William J; Foulkes, William; Mandal, Diptasri; Eeles, Rosalind; Kote-Jarai, Zsofia; Ackerman, Michael J; Olson, Timothy M; Klein, Christopher J; Thibodeau, Stephen N; Schaid, Daniel J
2017-05-01
Next-generation sequencing technologies have afforded unprecedented characterization of low-frequency and rare genetic variation. Due to low power for single-variant testing, aggregative methods are commonly used to combine observed rare variation within a single gene. Causal variation may also aggregate across multiple genes within relevant biomolecular pathways. Kernel-machine regression and adaptive testing methods for aggregative rare-variant association testing have been demonstrated to be powerful approaches for pathway-level analysis, although these methods tend to be computationally intensive at high-variant dimensionality and require access to complete data. An additional analytical issue in scans of large pathway definition sets is multiple testing correction. Gene set definitions may exhibit substantial genic overlap, and the impact of the resultant correlation in test statistics on Type I error rate control for large agnostic gene set scans has not been fully explored. Herein, we first outline a statistical strategy for aggregative rare-variant analysis using component gene-level linear kernel score test summary statistics as well as derive simple estimators of the effective number of tests for family-wise error rate control. We then conduct extensive simulation studies to characterize the behavior of our approach relative to direct application of kernel and adaptive methods under a variety of conditions. We also apply our method to two case-control studies, respectively, evaluating rare variation in hereditary prostate cancer and schizophrenia. Finally, we provide open-source R code for public use to facilitate easy application of our methods to existing rare-variant analysis results. © 2017 WILEY PERIODICALS, INC.
Vachhani, Raj; Patel, Toral; Centor, Robert M; Estrada, Carlos A
2017-01-01
Meta-analyses based on peer-reviewed publications report a sensitivity of approximately 85% for rapid antigen streptococcus tests to diagnose group A streptococcal (GAS) pharyngitis. Because these meta-analyses excluded package inserts, we examined the test characteristics of rapid antigen streptococcal tests and molecular methods that manufacturers report in their package inserts. We included tests available in the US market (Food and Drug Administration, period searched 1993-2015) and used package insert data to calculate pooled sensitivity and specificity. To examine quality, we used the Quality Assessment of Diagnostic Accuracy Studies-2. We excluded 26 tests having different trade names but identical methods and data. The study design was prospective in 41.7% (10 of 24). The pooled sensitivity of the most commonly used method, lateral flow/immunochromatographic, was 95% (95% confidence interval [CI] 94-96) and the pooled specificity was 98% (96-98); 7108 patients. The pooled sensitivity of the polymerase chain reaction or molecular methods was 98% (95% CI 96-98) and the pooled specificity was 96% (95% CI 95-97); 5685 patients. Package inserts include sponsored studies that overestimate the sensitivity of rapid tests to diagnose GAS pharyngitis by approximately 10%. Physicians should understand that package inserts overestimate diagnostic test utility; a negative test cannot be used to exclude GAS pharyngitis.
Seismic wavefield propagation in 2D anisotropic media: Ray theory versus wave-equation simulation
NASA Astrophysics Data System (ADS)
Bai, Chao-ying; Hu, Guang-yi; Zhang, Yan-teng; Li, Zhong-sheng
2014-05-01
Despite the ray theory that is based on the high frequency assumption of the elastic wave-equation, the ray theory and the wave-equation simulation methods should be mutually proof of each other and hence jointly developed, but in fact parallel independent progressively. For this reason, in this paper we try an alternative way to mutually verify and test the computational accuracy and the solution correctness of both the ray theory (the multistage irregular shortest-path method) and the wave-equation simulation method (both the staggered finite difference method and the pseudo-spectral method) in anisotropic VTI and TTI media. Through the analysis and comparison of wavefield snapshot, common source gather profile and synthetic seismogram, it is able not only to verify the accuracy and correctness of each of the methods at least for kinematic features, but also to thoroughly understand the kinematic and dynamic features of the wave propagation in anisotropic media. The results show that both the staggered finite difference method and the pseudo-spectral method are able to yield the same results even for complex anisotropic media (such as a fault model); the multistage irregular shortest-path method is capable of predicting similar kinematic features as the wave-equation simulation method does, which can be used to mutually test each other for methodology accuracy and solution correctness. In addition, with the aid of the ray tracing results, it is easy to identify the multi-phases (or multiples) in the wavefield snapshot, common source point gather seismic section and synthetic seismogram predicted by the wave-equation simulation method, which is a key issue for later seismic application.
Lee, J C; Cole, M; Linacre, A
2000-08-14
Abuse of hallucinogens produced by the fungal genera Psilocybe and Panaeolus are a growing problem. Five species from each of the two genera were examined in this preliminary research and a method that will unambiguously identify fungal samples as being of one of these two genera has been developed. The method uses genus specific DNA sequences within the Internal Transcribed Spacer of the ribosomal gene complex. Amplification of a common DNA product and a genus specific product results in two identifiable products, which facilitates the unambiguous identification of material from these two fungi to generic level.
Ghiabi, Edmond; Taylor, K Lynn
2010-06-01
This project aimed at documenting the surgical training curricula offered by North American graduate periodontics programs. A survey consisting of questions on teaching methods employed and the content of the surgical training program was mailed to directors of all fifty-eight graduate periodontics programs in Canada and the United States. The chi-square test was used to assess whether the residents' clinical experience was significantly (P<0.05) influenced by having a) a structured preclinical program or b) another dental residency program in the institution. Thirty-four programs (59 percent) responded to the survey. Twenty-six programs (76 percent of respondents) reported offering a structured preclinical component. Traditional teaching methods such as slides, live demonstration, DVD/CD, and animal cadavers were the most common teaching methods used, whereas online courses, computer simulation, and various surgical mannequins were least commonly used. The most commonly performed surgical procedures were conventional flaps, periodontal plastic procedures, hard tissue grafts, and implants. Furthermore, residents in programs offering a structured preclinical component performed significantly more procedures (P=0.012) using lasers than those in programs not offering a structured preclinical program. Devising new and innovative teaching methods is a clear avenue for future development in North American graduate periodontics programs.
Hypothesis testing of scientific Monte Carlo calculations.
Wallerberger, Markus; Gull, Emanuel
2017-11-01
The steadily increasing size of scientific Monte Carlo simulations and the desire for robust, correct, and reproducible results necessitates rigorous testing procedures for scientific simulations in order to detect numerical problems and programming bugs. However, the testing paradigms developed for deterministic algorithms have proven to be ill suited for stochastic algorithms. In this paper we demonstrate explicitly how the technique of statistical hypothesis testing, which is in wide use in other fields of science, can be used to devise automatic and reliable tests for Monte Carlo methods, and we show that these tests are able to detect some of the common problems encountered in stochastic scientific simulations. We argue that hypothesis testing should become part of the standard testing toolkit for scientific simulations.
Hypothesis testing of scientific Monte Carlo calculations
NASA Astrophysics Data System (ADS)
Wallerberger, Markus; Gull, Emanuel
2017-11-01
The steadily increasing size of scientific Monte Carlo simulations and the desire for robust, correct, and reproducible results necessitates rigorous testing procedures for scientific simulations in order to detect numerical problems and programming bugs. However, the testing paradigms developed for deterministic algorithms have proven to be ill suited for stochastic algorithms. In this paper we demonstrate explicitly how the technique of statistical hypothesis testing, which is in wide use in other fields of science, can be used to devise automatic and reliable tests for Monte Carlo methods, and we show that these tests are able to detect some of the common problems encountered in stochastic scientific simulations. We argue that hypothesis testing should become part of the standard testing toolkit for scientific simulations.
Rapacz, Marcin; Sasal, Monika; Kalaji, Hazem M.; Kościelniak, Janusz
2015-01-01
OJIP analysis, which explores changes in photosystem II (PSII) photochemical performance, has been used as a measure of plant susceptibility to stress. However, in the case of freezing tolerance and winter hardiness, which are highly environmentally variable, the use of this method can give ambiguous results depending on the species as well as the sampling year and time. To clarify this issue, we performed chlorophyll fluorescence measurements over three subsequent winters (2010/11, 2011/12 and 2012/13) on 220 accessions of common winter wheat and 139 accessions of winter triticale. After freezing, leaves were collected from cold-acclimated plants in the laboratory and field-grown plants. Observations of field survival in seven locations across Poland and measurements of freezing tolerance of the studied plants were also recorded. Our results confirm that the OJIP test is a reliable indicator of winter hardiness and freezing tolerance of common wheat and triticale under unstable winter environments. Regardless of species, the testing conditions giving the most reliable results were identical, and the reliability of the test could be easily checked by analysis of some relationships between OJIP-test parameters. We also found that triticale is more winter hardy and freezing tolerant than wheat. In addition, the two species were characterized by different patterns of photosynthetic apparatus acclimation to cold. PMID:26230839
Rapacz, Marcin; Sasal, Monika; Kalaji, Hazem M; Kościelniak, Janusz
2015-01-01
OJIP analysis, which explores changes in photosystem II (PSII) photochemical performance, has been used as a measure of plant susceptibility to stress. However, in the case of freezing tolerance and winter hardiness, which are highly environmentally variable, the use of this method can give ambiguous results depending on the species as well as the sampling year and time. To clarify this issue, we performed chlorophyll fluorescence measurements over three subsequent winters (2010/11, 2011/12 and 2012/13) on 220 accessions of common winter wheat and 139 accessions of winter triticale. After freezing, leaves were collected from cold-acclimated plants in the laboratory and field-grown plants. Observations of field survival in seven locations across Poland and measurements of freezing tolerance of the studied plants were also recorded. Our results confirm that the OJIP test is a reliable indicator of winter hardiness and freezing tolerance of common wheat and triticale under unstable winter environments. Regardless of species, the testing conditions giving the most reliable results were identical, and the reliability of the test could be easily checked by analysis of some relationships between OJIP-test parameters. We also found that triticale is more winter hardy and freezing tolerant than wheat. In addition, the two species were characterized by different patterns of photosynthetic apparatus acclimation to cold.
Williams, Janet K.; Erwin, Cheryl; Juhl, Andrew; Mills, James; Brossman, Bradley
2010-01-01
Aims: A family history of Huntington disease (HD) or receiving results of HD predictive genetic testing can influence individual well-being, family relationships, and social interactions in positive and negative ways. The aim of this study was to examine benefits reported by people with an HD family history or those who have undergone predictive HD testing, as well as the personal variables associated with perceived benefits. Methods: Seventy-four of 433 people completing the International Response of a Sample Population to HD risk (I-RESPOND-HD) survey reported benefits. Knowledge and understanding was perceived as the most common benefit from participants in both groups. The next most frequent perceived benefits from a family history were connecting with others and achieving life meaning and insights. The next most common perceived benefits from genetic testing were life planning and social support. The least common perceived benefit for both groups was renewed hope and optimism. Older age and spirituality were significantly associated with benefits in both groups. Conclusions: Perceptions of benefit may not be as likely until later years in people with prodromal HD. A developed sense of spirituality is identified as a personal resource associated with the perception of benefit from genetic testing for HD. Associations among spirituality, perceived benefits, and other indicators of personal and family well-being may be useful in genetic counseling and health care of people with prodromal HD. PMID:20722493
Hanson, Erik A; Lundervold, Arvid
2013-11-01
Multispectral, multichannel, or time series image segmentation is important for image analysis in a wide range of applications. Regularization of the segmentation is commonly performed using local image information causing the segmented image to be locally smooth or piecewise constant. A new spatial regularization method, incorporating non-local information, was developed and tested. Our spatial regularization method applies to feature space classification in multichannel images such as color images and MR image sequences. The spatial regularization involves local edge properties, region boundary minimization, as well as non-local similarities. The method is implemented in a discrete graph-cut setting allowing fast computations. The method was tested on multidimensional MRI recordings from human kidney and brain in addition to simulated MRI volumes. The proposed method successfully segment regions with both smooth and complex non-smooth shapes with a minimum of user interaction.
Evaluation of shrinkage and cracking in concrete of ring test by acoustic emission method
NASA Astrophysics Data System (ADS)
Watanabe, Takeshi; Hashimoto, Chikanori
2015-03-01
Drying shrinkage of concrete is one of the typical problems related to reduce durability and defilation of concrete structures. Lime stone, expansive additive and low-heat Portland cement are used to reduce drying shrinkage in Japan. Drying shrinkage is commonly evaluated by methods of measurement for length change of mortar and concrete. In these methods, there is detected strain due to drying shrinkage of free body, although visible cracking does not occur. In this study, the ring test was employed to detect strain and age cracking of concrete. The acoustic emission (AE) method was adopted to detect micro cracking due to shrinkage. It was recognized that in concrete using lime stone, expansive additive and low-heat Portland cement are effective to decrease drying shrinkage and visible cracking. Micro cracking due to shrinkage of this concrete was detected and evaluated by the AE method.
Ozarda, Yesim; Ichihara, Kiyoshi; Barth, Julian H; Klee, George
2013-05-01
The reference intervals (RIs) given in laboratory reports have an important role in aiding clinicians in interpreting test results in reference to values of healthy populations. In this report, we present a proposed protocol and standard operating procedures (SOPs) for common use in conducting multicenter RI studies on a national or international scale. The protocols and consensus on their contents were refined through discussions in recent C-RIDL meetings. The protocol describes in detail (1) the scheme and organization of the study, (2) the target population, inclusion/exclusion criteria, ethnicity, and sample size, (3) health status questionnaire, (4) target analytes, (5) blood collection, (6) sample processing and storage, (7) assays, (8) cross-check testing, (9) ethics, (10) data analyses, and (11) reporting of results. In addition, the protocol proposes the common measurement of a panel of sera when no standard materials exist for harmonization of test results. It also describes the requirements of the central laboratory, including the method of cross-check testing between the central laboratory of each country and local laboratories. This protocol and the SOPs remain largely exploratory and may require a reevaluation from the practical point of view after their implementation in the ongoing worldwide study. The paper is mainly intended to be a basis for discussion in the scientific community.
Truszczyński, M; Osek, J
1987-01-01
Three-hundred and fifty-eight E. coli strains isolated from piglets were tested for the presence of hemagglutinins by the use of the active hemagglutination test with or without mannose. Additionally 86 strains from the mentioned number of strains were investigated for the presence of common fimbriae using the same method but growing the strains in media especially suited for the development of this kind of fimbriae. These 358 strains and additionally 202 E. coli strains were tested using antisera for 987P and K88 antigens. It was found, using the active hemagglutination test, that 51.4% of the strains were hemagglutinating. The hemagglutinating strains carried the K88 antigen. All these strains were isolated from new-born and weaned piglets with enterotoxic form of colibacillosis, called also E. coli diarrhea. From cases of this form of colibacillosis originated also 26.7% of the strains in which common fimbriae (type 1) were detected. This result was obtained when the BHI medium was used for cultivation. In case of TSA medium only 2.3% of strains were positive. No specific or common fimbriae were found in strains recovered from septic form of colibacillosis and oedema disease (called also enterotoxaemic form of colibacillosis). No strain of 560 examined showed the presence of fimbrial 987P antigen.
Multi-version software reliability through fault-avoidance and fault-tolerance
NASA Technical Reports Server (NTRS)
Vouk, Mladen A.; Mcallister, David F.
1989-01-01
A number of experimental and theoretical issues associated with the practical use of multi-version software to provide run-time tolerance to software faults were investigated. A specialized tool was developed and evaluated for measuring testing coverage for a variety of metrics. The tool was used to collect information on the relationships between software faults and coverage provided by the testing process as measured by different metrics (including data flow metrics). Considerable correlation was found between coverage provided by some higher metrics and the elimination of faults in the code. Back-to-back testing was continued as an efficient mechanism for removal of un-correlated faults, and common-cause faults of variable span. Software reliability estimation methods was also continued based on non-random sampling, and the relationship between software reliability and code coverage provided through testing. New fault tolerance models were formulated. Simulation studies of the Acceptance Voting and Multi-stage Voting algorithms were finished and it was found that these two schemes for software fault tolerance are superior in many respects to some commonly used schemes. Particularly encouraging are the safety properties of the Acceptance testing scheme.
Estimating False Discovery Proportion Under Arbitrary Covariance Dependence*
Fan, Jianqing; Han, Xu; Gu, Weijie
2012-01-01
Multiple hypothesis testing is a fundamental problem in high dimensional inference, with wide applications in many scientific fields. In genome-wide association studies, tens of thousands of tests are performed simultaneously to find if any SNPs are associated with some traits and those tests are correlated. When test statistics are correlated, false discovery control becomes very challenging under arbitrary dependence. In the current paper, we propose a novel method based on principal factor approximation, which successfully subtracts the common dependence and weakens significantly the correlation structure, to deal with an arbitrary dependence structure. We derive an approximate expression for false discovery proportion (FDP) in large scale multiple testing when a common threshold is used and provide a consistent estimate of realized FDP. This result has important applications in controlling FDR and FDP. Our estimate of realized FDP compares favorably with Efron (2007)’s approach, as demonstrated in the simulated examples. Our approach is further illustrated by some real data applications. We also propose a dependence-adjusted procedure, which is more powerful than the fixed threshold procedure. PMID:24729644
Commercial Molecular Tests for Fungal Diagnosis from a Practical Point of View.
Lackner, Michaela; Lass-Flörl, Cornelia
2017-01-01
The increasing interest in molecular diagnostics is a result of tremendously improved knowledge on fungal infections in the past 20 years and the rapid development of new methods, in particular polymerase chain reaction. High expectations have been placed on molecular diagnostics, and the number of laboratories now using the relevant technology is rapidly increasing-resulting in an obvious need for standardization and definition of laboratory organization. In the past 10 years, multiple new molecular tools were marketed for the detection of DNA, antibodies, cell wall components, or other antigens. In contrast to classical culture methods, molecular methods do not detect a viable organisms, but only molecules which indicate its presence; this can be nucleic acids, cell components (antigens), or antibodies (Fig. 1). In this chapter, an overview is provided on commercially available detection tools, their strength and how to use them. A main focus is laid on providing tips and tricks that make daily life easier. We try to focus and mention methodical details which are not highlighted in the manufacturer's instructions of these test kits, but are based on our personal experience in the laboratory. Important to keep in mind is that molecular tools cannot replace culture, microscopy, or a critical view on patients' clinical history, signs, and symptoms, but provide a valuable add on tool. Diagnosis should not be based solely on a molecular test, but molecular tools might deliver an important piece of information that helps matching the diagnostic puzzle to a diagnosis, in particular as few tests are in vitro diagnostic tests (IVD) or only part of the whole test carries the IVD certificate (e.g., DNA extraction is often not included). Please be aware that the authors do not claim to provide a complete overview on all commercially available diagnostic assays being currently marketed for fungal detection, as those are subject to constant change. A main focus is put on commonly used panfungal assays and pathogen-specific assays, including Aspergillus-specific, Candida-specific, Cryptococcus specific, Histoplasma-specific, and Pneumocystis-specific assays. Assays are categorized according to their underlying principle in either antigen-detecting or antibody-detecting or DNA-detecting (Fig. 1). Other non-DNA-detecting nucleic acid methods such as FISH and PNA FISH are not summarized in this chapter and an overview on test performance, common false positives, and the clinical evaluation of commercial tests in studies is provided already in a previous book series by Javier Yugueros Marcos and David H. Pincus (Marcos and Pincus, Methods Mol Biol 968:25-54, 2013).
Comparison of infusion pumps calibration methods
NASA Astrophysics Data System (ADS)
Batista, Elsa; Godinho, Isabel; do Céu Ferreira, Maria; Furtado, Andreia; Lucas, Peter; Silva, Claudia
2017-12-01
Nowadays, several types of infusion pump are commonly used for drug delivery, such as syringe pumps and peristaltic pumps. These instruments present different measuring features and capacities according to their use and therapeutic application. In order to ensure the metrological traceability of these flow and volume measuring equipment, it is necessary to use suitable calibration methods and standards. Two different calibration methods can be used to determine the flow error of infusion pumps. One is the gravimetric method, considered as a primary method, commonly used by National Metrology Institutes. The other calibration method, a secondary method, relies on an infusion device analyser (IDA) and is typically used by hospital maintenance offices. The suitability of the IDA calibration method was assessed by testing several infusion instruments at different flow rates using the gravimetric method. In addition, a measurement comparison between Portuguese Accredited Laboratories and hospital maintenance offices was performed under the coordination of the Portuguese Institute for Quality, the National Metrology Institute. The obtained results were directly related to the used calibration method and are presented in this paper. This work has been developed in the framework of the EURAMET projects EMRP MeDD and EMPIR 15SIP03.
ERIC Educational Resources Information Center
Goncher, Andrea M.; Jayalath, Dhammika; Boles, Wageeh
2016-01-01
Concept inventory tests are one method to evaluate conceptual understanding and identify possible misconceptions. The multiple-choice question format, offering a choice between a correct selection and common misconceptions, can provide an assessment of students' conceptual understanding in various dimensions. Misconceptions of some engineering…
- Lifesaving & Fire Safety « Coast Guard Maritime Commons
. and Canadian implementation of lifejacket safety requirements and testing methods. 11/22/2017: Notice explore other contributing factors, it uncovered evidence of an ineffective safety management system Guard itself to provide effective oversight of the vessel's compliance with safety regulations. 9/26
40 CFR 60.714 - Installation of monitoring devices and recordkeeping.
Code of Federal Regulations, 2013 CFR
2013-07-01
... by the test method described in § 60.713(b)(1) (liquid material balance) shall maintain records of... equipment controlled by a carbon adsorption system and demonstrating compliance by the procedures described..., as appropriate. (1) For carbon adsorption systems with a common exhaust stack for all the individual...
40 CFR 60.714 - Installation of monitoring devices and recordkeeping.
Code of Federal Regulations, 2014 CFR
2014-07-01
... by the test method described in § 60.713(b)(1) (liquid material balance) shall maintain records of... equipment controlled by a carbon adsorption system and demonstrating compliance by the procedures described..., as appropriate. (1) For carbon adsorption systems with a common exhaust stack for all the individual...
40 CFR 60.714 - Installation of monitoring devices and recordkeeping.
Code of Federal Regulations, 2012 CFR
2012-07-01
... by the test method described in § 60.713(b)(1) (liquid material balance) shall maintain records of... equipment controlled by a carbon adsorption system and demonstrating compliance by the procedures described..., as appropriate. (1) For carbon adsorption systems with a common exhaust stack for all the individual...
40 CFR 60.714 - Installation of monitoring devices and recordkeeping.
Code of Federal Regulations, 2010 CFR
2010-07-01
... by the test method described in § 60.713(b)(1) (liquid material balance) shall maintain records of... equipment controlled by a carbon adsorption system and demonstrating compliance by the procedures described..., as appropriate. (1) For carbon adsorption systems with a common exhaust stack for all the individual...
40 CFR 60.714 - Installation of monitoring devices and recordkeeping.
Code of Federal Regulations, 2011 CFR
2011-07-01
... by the test method described in § 60.713(b)(1) (liquid material balance) shall maintain records of... equipment controlled by a carbon adsorption system and demonstrating compliance by the procedures described..., as appropriate. (1) For carbon adsorption systems with a common exhaust stack for all the individual...
In-Situ Air Sparaing: Engineering and Design
2008-01-31
Construction Materials. Although PVC casing is commonly used, flexible or rigid polyethylene pipe may be more efficient for certain excavation methods, such as...depth, etc.) Piping insulation/ heat tape installed Piping flushed/cleaned/pressure tested Subsurface as-built equipment...4-4 Figure 4-2 Pilot-Scale Piping and Instrumentation Diagram
The Language, Working Memory, and Other Cognitive Demands of Verbal Tasks
ERIC Educational Resources Information Center
Archibald, Lisa M. D.
2013-01-01
Purpose: To gain a better understanding of the cognitive processes supporting verbal abilities, the underlying structure and interrelationships between common verbal measures were investigated. Methods: An epidemiological sample (n = 374) of school-aged children completed standardized tests of language, intelligence, and short-term and working…
Empirical Performance of Covariates in Education Observational Studies
ERIC Educational Resources Information Center
Wong, Vivian C.; Valentine, Jeffrey C.; Miller-Bains, Kate
2017-01-01
This article summarizes results from 12 empirical evaluations of observational methods in education contexts. We look at the performance of three common covariate-types in observational studies where the outcome is a standardized reading or math test. They are: pretest measures, local geographic matching, and rich covariate sets with a strong…
Total organic halide (TOX) analyzers are commonly used to measure the amount of dissolved halogenated organic byproducts in disinfected waters. Because of the lack of information on the identity of disinfection byproducts, rigorous testing of the dissolved organic halide (DOX) pr...
The Variance Normalization Method of Ridge Regression Analysis.
ERIC Educational Resources Information Center
Bulcock, J. W.; And Others
The testing of contemporary sociological theory often calls for the application of structural-equation models to data which are inherently collinear. It is shown that simple ridge regression, which is commonly used for controlling the instability of ordinary least squares regression estimates in ill-conditioned data sets, is not a legitimate…
Analyzing Longitudinal Item Response Data via the Pairwise Fitting Method
ERIC Educational Resources Information Center
Fu, Zhi-Hui; Tao, Jian; Shi, Ning-Zhong; Zhang, Ming; Lin, Nan
2011-01-01
Multidimensional item response theory (MIRT) models can be applied to longitudinal educational surveys where a group of individuals are administered different tests over time with some common items. However, computational problems typically arise as the dimension of the latent variables increases. This is especially true when the latent variable…
DOT National Transportation Integrated Search
2008-10-01
Purpose: : The purpose of the National Transportation Product Evaluation Program (NTPEP) is to provide a cost-effective method of evaluation for materials of common interest among all participating NTPEP member departments. NTPEP reports allow member...
40 CFR 60.745 - Test methods and procedures.
Code of Federal Regulations, 2010 CFR
2010-07-01
... determination of the efficiency of a fixed-bed carbon adsorption system with a common exhaust stack for all the... separate runs, each coinciding with one or more complete system rotations through the adsorption cycles of... efficiency of a fixed-bed carbon adsorption system with individual exhaust stacks for each adsorber vessel...
Arabidopsis Ecotypes: A Model for Course Projects in Organismal Plant Biology & Evolution
ERIC Educational Resources Information Center
Wyatt, Sarah; Ballard, Harvey E.
2007-01-01
We present an inquiry-based project using readily-available seed stocks of Arabidopsis. Seedlings are grown under simulated "common garden" conditions to test evolutionary and organismal principles. Students learn scientific method by developing hypotheses and selecting appropriate data and analyses for their experiments. Experiments can be…
Three common finishing treatments of stainless steel that are used for equipment during poultry processing were tested for resistance to bacterial contamination. Methods were developed to measure attached bacteria and to identify factors that make surface finishes susceptible or ...
40 CFR 60.745 - Test methods and procedures.
Code of Federal Regulations, 2011 CFR
2011-07-01
... determination of the efficiency of a fixed-bed carbon adsorption system with a common exhaust stack for all the... separate runs, each coinciding with one or more complete system rotations through the adsorption cycles of... efficiency of a fixed-bed carbon adsorption system with individual exhaust stacks for each adsorber vessel...
ERIC Educational Resources Information Center
Traube, Dorian E.; Begun, Stephanie; Petering, Robin; Flynn, Marilyn L.
2017-01-01
The field of social work does not currently have a widely adopted method for expediting innovations into micro- or macropractice. Although it is common in fields such as engineering and business to have formal processes for accelerating scientific advances into consumer markets, few comparable mechanisms exist in the social sciences or social…
Quantification of soil surface roughness evolution under simulated rainfall
USDA-ARS?s Scientific Manuscript database
Soil surface roughness is commonly identified as one of the dominant factors governing runoff and interrill erosion. The objective of this study was to compare several existing soil surface roughness indices and to test the Revised Triangular Prism surface area Method (RTPM) as a new approach to cal...
This project involves development, validation testing and application of a fast, efficient method of quantitatively measuring occurrence and concentration of common human viral pathogens, enterovirus and hepatitis A virus, in ground water samples using real-time reverse transcrip...
Developmental efforts and experimental data are described that focused on quantifying the transfer of particles on a mass basis from indoor surfaces to human skin. Methods were developed that utilized a common fluorescein-tagged Arizona Test Dust (ATD) as a possible surrogate ...
Functional assessment of the ex vivo vocal folds through biomechanical testing: A review
Dion, Gregory R.; Jeswani, Seema; Roof, Scott; Fritz, Mark; Coelho, Paulo; Sobieraj, Michael; Amin, Milan R.; Branski, Ryan C.
2016-01-01
The human vocal folds are complex structures made up of distinct layers that vary in cellular and extracellular composition. The mechanical properties of vocal fold tissue are fundamental to the study of both the acoustics and biomechanics of voice production. To date, quantitative methods have been applied to characterize the vocal fold tissue in both normal and pathologic conditions. This review describes, summarizes, and discusses the most commonly employed methods for vocal fold biomechanical testing. Force-elongation, torsional parallel plate rheometry, simple-shear parallel plate rheometry, linear skin rheometry, and indentation are the most frequently employed biomechanical tests for vocal fold tissues and each provide material properties data that can be used to compare native tissue verses diseased for treated tissue. Force-elongation testing is clinically useful, as it allows for functional unit testing, while rheometry provides physiologically relevant shear data, and nanoindentation permits micrometer scale testing across different areas of the vocal fold as well as whole organ testing. Thoughtful selection of the testing technique during experimental design to evaluate a hypothesis is important to optimizing biomechanical testing of vocal fold tissues. PMID:27127075
Hydrogen peroxide test for intraoperative bile leak detection.
Trehan, V; Rao, Pankaj P; Naidu, C S; Sharma, Anuj K; Singh, A K; Sharma, Sanjay; Gaur, Amit; Kulkarni, S V; Pathak, N
2017-07-01
Bile leakage (BL) is a common complication following liver surgery, ranging from 3 to 27% in different series. To reduce the incidence of post-operative BL various BL tests have been applied since ages, but no method is foolproof and every method has their own limitations. In this study we used a relatively simpler technique to detect the BL intra-operatively. Topical application of 1.5% diluted hydrogen peroxide (H 2 O 2 ) was used to detect the BL from cut surface of liver and we compared this with conventional saline method to know the efficacy. A total of 31 patients included all patients who underwent liver resection and donor hepatectomies as part of Living Donor Liver Transplantation. After complete liver resection, the conventional saline test followed by topical diluted 1.5% H 2 O 2 test was performed on all. A BL was demonstrated in 11 patients (35.48%) by the conventional saline method and in 19 patients (61.29%) by H 2 O 2 method. Statistically compared by Wilcoxon signed-rank test showed significant difference ( P = 0.014) for minor liver resections group and ( P = 0.002) for major liver resections group. The topical application of H 2 O 2 is a simple and effective method of detection of BL from cut surface of liver. It is an easy, non-invasive, cheap, less time consuming, reproducible, and sensitive technique with no obvious disadvantages.
Hybrid Residual Flexibility/Mass-Additive Method for Structural Dynamic Testing
NASA Technical Reports Server (NTRS)
Tinker, M. L.
2003-01-01
A large fixture was designed and constructed for modal vibration testing of International Space Station elements. This fixed-base test fixture, which weighs thousands of pounds and is anchored to a massive concrete floor, initially utilized spherical bearings and pendulum mechanisms to simulate Shuttle orbiter boundary constraints for launch of the hardware. Many difficulties were encountered during a checkout test of the common module prototype structure, mainly due to undesirable friction and excessive clearances in the test-article-to-fixture interface bearings. Measured mode shapes and frequencies were not representative of orbiter-constrained modes due to the friction and clearance effects in the bearings. As a result, a major redesign effort for the interface mechanisms was undertaken. The total cost of the fixture design, construction and checkout, and redesign was over $2 million. Because of the problems experienced with fixed-base testing, alternative free-suspension methods were studied, including the residual flexibility and mass-additive approaches. Free-suspension structural dynamics test methods utilize soft elastic bungee cords and overhead frame suspension systems that are less complex and much less expensive than fixed-base systems. The cost of free-suspension fixturing is on the order of tens of thousands of dollars as opposed to millions, for large fixed-base fixturing. In addition, free-suspension test configurations are portable, allowing modal tests to be done at sites without modal test facilities. For example, a mass-additive modal test of the ASTRO-1 Shuttle payload was done at the Kennedy Space Center launch site. In this Technical Memorandum, the mass-additive and residual flexibility test methods are described in detail. A discussion of a hybrid approach that combines the best characteristics of each method follows and is the focus of the study.
A Study of Impact Point Detecting Method Based on Seismic Signal
NASA Astrophysics Data System (ADS)
Huo, Pengju; Zhang, Yu; Xu, Lina; Huang, Yong
The projectile landing position has to be determined for its recovery and range in the targeting test. In this paper, a global search method based on the velocity variance is proposed. In order to verify the applicability of this method, simulation analysis within the scope of four million square meters has been conducted in the same array structure of the commonly used linear positioning method, and MATLAB was used to compare and analyze the two methods. The compared simulation results show that the global search method based on the speed of variance has high positioning accuracy and stability, which can meet the needs of impact point location.
Malikowska-Racia, Natalia; Podkowa, Adrian; Sałat, Kinga
2018-04-21
Nowadays cognitive impairments are a growing unresolved medical issue which may accompany many diseases and therapies, furthermore, numerous researchers investigate various neurobiological aspects of human memory to find possible ways to improve it. Until any other method is discovered, in vivo studies remain the only available tool for memory evaluation. At first, researchers need to choose a model of amnesia which may strongly influence observed results. Thereby a deeper insight into a model itself may increase the quality and reliability of results. The most common method to impair memory in rodents is the pretreatment with drugs that disrupt learning and memory. Taking this into consideration, we compared the activity of agents commonly used for this purpose. We investigated effects of phencyclidine (PCP), a non-competitive NMDA receptor antagonist, and scopolamine (SCOP), an antagonist of muscarinic receptors, on short-term spatial memory and classical fear conditioning in mice. PCP (3 mg/kg) and SCOP (1 mg/kg) were administrated intraperitoneally 30 min before behavioral paradigms. To assess the influence of PCP and SCOP on short-term spatial memory, the Barnes maze test in C57BL/J6 mice was used. Effects on classical conditioning were evaluated using contextual fear conditioning test. Additionally, spontaneous locomotor activity of mice was measured. These two tests were performed in CD-1 mice. Our study reports that both tested agents disturbed short-term spatial memory in the Barnes maze test, however, SCOP revealed a higher activity. Surprisingly, learning in contextual fear conditioning test was impaired only by SCOP. Graphical Abstract ᅟ.
Model Considerations for Memory-based Automatic Music Transcription
NASA Astrophysics Data System (ADS)
Albrecht, Štěpán; Šmídl, Václav
2009-12-01
The problem of automatic music description is considered. The recorded music is modeled as a superposition of known sounds from a library weighted by unknown weights. Similar observation models are commonly used in statistics and machine learning. Many methods for estimation of the weights are available. These methods differ in the assumptions imposed on the weights. In Bayesian paradigm, these assumptions are typically expressed in the form of prior probability density function (pdf) on the weights. In this paper, commonly used assumptions about music signal are summarized and complemented by a new assumption. These assumptions are translated into pdfs and combined into a single prior density using combination of pdfs. Validity of the model is tested in simulation using synthetic data.
Fleetwood, V A; Gross, K N; Alex, G C; Cortina, C S; Smolevitz, J B; Sarvepalli, S; Bakhsh, S R; Poirier, J; Myers, J A; Singer, M A; Orkin, B A
2017-03-01
Anastomotic leak (AL) increases costs and cancer recurrence. Studies show decreased AL with side-to-side stapled anastomosis (SSA), but none identify risk factors within SSAs. We hypothesized that stapler characteristics and closure technique of the common enterotomy affect AL rates. Retrospective review of bowel SSAs was performed. Data included stapler brand, staple line oversewing, and closure method (handsewn, HC; linear stapler [Barcelona technique], BT; transverse stapler, TX). Primary endpoint was AL. Statistical analysis included Fisher's test and logistic regression. 463 patients were identified, 58.5% BT, 21.2% HC, and 20.3% TX. Covidien staplers comprised 74.9%, Ethicon 18.1%. There were no differences between stapler types (Covidien 5.8%, Ethicon 6.0%). However, AL rates varied by common side closure (BT 3.7% vs. TX 10.6%, p = 0.017), remaining significant on multivariate analysis. Closure method of the common side impacts AL rates. Barcelona technique has fewer leaks than transverse stapled closure. Further prospective evaluation is recommended. Copyright © 2017. Published by Elsevier Inc.
Coliform Bacteria Monitoring in Fish Systems: Current Practices in Public Aquaria.
Culpepper, Erin E; Clayton, Leigh A; Hadfield, Catherine A; Arnold, Jill E; Bourbon, Holly M
2016-06-01
Public aquaria evaluate coliform indicator bacteria levels in fish systems, but the purpose of testing, testing methods, and management responses are not standardized, unlike with the coliform bacteria testing for marine mammal enclosures required by the U.S. Department of Agriculture. An online survey was sent to selected aquaria to document current testing and management practices in fish systems without marine mammals. The information collected included indicator bacteria species, the size and type of systems monitored, the primary purpose of testing, sampling frequency, test methods, the criteria for interpreting results, corrective actions, and management changes to limit human exposure. Of the 25 institutions to which surveys were sent, 19 (76%) responded. Fourteen reported testing for fecal indicator bacteria in fish systems. The most commonly tested indicator species were total (86%) and fecal (79%) coliform bacteria, which were detected by means of the membrane filtration method (64%). Multiple types and sizes of systems were tested, and the guidelines for testing and corrective actions were highly variable. Only three institutions performed additional tests to confirm the identification of indicator organisms. The results from this study can be used to compare bacterial monitoring practices and protocols in fish systems, as an aid to discussions relating to the accuracy and reliability of test results, and to help implement appropriate management responses. Received August 23, 2015; accepted December 29, 2015.
Li, Ye; Yu, Baiying; Pang, Yong; Vigneron, Daniel B; Zhang, Xiaoliang
2013-01-01
The use of quadrature RF magnetic fields has been demonstrated to be an efficient method to reduce transmit power and to increase the signal-to-noise (SNR) in magnetic resonance (MR) imaging. The goal of this project was to develop a new method using the common-mode and differential-mode (CMDM) technique for compact, planar, distributed-element quadrature transmit/receive resonators for MR signal excitation and detection and to investigate its performance for MR imaging, particularly, at ultrahigh magnetic fields. A prototype resonator based on CMDM method implemented by using microstrip transmission line was designed and fabricated for 7T imaging. Both the common mode (CM) and the differential mode (DM) of the resonator were tuned and matched at 298MHz independently. Numerical electromagnetic simulation was performed to verify the orthogonal B1 field direction of the two modes of the CMDM resonator. Both workbench tests and MR imaging experiments were carried out to evaluate the performance. The intrinsic decoupling between the two modes of the CMDM resonator was demonstrated by the bench test, showing a better than -36 dB transmission coefficient between the two modes at resonance frequency. The MR images acquired by using each mode and the images combined in quadrature showed that the CM and DM of the proposed resonator provided similar B1 coverage and achieved SNR improvement in the entire region of interest. The simulation and experimental results demonstrate that the proposed CMDM method with distributed-element transmission line technique is a feasible and efficient technique for planar quadrature RF coil design at ultrahigh fields, providing intrinsic decoupling between two quadrature channels and high frequency capability. Due to its simple and compact geometry and easy implementation of decoupling methods, the CMDM quadrature resonator can possibly be a good candidate for design blocks in multichannel RF coil arrays.
Satuito, Cyril Glenn Perez; Katsuyama, Ichiro; Ando, Hirotomo; Seki, Yasuyuki; Senda, Tetsuya
2016-01-01
A laboratory test with a flow-through system was designed and its applicability for testing antifouling paints of varying efficacies was investigated. Six different formulations of antifouling paints were prepared to have increasing contents (0 to 40 wt.%) of Cu2O, which is the most commonly used antifouling substance, and each formulation of paint was coated on just one surface of every test plate. The test plates were aged for 45 days by rotating them at a speed of 10 knots inside a cylinder drum. A behavioral test was then conducted using five mussels (Mytilus galloprovincialis) that were pasted onto the coated surface of each aged test plate. The number of the byssus threads produced by each mussel generally decreased with increasing Cu2O content of the paint. The newly designed method was considered valid owing to the high consistency of its results with observations from the field experiment. PMID:27959916
Hassan, Wafaa El-Sayed
2008-08-01
Three rapid, simple, reproducible and sensitive extractive colorimetric methods (A--C) for assaying dothiepin hydrochloride (I) and risperidone (II) in bulk sample and in dosage forms were investigated. Methods A and B are based on the formation of an ion pair complexes with methyl orange (A) and orange G (B), whereas method C depends on ternary complex formation between cobalt thiocyanate and the studied drug I or II. The optimum reaction conditions were investigated and it was observed the calibration curves resulting from the measurements of absorbance concentration relations of the extracted complexes were linear over the concentration range 0.1--12 microg ml(-1) for method A, 0.5--11 mug ml(-1) for method B, and 3.2--80 microg ml(-1) for method C with a relative standard deviation (RSD) of 1.17 and 1.28 for drug I and II, respectively. The molar absorptivity, Sandell sensitivity, Ringbom optimum concentration ranges, and detection and quantification limits for all complexes were calculated and evaluated at maximum wavelengths of 423, 498, and 625 nm, using methods A, B, and C, respectively. The interference from excipients commonly present in dosage forms and common degradation products was studied. The proposed methods are highly specific for the determination of drugs I and II, in their dosage forms applying the standard additions technique without any interference from common excipients. The proposed methods have been compared statistically to the reference methods and found to be simple, accurate (t-test) and reproducible (F-value).
Meta-Analysis of Rare Binary Adverse Event Data
Bhaumik, Dulal K.; Amatya, Anup; Normand, Sharon-Lise; Greenhouse, Joel; Kaizar, Eloise; Neelon, Brian; Gibbons, Robert D.
2013-01-01
We examine the use of fixed-effects and random-effects moment-based meta-analytic methods for analysis of binary adverse event data. Special attention is paid to the case of rare adverse events which are commonly encountered in routine practice. We study estimation of model parameters and between-study heterogeneity. In addition, we examine traditional approaches to hypothesis testing of the average treatment effect and detection of the heterogeneity of treatment effect across studies. We derive three new methods, simple (unweighted) average treatment effect estimator, a new heterogeneity estimator, and a parametric bootstrapping test for heterogeneity. We then study the statistical properties of both the traditional and new methods via simulation. We find that in general, moment-based estimators of combined treatment effects and heterogeneity are biased and the degree of bias is proportional to the rarity of the event under study. The new methods eliminate much, but not all of this bias. The various estimators and hypothesis testing methods are then compared and contrasted using an example dataset on treatment of stable coronary artery disease. PMID:23734068
Prediction of Frequency for Simulation of Asphalt Mix Fatigue Tests Using MARS and ANN
Fakhri, Mansour
2014-01-01
Fatigue life of asphalt mixes in laboratory tests is commonly determined by applying a sinusoidal or haversine waveform with specific frequency. The pavement structure and loading conditions affect the shape and the frequency of tensile response pulses at the bottom of asphalt layer. This paper introduces two methods for predicting the loading frequency in laboratory asphalt fatigue tests for better simulation of field conditions. Five thousand (5000) four-layered pavement sections were analyzed and stress and strain response pulses in both longitudinal and transverse directions was determined. After fitting the haversine function to the response pulses by the concept of equal-energy pulse, the effective length of the response pulses were determined. Two methods including Multivariate Adaptive Regression Splines (MARS) and Artificial Neural Network (ANN) methods were then employed to predict the effective length (i.e., frequency) of tensile stress and strain pulses in longitudinal and transverse directions based on haversine waveform. It is indicated that, under controlled stress and strain modes, both methods (MARS and ANN) are capable of predicting the frequency of loading in HMA fatigue tests with very good accuracy. The accuracy of ANN method is, however, more than MARS method. It is furthermore shown that the results of the present study can be generalized to sinusoidal waveform by a simple equation. PMID:24688400
Prediction of frequency for simulation of asphalt mix fatigue tests using MARS and ANN.
Ghanizadeh, Ali Reza; Fakhri, Mansour
2014-01-01
Fatigue life of asphalt mixes in laboratory tests is commonly determined by applying a sinusoidal or haversine waveform with specific frequency. The pavement structure and loading conditions affect the shape and the frequency of tensile response pulses at the bottom of asphalt layer. This paper introduces two methods for predicting the loading frequency in laboratory asphalt fatigue tests for better simulation of field conditions. Five thousand (5000) four-layered pavement sections were analyzed and stress and strain response pulses in both longitudinal and transverse directions was determined. After fitting the haversine function to the response pulses by the concept of equal-energy pulse, the effective length of the response pulses were determined. Two methods including Multivariate Adaptive Regression Splines (MARS) and Artificial Neural Network (ANN) methods were then employed to predict the effective length (i.e., frequency) of tensile stress and strain pulses in longitudinal and transverse directions based on haversine waveform. It is indicated that, under controlled stress and strain modes, both methods (MARS and ANN) are capable of predicting the frequency of loading in HMA fatigue tests with very good accuracy. The accuracy of ANN method is, however, more than MARS method. It is furthermore shown that the results of the present study can be generalized to sinusoidal waveform by a simple equation.
Comparisons of geoid models over Alaska computed with different Stokes' kernel modifications
NASA Astrophysics Data System (ADS)
Li, X.; Wang, Y.
2011-01-01
Various Stokes kernel modification methods have been developed over the years. The goal of this paper is to test the most commonly used Stokes kernel modifications numerically by using Alaska as a test area and EGM08 as a reference model. The tests show that some methods are more sensitive than others to the integration cap sizes. For instance, using the methods of Vaníček and Kleusberg or Featherstone et al. with kernel modification at degree 60, the geoid decreases by 30 cm (on average) when the cap size increases from 1° to 25°. The corresponding changes in the methods of Wong and Gore and Heck and Grüninger are only at the 1 cm level. At high modification degrees, above 360, the methods of Vaníček and Kleusberg and Featherstone et al become unstable because of numerical problems in the modification coefficients; similar conclusions have been reported by Featherstone (2003). In contrast, the methods of Wong and Gore, Heck and Grüninger and the least-squares spectral combination are stable at any modification degree, though they do not provide as good fit as the best case of the Molodenskii-type methods at the GPS/Leveling benchmarks. However, certain tests for choosing the cap size and modification degree have to be performed in advance to avoid abrupt mean geoid changes if the latter methods are applied.
Modified alignment CGHs for aspheric surface test
NASA Astrophysics Data System (ADS)
Song, Jae-Bong; Yang, Ho-Soon; Rhee, Hyug-Gyo; Lee, Yun-Woo
2009-08-01
Computer Generated Holograms (CGH) for optical test are commonly consisted of one main pattern for testing aspheric surface and some alignment patterns for aligning the interferometer, CGH, and the test optics. To align the CGH plate and the test optics, we designed the alignment CGHs modified from the cat's eye alignment method, which are consisted of a couple of CGH patterns. The incident beam passed through the one part of the alignment CGH pattern is focused onto the one radius position of the test aspheric surface, and is reflected to the other part, and vice versa. This method has several merits compared to the conventional cat's eye alignment method. First, this method can be used in testing optics with a center hole, and the center part of CGH plate can be assigned to the alignment pattern. Second, the alignment pattern becomes a concentric circular arc pattern. The whole CGH patterns including the main pattern and alignment patterns are consisted of only concentric circular fringes. This concentric circular pattern can be easily made by the polar coordinated writer with circular scanning. The required diffraction angle becomes relatively small, so the 1st order diffraction beams instead of the 3rd order diffraction beam can be used as alignment beams, and the visibility can be improved. This alignment method also is more sensitive to the tilt and the lateral shift of the test aspheric surface. Using this alignment pattern, a 200 mm diameter F/0.5 aspheric mirror and a 600 mm diameter F/0.9 mirror were tested.
The Ex Vivo Eye Irritation Test as an alternative test method for serious eye damage/eye irritation.
Spöler, Felix; Kray, Oya; Kray, Stefan; Panfil, Claudia; Schrage, Norbert F
2015-07-01
Ocular irritation testing is a common requirement for the classification, labelling and packaging of chemicals (substances and mixtures). The in vivo Draize rabbit eye test (OECD Test Guideline 405) is considered to be the regulatory reference method for the classification of chemicals according to their potential to induce eye injury. In the Draize test, chemicals are applied to rabbit eyes in vivo, and changes are monitored over time. If no damage is observed, the chemical is not categorised. Otherwise, the classification depends on the severity and reversibility of the damage. Alternative test methods have to be designed to match the classifications from the in vivo reference method. However, observation of damage reversibility is usually not possible in vitro. Within the present study, a new organotypic method based on rabbit corneas obtained from food production is demonstrated to close this gap. The Ex Vivo Eye Irritation Test (EVEIT) retains the full biochemical activity of the corneal epithelium, epithelial stem cells and endothelium. This permits the in-depth analysis of ocular chemical trauma beyond that achievable by using established in vitro methods. In particular, the EVEIT is the first test to permit the direct monitoring of recovery of all corneal layers after damage. To develop a prediction model for the EVEIT that is comparable to the GHS system, 37 reference chemicals were analysed. The experimental data were used to derive a three-level potency ranking of eye irritation and corrosion that best fits the GHS categorisation. In vivo data available in the literature were used for comparison. When compared with GHS classification predictions, the overall accuracy of the three-level potency ranking was 78%. The classification of chemicals as irritating versus non-irritating resulted in 96% sensitivity, 91% specificity and 95% accuracy. 2015 FRAME.
An analytical approach to obtaining JWL parameters from cylinder tests
NASA Astrophysics Data System (ADS)
Sutton, B. D.; Ferguson, J. W.; Hodgson, A. N.
2017-01-01
An analytical method for determining parameters for the JWL Equation of State from cylinder test data is described. This method is applied to four datasets obtained from two 20.3 mm diameter EDC37 cylinder tests. The calculated pressure-relative volume (p-Vr) curves agree with those produced by hydro-code modelling. The average calculated Chapman-Jouguet (CJ) pressure is 38.6 GPa, compared to the model value of 38.3 GPa; the CJ relative volume is 0.729 for both. The analytical pressure-relative volume curves produced agree with the one used in the model out to the commonly reported expansion of 7 relative volumes, as do the predicted energies generated by integrating under the p-Vr curve. The calculated energy is within 1.6% of that predicted by the model.
Paper-based surfaced enhanced Raman spectroscopy for drug level testing with tear fluid
NASA Astrophysics Data System (ADS)
Yamada, Kenji; Yokoyama, Moe; Jeong, Hieyong; Kido, Michiko; Ohno, Yuko
2015-07-01
The purpose of this study was to show the effectiveness of therapeutic drug level testing by Paper-based Surfaced Enhanced Raman Spectroscopy (PSERS) for artificial lacrimal fluid. We have been used substrates which consist of a common filter paper and gold nano-rods. The targets were Phenobarbital (PB) which dissolved in artificial lacrimal fluid. We measured them using PSERS which the wavelength was 785nm, the power was 30mW. It was found that there were the strong peaks of PB at 997cm-1 and 1026cm-1 which corresponded with solid PB spectral peak for 1mM artificial lacrimal fluid. The results demonstrated the usefulness of this method. It is concluded that our method for therapeutic drug level testing is very efficient.
[Antagonism in vitro among phytopathogenic and saprobic fungi from horticultural soils].
Alippi, H E; Monaco, C
1990-01-01
Two methods were tested in order to determine the existence of in vitro antagonism among saprobic and pathogenic fungi. These microorganisms were the most common isolates from horticultural soils of La Plata (Buenos Aires). Trichoderma harzianum; T. koningii and Penicillium sp. were antagonistic to all the pathogenic fungi tested, Fusarium solani; F. oxysporum; Alternaria solani; Colletotrichum sp. and Sclerotium rolfsii Spicaria sp., Paecilomyces sp. and Chaetomiun sp. were antagonistic only to Colletotrichum sp. and Fusarium solani.
40 CFR Appendix C to Part 60 - Determination of Emission Rate Change
Code of Federal Regulations, 2011 CFR
2011-07-01
... emission rate to the atmosphere. The method used is the Student's t test, commonly used to make inferences.... EC01JN92.294 3.5 Calculate the test statistic, t, using Equation 4. EC01JN92.295 4.Results 4.1If E b>,E a... occurred. Table 1 Degrees of freedom (n a=n b−2) t′ (95 percent confidence level) 2 2.920 3 2.353 4 2.132 5...
Atom Chips on Direct Bonded Copper Substrates (Postprint)
2012-01-19
joining of a thin sheet of pure copper to a ceramic substrate14 and is commonly used in power electronics due to its high current handling and heat...Squires et al. Rev. Sci. Instrum. 82, 023101 (2011) FIG. 1. A scanning electron micrograph of the top view of test chip A. the photolithographically...the etching pro- cesses and masking methods were quantified using a scanning electron microscope. Two test chips (A and B) are presented below and are
Fernandez Montenegro, Juan Manuel; Argyriou, Vasileios
2017-05-01
Alzheimer's screening tests are commonly used by doctors to diagnose the patient's condition and stage as early as possible. Most of these tests are based on pen-paper interaction and do not embrace the advantages provided by new technologies. This paper proposes novel Alzheimer's screening tests based on virtual environments and game principles using new immersive technologies combined with advanced Human Computer Interaction (HCI) systems. These new tests are focused on the immersion of the patient in a virtual room, in order to mislead and deceive the patient's mind. In addition, we propose two novel variations of Turing Test proposed by Alan Turing as a method to detect dementia. As a result, four tests are introduced demonstrating the wide range of screening mechanisms that could be designed using virtual environments and game concepts. The proposed tests are focused on the evaluation of memory loss related to common objects, recent conversations and events; the diagnosis of problems in expressing and understanding language; the ability to recognize abnormalities; and to differentiate between virtual worlds and reality, or humans and machines. The proposed screening tests were evaluated and tested using both patients and healthy adults in a comparative study with state-of-the-art Alzheimer's screening tests. The results show the capacity of the new tests to distinguish healthy people from Alzheimer's patients. Copyright © 2017. Published by Elsevier Inc.
2018-01-01
Background We developed skin prick test (SPT) reagents for common inhalant allergens that reflected the real exposure in Korea. The study aim was to evaluate diagnostic usefulness and allergen potency of our inhalant SPT reagents in comparison with commercial products. Methods We produced eight common inhalant allergen SPT reagents using total extract (Prolagen): Dermatophagoides farinae, Dermatophagoides pteronyssinus, oak, ragweed, mugwort, Humulus japonicus pollens, as well as cat and dog allergens. We compared the newly developed reagents with three commercially available SPT reagents (Allergopharma, Hollister-Stier, Lofarma). We measured total protein concentrations, sodium dodecyl sulphate-polyacrylamide gel electrophoresis (SDS-PAGE), major allergen concentration, and biological allergen potencies measured by immunoglobulin E (IgE) immunoblotting and ImmunoCAP inhibition test. Results Diagnostic values of these SPT reagents were expressed as positivity rate and concordance rate of the results from ImmunoCAP allergen-specific IgE test in 94 allergic patients. In vitro analysis showed marked differences in protein concentrations, SDS-PAGE features, major allergen concentrations, and biological allergen potencies of four different SPT reagents. In vivo analysis showed that positive rates and concordance rates of Prolagen® SPT reagents were similar compared to the three commercial SPT reagents. Conclusion The newly developed Prolagen® inhalant SPT reagents are not inferior to the commercially available SPT reagents in allergy diagnosis. PMID:29573248
Methods Used to Evaluate Pain Behaviors in Rodents
Deuis, Jennifer R.; Dvorakova, Lucie S.; Vetter, Irina
2017-01-01
Rodents are commonly used to study the pathophysiological mechanisms of pain as studies in humans may be difficult to perform and ethically limited. As pain cannot be directly measured in rodents, many methods that quantify “pain-like” behaviors or nociception have been developed. These behavioral methods can be divided into stimulus-evoked or non-stimulus evoked (spontaneous) nociception, based on whether or not application of an external stimulus is used to elicit a withdrawal response. Stimulus-evoked methods, which include manual and electronic von Frey, Randall-Selitto and the Hargreaves test, were the first to be developed and continue to be in widespread use. However, concerns over the clinical translatability of stimulus-evoked nociception in recent years has led to the development and increasing implementation of non-stimulus evoked methods, such as grimace scales, burrowing, weight bearing and gait analysis. This review article provides an overview, as well as discussion of the advantages and disadvantages of the most commonly used behavioral methods of stimulus-evoked and non-stimulus-evoked nociception used in rodents. PMID:28932184
Development of a database and processing method for detecting hematotoxicity adverse drug events.
Shimai, Yoshie; Takeda, Toshihiro; Manabe, Shirou; Teramoto, Kei; Mihara, Naoki; Matsumura, Yasushi
2015-01-01
Adverse events are detected by monitoring the patient's status, including blood test results. However, it is difficult to identify all adverse events based on recognition by individual doctors. We developed a system that can be used to detect hematotoxicity adverse events according to blood test results recorded in an electronic medical record system. The blood test results were graded based on Common Terminology Criteria for Adverse Events (CTCAE) and changes in the blood test results (Up, Down, Flat) were assessed according to the variation in the grade. The changes in the blood test and injection data were stored in a database. By comparing the date of injection and start and end dates of the change in the blood test results, adverse events related to a designated drug were detected. Using this method, we searched for the occurrence of serious adverse events (CTCAE Grades 3 or 4) concerning WBC, ALT and creatinine related to paclitaxel at Osaka University Hospital. The rate of occurrence of a decreased WBC count, increased ALT level and increased creatinine level was 36.0%, 0.6% and 0.4%, respectively. This method is useful for detecting and estimating the rate of occurrence of hematotoxicity adverse drug events.
Improving the quality of parameter estimates obtained from slug tests
Butler, J.J.; McElwee, C.D.; Liu, W.
1996-01-01
The slug test is one of the most commonly used field methods for obtaining in situ estimates of hydraulic conductivity. Despite its prevalence, this method has received criticism from many quarters in the ground-water community. This criticism emphasizes the poor quality of the estimated parameters, a condition that is primarily a product of the somewhat casual approach that is often employed in slug tests. Recently, the Kansas Geological Survey (KGS) has pursued research directed it improving methods for the performance and analysis of slug tests. Based on extensive theoretical and field research, a series of guidelines have been proposed that should enable the quality of parameter estimates to be improved. The most significant of these guidelines are: (1) three or more slug tests should be performed at each well during a given test period; (2) two or more different initial displacements (Ho) should be used at each well during a test period; (3) the method used to initiate a test should enable the slug to be introduced in a near-instantaneous manner and should allow a good estimate of Ho to be obtained; (4) data-acquisition equipment that enables a large quantity of high quality data to be collected should be employed; (5) if an estimate of the storage parameter is needed, an observation well other than the test well should be employed; (6) the method chosen for analysis of the slug-test data should be appropriate for site conditions; (7) use of pre- and post-analysis plots should be an integral component of the analysis procedure, and (8) appropriate well construction parameters should be employed. Data from slug tests performed at a number of KGS field sites demonstrate the importance of these guidelines.
Berthels, Nele; Matthijs, Gert; Van Overwalle, Geertrui
2011-01-01
Recent reports in Europe and the United States raise concern about the potential negative impact of gene patents on the freedom to operate of diagnosticians and on the access of patients to genetic diagnostic services. Patents, historically seen as legal instruments to trigger innovation, could cause undesired side effects in the public health domain. Clear empirical evidence on the alleged hindering effect of gene patents is still scarce. We therefore developed a patent categorization method to determine which gene patents could indeed be problematic. The method is applied to patents relevant for genetic testing of spinocerebellar ataxia (SCA). The SCA test is probably the most widely used DNA test in (adult) neurology, as well as one of the most challenging due to the heterogeneity of the disease. Typically tested as a gene panel covering the five common SCA subtypes, we show that the patenting of SCA genes and testing methods and the associated licensing conditions could have far-reaching consequences on legitimate access to this gene panel. Moreover, with genetic testing being increasingly standardized, simply ignoring patents is unlikely to hold out indefinitely. This paper aims to differentiate among so-called ‘gene patents' by lifting out the truly problematic ones. In doing so, awareness is raised among all stakeholders in the genetic diagnostics field who are not necessarily familiar with the ins and outs of patenting and licensing. PMID:21811306
Esophageal function testing: Billing and coding update.
Khan, A; Massey, B; Rao, S; Pandolfino, J
2018-01-01
Esophageal function testing is being increasingly utilized in diagnosis and management of esophageal disorders. There have been several recent technological advances in the field to allow practitioners the ability to more accurately assess and treat such conditions, but there has been a relative lack of education in the literature regarding the associated Common Procedural Terminology (CPT) codes and methods of reimbursement. This review, commissioned and supported by the American Neurogastroenterology and Motility Society Council, aims to summarize each of the CPT codes for esophageal function testing and show the trends of associated reimbursement, as well as recommend coding methods in a practical context. We also aim to encourage many of these codes to be reviewed on a gastrointestinal (GI) societal level, by providing evidence of both discrepancies in coding definitions and inadequate reimbursement in this new era of esophageal function testing. © 2017 John Wiley & Sons Ltd.
Neurodevelopmental Reflex Testing in Neonatal Rat Pups.
Nguyen, Antoinette T; Armstrong, Edward A; Yager, Jerome Y
2017-04-24
Neurodevelopmental reflex testing is commonly used in clinical practice to assess the maturation of the nervous system. Neurodevelopmental reflexes are also referred to as primitive reflexes. They are sensitive and consistent with later outcomes. Abnormal reflexes are described as an absence, persistence, reappearance, or latency of reflexes, which are predictive indices of infants that are at high risk for neurodevelopmental disorders. Animal models of neurodevelopmental disabilities, such as cerebral palsy, often display aberrant developmental reflexes, as would be observed in human infants. The techniques described assess a variety of neurodevelopmental reflexes in neonatal rats. Neurodevelopmental reflex testing offers the investigator a testing method that is not otherwise available in such young animals. The methodology presented here aims to assist investigators in examining developmental milestones in neonatal rats as a method of detecting early-onset brain injury and/or determining the effectiveness of therapeutic interventions. The methodology presented here aims to provide a general guideline for investigators.
Evaluation of two methods for quantifying passeriform lice
Koop, Jennifer A. H.; Clayton, Dale H.
2013-01-01
Two methods commonly used to quantify ectoparasites on live birds are visual examination and dust-ruffling. Visual examination provides an estimate of ectoparasite abundance based on an observer’s timed inspection of various body regions on a bird. Dust-ruffling involves application of insecticidal powder to feathers that are then ruffled to dislodge ectoparasites onto a collection surface where they can then be counted. Despite the common use of these methods in the field, the proportion of actual ectoparasites they account for has only been tested with Rock Pigeons (Columba livia), a relatively large-bodied species (238–302 g) with dense plumage. We tested the accuracy of the two methods using European Starlings (Sturnus vulgaris; ~75 g). We first quantified the number of lice (Brueelia nebulosa) on starlings using visual examination, followed immediately by dust-ruffling. Birds were then euthanized and the proportion of lice accounted for by each method was compared to the total number of lice on each bird as determined with a body-washing method. Visual examination and dust-ruffling each accounted for a relatively small proportion of total lice (14% and 16%, respectively), but both were still significant predictors of abundance. The number of lice observed by visual examination accounted for 68% of the variation in total abundance. Similarly, the number of lice recovered by dust-ruffling accounted for 72% of the variation in total abundance. Our results show that both methods can be used to reliably quantify the abundance of lice on European Starlings and other similar-sized passerines. PMID:24039328
Lin, Long-Ze; Harnly, James M
2008-11-12
A screening method using LC-DAD-ESI/MS was developed for the identification of common hydroxycinnamoylquinic acids based on direct comparison with standards. A complete standard set for mono-, di-, and tricaffeoylquinic isomers was assembled from commercially available standards, positively identified compounds in common plants (artichokes, asparagus, coffee bean, honeysuckle flowers, sweet potato, and Vernonia amygdalina leaves) and chemically modified standards. Four C18 reversed phase columns were tested using the standardized profiling method (based on LC-DAD-ESI/MS) for 30 phenolic compounds, and their elution order and retention times were evaluated. Using only two columns under standardized LC condition and the collected phenolic compound database, it was possible to separate all of the hydroxycinnamoylquinic acid conjugates and to identify 28 and 18 hydroxycinnamoylquinic acids in arnica flowers (Arnica montana L.) and burdock roots (Arctium lappa L.), respectively. Of these, 22 are reported for the first time.
Music and movement share a dynamic structure that supports universal expressions of emotion
Sievers, Beau; Polansky, Larry; Casey, Michael; Wheatley, Thalia
2013-01-01
Music moves us. Its kinetic power is the foundation of human behaviors as diverse as dance, romance, lullabies, and the military march. Despite its significance, the music-movement relationship is poorly understood. We present an empirical method for testing whether music and movement share a common structure that affords equivalent and universal emotional expressions. Our method uses a computer program that can generate matching examples of music and movement from a single set of features: rate, jitter (regularity of rate), direction, step size, and dissonance/visual spikiness. We applied our method in two experiments, one in the United States and another in an isolated tribal village in Cambodia. These experiments revealed three things: (i) each emotion was represented by a unique combination of features, (ii) each combination expressed the same emotion in both music and movement, and (iii) this common structure between music and movement was evident within and across cultures. PMID:23248314
A rapid, one step molecular identification of Trichoderma citrinoviride and Trichoderma reesei.
Saroj, Dina B; Dengeti, Shrinivas N; Aher, Supriya; Gupta, Anil K
2015-06-01
Trichoderma species are widely used as production hosts for industrial enzymes. Identification of Trichoderma species requires a complex molecular biology based identification involving amplification and sequencing of multiple genes. Industrial laboratories are required to run identification tests repeatedly in cell banking procedures and also to prove absence of production host in the product. Such demands can be fulfilled by a brief method which enables confirmation of strain identity. This communication describes one step identification method for two common Trichoderma species; T. citrinoviride and T. reesei, based on identification of polymorphic region in the nucleotide sequence of translation elongation factor 1 alpha. A unique forward primer and common reverse primer resulted in 153 and 139 bp amplicon for T. citrinoviride and T. reesei, respectively. Simplification was further introduced by using mycelium as template for PCR amplification. Method described in this communication allows rapid, one step identification of two Trichoderma species.
Huang, Ay Huey; Wu, Jiunn Jong; Weng, Yu Mei; Ding, Hwia Cheng; Chang, Tsung Chain
1998-01-01
Nonfastidious aerobic gram-negative bacilli (GNB) are commonly isolated from blood cultures. The feasibility of using an electrochemical method for direct antimicrobial susceptibility testing of GNB in positive blood cultures was evaluated. An aliquot (10 μl) of 1:10-diluted positive blood cultures containing GNB was inoculated into the Bactometer module well (bioMérieux Vitek, Hazelwood, Mo.) containing 1 ml of Mueller-Hinton broth supplemented with an antibiotic. Susceptibility tests were performed in a breakpoint broth dilution format, with the results being categorized as resistant, intermediate, or susceptible. Seven antibiotics (ampicillin, cephalothin, gentamicin, amikacin, cefamandole, cefotaxime, and ciprofloxacin) were used in this study, with each agent being tested at the two interpretive breakpoint concentrations. The inoculated modules were incubated at 35°C, and the change in impedance in each well was continuously monitored for 24 h by the Bactometer. The MICs of the seven antibiotics for each blood isolate were also determined by the standardized broth microdilution method. Of 146 positive blood cultures (1,022 microorganism-antibiotic combinations) containing GNB tested by the direct method, the rates of very major, major, and minor errors were 0, 1.1, and 2.5%, respectively. The impedance method was simple; no centrifugation, preincubation, or standardization of the inocula was required, and the susceptibility results were normally available within 3 to 6 h after inoculation. The rapid method may allow proper antimicrobial treatment almost 30 to 40 h before the results of the standard methods are available. PMID:9738038
Evolutionary optimization methods for accelerator design
NASA Astrophysics Data System (ADS)
Poklonskiy, Alexey A.
Many problems from the fields of accelerator physics and beam theory can be formulated as optimization problems and, as such, solved using optimization methods. Despite growing efficiency of the optimization methods, the adoption of modern optimization techniques in these fields is rather limited. Evolutionary Algorithms (EAs) form a relatively new and actively developed optimization methods family. They possess many attractive features such as: ease of the implementation, modest requirements on the objective function, a good tolerance to noise, robustness, and the ability to perform a global search efficiently. In this work we study the application of EAs to problems from accelerator physics and beam theory. We review the most commonly used methods of unconstrained optimization and describe the GATool, evolutionary algorithm and the software package, used in this work, in detail. Then we use a set of test problems to assess its performance in terms of computational resources, quality of the obtained result, and the tradeoff between them. We justify the choice of GATool as a heuristic method to generate cutoff values for the COSY-GO rigorous global optimization package for the COSY Infinity scientific computing package. We design the model of their mutual interaction and demonstrate that the quality of the result obtained by GATool increases as the information about the search domain is refined, which supports the usefulness of this model. We Giscuss GATool's performance on the problems suffering from static and dynamic noise and study useful strategies of GATool parameter tuning for these and other difficult problems. We review the challenges of constrained optimization with EAs and methods commonly used to overcome them. We describe REPA, a new constrained optimization method based on repairing, in exquisite detail, including the properties of its two repairing techniques: REFIND and REPROPT. We assess REPROPT's performance on the standard constrained optimization test problems for EA with a variety of different configurations and suggest optimal default parameter values based on the results. Then we study the performance of the REPA method on the same set of test problems and compare the obtained results with those of several commonly used constrained optimization methods with EA. Based on the obtained results, particularly on the outstanding performance of REPA on test problem that presents significant difficulty for other reviewed EAs, we conclude that the proposed method is useful and competitive. We discuss REPA parameter tuning for difficult problems and critically review some of the problems from the de-facto standard test problem set for the constrained optimization with EA. In order to demonstrate the practical usefulness of the developed method, we study several problems of accelerator design and demonstrate how they can be solved with EAs. These problems include a simple accelerator design problem (design a quadrupole triplet to be stigmatically imaging, find all possible solutions), a complex real-life accelerator design problem (an optimization of the front end section for the future neutrino factory), and a problem of the normal form defect function optimization which is used to rigorously estimate the stability of the beam dynamics in circular accelerators. The positive results we obtained suggest that the application of EAs to problems from accelerator theory can be very beneficial and has large potential. The developed optimization scenarios and tools can be used to approach similar problems.
Missing data imputation and haplotype phase inference for genome-wide association studies
Browning, Sharon R.
2009-01-01
Imputation of missing data and the use of haplotype-based association tests can improve the power of genome-wide association studies (GWAS). In this article, I review methods for haplotype inference and missing data imputation, and discuss their application to GWAS. I discuss common features of the best algorithms for haplotype phase inference and missing data imputation in large-scale data sets, as well as some important differences between classes of methods, and highlight the methods that provide the highest accuracy and fastest computational performance. PMID:18850115
Some attributes of a language for property-based testing.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Neagoe, Vicentiu; Bishop, Matt
Property-based testing is a testing technique that evaluates executions of a program. The method checks that specifications, called properties, hold throughout the execution of the program. TASpec is a language used to specify these properties. This paper compares some attributes of the language with the specification patterns used for model-checking languages, and then presents some descriptions of properties that can be used to detect common security flaws in programs. This report describes the results of a one year research project at the University of California, Davis, which was funded by a University Collaboration LDRD entitled ''Property-based Testing for Cyber Securitymore » Assurance''.« less
Using Cardiac Biomarkers in Veterinary Practice.
Oyama, Mark A
2015-09-01
Blood-based assays for various cardiac biomarkers can assist in the diagnosis of heart disease in dogs and cats. The two most common markers are cardiac troponin-I and N-terminal pro-B-type natriuretic peptide. Biomarker assays can assist in differentiating cardiac from noncardiac causes of respiratory signs and detection of preclinical cardiomyopathy. Increasingly, studies indicate that cardiac biomarker testing can help assess the risk of morbidity and mortality in animals with heart disease. Usage of cardiac biomarker testing in clinical practice relies on proper patient selection, correct interpretation of test results, and incorporation of biomarker testing into existing diagnostic methods. Copyright © 2015 Elsevier Inc. All rights reserved.
Using cardiac biomarkers in veterinary practice.
Oyama, Mark A
2013-11-01
Blood-based assays for various cardiac biomarkers can assist in the diagnosis of heart disease in dogs and cats. The two most common markers are cardiac troponin-I and N-terminal pro-B-type natriuretic peptide. Biomarker assays can assist in differentiating cardiac from noncardiac causes of respiratory signs and detection of preclinical cardiomyopathy. Increasingly, studies indicate that cardiac biomarker testing can help assess the risk of morbidity and mortality in animals with heart disease. Usage of cardiac biomarker testing in clinical practice relies on proper patient selection, correct interpretation of test results, and incorporation of biomarker testing into existing diagnostic methods. Copyright © 2013 Elsevier Inc. All rights reserved.
A joint sparse representation-based method for double-trial evoked potentials estimation.
Yu, Nannan; Liu, Haikuan; Wang, Xiaoyan; Lu, Hanbing
2013-12-01
In this paper, we present a novel approach to solving an evoked potentials estimating problem. Generally, the evoked potentials in two consecutive trials obtained by repeated identical stimuli of the nerves are extremely similar. In order to trace evoked potentials, we propose a joint sparse representation-based double-trial evoked potentials estimation method, taking full advantage of this similarity. The estimation process is performed in three stages: first, according to the similarity of evoked potentials and the randomness of a spontaneous electroencephalogram, the two consecutive observations of evoked potentials are considered as superpositions of the common component and the unique components; second, making use of their characteristics, the two sparse dictionaries are constructed; and finally, we apply the joint sparse representation method in order to extract the common component of double-trial observations, instead of the evoked potential in each trial. A series of experiments carried out on simulated and human test responses confirmed the superior performance of our method. © 2013 Elsevier Ltd. Published by Elsevier Ltd. All rights reserved.
Does rational selection of training and test sets improve the outcome of QSAR modeling?
Martin, Todd M; Harten, Paul; Young, Douglas M; Muratov, Eugene N; Golbraikh, Alexander; Zhu, Hao; Tropsha, Alexander
2012-10-22
Prior to using a quantitative structure activity relationship (QSAR) model for external predictions, its predictive power should be established and validated. In the absence of a true external data set, the best way to validate the predictive ability of a model is to perform its statistical external validation. In statistical external validation, the overall data set is divided into training and test sets. Commonly, this splitting is performed using random division. Rational splitting methods can divide data sets into training and test sets in an intelligent fashion. The purpose of this study was to determine whether rational division methods lead to more predictive models compared to random division. A special data splitting procedure was used to facilitate the comparison between random and rational division methods. For each toxicity end point, the overall data set was divided into a modeling set (80% of the overall set) and an external evaluation set (20% of the overall set) using random division. The modeling set was then subdivided into a training set (80% of the modeling set) and a test set (20% of the modeling set) using rational division methods and by using random division. The Kennard-Stone, minimal test set dissimilarity, and sphere exclusion algorithms were used as the rational division methods. The hierarchical clustering, random forest, and k-nearest neighbor (kNN) methods were used to develop QSAR models based on the training sets. For kNN QSAR, multiple training and test sets were generated, and multiple QSAR models were built. The results of this study indicate that models based on rational division methods generate better statistical results for the test sets than models based on random division, but the predictive power of both types of models are comparable.
Ji, Chengdong; Guo, Xuan; Li, Zhen; Qian, Shuwen; Zheng, Feng; Qin, Haiqing
2013-01-01
Many studies have been conducted on colorectal anastomotic leakage to reduce the incidence of anastomotic leakage. However, how to precisely determine if the bowel can withstand the pressure of a colorectal anastomosis experiment, which is called anastomotic bursting pressure, has not been determined. A task force developed the experimental animal hollow organ mechanical testing system to provide precise measurement of the maximum pressure that an anastomotic colon can withstand, and to compare it with the commonly used method such as the mercury and air bag pressure manometer in a rat colon rupture pressure test. Forty-five male Sprague-Dawley rats were randomly divided into the manual ball manometry (H) group, the tracing machine manometry pressure gauge head (MP) group, and the experimental animal hollow organ mechanical testing system (ME) group. The rats in each group were subjected to a cut colon rupture pressure test after injecting anesthesia in the tail vein. Colonic end-to-end anastomosis was performed, and the rats were rested for 1 week before anastomotic bursting pressure was determined by one of the three methods. No differences were observed between the normal colon rupture pressure and colonic anastomotic bursting pressure, which were determined using the three manometry methods. However, several advantages, such as reduction in errors, were identified in the ME group. Different types of manometry methods can be applied to the normal rat colon, but the colonic anastomotic bursting pressure test using the experimental animal hollow organ mechanical testing system is superior to traditional methods. Copyright © 2013 Surgical Associates Ltd. Published by Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Koelfgen, Syri J.; Faber, James J.
2010-01-01
The National Aeronautics and Space Administration (NASA) and the aviation industry have recognized a need for developing a method to identify and combine resources to carry out research and testing more efficiently. The Integrated Vehicle Health Management (IVHM) Research Test and Integration Plan (RTIP) Wiki is a tool that is used to visualize, plan, and accomplish collaborative research and testing. Synergistic test opportunities are developed using the RTIP Wiki, and include potential common resource testing that combines assets and personnel from NASA, industry, academia, and other government agencies. A research scenario is linked to the appropriate IVHM milestones and resources detailed in the wiki, reviewed by the research team members, and integrated into a collaborative test strategy. The scenario is then implemented by creating a test plan when appropriate and the research is performed. The benefits of performing collaborative research and testing are achieving higher Technology Readiness Level (TRL) test opportunities with little or no additional cost, improved quality of research, and increased communication among researchers. In addition to a description of the method of creating these joint research scenarios, examples of the successful development and implementation of cooperative research using the IVHM RTIP Wiki are given.
He, Hua; McDermott, Michael P.
2012-01-01
Sensitivity and specificity are common measures of the accuracy of a diagnostic test. The usual estimators of these quantities are unbiased if data on the diagnostic test result and the true disease status are obtained from all subjects in an appropriately selected sample. In some studies, verification of the true disease status is performed only for a subset of subjects, possibly depending on the result of the diagnostic test and other characteristics of the subjects. Estimators of sensitivity and specificity based on this subset of subjects are typically biased; this is known as verification bias. Methods have been proposed to correct verification bias under the assumption that the missing data on disease status are missing at random (MAR), that is, the probability of missingness depends on the true (missing) disease status only through the test result and observed covariate information. When some of the covariates are continuous, or the number of covariates is relatively large, the existing methods require parametric models for the probability of disease or the probability of verification (given the test result and covariates), and hence are subject to model misspecification. We propose a new method for correcting verification bias based on the propensity score, defined as the predicted probability of verification given the test result and observed covariates. This is estimated separately for those with positive and negative test results. The new method classifies the verified sample into several subsamples that have homogeneous propensity scores and allows correction for verification bias. Simulation studies demonstrate that the new estimators are more robust to model misspecification than existing methods, but still perform well when the models for the probability of disease and probability of verification are correctly specified. PMID:21856650
Szemraj, Magdalena; Kwaszewska, Anna; Pawlak, Renata; Szewczyk, Eligia M
2014-10-01
Corynebacteria exist as part of human skin microbiota. However, under some circumstances, they can cause opportunistic infections. The subject of the study was to examine the macrolide-lincosamide-streptogramin B (MLSB) antibiotic resistance in 99 lipophilic strains of Corynebacterium genus isolated from the skin of healthy men. Over 70% of the tested strains were resistant to erythromycin and clindamycin. All of which demonstrated a constitutive type of MLSB resistance mechanism. In all strains, there were being investigated the erm(A), erm(B), erm(C), erm(X), lin(A), msr(A), and mph(C) genes that could be responsible for the different types of resistance to marcolides, lincosamides, and streptogramin B. In all strains with the MLSB resistance phenotype, the erm(X) gene was detected. None of the other tested genes were discovered. Strains harboring the erm(X) gene were identified using a phenotypic method based on numerous biological and biochemical tests. Identification of the chosen strains was compared with the results of API Coryne, MALDI-TOF MS, and 16S rDNA sequencing methods. Only 7 out of the 23 investigated resistant strains provided successful results in all the used methods, showing that identification of this group of bacteria is still a great challenge. The MLSB resistance mechanism was common in most frequently isolated from healthy human skin Corynebacterium tuberculostearicum and Corynebacterium jeikeium strains. This represents a threat as these species are also commonly described as etiological factors of opportunistic infections.
Block Play: Practical Suggestions for Common Dilemmas
ERIC Educational Resources Information Center
Tunks, Karyn Wellhousen
2009-01-01
Learning materials and teaching methods used in early childhood classrooms have fluctuated greatly over the past century. However, one learning tool has stood the test of time: Wood building blocks, often called unit blocks, continue to be a source of pleasure and learning for young children at play. Wood blocks have the unique capacity to engage…
The Impacts of Language Background and Language-Related Disorders in Auditory Processing Assessment
ERIC Educational Resources Information Center
Loo, Jenny Hooi Yin; Bamiou, Doris-Eva; Rosen, Stuart
2013-01-01
Purpose: To examine the impact of language background and language-related disorders (LRDs--dyslexia and/or language impairment) on performance in English speech and nonspeech tests of auditory processing (AP) commonly used in the clinic. Method: A clinical database concerning 133 multilingual children (mostly with English as an additional…
Online Calibration Methods for the DINA Model with Independent Attributes in CD-CAT
ERIC Educational Resources Information Center
Chen, Ping; Xin, Tao; Wang, Chun; Chang, Hua-Hua
2012-01-01
Item replenishing is essential for item bank maintenance in cognitive diagnostic computerized adaptive testing (CD-CAT). In regular CAT, online calibration is commonly used to calibrate the new items continuously. However, until now no reference has publicly become available about online calibration for CD-CAT. Thus, this study investigates the…
Analyzing Response Times in Tests with Rank Correlation Approaches
ERIC Educational Resources Information Center
Ranger, Jochen; Kuhn, Jorg-Tobias
2013-01-01
It is common practice to log-transform response times before analyzing them with standard factor analytical methods. However, sometimes the log-transformation is not capable of linearizing the relation between the response times and the latent traits. Therefore, a more general approach to response time analysis is proposed in the current…
Making the Cut in Gifted Selection: Score Combination Rules and Their Impact on Program Diversity
ERIC Educational Resources Information Center
Lakin, Joni M.
2018-01-01
The recommendation of using "multiple measures" is common in policy guidelines for gifted and talented assessment systems. However, the integration of multiple test scores in a system that uses cut-scores requires choosing between different methods of combining quantitative scores. Past research has indicated that OR combination rules…
Imputation of Missing Genotypes From Sparse to High Density Using Long-Range Phasing
USDA-ARS?s Scientific Manuscript database
Related individuals in a population share long chromosome segments which trace to a common ancestor. We describe a long-range phasing algorithm that makes use of this property to phase whole chromosomes and simultaneously impute a large number of missing markers. We test our method by imputing marke...
A drawback of current in vitro chemical testing is that many commonly used cell lines lack chemical metabolism. This hinders the use and relevance of cell culture in high throughput chemical toxicity screening. To address this challenge, we engineered HEK293T cells to overexpress...
ERIC Educational Resources Information Center
Benathen, Isaiah A.
1991-01-01
Alternatives to the traditional unknown tests that permit a clear and unequivocal differential identification decision between Bacillus subtilis and Bacillus megaterium are presented. Plates of Phenylethyl Alcohol agar with Blood (PEAB), slants of Bile Esculin agar and plates of DNA agar are used. The materials, methods, results, and conclusions…
ERIC Educational Resources Information Center
Glenn, David
2007-01-01
Most college instructors probably are not about to start giving the daily quizzes that some researchers recommend to improve learning, so students might want to try testing themselves when they study on their own. But there's a catch: When people study with flashcards, by far the most common method of self-quizzing, they're notoriously bad at…
Simultaneous Estimation of Regression Functions for Marine Corps Technical Training Specialties.
ERIC Educational Resources Information Center
Dunbar, Stephen B.; And Others
This paper considers the application of Bayesian techniques for simultaneous estimation to the specification of regression weights for selection tests used in various technical training courses in the Marine Corps. Results of a method for m-group regression developed by Molenaar and Lewis (1979) suggest that common weights for training courses…