Sample records for multiple test correction

  1. Correcting for multiple-testing in multi-arm trials: is it necessary and is it done?

    PubMed

    Wason, James M S; Stecher, Lynne; Mander, Adrian P

    2014-09-17

    Multi-arm trials enable the evaluation of multiple treatments within a single trial. They provide a way of substantially increasing the efficiency of the clinical development process. However, since multi-arm trials test multiple hypotheses, some regulators require that a statistical correction be made to control the chance of making a type-1 error (false-positive). Several conflicting viewpoints are expressed in the literature regarding the circumstances in which a multiple-testing correction should be used. In this article we discuss these conflicting viewpoints and review the frequency with which correction methods are currently used in practice. We identified all multi-arm clinical trials published in 2012 by four major medical journals. Summary data on several aspects of the trial design were extracted, including whether the trial was exploratory or confirmatory, whether a multiple-testing correction was applied and, if one was used, what type it was. We found that almost half (49%) of published multi-arm trials report using a multiple-testing correction. The percentage that corrected was higher for trials in which the experimental arms included multiple doses or regimens of the same treatments (67%). The percentage that corrected was higher in exploratory than confirmatory trials, although this is explained by a greater proportion of exploratory trials testing multiple doses and regimens of the same treatment. A sizeable proportion of published multi-arm trials do not correct for multiple-testing. Clearer guidance about whether multiple-testing correction is needed for multi-arm trials that test separate treatments against a common control group is required.

  2. All of the above: When multiple correct response options enhance the testing effect.

    PubMed

    Bishara, Anthony J; Lanzo, Lauren A

    2015-01-01

    Previous research has shown that multiple choice tests often improve memory retention. However, the presence of incorrect lures often attenuates this memory benefit. The current research examined the effects of "all of the above" (AOTA) options. When such options are correct, no incorrect lures are present. In the first three experiments, a correct AOTA option on an initial test led to a larger memory benefit than no test and standard multiple choice test conditions. The benefits of a correct AOTA option occurred even without feedback on the initial test; for both 5-minute and 48-hour retention delays; and for both cued recall and multiple choice final test formats. In the final experiment, an AOTA question led to better memory retention than did a control condition that had identical timing and exposure to response options. However, the benefits relative to this control condition were similar regardless of the type of multiple choice test (AOTA or not). Results suggest that retrieval contributes to multiple choice testing effects. However, the extra testing effect from a correct AOTA option, rather than being due to more retrieval, might be due simply to more exposure to correct information.

  3. "None of the above" as a correct and incorrect alternative on a multiple-choice test: implications for the testing effect.

    PubMed

    Odegard, Timothy N; Koen, Joshua D

    2007-11-01

    Both positive and negative testing effects have been demonstrated with a variety of materials and paradigms (Roediger & Karpicke, 2006b). The present series of experiments replicate and extend the research of Roediger and Marsh (2005) with the addition of a "none-of-the-above" response option. Participants (n=32 in both experiments) read a set of passages, took an initial multiple-choice test, completed a filler task, and then completed a final cued-recall test (Experiment 1) or multiple-choice test (Experiment 2). Questions were manipulated on the initial multiple-choice test by adding a "none-of-the-above" response alternative (choice "E") that was incorrect ("E" Incorrect) or correct ("E" Correct). The results from both experiments demonstrated that the positive testing effect was negated when the "none-of-the-above" alternative was the correct response on the initial multiple-choice test, but was still present when the "none-of-the-above" alternative was an incorrect response.

  4. Rapid and Accurate Multiple Testing Correction and Power Estimation for Millions of Correlated Markers

    PubMed Central

    Han, Buhm; Kang, Hyun Min; Eskin, Eleazar

    2009-01-01

    With the development of high-throughput sequencing and genotyping technologies, the number of markers collected in genetic association studies is growing rapidly, increasing the importance of methods for correcting for multiple hypothesis testing. The permutation test is widely considered the gold standard for accurate multiple testing correction, but it is often computationally impractical for these large datasets. Recently, several studies proposed efficient alternative approaches to the permutation test based on the multivariate normal distribution (MVN). However, they cannot accurately correct for multiple testing in genome-wide association studies for two reasons. First, these methods require partitioning of the genome into many disjoint blocks and ignore all correlations between markers from different blocks. Second, the true null distribution of the test statistic often fails to follow the asymptotic distribution at the tails of the distribution. We propose an accurate and efficient method for multiple testing correction in genome-wide association studies—SLIDE. Our method accounts for all correlation within a sliding window and corrects for the departure of the true null distribution of the statistic from the asymptotic distribution. In simulations using the Wellcome Trust Case Control Consortium data, the error rate of SLIDE's corrected p-values is more than 20 times smaller than the error rate of the previous MVN-based methods' corrected p-values, while SLIDE is orders of magnitude faster than the permutation test and other competing methods. We also extend the MVN framework to the problem of estimating the statistical power of an association study with correlated markers and propose an efficient and accurate power estimation method SLIP. SLIP and SLIDE are available at http://slide.cs.ucla.edu. PMID:19381255

  5. Best (but oft-forgotten) practices: the multiple problems of multiplicity-whether and how to correct for many statistical tests.

    PubMed

    Streiner, David L

    2015-10-01

    Testing many null hypotheses in a single study results in an increased probability of detecting a significant finding just by chance (the problem of multiplicity). Debates have raged over many years with regard to whether to correct for multiplicity and, if so, how it should be done. This article first discusses how multiple tests lead to an inflation of the α level, then explores the following different contexts in which multiplicity arises: testing for baseline differences in various types of studies, having >1 outcome variable, conducting statistical tests that produce >1 P value, taking multiple "peeks" at the data, and unplanned, post hoc analyses (i.e., "data dredging," "fishing expeditions," or "P-hacking"). It then discusses some of the methods that have been proposed for correcting for multiplicity, including single-step procedures (e.g., Bonferroni); multistep procedures, such as those of Holm, Hochberg, and Šidák; false discovery rate control; and resampling approaches. Note that these various approaches describe different aspects and are not necessarily mutually exclusive. For example, resampling methods could be used to control the false discovery rate or the family-wise error rate (as defined later in this article). However, the use of one of these approaches presupposes that we should correct for multiplicity, which is not universally accepted, and the article presents the arguments for and against such "correction." The final section brings together these threads and presents suggestions with regard to when it makes sense to apply the corrections and how to do so. © 2015 American Society for Nutrition.

  6. Piloting a Polychotomous Partial-Credit Scoring Procedure in a Multiple-Choice Test

    ERIC Educational Resources Information Center

    Tsopanoglou, Antonios; Ypsilandis, George S.; Mouti, Anna

    2014-01-01

    Multiple-choice (MC) tests are frequently used to measure language competence because they are quick, economical and straightforward to score. While degrees of correctness have been investigated for partially correct responses in combined-response MC tests, degrees of incorrectness in distractors and the role they play in determining the…

  7. Multiple testing corrections in quantitative proteomics: A useful but blunt tool.

    PubMed

    Pascovici, Dana; Handler, David C L; Wu, Jemma X; Haynes, Paul A

    2016-09-01

    Multiple testing corrections are a useful tool for restricting the FDR, but can be blunt in the context of low power, as we demonstrate by a series of simple simulations. Unfortunately, in proteomics experiments low power can be common, driven by proteomics-specific issues like small effects due to ratio compression, and few replicates due to reagent high cost, instrument time availability and other issues; in such situations, most multiple testing corrections methods, if used with conventional thresholds, will fail to detect any true positives even when many exist. In this low power, medium scale situation, other methods such as effect size considerations or peptide-level calculations may be a more effective option, even if they do not offer the same theoretical guarantee of a low FDR. Thus, we aim to highlight in this article that proteomics presents some specific challenges to the standard multiple testing corrections methods, which should be employed as a useful tool but not be regarded as a required rubber stamp. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Non-parametric combination and related permutation tests for neuroimaging.

    PubMed

    Winkler, Anderson M; Webster, Matthew A; Brooks, Jonathan C; Tracey, Irene; Smith, Stephen M; Nichols, Thomas E

    2016-04-01

    In this work, we show how permutation methods can be applied to combination analyses such as those that include multiple imaging modalities, multiple data acquisitions of the same modality, or simply multiple hypotheses on the same data. Using the well-known definition of union-intersection tests and closed testing procedures, we use synchronized permutations to correct for such multiplicity of tests, allowing flexibility to integrate imaging data with different spatial resolutions, surface and/or volume-based representations of the brain, including non-imaging data. For the problem of joint inference, we propose and evaluate a modification of the recently introduced non-parametric combination (NPC) methodology, such that instead of a two-phase algorithm and large data storage requirements, the inference can be performed in a single phase, with reasonable computational demands. The method compares favorably to classical multivariate tests (such as MANCOVA), even when the latter is assessed using permutations. We also evaluate, in the context of permutation tests, various combining methods that have been proposed in the past decades, and identify those that provide the best control over error rate and power across a range of situations. We show that one of these, the method of Tippett, provides a link between correction for the multiplicity of tests and their combination. Finally, we discuss how the correction can solve certain problems of multiple comparisons in one-way ANOVA designs, and how the combination is distinguished from conjunctions, even though both can be assessed using permutation tests. We also provide a common algorithm that accommodates combination and correction. © 2016 The Authors Human Brain Mapping Published by Wiley Periodicals, Inc.

  9. Score Increase and Partial-Credit Validity When Administering Multiple-Choice Tests Using an Answer-Until-Correct Format

    ERIC Educational Resources Information Center

    Slepkov, Aaron D.; Vreugdenhil, Andrew J.; Shiell, Ralph C.

    2016-01-01

    There are numerous benefits to answer-until-correct (AUC) approaches to multiple-choice testing, not the least of which is the straightforward allotment of partial credit. However, the benefits of granting partial credit can be tempered by the inevitable increase in test scores and by fears that such increases are further contaminated by a large…

  10. Feedback-related brain activity predicts learning from feedback in multiple-choice testing.

    PubMed

    Ernst, Benjamin; Steinhauser, Marco

    2012-06-01

    Different event-related potentials (ERPs) have been shown to correlate with learning from feedback in decision-making tasks and with learning in explicit memory tasks. In the present study, we investigated which ERPs predict learning from corrective feedback in a multiple-choice test, which combines elements from both paradigms. Participants worked through sets of multiple-choice items of a Swahili-German vocabulary task. Whereas the initial presentation of an item required the participants to guess the answer, corrective feedback could be used to learn the correct response. Initial analyses revealed that corrective feedback elicited components related to reinforcement learning (FRN), as well as to explicit memory processing (P300) and attention (early frontal positivity). However, only the P300 and early frontal positivity were positively correlated with successful learning from corrective feedback, whereas the FRN was even larger when learning failed. These results suggest that learning from corrective feedback crucially relies on explicit memory processing and attentional orienting to corrective feedback, rather than on reinforcement learning.

  11. Accurate and fast multiple-testing correction in eQTL studies.

    PubMed

    Sul, Jae Hoon; Raj, Towfique; de Jong, Simone; de Bakker, Paul I W; Raychaudhuri, Soumya; Ophoff, Roel A; Stranger, Barbara E; Eskin, Eleazar; Han, Buhm

    2015-06-04

    In studies of expression quantitative trait loci (eQTLs), it is of increasing interest to identify eGenes, the genes whose expression levels are associated with variation at a particular genetic variant. Detecting eGenes is important for follow-up analyses and prioritization because genes are the main entities in biological processes. To detect eGenes, one typically focuses on the genetic variant with the minimum p value among all variants in cis with a gene and corrects for multiple testing to obtain a gene-level p value. For performing multiple-testing correction, a permutation test is widely used. Because of growing sample sizes of eQTL studies, however, the permutation test has become a computational bottleneck in eQTL studies. In this paper, we propose an efficient approach for correcting for multiple testing and assess eGene p values by utilizing a multivariate normal distribution. Our approach properly takes into account the linkage-disequilibrium structure among variants, and its time complexity is independent of sample size. By applying our small-sample correction techniques, our method achieves high accuracy in both small and large studies. We have shown that our method consistently produces extremely accurate p values (accuracy > 98%) for three human eQTL datasets with different sample sizes and SNP densities: the Genotype-Tissue Expression pilot dataset, the multi-region brain dataset, and the HapMap 3 dataset. Copyright © 2015 The American Society of Human Genetics. Published by Elsevier Inc. All rights reserved.

  12. Non‐parametric combination and related permutation tests for neuroimaging

    PubMed Central

    Webster, Matthew A.; Brooks, Jonathan C.; Tracey, Irene; Smith, Stephen M.; Nichols, Thomas E.

    2016-01-01

    Abstract In this work, we show how permutation methods can be applied to combination analyses such as those that include multiple imaging modalities, multiple data acquisitions of the same modality, or simply multiple hypotheses on the same data. Using the well‐known definition of union‐intersection tests and closed testing procedures, we use synchronized permutations to correct for such multiplicity of tests, allowing flexibility to integrate imaging data with different spatial resolutions, surface and/or volume‐based representations of the brain, including non‐imaging data. For the problem of joint inference, we propose and evaluate a modification of the recently introduced non‐parametric combination (NPC) methodology, such that instead of a two‐phase algorithm and large data storage requirements, the inference can be performed in a single phase, with reasonable computational demands. The method compares favorably to classical multivariate tests (such as MANCOVA), even when the latter is assessed using permutations. We also evaluate, in the context of permutation tests, various combining methods that have been proposed in the past decades, and identify those that provide the best control over error rate and power across a range of situations. We show that one of these, the method of Tippett, provides a link between correction for the multiplicity of tests and their combination. Finally, we discuss how the correction can solve certain problems of multiple comparisons in one‐way ANOVA designs, and how the combination is distinguished from conjunctions, even though both can be assessed using permutation tests. We also provide a common algorithm that accommodates combination and correction. Hum Brain Mapp 37:1486‐1511, 2016. © 2016 Wiley Periodicals, Inc. PMID:26848101

  13. Is the NIHSS Certification Process Too Lenient?

    PubMed Central

    Hills, Nancy K.; Josephson, S. Andrew; Lyden, Patrick D.; Johnston, S. Claiborne

    2009-01-01

    Background and Purpose The National Institutes of Health Stroke Scale (NIHSS) is a widely used measure of neurological function in clinical trials and patient assessment; inter-rater scoring variability could impact communications and trial power. The manner in which the rater certification test is scored yields multiple correct answers that have changed over time. We examined the range of possible total NIHSS scores from answers given in certification tests by over 7,000 individual raters who were certified. Methods We analyzed the results of all raters who completed one of two standard multiple-patient videotaped certification examinations between 1998 and 2004. The range for the correct score, calculated using NIHSS ‘correct answers’, was determined for each patient. The distribution of scores derived from those who passed the certification test then was examined. Results A total of 6,268 raters scored 5 patients on Test 1; 1,240 scored 6 patients on Test 2. Using a National Stroke Association (NSA) answer key, we found that correct total scores ranged from 2 correct scores to as many as 12 different correct total scores. Among raters who achieved a passing score and were therefore qualified to administer the NIHSS, score distributions were even wider, with 1 certification patient receiving 18 different correct total scores. Conclusions Allowing multiple acceptable answers for questions on the NIHSS certification test introduces scoring variability. It seems reasonable to assume that the wider the range of acceptable answers in the certification test, the greater the variability in the performance of the test in trials and clinical practice by certified examiners. Greater consistency may be achieved by deriving a set of ‘best’ answers through expert consensus on all questions where this is possible, then teaching raters how to derive these answers using a required interactive training module. PMID:19295205

  14. Multiple testing and power calculations in genetic association studies.

    PubMed

    So, Hon-Cheong; Sham, Pak C

    2011-01-01

    Modern genetic association studies typically involve multiple single-nucleotide polymorphisms (SNPs) and/or multiple genes. With the development of high-throughput genotyping technologies and the reduction in genotyping cost, investigators can now assay up to a million SNPs for direct or indirect association with disease phenotypes. In addition, some studies involve multiple disease or related phenotypes and use multiple methods of statistical analysis. The combination of multiple genetic loci, multiple phenotypes, and multiple methods of evaluating associations between genotype and phenotype means that modern genetic studies often involve the testing of an enormous number of hypotheses. When multiple hypothesis tests are performed in a study, there is a risk of inflation of the type I error rate (i.e., the chance of falsely claiming an association when there is none). Several methods for multiple-testing correction are in popular use, and they all have strengths and weaknesses. Because no single method is universally adopted or always appropriate, it is important to understand the principles, strengths, and weaknesses of the methods so that they can be applied appropriately in practice. In this article, we review the three principle methods for multiple-testing correction and provide guidance for calculating statistical power.

  15. Resting-state fMRI data reflects default network activity rather than null data: A defense of commonly employed methods to correct for multiple comparisons.

    PubMed

    Slotnick, Scott D

    2017-07-01

    Analysis of functional magnetic resonance imaging (fMRI) data typically involves over one hundred thousand independent statistical tests; therefore, it is necessary to correct for multiple comparisons to control familywise error. In a recent paper, Eklund, Nichols, and Knutsson used resting-state fMRI data to evaluate commonly employed methods to correct for multiple comparisons and reported unacceptable rates of familywise error. Eklund et al.'s analysis was based on the assumption that resting-state fMRI data reflect null data; however, their 'null data' actually reflected default network activity that inflated familywise error. As such, Eklund et al.'s results provide no basis to question the validity of the thousands of published fMRI studies that have corrected for multiple comparisons or the commonly employed methods to correct for multiple comparisons.

  16. Do Streaks Matter in Multiple-Choice Tests?

    ERIC Educational Resources Information Center

    Kiss, Hubert János; Selei, Adrienn

    2018-01-01

    Success in life is determined to a large extent by school performance, which in turn depends heavily on grades obtained in exams. In this study, we investigate a particular type of exam: multiple-choice tests. More concretely, we study if patterns of correct answers in multiple-choice tests affect performance. We design an experiment to study if…

  17. The Use of Meta-Analytic Statistical Significance Testing

    ERIC Educational Resources Information Center

    Polanin, Joshua R.; Pigott, Terri D.

    2015-01-01

    Meta-analysis multiplicity, the concept of conducting multiple tests of statistical significance within one review, is an underdeveloped literature. We address this issue by considering how Type I errors can impact meta-analytic results, suggest how statistical power may be affected through the use of multiplicity corrections, and propose how…

  18. Investigating the potential influence of established multiple-choice test-taking cues on item response in a pharmacotherapy board certification examination preparatory manual: a pilot study.

    PubMed

    Gettig, Jacob P

    2006-04-01

    To determine the prevalence of established multiple-choice test-taking correct and incorrect answer cues in the American College of Clinical Pharmacy's Updates in Therapeutics: The Pharmacotherapy Preparatory Course, 2005 Edition, as an equal or lesser surrogate indication of the prevalence of such cues in the Pharmacotherapy board certification examination. All self-assessment and patient case question-and-answer sets were assessed individually to determine if they were subject to selected correct and incorrect answer cues commonly seen in multiple-choice question writing. If the question was considered evaluable, correct answer cues-longest answer, mid-range number, one of two similar choices, and one of two opposite choices-were tallied. In addition, incorrect answer cues- inclusionary language and grammatical mismatch-were also tallied. Each cue was counted if it did what was expected or did the opposite of what was expected. Multiple cues could be identified in each question. A total of 237 (47.7%) of 497 questions in the manual were deemed evaluable. A total of 325 correct answer cues and 35 incorrect answer cues were identified in the 237 evaluable questions. Most evaluable questions contained one to two correct and/or incorrect answer cue(s). Longest answer was the most frequently identified correct answer cue; however, it was the least likely to identify the correct answer. Inclusionary language was the most frequently identified incorrect answer cue. Incorrect answer cues were considerably more likely to identify incorrect answer choices than correct answer cues were able to identify correct answer choices. The use of established multiple-choice test-taking cues is unlikely to be of significant help when taking the Pharmacotherapy board certification examination, primarily because of the lack of questions subject to such cues and the inability of correct answer cues to accurately identify correct answers. Incorrect answer cues, especially the use of inclusionary language, almost always will accurately identify an incorrect answer choice. Assuming that questions in the preparatory course manual were equal or lesser surrogates of those in the board certification examination, it is unlikely that intuition alone can replace adequate preparation and studying as the sole determinant of examination success.

  19. Correction for Guessing in the Framework of the 3PL Item Response Theory

    ERIC Educational Resources Information Center

    Chiu, Ting-Wei

    2010-01-01

    Guessing behavior is an important topic with regard to assessing proficiency on multiple choice tests, particularly for examinees at lower levels of proficiency due to greater the potential for systematic error or bias which that inflates observed test scores. Methods that incorporate a correction for guessing on high-stakes tests generally rely…

  20. Optimizing multiple-choice tests as tools for learning.

    PubMed

    Little, Jeri L; Bjork, Elizabeth Ligon

    2015-01-01

    Answering multiple-choice questions with competitive alternatives can enhance performance on a later test, not only on questions about the information previously tested, but also on questions about related information not previously tested-in particular, on questions about information pertaining to the previously incorrect alternatives. In the present research, we assessed a possible explanation for this pattern: When multiple-choice questions contain competitive incorrect alternatives, test-takers are led to retrieve previously studied information pertaining to all of the alternatives in order to discriminate among them and select an answer, with such processing strengthening later access to information associated with both the correct and incorrect alternatives. Supporting this hypothesis, we found enhanced performance on a later cued-recall test for previously nontested questions when their answers had previously appeared as competitive incorrect alternatives in the initial multiple-choice test, but not when they had previously appeared as noncompetitive alternatives. Importantly, however, competitive alternatives were not more likely than noncompetitive alternatives to be intruded as incorrect responses, indicating that a general increased accessibility for previously presented incorrect alternatives could not be the explanation for these results. The present findings, replicated across two experiments (one in which corrective feedback was provided during the initial multiple-choice testing, and one in which it was not), thus strongly suggest that competitive multiple-choice questions can trigger beneficial retrieval processes for both tested and related information, and the results have implications for the effective use of multiple-choice tests as tools for learning.

  1. Comment on 3PL IRT Adjustment for Guessing

    ERIC Educational Resources Information Center

    Chiu, Ting-Wei; Camilli, Gregory

    2013-01-01

    Guessing behavior is an issue discussed widely with regard to multiple choice tests. Its primary effect is on number-correct scores for examinees at lower levels of proficiency. This is a systematic error or bias, which increases observed test scores. Guessing also can inflate random error variance. Correction or adjustment for guessing formulas…

  2. Improved workflow for quantification of left ventricular volumes and mass using free-breathing motion corrected cine imaging.

    PubMed

    Cross, Russell; Olivieri, Laura; O'Brien, Kendall; Kellman, Peter; Xue, Hui; Hansen, Michael

    2016-02-25

    Traditional cine imaging for cardiac functional assessment requires breath-holding, which can be problematic in some situations. Free-breathing techniques have relied on multiple averages or real-time imaging, producing images that can be spatially and/or temporally blurred. To overcome this, methods have been developed to acquire real-time images over multiple cardiac cycles, which are subsequently motion corrected and reformatted to yield a single image series displaying one cardiac cycle with high temporal and spatial resolution. Application of these algorithms has required significant additional reconstruction time. The use of distributed computing was recently proposed as a way to improve clinical workflow with such algorithms. In this study, we have deployed a distributed computing version of motion corrected re-binning reconstruction for free-breathing evaluation of cardiac function. Twenty five patients and 25 volunteers underwent cardiovascular magnetic resonance (CMR) for evaluation of left ventricular end-systolic volume (ESV), end-diastolic volume (EDV), and end-diastolic mass. Measurements using motion corrected re-binning were compared to those using breath-held SSFP and to free-breathing SSFP with multiple averages, and were performed by two independent observers. Pearson correlation coefficients and Bland-Altman plots tested agreement across techniques. Concordance correlation coefficient and Bland-Altman analysis tested inter-observer variability. Total scan plus reconstruction times were tested for significant differences using paired t-test. Measured volumes and mass obtained by motion corrected re-binning and by averaged free-breathing SSFP compared favorably to those obtained by breath-held SSFP (r = 0.9863/0.9813 for EDV, 0.9550/0.9685 for ESV, 0.9952/0.9771 for mass). Inter-observer variability was good with concordance correlation coefficients between observers across all acquisition types suggesting substantial agreement. Both motion corrected re-binning and averaged free-breathing SSFP acquisition and reconstruction times were shorter than breath-held SSFP techniques (p < 0.0001). On average, motion corrected re-binning required 3 min less than breath-held SSFP imaging, a 37% reduction in acquisition and reconstruction time. The motion corrected re-binning image reconstruction technique provides robust cardiac imaging that can be used for quantification that compares favorably to breath-held SSFP as well as multiple average free-breathing SSFP, but can be obtained in a fraction of the time when using cloud-based distributed computing reconstruction.

  3. Feedback enhances the positive effects and reduces the negative effects of multiple-choice testing.

    PubMed

    Butler, Andrew C; Roediger, Henry L

    2008-04-01

    Multiple-choice tests are used frequently in higher education without much consideration of the impact this form of assessment has on learning. Multiple-choice testing enhances retention of the material tested (the testing effect); however, unlike other tests, multiple-choice can also be detrimental because it exposes students to misinformation in the form of lures. The selection of lures can lead students to acquire false knowledge (Roediger & Marsh, 2005). The present research investigated whether feedback could be used to boost the positive effects and reduce the negative effects of multiple-choice testing. Subjects studied passages and then received a multiple-choice test with immediate feedback, delayed feedback, or no feedback. In comparison with the no-feedback condition, both immediate and delayed feedback increased the proportion of correct responses and reduced the proportion of intrusions (i.e., lure responses from the initial multiple-choice test) on a delayed cued recall test. Educators should provide feedback when using multiple-choice tests.

  4. Experimental evaluation of the extended Dytlewski-style dead time correction formalism for neutron multiplicity counting

    NASA Astrophysics Data System (ADS)

    Lockhart, M.; Henzlova, D.; Croft, S.; Cutler, T.; Favalli, A.; McGahee, Ch.; Parker, R.

    2018-01-01

    Over the past few decades, neutron multiplicity counting has played an integral role in Special Nuclear Material (SNM) characterization pertaining to nuclear safeguards. Current neutron multiplicity analysis techniques use singles, doubles, and triples count rates because a methodology to extract and dead time correct higher order count rates (i.e. quads and pents) was not fully developed. This limitation is overcome by the recent extension of a popular dead time correction method developed by Dytlewski. This extended dead time correction algorithm, named Dytlewski-Croft-Favalli(DCF), is detailed in reference Croft and Favalli (2017), which gives an extensive explanation of the theory and implications of this new development. Dead time corrected results can then be used to assay SNM by inverting a set of extended point model equations which as well have only recently been formulated. The current paper discusses and presents the experimental evaluation of practical feasibility of the DCF dead time correction algorithm to demonstrate its performance and applicability in nuclear safeguards applications. In order to test the validity and effectiveness of the dead time correction for quads and pents, 252Cf and SNM sources were measured in high efficiency neutron multiplicity counters at the Los Alamos National Laboratory (LANL) and the count rates were extracted up to the fifth order and corrected for dead time. In order to assess the DCF dead time correction, the corrected data is compared to traditional dead time correction treatment within INCC. The DCF dead time correction is found to provide adequate dead time treatment for broad range of count rates available in practical applications.

  5. Multiple-Choice Tests with Correction Allowed in Autism: An Excel Applet

    ERIC Educational Resources Information Center

    Martinez, Elisabetta Monari

    2010-01-01

    The valuation of academic achievements in students with severe language impairment is problematic if they also have difficulties in sustaining attention and in praxic skills. In severe autism all of these difficulties may occur together. Multiple-choice tests offer the advantage that simple praxic skills are required, allowing the tasks to be…

  6. Reproducibility of R-fMRI metrics on the impact of different strategies for multiple comparison correction and sample sizes.

    PubMed

    Chen, Xiao; Lu, Bin; Yan, Chao-Gan

    2018-01-01

    Concerns regarding reproducibility of resting-state functional magnetic resonance imaging (R-fMRI) findings have been raised. Little is known about how to operationally define R-fMRI reproducibility and to what extent it is affected by multiple comparison correction strategies and sample size. We comprehensively assessed two aspects of reproducibility, test-retest reliability and replicability, on widely used R-fMRI metrics in both between-subject contrasts of sex differences and within-subject comparisons of eyes-open and eyes-closed (EOEC) conditions. We noted permutation test with Threshold-Free Cluster Enhancement (TFCE), a strict multiple comparison correction strategy, reached the best balance between family-wise error rate (under 5%) and test-retest reliability/replicability (e.g., 0.68 for test-retest reliability and 0.25 for replicability of amplitude of low-frequency fluctuations (ALFF) for between-subject sex differences, 0.49 for replicability of ALFF for within-subject EOEC differences). Although R-fMRI indices attained moderate reliabilities, they replicated poorly in distinct datasets (replicability < 0.3 for between-subject sex differences, < 0.5 for within-subject EOEC differences). By randomly drawing different sample sizes from a single site, we found reliability, sensitivity and positive predictive value (PPV) rose as sample size increased. Small sample sizes (e.g., < 80 [40 per group]) not only minimized power (sensitivity < 2%), but also decreased the likelihood that significant results reflect "true" effects (PPV < 0.26) in sex differences. Our findings have implications for how to select multiple comparison correction strategies and highlight the importance of sufficiently large sample sizes in R-fMRI studies to enhance reproducibility. Hum Brain Mapp 39:300-318, 2018. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  7. Set of Criteria for Efficiency of the Process Forming the Answers to Multiple-Choice Test Items

    ERIC Educational Resources Information Center

    Rybanov, Alexander Aleksandrovich

    2013-01-01

    Is offered the set of criteria for assessing efficiency of the process forming the answers to multiple-choice test items. To increase accuracy of computer-assisted testing results, it is suggested to assess dynamics of the process of forming the final answer using the following factors: loss of time factor and correct choice factor. The model…

  8. The "None of the Above" Option in Multiple-Choice Testing: An Experimental Study

    ERIC Educational Resources Information Center

    DiBattista, David; Sinnige-Egger, Jo-Anne; Fortuna, Glenda

    2014-01-01

    The authors assessed the effects of using "none of the above" as an option in a 40-item, general-knowledge multiple-choice test administered to undergraduate students. Examinees who selected "none of the above" were given an incentive to write the correct answer to the question posed. Using "none of the above" as the…

  9. Experimental evaluation of the extended Dytlewski-style dead time correction formalism for neutron multiplicity counting

    DOE PAGES

    Lockhart, M.; Henzlova, D.; Croft, S.; ...

    2017-09-20

    Over the past few decades, neutron multiplicity counting has played an integral role in Special Nuclear Material (SNM) characterization pertaining to nuclear safeguards. Current neutron multiplicity analysis techniques use singles, doubles, and triples count rates because a methodology to extract and dead time correct higher order count rates (i.e. quads and pents) was not fully developed. This limitation is overcome by the recent extension of a popular dead time correction method developed by Dytlewski. This extended dead time correction algorithm, named Dytlewski-Croft-Favalli (DCF), is detailed in reference Croft and Favalli (2017), which gives an extensive explanation of the theory andmore » implications of this new development. Dead time corrected results can then be used to assay SNM by inverting a set of extended point model equations which as well have only recently been formulated. Here, we discuss and present the experimental evaluation of practical feasibility of the DCF dead time correction algorithm to demonstrate its performance and applicability in nuclear safeguards applications. In order to test the validity and effectiveness of the dead time correction for quads and pents, 252Cf and SNM sources were measured in high efficiency neutron multiplicity counters at the Los Alamos National Laboratory (LANL) and the count rates were extracted up to the fifth order and corrected for dead time. To assess the DCF dead time correction, the corrected data is compared to traditional dead time correction treatment within INCC. In conclusion, the DCF dead time correction is found to provide adequate dead time treatment for broad range of count rates available in practical applications.« less

  10. Experimental evaluation of the extended Dytlewski-style dead time correction formalism for neutron multiplicity counting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lockhart, M.; Henzlova, D.; Croft, S.

    Over the past few decades, neutron multiplicity counting has played an integral role in Special Nuclear Material (SNM) characterization pertaining to nuclear safeguards. Current neutron multiplicity analysis techniques use singles, doubles, and triples count rates because a methodology to extract and dead time correct higher order count rates (i.e. quads and pents) was not fully developed. This limitation is overcome by the recent extension of a popular dead time correction method developed by Dytlewski. This extended dead time correction algorithm, named Dytlewski-Croft-Favalli (DCF), is detailed in reference Croft and Favalli (2017), which gives an extensive explanation of the theory andmore » implications of this new development. Dead time corrected results can then be used to assay SNM by inverting a set of extended point model equations which as well have only recently been formulated. Here, we discuss and present the experimental evaluation of practical feasibility of the DCF dead time correction algorithm to demonstrate its performance and applicability in nuclear safeguards applications. In order to test the validity and effectiveness of the dead time correction for quads and pents, 252Cf and SNM sources were measured in high efficiency neutron multiplicity counters at the Los Alamos National Laboratory (LANL) and the count rates were extracted up to the fifth order and corrected for dead time. To assess the DCF dead time correction, the corrected data is compared to traditional dead time correction treatment within INCC. In conclusion, the DCF dead time correction is found to provide adequate dead time treatment for broad range of count rates available in practical applications.« less

  11. Multiple-Choice Testing Using Immediate Feedback--Assessment Technique (IF AT®) Forms: Second-Chance Guessing vs. Second-Chance Learning?

    ERIC Educational Resources Information Center

    Merrel, Jeremy D.; Cirillo, Pier F.; Schwartz, Pauline M.; Webb, Jeffrey A.

    2015-01-01

    Multiple choice testing is a common but often ineffective method for evaluating learning. A newer approach, however, using Immediate Feedback Assessment Technique (IF AT®, Epstein Educational Enterprise, Inc.) forms, offers several advantages. In particular, a student learns immediately if his or her answer is correct and, in the case of an…

  12. Evaluation of five guidelines for option development in multiple-choice item-writing.

    PubMed

    Martínez, Rafael J; Moreno, Rafael; Martín, Irene; Trigo, M Eva

    2009-05-01

    This paper evaluates certain guidelines for writing multiple-choice test items. The analysis of the responses of 5013 subjects to 630 items from 21 university classroom achievement tests suggests that an option should not differ in terms of heterogeneous content because such error has a slight but harmful effect on item discrimination. This also occurs with the "None of the above" option when it is the correct one. In contrast, results do not show the supposedly negative effects of a different-length option, the use of specific determiners, or the use of the "All of the above" option, which not only decreases difficulty but also improves discrimination when it is the correct option.

  13. Pick-N Multiple Choice-Exams: A Comparison of Scoring Algorithms

    ERIC Educational Resources Information Center

    Bauer, Daniel; Holzer, Matthias; Kopp, Veronika; Fischer, Martin R.

    2011-01-01

    To compare different scoring algorithms for Pick-N multiple correct answer multiple-choice (MC) exams regarding test reliability, student performance, total item discrimination and item difficulty. Data from six 3rd year medical students' end of term exams in internal medicine from 2005 to 2008 at Munich University were analysed (1,255 students,…

  14. The Use of EEG as a Workload Assessment Tool in Flight Test

    DTIC Science & Technology

    1993-10-01

    resource, single pool, mental model (Wickens) 9 which postulates that the human has a limited source of mental potential and when tasked with multiple...psychological spectrum presents an interesting challenge for future research. 10 EP Amplitude Microvolts) I-J ---- Single Task ......... Difficult...example, they obtained a p value of .000025 for a single test and then applied a Bonferroni correction to yield a conservatively corrected value of p

  15. Genetic variation in cell death genes and risk of non-Hodgkin lymphoma.

    PubMed

    Schuetz, Johanna M; Daley, Denise; Graham, Jinko; Berry, Brian R; Gallagher, Richard P; Connors, Joseph M; Gascoyne, Randy D; Spinelli, John J; Brooks-Wilson, Angela R

    2012-01-01

    Non-Hodgkin lymphomas are a heterogeneous group of solid tumours that constitute the 5(th) highest cause of cancer mortality in the United States and Canada. Poor control of cell death in lymphocytes can lead to autoimmune disease or cancer, making genes involved in programmed cell death of lymphocytes logical candidate genes for lymphoma susceptibility. We tested for genetic association with NHL and NHL subtypes, of SNPs in lymphocyte cell death genes using an established population-based study. 17 candidate genes were chosen based on biological function, with 123 SNPs tested. These included tagSNPs from HapMap and novel SNPs discovered by re-sequencing 47 cases in genes for which SNP representation was judged to be low. The main analysis, which estimated odds ratios by fitting data to an additive logistic regression model, used European ancestry samples that passed quality control measures (569 cases and 547 controls). A two-tiered approach for multiple testing correction was used: correction for number of tests within each gene by permutation-based methodology, followed by correction for the number of genes tested using the false discovery rate. Variant rs928883, near miR-155, showed an association (OR per A-allele: 2.80 [95% CI: 1.63-4.82]; p(F) = 0.027) with marginal zone lymphoma that is significant after correction for multiple testing. This is the first reported association between a germline polymorphism at a miRNA locus and lymphoma.

  16. Beyond hypercorrection: remembering corrective feedback for low-confidence errors.

    PubMed

    Griffiths, Lauren; Higham, Philip A

    2018-02-01

    Correcting errors based on corrective feedback is essential to successful learning. Previous studies have found that corrections to high-confidence errors are better remembered than low-confidence errors (the hypercorrection effect). The aim of this study was to investigate whether corrections to low-confidence errors can also be successfully retained in some cases. Participants completed an initial multiple-choice test consisting of control, trick and easy general-knowledge questions, rated their confidence after answering each question, and then received immediate corrective feedback. After a short delay, they were given a cued-recall test consisting of the same questions. In two experiments, we found high-confidence errors to control questions were better corrected on the second test compared to low-confidence errors - the typical hypercorrection effect. However, low-confidence errors to trick questions were just as likely to be corrected as high-confidence errors. Most surprisingly, we found that memory for the feedback and original responses, not confidence or surprise, were significant predictors of error correction. We conclude that for some types of material, there is an effortful process of elaboration and problem solving prior to making low-confidence errors that facilitates memory of corrective feedback.

  17. Comparison of paragraph comprehension test scores with reading versus listening-reading and multiple-choice versus nominal recall administration techniques: justification for the bypass approach.

    PubMed

    Weinberg, W A; McLean, A; Snider, R L; Rintelmann, J W; Brumback, R A

    1989-12-01

    Eight groups of learning disabled children (N = 100), categorized by the clinical Lexical Paradigm as good readers or poor readers, were individually administered the Gilmore Oral Reading Test, Form D, by one of four input/retrieval methods: (1) the standardized method of administration in which the child reads each paragraph aloud and then answers five questions relating to the paragraph [read/recall method]; (2) the child reads each paragraph aloud and then for each question selects the correct answer from among three choices read by the examiner [read/choice method]; (3) the examiner reads each paragraph aloud and reads each of the five questions to the child to answer [listen/recall method]; and (4) the examiner reads each paragraph aloud and then for each question reads three multiple-choice answers from which the child selects the correct answer [listen/choice method]. The major difference in scores was between the groups tested by the recall versus the orally read multiple-choice methods. This study indicated that poor readers who listened to the material and were tested by orally read multiple-choice format could perform as well as good readers. The performance of good readers was not affected by listening or by the method of testing. The multiple-choice testing improved the performance of poor readers independent of the input method. This supports the arguments made previously that a "bypass approach" to education of poor readers in which testing is accomplished using an orally read multiple-choice format can enhance the child's school performance on reading-related tasks. Using a listening while reading input method may further enhance performance.

  18. GUIDELINES FOR INSTALLATION AND SAMPLING OF SUB-SLAB VAPOR PROBES TO SUPPORT ASSESSMENT OF VAPOR INTRUSION

    EPA Science Inventory

    The purpose of this paper is to provide guidelines for sub-slab sampling using dedicated vapor probes. Use of dedicated vapor probes allows for multiple sample events before and after corrective action and for vacuum testing to enhance the design and monitoring of a corrective m...

  19. Atmospheric correction of the ocean color observations of the medium resolution imaging spectrometer (MERIS)

    NASA Astrophysics Data System (ADS)

    Antoine, David; Morel, Andre

    1997-02-01

    An algorithm is proposed for the atmospheric correction of the ocean color observations by the MERIS instrument. The principle of the algorithm, which accounts for all multiple scattering effects, is presented. The algorithm is then teste, and its accuracy assessed in terms of errors in the retrieved marine reflectances.

  20. Tests of multiplicative models in psychology: a case study using the unified theory of implicit attitudes, stereotypes, self-esteem, and self-concept.

    PubMed

    Blanton, Hart; Jaccard, James

    2006-01-01

    Theories that posit multiplicative relationships between variables are common in psychology. A. G. Greenwald et al. recently presented a theory that explicated relationships between group identification, group attitudes, and self-esteem. Their theory posits a multiplicative relationship between concepts when predicting a criterion variable. Greenwald et al. suggested analytic strategies to test their multiplicative model that researchers might assume are appropriate for testing multiplicative models more generally. The theory and analytic strategies of Greenwald et al. are used as a case study to show the strong measurement assumptions that underlie certain tests of multiplicative models. It is shown that the approach used by Greenwald et al. can lead to declarations of theoretical support when the theory is wrong as well as rejection of the theory when the theory is correct. A simple strategy for testing multiplicative models that makes weaker measurement assumptions than the strategy proposed by Greenwald et al. is suggested and discussed.

  1. [Continuing medical education: how to write multiple choice questions].

    PubMed

    Soler Fernández, R; Méndez Díaz, C; Rodríguez García, E

    2013-06-01

    Evaluating professional competence in medicine is a difficult but indispensable task because it makes it possible to evaluate, at different times and from different perspectives, the extent to which the knowledge, skills, and values required for exercising the profession have been acquired. Tests based on multiple choice questions have been and continue to be among the most useful tools for objectively evaluating learning in medicine. When these tests are well designed and correctly used, they can stimulate learning and even measure higher cognitive skills. Designing a multiple choice test is a difficult task that requires knowledge of the material to be tested and of the methodology of test preparation as well as time to prepare the test. The aim of this article is to review what can be evaluated through multiple choice tests, the rules and guidelines that should be taken into account when writing multiple choice questions, the different formats that can be used, the most common errors in elaborating multiple choice tests, and how to analyze the results of the test to verify its quality. Copyright © 2012 SERAM. Published by Elsevier Espana. All rights reserved.

  2. Gender and Ethnicity Differences in Multiple-Choice Testing. Effects of Self-Assessment and Risk-Taking Propensity

    DTIC Science & Technology

    1993-05-01

    correctness of the response provides I some advantages. They are: i 1. Increased reliability of the test; 2. Examinees pay more attention to the multiple...their choice 3 of test date. Each sign up sheet was divided into four cells: Non-Hispanic males and females and Hispanic males and females. 3 I I I...certain prestige and financial rewards; or entering a conservatory of music for advanced training with a well-known pianist . Mr. H realizes that even

  3. Are Learning Disabled Students "Test-Wise?": An Inquiry into Reading Comprehension Test Items.

    ERIC Educational Resources Information Center

    Scruggs, Thomas E.; Lifson, Steve

    The ability to correctly answer reading comprehension test items, without having read the accompanying reading passage, was compared for third grade learning disabled students and their peers from a regular classroom. In the first experiment, fourteen multiple choice items were selected from the Stanford Achievement Test. No reading passages were…

  4. The Positive and Negative Effects of Science Concept Tests on Student Conceptual Understanding

    ERIC Educational Resources Information Center

    Chang, Chun-Yen; Yeh, Ting-Kuang; Barufaldi, James P.

    2010-01-01

    This study explored the phenomenon of testing effect during science concept assessments, including the mechanism behind it and its impact upon a learner's conceptual understanding. The participants consisted of 208 high school students, in either the 11th or 12th grade. Three types of tests (traditional multiple-choice test, correct concept test,…

  5. A Comparison of Three Tests of Mediation

    ERIC Educational Resources Information Center

    Warbasse, Rosalia E.

    2009-01-01

    A simulation study was conducted to evaluate the performance of three tests of mediation: the bias-corrected and accelerated bootstrap (Efron & Tibshirani, 1993), the asymmetric confidence limits test (MacKinnon, 2008), and a multiple regression approach described by Kenny, Kashy, and Bolger (1998). The evolution of these methods is reviewed and…

  6. Passage Independence within Standardized Reading Comprehension Tests

    ERIC Educational Resources Information Center

    Roy-Charland, Annie; Colangelo, Gabrielle; Foglia, Victoria; Reguigui, Leïla

    2017-01-01

    In tests used to measure reading comprehension, validity is important in obtaining accurate results. Unfortunately, studies have shown that people can correctly answer some questions of these tests without reading the related passage. These findings bring forth the need to address whether this phenomenon is observed in multiple-choice only tests…

  7. Initial Experiences with Machine-Assisted Reconsiderative Test Scoring: A New Method for Partial Credit and Multiple Correct Responses.

    ERIC Educational Resources Information Center

    Anderson, Paul S.

    Initial experiences with computer-assisted reconsiderative scoring are described. Reconsiderative scoring occurs when student responses are received and reviewed by the teacher before points for correctness are assigned. Manually scored completion-style questions are reconsiderative. A new method of machine assistance produces an item analysis on…

  8. 40 CFR 65.158 - Performance test procedures for control devices.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... simultaneously from multiple loading arms, each run shall represent at least one complete tank truck or tank car... the combustion air or as a secondary fuel into a boiler or process heater with a design capacity less... corrected to 3 percent oxygen if a combustion device is the control device. (A) The emission rate correction...

  9. Evaluation of the laboratory mouse model for screening topical mosquito repellents.

    PubMed

    Rutledge, L C; Gupta, R K; Wirtz, R A; Buescher, M D

    1994-12-01

    Eight commercial repellents were tested against Aedes aegypti 0 and 4 h after application in serial dilution to volunteers and laboratory mice. Results were analyzed by multiple regression of percentage of biting (probit scale) on dose (logarithmic scale) and time. Empirical correction terms for conversion of values obtained in tests on mice to values expected in tests on human volunteers were calculated from data obtained on 4 repellents and evaluated with data obtained on 4 others. Corrected values from tests on mice did not differ significantly from values obtained in tests on volunteers. Test materials used in the study were dimethyl phthalate, butopyronoxyl, butoxy polypropylene glycol, MGK Repellent 11, deet, ethyl hexanediol, Citronyl, and dibutyl phthalate.

  10. Atypical nucleus accumbens morphology in psychopathy: another limbic piece in the puzzle.

    PubMed

    Boccardi, Marina; Bocchetta, Martina; Aronen, Hannu J; Repo-Tiihonen, Eila; Vaurio, Olli; Thompson, Paul M; Tiihonen, Jari; Frisoni, Giovanni B

    2013-01-01

    Psychopathy has been associated with increased putamen and striatum volumes. The nucleus accumbens - a key structure in reversal learning, less effective in psychopathy - has not yet received specific attention. Moreover, basal ganglia morphology has never been explored. We examined the morphology of the caudate, putamen and accumbens, manually segmented from magnetic resonance images of 26 offenders (age: 32.5 ± 8.4) with medium-high psychopathy (mean PCL-R=30 ± 5) and 25 healthy controls (age: 34.6 ± 10.8). Local differences were statistically modeled using a surface-based radial distance mapping method (p<0.05; multiple comparisons correction through permutation tests). In psychopathy, the caudate and putamen had normal global volume, but different morphology, significant after correction for multiple comparisons, for the right dorsal putamen (permutation test: p=0.02). The volume of the nucleus accumbens was 13% smaller in psychopathy (p corrected for multiple comparisons <0.006). The atypical morphology consisted of predominant anterior hypotrophy bilaterally (10-30%). Caudate and putamen local morphology displayed negative correlation with the lifestyle factor of the PCL-R (permutation test: p=0.05 and 0.03). From these data, psychopathy appears to be associated with an atypical striatal morphology, with highly significant global and local differences of the accumbens. This is consistent with the clinical syndrome and with theories of limbic involvement. Copyright © 2013 Elsevier Ltd. All rights reserved.

  11. The catechol-O-methyltransferase gene (COMT) and cognitive function from childhood through adolescence

    PubMed Central

    Gaysina, Darya; Xu, Man K.; Barnett, Jennifer H.; Croudace, Tim J.; Wong, Andrew; Richards, Marcus; Jones, Peter B.

    2013-01-01

    Genetic variation in the catechol-O-methyltransferase gene (COMT) can influence cognitive function, and this effect may depend on developmental stage. Using a large representative British birth cohort, we investigated the effect of COMT on cognitive function (verbal and non-verbal) at ages 8 and 15 years taking into account the possible modifying effect of pubertal stage. Five functional COMT polymorphisms, rs6269, rs4818, rs4680, rs737865 and rs165599 were analysed. Associations between COMT polymorphisms and cognition were tested using regression and latent variable structural equation modelling (SEM). Before correction for multiple testing, COMT rs737865 showed association with reading comprehension, verbal ability and global cognition at age 15 years in pubescent boys only. Although there was some evidence for age- and sex-specific effects of the COMT rs737865 none remained significant after correction for multiple testing. Further studies are necessary in order to make firmer conclusions. PMID:23178897

  12. Mass univariate analysis of event-related brain potentials/fields I: a critical tutorial review.

    PubMed

    Groppe, David M; Urbach, Thomas P; Kutas, Marta

    2011-12-01

    Event-related potentials (ERPs) and magnetic fields (ERFs) are typically analyzed via ANOVAs on mean activity in a priori windows. Advances in computing power and statistics have produced an alternative, mass univariate analyses consisting of thousands of statistical tests and powerful corrections for multiple comparisons. Such analyses are most useful when one has little a priori knowledge of effect locations or latencies, and for delineating effect boundaries. Mass univariate analyses complement and, at times, obviate traditional analyses. Here we review this approach as applied to ERP/ERF data and four methods for multiple comparison correction: strong control of the familywise error rate (FWER) via permutation tests, weak control of FWER via cluster-based permutation tests, false discovery rate control, and control of the generalized FWER. We end with recommendations for their use and introduce free MATLAB software for their implementation. Copyright © 2011 Society for Psychophysiological Research.

  13. Automated plasma control with optical emission spectroscopy

    NASA Astrophysics Data System (ADS)

    Ward, P. P.

    Plasma etching and desmear processes for printed wiring board (PWB) manufacture are difficult to predict and control. Non-uniformity of most plasma processes and sensitivity to environmental changes make it difficult to maintain process stability from day to day. To assure plasma process performance, weight loss coupons or post-plasma destructive testing must be used. These techniques are not real-time methods however, and do not allow for immediate diagnosis and process correction. These tests often require scrapping some fraction of a batch to insure the integrity of the rest. Since these tests verify a successful cycle with post-plasma diagnostics, poor test results often determine that a batch is substandard and the resulting parts unusable. These tests are a costly part of the overall fabrication cost. A more efficient method of testing would allow for constant monitoring of plasma conditions and process control. Process anomalies should be detected and corrected before the parts being treated are damaged. Real time monitoring would allow for instantaneous corrections. Multiple site monitoring would allow for process mapping within one system or simultaneous monitoring of multiple systems. Optical emission spectroscopy conducted external to the plasma apparatus would allow for this sort of multifunctional analysis without perturbing the glow discharge. In this paper, optical emission spectroscopy for non-intrusive, in situ process control will be explored along with applications of this technique to for process control, failure analysis and endpoint determination in PWB manufacture.

  14. The Effects of Item by Item Feedback Given during an Ability Test.

    ERIC Educational Resources Information Center

    Whetton, C.; Childs, R.

    1981-01-01

    Answer-until-correct (AUC) is a procedure for providing feedback during a multiple-choice test, giving an increased range of scores. The performance of secondary students on a verbal ability test using AUC procedures was compared with a group using conventional instructions. AUC scores considerably enhanced reliability but not validity.…

  15. Learning From Tests: Facilitation of Delayed Recall by Initial Recognition Alternatives.

    ERIC Educational Resources Information Center

    Whitten, William B., II; Leonard, Janet Mauriello

    1980-01-01

    Two experiments were designed to determine the effects of multiple-choice recognition test alternatives on subsequent memory for the correct answers. Results of both experiments are interpreted as demonstrations of the principle that long-term retention is facilitated such that memory evaluation occurs during initial recognition tests. (Author/RD)

  16. How well does multiple OCR error correction generalize?

    NASA Astrophysics Data System (ADS)

    Lund, William B.; Ringger, Eric K.; Walker, Daniel D.

    2013-12-01

    As the digitization of historical documents, such as newspapers, becomes more common, the need of the archive patron for accurate digital text from those documents increases. Building on our earlier work, the contributions of this paper are: 1. in demonstrating the applicability of novel methods for correcting optical character recognition (OCR) on disparate data sets, including a new synthetic training set, 2. enhancing the correction algorithm with novel features, and 3. assessing the data requirements of the correction learning method. First, we correct errors using conditional random fields (CRF) trained on synthetic training data sets in order to demonstrate the applicability of the methodology to unrelated test sets. Second, we show the strength of lexical features from the training sets on two unrelated test sets, yielding a relative reduction in word error rate on the test sets of 6.52%. New features capture the recurrence of hypothesis tokens and yield an additional relative reduction in WER of 2.30%. Further, we show that only 2.0% of the full training corpus of over 500,000 feature cases is needed to achieve correction results comparable to those using the entire training corpus, effectively reducing both the complexity of the training process and the learned correction model.

  17. The MAX Statistic is Less Powerful for Genome Wide Association Studies Under Most Alternative Hypotheses.

    PubMed

    Shifflett, Benjamin; Huang, Rong; Edland, Steven D

    2017-01-01

    Genotypic association studies are prone to inflated type I error rates if multiple hypothesis testing is performed, e.g., sequentially testing for recessive, multiplicative, and dominant risk. Alternatives to multiple hypothesis testing include the model independent genotypic χ 2 test, the efficiency robust MAX statistic, which corrects for multiple comparisons but with some loss of power, or a single Armitage test for multiplicative trend, which has optimal power when the multiplicative model holds but with some loss of power when dominant or recessive models underlie the genetic association. We used Monte Carlo simulations to describe the relative performance of these three approaches under a range of scenarios. All three approaches maintained their nominal type I error rates. The genotypic χ 2 and MAX statistics were more powerful when testing a strictly recessive genetic effect or when testing a dominant effect when the allele frequency was high. The Armitage test for multiplicative trend was most powerful for the broad range of scenarios where heterozygote risk is intermediate between recessive and dominant risk. Moreover, all tests had limited power to detect recessive genetic risk unless the sample size was large, and conversely all tests were relatively well powered to detect dominant risk. Taken together, these results suggest the general utility of the multiplicative trend test when the underlying genetic model is unknown.

  18. The frequency of item writing flaws in multiple-choice questions used in high stakes nursing assessments.

    PubMed

    Tarrant, Marie; Knierim, Aimee; Hayes, Sasha K; Ware, James

    2006-12-01

    Multiple-choice questions are a common assessment method in nursing examinations. Few nurse educators, however, have formal preparation in constructing multiple-choice questions. Consequently, questions used in baccalaureate nursing assessments often contain item-writing flaws, or violations to accepted item-writing guidelines. In one nursing department, 2770 MCQs were collected from tests and examinations administered over a five-year period from 2001 to 2005. Questions were evaluated for 19 frequently occurring item-writing flaws, for cognitive level, for question source, and for the distribution of correct answers. Results show that almost half (46.2%) of the questions contained violations of item-writing guidelines and over 90% were written at low cognitive levels. Only a small proportion of questions were teacher generated (14.1%), while 36.2% were taken from testbanks and almost half (49.4%) had no source identified. MCQs written at a lower cognitive level were significantly more likely to contain item-writing flaws. While there was no relationship between the source of the question and item-writing flaws, teacher-generated questions were more likely to be written at higher cognitive levels (p<0.001). Correct answers were evenly distributed across all four options and no bias was noted in the placement of correct options. Further training in item-writing is recommended for all faculty members who are responsible for developing tests. Pre-test review and quality assessment is also recommended to reduce the occurrence of item-writing flaws and to improve the quality of test questions.

  19. Small-Sample Adjustments for Tests of Moderators and Model Fit in Robust Variance Estimation in Meta-Regression

    ERIC Educational Resources Information Center

    Tipton, Elizabeth; Pustejovsky, James E.

    2015-01-01

    Randomized experiments are commonly used to evaluate the effectiveness of educational interventions. The goal of the present investigation is to develop small-sample corrections for multiple contrast hypothesis tests (i.e., F-tests) such as the omnibus test of meta-regression fit or a test for equality of three or more levels of a categorical…

  20. Correction of the significance level when attempting multiple transformations of an explanatory variable in generalized linear models

    PubMed Central

    2013-01-01

    Background In statistical modeling, finding the most favorable coding for an exploratory quantitative variable involves many tests. This process involves multiple testing problems and requires the correction of the significance level. Methods For each coding, a test on the nullity of the coefficient associated with the new coded variable is computed. The selected coding corresponds to that associated with the largest statistical test (or equivalently the smallest pvalue). In the context of the Generalized Linear Model, Liquet and Commenges (Stat Probability Lett,71:33–38,2005) proposed an asymptotic correction of the significance level. This procedure, based on the score test, has been developed for dichotomous and Box-Cox transformations. In this paper, we suggest the use of resampling methods to estimate the significance level for categorical transformations with more than two levels and, by definition those that involve more than one parameter in the model. The categorical transformation is a more flexible way to explore the unknown shape of the effect between an explanatory and a dependent variable. Results The simulations we ran in this study showed good performances of the proposed methods. These methods were illustrated using the data from a study of the relationship between cholesterol and dementia. Conclusion The algorithms were implemented using R, and the associated CPMCGLM R package is available on the CRAN. PMID:23758852

  1. Multiple Choice Test Bias Uncovered by Use of an "I Don't Know" Alternative.

    ERIC Educational Resources Information Center

    Sherman, Susan W.

    The multiple-choice science exercises used by the National Assessment of Educational Progress include an "I Don't Know" (IDK) alternative to estimate more accurately knowledge of groups of respondents. Group percentages of IDK responses were examined and compared with correct responses to see if the IDK introduces bias. Variance common…

  2. Regression-based pediatric norms for the brief visuospatial memory test: revised and the symbol digit modalities test.

    PubMed

    Smerbeck, A M; Parrish, J; Yeh, E A; Hoogs, M; Krupp, Lauren B; Weinstock-Guttman, B; Benedict, R H B

    2011-04-01

    The Brief Visuospatial Memory Test - Revised (BVMTR) and the Symbol Digit Modalities Test (SDMT) oral-only administration are known to be sensitive to cerebral disease in adult samples, but pediatric norms are not available. A demographically balanced sample of healthy control children (N = 92) ages 6-17 was tested with the BVMTR and SDMT. Multiple regression analysis (MRA) was used to develop demographically controlled normative equations. This analysis provided equations that were then used to construct demographically adjusted z-scores for the BVMTR Trial 1, Trial 2, Trial 3, Total Learning, and Delayed Recall indices, as well as the SDMT total correct score. To demonstrate the utility of this approach, a comparison group of children with acute disseminated encephalomyelitis (ADEM) or multiple sclerosis (MS) were also assessed. We find that these visual processing tests discriminate neurological patients from controls. As the tests are validated in adult multiple sclerosis, they are likely to be useful in monitoring pediatric onset multiple sclerosis patients as they transition into adulthood.

  3. Force Concept Inventory-based multiple-choice test for investigating students' representational consistency

    NASA Astrophysics Data System (ADS)

    Nieminen, Pasi; Savinainen, Antti; Viiri, Jouni

    2010-07-01

    This study investigates students’ ability to interpret multiple representations consistently (i.e., representational consistency) in the context of the force concept. For this purpose we developed the Representational Variant of the Force Concept Inventory (R-FCI), which makes use of nine items from the 1995 version of the Force Concept Inventory (FCI). These original FCI items were redesigned using various representations (such as motion map, vectorial and graphical), yielding 27 multiple-choice items concerning four central concepts underpinning the force concept: Newton’s first, second, and third laws, and gravitation. We provide some evidence for the validity and reliability of the R-FCI; this analysis is limited to the student population of one Finnish high school. The students took the R-FCI at the beginning and at the end of their first high school physics course. We found that students’ (n=168) representational consistency (whether scientifically correct or not) varied considerably depending on the concept. On average, representational consistency and scientifically correct understanding increased during the instruction, although in the post-test only a few students performed consistently both in terms of representations and scientifically correct understanding. We also compared students’ (n=87) results of the R-FCI and the FCI, and found that they correlated quite well.

  4. New Stuff in I/O (In-Baskets and Orals). The Development, Administration and Scoring of In-Baskets and Orals for the New York State Correction Captain Examination.

    ERIC Educational Resources Information Center

    Kaiser, Paul D.; Brull, Harry

    The design, administration, scoring, and results of the 1993 New York State Correctional Captain Examination are described. The examination was administered to 405 candidates. As in previous Sergeant and Lieutenant examinations, candidates also completed latent image written simulation problems and open/closed book multiple choice test components.…

  5. Project Physics Tests 5, Models of the Atom.

    ERIC Educational Resources Information Center

    Harvard Univ., Cambridge, MA. Harvard Project Physics.

    Test items relating to Project Physics Unit 5 are presented in this booklet. Included are 70 multiple-choice and 23 problem-and-essay questions. Concepts of atomic model are examined on aspects of relativistic corrections, electron emission, photoelectric effects, Compton effect, quantum theories, electrolysis experiments, atomic number and mass,…

  6. Statistical testing and power analysis for brain-wide association study.

    PubMed

    Gong, Weikang; Wan, Lin; Lu, Wenlian; Ma, Liang; Cheng, Fan; Cheng, Wei; Grünewald, Stefan; Feng, Jianfeng

    2018-04-05

    The identification of connexel-wise associations, which involves examining functional connectivities between pairwise voxels across the whole brain, is both statistically and computationally challenging. Although such a connexel-wise methodology has recently been adopted by brain-wide association studies (BWAS) to identify connectivity changes in several mental disorders, such as schizophrenia, autism and depression, the multiple correction and power analysis methods designed specifically for connexel-wise analysis are still lacking. Therefore, we herein report the development of a rigorous statistical framework for connexel-wise significance testing based on the Gaussian random field theory. It includes controlling the family-wise error rate (FWER) of multiple hypothesis testings using topological inference methods, and calculating power and sample size for a connexel-wise study. Our theoretical framework can control the false-positive rate accurately, as validated empirically using two resting-state fMRI datasets. Compared with Bonferroni correction and false discovery rate (FDR), it can reduce false-positive rate and increase statistical power by appropriately utilizing the spatial information of fMRI data. Importantly, our method bypasses the need of non-parametric permutation to correct for multiple comparison, thus, it can efficiently tackle large datasets with high resolution fMRI images. The utility of our method is shown in a case-control study. Our approach can identify altered functional connectivities in a major depression disorder dataset, whereas existing methods fail. A software package is available at https://github.com/weikanggong/BWAS. Copyright © 2018 Elsevier B.V. All rights reserved.

  7. Analysis of 30 Genes (355 SNPS) Related to Energy Homeostasis for Association with Adiposity in European-American and Yup'ik Eskimo Populations

    PubMed Central

    Chung, Wendy K.; Patki, Amit; Matsuoka, Naoki; Boyer, Bert B.; Liu, Nianjun; Musani, Solomon K.; Goropashnaya, Anna V.; Tan, Perciliz L.; Katsanis, Nicholas; Johnson, Stephen B.; Gregersen, Peter K.; Allison, David B.; Leibel, Rudolph L.; Tiwari, Hemant K.

    2009-01-01

    Objective Human adiposity is highly heritable, but few of the genes that predispose to obesity in most humans are known. We tested candidate genes in pathways related to food intake and energy expenditure for association with measures of adiposity. Methods We studied 355 genetic variants in 30 candidate genes in 7 molecular pathways related to obesity in two groups of adult subjects: 1,982 unrelated European Americans living in the New York metropolitan area drawn from the extremes of their body mass index (BMI) distribution and 593 related Yup'ik Eskimos living in rural Alaska characterized for BMI, body composition, waist circumference, and skin fold thicknesses. Data were analyzed by using a mixed model in conjunction with a false discovery rate (FDR) procedure to correct for multiple testing. Results After correcting for multiple testing, two single nucleotide polymorphisms (SNPs) in Ghrelin (GHRL) (rs35682 and rs35683) were associated with BMI in the New York European Americans. This association was not replicated in the Yup'ik participants. There was no evidence for gene × gene interactions among genes within the same molecular pathway after adjusting for multiple testing via FDR control procedure. Conclusion Genetic variation in GHRL may have a modest impact on BMI in European Americans. PMID:19077438

  8. Analysis of 30 genes (355 SNPS) related to energy homeostasis for association with adiposity in European-American and Yup'ik Eskimo populations.

    PubMed

    Chung, Wendy K; Patki, Amit; Matsuoka, Naoki; Boyer, Bert B; Liu, Nianjun; Musani, Solomon K; Goropashnaya, Anna V; Tan, Perciliz L; Katsanis, Nicholas; Johnson, Stephen B; Gregersen, Peter K; Allison, David B; Leibel, Rudolph L; Tiwari, Hemant K

    2009-01-01

    Human adiposity is highly heritable, but few of the genes that predispose to obesity in most humans are known. We tested candidate genes in pathways related to food intake and energy expenditure for association with measures of adiposity. We studied 355 genetic variants in 30 candidate genes in 7 molecular pathways related to obesity in two groups of adult subjects: 1,982 unrelated European Americans living in the New York metropolitan area drawn from the extremes of their body mass index (BMI) distribution and 593 related Yup'ik Eskimos living in rural Alaska characterized for BMI, body composition, waist circumference, and skin fold thicknesses. Data were analyzed by using a mixed model in conjunction with a false discovery rate (FDR) procedure to correct for multiple testing. After correcting for multiple testing, two single nucleotide polymorphisms (SNPs) in Ghrelin (GHRL) (rs35682 and rs35683) were associated with BMI in the New York European Americans. This association was not replicated in the Yup'ik participants. There was no evidence for gene x gene interactions among genes within the same molecular pathway after adjusting for multiple testing via FDR control procedure. Genetic variation in GHRL may have a modest impact on BMI in European Americans.

  9. Improving the analysis of slug tests

    USGS Publications Warehouse

    McElwee, C.D.

    2002-01-01

    This paper examines several techniques that have the potential to improve the quality of slug test analysis. These techniques are applicable in the range from low hydraulic conductivities with overdamped responses to high hydraulic conductivities with nonlinear oscillatory responses. Four techniques for improving slug test analysis will be discussed: use of an extended capability nonlinear model, sensitivity analysis, correction for acceleration and velocity effects, and use of multiple slug tests. The four-parameter nonlinear slug test model used in this work is shown to allow accurate analysis of slug tests with widely differing character. The parameter ?? represents a correction to the water column length caused primarily by radius variations in the wellbore and is most useful in matching the oscillation frequency and amplitude. The water column velocity at slug initiation (V0) is an additional model parameter, which would ideally be zero but may not be due to the initiation mechanism. The remaining two model parameters are A (parameter for nonlinear effects) and K (hydraulic conductivity). Sensitivity analysis shows that in general ?? and V0 have the lowest sensitivity and K usually has the highest. However, for very high K values the sensitivity to A may surpass the sensitivity to K. Oscillatory slug tests involve higher accelerations and velocities of the water column; thus, the pressure transducer responses are affected by these factors and the model response must be corrected to allow maximum accuracy for the analysis. The performance of multiple slug tests will allow some statistical measure of the experimental accuracy and of the reliability of the resulting aquifer parameters. ?? 2002 Elsevier Science B.V. All rights reserved.

  10. A new surgical technique for medial collateral ligament balancing: multiple needle puncturing.

    PubMed

    Bellemans, Johan; Vandenneucker, Hilde; Van Lauwe, Johan; Victor, Jan

    2010-10-01

    In this article, we present our experience with a new technique for medial soft tissue balancing, where we make multiple punctures in the medial collateral ligament (MCL) using a 19-gauge needle, to progressively stretch the MCL until a correct ligament balance is achieved. Ligament status was evaluated both before and after the procedure using computer navigation and mediolateral stress testing. The procedure was considered successful when 2 to 4-mm mediolateral joint line opening was obtained in extension and 2 to 6 mm in flexion. In 34 of 35 cases, a progressive correction of medial tightness was achieved according to the above described criteria. One case was considered overreleased in extension. Needle puncturing is a new, effective, and safe technique for progressive correction of MCL tightness in the varus knee. Copyright © 2010 Elsevier Inc. All rights reserved.

  11. Comprehension and Data-Sharing Behavior of Direct-To-Consumer Genetic Test Customers.

    PubMed

    McGrath, Scott P; Coleman, Jason; Najjar, Lotfollah; Fruhling, Ann; Bastola, Dhundy R

    2016-01-01

    The aim of this study was to evaluate current direct-to-consumer (DTC) genetic customers' ability to interpret and comprehend test results and to determine if honest brokers are needed. One hundred and twenty-two customers of the DTC genetic testing company 23andMe were polled in an online survey. The subjects were asked about their personal test results and to interpret the results of two mock test cases (type 2 diabetes and multiple sclerosis), where results were translated into disease probability for an individual compared to the public. When asked to evaluate the risk, 72.1% correctly assessed the first case and 77% were correct on the second case. Only 23.8% of those surveyed were able to interpret both cases correctly. x03C7;2 and logistic regression were used to interpret the results. Participants who took the time to read the DTC test-provided supplemental material were 3.93 times (p = 0.040) more likely to correctly interpret the test results than those who did not. The odds for correctly interpreting the test cases were 3.289 times (p = 0.011) higher for those who made more than USD 50,000 than those who made less. Survey results were compared to the Health Information National Trends Survey (HINTS) phase 4 cycle 3 data to evaluate national trends. Most of the subjects were able to correctly interpret the test cases, yet a majority did not share their results with a health-care professional. As the market for DTC genetic testing grows, test comprehension will become more critical. Involving more health professionals in this process may be necessary to ensure proper interpretations. © 2016 S. Karger AG, Basel.

  12. The Effect of SSM Grading on Reliability When Residual Items Have No Discriminating Power.

    ERIC Educational Resources Information Center

    Kane, Michael T.; Moloney, James M.

    Gilman and Ferry have shown that when the student's score on a multiple choice test is the total number of responses necessary to get all items correct, substantial increases in reliability can occur. In contrast, similar procedures giving partial credit on multiple choice items have resulted in relatively small gains in reliability. The analysis…

  13. A Systematic Assessment of "None of the Above" on Multiple Choice Tests in a First Year Psychology Classroom

    ERIC Educational Resources Information Center

    Pachai, Matthew V.; DiBattista, David; Kim, Joseph A.

    2015-01-01

    Multiple choice writing guidelines are decidedly split on the use of "none of the above" (NOTA), with some authors discouraging and others advocating its use. Moreover, empirical studies of NOTA have produced mixed results. Generally, these studies have utilized NOTA as either the correct response or a distractor and assessed its effect…

  14. An analytical framework for whole-genome sequence association studies and its implications for autism spectrum disorder.

    PubMed

    Werling, Donna M; Brand, Harrison; An, Joon-Yong; Stone, Matthew R; Zhu, Lingxue; Glessner, Joseph T; Collins, Ryan L; Dong, Shan; Layer, Ryan M; Markenscoff-Papadimitriou, Eirene; Farrell, Andrew; Schwartz, Grace B; Wang, Harold Z; Currall, Benjamin B; Zhao, Xuefang; Dea, Jeanselle; Duhn, Clif; Erdman, Carolyn A; Gilson, Michael C; Yadav, Rachita; Handsaker, Robert E; Kashin, Seva; Klei, Lambertus; Mandell, Jeffrey D; Nowakowski, Tomasz J; Liu, Yuwen; Pochareddy, Sirisha; Smith, Louw; Walker, Michael F; Waterman, Matthew J; He, Xin; Kriegstein, Arnold R; Rubenstein, John L; Sestan, Nenad; McCarroll, Steven A; Neale, Benjamin M; Coon, Hilary; Willsey, A Jeremy; Buxbaum, Joseph D; Daly, Mark J; State, Matthew W; Quinlan, Aaron R; Marth, Gabor T; Roeder, Kathryn; Devlin, Bernie; Talkowski, Michael E; Sanders, Stephan J

    2018-05-01

    Genomic association studies of common or rare protein-coding variation have established robust statistical approaches to account for multiple testing. Here we present a comparable framework to evaluate rare and de novo noncoding single-nucleotide variants, insertion/deletions, and all classes of structural variation from whole-genome sequencing (WGS). Integrating genomic annotations at the level of nucleotides, genes, and regulatory regions, we define 51,801 annotation categories. Analyses of 519 autism spectrum disorder families did not identify association with any categories after correction for 4,123 effective tests. Without appropriate correction, biologically plausible associations are observed in both cases and controls. Despite excluding previously identified gene-disrupting mutations, coding regions still exhibited the strongest associations. Thus, in autism, the contribution of de novo noncoding variation is probably modest in comparison to that of de novo coding variants. Robust results from future WGS studies will require large cohorts and comprehensive analytical strategies that consider the substantial multiple-testing burden.

  15. Immediate vs. Delayed Feedback in a Computer-Managed Test: Effects on Long-Term Retention. Technical Report, March 1976-August 1976.

    ERIC Educational Resources Information Center

    Sturges, Persis T.

    This experiment was designed to test the effect of immediate and delayed feedback on retention of learning in an educational situation. Four groups of college undergraduates took a multiple-choice computer-managed test. Three of these groups received informative feedback (the entire item with the correct answer identified) either: (1) immediately…

  16. Development and Application of a Four-Tier Test to Assess Pre-Service Physics Teachers' Misconceptions about Geometrical Optics

    ERIC Educational Resources Information Center

    Kaltakci-Gurel, Derya; Eryilmaz, Ali; McDermott, Lillian Christie

    2017-01-01

    Background: Correct identification of misconceptions is an important first step in order to gain an understanding of student learning. More recently, four-tier multiple choice tests have been found to be effective in assessing misconceptions. Purpose: The purposes of this study are (1) to develop and validate a four-tier misconception test to…

  17. Multiple Phenotype Association Tests Using Summary Statistics in Genome-Wide Association Studies

    PubMed Central

    Liu, Zhonghua; Lin, Xihong

    2017-01-01

    Summary We study in this paper jointly testing the associations of a genetic variant with correlated multiple phenotypes using the summary statistics of individual phenotype analysis from Genome-Wide Association Studies (GWASs). We estimated the between-phenotype correlation matrix using the summary statistics of individual phenotype GWAS analyses, and developed genetic association tests for multiple phenotypes by accounting for between-phenotype correlation without the need to access individual-level data. Since genetic variants often affect multiple phenotypes differently across the genome and the between-phenotype correlation can be arbitrary, we proposed robust and powerful multiple phenotype testing procedures by jointly testing a common mean and a variance component in linear mixed models for summary statistics. We computed the p-values of the proposed tests analytically. This computational advantage makes our methods practically appealing in large-scale GWASs. We performed simulation studies to show that the proposed tests maintained correct type I error rates, and to compare their powers in various settings with the existing methods. We applied the proposed tests to a GWAS Global Lipids Genetics Consortium summary statistics data set and identified additional genetic variants that were missed by the original single-trait analysis. PMID:28653391

  18. Multiple phenotype association tests using summary statistics in genome-wide association studies.

    PubMed

    Liu, Zhonghua; Lin, Xihong

    2018-03-01

    We study in this article jointly testing the associations of a genetic variant with correlated multiple phenotypes using the summary statistics of individual phenotype analysis from Genome-Wide Association Studies (GWASs). We estimated the between-phenotype correlation matrix using the summary statistics of individual phenotype GWAS analyses, and developed genetic association tests for multiple phenotypes by accounting for between-phenotype correlation without the need to access individual-level data. Since genetic variants often affect multiple phenotypes differently across the genome and the between-phenotype correlation can be arbitrary, we proposed robust and powerful multiple phenotype testing procedures by jointly testing a common mean and a variance component in linear mixed models for summary statistics. We computed the p-values of the proposed tests analytically. This computational advantage makes our methods practically appealing in large-scale GWASs. We performed simulation studies to show that the proposed tests maintained correct type I error rates, and to compare their powers in various settings with the existing methods. We applied the proposed tests to a GWAS Global Lipids Genetics Consortium summary statistics data set and identified additional genetic variants that were missed by the original single-trait analysis. © 2017, The International Biometric Society.

  19. Nonparametric relevance-shifted multiple testing procedures for the analysis of high-dimensional multivariate data with small sample sizes.

    PubMed

    Frömke, Cornelia; Hothorn, Ludwig A; Kropf, Siegfried

    2008-01-27

    In many research areas it is necessary to find differences between treatment groups with several variables. For example, studies of microarray data seek to find a significant difference in location parameters from zero or one for ratios thereof for each variable. However, in some studies a significant deviation of the difference in locations from zero (or 1 in terms of the ratio) is biologically meaningless. A relevant difference or ratio is sought in such cases. This article addresses the use of relevance-shifted tests on ratios for a multivariate parallel two-sample group design. Two empirical procedures are proposed which embed the relevance-shifted test on ratios. As both procedures test a hypothesis for each variable, the resulting multiple testing problem has to be considered. Hence, the procedures include a multiplicity correction. Both procedures are extensions of available procedures for point null hypotheses achieving exact control of the familywise error rate. Whereas the shift of the null hypothesis alone would give straight-forward solutions, the problems that are the reason for the empirical considerations discussed here arise by the fact that the shift is considered in both directions and the whole parameter space in between these two limits has to be accepted as null hypothesis. The first algorithm to be discussed uses a permutation algorithm, and is appropriate for designs with a moderately large number of observations. However, many experiments have limited sample sizes. Then the second procedure might be more appropriate, where multiplicity is corrected according to a concept of data-driven order of hypotheses.

  20. HIV Testing, HIV Positivity, and Linkage and Referral Services in Correctional Facilities in the United States, 2009–2013

    PubMed Central

    Seth, Puja; Figueroa, Argelia; Wang, Guoshen; Reid, Laurie; Belcher, Lisa

    2016-01-01

    Background Because of health disparities, incarcerated persons are at higher risk for multiple health issues, including HIV. Correctional facilities have an opportunity to provide HIV services to an underserved population. This article describes Centers for Disease Control and Prevention (CDC)–funded HIV testing and service delivery in correctional facilities. Methods Data on HIV testing and service delivery were submitted to CDC by 61 health department jurisdictions in 2013. HIV testing, HIV positivity, receipt of test results, linkage, and referral services were described, and differences across demographic characteristics for linkage and referral services were assessed. Finally, trends were examined for HIV testing, HIV positivity, and linkage from 2009 to 2013. Results Of CDC-funded tests in 2013 among persons 18 years and older, 254,719 (7.9%) were conducted in correctional facilities. HIV positivity was 0.9%, and HIV positivity for newly diagnosed persons was 0.3%. Blacks accounted for the highest percentage of HIV-infected persons (1.3%) and newly diagnosed persons (0.5%). Only 37.9% of newly diagnosed persons were linked within 90 days; 67.5% were linked within any time frame; 49.7% were referred to partner services; and 45.2% were referred to HIV prevention services. There was a significant percent increase in HIV testing, overall HIV positivity, and linkage from 2009 to 2013. However, trends were stable for newly diagnosed persons. Conclusions Identification of newly diagnosed persons in correctional facilities has remained stable from 2009 to 2013. Correctional facilities seem to be reaching blacks, likely due to higher incarceration rates. The current findings indicate that improvements are needed in HIV testing strategies, service delivery during incarceration, and linkage to care postrelease. PMID:26462190

  1. A simple test of association for contingency tables with multiple column responses.

    PubMed

    Decady, Y J; Thomas, D R

    2000-09-01

    Loughin and Scherer (1998, Biometrics 54, 630-637) investigated tests of association in two-way tables when one of the categorical variables allows for multiple-category responses from individual respondents. Standard chi-squared tests are invalid in this case, and they developed a bootstrap test procedure that provides good control of test levels under the null hypothesis. This procedure and some others that have been proposed are computationally involved and are based on techniques that are relatively unfamiliar to many practitioners. In this paper, the methods introduced by Rao and Scott (1981, Journal of the American Statistical Association 76, 221-230) for analyzing complex survey data are used to develop a simple test based on a corrected chi-squared statistic.

  2. gsSKAT: Rapid gene set analysis and multiple testing correction for rare-variant association studies using weighted linear kernels.

    PubMed

    Larson, Nicholas B; McDonnell, Shannon; Cannon Albright, Lisa; Teerlink, Craig; Stanford, Janet; Ostrander, Elaine A; Isaacs, William B; Xu, Jianfeng; Cooney, Kathleen A; Lange, Ethan; Schleutker, Johanna; Carpten, John D; Powell, Isaac; Bailey-Wilson, Joan E; Cussenot, Olivier; Cancel-Tassin, Geraldine; Giles, Graham G; MacInnis, Robert J; Maier, Christiane; Whittemore, Alice S; Hsieh, Chih-Lin; Wiklund, Fredrik; Catalona, William J; Foulkes, William; Mandal, Diptasri; Eeles, Rosalind; Kote-Jarai, Zsofia; Ackerman, Michael J; Olson, Timothy M; Klein, Christopher J; Thibodeau, Stephen N; Schaid, Daniel J

    2017-05-01

    Next-generation sequencing technologies have afforded unprecedented characterization of low-frequency and rare genetic variation. Due to low power for single-variant testing, aggregative methods are commonly used to combine observed rare variation within a single gene. Causal variation may also aggregate across multiple genes within relevant biomolecular pathways. Kernel-machine regression and adaptive testing methods for aggregative rare-variant association testing have been demonstrated to be powerful approaches for pathway-level analysis, although these methods tend to be computationally intensive at high-variant dimensionality and require access to complete data. An additional analytical issue in scans of large pathway definition sets is multiple testing correction. Gene set definitions may exhibit substantial genic overlap, and the impact of the resultant correlation in test statistics on Type I error rate control for large agnostic gene set scans has not been fully explored. Herein, we first outline a statistical strategy for aggregative rare-variant analysis using component gene-level linear kernel score test summary statistics as well as derive simple estimators of the effective number of tests for family-wise error rate control. We then conduct extensive simulation studies to characterize the behavior of our approach relative to direct application of kernel and adaptive methods under a variety of conditions. We also apply our method to two case-control studies, respectively, evaluating rare variation in hereditary prostate cancer and schizophrenia. Finally, we provide open-source R code for public use to facilitate easy application of our methods to existing rare-variant analysis results. © 2017 WILEY PERIODICALS, INC.

  3. Testing and Performance Analysis of the Multichannel Error Correction Code Decoder

    NASA Technical Reports Server (NTRS)

    Soni, Nitin J.

    1996-01-01

    This report provides the test results and performance analysis of the multichannel error correction code decoder (MED) system for a regenerative satellite with asynchronous, frequency-division multiple access (FDMA) uplink channels. It discusses the system performance relative to various critical parameters: the coding length, data pattern, unique word value, unique word threshold, and adjacent-channel interference. Testing was performed under laboratory conditions and used a computer control interface with specifically developed control software to vary these parameters. Needed technologies - the high-speed Bose Chaudhuri-Hocquenghem (BCH) codec from Harris Corporation and the TRW multichannel demultiplexer/demodulator (MCDD) - were fully integrated into the mesh very small aperture terminal (VSAT) onboard processing architecture and were demonstrated.

  4. Longitudinal plasma metabolic profiles, infant feeding, and islet autoimmunity in the MIDIA study.

    PubMed

    Jørgenrud, Benedicte; Stene, Lars C; Tapia, German; Bøås, Håkon; Pepaj, Milaim; Berg, Jens P; Thorsby, Per M; Orešič, Matej; Hyötyläinen, Tuulia; Rønningen, Kjersti S

    2017-03-01

    The aim of this study was to investigate the longitudinal plasma metabolic profiles in healthy infants and the potential association with breastfeeding duration and islet autoantibodies predictive of type 1 diabetes. Up to four longitudinal plasma samples from age 3 months from case children who developed islet autoimmunity (n = 29) and autoantibody-negative control children (n = 29) with the HLA DR4-DQ8/DR3-DQ2 genotype were analyzed using two-dimensional gas chromatography coupled to a time-of-flight mass spectrometer for detection of small polar metabolites. Plasma metabolite levels were found to depend strongly on age, with fold changes varying up to 50% from age 3 to 24 months (p < 0.001 after correction for multiple testing). Tyrosine levels tended to be lower in case children, but this was not significant after correction for multiple testing. Ornithine levels were lower in case children compared with the controls at the time of seroconversion, but the difference was not statistically significant after correcting for multiple testing. Breastfeeding for at least 3 months as compared with shorter duration was associated with higher plasma levels of isoleucine, and lower levels of methionine and 3,4-dihydroxybutyric acid at 3 months of age. Plasma levels of several small, polar metabolites changed with age during early childhood, independent of later islet autoimmunity status and sex. Breastfeeding was associated with higher levels of branched-chain amino acids, and lower levels of methionine and 3,4-dihydroxybutyric acid. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  5. Surgical correction of urethral dilatation in an intersex goat.

    PubMed

    Karras, S; Modransky, P; Welker, B

    1992-11-15

    Multiple congenital urethral abnormalities were successfully corrected in a polled goat kid. Anatomic genito-urinary abnormalities identified were paired testes with associated epididymis, ductus deferens, and active endometrial tissue. Blood karyotyping revealed the female state--XX sex chromosomes. This case exemplifies the complex interactions in addition to Y dominant Mendelian genetics that determine reproductive tract development in goats. The resultant intersex state is clinically recognized with greater frequency in polled progeny.

  6. Further evidence for the increased power of LOD scores compared with nonparametric methods.

    PubMed

    Durner, M; Vieland, V J; Greenberg, D A

    1999-01-01

    In genetic analysis of diseases in which the underlying model is unknown, "model free" methods-such as affected sib pair (ASP) tests-are often preferred over LOD-score methods, although LOD-score methods under the correct or even approximately correct model are more powerful than ASP tests. However, there might be circumstances in which nonparametric methods will outperform LOD-score methods. Recently, Dizier et al. reported that, in some complex two-locus (2L) models, LOD-score methods with segregation analysis-derived parameters had less power to detect linkage than ASP tests. We investigated whether these particular models, in fact, represent a situation that ASP tests are more powerful than LOD scores. We simulated data according to the parameters specified by Dizier et al. and analyzed the data by using a (a) single locus (SL) LOD-score analysis performed twice, under a simple dominant and a recessive mode of inheritance (MOI), (b) ASP methods, and (c) nonparametric linkage (NPL) analysis. We show that SL analysis performed twice and corrected for the type I-error increase due to multiple testing yields almost as much linkage information as does an analysis under the correct 2L model and is more powerful than either the ASP method or the NPL method. We demonstrate that, even for complex genetic models, the most important condition for linkage analysis is that the assumed MOI at the disease locus being tested is approximately correct, not that the inheritance of the disease per se is correctly specified. In the analysis by Dizier et al., segregation analysis led to estimates of dominance parameters that were grossly misspecified for the locus tested in those models in which ASP tests appeared to be more powerful than LOD-score analyses.

  7. Demographically corrected norms for African Americans and Caucasians on the Hopkins Verbal Learning Test-Revised, Brief Visuospatial Memory Test-Revised, Stroop Color and Word Test, and Wisconsin Card Sorting Test 64-Card Version.

    PubMed

    Norman, Marc A; Moore, David J; Taylor, Michael; Franklin, Donald; Cysique, Lucette; Ake, Chris; Lazarretto, Deborah; Vaida, Florin; Heaton, Robert K

    2011-08-01

    Memory and executive functioning are two important components of clinical neuropsychological (NP) practice and research. Multiple demographic factors are known to affect performance differentially on most NP tests, but adequate normative corrections, inclusive of race/ethnicity, are not available for many widely used instruments. This study compared demographic contributions for widely used tests of verbal and visual learning and memory (Brief Visual Memory Test-Revised, Hopkins Verbal Memory Test-Revised) and executive functioning (Stroop Color and Word Test, Wisconsin Card Sorting Test-64) in groups of healthy Caucasians (n = 143) and African Americans (n = 103). Demographic factors of age, education, gender, and race/ethnicity were found to be significant factors on some indices of all four tests. The magnitude of demographic contributions (especially age) was greater for African Americans than for Caucasians on most measures. New, demographically corrected T-score formulas were calculated for each race/ethnicity. The rates of NP impairment using previously published normative standards significantly overestimated NP impairment in African Americans. Utilizing the new demographic corrections developed and presented herein, NP impairment rates were comparable between the two race/ethnicities and were unrelated to the other demographic characteristics (age, education, gender) in either race/ethnicity group. Findings support the need to consider extended demographic contributions to neuropsychological test performance in clinical and research settings.

  8. SU-F-T-180: Evaluation of a Scintillating Screen Detector for Proton Beam QA and Acceptance Testing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ghebremedhin, A; Taber, M; Koss, P

    2016-06-15

    Purpose: To test the performance of a commercial scintillating screen detector for acceptance testing and Quality Assurance of a proton pencil beam scanning system. Method: The detector (Lexitek DRD 400) has 40cm × 40cm field, uses a thin scintillator imaged onto a 16-bit scientific CCD with ∼0.5mm resolution. A grid target and LED illuminators are provided for spatial calibration and relative gain correction. The detector mounts to the nozzle with micron precision. Tools are provided for image processing and analysis of single or multiple Gaussian spots. Results: The bias and gain of the detector were studied to measure repeatability andmore » accuracy. Gain measurements were taken with the LED illuminators to measure repeatability and variation of the lens-CCD pair as a function with f-stop. Overall system gain was measured with a passive scattering (broad) beam whose shape is calibrated with EDR film placed in front of the scintillator. To create a large uniform field, overlapping small fields were recorded with the detector translated laterally and stitched together to cover the full field. Due to the long exposures required to obtain multiple spills of the synchrotron and very high detector sensitivity, borated polyethylene shielding was added to reduce direct radiation events hitting the CCD. Measurements with a micro ion chamber were compared to the detector’s spot profile. Software was developed to process arrays of Gaussian spots and to correct for radiation events. Conclusion: The detector background has a fixed bias, a small component linear in time, and is easily corrected. The gain correction method was validated with 2% accuracy. The detector spot profile matches the micro ion chamber data over 4 orders of magnitude. The multiple spot analyses can be easily used with plan data for measuring pencil beam uniformity and for regular QA comparison.« less

  9. Segmentation-based retrospective shading correction in fluorescence microscopy E. coli images for quantitative analysis

    NASA Astrophysics Data System (ADS)

    Mai, Fei; Chang, Chunqi; Liu, Wenqing; Xu, Weichao; Hung, Yeung S.

    2009-10-01

    Due to the inherent imperfections in the imaging process, fluorescence microscopy images often suffer from spurious intensity variations, which is usually referred to as intensity inhomogeneity, intensity non uniformity, shading or bias field. In this paper, a retrospective shading correction method for fluorescence microscopy Escherichia coli (E. Coli) images is proposed based on segmentation result. Segmentation and shading correction are coupled together, so we iteratively correct the shading effects based on segmentation result and refine the segmentation by segmenting the image after shading correction. A fluorescence microscopy E. Coli image can be segmented (based on its intensity value) into two classes: the background and the cells, where the intensity variation within each class is close to zero if there is no shading. Therefore, we make use of this characteristics to correct the shading in each iteration. Shading is mathematically modeled as a multiplicative component and an additive noise component. The additive component is removed by a denoising process, and the multiplicative component is estimated using a fast algorithm to minimize the intra-class intensity variation. We tested our method on synthetic images and real fluorescence E.coli images. It works well not only for visual inspection, but also for numerical evaluation. Our proposed method should be useful for further quantitative analysis especially for protein expression value comparison.

  10. Calibration of weak-lensing shear in the Kilo-Degree Survey

    NASA Astrophysics Data System (ADS)

    Fenech Conti, I.; Herbonnet, R.; Hoekstra, H.; Merten, J.; Miller, L.; Viola, M.

    2017-05-01

    We describe and test the pipeline used to measure the weak-lensing shear signal from the Kilo-Degree Survey (KiDS). It includes a novel method of 'self-calibration' that partially corrects for the effect of noise bias. We also discuss the 'weight bias' that may arise in optimally weighted measurements, and present a scheme to mitigate that bias. To study the residual biases arising from both galaxy selection and shear measurement, and to derive an empirical correction to reduce the shear biases to ≲1 per cent, we create a suite of simulated images whose properties are close to those of the KiDS survey observations. We find that the use of 'self-calibration' reduces the additive and multiplicative shear biases significantly, although further correction via a calibration scheme is required, which also corrects for a dependence of the bias on galaxy properties. We find that the calibration relation itself is biased by the use of noisy, measured galaxy properties, which may limit the final accuracy that can be achieved. We assess the accuracy of the calibration in the tomographic bins used for the KiDS cosmic shear analysis, testing in particular the effect of possible variations in the uncertain distributions of galaxy size, magnitude and ellipticity, and conclude that the calibration procedure is accurate at the level of multiplicative bias ≲1 per cent required for the KiDS cosmic shear analysis.

  11. A new compound control method for sine-on-random mixed vibration test

    NASA Astrophysics Data System (ADS)

    Zhang, Buyun; Wang, Ruochen; Zeng, Falin

    2017-09-01

    Vibration environmental test (VET) is one of the important and effective methods to provide supports for the strength design, reliability and durability test of mechanical products. A new separation control strategy was proposed to apply in multiple-input multiple-output (MIMO) sine on random (SOR) mixed mode vibration test, which is the advanced and intensive test type of VET. As the key problem of the strategy, correlation integral method was applied to separate the mixed signals which included random and sinusoidal components. The feedback control formula of MIMO linear random vibration system was systematically deduced in frequency domain, and Jacobi control algorithm was proposed in view of the elements, such as self-spectrum, coherence, and phase of power spectral density (PSD) matrix. Based on the excessive correction of excitation in sine vibration test, compression factor was introduced to reduce the excitation correction, avoiding the destruction to vibration table or other devices. The two methods were synthesized to be applied in MIMO SOR vibration test system. In the final, verification test system with the vibration of a cantilever beam as the control object was established to verify the reliability and effectiveness of the methods proposed in the paper. The test results show that the exceeding values can be controlled in the tolerance range of references accurately, and the method can supply theory and application supports for mechanical engineering.

  12. Corrective response times in a coordinated eye-head-arm countermanding task.

    PubMed

    Tao, Gordon; Khan, Aarlenne Z; Blohm, Gunnar

    2018-06-01

    Inhibition of motor responses has been described as a race between two competing decision processes of motor initiation and inhibition, which manifest as the reaction time (RT) and the stop signal reaction time (SSRT); in the case where motor initiation wins out over inhibition, an erroneous movement occurs that usually needs to be corrected, leading to corrective response times (CRTs). Here we used a combined eye-head-arm movement countermanding task to investigate the mechanisms governing multiple effector coordination and the timing of corrective responses. We found a high degree of correlation between effector response times for RT, SSRT, and CRT, suggesting that decision processes are strongly dependent across effectors. To gain further insight into the mechanisms underlying CRTs, we tested multiple models to describe the distribution of RTs, SSRTs, and CRTs. The best-ranked model (according to 3 information criteria) extends the LATER race model governing RTs and SSRTs, whereby a second motor initiation process triggers the corrective response (CRT) only after the inhibition process completes in an expedited fashion. Our model suggests that the neural processing underpinning a failed decision has a residual effect on subsequent actions. NEW & NOTEWORTHY Failure to inhibit erroneous movements typically results in corrective movements. For coordinated eye-head-hand movements we show that corrective movements are only initiated after the erroneous movement cancellation signal has reached a decision threshold in an accelerated fashion.

  13. An improved standardization procedure to remove systematic low frequency variability biases in GCM simulations

    NASA Astrophysics Data System (ADS)

    Mehrotra, Rajeshwar; Sharma, Ashish

    2012-12-01

    The quality of the absolute estimates of general circulation models (GCMs) calls into question the direct use of GCM outputs for climate change impact assessment studies, particularly at regional scales. Statistical correction of GCM output is often necessary when significant systematic biasesoccur between the modeled output and observations. A common procedure is to correct the GCM output by removing the systematic biases in low-order moments relative to observations or to reanalysis data at daily, monthly, or seasonal timescales. In this paper, we present an extension of a recently published nested bias correction (NBC) technique to correct for the low- as well as higher-order moments biases in the GCM-derived variables across selected multiple time-scales. The proposed recursive nested bias correction (RNBC) approach offers an improved basis for applying bias correction at multiple timescales over the original NBC procedure. The method ensures that the bias-corrected series exhibits improvements that are consistently spread over all of the timescales considered. Different variations of the approach starting from the standard NBC to the more complex recursive alternatives are tested to assess their impacts on a range of GCM-simulated atmospheric variables of interest in downscaling applications related to hydrology and water resources. Results of the study suggest that three to five iteration RNBCs are the most effective in removing distributional and persistence related biases across the timescales considered.

  14. Error determination of a successive correction type objective analysis scheme. [for surface meteorological data

    NASA Technical Reports Server (NTRS)

    Smith, D. R.; Leslie, F. W.

    1984-01-01

    The Purdue Regional Objective Analysis of the Mesoscale (PROAM) is a successive correction type scheme for the analysis of surface meteorological data. The scheme is subjected to a series of experiments to evaluate its performance under a variety of analysis conditions. The tests include use of a known analytic temperature distribution to quantify error bounds for the scheme. Similar experiments were conducted using actual atmospheric data. Results indicate that the multiple pass technique increases the accuracy of the analysis. Furthermore, the tests suggest appropriate values for the analysis parameters in resolving disturbances for the data set used in this investigation.

  15. Identification of mistakes and their correction by a small group discussion as a revision exercise at the end of a teaching module in biochemistry.

    PubMed

    Bobby, Zachariah; Nandeesha, H; Sridhar, M G; Soundravally, R; Setiya, Sajita; Babu, M Sathish; Niranjan, G

    2014-01-01

    Graduate medical students often get less opportunity for clarifying their doubts and to reinforce their concepts after lecture classes. The Medical Council of India (MCI) encourages group discussions among students. We evaluated the effect of identifying mistakes in a given set of wrong statements and their correction by a small group discussion by graduate medical students as a revision exercise. At the end of a module, a pre-test consisting of multiple-choice questions (MCQs) was conducted. Later, a set of incorrect statements related to the topic was given to the students and they were asked to identify the mistakes and correct them in a small group discussion. The effects on low, medium and high achievers were evaluated by a post-test and delayed post-tests with the same set of MCQs. The mean post-test marks were significantly higher among all the three groups compared to the pre-test marks. The gain from the small group discussion was equal among low, medium and high achievers. The gain from the exercise was retained among low, medium and high achievers after 15 days. Identification of mistakes in statements and their correction by a small group discussion is an effective, but unconventional revision exercise in biochemistry. Copyright 2014, NMJI.

  16. SAT Wars: The Case for Test-Optional College Admissions

    ERIC Educational Resources Information Center

    Soares, Joseph A., Ed.

    2011-01-01

    What can a college admissions officer safely predict about the future of a 17-year-old? Are the best and the brightest students the ones who can check off the most correct boxes on a multiple-choice exam? Or are there better ways of measuring ability and promise? In this penetrating and revealing look at high-stakes standardized admissions tests,…

  17. Replication of an association of variation in the FOXO3A gene with human longevity using both case–control and longitudinal data

    PubMed Central

    Soerensen, Mette; Dato, Serena; Christensen, Kaare; McGue, Matt; Stevnsner, Tinna; Bohr, Vilhelm A.; Christiansen, Lene

    2010-01-01

    Summary Genetic variation in FOXO3A has previously been associated with human longevity. Studies published so far have been case–control studies and hence vulnerable to bias introduced by cohort effects. In this study we extended the previous findings in the cohorts of oldest old Danes (the Danish 1905 cohort, N = 1089) and middle-aged Danes (N = 736), applying a longitudinal study design as well as the case–control study design. Fifteen SNPs were chosen in order to cover the known common variation in FOXO3A. Comparing SNP frequencies in the oldest old with middle-aged individuals, we found association (after correction for multiple testing) of eight SNPs; 4 (rs13217795, rs2764264, rs479744, and rs9400239) previously reported to be associated with longevity and four novel SNPs (rs12206094, rs13220810, rs7762395, and rs9486902 (corrected P-values 0.001–0.044). Moreover, we found association of the haplotypes TAC and CAC of rs9486902, rs10499051, and rs12206094 (corrected P-values: 0.01–0.03) with longevity. Finally, we here present data applying a longitudinal study design; when using follow-up survival data on the oldest old in a longitudinal analysis, we found no SNPs to remain significant after the correction for multiple testing (Bonferroni correction). Hence, our results support and extent the proposed role of FOXO3A as a candidate longevity gene for survival from younger ages to old age, yet not during old age. PMID:20849522

  18. Self-adaptive calibration for staring infrared sensors

    NASA Astrophysics Data System (ADS)

    Kendall, William B.; Stocker, Alan D.

    1993-10-01

    This paper presents a new, self-adaptive technique for the correlation of non-uniformities (fixed-pattern noise) in high-density infrared focal-plane detector arrays. We have developed a new approach to non-uniformity correction in which we use multiple image frames of the scene itself, and take advantage of the aim-point wander caused by jitter, residual tracking errors, or deliberately induced motion. Such wander causes each detector in the array to view multiple scene elements, and each scene element to be viewed by multiple detectors. It is therefore possible to formulate (and solve) a set of simultaneous equations from which correction parameters can be computed for the detectors. We have tested our approach with actual images collected by the ARPA-sponsored MUSIC infrared sensor. For these tests we employed a 60-frame (0.75-second) sequence of terrain images for which an out-of-date calibration was deliberately used. The sensor was aimed at a point on the ground via an operator-assisted tracking system having a maximum aim point wander on the order of ten pixels. With these data, we were able to improve the calibration accuracy by a factor of approximately 100.

  19. Evaluation of multiple comparison correction procedures in drug assessment studies using LORETA maps.

    PubMed

    Alonso, Joan Francesc; Romero, Sergio; Mañanas, Miguel Ángel; Rojas, Mónica; Riba, Jordi; Barbanoj, Manel José

    2015-10-01

    The identification of the brain regions involved in the neuropharmacological action is a potential procedure for drug development. These regions are commonly determined by the voxels showing significant statistical differences after comparing placebo-induced effects with drug-elicited effects. LORETA is an electroencephalography (EEG) source imaging technique frequently used to identify brain structures affected by the drug. The aim of the present study was to evaluate different methods for the correction of multiple comparisons in the LORETA maps. These methods which have been commonly used in neuroimaging and also simulated studies have been applied on a real case of pharmaco-EEG study where the effects of increasing benzodiazepine doses on the central nervous system measured by LORETA were investigated. Data consisted of EEG recordings obtained from nine volunteers who received single oral doses of alprazolam 0.25, 0.5, and 1 mg, and placebo in a randomized crossover double-blind design. The identification of active regions was highly dependent on the selected multiple test correction procedure. The combined criteria approach known as cluster mass was useful to reveal that increasing drug doses led to higher intensity and spread of the pharmacologically induced changes in intracerebral current density.

  20. Multiple balance tests improve the assessment of postural stability in subjects with Parkinson's disease

    PubMed Central

    Jacobs, J V; Horak, F B; Tran, V K; Nutt, J G

    2006-01-01

    Objectives Clinicians often base the implementation of therapies on the presence of postural instability in subjects with Parkinson's disease (PD). These decisions are frequently based on the pull test from the Unified Parkinson's Disease Rating Scale (UPDRS). We sought to determine whether combining the pull test, the one‐leg stance test, the functional reach test, and UPDRS items 27–29 (arise from chair, posture, and gait) predicts balance confidence and falling better than any test alone. Methods The study included 67 subjects with PD. Subjects performed the one‐leg stance test, the functional reach test, and the UPDRS motor exam. Subjects also responded to the Activities‐specific Balance Confidence (ABC) scale and reported how many times they fell during the previous year. Regression models determined the combination of tests that optimally predicted mean ABC scores or categorised fall frequency. Results When all tests were included in a stepwise linear regression, only gait (UPDRS item 29), the pull test (UPDRS item 30), and the one‐leg stance test, in combination, represented significant predictor variables for mean ABC scores (r2 = 0.51). A multinomial logistic regression model including the one‐leg stance test and gait represented the model with the fewest significant predictor variables that correctly identified the most subjects as fallers or non‐fallers (85% of subjects were correctly identified). Conclusions Multiple balance tests (including the one‐leg stance test, and the gait and pull test items of the UPDRS) that assess different types of postural stress provide an optimal assessment of postural stability in subjects with PD. PMID:16484639

  1. Should "Multiple Imputations" Be Treated as "Multiple Indicators"?

    ERIC Educational Resources Information Center

    Mislevy, Robert J.

    1993-01-01

    Multiple imputations for latent variables are constructed so that analyses treating them as true variables have the correct expectations for population characteristics. Analyzing multiple imputations in accordance with their construction yields correct estimates of population characteristics, whereas analyzing them as multiple indicators generally…

  2. An introduction to multiplicity issues in clinical trials: the what, why, when and how.

    PubMed

    Li, Guowei; Taljaard, Monica; Van den Heuvel, Edwin R; Levine, Mitchell Ah; Cook, Deborah J; Wells, George A; Devereaux, Philip J; Thabane, Lehana

    2017-04-01

    In clinical trials it is not uncommon to face a multiple testing problem which can have an impact on both type I and type II error rates, leading to inappropriate interpretation of trial results. Multiplicity issues may need to be considered at the design, analysis and interpretation stages of a trial. The proportion of trial reports not adequately correcting for multiple testing remains substantial. The purpose of this article is to provide an introduction to multiple testing issues in clinical trials, and to reduce confusion around the need for multiplicity adjustments. We use a tutorial, question-and-answer approach to address the key issues of why, when and how to consider multiplicity adjustments in trials. We summarize the relevant circumstances under which multiplicity adjustments ought to be considered, as well as options for carrying out multiplicity adjustments in terms of trial design factors including Population, Intervention/Comparison, Outcome, Time frame and Analysis (PICOTA). Results are presented in an easy-to-use table and flow diagrams. Confusion about multiplicity issues can be reduced or avoided by considering the potential impact of multiplicity on type I and II errors and, if necessary pre-specifying statistical approaches to either avoid or adjust for multiplicity in the trial protocol or analysis plan. © The Author 2016; all rights reserved. Published by Oxford University Press on behalf of the International Epidemiological Association.

  3. Immediate detailed feedback to test-enhanced learning: an effective online educational tool.

    PubMed

    Wojcikowski, Ken; Kirk, Leslie

    2013-11-01

    Test-enhanced learning has gained popularity because it is an effective way to increase retention of knowledge; provided the student receives the correct answer soon after the test is taken. To determine whether detailed feedback provided to test-enhanced learning questions is an effective online educational tool for improving performance on complex biomedical information exams. A series of online multiple choice tests were developed to test knowledge of biomedical information that students were expected to know after each patient-case. Following submission of the student answers, one cohort (n = 52) received answers only while the following year, a second cohort (n = 51) received the answers with detailed feedback explaining why each answer was correct or incorrect. Students in both groups progressed through the series of online tests with little assessor intervention. Students receiving the answers along with the explanations within their feedback performed significantly better in the final biomedical information exam than those students receiving correct answers only. This pilot study found that the detailed feedback to test-enhanced learning questions is an important online learning tool. The increase in student performance in the complex biomedical information exam in this study suggests that detailed feedback should be investigated not only for increasing knowledge, but also be investigated for its effect on retention and application of knowledge.

  4. Multiple regression analysis in nomogram development for myopic wavefront laser in situ keratomileusis: Improving astigmatic outcomes.

    PubMed

    Allan, Bruce D; Hassan, Hala; Ieong, Alvin

    2015-05-01

    To describe and evaluate a new multiple regression-derived nomogram for myopic wavefront laser in situ keratomileusis (LASIK). Moorfields Eye Hospital, London, United Kingdom. Prospective comparative case series. Multiple regression modeling was used to derive a simplified formula for adjusting attempted spherical correction in myopic LASIK. An adaptation of Thibos' power vector method was then applied to derive adjustments to attempted cylindrical correction in eyes with 1.0 diopter (D) or more of preoperative cylinder. These elements were combined in a new nomogram (nomogram II). The 3-month refractive results for myopic wavefront LASIK (spherical equivalent ≤11.0 D; cylinder ≤4.5 D) were compared between 299 consecutive eyes treated using the earlier nomogram (nomogram I) in 2009 and 2010 and 414 eyes treated using nomogram II in 2011 and 2012. There was no significant difference in treatment accuracy (variance in the postoperative manifest refraction spherical equivalent error) between nomogram I and nomogram II (P = .73, Bartlett test). Fewer patients treated with nomogram II had more than 0.5 D of residual postoperative astigmatism (P = .0001, Fisher exact test). There was no significant coupling between adjustments to the attempted cylinder and the achieved sphere (P = .18, t test). Discarding marginal influences from a multiple regression-derived nomogram for myopic wavefront LASIK had no clinically significant effect on treatment accuracy. Thibos' power vector method can be used to guide adjustments to the treatment cylinder alongside nomograms designed to optimize postoperative spherical equivalent results in myopic LASIK. mentioned. Copyright © 2015 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.

  5. Statistical significance of combinatorial regulations

    PubMed Central

    Terada, Aika; Okada-Hatakeyama, Mariko; Tsuda, Koji; Sese, Jun

    2013-01-01

    More than three transcription factors often work together to enable cells to respond to various signals. The detection of combinatorial regulation by multiple transcription factors, however, is not only computationally nontrivial but also extremely unlikely because of multiple testing correction. The exponential growth in the number of tests forces us to set a strict limit on the maximum arity. Here, we propose an efficient branch-and-bound algorithm called the “limitless arity multiple-testing procedure” (LAMP) to count the exact number of testable combinations and calibrate the Bonferroni factor to the smallest possible value. LAMP lists significant combinations without any limit, whereas the family-wise error rate is rigorously controlled under the threshold. In the human breast cancer transcriptome, LAMP discovered statistically significant combinations of as many as eight binding motifs. This method may contribute to uncover pathways regulated in a coordinated fashion and find hidden associations in heterogeneous data. PMID:23882073

  6. Development of an Accelerated Test Design for Predicting the Service Life of the Solar Array at Mead, Nebraska

    NASA Technical Reports Server (NTRS)

    Gaines, G. B.; Thomas, R. E.; Noel, G. T.; Shilliday, T. S.; Wood, V. E.; Carmichael, D. C.

    1979-01-01

    An accelerated life test is described which was developed to predict the life of the 25 kW photovoltaic array installed near Mead, Nebraska. A quantitative model for accelerating testing using multiple environmental stresses was used to develop the test design. The model accounts for the effects of thermal stress by a relation of the Arrhenius form. This relation was then corrected for the effects of nonthermal environmental stresses, such as relative humidity, atmospheric pollutants, and ultraviolet radiation. The correction factors for the nonthermal stresses included temperature-dependent exponents to account for the effects of interactions between thermal and nonthermal stresses on the rate of degradation of power output. The test conditions, measurements, and data analyses for the accelerated tests are presented. Constant-temperature, cyclic-temperature, and UV types of tests are specified, incorporating selected levels of relative humidity and chemical contamination and an imposed forward-bias current and static electric field.

  7. Insights into Students' Conceptual Understanding Using Textual Analysis: A Case Study in Signal Processing

    ERIC Educational Resources Information Center

    Goncher, Andrea M.; Jayalath, Dhammika; Boles, Wageeh

    2016-01-01

    Concept inventory tests are one method to evaluate conceptual understanding and identify possible misconceptions. The multiple-choice question format, offering a choice between a correct selection and common misconceptions, can provide an assessment of students' conceptual understanding in various dimensions. Misconceptions of some engineering…

  8. Wafer hotspot prevention using etch aware OPC correction

    NASA Astrophysics Data System (ADS)

    Hamouda, Ayman; Power, Dave; Salama, Mohamed; Chen, Ao

    2016-03-01

    As technology development advances into deep-sub-wavelength nodes, multiple patterning is becoming more essential to achieve the technology shrink requirements. Recently, Optical Proximity Correction (OPC) technology has proposed simultaneous correction of multiple mask-patterns to enable multiple patterning awareness during OPC correction. This is essential to prevent inter-layer hot-spots during the final pattern transfer. In state-of-art literature, multi-layer awareness is achieved using simultaneous resist-contour simulations to predict and correct for hot-spots during mask generation. However, this approach assumes a uniform etch shrink response for all patterns independent of their proximity, which isn't sufficient for the full prevention of inter-exposure hot-spot, for example different color space violations post etch or via coverage/enclosure post etch. In this paper, we explain the need to include the etch component during multiple patterning OPC. We also introduce a novel approach for Etch-aware simultaneous Multiple-patterning OPC, where we calibrate and verify a lumped model that includes the combined resist and etch responses. Adding this extra simulation condition during OPC is suitable for full chip processing from a computation intensity point of view. Also, using this model during OPC to predict and correct inter-exposures hot-spots is similar to previously proposed multiple-patterning OPC, yet our proposed approach more accurately corrects post-etch defects too.

  9. An Empirical Test of a Strategy for Training Examinees in the Use of Partial Information in Taking Multiple Choice Tests.

    ERIC Educational Resources Information Center

    Bliss, Leonard B.

    The aim of this study was to show that the superiority of corrected-for-guessing scores over number right scores as true score estimates depends on the ability of examinees to recognize situations where they can eliminate one or more alternatives as incorrect and to omit items where they would only be guessing randomly. Previous investigations…

  10. Effects of Barometric Fluctuations on Well Water-Level Measurements and Aquifer Test Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spane, Frank A.

    1999-12-16

    This report examines the effects of barometric fluctuations on well water-level measurements and evaluates adjustment and removal methods for determining areal aquifer head conditions and aquifer test analysis. Two examples of Hanford Site unconfined aquifer tests are examined that demonstrate baro-metric response analysis and illustrate the predictive/removal capabilities of various methods for well water-level and aquifer total head values. Good predictive/removal characteristics were demonstrated with best corrective results provided by multiple-regression deconvolution methods.

  11. The influence of question design on the response to self-assessment in www.elearnSCI.org: a submodule pilot study.

    PubMed

    Liu, N; Li, X-W; Zhou, M-W; Biering-Sørensen, F

    2015-08-01

    This is an interventional training session. The objective of this study was to investigate the difference in response to self-assessment questions in the original and an adjusted version for a submodule of www.elearnSCI.org for student nurses. The study was conducted in a teaching hospital affiliated to Peking University, China. In all, 28 student nurses divided into two groups (groups A and B; 14 in each) received a print-out of a Chinese translation of the slides from the 'Maintaining skin integrity following spinal cord injury' submodule in www.elearnSCI.org for self-study. Both groups were then tested using the 10 self-assessment multiple-choice questions (MCQs) related to the same submodule. Group A used the original questions, whereas group B received an adjusted questionnaire. The responses to four conventional single-answer MCQs were nearly all correct in both groups. However, in three questions, group A, with the option 'All of the above', had a higher number of correct answers than group B, with multiple-answer MCQs. In addition, in another three questions, group A, using the original multiple-answer MCQs, had fewer correct answers than group B, where it was only necessary to tick a single incorrect answer. Variations in design influence the response to questions. The use of conventional single-answer MCQs should be reconsidered, as they only examine the recall of isolated knowledge facts. The 'All of the above' option should be avoided because it would increase the number of correct answers arrived at by guessing. When using multiple-answer MCQs, it is recommended that the questions asked should be in accordance with the content within the www.elearnSCI.org.

  12. [Rapid Identification of Epicarpium Citri Grandis via Infrared Spectroscopy and Fluorescence Spectrum Imaging Technology Combined with Neural Network].

    PubMed

    Pan, Sha-sha; Huang, Fu-rong; Xiao, Chi; Xian, Rui-yi; Ma, Zhi-guo

    2015-10-01

    To explore rapid reliable methods for detection of Epicarpium citri grandis (ECG), the experiment using Fourier Transform Attenuated Total Reflection Infrared Spectroscopy (FTIR/ATR) and Fluorescence Spectrum Imaging Technology combined with Multilayer Perceptron (MLP) Neural Network pattern recognition, for the identification of ECG, and the two methods are compared. Infrared spectra and fluorescence spectral images of 118 samples, 81 ECG and 37 other kinds of ECG, are collected. According to the differences in tspectrum, the spectra data in the 550-1 800 cm(-1) wavenumber range and 400-720 nm wavelength are regarded as the study objects of discriminant analysis. Then principal component analysis (PCA) is applied to reduce the dimension of spectroscopic data of ECG and MLP Neural Network is used in combination to classify them. During the experiment were compared the effects of different methods of data preprocessing on the model: multiplicative scatter correction (MSC), standard normal variable correction (SNV), first-order derivative(FD), second-order derivative(SD) and Savitzky-Golay (SG). The results showed that: after the infrared spectra data via the Savitzky-Golay (SG) pretreatment through the MLP Neural Network with the hidden layer function as sigmoid, we can get the best discrimination of ECG, the correct percent of training set and testing set are both 100%. Using fluorescence spectral imaging technology, corrected by the multiple scattering (MSC) results in the pretreatment is the most ideal. After data preprocessing, the three layers of the MLP Neural Network of the hidden layer function as sigmoid function can get 100% correct percent of training set and 96.7% correct percent of testing set. It was shown that the FTIR/ATR and fluorescent spectral imaging technology combined with MLP Neural Network can be used for the identification study of ECG and has the advantages of rapid, reliable effect.

  13. Three-dimensional mapping of equiprobable hydrostratigraphic units at the Frenchman Flat Corrective Action Unit, Nevada Test Site

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shirley, C.; Pohlmann, K.; Andricevic, R.

    1996-09-01

    Geological and geophysical data are used with the sequential indicator simulation algorithm of Gomez-Hernandez and Srivastava to produce multiple, equiprobable, three-dimensional maps of informal hydrostratigraphic units at the Frenchman Flat Corrective Action Unit, Nevada Test Site. The upper 50 percent of the Tertiary volcanic lithostratigraphic column comprises the study volume. Semivariograms are modeled from indicator-transformed geophysical tool signals. Each equiprobable study volume is subdivided into discrete classes using the ISIM3D implementation of the sequential indicator simulation algorithm. Hydraulic conductivity is assigned within each class using the sequential Gaussian simulation method of Deutsch and Journel. The resulting maps show the contiguitymore » of high and low hydraulic conductivity regions.« less

  14. The Voice of Holland: Allograph Production in Written Dutch Past Tense Inflection

    ERIC Educational Resources Information Center

    de Bree, Elise; van der Ven, Sanne; van der Maas, Han

    2017-01-01

    According to the Integration of Multiple Patterns hypothesis (IMP; Treiman & Kessler, 2014), the spelling difficulty of a word is affected by the number of cues converging on the correct answer. We tested this hypothesis in children's regular past tense formation in Dutch. Past tenses are formed by adding either-"de"…

  15. La Mujer Chicana. (The Chicana Woman).

    ERIC Educational Resources Information Center

    Herrera, Gloria; Lizcano, Jeanette

    As objectives for this secondary level unit, students are to: (1) read the unit with comprehension; (2) demonstrate their comprehension of the Chicana's history by participating in an oral discussion utilizing four discussion questions; and (3) correctly answer 15 of the 20 questions on a multiple choice test. The unit consists of a brief history…

  16. No association between oxytocin or prolactin gene variants and childhood-onset mood disorders

    PubMed Central

    Strauss, John S.; Freeman, Natalie L.; Shaikh, Sajid A.; Vetró, Ágnes; Kiss, Enikő; Kapornai, Krisztina; Daróczi, Gabriella; Rimay, Timea; Kothencné, Viola Osváth; Dombovári, Edit; Kaczvinszk, Emília; Tamás, Zsuzsa; Baji, Ildikó; Besny, Márta; Gádoros, Julia; DeLuca, Vincenzo; George, Charles J.; Dempster, Emma; Barr, Cathy L.; Kovacs, Maria; Kennedy, James L.

    2010-01-01

    Background Oxytocin (OXT) and prolactin (PRL) are neuropeptide hormones that interact with the serotonin system and are involved in the stress response and social affiliation. In human studies, serum OXT and PRL levels have been associated with depression and related phenotypes. Our purpose was to determine if single nucleotide polymorphisms (SNPs) at the loci for OXT, PRL and their receptors, OXTR and PRLR, were associated with childhood-onset mood disorders (COMD). Methods Using 678 families in a family-based association design, we genotyped sixteen SNPs at OXT, PRL, OXTR and PRLR to test for association with COMD. Results No significant associations were found for SNPs in the OXTR, PRL, or PRLR genes. Two of three SNPs 3' of the OXT gene were associated with COMD (p ≤ 0.02), significant after spectral decomposition, but were not significant after additionally correcting for the number of genes tested. Supplementary analyses of parent-of-origin and proband sex effects for OXT SNPs by Fisher’s Exact test were not significant after Bonferroni correction. Conclusions We have examined sixteen OXT and PRL system gene variants, with no evidence of statistically significant association after correction for multiple tests. PMID:20547007

  17. Grayscale inhomogeneity correction method for multiple mosaicked electron microscope images

    NASA Astrophysics Data System (ADS)

    Zhou, Fangxu; Chen, Xi; Sun, Rong; Han, Hua

    2018-04-01

    Electron microscope image stitching is highly desired to acquire microscopic resolution images of large target scenes in neuroscience. However, the result of multiple Mosaicked electron microscope images may exist severe gray scale inhomogeneity due to the instability of the electron microscope system and registration errors, which degrade the visual effect of the mosaicked EM images and aggravate the difficulty of follow-up treatment, such as automatic object recognition. Consequently, the grayscale correction method for multiple mosaicked electron microscope images is indispensable in these areas. Different from most previous grayscale correction methods, this paper designs a grayscale correction process for multiple EM images which tackles the difficulty of the multiple images monochrome correction and achieves the consistency of grayscale in the overlap regions. We adjust overall grayscale of the mosaicked images with the location and grayscale information of manual selected seed images, and then fuse local overlap regions between adjacent images using Poisson image editing. Experimental result demonstrates the effectiveness of our proposed method.

  18. Harmonic source wavefront aberration correction for ultrasound imaging

    PubMed Central

    Dianis, Scott W.; von Ramm, Olaf T.

    2011-01-01

    A method is proposed which uses a lower-frequency transmit to create a known harmonic acoustical source in tissue suitable for wavefront correction without a priori assumptions of the target or requiring a transponder. The measurement and imaging steps of this method were implemented on the Duke phased array system with a two-dimensional (2-D) array. The method was tested with multiple electronic aberrators [0.39π to 1.16π radians root-mean-square (rms) at 4.17 MHz] and with a physical aberrator 0.17π radians rms at 4.17 MHz) in a variety of imaging situations. Corrections were quantified in terms of peak beam amplitude compared to the unaberrated case, with restoration between 0.6 and 36.6 dB of peak amplitude with a single correction. Standard phantom images before and after correction were obtained and showed both visible improvement and 14 dB contrast improvement after correction. This method, when combined with previous phase correction methods, may be an important step that leads to improved clinical images. PMID:21303031

  19. Pilot study of the impact that bilateral sacroiliac joint manipulation using a drop table technique has on gait parameters in asymptomatic individuals with a leg length inequality.

    PubMed

    Ward, John; Sorrels, Ken; Coats, Jesse; Pourmoghaddam, Amir; Deleon, Carlos; Daigneault, Paige

    2014-03-01

    The purpose of this study was to pilot test our study procedures and estimate parameters for sample size calculations for a randomized controlled trial to determine if bilateral sacroiliac (SI) joint manipulation affects specific gait parameters in asymptomatic individuals with a leg length inequality (LLI). Twenty-one asymptomatic chiropractic students engaged in a baseline 90-second walking kinematic analysis using infrared Vicon® cameras. Following this, participants underwent a functional LLI test. Upon examination participants were classified as: left short leg, right short leg, or no short leg. Half of the participants in each short leg group were then randomized to receive bilateral corrective SI joint chiropractic manipulative therapy (CMT). All participants then underwent another 90-second gait analysis. Pre- versus post-intervention gait data were then analyzed within treatment groups by an individual who was blinded to participant group status. For the primary analysis, all p-values were corrected for multiple comparisons using the Bonferroni method. Within groups, no differences in measured gait parameters were statistically significant after correcting for multiple comparisons. The protocol of this study was acceptable to all subjects who were invited to participate. No participants refused randomization. Based on the data collected, we estimated that a larger main study would require 34 participants in each comparison group to detect a moderate effect size.

  20. The Positive and Negative Effects of Science Concept Tests on Student Conceptual Understanding

    NASA Astrophysics Data System (ADS)

    Chang, Chun-Yen; Yeh, Ting-Kuang; Barufaldi, James P.

    2010-01-01

    This study explored the phenomenon of testing effect during science concept assessments, including the mechanism behind it and its impact upon a learner's conceptual understanding. The participants consisted of 208 high school students, in either the 11th or 12th grade. Three types of tests (traditional multiple-choice test, correct concept test, and incorrect concept test) related to the greenhouse effect and global warming were developed to explore the mechanisms underlining the test effect. Interview data analyzed by means of the flow-map method were used to examine the two-week post-test consequences of taking one of these three tests. The results indicated: (1) Traditional tests can affect participants' long-term memory, both positively and negatively; in addition, when students ponder repeatedly and think harder about highly distracting choices during a test, they may gradually develop new conceptions; (2) Students develop more correct conceptions when more true descriptions are provided on the tests; on the other hand, students develop more misconceptions while completing tests in which more false descriptions of choices are provided. Finally, the results of this study revealed a noteworthy phenomenon that tests, if employed appropriately, may be also an effective instrument for assisting students' conceptual understanding.

  1. Applications of multivariate modeling to neuroimaging group analysis: A comprehensive alternative to univariate general linear model

    PubMed Central

    Chen, Gang; Adleman, Nancy E.; Saad, Ziad S.; Leibenluft, Ellen; Cox, RobertW.

    2014-01-01

    All neuroimaging packages can handle group analysis with t-tests or general linear modeling (GLM). However, they are quite hamstrung when there are multiple within-subject factors or when quantitative covariates are involved in the presence of a within-subject factor. In addition, sphericity is typically assumed for the variance–covariance structure when there are more than two levels in a within-subject factor. To overcome such limitations in the traditional AN(C)OVA and GLM, we adopt a multivariate modeling (MVM) approach to analyzing neuroimaging data at the group level with the following advantages: a) there is no limit on the number of factors as long as sample sizes are deemed appropriate; b) quantitative covariates can be analyzed together with within- subject factors; c) when a within-subject factor is involved, three testing methodologies are provided: traditional univariate testing (UVT)with sphericity assumption (UVT-UC) and with correction when the assumption is violated (UVT-SC), and within-subject multivariate testing (MVT-WS); d) to correct for sphericity violation at the voxel level, we propose a hybrid testing (HT) approach that achieves equal or higher power via combining traditional sphericity correction methods (Greenhouse–Geisser and Huynh–Feldt) with MVT-WS. PMID:24954281

  2. Repeated readings and science: Fluency with expository passages

    NASA Astrophysics Data System (ADS)

    Kostewicz, Douglas E.

    The current study investigated the effects of repeated readings to a fluency criterion (RRFC) for seven students with disabilities using science text. The study employed a single subject design, specifically, two multiple probe multiple baselines across subjects, to evaluate the effects of the RRFC intervention. Results indicated that students met criterion (200 or more correct words per minute with 2 or fewer errors) on four consecutive passages. A majority of students displayed accelerations to correct words per minute and decelerations to incorrect words per minute on successive initial, intervention readings suggesting reading transfer. Students' reading scores during posttest and maintenance out performed pre-test and baseline readings provided additional measures of reading transfer. For a relationship to comprehension, students scored higher on oral retell measures after meeting criterion as compared to initial readings. Overall, the research findings suggested that the RRFC intervention improves science reading fluency for students with disabilities, and may also indirectly benefit comprehension.

  3. Moving beyond Univariate Post-Hoc Testing in Exercise Science: A Primer on Descriptive Discriminate Analysis

    ERIC Educational Resources Information Center

    Barton, Mitch; Yeatts, Paul E.; Henson, Robin K.; Martin, Scott B.

    2016-01-01

    There has been a recent call to improve data reporting in kinesiology journals, including the appropriate use of univariate and multivariate analysis techniques. For example, a multivariate analysis of variance (MANOVA) with univariate post hocs and a Bonferroni correction is frequently used to investigate group differences on multiple dependent…

  4. 76 FR 22311 - Airworthiness Directives; Lockheed Martin Corporation/Lockheed Martin Aeronautics Company Model...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-21

    ... a report of fatigue cracking of the wing upper and lower rainbow fittings during durability testing... are susceptible to multiple site fatigue damage. We are issuing this AD to detect and correct such fatigue cracks, which could grow large and lead to the failure of the fitting and a catastrophic failure...

  5. Conceptual Coherence of Non-Newtonian Worldviews in Force Concept Inventory Data

    ERIC Educational Resources Information Center

    Scott, Terry F.; Schumayer, Dániel

    2017-01-01

    The Force Concept Inventory is one of the most popular and most analyzed multiple-choice concept tests used to investigate students' understanding of Newtonian mechanics. The correct answers poll a set of underlying Newtonian concepts and the coherence of these underlying concepts has been found in the data. However, this inventory was constructed…

  6. Individual and shared effects of social environment and polygenic risk scores on adolescent body mass index.

    PubMed

    Coleman, Jonathan R I; Krapohl, Eva; Eley, Thalia C; Breen, Gerome

    2018-04-20

    Juvenile obesity is associated with adverse health outcomes. Understanding genetic and environmental influences on body mass index (BMI) during adolescence could inform interventions. We investigated independent and interactive effects of parenting, socioeconomic status (SES) and polygenic risk on BMI pre-adolescence, and on the rate of change in BMI across adolescence. Genome-wide genotype data, BMI and child perceptions of parental warmth and punitive discipline were available at 11 years old, and parental SES was available from birth on 3,414 unrelated participants. Linear models were used to test the effects of social environment and polygenic risk on pre-adolescent BMI. Change in BMI across adolescence was assessed in a subset (N = 1943). Sex-specific effects were assessed. Higher genetic risk was associated with increased BMI pre-adolescence and across adolescence (p < 0.00417, corrected for multiple tests). Negative parenting was not significantly associated with either phenotype, but lower SES was associated with increased BMI pre-adolescence. No interactions passed correction for multiple testing. Polygenic risk scores from adult GWAS meta-analyses are associated with BMI in juveniles, suggesting a stable genetic component. Pre-adolescent BMI was associated with social environment, but parental style has, at most, a small effect.

  7. Multiple directed graph large-class multi-spectral processor

    NASA Technical Reports Server (NTRS)

    Casasent, David; Liu, Shiaw-Dong; Yoneyama, Hideyuki

    1988-01-01

    Numerical analysis techniques for the interpretation of high-resolution imaging-spectrometer data are described and demonstrated. The method proposed involves the use of (1) a hierarchical classifier with a tree structure generated automatically by a Fisher linear-discriminant-function algorithm and (2) a novel multiple-directed-graph scheme which reduces the local maxima and the number of perturbations required. Results for a 500-class test problem involving simulated imaging-spectrometer data are presented in tables and graphs; 100-percent-correct classification is achieved with an improvement factor of 5.

  8. Impression formation of tests: retrospective judgments of performance are higher when easier questions come first.

    PubMed

    Jackson, Abigail; Greene, Robert L

    2014-11-01

    Four experiments are reported on the importance of retrospective judgments of performance (postdictions) on tests. Participants answered general knowledge questions and estimated how many questions they answered correctly. They gave higher postdictions when easy questions preceded difficult questions. This was true when time to answer each question was equalized and constrained, when participants were instructed not to write answers, and when questions were presented in a multiple-choice format. Results are consistent with the notion that first impressions predominate in overall perception of test difficulty.

  9. Genetic variability of VEGF pathway genes in six randomized phase III trials assessing the addition of bevacizumab to standard therapy.

    PubMed

    de Haas, Sanne; Delmar, Paul; Bansal, Aruna T; Moisse, Matthieu; Miles, David W; Leighl, Natasha; Escudier, Bernard; Van Cutsem, Eric; Carmeliet, Peter; Scherer, Stefan J; Pallaud, Celine; Lambrechts, Diether

    2014-10-01

    Despite extensive translational research, no validated biomarkers predictive of bevacizumab treatment outcome have been identified. We performed a meta-analysis of individual patient data from six randomized phase III trials in colorectal, pancreatic, lung, renal, breast, and gastric cancer to explore the potential relationships between 195 common genetic variants in the vascular endothelial growth factor (VEGF) pathway and bevacizumab treatment outcome. The analysis included 1,402 patients (716 bevacizumab-treated and 686 placebo-treated). Twenty variants were associated (P < 0.05) with progression-free survival (PFS) in bevacizumab-treated patients. Of these, 4 variants in EPAS1 survived correction for multiple testing (q < 0.05). Genotype-by-treatment interaction tests revealed that, across these 20 variants, 3 variants in VEGF-C (rs12510099), EPAS1 (rs4953344), and IL8RA (rs2234671) were potentially predictive (P < 0.05), but not resistant to multiple testing (q > 0.05). A weak genotype-by-treatment interaction effect was also observed for rs699946 in VEGF-A, whereas Bayesian genewise analysis revealed that genetic variability in VHL was associated with PFS in the bevacizumab arm (q < 0.05). Variants in VEGF-A, EPAS1, and VHL were located in expression quantitative loci derived from lymphoblastoid cell lines, indicating that they affect the expression levels of their respective gene. This large genetic analysis suggests that variants in VEGF-A, EPAS1, IL8RA, VHL, and VEGF-C have potential value in predicting bevacizumab treatment outcome across tumor types. Although these associations did not survive correction for multiple testing in a genotype-by-interaction analysis, they are among the strongest predictive effects reported to date for genetic variants and bevacizumab efficacy.

  10. PubMed Central

    PANATTO, D.; ARATA, L.; BEVILACQUA, I.; APPRATO, L.; GASPARINI, R.; AMICIZIA, D.

    2015-01-01

    Summary Introduction. Health-related knowledge is often assessed through multiple-choice tests. Among the different types of formats, researchers may opt to use multiple-mark items, i.e. with more than one correct answer. Although multiple-mark items have long been used in the academic setting – sometimes with scant or inconclusive results – little is known about the implementation of this format in research on in-field health education and promotion. Methods. A study population of secondary school students completed a survey on nutrition-related knowledge, followed by a single- lecture intervention. Answers were scored by means of eight different scoring algorithms and analyzed from the perspective of classical test theory. The same survey was re-administered to a sample of the students in order to evaluate the short-term change in their knowledge. Results. In all, 286 questionnaires were analyzed. Partial scoring algorithms displayed better psychometric characteristics than the dichotomous rule. In particular, the algorithm proposed by Ripkey and the balanced rule showed greater internal consistency and relative efficiency in scoring multiple-mark items. A penalizing algorithm in which the proportion of marked distracters was subtracted from that of marked correct answers was the only one that highlighted a significant difference in performance between natives and immigrants, probably owing to its slightly better discriminatory ability. This algorithm was also associated with the largest effect size in the pre-/post-intervention score change. Discussion. The choice of an appropriate rule for scoring multiple- mark items in research on health education and promotion should consider not only the psychometric properties of single algorithms but also the study aims and outcomes, since scoring rules differ in terms of biasness, reliability, difficulty, sensitivity to guessing and discrimination. PMID:26900331

  11. Effect of Air Pollution on Exacerbations of Bronchiectasis in Badalona, Spain, 2008-2016.

    PubMed

    Garcia-Olivé, Ignasi; Stojanovic, Zoran; Radua, Joaquim; Rodriguez-Pons, Laura; Martinez-Rivera, Carlos; Ruiz Manzano, Juan

    2018-05-17

    Air pollution has been widely associated with respiratory diseases. Nevertheless, the association between air pollution and exacerbations of bronchiectasis has been less studied. To analyze the effect of air pollution on exacerbations of bronchiectasis. This was a retrospective observational study conducted in Badalona. The number of daily hospital admissions and emergency room visits related to exacerbation of bronchiectasis (ICD-9 code 494.1) between 2008 and 2016 was obtained. We used simple Poisson regressions to test the effects of daily mean temperature, SO2, NO2, CO, and PM10 levels on bronchiectasis-related emergencies and hospitalizations on the same day and 1-4 days after. All p values were corrected for multiple comparisons. SO2 was significantly associated with an increase in the number of hospitalizations (lags 0, 1, 2, and 3). None of these associations remained significant after correcting for multiple comparisons. The number of emergency room visits was associated with higher levels of SO2 (lags 0-4). After correcting for multiple comparisons, the association between emergency room visits and SO2 levels was statistically significant for lag 0 (p = 0.043), lag 1 (p = 0.018), and lag 3 (p = 0.050). The number of emergency room visits for exacerbation of bronchiectasis is associated with higher levels of SO2. © 2018 S. Karger AG, Basel.

  12. Measuring University students' understanding of the greenhouse effect - a comparison of multiple-choice, short answer and concept sketch assessment tools with respect to students' mental models

    NASA Astrophysics Data System (ADS)

    Gold, A. U.; Harris, S. E.

    2013-12-01

    The greenhouse effect comes up in most discussions about climate and is a key concept related to climate change. Existing studies have shown that students and adults alike lack a detailed understanding of this important concept or might hold misconceptions. We studied the effectiveness of different interventions on University-level students' understanding of the greenhouse effect. Introductory level science students were tested for their pre-knowledge of the greenhouse effect using validated multiple-choice questions, short answers and concept sketches. All students participated in a common lesson about the greenhouse effect and were then randomly assigned to one of two lab groups. One group explored an existing simulation about the greenhouse effect (PhET-lesson) and the other group worked with absorption spectra of different greenhouse gases (Data-lesson) to deepen the understanding of the greenhouse effect. All students completed the same assessment including multiple choice, short answers and concept sketches after participation in their lab lesson. 164 students completed all the assessments, 76 completed the PhET lesson and 77 completed the data lesson. 11 students missed the contrasting lesson. In this presentation we show the comparison between the multiple-choice questions, short answer questions and the concept sketches of students. We explore how well each of these assessment types represents student's knowledge. We also identify items that are indicators of the level of understanding of the greenhouse effect as measured in correspondence of student answers to an expert mental model and expert responses. Preliminary data analysis shows that student who produce concept sketch drawings that come close to expert drawings also choose correct multiple-choice answers. However, correct multiple-choice answers are not necessarily an indicator that a student produces an expert-like correlating concept sketch items. Multiple-choice questions that require detailed knowledge of the greenhouse effect (e.g. direction of re-emission of infrared energy from greenhouse gas) are significantly more likely to be answered correctly by students who also produce expert-like concept sketch items than by students who don't include this aspect in their sketch and don't answer the multiple choice questions correctly. This difference is not as apparent for less technical multiple-choice questions (e.g. type of radiation emitted by Sun). Our findings explore the formation of student's mental models throughout different interventions and how well the different assessment techniques used in this study represent the student understanding of the overall concept.

  13. [Experimental research of turbidity influence on water quality monitoring of COD in UV-visible spectroscopy].

    PubMed

    Tang, Bin; Wei, Biao; Wu, De-Cao; Mi, De-Ling; Zhao, Jing-Xiao; Feng, Peng; Jiang, Shang-Hai; Mao, Ben-Jiang

    2014-11-01

    Eliminating turbidity is a direct effect spectroscopy detection of COD key technical problems. This stems from the UV-visible spectroscopy detected key quality parameters depend on an accurate and effective analysis of water quality parameters analytical model, and turbidity is an important parameter that affects the modeling. In this paper, we selected formazine turbidity solution and standard solution of potassium hydrogen phthalate to study the turbidity affect of UV--visible absorption spectroscopy detection of COD, at the characteristics wavelength of 245, 300, 360 and 560 nm wavelength point several characteristics with the turbidity change in absorbance method of least squares curve fitting, thus analyzes the variation of absorbance with turbidity. The results show, In the ultraviolet range of 240 to 380 nm, as the turbidity caused by particle produces compounds to the organics, it is relatively complicated to test the turbidity affections on the water Ultraviolet spectra; in the visible region of 380 to 780 nm, the turbidity of the spectrum weakens with wavelength increases. Based on this, this paper we study the multiplicative scatter correction method affected by the turbidity of the water sample spectra calibration test, this method can correct water samples spectral affected by turbidity. After treatment, by comparing the spectra before, the results showed that the turbidity caused by wavelength baseline shift points have been effectively corrected, and features in the ultraviolet region has not diminished. Then we make multiplicative scatter correction for the three selected UV liquid-visible absorption spectroscopy, experimental results shows that on the premise of saving the characteristic of the Ultraviolet-Visible absorption spectrum of water samples, which not only improve the quality of COD spectroscopy detection SNR, but also for providing an efficient data conditioning regimen for establishing an accurate of the chemical measurement methods.

  14. System stability and calibrations for hand-held electromagnetic frequency domain instruments

    NASA Astrophysics Data System (ADS)

    Saksa, Pauli J.; Sorsa, Joona

    2017-05-01

    There are a few multiple-frequency domain electromagnetic induction (EMI) hand-held rigid boom systems available for shallow geophysical resistivity investigations. They basically measure secondary field real and imaginary components after the system calibrations. One multiple-frequency system, the EMP-400 Profiler from Geophysical Survey Systems Inc., was tested for system calibrations, stability and various effects present in normal measurements like height variation, tilting, signal stacking and time stability. Results indicated that in test conditions, repeatable high-accuracy imaginary component values can be recorded for near-surface frequency soundings. In test conditions, real components are also stable but vary strongly in normal surveying measurements. However, certain calibration issues related to the combination of user influence and measurement system height were recognised as an important factor in reducing for data errors and for further processing like static offset corrections.

  15. Limited Associations of Dopamine System Genes With Alcohol Dependence and Related Traits in the Irish Affected Sib Pair Study of Alcohol Dependence (IASPSAD)

    PubMed Central

    Hack, Laura M.; Kalsi, Gursharan; Aliev, Fazil; Kuo, Po-Hsiu; Prescott, Carol A.; Patterson, Diana G.; Walsh, Dermot; Dick, Danielle M.; Riley, Brien P.; Kendler, Kenneth S.

    2012-01-01

    Background Over 50 years of evidence from research has established that the central dopaminergic reward pathway is likely involved in alcohol dependence (AD). Additional evidence supports a role for dopamine (DA) in other disinhibitory psychopathology, which is often comorbid with AD. Family and twin studies demonstrate that a common genetic component accounts for most of the genetic variance in these traits. Thus, DA-related genes represent putative candidates for the genetic risk that underlies not only AD but also behavioral disinhibition. Many linkage and association studies have examined these relationships with inconsistent results, possibly because of low power, poor marker coverage, and/or an inappropriate correction for multiple testing. Methods We conducted an association study on the products encoded by 10 DA-related genes (DRD1-D5, SLC18A2, SLC6A3, DDC, TH, COMT) using a large, ethnically homogeneous sample with severe AD (n = 545) and screened controls (n = 509). We collected genotypes from linkage disequilibrium (LD)-tagging single nucleotide polymorphisms (SNPs) and employed a gene-based method of correction. We tested for association with AD diagnosis in cases and controls and with a variety of alcohol-related traits (including age-at-onset, initial sensitivity, tolerance, maximum daily drinks, and a withdrawal factor score), disinhibitory symptoms, and a disinhibitory factor score in cases only. A total of 135 SNPs were genotyped using the Illumina GoldenGate and Taqman Assays-on-Demand protocols. Results Of the 101 SNPs entered into standard analysis, 6 independent SNPs from 5 DA genes were associated with AD or a quantitative alcohol-related trait. Two SNPs across 2 genes were associated with a disinhibitory symptom count, while 1 SNP in DRD5 was positive for association with the general disinhibitory factor score. Conclusions Our study provides evidence of modest associations between a small number of DA-related genes and AD as well as a range of alcohol-related traits and measures of behavioral disinhibition. While we did conduct gene-based correction for multiple testing, we did not correct for multiple traits because the traits are correlated. However, false-positive findings remain possible, so our results must be interpreted with caution. PMID:21083670

  16. Interference correction by extracting the information of interference dominant regions: Application to near-infrared spectra

    NASA Astrophysics Data System (ADS)

    Bi, Yiming; Tang, Liang; Shan, Peng; Xie, Qiong; Hu, Yong; Peng, Silong; Tan, Jie; Li, Changwen

    2014-08-01

    Interference such as baseline drift and light scattering can degrade the model predictability in multivariate analysis of near-infrared (NIR) spectra. Usually interference can be represented by an additive and a multiplicative factor. In order to eliminate these interferences, correction parameters are needed to be estimated from spectra. However, the spectra are often mixed of physical light scattering effects and chemical light absorbance effects, making it difficult for parameter estimation. Herein, a novel algorithm was proposed to find a spectral region automatically that the interesting chemical absorbance and noise are low, that is, finding an interference dominant region (IDR). Based on the definition of IDR, a two-step method was proposed to find the optimal IDR and the corresponding correction parameters estimated from IDR. Finally, the correction was performed to the full spectral range using previously obtained parameters for the calibration set and test set, respectively. The method can be applied to multi target systems with one IDR suitable for all targeted analytes. Tested on two benchmark data sets of near-infrared spectra, the performance of the proposed method provided considerable improvement compared with full spectral estimation methods and comparable with other state-of-art methods.

  17. Adaptive offset correction for intracortical brain-computer interfaces.

    PubMed

    Homer, Mark L; Perge, Janos A; Black, Michael J; Harrison, Matthew T; Cash, Sydney S; Hochberg, Leigh R

    2014-03-01

    Intracortical brain-computer interfaces (iBCIs) decode intended movement from neural activity for the control of external devices such as a robotic arm. Standard approaches include a calibration phase to estimate decoding parameters. During iBCI operation, the statistical properties of the neural activity can depart from those observed during calibration, sometimes hindering a user's ability to control the iBCI. To address this problem, we adaptively correct the offset terms within a Kalman filter decoder via penalized maximum likelihood estimation. The approach can handle rapid shifts in neural signal behavior (on the order of seconds) and requires no knowledge of the intended movement. The algorithm, called multiple offset correction algorithm (MOCA), was tested using simulated neural activity and evaluated retrospectively using data collected from two people with tetraplegia operating an iBCI. In 19 clinical research test cases, where a nonadaptive Kalman filter yielded relatively high decoding errors, MOCA significantly reduced these errors ( 10.6 ± 10.1% ; p < 0.05, pairwise t-test). MOCA did not significantly change the error in the remaining 23 cases where a nonadaptive Kalman filter already performed well. These results suggest that MOCA provides more robust decoding than the standard Kalman filter for iBCIs.

  18. Multiple-choice tests stabilize access to marginal knowledge.

    PubMed

    Cantor, Allison D; Eslick, Andrea N; Marsh, Elizabeth J; Bjork, Robert A; Bjork, Elizabeth Ligon

    2015-02-01

    Marginal knowledge refers to knowledge that is stored in memory, but is not accessible at a given moment. For example, one might struggle to remember who wrote The Call of the Wild, even if that knowledge is stored in memory. Knowing how best to stabilize access to marginal knowledge is important, given that new learning often requires accessing and building on prior knowledge. While even a single opportunity to restudy marginal knowledge boosts its later accessibility (Berger, Hall, & Bahrick, 1999), in many situations explicit relearning opportunities are not available. Our question is whether multiple-choice tests (which by definition expose the learner to the correct answers) can also serve this function and, if so, how testing compares to restudying given that tests can be particularly powerful learning devices (Roediger & Karpicke, 2006). In four experiments, we found that multiple-choice testing had the power to stabilize access to marginal knowledge, and to do so for at least up to a week. Importantly, such tests did not need to be paired with feedback, although testing was no more powerful than studying. Overall, the results support the idea that one's knowledge base is unstable, with individual pieces of information coming in and out of reach. The present findings have implications for a key educational challenge: ensuring that students have continuing access to information they have learned.

  19. Initial Correction versus Negative Marking in Multiple Choice Examinations

    ERIC Educational Resources Information Center

    Van Hecke, Tanja

    2015-01-01

    Optimal assessment tools should measure in a limited time the knowledge of students in a correct and unbiased way. A method for automating the scoring is multiple choice scoring. This article compares scoring methods from a probabilistic point of view by modelling the probability to pass: the number right scoring, the initial correction (IC) and…

  20. Non-Bayesian Noun Generalization in 3-to 5-Year-Old Children: Probing the Role of Prior Knowledge in the Suspicious Coincidence Effect

    ERIC Educational Resources Information Center

    Jenkins, Gavin W.; Samuelson, Larissa K.; Smith, Jodi R.; Spencer, John P.

    2015-01-01

    It is unclear how children learn labels for multiple overlapping categories such as "Labrador," "dog," and "animal." Xu and Tenenbaum (2007a) suggested that learners infer correct meanings with the help of Bayesian inference. They instantiated these claims in a Bayesian model, which they tested with preschoolers and…

  1. Contingency Contracting and Its Impact on the Use of Punctuation Skills by Fifth Graders with Learning Disabilities

    ERIC Educational Resources Information Center

    Grünke, Matthias; Coeppicus, Christin

    2017-01-01

    The purpose of this study was to assess the effectiveness of contingency contracting on the percentage of correctly used punctuation marks in free writing tasks. Participants were three 11-year-old boys with learning disabilities (LD). A multiple-baseline across-subjects design was employed to test our prediction that the students would show…

  2. Systemic Ecological Illiteracy? Shedding Light on Meaning as an Act of Thought in Higher Learning

    ERIC Educational Resources Information Center

    Puk, Thomas G.; Stibbards, Adam

    2012-01-01

    Research on ecological literacy often takes for granted that participants understand, and can construct the meaning within, the complex concepts involved, simply because they are able to use the appropriate terminology in a "fluent" manner and/or can select the correct option on multiple choice tests. In this study, and in the larger…

  3. Evaluating Reported Candidate Gene Associations with Polycystic Ovary Syndrome

    PubMed Central

    Pau, Cindy; Saxena, Richa; Welt, Corrine Kolka

    2013-01-01

    Objective To replicate variants in candidate genes associated with PCOS in a population of European PCOS and control subjects. Design Case-control association analysis and meta-analysis. Setting Major academic hospital Patients Women of European ancestry with PCOS (n=525) and controls (n=472), aged 18 to 45 years. Intervention Variants previously associated with PCOS in candidate gene studies were genotyped (n=39). Metabolic, reproductive and anthropomorphic parameters were examined as a function of the candidate variants. All genetic association analyses were adjusted for age, BMI and ancestry and were reported after correction for multiple testing. Main Outcome Measure Association of candidate gene variants with PCOS. Results Three variants, rs3797179 (SRD5A1), rs12473543 (POMC), and rs1501299 (ADIPOQ), were nominally associated with PCOS. However, they did not remain significant after correction for multiple testing and none of the variants replicated in a sufficiently powered meta-analysis. Variants in the FBN3 gene (rs17202517 and rs73503752) were associated with smaller waist circumferences and variant rs727428 in the SHBG gene was associated with lower SHBG levels. Conclusion Previously identified variants in candidate genes do not appear to be associated with PCOS risk. PMID:23375202

  4. Detection of BCG bacteria using a magnetoresistive biosensor: A step towards a fully electronic platform for tuberculosis point-of-care detection.

    PubMed

    Barroso, Teresa G; Martins, Rui C; Fernandes, Elisabete; Cardoso, Susana; Rivas, José; Freitas, Paulo P

    2018-02-15

    Tuberculosis is one of the major public health concerns. This highly contagious disease affects more than 10.4 million people, being a leading cause of morbidity by infection. Tuberculosis is diagnosed at the point-of-care by the Ziehl-Neelsen sputum smear microscopy test. Ziehl-Neelsen is laborious, prone to human error and infection risk, with a limit of detection of 10 4 cells/mL. In resource-poor nations, a more practical test, with lower detection limit, is paramount. This work uses a magnetoresistive biosensor to detect BCG bacteria for tuberculosis diagnosis. Herein we report: i) nanoparticle assembly method and specificity for tuberculosis detection; ii) demonstration of proportionality between BCG cell concentration and magnetoresistive voltage signal; iii) application of multiplicative signal correction for systematic effects removal; iv) investigation of calibration effectiveness using chemometrics methods; and v) comparison with state-of-the-art point-of-care tuberculosis biosensors. Results present a clear correspondence between voltage signal and cell concentration. Multiplicative signal correction removes baseline shifts within and between biochip sensors, allowing accurate and precise voltage signal between different biochips. The corrected signal was used for multivariate regression models, which significantly decreased the calibration standard error from 0.50 to 0.03log 10 (cells/mL). Results show that Ziehl-Neelsen detection limits and below are achievable with the magnetoresistive biochip, when pre-processing and chemometrics are used. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Consortium analysis of gene and gene-folate interactions in purine and pyrimidine metabolism pathways with ovarian carcinoma risk

    PubMed Central

    Kelemen, Linda E.; Terry, Kathryn L.; Goodman, Marc T.; Webb, Penelope M.; Bandera, Elisa V.; McGuire, Valerie; Rossing, Mary Anne; Wang, Qinggang; Dicks, Ed; Tyrer, Jonathan P.; Song, Honglin; Kupryjanczyk, Jolanta; Dansonka-Mieszkowska, Agnieszka; Plisiecka-Halasa, Joanna; Timorek, Agnieszka; Menon, Usha; Gentry-Maharaj, Aleksandra; Gayther, Simon A.; Ramus, Susan J.; Narod, Steven A.; Risch, Harvey A.; McLaughlin, John R.; Siddiqui, Nadeem; Glasspool, Rosalind; Paul, James; Carty, Karen; Gronwald, Jacek; Lubiński, Jan; Jakubowska, Anna; Cybulski, Cezary; Kiemeney, Lambertus A.; Massuger, Leon F. A. G.; van Altena, Anne M.; Aben, Katja K. H.; Olson, Sara H.; Orlow, Irene; Cramer, Daniel W.; Levine, Douglas A.; Bisogna, Maria; Giles, Graham G.; Southey, Melissa C.; Bruinsma, Fiona; Kjær, Susanne Krüger; Høgdall, Estrid; Jensen, Allan; Høgdall, Claus K.; Lundvall, Lene; Engelholm, Svend-Aage; Heitz, Florian; du Bois, Andreas; Harter, Philipp; Schwaab, Ira; Butzow, Ralf; Nevanlinna, Heli; Pelttari, Liisa M.; Leminen, Arto; Thompson, Pamela J.; Lurie, Galina; Wilkens, Lynne R.; Lambrechts, Diether; Van Nieuwenhuysen, Els; Lambrechts, Sandrina; Vergote, Ignace; Beesley, Jonathan; Fasching, Peter A.; Beckmann, Matthias W.; Hein, Alexander; Ekici, Arif B.; Doherty, Jennifer A.; Wu, Anna H.; Pearce, Celeste L.; Pike, Malcolm C.; Stram, Daniel; Chang-Claude, Jenny; Rudolph, Anja; Dörk, Thilo; Dürst, Matthias; Hillemanns, Peter; Runnebaum, Ingo B.; Bogdanova, Natalia; Antonenkova, Natalia; Odunsi, Kunle; Edwards, Robert P.; Kelley, Joseph L.; Modugno, Francesmary; Ness, Roberta B.; Karlan, Beth Y.; Walsh, Christine; Lester, Jenny; Orsulic, Sandra; Fridley, Brooke L.; Vierkant, Robert A.; Cunningham, Julie M.; Wu, Xifeng; Lu, Karen; Liang, Dong; Hildebrandt, Michelle A.T.; Weber, Rachel Palmieri; Iversen, Edwin S.; Tworoger, Shelley S.; Poole, Elizabeth M.; Salvesen, Helga B.; Krakstad, Camilla; Bjorge, Line; Tangen, Ingvild L.; Pejovic, Tanja; Bean, Yukie; Kellar, Melissa; Wentzensen, Nicolas; Brinton, Louise A.; Lissowska, Jolanta; Garcia-Closas, Montserrat; Campbell, Ian G.; Eccles, Diana; Whittemore, Alice S.; Sieh, Weiva; Rothstein, Joseph H.; Anton-Culver, Hoda; Ziogas, Argyrios; Phelan, Catherine M.; Moysich, Kirsten B.; Goode, Ellen L.; Schildkraut, Joellen M.; Berchuck, Andrew; Pharoah, Paul D.P.; Sellers, Thomas A.; Brooks-Wilson, Angela; Cook, Linda S.; Le, Nhu D.

    2014-01-01

    Scope We re-evaluated previously reported associations between variants in pathways of one-carbon (folate) transfer genes and ovarian carcinoma (OC) risk, and in related pathways of purine and pyrimidine metabolism, and assessed interactions with folate intake. Methods and Results Odds ratios (OR) for 446 genetic variants were estimated among 13,410 OC cases and 22,635 controls and among 2,281 cases and 3,444 controls with folate information. Following multiple testing correction, the most significant main effect associations were for DPYD variants rs11587873 (OR=0.92, P=6x10−5) and rs828054 (OR=1.06, P=1x10−4). Thirteen variants in the pyrimidine metabolism genes, DPYD, DPYS, PPAT and TYMS, also interacted significantly with folate in a multi-variant analysis (corrected P=9.9x10−6) but collectively explained only 0.2% of OC risk. Although no other associations were significant after multiple testing correction, variants in SHMT1 in one-carbon transfer, previously reported with OC, suggested lower risk at higher folate (Pinteraction=0.03-0.006). Conclusions Variation in pyrimidine metabolism genes, particularly DPYD, which was previously reported to be associated with OC, may influence risk; however, stratification by folate intake is unlikely to modify disease risk appreciably in these women. SHMT1 SNP-byfolate interactions are plausible but require further validation. Polymorphisms in selected genes in purine metabolism were not associated with OC. PMID:25066213

  6. Evaluation of the Vitek MS v3.0 Matrix-Assisted Laser Desorption Ionization-Time of Flight Mass Spectrometry System for Identification of Mycobacterium and Nocardia Species.

    PubMed

    Body, Barbara A; Beard, Melodie A; Slechta, E Susan; Hanson, Kimberly E; Barker, Adam P; Babady, N Esther; McMillen, Tracy; Tang, Yi-Wei; Brown-Elliott, Barbara A; Iakhiaeva, Elena; Vasireddy, Ravikiran; Vasireddy, Sruthi; Smith, Terry; Wallace, Richard J; Turner, S; Curtis, L; Butler-Wu, Susan; Rychert, Jenna

    2018-06-01

    This multicenter study was designed to assess the accuracy and reproducibility of the Vitek MS v3.0 matrix-assisted laser desorption ionization-time of flight (MALDI-TOF) mass spectrometry system for identification of Mycobacterium and Nocardia species compared to DNA sequencing. A total of 963 clinical isolates representing 51 taxa were evaluated. In all, 663 isolates were correctly identified to the species level (69%), with another 231 (24%) correctly identified to the complex or group level. Fifty-five isolates (6%) could not be identified despite repeat testing. All of the tuberculous mycobacteria (45/45; 100%) and most of the nontuberculous mycobacteria (569/606; 94%) were correctly identified at least to the group or complex level. However, not all species or subspecies within the M. tuberculosis , M. abscessus , and M. avium complexes and within the M. fortuitum and M. mucogenicum groups could be differentiated. Among the 312 Nocardia isolates tested, 236 (76%) were correctly identified to the species level, with an additional 44 (14%) correctly identified to the complex level. Species within the N. nova and N. transvalensis complexes could not always be differentiated. Eleven percent of the isolates (103/963) underwent repeat testing in order to get a final result. Identification of a representative set of Mycobacterium and Nocardia species was highly reproducible, with 297 of 300 (99%) replicates correctly identified using multiple kit lots, instruments, analysts, and sites. These findings demonstrate that the system is robust and has utility for the routine identification of mycobacteria and Nocardia in clinical practice. Copyright © 2018 American Society for Microbiology.

  7. Lorazepam induces multiple disturbances in selective attention: attentional overload, decrement in target processing efficiency, and shifts in perceptual discrimination and response bias.

    PubMed

    Michael, George Andrew; Bacon, Elisabeth; Offerlin-Meyer, Isabelle

    2007-09-01

    There is a general consensus that benzodiazepines affect attentional processes, yet only few studies have tried to investigate these impairments in detail. The purpose of the present study was to investigate the effects of a single dose of Lorazepam on performance in a target cancellation task with important time constraints. We measured correct target detections and correct distractor rejections, misses and false positives. The results show that Lorazepam produces multiple kinds of shifts in performance, which suggests that it impairs multipLe processes: (a) the evolution of performance over time was not the same between the placebo and the Lorazepam groups, with the Lorazepam affecting performance quite early after the beginning of the test. This is suggestive of a depletion of attentional resources during sequential attentional processing; (b) Lorazepam affected differently target and distractor processing, with target detection being the most impaired; (c) misses were more frequent under Lorazepam than under placebo, but no such difference was observed as far as false positives were concerned. Signal detection analyses showed that Lorazepam (d) decreased perceptual discrimination, and (e) reliably increased response bias. Our results bring new insights on the multiple effects of Lorazepam on selective attention which, when combined, may have deleterious effects on human performance.

  8. Percolation galaxy groups and clusters in the sdss redshift survey: identification, catalogs, and the multiplicity function

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Berlind, Andreas A.; Frieman, Joshua A.; Weinberg, David H.

    2006-01-01

    We identify galaxy groups and clusters in volume-limited samples of the SDSS redshift survey, using a redshift-space friends-of-friends algorithm. We optimize the friends-of-friends linking lengths to recover galaxy systems that occupy the same dark matter halos, using a set of mock catalogs created by populating halos of N-body simulations with galaxies. Extensive tests with these mock catalogs show that no combination of perpendicular and line-of-sight linking lengths is able to yield groups and clusters that simultaneously recover the true halo multiplicity function, projected size distribution, and velocity dispersion. We adopt a linking length combination that yields, for galaxy groups withmore » ten or more members: a group multiplicity function that is unbiased with respect to the true halo multiplicity function; an unbiased median relation between the multiplicities of groups and their associated halos; a spurious group fraction of less than {approx}1%; a halo completeness of more than {approx}97%; the correct projected size distribution as a function of multiplicity; and a velocity dispersion distribution that is {approx}20% too low at all multiplicities. These results hold over a range of mock catalogs that use different input recipes of populating halos with galaxies. We apply our group-finding algorithm to the SDSS data and obtain three group and cluster catalogs for three volume-limited samples that cover 3495.1 square degrees on the sky. We correct for incompleteness caused by fiber collisions and survey edges, and obtain measurements of the group multiplicity function, with errors calculated from realistic mock catalogs. These multiplicity function measurements provide a key constraint on the relation between galaxy populations and dark matter halos.« less

  9. Station Correction Uncertainty in Multiple Event Location Algorithms and the Effect on Error Ellipses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Erickson, Jason P.; Carlson, Deborah K.; Ortiz, Anne

    Accurate location of seismic events is crucial for nuclear explosion monitoring. There are several sources of error in seismic location that must be taken into account to obtain high confidence results. Most location techniques account for uncertainties in the phase arrival times (measurement error) and the bias of the velocity model (model error), but they do not account for the uncertainty of the velocity model bias. By determining and incorporating this uncertainty in the location algorithm we seek to improve the accuracy of the calculated locations and uncertainty ellipses. In order to correct for deficiencies in the velocity model, itmore » is necessary to apply station specific corrections to the predicted arrival times. Both master event and multiple event location techniques assume that the station corrections are known perfectly, when in reality there is an uncertainty associated with these corrections. For multiple event location algorithms that calculate station corrections as part of the inversion, it is possible to determine the variance of the corrections. The variance can then be used to weight the arrivals associated with each station, thereby giving more influence to stations with consistent corrections. We have modified an existing multiple event location program (based on PMEL, Pavlis and Booker, 1983). We are exploring weighting arrivals with the inverse of the station correction standard deviation as well using the conditional probability of the calculated station corrections. This is in addition to the weighting already given to the measurement and modeling error terms. We re-locate a group of mining explosions that occurred at Black Thunder, Wyoming, and compare the results to those generated without accounting for station correction uncertainty.« less

  10. Evaluation of intensity drift correction strategies using MetaboDrift, a normalization tool for multi-batch metabolomics data.

    PubMed

    Thonusin, Chanisa; IglayReger, Heidi B; Soni, Tanu; Rothberg, Amy E; Burant, Charles F; Evans, Charles R

    2017-11-10

    In recent years, mass spectrometry-based metabolomics has increasingly been applied to large-scale epidemiological studies of human subjects. However, the successful use of metabolomics in this context is subject to the challenge of detecting biologically significant effects despite substantial intensity drift that often occurs when data are acquired over a long period or in multiple batches. Numerous computational strategies and software tools have been developed to aid in correcting for intensity drift in metabolomics data, but most of these techniques are implemented using command-line driven software and custom scripts which are not accessible to all end users of metabolomics data. Further, it has not yet become routine practice to assess the quantitative accuracy of drift correction against techniques which enable true absolute quantitation such as isotope dilution mass spectrometry. We developed an Excel-based tool, MetaboDrift, to visually evaluate and correct for intensity drift in a multi-batch liquid chromatography - mass spectrometry (LC-MS) metabolomics dataset. The tool enables drift correction based on either quality control (QC) samples analyzed throughout the batches or using QC-sample independent methods. We applied MetaboDrift to an original set of clinical metabolomics data from a mixed-meal tolerance test (MMTT). The performance of the method was evaluated for multiple classes of metabolites by comparison with normalization using isotope-labeled internal standards. QC sample-based intensity drift correction significantly improved correlation with IS-normalized data, and resulted in detection of additional metabolites with significant physiological response to the MMTT. The relative merits of different QC-sample curve fitting strategies are discussed in the context of batch size and drift pattern complexity. Our drift correction tool offers a practical, simplified approach to drift correction and batch combination in large metabolomics studies. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Lessons learnt on biases and uncertainties in personal exposure measurement surveys of radiofrequency electromagnetic fields with exposimeters.

    PubMed

    Bolte, John F B

    2016-09-01

    Personal exposure measurements of radio frequency electromagnetic fields are important for epidemiological studies and developing prediction models. Minimizing biases and uncertainties and handling spatial and temporal variability are important aspects of these measurements. This paper reviews the lessons learnt from testing the different types of exposimeters and from personal exposure measurement surveys performed between 2005 and 2015. Applying them will improve the comparability and ranking of exposure levels for different microenvironments, activities or (groups of) people, such that epidemiological studies are better capable of finding potential weak correlations with health effects. Over 20 papers have been published on how to prevent biases and minimize uncertainties due to: mechanical errors; design of hardware and software filters; anisotropy; and influence of the body. A number of biases can be corrected for by determining multiplicative correction factors. In addition a good protocol on how to wear the exposimeter, a sufficiently small sampling interval and sufficiently long measurement duration will minimize biases. Corrections to biases are possible for: non-detects through detection limit, erroneous manufacturer calibration and temporal drift. Corrections not deemed necessary, because no significant biases have been observed, are: linearity in response and resolution. Corrections difficult to perform after measurements are for: modulation/duty cycle sensitivity; out of band response aka cross talk; temperature and humidity sensitivity. Corrections not possible to perform after measurements are for: multiple signals detection in one band; flatness of response within a frequency band; anisotropy to waves of different elevation angle. An analysis of 20 microenvironmental surveys showed that early studies using exposimeters with logarithmic detectors, overestimated exposure to signals with bursts, such as in uplink signals from mobile phones and WiFi appliances. Further, the possible corrections for biases have not been fully applied. The main findings are that if the biases are not corrected for, the actual exposure will on average be underestimated. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. Construction of a multiple myeloma diagnostic model by magnetic bead-based MALDI-TOF mass spectrometry of serum and pattern recognition software.

    PubMed

    Wang, Qing-Tao; Li, Yong-Zhe; Liang, Yu-Fang; Hu, Chao-Jun; Zhai, Yu-Hua; Zhao, Guan-Fei; Zhang, Jian; Li, Ning; Ni, An-Ping; Chen, Wen-Ming; Xu, Yang

    2009-04-01

    A diagnosis of multiple myeloma (MM) is difficult to make on the basis of any single laboratory test result. Accurate diagnosis of MM generally results from a number of costly and invasive laboratory tests and medical procedures. The aim of this work is to find a new, highly specific and sensitive method for MM diagnosis. Serum samples were tested in groups representing MM (n = 54) and non-MM (n = 108). These included a subgroup of 17 plasma cell dyscrasias, a subgroup of 17 reactive plasmacytosis, 5 B cell lymphomas, and 7 other tumors with osseus metastasis, as well as 62 healthy donors as controls. Bioinformatic calculations associated with MM were performed. The decision algorithm, with a panel of three biomarkers, correctly identified 24 of 24 (100%) MM samples and 46 of 49 (93.88%) non-MM samples in the training set. During the masked test for the discriminatory model, 26 of 30 MM patients (sensitivity, 86.67%) were precisely recognized, and all 34 normal donors were successfully classified; patients with reactive plasmacytosis were also correctly classified into the non-MM group, and 11 of the other patients were incorrectly classified as MM. The results suggested that proteomic fingerprint technology combining magnetic beads with MALDI-TOF-MS has the potential for identifying individuals with MM. The biomarker classification model was suitable for preliminary assessment of MM and could potentially serve as a useful tool for MM diagnosis and differentiation diagnosis.

  13. Testing of next-generation nonlinear calibration based non-uniformity correction techniques using SWIR devices

    NASA Astrophysics Data System (ADS)

    Lovejoy, McKenna R.; Wickert, Mark A.

    2017-05-01

    A known problem with infrared imaging devices is their non-uniformity. This non-uniformity is the result of dark current, amplifier mismatch as well as the individual photo response of the detectors. To improve performance, non-uniformity correction (NUC) techniques are applied. Standard calibration techniques use linear, or piecewise linear models to approximate the non-uniform gain and off set characteristics as well as the nonlinear response. Piecewise linear models perform better than the one and two-point models, but in many cases require storing an unmanageable number of correction coefficients. Most nonlinear NUC algorithms use a second order polynomial to improve performance and allow for a minimal number of stored coefficients. However, advances in technology now make higher order polynomial NUC algorithms feasible. This study comprehensively tests higher order polynomial NUC algorithms targeted at short wave infrared (SWIR) imagers. Using data collected from actual SWIR cameras, the nonlinear techniques and corresponding performance metrics are compared with current linear methods including the standard one and two-point algorithms. Machine learning, including principal component analysis, is explored for identifying and replacing bad pixels. The data sets are analyzed and the impact of hardware implementation is discussed. Average floating point results show 30% less non-uniformity, in post-corrected data, when using a third order polynomial correction algorithm rather than a second order algorithm. To maximize overall performance, a trade off analysis on polynomial order and coefficient precision is performed. Comprehensive testing, across multiple data sets, provides next generation model validation and performance benchmarks for higher order polynomial NUC methods.

  14. A Two-Tier Multiple Choice Questions to Diagnose Thermodynamic Misconception of Thai and Laos Students

    NASA Astrophysics Data System (ADS)

    Kamcharean, Chanwit; Wattanakasiwich, Pornrat

    The objective of this study was to diagnose misconceptions of Thai and Lao students in thermodynamics by using a two-tier multiple-choice test. Two-tier multiple choice questions consist of the first tier, a content-based question and the second tier, a reasoning-based question. Data of student understanding was collected by using 10 two-tier multiple-choice questions. Thai participants were the first-year students (N = 57) taking a fundamental physics course at Chiang Mai University in 2012. Lao participants were high school students in Grade 11 (N = 57) and Grade 12 (N = 83) at Muengnern high school in Xayaboury province, Lao PDR. As results, most students answered content-tier questions correctly but chose incorrect answers for reason-tier questions. When further investigating their incorrect reasons, we found similar misconceptions as reported in previous studies such as incorrectly relating pressure with temperature when presenting with multiple variables.

  15. Gender differences in judgments of multiple emotions from facial expressions.

    PubMed

    Hall, Judith A; Matsumoto, David

    2004-06-01

    The authors tested gender differences in emotion judgments by utilizing a new judgment task (Studies 1 and 2) and presenting stimuli at the edge of conscious awareness (Study 2). Women were more accurate than men even under conditions of minimal stimulus information. Women's ratings were more variable across scales, and they rated correct target emotions higher than did men. Copyright 2004 American Psychological Association

  16. Detection of suboptimal effort with symbol span: development of a new embedded index.

    PubMed

    Young, J Christopher; Caron, Joshua E; Baughman, Brandon C; Sawyer, R John

    2012-03-01

    Developing embedded indicators of suboptimal effort on objective neurocognitive testing is essential for detecting increasingly sophisticated forms of symptom feigning. The current study explored whether Symbol Span, a novel Wechsler Memory Scale-fourth edition measure of supraspan visual attention, could be used to discriminate adequate effort from suboptimal effort. Archival data were collected from 136 veterans classified into Poor Effort (n = 42) and Good Effort (n = 94) groups based on symptom validity test (SVT) performance. The Poor Effort group had significantly lower raw scores (p < .001) and age-corrected scaled scores (p < .001) than the Good Effort group on the Symbol Span test. A raw score cutoff of <14 produced 83% specificity and 50% sensitivity for detection of Poor Effort. Similarly, sensitivity was 52% and specificity was 84% when employing a cutoff of <7 for Age-Corrected Scale Score. Collectively, present results suggest that Symbol Span can effectively differentiate veterans with multiple failures on established free-standing and embedded SVTs.

  17. Laboratory studies of scales for measuring helicopter noise

    NASA Technical Reports Server (NTRS)

    Ollerhead, J. B.

    1982-01-01

    The adequacy of the effective perceived noise level (EPNL) procedure for rating helicopter noise annoyance was investigated. Recordings of 89 helicopters and 30 fixed wing aircraft (CTOL) flyover sounds were rated with respect to annoyance by groups of approximately 40 subjects. The average annoyance scores were transformed to annoyance levels defined as the equally annoying sound levels of a fixed reference sound. The sound levels of the test sounds were measured on various scales, with and without corrections for duration, tones, and impulsiveness. On average, the helicopter sounds were judged equally annoying to CTOL sounds when their duration corrected levels are approximately 2 dB higher. Multiple regression analysis indicated that, provided the helicopter/CTOL difference of about 2 dB is taken into account, the particular linear combination of level, duration, and tone corrections inherent in EPNL is close to optimum. The results reveal no general requirement for special EPNL correction terms to penalize helicopter sounds which are particularly impulsive; impulsiveness causes spectral and temporal changes which themselves adequately amplify conventionally measured sound levels.

  18. Cancer prevention knowledge of people with profound hearing loss.

    PubMed

    Zazove, Philip; Meador, Helen E; Reed, Barbara D; Sen, Ananda; Gorenflo, Daniel W

    2009-03-01

    Deaf persons, a documented minority population, have low reading levels and difficulty communicating with physicians. The effect of these on their knowledge of cancer prevention recommendations is unknown. A cross-sectional study of 222 d/Deaf persons in Michigan, age 18 and older, chose one of four ways (voice, video of a certified American Sign Language interpreter, captions, or printed English) to complete a self-administered computer video questionnaire about demographics, hearing loss, language history, health-care utilization, and health-care information sources, as well as family and social variables. Twelve questions tested their knowledge of cancer prevention recommendations. The outcome measures were the percentage of correct answers to the questions and the association of multiple variables with these responses. Participants averaged 22.9% correct answers with no gender difference. Univariate analysis revealed that smoking history, types of medical problems, last physician visit, and women having previous cancer preventive tests did not affect scores. Improved scores occurred with computer use (p = 0.05), higher education (p < 0.01) and income (p = 0.01), hearing spouses (p < 0.01), speaking English in multiple situations (p < 0.001), and in men with previous prostate cancer testing (p = 0.04). Obtaining health information from books (p = 0.05), physicians (p = 0.008), nurses (p = 0.03) or the internet (p = 0.02), and believing that smoking is bad (p < 0.001) also improved scores. Multivariate analysis revealed that English use (p = 0.01) and believing that smoking was bad (p = 0.05) were associated with improved scores. Persons with profound hearing loss have poor knowledge of recommended cancer prevention interventions. English use in multiple settings was strongly associated with increased knowledge.

  19. Epigenetic Variation in the Mu-opioid Receptor Gene in Infants with Neonatal Abstinence Syndrome

    PubMed Central

    Wachman, Elisha M; Hayes, Marie J; Lester, Barry M; Terrin, Norma; Brown, Mark S; Nielsen, David A; Davis, Jonathan M

    2014-01-01

    Objective Neonatal abstinence syndrome (NAS) from in utero opioid exposure is highly variable with genetic factors appearing to play an important role. Epigenetic changes in cytosine:guanine (CpG) dinucleotide methylation can occur after drug exposure and may help to explain NAS variability. We correlated DNA methylation levels in the mu-opioid receptor (OPRM1) promoter in opioid-exposed infants and correlate them with NAS outcomes. Study design DNA samples from cord blood or saliva were analyzed for 86 infants being treated for NAS according to institutional protocol. Methylation levels at 16 OPRM1 CpG sites were determined and correlated with NAS outcome measures, including need for treatment, treatment with >2 medications, and length of hospital stay. We adjusted for co-variates and multiple genetic testing. Results Sixty-five percent of infants required treatment for NAS, and 24% required ≥2 medications. Hypermethylation of the OPRM1 promoter was measured at the −10 CpG in treated versus non-treated infants [adjusted difference δ=3.2% (95% CI 0.3–6.0%), p=0.03; NS after multiple testing correction]. There was hypermethylation at the −14 [δ=4.9% (95% CI 1.8–8.1%), p=0.003], −10 [δ=5.0% (95% CI 2.3–7.7%), p=0.0005)], and +84 [δ=3.5% (95% CI 0.6 – 6.4), p=0.02] CpG sites in infants requiring ≥2 medications which remained significant for −14 and −10 after multiple testing correction. Conclusions Increased methylation within the OPRM1 promoter is associated with worse NAS outcomes, consistent with gene silencing. PMID:24996986

  20. Cancer Prevention Knowledge of People with Profound Hearing Loss

    PubMed Central

    Meador, Helen E.; Reed, Barbara D.; Sen, Ananda; Gorenflo, Daniel W.

    2009-01-01

    BACKGROUND Deaf persons, a documented minority population, have low reading levels and difficulty communicating with physicians. The effect of these on their knowledge of cancer prevention recommendations is unknown. METHODS A cross-sectional study of 222 d/Deaf persons in Michigan, age 18 and older, chose one of four ways (voice, video of a certified American Sign Language interpreter, captions, or printed English) to complete a self-administered computer video questionnaire about demographics, hearing loss, language history, health-care utilization, and health-care information sources, as well as family and social variables. Twelve questions tested their knowledge of cancer prevention recommendations. The outcome measures were the percentage of correct answers to the questions and the association of multiple variables with these responses. RESULTS Participants averaged 22.9% correct answers with no gender difference. Univariate analysis revealed that smoking history, types of medical problems, last physician visit, and women having previous cancer preventive tests did not affect scores. Improved scores occurred with computer use (p = 0.05), higher education (p < 0.01) and income (p = 0.01), hearing spouses (p < 0.01), speaking English in multiple situations (p < 0.001), and in men with previous prostate cancer testing (p = 0.04). Obtaining health information from books (p = 0.05), physicians (p = 0.008), nurses (p = 0.03) or the internet (p = 0.02), and believing that smoking is bad (p < 0.001) also improved scores. Multivariate analysis revealed that English use (p = 0.01) and believing that smoking was bad (p = 0.05) were associated with improved scores. CONCLUSION Persons with profound hearing loss have poor knowledge of recommended cancer prevention interventions. English use in multiple settings was strongly associated with increased knowledge. PMID:19132325

  1. Certainty rating in pre-and post-tests of study modules in an online clinical pharmacy course - A pilot study to evaluate teaching and learning.

    PubMed

    Luetsch, Karen; Burrows, Judith

    2016-10-14

    Graduate and post-graduate education for health professionals is increasingly delivered in an e-learning environment, where automated, continuous formative testing with integrated feedback can guide students' self-assessment and learning. Asking students to rate the certainty they assign to the correctness of their answers to test questions can potentially provide deeper insights into the success of teaching, with test results informing course designers whether learning outcomes have been achieved. It may also have implications for decision making in clinical practice. A study of pre-and post-tests for five study modules was designed to evaluate the teaching and learning within a pharmacotherapeutic course in an online postgraduate clinical pharmacy program. Certainty based marking of multiple choice questions (MCQ) was adapted for formative pre- and post-study module testing by asking students to rate their certainty of correctness of MCQ answers. Paired t-tests and a coding scheme were used to analyse changes in answers and certainty between pre-and post-tests. A survey evaluated students' experience with the novel formative testing design. Twenty-nine pharmacists enrolled in the postgraduate program participated in the study. Overall 1315 matched pairs of MCQ answers and certainty ratings between pre- and post-module tests were available for evaluation. Most students identified correct answers in post-tests and increased their certainty compared to pre-tests. Evaluation of certainty ratings in addition to correctness of answers identified MCQs and topic areas for revision to course designers. A survey of students showed that assigning certainty ratings to their answers assisted in structuring and focusing their learning throughout online study modules, facilitating identification of areas of uncertainty and gaps in their clinical knowledge. Adding certainty ratings to MCQ answers seems to engage students with formative testing and feedback and focus their learning in a web-based postgraduate pharmacy course. It also offers deeper insight into the successful delivery of online course content, identifying areas for improvement of teaching and content delivery as well as test question design.

  2. Testing for a slope-based decoupling algorithm in a woofer-tweeter adaptive optics system.

    PubMed

    Cheng, Tao; Liu, WenJin; Yang, KangJian; He, Xin; Yang, Ping; Xu, Bing

    2018-05-01

    It is well known that using two or more deformable mirrors (DMs) can improve the compensation ability of an adaptive optics (AO) system. However, to keep the stability of an AO system, the correlation between the multiple DMs must be suppressed during the correction. In this paper, we proposed a slope-based decoupling algorithm to simultaneous control the multiple DMs. In order to examine the validity and practicality of this algorithm, a typical woofer-tweeter (W-T) AO system was set up. For the W-T system, a theory model was simulated and the results indicated in theory that the algorithm we presented can selectively make woofer and tweeter correct different spatial frequency aberration and suppress the cross coupling between the dual DMs. At the same time, the experimental results for the W-T AO system were consistent with the results of the simulation, which demonstrated in practice that this algorithm is practical for the AO system with dual DMs.

  3. SU-E-T-472: Improvement of IMRT QA Passing Rate by Correcting Angular Dependence of MatriXX

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Q; Watkins, W; Kim, T

    2015-06-15

    Purpose: Multi-channel planar detector arrays utilized for IMRT-QA, such as the MatriXX, exhibit an incident-beam angular dependent response which can Result in false-positive gamma-based QA results, especially for helical tomotherapy plans which encompass the full range of beam angles. Although MatriXX can use with gantry angle sensor to provide automatically angular correction, this sensor does not work with tomotherapy. The purpose of the study is to reduce IMRT-QA false-positives by correcting for the MatriXX angular dependence. Methods: MatriXX angular dependence was characterized by comparing multiple fixed-angle irradiation measurements with corresponding TPS computed doses. For 81 Tomo-helical IMRT-QA measurements, two differentmore » correction schemes were tested: (1) A Monte-Carlo dose engine was used to compute MatriXX signal based on the angular-response curve. The computed signal was then compared with measurement. (2) Uncorrected computed signal was compared with measurements uniformly scaled to account for the average angular dependence. Three scaling factor (+2%, +2.5%, +3%) were tested. Results: The MatriXX response is 8% less than predicted for a PA beam even when the couch is fully accounted for. Without angular correction, only 67% of the cases pass the >90% points γ<1 (3%, 3mm). After full angular correction, 96% of the cases pass the criteria. Of three scaling factors, +2% gave the highest passing rate (89%), which is still less than the full angular correction method. With a stricter γ(2%,3mm) criteria, the full angular correction method was still able to achieve the 90% passing rate while the scaling method only gives 53% passing rate. Conclusion: Correction for the MatriXX angular dependence reduced the false-positives rate of our IMRT-QA process. It is necessary to correct for the angular dependence to achieve the IMRT passing criteria specified in TG129.« less

  4. Improved electron probe microanalysis of trace elements in quartz

    USGS Publications Warehouse

    Donovan, John J.; Lowers, Heather; Rusk, Brian G.

    2011-01-01

    Quartz occurs in a wide range of geologic environments throughout the Earth's crust. The concentration and distribution of trace elements in quartz provide information such as temperature and other physical conditions of formation. Trace element analyses with modern electron-probe microanalysis (EPMA) instruments can achieve 99% confidence detection of ~100 ppm with fairly minimal effort for many elements in samples of low to moderate average atomic number such as many common oxides and silicates. However, trace element measurements below 100 ppm in many materials are limited, not only by the precision of the background measurement, but also by the accuracy with which background levels are determined. A new "blank" correction algorithm has been developed and tested on both Cameca and JEOL instruments, which applies a quantitative correction to the emitted X-ray intensities during the iteration of the sample matrix correction based on a zero level (or known trace) abundance calibration standard. This iterated blank correction, when combined with improved background fit models, and an "aggregate" intensity calculation utilizing multiple spectrometer intensities in software for greater geometric efficiency, yields a detection limit of 2 to 3 ppm for Ti and 6 to 7 ppm for Al in quartz at 99% t-test confidence with similar levels for absolute accuracy.

  5. Improvement of structural models using covariance analysis and nonlinear generalized least squares

    NASA Technical Reports Server (NTRS)

    Glaser, R. J.; Kuo, C. P.; Wada, B. K.

    1992-01-01

    The next generation of large, flexible space structures will be too light to support their own weight, requiring a system of structural supports for ground testing. The authors have proposed multiple boundary-condition testing (MBCT), using more than one support condition to reduce uncertainties associated with the supports. MBCT would revise the mass and stiffness matrix, analytically qualifying the structure for operation in space. The same procedure is applicable to other common test conditions, such as empty/loaded tanks and subsystem/system level tests. This paper examines three techniques for constructing the covariance matrix required by nonlinear generalized least squares (NGLS) to update structural models based on modal test data. The methods range from a complicated approach used to generate the simulation data (i.e., the correct answer) to a diagonal matrix based on only two constants. The results show that NGLS is very insensitive to assumptions about the covariance matrix, suggesting that a workable NGLS procedure is possible. The examples also indicate that the multiple boundary condition procedure more accurately reduces errors than individual boundary condition tests alone.

  6. Students’ understanding of forces: Force diagrams on horizontal and inclined plane

    NASA Astrophysics Data System (ADS)

    Sirait, J.; Hamdani; Mursyid, S.

    2018-03-01

    This study aims to analyse students’ difficulties in understanding force diagrams on horizontal surfaces and inclined planes. Physics education students (pre-service physics teachers) of Tanjungpura University, who had completed a Basic Physics course, took a Force concept test which has six questions covering three concepts: an object at rest, an object moving at constant speed, and an object moving at constant acceleration both on a horizontal surface and on an inclined plane. The test is in a multiple-choice format. It examines the ability of students to select appropriate force diagrams depending on the context. The results show that 44% of students have difficulties in solving the test (these students only could solve one or two items out of six items). About 50% of students faced difficulties finding the correct diagram of an object when it has constant speed and acceleration in both contexts. In general, students could only correctly identify 48% of the force diagrams on the test. The most difficult task for the students in terms was identifying the force diagram representing forces exerted on an object on in an inclined plane.

  7. Artificial neural network EMG classifier for functional hand grasp movements prediction.

    PubMed

    Gandolla, Marta; Ferrante, Simona; Ferrigno, Giancarlo; Baldassini, Davide; Molteni, Franco; Guanziroli, Eleonora; Cotti Cottini, Michele; Seneci, Carlo; Pedrocchi, Alessandra

    2017-12-01

    Objective To design and implement an electromyography (EMG)-based controller for a hand robotic assistive device, which is able to classify the user's motion intention before the effective kinematic movement execution. Methods Multiple degrees-of-freedom hand grasp movements (i.e. pinching, grasp an object, grasping) were predicted by means of surface EMG signals, recorded from 10 bipolar EMG electrodes arranged in a circular configuration around the forearm 2-3 cm from the elbow. Two cascaded artificial neural networks were then exploited to detect the patient's motion intention from the EMG signal window starting from the electrical activity onset to movement onset (i.e. electromechanical delay). Results The proposed approach was tested on eight healthy control subjects (4 females; age range 25-26 years) and it demonstrated a mean ± SD testing performance of 76% ± 14% for correctly predicting healthy users' motion intention. Two post-stroke patients tested the controller and obtained 79% and 100% of correctly classified movements under testing conditions. Conclusion A task-selection controller was developed to estimate the intended movement from the EMG measured during the electromechanical delay.

  8. Web-Based Evaluation System to Measure Learning Effectiveness in Kampo Medicine

    PubMed Central

    Usuku, Koichiro; Segawa, Makoto; Wang, Yue; Ogashiwa, Kahori; Fujita, Yusuke; Ogihara, Hiroyuki; Tazuma, Susumu

    2016-01-01

    Measuring the learning effectiveness of Kampo Medicine (KM) education is challenging. The aim of this study was to develop a web-based test to measure the learning effectiveness of KM education among medical students (MSs). We used an open-source Moodle platform to test 30 multiple-choice questions classified into 8-type fields (eight basic concepts of KM) including “qi-blood-fluid” and “five-element” theories, on 117 fourth-year MSs. The mean (±standard deviation [SD]) score on the web-based test was 30.2 ± 11.9 (/100). The correct answer rate ranged from 17% to 36%. A pattern-based portfolio enabled these rates to be individualized in terms of KM proficiency. MSs with scores higher (n = 19) or lower (n = 14) than mean ± 1SD were defined as high or low achievers, respectively. Cluster analysis using the correct answer rates for the 8-type field questions revealed clear divisions between high and low achievers. Interestingly, each high achiever had a different proficiency pattern. In contrast, three major clusters were evident among low achievers, all of whom responded with a low percentage of or no correct answers. In addition, a combination of three questions accurately classified high and low achievers. These findings suggest that our web-based test allows individual quantitative assessment of the learning effectiveness of KM education among MSs. PMID:27738440

  9. Web-Based Evaluation System to Measure Learning Effectiveness in Kampo Medicine.

    PubMed

    Iizuka, Norio; Usuku, Koichiro; Nakae, Hajime; Segawa, Makoto; Wang, Yue; Ogashiwa, Kahori; Fujita, Yusuke; Ogihara, Hiroyuki; Tazuma, Susumu; Hamamoto, Yoshihiko

    2016-01-01

    Measuring the learning effectiveness of Kampo Medicine (KM) education is challenging. The aim of this study was to develop a web-based test to measure the learning effectiveness of KM education among medical students (MSs). We used an open-source Moodle platform to test 30 multiple-choice questions classified into 8-type fields (eight basic concepts of KM) including "qi-blood-fluid" and "five-element" theories, on 117 fourth-year MSs. The mean (±standard deviation [SD]) score on the web-based test was 30.2 ± 11.9 (/100). The correct answer rate ranged from 17% to 36%. A pattern-based portfolio enabled these rates to be individualized in terms of KM proficiency. MSs with scores higher ( n = 19) or lower ( n = 14) than mean ± 1SD were defined as high or low achievers, respectively. Cluster analysis using the correct answer rates for the 8-type field questions revealed clear divisions between high and low achievers. Interestingly, each high achiever had a different proficiency pattern. In contrast, three major clusters were evident among low achievers, all of whom responded with a low percentage of or no correct answers. In addition, a combination of three questions accurately classified high and low achievers. These findings suggest that our web-based test allows individual quantitative assessment of the learning effectiveness of KM education among MSs.

  10. Establishing books as conditioned reinforcers for preschool children as a function of an observational intervention.

    PubMed

    Singer-Dudek, Jessica; Oblak, Mara; Greer, R Douglas

    2011-01-01

    We tested the effects of an observational intervention (Greer & Singer-Dudek, 2008) on establishing children's books as conditioned reinforcers using a delayed multiple baseline design. Three preschool students with mild language and developmental delays served as the participants. Prior to the intervention, books did not function as reinforcers for any of the participants. The observational intervention consisted of a situation in which the participant observed a confederate being presented with access to books contingent on correct responses and the participant received nothing for correct responses. After several sessions of this treatment, the previously neutral books acquired reinforcing properties for maintenance and acquisition responses for all three participants.

  11. ESTABLISHING BOOKS AS CONDITIONED REINFORCERS FOR PRESCHOOL CHILDREN AS A FUNCTION OF AN OBSERVATIONAL INTERVENTION

    PubMed Central

    Singer-Dudek, Jessica; Oblak, Mara; Greer, R. Douglas

    2011-01-01

    We tested the effects of an observational intervention (Greer & Singer-Dudek, 2008) on establishing children's books as conditioned reinforcers using a delayed multiple baseline design. Three preschool students with mild language and developmental delays served as the participants. Prior to the intervention, books did not function as reinforcers for any of the participants. The observational intervention consisted of a situation in which the participant observed a confederate being presented with access to books contingent on correct responses and the participant received nothing for correct responses. After several sessions of this treatment, the previously neutral books acquired reinforcing properties for maintenance and acquisition responses for all three participants. PMID:21941376

  12. Comprehension of confidence intervals - development and piloting of patient information materials for people with multiple sclerosis: qualitative study and pilot randomised controlled trial.

    PubMed

    Rahn, Anne C; Backhus, Imke; Fuest, Franz; Riemann-Lorenz, Karin; Köpke, Sascha; van de Roemer, Adrianus; Mühlhauser, Ingrid; Heesen, Christoph

    2016-09-20

    Presentation of confidence intervals alongside information about treatment effects can support informed treatment choices in people with multiple sclerosis. We aimed to develop and pilot-test different written patient information materials explaining confidence intervals in people with relapsing-remitting multiple sclerosis. Further, a questionnaire on comprehension of confidence intervals was developed and piloted. We developed different patient information versions aiming to explain confidence intervals. We used an illustrative example to test three different approaches: (1) short version, (2) "average weight" version and (3) "worm prophylaxis" version. Interviews were conducted using think-aloud and teach-back approaches to test feasibility and analysed using qualitative content analysis. To assess comprehension of confidence intervals, a six-item multiple choice questionnaire was developed and tested in a pilot randomised controlled trial using the online survey software UNIPARK. Here, the average weight version (intervention group) was tested against a standard patient information version on confidence intervals (control group). People with multiple sclerosis were invited to take part using existing mailing-lists of people with multiple sclerosis in Germany and were randomised using the UNIPARK algorithm. Participants were blinded towards group allocation. Primary endpoint was comprehension of confidence intervals, assessed with the six-item multiple choice questionnaire with six points representing perfect knowledge. Feasibility of the patient information versions was tested with 16 people with multiple sclerosis. For the pilot randomised controlled trial, 64 people with multiple sclerosis were randomised (intervention group: n = 36; control group: n = 28). More questions were answered correctly in the intervention group compared to the control group (mean 4.8 vs 3.8, mean difference 1.1 (95 % CI 0.42-1.69), p = 0.002). The questionnaire's internal consistency was moderate (Cronbach's alpha = 0.56). The pilot-phase shows promising results concerning acceptability and feasibility. Pilot randomised controlled trial results indicate that the patient information is well understood and that knowledge gain on confidence intervals can be assessed with a set of six questions. German Clinical Trials Register: DRKS00008561 . Registered 8th of June 2015.

  13. Optimal Sensor Scheduling for Multiple Hypothesis Testing

    DTIC Science & Technology

    1981-09-01

    Naval Research, under contract N00014-77-0532 is gratpfully acknowledged. 2 Laboratory for Information and Decision Systems , MIT Room 35-213, Cambridge...treat the more general problem [9,10]. However, two common threads connect these approaches: they obtain feedback laws mapping posterior destributions ...objective of a detection or identification algorithm is to produce correct estimates of the true state of a system . It is also bene- ficial if these

  14. ANRIL Genetic Variants in Iranian Breast Cancer Patients

    PubMed Central

    Khorshidi, Hamid Reza; Taheri, Mohammad; Noroozi, Rezvan; Sarrafzadeh, Shaghayegh; Sayad, Arezou; Ghafouri-Fard, Soudeh

    2017-01-01

    Objective The genetic variants of the long non-coding RNA ANRIL (an antisense noncoding RNA in the INK4 locus) as well as its expression have been shown to be associated with several human diseases including cancers. The aim of this study was to examine the association of ANRIL variants with breast cancer susceptibility in Iranian patients. Materials and Methods In this case-control study, we genotyped rs1333045, rs4977574, rs1333048 and rs10757278 single nucleotide polymorphisms (SNPs) in 122 breast can- cer patients as well as in 200 normal age-matched subjects by tetra-primer amplification refractory mutation system polymerase chain reaction (T-ARMS-PCR). Results The TT genotype at rs1333045 was significantly over-represented among pa- tients (P=0.038) but did not remain significant after multiple-testing correction. In addi- tion, among all observed haplotypes (with SNP order of rs1333045, rs1333048 rs4977574 and rs10757278), four haplotypes were shown to be associated with breast cancer risk. However, after multiple testing corrections, TCGA was the only haplotype which remained significant. Conclusion These results suggest that breast cancer risk is significantly associated with ANRIL variants. Future work analyzing the expression of different associated ANRIL haplotypes would further shed light on the role of ANRIL in this disease. PMID:28580310

  15. An analysis of gene expression in PTSD implicates genes involved in the glucocorticoid receptor pathway and neural responses to stress

    PubMed Central

    Logue, Mark W.; Smith, Alicia K.; Baldwin, Clinton; Wolf, Erika J.; Guffanti, Guia; Ratanatharathorn, Andrew; Stone, Annjanette; Schichman, Steven A.; Humphries, Donald; Binder, Elisabeth B.; Arloth, Janine; Menke, Andreas; Uddin, Monica; Wildman, Derek; Galea, Sandro; Aiello, Allison E.; Koenen, Karestan C.; Miller, Mark W.

    2015-01-01

    We examined the association between posttraumatic stress disorder (PTSD) and gene expression using whole blood samples from a cohort of trauma-exposed white non-Hispanic male veterans (115 cases and 28 controls). 10,264 probes of genes and gene transcripts were analyzed. We found 41 that were differentially expressed in PTSD cases versus controls (multiple-testing corrected p<0.05). The most significant was DSCAM, a neurological gene expressed widely in the developing brain and in the amygdala and hippocampus of the adult brain. We then examined the 41 differentially expressed genes in a meta-analysis using two replication cohorts and found significant associations with PTSD for 7 of the 41 (p<0.05), one of which (ATP6AP1L) survived multiple-testing correction. There was also broad evidence of overlap across the discovery and replication samples for the entire set of genes implicated in the discovery data based on the direction of effect and an enrichment of p<0.05 significant probes beyond what would be expected under the null. Finally, we found that the set of differentially expressed genes from the discovery sample was enriched for genes responsive to glucocorticoid signaling with most showing reduced expression in PTSD cases compared to controls. PMID:25867994

  16. Benchmarking and performance analysis of the CM-2. [SIMD computer

    NASA Technical Reports Server (NTRS)

    Myers, David W.; Adams, George B., II

    1988-01-01

    A suite of benchmarking routines testing communication, basic arithmetic operations, and selected kernel algorithms written in LISP and PARIS was developed for the CM-2. Experiment runs are automated via a software framework that sequences individual tests, allowing for unattended overnight operation. Multiple measurements are made and treated statistically to generate well-characterized results from the noisy values given by cm:time. The results obtained provide a comparison with similar, but less extensive, testing done on a CM-1. Tests were chosen to aid the algorithmist in constructing fast, efficient, and correct code on the CM-2, as well as gain insight into what performance criteria are needed when evaluating parallel processing machines.

  17. Recognition memory reveals just how CONTRASTIVE contrastive accenting really is

    PubMed Central

    Fraundorf, Scott H.; Watson, Duane G.; Benjamin, Aaron S.

    2010-01-01

    The effects of pitch accenting on memory were investigated in three experiments. Participants listened to short recorded discourses that contained contrast sets with two items (e.g. British scientists and French scientists); a continuation specified one item from the set. Pitch accenting on the critical word in the continuation was manipulated between non-contrastive (H* in the ToBI system) and contrastive (L+H*). On subsequent recognition memory tests, the L+H* accent increased hits to correct statements and correct rejections of the contrast item (Experiments 1–3), but did not impair memory for other parts of the discourse (Experiment 2). L+H* also did not facilitate correct rejections of lures not in the contrast set (Experiment 3), indicating that contrastive accents do not simply strengthen the representation of the target item. These results suggest comprehenders use pitch accenting to encode and update information about multiple elements in a contrast set. PMID:20835405

  18. Coliform Bacteria Monitoring in Fish Systems: Current Practices in Public Aquaria.

    PubMed

    Culpepper, Erin E; Clayton, Leigh A; Hadfield, Catherine A; Arnold, Jill E; Bourbon, Holly M

    2016-06-01

    Public aquaria evaluate coliform indicator bacteria levels in fish systems, but the purpose of testing, testing methods, and management responses are not standardized, unlike with the coliform bacteria testing for marine mammal enclosures required by the U.S. Department of Agriculture. An online survey was sent to selected aquaria to document current testing and management practices in fish systems without marine mammals. The information collected included indicator bacteria species, the size and type of systems monitored, the primary purpose of testing, sampling frequency, test methods, the criteria for interpreting results, corrective actions, and management changes to limit human exposure. Of the 25 institutions to which surveys were sent, 19 (76%) responded. Fourteen reported testing for fecal indicator bacteria in fish systems. The most commonly tested indicator species were total (86%) and fecal (79%) coliform bacteria, which were detected by means of the membrane filtration method (64%). Multiple types and sizes of systems were tested, and the guidelines for testing and corrective actions were highly variable. Only three institutions performed additional tests to confirm the identification of indicator organisms. The results from this study can be used to compare bacterial monitoring practices and protocols in fish systems, as an aid to discussions relating to the accuracy and reliability of test results, and to help implement appropriate management responses. Received August 23, 2015; accepted December 29, 2015.

  19. Effect of pH Test-Strip Characteristics on Accuracy of Readings.

    PubMed

    Metheny, Norma A; Gunn, Emily M; Rubbelke, Cynthia S; Quillen, Terrilynn Fox; Ezekiel, Uthayashanker R; Meert, Kathleen L

    2017-06-01

    Little is known about characteristics of colorimetric pH test strips that are most likely to be associated with accurate interpretations in clinical situations. To compare the accuracy of 4 pH test strips with varying characteristics (ie, multiple vs single colorimetric squares per calibration, and differing calibration units [1.0 vs 0.5]). A convenience sample of 100 upper-level nursing students with normal color vision was recruited to evaluate the accuracy of the test strips. Six buffer solutions (pH range, 3.0 to 6.0) were used during the testing procedure. Each of the 100 participants performed 20 pH tests in random order, providing a total of 2000 readings. The sensitivity and specificity of each test strip was computed. In addition, the degree to which the test strips under- or overestimated the pH values was analyzed using descriptive statistics. Our criterion for correct readings was an exact match with the pH buffer solution being evaluated. Although none of the test strips evaluated in our study was 100% accurate at all of the measured pH values, those with multiple squares per pH calibration were clearly superior overall to those with a single test square. Test strips with multiple squares per calibration were associated with greater overall accuracy than test strips with a single square per calibration. However, because variable degrees of error were observed in all of the test strips, use of a pH meter is recommended when precise readings are crucial. ©2017 American Association of Critical-Care Nurses.

  20. Multiple scattering corrections to the Beer-Lambert law. 1: Open detector.

    PubMed

    Tam, W G; Zardecki, A

    1982-07-01

    Multiple scattering corrections to the Beer-Lambert law are analyzed by means of a rigorous small-angle solution to the radiative transfer equation. Transmission functions for predicting the received radiant power-a directly measured quantity in contrast to the spectral radiance in the Beer-Lambert law-are derived. Numerical algorithms and results relating to the multiple scattering effects for laser propagation in fog, cloud, and rain are presented.

  1. Visualization and statistical comparisons of microbial communities using R packages on Phylochip data.

    PubMed

    Holmes, Susan; Alekseyenko, Alexander; Timme, Alden; Nelson, Tyrrell; Pasricha, Pankaj Jay; Spormann, Alfred

    2011-01-01

    This article explains the statistical and computational methodology used to analyze species abundances collected using the LNBL Phylochip in a study of Irritable Bowel Syndrome (IBS) in rats. Some tools already available for the analysis of ordinary microarray data are useful in this type of statistical analysis. For instance in correcting for multiple testing we use Family Wise Error rate control and step-down tests (available in the multtest package). Once the most significant species are chosen we use the hypergeometric tests familiar for testing GO categories to test specific phyla and families. We provide examples of normalization, multivariate projections, batch effect detection and integration of phylogenetic covariation, as well as tree equalization and robustification methods.

  2. Assisting people with multiple disabilities actively correct abnormal standing posture with a Nintendo Wii balance board through controlling environmental stimulation.

    PubMed

    Shih, Ching-Hsiang; Shih, Ching-Tien; Chu, Chiung-Ling

    2010-01-01

    The latest researches adopted software technology turning the Nintendo Wii Balance Board into a high performance change of standing posture (CSP) detector, and assessed whether two persons with multiple disabilities would be able to control environmental stimulation using body swing (changing standing posture). This study extends Wii Balance Board functionality for standing posture correction (i.e., actively adjust abnormal standing posture) to assessed whether two persons with multiple disabilities would be able to actively correct their standing posture by controlling their favorite stimulation on/off using a Wii Balance Board with a newly developed standing posture correcting program (SPCP). The study was performed according to an ABAB design, in which A represented baseline and B represented intervention phases. Data showed that both participants significantly increased time duration of maintaining correct standing posture (TDMCSP) to activate the control system to produce environmental stimulation during the intervention phases. Practical and developmental implications of the findings were discussed.

  3. Genome-wide analysis of epistasis in body mass index using multiple human populations.

    PubMed

    Wei, Wen-Hua; Hemani, Gib; Gyenesei, Attila; Vitart, Veronique; Navarro, Pau; Hayward, Caroline; Cabrera, Claudia P; Huffman, Jennifer E; Knott, Sara A; Hicks, Andrew A; Rudan, Igor; Pramstaller, Peter P; Wild, Sarah H; Wilson, James F; Campbell, Harry; Hastie, Nicholas D; Wright, Alan F; Haley, Chris S

    2012-08-01

    We surveyed gene-gene interactions (epistasis) in human body mass index (BMI) in four European populations (n<1200) via exhaustive pair-wise genome scans where interactions were computed as F ratios by testing a linear regression model fitting two single-nucleotide polymorphisms (SNPs) with interactions against the one without. Before the association tests, BMI was corrected for sex and age, normalised and adjusted for relatedness. Neither single SNPs nor SNP interactions were genome-wide significant in either cohort based on the consensus threshold (P=5.0E-08) and a Bonferroni corrected threshold (P=1.1E-12), respectively. Next we compared sub genome-wide significant SNP interactions (P<5.0E-08) across cohorts to identify common epistatic signals, where SNPs were annotated to genes to test for gene ontology (GO) enrichment. Among the epistatic genes contributing to the commonly enriched GO terms, 19 were shared across study cohorts of which 15 are previously published genome-wide association loci, including CDH13 (cadherin 13) associated with height and SORCS2 (sortilin-related VPS10 domain containing receptor 2) associated with circulating insulin-like growth factor 1 and binding protein 3. Interactions between the 19 shared epistatic genes and those involving BMI candidate loci (P<5.0E-08) were tested across cohorts and found eight replicated at the SNP level (P<0.05) in at least one cohort, which were further tested and showed limited replication in a separate European population (n>5000). We conclude that genome-wide analysis of epistasis in multiple populations is an effective approach to provide new insights into the genetic regulation of BMI but requires additional efforts to confirm the findings.

  4. Association of Common Mitochondrial DNA Variants with Multiple Sclerosis and Systemic Lupus Erythematosus

    PubMed Central

    Vyshkina, Tamara; Sylvester, Andrew; Sadiq, Saud; Bonilla, Eduardo; Canter, Jeff A.; Perl, Andras; Kalman, Bernadette

    2008-01-01

    Mitochondrial dysfunction has been implicated in the pathogenesis of multiple sclerosis (MS) and systemic lupus erythematosus (SLE). This study re-investigates the roles of previously suggested candidate genes of energy metabolism (Complex I genes located in the nucleus and in the mitochondria) in patients with MS relative to ethnically matched SLE patients and healthy controls. After stringent correction for multiple testing, we reproduce the association of the mitochondrial (mt)DNA haplotype K* with MS, but reject the importance of previously suggested borderline associations with nuclear genes of Complex I. In addition, we detect the association of common variants of the mitochondrial ND2 and ATP6 genes with both MS and SLE, which raises the possibility of a shared mitochondrial genetic background of these two autoimmune diseases. PMID:18708297

  5. Analyses of single nucleotide polymorphisms in selected nutrient-sensitive genes in weight-regain prevention: the DIOGENES study.

    PubMed

    Larsen, Lesli H; Angquist, Lars; Vimaleswaran, Karani S; Hager, Jörg; Viguerie, Nathalie; Loos, Ruth J F; Handjieva-Darlenska, Teodora; Jebb, Susan A; Kunesova, Marie; Larsen, Thomas M; Martinez, J Alfredo; Papadaki, Angeliki; Pfeiffer, Andreas F H; van Baak, Marleen A; Sørensen, Thorkild Ia; Holst, Claus; Langin, Dominique; Astrup, Arne; Saris, Wim H M

    2012-05-01

    Differences in the interindividual response to dietary intervention could be modified by genetic variation in nutrient-sensitive genes. This study examined single nucleotide polymorphisms (SNPs) in presumed nutrient-sensitive candidate genes for obesity and obesity-related diseases for main and dietary interaction effects on weight, waist circumference, and fat mass regain over 6 mo. In total, 742 participants who had lost ≥ 8% of their initial body weight were randomly assigned to follow 1 of 5 different ad libitum diets with different glycemic indexes and contents of dietary protein. The SNP main and SNP-diet interaction effects were analyzed by using linear regression models, corrected for multiple testing by using Bonferroni correction and evaluated by using quantile-quantile (Q-Q) plots. After correction for multiple testing, none of the SNPs were significantly associated with weight, waist circumference, or fat mass regain. Q-Q plots showed that ALOX5AP rs4769873 showed a higher observed than predicted P value for the association with less waist circumference regain over 6 mo (-3.1 cm/allele; 95% CI: -4.6, -1.6; P/Bonferroni-corrected P = 0.000039/0.076), independently of diet. Additional associations were identified by using Q-Q plots for SNPs in ALOX5AP, TNF, and KCNJ11 for main effects; in LPL and TUB for glycemic index interaction effects on waist circumference regain; in GHRL, CCK, MLXIPL, and LEPR on weight; in PPARC1A, PCK2, ALOX5AP, PYY, and ADRB3 on waist circumference; and in PPARD, FABP1, PLAUR, and LPIN1 on fat mass regain for dietary protein interaction. The observed effects of SNP-diet interactions on weight, waist, and fat mass regain suggest that genetic variation in nutrient-sensitive genes can modify the response to diet. This trial was registered at clinicaltrials.gov as NCT00390637.

  6. Improving Mixed Variable Optimization of Computational and Model Parameters Using Multiple Surrogate Functions

    DTIC Science & Technology

    2008-03-01

    multiplicative corrections as well as space mapping transformations for models defined over a lower dimensional space. A corrected surrogate model for the...correction functions used in [72]. If the low fidelity model g(x̃) is defined over a lower dimensional space then a space mapping transformation is...required. As defined in [21, 72], space mapping is a method of mapping between models of different dimensionality or fidelity. Let P denote the space

  7. Virtual test: A student-centered software to measure student's critical thinking on human disease

    NASA Astrophysics Data System (ADS)

    Rusyati, Lilit; Firman, Harry

    2016-02-01

    The study "Virtual Test: A Student-Centered Software to Measure Student's Critical Thinking on Human Disease" is descriptive research. The background is importance of computer-based test that use element and sub element of critical thinking. Aim of this study is development of multiple choices to measure critical thinking that made by student-centered software. Instruments to collect data are (1) construct validity sheet by expert judge (lecturer and medical doctor) and professional judge (science teacher); and (2) test legibility sheet by science teacher and junior high school student. Participants consisted of science teacher, lecturer, and medical doctor as validator; and the students as respondent. Result of this study are describe about characteristic of virtual test that use to measure student's critical thinking on human disease, analyze result of legibility test by students and science teachers, analyze result of expert judgment by science teachers and medical doctor, and analyze result of trial test of virtual test at junior high school. Generally, result analysis shown characteristic of multiple choices to measure critical thinking was made by eight elements and 26 sub elements that developed by Inch et al.; complete by relevant information; and have validity and reliability more than "enough". Furthermore, specific characteristic of multiple choices to measure critical thinking are information in form science comic, table, figure, article, and video; correct structure of language; add source of citation; and question can guide student to critical thinking logically.

  8. Protocadherin α (PCDHA) as a novel susceptibility gene for autism

    PubMed Central

    Anitha, Ayyappan; Thanseem, Ismail; Nakamura, Kazuhiko; Yamada, Kazuo; Iwayama, Yoshimi; Toyota, Tomoko; Iwata, Yasuhide; Suzuki, Katsuaki; Sugiyama, Toshiro; Tsujii, Masatsugu; Yoshikawa, Takeo; Mori, Norio

    2013-01-01

    Background Synaptic dysfunction has been shown to be involved in the pathogenesis of autism. We hypothesized that the protocadherin α gene cluster (PCDHA), which is involved in synaptic specificity and in serotonergic innervation of the brain, could be a suitable candidate gene for autism. Methods We examined 14 PCDHA single nucleotide polymorphisms (SNPs) for genetic association with autism in DNA samples of 3211 individuals (841 families, including 574 multiplex families) obtained from the Autism Genetic Resource Exchange. Results Five SNPs (rs251379, rs1119032, rs17119271, rs155806 and rs17119346) showed significant associations with autism. The strongest association (p < 0.001) was observed for rs1119032 (z score of risk allele G = 3.415) in multiplex families; SNP associations withstand multiple testing correction in multiplex families (p = 0.041). Haplotypes involving rs1119032 showed very strong associations with autism, withstanding multiple testing corrections. In quantitative transmission disequilibrium testing of multiplex families, the G allele of rs1119032 showed a significant association (p = 0.033) with scores on the Autism Diagnostic Interview–Revised (ADI-R)_D (early developmental abnormalities). We also found a significant difference in the distribution of ADI-R_A (social interaction) scores between the A/A, A/G and G/G genotypes of rs17119346 (p = 0.002). Limitations Our results should be replicated in an independent population and/or in samples of different racial backgrounds. Conclusion Our study provides strong genetic evidence of PCDHA as a potential candidate gene for autism. PMID:23031252

  9. Solar multi-conjugate adaptive optics based on high order ground layer adaptive optics and low order high altitude correction.

    PubMed

    Zhang, Lanqiang; Guo, Youming; Rao, Changhui

    2017-02-20

    Multi-conjugate adaptive optics (MCAO) is the most promising technique currently developed to enlarge the corrected field of view of adaptive optics for astronomy. In this paper, we propose a new configuration of solar MCAO based on high order ground layer adaptive optics and low order high altitude correction, which result in a homogeneous correction effect in the whole field of view. An individual high order multiple direction Shack-Hartmann wavefront sensor is employed in the configuration to detect the ground layer turbulence for low altitude correction. Furthermore, the other low order multiple direction Shack-Hartmann wavefront sensor supplies the wavefront information caused by high layers' turbulence through atmospheric tomography for high altitude correction. Simulation results based on the system design at the 1-meter New Vacuum Solar Telescope show that the correction uniform of the new scheme is obviously improved compared to conventional solar MCAO configuration.

  10. Brain-targeted stem cell gene therapy corrects mucopolysaccharidosis type II via multiple mechanisms.

    PubMed

    Gleitz, Hélène Fe; Liao, Ai Yin; Cook, James R; Rowlston, Samuel F; Forte, Gabriella Ma; D'Souza, Zelpha; O'Leary, Claire; Holley, Rebecca J; Bigger, Brian W

    2018-06-08

    The pediatric lysosomal storage disorder mucopolysaccharidosis type II is caused by mutations in IDS, resulting in accumulation of heparan and dermatan sulfate, causing severe neurodegeneration, skeletal disease, and cardiorespiratory disease. Most patients manifest with cognitive symptoms, which cannot be treated with enzyme replacement therapy, as native IDS does not cross the blood-brain barrier. We tested a brain-targeted hematopoietic stem cell gene therapy approach using lentiviral IDS fused to ApoEII (IDS.ApoEII) compared to a lentivirus expressing normal IDS or a normal bone marrow transplant. In mucopolysaccharidosis II mice, all treatments corrected peripheral disease, but only IDS.ApoEII mediated complete normalization of brain pathology and behavior, providing significantly enhanced correction compared to IDS. A normal bone marrow transplant achieved no brain correction. Whilst corrected macrophages traffic to the brain, secreting IDS/IDS.ApoEII enzyme for cross-correction, IDS.ApoEII was additionally more active in plasma and was taken up and transcytosed across brain endothelia significantly better than IDS via both heparan sulfate/ApoE-dependent receptors and mannose-6-phosphate receptors. Brain-targeted hematopoietic stem cell gene therapy provides a promising therapy for MPS II patients. © 2018 The Authors. Published under the terms of the CC BY 4.0 license.

  11. Improved H-κ Method by Harmonic Analysis on Ps and Crustal Multiples in Receiver Functions with respect to Dipping Moho and Crustal Anisotropy

    NASA Astrophysics Data System (ADS)

    Li, J.; Song, X.; Wang, P.; Zhu, L.

    2017-12-01

    The H-κ method (Zhu and Kanamori, 2000) has been widely used to estimate the crustal thickness and Vp/Vs ratio with receiver functions. However, in regions where the crustal structure is complicated, the method may produce uncertain or even unrealistic results, arising particularly from dipping Moho and/or crustal anisotropy. Here, we propose an improved H-κ method, which corrects for these effects first before stacking. The effect of dipping Moho and crustal anisotropy on Ps receiver function has been well studied, but not as much on crustal multiples (PpPs and PpSs+PsPs). Synthetic tests show that the effect of crustal anisotropy on the multiples are similar to Ps, while the effect of dipping Moho on the multiples is 5 times that on Ps (same cosine trend but 5 times in time shift). A Harmonic Analysis (HA) method for dipping/anisotropy was developed by Wang et al. (2017) for crustal Ps receiver functions to extract parameters of dipping Moho and crustal azimuthal anisotropy. In real data, the crustal multiples are much more complicated than the Ps. Therefore, we use the HA method (Wang et al., 2017), but apply separately to Ps and the multiples. It shows that although complicated, the trend of multiples can still be reasonably well represented by the HA. We then perform separate azimuthal corrections for Ps and the multiples and stack to obtain a combined receiver function. Lastly, the traditional H-κ procedure is applied to the stacked receiver function. We apply the improved H-κ method on 40 CNDSN (Chinese National Digital Seismic Network) stations distributed in a variety of geological setting across the Chinese continent. The results show apparent improvement compared to the traditional H-κ method, with clearer traces of multiples and stronger stacking energy in the grid search, as well as more reliable H-κ values.

  12. Delayed, but not immediate, feedback after multiple-choice questions increases performance on a subsequent short-answer, but not multiple-choice, exam: evidence for the dual-process theory of memory.

    PubMed

    Sinha, Neha; Glass, Arnold Lewis

    2015-01-01

    Three experiments, two performed in the laboratory and one embedded in a college psychology lecture course, investigated the effects of immediate versus delayed feedback following a multiple-choice exam on subsequent short answer and multiple-choice exams. Performance on the subsequent multiple-choice exam was not affected by the timing of the feedback on the prior exam; however, performance on the subsequent short answer exam was better following delayed than following immediate feedback. This was true regardless of the order in which immediate versus delayed feedback was given. Furthermore, delayed feedback only had a greater effect than immediate feedback on subsequent short answer performance following correct, confident responses on the prior exam. These results indicate that delayed feedback cues a student's prior response and increases subsequent recollection of that response. The practical implication is that delayed feedback is better than immediate feedback during academic testing.

  13. Multiplicity-dependent and nonbinomial efficiency corrections for particle number cumulants

    NASA Astrophysics Data System (ADS)

    Bzdak, Adam; Holzmann, Romain; Koch, Volker

    2016-12-01

    In this article we extend previous work on efficiency corrections for cumulant measurements [Bzdak and Koch, Phys. Rev. C 86, 044904 (2012), 10.1103/PhysRevC.86.044904; Phys. Rev. C 91, 027901 (2015), 10.1103/PhysRevC.91.027901]. We will discuss the limitations of the methods presented in these papers. Specifically we will consider multiplicity dependent efficiencies as well as nonbinomial efficiency distributions. We will discuss the most simple and straightforward methods to implement those corrections.

  14. Multiplicity-dependent and nonbinomial efficiency corrections for particle number cumulants

    DOE PAGES

    Bzdak, Adam; Holzmann, Romain; Koch, Volker

    2016-12-19

    Here, we extend previous work on efficiency corrections for cumulant measurements [Bzdak and Koch, Phys. Rev. C 86, 044904 (2012)PRVCAN0556-281310.1103/PhysRevC.86.044904; Phys. Rev. C 91, 027901 (2015)PRVCAN0556-281310.1103/PhysRevC.91.027901]. We will then discuss the limitations of the methods presented in these papers. Specifically we will consider multiplicity dependent efficiencies as well as nonbinomial efficiency distributions. We will discuss the most simple and straightforward methods to implement those corrections.

  15. [Problem based learning: achievement of educational goals in the information and comprehension sub-categories of Bloom cognitive domain].

    PubMed

    Montecinos, P; Rodewald, A M

    1994-06-01

    The aim this work was to assess and compare the achievements of medical students, subjected to problem based learning methodology. The information and comprehension categories of Bloom were tested in 17 medical students in four different occasions during the physiopathology course, using a multiple choice knowledge test. There was a significant improvement in the number of correct answers towards the end of the course. It is concluded that these medical students obtained adequate learning achievements in the information subcategory of Bloom using problem based learning methodology, during the physiopathology course.

  16. Effect of response format on cognitive reflection: Validating a two- and four-option multiple choice question version of the Cognitive Reflection Test.

    PubMed

    Sirota, Miroslav; Juanchich, Marie

    2018-03-27

    The Cognitive Reflection Test, measuring intuition inhibition and cognitive reflection, has become extremely popular because it reliably predicts reasoning performance, decision-making, and beliefs. Across studies, the response format of CRT items sometimes differs, based on the assumed construct equivalence of tests with open-ended versus multiple-choice items (the equivalence hypothesis). Evidence and theoretical reasons, however, suggest that the cognitive processes measured by these response formats and their associated performances might differ (the nonequivalence hypothesis). We tested the two hypotheses experimentally by assessing the performance in tests with different response formats and by comparing their predictive and construct validity. In a between-subjects experiment (n = 452), participants answered stem-equivalent CRT items in an open-ended, a two-option, or a four-option response format and then completed tasks on belief bias, denominator neglect, and paranormal beliefs (benchmark indicators of predictive validity), as well as on actively open-minded thinking and numeracy (benchmark indicators of construct validity). We found no significant differences between the three response formats in the numbers of correct responses, the numbers of intuitive responses (with the exception of the two-option version, which had a higher number than the other tests), and the correlational patterns of the indicators of predictive and construct validity. All three test versions were similarly reliable, but the multiple-choice formats were completed more quickly. We speculate that the specific nature of the CRT items helps build construct equivalence among the different response formats. We recommend using the validated multiple-choice version of the CRT presented here, particularly the four-option CRT, for practical and methodological reasons. Supplementary materials and data are available at https://osf.io/mzhyc/ .

  17. Terrestrial Gamma Radiation Dose Rate of West Sarawak

    NASA Astrophysics Data System (ADS)

    Izham, A.; Ramli, A. T.; Saridan Wan Hassan, W. M.; Idris, H. N.; Basri, N. A.

    2017-10-01

    A study of terrestrial gamma radiation (TGR) dose rate was conducted in west of Sarawak, covering Kuching, Samarahan, Serian, Sri Aman, and Betong divisions to construct a baseline TGR dose rate level data of the areas. The total area covered was 20,259.2 km2, where in-situ measurements of TGR dose rate were taken using NaI(Tl) scintillation detector Ludlum 19 micro R meter NaI(Tl) approximately 1 meter above ground level. Twenty-nine soil samples were taken across the 5 divisions covering 26 pairings of 9 geological formations and 7 soil types. A hyperpure Germanium detector was then used to find the samples' 238U, 232Th, and 40K radionuclides concentrations producing a correction factor Cf = 0.544. A total of239 measured data were corrected with Cf resulting in a mean Dm of 47 ± 1 nGy h-1, with a range between 5 nGy h-1 - 103 nGy h-1. A multiple regression analysis was conducted between geological means and soil types means against the corrected TGR dose rate Dm, generating Dg,s= 0.847Dg+ 0.637Ds- 22.313 prediction model with a normalized Beta equation of Dg,s= 0.605Dg+ 0.395Ds. The model has an 84.6% acceptance of Whitney- Mann test null hypothesis when tested against the corrected TGR dose rates.

  18. Using two MEMS deformable mirrors in an adaptive optics test bed for multiconjugate correction

    NASA Astrophysics Data System (ADS)

    Andrews, Jonathan R.; Martinez, Ty; Teare, Scott W.; Restaino, Sergio R.; Wilcox, Christopher C.; Santiago, Freddie; Payne, Don M.

    2010-02-01

    Adaptive optics systems have advanced considerably over the past decade and have become common tools for optical engineers. The most recent advances in adaptive optics technology have lead to significant reductions in the cost of most of the key components. Most significantly, the cost of deformable elements and wavefront sensor components have dropped to the point where multiple deformable mirrors and Shack- Hartmann array based wavefront sensor cameras can be included in a single system. Matched with the appropriate hardware and software, formidable systems can be operating in nearly any sized research laboratory. The significant advancement of MEMS deformable mirrors has made them very popular for use as the active corrective element in multi-conjugate adaptive optics systems so that, in particular for astronomical applications, this allows correction in more than one plane. The NRL compact AO system and atmospheric simulation systems has now been expanded to support Multi Conjugate Adaptive Optics (MCAO), taking advantage of using the liquid crystal spatial light modulator (SLM) driven aberration generators in two conjugate planes that are well separated spatially. Thus, by using two SLM based aberration generators and two separate wavefront sensors, the system can measure and apply wavefront correction with two MEMS deformable mirrors. This paper describes the multi-conjugate adaptive optics system and the testing and calibration of the system and demonstrates preliminary results with this system.

  19. A multi-instructor, team-based, active-learning exercise to integrate basic and clinical sciences content.

    PubMed

    Kolluru, Srikanth; Roesch, Darren M; Akhtar de la Fuente, Ayesha

    2012-03-12

    To introduce a multiple-instructor, team-based, active-learning exercise to promote the integration of basic sciences (pathophysiology, pharmacology, and medicinal chemistry) and clinical sciences in a doctor of pharmacy curriculum. A team-based learning activity that involved pre-class reading assignments, individual-and team-answered multiple-choice questions, and evaluation and discussion of a clinical case, was designed, implemented, and moderated by 3 faculty members from the pharmaceutical sciences and pharmacy practice departments. Student performance was assessed using a multiple-choice examination, an individual readiness assurance test (IRAT), a team readiness assurance test (TRAT), and a subjective, objective, assessment, and plan (SOAP) note. Student attitudes were assessed using a pre- and post-exercise survey instrument. Students' understanding of possible correct treatment strategies for depression improved. Students were appreciative of this true integration of basic sciences knowledge in a pharmacotherapy course and to have faculty members from both disciplines present to answer questions. Mean student score on the on depression module for the examination was 80.4%, indicating mastery of the content. An exercise led by multiple instructors improved student perceptions of the importance of team-based teaching. Integrated teaching and learning may be achieved when instructors from multiple disciplines work together in the classroom using proven team-based, active-learning exercises.

  20. Lip line changes in Class III facial asymmetry patients after orthodontic camouflage treatment, one-jaw surgery, and two-jaw surgery: A preliminary study.

    PubMed

    Lee, Gung-Chol; Yoo, Jo-Kwang; Kim, Seong-Hun; Moon, Cheol-Hyun

    2017-03-01

    To evaluate the effects of orthodontic camouflage treatment (OCT), one-jaw surgery, and two-jaw surgery on the correction of lip line cant (LLC) and to examine factors affecting the correction of LLC in Class III craniofacial asymmetry patients. A sample of 30 Class III craniofacial asymmetry patients was divided into OCT (n = 10), one-jaw surgery (n = 10), and two-jaw surgery (n = 10) groups such that the pretreatment LLC was similar in each group. Pretreatment and posttreatment cone-beam computed tomography scans were used to measure dental and skeletal parameters and LLC. Pretreatment and posttreatment measurements were compared within groups and between groups. Pearson's correlation tests and multiple regression analyses were performed to investigate factors affecting the amount and rate of LLC correction. The average LLC correction was 1.00° in the one-jaw surgery group, and in the two-jaw surgery group, it was 1.71°. In the OCT group it was -0.04°, which differed statistically significantly from the LLC correction in the other two groups. The amount and rate of LLC correction could be explained by settling of skeletal discrepancies or LLC at pretreatment with goodness of fit percentages of approximately 82% and 41%, respectively. Orthognathic surgery resulted in significant correction of LLC in Class III craniofacial asymmetry patients, while OCT did not.

  1. Feasibility of investigating differential proteomic expression in depression: implications for biomarker development in mood disorders

    PubMed Central

    Frye, M A; Nassan, M; Jenkins, G D; Kung, S; Veldic, M; Palmer, B A; Feeder, S E; Tye, S J; Choi, D S; Biernacka, J M

    2015-01-01

    The objective of this study was to determine whether proteomic profiling in serum samples can be utilized in identifying and differentiating mood disorders. A consecutive sample of patients with a confirmed diagnosis of unipolar (UP n=52) or bipolar depression (BP-I n=46, BP-II n=49) and controls (n=141) were recruited. A 7.5-ml blood sample was drawn for proteomic multiplex profiling of 320 proteins utilizing the Myriad RBM Discovery Multi-Analyte Profiling platform. After correcting for multiple testing and adjusting for covariates, growth differentiation factor 15 (GDF-15), hemopexin (HPX), hepsin (HPN), matrix metalloproteinase-7 (MMP-7), retinol-binding protein 4 (RBP-4) and transthyretin (TTR) all showed statistically significant differences among groups. In a series of three post hoc analyses correcting for multiple testing, MMP-7 was significantly different in mood disorder (BP-I+BP-II+UP) vs controls, MMP-7, GDF-15, HPN were significantly different in bipolar cases (BP-I+BP-II) vs controls, and GDF-15, HPX, HPN, RBP-4 and TTR proteins were all significantly different in BP-I vs controls. Good diagnostic accuracy (ROC-AUC⩾0.8) was obtained most notably for GDF-15, RBP-4 and TTR when comparing BP-I vs controls. While based on a small sample not adjusted for medication state, this discovery sample with a conservative method of correction suggests feasibility in using proteomic panels to assist in identifying and distinguishing mood disorders, in particular bipolar I disorder. Replication studies for confirmation, consideration of state vs trait serial assays to delineate proteomic expression of bipolar depression vs previous mania, and utility studies to assess proteomic expression profiling as an advanced decision making tool or companion diagnostic are encouraged. PMID:26645624

  2. [Development of critical thinking skill evaluation scale for nursing students].

    PubMed

    You, So Young; Kim, Nam Cho

    2014-04-01

    To develop a Critical Thinking Skill Test for Nursing Students. The construct concepts were drawn from a literature review and in-depth interviews with hospital nurses and surveys were conducted among students (n=607) from nursing colleges. The data were collected from September 13 to November 23, 2012 and analyzed using the SAS program, 9.2 version. The KR 20 coefficient for reliability, difficulty index, discrimination index, item-total correlation and known group technique for validity were performed. Four domains and 27 skills were identified and 35 multiple choice items were developed. Thirty multiple choice items which had scores higher than .80 on the content validity index were selected for the pre test. From the analysis of the pre test data, a modified 30 items were selected for the main test. In the main test, the KR 20 coefficient was .70 and Corrected Item-Total Correlations range was .11-.38. There was a statistically significant difference between two academic systems (p=.001). The developed instrument is the first critical thinking skill test reflecting nursing perspectives in hospital settings and is expected to be utilized as a tool which contributes to improvement of the critical thinking ability of nursing students.

  3. Threat detection of liquid explosives and precursors from their x-ray scattering pattern using energy dispersive detector technology

    NASA Astrophysics Data System (ADS)

    Kehres, Jan; Lyksborg, Mark; Olsen, Ulrik L.

    2017-09-01

    Energy dispersive X-ray diffraction (EDXRD) can be applied for identification of liquid threats in luggage scanning in security applications. To define the instrumental design, the framework for data reduction and analysis and test the performance of the threat detection in various scenarios, a flexible laboratory EDXRD test setup was build. A data set of overall 570 EDXRD spectra has been acquired for training and testing of threat identification algorithms. The EDXRD data was acquired with limited count statistics and at multiple detector angles and merged after correction and normalization. Initial testing of the threat detection algorithms with this data set indicate the feasibility of detection levels of > 95 % true positive with < 6 % false positive alarms.

  4. Efficacy and safety of one-stage posterior hemivertebral resection for unbalanced multiple hemivertebrae: A more than 2-year follow-up.

    PubMed

    Huang, Yong; Feng, Ganjun; Song, Yueming; Liu, Limin; Zhou, Chunguang; Wang, Lei; Zhou, Zhongjie; Yang, Xi

    2017-09-01

    One-stage posterior hemivertebral resection has been proven to be an effective, reliable surgical option for treating congenital scoliosis due to a single hemivertebra. To date, however, no studies of treating unbalanced multiple hemivertebrae have appeared. This study evaluated the efficacy and safety of one-stage posterior hemivertebral resection for unbalanced multiple hemivertebrae. Altogether, we studied 15 patients with unbalanced multiple hemivertebrae who had undergone hemivertebral resection using the one-stage posterior approach with at least 2 years of follow-up. Clinical outcomes were assessed radiographically and with the Scoliosis Research Society-22 (SRS-22) score. Related complications were also recorded. The mean Cobb angle of the main curve was 62.4° (46°-98°) before surgery and 18.2° (9°-33°) at the most recent follow-up (average correction 73.3%). The compensatory cranial curve was corrected from 28.5° (11°-52°) to 9.1° (0°-30°) (average correction 70.0%). The compensatory caudal curve was corrected from 31.6° (14°-54°) to 6.9°(0°-19°) (average correction 79.1%). The segmental kyphosis/lordosis was corrected from 41.1° (-40° to 98°) to 12.3° (-25° to 41°) (average correction 65.5%). The mean growth rate of the T1-S1 length in immature patients was 9.8mm/year during the follow-up period. Health-related quality of life (SRS-22 score) had significantly improved. Complications include one wound infection and one developing deformity. One-stage posterior hemivertebral resection for unbalanced multiple hemivertebrae provides good radiographic and clinical outcomes with no severe complications when performed by an experienced surgeon. Longer follow-up to detect late complications is obligatory. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Artificial neural network EMG classifier for functional hand grasp movements prediction

    PubMed Central

    Ferrante, Simona; Ferrigno, Giancarlo; Baldassini, Davide; Molteni, Franco; Guanziroli, Eleonora; Cotti Cottini, Michele; Seneci, Carlo; Pedrocchi, Alessandra

    2016-01-01

    Objective To design and implement an electromyography (EMG)-based controller for a hand robotic assistive device, which is able to classify the user's motion intention before the effective kinematic movement execution. Methods Multiple degrees-of-freedom hand grasp movements (i.e. pinching, grasp an object, grasping) were predicted by means of surface EMG signals, recorded from 10 bipolar EMG electrodes arranged in a circular configuration around the forearm 2–3 cm from the elbow. Two cascaded artificial neural networks were then exploited to detect the patient's motion intention from the EMG signal window starting from the electrical activity onset to movement onset (i.e. electromechanical delay). Results The proposed approach was tested on eight healthy control subjects (4 females; age range 25–26 years) and it demonstrated a mean ± SD testing performance of 76% ± 14% for correctly predicting healthy users' motion intention. Two post-stroke patients tested the controller and obtained 79% and 100% of correctly classified movements under testing conditions. Conclusion A task-selection controller was developed to estimate the intended movement from the EMG measured during the electromechanical delay. PMID:27677300

  6. VRT (verbal reasoning test): a new test for assessment of verbal reasoning. Test realization and Italian normative data from a multicentric study.

    PubMed

    Basagni, Benedetta; Luzzatti, Claudio; Navarrete, Eduardo; Caputo, Marina; Scrocco, Gessica; Damora, Alessio; Giunchi, Laura; Gemignani, Paola; Caiazzo, Annarita; Gambini, Maria Grazia; Avesani, Renato; Mancuso, Mauro; Trojano, Luigi; De Tanti, Antonio

    2017-04-01

    Verbal reasoning is a complex, multicomponent function, which involves activation of functional processes and neural circuits distributed in both brain hemispheres. Thus, this ability is often impaired after brain injury. The aim of the present study is to describe the construction of a new verbal reasoning test (VRT) for patients with brain injury and to provide normative values in a sample of healthy Italian participants. Three hundred and eighty healthy Italian subjects (193 women and 187 men) of different ages (range 16-75 years) and educational level (primary school to postgraduate degree) underwent the VRT. VRT is composed of seven subtests, investigating seven different domains. Multiple linear regression analysis revealed a significant effect of age and education on the participants' performance in terms of both VRT total score and all seven subtest scores. No gender effect was found. A correction grid for raw scores was built from the linear equation derived from the scores. Inferential cut-off scores were estimated using a non-parametric technique, and equivalent scores were computed. We also provided a grid for the correction of results by z scores.

  7. Correction of terrestrial LiDAR intensity channel using Oren-Nayar reflectance model: An application to lithological differentiation

    NASA Astrophysics Data System (ADS)

    Carrea, Dario; Abellan, Antonio; Humair, Florian; Matasci, Battista; Derron, Marc-Henri; Jaboyedoff, Michel

    2016-03-01

    Ground-based LiDAR has been traditionally used for surveying purposes via 3D point clouds. In addition to XYZ coordinates, an intensity value is also recorded by LiDAR devices. The intensity of the backscattered signal can be a significant source of information for various applications in geosciences. Previous attempts to account for the scattering of the laser signal are usually modelled using a perfect diffuse reflection. Nevertheless, experience on natural outcrops shows that rock surfaces do not behave as perfect diffuse reflectors. The geometry (or relief) of the scanned surfaces plays a major role in the recorded intensity values. Our study proposes a new terrestrial LiDAR intensity correction, which takes into consideration the range, the incidence angle and the geometry of the scanned surfaces. The proposed correction equation combines the classical radar equation for LiDAR with the bidirectional reflectance distribution function of the Oren-Nayar model. It is based on the idea that the surface geometry can be modelled by a relief of multiple micro-facets. This model is constrained by only one tuning parameter: the standard deviation of the slope angle distribution (σslope) of micro-facets. Firstly, a series of tests have been carried out in laboratory conditions on a 2 m2 board covered by black/white matte paper (perfect diffuse reflector) and scanned at different ranges and incidence angles. Secondly, other tests were carried out on rock blocks of different lithologies and surface conditions. Those tests demonstrated that the non-perfect diffuse reflectance of rock surfaces can be practically handled by the proposed correction method. Finally, the intensity correction method was applied to a real case study, with two scans of the carbonate rock outcrop of the Dents-du-Midi (Swiss Alps), to improve the lithological identification for geological mapping purposes. After correction, the intensity values are proportional to the intrinsic material reflectance and are independent from range, incidence angle and scanned surface geometry. The corrected intensity values significantly improve the material differentiation.

  8. Arithmetic memory networks established in childhood are changed by experience in adulthood

    PubMed Central

    Martinez-Lincoln, Amanda; Cortinas, Christina; Wicha, Nicole Y. Y.

    2014-01-01

    Adult bilinguals show stronger access to multiplication tables when using the language in which they learned arithmetic during childhood (LA+) than the other language (LA−), implying language-specific encoding of math facts. However, most bilinguals use LA+ throughout their life, confounding the impact of encoding and use. We tested if using arithmetic facts in LA− could reduce this LA− disadvantage. We measured event related brain potentials while bilingual teachers judged the correctness of multiplication problems in each of their languages. Critically, each teacher taught arithmetic in either LA+ or LA−. Earlier N400 peak latency was observed in both groups for the teaching than non-teaching language, showing more efficient access to these facts with use. LA+ teachers maintained an LA+ advantage, while LA− teachers showed equivalent N400 congruency effects (for incorrect versus correct solutions) in both languages. LA− teachers also showed a late positive component that may reflect conflict monitoring between their LA+ and a strong LA−. Thus, the LA− disadvantage for exact arithmetic established in early bilingual education can be mitigated by later use of LA−. PMID:25445361

  9. An analysis of the ArcCHECK-MR diode array's performance for ViewRay quality assurance.

    PubMed

    Ellefson, Steven T; Culberson, Wesley S; Bednarz, Bryan P; DeWerd, Larry A; Bayouth, John E

    2017-07-01

    The ArcCHECK-MR diode array utilizes a correction system with a virtual inclinometer to correct the angular response dependencies of the diodes. However, this correction system cannot be applied to measurements on the ViewRay MR-IGRT system due to the virtual inclinometer's incompatibility with the ViewRay's multiple simultaneous beams. Additionally, the ArcCHECK's current correction factors were determined without magnetic field effects taken into account. In the course of performing ViewRay IMRT quality assurance with the ArcCHECK, measurements were observed to be consistently higher than the ViewRay TPS predictions. The goals of this study were to quantify the observed discrepancies and test whether applying the current factors improves the ArcCHECK's accuracy for measurements on the ViewRay. Gamma and frequency analysis were performed on 19 ViewRay patient plans. Ion chamber measurements were performed at a subset of diode locations using a PMMA phantom with the same dimensions as the ArcCHECK. A new method for applying directionally dependent factors utilizing beam information from the ViewRay TPS was developed in order to analyze the current ArcCHECK correction factors. To test the current factors, nine ViewRay plans were altered to be delivered with only a single simultaneous beam and were measured with the ArcCHECK. The current correction factors were applied using both the new and current methods. The new method was also used to apply corrections to the original 19 ViewRay plans. It was found the ArcCHECK systematically reports doses higher than those actually delivered by the ViewRay. Application of the current correction factors by either method did not consistently improve measurement accuracy. As dose deposition and diode response have both been shown to change under the influence of a magnetic field, it can be concluded the current ArcCHECK correction factors are invalid and/or inadequate to correct measurements on the ViewRay system. © 2017 The Authors. Journal of Applied Clinical Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.

  10. Psychometrics of Multiple Choice Questions with Non-Functioning Distracters: Implications to Medical Education.

    PubMed

    Deepak, Kishore K; Al-Umran, Khalid Umran; AI-Sheikh, Mona H; Dkoli, B V; Al-Rubaish, Abdullah

    2015-01-01

    The functionality of distracters in a multiple choice question plays a very important role. We examined the frequency and impact of functioning and non-functioning distracters on psychometric properties of 5-option items in clinical disciplines. We analyzed item statistics of 1115 multiple choice questions from 15 summative assessments of undergraduate medical students and classified the items into five groups by their number of non-functioning distracters. We analyzed the effect of varying degree of non-functionality ranging from 0 to 4, on test reliability, difficulty index, discrimination index and point biserial correlation. The non-functionality of distracters inversely affected the test reliability and quality of items in a predictable manner. The non-functioning distracters made the items easier and lowered the discrimination index significantly. Three non-functional distracters in a 5-option MCQ significantly affected all psychometric properties (p < 0.5). The corrected point biserial correlation revealed that the items with 3 functional options were psychometrically as effective as 5-option items. Our study reveals that a multiple choice question with 3 functional options provides lower most limit of item format that has adequate psychometric property. The test containing items with less number of functioning options have significantly lower reliability. The distracter function analysis and revision of nonfunctioning distracters can serve as important methods to improve the psychometrics and reliability of assessment.

  11. Using the Geoscience Concept Inventory to Understand how Students Learn about Geologic Time

    NASA Astrophysics Data System (ADS)

    Teed, R. E.

    2009-12-01

    108 pre-service teachers completed a standardized multiple-choice test at the beginning and at the end of a ten-week introductory survey course on geology. Four of the fifteen questions dealt explicitly with geologic time. Correct student answers that the age of the Earth is known from uranium-series dating increased significantly, but only from ~0% to about 20%. However, answers that included U-series dating with other (irrelevant) sources of evidence increased from ~10% to ~70%. On the pre-test, students avoided the U-series dating in favor of incorrect, but probably more familiar, dating techniques or combinations of dating techniques. They seem to have gained familiarity with, if not an understanding of U-series dating in the class. There was no real change in students’ conceptions of what the newly-formed Earth would have looked like. Most (70% on pre-test, 65% on post-test) chose an image that looked like the modern Earth with a single continent (which they may have believed to be Pangea). Interestingly, 78% of those who chose that image on the pre-test chose it on the post-test, so they were not guessing. This is a powerful misconception and remained intact in most cases despite the work the students did in the geology class. On the other hand, most students appeared to be guessing when they answered how long Pangea took to break up. There were no significant changes in the totals for any response, but about half students changed their answers between the pre- and post-test with no significant pattern in the changes. Responses to a choice of timelines which all included the formation of the Earth, the appearances of life, dinosaurs, and humans, and the disappearance of dinosaurs, were more complex. There was an increase in the number of students who chose the correct timeline (from 20% to 42%), and a decrease in the number who chose a timeline in which all life appears at once (from 14% to 10%). In this case some misconceptions (based on incorrect answers on the pre-test) were more likely than others to grow into a correct understanding. For example, students who chose “C”, an incorrect timeline with events in the correct order but incorrectly scaled, on the pretest, were more likely to choose correctly on the post-test than students who gave other incorrect answers on the pre-test.

  12. Rescuing mutant CFTR: a multi-task approach to a better outcome in treating cystic fibrosis.

    PubMed

    Amaral, Margarida D; Farinha, Carlos M

    2013-01-01

    Correcting multiple defects of mutant CFTR with small molecule compounds has been the goal of an increasing number of recent Cystic Fibrosis (CF) drug discovery programmes. However, the mechanism of action (MoA) by which these molecules restore mutant CFTR is still poorly understood, in particular of CFTR correctors, i.e., compounds rescuing to the cells surface the most prevalent mutant in CF patients--F508del-CFTR. However, there is increasing evidence that to fully restore the multiple defects associated with F508del-CFTR, different small molecules with distinct corrective properties may be required. Towards this goal, a better insight into MoA of correctors is needed and several constraints should be addressed. The methodological approaches to achieve this include: 1) testing the combined effect of compounds with that of other (non-pharmacological) rescuing strategies (e.g., revertants or low temperature); 2) assessing effects in multiple cellular models (non-epithelial vs epithelial, non-human vs human, immortalized vs primary cultures, polarized vs non polarized, cells vs tissues); 3) assessing compound effects on isolated CFTR domains (e.g., compound binding by surface plasmon resonance, assessing effects on domain folding and aggregation); and finally 4) assessing compounds specificity in rescuing different CFTR mutants and other mutant proteins. These topics are reviewed and discussed here so as to provide a state-of-the art review on how to combine multiple ways of rescuing mutant CFTR to the ultimate benefit of CF patients.

  13. Common genetic variation in the indoleamine-2,3-dioxygenase genes and antidepressant treatment outcome in major depressive disorder.

    PubMed

    Cutler, Jessica A; Rush, A John; McMahon, Francis J; Laje, Gonzalo

    2012-03-01

    The essential amino acid tryptophan is the precursor to serotonin, but it can also be metabolized into kynurenine through indoleamine-2,3-dioxygenase (IDO). Increased immune activation has long been associated with symptoms of depression and has been shown to upregulate the expression of IDO. The presence of additional IDO directs more tryptophan down the kynurenine pathway, leaving less available for synthesis of serotonin and its metabolites. Kynurenine can be metabolized through a series of enzymes to quinolinic acid, a potent N-methyl-D-aspartate receptor agonist with demonstrated neurotoxic effects. We tested the hypothesis that IDO plays a role in outcome of treatment with the selective serotonin reuptake inhibitor, citalopram. Patients consisted of 1953 participants enrolled in the Sequenced Treatment Alternatives to Relieve Depression study (STAR*D). Genotypes corresponding to 94 single nucleotide polymorphisms (SNPs) in the genes IDO1 and IDO2, which encode IDO and IDO2, were extracted from a larger genome-wide set and analyzed using single marker tests to look for association with previously defined response, remission and QIDS-C score change phenotypes, with adequate correction for racial stratification and multiple testing. One SNP, rs2929115, showed evidence of association with citalopram response (OR = 0.64, p = 0.0005) after experiment-wide correction for multiple testing. Another closely associated marker, rs2929116 (OR = 0.64, p = 0.0006) had an experiment-wide significant result. Both implicated SNPs are located between 26 kb and 28 kb downstream of IDO2. We conclude that common genetic variation in IDO1 and IDO2 may play a role in antidepressant treatment outcome. These results are modest in a genome-wide context and need to be replicated in an independent sample.

  14. Hormone-Related Pathways and Risk of Breast Cancer Subtypes in African American Women

    PubMed Central

    Haddad, Stephen A.; Lunetta, Kathryn L.; Ruiz-Narváez, Edward A.; Bensen, Jeannette T.; Hong, Chi-Chen; Sucheston-Campbell, Lara E.; Yao, Song; Bandera, Elisa V.; Rosenberg, Lynn; Haiman, Christopher A.; Troester, Melissa A.; Ambrosone, Christine B.; Palmer, Julie R.

    2016-01-01

    Purpose We sought to investigate genetic variation in hormone pathways in relation to risk of overall and subtype-specific breast cancer in women of African ancestry (AA). Methods Genotyping and imputation yielded data on 143,934 SNPs in 308 hormone-related genes for 3663 breast cancer cases (1098 ER-, 1983 ER+, 582 ER unknown) and 4687 controls from the African American Breast Cancer Epidemiology and Risk (AMBER) Consortium. AMBER includes data from four large studies of AA women: the Carolina Breast Cancer Study, the Women's Circle of Health Study, the Black Women's Health Study, and the Multiethnic Cohort Study. Pathway- and gene-based analyses were conducted, and single SNP tests were run for the top genes. Results There were no strong associations at the pathway level. The most significantly associated genes were GHRH, CALM2, CETP, and AKR1C1 for overall breast cancer (gene-based nominal p ≤0.01); NR0B1, IGF2R, CALM2, CYP1B1, and GRB2 for ER+ breast cancer (p ≤0.02); and PGR, MAPK3, MAP3K1, and LHCGR for ER- disease (p ≤0.02). Single-SNP tests for SNPs with pairwise linkage disequilibrium r2 <0.8 in the top genes identified 12 common SNPs (in CALM2, CETP, NR0B1, IGF2R, CYP1B1, PGR, MAPK3, and MAP3K1) associated with overall or subtype-specific breast cancer after gene-level correction for multiple testing. Rs11571215 in PGR (progesterone receptor) was the SNP most strongly associated with ER- disease. Conclusion We identified eight genes in hormone pathways that contain common variants associated with breast cancer in AA women after gene-level correction for multiple testing. PMID:26458823

  15. An Association Between Functional Polymorphisms of the Interleukin 1 Gene Complex and Schizophrenia Using Transmission Disequilibrium Test.

    PubMed

    Kapelski, Pawel; Skibinska, Maria; Maciukiewicz, Malgorzata; Pawlak, Joanna; Dmitrzak-Weglarz, Monika; Szczepankiewicz, Aleksandra; Zaremba, Dorota; Twarowska-Hauser, Joanna

    2016-12-01

    IL1 gene complex has been implicated in the etiology of schizophrenia. To assess whether IL1 gene complex is associated with susceptibility to schizophrenia in Polish population we conducted family-based study. Functional polymorphisms from IL1A (rs1800587, rs17561, rs11677416), IL1B (rs1143634, rs1143643, rs16944, rs4848306, rs1143623, rs1143633, rs1143627) and IL1RN (rs419598, rs315952, rs9005, rs4251961) genes were genotyped in 143 trio with schizophrenia. Statistical analysis was performed using transmission disequilibrium test. We have found a trend toward an association of rs1143627, rs16944, rs1143623 in IL1B gene with the risk of schizophrenia. Our results show a protective effect of allele T of rs4251961 in IL1RN against schizophrenia. We also performed haplotype analysis of IL1 gene complex and found a trend toward an association with schizophrenia of GAGG haplotype (rs1143627, rs16944, rs1143623, rs4848306) in IL1B gene, haplotypes: TG (rs315952, rs9005) and TT (rs4251961, rs419598) in IL1RN. Haplotype CT (rs4251961, rs419598) in IL1RN was found to be associated with schizophrenia. After correction for multiple testing associations did not reach significance level. Our results might support theory that polymorphisms of interleukin 1 complex genes (rs1143627, rs16944, rs1143623, rs4848306 in IL1B gene and rs4251961, rs419598, rs315952, rs9005 in IL1RN gene) are involved in the pathogenesis of schizophrenia, however, none of the results reach significance level after correction for multiple testing.

  16. Genetic Association of Insulin-like Growth Factor-1 Polymorphisms with High-Grade Myopia in an International Family Cohort

    PubMed Central

    Metlapally, Ravikanth; Ki, Chang-Seok; Li, Yi-Ju; Tran-Viet, Khanh-Nhat; Abbott, Diana; Malecaze, Francois; Calvas, Patrick; Mackey, David A.; Rosenberg, Thomas; Paget, Sandrine; Guggenheim, Jeremy A.

    2010-01-01

    Purpose. Evidence from human myopia genetic mapping studies (MYP3 locus), modulated animal models, and observations of glycemic control in humans suggests that insulin-like growth factor (IGF)-1 plays a role in the control of eye growth. This study was conducted to determine whether IGF-1 polymorphisms are associated with myopia in a large, international dataset of Caucasian high-grade myopia pedigrees. Methods. Two hundred sixty-five multiplex families with 1391 subjects participated in the study. IGF-1 genotyping was performed with 13 selected tag single nucleotide polymorphisms (SNPs) using allelic discrimination assays. A family-based pedigree disequilibrium test (PDT) was performed to test for association. Myopia status was defined using sphere (SPH) or spherical equivalent (SE), and analyses assessed the association of (1) high-grade myopia (≤ −5.00 D), and (2) any myopia (≤ −0.50 D) with IGF-1 markers. Results were declared significant at P ≤ 0.0038 after Bonferroni correction. Q values that take into account multiple testing were also obtained. Results. In all, three SNPs—rs10860860, rs2946834, and rs6214—were present at P < 0.05. SNP rs6214 showed positive association with both the high-grade– and any-myopia groups (P = 2 × 10−3 and P = 2 × 10−3, respectively) after correction for multiple testing. Conclusions. The study supports a genetic association between IGF-1 and high-grade myopia. These findings are in line with recent evidence in an experimental myopia model showing that IGF-1 promotes ocular growth and axial myopia. IGF-1 may be a myopia candidate gene for further investigation. PMID:20435602

  17. Students' confusions with reciprocal and inverse functions

    NASA Astrophysics Data System (ADS)

    Kontorovich, Igor'

    2017-02-01

    These classroom notes are focused on undergraduate students' understanding of the polysemous symbol of superscript (-1), which can be interpreted as a reciprocal or an inverse function. Examination of 240 scripts in a mid-term test identified that some first-year students struggle with choosing the contextually correct interpretation and there are students who use both interpretations in their solutions. Several students also confuse between composition and multiplication of functions denoted by resembling symbols of '°' and 'ṡ'.

  18. Association of cytomegalovirus and Epstein-Barr virus with cognitive functioning and risk of dementia in the general population: 11-year follow-up study.

    PubMed

    Torniainen-Holm, Minna; Suvisaari, Jaana; Lindgren, Maija; Härkänen, Tommi; Dickerson, Faith; Yolken, Robert H

    2018-03-01

    Earlier studies have documented an association between cytomegalovirus and cognitive impairment, but results have been inconsistent. Few studies have investigated the association of cytomegalovirus and Epstein-Barr virus with cognitive decline longitudinally. Our aim was to examine whether cytomegalovirus and Epstein-Barr virus are associated with cognitive decline in adults. The study sample is from the Finnish Health 2000 Survey (BRIF8901, n = 7112), which is representative of the Finnish adult population. The sample was followed up after 11 years in the Health 2011 Survey. In addition, persons with dementia were identified from healthcare registers. In the Finnish population aged 30 and over, the seroprevalence of cytomegalovirus was estimated to be 84% and the seroprevalence of Epstein-Barr virus 98%. Seropositivity of the viruses and antibody levels were mostly not associated with cognitive performance. In the middle-aged adult group, cytomegalovirus serointensity was associated with impaired performance in verbal learning. However, the association disappeared when corrected for multiple testing. No interactions between infection and time or between the two infections were significant when corrected for multiple testing. Seropositivity did not predict dementia diagnosis. The results suggest that adult levels of antibodies to cytomegalovirus and Epstein-Barr virus may not be associated with a significant decline in cognitive function or with dementia at population level. Copyright © 2018. Published by Elsevier Inc.

  19. Virtual Excitation and Multiple Scattering Correction Terms to the Neutron Index of Refraction for Hydrogen.

    PubMed

    Schoen, K; Snow, W M; Kaiser, H; Werner, S A

    2005-01-01

    The neutron index of refraction is generally derived theoretically in the Fermi approximation. However, the Fermi approximation neglects the effects of the binding of the nuclei of a material as well as multiple scattering. Calculations by Nowak introduced correction terms to the neutron index of refraction that are quadratic in the scattering length and of order 10(-3) fm for hydrogen and deuterium. These correction terms produce a small shift in the final value for the coherent scattering length of H2 in a recent neutron interferometry experiment.

  20. Evaluating methods of correcting for multiple comparisons implemented in SPM12 in social neuroscience fMRI studies: an example from moral psychology.

    PubMed

    Han, Hyemin; Glenn, Andrea L

    2018-06-01

    In fMRI research, the goal of correcting for multiple comparisons is to identify areas of activity that reflect true effects, and thus would be expected to replicate in future studies. Finding an appropriate balance between trying to minimize false positives (Type I error) while not being too stringent and omitting true effects (Type II error) can be challenging. Furthermore, the advantages and disadvantages of these types of errors may differ for different areas of study. In many areas of social neuroscience that involve complex processes and considerable individual differences, such as the study of moral judgment, effects are typically smaller and statistical power weaker, leading to the suggestion that less stringent corrections that allow for more sensitivity may be beneficial and also result in more false positives. Using moral judgment fMRI data, we evaluated four commonly used methods for multiple comparison correction implemented in Statistical Parametric Mapping 12 by examining which method produced the most precise overlap with results from a meta-analysis of relevant studies and with results from nonparametric permutation analyses. We found that voxelwise thresholding with familywise error correction based on Random Field Theory provides a more precise overlap (i.e., without omitting too few regions or encompassing too many additional regions) than either clusterwise thresholding, Bonferroni correction, or false discovery rate correction methods.

  1. Temporally contiguous pencast instruction promotes meaningful learning for dental and dental hygiene students in physiology.

    PubMed

    Roesch, Darren M

    2014-01-01

    Smartpens allow for the creation of computerized "pencasts" that combine voice narration with handwritten notes and illustrations. The purpose of this study was to test the effects of voluntary participation in extracurricular instruction with a pencast on student learning. Dental and dental hygiene students were given instruction in a complex physiological topic using lecture and static slides. An Internet link to a pencast that covered the complex topic in a more temporally contiguous fashion was also provided for voluntary review. The students were given a multiple-choice exam that consisted of retention and transfer test questions. Sixty-nine percent of the students who did not watch the pencast and 89 percent of the students who watched the pencast answered the retention test question correctly (p=0.08). Fifty-four percent of the students who did not watch the pencast and 90 percent of the students who watched the pencast answered the transfer test question correctly (p=0.005). This finding indicates that students who watched the pencast performed better on a transfer test, a measurement of meaningful learning, than students who received only the narrated instruction with static images. This supports the hypothesis that temporally contiguous instruction promotes more meaningful learning than lecture accompanied only by static slide images.

  2. Analysing and correcting the differences between multi-source and multi-scale spatial remote sensing observations.

    PubMed

    Dong, Yingying; Luo, Ruisen; Feng, Haikuan; Wang, Jihua; Zhao, Jinling; Zhu, Yining; Yang, Guijun

    2014-01-01

    Differences exist among analysis results of agriculture monitoring and crop production based on remote sensing observations, which are obtained at different spatial scales from multiple remote sensors in same time period, and processed by same algorithms, models or methods. These differences can be mainly quantitatively described from three aspects, i.e. multiple remote sensing observations, crop parameters estimation models, and spatial scale effects of surface parameters. Our research proposed a new method to analyse and correct the differences between multi-source and multi-scale spatial remote sensing surface reflectance datasets, aiming to provide references for further studies in agricultural application with multiple remotely sensed observations from different sources. The new method was constructed on the basis of physical and mathematical properties of multi-source and multi-scale reflectance datasets. Theories of statistics were involved to extract statistical characteristics of multiple surface reflectance datasets, and further quantitatively analyse spatial variations of these characteristics at multiple spatial scales. Then, taking the surface reflectance at small spatial scale as the baseline data, theories of Gaussian distribution were selected for multiple surface reflectance datasets correction based on the above obtained physical characteristics and mathematical distribution properties, and their spatial variations. This proposed method was verified by two sets of multiple satellite images, which were obtained in two experimental fields located in Inner Mongolia and Beijing, China with different degrees of homogeneity of underlying surfaces. Experimental results indicate that differences of surface reflectance datasets at multiple spatial scales could be effectively corrected over non-homogeneous underlying surfaces, which provide database for further multi-source and multi-scale crop growth monitoring and yield prediction, and their corresponding consistency analysis evaluation.

  3. Analysing and Correcting the Differences between Multi-Source and Multi-Scale Spatial Remote Sensing Observations

    PubMed Central

    Dong, Yingying; Luo, Ruisen; Feng, Haikuan; Wang, Jihua; Zhao, Jinling; Zhu, Yining; Yang, Guijun

    2014-01-01

    Differences exist among analysis results of agriculture monitoring and crop production based on remote sensing observations, which are obtained at different spatial scales from multiple remote sensors in same time period, and processed by same algorithms, models or methods. These differences can be mainly quantitatively described from three aspects, i.e. multiple remote sensing observations, crop parameters estimation models, and spatial scale effects of surface parameters. Our research proposed a new method to analyse and correct the differences between multi-source and multi-scale spatial remote sensing surface reflectance datasets, aiming to provide references for further studies in agricultural application with multiple remotely sensed observations from different sources. The new method was constructed on the basis of physical and mathematical properties of multi-source and multi-scale reflectance datasets. Theories of statistics were involved to extract statistical characteristics of multiple surface reflectance datasets, and further quantitatively analyse spatial variations of these characteristics at multiple spatial scales. Then, taking the surface reflectance at small spatial scale as the baseline data, theories of Gaussian distribution were selected for multiple surface reflectance datasets correction based on the above obtained physical characteristics and mathematical distribution properties, and their spatial variations. This proposed method was verified by two sets of multiple satellite images, which were obtained in two experimental fields located in Inner Mongolia and Beijing, China with different degrees of homogeneity of underlying surfaces. Experimental results indicate that differences of surface reflectance datasets at multiple spatial scales could be effectively corrected over non-homogeneous underlying surfaces, which provide database for further multi-source and multi-scale crop growth monitoring and yield prediction, and their corresponding consistency analysis evaluation. PMID:25405760

  4. Plasma process control with optical emission spectroscopy

    NASA Astrophysics Data System (ADS)

    Ward, P. P.

    Plasma processes for cleaning, etching and desmear of electronic components and printed wiring boards (PWB) are difficult to predict and control. Non-uniformity of most plasma processes and sensitivity to environmental changes make it difficult to maintain process stability from day to day. To assure plasma process performance, weight loss coupons or post-plasma destructive testing must be used. The problem with these techniques is that they are not real-time methods and do not allow for immediate diagnosis and process correction. These methods often require scrapping some fraction of a batch to insure the integrity of the rest. Since these methods verify a successful cycle with post-plasma diagnostics, poor test results often determine that a batch is substandard and the resulting parts unusable. Both of these methods are a costly part of the overall fabrication cost. A more efficient method of testing would allow for constant monitoring of plasma conditions and process control. Process failures should be detected before the parts being treated. are damaged. Real time monitoring would allow for instantaneous corrections. Multiple site monitoring would allow for process mapping within one system or simultaneous monitoring of multiple systems. Optical emission spectroscopy conducted external to the plasma apparatus would allow for this sort of multifunctional analysis without perturbing the glow discharge. In this paper, optical emission spectroscopy for non-intrusive, in situ process control will be explored. A discussion of this technique as it applies towards process control, failure analysis and endpoint determination will be conducted. Methods for identifying process failures, progress and end of etch back and desmear processes will be discussed.

  5. Association of genetic variants in RAB23 and ANXA11 with uveitis in sarcoidosis

    PubMed Central

    Davoudi, Samaneh; Chang, Victoria S.; Navarro-Gomez, Daniel; Stanwyck, Lynn K.; Sevgi, Damla Duriye; Papavasileiou, Evangelia; Ren, Aiai; Uchiyama, Eduardo; Sullivan, Lynn; Lobo, Ann-Marie; Papaliodis, George N.

    2018-01-01

    Purpose Uveitis occurs in a subset of patients with sarcoidosis. The purpose of this study was to determine whether genetic variants that have been associated previously with overall sarcoidosis are associated with increased risk of developing uveitis. Methods Seventy-seven subjects were enrolled, including 45 patients diagnosed with sarcoidosis-related uveitis as cases and 32 patients with systemic sarcoidosis without ocular involvement as controls. Thirty-eight single nucleotide polymorphisms (SNPs) previously associated with sarcoidosis, sarcoidosis severity, or other organ-specific sarcoidosis involvement were identified. Allele frequencies in ocular sarcoidosis cases versus controls were compared using the chi-square test, and p values were corrected for multiple hypotheses testing using permutation. All analyses were conducted with PLINK. Results SNPs rs1040461 and rs61860052, in ras-related protein RAS23 (RAB23) and annexin A11 (ANXA11) genes, respectively, were associated with sarcoidosis-associated uveitis. The T allele of rs1040461 and the A allele of rs61860052 were found to be more prevalent in ocular sarcoidosis cases. These associations remained after correction for the multiple hypotheses tested (p=0.01 and p=0.02). In a subanalysis of Caucasian Americans only, two additional variants within the major histocompatibility complex (MHC) genes on chromosome 6, in HLA-DRB5 and HLA-DRB1, were associated with uveitis as well (p=0.009 and p=0.04). Conclusions Genetic variants in RAB23 and ANXA11 genes were associated with an increased risk of sarcoidosis-associated uveitis. These loci have previously been associated with overall sarcoidosis risk. PMID:29416296

  6. "An integrative formal model of motivation and decision making: The MGPM*": Correction to Ballard et al. (2016).

    PubMed

    2017-02-01

    Reports an error in "An integrative formal model of motivation and decision making: The MGPM*" by Timothy Ballard, Gillian Yeo, Shayne Loft, Jeffrey B. Vancouver and Andrew Neal ( Journal of Applied Psychology , 2016[Sep], Vol 101[9], 1240-1265). Equation A3 contained an error. This correct equation is provided in the erratum. (The following abstract of the original article appeared in record 2016-28692-001.) We develop and test an integrative formal model of motivation and decision making. The model, referred to as the extended multiple-goal pursuit model (MGPM*), is an integration of the multiple-goal pursuit model (Vancouver, Weinhardt, & Schmidt, 2010) and decision field theory (Busemeyer & Townsend, 1993). Simulations of the model generated predictions regarding the effects of goal type (approach vs. avoidance), risk, and time sensitivity on prioritization. We tested these predictions in an experiment in which participants pursued different combinations of approach and avoidance goals under different levels of risk. The empirical results were consistent with the predictions of the MGPM*. Specifically, participants pursuing 1 approach and 1 avoidance goal shifted priority from the approach to the avoidance goal over time. Among participants pursuing 2 approach goals, those with low time sensitivity prioritized the goal with the larger discrepancy, whereas those with high time sensitivity prioritized the goal with the smaller discrepancy. Participants pursuing 2 avoidance goals generally prioritized the goal with the smaller discrepancy. Finally, all of these effects became weaker as the level of risk increased. We used quantitative model comparison to show that the MGPM* explained the data better than the original multiple-goal pursuit model, and that the major extensions from the original model were justified. The MGPM* represents a step forward in the development of a general theory of decision making during multiple-goal pursuit. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  7. Retention of Gadolinium-Based Contrast Agents in Multiple Sclerosis: Retrospective Analysis of an 18-Year Longitudinal Study.

    PubMed

    Forslin, Y; Shams, S; Hashim, F; Aspelin, P; Bergendal, G; Martola, J; Fredrikson, S; Kristoffersen-Wiberg, M; Granberg, T

    2017-07-01

    Gadolinium-based contrast agents have been associated with lasting high T1-weighted signal intensity in the dentate nucleus and globus pallidus, with histopathologically confirmed gadolinium retention. We aimed to longitudinally investigate the relationship of multiple gadolinium-based contrast agent administrations to the Signal Intensity Index in the dentate nucleus and globus pallidus and any associations with cognitive function in multiple sclerosis. The Signal Intensity Index in the dentate nucleus and globus pallidus was retrospectively evaluated on T1-weighted MR imaging in an 18-year longitudinal cohort study of 23 patients with MS receiving multiple gadolinium-based contrast agent administrations and 23 healthy age- and sex-matched controls. Participants also underwent comprehensive neuropsychological testing. Patients with MS had a higher Signal Intensity Index in the dentate nucleus ( P < .001), but not in the globus pallidus ( P = .19), compared with non-gadolinium-based contrast agent-exposed healthy controls by an unpaired t test. Increasing numbers of gadolinium-based contrast agent administrations were associated with an increased Signal Intensity Index in the dentate nucleus (β = 0.45, P < .001) and globus pallidus (β = 0.60, P < .001). This association remained stable with corrections for the age, disease duration, and physical disability for both the dentate nucleus (β = 0.43, P = .001) and globus pallidus (β = 0.58, P < .001). An increased Signal Intensity Index in the dentate nucleus among patients with MS was associated with lower verbal fluency scores, which remained significant after correction for several aspects of disease severity (β = -0.40 P = .013). Our data corroborate previous reports of lasting gadolinium retention in brain tissues. An increased Signal Intensity Index in the dentate nucleus and globus pallidus was associated with lower verbal fluency, which does not prove causality but encourages further studies on cognition and gadolinium-based contrast agent administration. © 2017 by American Journal of Neuroradiology.

  8. Scatter characterization and correction for simultaneous multiple small-animal PET imaging.

    PubMed

    Prasad, Rameshwar; Zaidi, Habib

    2014-04-01

    The rapid growth and usage of small-animal positron emission tomography (PET) in molecular imaging research has led to increased demand on PET scanner's time. One potential solution to increase throughput is to scan multiple rodents simultaneously. However, this is achieved at the expense of deterioration of image quality and loss of quantitative accuracy owing to enhanced effects of photon attenuation and Compton scattering. The purpose of this work is, first, to characterize the magnitude and spatial distribution of the scatter component in small-animal PET imaging when scanning single and multiple rodents simultaneously and, second, to assess the relevance and evaluate the performance of scatter correction under similar conditions. The LabPET™-8 scanner was modelled as realistically as possible using Geant4 Application for Tomographic Emission Monte Carlo simulation platform. Monte Carlo simulations allow the separation of unscattered and scattered coincidences and as such enable detailed assessment of the scatter component and its origin. Simple shape-based and more realistic voxel-based phantoms were used to simulate single and multiple PET imaging studies. The modelled scatter component using the single-scatter simulation technique was compared to Monte Carlo simulation results. PET images were also corrected for attenuation and the combined effect of attenuation and scatter on single and multiple small-animal PET imaging evaluated in terms of image quality and quantitative accuracy. A good agreement was observed between calculated and Monte Carlo simulated scatter profiles for single- and multiple-subject imaging. In the LabPET™-8 scanner, the detector covering material (kovar) contributed the maximum amount of scatter events while the scatter contribution due to lead shielding is negligible. The out-of field-of-view (FOV) scatter fraction (SF) is 1.70, 0.76, and 0.11% for lower energy thresholds of 250, 350, and 400 keV, respectively. The increase in SF ranged between 25 and 64% when imaging multiple subjects (three to five) of different size simultaneously in comparison to imaging a single subject. The spill-over ratio (SOR) increases with increasing the number of subjects in the FOV. Scatter correction improved the SOR for both water and air cold compartments of single and multiple imaging studies. The recovery coefficients for different body parts of the mouse whole-body and rat whole-body anatomical models were improved for multiple imaging studies following scatter correction. The magnitude and spatial distribution of the scatter component in small-animal PET imaging of single and multiple subjects simultaneously were characterized, and its impact was evaluated in different situations. Scatter correction improves PET image quality and quantitative accuracy for single rat and simultaneous multiple mice and rat imaging studies, whereas its impact is insignificant in single mouse imaging.

  9. Correlation processing for correction of phase distortions in subaperture imaging.

    PubMed

    Tavh, B; Karaman, M

    1999-01-01

    Ultrasonic subaperture imaging combines synthetic aperture and phased array approaches and permits low-cost systems with improved image quality. In subaperture processing, a large array is synthesized using echo signals collected from a number of receive subapertures by multiple firings of a phased transmit subaperture. Tissue inhomogeneities and displacements in subaperture imaging may cause significant phase distortions on received echo signals. Correlation processing on reference echo signals can be used for correction of the phase distortions, for which the accuracy and robustness are critically limited by the signal correlation. In this study, we explore correlation processing techniques for adaptive subaperture imaging with phase correction for motion and tissue inhomogeneities. The proposed techniques use new subaperture data acquisition schemes to produce reference signal sets with improved signal correlation. The experimental test results were obtained using raw radio frequency (RF) data acquired from two different phantoms with 3.5 MHz, 128-element transducer array. The results show that phase distortions can effectively be compensated by the proposed techniques in real-time adaptive subaperture imaging.

  10. Quantification of γ-aminobutyric acid (GABA) in 1 H MRS volumes composed heterogeneously of grey and white matter.

    PubMed

    Mikkelsen, Mark; Singh, Krish D; Brealy, Jennifer A; Linden, David E J; Evans, C John

    2016-11-01

    The quantification of γ-aminobutyric acid (GABA) concentration using localised MRS suffers from partial volume effects related to differences in the intrinsic concentration of GABA in grey (GM) and white (WM) matter. These differences can be represented as a ratio between intrinsic GABA in GM and WM: r M . Individual differences in GM tissue volume can therefore potentially drive apparent concentration differences. Here, a quantification method that corrects for these effects is formulated and empirically validated. Quantification using tissue water as an internal concentration reference has been described previously. Partial volume effects attributed to r M can be accounted for by incorporating into this established method an additional multiplicative correction factor based on measured or literature values of r M weighted by the proportion of GM and WM within tissue-segmented MRS volumes. Simulations were performed to test the sensitivity of this correction using different assumptions of r M taken from previous studies. The tissue correction method was then validated by applying it to an independent dataset of in vivo GABA measurements using an empirically measured value of r M . It was shown that incorrect assumptions of r M can lead to overcorrection and inflation of GABA concentration measurements quantified in volumes composed predominantly of WM. For the independent dataset, GABA concentration was linearly related to GM tissue volume when only the water signal was corrected for partial volume effects. Performing a full correction that additionally accounts for partial volume effects ascribed to r M successfully removed this dependence. With an appropriate assumption of the ratio of intrinsic GABA concentration in GM and WM, GABA measurements can be corrected for partial volume effects, potentially leading to a reduction in between-participant variance, increased power in statistical tests and better discriminability of true effects. Copyright © 2016 John Wiley & Sons, Ltd.

  11. Detector-Response Correction of Two-Dimensional γ -Ray Spectra from Neutron Capture

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rusev, G.; Jandel, M.; Arnold, C. W.

    2015-05-28

    The neutron-capture reaction produces a large variety of γ-ray cascades with different γ-ray multiplicities. A measured spectral distribution of these cascades for each γ-ray multiplicity is of importance to applications and studies of γ-ray statistical properties. The DANCE array, a 4π ball of 160 BaF 2 detectors, is an ideal tool for measurement of neutron-capture γ-rays. The high granularity of DANCE enables measurements of high-multiplicity γ-ray cascades. The measured two-dimensional spectra (γ-ray energy, γ-ray multiplicity) have to be corrected for the DANCE detector response in order to compare them with predictions of the statistical model or use them in applications.more » The detector-response correction problem becomes more difficult for a 4π detection system than for a single detector. A trial and error approach and an iterative decomposition of γ-ray multiplets, have been successfully applied to the detector-response correction. As a result, applications of the decomposition methods are discussed for two-dimensional γ-ray spectra measured at DANCE from γ-ray sources and from the 10B(n, γ) and 113Cd(n, γ) reactions.« less

  12. An Experimental Evaluation of Blockage Corrections for Current Turbines

    NASA Astrophysics Data System (ADS)

    Ross, Hannah; Polagye, Brian

    2017-11-01

    Flow confinement has been shown to significantly alter the performance of turbines that extract power from water currents. These performance effects are related to the degree of constraint, defined by the ratio of turbine projected area to channel cross-sectional area. This quantity is referred to as the blockage ratio. Because it is often desirable to adjust experimental observations in water channels to unconfined conditions, analytical corrections for both wind and current turbines have been derived. These are generally based on linear momentum actuator disk theory but have been applied to turbines without experimental validation. This work tests multiple blockage corrections on performance and thrust data from a cross-flow turbine and porous plates (experimental analogues to actuator disks) collected in laboratory flumes at blockage ratios ranging between 10 and 35%. To isolate the effects of blockage, the Reynolds number, Froude number, and submergence depth were held constant while the channel width was varied. Corrected performance data are compared to performance in a towing tank at a blockage ratio of less than 5%. In addition to examining the accuracy of each correction, underlying assumptions are assessed to determine why some corrections perform better than others. This material is based upon work supported by the National Science Foundation Graduate Research Fellowship Program under Grant No. DGE-1256082 and the Naval Facilities Engineering Command (NAVFAC).

  13. Metabolic pathways as possible therapeutic targets for progressive multiple sclerosis.

    PubMed

    Heidker, Rebecca M; Emerson, Mitchell R; LeVine, Steven M

    2017-08-01

    Unlike relapsing remitting multiple sclerosis, there are very few therapeutic options for patients with progressive forms of multiple sclerosis. While immune mechanisms are key participants in the pathogenesis of relapsing remitting multiple sclerosis, the mechanisms underlying the development of progressive multiple sclerosis are less well understood. Putative mechanisms behind progressive multiple sclerosis have been put forth: insufficient energy production via mitochondrial dysfunction, activated microglia, iron accumulation, oxidative stress, activated astrocytes, Wallerian degeneration, apoptosis, etc . Furthermore, repair processes such as remyelination are incomplete. Experimental therapies that strive to improve metabolism within neurons and glia, e.g. , oligodendrocytes, could act to counter inadequate energy supplies and/or support remyelination. Most experimental approaches have been examined as standalone interventions; however, it is apparent that the biochemical steps being targeted are part of larger pathways, which are further intertwined with other metabolic pathways. Thus, the potential benefits of a tested intervention, or of an established therapy, e.g. , ocrelizumab, could be undermined by constraints on upstream and/or downstream steps. If correct, then this argues for a more comprehensive, multifaceted approach to therapy. Here we review experimental approaches to support neuronal and glial metabolism, and/or promote remyelination, which may have potential to lessen or delay progressive multiple sclerosis.

  14. Limb Correction of Polar-Orbiting Imagery for the Improved Interpretation of RGB Composites

    NASA Technical Reports Server (NTRS)

    Jedlovec, Gary J.; Elmer, Nicholas

    2016-01-01

    Red-Green-Blue (RGB) composite imagery combines information from several spectral channels into one image to aid in the operational analysis of atmospheric processes. However, infrared channels are adversely affected by the limb effect, the result of an increase in optical path length of the absorbing atmosphere between the satellite and the earth as viewing zenith angle increases. This paper reviews a newly developed technique to quickly correct for limb effects in both clear and cloudy regions using latitudinally and seasonally varying limb correction coefficients for real-time applications. These limb correction coefficients account for the increase in optical path length in order to produce limb-corrected RGB composites. The improved utility of a limb-corrected Air Mass RGB composite from the application of this approach is demonstrated using Aqua Moderate Resolution Imaging Spectroradiometer (MODIS) imagery. However, the limb correction can be applied to any polar-orbiting sensor infrared channels, provided the proper limb correction coefficients are calculated. Corrected RGB composites provide multiple advantages over uncorrected RGB composites, including increased confidence in the interpretation of RGB features, improved situational awareness for operational forecasters, and the ability to use RGB composites from multiple sensors jointly to increase the temporal frequency of observations.

  15. Ensuring production-worthy OPC recipes using large test structure arrays

    NASA Astrophysics Data System (ADS)

    Cork, Christopher; Zimmermann, Rainer; Mei, Xin; Shahin, Alexander

    2007-03-01

    The continual shrinking of design rules as the industry follows Moore's Law and the associated need for low k1 processes, have resulted in more layout configurations becoming difficult to print within the required tolerances. OPC recipes have needed to become more complex as tolerances decreased and acceptable corrections harder to find with simple algorithms. With this complexity comes the possibility of coding errors and ensuring the solutions are truly general. OPC Verification tools can check the quality of a correction based on pre-determined specifications for CD variation, line-end pullback and Edge Placement Error and then highlight layout configuration where violations are found. The problem facing a Mask Tape-Out group is that they usually have little control over the Design Styles coming in. Different approaches to eliminating problematic layouts have included highly restrictive Design Rules [1], whereby certain pitches or orientations are disallowed. Now these design rules are either becoming too complex or they overly restrict the designer from benefiting from the reduced pitch of the new node. The tight link between Design and Mask Tape-Out found in Integrated Device Manufacturers [2] (IDMs) i.e. companies that control both design and manufacturing can do much to dictate manufacturing friendly layout styles, and push ownership of problem resolution back to design groups. In fact this has been perceived as such an issue that a new class of products for designers that perform Lithographic Compliance Check on design layout is an emerging technology [3]. In contrast to IDMs, Semiconductor Foundries are presented with a much larger variety of design styles and a set of Fabless customers who generally are less knowledgeable in terms of understanding the impact of their layout on manufacturability and how to correct issues. The robustness requirements of a foundry's OPC correction recipe, therefore needs to be greater than that for an IDM's tape-out group. An OPC correction recipe which gives acceptable verification results, based solely on one customer GDS is clearly not sufficient to guarantee that all future tape-outs from multiple customers will be similarly clean. Ad hoc changes made in reaction to problems seen at verification are risky, while they may solve one particular layout issue on one product there is no guarantee that the problem may simply shift to another configuration on a yet to be manufactured part. The need to re-qualify a recipe over multiple products at each recipe change can easily results in excessive computational requirements. A single layer at an advanced node typically needs overnight runs on a large processor farm. Much of this layout, however, is extremely repetitive, made from a few standard cells placed tens of thousands of times. An alternative and more efficient approach, suggested by this paper as a screening methodology, is to encapsulate the problematic structures into a programmable test structure array. The dimensions of these test structures are parameterized in software such that they can be generated with these dimensions varied over the space of the design rules and conceivable design styles. By verifying the new recipe over these test structures one could more quickly gain confidence that this recipe would be robust over multiple tape-outs. This paper gives some examples of the implementation of this methodology.

  16. Single-Trial Normalization for Event-Related Spectral Decomposition Reduces Sensitivity to Noisy Trials

    PubMed Central

    Grandchamp, Romain; Delorme, Arnaud

    2011-01-01

    In electroencephalography, the classical event-related potential model often proves to be a limited method to study complex brain dynamics. For this reason, spectral techniques adapted from signal processing such as event-related spectral perturbation (ERSP) – and its variant event-related synchronization and event-related desynchronization – have been used over the past 20 years. They represent average spectral changes in response to a stimulus. These spectral methods do not have strong consensus for comparing pre- and post-stimulus activity. When computing ERSP, pre-stimulus baseline removal is usually performed after averaging the spectral estimate of multiple trials. Correcting the baseline of each single-trial prior to averaging spectral estimates is an alternative baseline correction method. However, we show that this method leads to positively skewed post-stimulus ERSP values. We eventually present new single-trial-based ERSP baseline correction methods that perform trial normalization or centering prior to applying classical baseline correction methods. We show that single-trial correction methods minimize the contribution of artifactual data trials with high-amplitude spectral estimates and are robust to outliers when performing statistical inference testing. We then characterize these methods in terms of their time–frequency responses and behavior compared to classical ERSP methods. PMID:21994498

  17. Altered Amygdala Resting-State Functional Connectivity in Maintenance Hemodialysis End-Stage Renal Disease Patients with Depressive Mood.

    PubMed

    Chen, Hui Juan; Wang, Yun Fei; Qi, Rongfeng; Schoepf, U Joseph; Varga-Szemes, Akos; Ball, B Devon; Zhang, Zhe; Kong, Xiang; Wen, Jiqiu; Li, Xue; Lu, Guang Ming; Zhang, Long Jiang

    2017-04-01

    The purpose of this study was to investigate patterns in the amygdala-based emotional processing circuit of hemodialysis patients using resting-state functional MR imaging (rs-fMRI). Fifty hemodialysis patients (25 with depressed mood and 25 without depressed mood) and 26 healthy controls were included. All subjects underwent neuropsychological tests and rs-fMRI, and patients also underwent laboratory tests. Functional connectivity of the bilateral amygdala was compared among the three groups. The relationship between functional connectivity and clinical markers was investigated. Depressed patients showed increased positive functional connectivity of the left amygdala with the left superior temporal gyrus and right parahippocampal gyrus (PHG) but decreased amygdala functional connectivity with the left precuneus, angular gyrus, posterior cingulate cortex (PCC), and left inferior parietal lobule compared with non-depressed patients (P < 0.05, AlphaSim corrected). Depressed patients had increased positive functional connectivity of the right amygdala with bilateral supplementary motor areas and PHG but decreased amygdala functional connectivity with the right superior frontal gyrus, superior parietal lobule, bilateral precuneus, and PCC (P < 0.05, AlphaSim corrected). After including anxiety as a covariate, we discovered additional decreased functional connectivity with anterior cingulate cortex (ACC) for bilateral amygdala (P < 0.05, AlphaSim corrected). For the depressed, neuropsychological test scores were correlated with functional connectivity of multiple regions (P < 0.05, AlphaSim corrected). In conclusion, functional connectivity in the amygdala-prefrontal-PCC-limbic circuits was impaired in depressive hemodialysis patients, with a gradual decrease in ACC between controls, non-depressed, and depressed patients for the right amygdala. This indicates that ACC plays a role in amygdala-based emotional regulatory circuits in these patients.

  18. Multicenter Evaluation of the Vitek MS v3.0 System for the Identification of Filamentous Fungi.

    PubMed

    Rychert, Jenna; Slechta, E Sue; Barker, Adam P; Miranda, Edwin; Babady, N Esther; Tang, Yi-Wei; Gibas, Connie; Wiederhold, Nathan; Sutton, DeAnna; Hanson, Kimberly E

    2018-02-01

    Invasive fungal infections are an important cause of morbidity and mortality affecting primarily immunocompromised patients. While fungal identification to the species level is critical to providing appropriate therapy, it can be slow and laborious and often relies on subjective morphological criteria. The use of matrix-assisted laser desorption ionization-time of flight (MALDI-TOF) mass spectrometry has the potential to speed up and improve the accuracy of identification. In this multicenter study, we evaluated the accuracy of the Vitek MS v3.0 system in identifying 1,601 clinical mold isolates compared to identification by DNA sequence analysis and supported by morphological and phenotypic testing. Among the 1,519 isolates representing organisms in the v3.0 database, 91% ( n = 1,387) were correctly identified to the species level. An additional 27 isolates (2%) were correctly identified to the genus level. Fifteen isolates were incorrectly identified, due to either a single incorrect identification ( n = 13) or multiple identifications from different genera ( n = 2). In those cases, when a single identification was provided that was not correct, the misidentification was within the same genus. The Vitek MS v3.0 was unable to identify 91 (6%) isolates, despite repeat testing. These isolates were distributed among all the genera. When considering all isolates tested, even those that were not represented in the database, the Vitek MS v3.0 provided a single correct identification 98% of the time. These findings demonstrate that the Vitek MS v3.0 system is highly accurate for the identification of common molds encountered in the clinical mycology laboratory. Copyright © 2018 American Society for Microbiology.

  19. Bleed-through correction for rendering and correlation analysis in multi-colour localization microscopy

    PubMed Central

    Kim, Dahan; Curthoys, Nikki M.; Parent, Matthew T.; Hess, Samuel T.

    2015-01-01

    Multi-colour localization microscopy has enabled sub-diffraction studies of colocalization between multiple biological species and quantification of their correlation at length scales previously inaccessible with conventional fluorescence microscopy. However, bleed-through, or misidentification of probe species, creates false colocalization and artificially increases certain types of correlation between two imaged species, affecting the reliability of information provided by colocalization and quantified correlation. Despite the potential risk of these artefacts of bleed-through, neither the effect of bleed-through on correlation nor methods of its correction in correlation analyses has been systematically studied at typical rates of bleed-through reported to affect multi-colour imaging. Here, we present a reliable method of bleed-through correction applicable to image rendering and correlation analysis of multi-colour localization microscopy. Application of our bleed-through correction shows our method accurately corrects the artificial increase in both types of correlations studied (Pearson coefficient and pair correlation), at all rates of bleed-through tested, in all types of correlations examined. In particular, anti-correlation could not be quantified without our bleed-through correction, even at rates of bleed-through as low as 2%. Demonstrated with dichroic-based multi-colour FPALM here, our presented method of bleed-through correction can be applied to all types of localization microscopy (PALM, STORM, dSTORM, GSDIM, etc.), including both simultaneous and sequential multi-colour modalities, provided the rate of bleed-through can be reliably determined. PMID:26185614

  20. Bleed-through correction for rendering and correlation analysis in multi-colour localization microscopy.

    PubMed

    Kim, Dahan; Curthoys, Nikki M; Parent, Matthew T; Hess, Samuel T

    2013-09-01

    Multi-colour localization microscopy has enabled sub-diffraction studies of colocalization between multiple biological species and quantification of their correlation at length scales previously inaccessible with conventional fluorescence microscopy. However, bleed-through, or misidentification of probe species, creates false colocalization and artificially increases certain types of correlation between two imaged species, affecting the reliability of information provided by colocalization and quantified correlation. Despite the potential risk of these artefacts of bleed-through, neither the effect of bleed-through on correlation nor methods of its correction in correlation analyses has been systematically studied at typical rates of bleed-through reported to affect multi-colour imaging. Here, we present a reliable method of bleed-through correction applicable to image rendering and correlation analysis of multi-colour localization microscopy. Application of our bleed-through correction shows our method accurately corrects the artificial increase in both types of correlations studied (Pearson coefficient and pair correlation), at all rates of bleed-through tested, in all types of correlations examined. In particular, anti-correlation could not be quantified without our bleed-through correction, even at rates of bleed-through as low as 2%. Demonstrated with dichroic-based multi-colour FPALM here, our presented method of bleed-through correction can be applied to all types of localization microscopy (PALM, STORM, dSTORM, GSDIM, etc.), including both simultaneous and sequential multi-colour modalities, provided the rate of bleed-through can be reliably determined.

  1. Local multiplicity adjustment for the spatial scan statistic using the Gumbel distribution.

    PubMed

    Gangnon, Ronald E

    2012-03-01

    The spatial scan statistic is an important and widely used tool for cluster detection. It is based on the simultaneous evaluation of the statistical significance of the maximum likelihood ratio test statistic over a large collection of potential clusters. In most cluster detection problems, there is variation in the extent of local multiplicity across the study region. For example, using a fixed maximum geographic radius for clusters, urban areas typically have many overlapping potential clusters, whereas rural areas have relatively few. The spatial scan statistic does not account for local multiplicity variation. We describe a previously proposed local multiplicity adjustment based on a nested Bonferroni correction and propose a novel adjustment based on a Gumbel distribution approximation to the distribution of a local scan statistic. We compare the performance of all three statistics in terms of power and a novel unbiased cluster detection criterion. These methods are then applied to the well-known New York leukemia dataset and a Wisconsin breast cancer incidence dataset. © 2011, The International Biometric Society.

  2. Local multiplicity adjustment for the spatial scan statistic using the Gumbel distribution

    PubMed Central

    Gangnon, Ronald E.

    2011-01-01

    Summary The spatial scan statistic is an important and widely used tool for cluster detection. It is based on the simultaneous evaluation of the statistical significance of the maximum likelihood ratio test statistic over a large collection of potential clusters. In most cluster detection problems, there is variation in the extent of local multiplicity across the study region. For example, using a fixed maximum geographic radius for clusters, urban areas typically have many overlapping potential clusters, while rural areas have relatively few. The spatial scan statistic does not account for local multiplicity variation. We describe a previously proposed local multiplicity adjustment based on a nested Bonferroni correction and propose a novel adjustment based on a Gumbel distribution approximation to the distribution of a local scan statistic. We compare the performance of all three statistics in terms of power and a novel unbiased cluster detection criterion. These methods are then applied to the well-known New York leukemia dataset and a Wisconsin breast cancer incidence dataset. PMID:21762118

  3. Serum cytokine levels related to multiple dimensions of fatigue in patients with primary Sjögren's syndrome

    PubMed Central

    Hartkamp, A; Geenen, R; Bijl, M; Kruize, A; Godaert, G; Derksen, R

    2004-01-01

    Methods: Sixty female patients with pSS filled out a questionnaire to assess multiple dimensions of fatigue. Scores were compared with values in a population based control group (n = 139). Levels of interleukin (IL)1ß, IL2, IL6, IL10, and tumour necrosis factor α were measured in serum with commercial sandwich ELISAs. The relationship between self reported dimensions of fatigue and these serum cytokine levels was determined. Results: Patients with pSS had high scores at all dimensions of fatigue (p<0.001): general fatigue, physical fatigue, reduced activity, reduced motivation, and mental fatigue. Fatigue levels were not related to serum cytokine levels. The incidental finding that reduced motivation was higher in patients with detectable serum levels of IL10 (p = 0.04) disappeared after correction for multiple testing. Conclusion: Fatigue is prominent in patients with pSS and involves all dimensions of fatigue. The findings do not suggest a widespread effect of circulating cytokines on multiple aspects of fatigue. PMID:15361396

  4. Intensity response function of the photopic negative response (PhNR): effect of age and test-retest reliability.

    PubMed

    Joshi, Nabin R; Ly, Emma; Viswanathan, Suresh

    2017-08-01

    To assess the effect of age and test-retest reliability of the intensity response function of the full-field photopic negative response (PhNR) in normal healthy human subjects. Full-field electroretinograms (ERGs) were recorded from one eye of 45 subjects, and 39 of these subjects were tested on two separate days with a Diagnosys Espion System (Lowell, MA, USA). The visual stimuli consisted of brief (<5 ms) red flashes ranging from 0.00625 to 6.4 phot cd.s/m 2 , delivered on a constant 7 cd/m 2 blue background. PhNR amplitudes were measured at its trough from baseline (BT) and from the preceding b-wave peak (PT), and b-wave amplitude was measured at its peak from the preceding a-wave trough or baseline if the a-wave was not present. The intensity response data of all three ERG measures were fitted with a generalized Naka-Rushton function to derive the saturated amplitude (V max ), semisaturation constant (K) and slope (n) parameters. Effect of age on the fit parameters was assessed with linear regression, and test-retest reliability was assessed with the Wilcoxon signed-rank test and Bland-Altman analysis. Holm's correction was applied to account for multiple comparisons. V max of BT was significantly smaller than that of PT and b-wave, and the V max of PT and b-wave was not significantly different from each other. The slope parameter n was smallest for BT and the largest for b-wave and the difference between the slopes of all three measures were statistically significant. Small differences observed in the mean values of K for the different measures did not reach statistical significance. The Wilcoxon signed-rank test indicated no significant differences between the two test visits for any of the Naka-Rushton parameters for the three ERG measures, and the Bland-Altman plots indicated that the mean difference between test and retest measurements of the different fit parameters was close to zero and within 6% of the average of the test and retest values of the respective parameters for all three ERG measurements, indicating minimal bias. While the coefficient of reliability (COR, defined as 1.96 times the standard deviation of the test and retest difference) of each fit parameter was more or less comparable across the three ERG measurements, the %COR (COR normalized to the mean test and retest measures) was generally larger for BT compared to both PT and b-wave for each fit parameter. The Naka-Rushton fit parameters did not show statistically significant changes with age for any of the ERG measures when corrections were applied for multiple comparisons. However, the V max of BT demonstrated a weak correlation with age prior to correction for multiple comparisons, and the effect of age on this parameter showed greater significance when the measure was expressed as a ratio of the V max of b-wave from the same subject. V max of the BT amplitude measure of PhNR at the best was weakly correlated with age. None of the other parameters of the Naka-Rushton fit to the intensity response data of either the PhNR or the b-wave showed any systematic changes with age. The test-retest reliability of the fit parameters for PhNR BT amplitude measurements appears to be lower than those of the PhNR PT and b-wave amplitude measurements.

  5. Facial emotion recognition is inversely correlated with tremor severity in essential tremor.

    PubMed

    Auzou, Nicolas; Foubert-Samier, Alexandra; Dupouy, Sandrine; Meissner, Wassilios G

    2014-04-01

    We here assess limbic and orbitofrontal control in 20 patients with essential tremor (ET) and 18 age-matched healthy controls using the Ekman Facial Emotion Recognition Task and the IOWA Gambling Task. Our results show an inverse relation between facial emotion recognition and tremor severity. ET patients also showed worse performance in joy and fear recognition, as well as subtle abnormalities in risk detection, but these differences did not reach significance after correction for multiple testing.

  6. Data Analysis Techniques for Physical Scientists

    NASA Astrophysics Data System (ADS)

    Pruneau, Claude A.

    2017-10-01

    Preface; How to read this book; 1. The scientific method; Part I. Foundation in Probability and Statistics: 2. Probability; 3. Probability models; 4. Classical inference I: estimators; 5. Classical inference II: optimization; 6. Classical inference III: confidence intervals and statistical tests; 7. Bayesian inference; Part II. Measurement Techniques: 8. Basic measurements; 9. Event reconstruction; 10. Correlation functions; 11. The multiple facets of correlation functions; 12. Data correction methods; Part III. Simulation Techniques: 13. Monte Carlo methods; 14. Collision and detector modeling; List of references; Index.

  7. Implementation and Initial Testing of Advanced Processing and Analysis Algorithms for Correlated Neutron Counting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Santi, Peter Angelo; Cutler, Theresa Elizabeth; Favalli, Andrea

    In order to improve the accuracy and capabilities of neutron multiplicity counting, additional quantifiable information is needed in order to address the assumptions that are present in the point model. Extracting and utilizing higher order moments (Quads and Pents) from the neutron pulse train represents the most direct way of extracting additional information from the measurement data to allow for an improved determination of the physical properties of the item of interest. The extraction of higher order moments from a neutron pulse train required the development of advanced dead time correction algorithms which could correct for dead time effects inmore » all of the measurement moments in a self-consistent manner. In addition, advanced analysis algorithms have been developed to address specific assumptions that are made within the current analysis model, namely that all neutrons are created at a single point within the item of interest, and that all neutrons that are produced within an item are created with the same energy distribution. This report will discuss the current status of implementation and initial testing of the advanced dead time correction and analysis algorithms that have been developed in an attempt to utilize higher order moments to improve the capabilities of correlated neutron measurement techniques.« less

  8. Evaluation of Second-Level Inference in fMRI Analysis

    PubMed Central

    Roels, Sanne P.; Loeys, Tom; Moerkerke, Beatrijs

    2016-01-01

    We investigate the impact of decisions in the second-level (i.e., over subjects) inferential process in functional magnetic resonance imaging on (1) the balance between false positives and false negatives and on (2) the data-analytical stability, both proxies for the reproducibility of results. Second-level analysis based on a mass univariate approach typically consists of 3 phases. First, one proceeds via a general linear model for a test image that consists of pooled information from different subjects. We evaluate models that take into account first-level (within-subjects) variability and models that do not take into account this variability. Second, one proceeds via inference based on parametrical assumptions or via permutation-based inference. Third, we evaluate 3 commonly used procedures to address the multiple testing problem: familywise error rate correction, False Discovery Rate (FDR) correction, and a two-step procedure with minimal cluster size. Based on a simulation study and real data we find that the two-step procedure with minimal cluster size results in most stable results, followed by the familywise error rate correction. The FDR results in most variable results, for both permutation-based inference and parametrical inference. Modeling the subject-specific variability yields a better balance between false positives and false negatives when using parametric inference. PMID:26819578

  9. Method for measuring multiple scattering corrections between liquid scintillators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Verbeke, J. M.; Glenn, A. M.; Keefer, G. J.

    2016-04-11

    In this study, a time-of-flight method is proposed to experimentally quantify the fractions of neutrons scattering between scintillators. An array of scintillators is characterized in terms of crosstalk with this method by measuring a californium source, for different neutron energy thresholds. The spectral information recorded by the scintillators can be used to estimate the fractions of neutrons multiple scattering. With the help of a correction to Feynman's point model theory to account for multiple scattering, these fractions can in turn improve the mass reconstruction of fissile materials under investigation.

  10. The multiple hop test: a discriminative or evaluative instrument for chronic ankle instability?

    PubMed

    Eechaute, Christophe; Bautmans, Ivan; De Hertogh, Willem; Vaes, Peter

    2012-05-01

    To determine whether the multiple hop test should be used as an evaluative or a discriminative instrument for chronic ankle instability (CAI). Blinded case-control study. : University research laboratory. Twenty-nine healthy subjects (21 men, 8 women, mean age 21.8 years) and 29 patients with CAI (17 men, 12 women, mean age 24.9 years) were selected. Subjects performed a multiple hop test and hopped on 10 different tape markers while trying to avoid any postural correction. Minimal detectable changes (MDC) of the number of balance errors, the time value, and the visual analog scale (VAS) score (perceived difficulty) were calculated as evaluative measures. For the discriminative properties, a receiver operating characteristic curve was determined and the area under curve (AUC), the sensitivity, specificity, diagnostic accuracy (DA), and likelihood ratios (LR) were calculated whether 1, 2, or 3 outcomes were positive. Based on their MDC, outcomes should, respectively, change by more than 7 errors (41%), 6 seconds (15%), and 27 mm (55%, VAS score) before considering it as a real change. Area under curves were, respectively, 79% (errors), 77% (time value), and 65% (VAS score). The most optimal cutoff point was, respectively, 13.5 errors, 35 seconds, and 32.5 mm. When 2 of 3 outcomes were positive, the sensitivity was 86%, the specificity was 79%, the DA was 83%, the positive LR was 4.2, and the negative LR was 0.17. The multiple hop test seems to be more a discriminative instrument for CAI, and its responsiveness needs to be demonstrated.

  11. Time-Series Analysis: Assessing the Effects of Multiple Educational Interventions in a Small-Enrollment Course

    NASA Astrophysics Data System (ADS)

    Warren, Aaron R.

    2009-11-01

    Time-series designs are an alternative to pretest-posttest methods that are able to identify and measure the impacts of multiple educational interventions, even for small student populations. Here, we use an instrument employing standard multiple-choice conceptual questions to collect data from students at regular intervals. The questions are modified by asking students to distribute 100 Confidence Points among the options in order to indicate the perceived likelihood of each answer option being the correct one. Tracking the class-averaged ratings for each option produces a set of time-series. ARIMA (autoregressive integrated moving average) analysis is then used to test for, and measure, changes in each series. In particular, it is possible to discern which educational interventions produce significant changes in class performance. Cluster analysis can also identify groups of students whose ratings evolve in similar ways. A brief overview of our methods and an example are presented.

  12. Unilateral spatial neglect in the acute phase of ischemic stroke can predict long-term disability and functional capacity.

    PubMed

    Luvizutto, Gustavo José; Moliga, Augusta Fabiana; Rizzatti, Gabriela Rizzo Soares; Fogaroli, Marcelo Ortolani; Moura Neto, Eduardo de; Nunes, Hélio Rubens de Carvalho; Resende, Luiz Antônio de Lima; Bazan, Rodrigo

    2018-05-21

    The aim of this study was to assess the relationship between the degree of unilateral spatial neglect during the acute phase of stroke and long-term functional independence. This was a prospective study of right ischemic stroke patients in which the independent variable was the degree of spatial neglect and the outcome that was measured was functional independence. The potential confounding factors included sex, age, stroke severity, topography of the lesion, risk factors, glycemia and the treatment received. Unilateral spatial neglect was measured using the line cancellation test, the star cancellation test and the line bisection test within 48 hours of the onset of symptoms. Functional independence was measured using the modified Rankin and Barthel scales at 90 days after discharge. The relationship between unilateral spatial neglect and functional independence was analyzed using multiple logistic regression that was corrected for confounding factors. We studied 60 patients with a median age of 68 (34-89) years, 52% of whom were male and 74% of whom were Caucasian. The risk for moderate to severe disability increased with increasing star cancellation test scores (OR=1.14 [1.03-1.26], p=0.01) corrected for the stroke severity, which was a confounding factor that had a statistically positive association with disability (OR=1.63 [1.13-2.65], p=0.01). The best chance of functional independence decreased with increasing star cancellation test scores (OR=0.86 [0.78-0.96], p=0.006) corrected for the stroke severity, which was a confounding factor that had a statistically negative association with independence (OR=0.66 [0.48-0.92], p=0.017). The severity of unilateral spatial neglect in acute stroke worsens the degree of long-term disability and functional independence.

  13. Improved nine-node shell element MITC9i with reduced distortion sensitivity

    NASA Astrophysics Data System (ADS)

    Wisniewski, K.; Turska, E.

    2017-11-01

    The 9-node quadrilateral shell element MITC9i is developed for the Reissner-Mindlin shell kinematics, the extended potential energy and Green strain. The following features of its formulation ensure an improved behavior: 1. The MITC technique is used to avoid locking, and we propose improved transformations for bending and transverse shear strains, which render that all patch tests are passed for the regular mesh, i.e. with straight element sides and middle positions of midside nodes and a central node. 2. To reduce shape distortion effects, the so-called corrected shape functions of Celia and Gray (Int J Numer Meth Eng 20:1447-1459, 1984) are extended to shells and used instead of the standard ones. In effect, all patch tests are passed additionally for shifts of the midside nodes along straight element sides and for arbitrary shifts of the central node. 3. Several extensions of the corrected shape functions are proposed to enable computations of non-flat shells. In particular, a criterion is put forward to determine the shift parameters associated with the central node for non-flat elements. Additionally, the method is presented to construct a parabolic side for a shifted midside node, which improves accuracy for symmetric curved edges. Drilling rotations are included by using the drilling Rotation Constraint equation, in a way consistent with the additive/multiplicative rotation update scheme for large rotations. We show that the corrected shape functions reduce the sensitivity of the solution to the regularization parameter γ of the penalty method for this constraint. The MITC9i shell element is subjected to a range of linear and non-linear tests to show passing the patch tests, the absence of locking, very good accuracy and insensitivity to node shifts. It favorably compares to several other tested 9-node elements.

  14. Prenatal detection of fetal triploidy from cell-free DNA testing in maternal blood.

    PubMed

    Nicolaides, Kypros H; Syngelaki, Argyro; del Mar Gil, Maria; Quezada, Maria Soledad; Zinevich, Yana

    2014-01-01

    To investigate potential performance of cell-free DNA (cfDNA) testing in maternal blood in detecting fetal triploidy. Plasma and buffy coat samples obtained at 11-13 weeks' gestation from singleton pregnancies with diandric triploidy (n=4), digynic triploidy (n=4), euploid fetuses (n=48) were sent to Natera, Inc. (San Carlos, Calif., USA) for cfDNA testing. Multiplex polymerase chain reaction amplification of cfDNA followed by sequencing of single nucleotide polymorphic loci covering chromosomes 13, 18, 21, X, and Y was performed. Sequencing data were analyzed using the NATUS algorithm which identifies copy number for each of the five chromosomes. cfDNA testing provided a result in 44 (91.7%) of the 48 euploid cases and correctly predicted the fetal sex and the presence of two copies each of chromosome 21, 18 and 13. In diandric triploidy, cfDNA testing identified multiple paternal haplotypes (indicating fetal trisomy 21, trisomy 18 and trisomy 13) suggesting the presence of either triploidy or dizygotic twins. In digynic triploidy the fetal fraction corrected for maternal weight and gestational age was below the 0.5th percentile. cfDNA testing by targeted sequencing and allelic ratio analysis of single nucleotide polymorphisms covering chromosomes 21, 18, 13, X, and Y can detect diandric triploidy and raise the suspicion of digynic triploidy. © 2013 S. Karger AG, Basel.

  15. Maritime Adaptive Optics Beam Control

    DTIC Science & Technology

    2010-09-01

    Liquid Crystal LMS Least Mean Square MIMO Multiple- Input Multiple-Output MMDM Micromachined Membrane Deformable Mirror MSE Mean Square Error...determine how the beam is distorted, a control computer to calculate the correction to be applied, and a corrective element, usually a deformable mirror ...during this research, an overview of the system modification is provided here. Using additional mirrors and reflecting the beam to and from an

  16. Preliminary results after upper cervical chiropractic care in patients with chronic cerebro-spinal venous insufficiency and multiple sclerosis.

    PubMed

    Mandolesi, Sandro; Marceca, Giuseppe; Moser, Jon; Niglio, Tarcisio; d'Alessandro, Aldo; Ciccone, Matteo Marco; Zito, Annapaola; Mandolesi, Dimitri; d'Alessandro, Alessandro; Fedele, Francesco

    2015-01-01

    The aim of the study is to evaluate the clinical and X-ray results of the Upper Cervical Chiropractic care through the specific adjustments (corrections) of C1-C2 on patients with chronic venous cerebral-spinal insufficiency (CCSVI) and multiple sclerosis (MS). We studied a sample of 77 patients before and after the Upper Cervical Chiropractic care, and we analyzed: A) The change of the X-ray parameters; B) The clinical results using a new set of questions. The protocol of the C1- C2 upper Cervical Chiropractic treatment, specific for these patients, lasts four months. From a haemodynamic point of view we divided the patients in 3 types: Type 1 - purely vascular with intravenous alterations; Type 2 - "mechanical" with of external venous compressions; Type 3 - mixed. We found an improvement in all kinds of subluxations after the treatment with respect to the pre-treatment X-ray evaluation, with a significant statistical difference. The differences between the clinical symptoms before and after the specific treatment of C1-C2 are statistically significant with p<0.001 according to the CHI-Square test revised by Yates. The preliminary X-ray and clinical improvements of the Upper Cervical Chiropractic corrections on C1- C2 on these patients with CCSVI and MS encourage us to continue with our studies. We believe that the Upper Cervical correction on C1-C2 could be the main non-invasive treatment of the CCSVI mechanical type in patients with MS. Further studies are required to evaluate the correlation between the Upper Cervical Chiropractic correction on C1-C2 on the cerebral venous drainage and the cerebro-spinal fluid.

  17. [Is ultrasound equal to X-ray in pediatric fracture diagnosis?].

    PubMed

    Moritz, J D; Hoffmann, B; Meuser, S H; Sehr, D H; Caliebe, A; Heller, M

    2010-08-01

    Ultrasound is currently not established for the diagnosis of fractures. The aim of this study was to compare ultrasound and X-ray beyond their use solely for the identification of fractures, i. e., for the detection of fracture type and dislocation for pediatric fracture diagnosis. Limb bones of dead young pigs served as a model for pediatric bones. The fractured bones were examined with ultrasound, X-ray, and CT, which served as the gold standard. 162 of 248 bones were fractured. 130 fractures were identified using ultrasound, and 148 using X-ray. There were some advantages of X-ray over ultrasound in the detection of fracture type (80 correct results using X-ray, 66 correct results using ultrasound). Ultrasound, however, was superior to X-ray for dislocation identification (41 correct results using X-ray, 51 correct results using ultrasound). Both findings were not statistically significant after adjustment for multiple testing. Ultrasound not only has comparable sensitivity to that of X-ray for the identification of limb fractures but is also equally effective for the diagnosis of fracture type and dislocation. Thus, ultrasound can be used as an adequate alternative method to X-ray for pediatric fracture diagnosis. Georg Thieme Verlag KG Stuttgart, New York.

  18. Simple aerosol correction technique based on the spectral relationships of the aerosol multiple-scattering reflectances for atmospheric correction over the oceans.

    PubMed

    Ahn, Jae-Hyun; Park, Young-Je; Kim, Wonkook; Lee, Boram

    2016-12-26

    An estimation of the aerosol multiple-scattering reflectance is an important part of the atmospheric correction procedure in satellite ocean color data processing. Most commonly, the utilization of two near-infrared (NIR) bands to estimate the aerosol optical properties has been adopted for the estimation of the effects of aerosols. Previously, the operational Geostationary Color Ocean Imager (GOCI) atmospheric correction scheme relies on a single-scattering reflectance ratio (SSE), which was developed for the processing of the Sea-viewing Wide Field-of-view Sensor (SeaWiFS) data to determine the appropriate aerosol models and their aerosol optical thicknesses. The scheme computes reflectance contributions (weighting factor) of candidate aerosol models in a single scattering domain then spectrally extrapolates the single-scattering aerosol reflectance from NIR to visible (VIS) bands using the SSE. However, it directly applies the weight value to all wavelengths in a multiple-scattering domain although the multiple-scattering aerosol reflectance has a non-linear relationship with the single-scattering reflectance and inter-band relationship of multiple scattering aerosol reflectances is non-linear. To avoid these issues, we propose an alternative scheme for estimating the aerosol reflectance that uses the spectral relationships in the aerosol multiple-scattering reflectance between different wavelengths (called SRAMS). The process directly calculates the multiple-scattering reflectance contributions in NIR with no residual errors for selected aerosol models. Then it spectrally extrapolates the reflectance contribution from NIR to visible bands for each selected model using the SRAMS. To assess the performance of the algorithm regarding the errors in the water reflectance at the surface or remote-sensing reflectance retrieval, we compared the SRAMS atmospheric correction results with the SSE atmospheric correction using both simulations and in situ match-ups with the GOCI data. From simulations, the mean errors for bands from 412 to 555 nm were 5.2% for the SRAMS scheme and 11.5% for SSE scheme in case-I waters. From in situ match-ups, 16.5% for the SRAMS scheme and 17.6% scheme for the SSE scheme in both case-I and case-II waters. Although we applied the SRAMS algorithm to the GOCI, it can be applied to other ocean color sensors which have two NIR wavelengths.

  19. Improved multidimensional semiclassical tunneling theory.

    PubMed

    Wagner, Albert F

    2013-12-12

    We show that the analytic multidimensional semiclassical tunneling formula of Miller et al. [Miller, W. H.; Hernandez, R.; Handy, N. C.; Jayatilaka, D.; Willets, A. Chem. Phys. Lett. 1990, 172, 62] is qualitatively incorrect for deep tunneling at energies well below the top of the barrier. The origin of this deficiency is that the formula uses an effective barrier weakly related to the true energetics but correctly adjusted to reproduce the harmonic description and anharmonic corrections of the reaction path at the saddle point as determined by second order vibrational perturbation theory. We present an analytic improved semiclassical formula that correctly includes energetic information and allows a qualitatively correct representation of deep tunneling. This is done by constructing a three segment composite Eckart potential that is continuous everywhere in both value and derivative. This composite potential has an analytic barrier penetration integral from which the semiclassical action can be derived and then used to define the semiclassical tunneling probability. The middle segment of the composite potential by itself is superior to the original formula of Miller et al. because it incorporates the asymmetry of the reaction barrier produced by the known reaction exoergicity. Comparison of the semiclassical and exact quantum tunneling probability for the pure Eckart potential suggests a simple threshold multiplicative factor to the improved formula to account for quantum effects very near threshold not represented by semiclassical theory. The deep tunneling limitations of the original formula are echoed in semiclassical high-energy descriptions of bound vibrational states perpendicular to the reaction path at the saddle point. However, typically ab initio energetic information is not available to correct it. The Supporting Information contains a Fortran code, test input, and test output that implements the improved semiclassical tunneling formula.

  20. Knowledge of Case Workers and Correctional Officers towards HIV and HCV Infections: Opportunity for Public Health Education in the Correctional System.

    PubMed

    Pérez, Cynthia M; del Carmen Santos, María; Torres, Aurinés; Grana, Carlos; Albizu-García, Carmen

    2015-09-01

    Given the heavy burden of hepatitis C virus (HCV) and human immunodeficiency virus (HIV) infections in correctional facilities, we examined knowledge about these infections among case workers and correctional officers in penal institutions in Puerto Rico. We used data from a cross-sectional study of state prisons, commissioned by the Puerto Rico Department of Correction and Rehabilitation, to assess knowledge about HCV and HIV (10 items each) among 256 case workers and correctional officers from 18 penal institutions selected in the prison system. Total scores for each scale ranged from 0 to 10 points, with higher scores reflecting more knowledge. Of 256 participants, 64.8% were males, 39.6% were aged 30-39 years, and 70.3% were case workers. The percentage of correct responses for knowledge items ranged from 8.5% to 97.0% for HCV infection and from 38.7% to 99.6% for HIV infection. The vast majority (>96%) of participants knew that injection drug users should be tested for HCV infection and that sharing of needle injection equipment and multiple sex partners increase the risk of HIV infection. However, misconceptions about routes of transmission for these viral infections were found, with larger gaps in knowledge for HCV infection. Mean knowledge scores for HCV and HIV infections were 4.20±0.17 and 6.95±0.22, respectively, being significantly (p<0.05) higher for case workers. The findings about HCV and HIV knowledge in an important segment of the correctional system staff support the urgent need for increasing educational opportunities for correctional staff.

  1. Estimation of Sensory Analysis Cupping Test Arabica Coffee Using NIR Spectroscopy

    NASA Astrophysics Data System (ADS)

    Safrizal; Sutrisno; Lilik, P. E. N.; Ahmad, U.; Samsudin

    2018-05-01

    Flavors have become the most important coffee quality parameters now day, many coffee consuming countries require certain taste scores for the coffee to be ordered, the currently used cupping method of appraisal is the method designed by The Specialty Coffee Association Of America (SCAA), from several previous studies was found that Near-Infrared Spectroscopy (NIRS) can be used to detect chemical composition of certain materials including those associated with flavor so it is possible also to be applied to coffee powder. The aim of this research is to get correlation between NIRS spectrum with cupping scoring by tester, then look at the possibility of testing coffee taste sensors using NIRS spectrum. The coffee samples were taken from various places, altitudes and postharvest handling methods, then the samples were prepared following the SCAA protocol, for sensory analysis was done in two ways, with the expert tester and with the NIRS test. The calibration between both found that Without pretreatment using PLS get RMSE cross validation 6.14, using Multiplicative Scatter Correction spectra obtained RMSE cross validation 5.43, the best RMSE cross-validation was 1.73 achieved by de-trending correction, NIRS can be used to predict the score of cupping.

  2. Detecting and removing multiplicative spatial bias in high-throughput screening technologies.

    PubMed

    Caraus, Iurie; Mazoure, Bogdan; Nadon, Robert; Makarenkov, Vladimir

    2017-10-15

    Considerable attention has been paid recently to improve data quality in high-throughput screening (HTS) and high-content screening (HCS) technologies widely used in drug development and chemical toxicity research. However, several environmentally- and procedurally-induced spatial biases in experimental HTS and HCS screens decrease measurement accuracy, leading to increased numbers of false positives and false negatives in hit selection. Although effective bias correction methods and software have been developed over the past decades, almost all of these tools have been designed to reduce the effect of additive bias only. Here, we address the case of multiplicative spatial bias. We introduce three new statistical methods meant to reduce multiplicative spatial bias in screening technologies. We assess the performance of the methods with synthetic and real data affected by multiplicative spatial bias, including comparisons with current bias correction methods. We also describe a wider data correction protocol that integrates methods for removing both assay and plate-specific spatial biases, which can be either additive or multiplicative. The methods for removing multiplicative spatial bias and the data correction protocol are effective in detecting and cleaning experimental data generated by screening technologies. As our protocol is of a general nature, it can be used by researchers analyzing current or next-generation high-throughput screens. The AssayCorrector program, implemented in R, is available on CRAN. makarenkov.vladimir@uqam.ca. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  3. Associations between cerebral amyloid and changes in cognitive function and falls risk in subcortical ischemic vascular cognitive impairment.

    PubMed

    Dao, Elizabeth; Best, John R; Hsiung, Ging-Yuek Robin; Sossi, Vesna; Jacova, Claudia; Tam, Roger; Liu-Ambrose, Teresa

    2017-06-28

    To determine the association between amyloid-beta (Aβ) plaque deposition and changes in global cognition, executive functions, information processing speed, and falls risk over a 12-month period in older adults with a primary clinical diagnosis of subcortical ischemic vascular cognitive impairment (SIVCI). This is a secondary analysis of data acquired from a subset of participants (N = 22) who were enrolled in a randomized controlled trial of aerobic exercise (NCT01027858). The subset of individuals completed an 11 C Pittsburgh compound B (PIB) scan. Cognitive function and falls risk were assessed at baseline, 6-months, and 12-months. Global cognition, executive functions, and information processing speed were measured using: 1) ADAS-Cog; 2) Trail Making Test; 3) Digit Span Test; 4) Stroop Test, and 5) Digit Symbol Substitution Test. Falls risk was measured using the Physiological Profile Assessment. Hierarchical multiple linear regression analyses determined the unique contribution of Aβ on changes in cognitive function and falls risk at 12-months after controlling for experimental group (i.e. aerobic exercise training or usual care control) and baseline performance. To correct for multiple comparisons, we applied the Benjamini-Hochberg procedure to obtain a false discovery rate corrected threshold using alpha = 0.05. Higher PIB retention was significantly associated with greater decrements in set shifting (Trail Making Test, adjusted R 2  = 35.3%, p = 0.002), attention and conflict resolution (Stroop Test, adjusted R 2  = 33.4%, p = 0.01), and information processing speed (Digit Symbol Substitution Test, adjusted R 2  = 24.4%, p = 0.001) over a 12-month period. Additionally, higher PIB retention was significantly associated with increased falls risk (Physiological Profile Assessment, adjusted R 2  = 49.1%, p = 0.04). PIB retention was not significantly associated with change in ADAS-Cog and Verbal Digit Span Test (p > 0.05). Symptoms associated with SIVCI may be amplified by secondary Aβ pathology. ClinicalTrials.gov, NCT01027858 , December 7, 2009.

  4. Examination of the Involvement of Cholinergic-Associated Genes in Nicotine Behaviors in European and African Americans.

    PubMed

    Melroy-Greif, Whitney E; Simonson, Matthew A; Corley, Robin P; Lutz, Sharon M; Hokanson, John E; Ehringer, Marissa A

    2017-04-01

    Cigarette smoking is a physiologically harmful habit. Nicotinic acetylcholine receptors (nAChRs) are bound by nicotine and upregulated in response to chronic exposure to nicotine. It is known that upregulation of these receptors is not due to a change in mRNA of these genes, however, more precise details on the process are still uncertain, with several plausible hypotheses describing how nAChRs are upregulated. We have manually curated a set of genes believed to play a role in nicotine-induced nAChR upregulation. Here, we test the hypothesis that these genes are associated with and contribute risk for nicotine dependence (ND) and the number of cigarettes smoked per day (CPD). Studies with genotypic data on European and African Americans (EAs and AAs, respectively) were collected and a gene-based test was run to test for an association between each gene and ND and CPD. Although several novel genes were associated with CPD and ND at P < 0.05 in EAs and AAs, these associations did not survive correction for multiple testing. Previous associations between CHRNA3, CHRNA5, CHRNB4 and CPD in EAs were replicated. Our hypothesis-driven approach avoided many of the limitations inherent in pathway analyses and provided nominal evidence for association between cholinergic-related genes and nicotine behaviors. We evaluated the evidence for association between a manually curated set of genes and nicotine behaviors in European and African Americans. Although no genes were associated after multiple testing correction, this study has several strengths: by manually curating a set of genes we circumvented the limitations inherent in many pathway analyses and tested several genes that had not yet been examined in a human genetic study; gene-based tests are a useful way to test for association with a set of genes; and these genes were collected based on literature review and conversations with experts, highlighting the importance of scientific collaboration. © The Author 2016. Published by Oxford University Press on behalf of the Society for Research on Nicotine and Tobacco. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  5. Long-term reproducibility of phantom signal intensities in nonuniformity corrected STIR-MRI examinations of skeletal muscle.

    PubMed

    Viddeleer, Alain R; Sijens, Paul E; van Ooijen, Peter M A; Kuypers, Paul D L; Hovius, Steven E R; Oudkerk, Matthijs

    2009-08-01

    Nerve regeneration could be monitored by comparing MRI image intensities in time, as denervated muscles display increased signal intensity in STIR sequences. In this study long-term reproducibility of STIR image intensity was assessed under clinical conditions and the required image intensity nonuniformity correction was improved by using phantom scans obtained at multiple positions. Three-dimensional image intensity nonuniformity was investigated in phantom scans. Next, over a three-year period, 190 clinical STIR hand scans were obtained using a standardized acquisition protocol, and corrected for intensity nonuniformity by using the results of phantom scanning. The results of correction with 1, 3, and 11 phantom scans were compared. The image intensities in calibration tubes close to the hands were measured every time to determine the reproducibility of our method. With calibration, the reproducibility of STIR image intensity improved from 7.8 to 6.4%. Image intensity nonuniformity correction with 11 phantom scans gave significantly better results than correction with 1 or 3 scans. The image intensities in clinical STIR images acquired at different times can be compared directly, provided that the acquisition protocol is standardized and that nonuniformity correction is applied. Nonuniformity correction is preferably based on multiple phantom scans.

  6. Many-body effects and ultraviolet renormalization in three-dimensional Dirac materials

    NASA Astrophysics Data System (ADS)

    Throckmorton, Robert E.; Hofmann, Johannes; Barnes, Edwin; Das Sarma, S.

    2015-09-01

    We develop a theory for electron-electron interaction-induced many-body effects in three-dimensional Weyl or Dirac semimetals, including interaction corrections to the polarizability, electron self-energy, and vertex function, up to second order in the effective fine-structure constant of the Dirac material. These results are used to derive the higher-order ultraviolet renormalization of the Fermi velocity, effective coupling, and quasiparticle residue, revealing that the corrections to the renormalization group flows of both the velocity and coupling counteract the leading-order tendencies of velocity enhancement and coupling suppression at low energies. This in turn leads to the emergence of a critical coupling above which the interaction strength grows with decreasing energy scale. In addition, we identify a range of coupling strengths below the critical point in which the Fermi velocity varies nonmonotonically as the low-energy, noninteracting fixed point is approached. Furthermore, we find that while the higher-order correction to the flow of the coupling is generally small compared to the leading order, the corresponding correction to the velocity flow carries an additional factor of the Dirac cone flavor number (the multiplicity of electron species, e.g. ground-state valley degeneracy arising from the band structure) relative to the leading-order result. Thus, for materials with a larger multiplicity, the regime of velocity nonmonotonicity is reached for modest values of the coupling strength. This is in stark contrast to an approach based on a large-N expansion or the random phase approximation (RPA), where higher-order corrections are strongly suppressed for larger values of the Dirac cone multiplicity. This suggests that perturbation theory in the coupling constant (i.e., the loop expansion) and the RPA/large-N expansion are complementary in the sense that they are applicable in different parameter regimes of the theory. We show how our results for the ultraviolet renormalization of quasiparticle properties can be tested experimentally through measurements of quantities such as the optical conductivity or dielectric function (with carrier density or temperature acting as the scale being varied to induce the running coupling). Although experiments typically access the finite-density regime, we show that our zero-density results still capture clear many-body signatures that should be visible at higher temperatures even in real systems with disorder and finite doping.

  7. Immediate Feedback Assessment Technique in a Chemistry Classroom

    NASA Astrophysics Data System (ADS)

    Taylor, Kate R.

    The Immediate Feedback Assessment Technique, or IFAT, is a new testing system that turns a student's traditional multiple-choice testing into a chance for hands-on learning; and provides teachers with an opportunity to obtain more information about a student's knowledge during testing. In the current study we wanted to know if: When students are given the second-chance afforded by the IFAT system, are they guessing or using prior knowledge when making their second chance choice. Additionally, while there has been some adaptation of this testing system in non-science disciplines, we wanted to study if the IFAT-system would be well- received among faculty in the sciences, more specifically chemistry faculty. By comparing the students rate of success on second-chance afforded by the IFAT-system versus the statistical likelihood of guessing correctly, statistical analysis was used to determine if we observed enough students earning the second-chance points to reject the likelihood that students were randomly guessing. Our data analysis revealed that is statistically highly unlikely that students were only guessing when the IFAT system was utilized. (It is important to note that while we can find that students are getting the answer correct at a much higher rate than random guessing we can never truly know if every student is using thought or not.).

  8. Corrective Action Decision Document/Closure Report for Corrective Action Unit 105: Area 2 Yucca Flat Atmospheric Test Sites, Nevada National Security Site, Nevada, Revision 0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matthews, Patrick

    2013-09-01

    This Corrective Action Decision Document/Closure Report presents information supporting the closure of Corrective Action Unit (CAU) 105: Area 2 Yucca Flat Atmospheric Test Sites, Nevada National Security Site, Nevada. CAU 105 comprises the following five corrective action sites (CASs): -02-23-04 Atmospheric Test Site - Whitney Closure In Place -02-23-05 Atmospheric Test Site T-2A Closure In Place -02-23-06 Atmospheric Test Site T-2B Clean Closure -02-23-08 Atmospheric Test Site T-2 Closure In Place -02-23-09 Atmospheric Test Site - Turk Closure In Place The purpose of this Corrective Action Decision Document/Closure Report is to provide justification and documentation supporting the recommendation that nomore » further corrective action is needed for CAU 105 based on the implementation of the corrective actions. Corrective action investigation (CAI) activities were performed from October 22, 2012, through May 23, 2013, as set forth in the Corrective Action Investigation Plan for Corrective Action Unit 105: Area 2 Yucca Flat Atmospheric Test Sites; and in accordance with the Soils Activity Quality Assurance Plan, which establishes requirements, technical planning, and general quality practices.« less

  9. Are multiple-trial experiments appropriate for eyewitness identification studies? Accuracy, choosing, and confidence across trials.

    PubMed

    Mansour, J K; Beaudry, J L; Lindsay, R C L

    2017-12-01

    Eyewitness identification experiments typically involve a single trial: A participant views an event and subsequently makes a lineup decision. As compared to this single-trial paradigm, multiple-trial designs are more efficient, but significantly reduce ecological validity and may affect the strategies that participants use to make lineup decisions. We examined the effects of a number of forensically relevant variables (i.e., memory strength, type of disguise, degree of disguise, and lineup type) on eyewitness accuracy, choosing, and confidence across 12 target-present and 12 target-absent lineup trials (N = 349; 8,376 lineup decisions). The rates of correct rejections and choosing (across both target-present and target-absent lineups) did not vary across the 24 trials, as reflected by main effects or interactions with trial number. Trial number had a significant but trivial quadratic effect on correct identifications (OR = 0.99) and interacted significantly, but again trivially, with disguise type (OR = 1.00). Trial number did not significantly influence participants' confidence in correct identifications, confidence in correct rejections, or confidence in target-absent selections. Thus, multiple-trial designs appear to have minimal effects on eyewitness accuracy, choosing, and confidence. Researchers should thus consider using multiple-trial designs for conducting eyewitness identification experiments.

  10. Flux or speed? Examining speckle contrast imaging of vascular flows

    PubMed Central

    Kazmi, S. M. Shams; Faraji, Ehssan; Davis, Mitchell A.; Huang, Yu-Yen; Zhang, Xiaojing J.; Dunn, Andrew K.

    2015-01-01

    Speckle contrast imaging enables rapid mapping of relative blood flow distributions using camera detection of back-scattered laser light. However, speckle derived flow measures deviate from direct measurements of erythrocyte speeds by 47 ± 15% (n = 13 mice) in vessels of various calibers. Alternatively, deviations with estimates of volumetric flux are on average 91 ± 43%. We highlight and attempt to alleviate this discrepancy by accounting for the effects of multiple dynamic scattering with speckle imaging of microfluidic channels of varying sizes and then with red blood cell (RBC) tracking correlated speckle imaging of vascular flows in the cerebral cortex. By revisiting the governing dynamic light scattering models, we test the ability to predict the degree of multiple dynamic scattering across vessels in order to correct for the observed discrepancies between relative RBC speeds and multi-exposure speckle imaging estimates of inverse correlation times. The analysis reveals that traditional speckle contrast imagery of vascular flows is neither a measure of volumetric flux nor particle speed, but rather the product of speed and vessel diameter. The corrected speckle estimates of the relative RBC speeds have an average 10 ± 3% deviation in vivo with those obtained from RBC tracking. PMID:26203384

  11. Flux or speed? Examining speckle contrast imaging of vascular flows.

    PubMed

    Kazmi, S M Shams; Faraji, Ehssan; Davis, Mitchell A; Huang, Yu-Yen; Zhang, Xiaojing J; Dunn, Andrew K

    2015-07-01

    Speckle contrast imaging enables rapid mapping of relative blood flow distributions using camera detection of back-scattered laser light. However, speckle derived flow measures deviate from direct measurements of erythrocyte speeds by 47 ± 15% (n = 13 mice) in vessels of various calibers. Alternatively, deviations with estimates of volumetric flux are on average 91 ± 43%. We highlight and attempt to alleviate this discrepancy by accounting for the effects of multiple dynamic scattering with speckle imaging of microfluidic channels of varying sizes and then with red blood cell (RBC) tracking correlated speckle imaging of vascular flows in the cerebral cortex. By revisiting the governing dynamic light scattering models, we test the ability to predict the degree of multiple dynamic scattering across vessels in order to correct for the observed discrepancies between relative RBC speeds and multi-exposure speckle imaging estimates of inverse correlation times. The analysis reveals that traditional speckle contrast imagery of vascular flows is neither a measure of volumetric flux nor particle speed, but rather the product of speed and vessel diameter. The corrected speckle estimates of the relative RBC speeds have an average 10 ± 3% deviation in vivo with those obtained from RBC tracking.

  12. Stable Gene Targeting in Human Cells Using Single-Strand Oligonucleotides with Modified Bases

    PubMed Central

    Rios, Xavier; Briggs, Adrian W.; Christodoulou, Danos; Gorham, Josh M.; Seidman, Jonathan G.; Church, George M.

    2012-01-01

    Recent advances allow multiplexed genome engineering in E. coli, employing easily designed oligonucleotides to edit multiple loci simultaneously. A similar technology in human cells would greatly expedite functional genomics, both by enhancing our ability to test how individual variants such as single nucleotide polymorphisms (SNPs) are related to specific phenotypes, and potentially allowing simultaneous mutation of multiple loci. However, oligo-mediated targeting of human cells is currently limited by low targeting efficiencies and low survival of modified cells. Using a HeLa-based EGFP-rescue reporter system we show that use of modified base analogs can increase targeting efficiency, in part by avoiding the mismatch repair machinery. We investigate the effects of oligonucleotide toxicity and find a strong correlation between the number of phosphorothioate bonds and toxicity. Stably EGFP-corrected cells were generated at a frequency of ~0.05% with an optimized oligonucleotide design combining modified bases and reduced number of phosphorothioate bonds. We provide evidence from comparative RNA-seq analysis suggesting cellular immunity induced by the oligonucleotides might contribute to the low viability of oligo-corrected cells. Further optimization of this method should allow rapid and scalable genome engineering in human cells. PMID:22615794

  13. Ultra high tip speed (670.6 m/sec) fan stage with composite rotor: Aerodynamic and mechanical design

    NASA Technical Reports Server (NTRS)

    Halle, J. E.; Burger, G. D.; Dundas, R. E.

    1977-01-01

    A highly loaded, single-stage compressor having a tip speed of 670.6 m/sec was designed for the purpose of investigating very high tip speeds and high aerodynamic loadings to obtain high stage pressure ratios at acceptable levels of efficiency. The design pressure ratio is 2.8 at an adiabatic efficiency of 84.4%. Corrected design flow is 83.4 kg/sec; corrected design speed is 15,200 rpm; and rotor inlet tip diameter is 0.853 m. The rotor uses multiple-circular-arc airfoils from 0 to 15% span, precompression airfoils assuming single, strong oblique shocks from 21 to 43% span, and precompression airfoils assuming multiple oblique shocks from 52% span to the tip. Because of the high tip speeds, the rotor blades are designed to be fabricated of composite materials. Two composite materials were investigated: Courtaulds HTS graphite fiber in a Kerimid 601 polyimide matrix and the same fibers in a PMR polyimide matrix. In addition to providing a description of the aerodynamic and mechanical design of the 670.0 m/sec fan, discussion is presented of the results of structural tests of blades fabricated with both types of matrices.

  14. Correction of distortions in distressed mothers' ratings of their preschool children's psychopathology.

    PubMed

    Müller, Jörg M; Furniss, Tilman

    2013-11-30

    The often-reported low informant agreement about child psychopathology between multiple informants has lead to various suggestions about how to address discrepant ratings. Among the factors that may lower agreement that have been discussed is informant credibility, reliability, or psychopathology, which is of interest in this paper. We tested three different models, namely, the accuracy, the distortion, and an integrated so-called combined model, that conceptualize parental ratings to assess child psychopathology. The data comprise ratings of child psychopathology from multiple informants (mother, therapist and kindergarten teacher) and ratings of maternal psychopathology. The children were patients in a preschool psychiatry unit (N=247). The results from structural equation modeling show that maternal ratings of child psychopathology were biased by maternal psychopathology (distortion model). Based on this statistical background, we suggest a method to adjust biased maternal ratings. We illustrate the maternal bias by comparing the ratings of mother to expert ratings (combined kindergarten teacher and therapist ratings) and show that the correction equation increases the agreement between maternal and expert ratings. We conclude that this approach may help to reduce misclassification of preschool children as 'clinical' on the basis of biased maternal ratings. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  15. Infant multiple breath washout using a new commercially available device: Ready to replace the previous setup?

    PubMed

    Kentgens, Anne-Christianne; Guidi, Marisa; Korten, Insa; Kohler, Lena; Binggeli, Severin; Singer, Florian; Latzin, Philipp; Anagnostopoulou, Pinelopi

    2018-05-01

    Multiple breath washout (MBW) is a sensitive test to measure lung volumes and ventilation inhomogeneity from infancy on. The commonly used setup for infant MBW, based on ultrasonic flowmeter, requires extensive signal processing, which may reduce robustness. A new setup may overcome some previous limitations but formal validation is lacking. We assessed the feasibility of infant MBW testing with the new setup and compared functional residual capacity (FRC) values of the old and the new setup in vivo and in vitro. We performed MBW in four healthy infants and four infants with cystic fibrosis, as well as in a Plexiglas lung simulator using realistic lung volumes and breathing patterns, with the new (Exhalyzer D, Spiroware 3.2.0, Ecomedics) and the old setup (Exhalyzer D, WBreath 3.18.0, ndd) in random sequence. The technical feasibility of MBW with the new device-setup was 100%. Intra-subject variability in FRC was low in both setups, but differences in FRC between the setups were considerable (mean relative difference 39.7%, range 18.9; 65.7, P = 0.008). Corrections of software settings decreased FRC differences (14.0%, -6.4; 42.3, P = 0.08). Results were confirmed in vitro. MBW measurements with the new setup were feasible in infants. However, despite attempts to correct software settings, outcomes between setups were not interchangeable. Further work is needed before widespread application of the new setup can be recommended. © 2018 Wiley Periodicals, Inc.

  16. Validating the Astronomy Diagnostics Test for Undergraduate Non-Science Majors

    NASA Astrophysics Data System (ADS)

    Slater, T. F.; Hufnagel, B.; Adams, J. P.

    1999-05-01

    The Astronomy Diagnostics Test (ADT) is a standard diagnostic test for undergraduate non-science majors taking introductory astronomy. Serving to compare the effectiveness of various instructional interventions, the ADT has been developed and field-tested over the last year by a multi-institutional team, known as the Collaboration for Astronomy Education Research (CAER). The team includes Jeff Adams, Rebecca Lindell Adrian, Christine Brick, Gina Brissenden, Grace Deming, Beth Hufnagel, Tim Slater, and Michael Zeilik, among others. The need for a nationally normed, valid, and reliable assessment instrument in astronomy has been articulated in a wide variety of forums. This need results from the simultaneous occurrence of several important phenomena over the last decade including: the inclusion of astronomy concepts in national science education standards; documentation of widespread astronomical misconceptions; the influence of the Force Concept Inventory guiding reform in physics; and the call for university faculty to document improvements in instruction. In a triangulated effort to validate the ADT for widespread use, the researchers used on a three-phase strategy. In this context, "validity" means that the ADT measures what it purports to measure. In other words, do students give the correct answer for the scientifically correct reasons or, alternatively, do students give the correct answer even though they have misunderstandings about the phenomena being tested? These three phases were: (1) conduct statistical item-analysis on each test question for a large and diverse student population (n=2000 from 21 institutions); (2) conduct 60 clinical student interviews using the test questions as the script; and (3) conduct an inductive analysis of 30 student supplied written responses to ADT questions posed without the multiple-choices provided. The ADT and its supporting comparative database is available at URL: http://solar.physics.montana.edu/aae/adt/. This research was supported in part by NSF DGE-9714489 (BH) and NASA Grant #CERES-NAG54576 (TS).

  17. A library least-squares approach for scatter correction in gamma-ray tomography

    NASA Astrophysics Data System (ADS)

    Meric, Ilker; Anton Johansen, Geir; Valgueiro Malta Moreira, Icaro

    2015-03-01

    Scattered radiation is known to lead to distortion in reconstructed images in Computed Tomography (CT). The effects of scattered radiation are especially more pronounced in non-scanning, multiple source systems which are preferred for flow imaging where the instantaneous density distribution of the flow components is of interest. In this work, a new method based on a library least-squares (LLS) approach is proposed as a means of estimating the scatter contribution and correcting for this. The validity of the proposed method is tested using the 85-channel industrial gamma-ray tomograph previously developed at the University of Bergen (UoB). The results presented here confirm that the LLS approach can effectively estimate the amounts of transmission and scatter components in any given detector in the UoB gamma-ray tomography system.

  18. A scan-angle correction for thermal infrared multispectral data using side lapping images

    USGS Publications Warehouse

    Watson, K.

    1996-01-01

    Thermal infrared multispectral scanner (TIMS) images, acquired with side lapping flight lines, provide dual angle observations of the same area on the ground and can thus be used to estimate variations in the atmospheric transmission with scan angle. The method was tested using TIMS aircraft data for six flight lines with about 30% sidelap for an area within Joshua Tree National Park, California. Generally the results correspond to predictions for the transmission scan-angle coefficient based on a standard atmospheric model although some differences were observed at the longer wavelength channels. A change was detected for the last pair of lines that may indicate either spatial or temporal atmospheric variation. The results demonstrate that the method provides information for correcting regional survey data (requiring multiple adjacent flight lines) that can be important in detecting subtle changes in lithology.

  19. 12 CFR 1002.15 - Incentives for self-testing and self-correction.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 12 Banks and Banking 8 2013-01-01 2013-01-01 false Incentives for self-testing and self-correction... OPPORTUNITY ACT (REGULATION B) § 1002.15 Incentives for self-testing and self-correction. (a) General rules—(1) Voluntary self-testing and correction. The report or results of a self-test that a creditor voluntarily...

  20. 12 CFR 202.15 - Incentives for self-testing and self-correction.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 12 Banks and Banking 2 2014-01-01 2014-01-01 false Incentives for self-testing and self-correction... RESERVE SYSTEM EQUAL CREDIT OPPORTUNITY ACT (REGULATION B) § 202.15 Incentives for self-testing and self-correction. (a) General rules—(1) Voluntary self-testing and correction. The report or results of a self-test...

  1. 12 CFR 1002.15 - Incentives for self-testing and self-correction.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 12 Banks and Banking 8 2012-01-01 2012-01-01 false Incentives for self-testing and self-correction... OPPORTUNITY ACT (REGULATION B) § 1002.15 Incentives for self-testing and self-correction. (a) General rules—(1) Voluntary self-testing and correction. The report or results of a self-test that a creditor voluntarily...

  2. 12 CFR 202.15 - Incentives for self-testing and self-correction.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 12 Banks and Banking 2 2013-01-01 2013-01-01 false Incentives for self-testing and self-correction... RESERVE SYSTEM EQUAL CREDIT OPPORTUNITY ACT (REGULATION B) § 202.15 Incentives for self-testing and self-correction. (a) General rules—(1) Voluntary self-testing and correction. The report or results of a self-test...

  3. 12 CFR 1002.15 - Incentives for self-testing and self-correction.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 12 Banks and Banking 8 2014-01-01 2014-01-01 false Incentives for self-testing and self-correction... OPPORTUNITY ACT (REGULATION B) § 1002.15 Incentives for self-testing and self-correction. (a) General rules—(1) Voluntary self-testing and correction. The report or results of a self-test that a creditor voluntarily...

  4. 12 CFR 202.15 - Incentives for self-testing and self-correction.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 12 Banks and Banking 2 2011-01-01 2011-01-01 false Incentives for self-testing and self-correction... RESERVE SYSTEM EQUAL CREDIT OPPORTUNITY ACT (REGULATION B) § 202.15 Incentives for self-testing and self-correction. (a) General rules—(1) Voluntary self-testing and correction. The report or results of a self-test...

  5. 12 CFR 202.15 - Incentives for self-testing and self-correction.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 12 Banks and Banking 2 2012-01-01 2012-01-01 false Incentives for self-testing and self-correction... RESERVE SYSTEM EQUAL CREDIT OPPORTUNITY ACT (REGULATION B) § 202.15 Incentives for self-testing and self-correction. (a) General rules—(1) Voluntary self-testing and correction. The report or results of a self-test...

  6. Dietary fatty acids were not independently associated with lipoprotein subclasses in elderly women.

    PubMed

    Alaghehband, Fatemeh Ramezan; Lankinen, Maria; Värri, Miika; Sirola, Joonas; Kröger, Heikki; Erkkilä, Arja T

    2017-07-01

    Dietary fatty acids are known to affect serum lipoproteins; however, little is known about the associations between consumption of dietary fatty acids and lipoprotein subclasses. In this study, we hypothesized that there is an association between dietary fatty acids and lipoprotein subclasses and investigated the cross-sectional association of dietary fat intake with subclasses of lipoproteins in elderly women. Altogether, 547 women (aged ≥65 years) who were part of OSTPRE cohort participated. Dietary intake was assessed by 3-day food records, lifestyle, and health information obtained through self-administrated questionnaires, and lipoprotein subclasses were determined by nuclear magnetic resonance spectroscopy. To analyze the associations between fatty acids and lipoprotein subclasses, we used Pearson and Spearman correlation coefficients and the analysis of covariance (ANCOVA) test with, adjustment for physical activity, body mass index, age, smoking status, and intake of lipid-lowering drugs. There were significant correlations between saturated fatty acids (SFA; % of energy) and concentrations of large, medium, and small low-density lipoproteins (LDL); total cholesterol in large, medium, and small LDL; and phospholipids in large, medium, and small LDL, after correction for multiple testing. After adjustment for covariates, the higher intake of SFA was associated with smaller size of LDL particles (P = .04, ANCOVA) and lower amount of triglycerides in small very low-density lipoproteins (P = .046, ANCOVA). However, these associations did not remain significant after correction for multiple testing. In conclusion, high intake of SFA may be associated with the size of LDL particles, but the results do not support significant, independent associations between dietary fatty acids and lipoprotein subclasses. Copyright © 2017 Elsevier Inc. All rights reserved.

  7. Association of common variants in PAH and LAT1 with non-syndromic cleft lip with or without cleft palate (NSCL/P) in the Polish population.

    PubMed

    Hozyasz, Kamil K; Mostowska, Adrianna; Wójcicki, Piotr; Lasota, Agnieszka; Wołkowicz, Anna; Dunin-Wilczyńska, Izabella; Jagodziński, Paweł P

    2014-04-01

    Non-syndromic cleft lip with or without cleft palate (NSCL/P) is a common structural malformation with a complex and multifactorial aetiology. Associations of abnormalities in phenylalanine metabolism and orofacial clefts have been suggested. Eight single nucleotide polymorphisms (SNPs) of genes encoding phenylalanine hydroxylase (PAH) and large neutral l-amino acid transporter type 1 (LAT1), as well as the PAH mutation that is most common in the Polish population (rs5030858; R408W), were investigated in 263 patients with NSCL/P and 270 matched controls using high resolution melting curve analysis (HRM). We found that two polymorphic variants of PAH appear to be risk factors for NSCL/P. The odds ratio (OR) for individuals with the rs7485331 A allele (AC or AA) compared to CC homozygotes was 0.616 (95% confidence interval [CI]=0.437-0.868; p=0.005) and this association remains statistically significant after multiple testing correction. The PAH rs12425434, previously associated with schizophrenia, was borderline associated with orofacial clefts. Moreover, haplotype analysis of polymorphisms in the PAH gene revealed a 4-marker combination that was significantly associated with NSCL/P. The global p-value for a haplotype comprised of SNPs rs74385331, rs12425434, rs1722392, and the mutation rs5030858 was 0.032, but this association did not survive multiple testing correction. This study suggests the involvement of the PAH gene in the aetiology of NSCL/P in the tested population. Further replication will be required in separate cohorts to confirm the consistency of the observed association. Copyright © 2014 Elsevier Ltd. All rights reserved.

  8. Interaction between polymorphisms in aspirin metabolic pathways, regular aspirin use and colorectal cancer risk: A case-control study in unselected white European populations.

    PubMed

    Sheth, Harsh; Northwood, Emma; Ulrich, Cornelia M; Scherer, Dominique; Elliott, Faye; Barrett, Jennifer H; Forman, David; Wolf, C Roland; Smith, Gillian; Jackson, Michael S; Santibanez-Koref, Mauro; Haile, Robert; Casey, Graham; Jenkins, Mark; Win, Aung Ko; Hopper, John L; Marchand, Loic Le; Lindor, Noralane M; Thibodeau, Stephen N; Potter, John D; Burn, John; Bishop, D Timothy

    2018-01-01

    Regular aspirin use is associated with reduced risk of colorectal cancer (CRC). Variation in aspirin's chemoprevention efficacy has been attributed to the presence of single nucleotide polymorphisms (SNPs). We conducted a meta-analysis using two large population-based case-control datasets, the UK-Leeds Colorectal Cancer Study Group and the NIH-Colon Cancer Family Registry, having a combined total of 3325 cases and 2262 controls. The aim was to assess 42 candidate SNPs in 15 genes whose association with colorectal cancer risk was putatively modified by aspirin use, in the literature. Log odds ratios (ORs) and standard errors were estimated for each dataset separately using logistic regression adjusting for age, sex and study site, and dataset-specific results were combined using random effects meta-analysis. Meta-analysis showed association between SNPs rs6983267, rs11694911 and rs2302615 with CRC risk reduction (All P<0.05). Association for SNP rs6983267 in the CCAT2 gene only was noteworthy after multiple test correction (P = 0.001). Site-specific analysis showed association between SNPs rs1799853 and rs2302615 with reduced colon cancer risk only (P = 0.01 and P = 0.004, respectively), however neither reached significance threshold following multiple test correction. Meta-analysis of SNPs rs2070959 and rs1105879 in UGT1A6 gene showed interaction between aspirin use and CRC risk (Pinteraction = 0.01 and 0.02, respectively); stratification by aspirin use showed an association for decreased CRC risk for aspirin users having a wild-type genotype (rs2070959 OR = 0.77, 95% CI = 0.68-0.86; rs1105879 OR = 0.77 95% CI = 0.69-0.86) compared to variant allele cariers. The direction of the interaction however is in contrast to that published in studies on colorectal adenomas. Both SNPs showed potential site-specific interaction with aspirin use and colon cancer risk only (Pinteraction = 0.006 and 0.008, respectively), with the direction of association similar to that observed for CRC. Additionally, they showed interaction between any non-steroidal anti-inflammatory drugs (including aspirin) use and CRC risk (Pinteraction = 0.01 for both). All gene x environment (GxE) interactions however were not significant after multiple test correction. Candidate gene investigation indicated no evidence of GxE interaction between genetic variants in genes involved in aspirin pathways, regular aspirin use and colorectal cancer risk.

  9. Interaction between polymorphisms in aspirin metabolic pathways, regular aspirin use and colorectal cancer risk: A case-control study in unselected white European populations

    PubMed Central

    Ulrich, Cornelia M.; Scherer, Dominique; Elliott, Faye; Barrett, Jennifer H.; Forman, David; Wolf, C. Roland; Smith, Gillian; Jackson, Michael S.; Santibanez-Koref, Mauro; Haile, Robert; Casey, Graham; Jenkins, Mark; Win, Aung Ko; Hopper, John L.; Marchand, Loic Le; Lindor, Noralane M.; Thibodeau, Stephen N.; Potter, John D.; Burn, John; Bishop, D. Timothy

    2018-01-01

    Regular aspirin use is associated with reduced risk of colorectal cancer (CRC). Variation in aspirin’s chemoprevention efficacy has been attributed to the presence of single nucleotide polymorphisms (SNPs). We conducted a meta-analysis using two large population-based case-control datasets, the UK-Leeds Colorectal Cancer Study Group and the NIH-Colon Cancer Family Registry, having a combined total of 3325 cases and 2262 controls. The aim was to assess 42 candidate SNPs in 15 genes whose association with colorectal cancer risk was putatively modified by aspirin use, in the literature. Log odds ratios (ORs) and standard errors were estimated for each dataset separately using logistic regression adjusting for age, sex and study site, and dataset-specific results were combined using random effects meta-analysis. Meta-analysis showed association between SNPs rs6983267, rs11694911 and rs2302615 with CRC risk reduction (All P<0.05). Association for SNP rs6983267 in the CCAT2 gene only was noteworthy after multiple test correction (P = 0.001). Site-specific analysis showed association between SNPs rs1799853 and rs2302615 with reduced colon cancer risk only (P = 0.01 and P = 0.004, respectively), however neither reached significance threshold following multiple test correction. Meta-analysis of SNPs rs2070959 and rs1105879 in UGT1A6 gene showed interaction between aspirin use and CRC risk (Pinteraction = 0.01 and 0.02, respectively); stratification by aspirin use showed an association for decreased CRC risk for aspirin users having a wild-type genotype (rs2070959 OR = 0.77, 95% CI = 0.68–0.86; rs1105879 OR = 0.77 95% CI = 0.69–0.86) compared to variant allele cariers. The direction of the interaction however is in contrast to that published in studies on colorectal adenomas. Both SNPs showed potential site-specific interaction with aspirin use and colon cancer risk only (Pinteraction = 0.006 and 0.008, respectively), with the direction of association similar to that observed for CRC. Additionally, they showed interaction between any non-steroidal anti-inflammatory drugs (including aspirin) use and CRC risk (Pinteraction = 0.01 for both). All gene x environment (GxE) interactions however were not significant after multiple test correction. Candidate gene investigation indicated no evidence of GxE interaction between genetic variants in genes involved in aspirin pathways, regular aspirin use and colorectal cancer risk. PMID:29425227

  10. Sure, or unsure? Measuring students' confidence and the potential impact on patient safety in multiple-choice questions.

    PubMed

    Rangel, Rafael Henrique; Möller, Leona; Sitter, Helmut; Stibane, Tina; Strzelczyk, Adam

    2017-11-01

    Multiple-choice questions (MCQs) provide useful information about correct and incorrect answers, but they do not offer information about students' confidence. Ninety and another 81 medical students participated each in a curricular neurology multiple-choice exam and indicated their confidence for every single MCQ. Each MCQ had a defined level of potential clinical impact on patient safety (uncritical, risky, harmful). Our first objective was to detect informed (IF), guessed (GU), misinformed (MI), and uninformed (UI) answers. Further, we evaluated whether there were significant differences for confidence at correct and incorrect answers. Then, we explored if clinical impact had a significant influence on students' confidence. There were 1818 IF, 635 GU, 71 MI, and 176 UI answers in exam I and 1453 IF, 613 GU, 92 MI, and 191 UI answers in exam II. Students' confidence was significantly higher for correct than for incorrect answers at both exams (p < 0.001). For exam I, students' confidence was significantly higher for incorrect harmful than for incorrect risky classified MCQs (p = 0.01). At exam II, students' confidence was significantly higher for incorrect harmful than for incorrect benign (p < 0.01) and significantly higher for correct benign than for correct harmful categorized MCQs (p = 0.01). We were pleased to see that there were more informed than guessed, more uninformed than misinformed answers and higher students' confidence for correct than for incorrect answers. Our expectation that students state higher confidence in correct and harmful and lower confidence in incorrect and harmful MCQs could not be confirmed.

  11. The Oxytocin Receptor Gene ( OXTR) and Face Recognition.

    PubMed

    Verhallen, Roeland J; Bosten, Jenny M; Goodbourn, Patrick T; Lawrance-Owen, Adam J; Bargary, Gary; Mollon, J D

    2017-01-01

    A recent study has linked individual differences in face recognition to rs237887, a single-nucleotide polymorphism (SNP) of the oxytocin receptor gene ( OXTR; Skuse et al., 2014). In that study, participants were assessed using the Warrington Recognition Memory Test for Faces, but performance on Warrington's test has been shown not to rely purely on face recognition processes. We administered the widely used Cambridge Face Memory Test-a purer test of face recognition-to 370 participants. Performance was not significantly associated with rs237887, with 16 other SNPs of OXTR that we genotyped, or with a further 75 imputed SNPs. We also administered three other tests of face processing (the Mooney Face Test, the Glasgow Face Matching Test, and the Composite Face Test), but performance was never significantly associated with rs237887 or with any of the other genotyped or imputed SNPs, after corrections for multiple testing. In addition, we found no associations between OXTR and Autism-Spectrum Quotient scores.

  12. Underwater hydrophone location survey

    NASA Technical Reports Server (NTRS)

    Cecil, Jack B.

    1993-01-01

    The Atlantic Undersea Test and Evaluation Center (AUTEC) is a U.S. Navy test range located on Andros Island, Bahamas, and a Division of the Naval Undersea Warfare Center (NUWC), Newport, RI. The Headquarters of AUTEC is located at a facility in West Palm Beach, FL. AUTEC's primary mission is to provide the U.S. Navy with a deep-water test and evaluation facility for making underwater acoustic measurements, testing and calibrating sonars, and providing accurate underwater, surface, and in-air tracking data on surface ships, submarines, aircraft, and weapon systems. Many of these programs are in support of Antisubmarine Warfare (ASW), undersea research and development programs, and Fleet assessment and operational readiness trials. Most tests conducted at AUTEC require precise underwater tracking (plus or minus 3 yards) of multiple acoustic signals emitted with the correct waveshape and repetition criteria from either a surface craft or underwater vehicle.

  13. Status of molten carbonate fuel cell technology development

    NASA Astrophysics Data System (ADS)

    Parsons, E. L., Jr.; Williams, M. C.; George, T. J.

    The MCFC technology has been identified by the DOE as a promising product for commercialization. Development of the MCFC technology supports the National Energy Strategy. Review of the status of the MCFC technology indicates that the MCFC technology developers are making rapid and significant progress. Manufacturing facility development and extensive testing is occurring. Improvements in performance (power density), lower costs, improved packaging, and scale up to full height are planned. MCFC developers need to continue to be responsive to end-users in potential markets. It will be market demands for the correct product definition which will ultimately determine the character of MCFC power plants. There is a need for continued MCFC product improvement and multiple product development tests.

  14. Age-related prevalence and met need for correctable and uncorrectable near vision impairment in a multi-country study.

    PubMed

    He, Mingguang; Abdou, Amza; Ellwein, Leon B; Naidoo, Kovin S; Sapkota, Yuddha D; Thulasiraj, R D; Varma, Rohit; Zhao, Jialiang; Kocur, Ivo; Congdon, Nathan G

    2014-01-01

    To estimate the prevalence, potential determinants, and proportion of met need for near vision impairment (NVI) correctable with refraction approximately 2 years after initial examination of a multi-country cohort. Population-based, prospective cohort study. People aged ≥35 years examined at baseline in semi-rural (Shunyi) and urban (Guangzhou) sites in China; rural sites in Nepal (Kaski), India (Madurai), and Niger (Dosso); a semi-urban site (Durban) in South Africa; and an urban site (Los Angeles) in the United States. Near visual acuity (NVA) with and without current near correction was measured at 40 cm using a logarithm of the minimum angle of resolution near vision tumbling E chart. Participants with uncorrected binocular NVA ≤20/40 were tested with plus sphere lenses to obtain best-corrected binocular NVA. Prevalence of total NVI (defined as uncorrected NVA ≤20/40) and NVI correctable and uncorrectable to >20/40, and current spectacle wearing among those with bilateral NVA ≤20/63 improving to >20/40 with near correction (met need). Among 13 671 baseline participants, 10 533 (77.2%) attended the follow-up examination. The prevalence of correctable NVI increased with age from 35 to 50-60 years and then decreased at all sites. Multiple logistic regression modeling suggested that correctable NVI was not associated with gender at any site, whereas more educated persons aged >54 years were associated with a higher prevalence of correctable NVI in Nepal and India. Although near vision spectacles were provided free at baseline, wear among those who could benefit was <40% at all but 2 centers (Guangzhou and Los Angeles). Prevalence of correctable NVI is greatest among persons of working age, and rates of correction are low in many settings, suggesting that strategies targeting the workplace may be needed. Copyright © 2014 American Academy of Ophthalmology. Published by Elsevier Inc. All rights reserved.

  15. Accuracies and Contrasts of Models of the Diffusion-Weighted-Dependent Attenuation of the MRI Signal at Intermediate b-values.

    PubMed

    Nicolas, Renaud; Sibon, Igor; Hiba, Bassem

    2015-01-01

    The diffusion-weighted-dependent attenuation of the MRI signal E(b) is extremely sensitive to microstructural features. The aim of this study was to determine which mathematical model of the E(b) signal most accurately describes it in the brain. The models compared were the monoexponential model, the stretched exponential model, the truncated cumulant expansion (TCE) model, the biexponential model, and the triexponential model. Acquisition was performed with nine b-values up to 2500 s/mm(2) in 12 healthy volunteers. The goodness-of-fit was studied with F-tests and with the Akaike information criterion. Tissue contrasts were differentiated with a multiple comparison corrected nonparametric analysis of variance. F-test showed that the TCE model was better than the biexponential model in gray and white matter. Corrected Akaike information criterion showed that the TCE model has the best accuracy and produced the most reliable contrasts in white matter among all models studied. In conclusion, the TCE model was found to be the best model to infer the microstructural properties of brain tissue.

  16. Effect of central corneal thickness, corneal curvature, and axial length on applanation tonometry.

    PubMed

    Kohlhaas, Markus; Boehm, Andreas G; Spoerl, Eberhard; Pürsten, Antje; Grein, Hans J; Pillunat, Lutz E

    2006-04-01

    To evaluate the effect of central corneal thickness (CCT), corneal curvature, and axial length on applanation tonometry in an in vivo study. In a masked, prospective clinical trial, we examined 125 eyes of 125 patients scheduled for cataract surgery. Corneal curvature was measured by means of keratometry and axial length by A-scan ultrasonography. By cannulating the anterior chamber before surgery, intraocular pressure (IOP) was set to 20, 35, and 50 mm Hg in a closed system by means of a water column. After measuring thickness, the IOP was measured with an applanation tonometer. Pearson product moment correlations and multiple linear regression analyses were performed, and significance levels were evaluated by the paired, 2-tailed t test. The difference between measured and real IOP was significantly dependent (P < .001) on CCT. The associations between IOP and corneal curvature or IOP and axial length were not statistically significant (P = .31). The association between IOP reading and CCT is shown in the "Dresdner correction table," which illustrates an approximately 1-mm Hg correction for every 25-microm deviation from a CCT of 550 microm. The correction values were positive as thickness decreased and negative as thickness increased. Central corneal thickness significantly affects IOP readings obtained by applanation tonometry according to the Goldmann principle. A correction of IOP readings by considering CCT according to the Dresdner correction table might be helpful for determining an accurate IOP value.

  17. The Cognitive and Perceptual Laws of the Inclined Plane.

    PubMed

    Masin, Sergio Cesare

    2016-09-01

    The study explored whether laypersons correctly tacitly know Galileo's law of the inclined plane and what the basis of such knowledge could be. Participants predicted the time a ball would take to roll down a slope with factorial combination of ball travel distance and slope angle. The resulting pattern of factorial curves relating the square of predicted time to travel distance for each slope angle was identical to that implied by Galileo's law, indicating a correct cognitive representation of this law. Intuitive physics research suggests that this cognitive representation may result from memories of past perceptions of objects rolling down a slope. Such a basis and the correct cognitive representation of Galileo's law led to the hypothesis that Galileo's law is also perceptually represented correctly. To test this hypothesis, participants were asked to judge the perceived travel time of a ball actually rolling down a slope, with perceived travel distance and perceived slope angle varied in a factorial design. The obtained pattern of factorial curves was equal to that implied by Galileo's law, indicating that the functional relationships defined in this law were perceptually represented correctly. The results foster the idea that laypersons may tacitly know both linear and nonlinear multiplicative physical laws of the everyday world. As a practical implication, the awareness of this conclusion may help develop more effective methods for teaching physics and for improving human performance in the physical environment.

  18. Joint association of nicotinic acetylcholine receptor variants with abdominal obesity in American Indians: the Strong Heart Family Study.

    PubMed

    Zhu, Yun; Yang, Jingyun; Yeh, Fawn; Cole, Shelley A; Haack, Karin; Lee, Elisa T; Howard, Barbara V; Zhao, Jinying

    2014-01-01

    Cigarette smoke is a strong risk factor for obesity and cardiovascular disease. The effect of genetic variants involved in nicotine metabolism on obesity or body composition has not been well studied. Though many genetic variants have previously been associated with adiposity or body fat distribution, a single variant usually confers a minimal individual risk. The goal of this study is to evaluate the joint association of multiple variants involved in cigarette smoke or nicotine dependence with obesity-related phenotypes in American Indians. To achieve this goal, we genotyped 61 tagSNPs in seven genes encoding nicotine acetylcholine receptors (nAChRs) in 3,665 American Indians participating in the Strong Heart Family Study. Single SNP association with obesity-related traits was tested using family-based association, adjusting for traditional risk factors including smoking. Joint association of all SNPs in the seven nAChRs genes were examined by gene-family analysis based on weighted truncated product method (TPM). Multiple testing was controlled by false discovery rate (FDR). Results demonstrate that multiple SNPs showed weak individual association with one or more measures of obesity, but none survived correction for multiple testing. However, gene-family analysis revealed significant associations with waist circumference (p = 0.0001) and waist-to-hip ratio (p = 0.0001), but not body mass index (p = 0.20) and percent body fat (p = 0.29), indicating that genetic variants are jointly associated with abdominal, but not general, obesity among American Indians. The observed combined genetic effect is independent of cigarette smoking per se. In conclusion, multiple variants in the nAChR gene family are jointly associated with abdominal obesity in American Indians, independent of general obesity and cigarette smoking per se.

  19. Frequency-domain method for measuring spectral properties in multiple-scattering media: methemoglobin absorption spectrum in a tissuelike phantom

    NASA Astrophysics Data System (ADS)

    Fishkin, Joshua B.; So, Peter T. C.; Cerussi, Albert E.; Gratton, Enrico; Fantini, Sergio; Franceschini, Maria Angela

    1995-03-01

    We have measured the optical absorption and scattering coefficient spectra of a multiple-scattering medium (i.e., a biological tissue-simulating phantom comprising a lipid colloid) containing methemoglobin by using frequency-domain techniques. The methemoglobin absorption spectrum determined in the multiple-scattering medium is in excellent agreement with a corrected methemoglobin absorption spectrum obtained from a steady-state spectrophotometer measurement of the optical density of a minimally scattering medium. The determination of the corrected methemoglobin absorption spectrum takes into account the scattering from impurities in the methemoglobin solution containing no lipid colloid. Frequency-domain techniques allow for the separation of the absorbing from the scattering properties of multiple-scattering media, and these techniques thus provide an absolute

  20. Glucose hypometabolism is highly localized, but lower cortical thickness and brain atrophy are widespread in cognitively normal older adults.

    PubMed

    Nugent, Scott; Castellano, Christian-Alexandre; Goffaux, Philippe; Whittingstall, Kevin; Lepage, Martin; Paquet, Nancy; Bocti, Christian; Fulop, Tamas; Cunnane, Stephen C

    2014-06-01

    Several studies have suggested that glucose hypometabolism may be present in specific brain regions in cognitively normal older adults and could contribute to the risk of subsequent cognitive decline. However, certain methodological shortcomings, including a lack of partial volume effect (PVE) correction or insufficient cognitive testing, confound the interpretation of most studies on this topic. We combined [(18)F]fluorodeoxyglucose ([(18)F]FDG) positron emission tomography (PET) and magnetic resonance (MR) imaging to quantify cerebral metabolic rate of glucose (CMRg) as well as cortical volume and thickness in 43 anatomically defined brain regions from a group of cognitively normal younger (25 ± 3 yr old; n = 25) and older adults (71 ± 9 yr old; n = 31). After correcting for PVE, we observed 11-17% lower CMRg in three specific brain regions of the older group: the superior frontal cortex, the caudal middle frontal cortex, and the caudate (P ≤ 0.01 false discovery rate-corrected). In the older group, cortical volumes and cortical thickness were 13-33 and 7-18% lower, respectively, in multiple brain regions (P ≤ 0.01 FDR correction). There were no differences in CMRg between individuals who were or were not prescribed antihypertensive medication. There were no significant correlations between CMRg and cognitive performance or metabolic parameters measured in fasting plasma. We conclude that highly localized glucose hypometabolism and widespread cortical thinning and atrophy can be present in older adults who are cognitively normal, as assessed using age-normed neuropsychological testing measures. Copyright © 2014 the American Physiological Society.

  1. Matrix metalloproteinases and educational attainment in refractive error: evidence of gene-environment interactions in the AREDS study

    PubMed Central

    Wojciechowski, Robert; Yee, Stephanie S.; Simpson, Claire L.; Bailey-Wilson, Joan E.; Stambolian, Dwight

    2012-01-01

    Purpose A previous study of Old Order Amish families has shown association of ocular refraction with markers proximal to matrix metalloproteinase (MMP) genes MMP1 and MMP10 and intragenic to MMP2. We conducted a candidate gene replication study of association between refraction and single nucleotide polymorphisms (SNPs) within these genomic regions. Design Candidate gene genetic association study. Participants 2,000 participants drawn from the Age Related Eye Disease Study (AREDS) were chosen for genotyping. After quality control filtering, 1912 individuals were available for analysis. Methods Microarray genotyping was performed using the HumanOmni 2.5 bead array. SNPs originally typed in the previous Amish association study were extracted for analysis. In addition, haplotype tagging SNPs were genotyped using TaqMan assays. Quantitative trait association analyses of mean spherical equivalent refraction (MSE) were performed on 30 markers using linear regression models and an additive genetic risk model, while adjusting for age, sex, education, and population substructure. Post-hoc analyses were performed after stratifying on a dichotomous education variable. Pointwise (P-emp) and multiple-test study-wise (P-multi) significance levels were calculated empirically through permutation. Main outcome measures MSE was used as a quantitative measure of ocular refraction. Results The mean age and ocular refraction were 68 years (SD=4.7) and +0.55 D (SD=2.14), respectively. Pointwise statistical significance was obtained for rs1939008 (P-emp=0.0326). No SNP attained statistical significance after correcting for multiple testing. In stratified analyses, multiple SNPs reached pointwise significance in the lower-education group: 2 of these were statistically significant after multiple testing correction. The two highest-ranking SNPs in Amish families (rs1939008 and rs9928731) showed pointwise P-emp<0.01 in the lower-education stratum of AREDS participants. Conclusions We show suggestive evidence of replication of an association signal for ocular refraction to a marker between MMP1 and MMP10. We also provide evidence of a gene-environment interaction between previously-reported markers and education on refractive error. Variants in MMP1- MMP10 and MMP2 regions appear to affect population variation in ocular refraction in environmental conditions less favorable for myopia development. PMID:23098370

  2. A New Clinical Pain Knowledge Test for Nurses: Development and Psychometric Evaluation.

    PubMed

    Bernhofer, Esther I; St Marie, Barbara; Bena, James F

    2017-08-01

    All nurses care for patients with pain, and pain management knowledge and attitude surveys for nurses have been around since 1987. However, no validated knowledge test exists to measure postlicensure clinicians' knowledge of the core competencies of pain management in current complex patient populations. To develop and test the psychometric properties of an instrument designed to measure pain management knowledge of postlicensure nurses. Psychometric instrument validation. Four large Midwestern U.S. hospitals. Registered nurses employed full time and part time August 2015 to April 2016, aged M = 43.25 years; time as RN, M = 16.13 years. Prospective survey design using e-mail to invite nurses to take an electronic multiple choice pain knowledge test. Content validity of initial 36-item test "very good" (95.1% agreement). Completed tests that met analysis criteria, N = 747. Mean initial test score, 69.4% correct (range 27.8-97.2). After revision/removal of 13 unacceptable questions, mean test score was 50.4% correct (range 8.7-82.6). Initial test item percent difficulty range was 15.2%-98.1%; discrimination values range, 0.03-0.50; final test item percent difficulty range, 17.6%-91.1%, discrimination values range, -0.04 to 1.04. Split-half reliability final test was 0.66. A high decision consistency reliability was identified, with test cut-score of 75%. The final 23-item Clinical Pain Knowledge Test has acceptable discrimination, difficulty, decision consistency, reliability, and validity in the general clinical inpatient nurse population. This instrument will be useful in assessing pain management knowledge of clinical nurses to determine gaps in education, evaluate knowledge after pain management education, and measure research outcomes. Copyright © 2017 American Society for Pain Management Nursing. Published by Elsevier Inc. All rights reserved.

  3. 77 FR 64768 - Regulations Regarding the Application of Section 172(h) Including Consolidated Groups; Correction

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-23

    ... treatment of multiple step plans for the acquisition of stock and CERTs involving members of a consolidated... language ``Service, 1111 Constitution Avenue NW.,'' is corrected to read ``Service, 1111 Constitution... from the bottom of the page, the language ``return group; (4) application of these'' is corrected to...

  4. Accuracy of Range Restriction Correction with Multiple Imputation in Small and Moderate Samples: A Simulation Study

    ERIC Educational Resources Information Center

    Pfaffel, Andreas; Spiel, Christiane

    2016-01-01

    Approaches to correcting correlation coefficients for range restriction have been developed under the framework of large sample theory. The accuracy of missing data techniques for correcting correlation coefficients for range restriction has thus far only been investigated with relatively large samples. However, researchers and evaluators are…

  5. Successful correction of tibial bone deformity through multiple surgical procedures, liquid nitrogen-pretreated bone tumor autograft, three-dimensional external fixation, and internal fixation in a patient with primary osteosarcoma: a case report.

    PubMed

    Takeuchi, Akihiko; Yamamoto, Norio; Shirai, Toshiharu; Nishida, Hideji; Hayashi, Katsuhiro; Watanabe, Koji; Miwa, Shinji; Tsuchiya, Hiroyuki

    2015-12-07

    In a previous report, we described a method of reconstruction using tumor-bearing autograft treated by liquid nitrogen for malignant bone tumor. Here we present the first case of bone deformity correction following a tumor-bearing frozen autograft via three-dimensional computerized reconstruction after multiple surgeries. A 16-year-old female student presented with pain in the left lower leg and was diagnosed with a low-grade central tibial osteosarcoma. Surgical bone reconstruction was performed using a tumor-bearing frozen autograft. Bone union was achieved at 7 months after the first surgical procedure. However, local tumor recurrence and lung metastases occurred 2 years later, at which time a second surgical procedure was performed. Five years later, the patient developed a 19° varus deformity and underwent a third surgical procedure, during which an osteotomy was performed using the Taylor Spatial Frame three-dimensional external fixation technique. A fourth corrective surgical procedure was performed in which internal fixation was achieved with a locking plate. Two years later, and 10 years after the initial diagnosis of tibial osteosarcoma, the bone deformity was completely corrected, and the patient's limb function was good. We present the first report in which a bone deformity due to a primary osteosarcoma was corrected using a tumor-bearing frozen autograft, followed by multiple corrective surgical procedures that included osteotomy, three-dimensional external fixation, and internal fixation.

  6. Multiple-Event Seismic Location Using the Markov-Chain Monte Carlo Technique

    NASA Astrophysics Data System (ADS)

    Myers, S. C.; Johannesson, G.; Hanley, W.

    2005-12-01

    We develop a new multiple-event location algorithm (MCMCloc) that utilizes the Markov-Chain Monte Carlo (MCMC) method. Unlike most inverse methods, the MCMC approach produces a suite of solutions, each of which is consistent with observations and prior estimates of data and model uncertainties. Model parameters in MCMCloc consist of event hypocenters, and travel-time predictions. Data are arrival time measurements and phase assignments. Posteriori estimates of event locations, path corrections, pick errors, and phase assignments are made through analysis of the posteriori suite of acceptable solutions. Prior uncertainty estimates include correlations between travel-time predictions, correlations between measurement errors, the probability of misidentifying one phase for another, and the probability of spurious data. Inclusion of prior constraints on location accuracy allows direct utilization of ground-truth locations or well-constrained location parameters (e.g. from InSAR) that aid in the accuracy of the solution. Implementation of a correlation structure for travel-time predictions allows MCMCloc to operate over arbitrarily large geographic areas. Transition in behavior between a multiple-event locator for tightly clustered events and a single-event locator for solitary events is controlled by the spatial correlation of travel-time predictions. We test the MCMC locator on a regional data set of Nevada Test Site nuclear explosions. Event locations and origin times are known for these events, allowing us to test the features of MCMCloc using a high-quality ground truth data set. Preliminary tests suggest that MCMCloc provides excellent relative locations, often outperforming traditional multiple-event location algorithms, and excellent absolute locations are attained when constraints from one or more ground truth event are included. When phase assignments are switched, we find that MCMCloc properly corrects the error when predicted arrival times are separated by several seconds. In cases where the predicted arrival times are within the combined uncertainty of prediction and measurement errors, MCMCloc determines the probability of one or the other phase assignment and propagates this uncertainty into all model parameters. We find that MCMCloc is a promising method for simultaneously locating large, geographically distributed data sets. Because we incorporate prior knowledge on many parameters, MCMCloc is ideal for combining trusted data with data of unknown reliability. This work was performed under the auspices of the U.S. Department of Energy by the University of California Lawrence Livermore National Laboratory under contract No. W-7405-Eng-48, Contribution UCRL-ABS-215048

  7. Evaluation of forensic genetic parameters of 12 STR loci in the Korean population using the InvestigatorⓇ HDplex kit.

    PubMed

    Jung, Ju Yeon; Kim, Eun Hye; Oh, Yu-Li; Park, Hyun-Chul; Hwang, Jung Ho; Lim, Si-Keun

    2017-09-01

    We genotyped and calculated the forensic parameters of 10 non-CODIS loci and 2 CODIS loci of 990 Korean individuals using the Investigator Ⓡ HDplex kit. No significant deviations from Hardy-Weinberg equilibrium (after Bonferroni correction for multiple testing) or genetic linkage disequilibrium were observed. The calculated matching probability and power of discrimination ranged from 0.0080 to 0.2014, and 0.7986 to 0.9920, respectively. We conclude that the markers of the kit are highly informative corroborative tools for forensic DNA analysis.

  8. [Evidence-based pedagogical principles used at medical specialist workshop].

    PubMed

    Holm, Ellen Astrid; Rosholm, Jens Ulrik; Mørch, Marianne Metz

    2010-09-27

    How should a theoretical postgraduate course be organized to obtain maximum effect? We report an example of a two-day course planned and implemented according to educational approaches previously shown to be effective. The theme of the course is "The old patient", and the course is compulsory for residents in internal medicine. This case study showed that the methods used were feasible, and the participants gained knowledge. A multiple-choice test before and after the course showed 44% (before) and 64% (after) correct answers, p < 0.001.

  9. Avoiding false discoveries in association studies.

    PubMed

    Sabatti, Chiara

    2007-01-01

    We consider the problem of controlling false discoveries in association studies. We assume that the design of the study is adequate so that the "false discoveries" are potentially only because of random chance, not to confounding or other flaws. Under this premise, we review the statistical framework for hypothesis testing and correction for multiple comparisons. We consider in detail the currently accepted strategies in linkage analysis. We then examine the underlying similarities and differences between linkage and association studies and document some of the most recent methodological developments for association mapping.

  10. The need for precise and well-documented experimental data on prompt fission neutron spectra from neutron-induced fission of 239Pu

    DOE PAGES

    Neudecker, Denise; Taddeucci, Terry Nicholas; Haight, Robert Cameron; ...

    2016-01-06

    The spectrum of neutrons emitted promptly after 239Pu(n,f)—a so-called prompt fission neutron spectrum (PFNS)—is a quantity of high interest, for instance, for reactor physics and global security. However, there are only few experimental data sets available that are suitable for evaluations. In addition, some of those data sets differ by more than their 1-σ uncertainty boundaries. We present the results of MCNP studies indicating that these differences are partly caused by underestimated multiple scattering contributions, over-corrected background, and inconsistent deconvolution methods. A detailed uncertainty quantification for suitable experimental data was undertaken including these effects, and test-evaluations were performed with themore » improved uncertainty information. The test-evaluations illustrate that the inadequately estimated effects and detailed uncertainty quantification have an impact on the evaluated PFNS and associated uncertainties as well as the neutron multiplicity of selected critical assemblies. A summary of data and documentation needs to improve the quality of the experimental database is provided based on the results of simulations and test-evaluations. Furthermore, given the possibly substantial distortion of the PFNS by multiple scattering and background effects, special care should be taken to reduce these effects in future measurements, e.g., by measuring the 239Pu PFNS as a ratio to either the 235U or 252Cf PFNS.« less

  11. Physical characteristics of experienced and junior open-wheel car drivers.

    PubMed

    Raschner, Christian; Platzer, Hans-Peter; Patterson, Carson

    2013-01-01

    Despite the popularity of open-wheel car racing, scientific literature about the physical characteristics of competitive race car drivers is scarce. The purpose of this study was to compare selected fitness parameters of experienced and junior open-wheel race car drivers. The experienced drivers consisted of five Formula One, two GP2 and two Formula 3 drivers, and the nine junior drivers drove in the Formula Master, Koenig, BMW and Renault series. The following fitness parameters were tested: multiple reactions, multiple anticipation, postural stability, isometric upper body strength, isometric leg extension strength, isometric grip strength, cyclic foot speed and jump height. The group differences were calculated using the Mann-Whitney U-test. Because of the multiple testing strategy used, the statistical significance was Bonferroni corrected and set at P < 0.004. Significant differences between the experienced and junior drivers were found only for the jump height parameter (P = 0.002). The experienced drivers tended to perform better in leg strength (P = 0.009), cyclic foot speed (P = 0.024) and grip strength (P = 0.058). None of the other variables differed between the groups. The results suggested that the experienced drivers were significantly more powerful than the junior drivers: they tended to be quicker and stronger (18% to 25%) but without statistical significance. The experienced drivers demonstrated excellent strength and power compared with other high-performance athletes.

  12. Modeling Photo-multiplier Gain and Regenerating Pulse Height Data for Application Development

    NASA Astrophysics Data System (ADS)

    Aspinall, Michael D.; Jones, Ashley R.

    2018-01-01

    Systems that adopt organic scintillation detector arrays often require a calibration process prior to the intended measurement campaign to correct for significant performance variances between detectors within the array. These differences exist because of low tolerances associated with photo-multiplier tube technology and environmental influences. Differences in detector response can be corrected for by adjusting the supplied photo-multiplier tube voltage to control its gain and the effect that this has on the pulse height spectra from a gamma-only calibration source with a defined photo-peak. Automated methods that analyze these spectra and adjust the photo-multiplier tube bias accordingly are emerging for hardware that integrate acquisition electronics and high voltage control. However, development of such algorithms require access to the hardware, multiple detectors and calibration source for prolonged periods, all with associated constraints and risks. In this work, we report on a software function and related models developed to rescale and regenerate pulse height data acquired from a single scintillation detector. Such a function could be used to generate significant and varied pulse height data that can be used to integration-test algorithms that are capable of automatically response matching multiple detectors using pulse height spectra analysis. Furthermore, a function of this sort removes the dependence on multiple detectors, digital analyzers and calibration source. Results show a good match between the real and regenerated pulse height data. The function has also been used successfully to develop auto-calibration algorithms.

  13. Multiple ligand simultaneous docking: orchestrated dancing of ligands in binding sites of protein.

    PubMed

    Li, Huameng; Li, Chenglong

    2010-07-30

    Present docking methodologies simulate only one single ligand at a time during docking process. In reality, the molecular recognition process always involves multiple molecular species. Typical protein-ligand interactions are, for example, substrate and cofactor in catalytic cycle; metal ion coordination together with ligand(s); and ligand binding with water molecules. To simulate the real molecular binding processes, we propose a novel multiple ligand simultaneous docking (MLSD) strategy, which can deal with all the above processes, vastly improving docking sampling and binding free energy scoring. The work also compares two search strategies: Lamarckian genetic algorithm and particle swarm optimization, which have respective advantages depending on the specific systems. The methodology proves robust through systematic testing against several diverse model systems: E. coli purine nucleoside phosphorylase (PNP) complex with two substrates, SHP2NSH2 complex with two peptides and Bcl-xL complex with ABT-737 fragments. In all cases, the final correct docking poses and relative binding free energies were obtained. In PNP case, the simulations also capture the binding intermediates and reveal the binding dynamics during the recognition processes, which are consistent with the proposed enzymatic mechanism. In the other two cases, conventional single-ligand docking fails due to energetic and dynamic coupling among ligands, whereas MLSD results in the correct binding modes. These three cases also represent potential applications in the areas of exploring enzymatic mechanism, interpreting noisy X-ray crystallographic maps, and aiding fragment-based drug design, respectively. 2010 Wiley Periodicals, Inc.

  14. Patterns of human papillomavirus types in multiple infections: an analysis in women and men of the high throughput human papillomavirus monitoring study.

    PubMed

    Vaccarella, Salvatore; Söderlund-Strand, Anna; Franceschi, Silvia; Plummer, Martyn; Dillner, Joakim

    2013-01-01

    To evaluate the pattern of co-infection of human papillomavirus (HPV) types in both sexes in Sweden. Cell samples from genital swabs, first-void urine, and genital swabs immersed in first-void urine were collected in the present cross-sectional High Throughput HPV Monitoring study. Overall, 31,717 samples from women and 9,949 from men (mean age 25) were tested for 16 HPV types using mass spectrometry. Multilevel logistic regression was used to estimate the expected number of multiple infections with specific HPV types, adjusted for age, type of sample, and accounting for correlations between HPV types due to unobserved risk factors using sample-level random effects. Bonferroni correction was used to allow for multiple comparisons (120). Observed-to-expected ratio for any multiple infections was slightly above unity in both sexes, but, for most 2-type combinations, there was no evidence of significant departure from expected numbers. HPV6/18 was found more often and HPV51/68 and 6/68 less often than expected. However, HPV68 tended to be generally underrepresented in co-infections, suggesting a sub-optimal performance of our testing method for this HPV type. We found no evidence for positive or negative clustering between HPV types included in the current prophylactic vaccines and other untargeted oncogenic types, in either sex.

  15. Logarithmic r-θ mapping for hybrid optical neural network filter for multiple objects recognition within cluttered scenes

    NASA Astrophysics Data System (ADS)

    Kypraios, Ioannis; Young, Rupert C. D.; Chatwin, Chris R.; Birch, Phil M.

    2009-04-01

    θThe window unit in the design of the complex logarithmic r-θ mapping for hybrid optical neural network filter can allow multiple objects of the same class to be detected within the input image. Additionally, the architecture of the neural network unit of the complex logarithmic r-θ mapping for hybrid optical neural network filter becomes attractive for accommodating the recognition of multiple objects of different classes within the input image by modifying the output layer of the unit. We test the overall filter for multiple objects of the same and of different classes' recognition within cluttered input images and video sequences of cluttered scenes. Logarithmic r-θ mapping for hybrid optical neural network filter is shown to exhibit with a single pass over the input data simultaneously in-plane rotation, out-of-plane rotation, scale, log r-θ map translation and shift invariance, and good clutter tolerance by recognizing correctly the different objects within the cluttered scenes. We record in our results additional extracted information from the cluttered scenes about the objects' relative position, scale and in-plane rotation.

  16. The role of appraisal and coping style in relation with societal participation in fatigued patients with multiple sclerosis: a cross-sectional multiple mediator analysis.

    PubMed

    van den Akker, Lizanne Eva; Beckerman, Heleen; Collette, Emma Hubertine; Bleijenberg, Gijs; Dekker, Joost; Knoop, Hans; de Groot, Vincent

    2016-10-01

    To determine the relationship between appraisal and societal participation in fatigued patients with Multiple Sclerosis (MS), and whether this relation is mediated by coping styles. 265 severely-fatigued MS patients. Appraisal, a latent construct, was created from the General Self-Efficacy Scale and the helplessness and acceptance subscales of the Illness Cognition Questionnaire. Coping styles were assessed using the Coping Inventory Stressful Situations (CISS21) and societal participation was assessed using the Impact on Participation and Autonomy. A multiple mediator model was developed and tested by structural equation modeling on cross-sectional data. We corrected for confounding by disease-related factors. Mediation was determined using a product-of-coefficients approach. A significant relationship existed between appraisal and participation (β = 0.21, 95 % CI 0.04-0.39). The pathways via coping styles were not significant. In patients with severe MS-related fatigue, appraisal and societal participation show a positive relationship that is not mediated by coping styles.

  17. The ranking probability approach and its usage in design and analysis of large-scale studies.

    PubMed

    Kuo, Chia-Ling; Zaykin, Dmitri

    2013-01-01

    In experiments with many statistical tests there is need to balance type I and type II error rates while taking multiplicity into account. In the traditional approach, the nominal [Formula: see text]-level such as 0.05 is adjusted by the number of tests, [Formula: see text], i.e., as 0.05/[Formula: see text]. Assuming that some proportion of tests represent "true signals", that is, originate from a scenario where the null hypothesis is false, power depends on the number of true signals and the respective distribution of effect sizes. One way to define power is for it to be the probability of making at least one correct rejection at the assumed [Formula: see text]-level. We advocate an alternative way of establishing how "well-powered" a study is. In our approach, useful for studies with multiple tests, the ranking probability [Formula: see text] is controlled, defined as the probability of making at least [Formula: see text] correct rejections while rejecting hypotheses with [Formula: see text] smallest P-values. The two approaches are statistically related. Probability that the smallest P-value is a true signal (i.e., [Formula: see text]) is equal to the power at the level [Formula: see text], to an very good excellent approximation. Ranking probabilities are also related to the false discovery rate and to the Bayesian posterior probability of the null hypothesis. We study properties of our approach when the effect size distribution is replaced for convenience by a single "typical" value taken to be the mean of the underlying distribution. We conclude that its performance is often satisfactory under this simplification; however, substantial imprecision is to be expected when [Formula: see text] is very large and [Formula: see text] is small. Precision is largely restored when three values with the respective abundances are used instead of a single typical effect size value.

  18. “The Relationship between Executive Functioning, Processing Speed and White Matter Integrity in Multiple Sclerosis”

    PubMed Central

    Genova, Helen M.; DeLuca, John; Chiaravalloti, Nancy; Wylie, Glenn

    2014-01-01

    The primary purpose of the current study was to examine the relationship between performance on executive tasks and white matter integrity, assessed by diffusion tensor imaging (DTI) in Multiple Sclerosis (MS). A second aim was to examine how processing speed affects the relationship between executive functioning and FA. This relationship was examined in two executive tasks that rely heavily on processing speed: the Color-Word Interference Test and Trail-Making Test (Delis-Kaplan Executive Function System). It was hypothesized that reduced fractional anisotropy (FA) is related to poor performance on executive tasks in MS, but that this relationship would be affected by the statistical correction of processing speed from the executive tasks. 15 healthy controls and 25 persons with MS participated. Regression analyses were used to examine the relationship between executive functioning and FA, both before and after processing speed was removed from the executive scores. Before processing speed was removed from the executive scores, reduced FA was associated with poor performance on Color-Word Interference Test and Trail-Making Test in a diffuse network including corpus callosum and superior longitudinal fasciculus. However, once processing speed was removed, the relationship between executive functions and FA was no longer significant on the Trail Making test, and significantly reduced and more localized on the Color-Word Interference Test. PMID:23777468

  19. Comparison of answer-until-correct and full-credit assessments in a team-based learning course.

    PubMed

    Farland, Michelle Z; Barlow, Patrick B; Levi Lancaster, T; Franks, Andrea S

    2015-03-25

    To assess the impact of awarding partial credit to team assessments on team performance and on quality of team interactions using an answer-until-correct method compared to traditional methods of grading (multiple-choice, full-credit). Subjects were students from 3 different offerings of an ambulatory care elective course, taught using team-based learning. The control group (full-credit) consisted of those enrolled in the course when traditional methods of assessment were used (2 course offerings). The intervention group consisted of those enrolled in the course when answer-until-correct method was used for team assessments (1 course offering). Study outcomes included student performance on individual and team readiness assurance tests (iRATs and tRATs), individual and team final examinations, and student assessment of quality of team interactions using the Team Performance Scale. Eighty-four students enrolled in the courses were included in the analysis (full-credit, n=54; answer-until-correct, n=30). Students who used traditional methods of assessment performed better on iRATs (full-credit mean 88.7 (5.9), answer-until-correct mean 82.8 (10.7), p<0.001). Students who used answer-until-correct method of assessment performed better on the team final examination (full-credit mean 45.8 (1.5), answer-until-correct 47.8 (1.4), p<0.001). There was no significant difference in performance on tRATs and the individual final examination. Students who used the answer-until-correct method had higher quality of team interaction ratings (full-credit 97.1 (9.1), answer-until-correct 103.0 (7.8), p=0.004). Answer-until-correct assessment method compared to traditional, full-credit methods resulted in significantly lower scores for iRATs, similar scores on tRATs and individual final examinations, improved scores on team final examinations, and improved perceptions of the quality of team interactions.

  20. The Rise and Fall of Boot Camps: A Case Study in Common-Sense Corrections

    ERIC Educational Resources Information Center

    Cullen, Francis T.; Blevins, Kristie R.; Trager, Jennifer S.; Gendreau, Paul

    2005-01-01

    "Common sense" is often used as a powerful rationale for implementing correctional programs that have no basis in criminology and virtually no hope of reducing recidivism. Within this context, we undertake a case study in "common-sense' corrections by showing how the rise of boot camps, although having multiple causes, was ultimately legitimized…

  1. The parietal memory network activates similarly for true and associative false recognition elicited via the DRM procedure.

    PubMed

    McDermott, Kathleen B; Gilmore, Adrian W; Nelson, Steven M; Watson, Jason M; Ojemann, Jeffrey G

    2017-02-01

    Neuroimaging investigations of human memory encoding and retrieval have revealed that multiple regions of parietal cortex contribute to memory. Recently, a sparse network of regions within parietal cortex has been identified using resting state functional connectivity (MRI techniques). The regions within this network exhibit consistent task-related responses during memory formation and retrieval, leading to its being called the parietal memory network (PMN). Among its signature patterns are: deactivation during initial experience with an item (e.g., encoding); activation during subsequent repetitions (e.g., at retrieval); greater activation for successfully retrieved familiar words than novel words (e.g., hits relative to correctly-rejected lures). The question of interest here is whether novel words that are subjectively experienced as having been recently studied would elicit PMN activation similar to that of hits. That is, we compared old items correctly recognized to two types of novel items on a recognition test: those correctly identified as new and those incorrectly labeled as old due to their strong associative relation to the studied words (in the DRM false memory protocol). Subjective oldness plays a strong role in driving activation, as hits and false alarms activated similarly (and greater than correctly-rejected lures). Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. Orbit correction in a linear nonscaling fixed field alternating gradient accelerator

    DOE PAGES

    Kelliher, D. J.; Machida, S.; Edmonds, C. S.; ...

    2014-11-20

    In a linear non-scaling FFAG the large natural chromaticity of the machine results in a betatron tune that varies by several integers over the momentum range. In addition, orbit correction is complicated by the consequent variation of the phase advance between lattice elements. Here we investigate how the correction of multiple closed orbit harmonics allows correction of both the COD and the accelerated orbit distortion over the momentum range.

  3. ICESat-2 laser Nd:YVO4 amplifier

    NASA Astrophysics Data System (ADS)

    Sawruk, Nicholas W.; Burns, Patrick M.; Edwards, Ryan E.; Litvinovitch, Viatcheslav; Martin, Nigel; Witt, Greg; Fakhoury, Elias; Iskander, John; Pronko, Mark S.; Troupaki, Elisavet; Bay, Michael M.; He, Charles C.; Wang, Liqin L.; Cavanaugh, John F.; Farrokh, Babak; Salem, Jonathan A.; Baker, Eric

    2018-02-01

    We report on the cause and corrective actions of three amplifier crystal fractures in the space-qualified laser systems used in NASA Goddard Space Flight Center's (GSFC) Ice, Cloud, and Land Elevation Satellite-2 (ICESat-2). The ICESat-2 lasers each contain three end-pumped Nd:YVOO4 amplifier stages. The crystals are clamped between two gold plated copper heat spreaders with an indium foil thermal interface material, and the crystal fractures occurred after multiple years of storage and over a year of operational run-time. The primary contributors are high compressive loading of the NdYVO4 crystals at the beginning of life, a time dependent crystal stress caused by an intermetallic reaction of the gold plating and indium, and slow crack growth resulting in a reduction in crystal strength over time. An updated crystal mounting scheme was designed, analyzed, fabricated and tested. Thee fracture slab failure analysis, finite-element modeling and corrective actions are presented.

  4. Characterizations of double pulsing in neutron multiplicity and coincidence counting systems

    DOE PAGES

    Koehler, Katrina E.; Henzl, Vladimir; Croft, Stephen; ...

    2016-06-29

    Passive neutron coincidence/multiplicity counters are subject to non-ideal behavior, such as double pulsing and dead time. It has been shown in the past that double-pulsing exhibits a distinct signature in a Rossi-alpha distribution, which is not readily noticed using traditional Multiplicity Shift Register analysis. But, it has been assumed that the use of a pre-delay in shift register analysis removes any effects of double pulsing. Here, we use high-fidelity simulations accompanied by experimental measurements to study the effects of double pulsing on multiplicity rates. By exploiting the information from the double pulsing signature peak observable in the Rossi-alpha distribution, themore » double pulsing fraction can be determined. Algebraic correction factors for the multiplicity rates in terms of the double pulsing fraction have been developed. We also discuss the role of these corrections across a range of scenarios.« less

  5. Rapid Vision Correction by Special Operations Forces.

    PubMed

    Reynolds, Mark E

    This report describes a rapid method of vision correction used by Special Operations Medics in multiple operational engagements. Between 2011 and 2015, Special Operations Medics used an algorithm- driven refraction technique. A standard block of instruction was provided to the medics, along with a packaged kit. The technique was used in multiple operational engagements with host nation military and civilians. Data collected for program evaluation were later analyzed to assess the utility of the technique. Glasses were distributed to 230 patients with complaints of either decreased distance or near (reading). Most patients (84%) with distance complaints achieved corrected binocular vision of 20/40 or better, and 97% of patients with near-vision complaints achieved corrected near-binocular vision of 20/40 or better. There was no statistically significant difference between the percentages of patients achieving 20/40 when medics used the technique under direct supervision versus independent use. A basic refraction technique using a designed kit allows for meaningful improvement in distance and/or near vision at austere locations. Special Operations Medics can leverage this approach after specific training with minimal time commitment. It can serve as a rapid, effective intervention with multiple applications in diverse operational environments. 2017.

  6. The cost of large numbers of hypothesis tests on power, effect size and sample size.

    PubMed

    Lazzeroni, L C; Ray, A

    2012-01-01

    Advances in high-throughput biology and computer science are driving an exponential increase in the number of hypothesis tests in genomics and other scientific disciplines. Studies using current genotyping platforms frequently include a million or more tests. In addition to the monetary cost, this increase imposes a statistical cost owing to the multiple testing corrections needed to avoid large numbers of false-positive results. To safeguard against the resulting loss of power, some have suggested sample sizes on the order of tens of thousands that can be impractical for many diseases or may lower the quality of phenotypic measurements. This study examines the relationship between the number of tests on the one hand and power, detectable effect size or required sample size on the other. We show that once the number of tests is large, power can be maintained at a constant level, with comparatively small increases in the effect size or sample size. For example at the 0.05 significance level, a 13% increase in sample size is needed to maintain 80% power for ten million tests compared with one million tests, whereas a 70% increase in sample size is needed for 10 tests compared with a single test. Relative costs are less when measured by increases in the detectable effect size. We provide an interactive Excel calculator to compute power, effect size or sample size when comparing study designs or genome platforms involving different numbers of hypothesis tests. The results are reassuring in an era of extreme multiple testing.

  7. Paced Auditory Serial Addition Test (PASAT 3.0 s): Demographically corrected norms for the Portuguese population.

    PubMed

    Sousa, Claudia Sofia; Neves, Mariana Rigueiro; Passos, Ana Margarida; Ferreira, Aristides; Sá, Maria José

    2017-05-23

    The main goal of this study was to produce adjusted normative data for the Portuguese population on the Paced Auditory Serial Addition Test (PASAT 3.0 s), the version used in the Brief Repeatable Battery of Neuropsychological Tests developed by the National Multiple Sclerosis Society. The study included 326 community-dwelling individuals (199 women and 127 men) aged between 20 and 70 (mean = 40.33, SD = 14.40), who had educational backgrounds ranging from 4 to 23 years of schooling (mean = 12.28, SD = 4.39). Age, gender and qualifications revealed differences in explaining their performance on the PASAT 3.0 s. Men had significantly better performance on the PASAT 3.0 s than women, even though this represents a small effect size r = 0.18. Demographically corrected normative data was developed and important information regarding performance on the PASAT 3.0 s test is provided. Results are discussed and presented in tables and a formula is presented for computing age, gender and education adjusted T-scores for performance on the PASAT 3.0 s. These results should be considered as useful reference values for clinicians and investigators when applying the PASAT 3.0 s to assess cognitive function like information processing speed in different pathologies.

  8. Sharply curved turn around duct flow predictions using spectral partitioning of the turbulent kinetic energy and a pressure modified wall law

    NASA Technical Reports Server (NTRS)

    Santi, L. Michael

    1986-01-01

    Computational predictions of turbulent flow in sharply curved 180 degree turn around ducts are presented. The CNS2D computer code is used to solve the equations of motion for two-dimensional incompressible flows transformed to a nonorthogonal body-fitted coordinate system. This procedure incorporates the pressure velocity correction algorithm SIMPLE-C to iteratively solve a discretized form of the transformed equations. A multiple scale turbulence model based on simplified spectral partitioning is employed to obtain closure. Flow field predictions utilizing the multiple scale model are compared to features predicted by the traditional single scale k-epsilon model. Tuning parameter sensitivities of the multiple scale model applied to turn around duct flows are also determined. In addition, a wall function approach based on a wall law suitable for incompressible turbulent boundary layers under strong adverse pressure gradients is tested. Turn around duct flow characteristics utilizing this modified wall law are presented and compared to results based on a standard wall treatment.

  9. True color scanning laser ophthalmoscopy and optical coherence tomography handheld probe

    PubMed Central

    LaRocca, Francesco; Nankivil, Derek; Farsiu, Sina; Izatt, Joseph A.

    2014-01-01

    Scanning laser ophthalmoscopes (SLOs) are able to achieve superior contrast and axial sectioning capability compared to fundus photography. However, SLOs typically use monochromatic illumination and are thus unable to extract color information of the retina. Previous color SLO imaging techniques utilized multiple lasers or narrow band sources for illumination, which allowed for multiple color but not “true color” imaging as done in fundus photography. We describe the first “true color” SLO, handheld color SLO, and combined color SLO integrated with a spectral domain optical coherence tomography (OCT) system. To achieve accurate color imaging, the SLO was calibrated with a color test target and utilized an achromatizing lens when imaging the retina to correct for the eye’s longitudinal chromatic aberration. Color SLO and OCT images from volunteers were then acquired simultaneously with a combined power under the ANSI limit. Images from this system were then compared with those from commercially available SLOs featuring multiple narrow-band color imaging. PMID:25401032

  10. 49 CFR 40.203 - What problems cause a drug test to be cancelled unless they are corrected?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 1 2010-10-01 2010-10-01 false What problems cause a drug test to be cancelled... What problems cause a drug test to be cancelled unless they are corrected? (a) As the MRO, when a... the CCF, you must cancel the test unless the flaw is corrected. (d) The following are correctable...

  11. Design, development, and testing of the DCT Cassegrain instrument support assembly

    NASA Astrophysics Data System (ADS)

    Bida, Thomas A.; Dunham, Edward W.; Nye, Ralph A.; Chylek, Tomas; Oliver, Richard C.

    2012-09-01

    The 4.3m Discovery Channel Telescope delivers an f/6.1 unvignetted 0.5° field to its RC focal plane. In order to support guiding, wavefront sensing, and instrument installations, a Cassegrain instrument support assembly has been developed which includes a facility guider and wavefront sensor package (GWAVES) and multiple interfaces for instrumentation. A 2-element, all-spherical, fused-silica corrector compensates for field curvature and astigmatism over the 0.5° FOV, while reducing ghost pupil reflections to minimal levels. Dual roving GWAVES camera probes pick off stars in the outer annulus of the corrected field, providing simultaneous guiding and wavefront sensing for telescope operations. The instrument cube supports 5 co-mounted instruments with rapid feed selection via deployable fold mirrors. The corrected beam passes through a dual filter wheel before imaging with the 6K x 6K single CCD of the Large Monolithic Imager (LMI). We describe key development strategies for the DCT Cassegrain instrument assembly and GWAVES, including construction of a prime focus test assembly with wavefront sensor utilized in fall 2011 to begin characterization of the DCT primary mirror support. We also report on 2012 on-sky test results of wavefront sensing, guiding, and imaging with the integrated Cassegrain cube.

  12. Design and implementation of a low-cost multiple-range digital phase detector

    NASA Astrophysics Data System (ADS)

    Omran, Hesham; Albasha, Lutfi; Al-Ali, A. R.

    2012-06-01

    This article describes the design, simulation, implementation and testing of a novel low-cost multiple-range programmable digital phase detector. The detector receives two periodic signals and calculates the ratio of the time difference to the time period to measure and display the phase difference. The resulting output values are in integer form ranging from -180° to 180°. Users can select the detector pre-set operation frequency ranges using a three-bit pre-scalar. This enables to use the detector for various applications. The proposed detector can be programmed over a frequency range of 10 Hz to 25 kHz by configuring its clock divider circuit. Detector simulations were conducted and verified using ModelSim and the design was implemented and tested using an Altera Cyclone II field-programmable gate array board. Both the simulation and actual circuit testing results showed that the phase detector has a magnitude of error of only 1°. The detector is ideal for applications such as power factor measurement and correction, self-tuning resonant circuits and in metal detection systems. Unlike other stand-alone phase detection systems, the reported system has the ability to be programmed to several frequency ranges, hence expanding its bandwidth.

  13. A genotyping protocol for multiple tissue types from the polyploid tree species Sequoia sempervirens (Cupressaceae)1

    PubMed Central

    Narayan, Lakshmi; Dodd, Richard S.; O’Hara, Kevin L.

    2015-01-01

    Premise of the study: Identifying clonal lineages in asexually reproducing plants using microsatellite markers is complicated by the possibility of nonidentical genotypes from the same clonal lineage due to somatic mutations, null alleles, and scoring errors. We developed and tested a clonal identification protocol that is robust to these issues for the asexually reproducing hexaploid tree species coast redwood (Sequoia sempervirens). Methods: Microsatellite data from four previously published and two newly developed primers were scored using a modified protocol, and clones were identified using Bruvo genetic distances. The effectiveness of this clonal identification protocol was assessed using simulations and by genotyping a test set of paired samples of different tissue types from the same trees. Results: Data from simulations showed that our protocol allowed us to accurately identify clonal lineages. Multiple test samples from the same trees were identified correctly, although certain tissue type pairs had larger genetic distances on average. Discussion: The methods described in this paper will allow for the accurate identification of coast redwood clones, facilitating future studies of the reproductive ecology of this species. The techniques used in this paper can be applied to studies of other clonal organisms as well. PMID:25798341

  14. A genotyping protocol for multiple tissue types from the polyploid tree species Sequoia sempervirens (Cupressaceae).

    PubMed

    Narayan, Lakshmi; Dodd, Richard S; O'Hara, Kevin L

    2015-03-01

    Identifying clonal lineages in asexually reproducing plants using microsatellite markers is complicated by the possibility of nonidentical genotypes from the same clonal lineage due to somatic mutations, null alleles, and scoring errors. We developed and tested a clonal identification protocol that is robust to these issues for the asexually reproducing hexaploid tree species coast redwood (Sequoia sempervirens). Microsatellite data from four previously published and two newly developed primers were scored using a modified protocol, and clones were identified using Bruvo genetic distances. The effectiveness of this clonal identification protocol was assessed using simulations and by genotyping a test set of paired samples of different tissue types from the same trees. Data from simulations showed that our protocol allowed us to accurately identify clonal lineages. Multiple test samples from the same trees were identified correctly, although certain tissue type pairs had larger genetic distances on average. The methods described in this paper will allow for the accurate identification of coast redwood clones, facilitating future studies of the reproductive ecology of this species. The techniques used in this paper can be applied to studies of other clonal organisms as well.

  15. Assessing group differences in biodiversity by simultaneously testing a user-defined selection of diversity indices.

    PubMed

    Pallmann, Philip; Schaarschmidt, Frank; Hothorn, Ludwig A; Fischer, Christiane; Nacke, Heiko; Priesnitz, Kai U; Schork, Nicholas J

    2012-11-01

    Comparing diversities between groups is a task biologists are frequently faced with, for example in ecological field trials or when dealing with metagenomics data. However, researchers often waver about which measure of diversity to choose as there is a multitude of approaches available. As Jost (2008, Molecular Ecology, 17, 4015) has pointed out, widely used measures such as the Shannon or Simpson index have undesirable properties which make them hard to compare and interpret. Many of the problems associated with the use of these 'raw' indices can be corrected by transforming them into 'true' diversity measures. We introduce a technique that allows the comparison of two or more groups of observations and simultaneously tests a user-defined selection of a number of 'true' diversity measures. This procedure yields multiplicity-adjusted P-values according to the method of Westfall and Young (1993, Resampling-Based Multiple Testing: Examples and Methods for p-Value Adjustment, 49, 941), which ensures that the rate of false positives (type I error) does not rise when the number of groups and/or diversity indices is extended. Software is available in the R package 'simboot'. © 2012 Blackwell Publishing Ltd.

  16. Comparison and testing of extended Kalman filters for attitude estimation of the Earth radiation budget satellite

    NASA Technical Reports Server (NTRS)

    Deutschmann, Julie; Bar-Itzhack, Itzhack Y.; Rokni, Mohammad

    1990-01-01

    The testing and comparison of two Extended Kalman Filters (EKFs) developed for the Earth Radiation Budget Satellite (ERBS) is described. One EKF updates the attitude quaternion using a four component additive error quaternion. This technique is compared to that of a second EKF, which uses a multiplicative error quaternion. A brief development of the multiplicative algorithm is included. The mathematical development of the additive EKF was presented in the 1989 Flight Mechanics/Estimation Theory Symposium along with some preliminary testing results using real spacecraft data. A summary of the additive EKF algorithm is included. The convergence properties, singularity problems, and normalization techniques of the two filters are addressed. Both filters are also compared to those from the ERBS operational ground support software, which uses a batch differential correction algorithm to estimate attitude and gyro biases. Sensitivity studies are performed on the estimation of sensor calibration states. The potential application of the EKF for real time and non-real time ground attitude determination and sensor calibration for future missions such as the Gamma Ray Observatory (GRO) and the Small Explorer Mission (SMEX) is also presented.

  17. Participation in Pre-High School Football and Neurological, Neuroradiological, and Neuropsychological Findings in Later Life: A Study of 45 Retired National Football League Players.

    PubMed

    Solomon, Gary S; Kuhn, Andrew W; Zuckerman, Scott L; Casson, Ira R; Viano, David C; Lovell, Mark R; Sills, Allen K

    2016-05-01

    A recent study found that an earlier age of first exposure (AFE) to tackle football was associated with long-term neurocognitive impairment in retired National Football League (NFL) players. To assess the association between years of exposure to pre-high school football (PreYOE) and neuroradiological, neurological, and neuropsychological outcome measures in a different sample of retired NFL players. Cross-sectional study; Level of evidence, 3. Forty-five former NFL players were included in this study. All participants prospectively completed extensive history taking, a neurological examination, brain magnetic resonance imaging, and a comprehensive battery of neuropsychological tests. To measure the associations between PreYOE and these outcome measures, multiple regression models were utilized while controlling for several covariates. After applying a Bonferroni correction for multiple comparisons, none of the neurological, neuroradiological, or neuropsychological outcome measures yielded a significant relationship with PreYOE. A second Bonferroni-corrected analysis of a subset of these athletes with self-reported learning disability yielded no significant relationships on paper-and-pencil neurocognitive tests but did result in a significant association between learning disability and computerized indices of visual motor speed and reaction time. The current study failed to replicate the results of a prior study, which concluded that an earlier AFE to tackle football might result in long-term neurocognitive deficits. In 45 retired NFL athletes, there were no associations between PreYOE and neuroradiological, neurological, and neuropsychological outcome measures. © 2016 The Author(s).

  18. Evaluating the utility of mid-infrared spectral subspaces for predicting soil properties.

    PubMed

    Sila, Andrew M; Shepherd, Keith D; Pokhariyal, Ganesh P

    2016-04-15

    We propose four methods for finding local subspaces in large spectral libraries. The proposed four methods include (a) cosine angle spectral matching; (b) hit quality index spectral matching; (c) self-organizing maps and (d) archetypal analysis methods. Then evaluate prediction accuracies for global and subspaces calibration models. These methods were tested on a mid-infrared spectral library containing 1907 soil samples collected from 19 different countries under the Africa Soil Information Service project. Calibration models for pH, Mehlich-3 Ca, Mehlich-3 Al, total carbon and clay soil properties were developed for the whole library and for the subspace. Root mean square error of prediction was used to evaluate predictive performance of subspace and global models. The root mean square error of prediction was computed using a one-third-holdout validation set. Effect of pretreating spectra with different methods was tested for 1st and 2nd derivative Savitzky-Golay algorithm, multiplicative scatter correction, standard normal variate and standard normal variate followed by detrending methods. In summary, the results show that global models outperformed the subspace models. We, therefore, conclude that global models are more accurate than the local models except in few cases. For instance, sand and clay root mean square error values from local models from archetypal analysis method were 50% poorer than the global models except for subspace models obtained using multiplicative scatter corrected spectra with which were 12% better. However, the subspace approach provides novel methods for discovering data pattern that may exist in large spectral libraries.

  19. An approach to holistically assess (dairy) farm eco-efficiency by combining Life Cycle Analysis with Data Envelopment Analysis models and methodologies.

    PubMed

    Soteriades, A D; Faverdin, P; Moreau, S; Charroin, T; Blanchard, M; Stott, A W

    2016-11-01

    Eco-efficiency is a useful guide to dairy farm sustainability analysis aimed at increasing output (physical or value added) and minimizing environmental impacts (EIs). Widely used partial eco-efficiency ratios (EIs per some functional unit, e.g. kg milk) can be problematic because (i) substitution possibilities between EIs are ignored, (ii) multiple ratios can complicate decision making and (iii) EIs are not usually associated with just the functional unit in the ratio's denominator. The objective of this study was to demonstrate a 'global' eco-efficiency modelling framework dealing with issues (i) to (iii) by combining Life Cycle Analysis (LCA) data and the multiple-input, multiple-output production efficiency method Data Envelopment Analysis (DEA). With DEA each dairy farm's outputs and LCA-derived EIs are aggregated into a single, relative, bounded, dimensionless eco-efficiency score, thus overcoming issues (i) to (iii). A novelty of this study is that a model providing a number of additional desirable properties was employed, known as the Range Adjusted Measure (RAM) of inefficiency. These properties altogether make RAM advantageous over other DEA models and are as follows. First, RAM is able to simultaneously minimize EIs and maximize outputs. Second, it indicates which EIs and/or outputs contribute the most to a farm's eco-inefficiency. Third it can be used to rank farms in terms of eco-efficiency scores. Thus, non-parametric rank tests can be employed to test for significant differences in terms of eco-efficiency score ranks between different farm groups. An additional DEA methodology was employed to 'correct' the farms' eco-efficiency scores for inefficiencies attributed to managerial factors. By removing managerial inefficiencies it was possible to detect differences in eco-efficiency between farms solely attributed to uncontrollable factors such as region. Such analysis is lacking in previous dairy studies combining LCA with DEA. RAM and the 'corrective' methodology were demonstrated with LCA data from French specialized dairy farms grouped by region (West France, Continental France) and feeding strategy (regardless of region). Mean eco-efficiency score ranks were significantly higher for farms with 30% maize in the total forage area before correcting for managerial inefficiencies. Mean eco-efficiency score ranks were higher for West than Continental farms, but significantly higher only after correcting for managerial inefficiencies. These results helped identify the eco-efficiency potential of each region and feeding strategy and could therefore aid advisors and policy makers at farm or region/sector level. The proposed framework helped better measure and understand (dairy) farm eco-efficiency, both within and between different farm groups.

  20. Universal opt-out screening for hepatitis C virus (HCV) within correctional facilities is an effective intervention to improve public health.

    PubMed

    Morris, Meghan D; Brown, Brandon; Allen, Scott A

    2017-09-11

    Purpose Worldwide efforts to identify individuals infected with the hepatitis C virus (HCV) focus almost exclusively on community healthcare systems, thereby failing to reach high-risk populations and those with poor access to primary care. In the USA, community-based HCV testing policies and guidelines overlook correctional facilities, where HCV rates are believed to be as high as 40 percent. This is a missed opportunity: more than ten million Americans move through correctional facilities each year. Herein, the purpose of this paper is to examine HCV testing practices in the US correctional system, California and describe how universal opt-out HCV testing could expand early HCV detection, improve public health in correctional facilities and communities, and prove cost-effective over time. Design/methodology/approach A commentary on the value of standardizing screening programs across facilities by mandating all facilities (universal) to implement opt-out testing policies for all prisoners upon entry to the correctional facilities. Findings Current variability in facility-level testing programs results in inconsistent testing levels across correctional facilities, and therefore makes estimating the actual number of HCV-infected adults in the USA difficult. The authors argue that universal opt-out testing policies ensure earlier diagnosis of HCV among a population most affected by the disease and is more cost-effective than selective testing policies. Originality/value The commentary explores the current limitations of selective testing policies in correctional systems and provides recommendations and implications for public health and correctional organizations.

  1. LCC demons with divergence term for liver MRI motion correction

    NASA Astrophysics Data System (ADS)

    Oh, Jihun; Martin, Diego; Skrinjar, Oskar

    2010-03-01

    Contrast-enhanced liver MR image sequences acquired at multiple times before and after contrast administration have been shown to be critically important for the diagnosis and monitoring of liver tumors and may be used for the quantification of liver inflammation and fibrosis. However, over multiple acquisitions, the liver moves and deforms due to patient and respiratory motion. In order to analyze contrast agent uptake one first needs to correct for liver motion. In this paper we present a method for the motion correction of dynamic contrastenhanced liver MR images. For this purpose we use a modified version of the Local Correlation Coefficient (LCC) Demons non-rigid registration method. Since the liver is nearly incompressible its displacement field has small divergence. For this reason we add a divergence term to the energy that is minimized in the LCC Demons method. We applied the method to four sequences of contrast-enhanced liver MR images. Each sequence had a pre-contrast scan and seven post-contrast scans. For each post-contrast scan we corrected for the liver motion relative to the pre-contrast scan. Quantitative evaluation showed that the proposed method improved the liver alignment relative to the non-corrected and translation-corrected scans and visual inspection showed no visible misalignment of the motion corrected contrast-enhanced scans and pre-contrast scan.

  2. Calibration and correction procedures for cosmic-ray neutron soil moisture probes located across Australia

    NASA Astrophysics Data System (ADS)

    Hawdon, Aaron; McJannet, David; Wallace, Jim

    2014-06-01

    The cosmic-ray probe (CRP) provides continuous estimates of soil moisture over an area of ˜30 ha by counting fast neutrons produced from cosmic rays which are predominantly moderated by water molecules in the soil. This paper describes the setup, measurement correction procedures, and field calibration of CRPs at nine locations across Australia with contrasting soil type, climate, and land cover. These probes form the inaugural Australian CRP network, which is known as CosmOz. CRP measurements require neutron count rates to be corrected for effects of atmospheric pressure, water vapor pressure changes, and variations in incoming neutron intensity. We assess the magnitude and importance of these corrections and present standardized approaches for network-wide analysis. In particular, we present a new approach to correct for incoming neutron intensity variations and test its performance against existing procedures used in other studies. Our field calibration results indicate that a generalized calibration function for relating neutron counts to soil moisture is suitable for all soil types, with the possible exception of very sandy soils with low water content. Using multiple calibration data sets, we demonstrate that the generalized calibration function only applies after accounting for persistent sources of hydrogen in the soil profile. Finally, we demonstrate that by following standardized correction procedures and scaling neutron counting rates of all CRPs to a single reference location, differences in calibrations between sites are related to site biomass. This observation provides a means for estimating biomass at a given location or for deriving coefficients for the calibration function in the absence of field calibration data.

  3. Command and Control Software Development

    NASA Technical Reports Server (NTRS)

    Wallace, Michael

    2018-01-01

    The future of the National Aeronautics and Space Administration (NASA) depends on its innovation and efficiency in the coming years. With ambitious goals to reach Mars and explore the vast universe, correct steps must be taken to ensure our space program reaches its destination safely. The interns in the Exploration Systems and Operations Division at the Kennedy Space Center (KSC) have been tasked with building command line tools to ease the process of managing and testing the data being produced by the ground control systems while its recording system is not in use. While working alongside full-time engineers, we were able to create multiple programs that reduce the cost and time it takes to test the subsystems that launch rockets to outer space.

  4. Combustion of liquid fuels in a flowing combustion gas environment at high pressures

    NASA Technical Reports Server (NTRS)

    Canada, G. S.; Faeth, G. M.

    1975-01-01

    The combustion of fuel droplets in gases which simulate combustion chamber conditions was considered both experimentally and theoretically. The fuel droplets were simulated by porous spheres and allowed to gasify in combustion gases produced by a burner. Tests were conducted for pressures of 1-40 atm, temperatures of 600-1500 K, oxygen concentrations of 0-13% (molar) and approach Reynolds numbers of 40-680. The fuels considered in the tests included methanol, ethanol, propanol-1, n-pentane, n-heptane and n-decane. Measurements were made of both the rate of gasification of the droplet and the liquid surface temperature. Measurements were compared with theory, involving various models of gas phase transport properties with a multiplicative correction for the effect of forced convection.

  5. New Method for the Approximation of Corrected Calcium Concentrations in Chronic Kidney Disease Patients.

    PubMed

    Kaku, Yoshio; Ookawara, Susumu; Miyazawa, Haruhisa; Ito, Kiyonori; Ueda, Yuichirou; Hirai, Keiji; Hoshino, Taro; Mori, Honami; Yoshida, Izumi; Morishita, Yoshiyuki; Tabei, Kaoru

    2016-02-01

    The following conventional calcium correction formula (Payne) is broadly applied for serum calcium estimation: corrected total calcium (TCa) (mg/dL) = TCa (mg/dL) + (4 - albumin (g/dL)); however, it is inapplicable to chronic kidney disease (CKD) patients. A total of 2503 venous samples were collected from 942 all-stage CKD patients, and levels of TCa (mg/dL), ionized calcium ([iCa(2+) ] mmol/L), phosphate (mg/dL), albumin (g/dL), and pH, and other clinical parameters were measured. We assumed corrected TCa (the gold standard) to be equal to eight times the iCa(2+) value (measured corrected TCa). Then, we performed stepwise multiple linear regression analysis by using the clinical parameters and derived a simple formula for corrected TCa approximation. The following formula was devised from multiple linear regression analysis: Approximated  corrected TCa (mg/dL) = TCa + 0.25 × (4 - albumin) + 4 × (7.4 - p H) + 0.1 × (6 - phosphate) + 0.3. Receiver operating characteristic curves analysis illustrated that area under the curve of approximated corrected TCa for detection of measured corrected TCa ≥ 8.4 mg/dL and ≤ 10.4 mg/dL were 0.994 and 0.919, respectively. The intraclass correlation coefficient demonstrated superior agreement using this new formula compared to other formulas (new formula: 0.826, Payne: 0.537, Jain: 0.312, Portale: 0.582, Ferrari: 0.362). In CKD patients, TCa correction should include not only albumin but also pH and phosphate. The approximated corrected TCa from this formula demonstrates superior agreement with the measured corrected TCa in comparison to other formulas. © 2016 International Society for Apheresis, Japanese Society for Apheresis, and Japanese Society for Dialysis Therapy.

  6. Item Reliabilities for a Family of Answer-Until-Correct (AUC) Scoring Rules.

    ERIC Educational Resources Information Center

    Kane, Michael T.; Moloney, James M.

    The Answer-Until-Correct (AUC) procedure has been proposed in order to increase the reliability of multiple-choice items. A model for examinees' behavior when they must respond to each item until they answer it correctly is presented. An expression for the reliability of AUC items, as a function of the characteristics of the item and the scoring…

  7. Color correction with blind image restoration based on multiple images using a low-rank model

    NASA Astrophysics Data System (ADS)

    Li, Dong; Xie, Xudong; Lam, Kin-Man

    2014-03-01

    We present a method that can handle the color correction of multiple photographs with blind image restoration simultaneously and automatically. We prove that the local colors of a set of images of the same scene exhibit the low-rank property locally both before and after a color-correction operation. This property allows us to correct all kinds of errors in an image under a low-rank matrix model without particular priors or assumptions. The possible errors may be caused by changes of viewpoint, large illumination variations, gross pixel corruptions, partial occlusions, etc. Furthermore, a new iterative soft-segmentation method is proposed for local color transfer using color influence maps. Due to the fact that the correct color information and the spatial information of images can be recovered using the low-rank model, more precise color correction and many other image-restoration tasks-including image denoising, image deblurring, and gray-scale image colorizing-can be performed simultaneously. Experiments have verified that our method can achieve consistent and promising results on uncontrolled real photographs acquired from the Internet and that it outperforms current state-of-the-art methods.

  8. 49 CFR 40.205 - How are drug test problems corrected?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ...), you must try to correct the problem promptly, if doing so is practicable. You may conduct another... 49 Transportation 1 2010-10-01 2010-10-01 false How are drug test problems corrected? 40.205... WORKPLACE DRUG AND ALCOHOL TESTING PROGRAMS Problems in Drug Tests § 40.205 How are drug test problems...

  9. Modified Hitschfeld-Bordan Equations for Attenuation-Corrected Radar Rain Reflectivity: Application to Nonuniform Beamfilling at Off-Nadir Incidence

    NASA Technical Reports Server (NTRS)

    Meneghini, Robert; Liao, Liang

    2013-01-01

    As shown by Takahashi et al., multiple path attenuation estimates over the field of view of an airborne or spaceborne weather radar are feasible for off-nadir incidence angles. This follows from the fact that the surface reference technique, which provides path attenuation estimates, can be applied to each radar range gate that intersects the surface. This study builds on this result by showing that three of the modified Hitschfeld-Bordan estimates for the attenuation-corrected radar reflectivity factor can be generalized to the case where multiple path attenuation estimates are available, thereby providing a correction to the effects of nonuniform beamfilling. A simple simulation is presented showing some strengths and weaknesses of the approach.

  10. Integrated data analysis for genome-wide research.

    PubMed

    Steinfath, Matthias; Repsilber, Dirk; Scholz, Matthias; Walther, Dirk; Selbig, Joachim

    2007-01-01

    Integrated data analysis is introduced as the intermediate level of a systems biology approach to analyse different 'omics' datasets, i.e., genome-wide measurements of transcripts, protein levels or protein-protein interactions, and metabolite levels aiming at generating a coherent understanding of biological function. In this chapter we focus on different methods of correlation analyses ranging from simple pairwise correlation to kernel canonical correlation which were recently applied in molecular biology. Several examples are presented to illustrate their application. The input data for this analysis frequently originate from different experimental platforms. Therefore, preprocessing steps such as data normalisation and missing value estimation are inherent to this approach. The corresponding procedures, potential pitfalls and biases, and available software solutions are reviewed. The multiplicity of observations obtained in omics-profiling experiments necessitates the application of multiple testing correction techniques.

  11. Measuring technical and mathematical investigation of multiple reignitions at the switching of a motor using vacuum circuit breakers

    NASA Astrophysics Data System (ADS)

    Luxa, Andreas

    The necessary conditions in switching system and vacuum circuit breaker for the occurrence of multiple re-ignitions and accompanying effects were examined. The shape of the occurring voltages was determined in relationship to other types of overvoltage. A phenomenological model of the arc, based on an extension of the Mayr equation for arcs was used with the simulation program NETOMAC for the switching transients. Factors which affect the arc parameters were analyzed. The results were statistically verified by 3000 three-phase switching tests on 3 standard vacuum circuit breakers under realistic systems conditions; the occurring overvoltage level was measured. Dimensioning criteria for motor simulation circuits in power plants were formulated on the basis of a theoretical equivalence analysis and experimental studies. The simulation model allows a sufficiently correct estimation of all effects.

  12. Performance of Certification and Recertification Examinees on Multiple Choice Test Items: Does Physician Age Have an Impact?

    PubMed

    Shen, Linjun; Juul, Dorthea; Faulkner, Larry R

    2016-01-01

    The development of recertification programs (now referred to as Maintenance of Certification or MOC) by the members of the American Board of Medical Specialties provides the opportunity to study knowledge base across the professional lifespan of physicians. Research results to date are mixed with some studies finding negative associations between age and various measures of competency and others finding no or minimal relationships. Four groups of multiple choice test items that were independently developed for certification and MOC examinations in psychiatry and neurology were administered to certification and MOC examinees within each specialty. Percent correct scores were calculated for each examinee. Differences between certification and MOC examinees were compared using unpaired t tests, and logistic regression was used to compare MOC and certification examinee performance on the common test items. Except for the neurology certification test items that addressed basic neurology concepts, the performance of the certification and MOC examinees was similar. The differences in performance on individual test items did not consistently favor one group or the other and could not be attributed to any distinguishable content or format characteristics of those items. The findings of this study are encouraging in that physicians who had recently completed residency training possessed clinical knowledge that was comparable to that of experienced physicians, and the experienced physicians' clinical knowledge was equivalent to that of recent residency graduates. The role testing can play in enhancing expertise is described.

  13. Identification and Correction of Additive and Multiplicative Spatial Biases in Experimental High-Throughput Screening.

    PubMed

    Mazoure, Bogdan; Caraus, Iurie; Nadon, Robert; Makarenkov, Vladimir

    2018-06-01

    Data generated by high-throughput screening (HTS) technologies are prone to spatial bias. Traditionally, bias correction methods used in HTS assume either a simple additive or, more recently, a simple multiplicative spatial bias model. These models do not, however, always provide an accurate correction of measurements in wells located at the intersection of rows and columns affected by spatial bias. The measurements in these wells depend on the nature of interaction between the involved biases. Here, we propose two novel additive and two novel multiplicative spatial bias models accounting for different types of bias interactions. We describe a statistical procedure that allows for detecting and removing different types of additive and multiplicative spatial biases from multiwell plates. We show how this procedure can be applied by analyzing data generated by the four HTS technologies (homogeneous, microorganism, cell-based, and gene expression HTS), the three high-content screening (HCS) technologies (area, intensity, and cell-count HCS), and the only small-molecule microarray technology available in the ChemBank small-molecule screening database. The proposed methods are included in the AssayCorrector program, implemented in R, and available on CRAN.

  14. 77 FR 20291 - Energy Conservation Program: Test Procedures for Residential Clothes Washers; Correction

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-04

    ... Conservation Program: Test Procedures for Residential Clothes Washers; Correction AGENCY: Office of Energy.... Department of Energy (DOE) is correcting a final rule establishing revised test procedures for residential... factor calculation section of the currently applicable test procedure. DATES: Effective: April 6, 2012...

  15. Evaluation of a 3D serious game for advanced life support retraining.

    PubMed

    Buttussi, Fabio; Pellis, Tommaso; Cabas Vidani, Alberto; Pausler, Daniele; Carchietti, Elio; Chittaro, Luca

    2013-09-01

    Advanced life support (ALS) knowledge and skills decrease in as little as three months, but only a few ALS providers actually attend retraining courses. We assess the effectiveness of a 3D serious game as a new tool for frequent ALS retraining. We developed a 3D serious game for scenario-based ALS retraining. The serious game, called EMSAVE, was designed to promote self-correction while playing. We organized a retraining course in which 40 ALS providers played two cardiac arrest scenarios with EMSAVE and took a test with 38 multiple-choice questions before and after playing. We administered the same test again 3 months later to evaluate retention. Participants also rated EMSAVE and the overall retraining experience. After using EMSAVE, the number of correct answers per participant increased by 4.8 (95%CI +3.4, +6.2, p<0.001) and all but one participant improved. After 3 months, despite an expected decrease in ALS knowledge and skills (-1.9 correct answers, 95%CI -0.6, -3.3, p<0.01), there was a significant retention benefit (+2.9 correct answers per participant, 95%CI +1.5, +4.2, p<0.001). Moreover, all but one participant regarded EMSAVE as a valuable tool to refresh ALS knowledge and skills, and 85% of participants were also willing to devote 1h/month to retrain with the serious game. A 3D serious game for scenario-based retraining proved effective to retrain in ALS and supported retention of acquired knowledge and skills at 3 months. EMSAVE also positively engaged and motivated participants. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  16. Magnetic Resonance Fingerprinting of Adult Brain Tumors: Initial Experience

    PubMed Central

    Badve, Chaitra; Yu, Alice; Dastmalchian, Sara; Rogers, Matthew; Ma, Dan; Jiang, Yun; Margevicius, Seunghee; Pahwa, Shivani; Lu, Ziang; Schluchter, Mark; Sunshine, Jeffrey; Griswold, Mark; Sloan, Andrew; Gulani, Vikas

    2016-01-01

    Background Magnetic resonance fingerprinting (MRF) allows rapid simultaneous quantification of T1 and T2 relaxation times. This study assesses the utility of MRF in differentiating between common types of adult intra-axial brain tumors. Methods MRF acquisition was performed in 31 patients with untreated intra-axial brain tumors: 17 glioblastomas, 6 WHO grade II lower-grade gliomas and 8 metastases. T1, T2 of the solid tumor (ST), immediate peritumoral white matter (PW), and contralateral white matter (CW) were summarized within each region of interest. Statistical comparisons on mean, standard deviation, skewness and kurtosis were performed using univariate Wilcoxon rank sum test across various tumor types. Bonferroni correction was used to correct for multiple comparisons testing. Multivariable logistic regression analysis was performed for discrimination between glioblastomas and metastases and area under the receiver operator curve (AUC) was calculated. Results Mean T2 values could differentiate solid tumor regions of lower-grade gliomas from metastases (mean±sd: 172±53ms and 105±27ms respectively, p =0.004, significant after Bonferroni correction). Mean T1 of PW surrounding lower-grade gliomas differed from PW around glioblastomas (mean±sd: 1066±218ms and 1578±331ms respectively, p=0.004, significant after Bonferroni correction). Logistic regression analysis revealed that mean T2 of ST offered best separation between glioblastomas and metastases with AUC of 0.86 (95% CI 0.69–1.00, p<0.0001). Conclusion MRF allows rapid simultaneous T1, T2 measurement in brain tumors and surrounding tissues. MRF based relaxometry can identify quantitative differences between solid-tumor regions of lower grade gliomas and metastases and between peritumoral regions of glioblastomas and lower grade gliomas. PMID:28034994

  17. MR Fingerprinting of Adult Brain Tumors: Initial Experience.

    PubMed

    Badve, C; Yu, A; Dastmalchian, S; Rogers, M; Ma, D; Jiang, Y; Margevicius, S; Pahwa, S; Lu, Z; Schluchter, M; Sunshine, J; Griswold, M; Sloan, A; Gulani, V

    2017-03-01

    MR fingerprinting allows rapid simultaneous quantification of T1 and T2 relaxation times. This study assessed the utility of MR fingerprinting in differentiating common types of adult intra-axial brain tumors. MR fingerprinting acquisition was performed in 31 patients with untreated intra-axial brain tumors: 17 glioblastomas, 6 World Health Organization grade II lower grade gliomas, and 8 metastases. T1, T2 of the solid tumor, immediate peritumoral white matter, and contralateral white matter were summarized within each ROI. Statistical comparisons on mean, SD, skewness, and kurtosis were performed by using the univariate Wilcoxon rank sum test across various tumor types. Bonferroni correction was used to correct for multiple-comparison testing. Multivariable logistic regression analysis was performed for discrimination between glioblastomas and metastases, and area under the receiver operator curve was calculated. Mean T2 values could differentiate solid tumor regions of lower grade gliomas from metastases (mean, 172 ± 53 ms, and 105 ± 27 ms, respectively; P = .004, significant after Bonferroni correction). The mean T1 of peritumoral white matter surrounding lower grade gliomas differed from peritumoral white matter around glioblastomas (mean, 1066 ± 218 ms, and 1578 ± 331 ms, respectively; P = .004, significant after Bonferroni correction). Logistic regression analysis revealed that the mean T2 of solid tumor offered the best separation between glioblastomas and metastases with an area under the curve of 0.86 (95% CI, 0.69-1.00; P < .0001). MR fingerprinting allows rapid simultaneous T1 and T2 measurement in brain tumors and surrounding tissues. MR fingerprinting-based relaxometry can identify quantitative differences between solid tumor regions of lower grade gliomas and metastases and between peritumoral regions of glioblastomas and lower grade gliomas. © 2017 by American Journal of Neuroradiology.

  18. Visual performance on detection tasks with double-targets of the same and different difficulty.

    PubMed

    Chan, Alan H S; Courtney, Alan J; Ma, C W

    2002-10-20

    This paper reports a study of measurement of horizontal visual sensitivity limits for 16 subjects in single-target and double-targets detection tasks. Two phases of tests were conducted in the double-targets task; targets of the same difficulty were tested in phase one while targets of different difficulty were tested in phase two. The range of sensitivity for the double-targets test was found to be smaller than that for single-target in both the same and different target difficulty cases. The presence of another target was found to affect performance to a marked degree. Interference effect of the difficult target on detection of the easy one was greater than that of the easy one on the detection of the difficult one. Performance decrement was noted when correct percentage detection was plotted against eccentricity of target in both the single-target and double-targets tests. Nevertheless, the non-significant correlation found between the performance for the two tasks demonstrated that it was impossible to predict quantitatively ability for detection of double targets from the data for single targets. This indicated probable problems in generalizing data for single target visual lobes to those for multiple targets. Also lobe area values obtained from measurements using a single-target task cannot be applied in a mathematical model for situations with multiple occurrences of targets.

  19. Statistical technique for analysing functional connectivity of multiple spike trains.

    PubMed

    Masud, Mohammad Shahed; Borisyuk, Roman

    2011-03-15

    A new statistical technique, the Cox method, used for analysing functional connectivity of simultaneously recorded multiple spike trains is presented. This method is based on the theory of modulated renewal processes and it estimates a vector of influence strengths from multiple spike trains (called reference trains) to the selected (target) spike train. Selecting another target spike train and repeating the calculation of the influence strengths from the reference spike trains enables researchers to find all functional connections among multiple spike trains. In order to study functional connectivity an "influence function" is identified. This function recognises the specificity of neuronal interactions and reflects the dynamics of postsynaptic potential. In comparison to existing techniques, the Cox method has the following advantages: it does not use bins (binless method); it is applicable to cases where the sample size is small; it is sufficiently sensitive such that it estimates weak influences; it supports the simultaneous analysis of multiple influences; it is able to identify a correct connectivity scheme in difficult cases of "common source" or "indirect" connectivity. The Cox method has been thoroughly tested using multiple sets of data generated by the neural network model of the leaky integrate and fire neurons with a prescribed architecture of connections. The results suggest that this method is highly successful for analysing functional connectivity of simultaneously recorded multiple spike trains. Copyright © 2011 Elsevier B.V. All rights reserved.

  20. Relativistic corrections to the multiple scattering effect on the Sunyaev-Zel'dovich effect in the isotropic approximation

    NASA Astrophysics Data System (ADS)

    Itoh, Naoki; Kawana, Youhei; Nozawa, Satoshi; Kohyama, Yasuharu

    2001-10-01

    We extend the formalism for the calculation of the relativistic corrections to the Sunyaev-Zel'dovich effect for clusters of galaxies and include the multiple scattering effects in the isotropic approximation. We present the results of the calculations by the Fokker-Planck expansion method as well as by the direct numerical integration of the collision term of the Boltzmann equation. The multiple scattering contribution is found to be very small compared with the single scattering contribution. For high-temperature galaxy clusters of kBTe~15keV, the ratio of both the contributions is -0.2 per cent in the Wien region. In the Rayleigh-Jeans region the ratio is -0.03 per cent. Therefore the multiple scattering contribution is safely neglected for the observed galaxy clusters.

  1. Training Correctional Educators: A Needs Assessment Study.

    ERIC Educational Resources Information Center

    Jurich, Sonia; Casper, Marta; Hull, Kim A.

    2001-01-01

    Focus groups and a training needs survey of Virginia correctional educators identified educational philosophy, communication skills, human behavior, and teaching techniques as topics of interest. Classroom observations identified additional areas: teacher isolation, multiple challenges, absence of grade structure, and safety constraints. (Contains…

  2. Assessment of representational competence in kinematics

    NASA Astrophysics Data System (ADS)

    Klein, P.; Müller, A.; Kuhn, J.

    2017-06-01

    A two-tier instrument for representational competence in the field of kinematics (KiRC) is presented, designed for a standard (1st year) calculus-based introductory mechanics course. It comprises 11 multiple choice (MC) and 7 multiple true-false (MTF) questions involving multiple representational formats, such as graphs, pictures, and formal (mathematical) expressions (1st tier). Furthermore, students express their answer confidence for selected items, providing additional information (2nd tier). Measurement characteristics of KiRC were assessed in a validation sample (pre- and post-test, N =83 and N =46 , respectively), including usefulness for measuring learning gain. Validity is checked by interviews and by benchmarking KiRC against related measures. Values for item difficulty, discrimination, and consistency are in the desired ranges; in particular, a good reliability was obtained (KR 20 =0.86 ). Confidence intervals were computed and a replication study yielded values within the latter. For practical and research purposes, KiRC as a diagnostic tool goes beyond related extant instruments both for the representational formats (e.g., mathematical expressions) and for the scope of content covered (e.g., choice of coordinate systems). Together with the satisfactory psychometric properties it appears a versatile and reliable tool for assessing students' representational competency in kinematics (and of its potential change). Confidence judgments add further information to the diagnostic potential of the test, in particular for representational misconceptions. Moreover, we present an analytic result for the question—arising from guessing correction or educational considerations—of how the total effect size (Cohen's d ) varies upon combination of two test components with known individual effect sizes, and then discuss the results in the case of KiRC (MC and MTF combination). The introduced method of test combination analysis can be applied to any test comprising two components for the purpose of finding effect size ranges.

  3. 49 CFR 40.205 - How are drug test problems corrected?

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 49 Transportation 1 2013-10-01 2013-10-01 false How are drug test problems corrected? 40.205 Section 40.205 Transportation Office of the Secretary of Transportation PROCEDURES FOR TRANSPORTATION WORKPLACE DRUG AND ALCOHOL TESTING PROGRAMS Problems in Drug Tests § 40.205 How are drug test problems corrected? (a) As a collector, you have the...

  4. Why We (Usually) Don't Have to Worry about Multiple Comparisons

    ERIC Educational Resources Information Center

    Gelman, Andrew; Hill, Jennifer; Yajima, Masanao

    2012-01-01

    Applied researchers often find themselves making statistical inferences in settings that would seem to require multiple comparisons adjustments. We challenge the Type I error paradigm that underlies these corrections. Moreover we posit that the problem of multiple comparisons can disappear entirely when viewed from a hierarchical Bayesian…

  5. The ‘Pokemon’ (ZBTB7) Gene: No Evidence of Association with Sporadic Breast Cancer

    PubMed Central

    Salas, Antonio; Vega, Ana; Milne, Roger L.; García-Magariños, Manuel; Ruibal, Álvaro; Benítez, Javier; Carracedo, Ángel

    2008-01-01

    It has been proposed that the excess of familiar risk associated with breast cancer could be explained by the cumulative effect of multiple weakly predisposing alleles. The transcriptional repressor FBI1, also known as Pokemon, has recently been identified as a critical factor in oncogenesis. This protein is encoded by the ZBTB7 gene. Here we aimed to determine whether polymorphisms in ZBTB7 are associated with breast cancer risk in a sample of cases and controls collected in hospitals from North and Central Spanish patients. We genotyped 15 SNPs in ZBTB7, including the flanking regions, with an average coverage of 1 SNP/2.4 Kb, in 360 sporadic breast cancer cases and 402 controls. Comparison of allele, genotype and haplotype frequencies between cases and controls did not reveal associations using Pearson’s chi-square test and a permutation procedure to correct for multiple test. In this, the first study of the ZBTB7 gene in relation to, sporadic breast cancer, we found no evidence of an association. PMID:21892298

  6. Signal Detection and Frame Synchronization of Multiple Wireless Networking Waveforms

    DTIC Science & Technology

    2007-09-01

    punctured to obtain coding rates of 2 3 and 3 4 . Convolutional forward error correction coding is used to detect and correct bit...likely to be isolated and be correctable by the convolutional decoder. 44 Data rate (Mbps) Modulation Coding Rate Coded bits per subcarrier...binary convolutional code . A shortened Reed-Solomon technique is employed first. The code is shortened depending upon the data

  7. Improved Statistical Methods Enable Greater Sensitivity in Rhythm Detection for Genome-Wide Data

    PubMed Central

    Hutchison, Alan L.; Maienschein-Cline, Mark; Chiang, Andrew H.; Tabei, S. M. Ali; Gudjonson, Herman; Bahroos, Neil; Allada, Ravi; Dinner, Aaron R.

    2015-01-01

    Robust methods for identifying patterns of expression in genome-wide data are important for generating hypotheses regarding gene function. To this end, several analytic methods have been developed for detecting periodic patterns. We improve one such method, JTK_CYCLE, by explicitly calculating the null distribution such that it accounts for multiple hypothesis testing and by including non-sinusoidal reference waveforms. We term this method empirical JTK_CYCLE with asymmetry search, and we compare its performance to JTK_CYCLE with Bonferroni and Benjamini-Hochberg multiple hypothesis testing correction, as well as to five other methods: cyclohedron test, address reduction, stable persistence, ANOVA, and F24. We find that ANOVA, F24, and JTK_CYCLE consistently outperform the other three methods when data are limited and noisy; empirical JTK_CYCLE with asymmetry search gives the greatest sensitivity while controlling for the false discovery rate. Our analysis also provides insight into experimental design and we find that, for a fixed number of samples, better sensitivity and specificity are achieved with higher numbers of replicates than with higher sampling density. Application of the methods to detecting circadian rhythms in a metadataset of microarrays that quantify time-dependent gene expression in whole heads of Drosophila melanogaster reveals annotations that are enriched among genes with highly asymmetric waveforms. These include a wide range of oxidation reduction and metabolic genes, as well as genes with transcripts that have multiple splice forms. PMID:25793520

  8. Stressors and anxiety in dementia caregiving: multiple mediation analysis of rumination, experiential avoidance, and leisure.

    PubMed

    Romero-Moreno, R; Losada, A; Márquez-González, M; Mausbach, B T

    2016-11-01

    Despite the robust associations between stressors and anxiety in dementia caregiving, there is a lack of research examining which factors contribute to explain this relationship. This study was designed to test a multiple mediation model of behavioral and psychological symptoms of dementia (BPSD) and anxiety that proposes higher levels of rumination and experiential avoidance and lower levels of leisure satisfaction as potential mediating variables. The sample consisted of 256 family caregivers. In order to test a simultaneously parallel multiple mediation model of the BPSD to anxiety pathway, a PROCESS method was used and bias-corrected and accelerated bootstrapping method was used to test confidence intervals. Higher levels of stressors significantly predicted anxiety. Greater stressors significantly predicted higher levels of rumination and experiential avoidance, and lower levels of leisure satisfaction. These three coping variables significantly predicted anxiety. Finally, rumination, experiential avoidance, and leisure satisfaction significantly mediated the link between stressors and anxiety. The explained variance for the final model was 47.09%. Significant contrasts were found between rumination and leisure satisfaction, with rumination being a significantly higher mediator. The results suggest that caregivers' experiential avoidance, rumination, and leisure satisfaction may function as mechanisms through which BPSD influence on caregivers' anxiety. Training caregivers in reducing their levels of experiential avoidance and rumination by techniques that foster their ability of acceptance of their negative internal experiences, and increase their level of leisure satisfaction, may be helpful to reduce their anxiety symptoms developed by stressors.

  9. The Need for Precise and Well-documented Experimental Data on Prompt Fission Neutron Spectra from Neutron-induced Fission of {sup 239}Pu

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Neudecker, D., E-mail: dneudecker@lanl.gov; Taddeucci, T.N.; Haight, R.C.

    2016-01-15

    The spectrum of neutrons emitted promptly after {sup 239}Pu(n,f)—a so-called prompt fission neutron spectrum (PFNS)—is a quantity of high interest, for instance, for reactor physics and global security. However, there are only few experimental data sets available that are suitable for evaluations. In addition, some of those data sets differ by more than their 1-σ uncertainty boundaries. We present the results of MCNP studies indicating that these differences are partly caused by underestimated multiple scattering contributions, over-corrected background, and inconsistent deconvolution methods. A detailed uncertainty quantification for suitable experimental data was undertaken including these effects, and test-evaluations were performed withmore » the improved uncertainty information. The test-evaluations illustrate that the inadequately estimated effects and detailed uncertainty quantification have an impact on the evaluated PFNS and associated uncertainties as well as the neutron multiplicity of selected critical assemblies. A summary of data and documentation needs to improve the quality of the experimental database is provided based on the results of simulations and test-evaluations. Given the possibly substantial distortion of the PFNS by multiple scattering and background effects, special care should be taken to reduce these effects in future measurements, e.g., by measuring the {sup 239}Pu PFNS as a ratio to either the {sup 235}U or {sup 252}Cf PFNS.« less

  10. Sticky foam as a less-than-lethal technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scott, S.H.

    1996-12-31

    Sandia National Labs (SNL) in 1994 completed a project funded by the National Institute of Justice (NIJ) to determine the applicability of sticky foam for correctional applications. Sticky foam is an extremely tacky, tenacious material used to block, entangle, and impair individuals. The NIJ project developed a gun capable of firing multiple shots of sticky foam, tested the gun and sticky foam effectiveness on SNL volunteers acting out prison and law enforcement scenarios, and had the gun and sticky foam evaluated by correctional representatives. Based on the NIJ project work, SNL supported the Marine Corps Mission, Operation United Shield, withmore » sticky foam guns and supporting equipment to assist in the withdrawal of UN Peacekeepers from Somalia. Prior to the loan of the equipment, the Marines were given training in sticky foam characterization, toxicology, safety issues, cleanup and waste disposal, use limitations, use protocol and precautions, emergency facial clean-up, skin cleanup, gun filling, targeting and firing, and gun cleaning. The Marine Corps successfully used the sticky foam guns as part of that operation. This paper describes these recent developments of sticky foam for non-lethal uses and some of the lessons learned from scenario and application testing.« less

  11. An efficient empirical Bayes method for genomewide association studies.

    PubMed

    Wang, Q; Wei, J; Pan, Y; Xu, S

    2016-08-01

    Linear mixed model (LMM) is one of the most popular methods for genomewide association studies (GWAS). Numerous forms of LMM have been developed; however, there are two major issues in GWAS that have not been fully addressed before. The two issues are (i) the genomic background noise and (ii) low statistical power after Bonferroni correction. We proposed an empirical Bayes (EB) method by assigning each marker effect a normal prior distribution, resulting in shrinkage estimates of marker effects. We found that such a shrinkage approach can selectively shrink marker effects and reduce the noise level to zero for majority of non-associated markers. In the meantime, the EB method allows us to use an 'effective number of tests' to perform Bonferroni correction for multiple tests. Simulation studies for both human and pig data showed that EB method can significantly increase statistical power compared with the widely used exact GWAS methods, such as GEMMA and FaST-LMM-Select. Real data analyses in human breast cancer identified improved detection signals for markers previously known to be associated with breast cancer. We therefore believe that EB method is a valuable tool for identifying the genetic basis of complex traits. © 2015 Blackwell Verlag GmbH.

  12. Control Surface Interaction Effects of the Active Aeroelastic Wing Wind Tunnel Model

    NASA Technical Reports Server (NTRS)

    Heeg, Jennifer

    2006-01-01

    This paper presents results from testing the Active Aeroelastic Wing wind tunnel model in NASA Langley s Transonic Dynamics Tunnel. The wind tunnel test provided an opportunity to study aeroelastic system behavior under combined control surface deflections, testing for control surface interaction effects. Control surface interactions were observed in both static control surface actuation testing and dynamic control surface oscillation testing. The primary method of evaluating interactions was examination of the goodness of the linear superposition assumptions. Responses produced by independently actuating single control surfaces were combined and compared with those produced by simultaneously actuating and oscillating multiple control surfaces. Adjustments to the data were required to isolate the control surface influences. Using dynamic data, the task increases, as both the amplitude and phase have to be considered in the data corrections. The goodness of static linear superposition was examined and analysis of variance was used to evaluate significant factors influencing that goodness. The dynamic data showed interaction effects in both the aerodynamic measurements and the structural measurements.

  13. [Posture and aging. Current fundamental studies and management concepts].

    PubMed

    Mourey, F; Camus, A; Pfitzenmeyer, P

    2000-02-19

    FUNDAMENTAL IMPORTANCE OF POSTURE: In the elderly subject, preservation of posture is fundamental to maintaining functional independence. In recent years, there has been much progress in our understanding of the mechanisms underlying strategies used to control equilibrium in the upright position. Physiological aging, associated with diverse disease states, dangerously alters the postural function, particularly anticipated adjustments which allow an adaptation of posture to movement. CLINICAL ASSESSMENT OF POSTURE: Several tests have been developed to assess posture in the elderly subject, particularly the time it takes to start walking. We selected certain tests which can be used in everyday practice to predict falls: the stance test, the improved Romberg test, the "timed get up and go test", measurement of walking cadence, assessment of balance reactions, sitting-standing and standing-sitting movements and capacity to get up off the floor. PATIENT CARE: Elderly patients with equilibrium disorders can benefit from specific personalized rehabilitation protocols. Different techniques have been developed for multiple afferential stimulation, reprogramming postural strategies, and correcting for deficient motor automatisms.

  14. How Question Types Reveal Student Thinking: An Experimental Comparison of Multiple-True-False and Free-Response Formats

    PubMed Central

    Hubbard, Joanna K.; Potts, Macy A.; Couch, Brian A.

    2017-01-01

    Assessments represent an important component of undergraduate courses because they affect how students interact with course content and gauge student achievement of course objectives. To make decisions on assessment design, instructors must understand the affordances and limitations of available question formats. Here, we use a crossover experimental design to identify differences in how multiple-true-false (MTF) and free-response (FR) exam questions reveal student thinking regarding specific conceptions. We report that correct response rates correlate across the two formats but that a higher percentage of students provide correct responses for MTF questions. We find that MTF questions reveal a high prevalence of students with mixed (correct and incorrect) conceptions, while FR questions reveal a high prevalence of students with partial (correct and unclear) conceptions. These results suggest that MTF question prompts can direct students to address specific conceptions but obscure nuances in student thinking and may overestimate the frequency of particular conceptions. Conversely, FR questions provide a more authentic portrait of student thinking but may face limitations in their ability to diagnose specific, particularly incorrect, conceptions. We further discuss an intrinsic tension between question structure and diagnostic capacity and how instructors might use multiple formats or hybrid formats to overcome these obstacles. PMID:28450446

  15. Intracalibration of particle detectors on a three-axis stabilized geostationary platform

    NASA Astrophysics Data System (ADS)

    Rowland, W.; Weigel, R. S.

    2012-11-01

    We describe an algorithm for intracalibration of measurements from plasma or energetic particle detectors on a three-axis stabilized platform. Modeling and forecasting of Earth's radiation belt environment requires data from particle instruments, and these data depend on measurements which have an inherent calibration uncertainty. Pre-launch calibration is typically performed, but on-orbit changes in the instrument often necessitate adjustment of calibration parameters to mitigate the effect of these changes on the measurements. On-orbit calibration practices for particle detectors aboard spin-stabilized spacecraft are well established. Three-axis stabilized platforms, however, pose unique challenges even when comparisons are being performed between multiple telescopes measuring the same energy ranges aboard the same satellite. This algorithm identifies time intervals when different telescopes are measuring particles with the same pitch angles. These measurements are used to compute scale factors which can be multiplied by the pre-launch geometric factor to correct any changes. The approach is first tested using measurements from GOES-13 MAGED particle detectors over a 5-month time period in 2010. We find statistically significant variations which are generally on the order of 5% or less. These results do not appear to be dependent on Poisson statistics nor upon whether a dead time correction was performed. When applied to data from a 5-month interval in 2011, one telescope shows a 10% shift from the 2010 scale factors. This technique has potential for operational use to help maintain relative calibration between multiple telescopes aboard a single satellite. It should also be extensible to inter-calibration between multiple satellites.

  16. Explanation of Two Anomalous Results in Statistical Mediation Analysis

    ERIC Educational Resources Information Center

    Fritz, Matthew S.; Taylor, Aaron B.; MacKinnon, David P.

    2012-01-01

    Previous studies of different methods of testing mediation models have consistently found two anomalous results. The first result is elevated Type I error rates for the bias-corrected and accelerated bias-corrected bootstrap tests not found in nonresampling tests or in resampling tests that did not include a bias correction. This is of special…

  17. Increased Incidence and Severity of Postoperative Radiographic Hallux Valgus Interphalangeus With Surgical Correction of Hallux Valgus.

    PubMed

    Dixon, Alexis E; Lee, Lydia C; Charlton, Timothy P; Thordarson, David B

    2015-08-01

    A previous study has shown an increased radiographic prevalence and severity of hallux valgus interphalangeus (HVIP) after surgical correction of hallux valgus (HV) due to correction of pronation deformity. The purpose of this study was to evaluate the change in pre- and postoperative HVIP deformity with correction of HV with multiple radiographic parameters. A retrospective chart review identified all bunion surgeries performed at a single center from July 1, 2009, to September 30, 2012. Exclusion criteria included prior bony surgery to the first ray, inadequate films, nonadult bunion, Akin osteotomy, or surgical treatment other than bunion correction. Pre- and postoperative films were reviewed for 2 HV angular measurements and 5 HVIP measurements, which were compared. The angles measured were hallux valgus angle (HVA), first intermetatarsal angle (IMA), hallux interphalangeus angle (HIA), distal metatarsal articular angle (DMAA), proximal phalangeal articular angle (PPAA), proximal to distal phalangeal articular angle (PDPAA), and total distal deformity (TDD). Prevalence of HVIP was analyzed in pre- and postoperative radiographs. A 1-sided Student t test was used to compare continuous data, and a chi-square test was used to compare categorical data. Ninety-two feet in 82 patients were eligible. The average preoperative HV improved with surgery. Preoperative HVA improved from 27 to 11 degrees (P < .001). Preoperative IMA improved from 13.6 to 6.1 degrees (P < .001). HVIP worsened after surgery. Preoperative HIA increased from 7.2 to 13.2 degrees (P < .001). DMAA worsened from 7.3 to 9.2 degrees (P = .001). PPAA worsened from 3.2 to 6.2 degrees. PDPAA worsened from 6.7 to 8.2 degrees (P < .001). The TDD increased from 14.6 to 17.9 degrees (P < .001). The prevalence of HVIP pre- and postoperatively as defined by HIA increased from 26% to 79% (P < .001) and by PPAA from 12% to 46% (P < .001). Initial assessment of preoperative radiographs underestimated HVIP. Postoperative correction of the deformity revealed HVIP that was not obvious preoperatively. Level III, retrospective comparative series. © The Author(s) 2015.

  18. A metabolomic study of low estimated GFR in non-proteinuric type 2 diabetes mellitus.

    PubMed

    Ng, D P K; Salim, A; Liu, Y; Zou, L; Xu, F G; Huang, S; Leong, H; Ong, C N

    2012-02-01

    We carried out a urinary metabolomic study to gain insight into low estimated GFR (eGFR) in patients with non-proteinuric type 2 diabetes. Patients were identified as being non-proteinuric using multiple urinalyses. Cases (n = 44) with low eGFR and controls (n = 46) had eGFR values <60 and ≥60 ml min(-1) 1.73 m(-2), respectively, as calculated using the Modification of Diet in Renal Disease formula. Urine samples were analysed by liquid chromatography/mass spectrometry (LC/MS) and GC/MS. False discovery rates were used to adjust for multiple hypotheses testing, and selection of metabolites that best predicted low eGFR status was achieved using least absolute shrinkage and selection operator logistic regression. Eleven GC/MS metabolites were strongly associated with low eGFR after correction for multiple hypotheses testing (smallest adjusted p value = 2.62 × 10(-14), largest adjusted p value = 3.84 × 10(-2)). In regression analysis, octanol, oxalic acid, phosphoric acid, benzamide, creatinine, 3,5-dimethoxymandelic amide and N-acetylglutamine were selected as the best subset for prediction and allowed excellent classification of low eGFR (AUC = 0.996). In LC/MS, 19 metabolites remained significant after multiple hypotheses testing had been taken into account (smallest adjusted p value = 2.04 × 10(-4), largest adjusted p value = 4.48 × 10(-2)), and several metabolites showed stronger evidence of association relative to the uraemic toxin, indoxyl sulphate (adjusted p value = 3.03 × 10(-2)). The potential effect of confounding on the association between metabolites was excluded. Our study has yielded substantial new insight into low eGFR and provided a collection of potential urinary biomarkers for its detection.

  19. HIV testing in correctional institutions: evaluating existing strategies, setting new standards.

    PubMed

    Basu, Sanjay; Smith-Rohrberg, Duncan; Hanck, Sarah; Altice, Frederick L

    2005-01-01

    Before introducing an HIV testing protocol into correctional facilities, the unique nature of these environments must be taken into account. We analyze three testing strategies that have been used in correctional settings--mandatory, voluntary, and routine "opt out" testing--and conclude that routine testing is most likely beneficial to inmates, the correctional system, and the outside community. The ethics of pre-release testing, and the issues surrounding segregation, confidentiality, and linking prisoners with community-based care, also play a role in determining how best to establish HIV testing strategies in correctional facilities. Testing must be performed in a manner that is not simply beneficial to public health, but also enhances the safety and health status of individual inmates. Longer-stay prison settings provide ample opportunities not just for testing but also for in-depth counseling, mental health and substance abuse treatment, and antiretroviral therapy. Jails present added complexities because of their shorter stay with respect to prisons, and testing, treatment, and counseling policies must be adapted to these settings.

  20. Evaluations and Comparisons of Treatment Effects Based on Best Combinations of Biomarkers with Applications to Biomedical Studies

    PubMed Central

    Chen, Xiwei; Yu, Jihnhee

    2014-01-01

    Abstract Many clinical and biomedical studies evaluate treatment effects based on multiple biomarkers that commonly consist of pre- and post-treatment measurements. Some biomarkers can show significant positive treatment effects, while other biomarkers can reflect no effects or even negative effects of the treatments, giving rise to a necessity to develop methodologies that may correctly and efficiently evaluate the treatment effects based on multiple biomarkers as a whole. In the setting of pre- and post-treatment measurements of multiple biomarkers, we propose to apply a receiver operating characteristic (ROC) curve methodology based on the best combination of biomarkers maximizing the area under the receiver operating characteristic curve (AUC)-type criterion among all possible linear combinations. In the particular case with independent pre- and post-treatment measurements, we show that the proposed method represents the well-known Su and Liu's (1993) result. Further, proceeding from derived best combinations of biomarkers' measurements, we propose an efficient technique via likelihood ratio tests to compare treatment effects. We show an extensive Monte Carlo study that confirms the superiority of the proposed test in comparison with treatment effects based on multiple biomarkers in a paired data setting. For practical applications, the proposed method is illustrated with a randomized trial of chlorhexidine gluconate on oral bacterial pathogens in mechanically ventilated patients as well as a treatment study for children with attention deficit-hyperactivity disorder and severe mood dysregulation. PMID:25019920

  1. Closure report for Corrective Action Unit 211, Area 15 EPA Farm waste sites, Nevada Test Site, Nevada

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1998-04-01

    This Closure Report summarizes the corrective actions which were completed at the Corrective Action Sites within Corrective Action Unit 211 Area 15 Farm Waste Sties at the Nevada Test Site. Current site descriptions, observations and identification of wastes removed are included on FFACO Corrective Action Site housekeeping closure verification forms.

  2. Common-pull, multiple-push, vacuum-activated telescope mirror cell.

    PubMed

    Ruiz, Elfego; Sohn, Erika; Salas, Luis; Luna, Esteban; Araiza-Durán, José A

    2014-11-20

    A new concept for push-pull active optics is presented, where the push-force is provided by means of individual airbag type actuators and a common force in the form of a vacuum is applied to the entire back of the mirror. The vacuum provides the pull-component of the system, in addition to gravity. Vacuum is controlled as a function of the zenithal angle, providing correction for the axial component of the mirror's weight. In this way, the push actuators are only responsible for correcting mirror deformations, as well as for supporting the axial mirror weight at the zenith, allowing for a uniform, full dynamic-range behavior of the system along the telescope's pointing range. This can result in the ability to perform corrections of up to a few microns for low-order aberrations. This mirror support concept was simulated using a finite element model and was tested experimentally at the 2.12 m San Pedro Mártir telescope. Advantages such as stress-free attachments, lighter weight, large actuator area, lower system complexity, and lower required mirror-cell stiffness could make this a method to consider for future large telescopes.

  3. A portable pattern-based design technology co-optimization flow to reduce optical proximity correction run-time

    NASA Astrophysics Data System (ADS)

    Chen, Yi-Chieh; Li, Tsung-Han; Lin, Hung-Yu; Chen, Kao-Tun; Wu, Chun-Sheng; Lai, Ya-Chieh; Hurat, Philippe

    2018-03-01

    Along with process improvement and integrated circuit (IC) design complexity increased, failure rate caused by optical getting higher in the semiconductor manufacture. In order to enhance chip quality, optical proximity correction (OPC) plays an indispensable rule in the manufacture industry. However, OPC, includes model creation, correction, simulation and verification, is a bottleneck from design to manufacture due to the multiple iterations and advanced physical behavior description in math. Thus, this paper presented a pattern-based design technology co-optimization (PB-DTCO) flow in cooperation with OPC to find out patterns which will negatively affect the yield and fixed it automatically in advance to reduce the run-time in OPC operation. PB-DTCO flow can generate plenty of test patterns for model creation and yield gaining, classify candidate patterns systematically and furthermore build up bank includes pairs of match and optimization patterns quickly. Those banks can be used for hotspot fixing, layout optimization and also be referenced for the next technology node. Therefore, the combination of PB-DTCO flow with OPC not only benefits for reducing the time-to-market but also flexible and can be easily adapted to diversity OPC flow.

  4. Statistical Evaluation of Combined Daily Gauge Observations and Rainfall Satellite Estimations over Continental South America

    NASA Technical Reports Server (NTRS)

    Vila, Daniel; deGoncalves, Luis Gustavo; Toll, David L.; Rozante, Jose Roberto

    2008-01-01

    This paper describes a comprehensive assessment of a new high-resolution, high-quality gauge-satellite based analysis of daily precipitation over continental South America during 2004. This methodology is based on a combination of additive and multiplicative bias correction schemes in order to get the lowest bias when compared with the observed values. Inter-comparisons and cross-validations tests have been carried out for the control algorithm (TMPA real-time algorithm) and different merging schemes: additive bias correction (ADD), ratio bias correction (RAT) and TMPA research version, for different months belonging to different seasons and for different network densities. All compared merging schemes produce better results than the control algorithm, but when finer temporal (daily) and spatial scale (regional networks) gauge datasets is included in the analysis, the improvement is remarkable. The Combined Scheme (CoSch) presents consistently the best performance among the five techniques. This is also true when a degraded daily gauge network is used instead of full dataset. This technique appears a suitable tool to produce real-time, high-resolution, high-quality gauge-satellite based analyses of daily precipitation over land in regional domains.

  5. Accuracy of p53 Codon 72 Polymorphism Status Determined by Multiple Laboratory Methods: A Latent Class Model Analysis

    PubMed Central

    Walter, Stephen D.; Riddell, Corinne A.; Rabachini, Tatiana; Villa, Luisa L.; Franco, Eduardo L.

    2013-01-01

    Introduction Studies on the association of a polymorphism in codon 72 of the p53 tumour suppressor gene (rs1042522) with cervical neoplasia have inconsistent results. While several methods for genotyping p53 exist, they vary in accuracy and are often discrepant. Methods We used latent class models (LCM) to examine the accuracy of six methods for p53 determination, all conducted by the same laboratory. We also examined the association of p53 with cytological cervical abnormalities, recognising potential test inaccuracy. Results Pairwise disagreement between laboratory methods occurred approximately 10% of the time. Given the estimated true p53 status of each woman, we found that each laboratory method is most likely to classify a woman to her correct status. Arg/Arg women had the highest risk of squamous intraepithelial lesions (SIL). Test accuracy was independent of cytology. There was no strong evidence for correlations of test errors. Discussion Empirical analyses ignore possible laboratory errors, and so are inherently biased, but test accuracy estimated by the LCM approach is unbiased when model assumptions are met. LCM analysis avoids ambiguities arising from empirical test discrepancies, obviating the need to regard any of the methods as a “gold” standard measurement. The methods we presented here to analyse the p53 data can be applied in many other situations where multiple tests exist, but where none of them is a gold standard. PMID:23441193

  6. Corrective Action Decision Document/Corrective Action Plan for Corrective Action Unit 104: Area 7 Yucca Flat Atmospheric Test Sites Nevada National Security Site, Nevada, Revision 0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patrick Matthews

    2012-10-01

    CAU 104 comprises the following corrective action sites (CASs): • 07-23-03, Atmospheric Test Site T-7C • 07-23-04, Atmospheric Test Site T7-1 • 07-23-05, Atmospheric Test Site • 07-23-06, Atmospheric Test Site T7-5a • 07-23-07, Atmospheric Test Site - Dog (T-S) • 07-23-08, Atmospheric Test Site - Baker (T-S) • 07-23-09, Atmospheric Test Site - Charlie (T-S) • 07-23-10, Atmospheric Test Site - Dixie • 07-23-11, Atmospheric Test Site - Dixie • 07-23-12, Atmospheric Test Site - Charlie (Bus) • 07-23-13, Atmospheric Test Site - Baker (Buster) • 07-23-14, Atmospheric Test Site - Ruth • 07-23-15, Atmospheric Test Site T7-4 •more » 07-23-16, Atmospheric Test Site B7-b • 07-23-17, Atmospheric Test Site - Climax These 15 CASs include releases from 30 atmospheric tests conducted in the approximately 1 square mile of CAU 104. Because releases associated with the CASs included in this CAU overlap and are not separate and distinguishable, these CASs are addressed jointly at the CAU level. The purpose of this CADD/CAP is to evaluate potential corrective action alternatives (CAAs), provide the rationale for the selection of recommended CAAs, and provide the plan for implementation of the recommended CAA for CAU 104. Corrective action investigation (CAI) activities were performed from October 4, 2011, through May 3, 2012, as set forth in the CAU 104 Corrective Action Investigation Plan.« less

  7. Wind tunnel-sidewall-boundary-layer effects in transonic airfoil testing-some correctable, but some not

    NASA Technical Reports Server (NTRS)

    Lynch, F. T.; Johnson, C. B.

    1988-01-01

    The need to correct transonic airfoil wind tunnel test data for the influence of the tunnel sidewall boundary layers, in addition to the wall accepted corrections for the analytical investigation was carried out in order to evaluate sidewall boundary layer effects on transonic airfoil characteristics, and to validate proposed correction and the limit to their applications. This investigation involved testing of modern airfoil configurations in two different transonic airfoil test facilities, the 15 x 60 inch two-dimensional insert of the National Aeronautical Establishment (NAE) 5 foot tunnel in Ottawa, Canada, and the two-dimensional test section of the NASA Langley 0.3 m Transonic Cryogenic Tunnel (TCT). Results presented included effects of variations in sidewall-boundary layer bleed in both facilities, different sidewall boundary layer correction procedures, tunnel-to tunnel comparisons of correcte results, and flow conditions with and without separation.

  8. Multiple Choice Items: How to Gain the Most out of Them.

    ERIC Educational Resources Information Center

    Talmir, Pinchas

    1991-01-01

    Describes how multiple-choice items can be designed and used as an effective diagnostic tool by avoiding their pitfalls and by taking advantage of their potential benefits. The following issues are discussed: correct' versus best answers; construction of diagnostic multiple-choice items; the problem of guessing; the use of justifications of…

  9. First laboratory results with the LINC-NIRVANA high layer wavefront sensor.

    PubMed

    Zhang, Xianyu; Gaessler, Wolfgang; Conrad, Albert R; Bertram, Thomas; Arcidiacono, Carmelo; Herbst, Thomas M; Kuerster, Martin; Bizenberger, Peter; Meschke, Daniel; Rix, Hans-Walter; Rao, Changhui; Mohr, Lars; Briegel, Florian; Kittmann, Frank; Berwein, Juergen; Trowitzsch, Jan; Schreiber, Laura; Ragazzoni, Roberto; Diolaiti, Emiliano

    2011-08-15

    In the field of adaptive optics, multi-conjugate adaptive optics (MCAO) can greatly increase the size of the corrected field of view (FoV) and also extend sky coverage. By applying layer oriented MCAO (LO-MCAO) [4], together with multiple guide stars (up to 20) and pyramid wavefront sensors [7], LINC-NIRVANA (L-N for short) [1] will provide two AO-corrected beams to a Fizeau interferometer to achieve 10 milliarcsecond angular resolution on the Large Binocular Telescope. This paper presents first laboratory results of the AO performance achieved with the high layer wavefront sensor (HWS). This sensor, together with its associated deformable mirror (a Xinetics-349), is being operated in one of the L-N laboratories. AO reference stars, spread across a 2 arc-minute FoV and with aberrations resulting from turbulence introduced at specific layers in the atmosphere, are simulated in this lab environment. This is achieved with the Multi-Atmosphere Phase screen and Stars (MAPS) [2] unit. From the wavefront data, the approximate residual wavefront error after correction has been calculated for different turbulent layer altitudes and wind speeds. Using a somewhat undersampled CCD, the FWHM of stars in the nearly 2 arc-minute FoV has also been measured. These test results demonstrate that the high layer wavefront sensor of LINC-NIRVANA will be able to achieve uniform AO correction across a large FoV. © 2011 Optical Society of America

  10. Performance analysis of multiple PRF technique for ambiguity resolution

    NASA Technical Reports Server (NTRS)

    Chang, C. Y.; Curlander, J. C.

    1992-01-01

    For short wavelength spaceborne synthetic aperture radar (SAR), ambiguity in Doppler centroid estimation occurs when the azimuth squint angle uncertainty is larger than the azimuth antenna beamwidth. Multiple pulse recurrence frequency (PRF) hopping is a technique developed to resolve the ambiguity by operating the radar in different PRF's in the pre-imaging sequence. Performance analysis results of the multiple PRF technique are presented, given the constraints of the attitude bound, the drift rate uncertainty, and the arbitrary numerical values of PRF's. The algorithm performance is derived in terms of the probability of correct ambiguity resolution. Examples, using the Shuttle Imaging Radar-C (SIR-C) and X-SAR parameters, demonstrate that the probability of correct ambiguity resolution obtained by the multiple PRF technique is greater than 95 percent and 80 percent for the SIR-C and X-SAR applications, respectively. The success rate is significantly higher than that achieved by the range cross correlation technique.

  11. Multiple Time-Step Dual-Hamiltonian Hybrid Molecular Dynamics — Monte Carlo Canonical Propagation Algorithm

    PubMed Central

    Weare, Jonathan; Dinner, Aaron R.; Roux, Benoît

    2016-01-01

    A multiple time-step integrator based on a dual Hamiltonian and a hybrid method combining molecular dynamics (MD) and Monte Carlo (MC) is proposed to sample systems in the canonical ensemble. The Dual Hamiltonian Multiple Time-Step (DHMTS) algorithm is based on two similar Hamiltonians: a computationally expensive one that serves as a reference and a computationally inexpensive one to which the workload is shifted. The central assumption is that the difference between the two Hamiltonians is slowly varying. Earlier work has shown that such dual Hamiltonian multiple time-step schemes effectively precondition nonlinear differential equations for dynamics by reformulating them into a recursive root finding problem that can be solved by propagating a correction term through an internal loop, analogous to RESPA. Of special interest in the present context, a hybrid MD-MC version of the DHMTS algorithm is introduced to enforce detailed balance via a Metropolis acceptance criterion and ensure consistency with the Boltzmann distribution. The Metropolis criterion suppresses the discretization errors normally associated with the propagation according to the computationally inexpensive Hamiltonian, treating the discretization error as an external work. Illustrative tests are carried out to demonstrate the effectiveness of the method. PMID:26918826

  12. Should essays and other "open-ended"-type questions retain a place in written summative assessment in clinical medicine?

    PubMed

    Hift, Richard J

    2014-11-28

    Written assessments fall into two classes: constructed-response or open-ended questions, such as the essay and a number of variants of the short-answer question, and selected-response or closed-ended questions; typically in the form of multiple-choice. It is widely believed that constructed response written questions test higher order cognitive processes in a manner that multiple-choice questions cannot, and consequently have higher validity. An extensive review of the literature suggests that in summative assessment neither premise is evidence-based. Well-structured open-ended and multiple-choice questions appear equivalent in their ability to assess higher cognitive functions, and performance in multiple-choice assessments may correlate more highly than the open-ended format with competence demonstrated in clinical practice following graduation. Studies of construct validity suggest that both formats measure essentially the same dimension, at least in mathematics, the physical sciences, biology and medicine. The persistence of the open-ended format in summative assessment may be due to the intuitive appeal of the belief that synthesising an answer to an open-ended question must be both more cognitively taxing and similar to actual experience than is selecting a correct response. I suggest that cognitive-constructivist learning theory would predict that a well-constructed context-rich multiple-choice item represents a complex problem-solving exercise which activates a sequence of cognitive processes which closely parallel those required in clinical practice, hence explaining the high validity of the multiple-choice format. The evidence does not support the proposition that the open-ended assessment format is superior to the multiple-choice format, at least in exit-level summative assessment, in terms of either its ability to test higher-order cognitive functioning or its validity. This is explicable using a theory of mental models, which might predict that the multiple-choice format will have higher validity, a statement for which some empiric support exists. Given the superior reliability and cost-effectiveness of the multiple-choice format consideration should be given to phasing out open-ended format questions in summative assessment. Whether the same applies to non-exit-level assessment and formative assessment is a question which remains to be answered; particularly in terms of the educational effect of testing, an area which deserves intensive study.

  13. Determination of corrections to flow direction measurements obtained with a wing-tip mounted sensor. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Moul, T. M.

    1983-01-01

    The nature of corrections for flow direction measurements obtained with a wing-tip mounted sensor was investigated. Corrections for the angle of attack and sideslip, measured by sensors mounted in front of each wing tip of a general aviation airplane, were determined. These flow corrections were obtained from both wind-tunnel and flight tests over a large angle-of-attack range. Both the angle-of-attack and angle-of-sideslip flow corrections were found to be substantial. The corrections were a function of the angle of attack and angle of sideslip. The effects of wing configuration changes, small changes in Reynolds number, and spinning rotation on the angle-of-attack flow correction were found to be small. The angle-of-attack flow correction determined from the static wind-tunnel tests agreed reasonably well with the correction determined from flight tests.

  14. Psychosomatic Medicine for Non-Psychiatric Residents: Video Education and Incorporation of Technology.

    PubMed

    Saunders, J; Gopalan, P; Puri, N; Azzam, P N; Zhou, L; Ghinassi, F; Jain, A; Travis, M; Ryan, N D

    2015-12-01

    Psychiatric education for non-psychiatric residents varies between training programs, and may affect resident comfort with psychiatric topics. This study's goals were to identify non-psychiatric residents' comfort with psychiatric topics and to test the effectiveness of a video intervention. Residents in various departments were given a survey. They were asked to rank their comfort level with multiple psychiatric topics, answer questions about medical decision making capacity (MDMC), watch a 15-min video about MDMC, and answer a post-test section. In total, 91 Internal Medicine, General Surgery, and Obstetrics and Gynecology residents responded to the study. Of the 91 residents, 55 completed the pre- and post-test assessments. There was no significant difference in correct responses. Residents' comfort levels were assessed, and a significant improvement in comfort level with MDMC was found. This study highlights potential opportunities for psychiatric education, and suggests brief video interventions can increase resident physicians' comfort with a psychiatric topic.

  15. Evaluation and comparison of ERTS measurements of major crops and soil associations for selected test sites in the central United States. [Texas, Indiana, Kansas, Iowa, Nebraska, and North Dakota

    NASA Technical Reports Server (NTRS)

    Baumgardner, M. F. (Principal Investigator)

    1974-01-01

    The author has identified the following significant results. Multispectral scanner data obtained by ERTS-1 over six test sites in the Central United States were analyzed and interpreted. ERTS-1 data for some of the test sites were geometrically corrected and temporally overlayed. Computer-implemented pattern recognition techniques were used in the analysis of all multispectral data. These techniques were used to evaluate ERTS-1 data as a tool for soil survey. Geology maps and land use inventories were prepared by digital analysis of multispectral data. Identification and mapping of crop species and rangelands were achieved throught the analysis of 1972 and 1973 ERTS-1 data. Multiple dates of ERTS-1 data were examined to determine the variation with time of the areal extent of surface water resources on the Southern Great Plain.

  16. Associations between the oxytocin receptor gene (OXTR) and "mind-reading" in humans--an exploratory study.

    PubMed

    Lucht, Michael J; Barnow, Sven; Sonnenfeld, Christine; Ulrich, Ines; Grabe, Hans Joergen; Schroeder, Winnie; Völzke, Henry; Freyberger, Harald J; John, Ulrich; Herrmann, Falko H; Kroemer, Heyo; Rosskopf, Dieter

    2013-02-01

    The application of intranasal oxytocin enhances facial emotion recognition in normal subjects and in subjects with autism spectrum disorders (ASD). In addition, various features of social cognition have been associated with variants of the oxytocin receptor gene (OXTR). Therefore, we tested for associations between mind-reading, a measure for social recognition and OXTR polymorphisms. 76 healthy adolescents and young adults were tested for associations between OXTR rs53576, rs2254298, rs2228485 and mind-reading using the "Reading the Mind in the Eyes Test" (RMET). After Bonferroni correction for multiple comparisons, rs2228485 was associated with the number of incorrect answers when subjects evaluated male faces (P =0.000639). There were also associations between OXTR rs53576, rs2254298 and rs2228485 and other RMET dimensions according to P <0.05 (uncorrected). This study adds further evidence to the hypothesis that genetic variations in the OXTR modulate mind-reading and social behaviour.

  17. Multilayer active shell mirrors for space telescopes

    NASA Astrophysics Data System (ADS)

    Steeves, John; Jackson, Kathryn; Pellegrino, Sergio; Redding, David; Wallace, J. Kent; Bradford, Samuel Case; Barbee, Troy

    2016-07-01

    A novel active mirror technology based on carbon fiber reinforced polymer (CFRP) substrates and replication techniques has been developed. Multiple additional layers are implemented into the design serving various functions. Nanolaminate metal films are used to provide a high quality reflective front surface. A backing layer of thin active material is implemented to provide the surface-parallel actuation scheme. Printed electronics are used to create a custom electrode pattern and flexible routing layer. Mirrors of this design are thin (< 1.0 mm), lightweight (2.7 kg/m2), and have large actuation capabilities. These capabilities, along with the associated manufacturing processes, represent a significant change in design compared to traditional optics. Such mirrors could be used as lightweight primaries for small CubeSat-based telescopes or as meter-class segments for future large aperture observatories. Multiple mirrors can be produced under identical conditions enabling a substantial reduction in manufacturing cost and complexity. An overview of the mirror design and manufacturing processes is presented. Predictions on the actuation performance have been made through finite element simulations demonstrating correctabilities on the order of 250-300× for astigmatic modes with only 41 independent actuators. A description of the custom metrology system used to characterize the active mirrors is also presented. The system is based on a Reverse Hartmann test and can accommodate extremely large deviations in mirror figure (> 100 μm PV) down to sub-micron precision. The system has been validated against several traditional techniques including photogrammetry and interferometry. The mirror performance has been characterized using this system, as well as closed-loop figure correction experiments on 150 mm dia. prototypes. The mirrors have demonstrated post-correction figure accuracies of 200 nm RMS (two dead actuators limiting performance).

  18. Audiometric analyses confirm a cochlear component, disproportional to age, in stapedial otosclerosis.

    PubMed

    Topsakal, Vedat; Fransen, Erik; Schmerber, Sébastien; Declau, Frank; Yung, Matthew; Gordts, Frans; Van Camp, Guy; Van de Heyning, Paul

    2006-09-01

    To report the preoperative audiometric profile of surgically confirmed otosclerosis. Retrospective, multicenter study. Four tertiary referral centers. One thousand sixty-four surgically confirmed patients with otosclerosis. Therapeutic ear surgery for hearing improvement. Preoperative audiometric air conduction (AC) and bone conduction (BC) hearing thresholds were obtained retrospectively for 1064 patients with otosclerosis. A cross-sectional multiple linear regression analysis was performed on audiometric data of affected ears. Influences of age and sex were analyzed and age-related typical audiograms were created. Bone conduction thresholds were corrected for Carhart effect and presbyacusis; in addition, we tested to see if separate cochlear otosclerosis component existed. Corrected thresholds were than analyzed separately for progression of cochlear otosclerosis. The study population consisted of 35% men and 65% women (mean age, 44 yr). The mean pure-tone average at 0.5, 1, and 2 kHz was 57 dB hearing level. Multiple linear regression analysis showed significant progression for all measured AC and BC thresholds. The average annual threshold deterioration for AC was 0.45 dB/yr and the annual threshold deterioration for BC was 0.37 dB/yr. The average annual gap expansion was 0.08 dB/year. The corrected BC thresholds for Carhart effect and presbyacusis remained significantly different from zero, but only showed progression at 2 kHz. The preoperative audiological profile of otosclerosis is described. There is a significant sensorineural component in patients with otosclerosis planned for stapedotomy, which is worse than age-related hearing loss by itself. Deterioration rates of AC and BC thresholds have been reported, which can be helpful in clinical practice and might also guide the characterization of allegedly different phenotypes for familial and sporadic otosclerosis.

  19. Multiplicity Counting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Geist, William H.

    2015-12-01

    This set of slides begins by giving background and a review of neutron counting; three attributes of a verification item are discussed: 240Pu eff mass; α, the ratio of (α,n) neutrons to spontaneous fission neutrons; and leakage multiplication. It then takes up neutron detector systems – theory & concepts (coincidence counting, moderation, die-away time); detector systems – some important details (deadtime, corrections); introduction to multiplicity counting; multiplicity electronics and example distributions; singles, doubles, and triples from measured multiplicity distributions; and the point model: multiplicity mathematics.

  20. 16 CFR 1209.37 - Corrective actions.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 16 Commercial Practices 2 2011-01-01 2011-01-01 false Corrective actions. 1209.37 Section 1209.37... SAFETY STANDARD FOR CELLULOSE INSULATION Certification § 1209.37 Corrective actions. (a) Test failure. When any test required by § 1209.36 yields failing or unacceptable results, corrective action must be...

  1. Atmospheric Correction Algorithm for Hyperspectral Imagery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    R. J. Pollina

    1999-09-01

    In December 1997, the US Department of Energy (DOE) established a Center of Excellence (Hyperspectral-Multispectral Algorithm Research Center, HyMARC) for promoting the research and development of algorithms to exploit spectral imagery. This center is located at the DOE Remote Sensing Laboratory in Las Vegas, Nevada, and is operated for the DOE by Bechtel Nevada. This paper presents the results to date of a research project begun at the center during 1998 to investigate the correction of hyperspectral data for atmospheric aerosols. Results of a project conducted by the Rochester Institute of Technology to define, implement, and test procedures for absolutemore » calibration and correction of hyperspectral data to absolute units of high spectral resolution imagery will be presented. Hybrid techniques for atmospheric correction using image or spectral scene data coupled through radiative propagation models will be specifically addressed. Results of this effort to analyze HYDICE sensor data will be included. Preliminary results based on studying the performance of standard routines, such as Atmospheric Pre-corrected Differential Absorption and Nonlinear Least Squares Spectral Fit, in retrieving reflectance spectra show overall reflectance retrieval errors of approximately one to two reflectance units in the 0.4- to 2.5-micron-wavelength region (outside of the absorption features). These results are based on HYDICE sensor data collected from the Southern Great Plains Atmospheric Radiation Measurement site during overflights conducted in July of 1997. Results of an upgrade made in the model-based atmospheric correction techniques, which take advantage of updates made to the moderate resolution atmospheric transmittance model (MODTRAN 4.0) software, will also be presented. Data will be shown to demonstrate how the reflectance retrieval in the shorter wavelengths of the blue-green region will be improved because of enhanced modeling of multiple scattering effects.« less

  2. Test of the Angle Detecting Inclined Sensor (ADIS) Technique for Measuring Space Radiation

    NASA Astrophysics Data System (ADS)

    Connell, J. J.; Lopate, C.; McLaughlin, K. R.

    2008-12-01

    In February 2008 we exposed an Angle Detecting Inclined Sensor (ADIS) prototype to beams of 150 MeV/u 78Kr and fragments at the National Superconducting Cyclotron Laboratory's (NSCL) Coupled Cyclotron Facility (CCF). ADIS is a highly innovative and uniquely simple detector configuration used to determine the angles of incidence of heavy ions in energetic charged particle instruments. Corrections for angle of incidence are required for good charge and mass separation. An ADIS instrument is under development to fly on the GOES-R series of weather satellites. The prototype tested consisted of three ADIS detectors, two of which were inclined at an angle to the telescope axis, forming the initial detectors in a five-detector telescope stack. By comparing the signals from the ADIS detectors, the angle of incidence may be determined and a pathlength correction applied to charge and mass determinations. Thus, ADIS replaces complex position sensing detectors with a system of simple, reliable and robust Si detectors. Accelerator data were taken at multiple angles to both primary and secondary beams with a spread of energies. This test instrument represents an improvement over the previous ADIS prototype in that it used oval inclined detectors and a much lower-mass support structure, thus reducing the number of events passing through dead material. We will present the results of this test. The ADIS instrument development project was partially funded by NASA under the Living With a Star (LWS) Targeted Research and Technology program (grant NAG5-12493).

  3. Flexible rotor balancing by the influence coefficient method: Multiple critical speeds with rigid or flexible supports

    NASA Technical Reports Server (NTRS)

    Tessarzik, J. M.

    1975-01-01

    Experimental tests were conducted to demonstrate the ability of the influence coefficient method to achieve precise balance of flexible rotors of virtually any design for operation through virtually any speed range. Various practical aspects of flexible-rotor balancing were investigated. Tests were made on a laboratory quality machine having a 122 cm (48 in.) long rotor weighing 50 kg (110 lb) and covering a speed range up to 18000 rpm. The balancing method was in every instance effective, practical, and economical and permitted safe rotor operation over the full speed range covering four rotor bending critical speeds. Improved correction weight removal methods for rotor balancing were investigated. Material removal from a rotating disk was demonstrated through application of a commercially available laser.

  4. Parametric, nonparametric and parametric modelling of a chaotic circuit time series

    NASA Astrophysics Data System (ADS)

    Timmer, J.; Rust, H.; Horbelt, W.; Voss, H. U.

    2000-09-01

    The determination of a differential equation underlying a measured time series is a frequently arising task in nonlinear time series analysis. In the validation of a proposed model one often faces the dilemma that it is hard to decide whether possible discrepancies between the time series and model output are caused by an inappropriate model or by bad estimates of parameters in a correct type of model, or both. We propose a combination of parametric modelling based on Bock's multiple shooting algorithm and nonparametric modelling based on optimal transformations as a strategy to test proposed models and if rejected suggest and test new ones. We exemplify this strategy on an experimental time series from a chaotic circuit where we obtain an extremely accurate reconstruction of the observed attractor.

  5. Effects of desmopressin on platelet function under conditions of hypothermia and acidosis: an in vitro study using multiple electrode aggregometry*.

    PubMed

    Hanke, A A; Dellweg, C; Kienbaum, P; Weber, C F; Görlinger, K; Rahe-Meyer, N

    2010-07-01

    Hypothermia and acidosis lead to an impairment of coagulation. It has been demonstrated that desmopressin improves platelet function under hypothermia. We tested platelet function ex vivo during hypothermia and acidosis. Blood samples were taken from 12 healthy subjects and assigned as follows: normal pH, pH 7.2, and pH 7.0, each with and without incubation with desmopressin. Platelet aggregation was assessed by multiple electrode aggregometry. Baseline was normal pH and 36 degrees C. The other samples were incubated for 30 min and measured at 32 degrees C. Acidosis significantly impaired aggregation. Desmopressin significantly increased aggregability during hypothermia and acidosis regardless of pH, but did not return it to normal values at low pH. During acidosis and hypothermia, acidosis should be corrected first; desmopressin can then be administered to improve platelet function as a bridge until normothermia can be achieved.

  6. Visual feature integration and focused attention: response competition from multiple distractor features.

    PubMed

    Lavie, N

    1997-05-01

    Predictions from Treisman's feature integration theory of attention were tested in a variant of the response-competition paradigm. Subjects made choice responses to particular color-shape conjunctions (e.g., a purple cross vs. a green circle) while withholding their responses to the opposite conjunctions (i.e., a purple circle vs. a green cross). The results showed that compatibility effects were based on both distractor color and shape. For unattended distractors in preknown irrelevant positions, compatibility effects were equivalent for conjunctive distractors (e.g., a purple cross and a blue triangle) and for disjunctive distractors (e.g., a purple triangle and a blue cross). Manipulation of attention to the distractors positions resulted in larger compatibility effects from conjoined features. These results accord with Treisman's claim that correct conjunction information is unavailable under conditions of inattention, and they provide new information on response-competition effects from multiple features.

  7. Teaching physical activities to students with significant disabilities using video modeling.

    PubMed

    Cannella-Malone, Helen I; Mizrachi, Sharona V; Sabielny, Linsey M; Jimenez, Eliseo D

    2013-06-01

    The objective of this study was to examine the effectiveness of video modeling on teaching physical activities to three adolescents with significant disabilities. The study implemented a multiple baseline across six physical activities (three per student): jumping rope, scooter board with cones, ladder drill (i.e., feet going in and out), ladder design (i.e., multiple steps), shuttle run, and disc ride. Additional prompt procedures (i.e., verbal, gestural, visual cues, and modeling) were implemented within the study. After the students mastered the physical activities, we tested to see if they would link the skills together (i.e., complete an obstacle course). All three students made progress learning the physical activities, but only one learned them with video modeling alone (i.e., without error correction). Video modeling can be an effective tool for teaching students with significant disabilities various physical activities, though additional prompting procedures may be needed.

  8. Analysis on trust influencing factors and trust model from multiple perspectives of online Auction

    NASA Astrophysics Data System (ADS)

    Yu, Wang

    2017-10-01

    Current reputation models lack the research on online auction trading completely so they cannot entirely reflect the reputation status of users and may cause problems on operability. To evaluate the user trust in online auction correctly, a trust computing model based on multiple influencing factors is established. It aims at overcoming the efficiency of current trust computing methods and the limitations of traditional theoretical trust models. The improved model comprehensively considers the trust degree evaluation factors of three types of participants according to different participation modes of online auctioneers, to improve the accuracy, effectiveness and robustness of the trust degree. The experiments test the efficiency and the performance of our model under different scale of malicious user, under environment like eBay and Sporas model. The experimental results analysis show the model proposed in this paper makes up the deficiency of existing model and it also has better feasibility.

  9. A multiple choice testing program coupled with a year-long elective experience is associated with improved performance on the internal medicine in-training examination.

    PubMed

    Mathis, Bradley R; Warm, Eric J; Schauer, Daniel P; Holmboe, Eric; Rouan, Gregory W

    2011-11-01

    The Internal Medicine In-Training Exam (IM-ITE) assesses the content knowledge of internal medicine trainees. Many programs use the IM-ITE to counsel residents, to create individual remediation plans, and to make fundamental programmatic and curricular modifications. To assess the association between a multiple-choice testing program administered during 12 consecutive months of ambulatory and inpatient elective experience and IM-ITE percentile scores in third post-graduate year (PGY-3) categorical residents. Retrospective cohort study. One hundred and four categorical internal medicine residents. Forty-five residents in the 2008 and 2009 classes participated in the study group, and the 59 residents in the three classes that preceded the use of the testing program, 2005-2007, served as controls. A comprehensive, elective rotation specific, multiple-choice testing program and a separate board review program, both administered during a continuous long-block elective experience during the twelve months between the second post-graduate year (PGY-2) and PGY-3 in-training examinations. We analyzed the change in median individual percent correct and percentile scores between the PGY-1 and PGY-2 IM-ITE and between the PGY-2 and PGY-3 IM-ITE in both control and study cohorts. For our main outcome measure, we compared the change in median individual percentile rank between the control and study cohorts between the PGY-2 and the PGY-3 IM-ITE testing opportunities. After experiencing the educational intervention, the study group demonstrated a significant increase in median individual IM-ITE percentile score between PGY-2 and PGY-3 examinations of 8.5 percentile points (p < 0.01). This is significantly better than the increase of 1.0 percentile point seen in the control group between its PGY-2 and PGY-3 examination (p < 0.01). A comprehensive multiple-choice testing program aimed at PGY-2 residents during a 12-month continuous long-block elective experience is associated with improved PGY-3 IM-ITE performance.

  10. A Point Kinetics Model for Estimating Neutron Multiplication of Bare Uranium Metal in Tagged Neutron Measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tweardy, Matthew C.; McConchie, Seth; Hayward, Jason P.

    An extension of the point kinetics model is developed in this paper to describe the neutron multiplicity response of a bare uranium object under interrogation by an associated particle imaging deuterium-tritium (D-T) measurement system. This extended model is used to estimate the total neutron multiplication of the uranium. Both MCNPX-PoliMi simulations and data from active interrogation measurements of highly enriched and depleted uranium geometries are used to evaluate the potential of this method and to identify the sources of systematic error. The detection efficiency correction for measured coincidence response is identified as a large source of systematic error. If themore » detection process is not considered, results suggest that the method can estimate total multiplication to within 13% of the simulated value. Values for multiplicity constants in the point kinetics equations are sensitive to enrichment due to (n, xn) interactions by D-T neutrons and can introduce another significant source of systematic bias. This can theoretically be corrected if isotopic composition is known a priori. Finally, the spatial dependence of multiplication is also suspected of introducing further systematic bias for high multiplication uranium objects.« less

  11. A Point Kinetics Model for Estimating Neutron Multiplication of Bare Uranium Metal in Tagged Neutron Measurements

    DOE PAGES

    Tweardy, Matthew C.; McConchie, Seth; Hayward, Jason P.

    2017-06-13

    An extension of the point kinetics model is developed in this paper to describe the neutron multiplicity response of a bare uranium object under interrogation by an associated particle imaging deuterium-tritium (D-T) measurement system. This extended model is used to estimate the total neutron multiplication of the uranium. Both MCNPX-PoliMi simulations and data from active interrogation measurements of highly enriched and depleted uranium geometries are used to evaluate the potential of this method and to identify the sources of systematic error. The detection efficiency correction for measured coincidence response is identified as a large source of systematic error. If themore » detection process is not considered, results suggest that the method can estimate total multiplication to within 13% of the simulated value. Values for multiplicity constants in the point kinetics equations are sensitive to enrichment due to (n, xn) interactions by D-T neutrons and can introduce another significant source of systematic bias. This can theoretically be corrected if isotopic composition is known a priori. Finally, the spatial dependence of multiplication is also suspected of introducing further systematic bias for high multiplication uranium objects.« less

  12. Incorporating gene-environment interaction in testing for association with rare genetic variants.

    PubMed

    Chen, Han; Meigs, James B; Dupuis, Josée

    2014-01-01

    The incorporation of gene-environment interactions could improve the ability to detect genetic associations with complex traits. For common genetic variants, single-marker interaction tests and joint tests of genetic main effects and gene-environment interaction have been well-established and used to identify novel association loci for complex diseases and continuous traits. For rare genetic variants, however, single-marker tests are severely underpowered due to the low minor allele frequency, and only a few gene-environment interaction tests have been developed. We aimed at developing powerful and computationally efficient tests for gene-environment interaction with rare variants. In this paper, we propose interaction and joint tests for testing gene-environment interaction of rare genetic variants. Our approach is a generalization of existing gene-environment interaction tests for multiple genetic variants under certain conditions. We show in our simulation studies that our interaction and joint tests have correct type I errors, and that the joint test is a powerful approach for testing genetic association, allowing for gene-environment interaction. We also illustrate our approach in a real data example from the Framingham Heart Study. Our approach can be applied to both binary and continuous traits, it is powerful and computationally efficient.

  13. Inflammatory cytokines in major depressive disorder: A case-control study.

    PubMed

    Cassano, Paolo; Bui, Eric; Rogers, Andrew H; Walton, Zandra E; Ross, Rachel; Zeng, Mary; Nadal-Vicens, Mireya; Mischoulon, David; Baker, Amanda W; Keshaviah, Aparna; Worthington, John; Hoge, Elizabeth A; Alpert, Jonathan; Fava, Maurizio; Wong, Kwok K; Simon, Naomi M

    2017-01-01

    There is mixed evidence in the literature on the role of inflammation in major depressive disorder. Contradictory findings are attributed to lack of rigorous characterization of study subjects, to the presence of concomitant medical illnesses, to the small sample sizes, and to the limited number of cytokines tested. Subjects aged 18-70 years, diagnosed with major depressive disorder and presenting with chronic course of illness, as well as matched controls ( n = 236), were evaluated by trained raters and provided blood for cytokine measurements. Cytokine levels in EDTA plasma were measured with the MILLIPLEX Multi-Analyte Profiling Human Cytokine/Chemokine Assay employing Luminex technology. The Wilcoxon rank-sum test was used to compare cytokine levels between major depressive disorder subjects and healthy volunteers, before (interleukin [IL]-1β, IL-6, and tumor necrosis factor-α) and after Bonferroni correction for multiple comparisons (IL-1α, IL-2, IL-3, IL-4, IL-5, IL-7, IL-8, IL-10, IL-12(p40), IL-12(p70), IL-13, IL-15, IFN-γ-inducible protein 10, Eotaxin, interferon-γ, monotype chemoattractant protein-1, macrophage inflammatory protein-1α, granulocyte-macrophage colony-stimulating factor and vascular endothelial growth factor). There were no significant differences in cytokine levels between major depressive disorder subjects and controls, both prior to and after correction for multiple analyses (significance set at p ⩽ 0.05 and p ⩽ 0.002, respectively). Our well-characterized examination of cytokine plasma levels did not support the association of major depressive disorder with systemic inflammation. The heterogeneity of major depressive disorder, as well as a potential sampling bias selecting for non-inflammatory depression, might have determined our findings discordant with the literature.

  14. Identification of FGF7 as a novel susceptibility locus for chronic obstructive pulmonary disease.

    PubMed

    Brehm, John M; Hagiwara, Koichi; Tesfaigzi, Yohannes; Bruse, Shannon; Mariani, Thomas J; Bhattacharya, Soumyaroop; Boutaoui, Nadia; Ziniti, John P; Soto-Quiros, Manuel E; Avila, Lydiana; Cho, Michael H; Himes, Blanca; Litonjua, Augusto A; Jacobson, Francine; Bakke, Per; Gulsvik, Amund; Anderson, Wayne H; Lomas, David A; Forno, Erick; Datta, Soma; Silverman, Edwin K; Celedón, Juan C

    2011-12-01

    Traditional genome-wide association studies (GWASs) of large cohorts of subjects with chronic obstructive pulmonary disease (COPD) have successfully identified novel candidate genes, but several other plausible loci do not meet strict criteria for genome-wide significance after correction for multiple testing. The authors hypothesise that by applying unbiased weights derived from unique populations we can identify additional COPD susceptibility loci. Methods The authors performed a homozygosity haplotype analysis on a group of subjects with and without COPD to identify regions of conserved homozygosity haplotype (RCHHs). Weights were constructed based on the frequency of these RCHHs in case versus controls, and used to adjust the p values from a large collaborative GWAS of COPD. The authors identified 2318 RCHHs, of which 576 were significantly (p<0.05) over-represented in cases. After applying the weights constructed from these regions to a collaborative GWAS of COPD, the authors identified two single nucleotide polymorphisms (SNPs) in a novel gene (fibroblast growth factor-7 (FGF7)) that gained genome-wide significance by the false discovery rate method. In a follow-up analysis, both SNPs (rs12591300 and rs4480740) were significantly associated with COPD in an independent population (combined p values of 7.9E-7 and 2.8E-6, respectively). In another independent population, increased lung tissue FGF7 expression was associated with worse measures of lung function. Weights constructed from a homozygosity haplotype analysis of an isolated population successfully identify novel genetic associations from a GWAS on a separate population. This method can be used to identify promising candidate genes that fail to meet strict correction for multiple testing.

  15. Recalibration of blood analytes over 25 years in the Atherosclerosis Risk in Communities Study: The impact of recalibration on chronic kidney disease prevalence and incidence

    PubMed Central

    Parrinello, Christina M.; Grams, Morgan E.; Couper, David; Ballantyne, Christie M.; Hoogeveen, Ron C.; Eckfeldt, John H.; Selvin, Elizabeth; Coresh, Josef

    2016-01-01

    Background Equivalence of laboratory tests over time is important for longitudinal studies. Even a small systematic difference (bias) can result in substantial misclassification. Methods We selected 200 Atherosclerosis Risk in Communities Study participants attending all 5 study visits over 25 years. Eight analytes were re-measured in 2011–13 from stored blood samples from multiple visits: creatinine, uric acid, glucose, total cholesterol, HDL-cholesterol, LDL-cholesterol, triglycerides, and high-sensitivity C-reactive protein. Original values were recalibrated to re-measured values using Deming regression. Differences >10% were considered to reflect substantial bias, and correction equations were applied to affected analytes in the total study population. We examined trends in chronic kidney disease (CKD) pre- and post-recalibration. Results Repeat measures were highly correlated with original values (Pearson’s r>0.85 after removing outliers [median 4.5% of paired measurements]), but 2 of 8 analytes (creatinine and uric acid) had differences >10%. Original values of creatinine and uric acid were recalibrated to current values using correction equations. CKD prevalence differed substantially after recalibration of creatinine (visits 1, 2, 4 and 5 pre-recalibration: 21.7%, 36.1%, 3.5%, 29.4%; post-recalibration: 1.3%, 2.2%, 6.4%, 29.4%). For HDL-cholesterol, the current direct enzymatic method differed substantially from magnesium dextran precipitation used during visits 1–4. Conclusions Analytes re-measured in samples stored for ~25 years were highly correlated with original values, but two of the 8 analytes showed substantial bias at multiple visits. Laboratory recalibration improved reproducibility of test results across visits and resulted in substantial differences in CKD prevalence. We demonstrate the importance of consistent recalibration of laboratory assays in a cohort study. PMID:25952043

  16. Inflammatory cytokines in major depressive disorder: A case–control study

    PubMed Central

    Cassano, Paolo; Bui, Eric; Rogers, Andrew H; Walton, Zandra E; Ross, Rachel; Zeng, Mary; Nadal-Vicens, Mireya; Mischoulon, David; Baker, Amanda W; Keshaviah, Aparna; Worthington, John; Hoge, Elizabeth A; Alpert, Jonathan; Fava, Maurizio; Wong, Kwok K; Simon, Naomi M

    2017-01-01

    Introduction There is mixed evidence in the literature on the role of inflammation in major depressive disorder. Contradictory findings are attributed to lack of rigorous characterization of study subjects, to the presence of concomitant medical illnesses, to the small sample sizes, and to the limited number of cytokines tested. Methods Subjects aged 18–70 years, diagnosed with major depressive disorder and presenting with chronic course of illness, as well as matched controls (n = 236), were evaluated by trained raters and provided blood for cytokine measurements. Cytokine levels in EDTA plasma were measured with the MILLIPLEX Multi-Analyte Profiling Human Cytokine/Chemokine Assay employing Luminex technology. The Wilcoxon rank-sum test was used to compare cytokine levels between major depressive disorder subjects and healthy volunteers, before (interleukin [IL]-1 β, IL-6, and tumor necrosis factor-α) and after Bonferroni correction for multiple comparisons (IL-1α, IL-2, IL-3, IL-4, IL-5, IL-7, IL-8, IL-10, IL-12(p40), IL-12(p70), IL-13, IL-15, IFN-γ-inducible protein 10, Eotaxin, interferon-γ, monotype chemoattractant protein-1, macrophage inflammatory protein-1α, granulocyte-macrophage colony-stimulating factor and vascular endothelial growth factor). Results There were no significant differences in cytokine levels between major depressive disorder subjects and controls, both prior to and after correction for multiple analyses (significance set at p ≤ 0.05 and p ≤ 0.002, respectively). Conclusion Our well-characterized examination of cytokine plasma levels did not support the association of major depressive disorder with systemic inflammation. The heterogeneity of major depressive disorder, as well as a potential sampling bias selecting for non-inflammatory depression, might have determined our findings discordant with the literature. PMID:27313138

  17. Genetic variation in the endocannabinoid system and response to Cognitive Behavior Therapy for child anxiety disorders

    PubMed Central

    Coleman, Jonathan R. I.; Roberts, Susanna; Keers, Robert; Breen, Gerome; Bögels, Susan; Creswell, Cathy; Hudson, Jennifer L.; McKinnon, Anna; Nauta, Maaike; Rapee, Ronald M.; Schneider, Silvia; Silverman, Wendy K.; Thastum, Mikael; Waite, Polly; Wergeland, Gro Janne H.; Eley, Thalia C.

    2016-01-01

    Extinction learning is an important mechanism in the successful psychological treatment of anxiety. Individual differences in response and relapse following Cognitive Behavior Therapy may in part be explained by variability in the ease with which fears are extinguished or the vulnerability of these fears to re‐emerge. Given the role of the endocannabinoid system in fear extinction, this study investigates whether genetic variation in the endocannabinoid system explains individual differences in response to CBT. Children (N = 1,309) with a primary anxiety disorder diagnosis were recruited. We investigated the relationship between variation in the CNR1, CNR2, and FAAH genes and change in primary anxiety disorder severity between pre‐ and post‐treatment and during the follow‐up period in the full sample and a subset with fear‐based anxiety disorder diagnoses. Change in symptom severity during active treatment was nominally associated (P < 0.05) with two SNPs. During the follow‐up period, five SNPs were nominally associated with a poorer treatment response (rs806365 [CNR1]; rs2501431 [CNR2]; rs2070956 [CNR2]; rs7769940 [CNR1]; rs2209172 [FAAH]) and one with a more favorable response (rs6928813 [CNR1]). Within the fear‐based subset, the effect of rs806365 survived multiple testing corrections (P < 0.0016). We found very limited evidence for an association between variants in endocannabinoid system genes and treatment response once multiple testing corrections were applied. Larger, more homogenous cohorts are needed to allow the identification of variants of small but statistically significant effect and to estimate effect sizes for these variants with greater precision in order to determine their potential clinical utility. © 2016 The Authors. American Journal of Medical Genetics Part B: Neuropsychiatric Genetics Published by Wiley Periodicals, Inc. PMID:27346075

  18. Genetic variants in VEGF pathway genes in neoadjuvant breast cancer patients receiving bevacizumab: Results from the randomized phase III GeparQuinto study.

    PubMed

    Hein, Alexander; Lambrechts, Diether; von Minckwitz, Gunter; Häberle, Lothar; Eidtmann, Holger; Tesch, Hans; Untch, Michael; Hilfrich, Jörn; Schem, Christian; Rezai, Mahdi; Gerber, Bernd; Dan Costa, Serban; Blohmer, Jens-Uwe; Schwedler, Kathrin; Kittel, Kornelia; Fehm, Tanja; Kunz, Georg; Beckmann, Matthias W; Ekici, Arif B; Hanusch, Claus; Huober, Jens; Liedtke, Cornelia; Mau, Christine; Moisse, Matthieu; Müller, Volkmar; Nekljudova, Valentina; Peuteman, Gilian; Rack, Brigitte; Rübner, Matthias; Van Brussel, Thomas; Wang, Liewei; Weinshilboum, Richard M; Loibl, Sibylle; Fasching, Peter A

    2015-12-15

    Studies assessing the effect of bevacizumab (BEV) on breast cancer (BC) outcome have shown different effects on progression-free and overall survival, suggesting that a subgroup of patients may benefit from this treatment. Unfortunately, no biomarkers exist to identify these patients. Here, we investigate whether single nucleotide polymorphisms (SNPs) in VEGF pathway genes correlate with pathological complete response (pCR) in the neoadjuvant GeparQuinto trial. HER2-negative patients were randomized into treatment arms receiving either BEV combined with standard chemotherapy or chemotherapy alone. In a pre-planned biomarker study, DNA was collected from 729 and 724 patients, respectively from both treatment arms, and genotyped for 125 SNPs. Logistic regression assessed interaction between individual SNPs and both treatment arms to predict pCR. Five SNPs may be associated with a better response to BEV, but none of them remained significant after correction for multiple testing. The two SNPs most strongly associated, rs833058 and rs699947, were located upstream of the VEGF-A promoter. Odds ratios for the homozygous common, heterozygous and homozygous rare rs833058 genotypes were 2.36 (95% CI, 1.49-3.75), 1.20 (95% CI, 0.88-1.64) and 0.61 (95% CI, 0.34-1.12). Notably, some SNPs in VEGF-A exhibited a more pronounced effect in the triple-negative subgroup. Several SNPs in VEGF-A may be associated with improved pCR when receiving BEV in the neoadjuvant setting. Although none of the observed effects survived correction for multiple testing, our observations are consistent with previous studies on BEV efficacy in BC. Further research is warranted to clarify the predictive value of these markers. © 2015 UICC.

  19. Circadian CLOCK gene polymorphisms in relation to sleep patterns and obesity in African Americans: findings from the Jackson heart study.

    PubMed

    Riestra, Pia; Gebreab, Samson Y; Xu, Ruihua; Khan, Rumana J; Gaye, Amadou; Correa, Adolfo; Min, Nancy; Sims, Mario; Davis, Sharon K

    2017-06-23

    Circadian rhythms regulate key biological processes and the dysregulation of the intrinsic clock mechanism affects sleep patterns and obesity onset. The CLOCK (circadian locomotor output cycles protein kaput) gene encodes a core transcription factor of the molecular circadian clock influencing diverse metabolic pathways, including glucose and lipid homeostasis. The primary objective of this study was to evaluate the associations between CLOCK single nucleotide polymorphisms (SNPs) and body mass index (BMI). We also evaluated the association of SNPs with BMI related factors such as sleep duration and quality, adiponectin and leptin, in 2962 participants (1116 men and 1810 women) from the Jackson Heart Study. Genotype data for the selected 23 CLOCK gene SNPS was obtained by imputation with IMPUTE2 software and reference phase data from the 1000 genome project. Genetic analyses were conducted with PLINK RESULTS: We found a significant association between the CLOCK SNP rs2070062 and sleep duration, participants carriers of the T allele showed significantly shorter sleep duration compared to non-carriers after the adjustment for individual proportions of European ancestry (PEA), socio economic status (SES), body mass index (BMI), alcohol consumption and smoking status that reach the significance threshold after multiple testing correction. In addition, we found nominal associations of the CLOCK SNP rs6853192 with longer sleep duration and the rs6820823, rs3792603 and rs11726609 with BMI. However, these associations did not reach the significance threshold after correction for multiple testing. In this work, CLOCK gene variants were associated with sleep duration and BMI suggesting that the effects of these polymorphisms on circadian rhythmicity may affect sleep duration and body weight regulation in Africans Americans.

  20. Genetic variants in endotoxin signalling pathway, domestic endotoxin exposure and asthma exacerbations.

    PubMed

    Kljaic-Bukvic, Blazenka; Blekic, Mario; Aberle, Neda; Curtin, John A; Hankinson, Jenny; Semic-Jusufagic, Aida; Belgrave, Danielle; Simpson, Angela; Custovic, Adnan

    2014-10-01

    We investigated the interaction between genetic variants in endotoxin signalling pathway and domestic endotoxin exposure in relation to asthma presence, and amongst children with asthma, we explored the association of these genetic variants and endotoxin exposure with hospital admissions due to asthma exacerbations. In a case-control study, we analysed data from 824 children (417 asthmatics, 407 controls; age 5-18 yr). Amongst asthmatics, we extracted data on hospitalization for asthma exacerbation from medical records. Endotoxin exposure was measured in dust samples collected from homes. We included 26 single-nucleotide polymorphisms (SNPs) in the final analysis (5 CD14, 7LY96 and 14 TLR4). Two variants remained significantly associated with hospital admissions with asthma exacerbations after correction for multiple testing: for CD14 SNP rs5744455, carriers of T allele had decreased risk of repeated hospital admissions compared with homozygotes for C allele [OR (95% CI), 0.42 (0.25-0.88), p = 0.01, False Discovery Rate (FDR) p = 0.02]; for LY96 SNP rs17226566, C-allele carriers were at a lower risk of hospital admissions compared with T-allele homozygotes [0.59 (0.38-0.90), p = 0.01, FDR p = 0.04]. We observed two interactions between SNPs in CD14 and LY96 with environmental endotoxin exposure in relation to hospital admissions due to asthma exacerbation which remained significant after correction for multiple testing (CD14 SNPs rs2915863 and LY96 SNP rs17226566). Amongst children with asthma, genetic variants in CD14 and LY96 may increase the risk of hospital admissions with acute exacerbations. Polymorphisms in endotoxin pathway interact with domestic endotoxin exposure in further modification of the risk of hospitalization. © 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  1. Differences in fecal microbial metabolites and microbiota of children with autism spectrum disorders

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kang, Dae-Wook; Ilhan, Zehra Esra; Isern, Nancy G.

    Evidence supporting that gut problems are linked to ASD symptoms has been accumulating both in humans and animal models of ASD. Gut microbes and their metabolites may be linked not only to GI problems but also to ASD behavior symptoms. Despite this high interest, most previous studies have looked mainly at microbial structure, and studies on fecal metabolites are rare in the context of ASD. Thus, we aimed to detect fecal metabolites that may be present at significantly different concentrations between 21 children with ASD and 23 neurotypical children and to investigate its possible link to human gut microbiome. Usingmore » NMR spectroscopy and 16S rRNA gene amplicon sequencing, we examined metabolite profiles and microbial compositions in fecal samples, respectively. Of the 59 metabolites detected, isopropanol concentrations were significantly higher in feces of children with ASD after multiple testing corrections. We also observed similar trends of fecal metabolites to previous studies; children with ASD have higher fecal p-cresol and possibly lower GABA concentrations. In addition, Fisher Discriminant Analysis (FDA) with leave-out-validation suggested that a group of metabolites- caprate, nicotinate, glutamine, thymine, and aspartate- may potentially function as a biomarker to separate ASD participants from the neurotypical group (78% sensitivity and 81% specificity). Consistent with our previous Arizona cohort study, we also confirmed lower gut microbial diversity and reduced relative abundances of Prevotella copri in children with ASD. After multiple testing corrections, we also learned that relative abundances of Feacalibacterium prausnitzii and Haemophilus parainfluenzae were lower in feces of children with ASD. Despite a relatively short list of fecal metabolites, the data in this study support that children with ASD have altered metabolite profiles in feces when compared with neurotypical children and warrant further investigation of metabolites in larger cohorts.« less

  2. Genetic and environmental (physical fitness and sedentary activity) interaction effects on cardiometabolic risk factors in Mexican American children and adolescents.

    PubMed

    Arya, Rector; Farook, Vidya S; Fowler, Sharon P; Puppala, Sobha; Chittoor, Geetha; Resendez, Roy G; Mummidi, Srinivas; Vanamala, Jairam; Almasy, Laura; Curran, Joanne E; Comuzzie, Anthony G; Lehman, Donna M; Jenkinson, Christopher P; Lynch, Jane L; DeFronzo, Ralph A; Blangero, John; Hale, Daniel E; Duggirala, Ravindranath; Diego, Vincent P

    2018-06-01

    Knowledge on genetic and environmental (G × E) interaction effects on cardiometabolic risk factors (CMRFs) in children is limited.  The purpose of this study was to examine the impact of G × E interaction effects on CMRFs in Mexican American (MA) children (n = 617, ages 6-17 years). The environments examined were sedentary activity (SA), assessed by recalls from "yesterday" (SAy) and "usually" (SAu) and physical fitness (PF) assessed by Harvard PF scores (HPFS). CMRF data included body mass index (BMI), waist circumference (WC), fat mass (FM), fasting insulin (FI), homeostasis model of assessment-insulin resistance (HOMA-IR), high-density lipoprotein cholesterol (HDL-C), triglycerides (TG), systolic (SBP) and diastolic (DBP) blood pressure, and number of metabolic syndrome components (MSC). We examined potential G × E interaction in the phenotypic expression of CMRFs using variance component models and likelihood-based statistical inference. Significant G × SA interactions were identified for six CMRFs: BMI, WC, FI, HOMA-IR, MSC, and HDL, and significant G × HPFS interactions were observed for four CMRFs: BMI, WC, FM, and HOMA-IR. However, after correcting for multiple hypothesis testing, only WC × SAy, FM × SAy, and FI × SAu interactions became marginally significant. After correcting for multiple testing, most of CMRFs exhibited significant G × E interactions (Reduced G × E model vs. Constrained model). These findings provide evidence that genetic factors interact with SA and PF to influence variation in CMRFs, and underscore the need for better understanding of these relationships to develop strategies and interventions to effectively reduce or prevent cardiometabolic risk in children. © 2018 WILEY PERIODICALS, INC.

  3. Serum glycerophosphate levels are increased in Japanese men with type 2 diabetes.

    PubMed

    Daimon, Makoto; Soga, Tomoyoshi; Hozawa, Atsushi; Oizumi, Toshihide; Kaino, Wataru; Takase, Kaoru; Karasawa, Shigeru; Jimbu, Yumi; Wada, Kiriko; Kameda, Wataru; Susa, Shinji; Kayama, Takamasa; Saito, Kaori; Tomita, Masaru; Kato, Takeo

    2012-01-01

    To identify metabolites showing changes in serum levels among Japanese male with diabetes. We performed metabolite profiling by coupling capillary electrophoresis with electrospray ionization time-of-flight mass spectrometry using fasting serum samples from Japanese male subjects with diabetes (n=17), impaired glucose tolerance (IGT; n=5) and normal glucose tolerance (NGT; n=14). Other than the expected differences in characteristics related to abnormal glucose metabolism, the percent body fat was significantly different among subjects with diabetes, IGT and NGT (27.3±6.2, 22.2±4.5 and 19.2±6.0%, respectively, p=0.0022). Therefore, percent body fat was considered as a possible confounding factor in subsequent analyses. Of 560 metabolites detected using our platform, the levels of 74 metabolites were quantified in all of the serum samples. Significant differences between diabetes and NGT were observed for 24 metabolites. The top-ranked metabolite was glycerol-3-phophate (glycerophosphate), which was significantly higher in subjects with diabetes than in those with NGT, even after Bonferroni correction for multiple testing (11.7±3.6 vs. 6.4±1.9 µM, respectively; corrected p=0.0222). Stepwise multiple regression analyses revealed that serum glycerophosphate levels were significantly correlated with 2-h plasma glucose after a 75-g oral glucose tolerance test (r=0.553, p=0.0005), independently of other characteristics, including FPG and HbA1c. Serum glycerophosphate levels were found to be elevated in Japanese men with diabetes, and correlated with 2-h PG, independent of FPG and HbA1c. Namely, serum glycerophosphate level at fasting condition can be a marker for predicting glucose intolerance. These results warrant further studies to evaluate the relevance of glycerophosphate in the pathophysiology of diabetes.

  4. Hydrologic extremes - an intercomparison of multiple gridded statistical downscaling methods

    NASA Astrophysics Data System (ADS)

    Werner, A. T.; Cannon, A. J.

    2015-06-01

    Gridded statistical downscaling methods are the main means of preparing climate model data to drive distributed hydrological models. Past work on the validation of climate downscaling methods has focused on temperature and precipitation, with less attention paid to the ultimate outputs from hydrological models. Also, as attention shifts towards projections of extreme events, downscaling comparisons now commonly assess methods in terms of climate extremes, but hydrologic extremes are less well explored. Here, we test the ability of gridded downscaling models to replicate historical properties of climate and hydrologic extremes, as measured in terms of temporal sequencing (i.e., correlation tests) and distributional properties (i.e., tests for equality of probability distributions). Outputs from seven downscaling methods - bias correction constructed analogues (BCCA), double BCCA (DBCCA), BCCA with quantile mapping reordering (BCCAQ), bias correction spatial disaggregation (BCSD), BCSD using minimum/maximum temperature (BCSDX), climate imprint delta method (CI), and bias corrected CI (BCCI) - are used to drive the Variable Infiltration Capacity (VIC) model over the snow-dominated Peace River basin, British Columbia. Outputs are tested using split-sample validation on 26 climate extremes indices (ClimDEX) and two hydrologic extremes indices (3 day peak flow and 7 day peak flow). To characterize observational uncertainty, four atmospheric reanalyses are used as climate model surrogates and two gridded observational datasets are used as downscaling target data. The skill of the downscaling methods generally depended on reanalysis and gridded observational dataset. However, CI failed to reproduce the distribution and BCSD and BCSDX the timing of winter 7 day low flow events, regardless of reanalysis or observational dataset. Overall, DBCCA passed the greatest number of tests for the ClimDEX indices, while BCCAQ, which is designed to more accurately resolve event-scale spatial gradients, passed the greatest number of tests for hydrologic extremes. Non-stationarity in the observational/reanalysis datasets complicated the evaluation of downscaling performance. Comparing temporal homogeneity and trends in climate indices and hydrological model outputs calculated from downscaled reanalyses and gridded observations was useful for diagnosing the reliability of the various historical datasets. We recommend that such analyses be conducted before such data are used to construct future hydro-climatic change scenarios.

  5. Hydrologic extremes - an intercomparison of multiple gridded statistical downscaling methods

    NASA Astrophysics Data System (ADS)

    Werner, Arelia T.; Cannon, Alex J.

    2016-04-01

    Gridded statistical downscaling methods are the main means of preparing climate model data to drive distributed hydrological models. Past work on the validation of climate downscaling methods has focused on temperature and precipitation, with less attention paid to the ultimate outputs from hydrological models. Also, as attention shifts towards projections of extreme events, downscaling comparisons now commonly assess methods in terms of climate extremes, but hydrologic extremes are less well explored. Here, we test the ability of gridded downscaling models to replicate historical properties of climate and hydrologic extremes, as measured in terms of temporal sequencing (i.e. correlation tests) and distributional properties (i.e. tests for equality of probability distributions). Outputs from seven downscaling methods - bias correction constructed analogues (BCCA), double BCCA (DBCCA), BCCA with quantile mapping reordering (BCCAQ), bias correction spatial disaggregation (BCSD), BCSD using minimum/maximum temperature (BCSDX), the climate imprint delta method (CI), and bias corrected CI (BCCI) - are used to drive the Variable Infiltration Capacity (VIC) model over the snow-dominated Peace River basin, British Columbia. Outputs are tested using split-sample validation on 26 climate extremes indices (ClimDEX) and two hydrologic extremes indices (3-day peak flow and 7-day peak flow). To characterize observational uncertainty, four atmospheric reanalyses are used as climate model surrogates and two gridded observational data sets are used as downscaling target data. The skill of the downscaling methods generally depended on reanalysis and gridded observational data set. However, CI failed to reproduce the distribution and BCSD and BCSDX the timing of winter 7-day low-flow events, regardless of reanalysis or observational data set. Overall, DBCCA passed the greatest number of tests for the ClimDEX indices, while BCCAQ, which is designed to more accurately resolve event-scale spatial gradients, passed the greatest number of tests for hydrologic extremes. Non-stationarity in the observational/reanalysis data sets complicated the evaluation of downscaling performance. Comparing temporal homogeneity and trends in climate indices and hydrological model outputs calculated from downscaled reanalyses and gridded observations was useful for diagnosing the reliability of the various historical data sets. We recommend that such analyses be conducted before such data are used to construct future hydro-climatic change scenarios.

  6. 49 CFR 40.269 - What problems cause an alcohol test to be cancelled unless they are corrected?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... cancelled unless they are corrected? 40.269 Section 40.269 Transportation Office of the Secretary of... Testing § 40.269 What problems cause an alcohol test to be cancelled unless they are corrected? As a BAT or STT, or employer, you must cancel an alcohol test if any of the following problems occur, unless...

  7. Effectiveness of a patient education plan on knowledge of post-op venous thromboembolism survival skills.

    PubMed

    Green, Julie; Bernhofer, Esther I

    2018-04-01

    To investigate the effectiveness of a multimethod venous thromboembolism prevention patient education plan on participants' knowledge retention. A potential complication of surgery requiring general anaesthesia, worldwide, is the development of life-threatening venous thromboembolism. Patients need education on preventing, recognising and immediately responding to a suspected thromboembolism. Written instructional materials given to patients at discharge may be inadequate. A randomised controlled trial. Setting was multiple general surgery units at a large Midwestern United States academic medical centre. Sample included patients recovering from surgery with general anaesthesia: (N = 66), 68% female, 34 = experimental, 32 = usual care. Prior to discharge, participants in the experimental group were given a multimethod venous thromboembolism prevention education plan including a video, pamphlet and verbal instruction; control group received usual instructional pamphlet. Both groups received a knowledge test immediately before instruction. Two weeks following discharge, a phone call was made to participants to complete the postinstruction test. The relevant EQUATOR guideline, CONSORT checklist, was used for reporting this study. There were no statistically significant differences in age, gender, race, length of stay, surgery and history of venous thromboembolism among participants and group or test score results. No statistically significant difference in postinstruction score was found between groups. However, there was a trend in greater perception of importance in all groups and higher knowledge scores in the experimental group, with the percentage of participants in the experimental group answering all questions correctly rising from 38.2% correct to 73.5% correct. Teaching patients the importance of knowing venous thromboembolism signs and preventive/survival skills is potentially life-saving and nurses must know the importance of using the most effective methods for the learning needs of their patients. Further research including different education methods and testing is suggested. © 2018 John Wiley & Sons Ltd.

  8. Overcoming the effects of false positives and threshold bias in graph theoretical analyses of neuroimaging data.

    PubMed

    Drakesmith, M; Caeyenberghs, K; Dutt, A; Lewis, G; David, A S; Jones, D K

    2015-09-01

    Graph theory (GT) is a powerful framework for quantifying topological features of neuroimaging-derived functional and structural networks. However, false positive (FP) connections arise frequently and influence the inferred topology of networks. Thresholding is often used to overcome this problem, but an appropriate threshold often relies on a priori assumptions, which will alter inferred network topologies. Four common network metrics (global efficiency, mean clustering coefficient, mean betweenness and smallworldness) were tested using a model tractography dataset. It was found that all four network metrics were significantly affected even by just one FP. Results also show that thresholding effectively dampens the impact of FPs, but at the expense of adding significant bias to network metrics. In a larger number (n=248) of tractography datasets, statistics were computed across random group permutations for a range of thresholds, revealing that statistics for network metrics varied significantly more than for non-network metrics (i.e., number of streamlines and number of edges). Varying degrees of network atrophy were introduced artificially to half the datasets, to test sensitivity to genuine group differences. For some network metrics, this atrophy was detected as significant (p<0.05, determined using permutation testing) only across a limited range of thresholds. We propose a multi-threshold permutation correction (MTPC) method, based on the cluster-enhanced permutation correction approach, to identify sustained significant effects across clusters of thresholds. This approach minimises requirements to determine a single threshold a priori. We demonstrate improved sensitivity of MTPC-corrected metrics to genuine group effects compared to an existing approach and demonstrate the use of MTPC on a previously published network analysis of tractography data derived from a clinical population. In conclusion, we show that there are large biases and instability induced by thresholding, making statistical comparisons of network metrics difficult. However, by testing for effects across multiple thresholds using MTPC, true group differences can be robustly identified. Copyright © 2015. Published by Elsevier Inc.

  9. New decoding methods of interleaved burst error-correcting codes

    NASA Astrophysics Data System (ADS)

    Nakano, Y.; Kasahara, M.; Namekawa, T.

    1983-04-01

    A probabilistic method of single burst error correction, using the syndrome correlation of subcodes which constitute the interleaved code, is presented. This method makes it possible to realize a high capability of burst error correction with less decoding delay. By generalizing this method it is possible to obtain probabilistic method of multiple (m-fold) burst error correction. After estimating the burst error positions using syndrome correlation of subcodes which are interleaved m-fold burst error detecting codes, this second method corrects erasure errors in each subcode and m-fold burst errors. The performance of these two methods is analyzed via computer simulation, and their effectiveness is demonstrated.

  10. Decision making under internal uncertainty: the case of multiple-choice tests with different scoring rules.

    PubMed

    Bereby-Meyer, Yoella; Meyer, Joachim; Budescu, David V

    2003-02-01

    This paper assesses framing effects on decision making with internal uncertainty, i.e., partial knowledge, by focusing on examinees' behavior in multiple-choice (MC) tests with different scoring rules. In two experiments participants answered a general-knowledge MC test that consisted of 34 solvable and 6 unsolvable items. Experiment 1 studied two scoring rules involving Positive (only gains) and Negative (only losses) scores. Although answering all items was the dominating strategy for both rules, the results revealed a greater tendency to answer under the Negative scoring rule. These results are in line with the predictions derived from Prospect Theory (PT) [Econometrica 47 (1979) 263]. The second experiment studied two scoring rules, which allowed respondents to exhibit partial knowledge. Under the Inclusion-scoring rule the respondents mark all answers that could be correct, and under the Exclusion-scoring rule they exclude all answers that might be incorrect. As predicted by PT, respondents took more risks under the Inclusion rule than under the Exclusion rule. The results illustrate that the basic process that underlies choice behavior under internal uncertainty and especially the effect of framing is similar to the process of choice under external uncertainty and can be described quite accurately by PT. Copyright 2002 Elsevier Science B.V.

  11. Yoga leads to multiple physical improvements after stroke, a pilot study.

    PubMed

    Schmid, Arlene A; Miller, Kristine K; Van Puymbroeck, Marieke; DeBaun-Sprague, Erin

    2014-12-01

    To assess change in physical functioning (pain, range of motion (ROM), strength, and endurance) after 8 weeks of therapeutic-yoga. Planned analyses of data from a randomized pilot study of yoga after stroke. University-based research laboratory. People with chronic stroke (N=47) randomized to therapeutic-yoga (n=37) or wait-list control (n=10). 16 sessions of therapeutic yoga (twice a week/8 weeks). Yoga was delivered in a standardized and progressive format with postures, breathing, and meditation, and relaxation in sitting, standing, and supine. Pain was assessed with the PEG, a 3-item functional measure of the interference of pain. ROM included neck and hip active and passive ROM measurements). Upper and lower extremity strength were assessed with the arm curl test and chair-to-stand test, respectively. Endurance was assessed with the 6-minute walk and modified 2-min step test. After a Bonferroni Correction, pain, neck ROM, hip passive ROM, upper extremity strength, and the 6-min walk scores all significantly improved after 8 weeks of engaging in yoga. No changes occurred in the wait-list control group. A group therapeutic-yoga intervention may improve multiple aspects of physical functioning after stroke. Such an intervention may be complementary to traditional rehabilitation. Copyright © 2014 Elsevier Ltd. All rights reserved.

  12. Development of the Central Dogma Concept Inventory (CDCI) Assessment Tool

    PubMed Central

    Newman, Dina L.; Snyder, Christopher W.; Fisk, J. Nick; Wright, L. Kate

    2016-01-01

    Scientific teaching requires scientifically constructed, field-tested instruments to accurately evaluate student thinking and gauge teacher effectiveness. We have developed a 23-question, multiple select–format assessment of student understanding of the essential concepts of the central dogma of molecular biology that is appropriate for all levels of undergraduate biology. Questions for the Central Dogma Concept Inventory (CDCI) tool were developed and iteratively revised based on student language and review by experts. The ability of the CDCI to discriminate between levels of understanding of the central dogma is supported by field testing (N = 54), and large-scale beta testing (N = 1733). Performance on the assessment increased with experience in biology; scores covered a broad range and showed no ceiling effect, even with senior biology majors, and pre/posttesting of a single class focused on the central dogma showed significant improvement. The multiple-select format reduces the chances of correct answers by random guessing, allows students at different levels to exhibit the extent of their knowledge, and provides deeper insight into the complexity of student thinking on each theme. To date, the CDCI is the first tool dedicated to measuring student thinking about the central dogma of molecular biology, and version 5 is ready to use. PMID:27055775

  13. Some ideas and opportunities concerning three-dimensional wind-tunnel wall corrections

    NASA Technical Reports Server (NTRS)

    Rubbert, P. E.

    1982-01-01

    Opportunities for improving the accuracy and reliability of wall corrections in conventional ventilated test sections are presented. The approach encompasses state-of-the-art technology in transonic computational methods combined with the measurement of tunnel-wall pressures. The objective is to arrive at correction procedures of known, verifiable accuracy that are practical within a production testing environment. It is concluded that: accurate and reliable correction procedures can be developed for cruise-type aerodynamic testing for any wall configuration; passive walls can be optimized for minimal interference for cruise-type aerodynamic testing (tailored slots, variable open area ratio, etc.); monitoring and assessment of noncorrectable interference (buoyancy and curvature in a transonic stream) can be an integral part of a correction procedure; and reasonably good correction procedures can probably be developd for complex flows involving extensive separation and other unpredictable phenomena.

  14. Multiple needle puncturing: balancing the varus knee.

    PubMed

    Bellemans, Johan

    2011-09-09

    The so-called "pie crusting" technique using multiple stab incisions is a well-established procedure for correcting tightness of the iliotibial band in the valgus knee. It is, however, not applicable for balancing the medial side in varus knees because of the risk for iatrogenic transsection of the medial collateral ligament (MCL). This article presents our experience with a safer alternative and minimally invasive technique for medial soft tissue balancing, where we make multiple punctures in the MCL using a 19-gauge needle to progressively stretch the MCL until a correct ligament balance is achieved. Our technique requires minimal to no additional soft tissue dissection and can even be performed percutaneously when necessary. This technique, therefore, does not impact the length of the skin or soft tissue incisions. We analyzed 61 cases with varus deformity that were intraoperatively treated using this technique. In 4 other cases, the technique was used as a percutaneous procedure to correct postoperative medial tightness that caused persistent pain on the medial side. The procedure was considered successful when a 2- to 4-mm mediolateral joint line opening was obtained in extension and 2 to 6 mm in flexion. In 62 cases (95%), a progressive correction of medial tightness was achieved according to the above-described criteria. Three cases were overreleased and required compensatory release of the lateral structures and use of a thicker insert. Based on these results, we consider needle puncturing an effective and safe technique for progressive correction of MCL tightness during minimally invasive total knee arthroplasty. Copyright 2011, SLACK Incorporated.

  15. Exploring viewing behavior data from whole slide images to predict correctness of students' answers during practical exams in oral pathology.

    PubMed

    Walkowski, Slawomir; Lundin, Mikael; Szymas, Janusz; Lundin, Johan

    2015-01-01

    The way of viewing whole slide images (WSI) can be tracked and analyzed. In particular, it can be useful to learn how medical students view WSIs during exams and how their viewing behavior is correlated with correctness of the answers they give. We used software-based view path tracking method that enabled gathering data about viewing behavior of multiple simultaneous WSI users. This approach was implemented and applied during two practical exams in oral pathology in 2012 (88 students) and 2013 (91 students), which were based on questions with attached WSIs. Gathered data were visualized and analyzed in multiple ways. As a part of extended analysis, we tried to use machine learning approaches to predict correctness of students' answers based on how they viewed WSIs. We compared the results of analyses for years 2012 and 2013 - done for a single question, for student groups, and for a set of questions. The overall patterns were generally consistent across these 3 years. Moreover, viewing behavior data appeared to have certain potential for predicting answers' correctness and some outcomes of machine learning approaches were in the right direction. However, general prediction results were not satisfactory in terms of precision and recall. Our work confirmed that the view path tracking method is useful for discovering viewing behavior of students analyzing WSIs. It provided multiple useful insights in this area, and general results of our analyses were consistent across two exams. On the other hand, predicting answers' correctness appeared to be a difficult task - students' answers seem to be often unpredictable.

  16. Improving Graduate Students' Graphing Skills of Multiple Baseline Designs with Microsoft[R] Excel 2007

    ERIC Educational Resources Information Center

    Lo, Ya-yu; Starling, A. Leyf Peirce

    2009-01-01

    This study examined the effects of a graphing task analysis using the Microsoft[R] Office Excel 2007 program on the single-subject multiple baseline graphing skills of three university graduate students. Using a multiple probe across participants design, the study demonstrated a functional relationship between the number of correct graphing…

  17. Application of distance correction to ChemCam laser-induced breakdown spectroscopy measurements

    DOE PAGES

    Mezzacappa, A.; Melikechi, N.; Cousin, A.; ...

    2016-04-04

    Laser-induced breakdown spectroscopy (LIBS) provides chemical information from atomic, ionic, and molecular emissions from which geochemical composition can be deciphered. Analysis of LIBS spectra in cases where targets are observed at different distances, as is the case for the ChemCam instrument on the Mars rover Curiosity, which performs analyses at distances between 2 and 7.4 m is not a simple task. Previously, we showed that spectral distance correction based on a proxy spectroscopic standard created from first-shot dust observations on Mars targets ameliorates the distance bias in multivariate-based elemental-composition predictions of laboratory data. In this work, we correct an expandedmore » set of neutral and ionic spectral emissions for distance bias in the ChemCam data set. By using and testing different selection criteria to generate multiple proxy standards, we find a correction that minimizes the difference in spectral intensity measured at two different distances and increases spectral reproducibility. When the quantitative performance of distance correction is assessed, there is improvement for SiO 2, Al 2O 3, CaO, FeOT, Na 2O, K 2O, that is, for most of the major rock forming elements, and for the total major-element weight percent predicted. But, for MgO the method does not provide improvements while for TiO 2, it yields inconsistent results. Additionally, we observed that many emission lines do not behave consistently with distance, evidenced from laboratory analogue measurements and ChemCam data. This limits the effectiveness of the method.« less

  18. Image Registration to Compensate for EPI Distortion in Patients with Brain Tumors: An Evaluation of Tract-Specific Effects.

    PubMed

    Albi, Angela; Meola, Antonio; Zhang, Fan; Kahali, Pegah; Rigolo, Laura; Tax, Chantal M W; Ciris, Pelin Aksit; Essayed, Walid I; Unadkat, Prashin; Norton, Isaiah; Rathi, Yogesh; Olubiyi, Olutayo; Golby, Alexandra J; O'Donnell, Lauren J

    2018-03-01

    Diffusion magnetic resonance imaging (dMRI) provides preoperative maps of neurosurgical patients' white matter tracts, but these maps suffer from echo-planar imaging (EPI) distortions caused by magnetic field inhomogeneities. In clinical neurosurgical planning, these distortions are generally not corrected and thus contribute to the uncertainty of fiber tracking. Multiple image processing pipelines have been proposed for image-registration-based EPI distortion correction in healthy subjects. In this article, we perform the first comparison of such pipelines in neurosurgical patient data. Five pipelines were tested in a retrospective clinical dMRI dataset of 9 patients with brain tumors. Pipelines differed in the choice of fixed and moving images and the similarity metric for image registration. Distortions were measured in two important tracts for neurosurgery, the arcuate fasciculus and corticospinal tracts. Significant differences in distortion estimates were found across processing pipelines. The most successful pipeline used dMRI baseline and T2-weighted images as inputs for distortion correction. This pipeline gave the most consistent distortion estimates across image resolutions and brain hemispheres. Quantitative results of mean tract distortions on the order of 1-2 mm are in line with other recent studies, supporting the potential need for distortion correction in neurosurgical planning. Novel results include significantly higher distortion estimates in the tumor hemisphere and greater effect of image resolution choice on results in the tumor hemisphere. Overall, this study demonstrates possible pitfalls and indicates that care should be taken when implementing EPI distortion correction in clinical settings. Copyright © 2018 by the American Society of Neuroimaging.

  19. Simulating Streamflow Using Bias-corrected Multiple Satellite Rainfall Products in the Tekeze Basin, Ethiopia

    NASA Astrophysics Data System (ADS)

    Abitew, T. A.; Roy, T.; Serrat-Capdevila, A.; van Griensven, A.; Bauwens, W.; Valdes, J. B.

    2016-12-01

    The Tekeze Basin supports one of Africans largest Arch Dam located in northern Ethiopian has vital role in hydropower generation. However, little has been done on the hydrology of the basin due to limited in situ hydroclimatological data. Therefore, the main objective of this research is to simulate streamflow upstream of the Tekeze Dam using Soil and Water Assessment Tool (SWAT) forced by bias-corrected multiple satellite rainfall products (CMORPH, TMPA and PERSIANN-CCS). This talk will present the potential as well as skills of bias-corrected satellite rainfall products for streamflow prediction in in Tropical Africa. Additionally, the SWAT model results will also be compared with previous conceptual Hydrological models (HyMOD and HBV) from SERVIR Streamflow forecasting in African Basin project (http://www.swaat.arizona.edu/index.html).

  20. Modeling non-linear growth responses to temperature and hydrology in wetland trees

    NASA Astrophysics Data System (ADS)

    Keim, R.; Allen, S. T.

    2016-12-01

    Growth responses of wetland trees to flooding and climate variations are difficult to model because they depend on multiple, apparently interacting factors, but are a critical link in hydrological control of wetland carbon budgets. To more generally understand tree growth to hydrological forcing, we modeled non-linear responses of tree ring growth to flooding and climate at sub-annual time steps, using Vaganov-Shashkin response functions. We calibrated the model to six baldcypress tree-ring chronologies from two hydrologically distinct sites in southern Louisiana, and tested several hypotheses of plasticity in wetlands tree responses to interacting environmental variables. The model outperformed traditional multiple linear regression. More importantly, optimized response parameters were generally similar among sites with varying hydrological conditions, suggesting generality to the functions. Model forms that included interacting responses to multiple forcing factors were more effective than were single response functions, indicating the principle of a single limiting factor is not correct in wetlands and both climatic and hydrological variables must be considered in predicting responses to hydrological or climate change.

  1. Model selection with multiple regression on distance matrices leads to incorrect inferences.

    PubMed

    Franckowiak, Ryan P; Panasci, Michael; Jarvis, Karl J; Acuña-Rodriguez, Ian S; Landguth, Erin L; Fortin, Marie-Josée; Wagner, Helene H

    2017-01-01

    In landscape genetics, model selection procedures based on Information Theoretic and Bayesian principles have been used with multiple regression on distance matrices (MRM) to test the relationship between multiple vectors of pairwise genetic, geographic, and environmental distance. Using Monte Carlo simulations, we examined the ability of model selection criteria based on Akaike's information criterion (AIC), its small-sample correction (AICc), and the Bayesian information criterion (BIC) to reliably rank candidate models when applied with MRM while varying the sample size. The results showed a serious problem: all three criteria exhibit a systematic bias toward selecting unnecessarily complex models containing spurious random variables and erroneously suggest a high level of support for the incorrectly ranked best model. These problems effectively increased with increasing sample size. The failure of AIC, AICc, and BIC was likely driven by the inflated sample size and different sum-of-squares partitioned by MRM, and the resulting effect on delta values. Based on these findings, we strongly discourage the continued application of AIC, AICc, and BIC for model selection with MRM.

  2. New approach to CT pixel-based photon dose calculations in heterogeneous media

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wong, J.W.; Henkelman, R.M.

    The effects of small cavities on dose in water and the dose in a homogeneous nonunit density medium illustrate that inhomogeneities do not act independently in photon dose perturbation, and serve as two constraints which should be satisfied by approximate methods of computed tomography (CT) pixel-based dose calculations. Current methods at best satisfy only one of the two constraints and show inadequacies in some intermediate geometries. We have developed an approximate method that satisfies both these constraints and treats much of the synergistic effect of multiple inhomogeneities correctly. The method calculates primary and first-scatter doses by first-order ray tracing withmore » the first-scatter contribution augmented by a component of second scatter that behaves like first scatter. Multiple-scatter dose perturbation values extracted from small cavity experiments are used in a function which approximates the small residual multiple-scatter dose. For a wide range of geometries tested, our method agrees very well with measurements. The average deviation is less than 2% with a maximum of 3%. In comparison, calculations based on existing methods can have errors larger than 10%.« less

  3. Multiple search methods for similarity-based virtual screening: analysis of search overlap and precision

    PubMed Central

    2011-01-01

    Background Data fusion methods are widely used in virtual screening, and make the implicit assumption that the more often a molecule is retrieved in multiple similarity searches, the more likely it is to be active. This paper tests the correctness of this assumption. Results Sets of 25 searches using either the same reference structure and 25 different similarity measures (similarity fusion) or 25 different reference structures and the same similarity measure (group fusion) show that large numbers of unique molecules are retrieved by just a single search, but that the numbers of unique molecules decrease very rapidly as more searches are considered. This rapid decrease is accompanied by a rapid increase in the fraction of those retrieved molecules that are active. There is an approximately log-log relationship between the numbers of different molecules retrieved and the number of searches carried out, and a rationale for this power-law behaviour is provided. Conclusions Using multiple searches provides a simple way of increasing the precision of a similarity search, and thus provides a justification for the use of data fusion methods in virtual screening. PMID:21824430

  4. Providing Counseling for Transgendered Inmates: A Survey of Correctional Services

    ERIC Educational Resources Information Center

    von Dresner, Kara Sandor; Underwood, Lee A.; Suarez, Elisabeth; Franklin, Timothy

    2013-01-01

    The purpose of this study was to survey the current assessment, housing, and mental health treatment needs of transsexual inmates within state correctional facilities. The literature reviewed epidemiology, prevalence, multiple uses of terms, assessment, and current standards of care. Along with the rise of the multicultural movement, growing…

  5. [Posttraumatic retroperitoneal hematoma in injured persons with severe closed multiple and combined abdominal trauma].

    PubMed

    Rylov, A I; Kravets, N S

    2001-01-01

    The experience of treatment of 69 injured persons with posttraumatic retroperitoneal hematoma suffering severe multiple combined abdominal trauma was analyzed. Application of the classification proposed permits to formulate diagnosis and to choose the tactic of treatment correctly. The intraoperative tactics algorithm was elaborated. It promotes the correct analysis of intraoperative findings and reduction of the diagnostic mistakes frequency as well. In the presence of vast defect, making impossible to suture over the parietal peritoneum, extraperitonization using cerebral dura mater was done. Operative intervention was concluded by drainage with subsequent laserotherapy.

  6. Documentation for the machine-readable version of the general catalogue of 33342 stars for the epoch 1950 (Boss 1937)

    NASA Technical Reports Server (NTRS)

    Roman, N. G.; Warren, W. H., Jr.

    1983-01-01

    A revised and corrected version of the machine-readable catalog has been prepared. Cross identifications of the GC stars to the HD and DM catalogs have been replaced by data from the new SAO-HD-GC-DM Cross Index (Roman, Warren and Schofield 1983), including component identifications for multiple SAO entries having identical DM numbers in the SAO Catalog, supplemental Bonner Durchmusterung stars (lower case letter designations) and codes for multiple HD stars. Additional individual corrections have been incorporated based upon errors found during analyses of other catalogs.

  7. Convergent genetic and expression data implicate immunity in Alzheimer's disease

    PubMed Central

    Jones, Lesley; Lambert, Jean-Charles; Wang, Li-San; Choi, Seung-Hoan; Harold, Denise; Vedernikov, Alexey; Escott-Price, Valentina; Stone, Timothy; Richards, Alexander; Bellenguez, Céline; Ibrahim-Verbaas, Carla A; Naj, Adam C; Sims, Rebecca; Gerrish, Amy; Jun, Gyungah; DeStefano, Anita L; Bis, Joshua C; Beecham, Gary W; Grenier-Boley, Benjamin; Russo, Giancarlo; Thornton-Wells, Tricia A; Jones, Nicola; Smith, Albert V; Chouraki, Vincent; Thomas, Charlene; Ikram, M Arfan; Zelenika, Diana; Vardarajan, Badri N; Kamatani, Yoichiro; Lin, Chiao-Feng; Schmidt, Helena; Kunkle, Brian; Dunstan, Melanie L; Ruiz, Agustin; Bihoreau, Marie-Thérèse; Reitz, Christiane; Pasquier, Florence; Hollingworth, Paul; Hanon, Olivier; Fitzpatrick, Annette L; Buxbaum, Joseph D; Campion, Dominique; Crane, Paul K; Becker, Tim; Gudnason, Vilmundur; Cruchaga, Carlos; Craig, David; Amin, Najaf; Berr, Claudine; Lopez, Oscar L; De Jager, Philip L; Deramecourt, Vincent; Johnston, Janet A; Evans, Denis; Lovestone, Simon; Letteneur, Luc; Kornhuber, Johanes; Tárraga, Lluís; Rubinsztein, David C; Eiriksdottir, Gudny; Sleegers, Kristel; Goate, Alison M; Fiévet, Nathalie; Huentelman, Matthew J; Gill, Michael; Emilsson, Valur; Brown, Kristelle; Kamboh, M Ilyas; Keller, Lina; Barberger-Gateau, Pascale; McGuinness, Bernadette; Larson, Eric B; Myers, Amanda J; Dufouil, Carole; Todd, Stephen; Wallon, David; Love, Seth; Kehoe, Pat; Rogaeva, Ekaterina; Gallacher, John; George-Hyslop, Peter St; Clarimon, Jordi; Lleὀ, Alberti; Bayer, Anthony; Tsuang, Debby W; Yu, Lei; Tsolaki, Magda; Bossù, Paola; Spalletta, Gianfranco; Proitsi, Petra; Collinge, John; Sorbi, Sandro; Garcia, Florentino Sanchez; Fox, Nick; Hardy, John; Naranjo, Maria Candida Deniz; Razquin, Cristina; Bosco, Paola; Clarke, Robert; Brayne, Carol; Galimberti, Daniela; Mancuso, Michelangelo; Moebus, Susanne; Mecocci, Patrizia; del Zompo, Maria; Maier, Wolfgang; Hampel, Harald; Pilotto, Alberto; Bullido, Maria; Panza, Francesco; Caffarra, Paolo; Nacmias, Benedetta; Gilbert, John R; Mayhaus, Manuel; Jessen, Frank; Dichgans, Martin; Lannfelt, Lars; Hakonarson, Hakon; Pichler, Sabrina; Carrasquillo, Minerva M; Ingelsson, Martin; Beekly, Duane; Alavarez, Victoria; Zou, Fanggeng; Valladares, Otto; Younkin, Steven G; Coto, Eliecer; Hamilton-Nelson, Kara L; Mateo, Ignacio; Owen, Michael J; Faber, Kelley M; Jonsson, Palmi V; Combarros, Onofre; O'Donovan, Michael C; Cantwell, Laura B; Soininen, Hilkka; Blacker, Deborah; Mead, Simon; Mosley, Thomas H; Bennett, David A; Harris, Tamara B; Fratiglioni, Laura; Holmes, Clive; de Bruijn, Renee FAG; Passmore, Peter; Montine, Thomas J; Bettens, Karolien; Rotter, Jerome I; Brice, Alexis; Morgan, Kevin; Foroud, Tatiana M; Kukull, Walter A; Hannequin, Didier; Powell, John F; Nalls, Michael A; Ritchie, Karen; Lunetta, Kathryn L; Kauwe, John SK; Boerwinkle, Eric; Riemenschneider, Matthias; Boada, Mercè; Hiltunen, Mikko; Martin, Eden R; Pastor, Pau; Schmidt, Reinhold; Rujescu, Dan; Dartigues, Jean-François; Mayeux, Richard; Tzourio, Christophe; Hofman, Albert; Nöthen, Markus M; Graff, Caroline; Psaty, Bruce M; Haines, Jonathan L; Lathrop, Mark; Pericak-Vance, Margaret A; Launer, Lenore J; Farrer, Lindsay A; van Duijn, Cornelia M; Van Broekhoven, Christine; Ramirez, Alfredo; Schellenberg, Gerard D; Seshadri, Sudha; Amouyel, Philippe; Holmans, Peter A

    2015-01-01

    Background Late–onset Alzheimer's disease (AD) is heritable with 20 genes showing genome wide association in the International Genomics of Alzheimer's Project (IGAP). To identify the biology underlying the disease we extended these genetic data in a pathway analysis. Methods The ALIGATOR and GSEA algorithms were used in the IGAP data to identify associated functional pathways and correlated gene expression networks in human brain. Results ALIGATOR identified an excess of curated biological pathways showing enrichment of association. Enriched areas of biology included the immune response (p = 3.27×10-12 after multiple testing correction for pathways), regulation of endocytosis (p = 1.31×10-11), cholesterol transport (p = 2.96 × 10-9) and proteasome-ubiquitin activity (p = 1.34×10-6). Correlated gene expression analysis identified four significant network modules, all related to the immune response (corrected p 0.002 – 0.05). Conclusions The immune response, regulation of endocytosis, cholesterol transport and protein ubiquitination represent prime targets for AD therapeutics. PMID:25533204

  8. A new scheme for perturbative triples correction to (0,1) sector of Fock space multi-reference coupled cluster method: theory, implementation, and examples.

    PubMed

    Dutta, Achintya Kumar; Vaval, Nayana; Pal, Sourav

    2015-01-28

    We propose a new elegant strategy to implement third order triples correction in the light of many-body perturbation theory to the Fock space multi-reference coupled cluster method for the ionization problem. The computational scaling as well as the storage requirement is of key concerns in any many-body calculations. Our proposed approach scales as N(6) does not require the storage of triples amplitudes and gives superior agreement over all the previous attempts made. This approach is capable of calculating multiple roots in a single calculation in contrast to the inclusion of perturbative triples in the equation of motion variant of the coupled cluster theory, where each root needs to be computed in a state-specific way and requires both the left and right state vectors together. The performance of the newly implemented scheme is tested by applying to methylene, boron nitride (B2N) anion, nitrogen, water, carbon monoxide, acetylene, formaldehyde, and thymine monomer, a DNA base.

  9. Differential transfer processes in incremental visuomotor adaptation.

    PubMed

    Seidler, Rachel D

    2005-01-01

    Visuomotor adaptive processes were examined by testing transfer of adaptation between similar conditions. Participants made manual aiming movements with a joystick to hit targets on a computer screen, with real-time feedback display of their movement. They adapted to three different rotations of the display in a sequential fashion, with a return to baseline display conditions between rotations. Adaptation was better when participants had prior adaptive experiences. When performance was assessed using direction error (calculated at the time of peak velocity) and initial endpoint error (error before any overt corrective actions), transfer was greater when the final rotation reflected an addition of previously experienced rotations (adaptation order 30 degrees rotation, 15 degrees, 45 degrees) than when it was a subtraction of previously experienced conditions (adaptation order 45 degrees rotation, 15 degrees, 30 degrees). Transfer was equal regardless of adaptation order when performance was assessed with final endpoint error (error following any discrete, corrective actions). These results imply the existence of multiple independent processes in visuomotor adaptation.

  10. Convergent genetic and expression data implicate immunity in Alzheimer's disease.

    PubMed

    2015-06-01

    Late-onset Alzheimer's disease (AD) is heritable with 20 genes showing genome-wide association in the International Genomics of Alzheimer's Project (IGAP). To identify the biology underlying the disease, we extended these genetic data in a pathway analysis. The ALIGATOR and GSEA algorithms were used in the IGAP data to identify associated functional pathways and correlated gene expression networks in human brain. ALIGATOR identified an excess of curated biological pathways showing enrichment of association. Enriched areas of biology included the immune response (P = 3.27 × 10(-12) after multiple testing correction for pathways), regulation of endocytosis (P = 1.31 × 10(-11)), cholesterol transport (P = 2.96 × 10(-9)), and proteasome-ubiquitin activity (P = 1.34 × 10(-6)). Correlated gene expression analysis identified four significant network modules, all related to the immune response (corrected P = .002-.05). The immune response, regulation of endocytosis, cholesterol transport, and protein ubiquitination represent prime targets for AD therapeutics. Copyright © 2015. Published by Elsevier Inc.

  11. IQ Scores Should Be Corrected for the Flynn Effect in High-Stakes Decisions

    ERIC Educational Resources Information Center

    Fletcher, Jack M.; Stuebing, Karla K.; Hughes, Lisa C.

    2010-01-01

    IQ test scores should be corrected for high stakes decisions that employ these assessments, including capital offense cases. If scores are not corrected, then diagnostic standards must change with each generation. Arguments against corrections, based on standards of practice, information present and absent in test manuals, and related issues,…

  12. Measurement Via Optical Near-Nulling and Subaperture Stitching

    NASA Technical Reports Server (NTRS)

    Forbes, Greg; De Vries, Gary; Murphy, Paul; Brophy, Chris

    2012-01-01

    A subaperture stitching interferometer system provides near-nulling of a subaperture wavefront reflected from an object of interest over a portion of a surface of the object. A variable optical element located in the radiation path adjustably provides near-nulling to facilitate stitching of subaperture interferograms, creating an interferogram representative of the entire surface of interest. This enables testing of aspheric surfaces without null optics customized for each surface prescription. The surface shapes of objects such as lenses and other precision components are often measured with interferometry. However, interferometers have a limited capture range, and thus the test wavefront cannot be too different from the reference or the interference cannot be analyzed. Furthermore, the performance of the interferometer is usually best when the test and reference wavefronts are nearly identical (referred to as a null condition). Thus, it is necessary when performing such measurements to correct for known variations in shape to ensure that unintended variations are within the capture range of the interferometer and accurately measured. This invention is a system for nearnulling within a subaperture stitching interferometer, although in principle, the concept can be employed by wavefront measuring gauges other than interferometers. The system employs a light source for providing coherent radiation of a subaperture extent. An object of interest is placed to modify the radiation (e.g., to reflect or pass the radiation), and a variable optical element is located to interact with, and nearly null, the affected radiation. A detector or imaging device is situated to obtain interference patterns in the modified radiation. Multiple subaperture interferograms are taken and are stitched, or joined, to provide an interferogram representative of the entire surface of the object of interest. The primary aspect of the invention is the use of adjustable corrective optics in the context of subaperture stitching near-nulling interferometry, wherein a complex surface is analyzed via multiple, separate, overlapping interferograms. For complex surfaces, the problem of managing the identification and placement of corrective optics becomes even more pronounced, to the extent that in most cases the null corrector optics are specific to the particular asphere prescription and no others (i.e. another asphere requires completely different null correction optics). In principle, the near-nulling technique does not require subaperture stitching at all. Building a near-null system that is practically useful relies on two key features: simplicity and universality. If the system is too complex, it will be difficult to calibrate and model its manufacturing errors, rendering it useless as a precision metrology tool and/or prohibitively expensive. If the system is not applicable to a wide range of test parts, then it does not provide significant value over conventional null-correction technology. Subaperture stitching enables simpler and more universal near-null systems to be effective, because a fraction of a surface is necessarily less complex than the whole surface (excepting the extreme case of a fractal surface description). The technique of near-nulling can significantly enhance aspheric subaperture stitching capability by allowing the interferometer to capture a wider range of aspheres. More over, subaperture stitching is essential to a truly effective near-nulling system, since looking at a fraction of the surface keeps the wavefront complexity within the capability of a relatively simple nearnull apparatus. Furthermore, by reducing the subaperture size, the complexity of the measured wavefront can be reduced until it is within the capability of the near-null design.

  13. Multimodal partial volume correction: Application to [11C]PIB PET/MRI myelin imaging in multiple sclerosis.

    PubMed

    Grecchi, Elisabetta; Veronese, Mattia; Bodini, Benedetta; García-Lorenzo, Daniel; Battaglini, Marco; Stankoff, Bruno; Turkheimer, Federico E

    2017-12-01

    The [ 11 C]PIB PET tracer, originally developed for amyloid imaging, has been recently repurposed to quantify demyelination and remyelination in multiple sclerosis (MS). Myelin PET imaging, however, is limited by its low resolution that deteriorates the quantification accuracy of white matter (WM) lesions. Here, we introduce a novel partial volume correction (PVC) method called Multiresolution-Multimodal Resolution-Recovery (MM-RR), which uses the wavelet transform and a synergistic statistical model to exploit MRI structural images to improve the resolution of [ 11 C]PIB PET myelin imaging. MM-RR performance was tested on a phantom acquisition and in a dataset comprising [ 11 C]PIB PET and MR T1- and T2-weighted images of 8 healthy controls and 20 MS patients. For the control group, the MM-RR PET images showed an average increase of 5.7% in WM uptake while the grey-matter (GM) uptake remained constant, resulting in +31% WM/GM contrast. Furthermore, MM-RR PET binding maps correlated significantly with the mRNA expressions of the most represented proteins in the myelin sheath (R 2  = 0.57 ± 0.09). In the patient group, MM-RR PET images showed sharper lesion contours and significant improvement in normal-appearing tissue/WM-lesion contrast compared to standard PET (contrast improvement > +40%). These results were consistent with MM-RR performances in phantom experiments.

  14. Pushing Critical Thinking Skills With Multiple-Choice Questions: Does Bloom's Taxonomy Work?

    PubMed

    Zaidi, Nikki L Bibler; Grob, Karri L; Monrad, Seetha M; Kurtz, Joshua B; Tai, Andrew; Ahmed, Asra Z; Gruppen, Larry D; Santen, Sally A

    2018-06-01

    Medical school assessments should foster the development of higher-order thinking skills to support clinical reasoning and a solid foundation of knowledge. Multiple-choice questions (MCQs) are commonly used to assess student learning, and well-written MCQs can support learner engagement in higher levels of cognitive reasoning such as application or synthesis of knowledge. Bloom's taxonomy has been used to identify MCQs that assess students' critical thinking skills, with evidence suggesting that higher-order MCQs support a deeper conceptual understanding of scientific process skills. Similarly, clinical practice also requires learners to develop higher-order thinking skills that include all of Bloom's levels. Faculty question writers and examinees may approach the same material differently based on varying levels of knowledge and expertise, and these differences can influence the cognitive levels being measured by MCQs. Consequently, faculty question writers may perceive that certain MCQs require higher-order thinking skills to process the question, whereas examinees may only need to employ lower-order thinking skills to render a correct response. Likewise, seemingly lower-order questions may actually require higher-order thinking skills to respond correctly. In this Perspective, the authors describe some of the cognitive processes examinees use to respond to MCQs. The authors propose that various factors affect both the question writer and examinee's interaction with test material and subsequent cognitive processes necessary to answer a question.

  15. Driving performance in persons with mild to moderate symptoms of multiple sclerosis.

    PubMed

    Devos, Hannes; Brijs, Tom; Alders, Geert; Wets, Geert; Feys, Peter

    2013-08-01

    To investigate whether driving performance is impaired in persons with mild to moderate multiple sclerosis (MS). This study included 15 persons with MS (pwMS) and 17 healthy controls. The MS group exhibited mild to moderate impairments on the Expanded Disability Status Scale (median, Q1-Q3; 3.5, 2.5-4). The driving simulation required participants to drive in daily traffic while attending to a divided attention (DA) task. Computerized measures on the driving task included number of accidents, tickets, speed maintenance, standard deviation of lateral position, and time to collision. Response times and accuracy on the DA task were also computer generated. Additionally, pwMS completed a clinical evaluation encompassing motor, functional, visual, psychosocial and cognitive tests. No differences between healthy controls and pwMS were observed on all measures of the primary driving task. PwMS performed worse than healthy controls on DA response time (3.10 s, 2.87-3.68 versus 2.15 s, 2.04-2.43; p = 0.001) and accuracy (15 correct answers, 11-18 versus 24 correct answers, 22-25; p < 0.0001). Depression was significantly associated with time to collision (r = -0.77; p < 0.01). Subjects with mild to moderate MS are able to prioritize the driving task above the DA task. The relationship between depression and driving performance in MS merits further investigation.

  16. Artificial intelligence in mitral valve analysis.

    PubMed

    Jeganathan, Jelliffe; Knio, Ziyad; Amador, Yannis; Hai, Ting; Khamooshian, Arash; Matyal, Robina; Khabbaz, Kamal R; Mahmood, Feroze

    2017-01-01

    Echocardiographic analysis of mitral valve (MV) has become essential for diagnosis and management of patients with MV disease. Currently, the various software used for MV analysis require manual input and are prone to interobserver variability in the measurements. The aim of this study is to determine the interobserver variability in an automated software that uses artificial intelligence for MV analysis. Retrospective analysis of intraoperative three-dimensional transesophageal echocardiography data acquired from four patients with normal MV undergoing coronary artery bypass graft surgery in a tertiary hospital. Echocardiographic data were analyzed using the eSie Valve Software (Siemens Healthcare, Mountain View, CA, USA). Three examiners analyzed three end-systolic (ES) frames from each of the four patients. A total of 36 ES frames were analyzed and included in the study. A multiple mixed-effects ANOVA model was constructed to determine if the examiner, the patient, and the loop had a significant effect on the average value of each parameter. A Bonferroni correction was used to correct for multiple comparisons, and P = 0.0083 was considered to be significant. Examiners did not have an effect on any of the six parameters tested. Patient and loop had an effect on the average parameter value for each of the six parameters as expected (P < 0.0083 for both). We were able to conclude that using automated analysis, it is possible to obtain results with good reproducibility, which only requires minimal user intervention.

  17. Status of ARGOS - The Laser Guide Star System for the LBT

    NASA Astrophysics Data System (ADS)

    Raab, Walfried; Rabien, Sebastian; Gaessler, Wolfgang; Esposito, Simone; Antichi, Jacopo; Lloyd-Hart, Michael; Barl, Lothar; Beckmann, Udo; Bonaglia, Marco; Borelli, Jose; Brynnel, Joar; Buschkamp, Peter; Busoni, Lorenzo; Carbonaro, Luca; Christou, Julian; Connot, Claus; Davies, Richard; Deysenroth, Matthias; Durney, Olivier; Green, Richard; Gemperlein, Hans; Gasho, Victor; Haug, Marcus; Hubbard, Pete; Ihle, Sebastian; Kulas, Martin; Loose, Christina; Lehmitz, Michael; Noenickx, Jamison; Nussbaum, Edmund; Orban De Xivry, Gilles; Quirrenbach, Andreas; Peter, Diethard; Rahmer, Gustavo; Rademacher, Matt; Storm, Jesper; Schwab, Christian; Vaitheeswaran, Vidhya; Ziegleder, Julian

    2013-12-01

    ARGOS is an innovative multiple laser guide star adaptive optics system for the Large Binocular Telescope (LBT), designed to perform effective GLAO correction over a very wide field of view. The system is using high powered pulsed green (532 nm) lasers to generate a set of three guide stars above each of the LBT mirrors. The laser beams are launched through a 40 cm telescope and focused at an altitude of 12 km, creating laser beacons by means of Rayleigh scattering. The returning scattered light, primarily sensitive to the turbulences close to the ground, is detected by a gated wavefront sensor system. The derived ground layer correction signals are directly driving the adaptive secondary mirror of the LBT. ARGOS is especially designed for operation with the multiple object spectrograph Luci, which will benefit from both, the improved spatial resolution, as well as the strongly enhanced flux. In addition to the GLAO Rayleigh beacon system, ARGOS was also designed for a possible future upgrade with a hybrid sodium laser - Rayleigh beacon combination, enabling diffraction limited operation. The ARGOS laser system has undergone extensive tests during Summer 2012 and is scheduled for installation at the LBT in Spring 2013. The remaining sub-systems will be installed during the course of 2013. We report on the overall status of the ARGOS system and the results of the sub-system characterizations carried out so far.

  18. On the Computation of the RMSEA and CFI from the Mean-And-Variance Corrected Test Statistic with Nonnormal Data in SEM.

    PubMed

    Savalei, Victoria

    2018-01-01

    A new type of nonnormality correction to the RMSEA has recently been developed, which has several advantages over existing corrections. In particular, the new correction adjusts the sample estimate of the RMSEA for the inflation due to nonnormality, while leaving its population value unchanged, so that established cutoff criteria can still be used to judge the degree of approximate fit. A confidence interval (CI) for the new robust RMSEA based on the mean-corrected ("Satorra-Bentler") test statistic has also been proposed. Follow up work has provided the same type of nonnormality correction for the CFI (Brosseau-Liard & Savalei, 2014). These developments have recently been implemented in lavaan. This note has three goals: a) to show how to compute the new robust RMSEA and CFI from the mean-and-variance corrected test statistic; b) to offer a new CI for the robust RMSEA based on the mean-and-variance corrected test statistic; and c) to caution that the logic of the new nonnormality corrections to RMSEA and CFI is most appropriate for the maximum likelihood (ML) estimator, and cannot easily be generalized to the most commonly used categorical data estimators.

  19. Virtual rough samples to test 3D nanometer-scale scanning electron microscopy stereo photogrammetry.

    PubMed

    Villarrubia, J S; Tondare, V N; Vladár, A E

    2016-01-01

    The combination of scanning electron microscopy for high spatial resolution, images from multiple angles to provide 3D information, and commercially available stereo photogrammetry software for 3D reconstruction offers promise for nanometer-scale dimensional metrology in 3D. A method is described to test 3D photogrammetry software by the use of virtual samples-mathematical samples from which simulated images are made for use as inputs to the software under test. The virtual sample is constructed by wrapping a rough skin with any desired power spectral density around a smooth near-trapezoidal line with rounded top corners. Reconstruction is performed with images simulated from different angular viewpoints. The software's reconstructed 3D model is then compared to the known geometry of the virtual sample. Three commercial photogrammetry software packages were tested. Two of them produced results for line height and width that were within close to 1 nm of the correct values. All of the packages exhibited some difficulty in reconstructing details of the surface roughness.

  20. Spectroscopy of Kerr Black Holes with Earth- and Space-Based Interferometers.

    PubMed

    Berti, Emanuele; Sesana, Alberto; Barausse, Enrico; Cardoso, Vitor; Belczynski, Krzysztof

    2016-09-02

    We estimate the potential of present and future interferometric gravitational-wave detectors to test the Kerr nature of black holes through "gravitational spectroscopy," i.e., the measurement of multiple quasinormal mode frequencies from the remnant of a black hole merger. Using population synthesis models of the formation and evolution of stellar-mass black hole binaries, we find that Voyager-class interferometers will be necessary to perform these tests. Gravitational spectroscopy in the local Universe may become routine with the Einstein Telescope, but a 40-km facility like Cosmic Explorer is necessary to go beyond z∼3. In contrast, detectors like eLISA (evolved Laser Interferometer Space Antenna) should carry out a few-or even hundreds-of these tests every year, depending on uncertainties in massive black hole formation models. Many space-based spectroscopical measurements will occur at high redshift, testing the strong gravity dynamics of Kerr black holes in domains where cosmological corrections to general relativity (if they occur in nature) must be significant.

  1. Use of the Moodle Platform to Promote an Ongoing Learning When Lecturing General Physics in the Physics, Mathematics and Electronic Engineering Programmes at the University of the Basque Country UPV/EHU

    NASA Astrophysics Data System (ADS)

    López, Gabriel A.; Sáenz, Jon; Leonardo, Aritz; Gurtubay, Idoia G.

    2016-08-01

    The Moodle platform has been used to put into practice an ongoing evaluation of the students' Physics learning process. The evaluation has been done on the frame of the course General Physics, which is lectured during the first year of the Physics, Mathematics and Electronic Engineering Programmes at the Faculty of Science and Technology of the University of the Basque Country (UPV/EHU). A test bank with more than 1000 multiple-choice questions, including conceptual and numerical problems, has been prepared. Throughout the course, the students have to answer a 10-question multiple-choice test for every one of the blocks the course is divided in and which were previously treated and worked in the theoretical lectures and problem-solving sessions. The tests are automatically corrected by Moodle, and under certain criteria, the corresponding mark is taken into account for the final mark of the course. According to the results obtained from a statistical study of the data on the student performances during the last four academic years, it has been observed that there exists an actual correlation between the marks obtained in the Moodle tests and the final mark of the course. In addition, it could be deduced that students who have passed the Moodle tests increase their possibilities of passing the course by an odds ratio close to 3.

  2. Novel single nucleotide polymorphism associations with colorectal cancer on chromosome 8q24 in African and European Americans

    PubMed Central

    Kupfer, Sonia S.; Torres, Jada Benn; Hooker, Stanley; Anderson, Jeffrey R.; Skol, Andrew D.; Ellis, Nathan A.; Kittles, Rick A.

    2009-01-01

    Regions on chromosome 8q24 harbor susceptibility alleles for multiple cancers including colorectal (region 3) and prostate cancer (regions 1–4). The objectives of the present study were (i) to test whether single-nucleotide polymorphisms (SNPs) in region 4 are associated with colorectal cancer (CRC) in European or African Americans; (ii) to test whether 8q24 SNPs previously shown to be associated with colorectal and prostate cancer also show association in our multiethnic series and (iii) to test for association between 100 ancestry informative markers (AIMs) and CRC in both the African American and European American cohorts. In total, we genotyped nine markers on 8q24 and 100 unlinked AIMs in 569 CRC cases and 439 controls (490 European Americans and 518 African Americans) obtained retrospectively from a hospital-based sample. We found rs7008482 in 8q24 region 4 to be significantly associated with CRC in European Americans (P = 0.03). Also in region 4, we found that a second SNP, rs16900305, trended toward association with CRC in African Americans. The rs6983267 in region 3, previously implicated in CRC risk, trended toward association with disease in European Americans but not in African Americans. Finally, none of the 100 AIMs tested for association reached statistical significance after correction for multiple hypothesis testing. In summary, these results are evidence that 8q24 region 4 contains novel CRC-associated alleles in European and African Americans. PMID:19520795

  3. Elaborated Corrective Feedback and the Acquisition of Reasoning Skills: A Study of Computer-Assisted Instruction.

    ERIC Educational Resources Information Center

    Collins, Maria; And Others

    1987-01-01

    Thirteen learning disabled and 15 remedial high school students were taught reasoning skills using computer-assisted instruction and were given basic or elaborated corrections. Criterion-referenced test scores were significantly higher for the elaborated-corrections treatment on the post- and maintenance tests and on a transfer test assessing…

  4. On the Hedges Correction for a "t"-Test

    ERIC Educational Resources Information Center

    VanHoudnos, Nathan M.; Greenhouse, Joel B.

    2016-01-01

    When cluster randomized experiments are analyzed as if units were independent, test statistics for treatment effects can be anticonservative. Hedges proposed a correction for such tests by scaling them to control their Type I error rate. This article generalizes the Hedges correction from a posttest-only experimental design to more common designs…

  5. Corrective Action Decision Document/Closure Report for Corrective Action Unit 252: Area 25 Engine Test Stand 1 Decontamination Pad, Nevada Test Site, Nevada

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    DOE /NV

    This Corrective Action Decision Document/Closure Report (CADD/CR) has been prepared for Corrective Action Unit (CAU) 252: Area 25 Engine Test Stand-1 Decontamination Pad, in accordance with the Federal Facility Agreement and Consent Order (FFACO). Located at the Nevada Test Site in Nevada, CAU 252 consists of only one Corrective Action Site (25-07-04, Decontamination Pad). This CADD/CR identifies and rationalizes the U.S. Department of Energy, Nevada Operations Office's (DOE/NV's) recommendation that no corrective action is deemed necessary at CAU 252. The Corrective Action Decision Document and Closure Report have been combined into one report because the potential contaminants of concern weremore » either not detected during the corrective action investigation or were only present at naturally occurring concentrations. Based on the field results, neither corrective action or a corrective action plan is required at this site. A Notice of Completion to DOE/NV is being requested from the Nevada Division of Environmental Protection for closure of CAU 252, as well as a request that this site be moved from Appendix III to Appendix IV of the FFACO. Further, no use restrictions are required to be placed on this CAU.« less

  6. Association of urinary metal profiles with altered glucose levels and diabetes risk: a population-based study in China.

    PubMed

    Feng, Wei; Cui, Xiuqing; Liu, Bing; Liu, Chuanyao; Xiao, Yang; Lu, Wei; Guo, Huan; He, Meian; Zhang, Xiaomin; Yuan, Jing; Chen, Weihong; Wu, Tangchun

    2015-01-01

    Elevated heavy metals and fasting plasma glucose (FPG) levels were both associated with increased risk of cardiovascular diseases. However, studies on the associations of heavy metals and essential elements with altered FPG and diabetes risk were limited or conflicting. The objective of this study was to evaluate the potential associations of heavy metals and essential trace elements with FPG and diabetes risk among general Chinese population. We conducted a cross-sectional study to investigate the associations of urinary concentrations of 23 metals with FPG, impaired fasting glucose (IFG) and diabetes among 2242 community-based Chinese adults in Wuhan. We used the false discovery rate (FDR) method to correct for multiple hypothesis tests. After adjusting for potential confounders, urinary aluminum, titanium, cobalt, nickel, copper, zinc, selenium, rubidium, strontium, molybdenum, cadmium, antimony, barium, tungsten and lead were associated with altered FPG, IFG or diabetes risk (all P< 0.05); arsenic was only dose-dependently related to diabetes (P< 0.05). After additional adjustment for multiple testing, titanium, copper, zinc, selenium, rubidium, tungsten and lead were still significantly associated with one or more outcomes (all FDR-adjusted P< 0.05). Our results suggest that multiple metals in urine are associated with FPG, IFG or diabetes risk. Because the cross-sectional design precludes inferences about causality, further prospective studies are warranted to validate our findings.

  7. Grayscale Optical Correlator Workbench

    NASA Technical Reports Server (NTRS)

    Hanan, Jay; Zhou, Hanying; Chao, Tien-Hsin

    2006-01-01

    Grayscale Optical Correlator Workbench (GOCWB) is a computer program for use in automatic target recognition (ATR). GOCWB performs ATR with an accurate simulation of a hardware grayscale optical correlator (GOC). This simulation is performed to test filters that are created in GOCWB. Thus, GOCWB can be used as a stand-alone ATR software tool or in combination with GOC hardware for building (target training), testing, and optimization of filters. The software is divided into three main parts, denoted filter, testing, and training. The training part is used for assembling training images as input to a filter. The filter part is used for combining training images into a filter and optimizing that filter. The testing part is used for testing new filters and for general simulation of GOC output. The current version of GOCWB relies on the mathematical software tools from MATLAB binaries for performing matrix operations and fast Fourier transforms. Optimization of filters is based on an algorithm, known as OT-MACH, in which variables specified by the user are parameterized and the best filter is selected on the basis of an average result for correct identification of targets in multiple test images.

  8. Testing Seam Concepts for Advanced Multilayer Insulation

    NASA Technical Reports Server (NTRS)

    Chato, D. J.; Johnson, W. L.; Alberts, Samantha J.

    2017-01-01

    Multilayer insulation (MLI) is considered the state of the art insulation for cryogenic propellant tanks in the space environment. MLI traditionally consists of multiple layers of metalized films separated by low conductivity spacers. In order to better understand some of the details within MLI design and construction, GRC has been investigating the heat loads caused by multiple types of seams. To date testing has been completed with 20 layer and 50 layer blankets. Although a truly seamless blanket is not practical, a blanket lay-up where each individual layer was overlapped and tapped together was used as a baseline for the other seams tests. Other seams concepts tested included: an overlap where the complete blanket was overlapped on top of itself; a butt joint were the blankets were just trimmed and butted up against each other, and a staggered butt joint where the seam in the out layers is offset from the seam in the inner layers. Measured performance is based on a preliminary analysis of rod calibration tests conducted prior to the start of seams testing. Baseline performance for the 50 layer blanket showed a measured heat load of 0.46 Watts with a degradation to about 0.47 Watts in the seamed blankets. Baseline performance for the 20 layer blanket showed a measured heat load of 0.57 Watts. Heat loads for the seamed tests are still begin analyzed. So far analysis work has suggested the need for corrections due to heat loads from both the heater leads and the instrumentation wires. A careful re-examination of the calibration test results with these factors accounted for is also underway. This presentation will discuss the theory of seams in MLI, our test results to date, and the uncertainties in our measurements.

  9. Least-Squares Neutron Spectral Adjustment with STAYSL PNNL

    NASA Astrophysics Data System (ADS)

    Greenwood, L. R.; Johnson, C. D.

    2016-02-01

    The STAYSL PNNL computer code, a descendant of the STAY'SL code [1], performs neutron spectral adjustment of a starting neutron spectrum, applying a least squares method to determine adjustments based on saturated activation rates, neutron cross sections from evaluated nuclear data libraries, and all associated covariances. STAYSL PNNL is provided as part of a comprehensive suite of programs [2], where additional tools in the suite are used for assembling a set of nuclear data libraries and determining all required corrections to the measured data to determine saturated activation rates. Neutron cross section and covariance data are taken from the International Reactor Dosimetry File (IRDF-2002) [3], which was sponsored by the International Atomic Energy Agency (IAEA), though work is planned to update to data from the IAEA's International Reactor Dosimetry and Fusion File (IRDFF) [4]. The nuclear data and associated covariances are extracted from IRDF-2002 using the third-party NJOY99 computer code [5]. The NJpp translation code converts the extracted data into a library data array format suitable for use as input to STAYSL PNNL. The software suite also includes three utilities to calculate corrections to measured activation rates. Neutron self-shielding corrections are calculated as a function of neutron energy with the SHIELD code and are applied to the group cross sections prior to spectral adjustment, thus making the corrections independent of the neutron spectrum. The SigPhi Calculator is a Microsoft Excel spreadsheet used for calculating saturated activation rates from raw gamma activities by applying corrections for gamma self-absorption, neutron burn-up, and the irradiation history. Gamma self-absorption and neutron burn-up corrections are calculated (iteratively in the case of the burn-up) within the SigPhi Calculator spreadsheet. The irradiation history corrections are calculated using the BCF computer code and are inserted into the SigPhi Calculator workbook for use in correcting the measured activities. Output from the SigPhi Calculator is automatically produced, and consists of a portion of the STAYSL PNNL input file data that is required to run the spectral adjustment calculations. Within STAYSL PNNL, the least-squares process is performed in one step, without iteration, and provides rapid results on PC platforms. STAYSL PNNL creates multiple output files with tabulated results, data suitable for plotting, and data formatted for use in subsequent radiation damage calculations using the SPECTER computer code (which is not included in the STAYSL PNNL suite). All components of the software suite have undergone extensive testing and validation prior to release and test cases are provided with the package.

  10. The development and validation of a test of science critical thinking for fifth graders.

    PubMed

    Mapeala, Ruslan; Siew, Nyet Moi

    2015-01-01

    The paper described the development and validation of the Test of Science Critical Thinking (TSCT) to measure the three critical thinking skill constructs: comparing and contrasting, sequencing, and identifying cause and effect. The initial TSCT consisted of 55 multiple choice test items, each of which required participants to select a correct response and a correct choice of critical thinking used for their response. Data were obtained from a purposive sampling of 30 fifth graders in a pilot study carried out in a primary school in Sabah, Malaysia. Students underwent the sessions of teaching and learning activities for 9 weeks using the Thinking Maps-aided Problem-Based Learning Module before they answered the TSCT test. Analyses were conducted to check on difficulty index (p) and discrimination index (d), internal consistency reliability, content validity, and face validity. Analysis of the test-retest reliability data was conducted separately for a group of fifth graders with similar ability. Findings of the pilot study showed that out of initial 55 administered items, only 30 items with relatively good difficulty index (p) ranged from 0.40 to 0.60 and with good discrimination index (d) ranged within 0.20-1.00 were selected. The Kuder-Richardson reliability value was found to be appropriate and relatively high with 0.70, 0.73 and 0.92 for identifying cause and effect, sequencing, and comparing and contrasting respectively. The content validity index obtained from three expert judgments equalled or exceeded 0.95. In addition, test-retest reliability showed good, statistically significant correlations ([Formula: see text]). From the above results, the selected 30-item TSCT was found to have sufficient reliability and validity and would therefore represent a useful tool for measuring critical thinking ability among fifth graders in primary science.

  11. Corrective Action Decision Document/Closure Report for Corrective Action Unit 105: Area 2 Yucca Flat Atmospheric Test Sites, Nevada National Security Site, Nevada, Revision 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matthews, Patrick

    2014-01-01

    The purpose of this Corrective Action Decision Document/Closure Report is to provide justification and documentation supporting the recommendation that no further corrective action is needed for CAU 105 based on the implementation of the corrective actions. Corrective action investigation (CAI) activities were performed from October 22, 2012, through May 23, 2013, as set forth in the Corrective Action Investigation Plan for Corrective Action Unit 105: Area 2 Yucca Flat Atmospheric Test Sites; and in accordance with the Soils Activity Quality Assurance Plan, which establishes requirements, technical planning, and general quality practices.

  12. An analysis of regional cerebral blood flow in impulsive murderers using single photon emission computed tomography.

    PubMed

    Amen, Daniel G; Hanks, Chris; Prunella, Jill R; Green, Aisa

    2007-01-01

    The authors explored differences in regional cerebral blood flow in 11 impulsive murderers and 11 healthy comparison subjects using single photon emission computed tomography. The authors assessed subjects at rest and during a computerized go/no-go concentration task. Using statistical parametric mapping software, the authors performed voxel-by-voxel t tests to assess significant differences, making family-wide error corrections for multiple comparisons. Murderers were found to have significantly lower relative rCBF during concentration, particularly in areas associated with concentration and impulse control. These results indicate that nonemotionally laden stimuli may result in frontotemporal dysregulation in people predisposed to impulsive violence.

  13. TU-H-206-04: An Effective Homomorphic Unsharp Mask Filtering Method to Correct Intensity Inhomogeneity in Daily Treatment MR Images

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, D; Gach, H; Li, H

    Purpose: The daily treatment MRIs acquired on MR-IGRT systems, like diagnostic MRIs, suffer from intensity inhomogeneity issue, associated with B1 and B0 inhomogeneities. An improved homomorphic unsharp mask (HUM) filtering method, automatic and robust body segmentation, and imaging field-of-view (FOV) detection methods were developed to compute the multiplicative slow-varying correction field and correct the intensity inhomogeneity. The goal is to improve and normalize the voxel intensity so that the images could be processed more accurately by quantitative methods (e.g., segmentation and registration) that require consistent image voxel intensity values. Methods: HUM methods have been widely used for years. A bodymore » mask is required, otherwise the body surface in the corrected image would be incorrectly bright due to the sudden intensity transition at the body surface. In this study, we developed an improved HUM-based correction method that includes three main components: 1) Robust body segmentation on the normalized image gradient map, 2) Robust FOV detection (needed for body segmentation) using region growing and morphologic filters, and 3) An effective implementation of HUM using repeated Gaussian convolution. Results: The proposed method was successfully tested on patient images of common anatomical sites (H/N, lung, abdomen and pelvis). Initial qualitative comparisons showed that this improved HUM method outperformed three recently published algorithms (FCM, LEMS, MICO) in both computation speed (by 50+ times) and robustness (in intermediate to severe inhomogeneity situations). Currently implemented in MATLAB, it takes 20 to 25 seconds to process a 3D MRI volume. Conclusion: Compared to more sophisticated MRI inhomogeneity correction algorithms, the improved HUM method is simple and effective. The inhomogeneity correction, body mask, and FOV detection methods developed in this study would be useful as preprocessing tools for many MRI-related research and clinical applications in radiotherapy. Authors have received research grants from ViewRay and Varian.« less

  14. Wall interference tests of a CAST 10-2/DOA 2 airfoil in an adaptive-wall test section

    NASA Technical Reports Server (NTRS)

    Mineck, Raymond E.

    1987-01-01

    A wind-tunnel investigation of a CAST 10-2/DOA 2 airfoil model has been conducted in the adaptive-wall test section of the Langley 0.3-Meter Transonic Cryogenic Tunnel (TCT) and in the National Aeronautical Establishment High Reynolds Number Two-Dimensional Test Facility. The primary goal of the tests was to assess two different wall-interference correction techniques: adaptive test-section walls and classical analytical corrections. Tests were conducted over a Mach number range from 0.3 to 0.8 and over a chord Reynolds number range from 6 million to 70 million. The airfoil aerodynamic characteristics from the tests in the 0.3-m TCT have been corrected for wall interference by the movement of the adaptive walls. No additional corrections for any residual interference have been applied to the data, to allow comparison with the classically corrected data from the same model in the conventional National Aeronautical Establishment facility. The data are presented graphically in this report as integrated force-and-moment coefficients and chordwise pressure distributions.

  15. A Systematic Replication and Extension of Using Incremental Rehearsal to Improve Multiplication Skills: An Investigation of Generalization

    ERIC Educational Resources Information Center

    Codding, Robin S.; Archer, Jillian; Connell, James

    2010-01-01

    The purpose of this study was to replicate and extend a previous study by Burns ("Education and Treatment of Children" 28: 237-249, 2005) examining the effectiveness of incremental rehearsal on computation performance. A multiple-probe design across multiplication problem sets was employed for one participant to examine digits correct per minute…

  16. Lidar inelastic multiple-scattering parameters of cirrus particle ensembles determined with geometrical-optics crystal phase functions.

    PubMed

    Reichardt, J; Hess, M; Macke, A

    2000-04-20

    Multiple-scattering correction factors for cirrus particle extinction coefficients measured with Raman and high spectral resolution lidars are calculated with a radiative-transfer model. Cirrus particle-ensemble phase functions are computed from single-crystal phase functions derived in a geometrical-optics approximation. Seven crystal types are considered. In cirrus clouds with height-independent particle extinction coefficients the general pattern of the multiple-scattering parameters has a steep onset at cloud base with values of 0.5-0.7 followed by a gradual and monotonic decrease to 0.1-0.2 at cloud top. The larger the scattering particles are, the more gradual is the rate of decrease. Multiple-scattering parameters of complex crystals and of imperfect hexagonal columns and plates can be well approximated by those of projected-area equivalent ice spheres, whereas perfect hexagonal crystals show values as much as 70% higher than those of spheres. The dependencies of the multiple-scattering parameters on cirrus particle spectrum, base height, and geometric depth and on the lidar parameters laser wavelength and receiver field of view, are discussed, and a set of multiple-scattering parameter profiles for the correction of extinction measurements in homogeneous cirrus is provided.

  17. Error correcting circuit design with carbon nanotube field effect transistors

    NASA Astrophysics Data System (ADS)

    Liu, Xiaoqiang; Cai, Li; Yang, Xiaokuo; Liu, Baojun; Liu, Zhongyong

    2018-03-01

    In this work, a parallel error correcting circuit based on (7, 4) Hamming code is designed and implemented with carbon nanotube field effect transistors, and its function is validated by simulation in HSpice with the Stanford model. A grouping method which is able to correct multiple bit errors in 16-bit and 32-bit application is proposed, and its error correction capability is analyzed. Performance of circuits implemented with CNTFETs and traditional MOSFETs respectively is also compared, and the former shows a 34.4% decrement of layout area and a 56.9% decrement of power consumption.

  18. Examination of association to autism of common genetic variationin genes related to dopamine.

    PubMed

    Anderson, B M; Schnetz-Boutaud, N; Bartlett, J; Wright, H H; Abramson, R K; Cuccaro, M L; Gilbert, J R; Pericak-Vance, M A; Haines, J L

    2008-12-01

    Autism is a severe neurodevelopmental disorder characterized by a triad of complications. Autistic individuals display significant disturbances in language and reciprocal social interactions, combined with repetitive and stereotypic behaviors. Prevalence studies suggest that autism is more common than originally believed, with recent estimates citing a rate of one in 150. Although multiple genetic linkage and association studies have yielded multiple suggestive genes or chromosomal regions, a specific risk locus has yet to be identified and widely confirmed. Because many etiologies have been suggested for this complex syndrome, we hypothesize that one of the difficulties in identifying autism genes is that multiple genetic variants may be required to significantly increase the risk of developing autism. Thus, we took the alternative approach of examining 14 prominent dopamine pathway candidate genes for detailed study by genotyping 28 single nucleotide polymorphisms. Although we did observe a nominally significant association for rs2239535 (P=0.008) on chromosome 20, single-locus analysis did not reveal any results as significant after correction for multiple comparisons. No significant interaction was identified when Multifactor Dimensionality Reduction was employed to test specifically for multilocus effects. Although genome-wide linkage scans in autism have provided support for linkage to various loci along the dopamine pathway, our study does not provide strong evidence of linkage or association to any specific gene or combination of genes within the pathway. These results demonstrate that common genetic variation within the tested genes located within this pathway at most play a minor to moderate role in overall autism pathogenesis.

  19. 78 FR 60679 - Airworthiness Directives; The Boeing Company Airplanes

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-02

    ... Company Model 717-200 airplanes. This AD was prompted by multiple reports of cracks of overwing frames. This AD requires repetitive inspections for cracking of the overwing frames, and corrective actions if necessary. We are issuing this AD to detect and correct such cracking that could sever a frame, which may...

  20. From Corrections to Community: The Juvenile Reentry Experience as Characterized by Multiple Systems Involvement

    ERIC Educational Resources Information Center

    Cusick, Gretchen Ruth; Goerge, Robert M.; Bell, Katie Claussen

    2009-01-01

    This Chapin Hall report describes findings on the extent of system involvement among Illinois youth released from correctional facilities, tracking a population of youth under age 18 in Illinois following their release. Using administrative records, researchers develop profiles of reentry experiences across the many systems that serve youth and…

  1. The Core: Teaching Your Child the Foundations of Classical Education

    ERIC Educational Resources Information Center

    Bortins, Leigh A.

    2010-01-01

    In the past, correct spelling, the multiplication tables, the names of the state capitals and the American presidents were basics that all children were taught in school. Today, many children graduate without this essential knowledge. Most curricula today follow a haphazard sampling of topics with a focus on political correctness instead of…

  2. Confidence-Based Assessments within an Adult Learning Environment

    ERIC Educational Resources Information Center

    Novacek, Paul

    2013-01-01

    Traditional knowledge assessments rely on multiple-choice type questions that only report a right or wrong answer. The reliance within the education system on this technique infers that a student who provides a correct answer purely through guesswork possesses knowledge equivalent to a student who actually knows the correct answer. A more complete…

  3. The Effect and Implications of a "Self-Correcting" Assessment Procedure

    ERIC Educational Resources Information Center

    Francis, Alisha L.; Barnett, Jerrold

    2012-01-01

    We investigated Montepare's (2005, 2007) self-correcting procedure for multiple-choice exams. Findings related to memory suggest this procedure should lead to improved retention by encouraging students to distribute the time spent reviewing the material. Results from a general psychology class (n = 98) indicate that the benefits are not as…

  4. Electronic camera-management system for 35-mm and 70-mm film cameras

    NASA Astrophysics Data System (ADS)

    Nielsen, Allan

    1993-01-01

    Military and commercial test facilities have been tasked with the need for increasingly sophisticated data collection and data reduction. A state-of-the-art electronic control system for high speed 35 mm and 70 mm film cameras designed to meet these tasks is described. Data collection in today's test range environment is difficult at best. The need for a completely integrated image and data collection system is mandated by the increasingly complex test environment. Instrumentation film cameras have been used on test ranges to capture images for decades. Their high frame rates coupled with exceptionally high resolution make them an essential part of any test system. In addition to documenting test events, today's camera system is required to perform many additional tasks. Data reduction to establish TSPI (time- space-position information) may be performed after a mission and is subject to all of the variables present in documenting the mission. A typical scenario would consist of multiple cameras located on tracking mounts capturing the event along with azimuth and elevation position data. Corrected data can then be reduced using each camera's time and position deltas and calculating the TSPI of the object using triangulation. An electronic camera control system designed to meet these requirements has been developed by Photo-Sonics, Inc. The feedback received from test technicians at range facilities throughout the world led Photo-Sonics to design the features of this control system. These prominent new features include: a comprehensive safety management system, full local or remote operation, frame rate accuracy of less than 0.005 percent, and phase locking capability to Irig-B. In fact, Irig-B phase lock operation of multiple cameras can reduce the time-distance delta of a test object traveling at mach-1 to less than one inch during data reduction.

  5. Reliability model of a monopropellant auxiliary propulsion system

    NASA Technical Reports Server (NTRS)

    Greenberg, J. S.

    1971-01-01

    A mathematical model and associated computer code has been developed which computes the reliability of a monopropellant blowdown hydrazine spacecraft auxiliary propulsion system as a function of time. The propulsion system is used to adjust or modify the spacecraft orbit over an extended period of time. The multiple orbit corrections are the multiple objectives which the auxiliary propulsion system is designed to achieve. Thus the reliability model computes the probability of successfully accomplishing each of the desired orbit corrections. To accomplish this, the reliability model interfaces with a computer code that models the performance of a blowdown (unregulated) monopropellant auxiliary propulsion system. The computer code acts as a performance model and as such gives an accurate time history of the system operating parameters. The basic timing and status information is passed on to and utilized by the reliability model which establishes the probability of successfully accomplishing the orbit corrections.

  6. Investigation of genetic variation in scavenger receptor class B, member 1 (SCARB1) and association with serum carotenoids

    PubMed Central

    McKay, Gareth J; Loane, Edward; Nolan, John M; Patterson, Christopher C; Meyers, Kristin J; Mares, Julie A; Yonova-Doing, Ekaterina; Hammond, Christopher J; Beatty, Stephen; Silvestri, Giuliana

    2013-01-01

    Objective To investigate association of scavenger receptor class B, member 1 (SCARB1) genetic variants with serum carotenoid levels of lutein (L) and zeaxanthin (Z) and macular pigment optical density (MPOD). Design A cross-sectional study of healthy adults aged 20-70. Participants 302 participants recruited following local advertisement. Methods MPOD was measured by customized heterochromatic flicker photometry. Fasting blood samples were taken for serum L and Z measurement by HPLC and lipoprotein analysis by spectrophotometric assay. Forty-seven single nucleotide polymorphisms (SNPs) across SCARB1 were genotyped using Sequenom technology. Association analyses were performed using PLINK to compare allele and haplotype means, with adjustment for potential confounding and correction for multiple comparisons by permutation testing. Replication analysis was performed in the TwinsUK and CAREDS cohorts. Main outcome measures Odds ratios (ORs) for macular pigment optical density area, serum lutein and zeaxanthin concentrations associated with genetic variations in SCARB1 and interactions between SCARB1 and sex. Results Following multiple regression analysis with adjustment for age, body mass index, sex, high-density lipoprotein cholesterol (HDLc), low-density lipoprotein cholesterol (LDLc), triglycerides, smoking, dietary L and Z levels, 5 SNPs were significantly associated with serum L concentration and 1 SNP with MPOD (P<0.01). Only the association between rs11057841 and serum L withstood correction for multiple comparisons by permutation testing (P<0.01) and replicated in the TwinsUK cohort (P=0.014). Independent replication was also observed in the CAREDS cohort with rs10846744 (P=2×10−4), a SNP in high linkage disequilibrium with rs11057841 (r2=0.93). No significant interactions by sex were found. Haplotype analysis revealed no stronger association than obtained with single SNP analyses. Conclusions Our study has identified association between rs11057841 and serum L concentration (24% increase per T allele) in healthy subjects, independent of potential confounding factors. Our data supports further evaluation of the role for SCARB1 in the transport of macular pigment and the possible modulation of AMD risk through combating the effects of oxidative stress within the retina. PMID:23562302

  7. Plasma lipidomic profiles and cardiovascular events in a randomized intervention trial with the Mediterranean diet.

    PubMed

    Toledo, Estefanía; Wang, Dong D; Ruiz-Canela, Miguel; Clish, Clary B; Razquin, Cristina; Zheng, Yan; Guasch-Ferré, Marta; Hruby, Adela; Corella, Dolores; Gómez-Gracia, Enrique; Fiol, Miquel; Estruch, Ramón; Ros, Emilio; Lapetra, José; Fito, Montserrat; Aros, Fernando; Serra-Majem, Luis; Liang, Liming; Salas-Salvadó, Jordi; Hu, Frank B; Martínez-González, Miguel A

    2017-10-01

    Background: Lipid metabolites may partially explain the inverse association between the Mediterranean diet (MedDiet) and cardiovascular disease (CVD). Objective: We evaluated the associations between 1 ) lipid species and the risk of CVD (myocardial infarction, stroke, or cardiovascular death); 2 ) a MedDiet intervention [supplemented with extra virgin olive oil (EVOO) or nuts] and 1-y changes in these molecules; and 3 ) 1-y changes in lipid species and subsequent CVD. Design: With the use of a case-cohort design, we profiled 202 lipid species at baseline and after 1 y of intervention in the PREDIMED (PREvención con DIeta MEDiterránea) trial in 983 participants [230 cases and a random subcohort of 790 participants (37 overlapping cases)]. Results: Baseline concentrations of cholesterol esters (CEs) were inversely associated with CVD. A shorter chain length and higher saturation of some lipids were directly associated with CVD. After adjusting for multiple testing, direct associations remained significant for 20 lipids, and inverse associations remained significant for 6 lipids. When lipid species were weighted by the number of carbon atoms and double bonds, the strongest inverse association was found for CEs [HR: 0.39 (95% CI: 0.22, 0.68)] between extreme quintiles ( P -trend = 0.002). Participants in the MedDiet + EVOO and MedDiet + nut groups experienced significant ( P < 0.05) 1-y changes in 20 and 17 lipids, respectively, compared with the control group. Of these changes, only those in CE(20:3) in the MedDiet + nuts group remained significant after correcting for multiple testing. None of the 1-y changes was significantly associated with CVD risk after correcting for multiple comparisons. Conclusions: Although the MedDiet interventions induced some significant 1-y changes in the lipidome, they were not significantly associated with subsequent CVD risk. Lipid metabolites with a longer acyl chain and higher number of double bonds at baseline were significantly and inversely associated with the risk of CVD. © 2017 American Society for Nutrition.

  8. Classification of mineral deposits into types using mineralogy with a probabilistic neural network

    USGS Publications Warehouse

    Singer, Donald A.; Kouda, Ryoichi

    1997-01-01

    In order to determine whether it is desirable to quantify mineral-deposit models further, a test of the ability of a probabilistic neural network to classify deposits into types based on mineralogy was conducted. Presence or absence of ore and alteration mineralogy in well-typed deposits were used to train the network. To reduce the number of minerals considered, the analyzed data were restricted to minerals present in at least 20% of at least one deposit type. An advantage of this restriction is that single or rare occurrences of minerals did not dominate the results. Probabilistic neural networks can provide mathematically sound confidence measures based on Bayes theorem and are relatively insensitive to outliers. Founded on Parzen density estimation, they require no assumptions about distributions of random variables used for classification, even handling multimodal distributions. They train quickly and work as well as, or better than, multiple-layer feedforward networks. Tests were performed with a probabilistic neural network employing a Gaussian kernel and separate sigma weights for each class and each variable. The training set was reduced to the presence or absence of 58 reported minerals in eight deposit types. The training set included: 49 Cyprus massive sulfide deposits; 200 kuroko massive sulfide deposits; 59 Comstock epithermal vein gold districts; 17 quartzalunite epithermal gold deposits; 25 Creede epithermal gold deposits; 28 sedimentary-exhalative zinc-lead deposits; 28 Sado epithermal vein gold deposits; and 100 porphyry copper deposits. The most common training problem was the error of classifying about 27% of Cyprus-type deposits in the training set as kuroko. In independent tests with deposits not used in the training set, 88% of 224 kuroko massive sulfide deposits were classed correctly, 92% of 25 porphyry copper deposits, 78% of 9 Comstock epithermal gold-silver districts, and 83% of six quartzalunite epithermal gold deposits were classed correctly. Across all deposit types, 88% of deposits in the validation dataset were correctly classed. Misclassifications were most common if a deposit was characterized by only a few minerals, e.g., pyrite, chalcopyrite,and sphalerite. The success rate jumped to 98% correctly classed deposits when just two rock types were added. Such a high success rate of the probabilistic neural network suggests that not only should this preliminary test be expanded to include other deposit types, but that other deposit features should be added.

  9. Interspecific hybridization causes long-term phylogenetic discordance between nuclear and mitochondrial genomes in freshwater fishes.

    PubMed

    Wallis, Graham P; Cameron-Christie, Sophia R; Kennedy, Hannah L; Palmer, Gemma; Sanders, Tessa R; Winter, David J

    2017-06-01

    Classification, phylogeography and the testing of evolutionary hypotheses rely on correct estimation of species phylogeny. Early molecular phylogenies often relied on mtDNA alone, which acts as a single linkage group with one history. Over the last decade, the use of multiple nuclear sequences has often revealed conflict among gene trees. This observation can be attributed to hybridization, lineage sorting, paralogy or selection. Here, we use 54 groups of fishes from 48 studies to estimate the degree of concordance between mitochondrial and nuclear gene trees in two ecological grades of fishes: marine and freshwater. We test the hypothesis that freshwater fish phylogenies should, on average, show more discordance because of their higher propensity for hybridization in the past. In keeping with this idea, concordance between mitochondrial and nuclear gene trees (as measured by proportion of components shared) is on average 50% higher in marine fishes. We discuss why this difference almost certainly results from introgression caused by greater historical hybridization among lineages in freshwater groups, and further emphasize the need to use multiple nuclear genes, and identify conflict among them, in estimation of species phylogeny. © 2017 John Wiley & Sons Ltd.

  10. Phase-field-based multiple-relaxation-time lattice Boltzmann model for incompressible multiphase flows.

    PubMed

    Liang, H; Shi, B C; Guo, Z L; Chai, Z H

    2014-05-01

    In this paper, a phase-field-based multiple-relaxation-time lattice Boltzmann (LB) model is proposed for incompressible multiphase flow systems. In this model, one distribution function is used to solve the Chan-Hilliard equation and the other is adopted to solve the Navier-Stokes equations. Unlike previous phase-field-based LB models, a proper source term is incorporated in the interfacial evolution equation such that the Chan-Hilliard equation can be derived exactly and also a pressure distribution is designed to recover the correct hydrodynamic equations. Furthermore, the pressure and velocity fields can be calculated explicitly. A series of numerical tests, including Zalesak's disk rotation, a single vortex, a deformation field, and a static droplet, have been performed to test the accuracy and stability of the present model. The results show that, compared with the previous models, the present model is more stable and achieves an overall improvement in the accuracy of the capturing interface. In addition, compared to the single-relaxation-time LB model, the present model can effectively reduce the spurious velocity and fluctuation of the kinetic energy. Finally, as an application, the Rayleigh-Taylor instability at high Reynolds numbers is investigated.

  11. Pleiotropic analysis of cancer risk loci on esophageal adenocarcinoma risk

    PubMed Central

    Lee, Eunjung; Stram, Daniel O.; Ek, Weronica E.; Onstad, Lynn E; MacGregor, Stuart; Gharahkhani, Puya; Ye, Weimin; Lagergren, Jesper; Shaheen, Nicholas J.; Murray, Liam J.; Hardie, Laura J; Gammon, Marilie D.; Chow, Wong-Ho; Risch, Harvey A.; Corley, Douglas A.; Levine, David M; Whiteman, David C.; Bernstein, Leslie; Bird, Nigel C.; Vaughan, Thomas L.; Wu, Anna H.

    2015-01-01

    Background Several cancer-associated loci identified from genome-wide association studies (GWAS) have been associated with risks of multiple cancer sites, suggesting pleiotropic effects. We investigated whether GWAS-identified risk variants for other common cancers are associated with risk of esophageal adenocarcinoma (EA) or its precursor, Barrett's esophagus (BE). Methods We examined the associations between risks of EA and BE and 387 single nucleotide polymorphisms (SNPs) that have been associated with risks of other cancers, by using genotype imputation data on 2,163 control participants and 3,885 (1,501 EA and 2,384 BE) case patients from the Barrett's and Esophageal Adenocarcinoma Genetic Susceptibility Study, and investigated effect modification by smoking history, body mass index (BMI), and reflux/heartburn. Results After correcting for multiple testing, none of the tested 387 SNPs were statistically significantly associated with risk of EA or BE. No evidence of effect modification by smoking, BMI, or reflux/heartburn was observed. Conclusions Genetic risk variants for common cancers identified from GWAS appear not to be associated with risks of EA or BE. Impact To our knowledge, this is the first investigation of pleiotropic genetic associations with risks of EA and BE. PMID:26364162

  12. Comprehensive verticality analysis and web-based rehabilitation system for people with multiple sclerosis with supervised medical monitoring.

    PubMed

    Eguiluz-Perez, Gonzalo; Garcia-Zapirain, Begonya

    2014-01-01

    People with Multiple Sclerosis (MS) need regular physical activities along with medical treatment despite their ability or disability level. However, poorly performed exercises could aggravate muscle imbalances and worsen their health. The goal of our work is to create a comprehensive system, encompassing a face-to-face sessions performed by MS patients one day a week at the medical center with exercises at home the rest of the week through a web platform in combination with a tracking tool to analyze the position of patients during exercise and correct them in real-time. The whole system is currently testing during six months with ten participants, five persons with MS and 5 professionals related with MS. Two tests, the Functional Independence Measure and the Berg Balance Scale will be act as a barometer for measuring the degree of independence obtained by the people with MS and also the validity of the whole system as a rehabilitation tool. Preliminary results about the usability of the system using SUS scale, 72 and 76 points over 100 (patients and professionals respectively), demonstrate that our system is usable for both patients and professionals.

  13. The (in)famous GWAS P-value threshold revisited and updated for low-frequency variants.

    PubMed

    Fadista, João; Manning, Alisa K; Florez, Jose C; Groop, Leif

    2016-08-01

    Genome-wide association studies (GWAS) have long relied on proposed statistical significance thresholds to be able to differentiate true positives from false positives. Although the genome-wide significance P-value threshold of 5 × 10(-8) has become a standard for common-variant GWAS, it has not been updated to cope with the lower allele frequency spectrum used in many recent array-based GWAS studies and sequencing studies. Using a whole-genome- and -exome-sequencing data set of 2875 individuals of European ancestry from the Genetics of Type 2 Diabetes (GoT2D) project and a whole-exome-sequencing data set of 13 000 individuals from five ancestries from the GoT2D and T2D-GENES (Type 2 Diabetes Genetic Exploration by Next-generation sequencing in multi-Ethnic Samples) projects, we describe guidelines for genome- and exome-wide association P-value thresholds needed to correct for multiple testing, explaining the impact of linkage disequilibrium thresholds for distinguishing independent variants, minor allele frequency and ancestry characteristics. We emphasize the advantage of studying recent genetic isolate populations when performing rare and low-frequency genetic association analyses, as the multiple testing burden is diminished due to higher genetic homogeneity.

  14. Curriculum-Based Measurement in Writing: Predicting the Success of High-School Students on State Standards Tests

    ERIC Educational Resources Information Center

    Espin, Christine; Wallace, Teri; Campbell, Heather; Lembke, Erica S.; Long, Jeffrey D.; Ticha, Renata

    2008-01-01

    We examined the technical adequacy of writing progress measures as indicators of success on state standards tests. Tenth-grade students wrote for 10 min, marking their samples at 3, 5, and 7 min. Samples were scored for words written, words spelled correctly, and correct and correct minus incorrect word sequences. The number of correct minus…

  15. Iterative channel decoding of FEC-based multiple-description codes.

    PubMed

    Chang, Seok-Ho; Cosman, Pamela C; Milstein, Laurence B

    2012-03-01

    Multiple description coding has been receiving attention as a robust transmission framework for multimedia services. This paper studies the iterative decoding of FEC-based multiple description codes. The proposed decoding algorithms take advantage of the error detection capability of Reed-Solomon (RS) erasure codes. The information of correctly decoded RS codewords is exploited to enhance the error correction capability of the Viterbi algorithm at the next iteration of decoding. In the proposed algorithm, an intradescription interleaver is synergistically combined with the iterative decoder. The interleaver does not affect the performance of noniterative decoding but greatly enhances the performance when the system is iteratively decoded. We also address the optimal allocation of RS parity symbols for unequal error protection. For the optimal allocation in iterative decoding, we derive mathematical equations from which the probability distributions of description erasures can be generated in a simple way. The performance of the algorithm is evaluated over an orthogonal frequency-division multiplexing system. The results show that the performance of the multiple description codes is significantly enhanced.

  16. Cooperative MIMO communication at wireless sensor network: an error correcting code approach.

    PubMed

    Islam, Mohammad Rakibul; Han, Young Shin

    2011-01-01

    Cooperative communication in wireless sensor network (WSN) explores the energy efficient wireless communication schemes between multiple sensors and data gathering node (DGN) by exploiting multiple input multiple output (MIMO) and multiple input single output (MISO) configurations. In this paper, an energy efficient cooperative MIMO (C-MIMO) technique is proposed where low density parity check (LDPC) code is used as an error correcting code. The rate of LDPC code is varied by varying the length of message and parity bits. Simulation results show that the cooperative communication scheme outperforms SISO scheme in the presence of LDPC code. LDPC codes with different code rates are compared using bit error rate (BER) analysis. BER is also analyzed under different Nakagami fading scenario. Energy efficiencies are compared for different targeted probability of bit error p(b). It is observed that C-MIMO performs more efficiently when the targeted p(b) is smaller. Also the lower encoding rate for LDPC code offers better error characteristics.

  17. Cooperative MIMO Communication at Wireless Sensor Network: An Error Correcting Code Approach

    PubMed Central

    Islam, Mohammad Rakibul; Han, Young Shin

    2011-01-01

    Cooperative communication in wireless sensor network (WSN) explores the energy efficient wireless communication schemes between multiple sensors and data gathering node (DGN) by exploiting multiple input multiple output (MIMO) and multiple input single output (MISO) configurations. In this paper, an energy efficient cooperative MIMO (C-MIMO) technique is proposed where low density parity check (LDPC) code is used as an error correcting code. The rate of LDPC code is varied by varying the length of message and parity bits. Simulation results show that the cooperative communication scheme outperforms SISO scheme in the presence of LDPC code. LDPC codes with different code rates are compared using bit error rate (BER) analysis. BER is also analyzed under different Nakagami fading scenario. Energy efficiencies are compared for different targeted probability of bit error pb. It is observed that C-MIMO performs more efficiently when the targeted pb is smaller. Also the lower encoding rate for LDPC code offers better error characteristics. PMID:22163732

  18. Quantifying radionuclide signatures from a γ-γ coincidence system.

    PubMed

    Britton, Richard; Jackson, Mark J; Davies, Ashley V

    2015-11-01

    A method for quantifying gamma coincidence signatures has been developed, and tested in conjunction with a high-efficiency multi-detector system to quickly identify trace amounts of radioactive material. The γ-γ system utilises fully digital electronics and list-mode acquisition to time-stamp each event, allowing coincidence matrices to be easily produced alongside typical 'singles' spectra. To quantify the coincidence signatures a software package has been developed to calculate efficiency and cascade summing corrected branching ratios. This utilises ENSDF records as an input, and can be fully automated, allowing the user to quickly and easily create/update a coincidence library that contains all possible γ and conversion electron cascades, associated cascade emission probabilities, and true-coincidence summing corrected γ cascade detection probabilities. It is also fully searchable by energy, nuclide, coincidence pair, γ multiplicity, cascade probability and half-life of the cascade. The probabilities calculated were tested using measurements performed on the γ-γ system, and found to provide accurate results for the nuclides investigated. Given the flexibility of the method, (it only relies on evaluated nuclear data, and accurate efficiency characterisations), the software can now be utilised for a variety of systems, quickly and easily calculating coincidence signature probabilities. Crown Copyright © 2015. Published by Elsevier Ltd. All rights reserved.

  19. Measuring Response Styles Across the Big Five: A Multiscale Extension of an Approach Using Multinomial Processing Trees.

    PubMed

    Khorramdel, Lale; von Davier, Matthias

    2014-01-01

    This study shows how to address the problem of trait-unrelated response styles (RS) in rating scales using multidimensional item response theory. The aim is to test and correct data for RS in order to provide fair assessments of personality. Expanding on an approach presented by Böckenholt (2012), observed rating data are decomposed into multiple response processes based on a multinomial processing tree. The data come from a questionnaire consisting of 50 items of the International Personality Item Pool measuring the Big Five dimensions administered to 2,026 U.S. students with a 5-point rating scale. It is shown that this approach can be used to test if RS exist in the data and that RS can be differentiated from trait-related responses. Although the extreme RS appear to be unidimensional after exclusion of only 1 item, a unidimensional measure for the midpoint RS is obtained only after exclusion of 10 items. Both RS measurements show high cross-scale correlations and item response theory-based (marginal) reliabilities. Cultural differences could be found in giving extreme responses. Moreover, it is shown how to score rating data to correct for RS after being proved to exist in the data.

  20. Some practical turbulence modeling options for Reynolds-averaged full Navier-Stokes calculations of three-dimensional flows

    NASA Technical Reports Server (NTRS)

    Bui, Trong T.

    1993-01-01

    New turbulence modeling options recently implemented for the 3-D version of Proteus, a Reynolds-averaged compressible Navier-Stokes code, are described. The implemented turbulence models include: the Baldwin-Lomax algebraic model, the Baldwin-Barth one-equation model, the Chien k-epsilon model, and the Launder-Sharma k-epsilon model. Features of this turbulence modeling package include: well documented and easy to use turbulence modeling options, uniform integration of turbulence models from different classes, automatic initialization of turbulence variables for calculations using one- or two-equation turbulence models, multiple solid boundaries treatment, and fully vectorized L-U solver for one- and two-equation models. Validation test cases include the incompressible and compressible flat plate turbulent boundary layers, turbulent developing S-duct flow, and glancing shock wave/turbulent boundary layer interaction. Good agreement is obtained between the computational results and experimental data. Sensitivity of the compressible turbulent solutions with the method of y(sup +) computation, the turbulent length scale correction, and some compressibility corrections are examined in detail. The test cases show that the highly optimized one-and two-equation turbulence models can be used in routine 3-D Navier-Stokes computations with no significant increase in CPU time as compared with the Baldwin-Lomax algebraic model.

Top