Sample records for provide rigorous tests

  1. Coverage Metrics for Requirements-Based Testing: Evaluation of Effectiveness

    NASA Technical Reports Server (NTRS)

    Staats, Matt; Whalen, Michael W.; Heindahl, Mats P. E.; Rajan, Ajitha

    2010-01-01

    In black-box testing, the tester creates a set of tests to exercise a system under test without regard to the internal structure of the system. Generally, no objective metric is used to measure the adequacy of black-box tests. In recent work, we have proposed three requirements coverage metrics, allowing testers to objectively measure the adequacy of a black-box test suite with respect to a set of requirements formalized as Linear Temporal Logic (LTL) properties. In this report, we evaluate the effectiveness of these coverage metrics with respect to fault finding. Specifically, we conduct an empirical study to investigate two questions: (1) do test suites satisfying a requirements coverage metric provide better fault finding than randomly generated test suites of approximately the same size?, and (2) do test suites satisfying a more rigorous requirements coverage metric provide better fault finding than test suites satisfying a less rigorous requirements coverage metric? Our results indicate (1) only one coverage metric proposed -- Unique First Cause (UFC) coverage -- is sufficiently rigorous to ensure test suites satisfying the metric outperform randomly generated test suites of similar size and (2) that test suites satisfying more rigorous coverage metrics provide better fault finding than test suites satisfying less rigorous coverage metrics.

  2. Rotation and anisotropy of galaxies revisited

    NASA Astrophysics Data System (ADS)

    Binney, James

    2005-11-01

    The use of the tensor virial theorem (TVT) as a diagnostic of anisotropic velocity distributions in galaxies is revisited. The TVT provides a rigorous global link between velocity anisotropy, rotation and shape, but the quantities appearing in it are not easily estimated observationally. Traditionally, use has been made of a centrally averaged velocity dispersion and the peak rotation velocity. Although this procedure cannot be rigorously justified, tests on model galaxies show that it works surprisingly well. With the advent of integral-field spectroscopy it is now possible to establish a rigorous connection between the TVT and observations. The TVT is reformulated in terms of sky-averages, and the new formulation is tested on model galaxies.

  3. Real-Time Ada Problem Study

    DTIC Science & Technology

    1989-03-24

    Specified Test Verification Matri_ .. 39 3.2.6.5 Test Generation Assistance. .............. . .. ......... 40 3.2.7 Maintenance...lack of intimate knowledge of how the runtime links to the compiler generated code. Furthermore, the runime must meet a rigorous set of tests to insure...projects, and is not provided. Along with the library, a set of tests should be provided to verify the accuracy of the library after changes have been

  4. Verification of Compartmental Epidemiological Models using Metamorphic Testing, Model Checking and Visual Analytics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ramanathan, Arvind; Steed, Chad A; Pullum, Laura L

    Compartmental models in epidemiology are widely used as a means to model disease spread mechanisms and understand how one can best control the disease in case an outbreak of a widespread epidemic occurs. However, a significant challenge within the community is in the development of approaches that can be used to rigorously verify and validate these models. In this paper, we present an approach to rigorously examine and verify the behavioral properties of compartmen- tal epidemiological models under several common modeling scenarios including birth/death rates and multi-host/pathogen species. Using metamorphic testing, a novel visualization tool and model checking, we buildmore » a workflow that provides insights into the functionality of compartmental epidemiological models. Our initial results indicate that metamorphic testing can be used to verify the implementation of these models and provide insights into special conditions where these mathematical models may fail. The visualization front-end allows the end-user to scan through a variety of parameters commonly used in these models to elucidate the conditions under which an epidemic can occur. Further, specifying these models using a process algebra allows one to automatically construct behavioral properties that can be rigorously verified using model checking. Taken together, our approach allows for detecting implementation errors as well as handling conditions under which compartmental epidemiological models may fail to provide insights into disease spread dynamics.« less

  5. Increased scientific rigor will improve reliability of research and effectiveness of management

    USGS Publications Warehouse

    Sells, Sarah N.; Bassing, Sarah B.; Barker, Kristin J.; Forshee, Shannon C.; Keever, Allison; Goerz, James W.; Mitchell, Michael S.

    2018-01-01

    Rigorous science that produces reliable knowledge is critical to wildlife management because it increases accurate understanding of the natural world and informs management decisions effectively. Application of a rigorous scientific method based on hypothesis testing minimizes unreliable knowledge produced by research. To evaluate the prevalence of scientific rigor in wildlife research, we examined 24 issues of the Journal of Wildlife Management from August 2013 through July 2016. We found 43.9% of studies did not state or imply a priori hypotheses, which are necessary to produce reliable knowledge. We posit that this is due, at least in part, to a lack of common understanding of what rigorous science entails, how it produces more reliable knowledge than other forms of interpreting observations, and how research should be designed to maximize inferential strength and usefulness of application. Current primary literature does not provide succinct explanations of the logic behind a rigorous scientific method or readily applicable guidance for employing it, particularly in wildlife biology; we therefore synthesized an overview of the history, philosophy, and logic that define scientific rigor for biological studies. A rigorous scientific method includes 1) generating a research question from theory and prior observations, 2) developing hypotheses (i.e., plausible biological answers to the question), 3) formulating predictions (i.e., facts that must be true if the hypothesis is true), 4) designing and implementing research to collect data potentially consistent with predictions, 5) evaluating whether predictions are consistent with collected data, and 6) drawing inferences based on the evaluation. Explicitly testing a priori hypotheses reduces overall uncertainty by reducing the number of plausible biological explanations to only those that are logically well supported. Such research also draws inferences that are robust to idiosyncratic observations and unavoidable human biases. Offering only post hoc interpretations of statistical patterns (i.e., a posteriorihypotheses) adds to uncertainty because it increases the number of plausible biological explanations without determining which have the greatest support. Further, post hocinterpretations are strongly subject to human biases. Testing hypotheses maximizes the credibility of research findings, makes the strongest contributions to theory and management, and improves reproducibility of research. Management decisions based on rigorous research are most likely to result in effective conservation of wildlife resources. 

  6. A rigorous approach to facilitate and guarantee the correctness of the genetic testing management in human genome information systems.

    PubMed

    Araújo, Luciano V; Malkowski, Simon; Braghetto, Kelly R; Passos-Bueno, Maria R; Zatz, Mayana; Pu, Calton; Ferreira, João E

    2011-12-22

    Recent medical and biological technology advances have stimulated the development of new testing systems that have been providing huge, varied amounts of molecular and clinical data. Growing data volumes pose significant challenges for information processing systems in research centers. Additionally, the routines of genomics laboratory are typically characterized by high parallelism in testing and constant procedure changes. This paper describes a formal approach to address this challenge through the implementation of a genetic testing management system applied to human genome laboratory. We introduced the Human Genome Research Center Information System (CEGH) in Brazil, a system that is able to support constant changes in human genome testing and can provide patients updated results based on the most recent and validated genetic knowledge. Our approach uses a common repository for process planning to ensure reusability, specification, instantiation, monitoring, and execution of processes, which are defined using a relational database and rigorous control flow specifications based on process algebra (ACP). The main difference between our approach and related works is that we were able to join two important aspects: 1) process scalability achieved through relational database implementation, and 2) correctness of processes using process algebra. Furthermore, the software allows end users to define genetic testing without requiring any knowledge about business process notation or process algebra. This paper presents the CEGH information system that is a Laboratory Information Management System (LIMS) based on a formal framework to support genetic testing management for Mendelian disorder studies. We have proved the feasibility and showed usability benefits of a rigorous approach that is able to specify, validate, and perform genetic testing using easy end user interfaces.

  7. A rigorous approach to facilitate and guarantee the correctness of the genetic testing management in human genome information systems

    PubMed Central

    2011-01-01

    Background Recent medical and biological technology advances have stimulated the development of new testing systems that have been providing huge, varied amounts of molecular and clinical data. Growing data volumes pose significant challenges for information processing systems in research centers. Additionally, the routines of genomics laboratory are typically characterized by high parallelism in testing and constant procedure changes. Results This paper describes a formal approach to address this challenge through the implementation of a genetic testing management system applied to human genome laboratory. We introduced the Human Genome Research Center Information System (CEGH) in Brazil, a system that is able to support constant changes in human genome testing and can provide patients updated results based on the most recent and validated genetic knowledge. Our approach uses a common repository for process planning to ensure reusability, specification, instantiation, monitoring, and execution of processes, which are defined using a relational database and rigorous control flow specifications based on process algebra (ACP). The main difference between our approach and related works is that we were able to join two important aspects: 1) process scalability achieved through relational database implementation, and 2) correctness of processes using process algebra. Furthermore, the software allows end users to define genetic testing without requiring any knowledge about business process notation or process algebra. Conclusions This paper presents the CEGH information system that is a Laboratory Information Management System (LIMS) based on a formal framework to support genetic testing management for Mendelian disorder studies. We have proved the feasibility and showed usability benefits of a rigorous approach that is able to specify, validate, and perform genetic testing using easy end user interfaces. PMID:22369688

  8. A Case Study to Explore Rigorous Teaching and Testing Practices to Narrow the Achievement Gap

    ERIC Educational Resources Information Center

    Isler, Tesha

    2012-01-01

    The problem examined in this study: Does the majority of teachers use rigorous teaching and testing practices? The purpose of this qualitative exploratory case study was to explore the classroom techniques of six effective teachers who use rigorous teaching and testing practices. The hypothesis for this study is that the examination of the…

  9. A Prospective Test of Cognitive Vulnerability Models of Depression with Adolescent Girls

    ERIC Educational Resources Information Center

    Bohon, Cara; Stice, Eric; Burton, Emily; Fudell, Molly; Nolen-Hoeksema, Susan

    2008-01-01

    This study sought to provide a more rigorous prospective test of two cognitive vulnerability models of depression with longitudinal data from 496 adolescent girls. Results supported the cognitive vulnerability model in that stressors predicted future increases in depressive symptoms and onset of clinically significant major depression for…

  10. Testing Adaptive Toolbox Models: A Bayesian Hierarchical Approach

    ERIC Educational Resources Information Center

    Scheibehenne, Benjamin; Rieskamp, Jorg; Wagenmakers, Eric-Jan

    2013-01-01

    Many theories of human cognition postulate that people are equipped with a repertoire of strategies to solve the tasks they face. This theoretical framework of a cognitive toolbox provides a plausible account of intra- and interindividual differences in human behavior. Unfortunately, it is often unclear how to rigorously test the toolbox…

  11. ASTM Committee C28: International Standards for Properties and Performance of Advanced Ceramics, Three Decades of High-quality, Technically-rigorous Normalization

    NASA Technical Reports Server (NTRS)

    Jenkins, Michael G.; Salem, Jonathan A.

    2016-01-01

    Physical and mechanical properties and performance of advanced ceramics and glasses are difficult to measure correctly without the proper techniques. For over three decades, ASTM Committee C28 on Advanced Ceramics, has developed high quality, rigorous, full-consensus standards (e.g., test methods, practices, guides, terminology) to measure properties and performance of monolithic and composite ceramics that may be applied to glasses in some cases. These standards testing particulars for many mechanical, physical, thermal, properties and performance of these materials. As a result these standards provide accurate, reliable, repeatable and complete data. Within Committee C28 users, producers, researchers, designers, academicians, etc. have written, continually updated, and validated through round-robin test programs, nearly 50 standards since the Committees founding in 1986. This paper provides a retrospective review of the 30 years of ASTM Committee C28 including a graphical pictogram listing of C28 standards along with examples of the tangible benefits of advanced ceramics standards to demonstrate their practical applications.

  12. Memory Hazard Functions: A Vehicle for Theory Development and Test

    ERIC Educational Resources Information Center

    Chechile, Richard A.

    2006-01-01

    A framework is developed to rigorously test an entire class of memory retention functions by examining hazard properties. Evidence is provided that the memory hazard function is not monotonically decreasing. Yet most of the proposals for retention functions, which have emerged from the psychological literature, imply that memory hazard is…

  13. Louis Guttman's Contributions to Classical Test Theory

    ERIC Educational Resources Information Center

    Zimmerman, Donald W.; Williams, Richard H.; Zumbo, Bruno D.; Ross, Donald

    2005-01-01

    This article focuses on Louis Guttman's contributions to the classical theory of educational and psychological tests, one of the lesser known of his many contributions to quantitative methods in the social sciences. Guttman's work in this field provided a rigorous mathematical basis for ideas that, for many decades after Spearman's initial work,…

  14. A Rigorous Test of the Fit of the Circumplex Model to Big Five Personality Data: Theoretical and Methodological Issues and Two Large Sample Empirical Tests.

    PubMed

    DeGeest, David Scott; Schmidt, Frank

    2015-01-01

    Our objective was to apply the rigorous test developed by Browne (1992) to determine whether the circumplex model fits Big Five personality data. This test has yet to be applied to personality data. Another objective was to determine whether blended items explained correlations among the Big Five traits. We used two working adult samples, the Eugene-Springfield Community Sample and the Professional Worker Career Experience Survey. Fit to the circumplex was tested via Browne's (1992) procedure. Circumplexes were graphed to identify items with loadings on multiple traits (blended items), and to determine whether removing these items changed five-factor model (FFM) trait intercorrelations. In both samples, the circumplex structure fit the FFM traits well. Each sample had items with dual-factor loadings (8 items in the first sample, 21 in the second). Removing blended items had little effect on construct-level intercorrelations among FFM traits. We conclude that rigorous tests show that the fit of personality data to the circumplex model is good. This finding means the circumplex model is competitive with the factor model in understanding the organization of personality traits. The circumplex structure also provides a theoretically and empirically sound rationale for evaluating intercorrelations among FFM traits. Even after eliminating blended items, FFM personality traits remained correlated.

  15. Using the Inquiry Process to Motivate and Engage All (Including Struggling) Readers

    ERIC Educational Resources Information Center

    Savitz, Rachelle S.; Wallace, Kelly

    2016-01-01

    With increasingly rigorous standards and mounting high stakes testing, it seems harder than ever to motivate and engage struggling readers. In this article the authors provide an overview of the inquiry learning process, which details how providing students with choice and opportunities to collaborate with peers can keep students invested in their…

  16. Carrying capacity in a heterogeneous environment with habitat connectivity.

    PubMed

    Zhang, Bo; Kula, Alex; Mack, Keenan M L; Zhai, Lu; Ryce, Arrix L; Ni, Wei-Ming; DeAngelis, Donald L; Van Dyken, J David

    2017-09-01

    A large body of theory predicts that populations diffusing in heterogeneous environments reach higher total size than if non-diffusing, and, paradoxically, higher size than in a corresponding homogeneous environment. However, this theory and its assumptions have not been rigorously tested. Here, we extended previous theory to include exploitable resources, proving qualitatively novel results, which we tested experimentally using spatially diffusing laboratory populations of yeast. Consistent with previous theory, we predicted and experimentally observed that spatial diffusion increased total equilibrium population abundance in heterogeneous environments, with the effect size depending on the relationship between r and K. Refuting previous theory, however, we discovered that homogeneously distributed resources support higher total carrying capacity than heterogeneously distributed resources, even with species diffusion. Our results provide rigorous experimental tests of new and old theory, demonstrating how the traditional notion of carrying capacity is ambiguous for populations diffusing in spatially heterogeneous environments. © 2017 John Wiley & Sons Ltd/CNRS.

  17. Carrying capacity in a heterogeneous environment with habitat connectivity

    USGS Publications Warehouse

    Zhang, Bo; Kula, Alex; Mack, Keenan M.L.; Zhai, Lu; Ryce, Arrix L.; Ni, Wei-Ming; DeAngelis, Donald L.; Van Dyken, J. David

    2017-01-01

    A large body of theory predicts that populations diffusing in heterogeneous environments reach higher total size than if non-diffusing, and, paradoxically, higher size than in a corresponding homogeneous environment. However, this theory and its assumptions have not been rigorously tested. Here, we extended previous theory to include exploitable resources, proving qualitatively novel results, which we tested experimentally using spatially diffusing laboratory populations of yeast. Consistent with previous theory, we predicted and experimentally observed that spatial diffusion increased total equilibrium population abundance in heterogeneous environments, with the effect size depending on the relationship between r and K. Refuting previous theory, however, we discovered that homogeneously distributed resources support higher total carrying capacity than heterogeneously distributed resources, even with species diffusion. Our results provide rigorous experimental tests of new and old theory, demonstrating how the traditional notion of carrying capacity is ambiguous for populations diffusing in spatially heterogeneous environments.

  18. A Randomized Study of How Physicians Interpret Research Funding Disclosures

    PubMed Central

    Kesselheim, Aaron S.; Robertson, Christopher T.; Myers, Jessica A.; Rose, Susannah L.; Gillet, Victoria; Ross, Kathryn M.; Glynn, Robert J.; Joffe, Steven; Avorn, Jerry

    2012-01-01

    BACKGROUND The effects of clinical-trial funding on the interpretation of trial results are poorly understood. We examined how such support affects physicians’ reactions to trials with a high, medium, or low level of methodologic rigor. METHODS We presented 503 board-certified internists with abstracts that we designed describing clinical trials of three hypothetical drugs. The trials had high, medium, or low methodologic rigor, and each report included one of three support disclosures: funding from a pharmaceutical company, NIH funding, or none. For both factors studied (rigor and funding), one of the three possible variations was randomly selected for inclusion in the abstracts. Follow-up questions assessed the physicians’ impressions of the trials’ rigor, their confidence in the results, and their willingness to prescribe the drugs. RESULTS The 269 respondents (53.5% response rate) perceived the level of study rigor accurately. Physicians reported that they would be less willing to prescribe drugs tested in low-rigor trials than those tested in medium-rigor trials (odds ratio, 0.64; 95% confidence interval [CI], 0.46 to 0.89; P = 0.008) and would be more willing to prescribe drugs tested in high-rigor trials than those tested in medium-rigor trials (odds ratio, 3.07; 95% CI, 2.18 to 4.32; P<0.001). Disclosure of industry funding, as compared with no disclosure of funding, led physicians to downgrade the rigor of a trial (odds ratio, 0.63; 95% CI, 0.46 to 0.87; P = 0.006), their confidence in the results (odds ratio, 0.71; 95% CI, 0.51 to 0.98; P = 0.04), and their willingness to prescribe the hypothetical drugs (odds ratio, 0.68; 95% CI, 0.49 to 0.94; P = 0.02). Physicians were half as willing to prescribe drugs studied in industry-funded trials as they were to prescribe drugs studied in NIH-funded trials (odds ratio, 0.52; 95% CI, 0.37 to 0.71; P<0.001). These effects were consistent across all levels of methodologic rigor. CONCLUSIONS Physicians discriminate among trials of varying degrees of rigor, but industry sponsorship negatively influences their perception of methodologic quality and reduces their willingness to believe and act on trial findings, independently of the trial’s quality. These effects may influence the translation of clinical research into practice. PMID:22992075

  19. Program Manager: Journal of the Defense Systems Management College. Volume 16, Number 3, May-June 1987

    DTIC Science & Technology

    1987-06-01

    redress a growingstrategic imbalance and provide an en- test pilots conducted a rigorous flight test during capability to penetrate Soviet program...ment career path for rated officers vidual rotates through assignments in ( pilots and navigators) is different from engineering, test and evaluation...pain. Acquiring the B-1B, or any other weapon system for that matter, entails developing, testing ard producing new technology. In any high-tech en

  20. Parent Management Training-Oregon Model: Adapting Intervention with Rigorous Research.

    PubMed

    Forgatch, Marion S; Kjøbli, John

    2016-09-01

    Parent Management Training-Oregon Model (PMTO(®) ) is a set of theory-based parenting programs with status as evidence-based treatments. PMTO has been rigorously tested in efficacy and effectiveness trials in different contexts, cultures, and formats. Parents, the presumed agents of change, learn core parenting practices, specifically skill encouragement, limit setting, monitoring/supervision, interpersonal problem solving, and positive involvement. The intervention effectively prevents and ameliorates children's behavior problems by replacing coercive interactions with positive parenting practices. Delivery format includes sessions with individual families in agencies or families' homes, parent groups, and web-based and telehealth communication. Mediational models have tested parenting practices as mechanisms of change for children's behavior and found support for the theory underlying PMTO programs. Moderating effects include children's age, maternal depression, and social disadvantage. The Norwegian PMTO implementation is presented as an example of how PMTO has been tailored to reach diverse populations as delivered by multiple systems of care throughout the nation. An implementation and research center in Oslo provides infrastructure and promotes collaboration between practitioners and researchers to conduct rigorous intervention research. Although evidence-based and tested within a wide array of contexts and populations, PMTO must continue to adapt to an ever-changing world. © 2016 Family Process Institute.

  1. Statistically rigorous calculations do not support common input and long-term synchronization of motor-unit firings

    PubMed Central

    Kline, Joshua C.

    2014-01-01

    Over the past four decades, various methods have been implemented to measure synchronization of motor-unit firings. In this work, we provide evidence that prior reports of the existence of universal common inputs to all motoneurons and the presence of long-term synchronization are misleading, because they did not use sufficiently rigorous statistical tests to detect synchronization. We developed a statistically based method (SigMax) for computing synchronization and tested it with data from 17,736 motor-unit pairs containing 1,035,225 firing instances from the first dorsal interosseous and vastus lateralis muscles—a data set one order of magnitude greater than that reported in previous studies. Only firing data, obtained from surface electromyographic signal decomposition with >95% accuracy, were used in the study. The data were not subjectively selected in any manner. Because of the size of our data set and the statistical rigor inherent to SigMax, we have confidence that the synchronization values that we calculated provide an improved estimate of physiologically driven synchronization. Compared with three other commonly used techniques, ours revealed three types of discrepancies that result from failing to use sufficient statistical tests necessary to detect synchronization. 1) On average, the z-score method falsely detected synchronization at 16 separate latencies in each motor-unit pair. 2) The cumulative sum method missed one out of every four synchronization identifications found by SigMax. 3) The common input assumption method identified synchronization from 100% of motor-unit pairs studied. SigMax revealed that only 50% of motor-unit pairs actually manifested synchronization. PMID:25210152

  2. The MINERVA Software Development Process

    NASA Technical Reports Server (NTRS)

    Narkawicz, Anthony; Munoz, Cesar A.; Dutle, Aaron M.

    2017-01-01

    This paper presents a software development process for safety-critical software components of cyber-physical systems. The process is called MINERVA, which stands for Mirrored Implementation Numerically Evaluated against Rigorously Verified Algorithms. The process relies on formal methods for rigorously validating code against its requirements. The software development process uses: (1) a formal specification language for describing the algorithms and their functional requirements, (2) an interactive theorem prover for formally verifying the correctness of the algorithms, (3) test cases that stress the code, and (4) numerical evaluation on these test cases of both the algorithm specifications and their implementations in code. The MINERVA process is illustrated in this paper with an application to geo-containment algorithms for unmanned aircraft systems. These algorithms ensure that the position of an aircraft never leaves a predetermined polygon region and provide recovery maneuvers when the region is inadvertently exited.

  3. Bioinformatic genome comparisons for taxonomic and phylogenic assignments using Aeromonas as a test case

    USDA-ARS?s Scientific Manuscript database

    Prokaryotic taxonomy is the underpinning of microbiology, providing a framework for the proper identification and naming of organisms. The 'gold standard' of bacterial species delineation is the overall genome similarity as determined by DNA-DNA hybridization (DDH), a technically rigorous yet someti...

  4. 50 CFR 18.118 - What are the mitigation, monitoring, and reporting requirements?

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... monitoring and research efforts will employ rigorous study designs and sampling protocols in order to provide... mitigation measures for offshore seismic surveys. Any offshore exploration activity expected to include the... 1 µPa. (ii) Ramp-up procedures. For all seismic surveys, including airgun testing, use the following...

  5. 50 CFR 18.118 - What are the mitigation, monitoring, and reporting requirements?

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... monitoring and research efforts will employ rigorous study designs and sampling protocols in order to provide... mitigation measures for offshore seismic surveys. Any offshore exploration activity expected to include the... 1 µPa. (ii) Ramp-up procedures. For all seismic surveys, including airgun testing, use the following...

  6. Rigor Made Easy: Getting Started

    ERIC Educational Resources Information Center

    Blackburn, Barbara R.

    2012-01-01

    Bestselling author and noted rigor expert Barbara Blackburn shares the secrets to getting started, maintaining momentum, and reaching your goals. Learn what rigor looks like in the classroom, understand what it means for your students, and get the keys to successful implementation. Learn how to use rigor to raise expectations, provide appropriate…

  7. SSE software test management STM capability: Using STM in the Ground Systems Development Environment (GSDE)

    NASA Technical Reports Server (NTRS)

    Church, Victor E.; Long, D.; Hartenstein, Ray; Perez-Davila, Alfredo

    1992-01-01

    This report is one of a series discussing configuration management (CM) topics for Space Station ground systems software development. It provides a description of the Software Support Environment (SSE)-developed Software Test Management (STM) capability, and discusses the possible use of this capability for management of developed software during testing performed on target platforms. This is intended to supplement the formal documentation of STM provided by the SEE Project. How STM can be used to integrate contractor CM and formal CM for software before delivery to operations is described. STM provides a level of control that is flexible enough to support integration and debugging, but sufficiently rigorous to insure the integrity of the testing process.

  8. Illustrating idiographic methods for translation research: moderation effects, natural clinical experiments, and complex treatment-by-subgroup interactions.

    PubMed

    Ridenour, Ty A; Wittenborn, Andrea K; Raiff, Bethany R; Benedict, Neal; Kane-Gill, Sandra

    2016-03-01

    A critical juncture in translation research involves the preliminary studies of intervention tools, provider training programs, policies, and other mechanisms used to leverage knowledge garnered at one translation stage into another stage. Potentially useful for such studies are rigorous techniques for conducting within-subject clinical trials, which have advanced incrementally over the last decade. However, these methods have largely not been utilized within prevention or translation contexts. The purpose of this manuscript is to demonstrate the flexibility, wide applicability, and rigor of idiographic clinical trials for preliminary testing of intervention mechanisms. Specifically demonstrated are novel uses of state-space modeling for testing intervention mechanisms of short-term outcomes, identifying heterogeneity in and moderation of within-person treatment mechanisms, a horizontal line plot to refine sampling design during the course of a clinic-based experimental study, and the need to test a treatment's efficacy as treatment is administered along with (e.g., traditional 12-month outcomes).

  9. Boston Naming Test Discontinuation Rule: "Rigorous" versus "Lenient" Interpretations.

    ERIC Educational Resources Information Center

    Ferman, Tanis J.; Ivnik, Robert J.; Lucas, John A.

    1998-01-01

    Two interpretations of the Boston Naming Test (BNT) (E. Kaplan, H. Goodglass, and S. Weintraub, 1983) discontinuation rule of six consecutive failures were found in BNT use with 655 normal older adults and 140 people with Alzheimer's disease. Differences between lenient and rigorous interpretations of responses could have impact on…

  10. LIHE Spectral Dynamics and Jaguar Data Acquisition System Measurement Assurance Results 2014.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Covert, Timothy T.; Willis, Michael David; Radtke, Gregg Arthur

    2015-06-01

    The Light Initiated High Explosive (LIHE) facility performs high rigor, high consequence impulse testing for the nuclear weapons (NW) community. To support the facility mission, LIHE's extensive data acquisition system (DAS) is comprised of several discrete components as well as a fully integrated system. Due to the high consequence and high rigor of the testing performed at LIHE, a measurement assurance plan (MAP) was developed in collaboration with NW system customers to meet their data quality needs and to provide assurance of the robustness of the LIHE DAS. While individual components of the DAS have been calibrated by the SNLmore » Primary Standards Laboratory (PSL), the integrated nature of this complex system requires verification of the complete system, from end-to-end. This measurement assurance plan (MAP) report documents the results of verification and validation procedures used to ensure that the data quality meets customer requirements.« less

  11. Statistically rigorous calculations do not support common input and long-term synchronization of motor-unit firings.

    PubMed

    De Luca, Carlo J; Kline, Joshua C

    2014-12-01

    Over the past four decades, various methods have been implemented to measure synchronization of motor-unit firings. In this work, we provide evidence that prior reports of the existence of universal common inputs to all motoneurons and the presence of long-term synchronization are misleading, because they did not use sufficiently rigorous statistical tests to detect synchronization. We developed a statistically based method (SigMax) for computing synchronization and tested it with data from 17,736 motor-unit pairs containing 1,035,225 firing instances from the first dorsal interosseous and vastus lateralis muscles--a data set one order of magnitude greater than that reported in previous studies. Only firing data, obtained from surface electromyographic signal decomposition with >95% accuracy, were used in the study. The data were not subjectively selected in any manner. Because of the size of our data set and the statistical rigor inherent to SigMax, we have confidence that the synchronization values that we calculated provide an improved estimate of physiologically driven synchronization. Compared with three other commonly used techniques, ours revealed three types of discrepancies that result from failing to use sufficient statistical tests necessary to detect synchronization. 1) On average, the z-score method falsely detected synchronization at 16 separate latencies in each motor-unit pair. 2) The cumulative sum method missed one out of every four synchronization identifications found by SigMax. 3) The common input assumption method identified synchronization from 100% of motor-unit pairs studied. SigMax revealed that only 50% of motor-unit pairs actually manifested synchronization. Copyright © 2014 the American Physiological Society.

  12. Validity of High-School Grades in Predicting Student Success beyond the Freshman Year: High-School Record vs. Standardized Tests as Indicators of Four-Year College Outcomes. Research & Occasional Paper Series: CSHE.6.07

    ERIC Educational Resources Information Center

    Geiser, Saul; Santelices, Maria Veronica

    2007-01-01

    High-school grades are often viewed as an unreliable criterion for college admissions, owing to differences in grading standards across high schools, while standardized tests are seen as methodologically rigorous, providing a more uniform and valid yardstick for assessing student ability and achievement. The present study challenges that…

  13. The Rigor Mortis of Education: Rigor Is Required in a Dying Educational System

    ERIC Educational Resources Information Center

    Mixon, Jason; Stuart, Jerry

    2009-01-01

    In an effort to answer the "Educational Call to Arms", our national public schools have turned to Advanced Placement (AP) courses as the predominate vehicle used to address the lack of academic rigor in our public high schools. Advanced Placement is believed by many to provide students with the rigor and work ethic necessary to…

  14. Dividing by Zero: Exploring Null Results in a Mathematics Professional Development Program

    ERIC Educational Resources Information Center

    Hill, Heather C.; Corey, Douglas Lyman; Jacob, Robin T.

    2018-01-01

    Background/Context: Since 2002, U.S. federal funding for educational research has favored the development and rigorous testing of interventions designed to improve student outcomes. However, recent reviews suggest that a large fraction of the programs developed and rigorously tested in the past decade have shown null results on student outcomes…

  15. Accuracy and performance of 3D mask models in optical projection lithography

    NASA Astrophysics Data System (ADS)

    Agudelo, Viviana; Evanschitzky, Peter; Erdmann, Andreas; Fühner, Tim; Shao, Feng; Limmer, Steffen; Fey, Dietmar

    2011-04-01

    Different mask models have been compared: rigorous electromagnetic field (EMF) modeling, rigorous EMF modeling with decomposition techniques and the thin mask approach (Kirchhoff approach) to simulate optical diffraction from different mask patterns in projection systems for lithography. In addition, each rigorous model was tested for two different formulations for partially coherent imaging: The Hopkins assumption and rigorous simulation of mask diffraction orders for multiple illumination angles. The aim of this work is to closely approximate results of the rigorous EMF method by the thin mask model enhanced with pupil filtering techniques. The validity of this approach for different feature sizes, shapes and illumination conditions is investigated.

  16. Mathematical Rigor vs. Conceptual Change: Some Early Results

    NASA Astrophysics Data System (ADS)

    Alexander, W. R.

    2003-05-01

    Results from two different pedagogical approaches to teaching introductory astronomy at the college level will be presented. The first of these approaches is a descriptive, conceptually based approach that emphasizes conceptual change. This descriptive class is typically an elective for non-science majors. The other approach is a mathematically rigorous treatment that emphasizes problem solving and is designed to prepare students for further study in astronomy. The mathematically rigorous class is typically taken by science majors. It also fulfills an elective science requirement for these science majors. The Astronomy Diagnostic Test version 2 (ADT 2.0) was used as an assessment instrument since the validity and reliability have been investigated by previous researchers. The ADT 2.0 was administered as both a pre-test and post-test to both groups. Initial results show no significant difference between the two groups in the post-test. However, there is a slightly greater improvement for the descriptive class between the pre and post testing compared to the mathematically rigorous course. There was great care to account for variables. These variables included: selection of text, class format as well as instructor differences. Results indicate that the mathematically rigorous model, doesn't improve conceptual understanding any better than the conceptual change model. Additional results indicate that there is a similar gender bias in favor of males that has been measured by previous investigators. This research has been funded by the College of Science and Mathematics at James Madison University.

  17. Structural Testing at the NWTC Helps Improve Blade Design and Increase System Reliability; NREL (National Renewable Energy Laboratory)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2015-08-01

    Since 1990, the National Renewable Energy Laboratory’s (NREL's) National Wind Technology Center (NWTC) has tested more than 150 wind turbine blades. NWTC researchers can test full-scale and subcomponent articles, conduct data analyses, and provide engineering expertise on best design practices. Structural testing of wind turbine blades enables designers, manufacturers, and owners to validate designs and assess structural performance to specific load conditions. Rigorous structural testing can reveal design and manufacturing problems at an early stage of development that can lead to overall improvements in design and increase system reliability.

  18. NSSEFF Designing New Higher Temperature Superconductors

    DTIC Science & Technology

    2017-04-13

    electronic structure calculations are integrated with the synthesis of new superconducting materials, with the aim of providing a rigorous test of the...apparent association of high temperature superconductivity with electron delocalization transitions occurring at quantum critical points. We will use...realistic electronic structure calculations to assess which transition metal monopnictides are closest to electron delocalization, and hence optimal for

  19. The Evaluation of the National Long Term Care Demonstration: Final Report. Executive Summary.

    ERIC Educational Resources Information Center

    Mathematica Policy Research, Inc., Plainsboro, NJ.

    This report describes the evaluation of the National Long-Term Care (Channeling) Demonstration, a rigorous test of comprehensive case management of community care as a way of containing long-term care costs for the impaired elderly while providing adequate care to those in need. The evaluation process is presented as an experimental design with…

  20. South Carolina Word List, Grades 1-12. Basic Skills Assessment Program.

    ERIC Educational Resources Information Center

    Instructional Objectives Exchange, Los Angeles, CA.

    Designed as a resource for reading teachers who are attempting to enhance their students' fundamental reading skills and to permit the more rigorous determination of readability levels for both instructional materials and testing devices, this word list provides a grade-by-grade set of key words students need to master for grades 1 through 12 The…

  1. Conformance Testing: Measurement Decision Rules

    NASA Technical Reports Server (NTRS)

    Mimbs, Scott M.

    2010-01-01

    The goal of a Quality Management System (QMS) as specified in ISO 9001 and AS9100 is to provide assurance to the customer that end products meet specifications. Measuring devices, often called measuring and test equipment (MTE), are used to provide the evidence of product conformity to specified requirements. Unfortunately, processes that employ MTE can become a weak link to the overall QMS if proper attention is not given to the measurement process design, capability, and implementation. Documented "decision rules" establish the requirements to ensure measurement processes provide the measurement data that supports the needs of the QMS. Measurement data are used to make the decisions that impact all areas of technology. Whether measurements support research, design, production, or maintenance, ensuring the data supports the decision is crucial. Measurement data quality can be critical to the resulting consequences of measurement-based decisions. Historically, most industries required simplistic, one-size-fits-all decision rules for measurements. One-size-fits-all rules in some cases are not rigorous enough to provide adequate measurement results, while in other cases are overly conservative and too costly to implement. Ideally, decision rules should be rigorous enough to match the criticality of the parameter being measured, while being flexible enough to be cost effective. The goal of a decision rule is to ensure that measurement processes provide data with a sufficient level of quality to support the decisions being made - no more, no less. This paper discusses the basic concepts of providing measurement-based evidence that end products meet specifications. Although relevant to all measurement-based conformance tests, the target audience is the MTE end-user, which is anyone using MTE other than calibration service providers. Topics include measurement fundamentals, the associated decision risks, verifying conformance to specifications, and basic measurement decisions rules.

  2. The NISTmAb Reference Material 8671 lifecycle management and quality plan.

    PubMed

    Schiel, John E; Turner, Abigail

    2018-03-01

    Comprehensive analysis of monoclonal antibody therapeutics involves an ever expanding cadre of technologies. Lifecycle-appropriate application of current and emerging techniques requires rigorous testing followed by discussion between industry and regulators in a pre-competitive space, an effort that may be facilitated by a widely available test metric. Biopharmaceutical quality materials, however, are often difficult to access and/or are protected by intellectual property rights. The NISTmAb, humanized IgG1κ Reference Material 8671 (RM 8671), has been established with the intent of filling that void. The NISTmAb embodies the quality and characteristics of a typical biopharmaceutical product, is widely available to the biopharmaceutical community, and is an open innovation tool for development and dissemination of results. The NISTmAb lifecyle management plan described herein provides a hierarchical strategy for maintenance of quality over time through rigorous method qualification detailed in additional submissions in the current publication series. The NISTmAb RM 8671 is a representative monoclonal antibody material and provides a means to continually evaluate current best practices, promote innovative approaches, and inform regulatory paradigms as technology advances. Graphical abstract The NISTmAb Reference Material (RM) 8671 is intended to be an industry standard monoclonal antibody for pre-competitive harmonization of best practices and designing next generation characterization technologies for identity, quality, and stability testing.

  3. Can power-law scaling and neuronal avalanches arise from stochastic dynamics?

    PubMed

    Touboul, Jonathan; Destexhe, Alain

    2010-02-11

    The presence of self-organized criticality in biology is often evidenced by a power-law scaling of event size distributions, which can be measured by linear regression on logarithmic axes. We show here that such a procedure does not necessarily mean that the system exhibits self-organized criticality. We first provide an analysis of multisite local field potential (LFP) recordings of brain activity and show that event size distributions defined as negative LFP peaks can be close to power-law distributions. However, this result is not robust to change in detection threshold, or when tested using more rigorous statistical analyses such as the Kolmogorov-Smirnov test. Similar power-law scaling is observed for surrogate signals, suggesting that power-law scaling may be a generic property of thresholded stochastic processes. We next investigate this problem analytically, and show that, indeed, stochastic processes can produce spurious power-law scaling without the presence of underlying self-organized criticality. However, this power-law is only apparent in logarithmic representations, and does not survive more rigorous analysis such as the Kolmogorov-Smirnov test. The same analysis was also performed on an artificial network known to display self-organized criticality. In this case, both the graphical representations and the rigorous statistical analysis reveal with no ambiguity that the avalanche size is distributed as a power-law. We conclude that logarithmic representations can lead to spurious power-law scaling induced by the stochastic nature of the phenomenon. This apparent power-law scaling does not constitute a proof of self-organized criticality, which should be demonstrated by more stringent statistical tests.

  4. Single-case synthesis tools I: Comparing tools to evaluate SCD quality and rigor.

    PubMed

    Zimmerman, Kathleen N; Ledford, Jennifer R; Severini, Katherine E; Pustejovsky, James E; Barton, Erin E; Lloyd, Blair P

    2018-03-03

    Tools for evaluating the quality and rigor of single case research designs (SCD) are often used when conducting SCD syntheses. Preferred components include evaluations of design features related to the internal validity of SCD to obtain quality and/or rigor ratings. Three tools for evaluating the quality and rigor of SCD (Council for Exceptional Children, What Works Clearinghouse, and Single-Case Analysis and Design Framework) were compared to determine if conclusions regarding the effectiveness of antecedent sensory-based interventions for young children changed based on choice of quality evaluation tool. Evaluation of SCD quality differed across tools, suggesting selection of quality evaluation tools impacts evaluation findings. Suggestions for selecting an appropriate quality and rigor assessment tool are provided and across-tool conclusions are drawn regarding the quality and rigor of studies. Finally, authors provide guidance for using quality evaluations in conjunction with outcome analyses when conducting syntheses of interventions evaluated in the context of SCD. Copyright © 2018 Elsevier Ltd. All rights reserved.

  5. Rate Coefficient for the (4)Heμ + CH4 Reaction at 500 K: Comparison between Theory and Experiment.

    PubMed

    Arseneau, Donald J; Fleming, Donald G; Li, Yongle; Li, Jun; Suleimanov, Yury V; Guo, Hua

    2016-03-03

    The rate constant for the H atom abstraction reaction from methane by the muonic helium atom, Heμ + CH4 → HeμH + CH3, is reported at 500 K and compared with theory, providing an important test of both the potential energy surface (PES) and reaction rate theory for the prototypical polyatomic CH5 reaction system. The theory used to characterize this reaction includes both variational transition-state (CVT/μOMT) theory (VTST) and ring polymer molecular dynamics (RPMD) calculations on a recently developed PES, which are compared as well with earlier calculations on different PESs for the H, D, and Mu + CH4 reactions, the latter, in particular, providing for a variation in atomic mass by a factor of 36. Though rigorous quantum calculations have been carried out for the H + CH4 reaction, these have not yet been extended to the isotopologues of this reaction (in contrast to H3), so it is important to provide tests of less rigorous theories in comparison with kinetic isotope effects measured by experiment. In this regard, the agreement between the VTST and RPMD calculations and experiment for the rate constant of the Heμ + CH4 reaction at 500 K is excellent, within 10% in both cases, which overlaps with experimental error.

  6. Enhancing causal interpretations of quality improvement interventions

    PubMed Central

    Cable, G

    2001-01-01

    In an era of chronic resource scarcity it is critical that quality improvement professionals have confidence that their project activities cause measured change. A commonly used research design, the single group pre-test/post-test design, provides little insight into whether quality improvement interventions cause measured outcomes. A re-evaluation of a quality improvement programme designed to reduce the percentage of bilateral cardiac catheterisations for the period from January 1991 to October 1996 in three catheterisation laboratories in a north eastern state in the USA was performed using an interrupted time series design with switching replications. The accuracy and causal interpretability of the findings were considerably improved compared with the original evaluation design. Moreover, the re-evaluation provided tangible evidence in support of the suggestion that more rigorous designs can and should be more widely employed to improve the causal interpretability of quality improvement efforts. Evaluation designs for quality improvement projects should be constructed to provide a reasonable opportunity, given available time and resources, for causal interpretation of the results. Evaluators of quality improvement initiatives may infrequently have access to randomised designs. Nonetheless, as shown here, other very rigorous research designs are available for improving causal interpretability. Unilateral methodological surrender need not be the only alternative to randomised experiments. Key Words: causal interpretations; quality improvement; interrupted time series design; implementation fidelity PMID:11533426

  7. Enhancing causal interpretations of quality improvement interventions.

    PubMed

    Cable, G

    2001-09-01

    In an era of chronic resource scarcity it is critical that quality improvement professionals have confidence that their project activities cause measured change. A commonly used research design, the single group pre-test/post-test design, provides little insight into whether quality improvement interventions cause measured outcomes. A re-evaluation of a quality improvement programme designed to reduce the percentage of bilateral cardiac catheterisations for the period from January 1991 to October 1996 in three catheterisation laboratories in a north eastern state in the USA was performed using an interrupted time series design with switching replications. The accuracy and causal interpretability of the findings were considerably improved compared with the original evaluation design. Moreover, the re-evaluation provided tangible evidence in support of the suggestion that more rigorous designs can and should be more widely employed to improve the causal interpretability of quality improvement efforts. Evaluation designs for quality improvement projects should be constructed to provide a reasonable opportunity, given available time and resources, for causal interpretation of the results. Evaluators of quality improvement initiatives may infrequently have access to randomised designs. Nonetheless, as shown here, other very rigorous research designs are available for improving causal interpretability. Unilateral methodological surrender need not be the only alternative to randomised experiments.

  8. Rigor and Responsiveness in Classroom Activity

    ERIC Educational Resources Information Center

    Thomspon, Jessica; Hagenah, Sara; Kang, Hosun; Stroupe, David; Braaten, Melissa; Colley, Carolyn; Windschitl, Mark

    2016-01-01

    Background/Context: There are few examples from classrooms or the literature that provide a clear vision of teaching that simultaneously promotes rigorous disciplinary activity and is responsive to all students. Maintaining rigorous and equitable classroom discourse is a worthy goal, yet there is no clear consensus of how this actually works in a…

  9. ASTM Committee C28: International Standards for Properties and Performance of Advanced Ceramics-Three Decades of High-Quality, Technically-Rigorous Normalization

    NASA Technical Reports Server (NTRS)

    Jenkins, Michael G.; Salem, Jonathan A.

    2016-01-01

    Physical and mechanical properties and performance of advanced ceramics and glasses are difficult to measure correctly without the proper techniques. For over three decades, ASTM Committee C28 on Advanced Ceramics, has developed high-quality, technically-rigorous, full-consensus standards (e.g., test methods, practices, guides, terminology) to measure properties and performance of monolithic and composite ceramics that may be applied to glasses in some cases. These standards contain testing particulars for many mechanical, physical, thermal, properties and performance of these materials. As a result these standards are used to generate accurate, reliable, repeatable and complete data. Within Committee C28, users, producers, researchers, designers, academicians, etc. have written, continually updated, and validated through round-robin test programs, 50 standards since the Committee's founding in 1986. This paper provides a detailed retrospective of the 30 years of ASTM Committee C28 including a graphical pictogram listing of C28 standards along with examples of the tangible benefits of standards for advanced ceramics to demonstrate their practical applications.

  10. A rigorous test of the accuracy of USGS digital elevation models in forested areas of Oregon and Washington.

    Treesearch

    Ward W. Carson; Stephen E. Reutebuch

    1997-01-01

    A procedure for performing a rigorous test of elevational accuracy of DEMs using independent ground coordinate data digitized photogrammetrically from aerial photography is presented. The accuracy of a sample set of 23 DEMs covering National Forests in Oregon and Washington was evaluated. Accuracy varied considerably between eastern and western parts of Oregon and...

  11. Long persistence of rigor mortis at constant low temperature.

    PubMed

    Varetto, Lorenzo; Curto, Ombretta

    2005-01-06

    We studied the persistence of rigor mortis by using physical manipulation. We tested the mobility of the knee on 146 corpses kept under refrigeration at Torino's city mortuary at a constant temperature of +4 degrees C. We found a persistence of complete rigor lasting for 10 days in all the cadavers we kept under observation; and in one case, rigor lasted for 16 days. Between the 11th and the 17th days, a progressively increasing number of corpses showed a change from complete into partial rigor (characterized by partial bending of the articulation). After the 17th day, all the remaining corpses showed partial rigor and in the two cadavers that were kept under observation "à outrance" we found the absolute resolution of rigor mortis occurred on the 28th day. Our results prove that it is possible to find a persistence of rigor mortis that is much longer than the expected when environmental conditions resemble average outdoor winter temperatures in temperate zones. Therefore, this datum must be considered when a corpse is found in those environmental conditions so that when estimating the time of death, we are not misled by the long persistence of rigor mortis.

  12. Testing adaptive toolbox models: a Bayesian hierarchical approach.

    PubMed

    Scheibehenne, Benjamin; Rieskamp, Jörg; Wagenmakers, Eric-Jan

    2013-01-01

    Many theories of human cognition postulate that people are equipped with a repertoire of strategies to solve the tasks they face. This theoretical framework of a cognitive toolbox provides a plausible account of intra- and interindividual differences in human behavior. Unfortunately, it is often unclear how to rigorously test the toolbox framework. How can a toolbox model be quantitatively specified? How can the number of toolbox strategies be limited to prevent uncontrolled strategy sprawl? How can a toolbox model be formally tested against alternative theories? The authors show how these challenges can be met by using Bayesian inference techniques. By means of parameter recovery simulations and the analysis of empirical data across a variety of domains (i.e., judgment and decision making, children's cognitive development, function learning, and perceptual categorization), the authors illustrate how Bayesian inference techniques allow toolbox models to be quantitatively specified, strategy sprawl to be contained, and toolbox models to be rigorously tested against competing theories. The authors demonstrate that their approach applies at the individual level but can also be generalized to the group level with hierarchical Bayesian procedures. The suggested Bayesian inference techniques represent a theoretical and methodological advancement for toolbox theories of cognition and behavior.

  13. Human-rated Safety Certification of a High Voltage Robonaut Lithium-ion Battery

    NASA Technical Reports Server (NTRS)

    Jeevarajan, Judith; Yayathi, S.; Johnson, M.; Waligora, T.; Verdeyen, W.

    2013-01-01

    NASA's rigorous certification process is being followed for the R2 high voltage battery program for use of R2 on International Space Station (ISS). Rigorous development testing at appropriate levels to credible off-nominal conditions and review of test data led to design improvements for safety at the virtual cell, cartridge and battery levels. Tests were carried out at all levels to confirm that both hardware and software controls work. Stringent flight acceptance testing of the flight battery will be completed before launch for mission use on ISS.

  14. [Experimental study of restiffening of the rigor mortis].

    PubMed

    Wang, X; Li, M; Liao, Z G; Yi, X F; Peng, X M

    2001-11-01

    To observe changes of the length of sarcomere of rat when restiffening. We measured the length of sarcomere of quadriceps in 40 rats in different condition by scanning electron microscope. The length of sarcomere of rigor mortis without destroy is obviously shorter than that of restiffening. The length of sarcomere is negatively correlative to the intensity of rigor mortis. Measuring the length of sarcomere can determine the intensity of rigor mortis and provide evidence for estimation of time since death.

  15. Integrating into the Mental Health System from the Criminal Justice System: Jail Aftercare Services for Persons with a Severe Mental Illness

    ERIC Educational Resources Information Center

    Davis, Kristin; Fallon, John; Vogel, Sue; Teachout, Alexandra

    2008-01-01

    This article describes a mental health evidence based practice, Assertive Community Treatment (ACT). While ACT has scientific support, it has not been rigorously tested for persons with a severe mental illness and repeated forensic involvement. This article provides preliminary evidence that ACT is best suited for reentry into the mental health…

  16. The Aharonov-Bohm effect and Tonomura et al. experiments: Rigorous results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ballesteros, Miguel; Weder, Ricardo

    The Aharonov-Bohm effect is a fundamental issue in physics. It describes the physically important electromagnetic quantities in quantum mechanics. Its experimental verification constitutes a test of the theory of quantum mechanics itself. The remarkable experiments of Tonomura et al. ['Observation of Aharonov-Bohm effect by electron holography', Phys. Rev. Lett 48, 1443 (1982) and 'Evidence for Aharonov-Bohm effect with magnetic field completely shielded from electron wave', Phys. Rev. Lett 56, 792 (1986)] are widely considered as the only experimental evidence of the physical existence of the Aharonov-Bohm effect. Here we give the first rigorous proof that the classical ansatz of Aharonovmore » and Bohm of 1959 ['Significance of electromagnetic potentials in the quantum theory', Phys. Rev. 115, 485 (1959)], that was tested by Tonomura et al., is a good approximation to the exact solution to the Schroedinger equation. This also proves that the electron, that is, represented by the exact solution, is not accelerated, in agreement with the recent experiment of Caprez et al. in 2007 ['Macroscopic test of the Aharonov-Bohm effect', Phys. Rev. Lett. 99, 210401 (2007)], that shows that the results of the Tonomura et al. experiments can not be explained by the action of a force. Under the assumption that the incoming free electron is a Gaussian wave packet, we estimate the exact solution to the Schroedinger equation for all times. We provide a rigorous, quantitative error bound for the difference in norm between the exact solution and the Aharonov-Bohm Ansatz. Our bound is uniform in time. We also prove that on the Gaussian asymptotic state the scattering operator is given by a constant phase shift, up to a quantitative error bound that we provide. Our results show that for intermediate size electron wave packets, smaller than the ones used in the Tonomura et al. experiments, quantum mechanics predicts the results observed by Tonomura et al. with an error bound smaller than 10{sup -99}. It would be quite interesting to perform experiments with electron wave packets of intermediate size. Furthermore, we provide a physical interpretation of our error bound.« less

  17. A Regional Seismic Travel Time Model for North America

    DTIC Science & Technology

    2010-09-01

    velocity at the Moho, the mantle velocity gradient, and the average crustal velocity. After tomography across Eurasia, rigorous tests find that Pn...velocity gradient, and the average crustal velocity. After tomography across Eurasia rigorous tests find that Pn travel time residuals are reduced...and S-wave velocity in the crustal layers and in the upper mantle. A good prior model is essential because the RSTT tomography inversion is invariably

  18. A multi-zone muffle furnace design

    NASA Technical Reports Server (NTRS)

    Rowe, Neil D.; Kisel, Martin

    1993-01-01

    A Multi-Zone Muffle-Tube Furnace was designed, built, and tested for the purpose of providing an in-house experience base with tubular furnaces for materials processing in microgravity. As such, it must not only provide the desired temperatures and controlled thermal gradients at several discrete zones along its length but must also be capable of sustaining the rigors of a Space Shuttle launch. The furnace is insulated to minimize radial and axial heat losses. It is contained in a water-cooled enclosure for purposes of dissipating un-wanted residual heat, keeping the outer surfaces of the furnace at a 'touch-safe' temperature, and providing a rugged housing. This report describes the salient features of the furnace, testing procedures and results, and concluding remarks evaluating the overall design.

  19. A consortium-driven framework to guide the implementation of ICH M7 Option 4 control strategies.

    PubMed

    Barber, Chris; Antonucci, Vincent; Baumann, Jens-Christoph; Brown, Roland; Covey-Crump, Elizabeth; Elder, David; Elliott, Eric; Fennell, Jared W; Gallou, Fabrice; Ide, Nathan D; Jordine, Guido; Kallemeyn, Jeffrey M; Lauwers, Dirk; Looker, Adam R; Lovelle, Lucie E; McLaughlin, Mark; Molzahn, Robert; Ott, Martin; Schils, Didier; Oestrich, Rolf Schulte; Stevenson, Neil; Talavera, Pere; Teasdale, Andrew; Urquhart, Michael W; Varie, David L; Welch, Dennie

    2017-11-01

    The ICH M7 Option 4 control of (potentially) mutagenic impurities is based on the use of scientific principles in lieu of routine analytical testing. This approach can reduce the burden of analytical testing without compromising patient safety, provided a scientifically rigorous approach is taken which is backed up by sufficient theoretical and/or analytical data. This paper introduces a consortium-led initiative and offers a proposal on the supporting evidence that could be presented in regulatory submissions. Copyright © 2017 Elsevier Inc. All rights reserved.

  20. How Do You Measure That Ceramic Property?

    NASA Technical Reports Server (NTRS)

    Salem, Jonathan; Helfinstine, John; Quinn, George; Gonczy, Stephen

    2011-01-01

    By using the dozens of consensus test standards and practices developed by the Advanced Ceramics Committee of ASTM, C-28, the measurement of mechanical, physical, thermal, and performance properties can be properly performed. The what, how, how not, and why are clearly illustrated for beginning as well as experienced testers. Using these standards will provide accurate, reliable, and complete data for rigorous comparisons with other test results. The C-28 Committee has involved academics, and producers, and users of ceramics to write and continually update more than 45 standards since the committee's inception in 1986.

  1. Microbicide safety/efficacy studies in animals: macaques and small animal models.

    PubMed

    Veazey, Ronald S

    2008-09-01

    A number of microbicide candidates have failed to prevent HIV transmission in human clinical trials, and there is uncertainty as to how many additional trials can be supported by the field. Regardless, there are far too many microbicide candidates in development, and a logical and consistent method for screening and selecting candidates for human clinical trials is desperately needed. The unique host and cell specificity of HIV, however, provides challenges for microbicide safety and efficacy screening, that can only be addressed by rigorous testing in relevant laboratory animal models. A number of laboratory animal model systems ranging from rodents to nonhuman primates, and single versus multiple dose challenges have recently been developed to test microbicide candidates. These models have shed light on both the safety and efficacy of candidate microbicides as well as the early mechanisms involved in transmission. This article summarizes the major advantages and disadvantages of the relevant animal models for microbicide safety and efficacy testing. Currently, nonhuman primates are the only relevant and effective laboratory model for screening microbicide candidates. Given the consistent failures of prior strategies, it is now clear that rigorous safety and efficacy testing in nonhuman primates should be a prerequisite for advancing additional microbicide candidates to human clinical trials.

  2. Microbicide Safety/Efficacy studies in animals -macaques and small animal models

    PubMed Central

    Veazey, Ronald S.

    2009-01-01

    Purpose of review A number of microbicide candidates have failed to prevent HIV transmission in human clinical trials, and there is uncertainty as to how many additional trials can be supported by the field. Regardless, there are far too many microbicide candidates in development, and a logical and consistent method for screening and selecting candidates for human clinical trials is desperately needed. However, the unique host and cell specificity of HIV provides challenges for microbicide safety and efficacy screening, that can only be addressed by rigorous testing in relevant laboratory animal models. Recent findings A number of laboratory animal model systems ranging from rodents to nonhuman primates, and single versus multiple dose challenges have recently been developed to test microbicide candidates. These models have shed light on both the safety and efficacy of candidate microbicides as well as the early mechanisms involved in transmission. This article summarizes the major advantages and disadvantages of the relevant animal models for microbicide safety and efficacy testing. Summary Currently, nonhuman primates are the only relevant and effective laboratory model for screening microbicide candidates. Given the consistent failures of prior strategies, it is now clear that rigorous safety and efficacy testing in nonhuman primates should be a pre-requisite for advancing additional microbicide candidates to human clinical trials. PMID:19373023

  3. In Search of Golden Rules: Comment on Hypothesis-Testing Approaches to Setting Cutoff Values for Fit Indexes and Dangers in Overgeneralizing Hu and Bentler's (1999) Findings

    ERIC Educational Resources Information Center

    Marsh, Herbert W.; Hau, Kit-Tai; Wen, Zhonglin

    2004-01-01

    Goodness-of-fit (GOF) indexes provide "rules of thumb"?recommended cutoff values for assessing fit in structural equation modeling. Hu and Bentler (1999) proposed a more rigorous approach to evaluating decision rules based on GOF indexes and, on this basis, proposed new and more stringent cutoff values for many indexes. This article discusses…

  4. Are we there yet? The state of the evidence base for guidelines on breaking bad news to cancer patients.

    PubMed

    Paul, C L; Clinton-McHarg, T; Sanson-Fisher, R W; Douglas, H; Webb, G

    2009-11-01

    The way clinicians break bad news to cancer patients has been retrospectively associated with poor psychosocial outcomes for patients. Education and practice in breaking bad news may be ineffective for improving patients' well-being unless it is informed by a sound evidence base. In the health field, research efforts are expected to advance evidence over time to inform evidence-based practice. Key characteristics of an advancing evidence base are a predominance of new data, and rigorous intervention studies which prospectively demonstrate improved outcomes. This review aimed to examine the progress of the evidence base in breaking bad news to cancer patients. Manual and computer-based searches (Medline and PsycINFO) were performed to identify publications on the topic of breaking bad news to cancer patients published between January 1995 and March 2009. Relevant publications were coded in terms of whether they provided new data, examined psychosocial outcomes for patients or tested intervention strategies and whether intervention studies met criteria for design rigour. Of the 245 relevant publications, 55.5% provided new data and 16.7% were intervention studies. Much of the intervention effort was directed towards improving provider skills rather than patient outcomes (9.8% of studies). Less than 2% of publications were rigorous intervention studies which addressed psychosocial outcomes for patients. Rigorous intervention studies which evaluate strategies for improving psychosocial outcomes in relation to breaking bad news to cancer patients are needed. Current practice and training regarding breaking bad news cannot be regarded as evidence-based until further research is completed.

  5. SPRUCE Advanced Molecular Techniques Provide a Rigorous Method for Characterizing Organic Matter Quality in Complex Systems: Supporting Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilson, Rachel M; Tfaily, Malak M

    These data are provided in support of the Commentary, Advanced molecular techniques provide a rigorous method for characterizing organic matter quality in complex systems, Wilson and Tfaily (2018). Measurement results demonstrate that optical characterization of peatland dissolved organic matter (DOM) may not fully capture classically identified chemical characteristics and may, therefore, not be the best measure of organic matter quality.

  6. Improved methods for distribution loss evaluation. Volume 1: analytic and evaluative techniques. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Flinn, D.G.; Hall, S.; Morris, J.

    This volume describes the background research, the application of the proposed loss evaluation techniques, and the results. The research identified present loss calculation methods as appropriate, provided care was taken to represent the various system elements in sufficient detail. The literature search of past methods and typical data revealed that extreme caution in using typical values (load factor, etc.) should be taken to ensure that all factors were referred to the same time base (daily, weekly, etc.). The performance of the method (and computer program) proposed in this project was determined by comparison of results with a rigorous evaluation ofmore » losses on the Salt River Project system. This rigorous evaluation used statistical modeling of the entire system as well as explicit enumeration of all substation and distribution transformers. Further tests were conducted at Public Service Electric and Gas of New Jersey to check the appropriateness of the methods in a northern environment. Finally sensitivity tests indicated data elements inaccuracy of which would most affect the determination of losses using the method developed in this project.« less

  7. Social Norms: Do We Love Norms Too Much?

    PubMed

    Bell, David C; Cox, Mary L

    2015-03-01

    Social norms are often cited as the cause of many social phenomena, especially as an explanation for prosocial family and relationship behaviors. And yet maybe we love the idea of social norms too much, as suggested by our failure to subject them to rigorous test. Compared to the detail in social norms theoretical orientations, there is very little detail in tests of normative theories. To provide guidance to researchers who invoke social norms as explanations, we catalog normative orientations that have been proposed to account for consistent patterns of action. We call on researchers to conduct tests of normative theories and the processes such theories assert.

  8. Corona And Ultraviolet Equipment For Testing Materials

    NASA Technical Reports Server (NTRS)

    Laue, Eric G.

    1993-01-01

    Two assemblies of laboratory equipment developed for use in testing abilities of polymers, paints, and other materials to withstand ultraviolet radiation and charged particles. One is vacuum ultraviolet source built around commercial deuterium lamp. Other exposes specimen in partial vacuum to both ultraviolet radiation and brush corona discharge. Either or both assemblies used separately or together to simulate approximately combination of solar radiation and charged particles encountered by materials aboard spacecraft in orbit around Earth. Also used to provide rigorous environmental tests of materials exposed to artificial ultraviolet radiation and charged particles in industrial and scientific settings or to natural ultraviolet radiation and charged particles aboard aircraft at high altitudes.

  9. Social Norms: Do We Love Norms Too Much?

    PubMed Central

    Bell, David C.; Cox, Mary L.

    2014-01-01

    Social norms are often cited as the cause of many social phenomena, especially as an explanation for prosocial family and relationship behaviors. And yet maybe we love the idea of social norms too much, as suggested by our failure to subject them to rigorous test. Compared to the detail in social norms theoretical orientations, there is very little detail in tests of normative theories. To provide guidance to researchers who invoke social norms as explanations, we catalog normative orientations that have been proposed to account for consistent patterns of action. We call on researchers to conduct tests of normative theories and the processes such theories assert. PMID:25937833

  10. Using qualitative methods to improve questionnaires for Spanish speakers: assessing face validity of a food behavior checklist.

    PubMed

    Banna, Jinan C; Vera Becerra, Luz E; Kaiser, Lucia L; Townsend, Marilyn S

    2010-01-01

    Development of outcome measures relevant to health nutrition behaviors requires a rigorous process of testing and revision. Whereas researchers often report performance of quantitative data collection to assess questionnaire validity and reliability, qualitative testing procedures are often overlooked. This report outlines a procedure for assessing face validity of a Spanish-language dietary assessment tool. Reviewing the literature produced no rigorously validated Spanish-language food behavior assessment tools for the US Department of Agriculture's food assistance and education programs. In response to this need, this study evaluated the face validity of a Spanish-language food behavior checklist adapted from a 16-item English version of a food behavior checklist shown to be valid and reliable for limited-resource English speakers. The English version was translated using rigorous methods involving initial translation by one party and creation of five possible versions. Photos were modified based on client input and new photos were taken as necessary. A sample of low-income, Spanish-speaking women completed cognitive interviews (n=20). Spanish translation experts (n=7) fluent in both languages and familiar with both cultures made minor modifications but essentially approved client preferences. The resulting checklist generated a readability score of 93, indicating low reading difficulty. The Spanish-language checklist has adequate face validity in the target population and is ready for further validation using convergent measures. At the conclusion of testing, this instrument may be used to evaluate nutrition education interventions in California. These qualitative procedures provide a framework for designing evaluation tools for low-literate audiences participating in the US Department of Agriculture food assistance and education programs. Copyright 2010 American Dietetic Association. Published by Elsevier Inc. All rights reserved.

  11. Using Qualitative Methods to Improve Questionnaires for Spanish Speakers: Assessing Face Validity of a Food Behavior Checklist

    PubMed Central

    BANNA, JINAN C.; VERA BECERRA, LUZ E.; KAISER, LUCIA L.; TOWNSEND, MARILYN S.

    2015-01-01

    Development of outcome measures relevant to health nutrition behaviors requires a rigorous process of testing and revision. Whereas researchers often report performance of quantitative data collection to assess questionnaire validity and reliability, qualitative testing procedures are often overlooked. This report outlines a procedure for assessing face validity of a Spanish-language dietary assessment tool. Reviewing the literature produced no rigorously validated Spanish-language food behavior assessment tools for the US Department of Agriculture’s food assistance and education programs. In response to this need, this study evaluated the face validity of a Spanish-language food behavior checklist adapted from a 16-item English version of a food behavior checklist shown to be valid and reliable for limited-resource English speakers. The English version was translated using rigorous methods involving initial translation by one party and creation of five possible versions. Photos were modified based on client input and new photos were taken as necessary. A sample of low-income, Spanish-speaking women completed cognitive interviews (n=20). Spanish translation experts (n=7) fluent in both languages and familiar with both cultures made minor modifications but essentially approved client preferences. The resulting checklist generated a readability score of 93, indicating low reading difficulty. The Spanish-language checklist has adequate face validity in the target population and is ready for further validation using convergent measures. At the conclusion of testing, this instrument may be used to evaluate nutrition education interventions in California. These qualitative procedures provide a framework for designing evaluation tools for low-literate audiences participating in the US Department of Agriculture food assistance and education programs. PMID:20102831

  12. Wideband Global SATCOM (WGS)

    DTIC Science & Technology

    2015-12-01

    system level testing. ​The WGS-6 financial data is not reported in this SAR because funding is provided by Australia in exchange for access to a...A 3831.3 3539.7 3539.7 3801.9 Confidence Level Confidence Level of cost estimate for current APB: 50% The ICE to support WGS Milestone C decision...to calculate mathematically the precise confidence levels associated with life-cycle cost estimates prepared for MDAPs. Based on the rigor in

  13. The psychology of psychology: A thought experiment.

    PubMed

    Ceci, Stephen J; Williams, Wendy M

    2015-01-01

    In the target article, Duarte et al. allege that the lack of political diversity reduces research efficacy. We pose a thought experiment that could provide an empirical test by examining whether institutional review board (IRB) members, granting agencies, and journal reviewers filter scientific products based on political values, invoking scientific criteria (rigor, etc.) as their justification. When these same products are cast in terms highlighting opposite values, do these people shift their decisions?

  14. Over half a century of studying carbon-12

    NASA Astrophysics Data System (ADS)

    Kokalova Wheldon, Tzany

    2015-09-01

    Carbon-12 is one of the most studied light nuclei yet it continues to surprise and provide a rigorous testing ground for a wide range of physics, from nucleosynthesis models to theories of symmetries. This paper discusses the background motivating the investigations of 12C and summarises the recent results, with an emphasis on collective excitations and the high-energy structure together with possible future directions for this most intriguing of nuclei.

  15. The strong Bell inequalities: A proposed experimental test

    NASA Technical Reports Server (NTRS)

    Fry, Edward S.

    1994-01-01

    All previous experimental tests of Bell inequalities have required additional assumptions. The strong Bell inequalities (i.e. those requiring no additional assumptions) have never been tested. An experiment has been designed that can, for the first time, provide a definitive test of the strong Bell inequalities. Not only will the detector efficiency loophole be closed; but the locality condition will also be rigorously enforced. The experiment involves producing two Hg-199 atoms by a resonant Raman dissociation of a mercury dimer ((199)Hg2) that is in an electronic and nuclear spin singlet state. Bell inequalities can be tested by measuring angular momentum correlations between the spin one-half nuclei of the two Hg-199 atoms. The method used to make these latter measurements will be described.

  16. Does McNemar's test compare the sensitivities and specificities of two diagnostic tests?

    PubMed

    Kim, Soeun; Lee, Woojoo

    2017-02-01

    McNemar's test is often used in practice to compare the sensitivities and specificities for the evaluation of two diagnostic tests. For correct evaluation of accuracy, an intuitive recommendation is to test the diseased and the non-diseased groups separately so that the sensitivities can be compared among the diseased, and specificities can be compared among the healthy group of people. This paper provides a rigorous theoretical framework for this argument and study the validity of McNemar's test regardless of the conditional independence assumption. We derive McNemar's test statistic under the null hypothesis considering both assumptions of conditional independence and conditional dependence. We then perform power analyses to show how the result is affected by the amount of the conditional dependence under alternative hypothesis.

  17. Skill Assessment for Coupled Biological/Physical Models of Marine Systems.

    PubMed

    Stow, Craig A; Jolliff, Jason; McGillicuddy, Dennis J; Doney, Scott C; Allen, J Icarus; Friedrichs, Marjorie A M; Rose, Kenneth A; Wallhead, Philip

    2009-02-20

    Coupled biological/physical models of marine systems serve many purposes including the synthesis of information, hypothesis generation, and as a tool for numerical experimentation. However, marine system models are increasingly used for prediction to support high-stakes decision-making. In such applications it is imperative that a rigorous model skill assessment is conducted so that the model's capabilities are tested and understood. Herein, we review several metrics and approaches useful to evaluate model skill. The definition of skill and the determination of the skill level necessary for a given application is context specific and no single metric is likely to reveal all aspects of model skill. Thus, we recommend the use of several metrics, in concert, to provide a more thorough appraisal. The routine application and presentation of rigorous skill assessment metrics will also serve the broader interests of the modeling community, ultimately resulting in improved forecasting abilities as well as helping us recognize our limitations.

  18. Moving toward Cognitive Alignment: Effective Data Provides Feedback Teachers Can Use to Make Adjustments in Learning Activities that Result in Standards Alignment with Content and Cognitive Rigor

    ERIC Educational Resources Information Center

    Manthey, George

    2005-01-01

    The most effective teaching strategies require higher order thinking, but the most used strategies seem to involve lower order thinking. If a comparison could be made between the cognitive rigor of the content standards that students are to be learning and the cognitive rigor of the actual work students are doing, then these kind of data are…

  19. Scientific rigor through videogames.

    PubMed

    Treuille, Adrien; Das, Rhiju

    2014-11-01

    Hypothesis-driven experimentation - the scientific method - can be subverted by fraud, irreproducibility, and lack of rigorous predictive tests. A robust solution to these problems may be the 'massive open laboratory' model, recently embodied in the internet-scale videogame EteRNA. Deploying similar platforms throughout biology could enforce the scientific method more broadly. Copyright © 2014 Elsevier Ltd. All rights reserved.

  20. Presumptive and Confirmatory Drug Tests

    ERIC Educational Resources Information Center

    Anderson, Craig

    2005-01-01

    The majority of drug testings are first done with some kind of qualitative presumptive tests. After the qualitative presumptive tests are performed, a confirmatory test is necessary which demonstrates to the students the rigor needed to conclusively identify a substance.

  1. A Rigorous Temperature-Dependent Stochastic Modelling and Testing for MEMS-Based Inertial Sensor Errors.

    PubMed

    El-Diasty, Mohammed; Pagiatakis, Spiros

    2009-01-01

    In this paper, we examine the effect of changing the temperature points on MEMS-based inertial sensor random error. We collect static data under different temperature points using a MEMS-based inertial sensor mounted inside a thermal chamber. Rigorous stochastic models, namely Autoregressive-based Gauss-Markov (AR-based GM) models are developed to describe the random error behaviour. The proposed AR-based GM model is initially applied to short stationary inertial data to develop the stochastic model parameters (correlation times). It is shown that the stochastic model parameters of a MEMS-based inertial unit, namely the ADIS16364, are temperature dependent. In addition, field kinematic test data collected at about 17 °C are used to test the performance of the stochastic models at different temperature points in the filtering stage using Unscented Kalman Filter (UKF). It is shown that the stochastic model developed at 20 °C provides a more accurate inertial navigation solution than the ones obtained from the stochastic models developed at -40 °C, -20 °C, 0 °C, +40 °C, and +60 °C. The temperature dependence of the stochastic model is significant and should be considered at all times to obtain optimal navigation solution for MEMS-based INS/GPS integration.

  2. Psychometric analysis of the Brisbane Practice Environment Measure (B-PEM).

    PubMed

    Flint, Anndrea; Farrugia, Charles; Courtney, Mary; Webster, Joan

    2010-03-01

    To undertake rigorous psychometric testing of the newly developed contemporary work environment measure (the Brisbane Practice Environment Measure [B-PEM]) using exploratory factor analysis and confirmatory factor analysis. Content validity of the 33-item measure was established by a panel of experts. Initial testing involved 195 nursing staff using principal component factor analysis with varimax rotation (orthogonal) and Cronbach's alpha coefficients. Confirmatory factor analysis was conducted using data from a further 983 nursing staff. Principal component factor analysis yielded a four-factor solution with eigenvalues greater than 1 that explained 52.53% of the variance. These factors were then verified using confirmatory factor analysis. Goodness-of-fit indices showed an acceptable fit overall with the full model, explaining 21% to 73% of the variance. Deletion of items took place throughout the evolution of the instrument, resulting in a 26-item, four-factor measure called the Brisbane Practice Environment Measure-Tested. The B-PEM has undergone rigorous psychometric testing, providing evidence of internal consistency and goodness-of-fit indices within acceptable ranges. The measure can be utilised as a subscale or total score reflective of a contemporary nursing work environment. An up-to-date instrument to measure practice environment may be useful for nursing leaders to monitor the workplace and to assist in identifying areas for improvement, facilitating greater job satisfaction and retention.

  3. Trends: Rigor Mortis in the Arts.

    ERIC Educational Resources Information Center

    Blodget, Alden S.

    1991-01-01

    Outlines how past art education provided a refuge for students from the rigors of other academic subjects. Observes that in recent years art education has become "discipline based." Argues that art educators need to reaffirm their commitment to a humanistic way of knowing. (KM)

  4. A computational framework for automation of point defect calculations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goyal, Anuj; Gorai, Prashun; Peng, Haowei

    We have developed a complete and rigorously validated open-source Python framework to automate point defect calculations using density functional theory. Furthermore, the framework provides an effective and efficient method for defect structure generation, and creation of simple yet customizable workflows to analyze defect calculations. This package provides the capability to compute widely-accepted correction schemes to overcome finite-size effects, including (1) potential alignment, (2) image-charge correction, and (3) band filling correction to shallow defects. Using Si, ZnO and In2O3 as test examples, we demonstrate the package capabilities and validate the methodology.

  5. A computational framework for automation of point defect calculations

    DOE PAGES

    Goyal, Anuj; Gorai, Prashun; Peng, Haowei; ...

    2017-01-13

    We have developed a complete and rigorously validated open-source Python framework to automate point defect calculations using density functional theory. Furthermore, the framework provides an effective and efficient method for defect structure generation, and creation of simple yet customizable workflows to analyze defect calculations. This package provides the capability to compute widely-accepted correction schemes to overcome finite-size effects, including (1) potential alignment, (2) image-charge correction, and (3) band filling correction to shallow defects. Using Si, ZnO and In2O3 as test examples, we demonstrate the package capabilities and validate the methodology.

  6. The effect of rigor mortis on the passage of erythrocytes and fluid through the myocardium of isolated dog hearts.

    PubMed

    Nevalainen, T J; Gavin, J B; Seelye, R N; Whitehouse, S; Donnell, M

    1978-07-01

    The effect of normal and artificially induced rigor mortis on the vascular passage of erythrocytes and fluid through isolated dog hearts was studied. Increased rigidity of 6-mm thick transmural sections through the centre of the posterior papillary muscle was used as an indication of rigor. The perfusibility of the myocardium was tested by injecting 10 ml of 1% sodium fluorescein in Hanks solution into the circumflex branch of the left coronary artery. In prerigor hearts (20 minute incubation) fluorescein perfused the myocardium evenly whether or not it was preceded by an injection of 10 ml of heparinized dog blood. Rigor mortis developed in all hearts after 90 minutes incubation or within 20 minutes of perfusing the heart with 50 ml of 5 mM iodoacetate in Hanks solution. Fluorescein injected into hearts in rigor did not enter the posterior papillary muscle and adjacent subendocardium whether or not it was preceded by heparinized blood. Thus the vascular occlusion caused by rigor in the dog heart appears to be so effective that it prevents flow into the subendocardium of small soluble ions such as fluorescein.

  7. A review of the evaluation of 47 drug abuse prevention curricula available nationally.

    PubMed

    Dusenbury, L; Falco, M; Lake, A

    1997-04-01

    This review determined how many drug prevention curricula available to schools have been shown in rigorous research studies to reduce substance use behavior. Forty-seven curricula which met the following criteria were included: 1) they focused on primary prevention of alcohol and/or drug use, 2) they were classroom-based curricula designed for any grade level P-12, 3) they were nationally and currently available, and 4) program distributors were willing to provide samples of curriculum materials to determine drug abuse prevention content. Of the 47 drug abuse prevention curricula identified, 10 (21%) had been subjected to sufficiently rigorous evaluations. At least eight of the 10 programs have been shown effective at reducing tobacco or drug use, in at least some studies. The remaining two programs did not appear to have sustained effects on drug use, although they had variable success at reducing substance use early on. One of the 10 programs has been shown to have positive effects lasting into young adulthood. Six of the 10 curricula have been shown to have effects lasting for at least two years after the pretest. Two curricula have not been evaluated beyond the post-test, so it is impossible to know whether their effectiveness will last. Recommendations to increase the number of programs rigorously evaluated are offered.

  8. A Rigorous Curriculum Really Matters

    ERIC Educational Resources Information Center

    Cook, Erika

    2013-01-01

    As every good secondary administrator knows, rigorous curricula matter. Challenging curricula is the factor in lifting each student to reach their potential: "the academic intensity of the student's high school curriculum still counts more than anything else...in providing momentum toward completing a bachelor's degree" (Adelman, 2006,…

  9. Visit from JAXA to NASA MSFC: The Engines Element & Ideas for Collaboration

    NASA Technical Reports Server (NTRS)

    Greene, William D.

    2013-01-01

    System Design, Development, and Fabrication: Design, develop, and fabricate or procure MB-60 component hardware compliant with the imposed technical requirements and in sufficient quantities to fulfill the overall MB-60 development effort. System Development, Assembly, and Test: Manage the scope of the development, assembly, and test-related activities for MB-60 development. This scope includes engine-level development planning, engine assembly and disassembly, test planning, engine testing, inspection, anomaly resolution, and development of necessary ground support equipment and special test equipment. System Integration: Provide coordinated integration in the realms of engineering, safety, quality, and manufacturing disciplines across the scope of the MB-60 design and associated products development Safety and Mission Assurance, structural design, fracture control, materials and processes, thermal analysis. Systems Engineering and Analysis: Manage and perform Systems Engineering and Analysis to provide rigor and structure to the overall design and development effort for the MB-60. Milestone reviews, requirements management, system analysis, program management support Program Management: Manage, plan, and coordinate the activities across all portions of the MB-60 work scope by providing direction for program administration, business management, and supplier management.

  10. Learning from Science and Sport - How we, Safety, "Engage with Rigor"

    NASA Astrophysics Data System (ADS)

    Herd, A.

    2012-01-01

    As the world of spaceflight safety is relatively small and potentially inward-looking, we need to be aware of the "outside world". We should then try to remind ourselves to be open to the possibility that data, knowledge or experience from outside of the spaceflight community may provide some constructive alternate perspectives. This paper will assess aspects from two seemingly tangential fields, science and sport, and align these with the world of safety. In doing so some useful insights will be given to the challenges we face and may provide solutions relevant in our everyday (of safety engineering). Sport, particularly a contact sport such as rugby union, requires direct interaction between members of two (opposing) teams. Professional, accurately timed and positioned interaction for a desired outcome. These interactions, whilst an essential part of the game, are however not without their constraints. The rugby scrum has constraints as to the formation and engagement of the two teams. The controlled engagement provides for an interaction between the two teams in a safe manner. The constraints arising from the reality that an incorrect engagement could cause serious injury to members of either team. In academia, scientific rigor is applied to assure that the arguments provided and the conclusions drawn in academic papers presented for publication are valid, legitimate and credible. The scientific goal of the need for rigor may be expressed in the example of achieving a statistically relevant sample size, n, in order to assure analysis validity of the data pool. A failure to apply rigor could then place the entire study at risk of failing to have the respective paper published. This paper will consider the merits of these two different aspects, scientific rigor and sports engagement, and offer a reflective look at how this may provide a "modus operandi" for safety engineers at any level whether at their desks (creating or reviewing safety assessments) or in a safety review meeting (providing a verbal critique of the presented safety case).

  11. Improving student achievement through daily activities and assessments in Introduction to Physics

    NASA Astrophysics Data System (ADS)

    Coppins, Kelly Ann

    The combination of a hands-on approach to science with the accountability of daily assessments provides a greater opportunity for students who traditionally receive below-average grades to be successful in science classes. The addition of competitive elements and real world applications plays to their strengths as kinesthetic learners without sacrificing the rigor required to meet graduation standards. Further, daily assessment allows students to develop test-taking skills they will need for the standardized tests used by the state and for college admission. Finally, the combination of daily feedback and daily accountability prevents a struggling student from slipping through the cracks.

  12. Peer Review of EPA's Draft BMDS Document: Exponential ...

    EPA Pesticide Factsheets

    BMDS is one of the Agency's premier tools for estimating risk assessments, therefore the validity and reliability of its statistical models are of paramount importance. This page provides links to peer review of the BMDS applications and its models as they were developed and eventually released documenting the rigorous review process taken to provide the best science tools available for statistical modeling. This page provides links to peer review of the BMDS applications and its models as they were developed and eventually released documenting the rigorous review process taken to provide the best science tools available for statistical modeling.

  13. Building an Evidence Base to Inform Interventions for Pregnant and Parenting Adolescents: A Call for Rigorous Evaluation

    PubMed Central

    Burrus, Barri B.; Scott, Alicia Richmond

    2012-01-01

    Adolescent parents and their children are at increased risk for adverse short- and long-term health and social outcomes. Effective interventions are needed to support these young families. We studied the evidence base and found a dearth of rigorously evaluated programs. Strategies from successful interventions are needed to inform both intervention design and policies affecting these adolescents. The lack of rigorous evaluations may be attributable to inadequate emphasis on and sufficient funding for evaluation, as well as to challenges encountered by program evaluators working with this population. More rigorous program evaluations are urgently needed to provide scientifically sound guidance for programming and policy decisions. Evaluation lessons learned have implications for other vulnerable populations. PMID:22897541

  14. Sputum fungal smear

    MedlinePlus

    ... in the test sample. Some labs use different measurements or test different samples. Talk to your doctor ... A.D.A.M. follows rigorous standards of quality and accountability. A.D.A.M. is among ...

  15. High-Resolution Imaged-Based 3D Reconstruction Combined with X-Ray CT Data Enables Comprehensive Non-Destructive Documentation and Targeted Research of Astromaterials

    NASA Technical Reports Server (NTRS)

    Blumenfeld, E. H.; Evans, C. A.; Oshel, E. R.; Liddle, D. A.; Beaulieu, K.; Zeigler, R. A.; Righter, K.; Hanna, R. D.; Ketcham, R. A.

    2014-01-01

    Providing web-based data of complex and sensitive astromaterials (including meteorites and lunar samples) in novel formats enhances existing preliminary examination data on these samples and supports targeted sample requests and analyses. We have developed and tested a rigorous protocol for collecting highly detailed imagery of meteorites and complex lunar samples in non-contaminating environments. These data are reduced to create interactive 3D models of the samples. We intend to provide these data as they are acquired on NASA's Astromaterials Acquisition and Curation website at http://curator.jsc.nasa.gov/.

  16. Incorporating Pharmacogenomics into Health Information Technology, Electronic Health Record and Decision Support System: An Overview.

    PubMed

    Alanazi, Abdullah

    2017-02-01

    As the adoption of information technology in healthcare is rising, the potentiality of moving Pharmacogenomics from benchside to bedside is aggravated. This paper reviews the current status of Pharmacogenomics (PGx) information and the attempts for incorporating them into the Electronic Health Record (EHR) system through Decision Support Systems (DSSs). Rigorous review strategies of PGx information and providing context-relevant recommendations in form of action plan- dose adjustment, lab tests rather than just information- would be ideal for making clinical recommendations out of PGx information. Lastly, realistic projections of what pharmacogenomics can provide is another important aspect in incorporating Pharmacogenomics into health information technology.

  17. Climate Change: Providing Equitable Access to a Rigorous and Engaging Curriculum

    ERIC Educational Resources Information Center

    Cardichon, Jessica; Roc, Martens

    2013-01-01

    This report examines how implementing rigorous and engaging curriculum aligned with college- and career-ready standards fosters positive school climates in which students are motivated to succeed, achievement gaps narrow, and learning and outcomes improve. It includes federal, state, and local recommendations for increasing access to high-quality,…

  18. Challenges in Building Usable Knowledge in Education

    ERIC Educational Resources Information Center

    Hedges, Larry V.

    2018-01-01

    The scientific rigor of education research has improved dramatically since the year 2000. Much of the credit for this improvement is deserved by Institute of Education Sciences (IES) policies that helped create a demand for rigorous research; increased human capital capacity to carry out such work; provided funding for the work itself; and…

  19. Strategies Leaders Can Use to Improve Rigor in Their Schools

    ERIC Educational Resources Information Center

    Williamson, Ronald; Blackburn, Barbara R.

    2009-01-01

    Concern about rigor is not new. Since the release of "A Nation At Risk" (National Commission on Excellence in Education, 1983) the debate about the quality of America's schools has grown exponentially. This debate calls for dramatically different schools, schools that are much more responsive to student need, and provide a rigorous…

  20. Harnessing Implementation Science to Increase the Impact of Health Equity Research.

    PubMed

    Chinman, Matthew; Woodward, Eva N; Curran, Geoffrey M; Hausmann, Leslie R M

    2017-09-01

    Health disparities are differences in health or health care between groups based on social, economic, and/or environmental disadvantage. Disparity research often follows 3 steps: detecting (phase 1), understanding (phase 2), and reducing (phase 3), disparities. Although disparities have narrowed over time, many remain. We argue that implementation science could enhance disparities research by broadening the scope of phase 2 studies and offering rigorous methods to test disparity-reducing implementation strategies in phase 3 studies. We briefly review the focus of phase 2 and phase 3 disparities research. We then provide a decision tree and case examples to illustrate how implementation science frameworks and research designs could further enhance disparity research. Most health disparities research emphasizes patient and provider factors as predominant mechanisms underlying disparities. Applying implementation science frameworks like the Consolidated Framework for Implementation Research could help disparities research widen its scope in phase 2 studies and, in turn, develop broader disparities-reducing implementation strategies in phase 3 studies. Many phase 3 studies of disparity-reducing implementation strategies are similar to case studies, whose designs are not able to fully test causality. Implementation science research designs offer rigorous methods that could accelerate the pace at which equity is achieved in real-world practice. Disparities can be considered a "special case" of implementation challenges-when evidence-based clinical interventions are delivered to, and received by, vulnerable populations at lower rates. Bringing together health disparities research and implementation science could advance equity more than either could achieve on their own.

  1. Genetic and environmental effects on the muscle structure response post-mortem.

    PubMed

    Thompson, J M; Perry, D; Daly, B; Gardner, G E; Johnston, D J; Pethick, D W

    2006-09-01

    This paper reviewed the mechanisms by which glycolytic rate and pre-rigor stretching of muscle impact on meat quality. If muscle is free to shorten during the rigor process extremes in glycolytic rate can impact negatively on meat quality by inducing either cold or rigor shortening. Factors that contribute to variation in glycolytic rate include the glycogen concentration at slaughter and fibre type of the muscle. Glycolysis is highly sensitive to temperature, which is an important factor in heavy grain fed carcasses. An alternative solution to controlling glycolysis is to stretch the muscle pre-rigor so that it cannot shorten, thus providing an insurance against extremes in processing conditions. Results are presented which show a large reduction in variance (both additive and phenotypic) in tenderness caused by pre-rigor stretching. Whilst this did not impact on the heritability of shear force, it did reduce genotype differences. The implications of these results on the magnitude of genotype effects on tenderness is discussed.

  2. International Seed Testing Association List of stabilized plant names, edition 6

    USDA-ARS?s Scientific Manuscript database

    Seed-testing laboratories determine the quality of seed lots in national and international seed commerce. Those services most commonly requested include purity analysis, noxious-weed seed detection, and viability tests. Rigorous procedures for performing various tests on specific crops have been est...

  3. Process tracing in political science: What's the story?

    PubMed

    Crasnow, Sharon

    2017-04-01

    Methodologists in political science have advocated for causal process tracing as a way of providing evidence for causal mechanisms. Recent analyses of the method have sought to provide more rigorous accounts of how it provides such evidence. These accounts have focused on the role of process tracing for causal inference and specifically on the way it can be used with case studies for testing hypotheses. While the analyses do provide an account of such testing, they pay little attention to the narrative elements of case studies. I argue that the role of narrative in case studies is not merely incidental. Narrative does cognitive work by both facilitating the consideration of alternative hypotheses and clarifying the relationship between evidence and explanation. I consider the use of process tracing in a particular case (the Fashoda Incident) in order to illustrate the role of narrative. I argue that process tracing contributes to knowledge production in ways that the current focus on inference tends to obscure. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Disseminated tuberculosis

    MedlinePlus

    ... lesions Interferon-gamma release blood test, such as the QFT-Gold test to test for prior exposure to TB ... verify that A.D.A.M. follows rigorous standards of quality and accountability. A.D.A.M. is among the first to achieve this important distinction for online ...

  5. How to Help Students Conceptualize the Rigorous Definition of the Limit of a Sequence

    ERIC Educational Resources Information Center

    Roh, Kyeong Hah

    2010-01-01

    This article suggests an activity, called the epsilon-strip activity, as an instructional method for conceptualization of the rigorous definition of the limit of a sequence via visualization. The article also describes the learning objectives of each instructional step of the activity, and then provides detailed instructional methods to guide…

  6. Rigor of cell fate decision by variable p53 pulses and roles of cooperative gene expression by p53

    PubMed Central

    Murakami, Yohei; Takada, Shoji

    2012-01-01

    Upon DNA damage, the cell fate decision between survival and apoptosis is largely regulated by p53-related networks. Recent experiments found a series of discrete p53 pulses in individual cells, which led to the hypothesis that the cell fate decision upon DNA damage is controlled by counting the number of p53 pulses. Under this hypothesis, Sun et al. (2009) modeled the Bax activation switch in the apoptosis signal transduction pathway that can rigorously “count” the number of uniform p53 pulses. Based on experimental evidence, here we use variable p53 pulses with Sun et al.’s model to investigate how the variability in p53 pulses affects the rigor of the cell fate decision by the pulse number. Our calculations showed that the experimentally anticipated variability in the pulse sizes reduces the rigor of the cell fate decision. In addition, we tested the roles of the cooperativity in PUMA expression by p53, finding that lower cooperativity is plausible for more rigorous cell fate decision. This is because the variability in the p53 pulse height is more amplified in PUMA expressions with more cooperative cases. PMID:27857606

  7. Well-Tempered Metadynamics: A Smoothly Converging and Tunable Free-Energy Method

    NASA Astrophysics Data System (ADS)

    Barducci, Alessandro; Bussi, Giovanni; Parrinello, Michele

    2008-01-01

    We present a method for determining the free-energy dependence on a selected number of collective variables using an adaptive bias. The formalism provides a unified description which has metadynamics and canonical sampling as limiting cases. Convergence and errors can be rigorously and easily controlled. The parameters of the simulation can be tuned so as to focus the computational effort only on the physically relevant regions of the order parameter space. The algorithm is tested on the reconstruction of an alanine dipeptide free-energy landscape.

  8. Well-tempered metadynamics: a smoothly converging and tunable free-energy method.

    PubMed

    Barducci, Alessandro; Bussi, Giovanni; Parrinello, Michele

    2008-01-18

    We present a method for determining the free-energy dependence on a selected number of collective variables using an adaptive bias. The formalism provides a unified description which has metadynamics and canonical sampling as limiting cases. Convergence and errors can be rigorously and easily controlled. The parameters of the simulation can be tuned so as to focus the computational effort only on the physically relevant regions of the order parameter space. The algorithm is tested on the reconstruction of an alanine dipeptide free-energy landscape.

  9. An operations-partnered evaluation of care redesign for high-risk patients in the Veterans Health Administration (VHA): Study protocol for the PACT Intensive Management (PIM) randomized quality improvement evaluation.

    PubMed

    Chang, Evelyn T; Zulman, Donna M; Asch, Steven M; Stockdale, Susan E; Yoon, Jean; Ong, Michael K; Lee, Martin; Simon, Alissa; Atkins, David; Schectman, Gordon; Kirsh, Susan R; Rubenstein, Lisa V

    2018-06-01

    Patient-centered medical homes have made great strides providing comprehensive care for patients with chronic conditions, but may not provide sufficient support for patients at highest risk for acute care use. To address this, the Veterans Health Administration (VHA) initiated a five-site demonstration project to evaluate the effectiveness of augmenting the VA's Patient Aligned Care Team (PACT) medical home with PACT Intensive Management (PIM) teams for Veterans at highest risk for hospitalization. Researchers partnered with VHA leadership to design a mixed-methods prospective multi-site evaluation that met leadership's desire for a rigorous evaluation conducted as quality improvement rather than research. We conducted a randomized QI evaluation and assigned high-risk patients to participate in PIM and compared them with high-risk Veterans receiving usual care through PACT. The summative evaluation examines whether PIM: 1) decreases VHA emergency department and hospital use; 2) increases satisfaction with VHA care; 3) decreases provider burnout; and 4) generates positive returns on investment. The formative evaluation aims to support improved care for high-risk patients at demonstration sites and to inform future initiatives for high-risk patients. The evaluation was reviewed by representatives from the VHA Office of Research and Development and the Office of Research Oversight and met criteria for quality improvement. VHA aims to function as a learning organization by rapidly implementing and rigorously testing QI innovations prior to final program or policy development. We observed challenges and opportunities in designing an evaluation consistent with QI standards and operations priorities, while also maintaining scientific rigor. This trial was retrospectively registered at ClinicalTrials.gov on April 3, 2017: NCT03100526. Protocol v1, FY14-17. Copyright © 2018 Elsevier Inc. All rights reserved.

  10. Injection-salting of pre rigor fillets of Atlantic salmon (Salmo salar).

    PubMed

    Birkeland, Sveinung; Akse, Leif; Joensen, Sjurdur; Tobiassen, Torbjørn; Skåra, Torstein

    2007-01-01

    The effects of temperature (-1, 4, and 10 degrees C), brine concentration (12% and 25% NaCl), injection volumes, and needle densities were investigated on fillet weight gain (%), salt content (%), fillet contraction (%), and muscle gaping in pre rigor brine-injected fillets of Atlantic salmon (Salmo salar). Increased brine concentration (12% to 25%) significantly increased the initial (< 5 min after injection) and final contraction (24 h after injection) of pre rigor fillets. Increased brine concentration significantly reduced weight gain and increased salt content but had no significant effect on muscle gaping. The temperatures tested did not significantly affect weight gain, fillet contraction, or gaping score. Significant regressions (P < 0.01) between the injection volume and weight gain (range: 2.5% to 15.5%) and salt content (range: 1.7% to 6.5%) were observed for injections of pre rigor fillets. Double injections significantly increased the weight gain and salt content compared to single injections. Initial fillet contraction measured 30 min after brine injection increased significantly (P < 0.01) with increasing brine injection volume but no significant difference in the fillet contraction was observed 12 h after brine injection (range: 7.9% to 8.9%). Brine-injected post rigor control fillets obtained higher weight gain, higher salt content, more muscle gaping, and significantly lower fillet contraction compared to the pre rigor injected fillets. Injection-salting is an applicable technology as a means to obtain satisfactory salt contents and homogenously distribute the salt into the muscle of pre rigor fillets of Atlantic salmon before further processing steps such as drying and smoking.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Buchholz, Stuart A.

    This memorandum documents laboratory thermomechanical triaxial strength testing of Waste Isolation Pilot Plant (WIPP) clean salt. The limited study completed independent, adjunct laboratory tests in the United States to assist in validating similar testing results being provided by the German facilities. The testing protocol consisted of completing confined triaxial, constant strain rate strength tests of intact WIPP clean salt at temperatures of 25°C and 100°C and at multiple confining pressures. The stratigraphy at WIPP also includes salt that has been labeled “argillaceous.” The much larger test matrix conducted in Germany included both the so-called clean and argillaceous salts. When combined,more » the total database of laboratory results will be used to develop input parameters for models, assess adequacy of existing models, and predict material behavior. These laboratory studies are also consistent with the goals of the international salt repository research program. The goal of this study was to complete a subset of a test matrix on clean salt from the WIPP undertaken by German research groups. The work was performed at RESPEC in Rapid City, South Dakota. A rigorous Quality Assurance protocol was applied, such that corroboration provides the potential of qualifying all of the test data gathered by German research groups.« less

  12. Instrument Selection for Randomized Controlled Trials Why This and Not That?

    PubMed Central

    Records, Kathie; Keller, Colleen; Ainsworth, Barbara; Permana, Paska

    2011-01-01

    A fundamental linchpin for obtaining rigorous findings in quantitative research involves the selection of survey instruments. Psychometric recommendations are available for the processes for scale development and testing and guidance for selection of established scales. These processes are necessary to address the validity link between the phenomena under investigation, the empirical measures and, ultimately, the theoretical ties between these and the world views of the participants. Detailed information is most often provided about study design and protocols, but far less frequently is a detailed theoretical explanation provided for why specific instruments are chosen. Guidance to inform choices is often difficult to find when scales are needed for specific cultural, ethnic, or racial groups. This paper details the rationale underlying instrument selection for measurement of the major processes (intervention, mediator and moderator variables, outcome variables) in an ongoing study of postpartum Latinas, Madres para la Salud [Mothers for Health]. The rationale underpinning our choices includes a discussion of alternatives, when appropriate. These exemplars may provide direction for other intervention researchers who are working with specific cultural, racial, or ethnic groups or for other investigators who are seeking to select the ‘best’ instrument. Thoughtful consideration of measurement and articulation of the rationale underlying our choices facilitates the maintenance of rigor within the study design and improves our ability to assess study outcomes. PMID:21986392

  13. Getting Closer to Countdown: Spacecraft Undergoes Readiness Tests

    NASA Image and Video Library

    2005-07-19

    It no easy task getting NASA Mars Reconnaissance Orbiter ready for launch. Workers stabilize the crane holding one of the enormous billboard-sized solar panels temporarily removed from the spacecraft prior to rigorous testing.

  14. Does Test Preparation Mean Low-Quality Instruction?

    ERIC Educational Resources Information Center

    Blazar, David; Pollard, Cynthia

    2017-01-01

    Critics of test-based accountability warn that test preparation has a negative influence on teachers' instruction due to a focus on procedural skills. Others advocate that the adoption of more rigorous assessments may be a way to incentivize more ambitious test preparation instruction. Drawing on classroom observations and teacher surveys, we do…

  15. Boeing CST-100 Heat Shield Testing

    NASA Image and Video Library

    2017-05-31

    A heat shield is used during separation test activities with Boeing's Starliner structural test article. The test article is undergoing rigorous qualification testing at the company's Huntington Beach Facility in California. Boeing’s CST-100 Starliner will launch on the Atlas V rocket to the International Space Station as part of NASA’s Commercial Crew Program.

  16. A Status Report on the Parachute Development for NASA's Next Manned Spacecraft

    NASA Technical Reports Server (NTRS)

    Sinclair, Robert

    2008-01-01

    NASA has determined that the parachute portion of the Landing System for the Crew Exploration Vehicle (CEV) will be Government Furnished Equipment (GFE). The Earth Landing System has been designated CEV Parachute Assembly System (CPAS). Thus a program team was developed consisting of NASA Johnson Space Center (JSC) and Jacobs Engineering through their Engineering and Science Contract Group (ESCG). Following a rigorous competitive phase, Airborne Systems North America was selected to provide the parachute design, testing and manufacturing role to support this team. The development program has begun with some early flight testing of a Generation 1 parachute system. Future testing will continue to refine the design and complete a qualification phase prior to manned flight of the spacecraft. The program team will also support early spacecraft system testing, including a Pad Abort Flight Test in the Fall of 2008

  17. Waveform generation in the EETS

    NASA Astrophysics Data System (ADS)

    Wilshire, J. P.

    1985-05-01

    Design decisions and analysis for the waveform generation portion of an electrical equipment test set are discussed. This test set is unlike conventional ATE in that it is portable and designed to operate in forward area sites for the USMC. It is also unique in that it provides for functional testing for 32 electronic units from the AV-88 Harrier II aircraft. Specific requirements for the waveform generator are discussed, including a wide frequency range, high resolution and accuracy, and low total harmonic distortion. Several approaches to meet these requirements are considered and a specific concept is presented in detail, which consists of a digitally produced waveform that feeds a deglitched analog conversion circuit. Rigorous mathematical analysis is presented to prove that this concept meets the requirements. Finally, design alternatives and enhancements are considered.

  18. On the characterization of the heterogeneous mechanical response of human brain tissue.

    PubMed

    Forte, Antonio E; Gentleman, Stephen M; Dini, Daniele

    2017-06-01

    The mechanical characterization of brain tissue is a complex task that scientists have tried to accomplish for over 50 years. The results in the literature often differ by orders of magnitude because of the lack of a standard testing protocol. Different testing conditions (including humidity, temperature, strain rate), the methodology adopted, and the variety of the species analysed are all potential sources of discrepancies in the measurements. In this work, we present a rigorous experimental investigation on the mechanical properties of human brain, covering both grey and white matter. The influence of testing conditions is also shown and thoroughly discussed. The material characterization performed is finally adopted to provide inputs to a mathematical formulation suitable for numerical simulations of brain deformation during surgical procedures.

  19. Rigor and Relevance Redux: Director's Biennial Report to Congress. IES 2009-6010

    ERIC Educational Resources Information Center

    Whitehurst, Grover J.

    2008-01-01

    The mission of the Institute of Education Sciences (IES) is to provide rigorous evidence on which to ground education practice and policy and to encourage its use. The Education Sciences Reform Act of 2002 (ESRA) requires that the Director of IES, on a biennial basis, transmit to the President, the National Board for Education Sciences, and the…

  20. Using the bending beam rheometer for low temperature testing of asphalt mixtures : final report.

    DOT National Transportation Integrated Search

    2016-07-01

    This work showed that the bending beam rheometer is a viable test to determine the low temperature performance of : asphalt mixtures; it balances the rigor required of any mechanical test and the relation to field performance with the : practicality ...

  1. Student peer assessment in evidence-based medicine (EBM) searching skills training: an experiment

    PubMed Central

    Eldredge, Jonathan D.; Bear, David G.; Wayne, Sharon J.; Perea, Paul P.

    2013-01-01

    Background: Student peer assessment (SPA) has been used intermittently in medical education for more than four decades, particularly in connection with skills training. SPA generally has not been rigorously tested, so medical educators have limited evidence about SPA effectiveness. Methods: Experimental design: Seventy-one first-year medical students were stratified by previous test scores into problem-based learning tutorial groups, and then these assigned groups were randomized further into intervention and control groups. All students received evidence-based medicine (EBM) training. Only the intervention group members received SPA training, practice with assessment rubrics, and then application of anonymous SPA to assignments submitted by other members of the intervention group. Results: Students in the intervention group had higher mean scores on the formative test with a potential maximum score of 49 points than did students in the control group, 45.7 and 43.5, respectively (P = 0.06). Conclusions: SPA training and the application of these skills by the intervention group resulted in higher scores on formative tests compared to those in the control group, a difference approaching statistical significance. The extra effort expended by librarians, other personnel, and medical students must be factored into the decision to use SPA in any specific educational context. Implications: SPA has not been rigorously tested, particularly in medical education. Future, similarly rigorous studies could further validate use of SPA so that librarians can optimally make use of limited contact time for information skills training in medical school curricula. PMID:24163593

  2. Experiment for validation of fluid-structure interaction models and algorithms.

    PubMed

    Hessenthaler, A; Gaddum, N R; Holub, O; Sinkus, R; Röhrle, O; Nordsletten, D

    2017-09-01

    In this paper a fluid-structure interaction (FSI) experiment is presented. The aim of this experiment is to provide a challenging yet easy-to-setup FSI test case that addresses the need for rigorous testing of FSI algorithms and modeling frameworks. Steady-state and periodic steady-state test cases with constant and periodic inflow were established. Focus of the experiment is on biomedical engineering applications with flow being in the laminar regime with Reynolds numbers 1283 and 651. Flow and solid domains were defined using computer-aided design (CAD) tools. The experimental design aimed at providing a straightforward boundary condition definition. Material parameters and mechanical response of a moderately viscous Newtonian fluid and a nonlinear incompressible solid were experimentally determined. A comprehensive data set was acquired by using magnetic resonance imaging to record the interaction between the fluid and the solid, quantifying flow and solid motion. Copyright © 2016 The Authors. International Journal for Numerical Methods in Biomedical Engineering published by John Wiley & Sons Ltd.

  3. The case for treatment fidelity in active music interventions: why and how.

    PubMed

    Wiens, Natalie; Gordon, Reyna L

    2018-05-04

    As the volume of studies testing the benefits of active music-making interventions increases exponentially, it is important to document what exactly is happening during music treatment sessions in order to provide evidence for the mechanisms through which music training affects other domains. Thus, to complement systematic and rigorous attention to outcomes of the treatment, we outline four vital components of treatment fidelity and discuss their implementation in nonmusic- and music-based interventions. We then describe the design of Music Impacting Language Expertise (MILEStone), a new intervention that aims to improve grammar skills in children with specific language impairment by increasing sensitivity to rhythmic structure, which may enhance general temporal processing and sensitivity to syntactic structure. We describe the approach to addressing treatment fidelity in MILEStone adapted from intervention research from other fields, including a behavioral coding system to track instructional episodes and child participation, a treatment manual, activity checklists, provider training and monitoring, a home practice log, and teacher ratings of participant engagement. This approach takes an important first step in modeling a formalized procedure for assessing treatment fidelity in active music-making intervention research, as a means of increasing methodological rigor in support of evidence-based practice in clinical and educational settings. © 2018 New York Academy of Sciences.

  4. Retaking the Test

    ERIC Educational Resources Information Center

    Backer, David Isaac; Lewis, Tyson Edward

    2015-01-01

    "Data-driven" teaching and learning is common sense in education today, and it is common sense that these data should come from standardized tests. Critiques of standardization either make no constructive suggestions for what to use in place of the tests or they call for better, more scientifically rigorous, reliable, and…

  5. Harnessing Implementation Science to Increase the Impact of Health Disparity Research

    PubMed Central

    Chinman, Matthew; Woodward, Eva N.; Curran, Geoffrey M.; Hausmann, Leslie R. M.

    2017-01-01

    Background Health disparities are differences in health or health care between groups based on social, economic, and/or environmental disadvantage. Disparity research often follows three steps: detecting (Phase 1), understanding (Phase 2), and reducing (Phase 3), disparities. While disparities have narrowed over time, many remain. Objectives We argue that implementation science could enhance disparities research by broadening the scope of Phase 2 studies and offering rigorous methods to test disparity-reducing implementation strategies in Phase 3 studies. Methods We briefly review the focus of Phase 2 and Phase 3 disparities research. We then provide a decision tree and case examples to illustrate how implementation science frameworks and research designs could further enhance disparity research. Results Most health disparities research emphasizes patient and provider factors as predominant mechanisms underlying disparities. Applying implementation science frameworks like the Consolidated Framework for Implementation Research could help disparities research widen its scope in Phase 2 studies and, in turn, develop broader disparities-reducing implementation strategies in Phase 3 studies. Many Phase 3 studies of disparity reducing implementation strategies are similar to case studies, whose designs are not able to fully test causality. Implementation science research designs offer rigorous methods that could accelerate the pace at which equity is achieved in real world practice. Conclusions Disparities can be considered a “special case” of implementation challenges—when evidence-based clinical interventions are delivered to, and received by, vulnerable populations at lower rates. Bringing together health disparities research and implementation science could advance equity more than either could achieve on their own. PMID:28806362

  6. Systematic review of mobile health behavioural interventions to improve uptake of HIV testing for vulnerable and key populations.

    PubMed

    Conserve, Donaldson F; Jennings, Larissa; Aguiar, Carolina; Shin, Grace; Handler, Lara; Maman, Suzanne

    2017-02-01

    Introduction This systematic narrative review examined the empirical evidence on the effectiveness of mobile health (mHealth) behavioural interventions designed to increase the uptake of HIV testing among vulnerable and key populations. Methods MEDLINE/PubMed, Embase, Web of Science, and Global Health electronic databases were searched. Studies were eligible for inclusion if they were published between 2005 and 2015, evaluated an mHealth intervention, and reported an outcome relating to HIV testing. We also reviewed the bibliographies of retrieved studies for other relevant citations. The methodological rigor of selected articles was assessed, and narrative analyses were used to synthesize findings from mixed methodologies. Results A total of seven articles met the inclusion criteria. Most mHealth interventions employed a text-messaging feature and were conducted in middle- and high-income countries. The methodological rigor was moderate among studies. The current literature suggests that mHealth interventions can have significant positive effects on HIV testing initiation among vulnerable and key populations, as well as the general public. In some cases, null results were observed. Qualitative themes relating to the use of mobile technologies to increase HIV testing included the benefits of having low-cost, confidential, and motivational communication. Reported barriers included cellular network restrictions, poor linkages with physical testing services, and limited knowledge of appropriate text-messaging dose. Discussion MHealth interventions may prove beneficial in reducing the proportion of undiagnosed persons living with HIV, particularly among vulnerable and key populations. However, more rigorous and tailored interventions are needed to assess the effectiveness of widespread use.

  7. Systematic review of mobile-health behavioral interventions to improve uptake of HIV testing for vulnerable and key populations

    PubMed Central

    Conserve, Donaldson F.; Jennings, Larissa; Aguiar, Carolina; Shin, Grace; Handler, Lara; Maman, Suzanne

    2016-01-01

    Objective This systematic narrative review examined the empirical evidence on the effectiveness of mobile health (mHealth) behavioral interventions designed to increase uptake of HIV testing among vulnerable and key populations. Methods MEDLINE/PubMed, Embase, Web of Science, and Global Health electronic databases were searched. Studies were eligible for inclusion if they were published between 2005 and 2015, evaluated an mHealth intervention, and reported an outcome relating to HIV testing. We also reviewed the bibliographies of retrieved studies for other relevant citations. The methodological rigor of selected articles was assessed, and narrative analyses were used to synthesize findings from mixed methodologies. Results A total of seven articles met the inclusion criteria. Most mHealth interventions employed a text-messaging feature and were conducted in middle- and high-income countries. The methodological rigor was moderate among studies. The current literature suggests that mHealth interventions can have significant positive effects on HIV testing initiation among vulnerable and key populations, as well as the general public. In some cases, null results were observed. Qualitative themes relating to use of mobile technologies to increase HIV testing included the benefits of having low-cost, confidential, and motivational communication. Reported barriers included cellular network restrictions, poor linkages with physical testing services, and limited knowledge of appropriate text-messaging dose. Conclusions MHealth interventions may prove beneficial in reducing the proportion of undiagnosed persons living with HIV, particularly among vulnerable and key populations. However, more rigorous and tailored intervention trials are needed to assess the effectiveness of widespread use. PMID:27056905

  8. New Tests Put States on Spot

    ERIC Educational Resources Information Center

    Ujifusa, Andrew

    2012-01-01

    As states begin to demand more rigor on their high-stakes tests--and the tests evolve to incorporate revised academic standards--many officials are gambling that an initial wave of lower scores will give way to greater student achievement in the future. Changes to statewide tests and subsequent plummeting scores sparked controversy and emergency…

  9. The Center For Medicare And Medicaid Innovation's blueprint for rapid-cycle evaluation of new care and payment models.

    PubMed

    Shrank, William

    2013-04-01

    The Affordable Care Act established the Center for Medicare and Medicaid Innovation to test innovative payment and service delivery models. The goal is to reduce program expenditures while preserving or improving the quality of care provided to beneficiaries of Medicare, Medicaid, and the Children's Health Insurance Program. Central to the success of the Innovation Center is a new, rapid-cycle approach to evaluation. This article describes that approach--setting forth how the Rapid Cycle Evaluation Group aims to deliver frequent feedback to providers in support of continuous quality improvement, while rigorously evaluating the outcomes of each model tested. This article also describes the relationship between the group's work and that of the Office of the Actuary at the Centers for Medicare and Medicaid Services, which plays a central role in the assessment of new models.

  10. Environmental risk assessments for transgenic crops producing output trait enzymes

    PubMed Central

    Tuttle, Ann; Shore, Scott; Stone, Terry

    2009-01-01

    The environmental risks from cultivating crops producing output trait enzymes can be rigorously assessed by testing conservative risk hypotheses of no harm to endpoints such as the abundance of wildlife, crop yield and the rate of degradation of crop residues in soil. These hypotheses can be tested with data from many sources, including evaluations of the agronomic performance and nutritional quality of the crop made during product development, and information from the scientific literature on the mode-of-action, taxonomic distribution and environmental fate of the enzyme. Few, if any, specific ecotoxicology or environmental fate studies are needed. The effective use of existing data means that regulatory decision-making, to which an environmental risk assessment provides essential information, is not unnecessarily complicated by evaluation of large amounts of new data that provide negligible improvement in the characterization of risk, and that may delay environmental benefits offered by transgenic crops containing output trait enzymes. PMID:19924556

  11. Space station dynamics, attitude control and momentum management

    NASA Technical Reports Server (NTRS)

    Sunkel, John W.; Singh, Ramen P.; Vengopal, Ravi

    1989-01-01

    The Space Station Attitude Control System software test-bed provides a rigorous environment for the design, development and functional verification of GN and C algorithms and software. The approach taken for the simulation of the vehicle dynamics and environmental models using a computationally efficient algorithm is discussed. The simulation includes capabilities for docking/berthing dynamics, prescribed motion dynamics associated with the Mobile Remote Manipulator System (MRMS) and microgravity disturbances. The vehicle dynamics module interfaces with the test-bed through the central Communicator facility which is in turn driven by the Station Control Simulator (SCS) Executive. The Communicator addresses issues such as the interface between the discrete flight software and the continuous vehicle dynamics, and multi-programming aspects such as the complex flow of control in real-time programs. Combined with the flight software and redundancy management modules, the facility provides a flexible, user-oriented simulation platform.

  12. A Systematic Review of Strategies for Implementing Empirically Supported Mental Health Interventions

    PubMed Central

    Powell, Byron J.; Proctor, Enola K.; Glass, Joseph E.

    2013-01-01

    Objective This systematic review examines experimental studies that test the effectiveness of strategies intended to integrate empirically supported mental health interventions into routine care settings. Our goal was to characterize the state of the literature and to provide direction for future implementation studies. Methods A literature search was conducted using electronic databases and a manual search. Results Eleven studies were identified that tested implementation strategies with a randomized (n = 10) or controlled clinical trial design (n = 1). The wide range of clinical interventions, implementation strategies, and outcomes evaluated precluded meta-analysis. However, the majority of studies (n = 7; 64%) found a statistically significant effect in the hypothesized direction for at least one implementation or clinical outcome. Conclusions There is a clear need for more rigorous research on the effectiveness of implementation strategies, and we provide several suggestions that could improve this research area. PMID:24791131

  13. Conceptualizing Rigor and Its Implications for Education in the Era of the Common Core

    ERIC Educational Resources Information Center

    Paige, David D.; Smith, Grant S.; Sizemore, John M.

    2015-01-01

    The adoption of Common Core State Standards in the USA by 46 states and the District of Columbia has provided several new foci for K-12 instruction, not the least of which is the reading and understanding of complex text, a higher order thinking process. Closely associated with this is the notion of rigor, the focus of the present study. As…

  14. Rigor, vigor, and the study of health disparities

    PubMed Central

    Adler, Nancy; Bush, Nicole R.; Pantell, Matthew S.

    2012-01-01

    Health disparities research spans multiple fields and methods and documents strong links between social disadvantage and poor health. Associations between socioeconomic status (SES) and health are often taken as evidence for the causal impact of SES on health, but alternative explanations, including the impact of health on SES, are plausible. Studies showing the influence of parents’ SES on their children’s health provide evidence for a causal pathway from SES to health, but have limitations. Health disparities researchers face tradeoffs between “rigor” and “vigor” in designing studies that demonstrate how social disadvantage becomes biologically embedded and results in poorer health. Rigorous designs aim to maximize precision in the measurement of SES and health outcomes through methods that provide the greatest control over temporal ordering and causal direction. To achieve precision, many studies use a single SES predictor and single disease. However, doing so oversimplifies the multifaceted, entwined nature of social disadvantage and may overestimate the impact of that one variable and underestimate the true impact of social disadvantage on health. In addition, SES effects on overall health and functioning are likely to be greater than effects on any one disease. Vigorous designs aim to capture this complexity and maximize ecological validity through more complete assessment of social disadvantage and health status, but may provide less-compelling evidence of causality. Newer approaches to both measurement and analysis may enable enhanced vigor as well as rigor. Incorporating both rigor and vigor into studies will provide a fuller understanding of the causes of health disparities. PMID:23045672

  15. ZY3-02 Laser Altimeter Footprint Geolocation Prediction

    PubMed Central

    Xie, Junfeng; Tang, Xinming; Mo, Fan; Li, Guoyuan; Zhu, Guangbin; Wang, Zhenming; Fu, Xingke; Gao, Xiaoming; Dou, Xianhui

    2017-01-01

    Successfully launched on 30 May 2016, ZY3-02 is the first Chinese surveying and mapping satellite equipped with a lightweight laser altimeter. Calibration is necessary before the laser altimeter becomes operational. Laser footprint location prediction is the first step in calibration that is based on ground infrared detectors, and it is difficult because the sample frequency of the ZY3-02 laser altimeter is 2 Hz, and the distance between two adjacent laser footprints is about 3.5 km. In this paper, we build an on-orbit rigorous geometric prediction model referenced to the rigorous geometric model of optical remote sensing satellites. The model includes three kinds of data that must be predicted: pointing angle, orbit parameters, and attitude angles. The proposed method is verified by a ZY3-02 laser altimeter on-orbit geometric calibration test. Five laser footprint prediction experiments are conducted based on the model, and the laser footprint prediction accuracy is better than 150 m on the ground. The effectiveness and accuracy of the on-orbit rigorous geometric prediction model are confirmed by the test results. The geolocation is predicted precisely by the proposed method, and this will give a reference to the geolocation prediction of future land laser detectors in other laser altimeter calibration test. PMID:28934160

  16. ZY3-02 Laser Altimeter Footprint Geolocation Prediction.

    PubMed

    Xie, Junfeng; Tang, Xinming; Mo, Fan; Li, Guoyuan; Zhu, Guangbin; Wang, Zhenming; Fu, Xingke; Gao, Xiaoming; Dou, Xianhui

    2017-09-21

    Successfully launched on 30 May 2016, ZY3-02 is the first Chinese surveying and mapping satellite equipped with a lightweight laser altimeter. Calibration is necessary before the laser altimeter becomes operational. Laser footprint location prediction is the first step in calibration that is based on ground infrared detectors, and it is difficult because the sample frequency of the ZY3-02 laser altimeter is 2 Hz, and the distance between two adjacent laser footprints is about 3.5 km. In this paper, we build an on-orbit rigorous geometric prediction model referenced to the rigorous geometric model of optical remote sensing satellites. The model includes three kinds of data that must be predicted: pointing angle, orbit parameters, and attitude angles. The proposed method is verified by a ZY3-02 laser altimeter on-orbit geometric calibration test. Five laser footprint prediction experiments are conducted based on the model, and the laser footprint prediction accuracy is better than 150 m on the ground. The effectiveness and accuracy of the on-orbit rigorous geometric prediction model are confirmed by the test results. The geolocation is predicted precisely by the proposed method, and this will give a reference to the geolocation prediction of future land laser detectors in other laser altimeter calibration test.

  17. The great chemical residue detection debate: dog versus machine

    NASA Astrophysics Data System (ADS)

    Tripp, Alan C.; Walker, James C.

    2003-09-01

    Many engineering groups desire to construct instrumentation to replace dog-handler teams in identifying and localizing chemical mixtures. This goal requires performance specifications for an "artificial dog-handler team". Progress toward generating such specifications from laboratory tests of dog-handler teams has been made recently at the Sensory Research Institute, and the method employed is amenable to the measurement of tasks representative of the decision-making that must go on when such teams solve problems in actual (and therefore informationally messy) situations. As progressively more quantitative data are obtained on progressively more complex odor tasks, the boundary conditions of dog-handler performance will be understood in great detail. From experiments leading to this knowledge, one ca develop, as we do in this paper, a taxonomy of test conditions that contain various subsets of the variables encountered in "real world settings". These tests provide the basis for the rigorous testing that will provide an improved basis for deciding when biological sensing approaches (e.g. dog-handler teams) are best and when "artificial noses" are most valuable.

  18. Perspectives on Validation of High-Throughput Assays Supporting 21st Century Toxicity Testing

    EPA Science Inventory

    In vitro high-throughput screening (HTS) assays are seeing increasing use in toxicity testing. HTS assays can simultaneously test many chemicals but have seen limited use in the regulatory arena, in part because of the need to undergo rigorous, time-consuming formal validation. ...

  19. Research Says…/High-Stakes Testing Narrows the Curriculum

    ERIC Educational Resources Information Center

    David, Jane L.

    2011-01-01

    The current rationale for standards-based reform goes like this: If standards are demanding and tests accurately measure achievement of those standards, then curriculum and instruction will become richer and more rigorous. By attaching serious consequences to schools that fail to increase test scores, U.S. policymakers believe that educators will…

  20. Testing Intelligently Includes Double-Checking Wechsler IQ Scores

    ERIC Educational Resources Information Center

    Kuentzel, Jeffrey G.; Hetterscheidt, Lesley A.; Barnett, Douglas

    2011-01-01

    The rigors of standardized testing make for numerous opportunities for examiner error, including simple computational mistakes in scoring. Although experts recommend that test scoring be double-checked, the extent to which independent double-checking would reduce scoring errors is not known. A double-checking procedure was established at a…

  1. Trans-dimensional and hierarchical Bayesian approaches toward rigorous estimation of seismic sources and structures in the Northeast Asia

    NASA Astrophysics Data System (ADS)

    Kim, Seongryong; Tkalčić, Hrvoje; Mustać, Marija; Rhie, Junkee; Ford, Sean

    2016-04-01

    A framework is presented within which we provide rigorous estimations for seismic sources and structures in the Northeast Asia. We use Bayesian inversion methods, which enable statistical estimations of models and their uncertainties based on data information. Ambiguities in error statistics and model parameterizations are addressed by hierarchical and trans-dimensional (trans-D) techniques, which can be inherently implemented in the Bayesian inversions. Hence reliable estimation of model parameters and their uncertainties is possible, thus avoiding arbitrary regularizations and parameterizations. Hierarchical and trans-D inversions are performed to develop a three-dimensional velocity model using ambient noise data. To further improve the model, we perform joint inversions with receiver function data using a newly developed Bayesian method. For the source estimation, a novel moment tensor inversion method is presented and applied to regional waveform data of the North Korean nuclear explosion tests. By the combination of new Bayesian techniques and the structural model, coupled with meaningful uncertainties related to each of the processes, more quantitative monitoring and discrimination of seismic events is possible.

  2. QTest: Quantitative Testing of Theories of Binary Choice.

    PubMed

    Regenwetter, Michel; Davis-Stober, Clintin P; Lim, Shiau Hong; Guo, Ying; Popova, Anna; Zwilling, Chris; Cha, Yun-Shil; Messner, William

    2014-01-01

    The goal of this paper is to make modeling and quantitative testing accessible to behavioral decision researchers interested in substantive questions. We provide a novel, rigorous, yet very general, quantitative diagnostic framework for testing theories of binary choice. This permits the nontechnical scholar to proceed far beyond traditionally rather superficial methods of analysis, and it permits the quantitatively savvy scholar to triage theoretical proposals before investing effort into complex and specialized quantitative analyses. Our theoretical framework links static algebraic decision theory with observed variability in behavioral binary choice data. The paper is supplemented with a custom-designed public-domain statistical analysis package, the QTest software. We illustrate our approach with a quantitative analysis using published laboratory data, including tests of novel versions of "Random Cumulative Prospect Theory." A major asset of the approach is the potential to distinguish decision makers who have a fixed preference and commit errors in observed choices from decision makers who waver in their preferences.

  3. Calibrating the Abaqus Crushable Foam Material Model using UNM Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schembri, Philip E.; Lewis, Matthew W.

    Triaxial test data from the University of New Mexico and uniaxial test data from W-14 is used to calibrate the Abaqus crushable foam material model to represent the syntactic foam comprised of APO-BMI matrix and carbon microballoons used in the W76. The material model is an elasto-plasticity model in which the yield strength depends on pressure. Both the elastic properties and the yield stress are estimated by fitting a line to the elastic region of each test response. The model parameters are fit to the data (in a non-rigorous way) to provide both a conservative and not-conservative material model. Themore » model is verified to perform as intended by comparing the values of pressure and shear stress at yield, as well as the shear and volumetric stress-strain response, to the test data.« less

  4. Testability of evolutionary game dynamics based on experimental economics data

    NASA Astrophysics Data System (ADS)

    Wang, Yijia; Chen, Xiaojie; Wang, Zhijian

    2017-11-01

    Understanding the dynamic processes of a real game system requires an appropriate dynamics model, and rigorously testing a dynamics model is nontrivial. In our methodological research, we develop an approach to testing the validity of game dynamics models that considers the dynamic patterns of angular momentum and speed as measurement variables. Using Rock-Paper-Scissors (RPS) games as an example, we illustrate the geometric patterns in the experiment data. We then derive the related theoretical patterns from a series of typical dynamics models. By testing the goodness-of-fit between the experimental and theoretical patterns, we show that the validity of these models can be evaluated quantitatively. Our approach establishes a link between dynamics models and experimental systems, which is, to the best of our knowledge, the most effective and rigorous strategy for ascertaining the testability of evolutionary game dynamics models.

  5. Pose Measurement Performance of the Argon Relative Navigation Sensor Suite in Simulated Flight Conditions

    NASA Technical Reports Server (NTRS)

    Galante, Joseph M.; Eepoel, John Van; Strube, Matt; Gill, Nat; Gonzalez, Marcelo; Hyslop, Andrew; Patrick, Bryan

    2012-01-01

    Argon is a flight-ready sensor suite with two visual cameras, a flash LIDAR, an on- board flight computer, and associated electronics. Argon was designed to provide sensing capabilities for relative navigation during proximity, rendezvous, and docking operations between spacecraft. A rigorous ground test campaign assessed the performance capability of the Argon navigation suite to measure the relative pose of high-fidelity satellite mock-ups during a variety of simulated rendezvous and proximity maneuvers facilitated by robot manipulators in a variety of lighting conditions representative of the orbital environment. A brief description of the Argon suite and test setup are given as well as an analysis of the performance of the system in simulated proximity and rendezvous operations.

  6. View of MISSE-8 taken during a session of EVA

    NASA Image and Video Library

    2011-07-12

    ISS028-E-016111 (12 July 2011) --- This close-up image, recorded during a July 12 spacewalk, shows the Materials on International Space Station Experiment - 8 (MISSE-8). The experiment package is a test bed for materials and computing elements attached to the outside of the orbiting complex. These materials and computing elements are being evaluated for the effects of atomic oxygen, ultraviolet, direct sunlight, radiation, and extremes of heat and cold. This experiment allows the development and testing of new materials and computing elements that can better withstand the rigors of space environments. Results will provide a better understanding of the durability of various materials and computing elements when they are exposed to the space environment, with applications in the design of future spacecraft.

  7. Testing for Mutagens Using Fruit Flies.

    ERIC Educational Resources Information Center

    Liebl, Eric C.

    1998-01-01

    Describes a laboratory employed in undergraduate teaching that uses fruit flies to test student-selected compounds for their ability to cause mutations. Requires no prior experience with fruit flies, incorporates a student design component, and employs both rigorous controls and statistical analyses. (DDR)

  8. Opisthotonos

    MedlinePlus

    ... The physical examination will include a complete checkup of the nervous system. Tests may include: Blood and urine tests Cerebrospinal ... urac.org). URAC's accreditation program is an independent audit to verify that A.D.A.M. follows rigorous standards of quality and accountability. A.D.A.M. is ...

  9. The KP Approximation Under a Weak Coriolis Forcing

    NASA Astrophysics Data System (ADS)

    Melinand, Benjamin

    2018-02-01

    In this paper, we study the asymptotic behavior of weakly transverse water-waves under a weak Coriolis forcing in the long wave regime. We derive the Boussinesq-Coriolis equations in this setting and we provide a rigorous justification of this model. Then, from these equations, we derive two other asymptotic models. When the Coriolis forcing is weak, we fully justify the rotation-modified Kadomtsev-Petviashvili equation (also called Grimshaw-Melville equation). When the Coriolis forcing is very weak, we rigorously justify the Kadomtsev-Petviashvili equation. This work provides the first mathematical justification of the KP approximation under a Coriolis forcing.

  10. Enhancing Shared Decision Making Through Carefully Designed Interventions That Target Patient And Provider Behavior.

    PubMed

    Tai-Seale, Ming; Elwyn, Glyn; Wilson, Caroline J; Stults, Cheryl; Dillon, Ellis C; Li, Martina; Chuang, Judith; Meehan, Amy; Frosch, Dominick L

    2016-04-01

    Patient-provider communication and shared decision making are essential for primary care delivery and are vital contributors to patient experience and health outcomes. To alleviate communication shortfalls, we designed a novel, multidimensional intervention aimed at nudging both patients and primary care providers to communicate more openly. The intervention was tested against an existing intervention, which focused mainly on changing patients' behaviors, in four primary care clinics involving 26 primary care providers and 300 patients. Study results suggest that compared to usual care, both the novel and existing interventions were associated with better patient reports of how well primary care providers engaged them in shared decision making. Future research should build on the work in this pilot to rigorously examine the comparative effectiveness and scalability of these interventions to improve shared decision making at the point of care. Project HOPE—The People-to-People Health Foundation, Inc.

  11. Burnout in Mental Health Services: A Review of the Problem and Its Remediation

    PubMed Central

    Morse, Gary; Salyers, Michelle P.; Rollins, Angela L.; Monroe-DeVita, Maria; Pfahler, Corey

    2011-01-01

    Staff burnout is increasingly viewed as a concern in the mental health field. In this article we first examine the extent to which burnout is a problem for mental health services in terms of two critical issues: its prevalence and its association with a range of undesirable outcomes for staff, organizations, and consumers. We subsequently provide a comprehensive review of the limited research attempting to remediate burnout among mental health staff. We conclude with recommendations for the development and rigorous testing of intervention approaches to address this critical area. Keywords: burnout, burnout prevention, mental health staff PMID:21533847

  12. Well-tempered metadynamics: a smoothly-converging and tunable free-energy method

    NASA Astrophysics Data System (ADS)

    Barducci, Alessandro; Bussi, Giovanni; Parrinello, Michele

    2008-03-01

    We present [1] a method for determining the free energy dependence on a selected number of order parameters using an adaptive bias. The formalism provides a unified description which has metadynamics and canonical sampling as limiting cases. Convergence and errors can be rigorously and easily controlled. The parameters of the simulation can be tuned so as to focus the computational effort only on the physically relevantregions of the order parameter space. The algorithm is tested on the reconstruction of alanine dipeptide free energy landscape. [1] A. Barducci, G. Bussi and M. Parrinello, Phys. Rev. Lett., accepted (2007).

  13. Gravitation. [Book on general relativity

    NASA Technical Reports Server (NTRS)

    Misner, C. W.; Thorne, K. S.; Wheeler, J. A.

    1973-01-01

    This textbook on gravitation physics (Einstein's general relativity or geometrodynamics) is designed for a rigorous full-year course at the graduate level. The material is presented in two parallel tracks in an attempt to divide key physical ideas from more complex enrichment material to be selected at the discretion of the reader or teacher. The full book is intended to provide competence relative to the laws of physics in flat space-time, Einstein's geometric framework for physics, applications with pulsars and neutron stars, cosmology, the Schwarzschild geometry and gravitational collapse, gravitational waves, experimental tests of Einstein's theory, and mathematical concepts of differential geometry.

  14. Shelter from the Storm.

    ERIC Educational Resources Information Center

    Urbaniak, Al; Farber, Yuriy

    2002-01-01

    Discusses how door manufacturers are introducing products designed to pass the rigorous tests needed to withstand tornadoes, including the Federal Emergency Management Agency's 320 and 361 directives. (EV)

  15. Test Anxiety and the Curriculum: The Subject Matters.

    ERIC Educational Resources Information Center

    Everson, Howard T.; And Others

    College students' self-reported test anxiety levels in English, mathematics, physical science, and social science were compared to develop empirical support for the claim that students, in general, are more anxious about tests in rigorous academic subjects than in the humanities and to understand the curriculum-related sources of anxiety. It was…

  16. Small sample mediation testing: misplaced confidence in bootstrapped confidence intervals.

    PubMed

    Koopman, Joel; Howe, Michael; Hollenbeck, John R; Sin, Hock-Peng

    2015-01-01

    Bootstrapping is an analytical tool commonly used in psychology to test the statistical significance of the indirect effect in mediation models. Bootstrapping proponents have particularly advocated for its use for samples of 20-80 cases. This advocacy has been heeded, especially in the Journal of Applied Psychology, as researchers are increasingly utilizing bootstrapping to test mediation with samples in this range. We discuss reasons to be concerned with this escalation, and in a simulation study focused specifically on this range of sample sizes, we demonstrate not only that bootstrapping has insufficient statistical power to provide a rigorous hypothesis test in most conditions but also that bootstrapping has a tendency to exhibit an inflated Type I error rate. We then extend our simulations to investigate an alternative empirical resampling method as well as a Bayesian approach and demonstrate that they exhibit comparable statistical power to bootstrapping in small samples without the associated inflated Type I error. Implications for researchers testing mediation hypotheses in small samples are presented. For researchers wishing to use these methods in their own research, we have provided R syntax in the online supplemental materials. (c) 2015 APA, all rights reserved.

  17. Impact of the Brain Injury Family Intervention (BIFI) training on rehabilitation providers: A mixed methods study.

    PubMed

    Meixner, Cara; O'Donoghue, Cynthia R; Hart, Vesna

    2017-01-01

    The psychological impact of TBI is vast, leading to adverse effects on survivors and their caregivers. Unhealthy family functioning may be mitigated by therapeutic strategies, particularly interdisciplinary family systems approaches like the well-documented Brain Injury Family Intervention (BIFI). Little is known about the experience of providers who offer such interventions. This mixed methods study aims to demonstrate that a structured three-day training on the BIFI protocol improves providers' knowledge and confidence in working with survivors and families, and that this outcome is sustainable. Participants were 34 providers who participated in an intensive training and completed a web-based survey at four points of time. Quantitative data were analyzed via Wilcoxon signed-rank tests and binomial test of proportions. Qualitative data were analyzed according to rigorous coding procedures. Providers' knowledge of brain injury and their ability to conceptualize treatment models for survivors and their families increased significantly and mostly remain consistent over time. Qualitative data point to additional gains, such as understanding of family systems. Past studies quantify the BIFI as an evidence-based intervention. This study supports the effectiveness of training and serves as first to demonstrate the benefit for providers short- and long-term.

  18. Image synthesis for SAR system, calibration and processor design

    NASA Technical Reports Server (NTRS)

    Holtzman, J. C.; Abbott, J. L.; Kaupp, V. H.; Frost, V. S.

    1978-01-01

    The Point Scattering Method of simulating radar imagery rigorously models all aspects of the imaging radar phenomena. Its computational algorithms operate on a symbolic representation of the terrain test site to calculate such parameters as range, angle of incidence, resolution cell size, etc. Empirical backscatter data and elevation data are utilized to model the terrain. Additionally, the important geometrical/propagation effects such as shadow, foreshortening, layover, and local angle of incidence are rigorously treated. Applications of radar image simulation to a proposed calibrated SAR system are highlighted: soil moisture detection and vegetation discrimination.

  19. Stem and progenitor cells: the premature desertion of rigorous definitions.

    PubMed

    Seaberg, Raewyn M; van der Kooy, Derek

    2003-03-01

    A current disturbing trend in stem cell biology is the abandonment of rigorous definitions of stem and progenitor cells in favor of more ambiguous, all-encompassing concepts. However, recent studies suggest that there are consistent, functional differences in the biology of these two cell types. Admittedly, it can be difficult to harmonize the in vivo and in vitro functional differences between stem and progenitor cells. Nonetheless, these distinctions between cell types should be emphasized rather than ignored, as they can be used to test specific hypotheses in neural stem cell biology.

  20. About Those Tests I Gave You... An Open Letter to My Students

    ERIC Educational Resources Information Center

    Dandrea, Ruth Ann

    2012-01-01

    This article presents the author's open letter to her students. In her letter, the author apologizes to her students for the state's narrow and deceptive standardized test. She asserts that she does not oppose rigorous testing and she understands the purpose of evaluation. A good test can measure achievement and even inspire. But, she argues that…

  1. Using Small-Scale Randomized Controlled Trials to Evaluate the Efficacy of New Curricular Materials

    PubMed Central

    Bass, Kristin M.; Stark, Louisa A.

    2014-01-01

    How can researchers in K–12 contexts stay true to the principles of rigorous evaluation designs within the constraints of classroom settings and limited funding? This paper explores this question by presenting a small-scale randomized controlled trial (RCT) designed to test the efficacy of curricular supplemental materials on epigenetics. The researchers asked whether the curricular materials improved students’ understanding of the content more than an alternative set of activities. The field test was conducted in a diverse public high school setting with 145 students who were randomly assigned to a treatment or comparison condition. Findings indicate that students in the treatment condition scored significantly higher on the posttest than did students in the comparison group (effect size: Cohen's d = 0.40). The paper discusses the strengths and limitations of the RCT, the contextual factors that influenced its enactment, and recommendations for others wishing to conduct small-scale rigorous evaluations in educational settings. Our intention is for this paper to serve as a case study for university science faculty members who wish to employ scientifically rigorous evaluations in K–12 settings while limiting the scope and budget of their work. PMID:25452482

  2. Aflatoxin

    MedlinePlus

    ... found in the following foods: Peanuts and peanut butter Tree nuts such as pecans Corn Wheat Oil ... foods that may contain aflatoxin. Peanuts and peanut butter are some of the most rigorously tested products ...

  3. How to Map Theory: Reliable Methods Are Fruitless Without Rigorous Theory.

    PubMed

    Gray, Kurt

    2017-09-01

    Good science requires both reliable methods and rigorous theory. Theory allows us to build a unified structure of knowledge, to connect the dots of individual studies and reveal the bigger picture. Some have criticized the proliferation of pet "Theories," but generic "theory" is essential to healthy science, because questions of theory are ultimately those of validity. Although reliable methods and rigorous theory are synergistic, Action Identification suggests psychological tension between them: The more we focus on methodological details, the less we notice the broader connections. Therefore, psychology needs to supplement training in methods (how to design studies and analyze data) with training in theory (how to connect studies and synthesize ideas). This article provides a technique for visually outlining theory: theory mapping. Theory mapping contains five elements, which are illustrated with moral judgment and with cars. Also included are 15 additional theory maps provided by experts in emotion, culture, priming, power, stress, ideology, morality, marketing, decision-making, and more (see all at theorymaps.org ). Theory mapping provides both precision and synthesis, which helps to resolve arguments, prevent redundancies, assess the theoretical contribution of papers, and evaluate the likelihood of surprising effects.

  4. Mechanical properties of frog skeletal muscles in iodoacetic acid rigor.

    PubMed Central

    Mulvany, M J

    1975-01-01

    1. Methods have been developed for describing the length: tension characteristics of frog skeletal muscles which go into rigor at 4 degrees C following iodoacetic acid poisoning either in the presence of Ca2+ (Ca-rigor) or its absence (Ca-free-rigor). 2. Such rigor muscles showed less resistance to slow stretch (slow rigor resistance) that to fast stretch (fast rigor resistance). The slow and fast rigor resistances of Ca-free-rigor muscles were much lower than those of Ca-rigor muscles. 3. The slow rigor resistance of Ca-rigor muscles was proportional to the amount of overlap between the contractile filaments present when the muscles were put into rigor. 4. Withdrawing Ca2+ from Ca-rigor muscles (induced-Ca-free rigor) reduced their slow and fast rigor resistances. Readdition of Ca2+ (but not Mg2+, Mn2+ or Sr2+) reversed the effect. 5. The slow and fast rigor resistances of Ca-rigor muscles (but not of Ca-free-rigor muscles) decreased with time. 6.The sarcomere structure of Ca-rigor and induced-Ca-free rigor muscles stretched by 0.2lo was destroyed in proportion to the amount of stretch, but the lengths of the remaining intact sarcomeres were essentially unchanged. This suggests that there had been a successive yielding of the weakeast sarcomeres. 7. The difference between the slow and fast rigor resistance and the effect of calcium on these resistances are discussed in relation to possible variations in the strength of crossbridges between the thick and thin filaments. Images Plate 1 Plate 2 PMID:1082023

  5. Sex begets violence: mating motives, social dominance, and physical aggression in men.

    PubMed

    Ainsworth, Sarah E; Maner, Jon K

    2012-11-01

    There are sizable gender differences in aggressive behavior, with men displaying a much higher propensity for violence than women. Evolutionary theories suggest that men's more violent nature derives in part from their historically greater need to compete over access to potential mates. The current research investigates this link between mating and male violence and provides rigorous experimental evidence that mating motives cause men to behave violently toward other men. In these studies, men and women were primed with a mating motive and then performed a noise-blast aggression task. Being primed with mating led men, but not women, to deliver more painful blasts of white noise to a same-sex partner (but not an opposite-sex partner). This effect was particularly pronounced among men with an unrestricted sociosexual orientation, for whom competition over access to new mates is an especially relevant concern. Findings also suggest that mating-induced male violence is motivated by a desire to assert one's dominance over other men: when men were given feedback that they had won a competition with their partner (and thus had achieved dominance through nonaggressive means), the effect of the mating prime on aggression was eliminated. These findings provide insight into the motivational roots of male aggression and illustrate the value of testing theories from evolutionary biology with rigorous experimental methods. (c) 2012 APA, all rights reserved.

  6. Semi-physical Simulation of the Airborne InSAR based on Rigorous Geometric Model and Real Navigation Data

    NASA Astrophysics Data System (ADS)

    Changyong, Dou; Huadong, Guo; Chunming, Han; yuquan, Liu; Xijuan, Yue; Yinghui, Zhao

    2014-03-01

    Raw signal simulation is a useful tool for the system design, mission planning, processing algorithm testing, and inversion algorithm design of Synthetic Aperture Radar (SAR). Due to the wide and high frequent variation of aircraft's trajectory and attitude, and the low accuracy of the Position and Orientation System (POS)'s recording data, it's difficult to quantitatively study the sensitivity of the key parameters, i.e., the baseline length and inclination, absolute phase and the orientation of the antennas etc., of the airborne Interferometric SAR (InSAR) system, resulting in challenges for its applications. Furthermore, the imprecise estimation of the installation offset between the Global Positioning System (GPS), Inertial Measurement Unit (IMU) and the InSAR antennas compounds the issue. An airborne interferometric SAR (InSAR) simulation based on the rigorous geometric model and real navigation data is proposed in this paper, providing a way for quantitatively studying the key parameters and for evaluating the effect from the parameters on the applications of airborne InSAR, as photogrammetric mapping, high-resolution Digital Elevation Model (DEM) generation, and surface deformation by Differential InSAR technology, etc. The simulation can also provide reference for the optimal design of the InSAR system and the improvement of InSAR data processing technologies such as motion compensation, imaging, image co-registration, and application parameter retrieval, etc.

  7. Constructed-Response Problems

    ERIC Educational Resources Information Center

    Swinford, Ashleigh

    2016-01-01

    With rigor outlined in state and Common Core standards and the addition of constructed-response test items to most state tests, math constructed-response questions have become increasingly popular in today's classroom. Although constructed-response problems can present a challenge for students, they do offer a glimpse of students' learning through…

  8. View of MISSE-8 taken during a session of EVA

    NASA Image and Video Library

    2011-07-12

    ISS028-E-016107 (12 July 2011) --- This medium close-up image, recorded during a July 12 spacewalk, shows the Materials on International Space Station Experiment - 8 (MISSE-8). The experiment package is a test bed for materials and computing elements attached to the outside of the orbiting complex. These materials and computing elements are being evaluated for the effects of atomic oxygen, ultraviolet, direct sunlight, radiation, and extremes of heat and cold. This experiment allows the development and testing of new materials and computing elements that can better withstand the rigors of space environments. Results will provide a better understanding of the durability of various materials and computing elements when they are exposed to the space environment, with applications in the design of future spacecraft.

  9. Observational Research Rigor Alone Does Not Justify Causal Inference

    PubMed Central

    Ejima, Keisuke; Li, Peng; Smith, Daniel L.; Nagy, Tim R.; Kadish, Inga; van Groen, Thomas; Dawson, John A.; Yang, Yongbin; Patki, Amit; Allison, David B.

    2016-01-01

    Background Differing opinions exist on whether associations obtained in observational studies can be reliable indicators of a causal effect if the observational study is sufficiently well controlled and executed. Materials and methods To test this, we conducted two animal observational studies that were rigorously controlled and executed beyond what is achieved in studies of humans. In study 1, we randomized 332 genetically identical C57BL/6J mice into three diet groups with differing food energy allotments and recorded individual self-selected daily energy intake and lifespan. In study 2, 60 male mice (CD1) were paired and divided into two groups for a 2-week feeding regimen. We evaluated the association between weight gain and food consumption. Within each pair, one animal was randomly assigned to an S group in which the animals had free access to food. The second paired animal (R group) was provided exactly the same diet that their S partner ate the day before. Results In study 1, across all three groups, we found a significant negative effect of energy intake on lifespan. However, we found a positive association between food intake and lifespan among the ad libitum feeding group: 29.99 (95% CI: 8.2 to 51.7) days per daily kcal. In study 2, we found a significant (P=0.003) group (randomized vs self-selected)-by-food consumption interaction effect on weight gain. Conclusions At least in nutrition research, associations derived from observational studies may not be reliable indicators of causal effects, even with the most rigorous study designs achievable. PMID:27711975

  10. Advanced Ceramics Property Measurements

    NASA Technical Reports Server (NTRS)

    Salem, Jonathan; Helfinstine, John; Quinn, George; Gonczy, Stephen

    2013-01-01

    Mechanical and physical properties of ceramic bodies can be difficult to measure correctly unless the proper techniques are used. The Advanced Ceramics Committee of ASTM, C-28, has developed dozens of consensus test standards and practices to measure various properties of a ceramic monolith, composite, or coating. The standards give the "what, how, how not, and why" for measurement of many mechanical, physical, thermal, and performance properties. Using these standards will provide accurate, reliable, and complete data for rigorous comparisons with other test results from your test lab, or another. The C-28 Committee has involved academics, producers, and users of ceramics to write and continually update more than 45 standards since the committee's inception in 1986. Included in this poster is a pictogram of the C-28 standards and information on how to obtain individual copies with full details or the complete collection of standards in one volume.

  11. Advanced Ceramics Property and Performance Measurements

    NASA Technical Reports Server (NTRS)

    Jenkins, Michael; Salem, Jonathan; Helfinstine, John; Quinn, George; Gonczy, Stephen

    2015-01-01

    Mechanical and physical properties of ceramic bodies can be difficult to measure correctly unless the proper techniques are used. The Advanced Ceramics Committee of ASTM, C-28, has developed dozens of consensus test standards and practices to measure various properties of a ceramic monolith, composite, or coating. The standards give the what, how, how not, and why for measurement of many mechanical, physical, thermal, and performance properties. Using these standards will provide accurate, reliable, and complete data for rigorous comparisons with other test results from your test lab, or another. The C-28 Committee has involved academics, producers, and users of ceramics to write and continually update more than 45 standards since the committees inception in 1986. Included in this poster is a pictogram of the C-28 standards and information on how to obtain individual copies with full details or the complete collection of all of the standards in one volume.

  12. Enhancing rigor and practice of scoping reviews in social policy research: considerations from a worked example on the Americans with disabilities act.

    PubMed

    Harris, Sarah Parker; Gould, Robert; Fujiura, Glenn

    2015-01-01

    There is increasing theoretical consideration about the use of systematic and scoping reviews of evidence in informing disability and rehabilitation research and practice. Indicative of this trend, this journal published a piece by Rumrill, Fitzgerald and Merchant in 2010 explaining the utility and process for conducting reviews of intervention-based research. There is still need to consider how to apply such rigor when conducting more exploratory reviews of heterogeneous research. This article explores the challenges, benefits, and procedures for conducting rigorous exploratory scoping reviews of diverse evidence. The article expands upon Rumrill, Fitzgerald and Merchant's framework and considers its application to more heterogeneous evidence on the impact of social policy. A worked example of a scoping review of the Americans with Disabilities Act is provided with a procedural framework for conducting scoping reviews on the effects of a social policy. The need for more nuanced techniques for enhancing rigor became apparent during the review process. There are multiple methodological steps that can enhance the utility of exploratory scoping reviews. The potential of systematic consideration during the exploratory review process is shown as a viable method to enhance the rigor in reviewing diverse bodies of evidence.

  13. High School Equivalency Testing in Washington. Forum: Responding to Changes in High School Equivalency Testing

    ERIC Educational Resources Information Center

    Kerr, Jon

    2015-01-01

    In 2013, as new high school equivalency exams were being developed and implemented across the nation and states were deciding which test was best for their population, Washington state identified the need to adopt the most rigorous test so that preparation to take it would equip students with the skills to be able to move directly from adult…

  14. The development of an Infrared Environmental System for TOPEX Solar Panel Testing

    NASA Technical Reports Server (NTRS)

    Noller, E.

    1994-01-01

    Environmental testing and flight qualification of the TOPEX/POSEIDON spacecraft solar panels were performed with infrared (IR) lamps and a control system that were newly designed and integrated. The basic goal was more rigorous testing of the costly panels' new composite-structure design without jeopardizing their safety. The technique greatly reduces the costs and high risks of testing flight solar panels.

  15. Integration of genomic medicine into pathology residency training: the stanford open curriculum.

    PubMed

    Schrijver, Iris; Natkunam, Yasodha; Galli, Stephen; Boyd, Scott D

    2013-03-01

    Next-generation sequencing methods provide an opportunity for molecular pathology laboratories to perform genomic testing that is far more comprehensive than single-gene analyses. Genome-based test results are expected to develop into an integral component of diagnostic clinical medicine and to provide the basis for individually tailored health care. To achieve these goals, rigorous interpretation of high-quality data must be informed by the medical history and the phenotype of the patient. The discipline of pathology is well positioned to implement genome-based testing and to interpret its results, but new knowledge and skills must be included in the training of pathologists to develop expertise in this area. Pathology residents should be trained in emerging technologies to integrate genomic test results appropriately with more traditional testing, to accelerate clinical studies using genomic data, and to help develop appropriate standards of data quality and evidence-based interpretation of these test results. We have created a genomic pathology curriculum as a first step in helping pathology residents build a foundation for the understanding of genomic medicine and its implications for clinical practice. This curriculum is freely accessible online. Copyright © 2013 American Society for Investigative Pathology and the Association for Molecular Pathology. Published by Elsevier Inc. All rights reserved.

  16. Rigorous mathematical modelling for a Fast Corrector Power Supply in TPS

    NASA Astrophysics Data System (ADS)

    Liu, K.-B.; Liu, C.-Y.; Chien, Y.-C.; Wang, B.-S.; Wong, Y. S.

    2017-04-01

    To enhance the stability of beam orbit, a Fast Orbit Feedback System (FOFB) eliminating undesired disturbances was installed and tested in the 3rd generation synchrotron light source of Taiwan Photon Source (TPS) of National Synchrotron Radiation Research Center (NSRRC). The effectiveness of the FOFB greatly depends on the output performance of Fast Corrector Power Supply (FCPS); therefore, the design and implementation of an accurate FCPS is essential. A rigorous mathematical modelling is very useful to shorten design time and improve design performance of a FCPS. A rigorous mathematical modelling derived by the state-space averaging method for a FCPS in the FOFB of TPS composed of a full-bridge topology is therefore proposed in this paper. The MATLAB/SIMULINK software is used to construct the proposed mathematical modelling and to conduct the simulations of the FCPS. Simulations for the effects of the different resolutions of ADC on the output accuracy of the FCPS are investigated. A FCPS prototype is realized to demonstrate the effectiveness of the proposed rigorous mathematical modelling for the FCPS. Simulation and experimental results show that the proposed mathematical modelling is helpful for selecting the appropriate components to meet the accuracy requirements of a FCPS.

  17. A Developmental Test of Mertonian Anomie Theory.

    ERIC Educational Resources Information Center

    Menard, Scott

    1995-01-01

    Carefully reviewed Merton's writings on anomie theory to construct a more complete and rigorous test of the theory for respondents in early, middle, and late adolescence. Concluded that misspecified models of strain theory have underestimated the predictive power of strain theory in general and of anomie theory in particular. (JBJ)

  18. Explaining the Sex Difference in Dyslexia

    ERIC Educational Resources Information Center

    Arnett, Anne B.; Pennington, Bruce F.; Peterson, Robin L.; Willcutt, Erik G.; DeFries, John C.; Olson, Richard K.

    2017-01-01

    Background: Males are diagnosed with dyslexia more frequently than females, even in epidemiological samples. This may be explained by greater variance in males' reading performance. Methods: We expand on previous research by rigorously testing the variance difference theory, and testing for mediation of the sex difference by cognitive correlates.…

  19. Mathematical Basis and Test Cases for Colloid-Facilitated Radionuclide Transport Modeling in GDSA-PFLOTRAN

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reimus, Paul William

    This report provides documentation of the mathematical basis for a colloid-facilitated radionuclide transport modeling capability that can be incorporated into GDSA-PFLOTRAN. It also provides numerous test cases against which the modeling capability can be benchmarked once the model is implemented numerically in GDSA-PFLOTRAN. The test cases were run using a 1-D numerical model developed by the author, and the inputs and outputs from the 1-D model are provided in an electronic spreadsheet supplement to this report so that all cases can be reproduced in GDSA-PFLOTRAN, and the outputs can be directly compared with the 1-D model. The cases include examplesmore » of all potential scenarios in which colloid-facilitated transport could result in the accelerated transport of a radionuclide relative to its transport in the absence of colloids. Although it cannot be claimed that all the model features that are described in the mathematical basis were rigorously exercised in the test cases, the goal was to test the features that matter the most for colloid-facilitated transport; i.e., slow desorption of radionuclides from colloids, slow filtration of colloids, and equilibrium radionuclide partitioning to colloids that is strongly favored over partitioning to immobile surfaces, resulting in a substantial fraction of radionuclide mass being associated with mobile colloids.« less

  20. Thermo-electrochemical evaluation of lithium-ion batteries for space applications

    NASA Astrophysics Data System (ADS)

    Walker, W.; Yayathi, S.; Shaw, J.; Ardebili, H.

    2015-12-01

    Advanced energy storage and power management systems designed through rigorous materials selection, testing and analysis processes are essential to ensuring mission longevity and success for space exploration applications. Comprehensive testing of Boston Power Swing 5300 lithium-ion (Li-ion) cells utilized by the National Aeronautics and Space Administration (NASA) to power humanoid robot Robonaut 2 (R2) is conducted to support the development of a test-correlated Thermal Desktop (TD) Systems Improved Numerical Differencing Analyzer (SINDA) (TD-S) model for evaluation of power system thermal performance. Temperature, current, working voltage and open circuit voltage measurements are taken during nominal charge-discharge operations to provide necessary characterization of the Swing 5300 cells for TD-S model correlation. Building from test data, embedded FORTRAN statements directly simulate Ohmic heat generation of the cells during charge-discharge as a function of surrounding temperature, local cell temperature and state of charge. The unique capability gained by using TD-S is demonstrated by simulating R2 battery thermal performance in example orbital environments for hypothetical extra-vehicular activities (EVA) exterior to a small satellite. Results provide necessary demonstration of this TD-S technique for thermo-electrochemical analysis of Li-ion cells operating in space environments.

  1. Development and validation of rear impact computer simulation model of an adult manual transit wheelchair with a seated occupant.

    PubMed

    Salipur, Zdravko; Bertocci, Gina

    2010-01-01

    It has been shown that ANSI WC19 transit wheelchairs that are crashworthy in frontal impact exhibit catastrophic failures in rear impact and may not be able to provide stable seating support and thus occupant protection for the wheelchair occupant. Thus far only limited sled test and computer simulation data have been available to study rear impact wheelchair safety. Computer modeling can be used as an economic and comprehensive tool to gain critical knowledge regarding wheelchair integrity and occupant safety. This study describes the development and validation of a computer model simulating an adult wheelchair-seated occupant subjected to a rear impact event. The model was developed in MADYMO and validated rigorously using the results of three similar sled tests conducted to specifications provided in the draft ISO/TC 173 standard. Outcomes from the model can provide critical wheelchair loading information to wheelchair and tiedown manufacturers, resulting in safer wheelchair designs for rear impact conditions. (c) 2009 IPEM. Published by Elsevier Ltd. All rights reserved.

  2. Methodological Issues in Trials of Complementary and Alternative Medicine Interventions

    PubMed Central

    Sikorskii, Alla; Wyatt, Gwen; Victorson, David; Faulkner, Gwen; Rahbar, Mohammad Hossein

    2010-01-01

    Background Complementary and alternative medicine (CAM) use is widespread among cancer patients. Information on safety and efficacy of CAM therapies is needed for both patients and health care providers. Well-designed randomized clinical trials (RCTs) of CAM therapy interventions can inform both clinical research and practice. Objectives To review important issues that affect the design of RCTs for CAM interventions. Methods Using the methods component of the Consolidated Standards for Reporting Trials (CONSORT) as a guiding framework, and a National Cancer Institute-funded reflexology study as an exemplar, methodological issues related to participants, intervention, objectives, outcomes, sample size, randomization, blinding, and statistical methods were reviewed. Discussion Trials of CAM interventions designed and implemented according to appropriate methodological standards will facilitate the needed scientific rigor in CAM research. Interventions in CAM can be tested using proposed methodology, and the results of testing will inform nursing practice in providing safe and effective supportive care and improving the well-being of patients. PMID:19918155

  3. Robust Likelihoods for Inflationary Gravitational Waves from Maps of Cosmic Microwave Background Polarization

    NASA Technical Reports Server (NTRS)

    Switzer, Eric Ryan; Watts, Duncan J.

    2016-01-01

    The B-mode polarization of the cosmic microwave background provides a unique window into tensor perturbations from inflationary gravitational waves. Survey effects complicate the estimation and description of the power spectrum on the largest angular scales. The pixel-space likelihood yields parameter distributions without the power spectrum as an intermediate step, but it does not have the large suite of tests available to power spectral methods. Searches for primordial B-modes must rigorously reject and rule out contamination. Many forms of contamination vary or are uncorrelated across epochs, frequencies, surveys, or other data treatment subsets. The cross power and the power spectrum of the difference of subset maps provide approaches to reject and isolate excess variance. We develop an analogous joint pixel-space likelihood. Contamination not modeled in the likelihood produces parameter-dependent bias and complicates the interpretation of the difference map. We describe a null test that consistently weights the difference map. Excess variance should either be explicitly modeled in the covariance or be removed through reprocessing the data.

  4. Adhesives and the ATS satellite. [construction of honeycomb panels

    NASA Technical Reports Server (NTRS)

    Hancock, F. E.

    1972-01-01

    Adhesives in the ATS satellite allow the designers to save weight, simplify design and fabrication and provide thermal and electrical conductivity or resistivity as required. The selections of adhesives are restricted to those few which can pass rigorous outgassing tests in order to avoid contaminating lenses and thermal control surfaces in space. An epoxy adhesive is used to construct the honeycomb panels which constitute most of the satellite's structure. General purpose epoxy adhesives hold doublers and standoffs in place and bond the truss to its fittings. Specialized adhesives include a high temperature resistant polyamide, a flexible polyurethane and filled epoxies which conduct heat or electricity.

  5. Electronic structure and microscopic model of V(2)GeO(4)F(2)-a quantum spin system with S = 1.

    PubMed

    Rahaman, Badiur; Saha-Dasgupta, T

    2007-07-25

    We present first-principles density functional calculations and downfolding studies of the electronic and magnetic properties of the oxide-fluoride quantum spin system V(2)GeO(4)F(2). We discuss explicitly the nature of the exchange paths and provide quantitative estimates of magnetic exchange couplings. A microscopic modelling based on analysis of the electronic structure of this systems puts it in the interesting class of weakly coupled alternating chain S = 1 systems. Based on the microscopic model, we make inferrences about its spin excitation spectra, which needs to be tested by rigorous experimental study.

  6. Parameter inference in small world network disease models with approximate Bayesian Computational methods

    NASA Astrophysics Data System (ADS)

    Walker, David M.; Allingham, David; Lee, Heung Wing Joseph; Small, Michael

    2010-02-01

    Small world network models have been effective in capturing the variable behaviour of reported case data of the SARS coronavirus outbreak in Hong Kong during 2003. Simulations of these models have previously been realized using informed “guesses” of the proposed model parameters and tested for consistency with the reported data by surrogate analysis. In this paper we attempt to provide statistically rigorous parameter distributions using Approximate Bayesian Computation sampling methods. We find that such sampling schemes are a useful framework for fitting parameters of stochastic small world network models where simulation of the system is straightforward but expressing a likelihood is cumbersome.

  7. Social Security And Mental Illness: Reducing Disability With Supported Employment

    PubMed Central

    Drake, Robert E.; Skinner, Jonathan S.; Bond, Gary R.; Goldman, Howard H.

    2010-01-01

    Social Security Administration disability programs are expensive, growing, and headed toward bankruptcy. People with psychiatric disabilities now constitute the largest and most rapidly expanding subgroup of program beneficiaries. Evidence-based supported employment is a well-defined, rigorously tested service model that helps people with psychiatric disabilities obtain and succeed in competitive employment. Providing evidence-based supported employment and mental health services to this population could reduce the growing rates of disability and enable those already disabled to contribute positively to the workforce and to their own welfare, at little or no cost (and, depending on assumptions, a possible savings) to the government. PMID:19414885

  8. Implicit filtered P{sub N} for high-energy density thermal radiation transport using discontinuous Galerkin finite elements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Laboure, Vincent M., E-mail: vincent.laboure@tamu.edu; McClarren, Ryan G., E-mail: rgm@tamu.edu; Hauck, Cory D., E-mail: hauckc@ornl.gov

    2016-09-15

    In this work, we provide a fully-implicit implementation of the time-dependent, filtered spherical harmonics (FP{sub N}) equations for non-linear, thermal radiative transfer. We investigate local filtering strategies and analyze the effect of the filter on the conditioning of the system, showing in particular that the filter improves the convergence properties of the iterative solver. We also investigate numerically the rigorous error estimates derived in the linear setting, to determine whether they hold also for the non-linear case. Finally, we simulate a standard test problem on an unstructured mesh and make comparisons with implicit Monte Carlo (IMC) calculations.

  9. Drill Holes and Predation Traces versus Abrasion-Induced Artifacts Revealed by Tumbling Experiments

    PubMed Central

    Gorzelak, Przemysław; Salamon, Mariusz A.; Trzęsiok, Dawid; Niedźwiedzki, Robert

    2013-01-01

    Drill holes made by predators in prey shells are widely considered to be the most unambiguous bodies of evidence of predator-prey interactions in the fossil record. However, recognition of traces of predatory origin from those formed by abiotic factors still waits for a rigorous evaluation as a prerequisite to ascertain predation intensity through geologic time and to test macroevolutionary patterns. New experimental data from tumbling various extant shells demonstrate that abrasion may leave holes strongly resembling the traces produced by drilling predators. They typically represent singular, circular to oval penetrations perpendicular to the shell surface. These data provide an alternative explanation to the drilling predation hypothesis for the origin of holes recorded in fossil shells. Although various non-morphological criteria (evaluation of holes for non-random distribution) and morphometric studies (quantification of the drill hole shape) have been employed to separate biological from abiotic traces, these are probably insufficient to exclude abrasion artifacts, consequently leading to overestimate predation intensity. As a result, from now on, we must adopt more rigorous criteria to appropriately distinguish abrasion artifacts from drill holes, such as microstructural identification of micro-rasping traces. PMID:23505530

  10. Aerial photography flight quality assessment with GPS/INS and DEM data

    NASA Astrophysics Data System (ADS)

    Zhao, Haitao; Zhang, Bing; Shang, Jiali; Liu, Jiangui; Li, Dong; Chen, Yanyan; Zuo, Zhengli; Chen, Zhengchao

    2018-01-01

    The flight altitude, ground coverage, photo overlap, and other acquisition specifications of an aerial photography flight mission directly affect the quality and accuracy of the subsequent mapping tasks. To ensure smooth post-flight data processing and fulfill the pre-defined mapping accuracy, flight quality assessments should be carried out in time. This paper presents a novel and rigorous approach for flight quality evaluation of frame cameras with GPS/INS data and DEM, using geometric calculation rather than image analysis as in the conventional methods. This new approach is based mainly on the collinearity equations, in which the accuracy of a set of flight quality indicators is derived through a rigorous error propagation model and validated with scenario data. Theoretical analysis and practical flight test of an aerial photography mission using an UltraCamXp camera showed that the calculated photo overlap is accurate enough for flight quality assessment of 5 cm ground sample distance image, using the SRTMGL3 DEM and the POSAV510 GPS/INS data. An even better overlap accuracy could be achieved for coarser-resolution aerial photography. With this new approach, the flight quality evaluation can be conducted on site right after landing, providing accurate and timely information for decision making.

  11. A Computational Framework for Automation of Point Defect Calculations

    NASA Astrophysics Data System (ADS)

    Goyal, Anuj; Gorai, Prashun; Peng, Haowei; Lany, Stephan; Stevanovic, Vladan; National Renewable Energy Laboratory, Golden, Colorado 80401 Collaboration

    A complete and rigorously validated open-source Python framework to automate point defect calculations using density functional theory has been developed. The framework provides an effective and efficient method for defect structure generation, and creation of simple yet customizable workflows to analyze defect calculations. The package provides the capability to compute widely accepted correction schemes to overcome finite-size effects, including (1) potential alignment, (2) image-charge correction, and (3) band filling correction to shallow defects. Using Si, ZnO and In2O3as test examples, we demonstrate the package capabilities and validate the methodology. We believe that a robust automated tool like this will enable the materials by design community to assess the impact of point defects on materials performance. National Renewable Energy Laboratory, Golden, Colorado 80401.

  12. Cell separation by immunoaffinity partitioning with polyethylene glycol-modified Protein A in aqueous polymer two-phase systems

    NASA Technical Reports Server (NTRS)

    Karr, Laurel J.; Van Alstine, James M.; Snyder, Robert S.; Shafer, Steven G.; Harris, J. Milton

    1988-01-01

    Previous work has shown that polyethylene glycol (PEG)-bound antibodies can be used as affinity ligands in PEG-dextran two-phase systems to provide selective partitioning of cells to the PEG-rich phase. In the present work it is shown that immunoaffinity partitioning can be simplified by use of PEG-modified Protein A which complexes with unmodified antibody and cells and shifts their partitioning into the PEG-rich phase, thus eliminating the need to prepare a PEG-modified antibody for each cell type. In addition, the paper provides a more rigorous test of the original technique with PEG-bound antibodies by showing that it is effective at shifting the partitioning of either cell type of a mixture of two cell populations.

  13. Evaluation of Metals Release from Oxidation of Fly Ash during Dredging of the Emory River, TN

    DTIC Science & Technology

    2011-08-01

    from an oil -free source (trickle flow, 2-5 bubbles per second) to provide some turbulent flow and to maintain dissolved oxygen levels. More rigorous...larval and (b) juvenile Pimephales promelas. ERDC/EL TR-11-9 79 five juvenile fish and was rigorously aerated from an oil -free source to...epithelial width. In contrast, juvenile pike from a reference lake had significantly thicker gill filaments compared to those exposed to Key Lake uranium

  14. Are Boys Discriminated in Swedish High Schools?

    ERIC Educational Resources Information Center

    Hinnerich, Bjorn Tyrefors; Hoglin, Erik; Johannesson, Magnus

    2011-01-01

    Girls typically have higher grades than boys in school and recent research suggests that part of this gender difference may be due to discrimination of boys in grading. We rigorously test this in a field experiment where a random sample of the same tests in the Swedish language is subject to blind and non-blind grading. The non-blind test score is…

  15. Discrimination against Students with Foreign Backgrounds: Evidence from Grading in Swedish Public High Schools

    ERIC Educational Resources Information Center

    Hinnerich, Bjorn Tyrefors; Höglin, Erik; Johannesson, Magnus

    2015-01-01

    We rigorously test for discrimination against students with foreign backgrounds in high school grading in Sweden. We analyse a random sample of national tests in the Swedish language graded both non-blindly by the student's own teacher and blindly without any identifying information. The increase in the test score due to non-blind grading is…

  16. 2011 Release of the Evaluated Nuclear Data Library (ENDL2011.0)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, D. A.; Beck, B.; Descalles, M. A.

    LLNL’s Computational Nuclear Physics Group and Nuclear Theory and Modeling Group have collaborated to produce the last of three major releases of LLNL’s evaluated nuclear database, ENDL2011. ENDL2011 is designed to support LLNL’s current and future nuclear data needs by providing the best nuclear data available to our programmatic customers. This library contains many new evaluations for radiochemical diagnostics, structural materials, and thermonuclear reactions. We have made an effort to eliminate all holes in reaction networks, allowing in-line isotopic creation and depletion calculations. We have striven to keep ENDL2011 at the leading edge of nuclear data library development by reviewingmore » and incorporating new evaluations as they are made available to the nuclear data community. Finally, this release is our most highly tested release as we have strengthened our already rigorous testing regime by adding tests against IPPE Activation Ratio Measurements, many more new critical assemblies and a more complete set of classified testing (to be detailed separately).« less

  17. QTest: Quantitative Testing of Theories of Binary Choice

    PubMed Central

    Regenwetter, Michel; Davis-Stober, Clintin P.; Lim, Shiau Hong; Guo, Ying; Popova, Anna; Zwilling, Chris; Cha, Yun-Shil; Messner, William

    2014-01-01

    The goal of this paper is to make modeling and quantitative testing accessible to behavioral decision researchers interested in substantive questions. We provide a novel, rigorous, yet very general, quantitative diagnostic framework for testing theories of binary choice. This permits the nontechnical scholar to proceed far beyond traditionally rather superficial methods of analysis, and it permits the quantitatively savvy scholar to triage theoretical proposals before investing effort into complex and specialized quantitative analyses. Our theoretical framework links static algebraic decision theory with observed variability in behavioral binary choice data. The paper is supplemented with a custom-designed public-domain statistical analysis package, the QTest software. We illustrate our approach with a quantitative analysis using published laboratory data, including tests of novel versions of “Random Cumulative Prospect Theory.” A major asset of the approach is the potential to distinguish decision makers who have a fixed preference and commit errors in observed choices from decision makers who waver in their preferences. PMID:24999495

  18. A Theoretical Framework for Lagrangian Descriptors

    NASA Astrophysics Data System (ADS)

    Lopesino, C.; Balibrea-Iniesta, F.; García-Garrido, V. J.; Wiggins, S.; Mancho, A. M.

    This paper provides a theoretical background for Lagrangian Descriptors (LDs). The goal of achieving rigorous proofs that justify the ability of LDs to detect invariant manifolds is simplified by introducing an alternative definition for LDs. The definition is stated for n-dimensional systems with general time dependence, however we rigorously prove that this method reveals the stable and unstable manifolds of hyperbolic points in four particular 2D cases: a hyperbolic saddle point for linear autonomous systems, a hyperbolic saddle point for nonlinear autonomous systems, a hyperbolic saddle point for linear nonautonomous systems and a hyperbolic saddle point for nonlinear nonautonomous systems. We also discuss further rigorous results which show the ability of LDs to highlight additional invariants sets, such as n-tori. These results are just a simple extension of the ergodic partition theory which we illustrate by applying this methodology to well-known examples, such as the planar field of the harmonic oscillator and the 3D ABC flow. Finally, we provide a thorough discussion on the requirement of the objectivity (frame-invariance) property for tools designed to reveal phase space structures and their implications for Lagrangian descriptors.

  19. The MIXED framework: A novel approach to evaluating mixed-methods rigor.

    PubMed

    Eckhardt, Ann L; DeVon, Holli A

    2017-10-01

    Evaluation of rigor in mixed-methods (MM) research is a persistent challenge due to the combination of inconsistent philosophical paradigms, the use of multiple research methods which require different skill sets, and the need to combine research at different points in the research process. Researchers have proposed a variety of ways to thoroughly evaluate MM research, but each method fails to provide a framework that is useful for the consumer of research. In contrast, the MIXED framework is meant to bridge the gap between an academic exercise and practical assessment of a published work. The MIXED framework (methods, inference, expertise, evaluation, and design) borrows from previously published frameworks to create a useful tool for the evaluation of a published study. The MIXED framework uses an experimental eight-item scale that allows for comprehensive integrated assessment of MM rigor in published manuscripts. Mixed methods are becoming increasingly prevalent in nursing and healthcare research requiring researchers and consumers to address issues unique to MM such as evaluation of rigor. © 2017 John Wiley & Sons Ltd.

  20. Postoperative cognitive dysfunction and its relationship to cognitive reserve in elderly total joint replacement patients.

    PubMed

    Scott, J E; Mathias, J L; Kneebone, A C; Krishnan, J

    2017-06-01

    Whether total joint replacement (TJR) patients are susceptible to postoperative cognitive dysfunction (POCD) remains unclear due to inconsistencies in research methodologies. Moreover, cognitive reserve may moderate the development of POCD after TJR, but has not been investigated in this context. The current study investigated POCD after TJR, and its relationship with cognitive reserve, using a more rigorous methodology than has previously been utilized. Fifty-three older adults (aged 50+) scheduled for TJR were assessed pre and post surgery (6 months). Forty-five healthy controls matched for age, gender, and premorbid IQ were re-assessed after an equivalent interval. Cognition, cognitive reserve, and physical and mental health were all measured. Standardized regression-based methods were used to assess cognitive changes, while controlling for the confounding effect of repeated cognitive testing. TJR patients only demonstrated a significant decline in Trail Making Test Part B (TMT B) performance, compared to controls. Cognitive reserve only predicted change in TMT B scores among a subset of TJR patients. Specifically, patients who showed the most improvement pre to post surgery had significantly higher reserve than those who showed the greatest decline. The current study provides limited evidence of POCD after TJR when examined using a rigorous methodology, which controlled for practice effects. Cognitive reserve only predicted performance within a subset of the TJR sample. However, the role of reserve in more cognitively compromised patients remains to be determined.

  1. All That Glitters: A Glimpse into the Future of Cancer Screening

    Cancer.gov

    Developing new screening approaches and rigorously establishing their validity is challenging. Researchers are actively searching for new screening tests that improve the benefits of screening while limiting the harms.

  2. Identifying incompatible combinations of concrete materials: volume II, test protocol.

    DOT National Transportation Integrated Search

    2006-08-01

    Unexpected interactions between otherwise acceptable ingredients in portland cement : concrete are becoming increasingly common as cementitious systems become more complex : and demands on the systems are more rigorous. Examples of incompatibilities ...

  3. B-ALL minimal residual disease flow cytometry: an application of a novel method for optimization of a single-tube model.

    PubMed

    Shaver, Aaron C; Greig, Bruce W; Mosse, Claudio A; Seegmiller, Adam C

    2015-05-01

    Optimizing a clinical flow cytometry panel can be a subjective process dependent on experience. We develop a quantitative method to make this process more rigorous and apply it to B lymphoblastic leukemia/lymphoma (B-ALL) minimal residual disease (MRD) testing. We retrospectively analyzed our existing three-tube, seven-color B-ALL MRD panel and used our novel method to develop an optimized one-tube, eight-color panel, which was tested prospectively. The optimized one-tube, eight-color panel resulted in greater efficiency of time and resources with no loss in diagnostic power. Constructing a flow cytometry panel using a rigorous, objective, quantitative method permits optimization and avoids problems of interdependence and redundancy in a large, multiantigen panel. Copyright© by the American Society for Clinical Pathology.

  4. International Ultraviolet Explorer (IUE) Battery History and Performance

    NASA Technical Reports Server (NTRS)

    Rao, Gopalskrishna M.; Tiller, Smith E.

    1999-01-01

    The "International Ultraviolet Explorer (IUE) Battery History and Performance" report provides the information on the cell/battery design, battery performance during the thirty eight (38) solar eclipse seasons and the end-of-life test data. It is noteworthy that IUE spacecraft was an in-house project and that the batteries were designed, fabricated and tested (Qualification and Acceptance) at the Goddard Space Flight Center. A detailed information is given on the cell and battery design criteria and the designs, on the Qualification and the Acceptance tests, and on the cell life cycling tests. The environmental, thermal, and vibration tests were performed on the batteries at the battery level as well as with the interface on the spacecraft. The telemetry data were acquired, analyzed, and trended for various parameters over the mission life. Rigorous and diligent battery management programs were developed and implemented from time to time to extend the mission life over eighteen plus years. Prior to the termination of spacecraft operation, special tests were conducted to check the battery switching operation, battery residual capacity, third electrode performance and battery impedance.

  5. 78 FR 31941 - Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-28

    ... burden 810 RISE Staff Pre-Test 157 1 .25 39 RISE Staff Post-Test 157 1 .25 39 RISE burden 78 Estimated... addition, evaluation plans were developed to support rigorous site-specific and cross-site studies to... and Lesbian Center's Recognize Intervene Support Empower (RISE) project. A third phase of the study...

  6. Standardized Test Results: KEEP and Control Students. 1975-1976, Technical Report #69.

    ERIC Educational Resources Information Center

    Antill, Ellen; Speidel, Gisela E.

    This report presents the results of various standardized measures administered to Kamehameha Early Education Program (KEEP) students and control students in the school year 1975-1976. In contrast to previous comparisons, KEEP employed more rigorous procedures for the selection of the control students and for the conditions of test administration.…

  7. Standards, Assessments & Opting Out, Spring 2015

    ERIC Educational Resources Information Center

    Advance Illinois, 2015

    2015-01-01

    In the spring, Illinois students will take new state assessments that reflect the rigor and relevance of the new Illinois Learning Standards. But some classmates will sit out and join the pushback against standardized testing. Opt-out advocates raise concerns about over-testing, and the resulting toll on students as well as the impact on classroom…

  8. LEAD LEACHING FROM IN-LINE BRASS DEVICES: A CRITICAL EVALUATION OF THE EXISTING STANDARD

    EPA Science Inventory

    The ANSI/NSF 61, Section 8 standard is intended to protect the public from in-line brass plumbing products that might leach excessive levels of lead to potable water. Experiments were conducted to examine the practical rigor of this test. Contrary to expectations, the test was no...

  9. Curve fitting air sample filter decay curves to estimate transuranic content.

    PubMed

    Hayes, Robert B; Chiou, Hung Cheng

    2004-01-01

    By testing industry standard techniques for radon progeny evaluation on air sample filters, a new technique is developed to evaluate transuranic activity on air filters by curve fitting the decay curves. The industry method modified here is simply the use of filter activity measurements at different times to estimate the air concentrations of radon progeny. The primary modification was to not look for specific radon progeny values but rather transuranic activity. By using a method that will provide reasonably conservative estimates of the transuranic activity present on a filter, some credit for the decay curve shape can then be taken. By carrying out rigorous statistical analysis of the curve fits to over 65 samples having no transuranic activity taken over a 10-mo period, an optimization of the fitting function and quality tests for this purpose was attained.

  10. Bayesian operational modal analysis with asynchronous data, part I: Most probable value

    NASA Astrophysics Data System (ADS)

    Zhu, Yi-Chen; Au, Siu-Kui

    2018-01-01

    In vibration tests, multiple sensors are used to obtain detailed mode shape information about the tested structure. Time synchronisation among data channels is required in conventional modal identification approaches. Modal identification can be more flexibly conducted if this is not required. Motivated by the potential gain in feasibility and economy, this work proposes a Bayesian frequency domain method for modal identification using asynchronous 'output-only' ambient data, i.e. 'operational modal analysis'. It provides a rigorous means for identifying the global mode shape taking into account the quality of the measured data and their asynchronous nature. This paper (Part I) proposes an efficient algorithm for determining the most probable values of modal properties. The method is validated using synthetic and laboratory data. The companion paper (Part II) investigates identification uncertainty and challenges in applications to field vibration data.

  11. A Prospective Test of Cognitive Vulnerability Models of Depression With Adolescent Girls

    PubMed Central

    Bohon, Cara; Stice, Eric; Burton, Emily; Fudell, Molly; Nolen-Hoeksema, Susan

    2009-01-01

    This study sought to provide a more rigorous prospective test of two cognitive vulnerability models of depression with longitudinal data from 496 adolescent girls. Results supported the cognitive vulnerability model in that stressors predicted future increases in depressive symptoms and onset of clinically significant major depression for individuals with a negative attributional style, but not for those with a positive attributional style, although these effects were small. This model appeared to be specific to depression, in that it did not predict future increases in bulimia nervosa or substance abuse symptoms. In contrast, results did not support the integrated cognitive vulnerability self-esteem model that asserts stressors should only predict increased depression for individuals with a confluence of negative attributional style and low self-esteem, and this model did not appear to be specific to depression. PMID:18328873

  12. Study designs for identifying risk compensation behavior among users of biomedical HIV prevention technologies: balancing methodological rigor and research ethics.

    PubMed

    Underhill, Kristen

    2013-10-01

    The growing evidence base for biomedical HIV prevention interventions - such as oral pre-exposure prophylaxis, microbicides, male circumcision, treatment as prevention, and eventually prevention vaccines - has given rise to concerns about the ways in which users of these biomedical products may adjust their HIV risk behaviors based on the perception that they are prevented from infection. Known as risk compensation, this behavioral adjustment draws on the theory of "risk homeostasis," which has previously been applied to phenomena as diverse as Lyme disease vaccination, insurance mandates, and automobile safety. Little rigorous evidence exists to answer risk compensation concerns in the biomedical HIV prevention literature, in part because the field has not systematically evaluated the study designs available for testing these behaviors. The goals of this Commentary are to explain the origins of risk compensation behavior in risk homeostasis theory, to reframe risk compensation as a testable response to the perception of reduced risk, and to assess the methodological rigor and ethical justification of study designs aiming to isolate risk compensation responses. Although the most rigorous methodological designs for assessing risk compensation behavior may be unavailable due to ethical flaws, several strategies can help investigators identify potential risk compensation behavior during Phase II, Phase III, and Phase IV testing of new technologies. Where concerns arise regarding risk compensation behavior, empirical evidence about the incidence, types, and extent of these behavioral changes can illuminate opportunities to better support the users of new HIV prevention strategies. This Commentary concludes by suggesting a new way to conceptualize risk compensation behavior in the HIV prevention context. Copyright © 2013 Elsevier Ltd. All rights reserved.

  13. Study designs for identifying risk compensation behavior among users of biomedical HIV prevention technologies: Balancing methodological rigor and research ethics

    PubMed Central

    Underhill, Kristen

    2014-01-01

    The growing evidence base for biomedical HIV prevention interventions – such as oral pre-exposure prophylaxis, microbicides, male circumcision, treatment as prevention, and eventually prevention vaccines – has given rise to concerns about the ways in which users of these biomedical products may adjust their HIV risk behaviors based on the perception that they are prevented from infection. Known as risk compensation, this behavioral adjustment draws on the theory of “risk homeostasis,” which has previously been applied to phenomena as diverse as Lyme disease vaccination, insurance mandates, and automobile safety. Little rigorous evidence exists to answer risk compensation concerns in the biomedical HIV prevention literature, in part because the field has not systematically evaluated the study designs available for testing these behaviors. The goals of this Commentary are to explain the origins of risk compensation behavior in risk homeostasis theory, to reframe risk compensation as a testable response to the perception of reduced risk, and to assess the methodological rigor and ethical justification of study designs aiming to isolate risk compensation responses. Although the most rigorous methodological designs for assessing risk compensation behavior may be unavailable due to ethical flaws, several strategies can help investigators identify potential risk compensation behavior during Phase II, Phase III, and Phase IV testing of new technologies. Where concerns arise regarding risk compensation behavior, empirical evidence about the incidence, types, and extent of these behavioral changes can illuminate opportunities to better support the users of new HIV prevention strategies. This Commentary concludes by suggesting a new way to conceptualize risk compensation behavior in the HIV prevention context. PMID:23597916

  14. Bioengineered Temporomandibular Joint Disk Implants: Study Protocol for a Two-Phase Exploratory Randomized Preclinical Pilot Trial in 18 Black Merino Sheep (TEMPOJIMS)

    PubMed Central

    Monje, Florencio Gil; González-García, Raúl; Little, Christopher B; Mónico, Lisete; Pinho, Mário; Santos, Fábio Abade; Carrapiço, Belmira; Gonçalves, Sandra Cavaco; Morouço, Pedro; Alves, Nuno; Moura, Carla; Wang, Yadong; Jeffries, Eric; Gao, Jin; Sousa, Rita; Neto, Lia Lucas; Caldeira, Daniel; Salvado, Francisco

    2017-01-01

    Background Preclinical trials are essential to test efficacious options to substitute the temporomandibular joint (TMJ) disk. The contemporary absence of an ideal treatment for patients with severe TMJ disorders can be related to difficulties concerning the appropriate study design to conduct preclinical trials in the TMJ field. These difficulties can be associated with the use of heterogeneous animal models, the use of the contralateral TMJ as control, the absence of rigorous randomized controlled preclinical trials with blinded outcomes assessors, and difficulties involving multidisciplinary teams. Objective This study aims to develop a new, reproducible, and effective study design for preclinical research in the TMJ domain, obtaining rigorous data related to (1) identify the impact of bilateral discectomy in black Merino sheep, (2) identify the impact of bilateral discopexy in black Merino sheep, and (3) identify the impact of three different bioengineering TMJ discs in black Merino sheep. Methods A two-phase exploratory randomized controlled preclinical trial with blinded outcomes is proposed. In the first phase, nine sheep are randomized into three different surgical bilateral procedures: bilateral discectomy, bilateral discopexy, and sham surgery. In the second phase, nine sheep are randomized to bilaterally test three different TMJ bioengineering disk implants. The primary outcome is the histological gradation of TMJ. Secondary outcomes are imaging changes, absolute masticatory time, ruminant time per cycle, ruminant kinetics, ruminant area, and sheep weight. Results Previous preclinical studies in this field have used the contralateral unoperated side as a control, different animal models ranging from mice to a canine model, with nonrandomized, nonblinded and uncontrolled study designs and limited outcomes measures. The main goal of this exploratory preclinical protocol is to set a new standard for future preclinical trials in oromaxillofacial surgery, particularly in the TMJ field, by proposing a rigorous design in black Merino sheep. The authors also intend to test the feasibility of pilot outcomes. The authors expect to increase the quality of further studies in this field and to progress in future treatment options for patients undergoing surgery for TMJ disk replacement. Conclusions The study has commenced, but it is too early to provide results or conclusions. PMID:28254733

  15. Hypnotherapy and Test Anxiety: Two Cognitive-Behavioral Constructs. The Effects of Hypnosis in Reducing Test Anxiety and Improving Academic Achievement in College Students.

    ERIC Educational Resources Information Center

    Sapp, Marty

    A two-group randomized multivariate analysis of covariance (MANCOVA) was used to investigate the effects of cognitive-behavioral hypnosis in reducing test anxiety and improving academic performance in comparison to a Hawthorne control group. Subjects were enrolled in a rigorous introductory psychology course which covered an entire text in one…

  16. Forecasting volatility with neural regression: a contribution to model adequacy.

    PubMed

    Refenes, A N; Holt, W T

    2001-01-01

    Neural nets' usefulness for forecasting is limited by problems of overfitting and the lack of rigorous procedures for model identification, selection and adequacy testing. This paper describes a methodology for neural model misspecification testing. We introduce a generalization of the Durbin-Watson statistic for neural regression and discuss the general issues of misspecification testing using residual analysis. We derive a generalized influence matrix for neural estimators which enables us to evaluate the distribution of the statistic. We deploy Monte Carlo simulation to compare the power of the test for neural and linear regressors. While residual testing is not a sufficient condition for model adequacy, it is nevertheless a necessary condition to demonstrate that the model is a good approximation to the data generating process, particularly as neural-network estimation procedures are susceptible to partial convergence. The work is also an important step toward developing rigorous procedures for neural model identification, selection and adequacy testing which have started to appear in the literature. We demonstrate its applicability in the nontrivial problem of forecasting implied volatility innovations using high-frequency stock index options. Each step of the model building process is validated using statistical tests to verify variable significance and model adequacy with the results confirming the presence of nonlinear relationships in implied volatility innovations.

  17. Single toxin dose-response models revisited

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Demidenko, Eugene, E-mail: eugened@dartmouth.edu

    The goal of this paper is to offer a rigorous analysis of the sigmoid shape single toxin dose-response relationship. The toxin efficacy function is introduced and four special points, including maximum toxin efficacy and inflection points, on the dose-response curve are defined. The special points define three phases of the toxin effect on mortality: (1) toxin concentrations smaller than the first inflection point or (2) larger then the second inflection point imply low mortality rate, and (3) concentrations between the first and the second inflection points imply high mortality rate. Probabilistic interpretation and mathematical analysis for each of the fourmore » models, Hill, logit, probit, and Weibull is provided. Two general model extensions are introduced: (1) the multi-target hit model that accounts for the existence of several vital receptors affected by the toxin, and (2) model with a nonzero mortality at zero concentration to account for natural mortality. Special attention is given to statistical estimation in the framework of the generalized linear model with the binomial dependent variable as the mortality count in each experiment, contrary to the widespread nonlinear regression treating the mortality rate as continuous variable. The models are illustrated using standard EPA Daphnia acute (48 h) toxicity tests with mortality as a function of NiCl or CuSO{sub 4} toxin. - Highlights: • The paper offers a rigorous study of a sigmoid dose-response relationship. • The concentration with highest mortality rate is rigorously defined. • A table with four special points for five morality curves is presented. • Two new sigmoid dose-response models have been introduced. • The generalized linear model is advocated for estimation of sigmoid dose-response relationship.« less

  18. Development of rigor mortis is not affected by muscle volume.

    PubMed

    Kobayashi, M; Ikegaya, H; Takase, I; Hatanaka, K; Sakurada, K; Iwase, H

    2001-04-01

    There is a hypothesis suggesting that rigor mortis progresses more rapidly in small muscles than in large muscles. We measured rigor mortis as tension determined isometrically in rat musculus erector spinae that had been cut into muscle bundles of various volumes. The muscle volume did not influence either the progress or the resolution of rigor mortis, which contradicts the hypothesis. Differences in pre-rigor load on the muscles influenced the onset and resolution of rigor mortis in a few pairs of samples, but did not influence the time taken for rigor mortis to reach its full extent after death. Moreover, the progress of rigor mortis in this muscle was biphasic; this may reflect the early rigor of red muscle fibres and the late rigor of white muscle fibres.

  19. Effects of rigor status during high-pressure processing on the physical qualities of farm-raised abalone (Haliotis rufescens).

    PubMed

    Hughes, Brianna H; Greenberg, Neil J; Yang, Tom C; Skonberg, Denise I

    2015-01-01

    High-pressure processing (HPP) is used to increase meat safety and shelf-life, with conflicting quality effects depending on rigor status during HPP. In the seafood industry, HPP is used to shuck and pasteurize oysters, but its use on abalones has only been minimally evaluated and the effect of rigor status during HPP on abalone quality has not been reported. Farm-raised abalones (Haliotis rufescens) were divided into 12 HPP treatments and 1 unprocessed control treatment. Treatments were processed pre-rigor or post-rigor at 2 pressures (100 and 300 MPa) and 3 processing times (1, 3, and 5 min). The control was analyzed post-rigor. Uniform plugs were cut from adductor and foot meat for texture profile analysis, shear force, and color analysis. Subsamples were used for scanning electron microscopy of muscle ultrastructure. Texture profile analysis revealed that post-rigor processed abalone was significantly (P < 0.05) less firm and chewy than pre-rigor processed irrespective of muscle type, processing time, or pressure. L values increased with pressure to 68.9 at 300 MPa for pre-rigor processed foot, 73.8 for post-rigor processed foot, 90.9 for pre-rigor processed adductor, and 89.0 for post-rigor processed adductor. Scanning electron microscopy images showed fraying of collagen fibers in processed adductor, but did not show pressure-induced compaction of the foot myofibrils. Post-rigor processed abalone meat was more tender than pre-rigor processed meat, and post-rigor processed foot meat was lighter in color than pre-rigor processed foot meat, suggesting that waiting for rigor to resolve prior to processing abalones may improve consumer perceptions of quality and market value. © 2014 Institute of Food Technologists®

  20. Do behavioral scientists really understand HIV-related sexual risk behavior? A systematic review of longitudinal and experimental studies predicting sexual behavior.

    PubMed

    Huebner, David M; Perry, Nicholas S

    2015-10-01

    Behavioral interventions to reduce sexual risk behavior depend on strong health behavior theory. By identifying the psychosocial variables that lead causally to sexual risk, theories provide interventionists with a guide for how to change behavior. However, empirical research is critical to determining whether a particular theory adequately explains sexual risk behavior. A large body of cross-sectional evidence, which has been reviewed elsewhere, supports the notion that certain theory-based constructs (e.g., self-efficacy) are correlates of sexual behavior. However, given the limitations of inferring causality from correlational research, it is essential that we review the evidence from more methodologically rigorous studies (i.e., longitudinal and experimental designs). This systematic review identified 44 longitudinal studies in which investigators attempted to predict sexual risk from psychosocial variables over time. We also found 134 experimental studies (i.e., randomized controlled trials of HIV interventions), but of these only 9 (6.7 %) report the results of mediation analyses that might provide evidence for the validity of health behavior theories in predicting sexual behavior. Results show little convergent support across both types of studies for most traditional, theoretical predictors of sexual behavior. This suggests that the field must expand the body of empirical work that utilizes the most rigorous study designs to test our theoretical assumptions. The inconsistent results of existing research would indicate that current theoretical models of sexual risk behavior are inadequate, and may require expansion or adaptation.

  1. TomoPhantom, a software package to generate 2D-4D analytical phantoms for CT image reconstruction algorithm benchmarks

    NASA Astrophysics Data System (ADS)

    Kazantsev, Daniil; Pickalov, Valery; Nagella, Srikanth; Pasca, Edoardo; Withers, Philip J.

    2018-01-01

    In the field of computerized tomographic imaging, many novel reconstruction techniques are routinely tested using simplistic numerical phantoms, e.g. the well-known Shepp-Logan phantom. These phantoms cannot sufficiently cover the broad spectrum of applications in CT imaging where, for instance, smooth or piecewise-smooth 3D objects are common. TomoPhantom provides quick access to an external library of modular analytical 2D/3D phantoms with temporal extensions. In TomoPhantom, quite complex phantoms can be built using additive combinations of geometrical objects, such as, Gaussians, parabolas, cones, ellipses, rectangles and volumetric extensions of them. Newly designed phantoms are better suited for benchmarking and testing of different image processing techniques. Specifically, tomographic reconstruction algorithms which employ 2D and 3D scanning geometries, can be rigorously analyzed using the software. TomoPhantom also provides a capability of obtaining analytical tomographic projections which further extends the applicability of software towards more realistic, free from the "inverse crime" testing. All core modules of the package are written in the C-OpenMP language and wrappers for Python and MATLAB are provided to enable easy access. Due to C-based multi-threaded implementation, volumetric phantoms of high spatial resolution can be obtained with computational efficiency.

  2. The impact of rigorous mathematical thinking as learning method toward geometry understanding

    NASA Astrophysics Data System (ADS)

    Nugraheni, Z.; Budiyono, B.; Slamet, I.

    2018-05-01

    To reach higher order thinking skill, needed to be mastered the conceptual understanding. RMT is a unique realization of the cognitive conceptual construction approach based on Mediated Learning Experience (MLE) theory by Feurstein and Vygotsky’s sociocultural theory. This was quasi experimental research which was comparing the experimental class that was given Rigorous Mathematical Thinking (RMT) as learning method and control class that was given Direct Learning (DL) as the conventional learning activity. This study examined whether there was different effect of two learning method toward conceptual understanding of Junior High School students. The data was analyzed by using Independent t-test and obtained a significant difference of mean value between experimental and control class on geometry conceptual understanding. Further, by semi-structure interview known that students taught by RMT had deeper conceptual understanding than students who were taught by conventional way. By these result known that Rigorous Mathematical Thinking (RMT) as learning method have positive impact toward Geometry conceptual understanding.

  3. Neurobehavioral testing in subarachnoid hemorrhage: A review of methods and current findings in rodents.

    PubMed

    Turan, Nefize; Miller, Brandon A; Heider, Robert A; Nadeem, Maheen; Sayeed, Iqbal; Stein, Donald G; Pradilla, Gustavo

    2017-11-01

    The most important aspect of a preclinical study seeking to develop a novel therapy for neurological diseases is whether the therapy produces any clinically relevant functional recovery. For this purpose, neurobehavioral tests are commonly used to evaluate the neuroprotective efficacy of treatments in a wide array of cerebrovascular diseases and neurotrauma. Their use, however, has been limited in experimental subarachnoid hemorrhage studies. After several randomized, double-blinded, controlled clinical trials repeatedly failed to produce a benefit in functional outcome despite some improvement in angiographic vasospasm, more rigorous methods of neurobehavioral testing became critical to provide a more comprehensive evaluation of the functional efficacy of proposed treatments. While several subarachnoid hemorrhage studies have incorporated an array of neurobehavioral assays, a standardized methodology has not been agreed upon. Here, we review neurobehavioral tests for rodents and their potential application to subarachnoid hemorrhage studies. Developing a standardized neurobehavioral testing regimen in rodent studies of subarachnoid hemorrhage would allow for better comparison of results between laboratories and a better prediction of what interventions would produce functional benefits in humans.

  4. Do we need methodological theory to do qualitative research?

    PubMed

    Avis, Mark

    2003-09-01

    Positivism is frequently used to stand for the epistemological assumption that empirical science based on principles of verificationism, objectivity, and reproducibility is the foundation of all genuine knowledge. Qualitative researchers sometimes feel obliged to provide methodological alternatives to positivism that recognize their different ethical, ontological, and epistemological commitments and have provided three theories: phenomenology, grounded theory, and ethnography. The author argues that positivism was a doomed attempt to define empirical foundations for knowledge through a rigorous separation of theory and evidence; offers a pragmatic, coherent view of knowledge; and suggests that rigorous, rational empirical investigation does not need methodological theory. Therefore, qualitative methodological theory is unnecessary and counterproductive because it hinders critical reflection on the relation between methodological theory and empirical evidence.

  5. Rigor force responses of permeabilized fibres from fast and slow skeletal muscles of aged rats.

    PubMed

    Plant, D R; Lynch, G S

    2001-09-01

    1. Ageing is generally associated with a decline in skeletal muscle mass and strength and a slowing of muscle contraction, factors that impact upon the quality of life for the elderly. The mechanisms underlying this age-related muscle weakness have not been fully resolved. The purpose of the present study was to determine whether the decrease in muscle force as a consequence of age could be attributed partly to a decrease in the number of cross-bridges participating during contraction. 2. Given that the rigor force is proportional to the approximate total number of interacting sites between the actin and myosin filaments, we tested the null hypothesis that the rigor force of permeabilized muscle fibres from young and old rats would not be different. 3. Permeabilized fibres from the extensor digitorum longus (fast-twitch; EDL) and soleus (predominantly slow-twitch) muscles of young (6 months of age) and old (27 months of age) male F344 rats were activated in Ca2+-buffered solutions to determine force-pCa characteristics (where pCa = -log(10)[Ca2+]) and then in solutions lacking ATP and Ca2+ to determine rigor force levels. 4. The rigor forces for EDL and soleus muscle fibres were not different between young and old rats, indicating that the approximate total number of cross-bridges that can be formed between filaments did not decline with age. We conclude that the age-related decrease in force output is more likely attributed to a decrease in the force per cross-bridge and/or decreases in the efficiency of excitation-contraction coupling.

  6. Inherited Retinal Degenerative Disease Clinical Trial Network

    DTIC Science & Technology

    2012-10-01

    strategies can be designed , tested and adopted as standard care. 2 While repeat evaluation and study of affected patients are vital to rigorously...following document is a summary of our experience and research in testing retinal structure and function in eyes with degenerative retinal diseases...Network PRINCIPAL INVESTIGATOR: Patricia Zilliox, Ph.D. CONTRACTING ORGANIZATION: National Neurovision Research Institute Owings

  7. 77 FR 55698 - National Emission Standards for Hazardous Air Pollutants From the Pulp and Paper Industry

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-11

    ... tests will help to ensure that control systems are maintained properly over time and a more rigorous... approach, industry is expected to save time in the performance test submittal process. Additionally this... pulping vent gas control at mills where the CCA approach would be adversely affected. Our revised cost...

  8. Separate Reading Exams Await Would-Be Elementary Teachers

    ERIC Educational Resources Information Center

    Sawchuk, Stephen

    2012-01-01

    A handful of states are gradually adopting licensing tests that measure aspiring elementary teachers' ability to master aspects of what's arguably their most important task: teaching students to read. In the most recent example of what appears to be a slow but steady push, Wisconsin became the latest state to adopt a rigorous, stand-alone test of…

  9. Integrated model development for liquid fueled rocket propulsion systems

    NASA Technical Reports Server (NTRS)

    Santi, L. Michael

    1993-01-01

    As detailed in the original statement of work, the objective of phase two of this research effort was to develop a general framework for rocket engine performance prediction that integrates physical principles, a rigorous mathematical formalism, component level test data, system level test data, and theory-observation reconciliation. Specific phase two development tasks are defined.

  10. Next Generation of Leaching Tests

    EPA Science Inventory

    A corresponding abstract has been cleared for this presentation. The four methods comprising the Leaching Environmental Assessment Framework are described along with the tools to support implementation of the more rigorous and accurate source terms that are developed using LEAF ...

  11. A psychometric evaluation of the digital logic concept inventory

    NASA Astrophysics Data System (ADS)

    Herman, Geoffrey L.; Zilles, Craig; Loui, Michael C.

    2014-10-01

    Concept inventories hold tremendous promise for promoting the rigorous evaluation of teaching methods that might remedy common student misconceptions and promote deep learning. The measurements from concept inventories can be trusted only if the concept inventories are evaluated both by expert feedback and statistical scrutiny (psychometric evaluation). Classical Test Theory and Item Response Theory provide two psychometric frameworks for evaluating the quality of assessment tools. We discuss how these theories can be applied to assessment tools generally and then apply them to the Digital Logic Concept Inventory (DLCI). We demonstrate that the DLCI is sufficiently reliable for research purposes when used in its entirety and as a post-course assessment of students' conceptual understanding of digital logic. The DLCI can also discriminate between students across a wide range of ability levels, providing the most information about weaker students' ability levels.

  12. EOSlib, Version 3

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Woods, Nathan; Menikoff, Ralph

    2017-02-03

    Equilibrium thermodynamics underpins many of the technologies used throughout theoretical physics, yet verification of the various theoretical models in the open literature remains challenging. EOSlib provides a single, consistent, verifiable implementation of these models, in a single, easy-to-use software package. It consists of three parts: a software library implementing various published equation-of-state (EOS) models; a database of fitting parameters for various materials for these models; and a number of useful utility functions for simplifying thermodynamic calculations such as computing Hugoniot curves or Riemann problem solutions. Ready availability of this library will enable reliable code-to- code testing of equation-of-state implementations, asmore » well as a starting point for more rigorous verification work. EOSlib also provides a single, consistent API for its analytic and tabular EOS models, which simplifies the process of comparing models for a particular application.« less

  13. In defense of the classical height system

    NASA Astrophysics Data System (ADS)

    Foroughi, Ismael; Vaníček, Petr; Sheng, Michael; Kingdon, Robert William; Santos, Marcelo C.

    2017-11-01

    In many European countries, normal heights referred to the quasi-geoid as introduced by Molodenskij in the mid-20th century are preferred to the classical height system that consists of orthometric heights and the geoid as a reference surface for these heights. The rationale for this choice is supposed to be that in the classical height system, neither the geoid, nor the orthometric height can be ever known with centimetre level accuracy because one would need to know the topographical mass density to a level that can never be achieved. The aim of this paper is to question the validity of this rationale. The common way of assessing the congruency of a local geoid model and the orthometric heights is to compare the geoid heights with the difference between orthometric heights provided by leveling and geodetic heights provided by GNSS. On the other hand, testing the congruency of a quasi-geoidal model with normal height a similar procedure is used, except that instead of orthometric heights, normal heights are employed. For the area of Auvergne, France, which is now a more or less standard choice for precise geoid or quasi-geoid testing, only the normal heights are supplied by the Institute Geographic National, the provider of the data. This is clearly the consequence of the European preference for the Molodenskij system. The quality of the height system is to be judged by the congruency of the difference of the geoid/quasi-geoid heights subtracted from the geodetic heights and orthometric/normal heights. To assess the congruency of the classical height system, the Helmert approximation of orthometric heights is typically used as the transformation between normal and Helmert's heights is easily done. However, the evaluation of the differences between Helmert's and the rigorous orthometric heights is somewhat more involved as will be seen from the review in this paper. For the area of interest, the differences between normal and Helmert's heights at the control leveling points range between - 9.5 and 0 cm, differences between Helmert's and the rigorous orthometric heights vary between - 3.6 and 1.1 cm. The local gravimetric geoid model of Auvergne, computed by the Stokes-Helmert technique, is used here to illustrate the accuracy of the classical height system. Results show a very reasonable standard deviation (STD) of 3.2 cm of the differences between geoid values, derived from control leveling points, and gravimetric geoid heights when Helmert's heights are employed and even a smaller STD of 2.9 cm when rigorous orthometric heights are used. A corresponding comparison of a quasi-geoid model, computed by Least-Squares Modification of Stokes method, with normal heights show an STD of 3.4 cm.

  14. Preserving pre-rigor meat functionality for beef patty production.

    PubMed

    Claus, J R; Sørheim, O

    2006-06-01

    Three methods were examined for preserving pre-rigor meat functionality in beef patties. Hot-boned semimembranosus muscles were processed as follows: (1) pre-rigor ground, salted, patties immediately cooked; (2) pre-rigor ground, salted and stored overnight; (3) pre-rigor injected with brine; and (4) post-rigor ground and salted. Raw patties contained 60% lean beef, 19.7% beef fat trim, 1.7% NaCl, 3.6% starch, and 15% water. Pre-rigor processing occurred at 3-3.5h postmortem. Patties made from pre-rigor ground meat had higher pH values; greater protein solubility; firmer, more cohesive, and chewier texture; and substantially lower cooking losses than the other treatments. Addition of salt was sufficient to reduce the rate and extent of glycolysis. Brine injection of intact pre-rigor muscles resulted in some preservation of the functional properties but not as pronounced as with salt addition to pre-rigor ground meat.

  15. Practical Work-up and Management of Recurrent Pregnancy Loss for the Front-Line Clinician.

    PubMed

    Branch, D Ware; Silver, Robert M

    2016-09-01

    Only a few so-called etiologies of recurrent pregnancy loss recurrent pregnancy loss in otherwise healthy women are adequately supported by well-designed investigations of association. The majority of proposed "treatments" have not been subjected to rigorous trials. The American Board of Internal Medicine Choosing Wisely initiative urges providers and patients to have constructive dialog aimed at choosing health care that is supported by evidence, not duplicative of other tests or procedures already received, free from harm, and truly necessary. We support the refreshing, objective frankness promoted by this campaign. A version of the Choosing Wisely "Do" and "Don't" format for recurrent pregnancy loss is presented.

  16. An assessment of clinical chemical sensing technology for potential use in space station health maintenance facility

    NASA Technical Reports Server (NTRS)

    1987-01-01

    A Health Maintenance Facility is currently under development for space station application which will provide capabilities equivalent to those found on Earth. This final report addresses the study of alternate means of diagnosis and evaluation of impaired tissue perfusion in a microgravity environment. Chemical data variables related to the dysfunction and the sensors required to measure these variables are reviewed. A technology survey outlines the ability of existing systems to meet these requirements. How the candidate sensing system was subjected to rigorous testing is explored to determine its suitability. Recommendations for follow-on activities are included that would make the commercial system more appropriate for space station applications.

  17. Stem cell stratagems in alternative medicine.

    PubMed

    Sipp, Douglas

    2011-05-01

    Stem cell research has attracted an extraordinary amount of attention and expectation due to its potential for applications in the treatment of numerous medical conditions. These exciting clinical prospects have generated widespread support from both the public and private sectors, and numerous preclinical studies and rigorous clinical trials have already been initiated. Recent years, however, have also seen alarming growth in the number and variety of claims of clinical uses of notional 'stem cells' that have not been adequately tested for safety and/or efficacy. In this article, I will survey the contours of the stem cell industry as practiced by alternative medicine providers, and highlight points of commonality in their strategies for marketing.

  18. How to Practice Sports Cardiology: A Cardiology Perspective.

    PubMed

    Lawless, Christine E

    2015-07-01

    The rigorous cardiovascular (CV) demands of sport, combined with training-related cardiac adaptations, render the athlete a truly unique CV patient and sports cardiology a truly unique discipline. Cardiologists are advised to adopt a systematic approach to the CV evaluation of athletes, taking into consideration the individual sports culture, sports-specific CV demands, CV adaptations and their appearance on cardiac testing, any existing or potential interaction of the heart with the internal and external sports environment, short- and long-term CV risks, and potential effect of performance-enhancing agents and antidoping regulations. This article outlines the systematic approach, provides a detailed example, and outlines contemporary sports cardiology core competencies. Copyright © 2015 Elsevier Inc. All rights reserved.

  19. Military Ecological Risk Assessment Framework (MERAF) for Assessment of Risks of Military Training and Testing to Natural Resources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Suter II, G.W.

    2003-06-18

    The objective of this research is to provide the DoD with a framework based on a systematic, risk-based approach to assess impacts for management of natural resources in an ecosystem context. This risk assessment framework is consistent with, but extends beyond, the EPA's ecological risk assessment framework, and specifically addresses DoD activities and management needs. MERAF is intended to be consistent with existing procedures for environmental assessment and planning with DoD testing and training. The intention is to supplement these procedures rather than creating new procedural requirements. MERAF is suitable for use for training and testing area assessment and management.more » It does not include human health risks nor does it address specific permitting or compliance requirements, although it may be useful in some of these cases. Use of MERAF fits into the National Environmental Policy Act (NEPA) process by providing a consistent and rigorous way of organizing and conducting the technical analysis for Environmental Impact Statements (EISs) (Sigal 1993; Carpenter 1995; Canter and Sadler 1997). It neither conflicts with, nor replaces, procedural requirements within the NEPA process or document management processes already in place within DoD.« less

  20. Extracting the distribution of laser damage precursors on fused silica surfaces for 351 nm, 3 ns laser pulses at high fluences (20-150 J/cm2).

    PubMed

    Laurence, Ted A; Bude, Jeff D; Ly, Sonny; Shen, Nan; Feit, Michael D

    2012-05-07

    Surface laser damage limits the lifetime of optics for systems guiding high fluence pulses, particularly damage in silica optics used for inertial confinement fusion-class lasers (nanosecond-scale high energy pulses at 355 nm/3.5 eV). The density of damage precursors at low fluence has been measured using large beams (1-3 cm); higher fluences cannot be measured easily since the high density of resulting damage initiation sites results in clustering. We developed automated experiments and analysis that allow us to damage test thousands of sites with small beams (10-30 µm), and automatically image the test sites to determine if laser damage occurred. We developed an analysis method that provides a rigorous connection between these small beam damage test results of damage probability versus laser pulse energy and the large beam damage results of damage precursor densities versus fluence. We find that for uncoated and coated fused silica samples, the distribution of precursors nearly flattens at very high fluences, up to 150 J/cm2, providing important constraints on the physical distribution and nature of these precursors.

  1. Ontogenetic scaling of metabolism, growth, and assimilation: testing metabolic scaling theory with Manduca sexta larvae.

    PubMed

    Sears, Katie E; Kerkhoff, Andrew J; Messerman, Arianne; Itagaki, Haruhiko

    2012-01-01

    Metabolism, growth, and the assimilation of energy and materials are essential processes that are intricately related and depend heavily on animal size. However, models that relate the ontogenetic scaling of energy assimilation and metabolism to growth rely on assumptions that have yet to be rigorously tested. Based on detailed daily measurements of metabolism, growth, and assimilation in tobacco hornworms, Manduca sexta, we provide a first experimental test of the core assumptions of a metabolic scaling model of ontogenetic growth. Metabolic scaling parameters changed over development, in violation of the model assumptions. At the same time, the scaling of growth rate matches that of metabolic rate, with similar scaling exponents both across and within developmental instars. Rates of assimilation were much higher than expected during the first two instars and did not match the patterns of scaling of growth and metabolism, which suggests high costs of biosynthesis early in development. The rapid increase in size and discrete instars observed in larval insect development provide an ideal system for understanding how patterns of growth and metabolism emerge from fundamental cellular processes and the exchange of materials and energy between an organism and its environment.

  2. Accelerating Biomedical Discoveries through Rigor and Transparency.

    PubMed

    Hewitt, Judith A; Brown, Liliana L; Murphy, Stephanie J; Grieder, Franziska; Silberberg, Shai D

    2017-07-01

    Difficulties in reproducing published research findings have garnered a lot of press in recent years. As a funder of biomedical research, the National Institutes of Health (NIH) has taken measures to address underlying causes of low reproducibility. Extensive deliberations resulted in a policy, released in 2015, to enhance reproducibility through rigor and transparency. We briefly explain what led to the policy, describe its elements, provide examples and resources for the biomedical research community, and discuss the potential impact of the policy on translatability with a focus on research using animal models. Importantly, while increased attention to rigor and transparency may lead to an increase in the number of laboratory animals used in the near term, it will lead to more efficient and productive use of such resources in the long run. The translational value of animal studies will be improved through more rigorous assessment of experimental variables and data, leading to better assessments of the translational potential of animal models, for the benefit of the research community and society. Published by Oxford University Press on behalf of the Institute for Laboratory Animal Research 2017. This work is written by (a) US Government employee(s) and is in the public domain in the US.

  3. Rigorous analysis of an electric-field-driven liquid crystal lens for 3D displays

    NASA Astrophysics Data System (ADS)

    Kim, Bong-Sik; Lee, Seung-Chul; Park, Woo-Sang

    2014-08-01

    We numerically analyzed the optical performance of an electric field driven liquid crystal (ELC) lens adopted for 3-dimensional liquid crystal displays (3D-LCDs) through rigorous ray tracing. For the calculation, we first obtain the director distribution profile of the liquid crystals by using the Erickson-Leslie motional equation; then, we calculate the transmission of light through the ELC lens by using the extended Jones matrix method. The simulation was carried out for a 9view 3D-LCD with a diagonal of 17.1 inches, where the ELC lens was slanted to achieve natural stereoscopic images. The results show that each view exists separately according to the viewing position at an optimum viewing distance of 80 cm. In addition, our simulation results provide a quantitative explanation for the ghost or blurred images between views observed from a 3D-LCD with an ELC lens. The numerical simulations are also shown to be in good agreement with the experimental results. The present simulation method is expected to provide optimum design conditions for obtaining natural 3D images by rigorously analyzing the optical functionalities of an ELC lens.

  4. RS-25 Rocket Engine Test

    NASA Image and Video Library

    2017-08-09

    The 8.5-minute test conducted at NASA’s Stennis Space Center is part of a series of tests designed to put the upgraded former space shuttle engines through the rigorous temperature and pressure conditions they will experience during a launch. The tests also support the development of a new controller, or “brain,” for the engine, which monitors engine status and communicates between the rocket and the engine, relaying commands to the engine and transmitting data back to the rocket.

  5. RS 25 Hot Fire test

    NASA Image and Video Library

    2016-08-18

    The 7.5-minute test conducted at NASA’s Stennis Space Center is part of a series of tests designed to put the upgraded former space shuttle engines through the rigorous temperature and pressure conditions they will experience during a launch. The tests also support the development of a new controller, or “brain,” for the engine, which monitors engine status and communicates between the rocket and the engine, relaying commands to the engine and transmitting data back to the rocket.

  6. RS-25 Hot Fire test

    NASA Image and Video Library

    2016-08-18

    The 7.5-minute test conducted at NASA’s Stennis Space Center is part of a series of tests designed to put the upgraded former space shuttle engines through the rigorous temperature and pressure conditions they will experience during a launch. The tests also support the development of a new controller, or “brain,” for the engine, which monitors engine status and communicates between the rocket and the engine, relaying commands to the engine and transmitting data back to the rocket.

  7. Valuing goodwill: not-for-profits prepare for annual impairment testing.

    PubMed

    Heuer, Christian; Travers, Mary Ann K

    2011-02-01

    Accounting standards for valuing goodwill and intangible assets are becoming more rigorous for not-for-profit organizations: Not-for-profit healthcare organizations need to test for goodwill impairment at least annually. Impairment testing is a two-stage process: initial analysis to determine whether impairment exists and subsequent calculation of the magnitude of impairment. Certain "triggering" events compel all organizations--whether for-profit or not-for-profit--to perform an impairment test for goodwill or intangible assets.

  8. Assessing significance in a Markov chain without mixing.

    PubMed

    Chikina, Maria; Frieze, Alan; Pegden, Wesley

    2017-03-14

    We present a statistical test to detect that a presented state of a reversible Markov chain was not chosen from a stationary distribution. In particular, given a value function for the states of the Markov chain, we would like to show rigorously that the presented state is an outlier with respect to the values, by establishing a [Formula: see text] value under the null hypothesis that it was chosen from a stationary distribution of the chain. A simple heuristic used in practice is to sample ranks of states from long random trajectories on the Markov chain and compare these with the rank of the presented state; if the presented state is a [Formula: see text] outlier compared with the sampled ranks (its rank is in the bottom [Formula: see text] of sampled ranks), then this observation should correspond to a [Formula: see text] value of [Formula: see text] This significance is not rigorous, however, without good bounds on the mixing time of the Markov chain. Our test is the following: Given the presented state in the Markov chain, take a random walk from the presented state for any number of steps. We prove that observing that the presented state is an [Formula: see text]-outlier on the walk is significant at [Formula: see text] under the null hypothesis that the state was chosen from a stationary distribution. We assume nothing about the Markov chain beyond reversibility and show that significance at [Formula: see text] is best possible in general. We illustrate the use of our test with a potential application to the rigorous detection of gerrymandering in Congressional districting.

  9. Assessing significance in a Markov chain without mixing

    PubMed Central

    Chikina, Maria; Frieze, Alan; Pegden, Wesley

    2017-01-01

    We present a statistical test to detect that a presented state of a reversible Markov chain was not chosen from a stationary distribution. In particular, given a value function for the states of the Markov chain, we would like to show rigorously that the presented state is an outlier with respect to the values, by establishing a p value under the null hypothesis that it was chosen from a stationary distribution of the chain. A simple heuristic used in practice is to sample ranks of states from long random trajectories on the Markov chain and compare these with the rank of the presented state; if the presented state is a 0.1% outlier compared with the sampled ranks (its rank is in the bottom 0.1% of sampled ranks), then this observation should correspond to a p value of 0.001. This significance is not rigorous, however, without good bounds on the mixing time of the Markov chain. Our test is the following: Given the presented state in the Markov chain, take a random walk from the presented state for any number of steps. We prove that observing that the presented state is an ε-outlier on the walk is significant at p=2ε under the null hypothesis that the state was chosen from a stationary distribution. We assume nothing about the Markov chain beyond reversibility and show that significance at p≈ε is best possible in general. We illustrate the use of our test with a potential application to the rigorous detection of gerrymandering in Congressional districting. PMID:28246331

  10. Use of laboratory tests for immune biomarkers in environmental health studies concerned with exposure to indoor air pollutants.

    PubMed Central

    Vogt, R F

    1991-01-01

    The immune system is likely to be involved in some of the health effects caused by certain indoor air exposures, and immune biomarkers can help determine which exposures and health effects have important immune components. However, the lack of standardized laboratory tests for most human immune markers and the many confounding variables that can influence them makes interpretation of results for exposure and disease end points uncertain. This paper presents an overview of the immune system and the considerations involved in using tests for immune markers in clinical epidemiology studies, particularly those concerned with indoor air exposures. Careful study design, well-characterized laboratory methods, and rigorous documentation of exposure status are required to determine the predictive value of such tests. Clinical tests currently available for some immune markers could help identify and characterize both irritative and hypersensitivity reactions to indoor air pollutants. Newer tests developed in research settings might provide more incisive indicators of immune status that could help identify exposure, susceptibility, or preclinical disease states, but their methodologies must be refined and tested in multicenter studies before they can be used reliably in public health applications. PMID:1821385

  11. Validation study and routine control monitoring of moist heat sterilization procedures.

    PubMed

    Shintani, Hideharu

    2012-06-01

    The proposed approach to validation of steam sterilization in autoclaves follows the basic life cycle concepts applicable to all validation programs. Understand the function of sterilization process, develop and understand the cycles to carry out the process, and define a suitable test or series of tests to confirm that the function of the process is suitably ensured by the structure provided. Sterilization of product and components and parts that come in direct contact with sterilized product is the most critical of pharmaceutical processes. Consequently, this process requires a most rigorous and detailed approach to validation. An understanding of the process requires a basic understanding of microbial death, the parameters that facilitate that death, the accepted definition of sterility, and the relationship between the definition and sterilization parameters. Autoclaves and support systems need to be designed, installed, and qualified in a manner that ensures their continued reliability. Lastly, the test program must be complete and definitive. In this paper, in addition to validation study, documentation of IQ, OQ and PQ concretely were described.

  12. Digital Hadron Calorimetry

    NASA Astrophysics Data System (ADS)

    Bilki, Burak

    2018-03-01

    The Particle Flow Algorithms attempt to measure each particle in a hadronic jet individually, using the detector providing the best energy/momentum resolution. Therefore, the spatial segmentation of the calorimeter plays a crucial role. In this context, the CALICE Collaboration developed the Digital Hadron Calorimeter. The Digital Hadron Calorimeter uses Resistive Plate Chambers as active media and has a 1-bit resolution (digital) readout of 1 × 1 cm2 pads. The calorimeter was tested with steel and tungsten absorber structures, as well as with no absorber structure, at the Fermilab and CERN test beam facilities over several years. In addition to conventional calorimetric measurements, the Digital Hadron Calorimeter offers detailed measurements of event shapes, rigorous tests of simulation models and various tools for improved performance due to its very high spatial granularity. Here we report on the results from the analysis of pion and positron events. Results of comparisons with the Monte Carlo simulations are also discussed. The analysis demonstrates the unique utilization of detailed event topologies.

  13. Simulation of Watts Bar Unit 1 Initial Startup Tests with Continuous Energy Monte Carlo Methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Godfrey, Andrew T; Gehin, Jess C; Bekar, Kursat B

    2014-01-01

    The Consortium for Advanced Simulation of Light Water Reactors* is developing a collection of methods and software products known as VERA, the Virtual Environment for Reactor Applications. One component of the testing and validation plan for VERA is comparison of neutronics results to a set of continuous energy Monte Carlo solutions for a range of pressurized water reactor geometries using the SCALE component KENO-VI developed by Oak Ridge National Laboratory. Recent improvements in data, methods, and parallelism have enabled KENO, previously utilized predominately as a criticality safety code, to demonstrate excellent capability and performance for reactor physics applications. The highlymore » detailed and rigorous KENO solutions provide a reliable nu-meric reference for VERAneutronics and also demonstrate the most accurate predictions achievable by modeling and simulations tools for comparison to operating plant data. This paper demonstrates the performance of KENO-VI for the Watts Bar Unit 1 Cycle 1 zero power physics tests, including reactor criticality, control rod worths, and isothermal temperature coefficients.« less

  14. Special Blood Donation Procedures

    MedlinePlus

    ... blood. For example, in the weeks before undergoing elective surgery, a person may donate several units of blood ... rigorous donor screening and testing. In addition, elderly patients may not tolerate donating blood before surgery because they are more likely to have side ...

  15. Development of a health-related website for parents of children receiving hematopoietic stem cell transplant: HSCT-CHESS.

    PubMed

    Mayer, Deborah K; Ratichek, S; Berhe, H; Stewart, S; McTavish, F; Gustafson, D; Parsons, S K

    2010-03-01

    Parents of pediatric hematopoietic stem cell transplant (HSCT) play a pivotal role in the care of their child during and after transplant. In addition to the child's comforter, parents also serve as care coordinators and conduits of communication between various health care providers, family and community members. The stress on the parent and family is enormous during this process, which for many is compounded by geographic dislocation to accompany their child during the rigorous treatment and recovery process. For many parents, their own recovery spans months to years. Parental activation, a process of becoming informed to participate in decisions, collaborate with health care providers, and manage care provided the conceptual framework to develop an eHealth approach for this population. HSCT-CHESS was developed, based on previous success with an existing eHealth system of integrated services, the Comprehensive Health Enhancement Support System (CHESS). CHESS(TM) is designed to help individuals and families cope with a health crisis or medical concern. The iterative user-centered development process for HSCT-CHESS included parents of HSCT recipients, representatives from an HSCT Advocacy Group, and members of the clinical, research, development and design teams. This rigorous process, including online focus groups and surveys, utilization of a parental user group, and an editorial and development process are described. As the population of cancer survivors and caregivers increase and as the oncology workforce becomes more stretched; developing eHealth applications may be an approach to address many of caregivers unmet needs. The purpose in describing this process is to help others when considering such an endeavor. HSCT-CHESS is now being tested in a randomized controlled trial versus standard care to evaluate its impact on the quality of life of both the parent and child HSCT recipient.

  16. MODFLOW–LGR—Documentation of ghost node local grid refinement (LGR2) for multiple areas and the boundary flow and head (BFH2) package

    USGS Publications Warehouse

    Mehl, Steffen W.; Hill, Mary C.

    2013-01-01

    This report documents the addition of ghost node Local Grid Refinement (LGR2) to MODFLOW-2005, the U.S. Geological Survey modular, transient, three-dimensional, finite-difference groundwater flow model. LGR2 provides the capability to simulate groundwater flow using multiple block-shaped higher-resolution local grids (a child model) within a coarser-grid parent model. LGR2 accomplishes this by iteratively coupling separate MODFLOW-2005 models such that heads and fluxes are balanced across the grid-refinement interface boundary. LGR2 can be used in two-and three-dimensional, steady-state and transient simulations and for simulations of confined and unconfined groundwater systems. Traditional one-way coupled telescopic mesh refinement methods can have large, often undetected, inconsistencies in heads and fluxes across the interface between two model grids. The iteratively coupled ghost-node method of LGR2 provides a more rigorous coupling in which the solution accuracy is controlled by convergence criteria defined by the user. In realistic problems, this can result in substantially more accurate solutions and require an increase in computer processing time. The rigorous coupling enables sensitivity analysis, parameter estimation, and uncertainty analysis that reflects conditions in both model grids. This report describes the method used by LGR2, evaluates accuracy and performance for two-and three-dimensional test cases, provides input instructions, and lists selected input and output files for an example problem. It also presents the Boundary Flow and Head (BFH2) Package, which allows the child and parent models to be simulated independently using the boundary conditions obtained through the iterative process of LGR2.

  17. MODFLOW-2005, the U.S. Geological Survey modular ground-water model - documentation of shared node local grid refinement (LGR) and the boundary flow and head (BFH) package

    USGS Publications Warehouse

    Mehl, Steffen W.; Hill, Mary C.

    2006-01-01

    This report documents the addition of shared node Local Grid Refinement (LGR) to MODFLOW-2005, the U.S. Geological Survey modular, transient, three-dimensional, finite-difference ground-water flow model. LGR provides the capability to simulate ground-water flow using one block-shaped higher-resolution local grid (a child model) within a coarser-grid parent model. LGR accomplishes this by iteratively coupling two separate MODFLOW-2005 models such that heads and fluxes are balanced across the shared interfacing boundary. LGR can be used in two-and three-dimensional, steady-state and transient simulations and for simulations of confined and unconfined ground-water systems. Traditional one-way coupled telescopic mesh refinement (TMR) methods can have large, often undetected, inconsistencies in heads and fluxes across the interface between two model grids. The iteratively coupled shared-node method of LGR provides a more rigorous coupling in which the solution accuracy is controlled by convergence criteria defined by the user. In realistic problems, this can result in substantially more accurate solutions and require an increase in computer processing time. The rigorous coupling enables sensitivity analysis, parameter estimation, and uncertainty analysis that reflects conditions in both model grids. This report describes the method used by LGR, evaluates LGR accuracy and performance for two- and three-dimensional test cases, provides input instructions, and lists selected input and output files for an example problem. It also presents the Boundary Flow and Head (BFH) Package, which allows the child and parent models to be simulated independently using the boundary conditions obtained through the iterative process of LGR.

  18. Integrative medicine or infiltrative pseudoscience?

    PubMed

    Li, Ben; Forbes, Thomas L; Byrne, John

    2018-01-02

    Evidence-based medicine, first described in 1992, offers a clear, systematic, and scientific approach to the practice of medicine. Recently, the non-evidence-based practice of complementary and alternative medicine (CAM) has been increasing in the United States and around the world, particularly at medical institutions known for providing rigorous evidence-based care. The use of CAM may cause harm to patients through interactions with evidence-based medications or if patients choose to forego evidence-based care. CAM may also put financial strain on patients as most CAM expenditures are paid out-of-pocket. Despite these drawbacks, patients continue to use CAM due to media promotion of CAM therapies, dissatisfaction with conventional healthcare, and a desire for more holistic care. Given the increasing demand for CAM, many medical institutions now offer CAM services. Recently, there has been controversy surrounding the leaders of several CAM centres based at a highly respected academic medical institution, as they publicly expressed anti-vaccination views. These controversies demonstrate the non-evidence-based philosophies that run deep within CAM that are contrary to the evidence-based care that academic medical institutions should provide. Although there are financial incentives for institutions to provide CAM, it is important to recognize that this legitimizes CAM and may cause harm to patients. The poor regulation of CAM allows for the continued distribution of products and services that have not been rigorously tested for safety and efficacy. Governments in Australia and England have successfully improved regulation of CAM and can serve as a model to other countries. Copyright © 2017 Royal College of Surgeons of Edinburgh (Scottish charity number SC005317) and Royal College of Surgeons in Ireland. Published by Elsevier Ltd. All rights reserved.

  19. The "Performance of Rotavirus and Oral Polio Vaccines in Developing Countries" (PROVIDE) study: description of methods of an interventional study designed to explore complex biologic problems.

    PubMed

    Kirkpatrick, Beth D; Colgate, E Ross; Mychaleckyj, Josyf C; Haque, Rashidul; Dickson, Dorothy M; Carmolli, Marya P; Nayak, Uma; Taniuchi, Mami; Naylor, Caitlin; Qadri, Firdausi; Ma, Jennie Z; Alam, Masud; Walsh, Mary Claire; Diehl, Sean A; Petri, William A

    2015-04-01

    Oral vaccines appear less effective in children in the developing world. Proposed biologic reasons include concurrent enteric infections, malnutrition, breast milk interference, and environmental enteropathy (EE). Rigorous study design and careful data management are essential to begin to understand this complex problem while assuring research subject safety. Herein, we describe the methodology and lessons learned in the PROVIDE study (Dhaka, Bangladesh). A randomized clinical trial platform evaluated the efficacy of delayed-dose oral rotavirus vaccine as well as the benefit of an injectable polio vaccine replacing one dose of oral polio vaccine. This rigorous infrastructure supported the additional examination of hypotheses of vaccine underperformance. Primary and secondary efficacy and immunogenicity measures for rotavirus and polio vaccines were measured, as well as the impact of EE and additional exploratory variables. Methods for the enrollment and 2-year follow-up of a 700 child birth cohort are described, including core laboratory, safety, regulatory, and data management practices. Intense efforts to standardize clinical, laboratory, and data management procedures in a developing world setting provide clinical trials rigor to all outcomes. Although this study infrastructure requires extensive time and effort, it allows optimized safety and confidence in the validity of data gathered in complex, developing country settings. © The American Society of Tropical Medicine and Hygiene.

  20. Nine Criteria for a Measure of Scientific Output

    PubMed Central

    Kreiman, Gabriel; Maunsell, John H. R.

    2011-01-01

    Scientific research produces new knowledge, technologies, and clinical treatments that can lead to enormous returns. Often, the path from basic research to new paradigms and direct impact on society takes time. Precise quantification of scientific output in the short-term is not an easy task but is critical for evaluating scientists, laboratories, departments, and institutions. While there have been attempts to quantifying scientific output, we argue that current methods are not ideal and suffer from solvable difficulties. Here we propose criteria that a metric should have to be considered a good index of scientific output. Specifically, we argue that such an index should be quantitative, based on robust data, rapidly updated and retrospective, presented with confidence intervals, normalized by number of contributors, career stage and discipline, impractical to manipulate, and focused on quality over quantity. Such an index should be validated through empirical testing. The purpose of quantitatively evaluating scientific output is not to replace careful, rigorous review by experts but rather to complement those efforts. Because it has the potential to greatly influence the efficiency of scientific research, we have a duty to reflect upon and implement novel and rigorous ways of evaluating scientific output. The criteria proposed here provide initial steps toward the systematic development and validation of a metric to evaluate scientific output. PMID:22102840

  1. Information flow and causality as rigorous notions ab initio

    NASA Astrophysics Data System (ADS)

    Liang, X. San

    2016-11-01

    Information flow or information transfer the widely applicable general physics notion can be rigorously derived from first principles, rather than axiomatically proposed as an ansatz. Its logical association with causality is firmly rooted in the dynamical system that lies beneath. The principle of nil causality that reads, an event is not causal to another if the evolution of the latter is independent of the former, which transfer entropy analysis and Granger causality test fail to verify in many situations, turns out to be a proven theorem here. Established in this study are the information flows among the components of time-discrete mappings and time-continuous dynamical systems, both deterministic and stochastic. They have been obtained explicitly in closed form, and put to applications with the benchmark systems such as the Kaplan-Yorke map, Rössler system, baker transformation, Hénon map, and stochastic potential flow. Besides unraveling the causal relations as expected from the respective systems, some of the applications show that the information flow structure underlying a complex trajectory pattern could be tractable. For linear systems, the resulting remarkably concise formula asserts analytically that causation implies correlation, while correlation does not imply causation, providing a mathematical basis for the long-standing philosophical debate over causation versus correlation.

  2. Benchmarking of density functionals for a soft but accurate prediction and assignment of (1) H and (13)C NMR chemical shifts in organic and biological molecules.

    PubMed

    Benassi, Enrico

    2017-01-15

    A number of programs and tools that simulate 1 H and 13 C nuclear magnetic resonance (NMR) chemical shifts using empirical approaches are available. These tools are user-friendly, but they provide a very rough (and sometimes misleading) estimation of the NMR properties, especially for complex systems. Rigorous and reliable ways to predict and interpret NMR properties of simple and complex systems are available in many popular computational program packages. Nevertheless, experimentalists keep relying on these "unreliable" tools in their daily work because, to have a sufficiently high accuracy, these rigorous quantum mechanical methods need high levels of theory. An alternative, efficient, semi-empirical approach has been proposed by Bally, Rablen, Tantillo, and coworkers. This idea consists of creating linear calibrations models, on the basis of the application of different combinations of functionals and basis sets. Following this approach, the predictive capability of a wider range of popular functionals was systematically investigated and tested. The NMR chemical shifts were computed in solvated phase at density functional theory level, using 30 different functionals coupled with three different triple-ζ basis sets. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  3. Using cancer to make cellular reproduction rigorous and relevant

    NASA Astrophysics Data System (ADS)

    Duncan, Cynthia F.

    The 1983 report Nation at Risk highlighted the fact that test scores of American students were far below that of competing nations and educational standards were being lowered. This trend has continued and studies have also shown that students are not entering college ready for success. This trend can be reversed. Students can better understand and retain biology content expectations if they are taught in a way that is both rigorous and relevant. In the past, students have learned the details of cellular reproduction with little knowledge of why it is important to their everyday lives. This material is learned only for the test. Knowing the details of cellular reproduction is crucial for understanding cancer. Cancer is a topic that will likely affect all of my students at some point in their lives. Students used hands on activities, including simulations, labs, and models to learn about cellular reproduction with cancer as a theme throughout. Students were challenged to learn how to use the rigorous biology content expectations to think about cancer, including stem cell research. Students that will some day be college students, voting citizens, and parents, will become better learners. Students were assessed before and after the completion of the unit to determine if learning occurs. Students did learn the material and became more critical thinkers. Statistical analysis was completed to insure confidence in the results.

  4. Beyond research: a primer for considerations on using viral metagenomics in the field and clinic.

    PubMed

    Hall, Richard J; Draper, Jenny L; Nielsen, Fiona G G; Dutilh, Bas E

    2015-01-01

    Powered by recent advances in next-generation sequencing technologies, metagenomics has already unveiled vast microbial biodiversity in a range of environments, and is increasingly being applied in clinics for difficult-to-diagnose cases. It can be tempting to suggest that metagenomics could be used as a "universal test" for all pathogens without the need to conduct lengthy serial testing using specific assays. While this is an exciting prospect, there are issues that need to be addressed before metagenomic methods can be applied with rigor as a diagnostic tool, including the potential for incidental findings, unforeseen consequences for trade and regulatory authorities, privacy and cultural issues, data sharing, and appropriate reporting of results to end-users. These issues will require consideration and discussion across a range of disciplines, with inclusion of scientists, ethicists, clinicians, diagnosticians, health practitioners, and ultimately the public. Here, we provide a primer for consideration on some of these issues.

  5. A Systematic Review of Behavioral Interventions to Reduce Condomless Sex and Increase HIV Testing for Latino MSM.

    PubMed

    Pérez, Ashley; Santamaria, E Karina; Operario, Don

    2017-12-15

    Latino men who have sex with men (MSM) in the United States are disproportionately affected by HIV, and there have been calls to improve availability of culturally sensitive HIV prevention programs for this population. This article provides a systematic review of intervention programs to reduce condomless sex and/or increase HIV testing among Latino MSM. We searched four electronic databases using a systematic review protocol, screened 1777 unique records, and identified ten interventions analyzing data from 2871 Latino MSM. Four studies reported reductions in condomless anal intercourse, and one reported reductions in number of sexual partners. All studies incorporated surface structure cultural features such as bilingual study recruitment, but the incorporation of deep structure cultural features, such as machismo and sexual silence, was lacking. There is a need for rigorously designed interventions that incorporate deep structure cultural features in order to reduce HIV among Latino MSM.

  6. Supported education for individuals with psychiatric disabilities: State of the practice and policy implications.

    PubMed

    Ringeisen, Heather; Langer Ellison, Marsha; Ryder-Burge, Amy; Biebel, Kathleen; Alikhan, Shums; Jones, Emily

    2017-06-01

    Supported education (SEd) is a promising practice that supports and encourages educational goals and attainment among individuals with psychiatric disabilities. This paper provides insights into how SEd objectives are pursued in different settings, assesses the evidence base, and discusses policy implications. Insights from 3 data sources were synthesized: published literature, an environmental scan, and 3 site visits to programs that support the education goals of individuals with psychiatric disabilities. While setting, target populations, level of coordination with supported employment, and financing strategies varied, common SEd components emerged: specialized and dedicated staffing, one-on-one and group skill-building activities, assistance with navigating the academic setting and coordinating different services, and linkages with mental health counseling. The evidence base is growing; however, many published studies to date do not employ rigorous methodology. Conclusions and Implications for Policy and Practice: Continued specification, operationalization, and testing of SEd core components are needed. The components of the evolving SEd model would benefit from rigorous testing to evaluate impact on degree completion and other key impacts such as employment; health, mental health, or recovery; and community participation. In addition to funding streams from special education and Medicaid, new opportunities for increasing the availability of SEd include the Workforce Innovation and Opportunities Act (WIOA) reauthorization, which requires state vocational rehabilitation agencies to fund preemployment services for transition-age individuals. New "set-aside" requirements for the Mental Health Services Block Grant will increase funding for early intervention services for individuals with serious mental illness, potentially including SEd. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  7. Postflight reconditioning for European Astronauts - A case report of recovery after six months in space.

    PubMed

    Petersen, Nora; Lambrecht, Gunda; Scott, Jonathan; Hirsch, Natalie; Stokes, Maria; Mester, Joachim

    2017-01-01

    Postflight reconditioning of astronauts is understudied. Despite a rigorous, daily inflight exercise countermeasures programme during six months in microgravity (μG) on-board the International Space Station (ISS), physiological impairments occur and postflight reconditioning is still required on return to Earth. Such postflight programmes are implemented by space agency reconditioning specialists. Case Description and Assessments: A 38 year old male European Space Agency (ESA) crewmember's pre- and postflight (at six and 21 days after landing) physical performance from a six-month mission to ISS are described. muscle strength (squat and bench press 1 Repetition Maximum) and power (vertical jump), core muscle endurance and hip flexibility (Sit and Reach, Thomas Test). In-flight, the astronaut undertook a rigorous daily (2-h) exercise programme. The 21 day postflight reconditioning exercise concept focused on motor control and functional training, and was delivered in close co-ordination by the ESA physiotherapist and exercise specialist to provide the crewmember with comprehensive reconditioning support. Despite an intensive inflight exercise programme for this highly motivated crewmember, postflight performance showed impairments at R+6 for most parameters, all of which recovered by R+21 except muscular power (jump tests). Regardless of intense inflight exercise countermeasures and excellent compliance to postflight reconditioning, postflight performance showed impairments at R+6 for most parameters. Complex powerful performance tasks took longer to return to preflight values. Research is needed to develop optimal inflight and postflight exercise programmes to overcome the negative effects of microgravity and return the astronaut to preflight status as rapidly as possible. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. Testing the Impact of a Pre-Instructional Digital Game on Middle-Grade Students' Understanding of Photosynthesis

    ERIC Educational Resources Information Center

    Culp, Katherine McMillan; Martin, Wendy; Clements, Margaret; Lewis Presser, Ashley

    2015-01-01

    Rigorous studies of the impact of digital games on student learning remain relatively rare, as do studies of games as supports for learning difficult, core curricular concepts in the context of normal classroom practices. This study uses a blocked, cluster randomized controlled trial design to test the impact of a digital game, played as homework…

  9. Giant Panda Maternal Care: A Test of the Experience Constraint Hypothesis.

    PubMed

    Snyder, Rebecca J; Perdue, Bonnie M; Zhang, Zhihe; Maple, Terry L; Charlton, Benjamin D

    2016-06-07

    The body condition constraint and the experience condition constraint hypotheses have both been proposed to account for differences in reproductive success between multiparous (experienced) and primiparous (first-time) mothers. However, because primiparous mothers are typically characterized by both inferior body condition and lack of experience when compared to multiparous mothers, interpreting experience related differences in maternal care as support for either the body condition constraint hypothesis or the experience constraint hypothesis is extremely difficult. Here, we examined maternal behaviour in captive giant pandas, allowing us to simultaneously control for body condition and provide a rigorous test of the experience constraint hypothesis in this endangered animal. We found that multiparous mothers spent more time engaged in key maternal behaviours (nursing, grooming, and holding cubs) and had significantly less vocal cubs than primiparous mothers. This study provides the first evidence supporting the experience constraint hypothesis in the order Carnivora, and may have utility for captive breeding programs in which it is important to monitor the welfare of this species' highly altricial cubs, whose survival is almost entirely dependent on receiving adequate maternal care during the first few weeks of life.

  10. A Practical Approach to Replication of Abstract Data Objects

    DTIC Science & Technology

    1990-05-01

    rigorous torture testing. Torture testing was done with the aid of a basher program that allows the user to configure an object and perform a specified...number of transactions, each containing a specified number of operations on the object. There are separate basher programs for RSMs and MRSMs. The...modify the object (Writes and Erases, for RSMs). The basher maintains a local, non- replicated table that tracks the RSM that is under test. Each

  11. Is the 'driving test' a robust quality indicator of colonoscopy performance?

    PubMed

    Kelly, Nicholas M; Moorehead, John; Tham, Tony

    2010-04-16

    Colorectal cancer is a major cause of death in the western world and is currently the second commonest cause of death from malignant disease in the UK. Recently a "driving test" for colonoscopists wishing to take part in the National Health Service Bowel Cancer Screening Program has been introduced, with the aim of improving quality in colonoscopy. We describe the accreditation process and have reviewed the published evidence for its use. We compared this method of assessment to what occurs in other developed countries. To the authors' knowledge no other countries have similar methods of assessment of practicing colonoscopists, and instead use critical evaluation of key quality criteria. The UK appears to have one of the most rigorous accreditation processes, although this still has flaws. The published evidence suggests that the written part of the accreditation is not a good discriminating test and it needs to be improved or abandoned. Further work is needed on the best methods of assessing polypectomy skills. Rigorous systems need to be in place for the colonoscopist who fails the assessment.

  12. Revised Planning Methodology For Signalized Intersections And Operational Analysis Of Exclusive Left-Turn Lanes, Part-II: Models And Procedures (Final Report)

    DOT National Transportation Integrated Search

    1996-04-01

    THIS REPORT ALSO DESCRIBES THE PROCEDURES FOR DIRECT ESTIMATION OF INTERSECTION CAPACITY WITH SIMULATION, INCLUDING A SET OF RIGOROUS STATISTICAL TESTS FOR SIMULATION PARAMETER CALIBRATION FROM FIELD DATA.

  13. The ICA Communication Audit: Rationale and Development.

    ERIC Educational Resources Information Center

    Goldhaber, Gerald M.

    After reviewing previous research on communication in organizations, the Organizational Communication Division of the International Communication Association (ICA) decided, in 1971, to develop its own measurement system, the ICA Communication Audit. Rigorous pilot-testing, refinement, standardization, and application would allow the construction…

  14. Analytical Methodology Used To Assess/Refine Observatory Thermal Vacuum Test Conditions For the Landsat 8 Data Continuity Mission

    NASA Technical Reports Server (NTRS)

    Fantano, Louis

    2015-01-01

    Thermal and Fluids Analysis Workshop Silver Spring, MD NCTS 21070-15 The Landsat 8 Data Continuity Mission, which is part of the United States Geologic Survey (USGS), launched February 11, 2013. A Landsat environmental test requirement mandated that test conditions bound worst-case flight thermal environments. This paper describes a rigorous analytical methodology applied to assess refine proposed thermal vacuum test conditions and the issues encountered attempting to satisfy this requirement.

  15. Quality control in cone-beam computed tomography (CBCT) EFOMP-ESTRO-IAEA protocol (summary report).

    PubMed

    de Las Heras Gala, Hugo; Torresin, Alberto; Dasu, Alexandru; Rampado, Osvaldo; Delis, Harry; Hernández Girón, Irene; Theodorakou, Chrysoula; Andersson, Jonas; Holroyd, John; Nilsson, Mats; Edyvean, Sue; Gershan, Vesna; Hadid-Beurrier, Lama; Hoog, Christopher; Delpon, Gregory; Sancho Kolster, Ismael; Peterlin, Primož; Garayoa Roca, Julia; Caprile, Paola; Zervides, Costas

    2017-07-01

    The aim of the guideline presented in this article is to unify the test parameters for image quality evaluation and radiation output in all types of cone-beam computed tomography (CBCT) systems. The applications of CBCT spread over dental and interventional radiology, guided surgery and radiotherapy. The chosen tests provide the means to objectively evaluate the performance and monitor the constancy of the imaging chain. Experience from all involved associations has been collected to achieve a consensus that is rigorous and helpful for the practice. The guideline recommends to assess image quality in terms of uniformity, geometrical precision, voxel density values (or Hounsfield units where available), noise, low contrast resolution and spatial resolution measurements. These tests usually require the use of a phantom and evaluation software. Radiation output can be determined with a kerma-area product meter attached to the tube case. Alternatively, a solid state dosimeter attached to the flat panel and a simple geometric relationship can be used to calculate the dose to the isocentre. Summary tables including action levels and recommended frequencies for each test, as well as relevant references, are provided. If the radiation output or image quality deviates from expected values, or exceeds documented action levels for a given system, a more in depth system analysis (using conventional tests) and corrective maintenance work may be required. Copyright © 2017. Published by Elsevier Ltd.

  16. Rigor of non-dairy galactose restriction in early childhood, measured by retrospective survey, does not associate with severity of five long-term outcomes quantified in 231 children and adults with classic galactosemia

    PubMed Central

    Frederick, Allison B.; Cutler, David J.; Fridovich-Keil, Judith L.

    2017-01-01

    One of many vexing decisions faced by parents of an infant with classic galactosemia (CG) is how carefully to restrict non-dairy galactose from their growing child’s diet. Until recently, many experts recommended vigorous lifelong dietary restriction of milk and all high-galactose dairy products as well as some non-dairy sources of galactose such as legumes and specific fruits and vegetables. Recently, experts have begun to relax their recommendations. The new recommendations, that restrict only high galactose dairy products, were made in the face of uncertainty, however, because no sufficiently powered study had been reported testing for possible association between rigor of non-dairy galactose restriction and severity of long-term outcomes in CG. Here we describe the largest study of diet and outcomes in CG reported to date, conducted using information gathered from 231 patients with CG and 71 unaffected sibling controls. We compared rigor of dietary galactose restriction, measured using a 4-point scale by a retrospective parent-response survey, with outcomes including growth, adaptive behaviors, receipt of speech therapy, receipt of special educational services, and for girls and women, a plasma marker of ovarian function (AMH). Our results confirmed the expected differences between patients and controls, but among patients showed no significant association between rigor of non-dairy galactose restriction in early childhood and any of the outcomes quantified. Indeed, some weak associations were seen suggesting that rigorous restriction of non-dairy galactose may be deleterious rather than beneficial. Despite limitations, these findings support the ongoing trend toward diet liberalization with regard to non-dairy sources of galactose for children and adults with classic galactosemia. PMID:28695375

  17. Recent Developments: PKI Square Dish for the Soleras Project

    NASA Technical Reports Server (NTRS)

    Rogers, W. E.

    1984-01-01

    The Square Dish solar collectors are subjected to rigorous design attention regarding corrosion at the site, and certification of the collector structure. The microprocessor controls and tracking mechanisms are improved in the areas of fail safe operations, durability, and low parasitic power requirements. Prototype testing demonstrates performance efficiency of approximately 72% at 730 F outlet temperature. Studies are conducted that include developing formal engineering design studies, developing formal engineering design drawing and fabrication details, establishing subcontracts for fabrication of major components, and developing a rigorous quality control system. The improved design is more cost effective to product and the extensive manuals developed for assembly and operation/maintenance result in faster field assembly and ease of operation.

  18. Recent developments: PKI square dish for the Soleras Project

    NASA Astrophysics Data System (ADS)

    Rogers, W. E.

    1984-03-01

    The Square Dish solar collectors are subjected to rigorous design attention regarding corrosion at the site, and certification of the collector structure. The microprocessor controls and tracking mechanisms are improved in the areas of fail safe operations, durability, and low parasitic power requirements. Prototype testing demonstrates performance efficiency of approximately 72% at 730 F outlet temperature. Studies are conducted that include developing formal engineering design studies, developing formal engineering design drawing and fabrication details, establishing subcontracts for fabrication of major components, and developing a rigorous quality control system. The improved design is more cost effective to product and the extensive manuals developed for assembly and operation/maintenance result in faster field assembly and ease of operation.

  19. A Rigorous Framework for Optimization of Expensive Functions by Surrogates

    NASA Technical Reports Server (NTRS)

    Booker, Andrew J.; Dennis, J. E., Jr.; Frank, Paul D.; Serafini, David B.; Torczon, Virginia; Trosset, Michael W.

    1998-01-01

    The goal of the research reported here is to develop rigorous optimization algorithms to apply to some engineering design problems for which design application of traditional optimization approaches is not practical. This paper presents and analyzes a framework for generating a sequence of approximations to the objective function and managing the use of these approximations as surrogates for optimization. The result is to obtain convergence to a minimizer of an expensive objective function subject to simple constraints. The approach is widely applicable because it does not require, or even explicitly approximate, derivatives of the objective. Numerical results are presented for a 31-variable helicopter rotor blade design example and for a standard optimization test example.

  20. BINARY CORRELATIONS IN IONIZED GASES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Balescu, R.; Taylor, H.S.

    1961-01-01

    An equation of evolution for the binary distribution function in a classical homogeneous, nonequilibrium plasma was derived. It is shown that the asymptotic (long-time) solution of this equation is the Debye distribution, thus providing a rigorous dynamical derivation of the equilibrium distribution. This proof is free from the fundamental conceptual difficulties of conventional equilibrium derivations. Out of equilibrium, a closed formula was obtained for the long living correlations, in terms of the momentum distribution function. These results should form an appropriate starting point for a rigorous theory of transport phenomena in plasmas, including the effect of molecular correlations. (auth)

  1. Rigorous Electromagnetic Analysis of the Focusing Action of Refractive Cylindrical Microlens

    NASA Astrophysics Data System (ADS)

    Liu, Juan; Gu, Ben-Yuan; Dong, Bi-Zhen; Yang, Guo-Zhen

    The focusing action of refractive cylindrical microlens is investigated based on the rigorous electromagnetic theory with the use of the boundary element method. The focusing behaviors of these refractive microlenses with continuous and multilevel surface-envelope are characterized in terms of total electric-field patterns, the electric-field intensity distributions on the focal plane, and their diffractive efficiencies at the focal spots. The obtained results are also compared with the ones obtained by Kirchhoff's scalar diffraction theory. The present numerical and graphical results may provide useful information for the analysis and design of refractive elements in micro-optics.

  2. Homeopathy: An Introduction

    MedlinePlus

    ... are inconsistent with fundamental concepts of chemistry and physics. There are significant challenges in carrying out rigorous clinical research on homeopathic remedies. Tell all your health care providers about any complementary health practices you ...

  3. Corneal transplant - discharge

    MedlinePlus

    ... heavy lifting. Stay away from dust and blowing sand. Follow your provider's instructions for using eye drops ... A.D.A.M. follows rigorous standards of quality and accountability. A.D.A.M. is among ...

  4. Identifying Opportunities in Citizen Science for Academic Libraries

    ERIC Educational Resources Information Center

    Cohen, Cynthia M.; Cheney, Liz; Duong, Khue; Lea, Ben; Unno, Zoe Pettway

    2015-01-01

    Citizen science projects continue to grow in popularity, providing opportunities for nonexpert volunteers to contribute to and become personally invested in rigorous scientific research. Academic libraries, aiming to promote and provide tools and resources to master scientific and information literacy, can support these efforts. While few examples…

  5. Time-dependent neo-deterministic seismic hazard scenarios for the 2016 Central Italy earthquakes sequence

    NASA Astrophysics Data System (ADS)

    Peresan, Antonella; Kossobokov, Vladimir; Romashkova, Leontina; Panza, Giuliano F.

    2017-04-01

    Predicting earthquakes and related ground shaking is widely recognized among the most challenging scientific problems, both for societal relevance and intrinsic complexity of the problem. The development of reliable forecasting tools requires their rigorous formalization and testing, first in retrospect, and then in an experimental real-time mode, which imply a careful application of statistics to data sets of limited size and different accuracy. Accordingly, the operational issues of prospective validation and use of time-dependent neo-deterministic seismic hazard scenarios are discussed, reviewing the results in their application in Italy and surroundings. Long-term practice and results obtained for the Italian territory in about two decades of rigorous prospective testing, support the feasibility of earthquake forecasting based on the analysis of seismicity patterns at the intermediate-term middle-range scale. Italy is the only country worldwide where two independent, globally tested, algorithms are simultaneously applied, namely CN and M8S, which permit to deal with multiple sets of seismic precursors to allow for a diagnosis of the intervals of time when a strong event is likely to occur inside a given region. Based on routinely updated space-time information provided by CN and M8S forecasts, an integrated procedure has been developed that allows for the definition of time-dependent seismic hazard scenarios, through the realistic modeling of ground motion by the neo-deterministic approach (NDSHA). This scenario-based methodology permits to construct, both at regional and local scale, scenarios of ground motion for the time interval when a strong event is likely to occur within the alerted areas. CN and M8S predictions, as well as the related time-dependent ground motion scenarios associated with the alarmed areas, are routinely updated since 2006. The issues and results from real-time testing of the integrated NDSHA scenarios are illustrated, with special emphasis on the sequence of destructive earthquakes that struck Central Italy starting on August 2016. The results obtained so far evidence the validity of the proposed methodology in anticipating ground shaking from approaching strong earthquakes and prove that the information provided by time-dependent NDSHA can be useful in assigning priorities for timely and effective mitigation actions.

  6. Implementing Telerehabilitation Research for Stroke Rehabilitation with Community Dwelling Veterans: Lessons Learned

    PubMed Central

    Chumbler, Neale R.; Quigley, Patricia; Sanford, Jon; Griffiths, Patricia; Rose, Dorian; Morey, Miriam; Ely, E. Wesley; Hoenig, Helen

    2010-01-01

    Telerehabilitation (TR) is the use of telehealth technologies to provide distant support, rehabilitation services, and information exchange between people with disabilities and their clinical providers. This article discusses the barriers experienced when implementing a TR multi-site randomized controlled trial for stroke patients in their homes, and the lessons learned. The barriers are divided into two sections: those specific to TR and those pertinent to the conduct of tele-research. The TR specific barriers included the rapidly changing telecommunications and health care environment and inconsistent equipment functionality. The barriers applicable to tele-research included the need to meet regulations in diverse departments and rapidly changing research regulations. Lessons learned included the need for: telehealth equipment options to allow for functionality within a diverse telecommunications infrastructure; rigorous pilot testing of all equipment in authentic situations; and on-call and on-site biomedical engineering and/or IT staff. PMID:25945169

  7. Design and analysis of a fast, two-mirror soft-x-ray microscope

    NASA Technical Reports Server (NTRS)

    Shealy, D. L.; Wang, C.; Jiang, W.; Jin, L.; Hoover, R. B.

    1992-01-01

    During the past several years, a number of investigators have addressed the design, analysis, fabrication, and testing of spherical Schwarzschild microscopes for soft-x-ray applications using multilayer coatings. Some of these systems have demonstrated diffraction limited resolution for small numerical apertures. Rigorously aplanatic, two-aspherical mirror Head microscopes can provide near diffraction limited resolution for very large numerical apertures. The relationships between the numerical aperture, mirror radii and diameters, magnifications, and total system length for Schwarzschild microscope configurations are summarized. Also, an analysis of the characteristics of the Head-Schwarzschild surfaces will be reported. The numerical surface data predicted by the Head equations were fit by a variety of functions and analyzed by conventional optical design codes. Efforts have been made to determine whether current optical substrate and multilayer coating technologies will permit construction of a very fast Head microscope which can provide resolution approaching that of the wavelength of the incident radiation.

  8. Investigation of the Thermomechanical Response of Shape Memory Alloy Hybrid Composite Beams

    NASA Technical Reports Server (NTRS)

    Davis, Brian A.

    2005-01-01

    Previous work at NASA Langley Research Center (LaRC) involved fabrication and testing of composite beams with embedded, pre-strained shape memory alloy (SMA) ribbons. That study also provided comparison of experimental results with numerical predictions from a research code making use of a new thermoelastic model for shape memory alloy hybrid composite (SMAHC) structures. The previous work showed qualitative validation of the numerical model. However, deficiencies in the experimental-numerical correlation were noted and hypotheses for the discrepancies were given for further investigation. The goal of this work is to refine the experimental measurement and numerical modeling approaches in order to better understand the discrepancies, improve the correlation between prediction and measurement, and provide rigorous quantitative validation of the numerical model. Thermal buckling, post-buckling, and random responses to thermal and inertial (base acceleration) loads are studied. Excellent agreement is achieved between the predicted and measured results, thereby quantitatively validating the numerical tool.

  9. Complexities and potential pitfalls of clinical study design and data analysis in assisted reproduction.

    PubMed

    Patounakis, George; Hill, Micah J

    2018-06-01

    The purpose of the current review is to describe the common pitfalls in design and statistical analysis of reproductive medicine studies. It serves to guide both authors and reviewers toward reducing the incidence of spurious statistical results and erroneous conclusions. The large amount of data gathered in IVF cycles leads to problems with multiplicity, multicollinearity, and over fitting of regression models. Furthermore, the use of the word 'trend' to describe nonsignificant results has increased in recent years. Finally, methods to accurately account for female age in infertility research models are becoming more common and necessary. The pitfalls of study design and analysis reviewed provide a framework for authors and reviewers to approach clinical research in the field of reproductive medicine. By providing a more rigorous approach to study design and analysis, the literature in reproductive medicine will have more reliable conclusions that can stand the test of time.

  10. Key considerations for the experimental training and evaluation of cancer odour detection dogs: lessons learnt from a double-blind, controlled trial of prostate cancer detection

    PubMed Central

    2014-01-01

    Background Cancer detection using sniffer dogs is a potential technology for clinical use and research. Our study sought to determine whether dogs could be trained to discriminate the odour of urine from men with prostate cancer from controls, using rigorous testing procedures and well-defined samples from a major research hospital. Methods We attempted to train ten dogs by initially rewarding them for finding and indicating individual prostate cancer urine samples (Stage 1). If dogs were successful in Stage 1, we then attempted to train them to discriminate prostate cancer samples from controls (Stage 2). The number of samples used to train each dog varied depending on their individual progress. Overall, 50 unique prostate cancer and 67 controls were collected and used during training. Dogs that passed Stage 2 were tested for their ability to discriminate 15 (Test 1) or 16 (Tests 2 and 3) unfamiliar prostate cancer samples from 45 (Test 1) or 48 (Tests 2 and 3) unfamiliar controls under double-blind conditions. Results Three dogs reached training Stage 2 and two of these learnt to discriminate potentially familiar prostate cancer samples from controls. However, during double-blind tests using new samples the two dogs did not indicate prostate cancer samples more frequently than expected by chance (Dog A sensitivity 0.13, specificity 0.71, Dog B sensitivity 0.25, specificity 0.75). The other dogs did not progress past Stage 1 as they did not have optimal temperaments for the sensitive odour discrimination training. Conclusions Although two dogs appeared to have learnt to select prostate cancer samples during training, they did not generalise on a prostate cancer odour during robust double-blind tests involving new samples. Our study illustrates that these rigorous tests are vital to avoid drawing misleading conclusions about the abilities of dogs to indicate certain odours. Dogs may memorise the individual odours of large numbers of training samples rather than generalise on a common odour. The results do not exclude the possibility that dogs could be trained to detect prostate cancer. We recommend that canine olfactory memory is carefully considered in all future studies and rigorous double-blind methods used to avoid confounding effects. PMID:24575737

  11. Sign Language Studies with Chimpanzees and Children.

    ERIC Educational Resources Information Center

    Van Cantfort, Thomas E.; Rimpau, James B.

    1982-01-01

    Reviews methodologies of sign language studies with chimpanzees and compares major findings of those studies with studies of human children. Considers relevance of input conditions for language acquisition, evidence used to demonstrate linguistic achievements, and application of rigorous testing procedures in developmental psycholinguistics.…

  12. Accelerating Research Impact in a Learning Health Care System

    PubMed Central

    Elwy, A. Rani; Sales, Anne E.; Atkins, David

    2017-01-01

    Background: Since 1998, the Veterans Health Administration (VHA) Quality Enhancement Research Initiative (QUERI) has supported more rapid implementation of research into clinical practice. Objectives: With the passage of the Veterans Access, Choice and Accountability Act of 2014 (Choice Act), QUERI further evolved to support VHA’s transformation into a Learning Health Care System by aligning science with clinical priority goals based on a strategic planning process and alignment of funding priorities with updated VHA priority goals in response to the Choice Act. Design: QUERI updated its strategic goals in response to independent assessments mandated by the Choice Act that recommended VHA reduce variation in care by providing a clear path to implement best practices. Specifically, QUERI updated its application process to ensure its centers (Programs) focus on cross-cutting VHA priorities and specify roadmaps for implementation of research-informed practices across different settings. QUERI also increased funding for scientific evaluations of the Choice Act and other policies in response to Commission on Care recommendations. Results: QUERI’s national network of Programs deploys effective practices using implementation strategies across different settings. QUERI Choice Act evaluations informed the law’s further implementation, setting the stage for additional rigorous national evaluations of other VHA programs and policies including community provider networks. Conclusions: Grounded in implementation science and evidence-based policy, QUERI serves as an example of how to operationalize core components of a Learning Health Care System, notably through rigorous evaluation and scientific testing of implementation strategies to ultimately reduce variation in quality and improve overall population health. PMID:27997456

  13. Consumer Outcomes After Implementing CommonGround as an Approach to Shared Decision Making.

    PubMed

    Salyers, Michelle P; Fukui, Sadaaki; Bonfils, Kelsey A; Firmin, Ruth L; Luther, Lauren; Goscha, Rick; Rapp, Charles A; Holter, Mark C

    2017-03-01

    The authors examined consumer outcomes before and after implementing CommonGround, a computer-based shared decision-making program. Consumers with severe mental illness (N=167) were interviewed prior to implementation and 12 and 18 months later to assess changes in active treatment involvement, symptoms, and recovery-related attitudes. Providers also rated consumers on level of treatment involvement. Most consumers used CommonGround at least once (67%), but few used the program regularly. Mixed-effects regression analyses showed improvement in self-reported symptoms and recovery attitudes. Self-reported treatment involvement did not change; however, for a subset of consumers with the same providers over time (N=83), the providers rated consumers as more active in treatment. This study adds to the growing literature on tools to support shared decision making, showing the potential benefits of CommonGround for improving recovery outcomes. More work is needed to better engage consumers in CommonGround and to test the approach with more rigorous methods.

  14. Experimental evaluation of rigor mortis. V. Effect of various temperatures on the evolution of rigor mortis.

    PubMed

    Krompecher, T

    1981-01-01

    Objective measurements were carried out to study the evolution of rigor mortis on rats at various temperatures. Our experiments showed that: (1) at 6 degrees C rigor mortis reaches full development between 48 and 60 hours post mortem, and is resolved at 168 hours post mortem; (2) at 24 degrees C rigor mortis reaches full development at 5 hours post mortem, and is resolved at 16 hours post mortem; (3) at 37 degrees C rigor mortis reaches full development at 3 hours post mortem, and is resolved at 6 hours post mortem; (4) the intensity of rigor mortis grows with increase in temperature (difference between values obtained at 24 degrees C and 37 degrees C); and (5) and 6 degrees C a "cold rigidity" was found, in addition to and independent of rigor mortis.

  15. DESCQA: An Automated Validation Framework for Synthetic Sky Catalogs

    NASA Astrophysics Data System (ADS)

    Mao, Yao-Yuan; Kovacs, Eve; Heitmann, Katrin; Uram, Thomas D.; Benson, Andrew J.; Campbell, Duncan; Cora, Sofía A.; DeRose, Joseph; Di Matteo, Tiziana; Habib, Salman; Hearin, Andrew P.; Bryce Kalmbach, J.; Krughoff, K. Simon; Lanusse, François; Lukić, Zarija; Mandelbaum, Rachel; Newman, Jeffrey A.; Padilla, Nelson; Paillas, Enrique; Pope, Adrian; Ricker, Paul M.; Ruiz, Andrés N.; Tenneti, Ananth; Vega-Martínez, Cristian A.; Wechsler, Risa H.; Zhou, Rongpu; Zu, Ying; The LSST Dark Energy Science Collaboration

    2018-02-01

    The use of high-quality simulated sky catalogs is essential for the success of cosmological surveys. The catalogs have diverse applications, such as investigating signatures of fundamental physics in cosmological observables, understanding the effect of systematic uncertainties on measured signals and testing mitigation strategies for reducing these uncertainties, aiding analysis pipeline development and testing, and survey strategy optimization. The list of applications is growing with improvements in the quality of the catalogs and the details that they can provide. Given the importance of simulated catalogs, it is critical to provide rigorous validation protocols that enable both catalog providers and users to assess the quality of the catalogs in a straightforward and comprehensive way. For this purpose, we have developed the DESCQA framework for the Large Synoptic Survey Telescope Dark Energy Science Collaboration as well as for the broader community. The goal of DESCQA is to enable the inspection, validation, and comparison of an inhomogeneous set of synthetic catalogs via the provision of a common interface within an automated framework. In this paper, we present the design concept and first implementation of DESCQA. In order to establish and demonstrate its full functionality we use a set of interim catalogs and validation tests. We highlight several important aspects, both technical and scientific, that require thoughtful consideration when designing a validation framework, including validation metrics and how these metrics impose requirements on the synthetic sky catalogs.

  16. Examining the Statistical Rigor of Test and Evaluation Results in the Live, Virtual and Constructive Environment

    DTIC Science & Technology

    2011-06-01

    Committee Meeting. 23 June 2008. Bjorkman, Eileen A. and Frank B. Gray . “Testing in a Joint Environment 2004-2008: Findings, Conclusions and...the LVC joint test environment to evaluate system performance and joint mission effectiveness (Bjorkman and Gray 2009a). The LVC battlespace...attack (Bjorkman and Gray 2009b). Figure 3 - JTEM Methodology (Bjorkman 2008) A key INTEGRAL FIRE lesson learned was realizing the need for each

  17. Parents Guide to "First Grade" Instruction

    ERIC Educational Resources Information Center

    Department of Defense Education Activity, 2012

    2012-01-01

    The Department of Defense Education Activity (DoDEA) is committed to providing the highest quality of education to its students. One way to provide a quality education is with an effective curriculum that reflects high standards and expectations. Thus, DoDEA has developed rigorous content standards aligned with national guidelines and standards.…

  18. Parents Guide to "Kindergarten" Instruction

    ERIC Educational Resources Information Center

    Department of Defense Education Activity, 2012

    2012-01-01

    The Department of Defense Education Activity (DoDEA) is committed to providing the highest quality of education to its students. One way to provide a quality education is with an effective curriculum that reflects high standards and expectations. Thus, DoDEA has developed rigorous content standards aligned with national guidelines and standards.…

  19. Parents Guide to "Prekindergarten" Instruction

    ERIC Educational Resources Information Center

    Department of Defense Education Activity, 2012

    2012-01-01

    The Department of Defense Education Activity (DoDEA) is committed to providing the highest quality of education to its students. One way to provide a quality education is with an effective curriculum that reflects high standards and expectations. Thus, DoDEA has developed rigorous content standards aligned with national guidelines and standards.…

  20. Parents Guide to "Second Grade" Instruction

    ERIC Educational Resources Information Center

    Department of Defense Education Activity, 2012

    2012-01-01

    The Department of Defense Education Activity (DoDEA) is committed to providing the highest quality of education to its students. One way to provide a quality education is with an effective curriculum that reflects high standards and expectations. Thus, DoDEA has developed rigorous content standards aligned with national guidelines and standards.…

  1. Parents Guide to "Third Grade" Instruction

    ERIC Educational Resources Information Center

    Department of Defense Education Activity, 2012

    2012-01-01

    The Department of Defense Education Activity (DoDEA) is committed to providing the highest quality of education to its students. One way to provide a quality education is with an effective curriculum that reflects high standards and expectations. Thus, DoDEA has developed rigorous content standards aligned with national guidelines and standards.…

  2. Parents Guide to "Fourth Grade" Instruction

    ERIC Educational Resources Information Center

    Department of Defense Education Activity, 2012

    2012-01-01

    The Department of Defense Education Activity (DoDEA) is committed to providing the highest quality of education to its students. One way to provide a quality education is with an effective curriculum that reflects high standards and expectations. Thus, DoDEA has developed rigorous content standards aligned with national guidelines and standards.…

  3. Parents Guide to "Sixth Grade" Instruction

    ERIC Educational Resources Information Center

    Department of Defense Education Activity, 2012

    2012-01-01

    The Department of Defense Education Activity (DoDEA) is committed to providing the highest quality of education to its students. One way to provide a quality education is with an effective curriculum that reflects high standards and expectations. Thus, DoDEA has developed rigorous content standards aligned with national guidelines and standards.…

  4. Parents Guide to "Fifth Grade" Instruction

    ERIC Educational Resources Information Center

    Department of Defense Education Activity, 2012

    2012-01-01

    The Department of Defense Education Activity (DoDEA) is committed to providing the highest quality of education to its students. One way to provide a quality education is with an effective curriculum that reflects high standards and expectations. Thus, DoDEA has developed rigorous content standards aligned with national guidelines and standards.…

  5. Zoos, Aquariums, and Expanding Students' Data Literacy

    ERIC Educational Resources Information Center

    Mokros, Jan; Wright, Tracey

    2009-01-01

    Zoo and aquarium educators are increasingly providing educationally rigorous programs that connect their animal collections with curriculum standards in mathematics as well as science. Partnering with zoos and aquariums is a powerful way for teachers to provide students with more opportunities to observe, collect, and analyze scientific data. This…

  6. Biomedical text mining for research rigor and integrity: tasks, challenges, directions.

    PubMed

    Kilicoglu, Halil

    2017-06-13

    An estimated quarter of a trillion US dollars is invested in the biomedical research enterprise annually. There is growing alarm that a significant portion of this investment is wasted because of problems in reproducibility of research findings and in the rigor and integrity of research conduct and reporting. Recent years have seen a flurry of activities focusing on standardization and guideline development to enhance the reproducibility and rigor of biomedical research. Research activity is primarily communicated via textual artifacts, ranging from grant applications to journal publications. These artifacts can be both the source and the manifestation of practices leading to research waste. For example, an article may describe a poorly designed experiment, or the authors may reach conclusions not supported by the evidence presented. In this article, we pose the question of whether biomedical text mining techniques can assist the stakeholders in the biomedical research enterprise in doing their part toward enhancing research integrity and rigor. In particular, we identify four key areas in which text mining techniques can make a significant contribution: plagiarism/fraud detection, ensuring adherence to reporting guidelines, managing information overload and accurate citation/enhanced bibliometrics. We review the existing methods and tools for specific tasks, if they exist, or discuss relevant research that can provide guidance for future work. With the exponential increase in biomedical research output and the ability of text mining approaches to perform automatic tasks at large scale, we propose that such approaches can support tools that promote responsible research practices, providing significant benefits for the biomedical research enterprise. Published by Oxford University Press 2017. This work is written by a US Government employee and is in the public domain in the US.

  7. Reducing stillbirths: screening and monitoring during pregnancy and labour

    PubMed Central

    Haws, Rachel A; Yakoob, Mohammad Yawar; Soomro, Tanya; Menezes, Esme V; Darmstadt, Gary L; Bhutta, Zulfiqar A

    2009-01-01

    Background Screening and monitoring in pregnancy are strategies used by healthcare providers to identify high-risk pregnancies so that they can provide more targeted and appropriate treatment and follow-up care, and to monitor fetal well-being in both low- and high-risk pregnancies. The use of many of these techniques is controversial and their ability to detect fetal compromise often unknown. Theoretically, appropriate management of maternal and fetal risk factors and complications that are detected in pregnancy and labour could prevent a large proportion of the world's 3.2 million estimated annual stillbirths, as well as minimise maternal and neonatal morbidity and mortality. Methods The fourth in a series of papers assessing the evidence base for prevention of stillbirths, this paper reviews available published evidence for the impact of 14 screening and monitoring interventions in pregnancy on stillbirth, including identification and management of high-risk pregnancies, advanced monitoring techniques, and monitoring of labour. Using broad and specific strategies to search PubMed and the Cochrane Library, we identified 221 relevant reviews and studies testing screening and monitoring interventions during the antenatal and intrapartum periods and reporting stillbirth or perinatal mortality as an outcome. Results We found a dearth of rigorous evidence of direct impact of any of these screening procedures and interventions on stillbirth incidence. Observational studies testing some interventions, including fetal movement monitoring and Doppler monitoring, showed some evidence of impact on stillbirths in selected high-risk populations, but require larger rigourous trials to confirm impact. Other interventions, such as amniotic fluid assessment for oligohydramnios, appear predictive of stillbirth risk, but studies are lacking which assess the impact on perinatal mortality of subsequent intervention based on test findings. Few rigorous studies of cardiotocography have reported stillbirth outcomes, but steep declines in stillbirth rates have been observed in high-income settings such as the U.S., where cardiotocography is used in conjunction with Caesarean section for fetal distress. Conclusion There are numerous research gaps and large, adequately controlled trials are still needed for most of the interventions we considered. The impact of monitoring interventions on stillbirth relies on use of effective and timely intervention should problems be detected. Numerous studies indicated that positive tests were associated with increased perinatal mortality, but while some tests had good sensitivity in detecting distress, false-positive rates were high for most tests, and questions remain about optimal timing, frequency, and implications of testing. Few studies included assessments of impact of subsequent intervention needed before recommending particular monitoring strategies as a means to decrease stillbirth incidence. In high-income countries such as the US, observational evidence suggests that widespread use of cardiotocography with Caesarean section for fetal distress has led to significant declines in stillbirth rates. Efforts to increase availability of Caesarean section in low-/middle-income countries should be coupled with intrapartum monitoring technologies where resources and provider skills permit. PMID:19426468

  8. Academic Rigor in the College Classroom: Two Federal Commissions Strive to Define Rigor in the Past 70 Years

    ERIC Educational Resources Information Center

    Francis, Clay

    2018-01-01

    Historic notions of academic rigor usually follow from critiques of the system--we often define our goals for academically rigorous work through the lens of our shortcomings. This chapter discusses how the Truman Commission in 1947 and the Spellings Commission in 2006 shaped the way we think about academic rigor in today's context.

  9. Measuring Patient-Reported Outcomes: Key Metrics in Reconstructive Surgery.

    PubMed

    Voineskos, Sophocles H; Nelson, Jonas A; Klassen, Anne F; Pusic, Andrea L

    2018-01-29

    Satisfaction and improved quality of life are among the most important outcomes for patients undergoing plastic and reconstructive surgery for a variety of diseases and conditions. Patient-reported outcome measures (PROMs) are essential tools for evaluating the benefits of newly developed surgical techniques. Modern PROMs are being developed with new psychometric approaches, such as Rasch Measurement Theory, and their measurement properties (validity, reliability, responsiveness) are rigorously tested. These advances have resulted in the availability of PROMs that provide clinically meaningful data and effectively measure functional as well as psychosocial outcomes. This article guides the reader through the steps of creating a PROM and highlights the potential research and clinical uses of such instruments. Limitations of PROMs and anticipated future directions in this field are discussed.

  10. Hidden attractors in dynamical models of phase-locked loop circuits: Limitations of simulation in MATLAB and SPICE

    NASA Astrophysics Data System (ADS)

    Kuznetsov, N. V.; Leonov, G. A.; Yuldashev, M. V.; Yuldashev, R. V.

    2017-10-01

    During recent years it has been shown that hidden oscillations, whose basin of attraction does not overlap with small neighborhoods of equilibria, may significantly complicate simulation of dynamical models, lead to unreliable results and wrong conclusions, and cause serious damage in drilling systems, aircrafts control systems, electromechanical systems, and other applications. This article provides a survey of various phase-locked loop based circuits (used in satellite navigation systems, optical, and digital communication), where such difficulties take place in MATLAB and SPICE. Considered examples can be used for testing other phase-locked loop based circuits and simulation tools, and motivate the development and application of rigorous analytical methods for the global analysis of phase-locked loop based circuits.

  11. Better Higgs-C P tests through information geometry

    NASA Astrophysics Data System (ADS)

    Brehmer, Johann; Kling, Felix; Plehn, Tilman; Tait, Tim M. P.

    2018-05-01

    Measuring the C P symmetry in the Higgs sector is one of the key tasks of the LHC and a crucial ingredient for precision studies, for example in the language of effective Lagrangians. We systematically analyze which LHC signatures offer dedicated C P measurements in the Higgs-gauge sector and discuss the nature of the information they provide. Based on the Fisher information measure, we compare the maximal reach for C P -violating effects in weak boson fusion, associated Z H production, and Higgs decays into four leptons. We find a subtle balance between more theory-independent approaches and more powerful analysis channels, indicating that rigorous evidence for C P violation in the Higgs-gauge sector will likely require a multistep process.

  12. Characterizing (rating) the performance of large photovoltaic arrays for all operating conditions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    King, D.L.; Eckert, P.E.

    1996-06-01

    A new method has been developed for characterizing the electrical performance of photovoltaic arrays. The method provides both a ``rating`` at standard reporting conditions and a rigorous yet straightforward model for predicting array performance at all operating conditions. For the first time, the performance model handles the influences of irradiance, module temperature, solar spectrum, solar angle-of-incidence, and temperature coefficients, in a practical way. Validity of the procedure was confirmed during field testing of a 25-kW array recently installed by Arizona Public Service Co. on Carol Spring Mountain (which powers microwave, ceullular phone, and TV communictions equipment). This paper describes themore » characterization procedure, measured array performance, and the predictive model.« less

  13. New cure or same old story? International (Africa).

    PubMed

    1996-03-25

    In Kenya, Professor Arthur Obel claimed that he has discovered a wonder drug, Pearl Omega, which can reverse HIV-positive status in some patients. In response to criticism, Obel modified his claim to assert that his drug alleviates the suffering of AIDS patients and prolongs their lives. Obel refused to provide proof that 7 of 32 HIV-positive patients who received Pearl Omega had become seronegative. While the Kenyan government supports local initiatives in the world-wide search for a cure for AIDS, it insists that all drugs being used to treat the disease undergo rigorous testing. With 5% of the Kenyan population believed to be HIV positive, medical authorities expect that the Pearl Omega debate is far from over.

  14. Digital morphogenesis via Schelling segregation

    NASA Astrophysics Data System (ADS)

    Barmpalias, George; Elwes, Richard; Lewis-Pye, Andrew

    2018-04-01

    Schelling’s model of segregation looks to explain the way in which particles or agents of two types may come to arrange themselves spatially into configurations consisting of large homogeneous clusters, i.e. connected regions consisting of only one type. As one of the earliest agent based models studied by economists and perhaps the most famous model of self-organising behaviour, it also has direct links to areas at the interface between computer science and statistical mechanics, such as the Ising model and the study of contagion and cascading phenomena in networks. While the model has been extensively studied it has largely resisted rigorous analysis, prior results from the literature generally pertaining to variants of the model which are tweaked so as to be amenable to standard techniques from statistical mechanics or stochastic evolutionary game theory. In Brandt et al (2012 Proc. 44th Annual ACM Symp. on Theory of Computing) provided the first rigorous analysis of the unperturbed model, for a specific set of input parameters. Here we provide a rigorous analysis of the model’s behaviour much more generally and establish some surprising forms of threshold behaviour, notably the existence of situations where an increased level of intolerance for neighbouring agents of opposite type leads almost certainly to decreased segregation.

  15. From screening to synthesis: using nvivo to enhance transparency in qualitative evidence synthesis.

    PubMed

    Houghton, Catherine; Murphy, Kathy; Meehan, Ben; Thomas, James; Brooker, Dawn; Casey, Dympna

    2017-03-01

    To explore the experiences and perceptions of healthcare staff caring for people with dementia in the acute setting. This article focuses on the methodological process of conducting framework synthesis using nvivo for each stage of the review: screening, data extraction, synthesis and critical appraisal. Qualitative evidence synthesis brings together many research findings in a meaningful way that can be used to guide practice and policy development. For this purpose, synthesis must be conducted in a comprehensive and rigorous way. There has been previous discussion on how using nvivo can assist in enhancing and illustrate the rigorous processes involved. Qualitative framework synthesis. Twelve documents, or research reports, based on nine studies, were included for synthesis. The benefits of using nvivo are outlined in terms of facilitating teams of researchers to systematically and rigorously synthesise findings. nvivo functions were used to conduct a sensitivity analysis. Some valuable lessons were learned, and these are presented to assist and guide researchers who wish to use similar methods in future. Ultimately, good qualitative evidence synthesis will provide practitioners and policymakers with significant information that will guide decision-making on many aspects of clinical practice. The example provided explored how people with dementia are cared for acute settings. © 2016 The Authors. Journal of Clinical Nursing Published by John Wiley & Sons Ltd.

  16. Concerns regarding a call for pluralism of information theory and hypothesis testing

    USGS Publications Warehouse

    Lukacs, P.M.; Thompson, W.L.; Kendall, W.L.; Gould, W.R.; Doherty, P.F.; Burnham, K.P.; Anderson, D.R.

    2007-01-01

    1. Stephens et al . (2005) argue for `pluralism? in statistical analysis, combining null hypothesis testing and information-theoretic (I-T) methods. We show that I-T methods are more informative even in single variable problems and we provide an ecological example. 2. I-T methods allow inferences to be made from multiple models simultaneously. We believe multimodel inference is the future of data analysis, which cannot be achieved with null hypothesis-testing approaches. 3. We argue for a stronger emphasis on critical thinking in science in general and less reliance on exploratory data analysis and data dredging. Deriving alternative hypotheses is central to science; deriving a single interesting science hypothesis and then comparing it to a default null hypothesis (e.g. `no difference?) is not an efficient strategy for gaining knowledge. We think this single-hypothesis strategy has been relied upon too often in the past. 4. We clarify misconceptions presented by Stephens et al . (2005). 5. We think inference should be made about models, directly linked to scientific hypotheses, and their parameters conditioned on data, Prob(Hj| data). I-T methods provide a basis for this inference. Null hypothesis testing merely provides a probability statement about the data conditioned on a null model, Prob(data |H0). 6. Synthesis and applications. I-T methods provide a more informative approach to inference. I-T methods provide a direct measure of evidence for or against hypotheses and a means to consider simultaneously multiple hypotheses as a basis for rigorous inference. Progress in our science can be accelerated if modern methods can be used intelligently; this includes various I-T and Bayesian methods.

  17. Emergency cricothyrotomy for trismus caused by instantaneous rigor in cardiac arrest patients.

    PubMed

    Lee, Jae Hee; Jung, Koo Young

    2012-07-01

    Instantaneous rigor as muscle stiffening occurring in the moment of death (or cardiac arrest) can be confused with rigor mortis. If trismus is caused by instantaneous rigor, orotracheal intubation is impossible and a surgical airway should be secured. Here, we report 2 patients who had emergency cricothyrotomy for trismus caused by instantaneous rigor. This case report aims to help physicians understand instantaneous rigor and to emphasize the importance of securing a surgical airway quickly on the occurrence of trismus. Copyright © 2012 Elsevier Inc. All rights reserved.

  18. Programs and policies to assist high school dropouts in the transition to adulthood.

    PubMed

    Bloom, Dan

    2010-01-01

    Dan Bloom of MDRC examines policies and programs designed to help high school dropouts improve their educational attainment and labor market outcomes. So called "second-chance" programs, he says, have long provided some combination of education, training, employment, counseling, and social services. But the research record on their effectiveness is fairly thin, he says, and the results are mixed. Bloom describes eleven employment- or education-focused programs serving high school dropouts that have been rigorously evaluated over the past thirty years. Some relied heavily on paid work experience, while others focused more on job training or education. Some programs, especially those that offered paid work opportunities, generated significant increases in employment or earnings in the short term, but none of the studies that followed participants for more than a couple of years found lasting improvements in economic outcomes. Nevertheless, the findings provide an important foundation on which to build. Because of the high individual and social costs of ignoring high school dropouts, the argument for investing more public funds in services, systems, and research for these young people is strong. The paucity of conclusive evidence, however, makes it hard to know how to direct resources and magnifies the importance of ensuring that all new initiatives provide for rigorous evaluation of their impacts. Bloom concludes with recommendations for policy and research aimed at building on current efforts to expand and improve effective programs for dropouts while simultaneously developing and testing new approaches that might be more effective and strengthening local systems to support vulnerable young people. He stresses the importance of identifying and disseminating strategies to engage young people who are more seriously disconnected and unlikely to join programs. A recurring theme is that providing young people with opportunities for paid work may be useful both as an engagement tool and as a strategy for improving long-term labor market outcomes.

  19. Adjoint-Based Algorithms for Adaptation and Design Optimizations on Unstructured Grids

    NASA Technical Reports Server (NTRS)

    Nielsen, Eric J.

    2006-01-01

    Schemes based on discrete adjoint algorithms present several exciting opportunities for significantly advancing the current state of the art in computational fluid dynamics. Such methods provide an extremely efficient means for obtaining discretely consistent sensitivity information for hundreds of design variables, opening the door to rigorous, automated design optimization of complex aerospace configuration using the Navier-Stokes equation. Moreover, the discrete adjoint formulation provides a mathematically rigorous foundation for mesh adaptation and systematic reduction of spatial discretization error. Error estimates are also an inherent by-product of an adjoint-based approach, valuable information that is virtually non-existent in today's large-scale CFD simulations. An overview of the adjoint-based algorithm work at NASA Langley Research Center is presented, with examples demonstrating the potential impact on complex computational problems related to design optimization as well as mesh adaptation.

  20. Efficiency versus speed in quantum heat engines: Rigorous constraint from Lieb-Robinson bound

    NASA Astrophysics Data System (ADS)

    Shiraishi, Naoto; Tajima, Hiroyasu

    2017-08-01

    A long-standing open problem whether a heat engine with finite power achieves the Carnot efficiency is investgated. We rigorously prove a general trade-off inequality on thermodynamic efficiency and time interval of a cyclic process with quantum heat engines. In a first step, employing the Lieb-Robinson bound we establish an inequality on the change in a local observable caused by an operation far from support of the local observable. This inequality provides a rigorous characterization of the following intuitive picture that most of the energy emitted from the engine to the cold bath remains near the engine when the cyclic process is finished. Using this description, we prove an upper bound on efficiency with the aid of quantum information geometry. Our result generally excludes the possibility of a process with finite speed at the Carnot efficiency in quantum heat engines. In particular, the obtained constraint covers engines evolving with non-Markovian dynamics, which almost all previous studies on this topic fail to address.

  1. Efficiency versus speed in quantum heat engines: Rigorous constraint from Lieb-Robinson bound.

    PubMed

    Shiraishi, Naoto; Tajima, Hiroyasu

    2017-08-01

    A long-standing open problem whether a heat engine with finite power achieves the Carnot efficiency is investgated. We rigorously prove a general trade-off inequality on thermodynamic efficiency and time interval of a cyclic process with quantum heat engines. In a first step, employing the Lieb-Robinson bound we establish an inequality on the change in a local observable caused by an operation far from support of the local observable. This inequality provides a rigorous characterization of the following intuitive picture that most of the energy emitted from the engine to the cold bath remains near the engine when the cyclic process is finished. Using this description, we prove an upper bound on efficiency with the aid of quantum information geometry. Our result generally excludes the possibility of a process with finite speed at the Carnot efficiency in quantum heat engines. In particular, the obtained constraint covers engines evolving with non-Markovian dynamics, which almost all previous studies on this topic fail to address.

  2. Rigorous electromagnetic simulation applied to alignment systems

    NASA Astrophysics Data System (ADS)

    Deng, Yunfei; Pistor, Thomas V.; Neureuther, Andrew R.

    2001-09-01

    Rigorous electromagnetic simulation with TEMPEST is used to provide benchmark data and understanding of key parameters in the design of topographical features of alignment marks. Periodic large silicon trenches are analyzed as a function of wavelength (530-800 nm), duty cycle, depth, slope and angle of incidence. The signals are well behaved except when the trench width becomes about 1 micrometers or smaller. Segmentation of the trenches to form 3D marks shows that a segmentation period of 2-5 wavelengths makes the diffraction in the (1,1) direction about 1/3 to 1/2 of that in the main first order (1,0). Transmission alignment marks nanoimprint lithography using the difference between the +1 and -1 reflected orders showed a sensitivity of the difference signal to misalignment of 0.7%/nm for rigorous simulation and 0.5%/nm for simple ray-tracing. The sensitivity to a slanted substrate indentation was 10 nm off-set per degree of tilt from horizontal.

  3. Decomposing the Site Frequency Spectrum: The Impact of Tree Topology on Neutrality Tests.

    PubMed

    Ferretti, Luca; Ledda, Alice; Wiehe, Thomas; Achaz, Guillaume; Ramos-Onsins, Sebastian E

    2017-09-01

    We investigate the dependence of the site frequency spectrum on the topological structure of genealogical trees. We show that basic population genetic statistics, for instance, estimators of θ or neutrality tests such as Tajima's D , can be decomposed into components of waiting times between coalescent events and of tree topology. Our results clarify the relative impact of the two components on these statistics. We provide a rigorous interpretation of positive or negative values of an important class of neutrality tests in terms of the underlying tree shape. In particular, we show that values of Tajima's D and Fay and Wu's H depend in a direct way on a peculiar measure of tree balance, which is mostly determined by the root balance of the tree. We present a new test for selection in the same class as Fay and Wu's H and discuss its interpretation and power. Finally, we determine the trees corresponding to extreme expected values of these neutrality tests and present formulas for these extreme values as a function of sample size and number of segregating sites. Copyright © 2017 by the Genetics Society of America.

  4. Implementation and process evaluation of a workplace colorectal cancer screening program in eastern Washington.

    PubMed

    Hannon, Peggy A; Vu, Thuy; Ogdon, Sara; Fleury, Emily M; Yette, Emily; Wittenberg, Reva; Celedonia, Megan; Bowen, Deborah J

    2013-03-01

    Colorectal cancer screening is a life-saving intervention, but screening rates are low. The authors implemented and evaluated the Spokane Colorectal Cancer Screening Program-a novel worksite intervention to promote colorectal cancer screening that used a combination of evidence-based strategies recommended by the Guide to Community Preventive Services, as well as additional strategies. Over a period of approximately 3 months, participating worksites held one or more physician-led seminars about colorectal cancer screening for employees. They also distributed free fecal immunochemical tests at the worksite to employees 50 years and older, and they provided test results to employees and their primary care physician. The authors measured attendance at seminars, test kits taken and returned, employee awareness of the program, and colorectal cancer screening rates in participating and comparison worksites. It is estimated that 9% of eligible employees received kits at the worksite, and 4% were screened with these kits. The Spokane Colorectal Cancer Screening Program was a promising pilot test of an innovative worksite screening program that successfully translated evidence-based strategies into practical use in a brief period of time, and it merits a larger study to be able to test its effects more rigorously.

  5. Development and Justification of a Risk Evaluation Matrix To Guide Chemical Testing Necessary To Select and Qualify Plastic Components Used in Production Systems for Pharmaceutical Products.

    PubMed

    Jenke, Dennis

    2015-01-01

    An accelerating trend in the pharmaceutical industry is the use of plastic components in systems used to produce an active pharmaceutical ingredient or a finished drug product. If the active pharmaceutical ingredient, the finished drug product, or any solution used to generate them (for example, a process stream such as media, buffers, eluents, and the like) is contacted by a plastic component at any time during the production process, substances leached from the component may accumulate in the active pharmaceutical ingredient or finished drug product, affecting its safety and/or efficacy. In this article the author develops and justifies a semi-quantitative risk evaluation matrix that is used to determine the amount and rigor of component testing necessary and appropriate to establish that the component is chemically suitable for its intended use. By considering key properties of the component, the contact medium, the contact conditions, and the active pharmaceutical ingredient's or finished drug product's clinical conditions of use, use of the risk evaluation matrix produces a risk score whose magnitude reflects the accumulated risk that the component will interact with the contact solution to such an extent that component-related extractables will accumulate in the active pharmaceutical ingredient or finished drug product as leachables at levels sufficiently high to adversely affect user safety. The magnitude of the risk score establishes the amount and rigor of the testing that is required to select and qualify the component, and such testing is broadly grouped into three categories: baseline assessment, general testing, and full testing (extractables profiling). Production suites used to generate pharmaceuticals can include plastic components. It is possible that substances in the components could leach into manufacturing solutions and accumulate in the pharmaceutical product. In this article the author develops and justifies a semi-quantitative risk evaluation matrix that can be used to determine the amount and rigor of component testing that may be necessary and appropriate to establish that the component is suitable for its intended use. Use of the risk evaluation matrix allows a user of a component to determine the type and amount of testing that should be performed to establish the patient safety risk associated with using that component in order to manufacture an active pharmaceutical ingredient or a finished drug product. © PDA, Inc. 2015.

  6. Study of the quality characteristics in cold-smoked salmon (Salmo salar) originating from pre- or post-rigor raw material.

    PubMed

    Birkeland, S; Akse, L

    2010-01-01

    Improved slaughtering procedures in the salmon industry have caused a delayed onset of rigor mortis and, thus, a potential for pre-rigor secondary processing. The aim of this study was to investigate the effect of rigor status at time of processing on quality traits color, texture, sensory, microbiological, in injection salted, and cold-smoked Atlantic salmon (Salmo salar). Injection of pre-rigor fillets caused a significant (P<0.001) contraction (-7.9%± 0.9%) on the caudal-cranial axis. No significant differences in instrumental color (a*, b*, C*, or h*), texture (hardness), or sensory traits (aroma, color, taste, and texture) were observed between pre- or post-rigor processed fillets; however, post-rigor (1477 ± 38 g) fillets had a significant (P>0.05) higher fracturability than pre-rigor fillets (1369 ± 71 g). Pre-rigor fillets were significantly (P<0.01) lighter, L*, (39.7 ± 1.0) than post-rigor fillets (37.8 ± 0.8) and had significantly lower (P<0.05) aerobic plate count (APC), 1.4 ± 0.4 log CFU/g against 2.6 ± 0.6 log CFU/g, and psychrotrophic count (PC), 2.1 ± 0.2 log CFU/g against 3.0 ± 0.5 log CFU/g, than post-rigor processed fillets. This study showed that similar quality characteristics can be obtained in cold-smoked products processed either pre- or post-rigor when using suitable injection salting protocols and smoking techniques. © 2010 Institute of Food Technologists®

  7. Effect of Pre-rigor Salting Levels on Physicochemical and Textural Properties of Chicken Breast Muscles.

    PubMed

    Kim, Hyun-Wook; Hwang, Ko-Eun; Song, Dong-Heon; Kim, Yong-Jae; Ham, Youn-Kyung; Yeo, Eui-Joo; Jeong, Tae-Jun; Choi, Yun-Sang; Kim, Cheon-Jei

    2015-01-01

    This study was conducted to evaluate the effect of pre-rigor salting level (0-4% NaCl concentration) on physicochemical and textural properties of pre-rigor chicken breast muscles. The pre-rigor chicken breast muscles were de-boned 10 min post-mortem and salted within 25 min post-mortem. An increase in pre-rigor salting level led to the formation of high ultimate pH of chicken breast muscles at post-mortem 24 h. The addition of minimum of 2% NaCl significantly improved water holding capacity, cooking loss, protein solubility, and hardness when compared to the non-salting chicken breast muscle (p<0.05). On the other hand, the increase in pre-rigor salting level caused the inhibition of myofibrillar protein degradation and the acceleration of lipid oxidation. However, the difference in NaCl concentration between 3% and 4% had no great differences in the results of physicochemical and textural properties due to pre-rigor salting effects (p>0.05). Therefore, our study certified the pre-rigor salting effect of chicken breast muscle salted with 2% NaCl when compared to post-rigor muscle salted with equal NaCl concentration, and suggests that the 2% NaCl concentration is minimally required to ensure the definite pre-rigor salting effect on chicken breast muscle.

  8. Effect of Pre-rigor Salting Levels on Physicochemical and Textural Properties of Chicken Breast Muscles

    PubMed Central

    Choi, Yun-Sang

    2015-01-01

    This study was conducted to evaluate the effect of pre-rigor salting level (0-4% NaCl concentration) on physicochemical and textural properties of pre-rigor chicken breast muscles. The pre-rigor chicken breast muscles were de-boned 10 min post-mortem and salted within 25 min post-mortem. An increase in pre-rigor salting level led to the formation of high ultimate pH of chicken breast muscles at post-mortem 24 h. The addition of minimum of 2% NaCl significantly improved water holding capacity, cooking loss, protein solubility, and hardness when compared to the non-salting chicken breast muscle (p<0.05). On the other hand, the increase in pre-rigor salting level caused the inhibition of myofibrillar protein degradation and the acceleration of lipid oxidation. However, the difference in NaCl concentration between 3% and 4% had no great differences in the results of physicochemical and textural properties due to pre-rigor salting effects (p>0.05). Therefore, our study certified the pre-rigor salting effect of chicken breast muscle salted with 2% NaCl when compared to post-rigor muscle salted with equal NaCl concentration, and suggests that the 2% NaCl concentration is minimally required to ensure the definite pre-rigor salting effect on chicken breast muscle. PMID:26761884

  9. Use of software engineering techniques in the design of the ALEPH data acquisition system

    NASA Astrophysics Data System (ADS)

    Charity, T.; McClatchey, R.; Harvey, J.

    1987-08-01

    The SASD methodology is being used to provide a rigorous design framework for various components of the ALEPH data acquisition system. The Entity-Relationship data model is used to describe the layout and configuration of the control and acquisition systems and detector components. State Transition Diagrams are used to specify control applications such as run control and resource management and Data Flow Diagrams assist in decomposing software tasks and defining interfaces between processes. These techniques encourage rigorous software design leading to enhanced functionality and reliability. Improved documentation and communication ensures continuity over the system life-cycle and simplifies project management.

  10. Zonation in the deep benthic megafauna : Application of a general test.

    PubMed

    Gardiner, Frederick P; Haedrich, Richard L

    1978-01-01

    A test based on Maxwell-Boltzman statistics, instead of the formerly suggested but inappropriate Bose-Einstein statistics (Pielou and Routledge, 1976), examines the distribution of the boundaries of species' ranges distributed along a gradient, and indicates whether they are random or clustered (zoned). The test is most useful as a preliminary to the application of more instructive but less statistically rigorous methods such as cluster analysis. The test indicates zonation is marked in the deep benthic megafauna living between 200 and 3000 m, but below 3000 m little zonation may be found.

  11. Galileo attitude and articulation control subsystem closed loop testing

    NASA Technical Reports Server (NTRS)

    Lembeck, M. F.; Pignatano, N. D.

    1983-01-01

    In order to ensure the reliable operation of the Attitude and Articulation Control Subsystem (AACS) which will guide the Galileo spacecraft on its two and one-half year journey to Jupiter, the AACS is being rigorously tested. The primary objectives of the test program are the verification of the AACS's form, fit, and function, especially with regard to subsystem external interfaces and the functional operation of the flight software. Attention is presently given to the Galileo Closed Loop Test System, which simulates the dynamic and 'visual' flight environment for AACS components in the laboratory.

  12. Final Technical Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Held, Isaac; V. Balaji; Fueglistaler, Stephan

    We have constructed and analyzed a series of idealized models of tropical convection interacting with large-scale circulations, with 25-50km resolution and with 1-2km cloud resolving resolution to set the stage for rigorous tests of convection closure schemes in high resolution global climate models. Much of the focus has been on the climatology of tropical cyclogenesis in rotating systems and the related problem of the spontaneous aggregation of convection in non-rotating systems. The PI (Held) will be delivering the honorary Bjerknes lecture at the Fall 2016 AGU meeting in December on this work. We have also provided new analyses of long-standingmore » issues related to the interaction between convection and the large-scale circulation: Kelvin waves in the upper troposphere and lower stratosphere, water vapor transport into the stratosphere, and upper tropospheric temperature trends. The results of these analyses help to improve our understanding of processes, and provide tests for future high resolution global modeling. Our final goal of testing new convections schemes in next-generation global atmospheric models at GFDL has been left for future work due to the complexity of the idealized model results meant as tests for these models uncovered in this work and to computational resource limitations. 11 papers have been published with support from this grant, 2 are in review, and another major summary paper is in preparation.« less

  13. A Comparison of Single-Cycle Versus Multiple-Cycle Proof Testing Strategies

    NASA Technical Reports Server (NTRS)

    McClung, R. C.; Chell, G. G.; Millwater, H. R.; Russell, D. A.; Millwater, H. R.

    1999-01-01

    Single-cycle and multiple-cycle proof testing (SCPT and MCPT) strategies for reusable aerospace propulsion system components are critically evaluated and compared from a rigorous elastic-plastic fracture mechanics perspective. Earlier MCPT studies are briefly reviewed. New J-integral estimation methods for semielliptical surface cracks and cracks at notches are derived and validated. Engineering methods are developed to characterize crack growth rates during elastic-plastic fatigue crack growth (FCG) and the tear-fatigue interaction near instability. Surface crack growth experiments are conducted with Inconel 718 to characterize tearing resistance, FCG under small-scale yielding and elastic-plastic conditions, and crack growth during simulated MCPT. Fractography and acoustic emission studies provide additional insight. The relative merits of SCPT and MCPT are directly compared using a probabilistic analysis linked with an elastic-plastic crack growth computer code. The conditional probability of failure in service is computed for a population of components that have survived a previous proof test, based on an assumed distribution of initial crack depths. Parameter studies investigate the influence of proof factor, tearing resistance, crack shape, initial crack depth distribution, and notches on the MCPT versus SCPT comparison. The parameter studies provide a rational basis to formulate conclusions about the relative advantages and disadvantages of SCPT and MCPT. Practical engineering guidelines are proposed to help select the optimum proof test protocol in a given application.

  14. A Comparison of Single-Cycle Versus Multiple-Cycle Proof Testing Strategies

    NASA Technical Reports Server (NTRS)

    McClung, R. C.; Chell, G. G.; Millwater, H. R.; Russell, D. A.; Orient, G. E.

    1996-01-01

    Single-cycle and multiple-cycle proof testing (SCPT and MCPT) strategies for reusable aerospace propulsion system components are critically evaluated and compared from a rigorous elastic-plastic fracture mechanics perspective. Earlier MCPT studies are briefly reviewed. New J-integral estimation methods for semi-elliptical surface cracks and cracks at notches are derived and validated. Engineering methods are developed to characterize crack growth rates during elastic-plastic fatigue crack growth (FCG) and the tear-fatigue interaction near instability. Surface crack growth experiments are conducted with Inconel 718 to characterize tearing resistance, FCG under small-scale yielding and elastic-plastic conditions, and crack growth during simulated MCPT. Fractography and acoustic emission studies provide additional insight. The relative merits of SCPT and MCPT are directly compared using a probabilistic analysis linked with an elastic-plastic crack growth computer code. The conditional probability of failure in service is computed for a population of components that have survived a previous proof test, based on an assumed distribution of initial crack depths. Parameter studies investigate the influence of proof factor, tearing resistance, crack shape, initial crack depth distribution, and notches on the MCPT vs. SCPT comparison. The parameter studies provide a rational basis to formulate conclusions about the relative advantages and disadvantages of SCPT and MCPT. Practical engineering guidelines are proposed to help select the optimum proof test protocol in a given application.

  15. Rational selection of training and test sets for the development of validated QSAR models

    NASA Astrophysics Data System (ADS)

    Golbraikh, Alexander; Shen, Min; Xiao, Zhiyan; Xiao, Yun-De; Lee, Kuo-Hsiung; Tropsha, Alexander

    2003-02-01

    Quantitative Structure-Activity Relationship (QSAR) models are used increasingly to screen chemical databases and/or virtual chemical libraries for potentially bioactive molecules. These developments emphasize the importance of rigorous model validation to ensure that the models have acceptable predictive power. Using k nearest neighbors ( kNN) variable selection QSAR method for the analysis of several datasets, we have demonstrated recently that the widely accepted leave-one-out (LOO) cross-validated R2 (q2) is an inadequate characteristic to assess the predictive ability of the models [Golbraikh, A., Tropsha, A. Beware of q2! J. Mol. Graphics Mod. 20, 269-276, (2002)]. Herein, we provide additional evidence that there exists no correlation between the values of q 2 for the training set and accuracy of prediction ( R 2) for the test set and argue that this observation is a general property of any QSAR model developed with LOO cross-validation. We suggest that external validation using rationally selected training and test sets provides a means to establish a reliable QSAR model. We propose several approaches to the division of experimental datasets into training and test sets and apply them in QSAR studies of 48 functionalized amino acid anticonvulsants and a series of 157 epipodophyllotoxin derivatives with antitumor activity. We formulate a set of general criteria for the evaluation of predictive power of QSAR models.

  16. Technology Proliferation: Acquisition Strategies and Opportunities for an Uncertain Future

    DTIC Science & Technology

    2018-04-20

    The large programs of record characteristic of federal acquisition consist of rigorous research, development, testing, and evaluation (RDT&E...and evaluation (IOT&E) activities drive the program toward the decision to enter full rate production (FRP). Finally, in the sustainment phase, the...the new feature by a full release at a later date, or halt the development altogether. As stated by the Director of Operational Test and Evaluation

  17. Forward modelling of global gravity fields with 3D density structures and an application to the high-resolution ( 2 km) gravity fields of the Moon

    NASA Astrophysics Data System (ADS)

    Šprlák, M.; Han, S.-C.; Featherstone, W. E.

    2017-12-01

    Rigorous modelling of the spherical gravitational potential spectra from the volumetric density and geometry of an attracting body is discussed. Firstly, we derive mathematical formulas for the spatial analysis of spherical harmonic coefficients. Secondly, we present a numerically efficient algorithm for rigorous forward modelling. We consider the finite-amplitude topographic modelling methods as special cases, with additional postulates on the volumetric density and geometry. Thirdly, we implement our algorithm in the form of computer programs and test their correctness with respect to the finite-amplitude topography routines. For this purpose, synthetic and realistic numerical experiments, applied to the gravitational field and geometry of the Moon, are performed. We also investigate the optimal choice of input parameters for the finite-amplitude modelling methods. Fourth, we exploit the rigorous forward modelling for the determination of the spherical gravitational potential spectra inferred by lunar crustal models with uniform, laterally variable, radially variable, and spatially (3D) variable bulk density. Also, we analyse these four different crustal models in terms of their spectral characteristics and band-limited radial gravitation. We demonstrate applicability of the rigorous forward modelling using currently available computational resources up to degree and order 2519 of the spherical harmonic expansion, which corresponds to a resolution of 2.2 km on the surface of the Moon. Computer codes, a user manual and scripts developed for the purposes of this study are publicly available to potential users.

  18. Medicine, methodology, and values: trade-offs in clinical science and practice.

    PubMed

    Ho, Vincent K Y

    2011-01-01

    The current guidelines of evidence-based medicine (EBM) presuppose that clinical research and clinical practice should advance from rigorous scientific tests as they generate reliable, value-free knowledge. Under this presupposition, hypotheses postulated by doctors and patients in the process of their decision making are preferably tested in randomized clinical trials (RCTs), and in systematic reviews and meta-analyses summarizing outcomes from multiple RCTs. Since testing under this scheme is predominantly focused on the criteria of generality and precision achieved through methodological rigor, at the cost of the criterion of realism, translating test results to clinical practice is often problematic. Choices concerning which methodological criteria should have priority are inevitable, however, as clinical trials, and scientific research in general, cannot meet all relevant criteria at the same time. Since these choices may be informed by considerations external to science, we must acknowledge that science cannot be value-free in a strict sense, and this invites a more prominent role for value-laden considerations in evaluating clinical research. The urgency for this becomes even more apparent when we consider the important yet implicit role of scientific theories in EBM, which may also be subjected to methodological evaluation and for which selectiveness in methodological focus is likewise inevitable.

  19. 21st Century Mathematics

    ERIC Educational Resources Information Center

    Seeley, Cathy

    2004-01-01

    This article addresses some important issues in mathematics instruction at the middle and secondary levels, including the structuring of a district's mathematics program; the choice of textbooks and use of calculators in the classroom; the need for more rigorous lesson planning practices; and the dangers of teaching to standardized tests rather…

  20. NOVEL OXIDANT FOR ELEMENTAL MERCURY CONTROL FROM FLUE GAS

    EPA Science Inventory

    A novel economical oxidant has been developed for elemental mercury (Hg(0)) removal from coal-fired boilers. The oxidant was rigorously tested in a lab-scale fixed-bed system with the Norit America's FGD activated carbon (DOE's benchmark sorbent) in a typical PRB subbituminous/l...

  1. THE DETERMINATION OF TOTAL ORGANIC HALIDE IN WATER: A COMPARATIVE STUDY OF TWO INSTRUMENTS

    EPA Science Inventory

    Total organic halide (TOX) analyzers are commonly used to measure the amount of dissolved halogenated organic byproducts in disinfected waters. ecause of the lack of information on the identity of disinfection byproducts, rigorous testing of the dissolved organic halide (DOX) pro...

  2. Teachers' Perceptions of Kindergarten Readiness Indicators

    ERIC Educational Resources Information Center

    Boylan, Tronya E.

    2017-01-01

    The study of school readiness is multifaceted, encompassing an understanding of many developmental areas and skills. In the current educational culture of high-stakes testing, increased rigor, and high learning expectations, parents may be concerned about a child's readiness to begin kindergarten. With increased accountability, teachers may also…

  3. ERP Reliability Analysis (ERA) Toolbox: An open-source toolbox for analyzing the reliability of event-related brain potentials.

    PubMed

    Clayson, Peter E; Miller, Gregory A

    2017-01-01

    Generalizability theory (G theory) provides a flexible, multifaceted approach to estimating score reliability. G theory's approach to estimating score reliability has important advantages over classical test theory that are relevant for research using event-related brain potentials (ERPs). For example, G theory does not require parallel forms (i.e., equal means, variances, and covariances), can handle unbalanced designs, and provides a single reliability estimate for designs with multiple sources of error. This monograph provides a detailed description of the conceptual framework of G theory using examples relevant to ERP researchers, presents the algorithms needed to estimate ERP score reliability, and provides a detailed walkthrough of newly-developed software, the ERP Reliability Analysis (ERA) Toolbox, that calculates score reliability using G theory. The ERA Toolbox is open-source, Matlab software that uses G theory to estimate the contribution of the number of trials retained for averaging, group, and/or event types on ERP score reliability. The toolbox facilitates the rigorous evaluation of psychometric properties of ERP scores recommended elsewhere in this special issue. Copyright © 2016 Elsevier B.V. All rights reserved.

  4. Experimental evaluation of rigor mortis. VI. Effect of various causes of death on the evolution of rigor mortis.

    PubMed

    Krompecher, T; Bergerioux, C; Brandt-Casadevall, C; Gujer, H R

    1983-07-01

    The evolution of rigor mortis was studied in cases of nitrogen asphyxia, drowning and strangulation, as well as in fatal intoxications due to strychnine, carbon monoxide and curariform drugs, using a modified method of measurement. Our experiments demonstrated that: (1) Strychnine intoxication hastens the onset and passing of rigor mortis. (2) CO intoxication delays the resolution of rigor mortis. (3) The intensity of rigor may vary depending upon the cause of death. (4) If the stage of rigidity is to be used to estimate the time of death, it is necessary: (a) to perform a succession of objective measurements of rigor mortis intensity; and (b) to verify the eventual presence of factors that could play a role in the modification of its development.

  5. Evaluating habitat suitability models for nesting white-headed woodpeckers in unburned forest

    Treesearch

    Quresh S. Latif; Victoria A. Saab; Kim Mellen-Mclean; Jonathan G. Dudley

    2015-01-01

    Habitat suitability models can provide guidelines for species conservation by predicting where species of interest are likely to occur. Presence-only models are widely used but typically provide only relative indices of habitat suitability (HSIs), necessitating rigorous evaluation often using independently collected presence-absence data. We refined and evaluated...

  6. Teacher Efficacy of High School Mathematics Co-Teachers

    ERIC Educational Resources Information Center

    Rimpola, Raquel C.

    2011-01-01

    High school mathematics inclusion classes help provide all students the access to rigorous curriculum. This study provides information about the teacher efficacy of high school mathematics co-teachers. It considers the influence of the amount of collaborative planning time on the efficacy of co-teachers. A quantitative research design was used,…

  7. RIGOR MORTIS AND THE INFLUENCE OF CALCIUM AND MAGNESIUM SALTS UPON ITS DEVELOPMENT.

    PubMed

    Meltzer, S J; Auer, J

    1908-01-01

    Calcium salts hasten and magnesium salts retard the development of rigor mortis, that is, when these salts are administered subcutaneously or intravenously. When injected intra-arterially, concentrated solutions of both kinds of salts cause nearly an immediate onset of a strong stiffness of the muscles which is apparently a contraction, brought on by a stimulation caused by these salts and due to osmosis. This contraction, if strong, passes over without a relaxation into a real rigor. This form of rigor may be classed as work-rigor (Arbeitsstarre). In animals, at least in frogs, with intact cords, the early contraction and the following rigor are stronger than in animals with destroyed cord. If M/8 solutions-nearly equimolecular to "physiological" solutions of sodium chloride-are used, even when injected intra-arterially, calcium salts hasten and magnesium salts retard the onset of rigor. The hastening and retardation in this case as well as in the cases of subcutaneous and intravenous injections, are ion effects and essentially due to the cations, calcium and magnesium. In the rigor hastened by calcium the effects of the extensor muscles mostly prevail; in the rigor following magnesium injection, on the other hand, either the flexor muscles prevail or the muscles become stiff in the original position of the animal at death. There seems to be no difference in the degree of stiffness in the final rigor, only the onset and development of the rigor is hastened in the case of the one salt and retarded in the other. Calcium hastens also the development of heat rigor. No positive facts were obtained with regard to the effect of magnesium upon heat vigor. Calcium also hastens and magnesium retards the onset of rigor in the left ventricle of the heart. No definite data were gathered with regard to the effects of these salts upon the right ventricle.

  8. RIGOR MORTIS AND THE INFLUENCE OF CALCIUM AND MAGNESIUM SALTS UPON ITS DEVELOPMENT

    PubMed Central

    Meltzer, S. J.; Auer, John

    1908-01-01

    Calcium salts hasten and magnesium salts retard the development of rigor mortis, that is, when these salts are administered subcutaneously or intravenously. When injected intra-arterially, concentrated solutions of both kinds of salts cause nearly an immediate onset of a strong stiffness of the muscles which is apparently a contraction, brought on by a stimulation caused by these salts and due to osmosis. This contraction, if strong, passes over without a relaxation into a real rigor. This form of rigor may be classed as work-rigor (Arbeitsstarre). In animals, at least in frogs, with intact cords, the early contraction and the following rigor are stronger than in animals with destroyed cord. If M/8 solutions—nearly equimolecular to "physiological" solutions of sodium chloride—are used, even when injected intra-arterially, calcium salts hasten and magnesium salts retard the onset of rigor. The hastening and retardation in this case as well as in the cases of subcutaneous and intravenous injections, are ion effects and essentially due to the cations, calcium and magnesium. In the rigor hastened by calcium the effects of the extensor muscles mostly prevail; in the rigor following magnesium injection, on the other hand, either the flexor muscles prevail or the muscles become stiff in the original position of the animal at death. There seems to be no difference in the degree of stiffness in the final rigor, only the onset and development of the rigor is hastened in the case of the one salt and retarded in the other. Calcium hastens also the development of heat rigor. No positive facts were obtained with regard to the effect of magnesium upon heat vigor. Calcium also hastens and magnesium retards the onset of rigor in the left ventricle of the heart. No definite data were gathered with regard to the effects of these salts upon the right ventricle. PMID:19867124

  9. Rigorous Performance Evaluation of Smartphone GNSS/IMU Sensors for ITS Applications

    PubMed Central

    Gikas, Vassilis; Perakis, Harris

    2016-01-01

    With the rapid growth in smartphone technologies and improvement in their navigation sensors, an increasing amount of location information is now available, opening the road to the provision of new Intelligent Transportation System (ITS) services. Current smartphone devices embody miniaturized Global Navigation Satellite System (GNSS), Inertial Measurement Unit (IMU) and other sensors capable of providing user position, velocity and attitude. However, it is hard to characterize their actual positioning and navigation performance capabilities due to the disparate sensor and software technologies adopted among manufacturers and the high influence of environmental conditions, and therefore, a unified certification process is missing. This paper presents the analysis results obtained from the assessment of two modern smartphones regarding their positioning accuracy (i.e., precision and trueness) capabilities (i.e., potential and limitations) based on a practical but rigorous methodological approach. Our investigation relies on the results of several vehicle tracking (i.e., cruising and maneuvering) tests realized through comparing smartphone obtained trajectories and kinematic parameters to those derived using a high-end GNSS/IMU system and advanced filtering techniques. Performance testing is undertaken for the HTC One S (Android) and iPhone 5s (iOS). Our findings indicate that the deviation of the smartphone locations from ground truth (trueness) deteriorates by a factor of two in obscured environments compared to those derived in open sky conditions. Moreover, it appears that iPhone 5s produces relatively smaller and less dispersed error values compared to those computed for HTC One S. Also, the navigation solution of the HTC One S appears to adapt faster to changes in environmental conditions, suggesting a somewhat different data filtering approach for the iPhone 5s. Testing the accuracy of the accelerometer and gyroscope sensors for a number of maneuvering (speeding, turning, etc.,) events reveals high consistency between smartphones, whereas the small deviations from ground truth verify their high potential even for critical ITS safety applications. PMID:27527187

  10. Rigorous Performance Evaluation of Smartphone GNSS/IMU Sensors for ITS Applications.

    PubMed

    Gikas, Vassilis; Perakis, Harris

    2016-08-05

    With the rapid growth in smartphone technologies and improvement in their navigation sensors, an increasing amount of location information is now available, opening the road to the provision of new Intelligent Transportation System (ITS) services. Current smartphone devices embody miniaturized Global Navigation Satellite System (GNSS), Inertial Measurement Unit (IMU) and other sensors capable of providing user position, velocity and attitude. However, it is hard to characterize their actual positioning and navigation performance capabilities due to the disparate sensor and software technologies adopted among manufacturers and the high influence of environmental conditions, and therefore, a unified certification process is missing. This paper presents the analysis results obtained from the assessment of two modern smartphones regarding their positioning accuracy (i.e., precision and trueness) capabilities (i.e., potential and limitations) based on a practical but rigorous methodological approach. Our investigation relies on the results of several vehicle tracking (i.e., cruising and maneuvering) tests realized through comparing smartphone obtained trajectories and kinematic parameters to those derived using a high-end GNSS/IMU system and advanced filtering techniques. Performance testing is undertaken for the HTC One S (Android) and iPhone 5s (iOS). Our findings indicate that the deviation of the smartphone locations from ground truth (trueness) deteriorates by a factor of two in obscured environments compared to those derived in open sky conditions. Moreover, it appears that iPhone 5s produces relatively smaller and less dispersed error values compared to those computed for HTC One S. Also, the navigation solution of the HTC One S appears to adapt faster to changes in environmental conditions, suggesting a somewhat different data filtering approach for the iPhone 5s. Testing the accuracy of the accelerometer and gyroscope sensors for a number of maneuvering (speeding, turning, etc.,) events reveals high consistency between smartphones, whereas the small deviations from ground truth verify their high potential even for critical ITS safety applications.

  11. The development of research tools used in the STORK Study (the Scottish Trial of Refer or Keep) to explore midwives' intrapartum decision making.

    PubMed

    Styles, Maggie; Cheyne, Helen; O'Carroll, Ronan; Greig, Fiona; Dagge-Bell, Fiona; Niven, Catherine

    2011-10-01

    to develop appropriate tools to assess midwives' attitudes and behaviour in relation to decision making involving risk. a questionnaire and series of vignettes were developed and testes to explore midwives' intrapartum decision making in relation to their attitudes towards risk. An innovative online computer package was developed specifically for use in the STORK Study which enabled the programme to be very tightly controlled with limited functions accessible to participants. a pilot study was conducted with over 50 midwives and nurses to ensure face and content validity of the vignettes and questionnaire. Initially designed to be a paper-based study, rigorous piloting highlighted the many difficulties in presenting it in that particular format. The solution to this problem was to develop the study as a secure online package. online data collection provided the researchers with a greater degree of control of the data collection process, not achievable using traditional paper survey methods. Another example of this control is the immediate entry of data from participants' responses to a background database which automatically stores and backs up data this means that no additional time is required for data entry. The cost of employing an information technology professional was easily offset by the financial savings made through the limited use of stationery and postage. although the development and testing of the research tools for the STORK Study was labour and time intensive, ultimately a questionnaire and vignette package was produced that had been rigorously tested by over 50 midwives and nurses. The researchers are confident in the reliability of the questionnaire and vignettes, as well as the validity of the data collected. The use of an online survey is clearly indicated when the population has readily available internet access, and where controlling the process of data collection is required, as such control cannot be achieved in traditional survey and questionnaire implementation. Copyright © 2010 Elsevier Ltd. All rights reserved.

  12. Hardware

    NASA Technical Reports Server (NTRS)

    1999-01-01

    The full complement of EDOMP investigations called for a broad spectrum of flight hardware ranging from commercial items, modified for spaceflight, to custom designed hardware made to meet the unique requirements of testing in the space environment. In addition, baseline data collection before and after spaceflight required numerous items of ground-based hardware. Two basic categories of ground-based hardware were used in EDOMP testing before and after flight: (1) hardware used for medical baseline testing and analysis, and (2) flight-like hardware used both for astronaut training and medical testing. To ensure post-landing data collection, hardware was required at both the Kennedy Space Center (KSC) and the Dryden Flight Research Center (DFRC) landing sites. Items that were very large or sensitive to the rigors of shipping were housed permanently at the landing site test facilities. Therefore, multiple sets of hardware were required to adequately support the prime and backup landing sites plus the Johnson Space Center (JSC) laboratories. Development of flight hardware was a major element of the EDOMP. The challenges included obtaining or developing equipment that met the following criteria: (1) compact (small size and light weight), (2) battery-operated or requiring minimal spacecraft power, (3) sturdy enough to survive the rigors of spaceflight, (4) quiet enough to pass acoustics limitations, (5) shielded and filtered adequately to assure electromagnetic compatibility with spacecraft systems, (6) user-friendly in a microgravity environment, and (7) accurate and efficient operation to meet medical investigative requirements.

  13. Botanicals and Their Bioactive Phytochemicals for Women’s Health

    PubMed Central

    Dietz, Birgit M.; Hajirahimkhan, Atieh; Dunlap, Tareisha L.

    2016-01-01

    Botanical dietary supplements are increasingly popular for women’s health, particularly for older women. The specific botanicals women take vary as a function of age. Younger women will use botanicals for urinary tract infections, especially Vaccinium macrocarpon (cranberry), where there is evidence for efficacy. Botanical dietary supplements for premenstrual syndrome (PMS) are less commonly used, and rigorous clinical trials have not been done. Some examples include Vitex agnus-castus (chasteberry), Angelica sinensis (dong quai), Viburnum opulus/prunifolium (cramp bark and black haw), and Zingiber officinale (ginger). Pregnant women have also used ginger for relief from nausea. Natural galactagogues for lactating women include Trigonella foenum-graecum (fenugreek) and Silybum marianum (milk thistle); however, rigorous safety and efficacy studies are lacking. Older women suffering menopausal symptoms are increasingly likely to use botanicals, especially since the Women’s Health Initiative showed an increased risk for breast cancer associated with traditional hormone therapy. Serotonergic mechanisms similar to antidepressants have been proposed for Actaea/Cimicifuga racemosa (black cohosh) and Valeriana officinalis (valerian). Plant extracts with estrogenic activities for menopausal symptom relief include Glycine max (soy), Trifolium pratense (red clover), Pueraria lobata (kudzu), Humulus lupulus (hops), Glycyrrhiza species (licorice), Rheum rhaponticum (rhubarb), Vitex agnus-castus (chasteberry), Linum usitatissimum (flaxseed), Epimedium species (herba Epimedii, horny goat weed), and Medicago sativa (alfalfa). Some of the estrogenic botanicals have also been shown to have protective effects against osteoporosis. Several of these botanicals could have additional breast cancer preventive effects linked to hormonal, chemical, inflammatory, and/or epigenetic pathways. Finally, although botanicals are perceived as natural safe remedies, it is important for women and their healthcare providers to realize that they have not been rigorously tested for potential toxic effects and/or drug/botanical interactions. Understanding the mechanism of action of these supplements used for women’s health will ultimately lead to standardized botanical products with higher efficacy, safety, and chemopreventive properties. PMID:27677719

  14. Botanicals and Their Bioactive Phytochemicals for Women's Health.

    PubMed

    Dietz, Birgit M; Hajirahimkhan, Atieh; Dunlap, Tareisha L; Bolton, Judy L

    2016-10-01

    Botanical dietary supplements are increasingly popular for women's health, particularly for older women. The specific botanicals women take vary as a function of age. Younger women will use botanicals for urinary tract infections, especially Vaccinium macrocarpon (cranberry), where there is evidence for efficacy. Botanical dietary supplements for premenstrual syndrome (PMS) are less commonly used, and rigorous clinical trials have not been done. Some examples include Vitex agnus-castus (chasteberry), Angelica sinensis (dong quai), Viburnum opulus/prunifolium (cramp bark and black haw), and Zingiber officinale (ginger). Pregnant women have also used ginger for relief from nausea. Natural galactagogues for lactating women include Trigonella foenum-graecum (fenugreek) and Silybum marianum (milk thistle); however, rigorous safety and efficacy studies are lacking. Older women suffering menopausal symptoms are increasingly likely to use botanicals, especially since the Women's Health Initiative showed an increased risk for breast cancer associated with traditional hormone therapy. Serotonergic mechanisms similar to antidepressants have been proposed for Actaea/Cimicifuga racemosa (black cohosh) and Valeriana officinalis (valerian). Plant extracts with estrogenic activities for menopausal symptom relief include Glycine max (soy), Trifolium pratense (red clover), Pueraria lobata (kudzu), Humulus lupulus (hops), Glycyrrhiza species (licorice), Rheum rhaponticum (rhubarb), Vitex agnus-castus (chasteberry), Linum usitatissimum (flaxseed), Epimedium species (herba Epimedii, horny goat weed), and Medicago sativa (alfalfa). Some of the estrogenic botanicals have also been shown to have protective effects against osteoporosis. Several of these botanicals could have additional breast cancer preventive effects linked to hormonal, chemical, inflammatory, and/or epigenetic pathways. Finally, although botanicals are perceived as natural safe remedies, it is important for women and their healthcare providers to realize that they have not been rigorously tested for potential toxic effects and/or drug/botanical interactions. Understanding the mechanism of action of these supplements used for women's health will ultimately lead to standardized botanical products with higher efficacy, safety, and chemopreventive properties. Copyright © 2016 by The Author(s).

  15. Assessment and clinical management of bone disease in adults with eating disorders: a review.

    PubMed

    Drabkin, Anne; Rothman, Micol S; Wassenaar, Elizabeth; Mascolo, Margherita; Mehler, Philip S

    2017-01-01

    To review current medical literature regarding the causes and clinical management options for low bone mineral density (BMD) in adult patients with eating disorders. Low bone mineral density is a common complication of eating disorders with potentially lifelong debilitating consequences. Definitive, rigorous guidelines for screening, prevention and management are lacking. This article intends to provide a review of the literature to date and current options for prevention and treatment. Current, peer-reviewed literature was reviewed, interpreted and summarized. Any patient with lower than average BMD should weight restore and in premenopausal females, spontaneous menses should resume. Adequate vitamin D and calcium supplementation is important. Weight-bearing exercise should be avoided unless cautiously monitored by a treatment team in the setting of weight restoration. If a patient has a Z-score less than expected for age with a high fracture risk or likelihood of ongoing BMD loss, physiologic transdermal estrogen plus oral progesterone, bisphosphonates (alendronate or risedronate) or teriparatide could be considered. Other agents, such as denosumab and testosterone in men, have not been tested in eating-disordered populations and should only be trialed on an empiric basis if there is a high clinical concern for fractures or worsening bone mineral density. A rigorous peer-based approach to establish guidelines for evaluation and management of low bone mineral density is needed in this neglected subspecialty of eating disorders.

  16. Bayesian Reconstruction of Disease Outbreaks by Combining Epidemiologic and Genomic Data

    PubMed Central

    Jombart, Thibaut; Cori, Anne; Didelot, Xavier; Cauchemez, Simon; Fraser, Christophe; Ferguson, Neil

    2014-01-01

    Recent years have seen progress in the development of statistically rigorous frameworks to infer outbreak transmission trees (“who infected whom”) from epidemiological and genetic data. Making use of pathogen genome sequences in such analyses remains a challenge, however, with a variety of heuristic approaches having been explored to date. We introduce a statistical method exploiting both pathogen sequences and collection dates to unravel the dynamics of densely sampled outbreaks. Our approach identifies likely transmission events and infers dates of infections, unobserved cases and separate introductions of the disease. It also proves useful for inferring numbers of secondary infections and identifying heterogeneous infectivity and super-spreaders. After testing our approach using simulations, we illustrate the method with the analysis of the beginning of the 2003 Singaporean outbreak of Severe Acute Respiratory Syndrome (SARS), providing new insights into the early stage of this epidemic. Our approach is the first tool for disease outbreak reconstruction from genetic data widely available as free software, the R package outbreaker. It is applicable to various densely sampled epidemics, and improves previous approaches by detecting unobserved and imported cases, as well as allowing multiple introductions of the pathogen. Because of its generality, we believe this method will become a tool of choice for the analysis of densely sampled disease outbreaks, and will form a rigorous framework for subsequent methodological developments. PMID:24465202

  17. Evidence-based Sensor Tasking for Space Domain Awareness

    NASA Astrophysics Data System (ADS)

    Jaunzemis, A.; Holzinger, M.; Jah, M.

    2016-09-01

    Space Domain Awareness (SDA) is the actionable knowledge required to predict, avoid, deter, operate through, recover from, and/or attribute cause to the loss and/or degradation of space capabilities and services. A main purpose for SDA is to provide decision-making processes with a quantifiable and timely body of evidence of behavior(s) attributable to specific space threats and/or hazards. To fulfill the promise of SDA, it is necessary for decision makers and analysts to pose specific hypotheses that may be supported or refuted by evidence, some of which may only be collected using sensor networks. While Bayesian inference may support some of these decision making needs, it does not adequately capture ambiguity in supporting evidence; i.e., it struggles to rigorously quantify 'known unknowns' for decision makers. Over the past 40 years, evidential reasoning approaches such as Dempster Shafer theory have been developed to address problems with ambiguous bodies of evidence. This paper applies mathematical theories of evidence using Dempster Shafer expert systems to address the following critical issues: 1) How decision makers can pose critical decision criteria as rigorous, testable hypotheses, 2) How to interrogate these hypotheses to reduce ambiguity, and 3) How to task a network of sensors to gather evidence for multiple competing hypotheses. This theory is tested using a simulated sensor tasking scenario balancing search versus track responsibilities.

  18. The impact of employee assistance services on workplace outcomes: Results of a prospective, quasi-experimental study.

    PubMed

    Richmond, Melissa K; Pampel, Fred C; Wood, Randi C; Nunes, Ana P

    2017-04-01

    Employee Assistance Programs (EAPs) are widely used to help employees experiencing personal or work-related difficulties that impact work productivity. However, rigorous research on the effectiveness of programs to improve work-related outcomes is lacking. The current study represents a major advance in EAP research by using a prospective, quasi-experimental design with a large and diverse employee base. Using propensity scores calculated from demographic, social, work-related, and psychological variables collected on baseline surveys, we matched 156 employees receiving EAP to 188 non-EAP employees. Follow-up surveys were collected from 2 to 12 months post-baseline (M = 6.0). At follow-up, EAP employees had significantly greater reductions in absenteeism (b = -.596, p = .001) and presenteeism (b = -.217, p = .038), but not workplace distress (b = -.079, p = .448), than did non-EAP employees. Tests of moderation of baseline alcohol use, depression, anxiety, and productivity indicate that for the most part, the program works equally well for all groups. However, EAP did more to reduce absenteeism for those who began with lower severity of depression and anxiety at baseline. Results provide the scientific rigor needed to demonstrate EAP impact on improved work outcomes. In the first study of its kind, findings confirm the value of EAPs to help employees address personal and work-related concerns that are affecting job performance. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  19. Systematic reviews of diagnostic tests in endocrinology: an audit of methods, reporting, and performance.

    PubMed

    Spencer-Bonilla, Gabriela; Singh Ospina, Naykky; Rodriguez-Gutierrez, Rene; Brito, Juan P; Iñiguez-Ariza, Nicole; Tamhane, Shrikant; Erwin, Patricia J; Murad, M Hassan; Montori, Victor M

    2017-07-01

    Systematic reviews provide clinicians and policymakers estimates of diagnostic test accuracy and their usefulness in clinical practice. We identified all available systematic reviews of diagnosis in endocrinology, summarized the diagnostic accuracy of the tests included, and assessed the credibility and clinical usefulness of the methods and reporting. We searched Ovid MEDLINE, EMBASE, and Cochrane CENTRAL from inception to December 2015 for systematic reviews and meta-analyses reporting accuracy measures of diagnostic tests in endocrinology. Experienced reviewers independently screened for eligible studies and collected data. We summarized the results, methods, and reporting of the reviews. We performed subgroup analyses to categorize diagnostic tests as most useful based on their accuracy. We identified 84 systematic reviews; half of the tests included were classified as helpful when positive, one-fourth as helpful when negative. Most authors adequately reported how studies were identified and selected and how their trustworthiness (risk of bias) was judged. Only one in three reviews, however, reported an overall judgment about trustworthiness and one in five reported using adequate meta-analytic methods. One in four reported contacting authors for further information and about half included only patients with diagnostic uncertainty. Up to half of the diagnostic endocrine tests in which the likelihood ratio was calculated or provided are likely to be helpful in practice when positive as are one-quarter when negative. Most diagnostic systematic reviews in endocrine lack methodological rigor, protection against bias, and offer limited credibility. Substantial efforts, therefore, seem necessary to improve the quality of diagnostic systematic reviews in endocrinology.

  20. Multivariate Qst–Fst Comparisons: A Neutrality Test for the Evolution of the G Matrix in Structured Populations

    PubMed Central

    Martin, Guillaume; Chapuis, Elodie; Goudet, Jérôme

    2008-01-01

    Neutrality tests in quantitative genetics provide a statistical framework for the detection of selection on polygenic traits in wild populations. However, the existing method based on comparisons of divergence at neutral markers and quantitative traits (Qst–Fst) suffers from several limitations that hinder a clear interpretation of the results with typical empirical designs. In this article, we propose a multivariate extension of this neutrality test based on empirical estimates of the among-populations (D) and within-populations (G) covariance matrices by MANOVA. A simple pattern is expected under neutrality: D = 2Fst/(1 − Fst)G, so that neutrality implies both proportionality of the two matrices and a specific value of the proportionality coefficient. This pattern is tested using Flury's framework for matrix comparison [common principal-component (CPC) analysis], a well-known tool in G matrix evolution studies. We show the importance of using a Bartlett adjustment of the test for the small sample sizes typically found in empirical studies. We propose a dual test: (i) that the proportionality coefficient is not different from its neutral expectation [2Fst/(1 − Fst)] and (ii) that the MANOVA estimates of mean square matrices between and among populations are proportional. These two tests combined provide a more stringent test for neutrality than the classic Qst–Fst comparison and avoid several statistical problems. Extensive simulations of realistic empirical designs suggest that these tests correctly detect the expected pattern under neutrality and have enough power to efficiently detect mild to strong selection (homogeneous, heterogeneous, or mixed) when it is occurring on a set of traits. This method also provides a rigorous and quantitative framework for disentangling the effects of different selection regimes and of drift on the evolution of the G matrix. We discuss practical requirements for the proper application of our test in empirical studies and potential extensions. PMID:18245845

  1. Incorporating uncertainty into medical decision making: an approach to unexpected test results.

    PubMed

    Bianchi, Matt T; Alexander, Brian M; Cash, Sydney S

    2009-01-01

    The utility of diagnostic tests derives from the ability to translate the population concepts of sensitivity and specificity into information that will be useful for the individual patient: the predictive value of the result. As the array of available diagnostic testing broadens, there is a temptation to de-emphasize history and physical findings and defer to the objective rigor of technology. However, diagnostic test interpretation is not always straightforward. One significant barrier to routine use of probability-based test interpretation is the uncertainty inherent in pretest probability estimation, the critical first step of Bayesian reasoning. The context in which this uncertainty presents the greatest challenge is when test results oppose clinical judgment. It is this situation when decision support would be most helpful. The authors propose a simple graphical approach that incorporates uncertainty in pretest probability and has specific application to the interpretation of unexpected results. This method quantitatively demonstrates how uncertainty in disease probability may be amplified when test results are unexpected (opposing clinical judgment), even for tests with high sensitivity and specificity. The authors provide a simple nomogram for determining whether an unexpected test result suggests that one should "switch diagnostic sides.'' This graphical framework overcomes the limitation of pretest probability uncertainty in Bayesian analysis and guides decision making when it is most challenging: interpretation of unexpected test results.

  2. Complementary and alternative medicine for the treatment and diagnosis of asthma and allergic diseases.

    PubMed

    Passalacqua, G; Compalati, E; Schiappoli, M; Senna, G

    2005-03-01

    The use of Complementary/Alternative Medicines (CAM) is largely diffused and constantly increasing, especially in the field of allergic diseases and asthma. Homeopathy, acupuncture and phytotherapy are the most frequently utilised treatments, whereas complementary diagnostic techniques are mainly used in the field of food allergy-intolerance. Looking at the literature, the majority of clinical trials with CAMS are of low methodological quality, thus difficult to interpret. There are very few studies performed in a rigorously controlled fashion, and those studies provided inconclusive results. In asthma, none of the CAM have thus far been proved more effective than placebo or equally effective as standard treatments. Some herbal products, containing active principles, have displayed some clinical effect, but the herbal remedies are usually not standardised and not quantified, thus carry the risk of toxic effects or interactions. None of the alternative diagnostic techniques (electrodermal testing, kinesiology, leukocytotoxic test, iridology, hair analysis) have been proved able to distinguish between healthy and allergic subjects or to diagnose sensitizations. Therefore these tests must not be used, since they can lead to delayed or incorrect diagnosis and therapy.

  3. Is it Code Imperfection or 'garbage in Garbage Out'? Outline of Experiences from a Comprehensive Adr Code Verification

    NASA Astrophysics Data System (ADS)

    Zamani, K.; Bombardelli, F. A.

    2013-12-01

    ADR equation describes many physical phenomena of interest in the field of water quality in natural streams and groundwater. In many cases such as: density driven flow, multiphase reactive transport, and sediment transport, either one or a number of terms in the ADR equation may become nonlinear. For that reason, numerical tools are the only practical choice to solve these PDEs. All numerical solvers developed for transport equation need to undergo code verification procedure before they are put in to practice. Code verification is a mathematical activity to uncover failures and check for rigorous discretization of PDEs and implementation of initial/boundary conditions. In the context computational PDE verification is not a well-defined procedure on a clear path. Thus, verification tests should be designed and implemented with in-depth knowledge of numerical algorithms and physics of the phenomena as well as mathematical behavior of the solution. Even test results need to be mathematically analyzed to distinguish between an inherent limitation of algorithm and a coding error. Therefore, it is well known that code verification is a state of the art, in which innovative methods and case-based tricks are very common. This study presents full verification of a general transport code. To that end, a complete test suite is designed to probe the ADR solver comprehensively and discover all possible imperfections. In this study we convey our experiences in finding several errors which were not detectable with routine verification techniques. We developed a test suit including hundreds of unit tests and system tests. The test package has gradual increment in complexity such that tests start from simple and increase to the most sophisticated level. Appropriate verification metrics are defined for the required capabilities of the solver as follows: mass conservation, convergence order, capabilities in handling stiff problems, nonnegative concentration, shape preservation, and spurious wiggles. Thereby, we provide objective, quantitative values as opposed to subjective qualitative descriptions as 'weak' or 'satisfactory' agreement with those metrics. We start testing from a simple case of unidirectional advection, then bidirectional advection and tidal flow and build up to nonlinear cases. We design tests to check nonlinearity in velocity, dispersivity and reactions. For all of the mentioned cases we conduct mesh convergence tests. These tests compare the results' order of accuracy versus the formal order of accuracy of discretization. The concealing effect of scales (Peclet and Damkohler numbers) on the mesh convergence study and appropriate remedies are also discussed. For the cases in which the appropriate benchmarks for mesh convergence study are not available we utilize Symmetry, Complete Richardson Extrapolation and Method of False Injection to uncover bugs. Detailed discussions of capabilities of the mentioned code verification techniques are given. Auxiliary subroutines for automation of the test suit and report generation are designed. All in all, the test package is not only a robust tool for code verification but also it provides comprehensive insight on the ADR solvers capabilities. Such information is essential for any rigorous computational modeling of ADR equation for surface/subsurface pollution transport.

  4. Choosing a primary care provider

    MedlinePlus

    ... A.D.A.M. follows rigorous standards of quality and accountability. A.D.A.M. is among the first to achieve this important distinction for online health information and services. Learn more about A.D.A.M.'s editorial ...

  5. 75 FR 71131 - Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-22

    ... impacts. To complete this task with scientific rigor, it will be necessary to collect high quality survey... instruments, methodologies, procedures, and analytical techniques for this task. Moreover, they have been pilot tested in 11 States. The tools and techniques were submitted for review, and were approved, by...

  6. Anticipating and Incorporating Stakeholder Feedback When Developing Value-Added Models

    ERIC Educational Resources Information Center

    Balch, Ryan; Koedel, Cory

    2014-01-01

    State and local education agencies across the United States are increasingly adopting rigorous teacher evaluation systems. Most systems formally incorporate teacher performance as measured by student test-score growth, sometimes by state mandate. An important consideration that will influence the long-term persistence and efficacy of these systems…

  7. NOVEL ECONOMICAL HG(0) OXIDATION REAGENT FOR MERCURY EMISSIONS CONTROL FROM COAL-FIRED BOILERS

    EPA Science Inventory

    The authors have developed a novel economical additive for elemental mercury (Hg0) removal from coal-fired boilers. The oxidation reagent was rigorously tested in a lab-scale fixed-bed column with the Norit America's FGD activated carbon (DOE's benchmark sorbent) in a typical PRB...

  8. After Common Core, States Set Rigorous Standards

    ERIC Educational Resources Information Center

    Peterson, Paul E.; Barrows, Samuel; Gift, Thomas

    2016-01-01

    In spite of Tea Party criticism, union skepticism, and anti-testing outcries, the campaign to implement Common Core State Standards (otherwise known as Common Core) has achieved phenomenal success in statehouses across the country. Since 2011, 45 states have raised their standards for student proficiency in reading and math, with the greatest…

  9. THE DETERMINATION OF TOTAL ORGANIC HALIDE IN WATER: AN INTERLABORATORY COMPARATIVE STUDY OF TWO METHODS

    EPA Science Inventory

    Total organic halide (TOX) analyzers are commonly used to measure the amount of dissolved halogenated organic byproducts in disinfected waters. Because of the lack of information on the identity of disinfection byproducts, rigorous testing of the dissolved organic halide (DOX) pr...

  10. Reporting Randomized Controlled Trials in Education

    ERIC Educational Resources Information Center

    Mayo-Wilson, Evan; Grant, Sean; Montgomery, Paul

    2014-01-01

    Randomized controlled trials (RCTs) are increasingly used to evaluate programs and interventions in order to inform education policy and practice. High quality reports of these RCTs are needed for interested readers to understand the rigor of the study, the interventions tested, and the context in which the evaluation took place (Mayo-Wilson et…

  11. An Educational and Entrepreneurial Ecosystem to Actualize Technology-Based Social Ventures

    ERIC Educational Resources Information Center

    Mehta, Khanjan; Zappe, Sarah; Brannon, Mary Lynn; Zhao, Yu

    2016-01-01

    The Humanitarian Engineering and Social Entrepreneurship (HESE) Program engages students and faculty across Penn State in the rigorous research, design, field-testing, and launch of technology-based social enterprises that address global development challenges. HESE ventures are embedded in a series of five courses that integrate learning,…

  12. Challenges and Innovations in a Community-Based Participatory Randomized Controlled Trial

    ERIC Educational Resources Information Center

    Goodkind, Jessica R.; Amer, Suha; Christian, Charlisa; Hess, Julia Meredith; Bybee, Deborah; Isakson, Brian L.; Baca, Brandon; Ndayisenga, Martin; Greene, R. Neil; Shantzek, Cece

    2017-01-01

    Randomized controlled trials (RCTs) are a long-standing and important design for conducting rigorous tests of the effectiveness of health interventions. However, many questions have been raised about the external validity of RCTs, their utility in explicating mechanisms of intervention and participants' intervention experiences, and their…

  13. New Assessments, New Rigor

    ERIC Educational Resources Information Center

    Joan Herman; Robert Linn

    2014-01-01

    Researching. Synthesizing. Reasoning with evidence. The PARCC and Smarter Balanced assessments are clearly setting their sights on complex thinking skills. Researchers Joan Herman and Robert Linn look at the new assessments to see how they stack up against Norman Webb's depth of knowledge framework as well as against current state tests. The…

  14. Reexamining Our Approach to College Access

    ERIC Educational Resources Information Center

    Pérez, Angel B.

    2017-01-01

    In this article Trinity College vice president for enrollment and student success, Angel Pérez addresses the nation's inability to offer consistent college preparation, academic rigor and counseling across varying socioeconomic communities. Research has highlighted the fact that standardized tests do more to keep low-income students out of top…

  15. Close Early Learning Gaps with Rigorous DAP

    ERIC Educational Resources Information Center

    Brown, Christopher P.; Mowry, Brian

    2015-01-01

    Rigorous DAP (developmentally appropriate practices) is a set of 11 principles of instruction intended to help close early childhood learning gaps. Academically rigorous learning environments create the conditions for children to learn at high levels. While academic rigor focuses on one dimension of education--academic--DAP considers the whole…

  16. (Energetics of silicate melts from thermal diffusion studies)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1989-01-01

    Research during the past year has been concentrated in four major areas. We are continuing work initiated during the first two years on modelling thermal diffusion on multicomponent silicate liquids. We have derived appropriate relations for ternary and quaternary systems and reanalyzed experimental thermal diffusion data for the ternary system fayalite-leucite-silica. In our manuscript entitled Thermal Diffusion in Petrology'', to be published in Adv. in Phy. Geochem., we show that these model results independently recover the compositional extent and temperature of liquid immiscibility in this system. Such retrieval provides a rigorous test of our theoretical predictions and simplified treatment ofmore » complex silicate liquids reported in Geochimica Cosmochimica Acta in 1986. The usefulness of our Soret research in providing mixing energies of silicate liquids has been recently confirmed by Ghiorso (1987, Cont. Min. Pet.). This demonstration provides a strategy for incorporating Soret data into the calibration of phase equilibrium-based solution models such as the one developed by Ghiorso. During the past year we also have resumed our studies of thermal diffusion in borosilicate glasses which also exhibit liquid immiscibility. Our objectives in studying these systems are (1) to further test of our multicomponent thermal diffusion model and (2) to provide quantitative constraints on the mixing properties of these glass-forming systems which are important for evaluating their suitability for storage of high-level nuclear waste. 16 refs.« less

  17. Rigorous Science: a How-To Guide.

    PubMed

    Casadevall, Arturo; Fang, Ferric C

    2016-11-08

    Proposals to improve the reproducibility of biomedical research have emphasized scientific rigor. Although the word "rigor" is widely used, there has been little specific discussion as to what it means and how it can be achieved. We suggest that scientific rigor combines elements of mathematics, logic, philosophy, and ethics. We propose a framework for rigor that includes redundant experimental design, sound statistical analysis, recognition of error, avoidance of logical fallacies, and intellectual honesty. These elements lead to five actionable recommendations for research education. Copyright © 2016 Casadevall and Fang.

  18. Identifying On-Orbit Test Targets for Space Fence Operational Testing

    NASA Astrophysics Data System (ADS)

    Pechkis, D.; Pacheco, N.; Botting, T.

    2014-09-01

    Space Fence will be an integrated system of two ground-based, S-band (2 to 4 GHz) phased-array radars located in Kwajalein and perhaps Western Australia [1]. Space Fence will cooperate with other Space Surveillance Network sensors to provide space object tracking and radar characterization data to support U.S. Strategic Command space object catalog maintenance and other space situational awareness needs. We present a rigorous statistical test design intended to test Space Fence to the letter of the program requirements as well as to characterize the system performance across the entire operational envelope. The design uses altitude, size, and inclination as independent factors in statistical tests of dependent variables (e.g., observation accuracy) linked to requirements. The analysis derives the type and number of necessary test targets. Comparing the resulting sample sizes with the number of currently known targets, we identify those areas where modelling and simulation methods are needed. Assuming hypothetical Kwajalein radar coverage and a conservative number of radar passes per object per day, we conclude that tests involving real-world space objects should take no more than 25 days to evaluate all operational requirements; almost 60 percent of the requirements can be tested in a single day and nearly 90 percent can be tested in one week or less. Reference: [1] L. Haines and P. Phu, Space Fence PDR Concept Development Phase, 2011 AMOS Conference Technical Papers.

  19. Experimental evaluation of rigor mortis. VII. Effect of ante- and post-mortem electrocution on the evolution of rigor mortis.

    PubMed

    Krompecher, T; Bergerioux, C

    1988-01-01

    The influence of electrocution on the evolution of rigor mortis was studied on rats. Our experiments showed that: (1) Electrocution hastens the onset of rigor mortis. After an electrocution of 90 s, a complete rigor develops already 1 h post-mortem (p.m.) compared to 5 h p.m. for the controls. (2) Electrocution hastens the passing of rigor mortis. After an electrocution of 90 s, the first significant decrease occurs at 3 h p.m. (8 h p.m. in the controls). (3) These modifications in rigor mortis evolution are less pronounced in the limbs not directly touched by the electric current. (4) In case of post-mortem electrocution, the changes are slightly less pronounced, the resistance is higher and the absorbed energy is lower as compared with the ante-mortem electrocution cases. The results are completed by two practical observations on human electrocution cases.

  20. Giant Panda Maternal Care: A Test of the Experience Constraint Hypothesis

    PubMed Central

    Snyder, Rebecca J.; Perdue, Bonnie M.; Zhang, Zhihe; Maple, Terry L.; Charlton, Benjamin D.

    2016-01-01

    The body condition constraint and the experience condition constraint hypotheses have both been proposed to account for differences in reproductive success between multiparous (experienced) and primiparous (first-time) mothers. However, because primiparous mothers are typically characterized by both inferior body condition and lack of experience when compared to multiparous mothers, interpreting experience related differences in maternal care as support for either the body condition constraint hypothesis or the experience constraint hypothesis is extremely difficult. Here, we examined maternal behaviour in captive giant pandas, allowing us to simultaneously control for body condition and provide a rigorous test of the experience constraint hypothesis in this endangered animal. We found that multiparous mothers spent more time engaged in key maternal behaviours (nursing, grooming, and holding cubs) and had significantly less vocal cubs than primiparous mothers. This study provides the first evidence supporting the experience constraint hypothesis in the order Carnivora, and may have utility for captive breeding programs in which it is important to monitor the welfare of this species’ highly altricial cubs, whose survival is almost entirely dependent on receiving adequate maternal care during the first few weeks of life. PMID:27272352

  1. EVA Health and Human Performance Benchmarking Study

    NASA Technical Reports Server (NTRS)

    Abercromby, A. F.; Norcross, J.; Jarvis, S. L.

    2016-01-01

    Multiple HRP Risks and Gaps require detailed characterization of human health and performance during exploration extravehicular activity (EVA) tasks; however, a rigorous and comprehensive methodology for characterizing and comparing the health and human performance implications of current and future EVA spacesuit designs does not exist. This study will identify and implement functional tasks and metrics, both objective and subjective, that are relevant to health and human performance, such as metabolic expenditure, suit fit, discomfort, suited postural stability, cognitive performance, and potentially biochemical responses for humans working inside different EVA suits doing functional tasks under the appropriate simulated reduced gravity environments. This study will provide health and human performance benchmark data for humans working in current EVA suits (EMU, Mark III, and Z2) as well as shirtsleeves using a standard set of tasks and metrics with quantified reliability. Results and methodologies developed during this test will provide benchmark data against which future EVA suits, and different suit configurations (eg, varied pressure, mass, CG) may be reliably compared in subsequent tests. Results will also inform fitness for duty standards as well as design requirements and operations concepts for future EVA suits and other exploration systems.

  2. Performance of Al2O3:C optically stimulated luminescence dosimeters for clinical radiation therapy applications.

    PubMed

    Hu, B; Wang, Y; Zealey, W

    2009-12-01

    A commercial Optical Stimulated Luminescence (OSL) dosimetry system developed by Landauer was tested to analyse the possibility of using OSL dosimetry for external beam radiotherapy planning checks. Experiments were performed to determine signal sensitivity, dose response range, beam type/energy dependency, reproducibility and linearity. Optical annealing processes to test OSL material reusability were also studied. In each case the measurements were converted into absorbed dose. The experimental results show that OSL dosimetry provides a wide dose response range, good linearity and reproducibility for the doses up to 800cGy. The OSL output is linear with dose up to 600cGy range showing a maximum deviation from linearity of 2.0% for the doses above 600cGy. The standard deviation in response of 20 dosimeters was 3.0%. After optical annealing using incandescent light, the readout intensity decreased by approximately 98% in the first 30 minutes. The readout intensity, I, decreased after repeated optical annealing as a power law, given by I infinity t (-1.3). This study concludes that OSL dosimetry can provide an alternative dosimetry technique for use in in-vivo dosimetry if rigorous measurement protocols are established.

  3. Upgrading geometry conceptual understanding and strategic competence through implementing rigorous mathematical thinking (RMT)

    NASA Astrophysics Data System (ADS)

    Nugraheni, Z.; Budiyono, B.; Slamet, I.

    2018-03-01

    To reach higher order thinking skill, needed to be mastered the conceptual understanding and strategic competence as they are two basic parts of high order thinking skill (HOTS). RMT is a unique realization of the cognitive conceptual construction approach based on Feurstein with his theory of Mediated Learning Experience (MLE) and Vygotsky’s sociocultural theory. This was quasi-experimental research which compared the experimental class that was given Rigorous Mathematical Thinking (RMT) as learning method and the control class that was given Direct Learning (DL) as the conventional learning activity. This study examined whether there was different effect of two learning model toward conceptual understanding and strategic competence of Junior High School Students. The data was analyzed by using Multivariate Analysis of Variance (MANOVA) and obtained a significant difference between experimental and control class when considered jointly on the mathematics conceptual understanding and strategic competence (shown by Wilk’s Λ = 0.84). Further, by independent t-test is known that there was significant difference between two classes both on mathematical conceptual understanding and strategic competence. By this result is known that Rigorous Mathematical Thinking (RMT) had positive impact toward Mathematics conceptual understanding and strategic competence.

  4. New recommendations for measuring collagen solubility.

    PubMed

    Latorre, María E; Lifschitz, Adrian L; Purslow, Peter P

    2016-08-01

    The heat-solubility of intramuscular collagen is usually conducted in 1/4 Ringer's solution at pH7.4, despite this ionic strength and pH being inappropriate for post-rigor meat. The current work studied the percentage of soluble collagen and hydrothermal isometric tension characteristics of perimysial strips on bovine semitendinosus muscles in either 1/4 Ringer's solution, distilled water, PBS, or a solution of the same salt concentration as 1/4 Ringer's but at pH5.6. Values of % soluble collagen were lower at pH7.4 than 5.6. Increasing ionic strength reduced % soluble collagen. The maximum perimysial isometric tension was independent of the bathing medium, but the percent relaxation was higher at pH7.4 than at pH5.6, and increased with ionic strength of the media. It is recommended that future measurements of collagen solubility and tests on connective tissue components of post-rigor meat should be carried out in a solution of concentrations NaCl and KCl equivalent to those in 1/4 Ringer's, but at pH5.6, a pH relevant to post-rigor meat. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. Rigorous Schools and Classrooms: Leading the Way

    ERIC Educational Resources Information Center

    Williamson, Ronald; Blackburn, Barbara R.

    2010-01-01

    Turn your school into a student-centered learning environment, where rigor is at the heart of instruction in every classroom. From the bestselling author of "Rigor is Not a Four-Letter Word," Barbara Blackburn, and award-winning educator Ronald Williamson, this comprehensive guide to establishing a schoolwide culture of rigor is for principals and…

  6. Rigor Revisited: Scaffolding College Student Learning by Incorporating Their Lived Experiences

    ERIC Educational Resources Information Center

    Castillo-Montoya, Milagros

    2018-01-01

    This chapter explores how students' lived experiences contribute to the rigor of their thinking. Insights from research indicate faculty can enhance rigor by accounting for the many ways it may surface in the classroom. However, to see this type of rigor, we must revisit the way we conceptualize it for higher education.

  7. Effect of rigor temperature, ageing and display time on the meat quality and lipid oxidative stability of hot boned beef Semimembranosus muscle.

    PubMed

    Mungure, Tanyaradzwa E; Bekhit, Alaa El-Din A; Birch, E John; Stewart, Ian

    2016-04-01

    The effects of rigor temperature (5, 15, 20 and 25°C), ageing (3, 7, 14, and 21 days) and display time on meat quality and lipid oxidative stability of hot boned beef M. Semimembranosus (SM) muscle were investigated. Ultimate pH (pH(u)) was rapidly attained at higher rigor temperatures. Electrical conductivity increased with rigor temperature (p<0.001). Tenderness, purge and cooking losses were not affected by rigor temperature; however purge loss and tenderness increased with ageing (p<0.01). Lightness (L*) and redness (a*) of the SM increased as rigor temperature increased (p<0.01). Lipid oxidation was assessed using (1)H NMR where changes in aliphatic to olefinic (R(ao)) and diallylmethylene (R(ad)) proton ratios can be rapidly monitored. R(ad), R(ao), PUFA and TBARS were not affected by rigor temperature, however ageing and display increased lipid oxidation (p<0.05). This study shows that rigor temperature manipulation of hot boned beef SM muscle does not have adverse effects on lipid oxidation. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. Mixed Criticality Scheduling for Industrial Wireless Sensor Networks

    PubMed Central

    Jin, Xi; Xia, Changqing; Xu, Huiting; Wang, Jintao; Zeng, Peng

    2016-01-01

    Wireless sensor networks (WSNs) have been widely used in industrial systems. Their real-time performance and reliability are fundamental to industrial production. Many works have studied the two aspects, but only focus on single criticality WSNs. Mixed criticality requirements exist in many advanced applications in which different data flows have different levels of importance (or criticality). In this paper, first, we propose a scheduling algorithm, which guarantees the real-time performance and reliability requirements of data flows with different levels of criticality. The algorithm supports centralized optimization and adaptive adjustment. It is able to improve both the scheduling performance and flexibility. Then, we provide the schedulability test through rigorous theoretical analysis. We conduct extensive simulations, and the results demonstrate that the proposed scheduling algorithm and analysis significantly outperform existing ones. PMID:27589741

  9. Nested variant of the method of moments of coupled cluster equations for vertical excitation energies and excited-state potential energy surfaces.

    PubMed

    Kowalski, Karol

    2009-05-21

    In this article we discuss the problem of proper balancing of the noniterative corrections to the ground- and excited-state energies obtained with approximate coupled cluster (CC) and equation-of-motion CC (EOMCC) approaches. It is demonstrated that for a class of excited states dominated by single excitations and for states with medium doubly excited component, the newly introduced nested variant of the method of moments of CC equations provides mathematically rigorous way of balancing the ground- and excited-state correlation effects. The resulting noniterative methodology accounting for the effect of triples is tested using its parallel implementation on the systems, for which iterative CC/EOMCC calculations with full inclusion of triply excited configurations or their most important subset are numerically feasible.

  10. Maximum entropy models as a tool for building precise neural controls.

    PubMed

    Savin, Cristina; Tkačik, Gašper

    2017-10-01

    Neural responses are highly structured, with population activity restricted to a small subset of the astronomical range of possible activity patterns. Characterizing these statistical regularities is important for understanding circuit computation, but challenging in practice. Here we review recent approaches based on the maximum entropy principle used for quantifying collective behavior in neural activity. We highlight recent models that capture population-level statistics of neural data, yielding insights into the organization of the neural code and its biological substrate. Furthermore, the MaxEnt framework provides a general recipe for constructing surrogate ensembles that preserve aspects of the data, but are otherwise maximally unstructured. This idea can be used to generate a hierarchy of controls against which rigorous statistical tests are possible. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Real Time Optimal Control of Supercapacitor Operation for Frequency Response

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Luo, Yusheng; Panwar, Mayank; Mohanpurkar, Manish

    2016-07-01

    Supercapacitors are gaining wider applications in power systems due to fast dynamic response. Utilizing supercapacitors by means of power electronics interfaces for power compensation is a proven effective technique. For applications such as requency restoration if the cost of supercapacitors maintenance as well as the energy loss on the power electronics interfaces are addressed. It is infeasible to use traditional optimization control methods to mitigate the impacts of frequent cycling. This paper proposes a Front End Controller (FEC) using Generalized Predictive Control featuring real time receding optimization. The optimization constraints are based on cost and thermal management to enhance tomore » the utilization efficiency of supercapacitors. A rigorous mathematical derivation is conducted and test results acquired from Digital Real Time Simulator are provided to demonstrate effectiveness.« less

  12. Computational Modeling Using OpenSim to Simulate a Squat Exercise Motion

    NASA Technical Reports Server (NTRS)

    Gallo, C. A.; Thompson, W. K.; Lewandowski, B. E.; Humphreys, B. T.; Funk, J. H.; Funk, N. H.; Weaver, A. S.; Perusek, G. P.; Sheehan, C. C.; Mulugeta, L.

    2015-01-01

    Long duration space travel to destinations such as Mars or an asteroid will expose astronauts to extended periods of reduced gravity. Astronauts will use an exercise regime for the duration of the space flight to minimize the loss of bone density, muscle mass and aerobic capacity that occurs during exposure to a reduced gravity environment. Since the area available in the spacecraft for an exercise device is limited and gravity is not present to aid loading, compact resistance exercise device prototypes are being developed. Since it is difficult to rigorously test these proposed devices in space flight, computational modeling provides an estimation of the muscle forces, joint torques and joint loads during exercise to gain insight on the efficacy to protect the musculoskeletal health of astronauts.

  13. Observations of fallibility in applications of modern programming methodologies

    NASA Technical Reports Server (NTRS)

    Gerhart, S. L.; Yelowitz, L.

    1976-01-01

    Errors, inconsistencies, or confusing points are noted in a variety of published algorithms, many of which are being used as examples in formulating or teaching principles of such modern programming methodologies as formal specification, systematic construction, and correctness proving. Common properties of these points of contention are abstracted. These properties are then used to pinpoint possible causes of the errors and to formulate general guidelines which might help to avoid further errors. The common characteristic of mathematical rigor and reasoning in these examples is noted, leading to some discussion about fallibility in mathematics, and its relationship to fallibility in these programming methodologies. The overriding goal is to cast a more realistic perspective on the methodologies, particularly with respect to older methodologies, such as testing, and to provide constructive recommendations for their improvement.

  14. Magpies can use local cues to retrieve their food caches.

    PubMed

    Feenders, Gesa; Smulders, Tom V

    2011-03-01

    Much importance has been placed on the use of spatial cues by food-hoarding birds in the retrieval of their caches. In this study, we investigate whether food-hoarding birds can be trained to use local cues ("beacons") in their cache retrieval. We test magpies (Pica pica) in an active hoarding-retrieval paradigm, where local cues are always reliable, while spatial cues are not. Our results show that the birds use the local cues to retrieve their caches, even when occasionally contradicting spatial information is available. The design of our study does not allow us to test rigorously whether the birds prefer using local over spatial cues, nor to investigate the process through which they learn to use local cues. We furthermore provide evidence that magpies develop landmark preferences, which improve their retrieval accuracy. Our findings support the hypothesis that birds are flexible in their use of memory information, using a combination of the most reliable or salient information to retrieve their caches. © Springer-Verlag 2010

  15. Retrieval induces forgetting, but only when nontested items compete for retrieval: Implication for interference, inhibition, and context reinstatement.

    PubMed

    Chan, Jason C K; Erdman, Matthew R; Davis, Sara D

    2015-09-01

    The mechanism responsible for retrieval-induced forgetting has been the subject of rigorous theoretical debate, with some researchers postulating that retrieval-induced forgetting can be explained by interference (J. G .W. Raaijmakers & E. Jakab, 2013) or context reinstatement (T. R. Jonker, P. Seli, & C. M. MacLeod, 2013), whereas others claim that retrieval-induced forgetting is better explained by inhibition (M. C. Anderson, 2003). A fundamental assumption of the inhibition account is that nonpracticed items are suppressed because they compete for retrieval during initial testing. In the current study, we manipulated competition in a novel interpolated testing paradigm by having subjects learn the nonpracticed items either before (high-competition condition) or after (low-competition condition) they practiced retrieval of the target items. We found retrieval-induced forgetting for the nonpracticed competitors only when they were studied before retrieval practice. This result provides support for a critical assumption of the inhibition account. (c) 2015 APA, all rights reserved).

  16. Double Dutch: A Tool for Designing Combinatorial Libraries of Biological Systems.

    PubMed

    Roehner, Nicholas; Young, Eric M; Voigt, Christopher A; Gordon, D Benjamin; Densmore, Douglas

    2016-06-17

    Recently, semirational approaches that rely on combinatorial assembly of characterized DNA components have been used to engineer biosynthetic pathways. In practice, however, it is not practical to assemble and test millions of pathway variants in order to elucidate how different DNA components affect the behavior of a pathway. To address this challenge, we apply a rigorous mathematical approach known as design of experiments (DOE) that can be used to construct empirical models of system behavior without testing all variants. To support this approach, we have developed a tool named Double Dutch, which uses a formal grammar and heuristic algorithms to automate the process of DOE library design. Compared to designing by hand, Double Dutch enables users to more efficiently and scalably design libraries of pathway variants that can be used in a DOE framework and uniquely provides a means to flexibly balance design considerations of statistical analysis, construction cost, and risk of homologous recombination, thereby demonstrating the utility of automating decision making when faced with complex design trade-offs.

  17. NextGen Operational Improvements: Will they Improve Human Performance

    NASA Technical Reports Server (NTRS)

    Beard, Bettina L.; Johnston, James C.; Holbrook, Jon

    2013-01-01

    Modernization of the National Airspace System depends critically on the development of advanced technology, including cutting-edge automation, controller decision-support tools and integrated on-demand information. The Next Generation Air Transportation System national plan envisions air traffic control tower automation that proposes solutions for seven problems: 1) departure metering, 2) taxi routing, 3) taxi and runway scheduling, 4) departure runway assignments, 5) departure flow management, 6) integrated arrival and departure scheduling and 7) runway configuration management. Government, academia and industry are simultaneously pursuing the development of these tools. For each tool, the development process typically begins by assessing its potential benefits, and then progresses to designing preliminary versions of the tool, followed by testing the tool's strengths and weaknesses using computational modeling, human-in-the-loop simulation and/or field tests. We compiled the literature, evaluated the methodological rigor of the studies and served as referee for partisan conclusions that were sometimes overly optimistic. Here we provide the results of this review.

  18. On-ground characterization of the Euclid's CCD273-based readout chain

    NASA Astrophysics Data System (ADS)

    Szafraniec, Magdalena; Azzollini, R.; Cropper, M.; Pottinger, S.; Khalil, A.; Hailey, M.; Hu, D.; Plana, C.; Cutts, A.; Hunt, T.; Kohley, R.; Walton, D.; Theobald, C.; Sharples, R.; Schmoll, J.; Ferrando, P.

    2016-07-01

    Euclid is a medium class European Space Agency mission scheduled for launch in 2020. The goal of the survey is to examine the nature of Dark Matter and Dark Energy in the Universe. One of the cosmological probes used to analyze Euclid's data, the weak lensing technique, measures the distortions of galaxy shapes and this requires very accurate knowledge of the system point spread function (PSF). Therefore, to ensure that the galaxy shape is not affected, the detector chain of the telescope's VISible Instrument (VIS) needs to meet specific performance performance requirements. Each of the 12 VIS readout chains consisting of 3 CCDs, readout electronics (ROE) and a power supply unit (RPSU) will undergo a rigorous on-ground testing to ensure that these requirements are met. This paper reports on the current status of the warm and cold testing of the VIS Engineering Model readout chain. Additionally, an early insight to the commissioning of the Flight Model calibration facility and program is provided.

  19. An operational global-scale ocean thermal analysis system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clancy, R. M.; Pollak, K.D.; Phoebus, P.A.

    1990-04-01

    The Optimum Thermal Interpolation System (OTIS) is an ocean thermal analysis system designed for operational use at FNOC. It is based on the optimum interpolation of the assimilation technique and functions in an analysis-prediction-analysis data assimilation cycle with the TOPS mixed-layer model. OTIS provides a rigorous framework for combining real-time data, climatology, and predictions from numerical ocean prediction models to produce a large-scale synoptic representation of ocean thermal structure. The techniques and assumptions used in OTIS are documented and results of operational tests of global scale OTIS at FNOC are presented. The tests involved comparisons of OTIS against an existingmore » operational ocean thermal structure model and were conducted during February, March, and April 1988. Qualitative comparison of the two products suggests that OTIS gives a more realistic representation of subsurface anomalies and horizontal gradients and that it also gives a more accurate analysis of the thermal structure, with improvements largest below the mixed layer. 37 refs.« less

  20. Does metacognitive strategy instruction improve impaired receptive cognitive-communication skills following acquired brain injury?

    PubMed

    Copley, Anna; Smith, Kathleen; Savill, Katelyn; Finch, Emma

    2015-01-01

    To investigate if metacognitive strategy instruction (MSI) improves the receptive language skills of adults with cognitive-communication disorders secondary to acquired brain injury (ABI). An ABA intervention programme was implemented with eight adults with ABI, aged 25-70 years. The Measure of Cognitive-Linguistic Abilities (MCLA) was administered at baseline and following treatment. The treatment employed in this study involved three components: individual goal-based therapy, group remediation therapy using self-instruction and home practice. No receptive language sub-tests of the MCLA reached statistical significance. However, participants' raw score improvements in receptive language sub-tests indicated that MSI may be effective at remediating CCDs following ABI. Preliminary findings indicate that MSI may be effective in improving receptive language skills in adults with CCDs following ABI. Further research involving a more rigorous study, a larger sample size and a more reliable outcome measure is necessary and may provide statistically significant evidence for the effectiveness of MSI for remediating receptive language disorders.

  1. Dissolution curve comparisons through the F(2) parameter, a Bayesian extension of the f(2) statistic.

    PubMed

    Novick, Steven; Shen, Yan; Yang, Harry; Peterson, John; LeBlond, Dave; Altan, Stan

    2015-01-01

    Dissolution (or in vitro release) studies constitute an important aspect of pharmaceutical drug development. One important use of such studies is for justifying a biowaiver for post-approval changes which requires establishing equivalence between the new and old product. We propose a statistically rigorous modeling approach for this purpose based on the estimation of what we refer to as the F2 parameter, an extension of the commonly used f2 statistic. A Bayesian test procedure is proposed in relation to a set of composite hypotheses that capture the similarity requirement on the absolute mean differences between test and reference dissolution profiles. Several examples are provided to illustrate the application. Results of our simulation study comparing the performance of f2 and the proposed method show that our Bayesian approach is comparable to or in many cases superior to the f2 statistic as a decision rule. Further useful extensions of the method, such as the use of continuous-time dissolution modeling, are considered.

  2. Standardization of Analysis Sets for Reporting Results from ADNI MRI Data

    PubMed Central

    Wyman, Bradley T.; Harvey, Danielle J.; Crawford, Karen; Bernstein, Matt A.; Carmichael, Owen; Cole, Patricia E.; Crane, Paul; DeCarli, Charles; Fox, Nick C.; Gunter, Jeffrey L.; Hill, Derek; Killiany, Ronald J.; Pachai, Chahin; Schwarz, Adam J.; Schuff, Norbert; Senjem, Matthew L.; Suhy, Joyce; Thompson, Paul M.; Weiner, Michael; Jack, Clifford R.

    2013-01-01

    The ADNI 3D T1-weighted MRI acquisitions provide a rich dataset for developing and testing analysis techniques for extracting structural endpoints. To promote greater rigor in analysis and meaningful comparison of different algorithms, the ADNI MRI Core has created standardized analysis sets of data comprising scans that met minimum quality control requirements. We encourage researchers to test and report their techniques against these data. Standard analysis sets of volumetric scans from ADNI-1 have been created, comprising: screening visits, 1 year completers (subjects who all have screening, 6 and 12 month scans), two year annual completers (screening, 1, and 2 year scans), two year completers (screening, 6 months, 1 year, 18 months (MCI only) and 2 years) and complete visits (screening, 6 months, 1 year, 18 months (MCI only), 2, and 3 year (normal and MCI only) scans). As the ADNI-GO/ADNI-2 data becomes available, updated standard analysis sets will be posted regularly. PMID:23110865

  3. Point-of-Care Technologies for Precision Cardiovascular Care and Clinical Research

    PubMed Central

    King, Kevin; Grazette, Luanda P.; Paltoo, Dina N.; McDevitt, John T.; Sia, Samuel K.; Barrett, Paddy M.; Apple, Fred S.; Gurbel, Paul A.; Weissleder, Ralph; Leeds, Hilary; Iturriaga, Erin J.; Rao, Anupama; Adhikari, Bishow; Desvigne-Nickens, Patrice; Galis, Zorina S.; Libby, Peter

    2016-01-01

    Point-of-care technologies (POC or POCT) are enabling innovative cardiovascular diagnostics that promise to improve patient care across diverse clinical settings. The National Heart, Lung, and Blood Institute convened a working group to discuss POCT in cardiovascular medicine. The multidisciplinary working group, which included clinicians, scientists, engineers, device manufacturers, regulatory officials, and program staff, reviewed the state of the POCT field; discussed opportunities for POCT to improve cardiovascular care, realize the promise of precision medicine, and advance the clinical research enterprise; and identified barriers facing translation and integration of POCT with existing clinical systems. A POCT development roadmap emerged to guide multidisciplinary teams of biomarker scientists, technologists, health care providers, and clinical trialists as they: 1) formulate needs assessments; 2) define device design specifications; 3) develop component technologies and integrated systems; 4) perform iterative pilot testing; and 5) conduct rigorous prospective clinical testing to ensure that POCT solutions have substantial effects on cardiovascular care. PMID:26977455

  4. Why Do We Need the Derivative for the Surface Area?

    ERIC Educational Resources Information Center

    Hristova, Yulia; Zeytuncu, Yunus E.

    2016-01-01

    Surface area and volume computations are the most common applications of integration in calculus books. When computing the surface area of a solid of revolution, students are usually told to use the frustum method instead of the disc method; however, a rigorous explanation is rarely provided. In this note, we provide one by using geometric…

  5. CMS-Wave

    DTIC Science & Technology

    2014-10-27

    a phase-averaged spectral wind-wave generation and transformation model and its interface in the Surface-water Modeling System (SMS). Ambrose...applications of the Boussinesq (BOUSS-2D) wave model that provides more rigorous calculations for design and performance optimization of integrated...navigation systems . Together these wave models provide reliable predictions on regional and local spatial domains and cost-effective engineering solutions

  6. High and low rigor temperature effects on sheep meat tenderness and ageing.

    PubMed

    Devine, Carrick E; Payne, Steven R; Peachey, Bridget M; Lowe, Timothy E; Ingram, John R; Cook, Christian J

    2002-02-01

    Immediately after electrical stimulation, the paired m. longissimus thoracis et lumborum (LT) of 40 sheep were boned out and wrapped tightly with a polyethylene cling film. One of the paired LT's was chilled in 15°C air to reach a rigor mortis (rigor) temperature of 18°C and the other side was placed in a water bath at 35°C and achieved rigor at this temperature. Wrapping reduced rigor shortening and mimicked meat left on the carcass. After rigor, the meat was aged at 15°C for 0, 8, 26 and 72 h and then frozen. The frozen meat was cooked to 75°C in an 85°C water bath and shear force values obtained from a 1×1 cm cross-section. The shear force values of meat for 18 and 35°C rigor were similar at zero ageing, but as ageing progressed, the 18 rigor meat aged faster and became more tender than meat that went into rigor at 35°C (P<0.001). The mean sarcomere length values of meat samples for 18 and 35°C rigor at each ageing time were significantly different (P<0.001), the samples at 35°C being shorter. When the short sarcomere length values and corresponding shear force values were removed for further data analysis, the shear force values for the 35°C rigor were still significantly greater. Thus the toughness of 35°C meat was not a consequence of muscle shortening and appears to be due to both a faster rate of tenderisation and the meat tenderising to a greater extent at the lower temperature. The cook loss at 35°C rigor (30.5%) was greater than that at 18°C rigor (28.4%) (P<0.01) and the colour Hunter L values were higher at 35°C (P<0.01) compared with 18°C, but there were no significant differences in a or b values.

  7. Optimization of Wireless Power Transfer Systems Enhanced by Passive Elements and Metasurfaces

    NASA Astrophysics Data System (ADS)

    Lang, Hans-Dieter; Sarris, Costas D.

    2017-10-01

    This paper presents a rigorous optimization technique for wireless power transfer (WPT) systems enhanced by passive elements, ranging from simple reflectors and intermedi- ate relays all the way to general electromagnetic guiding and focusing structures, such as metasurfaces and metamaterials. At its core is a convex semidefinite relaxation formulation of the otherwise nonconvex optimization problem, of which tightness and optimality can be confirmed by a simple test of its solutions. The resulting method is rigorous, versatile, and general -- it does not rely on any assumptions. As shown in various examples, it is able to efficiently and reliably optimize such WPT systems in order to find their physical limitations on performance, optimal operating parameters and inspect their working principles, even for a large number of active transmitters and passive elements.

  8. GSA's Green Proving Ground: Identifying, Testing and Evaluating Innovative Technologies; Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kandt, A.; Lowell, M.

    2012-05-01

    This paper will provide an overview of the GPG program and its objectives as well as a summary and status update of the 16 technologies selected for enhanced testing and evaluation in 2011. The federal government's General Services Administration's (GSA) Public Buildings Service (PBS) acquires space on behalf of the federal government through new construction and leasing, and acts as a caretaker for federal properties across the country. PBS owns or leases 9,624 assets and maintains an inventory of more than 370.2 million square feet of workspace, and as such has enormous potential for implementing energy efficient and renewable energymore » technologies to reduce energy and water use and associated emissions. The Green Proving Ground (GPG) program utilizes GSA's real estate portfolio to test and evaluate innovative and underutilized sustainable building technologies and practices. Findings are used to support the development of GSA performance specifications and inform decision making within GSA, other federal agencies, and the real estate industry. The program aims to drive innovation in environmental performance in federal buildings and help lead market transformation through deployment of new technologies. In 2011, the GPG program selected 16 technologies or practices for rigorous testing and evaluation. Evaluations are currently being performed in collaboration with the Department of Energy's National Laboratories, and a steady stream of results will be forthcoming throughout 2012. This paper will provide an overview of the GPG program and its objectives as well as a summary and status update of the 16 technologies selected for enhanced testing and evaluation in 2011. Lastly, it provides a general overview of the 2012 program.« less

  9. Advanced High-Temperature Flexible TPS for Inflatable Aerodynamic Decelerators

    NASA Technical Reports Server (NTRS)

    DelCorso, Joseph A.; Cheatwood, F. McNeil; Bruce, Walter E., III; Hughes, Stephen J.; Calomino, Anthony M.

    2011-01-01

    Typical entry vehicle aeroshells are limited in size by the launch vehicle shroud. Inflatable aerodynamic decelerators allow larger aeroshell diameters for entry vehicles because they are not constrained to the launch vehicle shroud diameter. During launch, the hypersonic inflatable aerodynamic decelerator (HIAD) is packed in a stowed configuration. Prior to atmospheric entry, the HIAD is deployed to produce a drag device many times larger than the launch shroud diameter. The large surface area of the inflatable aeroshell provides deceleration of high-mass entry vehicles at relatively low ballistic coefficients. Even for these low ballistic coefficients there is still appreciable heating, requiring the HIAD to employ a thermal protection system (TPS). This TPS must be capable of surviving the heat pulse, and the rigors of fabrication handling, high density packing, deployment, and aerodynamic loading. This paper provides a comprehensive overview of flexible TPS tests and results, conducted over the last three years. This paper also includes an overview of each test facility, the general approach for testing flexible TPS, the thermal analysis methodology and results, and a comparison with 8-foot High Temperature Tunnel, Laser-Hardened Materials Evaluation Laboratory, and Panel Test Facility test data. Results are presented for a baseline TPS layup that can withstand a 20 W/cm2 heat flux, silicon carbide (SiC) based TPS layup, and polyimide insulator TPS layup. Recent work has focused on developing material layups expected to survive heat flux loads up to 50 W/cm2 (which is adequate for many potential applications), future work will consider concepts capable of withstanding more than 100 W/cm2 incident radiant heat flux. This paper provides an overview of the experimental setup, material layup configurations, facility conditions, and planned future flexible TPS activities.

  10. On analyticity of linear waves scattered by a layered medium

    NASA Astrophysics Data System (ADS)

    Nicholls, David P.

    2017-10-01

    The scattering of linear waves by periodic structures is a crucial phenomena in many branches of applied physics and engineering. In this paper we establish rigorous analytic results necessary for the proper numerical analysis of a class of High-Order Perturbation of Surfaces methods for simulating such waves. More specifically, we prove a theorem on existence and uniqueness of solutions to a system of partial differential equations which model the interaction of linear waves with a multiply layered periodic structure in three dimensions. This result provides hypotheses under which a rigorous numerical analysis could be conducted for recent generalizations to the methods of Operator Expansions, Field Expansions, and Transformed Field Expansions.

  11. Guidelines for conducting rigorous health care psychosocial cross-cultural/language qualitative research.

    PubMed

    Arriaza, Pablo; Nedjat-Haiem, Frances; Lee, Hee Yun; Martin, Shadi S

    2015-01-01

    The purpose of this article is to synthesize and chronicle the authors' experiences as four bilingual and bicultural researchers, each experienced in conducting cross-cultural/cross-language qualitative research. Through narrative descriptions of experiences with Latinos, Iranians, and Hmong refugees, the authors discuss their rewards, challenges, and methods of enhancing rigor, trustworthiness, and transparency when conducting cross-cultural/cross-language research. The authors discuss and explore how to effectively manage cross-cultural qualitative data, how to effectively use interpreters and translators, how to identify best methods of transcribing data, and the role of creating strong community relationships. The authors provide guidelines for health care professionals to consider when engaging in cross-cultural qualitative research.

  12. DESCQA: An Automated Validation Framework for Synthetic Sky Catalogs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mao, Yao-Yuan; Kovacs, Eve; Heitmann, Katrin

    The use of high-quality simulated sky catalogs is essential for the success of cosmological surveys. The catalogs have diverse applications, such as investigating signatures of fundamental physics in cosmological observables, understanding the effect of systematic uncertainties on measured signals and testing mitigation strategies for reducing these uncertainties, aiding analysis pipeline development and testing, and survey strategy optimization. The list of applications is growing with improvements in the quality of the catalogs and the details that they can provide. Given the importance of simulated catalogs, it is critical to provide rigorous validation protocols that enable both catalog providers and users tomore » assess the quality of the catalogs in a straightforward and comprehensive way. For this purpose, we have developed the DESCQA framework for the Large Synoptic Survey Telescope Dark Energy Science Collaboration as well as for the broader community. The goal of DESCQA is to enable the inspection, validation, and comparison of an inhomogeneous set of synthetic catalogs via the provision of a common interface within an automated framework. Here in this paper, we present the design concept and first implementation of DESCQA. In order to establish and demonstrate its full functionality we use a set of interim catalogs and validation tests. We highlight several important aspects, both technical and scientific, that require thoughtful consideration when designing a validation framework, including validation metrics and how these metrics impose requirements on the synthetic sky catalogs.« less

  13. DESCQA: An Automated Validation Framework for Synthetic Sky Catalogs

    DOE PAGES

    Mao, Yao-Yuan; Kovacs, Eve; Heitmann, Katrin; ...

    2018-02-08

    The use of high-quality simulated sky catalogs is essential for the success of cosmological surveys. The catalogs have diverse applications, such as investigating signatures of fundamental physics in cosmological observables, understanding the effect of systematic uncertainties on measured signals and testing mitigation strategies for reducing these uncertainties, aiding analysis pipeline development and testing, and survey strategy optimization. The list of applications is growing with improvements in the quality of the catalogs and the details that they can provide. Given the importance of simulated catalogs, it is critical to provide rigorous validation protocols that enable both catalog providers and users tomore » assess the quality of the catalogs in a straightforward and comprehensive way. For this purpose, we have developed the DESCQA framework for the Large Synoptic Survey Telescope Dark Energy Science Collaboration as well as for the broader community. The goal of DESCQA is to enable the inspection, validation, and comparison of an inhomogeneous set of synthetic catalogs via the provision of a common interface within an automated framework. Here in this paper, we present the design concept and first implementation of DESCQA. In order to establish and demonstrate its full functionality we use a set of interim catalogs and validation tests. We highlight several important aspects, both technical and scientific, that require thoughtful consideration when designing a validation framework, including validation metrics and how these metrics impose requirements on the synthetic sky catalogs.« less

  14. Antimüllerian hormone levels and antral follicle counts are not reduced compared with community controls in patients with rigorously defined unexplained infertility.

    PubMed

    Greenwood, Eleni A; Cedars, Marcelle I; Santoro, Nanette; Eisenberg, Esther; Kao, Chia-Ning; Haisenleder, Daniel J; Diamond, Michael P; Huddleston, Heather G

    2017-12-01

    To test the hypothesis that women with unexplained infertility demonstrate evidence of diminished ovarian reserve when compared with a population of community controls. Cross-sectional study. Multicenter university-based clinical practices. Study participants included 277 healthy, normo-ovulatory female partners with rigorously defined unexplained infertility randomly selected from a multicenter trial (Assessment of Multiple Intrauterine Gestations from Ovarian Stimulation). Controls included 226 healthy, normo-ovulatory women not seeking treatment for fertility from a community-based cohort (Ovarian Aging study). Serum antimüllerian hormone (AMH) assay at a central laboratory, FSH, fasting serum metabolic testing, transvaginal ultrasonography for antral follicle counts (AFCs), anthropometric measurements. Average AMH, AFC, and AMH/AFC were compared between infertile and control women by age. Analyses of covariance compared these outcomes while controlling for confounders, including age, race, body mass index, smoking history, and study site. In our models, AMH, AFC, and AMH/AFC ovarian reserve indices did not differ between infertile women and community-based controls, after controlling for age, race, body mass index, smoking history, and study site. Currently utilized predictors of ovarian reserve do not discriminate women with rigorously defined unexplained infertility from healthy community-based women of similar demographic characteristics. Contrary to our hypothesis, among women with FSH in the normal range (≤12 IU/L), women with unexplained infertility did not show evidence of decreased ovarian reserve as measured by AMH and AFC. Ovarian reserve markers in isolation may not serve as predictors of future fertility. Copyright © 2017 American Society for Reproductive Medicine. All rights reserved.

  15. Exploring Student Perceptions of Rigor Online: Toward a Definition of Rigorous Learning

    ERIC Educational Resources Information Center

    Duncan, Heather E.; Range, Bret; Hvidston, David

    2013-01-01

    Technological advances in the last decade have impacted delivery methods of university courses. More and more courses are offered in a variety of formats. While academic rigor is a term often used, its definition is less clear. This mixed-methods study explored graduate student conceptions of rigor in the online learning environment embedded…

  16. Methodological rigor and citation frequency in patient compliance literature.

    PubMed Central

    Bruer, J T

    1982-01-01

    An exhaustive bibliography which assesses the methodological rigor of the patient compliance literature, and citation data from the Science Citation Index (SCI) are combined to determine if methodologically rigorous papers are used with greater frequency than substandard articles by compliance investigators. There are low, but statistically significant, correlations between methodological rigor and citation indicators for 138 patient compliance papers published in SCI source journals during 1975 and 1976. The correlation is not strong enough to warrant use of citation measures as indicators of rigor on a paper-by-paper basis. The data do suggest that citation measures might be developed as crude indicators of methodological rigor. There is no evidence that randomized trials are cited more frequently than studies that employ other experimental designs. PMID:7114334

  17. Exploring the barriers to rigorous monitoring and evaluation of health systems strengthening activities: qualitative evidence from international development partners.

    PubMed

    Wisniewski, Janna M; Yeager, Valerie A; Diana, Mark L; Hotchkiss, David R

    2016-10-01

    The number of health systems strengthening (HSS) programs has increased in the last decade. However, a limited number of studies providing robust evidence for the value and impact of these programs are available. This study aims to identify knowledge gaps and challenges that impede rigorous monitoring and evaluation (M&E) of HSS, and to ascertain the extent to which these efforts are informed by existing technical guidance. Interviews were conducted with HSS advisors at United States Agency for International Development-funded missions as well as senior M&E advisors at implementing partner and multilateral organizations. Findings showed that mission staff do not use existing technical resources, either because they do not know about them or do not find them useful. Barriers to rigorous M&E included a lack suitable of indicators, data limitations, difficulty in demonstrating an impact on health, and insufficient funding and resources. Consensus and collaboration between international health partners and local governments may mitigate these challenges. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  18. Improving clinical cognitive testing

    PubMed Central

    Gale, Seth A.; Barrett, A.M.; Boeve, Bradley F.; Chatterjee, Anjan; Coslett, H. Branch; D'Esposito, Mark; Finney, Glen R.; Gitelman, Darren R.; Hart, John J.; Lerner, Alan J.; Meador, Kimford J.; Pietras, Alison C.; Voeller, Kytja S.; Kaufer, Daniel I.

    2015-01-01

    Objective: To evaluate the evidence basis of single-domain cognitive tests frequently used by behavioral neurologists in an effort to improve the quality of clinical cognitive assessment. Methods: Behavioral Neurology Section members of the American Academy of Neurology were surveyed about how they conduct clinical cognitive testing, with a particular focus on the Neurobehavioral Status Exam (NBSE). In contrast to general screening cognitive tests, an NBSE consists of tests of individual cognitive domains (e.g., memory or language) that provide a more comprehensive diagnostic assessment. Workgroups for each of 5 cognitive domains (attention, executive function, memory, language, and spatial cognition) conducted evidence-based reviews of frequently used tests. Reviews focused on suitability for office-based clinical practice, including test administration time, accessibility of normative data, disease populations studied, and availability in the public domain. Results: Demographic and clinical practice data were obtained from 200 respondents who reported using a wide range of cognitive tests. Based on survey data and ancillary information, between 5 and 15 tests in each cognitive domain were reviewed. Within each domain, several tests are highlighted as being well-suited for an NBSE. Conclusions: We identified frequently used single-domain cognitive tests that are suitable for an NBSE to help make informed choices about clinical cognitive assessment. Some frequently used tests have limited normative data or have not been well-studied in common neurologic disorders. Utilizing standardized cognitive tests, particularly those with normative data based on the individual's age and educational level, can enhance the rigor and utility of clinical cognitive assessment. PMID:26163433

  19. Sci-Sat AM: Radiation Dosimetry and Practical Therapy Solutions - 05: Not all geometries are equivalent for magnetic field Fano cavity tests

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Malkov, Victor N.; Rogers, David W.O.

    The coupling of MRI and radiation treatment systems for the application of magnetic resonance guided radiation therapy necessitates a reliable magnetic field capable Monte Carlo (MC) code. In addition to the influence of the magnetic field on dose distributions, the question of proper calibration has arisen due to the several percent variation of ion chamber and solid state detector responses in magnetic fields when compared to the 0 T case (Reynolds et al., Med Phys, 2013). In the absence of a magnetic field, EGSnrc has been shown to pass the Fano cavity test (a rigorous benchmarking tool of MC codes)more » at the 0.1 % level (Kawrakow, Med.Phys, 2000), and similar results should be required of magnetic field capable MC algorithms. To properly test such developing MC codes, the Fano cavity theorem has been adapted to function in a magnetic field (Bouchard et al., PMB, 2015). In this work, the Fano cavity test is applied in a slab and ion-chamber-like geometries to test the transport options of an implemented magnetic field algorithm in EGSnrc. Results show that the deviation of the MC dose from the expected Fano cavity theory value is highly sensitive to the choice of geometry, and the ion chamber geometry appears to pass the test more easily than larger slab geometries. As magnetic field MC codes begin to be used for dose simulations and correction factor calculations, care must be taken to apply the most rigorous Fano test geometries to ensure reliability of such algorithms.« less

  20. Selecting participants when testing new drugs: the implications of age and gender discrimination.

    PubMed

    Ferguson, Pamela R

    2002-01-01

    Pharmaceutical products are rigorously tested for safety and efficacy prior to being licensed for use. During this testing process the archetypal research subject is a young male; women and older people are less frequently invited to participate. This is especially true at the early stages, but can also occur in the later phases of drug testing. This paper considers the reasons for the relative under-representation of these groups, and the legal implications of failing to include as research subjects the very types of people who will ultimately consume these drugs.

  1. Done in 60 seconds- See a Massive Rocket Fuel Tank Built in A Minute

    NASA Image and Video Library

    2016-08-18

    The 7.5-minute test conducted at NASA’s Stennis Space Center is part of a series of tests designed to put the upgraded former space shuttle engines through the rigorous temperature and pressure conditions they will experience during a launch. The tests also support the development of a new controller, or “brain,” for the engine, which monitors engine status and communicates between the rocket and the engine, relaying commands to the engine and transmitting data back to the rocket.

  2. Testing Theoretical Models of Magnetic Damping Using an Air Track

    ERIC Educational Resources Information Center

    Vidaurre, Ana; Riera, Jaime; Monsoriu, Juan A.; Gimenez, Marcos H.

    2008-01-01

    Magnetic braking is a long-established application of Lenz's law. A rigorous analysis of the laws governing this problem involves solving Maxwell's equations in a time-dependent situation. Approximate models have been developed to describe different experimental results related to this phenomenon. In this paper we present a new method for the…

  3. Assessing face validity of a physical activity questionnaire for Spanish-speaking women in California

    USDA-ARS?s Scientific Manuscript database

    Background: A review of the literature produced no rigorously tested and validated Spanish-language physical activity survey or evaluation tools for use by USDA’s food assistance and education programs. The purpose of the current study was to develop and evaluate the face validity of a visually enha...

  4. The Cognitive Processes Associated with Occupational/Career Indecision: A Model for Gifted Adolescents

    ERIC Educational Resources Information Center

    Jung, Jae Yup

    2013-01-01

    This study developed and tested a new model of the cognitive processes associated with occupational/career indecision for gifted adolescents. A survey instrument with rigorous psychometric properties, developed from a number of existing instruments, was administered to a sample of 687 adolescents attending three academically selective high schools…

  5. Ocean Profile Measurements during the Seasonal Ice Zone Reconnaissance Surveys

    DTIC Science & Technology

    2012-09-30

    physical processes that occur within the BCSIZ that require data from all components of SIZRS, and improve predictive models of the SIZ through model ...the IABP (Ignatius Rigor) are approved by the USCG for operation from the ADA aircraft, but we anticipate being informed of any Safety of Flight Test

  6. Tuskegee Airman Lee Hayes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hayes, Lee

    2006-08-03

    Hayes, a resident of Amagansett who worked at Brookhaven Lab as a custodian from 1958 to 1966, served in an all-black bomber squadron at Tuskegee Army Air Field in Alabama. He was among 994 precedent-breaking black soldiers at Tuskegee who passed rigorous tests between 1942 and 1946 to become pilots in the then-segregated armed forces.

  7. Immigrants as New Speakers in Galicia and Wales: Issues of Integration, Belonging and Legitimacy

    ERIC Educational Resources Information Center

    Bermingham, Nicola; Higham, Gwennan

    2018-01-01

    Immigrant integration in nation states increasingly focuses on the importance of learning the national state language. This is evidenced by increased emphasis on rigorous language testing and tighter citizenship regulations. This paper analyses immigrant integration in two sub-state contexts, Galicia and Wales, where presence of a national…

  8. Curriculum Articulation as a Means of Meeting the High Technology Challenge.

    ERIC Educational Resources Information Center

    Knight, Mary

    Education is being constantly criticized for its failure to produce literate graduates. Reformers are urging a return to basics, a more rigorous curriculum, and results that can be measured on achievement tests. Vocational education also is being criticized. Opponents think that vocational education prepares graduates for a narrow range of job…

  9. Outcomes of Global Environmentalism: Longitudinal and Cross-National Trends in Chemical Fertilizer and Pesticide Use

    ERIC Educational Resources Information Center

    Shorette, Kristen

    2012-01-01

    Previous research identifies changing world cultural norms as the impetus for a worldwide trend promoting environmentalism. However, the extent to which countries comply with the norms promoted and codified by environmental organizations and treaties has been less rigorously tested. Suspected noncompliance is generally explained as "decoupling"…

  10. REMOVAL OF ADDED NITRATE IN COTTON BURR COMPOST, MULCH COMPOST, AND PEAT: MECHANISMS AND POTENTIAL USE FOR GROUNDWATER NITRATE REMEDIATION

    EPA Science Inventory

    We conducted batch tests on the nature and kinetics of removal of added nitrate in cotton burr compost, mulch compost, and sphagnum peat that may be potentially used in a permeable reactive barrier (PRB) for groundwater nitrate remediation. A rigorous steam autoclaving protocol (...

  11. Edison - A New Cray Supercomputer Advances Discovery at NERSC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dosanjh, Sudip; Parkinson, Dula; Yelick, Kathy

    2014-02-06

    When a supercomputing center installs a new system, users are invited to make heavy use of the computer as part of the rigorous testing. In this video, find out what top scientists have discovered using Edison, a Cray XC30 supercomputer, and how NERSC's newest supercomputer will accelerate their future research.

  12. Shaping Social Work Science: What Should Quantitative Researchers Do?

    ERIC Educational Resources Information Center

    Guo, Shenyang

    2015-01-01

    Based on a review of economists' debates on mathematical economics, this article discusses a key issue for shaping the science of social work--research methodology. The article describes three important tasks quantitative researchers need to fulfill in order to enhance the scientific rigor of social work research. First, to test theories using…

  13. Tuskegee Airman Lee Hayes

    ScienceCinema

    Hayes, Lee

    2017-12-22

    Hayes, a resident of Amagansett who worked at Brookhaven Lab as a custodian from 1958 to 1966, served in an all-black bomber squadron at Tuskegee Army Air Field in Alabama. He was among 994 precedent-breaking black soldiers at Tuskegee who passed rigorous tests between 1942 and 1946 to become pilots in the then-segregated armed forces.

  14. 34 CFR 200.56 - Definition of “highly qualified teacher.”

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... areas of the basic elementary school curriculum; or (3) At the public middle and high school levels, demonstrate a high level of competency by— (i) Passing a rigorous State test in each academic subject in which... teacher— (1) Receives high-quality professional development that is sustained, intensive, and classroom...

  15. 34 CFR 200.56 - Definition of “highly qualified teacher.”

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... areas of the basic elementary school curriculum; or (3) At the public middle and high school levels, demonstrate a high level of competency by— (i) Passing a rigorous State test in each academic subject in which... teacher— (1) Receives high-quality professional development that is sustained, intensive, and classroom...

  16. 34 CFR 200.56 - Definition of “highly qualified teacher.”

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... areas of the basic elementary school curriculum; or (3) At the public middle and high school levels, demonstrate a high level of competency by— (i) Passing a rigorous State test in each academic subject in which... teacher— (1) Receives high-quality professional development that is sustained, intensive, and classroom...

  17. 34 CFR 200.56 - Definition of “highly qualified teacher.”

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... areas of the basic elementary school curriculum; or (3) At the public middle and high school levels, demonstrate a high level of competency by— (i) Passing a rigorous State test in each academic subject in which... teacher— (1) Receives high-quality professional development that is sustained, intensive, and classroom...

  18. 34 CFR 200.56 - Definition of “highly qualified teacher.”

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... areas of the basic elementary school curriculum; or (3) At the public middle and high school levels, demonstrate a high level of competency by— (i) Passing a rigorous State test in each academic subject in which... teacher— (1) Receives high-quality professional development that is sustained, intensive, and classroom...

  19. You've Shown the Program Model Is Effective. Now What?

    ERIC Educational Resources Information Center

    Ellickson, Phyllis L.

    2014-01-01

    Rigorous tests of theory-based programs require faithful implementation. Otherwise, lack of results might be attributable to faulty program delivery, faulty theory, or both. However, once the evidence indicates the model works and merits broader dissemination, implementation issues do not fade away. How can developers enhance the likelihood that…

  20. Integrating Pharmacology Topics in High School Biology and Chemistry Classes Improves Performance

    ERIC Educational Resources Information Center

    Schwartz-Bloom, Rochelle D.; Halpin, Myra J.

    2003-01-01

    Although numerous programs have been developed for Grade Kindergarten through 12 science education, evaluation has been difficult owing to the inherent problems conducting controlled experiments in the typical classroom. Using a rigorous experimental design, we developed and tested a novel program containing a series of pharmacology modules (e.g.,…

  1. Hydrophobin genes of the entomopathogenic fungus, Metarhizium brunneum, are differentially expressed and corresponding mutants are decreased in virulence

    USDA-ARS?s Scientific Manuscript database

    Hydrophobins are small, cysteine-rich, secreted proteins, ubiquitously produced by filamentous fungi that are speculated to function in fungal growth, cell surface properties, and development, although this has been rigorously tested for only a few species. Herein, we report identification of three ...

  2. Academic Stress in an Achievement Driven Era: Time and School Culture

    ERIC Educational Resources Information Center

    Mrowka, Karyn Anne Kowalski

    2014-01-01

    Whether academic achievement is defined as passing a state-mandated test for graduation or earning "A's" in a rigorous course load and having a resume full of extra-curricular accomplishments, the pressure to achieve is pervading public education, creating a culture of competition and causing academic stress. A culture of competition…

  3. Mathematics Awareness through Technology, Teamwork, Engagement, and Rigor

    ERIC Educational Resources Information Center

    James, Laurie

    2016-01-01

    The purpose of this two-year observational study was to determine if the use of technology and intervention groups affected fourth-grade math scores. Specifically, the desire was to identify the percentage of students who met or exceeded grade-level standards on the state standardized test. This study indicated possible reasons that enhanced…

  4. An Exploratory Investigation of the Factor Structure of the Reynolds Intellectual Assessment Scales (RIAS)

    ERIC Educational Resources Information Center

    Dombrowski, Stefan C.; Watkins, Marley W.; Brogan, Michael J.

    2009-01-01

    This study investigated the factor structure of the Reynolds Intellectual Assessment Scales (RIAS) using rigorous exploratory factor analytic and factor extraction procedures. The results of this study indicate that the RIAS is a single factor test. Despite these results, higher order factor analysis using the Schmid-Leiman procedure indicates…

  5. Rigorous Tests of Student Outcomes in CTE Programs of Study: Final Report

    ERIC Educational Resources Information Center

    Castellano, Marisa; Sundell, Kirsten E.; Overman, Laura T.; Richardson, George B.; Stone, James R., III

    2014-01-01

    This study was designed to investigate the relationship between participation in federally mandated college and career-preparatory programs--known as programs of study (POS)--and high school achievement outcomes. POS are an organized approach to college and career readiness that offer an aligned sequence of courses spanning secondary and…

  6. Interactive visual analysis promotes exploration of long-term ecological data

    Treesearch

    T.N. Pham; J.A. Jones; R. Metoyer; F.J. Swanson; R.J. Pabst

    2013-01-01

    Long-term ecological data are crucial in helping ecologists understand ecosystem function and environmental change. Nevertheless, these kinds of data sets are difficult to analyze because they are usually large, multivariate, and spatiotemporal. Although existing analysis tools such as statistical methods and spreadsheet software permit rigorous tests of pre-conceived...

  7. Edison - A New Cray Supercomputer Advances Discovery at NERSC

    ScienceCinema

    Dosanjh, Sudip; Parkinson, Dula; Yelick, Kathy; Trebotich, David; Broughton, Jeff; Antypas, Katie; Lukic, Zarija, Borrill, Julian; Draney, Brent; Chen, Jackie

    2018-01-16

    When a supercomputing center installs a new system, users are invited to make heavy use of the computer as part of the rigorous testing. In this video, find out what top scientists have discovered using Edison, a Cray XC30 supercomputer, and how NERSC's newest supercomputer will accelerate their future research.

  8. Neurocognitive Functioning in AD/HD, Predominantly Inattentive and Combined Subtypes

    ERIC Educational Resources Information Center

    Solanto, Mary V.; Gilbert, Sharone N.; Raj, Anu; Zhu, John; Pope-Boyd, Sa'brina; Stepak, Brenda; Vail, Lucia; Newcorn, Jeffrey H.

    2007-01-01

    The Predominantly Inattentive (PI) and Combined (CB) subtypes of AD/HD differ in cognitive tempo, age of onset, gender ratio, and comorbidity, yet a differentiating endophenotype has not been identified. The aim of this study was to test rigorously diagnosed PI, CB, and typical children on measures selected for their potential to reveal…

  9. Cultivating Common Ground: Integrating Standards-Based Visual Arts, Math and Literacy in High-Poverty Urban Classrooms

    ERIC Educational Resources Information Center

    Cunnington, Marisol; Kantrowitz, Andrea; Harnett, Susanne; Hill-Ries, Aline

    2014-01-01

    The "Framing Student Success: Connecting Rigorous Visual Arts, Math and Literacy Learning" experimental demonstration project was designed to develop and test an instructional program integrating high-quality, standards-based instruction in the visual arts, math, and literacy. Developed and implemented by arts-in-education organization…

  10. SMAP validation of soil moisture products

    USDA-ARS?s Scientific Manuscript database

    The Soil Moisture Active Passive (SMAP) satellite will be launched by the National Aeronautics and Space Administration in October 2014. SMAP will also incorporate a rigorous calibration and validation program that will support algorithm refinement and provide users with information on the accuracy ...

  11. Technological characteristics of pre- and post-rigor deboned beef mixtures from Holstein steers and quality attributes of cooked beef sausage.

    PubMed

    Sukumaran, Anuraj T; Holtcamp, Alexander J; Campbell, Yan L; Burnett, Derris; Schilling, Mark W; Dinh, Thu T N

    2018-06-07

    The objective of this study was to determine the effects of deboning time (pre- and post-rigor), processing steps (grinding - GB; salting - SB; batter formulation - BB), and storage time on the quality of raw beef mixtures and vacuum-packaged cooked sausage, produced using a commercial formulation with 0.25% phosphate. The pH was greater in pre-rigor GB and SB than in post-rigor GB and SB (P < .001). However, deboning time had no effect on metmyoglobin reducing activity, cooking loss, and color of raw beef mixtures. Protein solubility of pre-rigor beef mixtures (124.26 mg/kg) was greater than that of post-rigor beef (113.93 mg/kg; P = .071). TBARS were increased in BB but decreased during vacuum storage of cooked sausage (P ≤ .018). Except for chewiness and saltiness being 52.9 N-mm and 0.3 points greater in post-rigor sausage (P = .040 and 0.054, respectively), texture profile analysis and trained panelists detected no difference in texture between pre- and post-rigor sausage. Published by Elsevier Ltd.

  12. Wireless Sensor Applications in Extreme Aeronautical Environments

    NASA Technical Reports Server (NTRS)

    Wilson, William C.; Atkinson, Gary M.

    2013-01-01

    NASA aeronautical programs require rigorous ground and flight testing. Many of the testing environments can be extremely harsh. These environments include cryogenic temperatures and high temperatures (greater than 1500 C). Temperature, pressure, vibration, ionizing radiation, and chemical exposure may all be part of the harsh environment found in testing. This paper presents a survey of research opportunities for universities and industry to develop new wireless sensors that address anticipated structural health monitoring (SHM) and testing needs for aeronautical vehicles. Potential applications of passive wireless sensors for ground testing and high altitude aircraft operations are presented. Some of the challenges and issues of the technology are also presented.

  13. RIT Stability through the Transition to Common Core-Aligned MAP® Tests. How Using MAP to Measure Student Learning Growth is Reliable Now and in 2014

    ERIC Educational Resources Information Center

    Northwest Evaluation Association, 2013

    2013-01-01

    While many educators expect the Common Core State Standards (CCSS) to be more rigorous than previous state standards, some wonder if the transition to CCSS and to a Common Core aligned MAP test will have an impact on their students' RIT scores or the NWEA norms. MAP assessments use a proprietary scale known as the RIT (Rasch unit) scale to measure…

  14. ChemCam for Mars Science Laboratory rover, undergoing pre-flight testing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2011-10-20

    Los Alamos National Laboratory and partners developed a laser instrument, ChemCam, that will ride on the elevated mast of the Mars Science Laboratory rover Curiosity. The system allows Curiosity to "zap" rocks from a distance, reading their chemical composition through spectroscopic analysis. In this video, laboratory shaker-table testing of the instrument ensures that all of its components are solidly attached and resistant to damage from the rigors of launch, travel and landing.

  15. ChemCam for Mars Science Laboratory rover, undergoing pre-flight testing

    ScienceCinema

    None

    2018-06-06

    Los Alamos National Laboratory and partners developed a laser instrument, ChemCam, that will ride on the elevated mast of the Mars Science Laboratory rover Curiosity. The system allows Curiosity to "zap" rocks from a distance, reading their chemical composition through spectroscopic analysis. In this video, laboratory shaker-table testing of the instrument ensures that all of its components are solidly attached and resistant to damage from the rigors of launch, travel and landing.

  16. DSN system performance test Doppler noise models; noncoherent configuration

    NASA Technical Reports Server (NTRS)

    Bunce, R.

    1977-01-01

    The newer model for variance, the Allan technique, now adopted for testing, is analyzed in the subject mode. A model is generated (including considerable contribution from the station secondary frequency standard), and rationalized with existing data. The variance model is definitely sound; the Allan technique mates theory and measure. The mean-frequency model is an estimate; this problem is yet to be rigorously resolved. The unaltered defining expressions are noncovergent, and the observed mean is quite erratic.

  17. Personalized Telehealth in the Future: A Global Research Agenda

    PubMed Central

    2016-01-01

    As telehealth plays an even greater role in global health care delivery, it will be increasingly important to develop a strong evidence base of successful, innovative telehealth solutions that can lead to scalable and sustainable telehealth programs. This paper has two aims: (1) to describe the challenges of promoting telehealth implementation to advance adoption and (2) to present a global research agenda for personalized telehealth within chronic disease management. Using evidence from the United States and the European Union, this paper provides a global overview of the current state of telehealth services and benefits, presents fundamental principles that must be addressed to advance the status quo, and provides a framework for current and future research initiatives within telehealth for personalized care, treatment, and prevention. A broad, multinational research agenda can provide a uniform framework for identifying and rapidly replicating best practices, while concurrently fostering global collaboration in the development and rigorous testing of new and emerging telehealth technologies. In this paper, the members of the Transatlantic Telehealth Research Network offer a 12-point research agenda for future telehealth applications within chronic disease management. PMID:26932229

  18. Personalized Telehealth in the Future: A Global Research Agenda.

    PubMed

    Dinesen, Birthe; Nonnecke, Brandie; Lindeman, David; Toft, Egon; Kidholm, Kristian; Jethwani, Kamal; Young, Heather M; Spindler, Helle; Oestergaard, Claus Ugilt; Southard, Jeffrey A; Gutierrez, Mario; Anderson, Nick; Albert, Nancy M; Han, Jay J; Nesbitt, Thomas

    2016-03-01

    As telehealth plays an even greater role in global health care delivery, it will be increasingly important to develop a strong evidence base of successful, innovative telehealth solutions that can lead to scalable and sustainable telehealth programs. This paper has two aims: (1) to describe the challenges of promoting telehealth implementation to advance adoption and (2) to present a global research agenda for personalized telehealth within chronic disease management. Using evidence from the United States and the European Union, this paper provides a global overview of the current state of telehealth services and benefits, presents fundamental principles that must be addressed to advance the status quo, and provides a framework for current and future research initiatives within telehealth for personalized care, treatment, and prevention. A broad, multinational research agenda can provide a uniform framework for identifying and rapidly replicating best practices, while concurrently fostering global collaboration in the development and rigorous testing of new and emerging telehealth technologies. In this paper, the members of the Transatlantic Telehealth Research Network offer a 12-point research agenda for future telehealth applications within chronic disease management.

  19. On the Tracy-Widomβ Distribution for β=6

    NASA Astrophysics Data System (ADS)

    Grava, Tamara; Its, Alexander; Kapaev, Andrei; Mezzadri, Francesco

    2016-11-01

    We study the Tracy-Widom distribution function for Dyson's β-ensemble with β = 6. The starting point of our analysis is the recent work of I. Rumanov where he produces a Lax-pair representation for the Bloemendal-Virág equation. The latter is a linear PDE which describes the Tracy-Widom functions corresponding to general values of β. Using his Lax pair, Rumanov derives an explicit formula for the Tracy-Widom β=6 function in terms of the second Painlevé transcendent and the solution of an auxiliary ODE. Rumanov also shows that this formula allows him to derive formally the asymptotic expansion of the Tracy-Widom function. Our goal is to make Rumanov's approach and hence the asymptotic analysis it provides rigorous. In this paper, the first one in a sequel, we show that Rumanov's Lax-pair can be interpreted as a certain gauge transformation of the standard Lax pair for the second Painlevé equation. This gauge transformation though contains functional parameters which are defined via some auxiliary nonlinear ODE which is equivalent to the auxiliary ODE of Rumanov's formula. The gauge-interpretation of Rumanov's Lax-pair allows us to highlight the steps of the original Rumanov's method which needs rigorous justifications in order to make the method complete. We provide a rigorous justification of one of these steps. Namely, we prove that the Painlevé function involved in Rumanov's formula is indeed, as it has been suggested by Rumanov, the Hastings-McLeod solution of the second Painlevé equation. The key issue which we also discuss and which is still open is the question of integrability of the auxiliary ODE in Rumanov's formula. We note that this question is crucial for the rigorous asymptotic analysis of the Tracy-Widom function. We also notice that our work is a partial answer to one of the problems related to the β-ensembles formulated by Percy Deift during the June 2015 Montreal Conference on integrable systems.

  20. Cost analysis of cassava cellulose utilization scenarios for ethanol production on flowsheet simulation platform.

    PubMed

    Zhang, Jian; Fang, Zhenhong; Deng, Hongbo; Zhang, Xiaoxi; Bao, Jie

    2013-04-01

    Cassava cellulose accounts for one quarter of cassava residues and its utilization is important for improving the efficiency and profit in commercial scale cassava ethanol industry. In this study, three scenarios of cassava cellulose utilization for ethanol production were experimentally tested under same conditions and equipment. Based on the experimental results, a rigorous flowsheet simulation model was established on Aspen plus platform and the cost of cellulase enzyme and steam energy in the three cases was calculated. The results show that the simultaneous co-saccharification of cassava starch/cellulose and ethanol fermentation process (Co-SSF) provided a cost effective option of cassava cellulose utilization for ethanol production, while the utilization of cassava cellulose from cassava ethanol fermentation residues was not economically sound. Comparing to the current fuel ethanol selling price, the Co-SSF process may provide an important choice for enhancing cassava ethanol production efficiency and profit in commercial scale. Copyright © 2013 Elsevier Ltd. All rights reserved.

  1. Approaching the third decade of paediatric palliative oncology investigation: historical progress and future directions

    PubMed Central

    Rosenberg, Abby R; Wolfe, Joanne

    2017-01-01

    Paediatric palliative care (PPC) endeavours to alleviate the suffering and improve the quality of life of children with serious illnesses and their families. In the past two decades since WHO defined PPC and called for its inclusion in paediatric oncology care, rigorous investigation has provided important insights. For example, the first decade of research focused on end-of-life experiences of the child and the family, underscoring the high prevalence of symptom burden, the barriers to parent–provider concordance with regards to prognosis, as well as the need for bereavement supports. The second decade expanded PPC oncology investigation to include the entire cancer continuum and the voices of patients. Other studies identified the need for support of parents, siblings, and racial and ethnic minority groups. Promising interventions designed to improve outcomes were tested in randomised clinical trials. Future research will build on these findings and pose novel questions about how to continue to reduce the burdens of paediatric cancer. PMID:29333484

  2. Latino Immigrants, Acculturation, and Health: Promising New Directions in Research.

    PubMed

    Abraído-Lanza, Ana F; Echeverría, Sandra E; Flórez, Karen R

    2016-01-01

    This article provides an analysis of novel topics emerging in recent years in research on Latino immigrants, acculturation, and health. In the past ten years, the number of studies assessing new ways to conceptualize and understand how acculturation-related processes may influence health has grown. These new frameworks draw from integrative approaches testing new ground to acknowledge the fundamental role of context and policy. We classify the emerging body of evidence according to themes that we identify as promising directions--intrapersonal, interpersonal, social environmental, community, political, and global contexts, cross-cutting themes in life course and developmental approaches, and segmented assimilation--and discuss the challenges and opportunities each theme presents. This body of work, which considers acculturation in context, points to the emergence of a new wave of research that holds great promise in driving forward the study of Latino immigrants, acculturation, and health. We provide suggestions to further advance the ideologic and methodologic rigor of this new wave.

  3. Self-Reported Pediatric Measures of Physical Activity, Sedentary Behavior and Strength Impact for PROMIS®: Conceptual Framework

    PubMed Central

    Tucker, Carole A.; Bevans, Katherine B.; Teneralli, Rachel E.; Smith, Ashley Wilder; Bowles, Heather R; Forrest, Christopher B.

    2014-01-01

    Purpose Children's physical activity (PA) levels are commonly assessed in pediatric clinical research, but rigorous self-report assessment tools for children are scarce, and computer adaptive test implementations are rare. Our objective was to improve pediatric self-report measures of activity using semi-structured interviews with experts and children for conceptualization of a child-informed framework. Methods Semi-structured interviews were conducted to conceptualize physical activity, sedentary behaviors, and strengthening activities. We performed systematic literature reviews to identify item-level concepts used to assess these 3 domains. Results We developed conceptual frameworks for each domain using words and phrases identified by children as relevant. Conclusions Semi-structured interview methods provide valuable information of children's perspectives and the ways children recall previous activities. Conceptualized domains of physical activity are based on the literature and expert views that also reflect children's experiences and understanding providing a basis for pediatric self-report instruments. PMID:25251789

  4. Molecular and Cellular Biophysics

    NASA Astrophysics Data System (ADS)

    Jackson, Meyer B.

    2006-01-01

    Molecular and Cellular Biophysics provides advanced undergraduate and graduate students with a foundation in the basic concepts of biophysics. Students who have taken physical chemistry and calculus courses will find this book an accessible and valuable aid in learning how these concepts can be used in biological research. The text provides a rigorous treatment of the fundamental theories in biophysics and illustrates their application with examples. Conformational transitions of proteins are studied first using thermodynamics, and subsequently with kinetics. Allosteric theory is developed as the synthesis of conformational transitions and association reactions. Basic ideas of thermodynamics and kinetics are applied to topics such as protein folding, enzyme catalysis and ion channel permeation. These concepts are then used as the building blocks in a treatment of membrane excitability. Through these examples, students will gain an understanding of the general importance and broad applicability of biophysical principles to biological problems. Offers a unique synthesis of concepts across a wide range of biophysical topics Provides a rigorous theoretical treatment, alongside applications in biological systems Author has been teaching biophysics for nearly 25 years

  5. The CUAHSI Water Data Center: Empowering scientists to discover, use, store, and share water data

    NASA Astrophysics Data System (ADS)

    Couch, A. L.; Hooper, R. P.; Arrigo, J. S.

    2012-12-01

    The proposed CUAHSI Water Data Center (WDC) will provide production-quality water data resources based upon the successful large-scale data services prototype developed by the CUAHSI Hydrologic Information System (HIS) project. The WDC, using the HIS technology, concentrates on providing time series data collected at fixed points or on moving platforms from sensors primarily (but not exclusively) in the medium of water. The WDC's missions include providing simple and effective data discovery tools useful to researchers in a variety of water-related disciplines, and providing simple and cost-effective data publication mechanisms for projects that do not desire to run their own data servers. The WDC's activities will include: 1. Rigorous curation of the water data catalog already assembled during the CUAHSI HIS project, to ensure accuracy of records and existence of declared sources. 2. Data backup and failover services for "at risk" data sources. 3. Creation and support for ubiquitously accessible data discovery and access, web-based search and smartphone applications. 4. Partnerships with researchers to extend the state of the art in water data use. 5. Partnerships with industry to create plug-and-play data publishing from sensors, and to create domain-specific tools. The WDC will serve as a knowledge resource for researchers of water-related issues, and will interface with other data centers to make their data more accessible to water researchers. The WDC will serve as a vehicle for addressing some of the grand challenges of accessing and using water data, including: a. Cross-domain data discovery: different scientific domains refer to the same kind of water data using different terminologies, making discovery of data difficult for researchers outside the data provider's domain. b. Cross-validation of data sources: much water data comes from sources lacking rigorous quality control procedures; such sources can be compared against others with rigorous quality control. The WDC enables this by making both kinds of sources available in the same search interface. c. Data provenance: the appropriateness of data for use in a specific model or analysis often depends upon the exact details of how data was gathered and processed. The WDC will aid this by curating standards for metadata that are as descriptive as practical of the collection procedures. "Plug and play" sensor interfaces will fill in metadata appropriate to each sensor without human intervention. d. Contextual search: discovering data based upon geological (e.g. aquifer) or geographic (e.g., location in a stream network) features external to metadata. e. Data-driven search: discovering data that exhibit quality factors that are not described by the metadata. The WDC will partner with researchers desiring contextual and data driven search, and make results available to all. Many major data providers (e.g. federal agencies) are not mandated to provide access to data other than those they collect. The HIS project assembled data from over 90 different sources, thus demonstrating the promise of this approach. Meeting the grand challenges listed above will greatly enhance scientists' ability to discover, interpret, access, and analyze water data from across domains and sources to test Earth system hypotheses.

  6. Discrete structures in continuum descriptions of defective crystals

    PubMed Central

    2016-01-01

    I discuss various mathematical constructions that combine together to provide a natural setting for discrete and continuum geometric models of defective crystals. In particular, I provide a quite general list of ‘plastic strain variables’, which quantifies inelastic behaviour, and exhibit rigorous connections between discrete and continuous mathematical structures associated with crystalline materials that have a correspondingly general constitutive specification. PMID:27002070

  7. Better Service through Data: Wai Sze (Lacey) Chan--Queens Borough Public Library, Jamaica, NY

    ERIC Educational Resources Information Center

    Library Journal, 2004

    2004-01-01

    The New Americans Program at Queens Borough Public Library (QBPL) is well known for the innovative collections and programs it provides to one of the nation's most diverse communities. What is less known is the rigorous analysis of demographic data that provides direction to the program. Wai Sze (Lacey) Chan uses demographics to create as complete…

  8. Circular instead of hierarchical: methodological principles for the evaluation of complex interventions

    PubMed Central

    Walach, Harald; Falkenberg, Torkel; Fønnebø, Vinjar; Lewith, George; Jonas, Wayne B

    2006-01-01

    Background The reasoning behind evaluating medical interventions is that a hierarchy of methods exists which successively produce improved and therefore more rigorous evidence based medicine upon which to make clinical decisions. At the foundation of this hierarchy are case studies, retrospective and prospective case series, followed by cohort studies with historical and concomitant non-randomized controls. Open-label randomized controlled studies (RCTs), and finally blinded, placebo-controlled RCTs, which offer most internal validity are considered the most reliable evidence. Rigorous RCTs remove bias. Evidence from RCTs forms the basis of meta-analyses and systematic reviews. This hierarchy, founded on a pharmacological model of therapy, is generalized to other interventions which may be complex and non-pharmacological (healing, acupuncture and surgery). Discussion The hierarchical model is valid for limited questions of efficacy, for instance for regulatory purposes and newly devised products and pharmacological preparations. It is inadequate for the evaluation of complex interventions such as physiotherapy, surgery and complementary and alternative medicine (CAM). This has to do with the essential tension between internal validity (rigor and the removal of bias) and external validity (generalizability). Summary Instead of an Evidence Hierarchy, we propose a Circular Model. This would imply a multiplicity of methods, using different designs, counterbalancing their individual strengths and weaknesses to arrive at pragmatic but equally rigorous evidence which would provide significant assistance in clinical and health systems innovation. Such evidence would better inform national health care technology assessment agencies and promote evidence based health reform. PMID:16796762

  9. Design and implementation of a controlled clinical trial to evaluate the effectiveness and efficiency of routine opt-out rapid human immunodeficiency virus screening in the emergency department.

    PubMed

    Haukoos, Jason S; Hopkins, Emily; Byyny, Richard L; Conroy, Amy A; Silverman, Morgan; Eisert, Sheri; Thrun, Mark; Wilson, Michael; Boyett, Brian; Heffelfinger, James D

    2009-08-01

    In 2006, the Centers for Disease Control and Prevention (CDC) released revised recommendations for performing human immunodeficiency virus (HIV) testing in health care settings, including implementing routine rapid HIV screening, the use of an integrated opt-out consent, and limited prevention counseling. Emergency departments (EDs) have been a primary focus of these efforts. These revised CDC recommendations were primarily based on feasibility studies and have not been evaluated through the application of rigorous research methods. This article describes the design and implementation of a large prospective controlled clinical trial to evaluate the CDC's recommendations in an ED setting. From April 15, 2007, through April 15, 2009, a prospective quasi-experimental equivalent time-samples clinical trial was performed to compare the clinical effectiveness and efficiency of routine (nontargeted) opt-out rapid HIV screening (intervention) to physician-directed diagnostic rapid HIV testing (control) in a high-volume urban ED. In addition, three nested observational studies were performed to evaluate the cost-effectiveness and patient and staff acceptance of the two rapid HIV testing methods. This article describes the rationale, methodologies, and study design features of this program evaluation clinical trial. It also provides details regarding the integration of the principal clinical trial and its nested observational studies. Such ED-based trials are rare, but serve to provide valid comparisons between testing approaches. Investigators should consider similar methodology when performing future ED-based health services research.

  10. On testing for spatial correspondence between maps of human brain structure and function.

    PubMed

    Alexander-Bloch, Aaron F; Shou, Haochang; Liu, Siyuan; Satterthwaite, Theodore D; Glahn, David C; Shinohara, Russell T; Vandekar, Simon N; Raznahan, Armin

    2018-06-01

    A critical issue in many neuroimaging studies is the comparison between brain maps. Nonetheless, it remains unclear how one should test hypotheses focused on the overlap or spatial correspondence between two or more brain maps. This "correspondence problem" affects, for example, the interpretation of comparisons between task-based patterns of functional activation, resting-state networks or modules, and neuroanatomical landmarks. To date, this problem has been addressed with remarkable variability in terms of methodological approaches and statistical rigor. In this paper, we address the correspondence problem using a spatial permutation framework to generate null models of overlap by applying random rotations to spherical representations of the cortical surface, an approach for which we also provide a theoretical statistical foundation. We use this method to derive clusters of cognitive functions that are correlated in terms of their functional neuroatomical substrates. In addition, using publicly available data, we formally demonstrate the correspondence between maps of task-based functional activity, resting-state fMRI networks and gyral-based anatomical landmarks. We provide open-access code to implement the methods presented for two commonly-used tools for surface based cortical analysis (https://www.github.com/spin-test). This spatial permutation approach constitutes a useful advance over widely-used methods for the comparison of cortical maps, thereby opening new possibilities for the integration of diverse neuroimaging data. Copyright © 2018 Elsevier Inc. All rights reserved.

  11. Enhancing the Ability of Experimental Autoimmune Encephalomyelitis to Serve as a More Rigorous Model of Multiple Sclerosis through Refinement of the Experimental Design

    PubMed Central

    Emerson, Mitchell R; Gallagher, Ryan J; Marquis, Janet G; LeVine, Steven M

    2009-01-01

    Advancing the understanding of the mechanisms involved in the pathogenesis of multiple sclerosis (MS) likely will lead to new and better therapeutics. Although important information about the disease process has been obtained from research on pathologic specimens, peripheral blood lymphocytes and MRI studies, the elucidation of detailed mechanisms has progressed largely through investigations using animal models of MS. In addition, animal models serve as an important tool for the testing of putative interventions. The most commonly studied model of MS is experimental autoimmune encephalomyelitis (EAE). This model can be induced in a variety of species and by various means, but there has been concern that the model may not accurately reflect the disease process, and more importantly, it may give rise to erroneous findings when it is used to test possible therapeutics. Several reasons have been given to explain the shortcomings of this model as a useful testing platform, but one idea provides a framework for improving the value of this model, and thus, it deserves careful consideration. In particular, the idea asserts that EAE studies are inadequately designed to enable appropriate evaluation of putative therapeutics. Here we discuss problem areas within EAE study designs and provide suggestions for their improvement. This paper is principally directed at investigators new to the field of EAE, although experienced investigators may find useful suggestions herein. PMID:19389303

  12. A ten-year review of the literature on the use of standardized patients in teaching and learning: 1996-2005.

    PubMed

    May, Win; Park, Joo Hyun; Lee, Justin P

    2009-06-01

    Although there is a growing body of literature on the educational use of standardized patients (SP) in teaching and learning, there have been no reviews on their value. To determine whether the educational use of SPs has an effect on the knowledge, skills, and behaviour of learners in the health professions. English-language articles covering the period 1996-2005 were reviewed to address the issue of to what extent has the use of SPs affected the knowledge, skills and performance of learners. Out of 797 abstracts, 69 articles, which met the review criteria, were selected. An adaptation of Kirkpatrick's model was used to classify and analyse the articles. Most of the learners were students in medicine and nursing. SPs were used mostly to teach communication skills and clinical skills. The study designs were case-control (29%), pre-test/post-test (24.6%), post-test only (26.1%) and qualitative studies (20.3%). METHODOLOGICAL ISSUES: Most of the studies had weak research designs. More rigorous designs with control or comparison groups should be used in future research. Most studies reported that the educational use of SPs was valuable. More rigorous studies would support the evidence-based use of SPs in teaching and learning.

  13. Linking Research to Practice: FEWS NET and Its Use of Satellite Remote Sensing Data

    NASA Technical Reports Server (NTRS)

    Brown, Molly E.; Brickley, Elizabeth B.

    2011-01-01

    The purpose of the Famine Early Warning Systems Network (FEWS NET) is to collaborate with international, regional and national partners to provide timely and rigorous early warning and vulnerability information on emerging and evolving food security issues

  14. Separation Kernel Protection Profile Revisited: Choices and Rationale

    DTIC Science & Technology

    2010-12-01

    provide the most stringent protection and rigorous security countermeasures” [ IATF ]. In other words, robustness is not the same as assurance. Figure 3... IATF Information Assurance Technical Framework, Chapter 4, Release 3.1, National Security Agency, September 2002. Karjoth01 G. Karjoth, “The

  15. Re-Establishing Broca's Initial Findings

    ERIC Educational Resources Information Center

    Richardson, Jessica D.; Fillmore, Paul; Rorden, Chris; LaPointe, Leonard L.; Fridriksson, Julius

    2012-01-01

    The importance of the left inferior pre-frontal cortex (LIPC) for speech production was first popularized by Paul Broca, providing a cornerstone of behavioral neurology and laying the foundation for future research examining brain-behavior relationships. Although Broca's findings were rigorously challenged, comprehensive contradictory evidence was…

  16. CTE's Role in Urban Education. Issue Brief

    ERIC Educational Resources Information Center

    Association for Career and Technical Education (ACTE), 2012

    2012-01-01

    This Issue Brief explores the promising role that career and technical education programs play in addressing key student achievement issues facing urban schools. CTE programs engage urban students by providing rigorous and relevant coursework, fostering positive relationships, establishing clear pathways and connecting education and…

  17. The Rigors of Aligning Performance

    DTIC Science & Technology

    2015-06-01

    organization merged 6 its field activities into regional facilities engineering commands (FECs). Today, FECs provide one-stop shopping for NAVFAC clients...many are old and antiquated , sometimes the systems mesh together other times they do not. Training is lacking on the various systems.  Communication

  18. Experimental evaluation of rigor mortis. III. Comparative study of the evolution of rigor mortis in different sized muscle groups in rats.

    PubMed

    Krompecher, T; Fryc, O

    1978-01-01

    The use of new methods and an appropriate apparatus has allowed us to make successive measurements of rigor mortis and a study of its evolution in the rat. By a comparative examination on the front and hind limbs, we have determined the following: (1) The muscular mass of the hind limbs is 2.89 times greater than that of the front limbs. (2) In the initial phase rigor mortis is more pronounced in the front limbs. (3) The front and hind limbs reach maximum rigor mortis at the same time and this state is maintained for 2 hours. (4) Resolution of rigor mortis is accelerated in the front limbs during the initial phase, but both front and hind limbs reach complete resolution at the same time.

  19. Onset of rigor mortis is earlier in red muscle than in white muscle.

    PubMed

    Kobayashi, M; Takatori, T; Nakajima, M; Sakurada, K; Hatanaka, K; Ikegaya, H; Matsuda, Y; Iwase, H

    2000-01-01

    Rigor mortis is thought to be related to falling ATP levels in muscles postmortem. We measured rigor mortis as tension determined isometrically in three rat leg muscles in liquid paraffin kept at 37 degrees C or 25 degrees C--two red muscles, red gastrocnemius (RG) and soleus (SO) and one white muscle, white gastrocnemius (WG). Onset, half and full rigor mortis occurred earlier in RG and SO than in WG both at 37 degrees C and at 25 degrees C even though RG and WG were portions of the same muscle. This suggests that rigor mortis directly reflects the postmortem intramuscular ATP level, which decreases more rapidly in red muscle than in white muscle after death. Rigor mortis was more retarded at 25 degrees C than at 37 degrees C in each type of muscle.

  20. Clinical decision making-a functional medicine perspective.

    PubMed

    Pizzorno, Joseph E

    2012-09-01

    As 21st century health care moves from a disease-based approach to a more patient-centric system that can address biochemical individuality to improve health and function, clinical decision making becomes more complex. Accentuating the problem is the lack of a clear standard for this more complex functional medicine approach. While there is relatively broad agreement in Western medicine for what constitutes competent assessment of disease and identification of related treatment approaches, the complex functional medicine model posits multiple and individualized diagnostic and therapeutic approaches, most or many of which have reasonable underlying science and principles, but which have not been rigorously tested in a research or clinical setting. This has led to non-rigorous thinking and sometimes to uncritical acceptance of both poorly documented diagnostic procedures and ineffective therapies, resulting in less than optimal clinical care.

  1. Clinical Decision Making—A Functional Medicine Perspective

    PubMed Central

    2012-01-01

    As 21st century health care moves from a disease-based approach to a more patient-centric system that can address biochemical individuality to improve health and function, clinical decision making becomes more complex. Accentuating the problem is the lack of a clear standard for this more complex functional medicine approach. While there is relatively broad agreement in Western medicine for what constitutes competent assessment of disease and identification of related treatment approaches, the complex functional medicine model posits multiple and individualized diagnostic and therapeutic approaches, most or many of which have reasonable underlying science and principles, but which have not been rigorously tested in a research or clinical setting. This has led to non-rigorous thinking and sometimes to uncritical acceptance of both poorly documented diagnostic procedures and ineffective therapies, resulting in less than optimal clinical care. PMID:24278827

  2. Principles to Products: Toward Realizing MOS 2.0

    NASA Technical Reports Server (NTRS)

    Bindschadler, Duane L.; Delp, Christopher L.

    2012-01-01

    This is a report on the Operations Revitalization Initiative, part of the ongoing NASA-funded Advanced Multi-Mission Operations Systems (AMMOS) program. We are implementing products that significantly improve efficiency and effectiveness of Mission Operations Systems (MOS) for deep-space missions. We take a multi-mission approach, in keeping with our organization's charter to "provide multi-mission tools and services that enable mission customers to operate at a lower total cost to NASA." Focusing first on architectural fundamentals of the MOS, we review the effort's progress. In particular, we note the use of stakeholder interactions and consideration of past lessons learned to motivate a set of Principles that guide the evolution of the AMMOS. Thus guided, we have created essential patterns and connections (detailed in companion papers) that are explicitly modeled and support elaboration at multiple levels of detail (system, sub-system, element...) throughout a MOS. This architecture is realized in design and implementation products that provide lifecycle support to a Mission at the system and subsystem level. The products include adaptable multi-mission engineering documentation that describes essentials such as operational concepts and scenarios, requirements, interfaces and agreements, information models, and mission operations processes. Because we have adopted a model-based system engineering method, these documents and their contents are meaningfully related to one another and to the system model. This means they are both more rigorous and reusable (from mission to mission) than standard system engineering products. The use of models also enables detailed, early (e.g., formulation phase) insight into the impact of changes (e.g., to interfaces or to software) that is rigorous and complete, allowing better decisions on cost or technical trades. Finally, our work provides clear and rigorous specification of operations needs to software developers, further enabling significant gains in productivity.

  3. Hypothesis testing of scientific Monte Carlo calculations.

    PubMed

    Wallerberger, Markus; Gull, Emanuel

    2017-11-01

    The steadily increasing size of scientific Monte Carlo simulations and the desire for robust, correct, and reproducible results necessitates rigorous testing procedures for scientific simulations in order to detect numerical problems and programming bugs. However, the testing paradigms developed for deterministic algorithms have proven to be ill suited for stochastic algorithms. In this paper we demonstrate explicitly how the technique of statistical hypothesis testing, which is in wide use in other fields of science, can be used to devise automatic and reliable tests for Monte Carlo methods, and we show that these tests are able to detect some of the common problems encountered in stochastic scientific simulations. We argue that hypothesis testing should become part of the standard testing toolkit for scientific simulations.

  4. Hypothesis testing of scientific Monte Carlo calculations

    NASA Astrophysics Data System (ADS)

    Wallerberger, Markus; Gull, Emanuel

    2017-11-01

    The steadily increasing size of scientific Monte Carlo simulations and the desire for robust, correct, and reproducible results necessitates rigorous testing procedures for scientific simulations in order to detect numerical problems and programming bugs. However, the testing paradigms developed for deterministic algorithms have proven to be ill suited for stochastic algorithms. In this paper we demonstrate explicitly how the technique of statistical hypothesis testing, which is in wide use in other fields of science, can be used to devise automatic and reliable tests for Monte Carlo methods, and we show that these tests are able to detect some of the common problems encountered in stochastic scientific simulations. We argue that hypothesis testing should become part of the standard testing toolkit for scientific simulations.

  5. Space radiator simulation system analysis

    NASA Technical Reports Server (NTRS)

    Black, W. Z.; Wulff, W.

    1972-01-01

    A transient heat transfer analysis was carried out on a space radiator heat rejection system exposed to an arbitrarily prescribed combination of aerodynamic heating, solar, albedo, and planetary radiation. A rigorous analysis was carried out for the radiation panel and tubes lying in one plane and an approximate analysis was used to extend the rigorous analysis to the case of a curved panel. The analysis permits the consideration of both gaseous and liquid coolant fluids, including liquid metals, under prescribed, time dependent inlet conditions. The analysis provided a method for predicting: (1) transient and steady-state, two dimensional temperature profiles, (2) local and total heat rejection rates, (3) coolant flow pressure in the flow channel, and (4) total system weight and protection layer thickness.

  6. Origin of the spike-timing-dependent plasticity rule

    NASA Astrophysics Data System (ADS)

    Cho, Myoung Won; Choi, M. Y.

    2016-08-01

    A biological synapse changes its efficacy depending on the difference between pre- and post-synaptic spike timings. Formulating spike-timing-dependent interactions in terms of the path integral, we establish a neural-network model, which makes it possible to predict relevant quantities rigorously by means of standard methods in statistical mechanics and field theory. In particular, the biological synaptic plasticity rule is shown to emerge as the optimal form for minimizing the free energy. It is further revealed that maximization of the entropy of neural activities gives rise to the competitive behavior of biological learning. This demonstrates that statistical mechanics helps to understand rigorously key characteristic behaviors of a neural network, thus providing the possibility of physics serving as a useful and relevant framework for probing life.

  7. Kinetics versus thermodynamics in materials modeling: The case of the di-vacancy in iron

    NASA Astrophysics Data System (ADS)

    Djurabekova, F.; Malerba, L.; Pasianot, R. C.; Olsson, P.; Nordlund, K.

    2010-07-01

    Monte Carlo models are widely used for the study of microstructural and microchemical evolution of materials under irradiation. However, they often link explicitly the relevant activation energies to the energy difference between local equilibrium states. We provide a simple example (di-vacancy migration in iron) in which a rigorous activation energy calculation, by means of both empirical interatomic potentials and density functional theory methods, clearly shows that such a link is not granted, revealing a migration mechanism that a thermodynamics-linked activation energy model cannot predict. Such a mechanism is, however, fully consistent with thermodynamics. This example emphasizes the importance of basing Monte Carlo methods on models where the activation energies are rigorously calculated, rather than deduced from widespread heuristic equations.

  8. Properties of Coulomb crystals: rigorous results.

    PubMed

    Cioslowski, Jerzy

    2008-04-28

    Rigorous equalities and bounds for several properties of Coulomb crystals are presented. The energy e(N) per particle pair is shown to be a nondecreasing function of the particle number N for all clusters described by double-power-law pairwise-additive potentials epsilon(r) that are unbound at both r-->0 and r-->infinity. A lower bound for the ratio of the mean reciprocal crystal radius and e(N) is derived. The leading term in the asymptotic expression for the shell capacity that appears in the recently introduced approximate model of Coulomb crystals is obtained, providing in turn explicit large-N asymptotics for e(N) and the mean crystal radius. In addition, properties of the harmonic vibrational spectra are investigated, producing an upper bound for the zero-point energy.

  9. Measurements of the degree of development of rigor mortis as an indicator of stress in slaughtered pigs.

    PubMed

    Warriss, P D; Brown, S N; Knowles, T G

    2003-12-13

    The degree of development of rigor mortis in the carcases of slaughter pigs was assessed subjectively on a three-point scale 35 minutes after they were exsanguinated, and related to the levels of cortisol, lactate and creatine kinase in blood collected at exsanguination. Earlier rigor development was associated with higher concentrations of these stress indicators in the blood. This relationship suggests that the mean rigor score, and the frequency distribution of carcases that had or had not entered rigor, could be used as an index of the degree of stress to which the pigs had been subjected.

  10. Qualitative Methods in Mental Health Services Research

    PubMed Central

    Palinkas, Lawrence A.

    2014-01-01

    Qualitative and mixed methods play a prominent role in mental health services research. However, the standards for their use are not always evident, especially for those not trained in such methods. This paper reviews the rationale and common approaches to using qualitative and mixed methods in mental health services and implementation research based on a review of the papers included in this special series along with representative examples from the literature. Qualitative methods are used to provide a “thick description” or depth of understanding to complement breadth of understanding afforded by quantitative methods, elicit the perspective of those being studied, explore issues that have not been well studied, develop conceptual theories or test hypotheses, or evaluate the process of a phenomenon or intervention. Qualitative methods adhere to many of the same principles of scientific rigor as quantitative methods, but often differ with respect to study design, data collection and data analysis strategies. For instance, participants for qualitative studies are usually sampled purposefully rather than at random and the design usually reflects an iterative process alternating between data collection and analysis. The most common techniques for data collection are individual semi-structured interviews, focus groups, document reviews, and participant observation. Strategies for analysis are usually inductive, based on principles of grounded theory or phenomenology. Qualitative methods are also used in combination with quantitative methods in mixed method designs for convergence, complementarity, expansion, development, and sampling. Rigorously applied qualitative methods offer great potential in contributing to the scientific foundation of mental health services research. PMID:25350675

  11. Qualitative and mixed methods in mental health services and implementation research.

    PubMed

    Palinkas, Lawrence A

    2014-01-01

    Qualitative and mixed methods play a prominent role in mental health services research. However, the standards for their use are not always evident, especially for those not trained in such methods. This article reviews the rationale and common approaches to using qualitative and mixed methods in mental health services and implementation research based on a review of the articles included in this special series along with representative examples from the literature. Qualitative methods are used to provide a "thick description" or depth of understanding to complement breadth of understanding afforded by quantitative methods, elicit the perspective of those being studied, explore issues that have not been well studied, develop conceptual theories or test hypotheses, or evaluate the process of a phenomenon or intervention. Qualitative methods adhere to many of the same principles of scientific rigor as quantitative methods but often differ with respect to study design, data collection, and data analysis strategies. For instance, participants for qualitative studies are usually sampled purposefully rather than at random and the design usually reflects an iterative process alternating between data collection and analysis. The most common techniques for data collection are individual semistructured interviews, focus groups, document reviews, and participant observation. Strategies for analysis are usually inductive, based on principles of grounded theory or phenomenology. Qualitative methods are also used in combination with quantitative methods in mixed-method designs for convergence, complementarity, expansion, development, and sampling. Rigorously applied qualitative methods offer great potential in contributing to the scientific foundation of mental health services research.

  12. Statistical Analysis of the Processes Controlling Choline and Ethanolamine Glycerophospholipid Molecular Species Composition

    PubMed Central

    Kiebish, Michael A.; Yang, Kui; Han, Xianlin; Gross, Richard W.; Chuang, Jeffrey

    2012-01-01

    The regulation and maintenance of the cellular lipidome through biosynthetic, remodeling, and catabolic mechanisms are critical for biological homeostasis during development, health and disease. These complex mechanisms control the architectures of lipid molecular species, which have diverse yet highly regulated fatty acid chains at both the sn1 and sn2 positions. Phosphatidylcholine (PC) and phosphatidylethanolamine (PE) serve as the predominant biophysical scaffolds in membranes, acting as reservoirs for potent lipid signals and regulating numerous enzymatic processes. Here we report the first rigorous computational dissection of the mechanisms influencing PC and PE molecular architectures from high-throughput shotgun lipidomic data. Using novel statistical approaches, we have analyzed multidimensional mass spectrometry-based shotgun lipidomic data from developmental mouse heart and mature mouse heart, lung, brain, and liver tissues. We show that in PC and PE, sn1 and sn2 positions are largely independent, though for low abundance species regulatory processes may interact with both the sn1 and sn2 chain simultaneously, leading to cooperative effects. Chains with similar biochemical properties appear to be remodeled similarly. We also see that sn2 positions are more regulated than sn1, and that PC exhibits stronger cooperative effects than PE. A key aspect of our work is a novel statistically rigorous approach to determine cooperativity based on a modified Fisher's exact test using Markov Chain Monte Carlo sampling. This computational approach provides a novel tool for developing mechanistic insight into lipidomic regulation. PMID:22662143

  13. The rigorous bound on the transmission probability for massless scalar field of non-negative-angular-momentum mode emitted from a Myers-Perry black hole

    NASA Astrophysics Data System (ADS)

    Ngampitipan, Tritos; Boonserm, Petarpa; Chatrabhuti, Auttakit; Visser, Matt

    2016-06-01

    Hawking radiation is the evidence for the existence of black hole. What an observer can measure through Hawking radiation is the transmission probability. In the laboratory, miniature black holes can successfully be generated. The generated black holes are, most commonly, Myers-Perry black holes. In this paper, we will derive the rigorous bounds on the transmission probabilities for massless scalar fields of non-negative-angular-momentum modes emitted from a generated Myers-Perry black hole in six, seven, and eight dimensions. The results show that for low energy, the rigorous bounds increase with the increase in the energy of emitted particles. However, for high energy, the rigorous bounds decrease with the increase in the energy of emitted particles. When the black holes spin faster, the rigorous bounds decrease. For dimension dependence, the rigorous bounds also decrease with the increase in the number of extra dimensions. Furthermore, as comparison to the approximate transmission probability, the rigorous bound is proven to be useful.

  14. The rigorous bound on the transmission probability for massless scalar field of non-negative-angular-momentum mode emitted from a Myers-Perry black hole

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ngampitipan, Tritos, E-mail: tritos.ngampitipan@gmail.com; Particle Physics Research Laboratory, Department of Physics, Faculty of Science, Chulalongkorn University, Phayathai Road, Patumwan, Bangkok 10330; Boonserm, Petarpa, E-mail: petarpa.boonserm@gmail.com

    Hawking radiation is the evidence for the existence of black hole. What an observer can measure through Hawking radiation is the transmission probability. In the laboratory, miniature black holes can successfully be generated. The generated black holes are, most commonly, Myers-Perry black holes. In this paper, we will derive the rigorous bounds on the transmission probabilities for massless scalar fields of non-negative-angular-momentum modes emitted from a generated Myers-Perry black hole in six, seven, and eight dimensions. The results show that for low energy, the rigorous bounds increase with the increase in the energy of emitted particles. However, for high energy,more » the rigorous bounds decrease with the increase in the energy of emitted particles. When the black holes spin faster, the rigorous bounds decrease. For dimension dependence, the rigorous bounds also decrease with the increase in the number of extra dimensions. Furthermore, as comparison to the approximate transmission probability, the rigorous bound is proven to be useful.« less

  15. Demodulation of messages received with low signal to noise ratio

    NASA Astrophysics Data System (ADS)

    Marguinaud, A.; Quignon, T.; Romann, B.

    The implementation of this all-digital demodulator is derived from maximum likelihood considerations applied to an analytical representation of the received signal. Traditional adapted filters and phase lock loops are replaced by minimum variance estimators and hypothesis tests. These statistical tests become very simple when working on phase signal. These methods, combined with rigorous control data representation allow significant computation savings as compared to conventional realizations. Nominal operation has been verified down to energetic signal over noise of -3 dB upon a QPSK demodulator.

  16. Proficient vs. Prepared: Disparities between State Tests and the 2013 National Assessment of Educational Progress (NAEP)

    ERIC Educational Resources Information Center

    Achieve, Inc., 2015

    2015-01-01

    Today's economy demands that all young people develop high-level literacy, quantitative reasoning, problem solving, communication, and collaboration skills, all grounded in a rigorous and content-rich K-12 curriculum. Acquiring these skills ensures that high school graduates are academically prepared to pursue the future of their choosing.…

  17. The Case against Exit Exams. Policy Brief

    ERIC Educational Resources Information Center

    Hyslop, Anne

    2014-01-01

    In the 2013-14 school year, twenty-four states required students to be proficient on standardized tests in order to graduate from high school. But starting next year, and in the years to come, states will launch more rigorous, college- and career-ready assessments aligned to the Common Core. As they do so, they should revisit the stakes on these…

  18. Large eddy simulation of forest canopy flow for wildland fire modeling

    Treesearch

    Eric Mueller; William Mell; Albert Simeoni

    2014-01-01

    Large eddy simulation (LES) based computational fluid dynamics (CFD) simulators have obtained increasing attention in the wildland fire research community, as these tools allow the inclusion of important driving physics. However, due to the complexity of the models, individual aspects must be isolated and tested rigorously to ensure meaningful results. As wind is a...

  19. Burnout and Engagement in University Students: A Cross-National Study.

    ERIC Educational Resources Information Center

    Schaufeli, Wilmar B.; Martinez, Isabel M.; Pinto, Alexandra Marques; Salanova, Marisa; Bakker, Arnold B.

    2002-01-01

    Examines burnout and engagement among college students from Spain, Portugal, and the Netherlands using the Maslach Burnout Inventory Student Survey (MBI-SS) and the Utrecht Work Engagement Scale for students. Overall, these two instruments may be used for such a purpose, but both instruments, particularly the MBI-SS, do not pass a rigorous test of…

  20. Lichen elements as pollution indicators: evaluation of methods for large monitoring programmes

    Treesearch

    Susan Will-Wolf; Sarah Jovan; Michael C. Amacher

    2017-01-01

    Lichen element content is a reliable indicator for relative air pollution load in research and monitoring programmes requiring both efficiency and representation of many sites. We tested the value of costly rigorous field and handling protocols for sample element analysis using five lichen species. No relaxation of rigour was supported; four relaxed protocols generated...

Top