Sample records for statistical experiment design

  1. Assay optimization: a statistical design of experiments approach.

    PubMed

    Altekar, Maneesha; Homon, Carol A; Kashem, Mohammed A; Mason, Steven W; Nelson, Richard M; Patnaude, Lori A; Yingling, Jeffrey; Taylor, Paul B

    2007-03-01

    With the transition from manual to robotic HTS in the last several years, assay optimization has become a significant bottleneck. Recent advances in robotic liquid handling have made it feasible to reduce assay optimization timelines with the application of statistically designed experiments. When implemented, they can efficiently optimize assays by rapidly identifying significant factors, complex interactions, and nonlinear responses. This article focuses on the use of statistically designed experiments in assay optimization.

  2. Experimental toxicology: Issues of statistics, experimental design, and replication.

    PubMed

    Briner, Wayne; Kirwan, Jeral

    2017-01-01

    The difficulty of replicating experiments has drawn considerable attention. Issues with replication occur for a variety of reasons ranging from experimental design to laboratory errors to inappropriate statistical analysis. Here we review a variety of guidelines for statistical analysis, design, and execution of experiments in toxicology. In general, replication can be improved by using hypothesis driven experiments with adequate sample sizes, randomization, and blind data collection techniques. Copyright © 2016 Elsevier B.V. All rights reserved.

  3. Statistical power and optimal design in experiments in which samples of participants respond to samples of stimuli.

    PubMed

    Westfall, Jacob; Kenny, David A; Judd, Charles M

    2014-10-01

    Researchers designing experiments in which a sample of participants responds to a sample of stimuli are faced with difficult questions about optimal study design. The conventional procedures of statistical power analysis fail to provide appropriate answers to these questions because they are based on statistical models in which stimuli are not assumed to be a source of random variation in the data, models that are inappropriate for experiments involving crossed random factors of participants and stimuli. In this article, we present new methods of power analysis for designs with crossed random factors, and we give detailed, practical guidance to psychology researchers planning experiments in which a sample of participants responds to a sample of stimuli. We extensively examine 5 commonly used experimental designs, describe how to estimate statistical power in each, and provide power analysis results based on a reasonable set of default parameter values. We then develop general conclusions and formulate rules of thumb concerning the optimal design of experiments in which a sample of participants responds to a sample of stimuli. We show that in crossed designs, statistical power typically does not approach unity as the number of participants goes to infinity but instead approaches a maximum attainable power value that is possibly small, depending on the stimulus sample. We also consider the statistical merits of designs involving multiple stimulus blocks. Finally, we provide a simple and flexible Web-based power application to aid researchers in planning studies with samples of stimuli.

  4. Statistical issues in quality control of proteomic analyses: good experimental design and planning.

    PubMed

    Cairns, David A

    2011-03-01

    Quality control is becoming increasingly important in proteomic investigations as experiments become more multivariate and quantitative. Quality control applies to all stages of an investigation and statistics can play a key role. In this review, the role of statistical ideas in the design and planning of an investigation is described. This involves the design of unbiased experiments using key concepts from statistical experimental design, the understanding of the biological and analytical variation in a system using variance components analysis and the determination of a required sample size to perform a statistically powerful investigation. These concepts are described through simple examples and an example data set from a 2-D DIGE pilot experiment. Each of these concepts can prove useful in producing better and more reproducible data. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Statistical issues in the design and planning of proteomic profiling experiments.

    PubMed

    Cairns, David A

    2015-01-01

    The statistical design of a clinical proteomics experiment is a critical part of well-undertaken investigation. Standard concepts from experimental design such as randomization, replication and blocking should be applied in all experiments, and this is possible when the experimental conditions are well understood by the investigator. The large number of proteins simultaneously considered in proteomic discovery experiments means that determining the number of required replicates to perform a powerful experiment is more complicated than in simple experiments. However, by using information about the nature of an experiment and making simple assumptions this is achievable for a variety of experiments useful for biomarker discovery and initial validation.

  6. Guidelines for the design and statistical analysis of experiments in papers submitted to ATLA.

    PubMed

    Festing, M F

    2001-01-01

    In vitro experiments need to be well designed and correctly analysed if they are to achieve their full potential to replace the use of animals in research. An "experiment" is a procedure for collecting scientific data in order to answer a hypothesis, or to provide material for generating new hypotheses, and differs from a survey because the scientist has control over the treatments that can be applied. Most experiments can be classified into one of a few formal designs, the most common being completely randomised, and randomised block designs. These are quite common with in vitro experiments, which are often replicated in time. Some experiments involve a single independent (treatment) variable, while other "factorial" designs simultaneously vary two or more independent variables, such as drug treatment and cell line. Factorial designs often provide additional information at little extra cost. Experiments need to be carefully planned to avoid bias, be powerful yet simple, provide for a valid statistical analysis and, in some cases, have a wide range of applicability. Virtually all experiments need some sort of statistical analysis in order to take account of biological variation among the experimental subjects. Parametric methods using the t test or analysis of variance are usually more powerful than non-parametric methods, provided the underlying assumptions of normality of the residuals and equal variances are approximately valid. The statistical analyses of data from a completely randomised design, and from a randomised-block design are demonstrated in Appendices 1 and 2, and methods of determining sample size are discussed in Appendix 3. Appendix 4 gives a checklist for authors submitting papers to ATLA.

  7. Statistical models for the analysis and design of digital polymerase chain (dPCR) experiments

    USGS Publications Warehouse

    Dorazio, Robert; Hunter, Margaret

    2015-01-01

    Statistical methods for the analysis and design of experiments using digital PCR (dPCR) have received only limited attention and have been misused in many instances. To address this issue and to provide a more general approach to the analysis of dPCR data, we describe a class of statistical models for the analysis and design of experiments that require quantification of nucleic acids. These models are mathematically equivalent to generalized linear models of binomial responses that include a complementary, log–log link function and an offset that is dependent on the dPCR partition volume. These models are both versatile and easy to fit using conventional statistical software. Covariates can be used to specify different sources of variation in nucleic acid concentration, and a model’s parameters can be used to quantify the effects of these covariates. For purposes of illustration, we analyzed dPCR data from different types of experiments, including serial dilution, evaluation of copy number variation, and quantification of gene expression. We also showed how these models can be used to help design dPCR experiments, as in selection of sample sizes needed to achieve desired levels of precision in estimates of nucleic acid concentration or to detect differences in concentration among treatments with prescribed levels of statistical power.

  8. Statistical Models for the Analysis and Design of Digital Polymerase Chain Reaction (dPCR) Experiments.

    PubMed

    Dorazio, Robert M; Hunter, Margaret E

    2015-11-03

    Statistical methods for the analysis and design of experiments using digital PCR (dPCR) have received only limited attention and have been misused in many instances. To address this issue and to provide a more general approach to the analysis of dPCR data, we describe a class of statistical models for the analysis and design of experiments that require quantification of nucleic acids. These models are mathematically equivalent to generalized linear models of binomial responses that include a complementary, log-log link function and an offset that is dependent on the dPCR partition volume. These models are both versatile and easy to fit using conventional statistical software. Covariates can be used to specify different sources of variation in nucleic acid concentration, and a model's parameters can be used to quantify the effects of these covariates. For purposes of illustration, we analyzed dPCR data from different types of experiments, including serial dilution, evaluation of copy number variation, and quantification of gene expression. We also showed how these models can be used to help design dPCR experiments, as in selection of sample sizes needed to achieve desired levels of precision in estimates of nucleic acid concentration or to detect differences in concentration among treatments with prescribed levels of statistical power.

  9. Study/experimental/research design: much more than statistics.

    PubMed

    Knight, Kenneth L

    2010-01-01

    The purpose of study, experimental, or research design in scientific manuscripts has changed significantly over the years. It has evolved from an explanation of the design of the experiment (ie, data gathering or acquisition) to an explanation of the statistical analysis. This practice makes "Methods" sections hard to read and understand. To clarify the difference between study design and statistical analysis, to show the advantages of a properly written study design on article comprehension, and to encourage authors to correctly describe study designs. The role of study design is explored from the introduction of the concept by Fisher through modern-day scientists and the AMA Manual of Style. At one time, when experiments were simpler, the study design and statistical design were identical or very similar. With the complex research that is common today, which often includes manipulating variables to create new variables and the multiple (and different) analyses of a single data set, data collection is very different than statistical design. Thus, both a study design and a statistical design are necessary. Scientific manuscripts will be much easier to read and comprehend. A proper experimental design serves as a road map to the study methods, helping readers to understand more clearly how the data were obtained and, therefore, assisting them in properly analyzing the results.

  10. Study/Experimental/Research Design: Much More Than Statistics

    PubMed Central

    Knight, Kenneth L.

    2010-01-01

    Abstract Context: The purpose of study, experimental, or research design in scientific manuscripts has changed significantly over the years. It has evolved from an explanation of the design of the experiment (ie, data gathering or acquisition) to an explanation of the statistical analysis. This practice makes “Methods” sections hard to read and understand. Objective: To clarify the difference between study design and statistical analysis, to show the advantages of a properly written study design on article comprehension, and to encourage authors to correctly describe study designs. Description: The role of study design is explored from the introduction of the concept by Fisher through modern-day scientists and the AMA Manual of Style. At one time, when experiments were simpler, the study design and statistical design were identical or very similar. With the complex research that is common today, which often includes manipulating variables to create new variables and the multiple (and different) analyses of a single data set, data collection is very different than statistical design. Thus, both a study design and a statistical design are necessary. Advantages: Scientific manuscripts will be much easier to read and comprehend. A proper experimental design serves as a road map to the study methods, helping readers to understand more clearly how the data were obtained and, therefore, assisting them in properly analyzing the results. PMID:20064054

  11. Statistical Engineering in Air Traffic Management Research

    NASA Technical Reports Server (NTRS)

    Wilson, Sara R.

    2015-01-01

    NASA is working to develop an integrated set of advanced technologies to enable efficient arrival operations in high-density terminal airspace for the Next Generation Air Transportation System. This integrated arrival solution is being validated and verified in laboratories and transitioned to a field prototype for an operational demonstration at a major U.S. airport. Within NASA, this is a collaborative effort between Ames and Langley Research Centers involving a multi-year iterative experimentation process. Designing and analyzing a series of sequential batch computer simulations and human-in-the-loop experiments across multiple facilities and simulation environments involves a number of statistical challenges. Experiments conducted in separate laboratories typically have different limitations and constraints, and can take different approaches with respect to the fundamental principles of statistical design of experiments. This often makes it difficult to compare results from multiple experiments and incorporate findings into the next experiment in the series. A statistical engineering approach is being employed within this project to support risk-informed decision making and maximize the knowledge gained within the available resources. This presentation describes a statistical engineering case study from NASA, highlights statistical challenges, and discusses areas where existing statistical methodology is adapted and extended.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heaney, Mike

    Statistically designed experiments can save researchers time and money by reducing the number of necessary experimental trials, while resulting in more conclusive experimental results. Surprisingly, many researchers are still not aware of this efficient and effective experimental methodology. As reported in a 2013 article from Chemical & Engineering News, there has been a resurgence of this methodology in recent years (http://cen.acs.org/articles/91/i13/Design-Experiments-Makes-Comeback.html?h=2027056365). This presentation will provide a brief introduction to statistically designed experiments. The main advantages will be reviewed along with the some basic concepts such as factorial and fractional factorial designs. The recommended sequential approach to experiments will be introducedmore » and finally a case study will be presented to demonstrate this methodology.« less

  13. Optimizing ELISAs for precision and robustness using laboratory automation and statistical design of experiments.

    PubMed

    Joelsson, Daniel; Moravec, Phil; Troutman, Matthew; Pigeon, Joseph; DePhillips, Pete

    2008-08-20

    Transferring manual ELISAs to automated platforms requires optimizing the assays for each particular robotic platform. These optimization experiments are often time consuming and difficult to perform using a traditional one-factor-at-a-time strategy. In this manuscript we describe the development of an automated process using statistical design of experiments (DOE) to quickly optimize immunoassays for precision and robustness on the Tecan EVO liquid handler. By using fractional factorials and a split-plot design, five incubation time variables and four reagent concentration variables can be optimized in a short period of time.

  14. On the Use of Statistics in Design and the Implications for Deterministic Computer Experiments

    NASA Technical Reports Server (NTRS)

    Simpson, Timothy W.; Peplinski, Jesse; Koch, Patrick N.; Allen, Janet K.

    1997-01-01

    Perhaps the most prevalent use of statistics in engineering design is through Taguchi's parameter and robust design -- using orthogonal arrays to compute signal-to-noise ratios in a process of design improvement. In our view, however, there is an equally exciting use of statistics in design that could become just as prevalent: it is the concept of metamodeling whereby statistical models are built to approximate detailed computer analysis codes. Although computers continue to get faster, analysis codes always seem to keep pace so that their computational time remains non-trivial. Through metamodeling, approximations of these codes are built that are orders of magnitude cheaper to run. These metamodels can then be linked to optimization routines for fast analysis, or they can serve as a bridge for integrating analysis codes across different domains. In this paper we first review metamodeling techniques that encompass design of experiments, response surface methodology, Taguchi methods, neural networks, inductive learning, and kriging. We discuss their existing applications in engineering design and then address the dangers of applying traditional statistical techniques to approximate deterministic computer analysis codes. We conclude with recommendations for the appropriate use of metamodeling techniques in given situations and how common pitfalls can be avoided.

  15. Determining significant material properties: A discovery approach

    NASA Technical Reports Server (NTRS)

    Karplus, Alan K.

    1992-01-01

    The following is a laboratory experiment designed to further understanding of materials science. The experiment itself can be informative for persons of any age past elementary school, and even for some in elementary school. The preparation of the plastic samples is readily accomplished by persons with resonable dexterity in the cutting of paper designs. The completion of the statistical Design of Experiments, which uses Yates' Method, requires basic math (addition and subtraction). Interpretive work requires plotting of data and making observations. Knowledge of statistical methods would be helpful. The purpose of this experiment is to acquaint students with the seven classes of recyclable plastics, and provide hands-on learning about the response of these plastics to mechanical tensile loading.

  16. Process optimization using combinatorial design principles: parallel synthesis and design of experiment methods.

    PubMed

    Gooding, Owen W

    2004-06-01

    The use of parallel synthesis techniques with statistical design of experiment (DoE) methods is a powerful combination for the optimization of chemical processes. Advances in parallel synthesis equipment and easy to use software for statistical DoE have fueled a growing acceptance of these techniques in the pharmaceutical industry. As drug candidate structures become more complex at the same time that development timelines are compressed, these enabling technologies promise to become more important in the future.

  17. A comparison of InVivoStat with other statistical software packages for analysis of data generated from animal experiments.

    PubMed

    Clark, Robin A; Shoaib, Mohammed; Hewitt, Katherine N; Stanford, S Clare; Bate, Simon T

    2012-08-01

    InVivoStat is a free-to-use statistical software package for analysis of data generated from animal experiments. The package is designed specifically for researchers in the behavioural sciences, where exploiting the experimental design is crucial for reliable statistical analyses. This paper compares the analysis of three experiments conducted using InVivoStat with other widely used statistical packages: SPSS (V19), PRISM (V5), UniStat (V5.6) and Statistica (V9). We show that InVivoStat provides results that are similar to those from the other packages and, in some cases, are more advanced. This investigation provides evidence of further validation of InVivoStat and should strengthen users' confidence in this new software package.

  18. Accounting for variation in designing greenhouse experiments with special reference to greenhouses containing plants on conveyor systems

    PubMed Central

    2013-01-01

    Background There are a number of unresolved issues in the design of experiments in greenhouses. They include whether statistical designs should be used and, if so, which designs should be used. Also, are there thigmomorphogenic or other effects arising from the movement of plants on conveyor belts within a greenhouse? A two-phase, single-line wheat experiment involving four tactics was conducted in a conventional greenhouse and a fully-automated phenotyping greenhouse (Smarthouse) to investigate these issues. Results and discussion Analyses of our experiment show that there was a small east–west trend in total area of the plants in the Smarthouse. Analyses of the data from three multiline experiments reveal a large north–south trend. In the single-line experiment, there was no evidence of differences between trios of lanes, nor of movement effects. Swapping plant positions during the trial was found to decrease the east–west trend, but at the cost of increased error variance. The movement of plants in a north–south direction, through a shaded area for an equal amount of time, nullified the north–south trend. An investigation of alternative experimental designs for equally-replicated experiments revealed that generally designs with smaller blocks performed best, but that (nearly) trend-free designs can be effective when blocks are larger. Conclusions To account for variation in microclimate in a greenhouse, using statistical design and analysis is better than rearranging the position of plants during the experiment. For the relocation of plants to be successful requires that plants spend an equal amount of time in each microclimate, preferably during comparable growth stages. Even then, there is no evidence that this will be any more precise than statistical design and analysis of the experiment, and the risk is that it will not be successful at all. As for statistical design and analysis, it is best to use either (i) smaller blocks, (ii) (nearly) trend-free arrangement of treatments with a linear trend term included in the analysis, or, as a last resort, (iii) blocks of several complete rows with trend terms in the analysis. Also, we recommend that the greenhouse arrangement parallel that in the Smarthouse, but with randomization where appropriate. PMID:23391282

  19. Sb2Te3 and Its Superlattices: Optimization by Statistical Design.

    PubMed

    Behera, Jitendra K; Zhou, Xilin; Ranjan, Alok; Simpson, Robert E

    2018-05-02

    The objective of this work is to demonstrate the usefulness of fractional factorial design for optimizing the crystal quality of chalcogenide van der Waals (vdW) crystals. We statistically analyze the growth parameters of highly c axis oriented Sb 2 Te 3 crystals and Sb 2 Te 3 -GeTe phase change vdW heterostructured superlattices. The statistical significance of the growth parameters of temperature, pressure, power, buffer materials, and buffer layer thickness was found by fractional factorial design and response surface analysis. Temperature, pressure, power, and their second-order interactions are the major factors that significantly influence the quality of the crystals. Additionally, using tungsten rather than molybdenum as a buffer layer significantly enhances the crystal quality. Fractional factorial design minimizes the number of experiments that are necessary to find the optimal growth conditions, resulting in an order of magnitude improvement in the crystal quality. We highlight that statistical design of experiment methods, which is more commonly used in product design, should be considered more broadly by those designing and optimizing materials.

  20. Experimental design, power and sample size for animal reproduction experiments.

    PubMed

    Chapman, Phillip L; Seidel, George E

    2008-01-01

    The present paper concerns statistical issues in the design of animal reproduction experiments, with emphasis on the problems of sample size determination and power calculations. We include examples and non-technical discussions aimed at helping researchers avoid serious errors that may invalidate or seriously impair the validity of conclusions from experiments. Screen shots from interactive power calculation programs and basic SAS power calculation programs are presented to aid in understanding statistical power and computing power in some common experimental situations. Practical issues that are common to most statistical design problems are briefly discussed. These include one-sided hypothesis tests, power level criteria, equality of within-group variances, transformations of response variables to achieve variance equality, optimal specification of treatment group sizes, 'post hoc' power analysis and arguments for the increased use of confidence intervals in place of hypothesis tests.

  1. Teaching Statistical Inference for Causal Effects in Experiments and Observational Studies

    ERIC Educational Resources Information Center

    Rubin, Donald B.

    2004-01-01

    Inference for causal effects is a critical activity in many branches of science and public policy. The field of statistics is the one field most suited to address such problems, whether from designed experiments or observational studies. Consequently, it is arguably essential that departments of statistics teach courses in causal inference to both…

  2. Return to Our Roots: Raising Radishes To Teach Experimental Design.

    ERIC Educational Resources Information Center

    Stallings, William M.

    To provide practice in making design decisions, collecting and analyzing data, and writing and documenting results, a professor of statistics has his graduate students in statistics and research methodology classes design and perform an experiment on the effects of fertilizers on the growth of radishes. This project has been required of students…

  3. Proceedings of the Conference on the Design of Experiments (23rd) S

    DTIC Science & Technology

    1978-07-01

    of Statistics, Carnegie-Mellon University. * [12] Duran , B. S . (1976). A survey of nonparametric tests for scale. Comunications in Statistics A5, 1287...the twenty-third Design of Experiments Conference was the U. S . Army Combat Development Experimentation Command, Fort Ord, California. Excellent...Availability Prof. G. E. P. Box Time Series Modelling University of Wisconsin Dr. Churchill Eisenhart was recipient this year of the Samuel S . Wilks Memorial

  4. Characterizing Variability in Smestad and Gratzel's Nanocrystalline Solar Cells: A Collaborative Learning Experience in Experimental Design

    ERIC Educational Resources Information Center

    Lawson, John; Aggarwal, Pankaj; Leininger, Thomas; Fairchild, Kenneth

    2011-01-01

    This article describes a collaborative learning experience in experimental design that closely approximates what practicing statisticians and researchers in applied science experience during consulting. Statistics majors worked with a teaching assistant from the chemistry department to conduct a series of experiments characterizing the variation…

  5. Experience and Explanation: Using Videogames to Prepare Students for Formal Instruction in Statistics

    ERIC Educational Resources Information Center

    Arena, Dylan A.; Schwartz, Daniel L.

    2014-01-01

    Well-designed digital games can deliver powerful experiences that are difficult to provide through traditional instruction, while traditional instruction can deliver formal explanations that are not a natural fit for gameplay. Combined, they can accomplish more than either can alone. An experiment tested this claim using the topic of statistics,…

  6. Statistical validation and an empirical model of hydrogen production enhancement found by utilizing passive flow disturbance in the steam-reformation process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Erickson, Paul A.; Liao, Chang-hsien

    2007-11-15

    A passive flow disturbance has been proven to enhance the conversion of fuel in a methanol-steam reformer. This study presents a statistical validation of the experiment based on a standard 2{sup k} factorial experiment design and the resulting empirical model of the enhanced hydrogen producing process. A factorial experiment design was used to statistically analyze the effects and interactions of various input factors in the experiment. Three input factors, including the number of flow disturbers, catalyst size, and reactant flow rate were investigated for their effects on the fuel conversion in the steam-reformation process. Based on the experimental results, anmore » empirical model was developed and further evaluated with an uncertainty analysis and interior point data. (author)« less

  7. Statistical inference for tumor growth inhibition T/C ratio.

    PubMed

    Wu, Jianrong

    2010-09-01

    The tumor growth inhibition T/C ratio is commonly used to quantify treatment effects in drug screening tumor xenograft experiments. The T/C ratio is converted to an antitumor activity rating using an arbitrary cutoff point and often without any formal statistical inference. Here, we applied a nonparametric bootstrap method and a small sample likelihood ratio statistic to make a statistical inference of the T/C ratio, including both hypothesis testing and a confidence interval estimate. Furthermore, sample size and power are also discussed for statistical design of tumor xenograft experiments. Tumor xenograft data from an actual experiment were analyzed to illustrate the application.

  8. Using a Discussion about Scientific Controversy to Teach Central Concepts in Experimental Design

    ERIC Educational Resources Information Center

    Bennett, Kimberley Ann

    2015-01-01

    Students may need explicit training in informal statistical reasoning in order to design experiments or use formal statistical tests effectively. By using scientific scandals and media misinterpretation, we can explore the need for good experimental design in an informal way. This article describes the use of a paper that reviews the measles mumps…

  9. Metamodels for Computer-Based Engineering Design: Survey and Recommendations

    NASA Technical Reports Server (NTRS)

    Simpson, Timothy W.; Peplinski, Jesse; Koch, Patrick N.; Allen, Janet K.

    1997-01-01

    The use of statistical techniques to build approximations of expensive computer analysis codes pervades much of todays engineering design. These statistical approximations, or metamodels, are used to replace the actual expensive computer analyses, facilitating multidisciplinary, multiobjective optimization and concept exploration. In this paper we review several of these techniques including design of experiments, response surface methodology, Taguchi methods, neural networks, inductive learning, and kriging. We survey their existing application in engineering design and then address the dangers of applying traditional statistical techniques to approximate deterministic computer analysis codes. We conclude with recommendations for the appropriate use of statistical approximation techniques in given situations and how common pitfalls can be avoided.

  10. Software for the Integration of Multiomics Experiments in Bioconductor.

    PubMed

    Ramos, Marcel; Schiffer, Lucas; Re, Angela; Azhar, Rimsha; Basunia, Azfar; Rodriguez, Carmen; Chan, Tiffany; Chapman, Phil; Davis, Sean R; Gomez-Cabrero, David; Culhane, Aedin C; Haibe-Kains, Benjamin; Hansen, Kasper D; Kodali, Hanish; Louis, Marie S; Mer, Arvind S; Riester, Markus; Morgan, Martin; Carey, Vince; Waldron, Levi

    2017-11-01

    Multiomics experiments are increasingly commonplace in biomedical research and add layers of complexity to experimental design, data integration, and analysis. R and Bioconductor provide a generic framework for statistical analysis and visualization, as well as specialized data classes for a variety of high-throughput data types, but methods are lacking for integrative analysis of multiomics experiments. The MultiAssayExperiment software package, implemented in R and leveraging Bioconductor software and design principles, provides for the coordinated representation of, storage of, and operation on multiple diverse genomics data. We provide the unrestricted multiple 'omics data for each cancer tissue in The Cancer Genome Atlas as ready-to-analyze MultiAssayExperiment objects and demonstrate in these and other datasets how the software simplifies data representation, statistical analysis, and visualization. The MultiAssayExperiment Bioconductor package reduces major obstacles to efficient, scalable, and reproducible statistical analysis of multiomics data and enhances data science applications of multiple omics datasets. Cancer Res; 77(21); e39-42. ©2017 AACR . ©2017 American Association for Cancer Research.

  11. The Dependence of Strength in Plastics upon Polymer Chain Length and Chain Orientation: An Experiment Emphasizing the Statistical Handling and Evaluation of Data.

    ERIC Educational Resources Information Center

    Spencer, R. Donald

    1984-01-01

    Describes an experiment (using plastic bags) designed to give students practical understanding on using statistics to evaluate data and how statistical treatment of experimental results can enhance their value in solving scientific problems. Students also gain insight into the orientation and structure of polymers by examining the plastic bags.…

  12. Reducing statistics anxiety and enhancing statistics learning achievement: effectiveness of a one-minute strategy.

    PubMed

    Chiou, Chei-Chang; Wang, Yu-Min; Lee, Li-Tze

    2014-08-01

    Statistical knowledge is widely used in academia; however, statistics teachers struggle with the issue of how to reduce students' statistics anxiety and enhance students' statistics learning. This study assesses the effectiveness of a "one-minute paper strategy" in reducing students' statistics-related anxiety and in improving students' statistics-related achievement. Participants were 77 undergraduates from two classes enrolled in applied statistics courses. An experiment was implemented according to a pretest/posttest comparison group design. The quasi-experimental design showed that the one-minute paper strategy significantly reduced students' statistics anxiety and improved students' statistics learning achievement. The strategy was a better instructional tool than the textbook exercise for reducing students' statistics anxiety and improving students' statistics achievement.

  13. The Design and Analysis of Transposon-Insertion Sequencing Experiments

    PubMed Central

    Chao, Michael C.; Abel, Sören; Davis, Brigid M.; Waldor, Matthew K.

    2016-01-01

    Preface Transposon-insertion sequencing (TIS) is a powerful approach that can be widely applied to genome-wide definition of loci that are required for growth in diverse conditions. However, experimental design choices and stochastic biological processes can heavily influence the results of TIS experiments and affect downstream statistical analysis. Here, we discuss TIS experimental parameters and how these factors relate to the benefits and limitations of the various statistical frameworks that can be applied to computational analysis of TIS data. PMID:26775926

  14. Statistical design of quantitative mass spectrometry-based proteomic experiments.

    PubMed

    Oberg, Ann L; Vitek, Olga

    2009-05-01

    We review the fundamental principles of statistical experimental design, and their application to quantitative mass spectrometry-based proteomics. We focus on class comparison using Analysis of Variance (ANOVA), and discuss how randomization, replication and blocking help avoid systematic biases due to the experimental procedure, and help optimize our ability to detect true quantitative changes between groups. We also discuss the issues of pooling multiple biological specimens for a single mass analysis, and calculation of the number of replicates in a future study. When applicable, we emphasize the parallels between designing quantitative proteomic experiments and experiments with gene expression microarrays, and give examples from that area of research. We illustrate the discussion using theoretical considerations, and using real-data examples of profiling of disease.

  15. A comparison of two experimental design approaches in applying conjoint analysis in patient-centered outcomes research: a randomized trial.

    PubMed

    Kinter, Elizabeth T; Prior, Thomas J; Carswell, Christopher I; Bridges, John F P

    2012-01-01

    While the application of conjoint analysis and discrete-choice experiments in health are now widely accepted, a healthy debate exists around competing approaches to experimental design. There remains, however, a paucity of experimental evidence comparing competing design approaches and their impact on the application of these methods in patient-centered outcomes research. Our objectives were to directly compare the choice-model parameters and predictions of an orthogonal and a D-efficient experimental design using a randomized trial (i.e., an experiment on experiments) within an application of conjoint analysis studying patient-centered outcomes among outpatients diagnosed with schizophrenia in Germany. Outpatients diagnosed with schizophrenia were surveyed and randomized to receive choice tasks developed using either an orthogonal or a D-efficient experimental design. The choice tasks elicited judgments from the respondents as to which of two patient profiles (varying across seven outcomes and process attributes) was preferable from their own perspective. The results from the two survey designs were analyzed using the multinomial logit model, and the resulting parameter estimates and their robust standard errors were compared across the two arms of the study (i.e., the orthogonal and D-efficient designs). The predictive performances of the two resulting models were also compared by computing their percentage of survey responses classified correctly, and the potential for variation in scale between the two designs of the experiments was tested statistically and explored graphically. The results of the two models were statistically identical. No difference was found using an overall chi-squared test of equality for the seven parameters (p = 0.69) or via uncorrected pairwise comparisons of the parameter estimates (p-values ranged from 0.30 to 0.98). The D-efficient design resulted in directionally smaller standard errors for six of the seven parameters, of which only two were statistically significant, and no differences were found in the observed D-efficiencies of their standard errors (p = 0.62). The D-efficient design resulted in poorer predictive performance, but this was not significant (p = 0.73); there was some evidence that the parameters of the D-efficient design were biased marginally towards the null. While no statistical difference in scale was detected between the two designs (p = 0.74), the D-efficient design had a higher relative scale (1.06). This could be observed when the parameters were explored graphically, as the D-efficient parameters were lower. Our results indicate that orthogonal and D-efficient experimental designs have produced results that are statistically equivalent. This said, we have identified several qualitative findings that speak to the potential differences in these results that may have been statistically identified in a larger sample. While more comparative studies focused on the statistical efficiency of competing design strategies are needed, a more pressing research problem is to document the impact the experimental design has on respondent efficiency.

  16. The Design and Analysis of Salmonid Tagging Studies in the Columbia Basin : Volume II: Experiment Salmonid Survival with Combined PIT-CWT Tagging.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Newman, Ken

    1997-06-01

    Experiment designs to estimate the effect of transportation on survival and return rates of Columbia River system salmonids are discussed along with statistical modeling techniques. Besides transportation, river flow and dam spill are necessary components in the design and analysis otherwise questions as to the effects of reservoir drawdowns and increased dam spill may never be satisfactorily answered. Four criteria for comparing different experiment designs are: (1) feasibility, (2) clarity of results, (3) scope of inference, and (4) time to learn. In this report, alternative designs for conducting experimental manipulations of smolt tagging studies to study effects of river operationsmore » such as flow levels, spill fractions, and transporting outmigrating salmonids around dams in the Columbia River system are presented. The principles of study design discussed in this report have broad implications for the many studies proposed to investigate both smolt and adult survival relationships. The concepts are illustrated for the case of the design and analysis of smolt transportation experiments. The merits of proposed transportation studies should be measured relative to these principles of proper statistical design and analysis.« less

  17. Statistical Methodologies to Integrate Experimental and Computational Research

    NASA Technical Reports Server (NTRS)

    Parker, P. A.; Johnson, R. T.; Montgomery, D. C.

    2008-01-01

    Development of advanced algorithms for simulating engine flow paths requires the integration of fundamental experiments with the validation of enhanced mathematical models. In this paper, we provide an overview of statistical methods to strategically and efficiently conduct experiments and computational model refinement. Moreover, the integration of experimental and computational research efforts is emphasized. With a statistical engineering perspective, scientific and engineering expertise is combined with statistical sciences to gain deeper insights into experimental phenomenon and code development performance; supporting the overall research objectives. The particular statistical methods discussed are design of experiments, response surface methodology, and uncertainty analysis and planning. Their application is illustrated with a coaxial free jet experiment and a turbulence model refinement investigation. Our goal is to provide an overview, focusing on concepts rather than practice, to demonstrate the benefits of using statistical methods in research and development, thereby encouraging their broader and more systematic application.

  18. Predictive Model for the Design of Zwitterionic Polymer Brushes: A Statistical Design of Experiments Approach.

    PubMed

    Kumar, Ramya; Lahann, Joerg

    2016-07-06

    The performance of polymer interfaces in biology is governed by a wide spectrum of interfacial properties. With the ultimate goal of identifying design parameters for stem cell culture coatings, we developed a statistical model that describes the dependence of brush properties on surface-initiated polymerization (SIP) parameters. Employing a design of experiments (DOE) approach, we identified operating boundaries within which four gel architecture regimes can be realized, including a new regime of associated brushes in thin films. Our statistical model can accurately predict the brush thickness and the degree of intermolecular association of poly[{2-(methacryloyloxy) ethyl} dimethyl-(3-sulfopropyl) ammonium hydroxide] (PMEDSAH), a previously reported synthetic substrate for feeder-free and xeno-free culture of human embryonic stem cells. DOE-based multifunctional predictions offer a powerful quantitative framework for designing polymer interfaces. For example, model predictions can be used to decrease the critical thickness at which the wettability transition occurs by simply increasing the catalyst quantity from 1 to 3 mol %.

  19. Design of a factorial experiment with randomization restrictions to assess medical device performance on vascular tissue.

    PubMed

    Diestelkamp, Wiebke S; Krane, Carissa M; Pinnell, Margaret F

    2011-05-20

    Energy-based surgical scalpels are designed to efficiently transect and seal blood vessels using thermal energy to promote protein denaturation and coagulation. Assessment and design improvement of ultrasonic scalpel performance relies on both in vivo and ex vivo testing. The objective of this work was to design and implement a robust, experimental test matrix with randomization restrictions and predictive statistical power, which allowed for identification of those experimental variables that may affect the quality of the seal obtained ex vivo. The design of the experiment included three factors: temperature (two levels); the type of solution used to perfuse the artery during transection (three types); and artery type (two types) resulting in a total of twelve possible treatment combinations. Burst pressures of porcine carotid and renal arteries sealed ex vivo were assigned as the response variable. The experimental test matrix was designed and carried out as a split-plot experiment in order to assess the contributions of several variables and their interactions while accounting for randomization restrictions present in the experimental setup. The statistical software package SAS was utilized and PROC MIXED was used to account for the randomization restrictions in the split-plot design. The combination of temperature, solution, and vessel type had a statistically significant impact on seal quality. The design and implementation of a split-plot experimental test-matrix provided a mechanism for addressing the existing technical randomization restrictions of ex vivo ultrasonic scalpel performance testing, while preserving the ability to examine the potential effects of independent factors or variables. This method for generating the experimental design and the statistical analyses of the resulting data are adaptable to a wide variety of experimental problems involving large-scale tissue-based studies of medical or experimental device efficacy and performance.

  20. Selecting the best design for nonstandard toxicology experiments.

    PubMed

    Webb, Jennifer M; Smucker, Byran J; Bailer, A John

    2014-10-01

    Although many experiments in environmental toxicology use standard statistical experimental designs, there are situations that arise where no such standard design is natural or applicable because of logistical constraints. For example, the layout of a laboratory may suggest that each shelf serve as a block, with the number of experimental units per shelf either greater than or less than the number of treatments in a way that precludes the use of a typical block design. In such cases, an effective and powerful alternative is to employ optimal experimental design principles, a strategy that produces designs with precise statistical estimates. Here, a D-optimal design was generated for an experiment in environmental toxicology that has 2 factors, 16 treatments, and constraints similar to those described above. After initial consideration of a randomized complete block design and an intuitive cyclic design, it was decided to compare a D-optimal design and a slightly more complicated version of the cyclic design. Simulations were conducted generating random responses under a variety of scenarios that reflect conditions motivated by a similar toxicology study, and the designs were evaluated via D-efficiency as well as by a power analysis. The cyclic design performed well compared to the D-optimal design. © 2014 SETAC.

  1. Investigating the impact of design characteristics on statistical efficiency within discrete choice experiments: A systematic survey.

    PubMed

    Vanniyasingam, Thuva; Daly, Caitlin; Jin, Xuejing; Zhang, Yuan; Foster, Gary; Cunningham, Charles; Thabane, Lehana

    2018-06-01

    This study reviews simulation studies of discrete choice experiments to determine (i) how survey design features affect statistical efficiency, (ii) and to appraise their reporting quality. Statistical efficiency was measured using relative design (D-) efficiency, D-optimality, or D-error. For this systematic survey, we searched Journal Storage (JSTOR), Since Direct, PubMed, and OVID which included a search within EMBASE. Searches were conducted up to year 2016 for simulation studies investigating the impact of DCE design features on statistical efficiency. Studies were screened and data were extracted independently and in duplicate. Results for each included study were summarized by design characteristic. Previously developed criteria for reporting quality of simulation studies were also adapted and applied to each included study. Of 371 potentially relevant studies, 9 were found to be eligible, with several varying in study objectives. Statistical efficiency improved when increasing the number of choice tasks or alternatives; decreasing the number of attributes, attribute levels; using an unrestricted continuous "manipulator" attribute; using model-based approaches with covariates incorporating response behaviour; using sampling approaches that incorporate previous knowledge of response behaviour; incorporating heterogeneity in a model-based design; correctly specifying Bayesian priors; minimizing parameter prior variances; and using an appropriate method to create the DCE design for the research question. The simulation studies performed well in terms of reporting quality. Improvement is needed in regards to clearly specifying study objectives, number of failures, random number generators, starting seeds, and the software used. These results identify the best approaches to structure a DCE. An investigator can manipulate design characteristics to help reduce response burden and increase statistical efficiency. Since studies varied in their objectives, conclusions were made on several design characteristics, however, the validity of each conclusion was limited. Further research should be conducted to explore all conclusions in various design settings and scenarios. Additional reviews to explore other statistical efficiency outcomes and databases can also be performed to enhance the conclusions identified from this review.

  2. Content-based VLE designs improve learning efficiency in constructivist statistics education.

    PubMed

    Wessa, Patrick; De Rycker, Antoon; Holliday, Ian Edward

    2011-01-01

    We introduced a series of computer-supported workshops in our undergraduate statistics courses, in the hope that it would help students to gain a deeper understanding of statistical concepts. This raised questions about the appropriate design of the Virtual Learning Environment (VLE) in which such an approach had to be implemented. Therefore, we investigated two competing software design models for VLEs. In the first system, all learning features were a function of the classical VLE. The second system was designed from the perspective that learning features should be a function of the course's core content (statistical analyses), which required us to develop a specific-purpose Statistical Learning Environment (SLE) based on Reproducible Computing and newly developed Peer Review (PR) technology. The main research question is whether the second VLE design improved learning efficiency as compared to the standard type of VLE design that is commonly used in education. As a secondary objective we provide empirical evidence about the usefulness of PR as a constructivist learning activity which supports non-rote learning. Finally, this paper illustrates that it is possible to introduce a constructivist learning approach in large student populations, based on adequately designed educational technology, without subsuming educational content to technological convenience. Both VLE systems were tested within a two-year quasi-experiment based on a Reliable Nonequivalent Group Design. This approach allowed us to draw valid conclusions about the treatment effect of the changed VLE design, even though the systems were implemented in successive years. The methodological aspects about the experiment's internal validity are explained extensively. The effect of the design change is shown to have substantially increased the efficiency of constructivist, computer-assisted learning activities for all cohorts of the student population under investigation. The findings demonstrate that a content-based design outperforms the traditional VLE-based design.

  3. A Comparison of Methods to Test for Mediation in Multisite Experiments

    ERIC Educational Resources Information Center

    Pituch, Keenan A.; Whittaker, Tiffany A.; Stapleton, Laura M.

    2005-01-01

    A Monte Carlo study extended the research of MacKinnon, Lockwood, Hoffman, West, and Sheets (2002) for single-level designs by examining the statistical performance of four methods to test for mediation in a multilevel experimental design. The design studied was a two-group experiment that was replicated across several sites, included a single…

  4. Examining the Internal Validity and Statistical Precision of the Comparative Interrupted Time Series Design by Comparison with a Randomized Experiment

    ERIC Educational Resources Information Center

    St.Clair, Travis; Cook, Thomas D.; Hallberg, Kelly

    2014-01-01

    Although evaluators often use an interrupted time series (ITS) design to test hypotheses about program effects, there are few empirical tests of the design's validity. We take a randomized experiment on an educational topic and compare its effects to those from a comparative ITS (CITS) design that uses the same treatment group as the experiment…

  5. Good experimental design and statistics can save animals, but how can it be promoted?

    PubMed

    Festing, Michael F W

    2004-06-01

    Surveys of published papers show that there are many errors both in the design of the experiments and in the statistical analysis of the resulting data. This must result in a waste of animals and scientific resources, and it is surely unethical. Scientific quality might be improved, to some extent, by journal editors, but they are constrained by lack of statistical referees and inadequate statistical training of those referees that they do use. Other parties, such as welfare regulators, ethical review committees and individual scientists also have an interest in scientific quality, but they do not seem to be well placed to make the required changes. However, those who fund research would have the power to do something if they could be convinced that it is in their best interests to do so. More examples of the way in which better experimental design has led to improved experiments would be helpful in persuading these funding organisations to take further action.

  6. Exceedance statistics of accelerations resulting from thruster firings on the Apollo-Soyuz mission

    NASA Technical Reports Server (NTRS)

    Fichtl, G. H.; Holland, R. L.

    1981-01-01

    Spacecraft acceleration resulting from firings of vernier control system thrusters is an important consideration in the design, planning, execution and post-flight analysis of laboratory experiments in space. In particular, scientists and technologists involved with the development of experiments to be performed in space in many instances required statistical information on the magnitude and rate of occurrence of spacecraft accelerations. Typically, these accelerations are stochastic in nature, so that it is useful to characterize these accelerations in statistical terms. Statistics of spacecraft accelerations are summarized.

  7. An Automated Statistical Process Control Study of Inline Mixing Using Spectrophotometric Detection

    ERIC Educational Resources Information Center

    Dickey, Michael D.; Stewart, Michael D.; Willson, C. Grant

    2006-01-01

    An experiment is described, which is designed for a junior-level chemical engineering "fundamentals of measurements and data analysis" course, where students are introduced to the concept of statistical process control (SPC) through a simple inline mixing experiment. The students learn how to create and analyze control charts in an effort to…

  8. Quasi-experimental study designs series-paper 10: synthesizing evidence for effects collected from quasi-experimental studies presents surmountable challenges.

    PubMed

    Becker, Betsy Jane; Aloe, Ariel M; Duvendack, Maren; Stanley, T D; Valentine, Jeffrey C; Fretheim, Atle; Tugwell, Peter

    2017-09-01

    To outline issues of importance to analytic approaches to the synthesis of quasi-experiments (QEs) and to provide a statistical model for use in analysis. We drew on studies of statistics, epidemiology, and social-science methodology to outline methods for synthesis of QE studies. The design and conduct of QEs, effect sizes from QEs, and moderator variables for the analysis of those effect sizes were discussed. Biases, confounding, design complexities, and comparisons across designs offer serious challenges to syntheses of QEs. Key components of meta-analyses of QEs were identified, including the aspects of QE study design to be coded and analyzed. Of utmost importance are the design and statistical controls implemented in the QEs. Such controls and any potential sources of bias and confounding must be modeled in analyses, along with aspects of the interventions and populations studied. Because of such controls, effect sizes from QEs are more complex than those from randomized experiments. A statistical meta-regression model that incorporates important features of the QEs under review was presented. Meta-analyses of QEs provide particular challenges, but thorough coding of intervention characteristics and study methods, along with careful analysis, should allow for sound inferences. Copyright © 2017 Elsevier Inc. All rights reserved.

  9. Radar derived spatial statistics of summer rain. Volume 1: Experiment description

    NASA Technical Reports Server (NTRS)

    Katz, I.; Arnold, A.; Goldhirsh, J.; Konrad, T. G.; Vann, W. L.; Dobson, E. B.; Rowland, J. R.

    1975-01-01

    An experiment was performed at Wallops Island, Virginia, to obtain a statistical description of summer rainstorms. Its purpose was to obtain information needed for design of earth and space communications systems in which precipitation in the earth's atmosphere scatters or attenuates the radio signal. Rainstorms were monitored with the high resolution SPANDAR radar and the 3-dimensional structures of the storms were recorded on digital tape. The equipment, the experiment, and tabulated data obtained during the experiment are described.

  10. The influence of narrative v. statistical information on perceiving vaccination risks.

    PubMed

    Betsch, Cornelia; Ulshöfer, Corina; Renkewitz, Frank; Betsch, Tilmann

    2011-01-01

    Health-related information found on the Internet is increasing and impacts patient decision making, e.g. regarding vaccination decisions. In addition to statistical information (e.g. incidence rates of vaccine adverse events), narrative information is also widely available such as postings on online bulletin boards. Previous research has shown that narrative information can impact treatment decisions, even when statistical information is presented concurrently. As the determinants of this effect are largely unknown, we will vary features of the narratives to identify mechanisms through which narratives impact risk judgments. An online bulletin board setting provided participants with statistical information and authentic narratives about the occurrence and nonoccurrence of adverse events. Experiment 1 followed a single factorial design with 1, 2, or 4 narratives out of 10 reporting adverse events. Experiment 2 implemented a 2 (statistical risk 20% vs. 40%) × 2 (2/10 vs. 4/10 narratives reporting adverse events) × 2 (high vs. low richness) × 2 (high vs. low emotionality) between-subjects design. Dependent variables were perceived risk of side-effects and vaccination intentions. Experiment 1 shows an inverse relation between the number of narratives reporting adverse-events and vaccination intentions, which was mediated by the perceived risk of vaccinating. Experiment 2 showed a stronger influence of the number of narratives than of the statistical risk information. High (vs. low) emotional narratives had a greater impact on the perceived risk, while richness had no effect. The number of narratives influences risk judgments can potentially override statistical information about risk.

  11. Towards evidence-based computational statistics: lessons from clinical research on the role and design of real-data benchmark studies.

    PubMed

    Boulesteix, Anne-Laure; Wilson, Rory; Hapfelmeier, Alexander

    2017-09-09

    The goal of medical research is to develop interventions that are in some sense superior, with respect to patient outcome, to interventions currently in use. Similarly, the goal of research in methodological computational statistics is to develop data analysis tools that are themselves superior to the existing tools. The methodology of the evaluation of medical interventions continues to be discussed extensively in the literature and it is now well accepted that medicine should be at least partly "evidence-based". Although we statisticians are convinced of the importance of unbiased, well-thought-out study designs and evidence-based approaches in the context of clinical research, we tend to ignore these principles when designing our own studies for evaluating statistical methods in the context of our methodological research. In this paper, we draw an analogy between clinical trials and real-data-based benchmarking experiments in methodological statistical science, with datasets playing the role of patients and methods playing the role of medical interventions. Through this analogy, we suggest directions for improvement in the design and interpretation of studies which use real data to evaluate statistical methods, in particular with respect to dataset inclusion criteria and the reduction of various forms of bias. More generally, we discuss the concept of "evidence-based" statistical research, its limitations and its impact on the design and interpretation of real-data-based benchmark experiments. We suggest that benchmark studies-a method of assessment of statistical methods using real-world datasets-might benefit from adopting (some) concepts from evidence-based medicine towards the goal of more evidence-based statistical research.

  12. Statistical results from the Virginia Tech propagation experiment using the Olympus 12, 20, and 30 GHz satellite beacons

    NASA Technical Reports Server (NTRS)

    Stutzman, Warren L.; Safaai-Jazi, A.; Pratt, Timothy; Nelson, B.; Laster, J.; Ajaz, H.

    1993-01-01

    Virginia Tech has performed a comprehensive propagation experiment using the Olympus satellite beacons at 12.5, 19.77, and 29.66 GHz (which we refer to as 12, 20, and 30 GHz). Four receive terminals were designed and constructed, one terminal at each frequency plus a portable one with 20 and 30 GHz receivers for microscale and scintillation studies. Total power radiometers were included in each terminal in order to set the clear air reference level for each beacon and also to predict path attenuation. More details on the equipment and the experiment design are found elsewhere. Statistical results for one year of data collection were analyzed. In addition, the following studies were performed: a microdiversity experiment in which two closely spaced 20 GHz receivers were used; a comparison of total power and Dicke switched radiometer measurements, frequency scaling of scintillations, and adaptive power control algorithm development. Statistical results are reported.

  13. Simulation study to determine the impact of different design features on design efficiency in discrete choice experiments

    PubMed Central

    Vanniyasingam, Thuva; Cunningham, Charles E; Foster, Gary; Thabane, Lehana

    2016-01-01

    Objectives Discrete choice experiments (DCEs) are routinely used to elicit patient preferences to improve health outcomes and healthcare services. While many fractional factorial designs can be created, some are more statistically optimal than others. The objective of this simulation study was to investigate how varying the number of (1) attributes, (2) levels within attributes, (3) alternatives and (4) choice tasks per survey will improve or compromise the statistical efficiency of an experimental design. Design and methods A total of 3204 DCE designs were created to assess how relative design efficiency (d-efficiency) is influenced by varying the number of choice tasks (2–20), alternatives (2–5), attributes (2–20) and attribute levels (2–5) of a design. Choice tasks were created by randomly allocating attribute and attribute level combinations into alternatives. Outcome Relative d-efficiency was used to measure the optimality of each DCE design. Results DCE design complexity influenced statistical efficiency. Across all designs, relative d-efficiency decreased as the number of attributes and attribute levels increased. It increased for designs with more alternatives. Lastly, relative d-efficiency converges as the number of choice tasks increases, where convergence may not be at 100% statistical optimality. Conclusions Achieving 100% d-efficiency is heavily dependent on the number of attributes, attribute levels, choice tasks and alternatives. Further exploration of overlaps and block sizes are needed. This study's results are widely applicable for researchers interested in creating optimal DCE designs to elicit individual preferences on health services, programmes, policies and products. PMID:27436671

  14. Designing biomedical proteomics experiments: state-of-the-art and future perspectives.

    PubMed

    Maes, Evelyne; Kelchtermans, Pieter; Bittremieux, Wout; De Grave, Kurt; Degroeve, Sven; Hooyberghs, Jef; Mertens, Inge; Baggerman, Geert; Ramon, Jan; Laukens, Kris; Martens, Lennart; Valkenborg, Dirk

    2016-05-01

    With the current expanded technical capabilities to perform mass spectrometry-based biomedical proteomics experiments, an improved focus on the design of experiments is crucial. As it is clear that ignoring the importance of a good design leads to an unprecedented rate of false discoveries which would poison our results, more and more tools are developed to help researchers designing proteomic experiments. In this review, we apply statistical thinking to go through the entire proteomics workflow for biomarker discovery and validation and relate the considerations that should be made at the level of hypothesis building, technology selection, experimental design and the optimization of the experimental parameters.

  15. A d-statistic for single-case designs that is equivalent to the usual between-groups d-statistic.

    PubMed

    Shadish, William R; Hedges, Larry V; Pustejovsky, James E; Boyajian, Jonathan G; Sullivan, Kristynn J; Andrade, Alma; Barrientos, Jeannette L

    2014-01-01

    We describe a standardised mean difference statistic (d) for single-case designs that is equivalent to the usual d in between-groups experiments. We show how it can be used to summarise treatment effects over cases within a study, to do power analyses in planning new studies and grant proposals, and to meta-analyse effects across studies of the same question. We discuss limitations of this d-statistic, and possible remedies to them. Even so, this d-statistic is better founded statistically than other effect size measures for single-case design, and unlike many general linear model approaches such as multilevel modelling or generalised additive models, it produces a standardised effect size that can be integrated over studies with different outcome measures. SPSS macros for both effect size computation and power analysis are available.

  16. Implementation of quality by design principles in the development of microsponges as drug delivery carriers: Identification and optimization of critical factors using multivariate statistical analyses and design of experiments studies.

    PubMed

    Simonoska Crcarevska, Maja; Dimitrovska, Aneta; Sibinovska, Nadica; Mladenovska, Kristina; Slavevska Raicki, Renata; Glavas Dodov, Marija

    2015-07-15

    Microsponges drug delivery system (MDDC) was prepared by double emulsion-solvent-diffusion technique using rotor-stator homogenization. Quality by design (QbD) concept was implemented for the development of MDDC with potential to be incorporated into semisolid dosage form (gel). Quality target product profile (QTPP) and critical quality attributes (CQA) were defined and identified, accordingly. Critical material attributes (CMA) and Critical process parameters (CPP) were identified using quality risk management (QRM) tool, failure mode, effects and criticality analysis (FMECA). CMA and CPP were identified based on results obtained from principal component analysis (PCA-X&Y) and partial least squares (PLS) statistical analysis along with literature data, product and process knowledge and understanding. FMECA identified amount of ethylcellulose, chitosan, acetone, dichloromethane, span 80, tween 80 and water ratio in primary/multiple emulsions as CMA and rotation speed and stirrer type used for organic solvent removal as CPP. The relationship between identified CPP and particle size as CQA was described in the design space using design of experiments - one-factor response surface method. Obtained results from statistically designed experiments enabled establishment of mathematical models and equations that were used for detailed characterization of influence of identified CPP upon MDDC particle size and particle size distribution and their subsequent optimization. Copyright © 2015 Elsevier B.V. All rights reserved.

  17. Statistical methodology for the analysis of dye-switch microarray experiments

    PubMed Central

    Mary-Huard, Tristan; Aubert, Julie; Mansouri-Attia, Nadera; Sandra, Olivier; Daudin, Jean-Jacques

    2008-01-01

    Background In individually dye-balanced microarray designs, each biological sample is hybridized on two different slides, once with Cy3 and once with Cy5. While this strategy ensures an automatic correction of the gene-specific labelling bias, it also induces dependencies between log-ratio measurements that must be taken into account in the statistical analysis. Results We present two original statistical procedures for the statistical analysis of individually balanced designs. These procedures are compared with the usual ML and REML mixed model procedures proposed in most statistical toolboxes, on both simulated and real data. Conclusion The UP procedure we propose as an alternative to usual mixed model procedures is more efficient and significantly faster to compute. This result provides some useful guidelines for the analysis of complex designs. PMID:18271965

  18. Exceedance statistics of accelerations resulting from thruster firings on the Apollo-Soyuz mission

    NASA Technical Reports Server (NTRS)

    Fichtl, G. H.; Holland, R. L.

    1983-01-01

    Spacecraft acceleration resulting from firings of vernier control system thrusters is an important consideration in the design, planning, execution and post-flight analysis of laboratory experiments in space. In particular, scientists and technologists involved with the development of experiments to be performed in space in many instances required statistical information on the magnitude and rate of occurrence of spacecraft accelerations. Typically, these accelerations are stochastic in nature, so that it is useful to characterize these accelerations in statistical terms. Statistics of spacecraft accelerations are summarized. Previously announced in STAR as N82-12127

  19. 16 CFR 1000.26 - Directorate for Epidemiology.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    .... In addition, staff in the Hazard Analysis Division design special studies, design and analyze data from experiments for testing of consumer products, and provide statistical expertise and advice to...

  20. 16 CFR 1000.26 - Directorate for Epidemiology.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    .... In addition, staff in the Hazard Analysis Division design special studies, design and analyze data from experiments for testing of consumer products, and provide statistical expertise and advice to...

  1. 16 CFR 1000.26 - Directorate for Epidemiology.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    .... In addition, staff in the Hazard Analysis Division design special studies, design and analyze data from experiments for testing of consumer products, and provide statistical expertise and advice to...

  2. Illinois crash facts and statistics, 2002

    DOT National Transportation Integrated Search

    2002-01-01

    This publication, Illinois Traffic Crash Facts : and Statistics for 2002, is designed to provide an : overview of motor vehicle crash experience in : Illinois. In addition to a plethora of crash data, the : publication includes key events in th...

  3. Illinois crash facts and statistics, 2001

    DOT National Transportation Integrated Search

    2001-01-01

    This publication, Illinois Traffic Crash Facts : and Statistics for 2001, is designed to provide an : overview of motor vehicle crash experience in : Illinois. In addition to a plethora of crash data, the : publication includes key events in th...

  4. Illinois crash facts and statistics, 2003

    DOT National Transportation Integrated Search

    2003-01-01

    This publication, Illinois Traffic Crash Facts : and Statistics for 2003, is designed to provide an : overview of motor vehicle crash experience in : Illinois. In addition to a plethora of crash data, the : publication includes key events in th...

  5. Modeling and Recovery of Iron (Fe) from Red Mud by Coal Reduction

    NASA Astrophysics Data System (ADS)

    Zhao, Xiancong; Li, Hongxu; Wang, Lei; Zhang, Lifeng

    Recovery of Fe from red mud has been studied using statistically designed experiments. The effects of three factors, namely: reduction temperature, reduction time and proportion of additive on recovery of Fe have been investigated. Experiments have been carried out using orthogonal central composite design and factorial design methods. A model has been obtained through variance analysis at 92.5% confidence level.

  6. 16 CFR § 1000.26 - Directorate for Epidemiology.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    .... In addition, staff in the Hazard Analysis Division design special studies, design and analyze data from experiments for testing of consumer products, and provide statistical expertise and advice to...

  7. Effects of Platform Design on the Customer Experience in an Online Solar PV Marketplace

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    OShaughnessy, Eric J; Margolis, Robert M; Leibowicz, Benjamin

    Residential solar photovoltaic (PV) customers are increasingly buying PV systems in online marketplaces, where customers can compare multiple quotes from several installers on quote platforms. In this study, we use data from an online marketplace to explore how quote platform design affects customer experiences. We analyze how four design changes affected customer experiences in terms of factors such as prices. We find that three of the four design changes are associated with statistically significant and robust price reductions, even though none of the changes were implemented specifically to reduce prices. The results suggest that even seemingly small platform design changesmore » can affect PV customer experiences in online marketplaces.« less

  8. Design of a factorial experiment with randomization restrictions to assess medical device performance on vascular tissue

    PubMed Central

    2011-01-01

    Background Energy-based surgical scalpels are designed to efficiently transect and seal blood vessels using thermal energy to promote protein denaturation and coagulation. Assessment and design improvement of ultrasonic scalpel performance relies on both in vivo and ex vivo testing. The objective of this work was to design and implement a robust, experimental test matrix with randomization restrictions and predictive statistical power, which allowed for identification of those experimental variables that may affect the quality of the seal obtained ex vivo. Methods The design of the experiment included three factors: temperature (two levels); the type of solution used to perfuse the artery during transection (three types); and artery type (two types) resulting in a total of twelve possible treatment combinations. Burst pressures of porcine carotid and renal arteries sealed ex vivo were assigned as the response variable. Results The experimental test matrix was designed and carried out as a split-plot experiment in order to assess the contributions of several variables and their interactions while accounting for randomization restrictions present in the experimental setup. The statistical software package SAS was utilized and PROC MIXED was used to account for the randomization restrictions in the split-plot design. The combination of temperature, solution, and vessel type had a statistically significant impact on seal quality. Conclusions The design and implementation of a split-plot experimental test-matrix provided a mechanism for addressing the existing technical randomization restrictions of ex vivo ultrasonic scalpel performance testing, while preserving the ability to examine the potential effects of independent factors or variables. This method for generating the experimental design and the statistical analyses of the resulting data are adaptable to a wide variety of experimental problems involving large-scale tissue-based studies of medical or experimental device efficacy and performance. PMID:21599963

  9. A Robust Adaptive Autonomous Approach to Optimal Experimental Design

    NASA Astrophysics Data System (ADS)

    Gu, Hairong

    Experimentation is the fundamental tool of scientific inquiries to understand the laws governing the nature and human behaviors. Many complex real-world experimental scenarios, particularly in quest of prediction accuracy, often encounter difficulties to conduct experiments using an existing experimental procedure for the following two reasons. First, the existing experimental procedures require a parametric model to serve as the proxy of the latent data structure or data-generating mechanism at the beginning of an experiment. However, for those experimental scenarios of concern, a sound model is often unavailable before an experiment. Second, those experimental scenarios usually contain a large number of design variables, which potentially leads to a lengthy and costly data collection cycle. Incompetently, the existing experimental procedures are unable to optimize large-scale experiments so as to minimize the experimental length and cost. Facing the two challenges in those experimental scenarios, the aim of the present study is to develop a new experimental procedure that allows an experiment to be conducted without the assumption of a parametric model while still achieving satisfactory prediction, and performs optimization of experimental designs to improve the efficiency of an experiment. The new experimental procedure developed in the present study is named robust adaptive autonomous system (RAAS). RAAS is a procedure for sequential experiments composed of multiple experimental trials, which performs function estimation, variable selection, reverse prediction and design optimization on each trial. Directly addressing the challenges in those experimental scenarios of concern, function estimation and variable selection are performed by data-driven modeling methods to generate a predictive model from data collected during the course of an experiment, thus exempting the requirement of a parametric model at the beginning of an experiment; design optimization is performed to select experimental designs on the fly of an experiment based on their usefulness so that fewest designs are needed to reach useful inferential conclusions. Technically, function estimation is realized by Bayesian P-splines, variable selection is realized by Bayesian spike-and-slab prior, reverse prediction is realized by grid-search and design optimization is realized by the concepts of active learning. The present study demonstrated that RAAS achieves statistical robustness by making accurate predictions without the assumption of a parametric model serving as the proxy of latent data structure while the existing procedures can draw poor statistical inferences if a misspecified model is assumed; RAAS also achieves inferential efficiency by taking fewer designs to acquire useful statistical inferences than non-optimal procedures. Thus, RAAS is expected to be a principled solution to real-world experimental scenarios pursuing robust prediction and efficient experimentation.

  10. Artificial Intelligence Approach to Support Statistical Quality Control Teaching

    ERIC Educational Resources Information Center

    Reis, Marcelo Menezes; Paladini, Edson Pacheco; Khator, Suresh; Sommer, Willy Arno

    2006-01-01

    Statistical quality control--SQC (consisting of Statistical Process Control, Process Capability Studies, Acceptance Sampling and Design of Experiments) is a very important tool to obtain, maintain and improve the Quality level of goods and services produced by an organization. Despite its importance, and the fact that it is taught in technical and…

  11. Obtaining mathematical models for assessing efficiency of dust collectors using integrated system of analysis and data management STATISTICA Design of Experiments

    NASA Astrophysics Data System (ADS)

    Azarov, A. V.; Zhukova, N. S.; Kozlovtseva, E. Yu; Dobrinsky, D. R.

    2018-05-01

    The article considers obtaining mathematical models to assess the efficiency of the dust collectors using an integrated system of analysis and data management STATISTICA Design of Experiments. The procedure for obtaining mathematical models and data processing is considered by the example of laboratory studies on a mounted installation containing a dust collector in counter-swirling flows (CSF) using gypsum dust of various fractions. Planning of experimental studies has been carried out in order to reduce the number of experiments and reduce the cost of experimental research. A second-order non-position plan (Box-Bencken plan) was used, which reduced the number of trials from 81 to 27. The order of statistical data research of Box-Benken plan using standard tools of integrated system for analysis and data management STATISTICA Design of Experiments is considered. Results of statistical data processing with significance estimation of coefficients and adequacy of mathematical models are presented.

  12. Content-Based VLE Designs Improve Learning Efficiency in Constructivist Statistics Education

    PubMed Central

    Wessa, Patrick; De Rycker, Antoon; Holliday, Ian Edward

    2011-01-01

    Background We introduced a series of computer-supported workshops in our undergraduate statistics courses, in the hope that it would help students to gain a deeper understanding of statistical concepts. This raised questions about the appropriate design of the Virtual Learning Environment (VLE) in which such an approach had to be implemented. Therefore, we investigated two competing software design models for VLEs. In the first system, all learning features were a function of the classical VLE. The second system was designed from the perspective that learning features should be a function of the course's core content (statistical analyses), which required us to develop a specific–purpose Statistical Learning Environment (SLE) based on Reproducible Computing and newly developed Peer Review (PR) technology. Objectives The main research question is whether the second VLE design improved learning efficiency as compared to the standard type of VLE design that is commonly used in education. As a secondary objective we provide empirical evidence about the usefulness of PR as a constructivist learning activity which supports non-rote learning. Finally, this paper illustrates that it is possible to introduce a constructivist learning approach in large student populations, based on adequately designed educational technology, without subsuming educational content to technological convenience. Methods Both VLE systems were tested within a two-year quasi-experiment based on a Reliable Nonequivalent Group Design. This approach allowed us to draw valid conclusions about the treatment effect of the changed VLE design, even though the systems were implemented in successive years. The methodological aspects about the experiment's internal validity are explained extensively. Results The effect of the design change is shown to have substantially increased the efficiency of constructivist, computer-assisted learning activities for all cohorts of the student population under investigation. The findings demonstrate that a content–based design outperforms the traditional VLE–based design. PMID:21998652

  13. Analysis of Sensitivity Experiments - An Expanded Primer

    DTIC Science & Technology

    2017-03-08

    diehard practitioners. The difficulty associated with mastering statistical inference presents a true dilemma. Statistics is an extremely applied...lost, perhaps forever. In other words, when on this safari, you need a guide. This report is designed to be a guide, of sorts. It focuses on analytical...estimated accurately if our analysis is to have real meaning. For this reason, the sensitivity test procedure is designed to concentrate measurements

  14. Simulation study to determine the impact of different design features on design efficiency in discrete choice experiments.

    PubMed

    Vanniyasingam, Thuva; Cunningham, Charles E; Foster, Gary; Thabane, Lehana

    2016-07-19

    Discrete choice experiments (DCEs) are routinely used to elicit patient preferences to improve health outcomes and healthcare services. While many fractional factorial designs can be created, some are more statistically optimal than others. The objective of this simulation study was to investigate how varying the number of (1) attributes, (2) levels within attributes, (3) alternatives and (4) choice tasks per survey will improve or compromise the statistical efficiency of an experimental design. A total of 3204 DCE designs were created to assess how relative design efficiency (d-efficiency) is influenced by varying the number of choice tasks (2-20), alternatives (2-5), attributes (2-20) and attribute levels (2-5) of a design. Choice tasks were created by randomly allocating attribute and attribute level combinations into alternatives. Relative d-efficiency was used to measure the optimality of each DCE design. DCE design complexity influenced statistical efficiency. Across all designs, relative d-efficiency decreased as the number of attributes and attribute levels increased. It increased for designs with more alternatives. Lastly, relative d-efficiency converges as the number of choice tasks increases, where convergence may not be at 100% statistical optimality. Achieving 100% d-efficiency is heavily dependent on the number of attributes, attribute levels, choice tasks and alternatives. Further exploration of overlaps and block sizes are needed. This study's results are widely applicable for researchers interested in creating optimal DCE designs to elicit individual preferences on health services, programmes, policies and products. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  15. The Communicability of Graphical Alternatives to Tabular Displays of Statistical Simulation Studies

    PubMed Central

    Cook, Alex R.; Teo, Shanice W. L.

    2011-01-01

    Simulation studies are often used to assess the frequency properties and optimality of statistical methods. They are typically reported in tables, which may contain hundreds of figures to be contrasted over multiple dimensions. To assess the degree to which these tables are fit for purpose, we performed a randomised cross-over experiment in which statisticians were asked to extract information from (i) such a table sourced from the literature and (ii) a graphical adaptation designed by the authors, and were timed and assessed for accuracy. We developed hierarchical models accounting for differences between individuals of different experience levels (under- and post-graduate), within experience levels, and between different table-graph pairs. In our experiment, information could be extracted quicker and, for less experienced participants, more accurately from graphical presentations than tabular displays. We also performed a literature review to assess the prevalence of hard-to-interpret design features in tables of simulation studies in three popular statistics journals, finding that many are presented innumerately. We recommend simulation studies be presented in graphical form. PMID:22132184

  16. The communicability of graphical alternatives to tabular displays of statistical simulation studies.

    PubMed

    Cook, Alex R; Teo, Shanice W L

    2011-01-01

    Simulation studies are often used to assess the frequency properties and optimality of statistical methods. They are typically reported in tables, which may contain hundreds of figures to be contrasted over multiple dimensions. To assess the degree to which these tables are fit for purpose, we performed a randomised cross-over experiment in which statisticians were asked to extract information from (i) such a table sourced from the literature and (ii) a graphical adaptation designed by the authors, and were timed and assessed for accuracy. We developed hierarchical models accounting for differences between individuals of different experience levels (under- and post-graduate), within experience levels, and between different table-graph pairs. In our experiment, information could be extracted quicker and, for less experienced participants, more accurately from graphical presentations than tabular displays. We also performed a literature review to assess the prevalence of hard-to-interpret design features in tables of simulation studies in three popular statistics journals, finding that many are presented innumerately. We recommend simulation studies be presented in graphical form.

  17. Paradigms for adaptive statistical information designs: practical experiences and strategies.

    PubMed

    Wang, Sue-Jane; Hung, H M James; O'Neill, Robert

    2012-11-10

    In the last decade or so, interest in adaptive design clinical trials has gradually been directed towards their use in regulatory submissions by pharmaceutical drug sponsors to evaluate investigational new drugs. Methodological advances of adaptive designs are abundant in the statistical literature since the 1970s. The adaptive design paradigm has been enthusiastically perceived to increase the efficiency and to be more cost-effective than the fixed design paradigm for drug development. Much interest in adaptive designs is in those studies with two-stages, where stage 1 is exploratory and stage 2 depends upon stage 1 results, but where the data of both stages will be combined to yield statistical evidence for use as that of a pivotal registration trial. It was not until the recent release of the US Food and Drug Administration Draft Guidance for Industry on Adaptive Design Clinical Trials for Drugs and Biologics (2010) that the boundaries of flexibility for adaptive designs were specifically considered for regulatory purposes, including what are exploratory goals, and what are the goals of adequate and well-controlled (A&WC) trials (2002). The guidance carefully described these distinctions in an attempt to minimize the confusion between the goals of preliminary learning phases of drug development, which are inherently substantially uncertain, and the definitive inference-based phases of drug development. In this paper, in addition to discussing some aspects of adaptive designs in a confirmatory study setting, we underscore the value of adaptive designs when used in exploratory trials to improve planning of subsequent A&WC trials. One type of adaptation that is receiving attention is the re-estimation of the sample size during the course of the trial. We refer to this type of adaptation as an adaptive statistical information design. Specifically, a case example is used to illustrate how challenging it is to plan a confirmatory adaptive statistical information design. We highlight the substantial risk of planning the sample size for confirmatory trials when information is very uninformative and stipulate the advantages of adaptive statistical information designs for planning exploratory trials. Practical experiences and strategies as lessons learned from more recent adaptive design proposals will be discussed to pinpoint the improved utilities of adaptive design clinical trials and their potential to increase the chance of a successful drug development. Published 2012. This article is a US Government work and is in the public domain in the USA.

  18. Statistical evaluation of metal fill widths for emulated metal fill in parasitic extraction methodology

    NASA Astrophysics Data System (ADS)

    J-Me, Teh; Noh, Norlaili Mohd.; Aziz, Zalina Abdul

    2015-05-01

    In the chip industry today, the key goal of a chip development organization is to develop and market chips within a short time frame to gain foothold on market share. This paper proposes a design flow around the area of parasitic extraction to improve the design cycle time. The proposed design flow utilizes the usage of metal fill emulation as opposed to the current flow which performs metal fill insertion directly. By replacing metal fill structures with an emulation methodology in earlier iterations of the design flow, this is targeted to help reduce runtime in fill insertion stage. Statistical design of experiments methodology utilizing the randomized complete block design was used to select an appropriate emulated metal fill width to improve emulation accuracy. The experiment was conducted on test cases of different sizes, ranging from 1000 gates to 21000 gates. The metal width was varied from 1 x minimum metal width to 6 x minimum metal width. Two-way analysis of variance and Fisher's least significant difference test were used to analyze the interconnect net capacitance values of the different test cases. This paper presents the results of the statistical analysis for the 45 nm process technology. The recommended emulated metal fill width was found to be 4 x the minimum metal width.

  19. Rhythmic grouping biases constrain infant statistical learning

    PubMed Central

    Hay, Jessica F.; Saffran, Jenny R.

    2012-01-01

    Linguistic stress and sequential statistical cues to word boundaries interact during speech segmentation in infancy. However, little is known about how the different acoustic components of stress constrain statistical learning. The current studies were designed to investigate whether intensity and duration each function independently as cues to initial prominence (trochaic-based hypothesis) or whether, as predicted by the Iambic-Trochaic Law (ITL), intensity and duration have characteristic and separable effects on rhythmic grouping (ITL-based hypothesis) in a statistical learning task. Infants were familiarized with an artificial language (Experiments 1 & 3) or a tone stream (Experiment 2) in which there was an alternation in either intensity or duration. In addition to potential acoustic cues, the familiarization sequences also contained statistical cues to word boundaries. In speech (Experiment 1) and non-speech (Experiment 2) conditions, 9-month-old infants demonstrated discrimination patterns consistent with an ITL-based hypothesis: intensity signaled initial prominence and duration signaled final prominence. The results of Experiment 3, in which 6.5-month-old infants were familiarized with the speech streams from Experiment 1, suggest that there is a developmental change in infants’ willingness to treat increased duration as a cue to word offsets in fluent speech. Infants’ perceptual systems interact with linguistic experience to constrain how infants learn from their auditory environment. PMID:23730217

  20. Statistical aspects of quantitative real-time PCR experiment design.

    PubMed

    Kitchen, Robert R; Kubista, Mikael; Tichopad, Ales

    2010-04-01

    Experiments using quantitative real-time PCR to test hypotheses are limited by technical and biological variability; we seek to minimise sources of confounding variability through optimum use of biological and technical replicates. The quality of an experiment design is commonly assessed by calculating its prospective power. Such calculations rely on knowledge of the expected variances of the measurements of each group of samples and the magnitude of the treatment effect; the estimation of which is often uninformed and unreliable. Here we introduce a method that exploits a small pilot study to estimate the biological and technical variances in order to improve the design of a subsequent large experiment. We measure the variance contributions at several 'levels' of the experiment design and provide a means of using this information to predict both the total variance and the prospective power of the assay. A validation of the method is provided through a variance analysis of representative genes in several bovine tissue-types. We also discuss the effect of normalisation to a reference gene in terms of the measured variance components of the gene of interest. Finally, we describe a software implementation of these methods, powerNest, that gives the user the opportunity to input data from a pilot study and interactively modify the design of the assay. The software automatically calculates expected variances, statistical power, and optimal design of the larger experiment. powerNest enables the researcher to minimise the total confounding variance and maximise prospective power for a specified maximum cost for the large study. Copyright 2010 Elsevier Inc. All rights reserved.

  1. Power Analysis in Two-Level Unbalanced Designs

    ERIC Educational Resources Information Center

    Konstantopoulos, Spyros

    2010-01-01

    Previous work on statistical power has discussed mainly single-level designs or 2-level balanced designs with random effects. Although balanced experiments are common, in practice balance cannot always be achieved. Work on class size is one example of unbalanced designs. This study provides methods for power analysis in 2-level unbalanced designs…

  2. A Tutorial on Adaptive Design Optimization

    PubMed Central

    Myung, Jay I.; Cavagnaro, Daniel R.; Pitt, Mark A.

    2013-01-01

    Experimentation is ubiquitous in the field of psychology and fundamental to the advancement of its science, and one of the biggest challenges for researchers is designing experiments that can conclusively discriminate the theoretical hypotheses or models under investigation. The recognition of this challenge has led to the development of sophisticated statistical methods that aid in the design of experiments and that are within the reach of everyday experimental scientists. This tutorial paper introduces the reader to an implementable experimentation methodology, dubbed Adaptive Design Optimization, that can help scientists to conduct “smart” experiments that are maximally informative and highly efficient, which in turn should accelerate scientific discovery in psychology and beyond. PMID:23997275

  3. DNA Fingerprinting in a Forensic Teaching Experiment

    ERIC Educational Resources Information Center

    Wagoner, Stacy A.; Carlson, Kimberly A.

    2008-01-01

    This article presents an experiment designed to provide students, in a classroom laboratory setting, a hands-on demonstration of the steps used in DNA forensic analysis by performing DNA extraction, DNA fingerprinting, and statistical analysis of the data. This experiment demonstrates how DNA fingerprinting is performed and how long it takes. It…

  4. Bio-based renewable additives for anti-icing applications (phase one).

    DOT National Transportation Integrated Search

    2016-09-04

    The performance and impacts of several bio-based anti-icers along with a traditional chloride-based anti-icer (salt brine) were evaluated. : A statistical design of experiments (uniform design) was employed for developing anti-icing liquids consistin...

  5. Testing Nelder-Mead based repulsion algorithms for multiple roots of nonlinear systems via a two-level factorial design of experiments.

    PubMed

    Ramadas, Gisela C V; Rocha, Ana Maria A C; Fernandes, Edite M G P

    2015-01-01

    This paper addresses the challenging task of computing multiple roots of a system of nonlinear equations. A repulsion algorithm that invokes the Nelder-Mead (N-M) local search method and uses a penalty-type merit function based on the error function, known as 'erf', is presented. In the N-M algorithm context, different strategies are proposed to enhance the quality of the solutions and improve the overall efficiency. The main goal of this paper is to use a two-level factorial design of experiments to analyze the statistical significance of the observed differences in selected performance criteria produced when testing different strategies in the N-M based repulsion algorithm. The main goal of this paper is to use a two-level factorial design of experiments to analyze the statistical significance of the observed differences in selected performance criteria produced when testing different strategies in the N-M based repulsion algorithm.

  6. The Role of Design-of-Experiments in Managing Flow in Compact Air Vehicle Inlets

    NASA Technical Reports Server (NTRS)

    Anderson, Bernhard H.; Miller, Daniel N.; Gridley, Marvin C.; Agrell, Johan

    2003-01-01

    It is the purpose of this study to demonstrate the viability and economy of Design-of-Experiments methodologies to arrive at microscale secondary flow control array designs that maintain optimal inlet performance over a wide range of the mission variables and to explore how these statistical methods provide a better understanding of the management of flow in compact air vehicle inlets. These statistical design concepts were used to investigate the robustness properties of low unit strength micro-effector arrays. Low unit strength micro-effectors are micro-vanes set at very low angles-of-incidence with very long chord lengths. They were designed to influence the near wall inlet flow over an extended streamwise distance, and their advantage lies in low total pressure loss and high effectiveness in managing engine face distortion. The term robustness is used in this paper in the same sense as it is used in the industrial problem solving community. It refers to minimizing the effects of the hard-to-control factors that influence the development of a product or process. In Robustness Engineering, the effects of the hard-to-control factors are often called noise , and the hard-to-control factors themselves are referred to as the environmental variables or sometimes as the Taguchi noise variables. Hence Robust Optimization refers to minimizing the effects of the environmental or noise variables on the development (design) of a product or process. In the management of flow in compact inlets, the environmental or noise variables can be identified with the mission variables. Therefore this paper formulates a statistical design methodology that minimizes the impact of variations in the mission variables on inlet performance and demonstrates that these statistical design concepts can lead to simpler inlet flow management systems.

  7. Designing to Support Critical Engagement with Statistics

    ERIC Educational Resources Information Center

    Gresalfi, Melissa Sommerfeld

    2015-01-01

    The purpose of this paper is to describe a trajectory of designing for particular forms of engagement with mathematics. The forms of engagement that were targeted through these design experiments involved making intentional choices about which procedures to leverage in order to support particular claims (what I call "critical…

  8. Experiment Design for Complex VTOL Aircraft with Distributed Propulsion and Tilt Wing

    NASA Technical Reports Server (NTRS)

    Murphy, Patrick C.; Landman, Drew

    2015-01-01

    Selected experimental results from a wind tunnel study of a subscale VTOL concept with distributed propulsion and tilt lifting surfaces are presented. The vehicle complexity and automated test facility were ideal for use with a randomized designed experiment. Design of Experiments and Response Surface Methods were invoked to produce run efficient, statistically rigorous regression models with minimized prediction error. Static tests were conducted at the NASA Langley 12-Foot Low-Speed Tunnel to model all six aerodynamic coefficients over a large flight envelope. This work supports investigations at NASA Langley in developing advanced configurations, simulations, and advanced control systems.

  9. A Statistical Decision Model for Periodical Selection for a Specialized Information Center

    ERIC Educational Resources Information Center

    Dym, Eleanor D.; Shirey, Donald L.

    1973-01-01

    An experiment is described which attempts to define a quantitative methodology for the identification and evaluation of all possibly relevant periodical titles containing toxicological-biological information. A statistical decision model was designed and employed, along with yes/no criteria questions, a training technique and a quality control…

  10. Statistical Measures of Integrity in Online Testing: Empirical Study

    ERIC Educational Resources Information Center

    Wielicki, Tom

    2016-01-01

    This paper reports on longitudinal study regarding integrity of testing in an online format as used by e-learning platforms. Specifically, this study explains whether online testing, which implies an open book format is compromising integrity of assessment by encouraging cheating among students. Statistical experiment designed for this study…

  11. Intraclass Correlations and Covariate Outcome Correlations for Planning Two-and Three-Level Cluster-Randomized Experiments in Education

    ERIC Educational Resources Information Center

    Hedges, Larry V.; Hedberg, E. C.

    2013-01-01

    Background: Cluster-randomized experiments that assign intact groups such as schools or school districts to treatment conditions are increasingly common in educational research. Such experiments are inherently multilevel designs whose sensitivity (statistical power and precision of estimates) depends on the variance decomposition across levels.…

  12. Intraclass Correlations and Covariate Outcome Correlations for Planning 2 and 3 Level Cluster Randomized Experiments in Education

    ERIC Educational Resources Information Center

    Hedges, Larry V.; Hedberg, Eric C.

    2013-01-01

    Background: Cluster randomized experiments that assign intact groups such as schools or school districts to treatment conditions are increasingly common in educational research. Such experiments are inherently multilevel designs whose sensitivity (statistical power and precision of estimates) depends on the variance decomposition across levels.…

  13. NASA DOE POD NDE Capabilities Data Book

    NASA Technical Reports Server (NTRS)

    Generazio, Edward R.

    2015-01-01

    This data book contains the Directed Design of Experiments for Validating Probability of Detection (POD) Capability of NDE Systems (DOEPOD) analyses of the nondestructive inspection data presented in the NTIAC, Nondestructive Evaluation (NDE) Capabilities Data Book, 3rd ed., NTIAC DB-97-02. DOEPOD is designed as a decision support system to validate inspection system, personnel, and protocol demonstrating 0.90 POD with 95% confidence at critical flaw sizes, a90/95. The test methodology used in DOEPOD is based on the field of statistical sequential analysis founded by Abraham Wald. Sequential analysis is a method of statistical inference whose characteristic feature is that the number of observations required by the procedure is not determined in advance of the experiment. The decision to terminate the experiment depends, at each stage, on the results of the observations previously made. A merit of the sequential method, as applied to testing statistical hypotheses, is that test procedures can be constructed which require, on average, a substantially smaller number of observations than equally reliable test procedures based on a predetermined number of observations.

  14. Experience and Explanation: Using Videogames to Prepare Students for Formal Instruction in Statistics

    NASA Astrophysics Data System (ADS)

    Arena, Dylan A.; Schwartz, Daniel L.

    2014-08-01

    Well-designed digital games can deliver powerful experiences that are difficult to provide through traditional instruction, while traditional instruction can deliver formal explanations that are not a natural fit for gameplay. Combined, they can accomplish more than either can alone. An experiment tested this claim using the topic of statistics, where people's everyday experiences often conflict with normative statistical theories and a videogame might provide an alternate set of experiences for students to draw upon. The research used a game called Stats Invaders!, a variant of the classic videogame Space Invaders. In Stats Invaders!, the locations of descending alien invaders follow probability distributions, and players need to infer the shape of the distributions to play well. The experiment tested whether the game developed participants' intuitions about the structure of random events and thereby prepared them for future learning from a subsequent written passage on probability distributions. Community-college students who played the game and then read the passage learned more than participants who only read the passage.

  15. Statistical methods for quantitative mass spectrometry proteomic experiments with labeling.

    PubMed

    Oberg, Ann L; Mahoney, Douglas W

    2012-01-01

    Mass Spectrometry utilizing labeling allows multiple specimens to be subjected to mass spectrometry simultaneously. As a result, between-experiment variability is reduced. Here we describe use of fundamental concepts of statistical experimental design in the labeling framework in order to minimize variability and avoid biases. We demonstrate how to export data in the format that is most efficient for statistical analysis. We demonstrate how to assess the need for normalization, perform normalization, and check whether it worked. We describe how to build a model explaining the observed values and test for differential protein abundance along with descriptive statistics and measures of reliability of the findings. Concepts are illustrated through the use of three case studies utilizing the iTRAQ 4-plex labeling protocol.

  16. 76 FR 9696 - Equipment Price Forecasting in Energy Conservation Standards Analysis

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-22

    ... for particular efficiency design options, an empirical experience curve fit to the available data may be used to forecast future costs of such design option technologies. If a statistical evaluation indicates a low level of confidence in estimates of the design option cost trend, this method should not be...

  17. Designing an Error Resolution Checklist for a Shared Manned-Unmanned Environment

    DTIC Science & Technology

    2010-06-01

    performance during the Olympics. Thank you to Birsen Donmez, who took an active role in my statistics instruction. I appreciate your time and patience...in teaching me the finer details of “varsity statistics ”. Also, thank you for being so responsive through e-mail, even though you are now located in...105! 6.3.! Experiment recommendations and future work................................................ 105! Appendix A: Descriptive Statistics

  18. Long-Term Bioeffects of 435-MHz Radiofrequency Radiation on Selected Blood-Borne Endpoints in Cannulated Rats. Volume 4. Plasma Catecholamines.

    DTIC Science & Technology

    1987-08-01

    out. To use each animal as its own control , arterial blood was sampled by means of chronically implanted aortic cannulas 112,13,14]. This simple...APPENDIX B STATISTICAL METHODOLOGY 37 APPENDIX B STATISTICAL METHODOLOGY The balanced design of this experiment (requiring that 25 animals from each...protoccl in that, in numerous cases, samples were collected at odd intervals (invalidating the orthogonality of the design ) and the number of samples’taken

  19. Exploring Tree Age & Diameter to Illustrate Sample Design & Inference in Observational Ecology

    ERIC Educational Resources Information Center

    Casady, Grant M.

    2015-01-01

    Undergraduate biology labs often explore the techniques of data collection but neglect the statistical framework necessary to express findings. Students can be confused about how to use their statistical knowledge to address specific biological questions. Growth in the area of observational ecology requires that students gain experience in…

  20. Using Paper Helicopters to Teach Statistical Process Control

    ERIC Educational Resources Information Center

    Johnson, Danny J.

    2011-01-01

    This hands-on project uses a paper helicopter to teach students how to distinguish between common and special causes of variability when developing and using statistical process control charts. It allows the student to experience a process that is out-of-control due to imprecise or incomplete product design specifications and to discover how the…

  1. An On-Line Virtual Environment for Teaching Statistical Sampling and Analysis

    ERIC Educational Resources Information Center

    Marsh, Michael T.

    2009-01-01

    Regardless of the related discipline, students in statistics courses invariably have difficulty understanding the connection between the numerical values calculated for end-of-the-chapter exercises and their usefulness in decision making. This disconnect is, in part, due to the lack of time and opportunity to actually design the experiments and…

  2. SOCR: Statistics Online Computational Resource

    PubMed Central

    Dinov, Ivo D.

    2011-01-01

    The need for hands-on computer laboratory experience in undergraduate and graduate statistics education has been firmly established in the past decade. As a result a number of attempts have been undertaken to develop novel approaches for problem-driven statistical thinking, data analysis and result interpretation. In this paper we describe an integrated educational web-based framework for: interactive distribution modeling, virtual online probability experimentation, statistical data analysis, visualization and integration. Following years of experience in statistical teaching at all college levels using established licensed statistical software packages, like STATA, S-PLUS, R, SPSS, SAS, Systat, etc., we have attempted to engineer a new statistics education environment, the Statistics Online Computational Resource (SOCR). This resource performs many of the standard types of statistical analysis, much like other classical tools. In addition, it is designed in a plug-in object-oriented architecture and is completely platform independent, web-based, interactive, extensible and secure. Over the past 4 years we have tested, fine-tuned and reanalyzed the SOCR framework in many of our undergraduate and graduate probability and statistics courses and have evidence that SOCR resources build student’s intuition and enhance their learning. PMID:21451741

  3. Statistical optimization of process parameters for lipase-catalyzed synthesis of triethanolamine-based esterquats using response surface methodology in 2-liter bioreactor.

    PubMed

    Masoumi, Hamid Reza Fard; Basri, Mahiran; Kassim, Anuar; Abdullah, Dzulkefly Kuang; Abdollahi, Yadollah; Abd Gani, Siti Salwa; Rezaee, Malahat

    2013-01-01

    Lipase-catalyzed production of triethanolamine-based esterquat by esterification of oleic acid (OA) with triethanolamine (TEA) in n-hexane was performed in 2 L stirred-tank reactor. A set of experiments was designed by central composite design to process modeling and statistically evaluate the findings. Five independent process variables, including enzyme amount, reaction time, reaction temperature, substrates molar ratio of OA to TEA, and agitation speed, were studied under the given conditions designed by Design Expert software. Experimental data were examined for normality test before data processing stage and skewness and kurtosis indices were determined. The mathematical model developed was found to be adequate and statistically accurate to predict the optimum conversion of product. Response surface methodology with central composite design gave the best performance in this study, and the methodology as a whole has been proven to be adequate for the design and optimization of the enzymatic process.

  4. Thinking Style Diversity and Collaborative Design Learning

    NASA Astrophysics Data System (ADS)

    Volpentesta, Antonio P.; Ammirato, Salvatore; Sofo, Francesco

    The paper explores the impact of structured learning experiences that were designed to challenge students’ ways of thinking and promote creativity. The aim was to develop the ability of students, coming from different engineering disciplines and characterized by particular thinking style profiles, to collaboratively work on a project-based learning experience in an educational environment. Three project-based learning experiences were structured using critical thinking methods to stimulate creativity. Pre and post-survey data using a specially modified thinking style inventory for 202 design students indicated a thinking style profile of preferences with a focus on exploring and questioning. Statistically significant results showed students successfully developed empathy and openness to multiple perspectives.

  5. A multimodality imaging-compatible insertion robot with a respiratory motion calibration module designed for ablation of liver tumors: a preclinical study.

    PubMed

    Li, Dongrui; Cheng, Zhigang; Chen, Gang; Liu, Fangyi; Wu, Wenbo; Yu, Jie; Gu, Ying; Liu, Fengyong; Ren, Chao; Liang, Ping

    2018-04-03

    To test the accuracy and efficacy of the multimodality imaging-compatible insertion robot with a respiratory motion calibration module designed for ablation of liver tumors in phantom and animal models. To evaluate and compare the influences of intervention experience on robot-assisted and ultrasound-controlled ablation procedures. Accuracy tests on rigid body/phantom model with a respiratory movement simulation device and microwave ablation tests on porcine liver tumor/rabbit liver cancer were performed with the robot we designed or with the traditional ultrasound-guidance by physicians with or without intervention experience. In the accuracy tests performed by the physicians without intervention experience, the insertion accuracy and efficiency of robot-assisted group was higher than those of ultrasound-guided group with statistically significant differences. In the microwave ablation tests performed by the physicians without intervention experience, better complete ablation rate was achieved when applying the robot. In the microwave ablation tests performed by the physicians with intervention experience, there was no statistically significant difference of the insertion number and total ablation time between the robot-assisted group and the ultrasound-controlled group. The evaluation by the NASA-TLX suggested that the robot-assisted insertion and microwave ablation process performed by physicians with or without experience were more comfortable. The multimodality imaging-compatible insertion robot with a respiratory motion calibration module designed for ablation of liver tumors could increase the insertion accuracy and ablation efficacy, and minimize the influence of the physicians' experience. The ablation procedure could be more comfortable with less stress with the application of the robot.

  6. Design of Experiments with Multiple Independent Variables: A Resource Management Perspective on Complete and Reduced Factorial Designs

    PubMed Central

    Collins, Linda M.; Dziak, John J.; Li, Runze

    2009-01-01

    An investigator who plans to conduct experiments with multiple independent variables must decide whether to use a complete or reduced factorial design. This article advocates a resource management perspective on making this decision, in which the investigator seeks a strategic balance between service to scientific objectives and economy. Considerations in making design decisions include whether research questions are framed as main effects or simple effects; whether and which effects are aliased (confounded) in a particular design; the number of experimental conditions that must be implemented in a particular design and the number of experimental subjects the design requires to maintain the desired level of statistical power; and the costs associated with implementing experimental conditions and obtaining experimental subjects. In this article four design options are compared: complete factorial, individual experiments, single factor, and fractional factorial designs. Complete and fractional factorial designs and single factor designs are generally more economical than conducting individual experiments on each factor. Although relatively unfamiliar to behavioral scientists, fractional factorial designs merit serious consideration because of their economy and versatility. PMID:19719358

  7. STRengthening analytical thinking for observational studies: the STRATOS initiative.

    PubMed

    Sauerbrei, Willi; Abrahamowicz, Michal; Altman, Douglas G; le Cessie, Saskia; Carpenter, James

    2014-12-30

    The validity and practical utility of observational medical research depends critically on good study design, excellent data quality, appropriate statistical methods and accurate interpretation of results. Statistical methodology has seen substantial development in recent times. Unfortunately, many of these methodological developments are ignored in practice. Consequently, design and analysis of observational studies often exhibit serious weaknesses. The lack of guidance on vital practical issues discourages many applied researchers from using more sophisticated and possibly more appropriate methods when analyzing observational studies. Furthermore, many analyses are conducted by researchers with a relatively weak statistical background and limited experience in using statistical methodology and software. Consequently, even 'standard' analyses reported in the medical literature are often flawed, casting doubt on their results and conclusions. An efficient way to help researchers to keep up with recent methodological developments is to develop guidance documents that are spread to the research community at large. These observations led to the initiation of the strengthening analytical thinking for observational studies (STRATOS) initiative, a large collaboration of experts in many different areas of biostatistical research. The objective of STRATOS is to provide accessible and accurate guidance in the design and analysis of observational studies. The guidance is intended for applied statisticians and other data analysts with varying levels of statistical education, experience and interests. In this article, we introduce the STRATOS initiative and its main aims, present the need for guidance documents and outline the planned approach and progress so far. We encourage other biostatisticians to become involved. © 2014 The Authors. Statistics in Medicine published by John Wiley & Sons, Ltd.

  8. An Intuitive Graphical Approach to Understanding the Split-Plot Experiment

    ERIC Educational Resources Information Center

    Robinson, Timothy J.; Brenneman, William A.; Myers, William R.

    2009-01-01

    While split-plot designs have received considerable attention in the literature over the past decade, there seems to be a general lack of intuitive understanding of the error structure of these designs and the resulting statistical analysis. Typically, students learn the proper error terms for testing factors of a split-plot design via "expected…

  9. Preparing a Data Scientist: A Pedagogic Experience in Designing a Big Data Analytics Course

    ERIC Educational Resources Information Center

    Asamoah, Daniel Adomako; Sharda, Ramesh; Hassan Zadeh, Amir; Kalgotra, Pankush

    2017-01-01

    In this article, we present an experiential perspective on how a big data analytics course was designed and delivered to students at a major Midwestern university. In reference to the "MSIS 2006 Model Curriculum," we designed this course as a level 2 course, with prerequisites in databases, computer programming, statistics, and data…

  10. Experimental Design and Power Calculation for RNA-seq Experiments.

    PubMed

    Wu, Zhijin; Wu, Hao

    2016-01-01

    Power calculation is a critical component of RNA-seq experimental design. The flexibility of RNA-seq experiment and the wide dynamic range of transcription it measures make it an attractive technology for whole transcriptome analysis. These features, in addition to the high dimensionality of RNA-seq data, bring complexity in experimental design, making an analytical power calculation no longer realistic. In this chapter we review the major factors that influence the statistical power of detecting differential expression, and give examples of power assessment using the R package PROPER.

  11. Mechanistic analysis of challenge-response experiments.

    PubMed

    Shotwell, M S; Drake, K J; Sidorov, V Y; Wikswo, J P

    2013-09-01

    We present an application of mechanistic modeling and nonlinear longitudinal regression in the context of biomedical response-to-challenge experiments, a field where these methods are underutilized. In this type of experiment, a system is studied by imposing an experimental challenge, and then observing its response. The combination of mechanistic modeling and nonlinear longitudinal regression has brought new insight, and revealed an unexpected opportunity for optimal design. Specifically, the mechanistic aspect of our approach enables the optimal design of experimental challenge characteristics (e.g., intensity, duration). This article lays some groundwork for this approach. We consider a series of experiments wherein an isolated rabbit heart is challenged with intermittent anoxia. The heart responds to the challenge onset, and recovers when the challenge ends. The mean response is modeled by a system of differential equations that describe a candidate mechanism for cardiac response to anoxia challenge. The cardiac system behaves more variably when challenged than when at rest. Hence, observations arising from this experiment exhibit complex heteroscedasticity and sharp changes in central tendency. We present evidence that an asymptotic statistical inference strategy may fail to adequately account for statistical uncertainty. Two alternative methods are critiqued qualitatively (i.e., for utility in the current context), and quantitatively using an innovative Monte-Carlo method. We conclude with a discussion of the exciting opportunities in optimal design of response-to-challenge experiments. © 2013, The International Biometric Society.

  12. Adaptive design optimization: a mutual information-based approach to model discrimination in cognitive science.

    PubMed

    Cavagnaro, Daniel R; Myung, Jay I; Pitt, Mark A; Kujala, Janne V

    2010-04-01

    Discriminating among competing statistical models is a pressing issue for many experimentalists in the field of cognitive science. Resolving this issue begins with designing maximally informative experiments. To this end, the problem to be solved in adaptive design optimization is identifying experimental designs under which one can infer the underlying model in the fewest possible steps. When the models under consideration are nonlinear, as is often the case in cognitive science, this problem can be impossible to solve analytically without simplifying assumptions. However, as we show in this letter, a full solution can be found numerically with the help of a Bayesian computational trick derived from the statistics literature, which recasts the problem as a probability density simulation in which the optimal design is the mode of the density. We use a utility function based on mutual information and give three intuitive interpretations of the utility function in terms of Bayesian posterior estimates. As a proof of concept, we offer a simple example application to an experiment on memory retention.

  13. STRengthening Analytical Thinking for Observational Studies: the STRATOS initiative

    PubMed Central

    Sauerbrei, Willi; Abrahamowicz, Michal; Altman, Douglas G; le Cessie, Saskia; Carpenter, James

    2014-01-01

    The validity and practical utility of observational medical research depends critically on good study design, excellent data quality, appropriate statistical methods and accurate interpretation of results. Statistical methodology has seen substantial development in recent times. Unfortunately, many of these methodological developments are ignored in practice. Consequently, design and analysis of observational studies often exhibit serious weaknesses. The lack of guidance on vital practical issues discourages many applied researchers from using more sophisticated and possibly more appropriate methods when analyzing observational studies. Furthermore, many analyses are conducted by researchers with a relatively weak statistical background and limited experience in using statistical methodology and software. Consequently, even ‘standard’ analyses reported in the medical literature are often flawed, casting doubt on their results and conclusions. An efficient way to help researchers to keep up with recent methodological developments is to develop guidance documents that are spread to the research community at large. These observations led to the initiation of the strengthening analytical thinking for observational studies (STRATOS) initiative, a large collaboration of experts in many different areas of biostatistical research. The objective of STRATOS is to provide accessible and accurate guidance in the design and analysis of observational studies. The guidance is intended for applied statisticians and other data analysts with varying levels of statistical education, experience and interests. In this article, we introduce the STRATOS initiative and its main aims, present the need for guidance documents and outline the planned approach and progress so far. We encourage other biostatisticians to become involved. PMID:25074480

  14. iCFD: Interpreted Computational Fluid Dynamics - Degeneration of CFD to one-dimensional advection-dispersion models using statistical experimental design - The secondary clarifier.

    PubMed

    Guyonvarch, Estelle; Ramin, Elham; Kulahci, Murat; Plósz, Benedek Gy

    2015-10-15

    The present study aims at using statistically designed computational fluid dynamics (CFD) simulations as numerical experiments for the identification of one-dimensional (1-D) advection-dispersion models - computationally light tools, used e.g., as sub-models in systems analysis. The objective is to develop a new 1-D framework, referred to as interpreted CFD (iCFD) models, in which statistical meta-models are used to calculate the pseudo-dispersion coefficient (D) as a function of design and flow boundary conditions. The method - presented in a straightforward and transparent way - is illustrated using the example of a circular secondary settling tank (SST). First, the significant design and flow factors are screened out by applying the statistical method of two-level fractional factorial design of experiments. Second, based on the number of significant factors identified through the factor screening study and system understanding, 50 different sets of design and flow conditions are selected using Latin Hypercube Sampling (LHS). The boundary condition sets are imposed on a 2-D axi-symmetrical CFD simulation model of the SST. In the framework, to degenerate the 2-D model structure, CFD model outputs are approximated by the 1-D model through the calibration of three different model structures for D. Correlation equations for the D parameter then are identified as a function of the selected design and flow boundary conditions (meta-models), and their accuracy is evaluated against D values estimated in each numerical experiment. The evaluation and validation of the iCFD model structure is carried out using scenario simulation results obtained with parameters sampled from the corners of the LHS experimental region. For the studied SST, additional iCFD model development was carried out in terms of (i) assessing different density current sub-models; (ii) implementation of a combined flocculation, hindered, transient and compression settling velocity function; and (iii) assessment of modelling the onset of transient and compression settling. Furthermore, the optimal level of model discretization both in 2-D and 1-D was undertaken. Results suggest that the iCFD model developed for the SST through the proposed methodology is able to predict solid distribution with high accuracy - taking a reasonable computational effort - when compared to multi-dimensional numerical experiments, under a wide range of flow and design conditions. iCFD tools could play a crucial role in reliably predicting systems' performance under normal and shock events. Copyright © 2015 Elsevier Ltd. All rights reserved.

  15. Optimized Design and Analysis of Sparse-Sampling fMRI Experiments

    PubMed Central

    Perrachione, Tyler K.; Ghosh, Satrajit S.

    2013-01-01

    Sparse-sampling is an important methodological advance in functional magnetic resonance imaging (fMRI), in which silent delays are introduced between MR volume acquisitions, allowing for the presentation of auditory stimuli without contamination by acoustic scanner noise and for overt vocal responses without motion-induced artifacts in the functional time series. As such, the sparse-sampling technique has become a mainstay of principled fMRI research into the cognitive and systems neuroscience of speech, language, hearing, and music. Despite being in use for over a decade, there has been little systematic investigation of the acquisition parameters, experimental design considerations, and statistical analysis approaches that bear on the results and interpretation of sparse-sampling fMRI experiments. In this report, we examined how design and analysis choices related to the duration of repetition time (TR) delay (an acquisition parameter), stimulation rate (an experimental design parameter), and model basis function (an analysis parameter) act independently and interactively to affect the neural activation profiles observed in fMRI. First, we conducted a series of computational simulations to explore the parameter space of sparse design and analysis with respect to these variables; second, we validated the results of these simulations in a series of sparse-sampling fMRI experiments. Overall, these experiments suggest the employment of three methodological approaches that can, in many situations, substantially improve the detection of neurophysiological response in sparse fMRI: (1) Sparse analyses should utilize a physiologically informed model that incorporates hemodynamic response convolution to reduce model error. (2) The design of sparse fMRI experiments should maintain a high rate of stimulus presentation to maximize effect size. (3) TR delays of short to intermediate length can be used between acquisitions of sparse-sampled functional image volumes to increase the number of samples and improve statistical power. PMID:23616742

  16. Optimized design and analysis of sparse-sampling FMRI experiments.

    PubMed

    Perrachione, Tyler K; Ghosh, Satrajit S

    2013-01-01

    Sparse-sampling is an important methodological advance in functional magnetic resonance imaging (fMRI), in which silent delays are introduced between MR volume acquisitions, allowing for the presentation of auditory stimuli without contamination by acoustic scanner noise and for overt vocal responses without motion-induced artifacts in the functional time series. As such, the sparse-sampling technique has become a mainstay of principled fMRI research into the cognitive and systems neuroscience of speech, language, hearing, and music. Despite being in use for over a decade, there has been little systematic investigation of the acquisition parameters, experimental design considerations, and statistical analysis approaches that bear on the results and interpretation of sparse-sampling fMRI experiments. In this report, we examined how design and analysis choices related to the duration of repetition time (TR) delay (an acquisition parameter), stimulation rate (an experimental design parameter), and model basis function (an analysis parameter) act independently and interactively to affect the neural activation profiles observed in fMRI. First, we conducted a series of computational simulations to explore the parameter space of sparse design and analysis with respect to these variables; second, we validated the results of these simulations in a series of sparse-sampling fMRI experiments. Overall, these experiments suggest the employment of three methodological approaches that can, in many situations, substantially improve the detection of neurophysiological response in sparse fMRI: (1) Sparse analyses should utilize a physiologically informed model that incorporates hemodynamic response convolution to reduce model error. (2) The design of sparse fMRI experiments should maintain a high rate of stimulus presentation to maximize effect size. (3) TR delays of short to intermediate length can be used between acquisitions of sparse-sampled functional image volumes to increase the number of samples and improve statistical power.

  17. A statistical approach to selecting and confirming validation targets in -omics experiments

    PubMed Central

    2012-01-01

    Background Genomic technologies are, by their very nature, designed for hypothesis generation. In some cases, the hypotheses that are generated require that genome scientists confirm findings about specific genes or proteins. But one major advantage of high-throughput technology is that global genetic, genomic, transcriptomic, and proteomic behaviors can be observed. Manual confirmation of every statistically significant genomic result is prohibitively expensive. This has led researchers in genomics to adopt the strategy of confirming only a handful of the most statistically significant results, a small subset chosen for biological interest, or a small random subset. But there is no standard approach for selecting and quantitatively evaluating validation targets. Results Here we present a new statistical method and approach for statistically validating lists of significant results based on confirming only a small random sample. We apply our statistical method to show that the usual practice of confirming only the most statistically significant results does not statistically validate result lists. We analyze an extensively validated RNA-sequencing experiment to show that confirming a random subset can statistically validate entire lists of significant results. Finally, we analyze multiple publicly available microarray experiments to show that statistically validating random samples can both (i) provide evidence to confirm long gene lists and (ii) save thousands of dollars and hundreds of hours of labor over manual validation of each significant result. Conclusions For high-throughput -omics studies, statistical validation is a cost-effective and statistically valid approach to confirming lists of significant results. PMID:22738145

  18. EBSCO's Usage Consolidation Attempts to Streamline Gathering, Storage, and Reporting of Usage Statistics

    ERIC Educational Resources Information Center

    Remy, Charlie

    2012-01-01

    This paper provides an overview of EBSCO's new Usage Consolidation product designed to streamline the harvesting, storage, and analysis of usage statistics from electronic resources. Strengths and weaknesses of the product are discussed as well as an early beta partner's experience. In the current atmosphere of flat or declining budgets, libraries…

  19. Using VITA Service Learning Experiences to Teach Hypothesis Testing and P-Value Analysis

    ERIC Educational Resources Information Center

    Drougas, Anne; Harrington, Steve

    2011-01-01

    This paper describes a hypothesis testing project designed to capture student interest and stimulate classroom interaction and communication. Using an online survey instrument, the authors collected student demographic information and data regarding university service learning experiences. Introductory statistics students performed a series of…

  20. Intraclass Correlation Values for Planning Group-Randomized Trials in Education

    ERIC Educational Resources Information Center

    Hedges, Larry V.; Hedberg, E. C.

    2007-01-01

    Experiments that assign intact groups to treatment conditions are increasingly common in social research. In educational research, the groups assigned are often schools. The design of group-randomized experiments requires knowledge of the intraclass correlation structure to compute statistical power and sample sizes required to achieve adequate…

  1. RANDOMIZATION PROCEDURES FOR THE ANALYSIS OF EDUCATIONAL EXPERIMENTS.

    ERIC Educational Resources Information Center

    COLLIER, RAYMOND O.

    CERTAIN SPECIFIC ASPECTS OF HYPOTHESIS TESTS USED FOR ANALYSIS OF RESULTS IN RANDOMIZED EXPERIMENTS WERE STUDIED--(1) THE DEVELOPMENT OF THE THEORETICAL FACTOR, THAT OF PROVIDING INFORMATION ON STATISTICAL TESTS FOR CERTAIN EXPERIMENTAL DESIGNS AND (2) THE DEVELOPMENT OF THE APPLIED ELEMENT, THAT OF SUPPLYING THE EXPERIMENTER WITH MACHINERY FOR…

  2. Virtual Model Validation of Complex Multiscale Systems: Applications to Nonlinear Elastostatics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oden, John Tinsley; Prudencio, Ernest E.; Bauman, Paul T.

    We propose a virtual statistical validation process as an aid to the design of experiments for the validation of phenomenological models of the behavior of material bodies, with focus on those cases in which knowledge of the fabrication process used to manufacture the body can provide information on the micro-molecular-scale properties underlying macroscale behavior. One example is given by models of elastomeric solids fabricated using polymerization processes. We describe a framework for model validation that involves Bayesian updates of parameters in statistical calibration and validation phases. The process enables the quanti cation of uncertainty in quantities of interest (QoIs) andmore » the determination of model consistency using tools of statistical information theory. We assert that microscale information drawn from molecular models of the fabrication of the body provides a valuable source of prior information on parameters as well as a means for estimating model bias and designing virtual validation experiments to provide information gain over calibration posteriors.« less

  3. A study of the feasibility of statistical analysis of airport performance simulation

    NASA Technical Reports Server (NTRS)

    Myers, R. H.

    1982-01-01

    The feasibility of conducting a statistical analysis of simulation experiments to study airport capacity is investigated. First, the form of the distribution of airport capacity is studied. Since the distribution is non-Gaussian, it is important to determine the effect of this distribution on standard analysis of variance techniques and power calculations. Next, power computations are made in order to determine how economic simulation experiments would be if they are designed to detect capacity changes from condition to condition. Many of the conclusions drawn are results of Monte-Carlo techniques.

  4. Statistical challenges in a regulatory review of cardiovascular and CNS clinical trials.

    PubMed

    Hung, H M James; Wang, Sue-Jane; Yang, Peiling; Jin, Kun; Lawrence, John; Kordzakhia, George; Massie, Tristan

    2016-01-01

    There are several challenging statistical problems identified in the regulatory review of large cardiovascular (CV) clinical outcome trials and central nervous system (CNS) trials. The problems can be common or distinct due to disease characteristics and the differences in trial design elements such as endpoints, trial duration, and trial size. In schizophrenia trials, heavy missing data is a big problem. In Alzheimer trials, the endpoints for assessing symptoms and the endpoints for assessing disease progression are essentially the same; it is difficult to construct a good trial design to evaluate a test drug for its ability to slow the disease progression. In CV trials, reliance on a composite endpoint with low event rate makes the trial size so large that it is infeasible to study multiple doses necessary to find the right dose for study patients. These are just a few typical problems. In the past decade, adaptive designs were increasingly used in these disease areas and some challenges occur with respect to that use. Based on our review experiences, group sequential designs (GSDs) have borne many successful stories in CV trials and are also increasingly used for developing treatments targeting CNS diseases. There is also a growing trend of using more advanced unblinded adaptive designs for producing efficacy evidence. Many statistical challenges with these kinds of adaptive designs have been identified through our experiences with the review of regulatory applications and are shared in this article.

  5. Application of the experimental design of experiments (DoE) for the determination of organotin compounds in water samples using HS-SPME and GC-MS/MS.

    PubMed

    Coscollà, Clara; Navarro-Olivares, Santiago; Martí, Pedro; Yusà, Vicent

    2014-02-01

    When attempting to discover the important factors and then optimise a response by tuning these factors, experimental design (design of experiments, DoE) gives a powerful suite of statistical methodology. DoE identify significant factors and then optimise a response with respect to them in method development. In this work, a headspace-solid-phase micro-extraction (HS-SPME) combined with gas chromatography tandem mass spectrometry (GC-MS/MS) methodology for the simultaneous determination of six important organotin compounds namely monobutyltin (MBT), dibutyltin (DBT), tributyltin (TBT), monophenyltin (MPhT), diphenyltin (DPhT), triphenyltin (TPhT) has been optimized using a statistical design of experiments (DOE). The analytical method is based on the ethylation with NaBEt4 and simultaneous headspace-solid-phase micro-extraction of the derivative compounds followed by GC-MS/MS analysis. The main experimental parameters influencing the extraction efficiency selected for optimization were pre-incubation time, incubation temperature, agitator speed, extraction time, desorption temperature, buffer (pH, concentration and volume), headspace volume, sample salinity, preparation of standards, ultrasonic time and desorption time in the injector. The main factors (excitation voltage, excitation time, ion source temperature, isolation time and electron energy) affecting the GC-IT-MS/MS response were also optimized using the same statistical design of experiments. The proposed method presented good linearity (coefficient of determination R(2)>0.99) and repeatibilty (1-25%) for all the compounds under study. The accuracy of the method measured as the average percentage recovery of the compounds in spiked surface and marine waters was higher than 70% for all compounds studied. Finally, the optimized methodology was applied to real aqueous samples enabled the simultaneous determination of all compounds under study in surface and marine water samples obtained from Valencia region (Spain). © 2013 Elsevier B.V. All rights reserved.

  6. Optimal experimental designs for fMRI when the model matrix is uncertain.

    PubMed

    Kao, Ming-Hung; Zhou, Lin

    2017-07-15

    This study concerns optimal designs for functional magnetic resonance imaging (fMRI) experiments when the model matrix of the statistical model depends on both the selected stimulus sequence (fMRI design), and the subject's uncertain feedback (e.g. answer) to each mental stimulus (e.g. question) presented to her/him. While practically important, this design issue is challenging. This mainly is because that the information matrix cannot be fully determined at the design stage, making it difficult to evaluate the quality of the selected designs. To tackle this challenging issue, we propose an easy-to-use optimality criterion for evaluating the quality of designs, and an efficient approach for obtaining designs optimizing this criterion. Compared with a previously proposed method, our approach requires a much less computing time to achieve designs with high statistical efficiencies. Copyright © 2017 Elsevier Inc. All rights reserved.

  7. The Taguchi methodology as a statistical tool for biotechnological applications: a critical appraisal.

    PubMed

    Rao, Ravella Sreenivas; Kumar, C Ganesh; Prakasham, R Shetty; Hobbs, Phil J

    2008-04-01

    Success in experiments and/or technology mainly depends on a properly designed process or product. The traditional method of process optimization involves the study of one variable at a time, which requires a number of combinations of experiments that are time, cost and labor intensive. The Taguchi method of design of experiments is a simple statistical tool involving a system of tabulated designs (arrays) that allows a maximum number of main effects to be estimated in an unbiased (orthogonal) fashion with a minimum number of experimental runs. It has been applied to predict the significant contribution of the design variable(s) and the optimum combination of each variable by conducting experiments on a real-time basis. The modeling that is performed essentially relates signal-to-noise ratio to the control variables in a 'main effect only' approach. This approach enables both multiple response and dynamic problems to be studied by handling noise factors. Taguchi principles and concepts have made extensive contributions to industry by bringing focused awareness to robustness, noise and quality. This methodology has been widely applied in many industrial sectors; however, its application in biological sciences has been limited. In the present review, the application and comparison of the Taguchi methodology has been emphasized with specific case studies in the field of biotechnology, particularly in diverse areas like fermentation, food processing, molecular biology, wastewater treatment and bioremediation.

  8. Measuring and Enhancing Creativity

    ERIC Educational Resources Information Center

    Mahboub, Kamyar C.; Portillo, Margaret B.; Liu, Yinhui; Chandraratna, Susantha

    2004-01-01

    The purpose of this study was to assess ways by which creativity may be enhanced in a design-oriented course. In order to demonstrate the validity of the approach, a statistically based study was employed. Additionally, the experiment was replicated in two design-oriented fields at the University of Kentucky. These fields were civil engineering…

  9. Statistical Modeling Studies of Iron Recovery from Red Mud Using Thermal Plasma

    NASA Astrophysics Data System (ADS)

    Swagat, S. Rath; Archana, Pany; Jayasankar, K.; Ajit, K. Mitra; C. Satish, Kumar; Partha, S. Mukherjee; Barada, K. Mishra

    2013-05-01

    Optimization studies of plasma smelting of red mud were carried out. Reduction of the dried red mud fines was done in an extended arc plasma reactor to recover the pig iron. Lime grit and low ash metallurgical (LAM) coke were used as the flux and reductant, respectively. 2-level factorial design was used to study the influence of all parameters on the responses. Response surface modeling was done with the data obtained from statistically designed experiments. Metal recovery at optimum parameters was found to be 79.52%.

  10. Multivariable regression analysis of list experiment data on abortion: results from a large, randomly-selected population based study in Liberia.

    PubMed

    Moseson, Heidi; Gerdts, Caitlin; Dehlendorf, Christine; Hiatt, Robert A; Vittinghoff, Eric

    2017-12-21

    The list experiment is a promising measurement tool for eliciting truthful responses to stigmatized or sensitive health behaviors. However, investigators may be hesitant to adopt the method due to previously untestable assumptions and the perceived inability to conduct multivariable analysis. With a recently developed statistical test that can detect the presence of a design effect - the absence of which is a central assumption of the list experiment method - we sought to test the validity of a list experiment conducted on self-reported abortion in Liberia. We also aim to introduce recently developed multivariable regression estimators for the analysis of list experiment data, to explore relationships between respondent characteristics and having had an abortion - an important component of understanding the experiences of women who have abortions. To test the null hypothesis of no design effect in the Liberian list experiment data, we calculated the percentage of each respondent "type," characterized by response to the control items, and compared these percentages across treatment and control groups with a Bonferroni-adjusted alpha criterion. We then implemented two least squares and two maximum likelihood models (four total), each representing different bias-variance trade-offs, to estimate the association between respondent characteristics and abortion. We find no clear evidence of a design effect in list experiment data from Liberia (p = 0.18), affirming the first key assumption of the method. Multivariable analyses suggest a negative association between education and history of abortion. The retrospective nature of measuring lifetime experience of abortion, however, complicates interpretation of results, as the timing and safety of a respondent's abortion may have influenced her ability to pursue an education. Our work demonstrates that multivariable analyses, as well as statistical testing of a key design assumption, are possible with list experiment data, although with important limitations when considering lifetime measures. We outline how to implement this methodology with list experiment data in future research.

  11. A statistical approach to optimizing concrete mixture design.

    PubMed

    Ahmad, Shamsad; Alghamdi, Saeid A

    2014-01-01

    A step-by-step statistical approach is proposed to obtain optimum proportioning of concrete mixtures using the data obtained through a statistically planned experimental program. The utility of the proposed approach for optimizing the design of concrete mixture is illustrated considering a typical case in which trial mixtures were considered according to a full factorial experiment design involving three factors and their three levels (3(3)). A total of 27 concrete mixtures with three replicates (81 specimens) were considered by varying the levels of key factors affecting compressive strength of concrete, namely, water/cementitious materials ratio (0.38, 0.43, and 0.48), cementitious materials content (350, 375, and 400 kg/m(3)), and fine/total aggregate ratio (0.35, 0.40, and 0.45). The experimental data were utilized to carry out analysis of variance (ANOVA) and to develop a polynomial regression model for compressive strength in terms of the three design factors considered in this study. The developed statistical model was used to show how optimization of concrete mixtures can be carried out with different possible options.

  12. A Statistical Approach to Optimizing Concrete Mixture Design

    PubMed Central

    Alghamdi, Saeid A.

    2014-01-01

    A step-by-step statistical approach is proposed to obtain optimum proportioning of concrete mixtures using the data obtained through a statistically planned experimental program. The utility of the proposed approach for optimizing the design of concrete mixture is illustrated considering a typical case in which trial mixtures were considered according to a full factorial experiment design involving three factors and their three levels (33). A total of 27 concrete mixtures with three replicates (81 specimens) were considered by varying the levels of key factors affecting compressive strength of concrete, namely, water/cementitious materials ratio (0.38, 0.43, and 0.48), cementitious materials content (350, 375, and 400 kg/m3), and fine/total aggregate ratio (0.35, 0.40, and 0.45). The experimental data were utilized to carry out analysis of variance (ANOVA) and to develop a polynomial regression model for compressive strength in terms of the three design factors considered in this study. The developed statistical model was used to show how optimization of concrete mixtures can be carried out with different possible options. PMID:24688405

  13. Economic Statistical Design of Integrated X-bar-S Control Chart with Preventive Maintenance and General Failure Distribution

    PubMed Central

    Caballero Morales, Santiago Omar

    2013-01-01

    The application of Preventive Maintenance (PM) and Statistical Process Control (SPC) are important practices to achieve high product quality, small frequency of failures, and cost reduction in a production process. However there are some points that have not been explored in depth about its joint application. First, most SPC is performed with the X-bar control chart which does not fully consider the variability of the production process. Second, many studies of design of control charts consider just the economic aspect while statistical restrictions must be considered to achieve charts with low probabilities of false detection of failures. Third, the effect of PM on processes with different failure probability distributions has not been studied. Hence, this paper covers these points, presenting the Economic Statistical Design (ESD) of joint X-bar-S control charts with a cost model that integrates PM with general failure distribution. Experiments showed statistically significant reductions in costs when PM is performed on processes with high failure rates and reductions in the sampling frequency of units for testing under SPC. PMID:23527082

  14. The longevity of statistical learning: When infant memory decays, isolated words come to the rescue.

    PubMed

    Karaman, Ferhat; Hay, Jessica F

    2018-02-01

    Research over the past 2 decades has demonstrated that infants are equipped with remarkable computational abilities that allow them to find words in continuous speech. Infants can encode information about the transitional probability (TP) between syllables to segment words from artificial and natural languages. As previous research has tested infants immediately after familiarization, infants' ability to retain sequential statistics beyond the immediate familiarization context remains unknown. Here, we examine infants' memory for statistically defined words 10 min after familiarization with an Italian corpus. Eight-month-old English-learning infants were familiarized with Italian sentences that contained 4 embedded target words-2 words had high internal TP (HTP, TP = 1.0) and 2 had low TP (LTP, TP = .33)-and were tested on their ability to discriminate HTP from LTP words using the Headturn Preference Procedure. When tested after a 10-min delay, infants failed to discriminate HTP from LTP words, suggesting that memory for statistical information likely decays over even short delays (Experiment 1). Experiments 2-4 were designed to test whether experience with isolated words selectively reinforces memory for statistically defined (i.e., HTP) words. When 8-month-olds were given additional experience with isolated tokens of both HTP and LTP words immediately after familiarization, they looked significantly longer on HTP than LTP test trials 10 min later. Although initial representations of statistically defined words may be fragile, our results suggest that experience with isolated words may reinforce the output of statistical learning by helping infants create more robust memories for words with strong versus weak co-occurrence statistics. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  15. Experimental design in chemistry: A tutorial.

    PubMed

    Leardi, Riccardo

    2009-10-12

    In this tutorial the main concepts and applications of experimental design in chemistry will be explained. Unfortunately, nowadays experimental design is not as known and applied as it should be, and many papers can be found in which the "optimization" of a procedure is performed one variable at a time. Goal of this paper is to show the real advantages in terms of reduced experimental effort and of increased quality of information that can be obtained if this approach is followed. To do that, three real examples will be shown. Rather than on the mathematical aspects, this paper will focus on the mental attitude required by experimental design. The readers being interested to deepen their knowledge of the mathematical and algorithmical part can find very good books and tutorials in the references [G.E.P. Box, W.G. Hunter, J.S. Hunter, Statistics for Experimenters: An Introduction to Design, Data Analysis, and Model Building, John Wiley & Sons, New York, 1978; R. Brereton, Chemometrics: Data Analysis for the Laboratory and Chemical Plant, John Wiley & Sons, New York, 1978; R. Carlson, J.E. Carlson, Design and Optimization in Organic Synthesis: Second Revised and Enlarged Edition, in: Data Handling in Science and Technology, vol. 24, Elsevier, Amsterdam, 2005; J.A. Cornell, Experiments with Mixtures: Designs, Models and the Analysis of Mixture Data, in: Series in Probability and Statistics, John Wiley & Sons, New York, 1991; R.E. Bruns, I.S. Scarminio, B. de Barros Neto, Statistical Design-Chemometrics, in: Data Handling in Science and Technology, vol. 25, Elsevier, Amsterdam, 2006; D.C. Montgomery, Design and Analysis of Experiments, 7th edition, John Wiley & Sons, Inc., 2009; T. Lundstedt, E. Seifert, L. Abramo, B. Thelin, A. Nyström, J. Pettersen, R. Bergman, Chemolab 42 (1998) 3; Y. Vander Heyden, LC-GC Europe 19 (9) (2006) 469].

  16. How-To-Do-It: Snails, Pill Bugs, Mealworms, and Chi-Square? Using Invertebrate Behavior to Illustrate Hypothesis Testing with Chi-Square.

    ERIC Educational Resources Information Center

    Biermann, Carol

    1988-01-01

    Described is a study designed to introduce students to the behavior of common invertebrate animals, and to use of the chi-square statistical technique. Discusses activities with snails, pill bugs, and mealworms. Provides an abbreviated chi-square table and instructions for performing the experiments and statistical tests. (CW)

  17. Threats to the Internal Validity of Experimental and Quasi-Experimental Research in Healthcare.

    PubMed

    Flannelly, Kevin J; Flannelly, Laura T; Jankowski, Katherine R B

    2018-01-01

    The article defines, describes, and discusses the seven threats to the internal validity of experiments discussed by Donald T. Campbell in his classic 1957 article: history, maturation, testing, instrument decay, statistical regression, selection, and mortality. These concepts are said to be threats to the internal validity of experiments because they pose alternate explanations for the apparent causal relationship between the independent variable and dependent variable of an experiment if they are not adequately controlled. A series of simple diagrams illustrate three pre-experimental designs and three true experimental designs discussed by Campbell in 1957 and several quasi-experimental designs described in his book written with Julian C. Stanley in 1966. The current article explains why each design controls for or fails to control for these seven threats to internal validity.

  18. TEACHING ADULTS BY TELEVISION, A REPORT OF AN EXPERIMENT IN THE TEACHING OF ELEMENTARY ENGLISH AND ARITHMETIC TO ADULT AFRICANS ON THE COPPERBELT, ZAMBIA, 1963-1965.

    ERIC Educational Resources Information Center

    CRIPWELL, KENNETH K.R.

    THREE EXPERIMENTS WERE DESIGNED TO TEACH ADULT MEN WITH LIMITED EDUCATION A CLOSED-CIRCUIT TELEVISIED COURSE IN ENGLISH AND ARITHMETIC, TO BE REINFORCED BY CONVENTIONAL CLASSROOM INSTRUCTION. BACKGROUND AND GENERAL PROCEDURES OF THE EXPERIMENTS ARE DESCRIBED, AND STATISTICAL DATA REPORTED FOR COMPARISONS ON ABILITY BEFORE AND AFTER INSTRUCTION…

  19. A practical approach to automate randomized design of experiments for ligand-binding assays.

    PubMed

    Tsoi, Jennifer; Patel, Vimal; Shih, Judy

    2014-03-01

    Design of experiments (DOE) is utilized in optimizing ligand-binding assay by modeling factor effects. To reduce the analyst's workload and error inherent with DOE, we propose the integration of automated liquid handlers to perform the randomized designs. A randomized design created from statistical software was imported into custom macro converting the design into a liquid-handler worklist to automate reagent delivery. An optimized assay was transferred to a contract research organization resulting in a successful validation. We developed a practical solution for assay optimization by integrating DOE and automation to increase assay robustness and enable successful method transfer. The flexibility of this process allows it to be applied to a variety of assay designs.

  20. Directed Design of Experiments (DOE) for Determining Probability of Detection (POD) Capability of NDE Systems (DOEPOD)

    NASA Technical Reports Server (NTRS)

    Generazio, Ed

    2007-01-01

    This viewgraph presentation reviews some of the issues that people who specialize in Non destructive evaluation (NDE) have with determining the statistics of the probability of detection. There is discussion of the use of the binominal distribution, and the probability of hit. The presentation then reviews the concepts of Directed Design of Experiments for Validating Probability of Detection of Inspection Systems (DOEPOD). Several cases are reviewed, and discussed. The concept of false calls is also reviewed.

  1. Proceedings of the Conference on the Design of Experiments in Army Research, Development, and Testing (33rd)

    DTIC Science & Technology

    1988-05-01

    Evaluation Directorate (ARMTE) was tasked to conduct a "side- by-side" comparison of EMPS vs . DATMs and to conduct a human factors evaluation of the EMPS...performance ("side-by-side") comparison of EMPS vs . DATMs and to conduct a human factors evaluation. The performance evaluation was based on the speed... independent targets over time. To acquire data for this research, the BRL conducted a statistically designed exper- iment, the Firepower Control Experiment

  2. Incorporating an Interactive Statistics Workshop into an Introductory Biology Course-Based Undergraduate Research Experience (CURE) Enhances Students’ Statistical Reasoning and Quantitative Literacy Skills †

    PubMed Central

    Olimpo, Jeffrey T.; Pevey, Ryan S.; McCabe, Thomas M.

    2018-01-01

    Course-based undergraduate research experiences (CUREs) provide an avenue for student participation in authentic scientific opportunities. Within the context of such coursework, students are often expected to collect, analyze, and evaluate data obtained from their own investigations. Yet, limited research has been conducted that examines mechanisms for supporting students in these endeavors. In this article, we discuss the development and evaluation of an interactive statistics workshop that was expressly designed to provide students with an open platform for graduate teaching assistant (GTA)-mentored data processing, statistical testing, and synthesis of their own research findings. Mixed methods analyses of pre/post-intervention survey data indicated a statistically significant increase in students’ reasoning and quantitative literacy abilities in the domain, as well as enhancement of student self-reported confidence in and knowledge of the application of various statistical metrics to real-world contexts. Collectively, these data reify an important role for scaffolded instruction in statistics in preparing emergent scientists to be data-savvy researchers in a globally expansive STEM workforce. PMID:29904549

  3. Incorporating an Interactive Statistics Workshop into an Introductory Biology Course-Based Undergraduate Research Experience (CURE) Enhances Students' Statistical Reasoning and Quantitative Literacy Skills.

    PubMed

    Olimpo, Jeffrey T; Pevey, Ryan S; McCabe, Thomas M

    2018-01-01

    Course-based undergraduate research experiences (CUREs) provide an avenue for student participation in authentic scientific opportunities. Within the context of such coursework, students are often expected to collect, analyze, and evaluate data obtained from their own investigations. Yet, limited research has been conducted that examines mechanisms for supporting students in these endeavors. In this article, we discuss the development and evaluation of an interactive statistics workshop that was expressly designed to provide students with an open platform for graduate teaching assistant (GTA)-mentored data processing, statistical testing, and synthesis of their own research findings. Mixed methods analyses of pre/post-intervention survey data indicated a statistically significant increase in students' reasoning and quantitative literacy abilities in the domain, as well as enhancement of student self-reported confidence in and knowledge of the application of various statistical metrics to real-world contexts. Collectively, these data reify an important role for scaffolded instruction in statistics in preparing emergent scientists to be data-savvy researchers in a globally expansive STEM workforce.

  4. An Electronic Engineering Curriculum Design Based on Concept-Mapping Techniques

    ERIC Educational Resources Information Center

    Toral, S. L.; Martinez-Torres, M. R.; Barrero, F.; Gallardo, S.; Duran, M. J.

    2007-01-01

    Curriculum design is a concern in European Universities as they face the forthcoming European Higher Education Area (EHEA). This process can be eased by the use of scientific tools such as Concept-Mapping Techniques (CMT) that extract and organize the most relevant information from experts' experience using statistics techniques, and helps a…

  5. The effects of DRIE operational parameters on vertically aligned micropillar arrays

    NASA Astrophysics Data System (ADS)

    Miller, Kane; Li, Mingxiao; Walsh, Kevin M.; Fu, Xiao-An

    2013-03-01

    Vertically aligned silicon micropillar arrays have been created by deep reactive ion etching (DRIE) and used for a number of microfabricated devices including microfluidic devices, micropreconcentrators and photovoltaic cells. This paper delineates an experimental design performed on the Bosch process of DRIE of micropillar arrays. The arrays are fabricated with direct-write optical lithography without photomask, and the effects of DRIE process parameters, including etch cycle time, passivation cycle time, platen power and coil power on profile angle, scallop depth and scallop peak-to-peak distance are studied by statistical design of experiments. Scanning electron microscope images are used for measuring the resultant profile angles and characterizing the scalloping effect on the pillar sidewalls. The experimental results indicate the effects of the determining factors, etch cycle time, passivation cycle time and platen power, on the micropillar profile angles and scallop depths. An optimized DRIE process recipe for creating nearly 90° and smooth surface (invisible scalloping) has been obtained as a result of the statistical design of experiments.

  6. Assessing signal-to-noise in quantitative proteomics: multivariate statistical analysis in DIGE experiments.

    PubMed

    Friedman, David B

    2012-01-01

    All quantitative proteomics experiments measure variation between samples. When performing large-scale experiments that involve multiple conditions or treatments, the experimental design should include the appropriate number of individual biological replicates from each condition to enable the distinction between a relevant biological signal from technical noise. Multivariate statistical analyses, such as principal component analysis (PCA), provide a global perspective on experimental variation, thereby enabling the assessment of whether the variation describes the expected biological signal or the unanticipated technical/biological noise inherent in the system. Examples will be shown from high-resolution multivariable DIGE experiments where PCA was instrumental in demonstrating biologically significant variation as well as sample outliers, fouled samples, and overriding technical variation that would not be readily observed using standard univariate tests.

  7. The Value of Interrupted Time-Series Experiments for Community Intervention Research

    PubMed Central

    Biglan, Anthony; Ary, Dennis; Wagenaar, Alexander C.

    2015-01-01

    Greater use of interrupted time-series experiments is advocated for community intervention research. Time-series designs enable the development of knowledge about the effects of community interventions and policies in circumstances in which randomized controlled trials are too expensive, premature, or simply impractical. The multiple baseline time-series design typically involves two or more communities that are repeatedly assessed, with the intervention introduced into one community at a time. It is particularly well suited to initial evaluations of community interventions and the refinement of those interventions. This paper describes the main features of multiple baseline designs and related repeated-measures time-series experiments, discusses the threats to internal validity in multiple baseline designs, and outlines techniques for statistical analyses of time-series data. Examples are given of the use of multiple baseline designs in evaluating community interventions and policy changes. PMID:11507793

  8. Anodal transcranial direct current stimulation of the right anterior temporal lobe did not significantly affect verbal insight.

    PubMed

    Aihara, Takatsugu; Ogawa, Takeshi; Shimokawa, Takeaki; Yamashita, Okito

    2017-01-01

    Humans often utilize past experience to solve difficult problems. However, if past experience is insufficient to solve a problem, solvers may reach an impasse. Insight can be valuable for breaking an impasse, enabling the reinterpretation or re-representation of a problem. Previous studies using between-subjects designs have revealed a causal relationship between the anterior temporal lobes (ATLs) and non-verbal insight, by enhancing the right ATL while inhibiting the left ATL using transcranial direct current stimulation (tDCS). In addition, neuroimaging studies have reported a correlation between right ATL activity and verbal insight. Based on these findings, we hypothesized that the right ATL is causally related to both non-verbal and verbal insight. To test this hypothesis, we conducted an experiment with 66 subjects using a within-subjects design, which typically has greater statistical power than a between-subjects design. Subjects participated in tDCS experiments across 2 days, in which they solved both non-verbal and verbal insight problems under active or sham stimulation conditions. To dissociate the effects of right ATL stimulation from those of left ATL stimulation, we used two montage types; anodal tDCS of the right ATL together with cathodal tDCS of the left ATL (stimulating both ATLs) and anodal tDCS of the right ATL with cathodal tDCS of the left cheek (stimulating only the right ATL). The montage used was counterbalanced across subjects. Statistical analyses revealed that, regardless of the montage type, there were no significant differences between the active and sham conditions for either verbal or non-verbal insight, although the finding for non-verbal insight was inconclusive because of a lack of statistical power. These results failed to support previous findings suggesting that the right ATL is the central locus of insight.

  9. Statistical analysis and application of quasi experiments to antimicrobial resistance intervention studies.

    PubMed

    Shardell, Michelle; Harris, Anthony D; El-Kamary, Samer S; Furuno, Jon P; Miller, Ram R; Perencevich, Eli N

    2007-10-01

    Quasi-experimental study designs are frequently used to assess interventions that aim to limit the emergence of antimicrobial-resistant pathogens. However, previous studies using these designs have often used suboptimal statistical methods, which may result in researchers making spurious conclusions. Methods used to analyze quasi-experimental data include 2-group tests, regression analysis, and time-series analysis, and they all have specific assumptions, data requirements, strengths, and limitations. An example of a hospital-based intervention to reduce methicillin-resistant Staphylococcus aureus infection rates and reduce overall length of stay is used to explore these methods.

  10. Predictor sort sampling and one-sided confidence bounds on quantiles

    Treesearch

    Steve Verrill; Victoria L. Herian; David W. Green

    2002-01-01

    Predictor sort experiments attempt to make use of the correlation between a predictor that can be measured prior to the start of an experiment and the response variable that we are investigating. Properly designed and analyzed, they can reduce necessary sample sizes, increase statistical power, and reduce the lengths of confidence intervals. However, if the non- random...

  11. Statistical mixture design and multivariate analysis of inkjet printed a-WO3/TiO2/WOX electrochromic films.

    PubMed

    Wojcik, Pawel Jerzy; Pereira, Luís; Martins, Rodrigo; Fortunato, Elvira

    2014-01-13

    An efficient mathematical strategy in the field of solution processed electrochromic (EC) films is outlined as a combination of an experimental work, modeling, and information extraction from massive computational data via statistical software. Design of Experiment (DOE) was used for statistical multivariate analysis and prediction of mixtures through a multiple regression model, as well as the optimization of a five-component sol-gel precursor subjected to complex constraints. This approach significantly reduces the number of experiments to be realized, from 162 in the full factorial (L=3) and 72 in the extreme vertices (D=2) approach down to only 30 runs, while still maintaining a high accuracy of the analysis. By carrying out a finite number of experiments, the empirical modeling in this study shows reasonably good prediction ability in terms of the overall EC performance. An optimized ink formulation was employed in a prototype of a passive EC matrix fabricated in order to test and trial this optically active material system together with a solid-state electrolyte for the prospective application in EC displays. Coupling of DOE with chromogenic material formulation shows the potential to maximize the capabilities of these systems and ensures increased productivity in many potential solution-processed electrochemical applications.

  12. Parametric study of the swimming performance of a fish robot propelled by a flexible caudal fin.

    PubMed

    Low, K H; Chong, C W

    2010-12-01

    In this paper, we aim to study the swimming performance of fish robots by using a statistical approach. A fish robot employing a carangiform swimming mode had been used as an experimental platform for the performance study. The experiments conducted aim to investigate the effect of various design parameters on the thrust capability of the fish robot with a flexible caudal fin. The controllable parameters associated with the fin include frequency, amplitude of oscillation, aspect ratio and the rigidity of the caudal fin. The significance of these parameters was determined in the first set of experiments by using a statistical approach. A more detailed parametric experimental study was then conducted with only those significant parameters. As a result, the parametric study could be completed with a reduced number of experiments and time spent. With the obtained experimental result, we were able to understand the relationship between various parameters and a possible adjustment of parameters to obtain a higher thrust. The proposed statistical method for experimentation provides an objective and thorough analysis of the effects of individual or combinations of parameters on the swimming performance. Such an efficient experimental design helps to optimize the process and determine factors that influence variability.

  13. Optimizing Mass Spectrometry Analyses: A Tailored Review on the Utility of Design of Experiments.

    PubMed

    Hecht, Elizabeth S; Oberg, Ann L; Muddiman, David C

    2016-05-01

    Mass spectrometry (MS) has emerged as a tool that can analyze nearly all classes of molecules, with its scope rapidly expanding in the areas of post-translational modifications, MS instrumentation, and many others. Yet integration of novel analyte preparatory and purification methods with existing or novel mass spectrometers can introduce new challenges for MS sensitivity. The mechanisms that govern detection by MS are particularly complex and interdependent, including ionization efficiency, ion suppression, and transmission. Performance of both off-line and MS methods can be optimized separately or, when appropriate, simultaneously through statistical designs, broadly referred to as "design of experiments" (DOE). The following review provides a tutorial-like guide into the selection of DOE for MS experiments, the practices for modeling and optimization of response variables, and the available software tools that support DOE implementation in any laboratory. This review comes 3 years after the latest DOE review (Hibbert DB, 2012), which provided a comprehensive overview on the types of designs available and their statistical construction. Since that time, new classes of DOE, such as the definitive screening design, have emerged and new calls have been made for mass spectrometrists to adopt the practice. Rather than exhaustively cover all possible designs, we have highlighted the three most practical DOE classes available to mass spectrometrists. This review further differentiates itself by providing expert recommendations for experimental setup and defining DOE entirely in the context of three case-studies that highlight the utility of different designs to achieve different goals. A step-by-step tutorial is also provided.

  14. Statistical Metamodeling and Sequential Design of Computer Experiments to Model Glyco-Altered Gating of Sodium Channels in Cardiac Myocytes.

    PubMed

    Du, Dongping; Yang, Hui; Ednie, Andrew R; Bennett, Eric S

    2016-09-01

    Glycan structures account for up to 35% of the mass of cardiac sodium ( Nav ) channels. To question whether and how reduced sialylation affects Nav activity and cardiac electrical signaling, we conducted a series of in vitro experiments on ventricular apex myocytes under two different glycosylation conditions, reduced protein sialylation (ST3Gal4(-/-)) and full glycosylation (control). Although aberrant electrical signaling is observed in reduced sialylation, realizing a better understanding of mechanistic details of pathological variations in INa and AP is difficult without performing in silico studies. However, computer model of Nav channels and cardiac myocytes involves greater levels of complexity, e.g., high-dimensional parameter space, nonlinear and nonconvex equations. Traditional linear and nonlinear optimization methods have encountered many difficulties for model calibration. This paper presents a new statistical metamodeling approach for efficient computer experiments and optimization of Nav models. First, we utilize a fractional factorial design to identify control variables from the large set of model parameters, thereby reducing the dimensionality of parametric space. Further, we develop the Gaussian process model as a surrogate of expensive and time-consuming computer models and then identify the next best design point that yields the maximal probability of improvement. This process iterates until convergence, and the performance is evaluated and validated with real-world experimental data. Experimental results show the proposed algorithm achieves superior performance in modeling the kinetics of Nav channels under a variety of glycosylation conditions. As a result, in silico models provide a better understanding of glyco-altered mechanistic details in state transitions and distributions of Nav channels. Notably, ST3Gal4(-/-) myocytes are shown to have higher probabilities accumulated in intermediate inactivation during the repolarization and yield a shorter refractory period than WTs. The proposed statistical design of computer experiments is generally extensible to many other disciplines that involve large-scale and computationally expensive models.

  15. A Bayesian approach to the statistical analysis of device preference studies.

    PubMed

    Fu, Haoda; Qu, Yongming; Zhu, Baojin; Huster, William

    2012-01-01

    Drug delivery devices are required to have excellent technical specifications to deliver drugs accurately, and in addition, the devices should provide a satisfactory experience to patients because this can have a direct effect on drug compliance. To compare patients' experience with two devices, cross-over studies with patient-reported outcomes (PRO) as response variables are often used. Because of the strength of cross-over designs, each subject can directly compare the two devices by using the PRO variables, and variables indicating preference (preferring A, preferring B, or no preference) can be easily derived. Traditionally, methods based on frequentist statistics can be used to analyze such preference data, but there are some limitations for the frequentist methods. Recently, Bayesian methods are considered an acceptable method by the US Food and Drug Administration to design and analyze device studies. In this paper, we propose a Bayesian statistical method to analyze the data from preference trials. We demonstrate that the new Bayesian estimator enjoys some optimal properties versus the frequentist estimator. Copyright © 2012 John Wiley & Sons, Ltd.

  16. Design of experiments enhanced statistical process control for wind tunnel check standard testing

    NASA Astrophysics Data System (ADS)

    Phillips, Ben D.

    The current wind tunnel check standard testing program at NASA Langley Research Center is focused on increasing data quality, uncertainty quantification and overall control and improvement of wind tunnel measurement processes. The statistical process control (SPC) methodology employed in the check standard testing program allows for the tracking of variations in measurements over time as well as an overall assessment of facility health. While the SPC approach can and does provide researchers with valuable information, it has certain limitations in the areas of process improvement and uncertainty quantification. It is thought by utilizing design of experiments methodology in conjunction with the current SPC practices that one can efficiently and more robustly characterize uncertainties and develop enhanced process improvement procedures. In this research, methodologies were developed to generate regression models for wind tunnel calibration coefficients, balance force coefficients and wind tunnel flow angularities. The coefficients of these regression models were then tracked in statistical process control charts, giving a higher level of understanding of the processes. The methodology outlined is sufficiently generic such that this research can be applicable to any wind tunnel check standard testing program.

  17. Experiment design for pilot identification in compensatory tracking tasks

    NASA Technical Reports Server (NTRS)

    Wells, W. R.

    1976-01-01

    A design criterion for input functions in laboratory tracking tasks resulting in efficient parameter estimation is formulated. The criterion is that the statistical correlations between pairs of parameters be reduced in order to minimize the problem of nonuniqueness in the extraction process. The effectiveness of the method is demonstrated for a lower order dynamic system.

  18. Under What Circumstances Does External Knowledge about the Correlation Structure Improve Power in Cluster Randomized Designs?

    ERIC Educational Resources Information Center

    Rhoads, Christopher

    2014-01-01

    Recent publications have drawn attention to the idea of utilizing prior information about the correlation structure to improve statistical power in cluster randomized experiments. Because power in cluster randomized designs is a function of many different parameters, it has been difficult for applied researchers to discern a simple rule explaining…

  19. A new u-statistic with superior design sensitivity in matched observational studies.

    PubMed

    Rosenbaum, Paul R

    2011-09-01

    In an observational or nonrandomized study of treatment effects, a sensitivity analysis indicates the magnitude of bias from unmeasured covariates that would need to be present to alter the conclusions of a naïve analysis that presumes adjustments for observed covariates suffice to remove all bias. The power of sensitivity analysis is the probability that it will reject a false hypothesis about treatment effects allowing for a departure from random assignment of a specified magnitude; in particular, if this specified magnitude is "no departure" then this is the same as the power of a randomization test in a randomized experiment. A new family of u-statistics is proposed that includes Wilcoxon's signed rank statistic but also includes other statistics with substantially higher power when a sensitivity analysis is performed in an observational study. Wilcoxon's statistic has high power to detect small effects in large randomized experiments-that is, it often has good Pitman efficiency-but small effects are invariably sensitive to small unobserved biases. Members of this family of u-statistics that emphasize medium to large effects can have substantially higher power in a sensitivity analysis. For example, in one situation with 250 pair differences that are Normal with expectation 1/2 and variance 1, the power of a sensitivity analysis that uses Wilcoxon's statistic is 0.08 while the power of another member of the family of u-statistics is 0.66. The topic is examined by performing a sensitivity analysis in three observational studies, using an asymptotic measure called the design sensitivity, and by simulating power in finite samples. The three examples are drawn from epidemiology, clinical medicine, and genetic toxicology. © 2010, The International Biometric Society.

  20. The statistics of identifying differentially expressed genes in Expresso and TM4: a comparison

    PubMed Central

    Sioson, Allan A; Mane, Shrinivasrao P; Li, Pinghua; Sha, Wei; Heath, Lenwood S; Bohnert, Hans J; Grene, Ruth

    2006-01-01

    Background Analysis of DNA microarray data takes as input spot intensity measurements from scanner software and returns differential expression of genes between two conditions, together with a statistical significance assessment. This process typically consists of two steps: data normalization and identification of differentially expressed genes through statistical analysis. The Expresso microarray experiment management system implements these steps with a two-stage, log-linear ANOVA mixed model technique, tailored to individual experimental designs. The complement of tools in TM4, on the other hand, is based on a number of preset design choices that limit its flexibility. In the TM4 microarray analysis suite, normalization, filter, and analysis methods form an analysis pipeline. TM4 computes integrated intensity values (IIV) from the average intensities and spot pixel counts returned by the scanner software as input to its normalization steps. By contrast, Expresso can use either IIV data or median intensity values (MIV). Here, we compare Expresso and TM4 analysis of two experiments and assess the results against qRT-PCR data. Results The Expresso analysis using MIV data consistently identifies more genes as differentially expressed, when compared to Expresso analysis with IIV data. The typical TM4 normalization and filtering pipeline corrects systematic intensity-specific bias on a per microarray basis. Subsequent statistical analysis with Expresso or a TM4 t-test can effectively identify differentially expressed genes. The best agreement with qRT-PCR data is obtained through the use of Expresso analysis and MIV data. Conclusion The results of this research are of practical value to biologists who analyze microarray data sets. The TM4 normalization and filtering pipeline corrects microarray-specific systematic bias and complements the normalization stage in Expresso analysis. The results of Expresso using MIV data have the best agreement with qRT-PCR results. In one experiment, MIV is a better choice than IIV as input to data normalization and statistical analysis methods, as it yields as greater number of statistically significant differentially expressed genes; TM4 does not support the choice of MIV input data. Overall, the more flexible and extensive statistical models of Expresso achieve more accurate analytical results, when judged by the yardstick of qRT-PCR data, in the context of an experimental design of modest complexity. PMID:16626497

  1. Results from the HARP Experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Catanesi, M. G.

    2008-02-21

    Hadron production is a key ingredient in many aspects of {nu} physics. Precise prediction of atmospheric {nu} fluxes, characterization of accelerator {nu} beams, quantification of {pi} production and capture for {nu}-factory designs, all of these would profit from hadron production measurements. HARP at the CERN PS was the first hadron production experiment designed on purpose to match all these requirements. It combines a large, full phase space acceptance with low systematic errors and high statistics. HARP was operated in the range from 3 GeV to 15 GeV. We briefly describe here the most recent results.

  2. Modified Bayesian Kriging for Noisy Response Problems for Reliability Analysis

    DTIC Science & Technology

    2015-01-01

    52242, USA nicholas-gaul@uiowa.edu Mary Kathryn Cowles Department of Statistics & Actuarial Science College of Liberal Arts and Sciences , The...Forrester, A. I. J., & Keane, A. J. (2009). Recent advances in surrogate-based optimization. Progress in Aerospace Sciences , 45(1–3), 50-79. doi...Wiley. [27] Sacks, J., Welch, W. J., Toby J. Mitchell, & Wynn, H. P. (1989). Design and analysis of computer experiments. Statistical Science , 4

  3. Data mining and statistical inference in selective laser melting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kamath, Chandrika

    Selective laser melting (SLM) is an additive manufacturing process that builds a complex three-dimensional part, layer-by-layer, using a laser beam to fuse fine metal powder together. The design freedom afforded by SLM comes associated with complexity. As the physical phenomena occur over a broad range of length and time scales, the computational cost of modeling the process is high. At the same time, the large number of parameters that control the quality of a part make experiments expensive. In this paper, we describe ways in which we can use data mining and statistical inference techniques to intelligently combine simulations andmore » experiments to build parts with desired properties. We start with a brief summary of prior work in finding process parameters for high-density parts. We then expand on this work to show how we can improve the approach by using feature selection techniques to identify important variables, data-driven surrogate models to reduce computational costs, improved sampling techniques to cover the design space adequately, and uncertainty analysis for statistical inference. Here, our results indicate that techniques from data mining and statistics can complement those from physical modeling to provide greater insight into complex processes such as selective laser melting.« less

  4. Data mining and statistical inference in selective laser melting

    DOE PAGES

    Kamath, Chandrika

    2016-01-11

    Selective laser melting (SLM) is an additive manufacturing process that builds a complex three-dimensional part, layer-by-layer, using a laser beam to fuse fine metal powder together. The design freedom afforded by SLM comes associated with complexity. As the physical phenomena occur over a broad range of length and time scales, the computational cost of modeling the process is high. At the same time, the large number of parameters that control the quality of a part make experiments expensive. In this paper, we describe ways in which we can use data mining and statistical inference techniques to intelligently combine simulations andmore » experiments to build parts with desired properties. We start with a brief summary of prior work in finding process parameters for high-density parts. We then expand on this work to show how we can improve the approach by using feature selection techniques to identify important variables, data-driven surrogate models to reduce computational costs, improved sampling techniques to cover the design space adequately, and uncertainty analysis for statistical inference. Here, our results indicate that techniques from data mining and statistics can complement those from physical modeling to provide greater insight into complex processes such as selective laser melting.« less

  5. An Analysis of Operational Suitability for Test and Evaluation of Highly Reliable Systems

    DTIC Science & Technology

    1994-03-04

    Exposition," Journal of the American Statistical A iation-59: 353-375 (June 1964). 17. SYS 229, Test and Evaluation Management Coursebook , School of Systems...in hours, 0 is 2-5 the desired MTBCF in hours, R is the number of critical failures, and a is the P[type-I error] of the X2 statistic with 2*R+2...design of experiments (DOE) tables and the use of Bayesian statistics to increase the confidence level of the test results that will be obtained from

  6. SSSFD manipulator engineering using statistical experiment design techniques

    NASA Technical Reports Server (NTRS)

    Barnes, John

    1991-01-01

    The Satellite Servicer System Flight Demonstration (SSSFD) program is a series of Shuttle flights designed to verify major on-orbit satellite servicing capabilities, such as rendezvous and docking of free flyers, Orbital Replacement Unit (ORU) exchange, and fluid transfer. A major part of this system is the manipulator system that will perform the ORU exchange. The manipulator must possess adequate toolplate dexterity to maneuver a variety of EVA-type tools into position to interface with ORU fasteners, connectors, latches, and handles on the satellite, and to move workpieces and ORUs through 6 degree of freedom (dof) space from the Target Vehicle (TV) to the Support Module (SM) and back. Two cost efficient tools were combined to perform a study of robot manipulator design parameters. These tools are graphical computer simulations and Taguchi Design of Experiment methods. Using a graphics platform, an off-the-shelf robot simulation software package, and an experiment designed with Taguchi's approach, the sensitivities of various manipulator kinematic design parameters to performance characteristics are determined with minimal cost.

  7. Designing Intervention Studies: Selected Populations, Range Restrictions, and Statistical Power

    PubMed Central

    Miciak, Jeremy; Taylor, W. Pat; Stuebing, Karla K.; Fletcher, Jack M.; Vaughn, Sharon

    2016-01-01

    An appropriate estimate of statistical power is critical for the design of intervention studies. Although the inclusion of a pretest covariate in the test of the primary outcome can increase statistical power, samples selected on the basis of pretest performance may demonstrate range restriction on the selection measure and other correlated measures. This can result in attenuated pretest-posttest correlations, reducing the variance explained by the pretest covariate. We investigated the implications of two potential range restriction scenarios: direct truncation on a selection measure and indirect range restriction on correlated measures. Empirical and simulated data indicated direct range restriction on the pretest covariate greatly reduced statistical power and necessitated sample size increases of 82%–155% (dependent on selection criteria) to achieve equivalent statistical power to parameters with unrestricted samples. However, measures demonstrating indirect range restriction required much smaller sample size increases (32%–71%) under equivalent scenarios. Additional analyses manipulated the correlations between measures and pretest-posttest correlations to guide planning experiments. Results highlight the need to differentiate between selection measures and potential covariates and to investigate range restriction as a factor impacting statistical power. PMID:28479943

  8. Designing Intervention Studies: Selected Populations, Range Restrictions, and Statistical Power.

    PubMed

    Miciak, Jeremy; Taylor, W Pat; Stuebing, Karla K; Fletcher, Jack M; Vaughn, Sharon

    2016-01-01

    An appropriate estimate of statistical power is critical for the design of intervention studies. Although the inclusion of a pretest covariate in the test of the primary outcome can increase statistical power, samples selected on the basis of pretest performance may demonstrate range restriction on the selection measure and other correlated measures. This can result in attenuated pretest-posttest correlations, reducing the variance explained by the pretest covariate. We investigated the implications of two potential range restriction scenarios: direct truncation on a selection measure and indirect range restriction on correlated measures. Empirical and simulated data indicated direct range restriction on the pretest covariate greatly reduced statistical power and necessitated sample size increases of 82%-155% (dependent on selection criteria) to achieve equivalent statistical power to parameters with unrestricted samples. However, measures demonstrating indirect range restriction required much smaller sample size increases (32%-71%) under equivalent scenarios. Additional analyses manipulated the correlations between measures and pretest-posttest correlations to guide planning experiments. Results highlight the need to differentiate between selection measures and potential covariates and to investigate range restriction as a factor impacting statistical power.

  9. Optimization of Soluble Expression and Purification of Recombinant Human Rhinovirus Type-14 3C Protease Using Statistically Designed Experiments: Isolation and Characterization of the Enzyme.

    PubMed

    Antoniou, Georgia; Papakyriacou, Irineos; Papaneophytou, Christos

    2017-10-01

    Human rhinovirus (HRV) 3C protease is widely used in recombinant protein production for various applications such as biochemical characterization and structural biology projects to separate recombinant fusion proteins from their affinity tags in order to prevent interference between these tags and the target proteins. Herein, we report the optimization of expression and purification conditions of glutathione S-transferase (GST)-tagged HRV 3C protease by statistically designed experiments. Soluble expression of GST-HRV 3C protease was initially optimized by response surface methodology (RSM), and a 5.5-fold increase in enzyme yield was achieved. Subsequently, we developed a new incomplete factorial (IF) design that examines four variables (bacterial strain, expression temperature, induction time, and inducer concentration) in a single experiment. The new design called Incomplete Factorial-Strain/Temperature/Time/Inducer (IF-STTI) was validated using three GST-tagged proteins. In all cases, IF-STTI resulted in only 10% lower expression yields than those obtained by RSM. Purification of GST-HRV 3C was optimized by an IF design that examines simultaneously the effect of the amount of resin, incubation time of cell lysate with resin, and glycerol and DTT concentration in buffers, and a further 15% increase in protease recovery was achieved. Purified GST-HRV 3C protease was active at both 4 and 25 °C in a variety of buffers.

  10. Changes in compensatory eye movements associated with simulated stimulus conditions of spaceflight

    NASA Technical Reports Server (NTRS)

    Harm, Deborah L.; Zografos, Linda M.; Skinner, Noel C.; Parker, Donald E.

    1993-01-01

    Compensatory vertical eye movement gain (CVEMG) was recorded during pitch oscillation in darkness before, during, and immediately after exposures to the stimulus rearrangement produced by the Preflight Adaptation Trainer (PAT) Tilt-Translation Device (TTD). The TTD is designed to elicit adaptive responses that are similar to those observed in microgravity-adapted astronauts. The data from Experiment 1 yielded a statistically significant CVEMG decrease following 15 min of exposure to a stimulus rearrangement condition where the phase angle between subject pitch tilt and visual scene translation was 270 deg; statistically significant gain decreases were not observed following exposures either to a condition where the phase angle between subject pitch and scene translation was 90 deg or to a no-stimulus-rearrangement condition. Experiment 2 replicated the 270-deg-phase condition from Experiment 1 and extended the exposure duration from 30 to 45 min. Statistically significant additional changes in CVEMG associated with the increased exposure duration were not observed. The adaptation time constant estimated fram the combined data from Experiments 1 and 2 was 29 min.

  11. Changes in Compensatory Eye Movements Associated with Simulated Stimulus Conditions of Spaceflight

    NASA Technical Reports Server (NTRS)

    Harm, Deborah L.; Zografos, Linda M.; Skinner, Noel C.; Parker, Donald E.

    1993-01-01

    Compensatory vertical eye movement gain (CVEMG) was recorded during pitch oscillation in darkness before, during and immediately after exposures to the stimulus rearrangement produced by the Preflight Adaptation Trainer (PAT) Tilt-Translation Device (TTD). The TTD is designed to elicit adaptive responses that are similar to those observed in microgravity-adapted astronauts. The data from Experiment 1 yielded a statistically significant CVEMG decrease following 15 minutes of exposure to a stimulus rearrangement condition where the phase angle between subject pitch tilt and visual scene translation was 270 degrees; statistically significant gain decreases were not observed following exposures either to a condition where the phase angle between subject pitch and scene translation was 90 degrees or to a no-stimulus-rearrangement condition. Experiment 2 replicated the 270 degree phase condition from Experiment 1 and extended the exposure duration from 30 to 45 minutes. Statistically significant additional changes in CVEMG associated with the increased exposure duration were not observed. The adaptation time constant estimated from the combined data from Experiments 1 and 2 was 29 minutes.

  12. Design of experiments (DoE) in pharmaceutical development.

    PubMed

    N Politis, Stavros; Colombo, Paolo; Colombo, Gaia; M Rekkas, Dimitrios

    2017-06-01

    At the beginning of the twentieth century, Sir Ronald Fisher introduced the concept of applying statistical analysis during the planning stages of research rather than at the end of experimentation. When statistical thinking is applied from the design phase, it enables to build quality into the product, by adopting Deming's profound knowledge approach, comprising system thinking, variation understanding, theory of knowledge, and psychology. The pharmaceutical industry was late in adopting these paradigms, compared to other sectors. It heavily focused on blockbuster drugs, while formulation development was mainly performed by One Factor At a Time (OFAT) studies, rather than implementing Quality by Design (QbD) and modern engineering-based manufacturing methodologies. Among various mathematical modeling approaches, Design of Experiments (DoE) is extensively used for the implementation of QbD in both research and industrial settings. In QbD, product and process understanding is the key enabler of assuring quality in the final product. Knowledge is achieved by establishing models correlating the inputs with the outputs of the process. The mathematical relationships of the Critical Process Parameters (CPPs) and Material Attributes (CMAs) with the Critical Quality Attributes (CQAs) define the design space. Consequently, process understanding is well assured and rationally leads to a final product meeting the Quality Target Product Profile (QTPP). This review illustrates the principles of quality theory through the work of major contributors, the evolution of the QbD approach and the statistical toolset for its implementation. As such, DoE is presented in detail since it represents the first choice for rational pharmaceutical development.

  13. Do labeled versus unlabeled treatments of alternatives' names influence stated choice outputs? Results from a mode choice study.

    PubMed

    Jin, Wen; Jiang, Hai; Liu, Yimin; Klampfl, Erica

    2017-01-01

    Discrete choice experiments have been widely applied to elicit behavioral preferences in the literature. In many of these experiments, the alternatives are named alternatives, meaning that they are naturally associated with specific names. For example, in a mode choice study, the alternatives can be associated with names such as car, taxi, bus, and subway. A fundamental issue that arises in stated choice experiments is whether to treat the alternatives' names as labels (that is, labeled treatment), or as attributes (that is, unlabeled treatment) in the design as well as the presentation phases of the choice sets. In this research, we investigate the impact of labeled versus unlabeled treatments of alternatives' names on the outcome of stated choice experiments, a question that has not been thoroughly investigated in the literature. Using results from a mode choice study, we find that the labeled or the unlabeled treatment of alternatives' names in either the design or the presentation phase of the choice experiment does not statistically affect the estimates of the coefficient parameters. We then proceed to measure the influence toward the willingness-to-pay (WTP) estimates. By using a random-effects model to relate the conditional WTP estimates to the socioeconomic characteristics of the individuals and the labeled versus unlabeled treatments of alternatives' names, we find that: a) Given the treatment of alternatives' names in the presentation phase, the treatment of alternatives' names in the design phase does not statistically affect the estimates of the WTP measures; and b) Given the treatment of alternatives' names in the design phase, the labeled treatment of alternatives' names in the presentation phase causes the corresponding WTP estimates to be slightly higher.

  14. Do labeled versus unlabeled treatments of alternatives’ names influence stated choice outputs? Results from a mode choice study

    PubMed Central

    Jin, Wen; Jiang, Hai; Liu, Yimin; Klampfl, Erica

    2017-01-01

    Discrete choice experiments have been widely applied to elicit behavioral preferences in the literature. In many of these experiments, the alternatives are named alternatives, meaning that they are naturally associated with specific names. For example, in a mode choice study, the alternatives can be associated with names such as car, taxi, bus, and subway. A fundamental issue that arises in stated choice experiments is whether to treat the alternatives’ names as labels (that is, labeled treatment), or as attributes (that is, unlabeled treatment) in the design as well as the presentation phases of the choice sets. In this research, we investigate the impact of labeled versus unlabeled treatments of alternatives’ names on the outcome of stated choice experiments, a question that has not been thoroughly investigated in the literature. Using results from a mode choice study, we find that the labeled or the unlabeled treatment of alternatives’ names in either the design or the presentation phase of the choice experiment does not statistically affect the estimates of the coefficient parameters. We then proceed to measure the influence toward the willingness-to-pay (WTP) estimates. By using a random-effects model to relate the conditional WTP estimates to the socioeconomic characteristics of the individuals and the labeled versus unlabeled treatments of alternatives’ names, we find that: a) Given the treatment of alternatives’ names in the presentation phase, the treatment of alternatives’ names in the design phase does not statistically affect the estimates of the WTP measures; and b) Given the treatment of alternatives’ names in the design phase, the labeled treatment of alternatives’ names in the presentation phase causes the corresponding WTP estimates to be slightly higher. PMID:28806764

  15. Optimally designing games for behavioural research

    PubMed Central

    Rafferty, Anna N.; Zaharia, Matei; Griffiths, Thomas L.

    2014-01-01

    Computer games can be motivating and engaging experiences that facilitate learning, leading to their increasing use in education and behavioural experiments. For these applications, it is often important to make inferences about the knowledge and cognitive processes of players based on their behaviour. However, designing games that provide useful behavioural data are a difficult task that typically requires significant trial and error. We address this issue by creating a new formal framework that extends optimal experiment design, used in statistics, to apply to game design. In this framework, we use Markov decision processes to model players' actions within a game, and then make inferences about the parameters of a cognitive model from these actions. Using a variety of concept learning games, we show that in practice, this method can predict which games will result in better estimates of the parameters of interest. The best games require only half as many players to attain the same level of precision. PMID:25002821

  16. Analysis and meta-analysis of single-case designs with a standardized mean difference statistic: a primer and applications.

    PubMed

    Shadish, William R; Hedges, Larry V; Pustejovsky, James E

    2014-04-01

    This article presents a d-statistic for single-case designs that is in the same metric as the d-statistic used in between-subjects designs such as randomized experiments and offers some reasons why such a statistic would be useful in SCD research. The d has a formal statistical development, is accompanied by appropriate power analyses, and can be estimated using user-friendly SPSS macros. We discuss both advantages and disadvantages of d compared to other approaches such as previous d-statistics, overlap statistics, and multilevel modeling. It requires at least three cases for computation and assumes normally distributed outcomes and stationarity, assumptions that are discussed in some detail. We also show how to test these assumptions. The core of the article then demonstrates in depth how to compute d for one study, including estimation of the autocorrelation and the ratio of between case variance to total variance (between case plus within case variance), how to compute power using a macro, and how to use the d to conduct a meta-analysis of studies using single-case designs in the free program R, including syntax in an appendix. This syntax includes how to read data, compute fixed and random effect average effect sizes, prepare a forest plot and a cumulative meta-analysis, estimate various influence statistics to identify studies contributing to heterogeneity and effect size, and do various kinds of publication bias analyses. This d may prove useful for both the analysis and meta-analysis of data from SCDs. Copyright © 2013 Society for the Study of School Psychology. Published by Elsevier Ltd. All rights reserved.

  17. Effects of Learning Style and Training Method on Computer Attitude and Performance in World Wide Web Page Design Training.

    ERIC Educational Resources Information Center

    Chou, Huey-Wen; Wang, Yu-Fang

    1999-01-01

    Compares the effects of two training methods on computer attitude and performance in a World Wide Web page design program in a field experiment with high school students in Taiwan. Discusses individual differences, Kolb's Experiential Learning Theory and Learning Style Inventory, Computer Attitude Scale, and results of statistical analyses.…

  18. Exploring Customization in Higher Education: An Experiment in Leveraging Computer Spreadsheet Technology to Deliver Highly Individualized Online Instruction to Undergraduate Business Students

    ERIC Educational Resources Information Center

    Kunzler, Jayson S.

    2012-01-01

    This dissertation describes a research study designed to explore whether customization of online instruction results in improved learning in a college business statistics course. The study involved utilizing computer spreadsheet technology to develop an intelligent tutoring system (ITS) designed to: a) collect and monitor individual real-time…

  19. Analytical procedure validation and the quality by design paradigm.

    PubMed

    Rozet, Eric; Lebrun, Pierre; Michiels, Jean-François; Sondag, Perceval; Scherder, Tara; Boulanger, Bruno

    2015-01-01

    Since the adoption of the ICH Q8 document concerning the development of pharmaceutical processes following a quality by design (QbD) approach, there have been many discussions on the opportunity for analytical procedure developments to follow a similar approach. While development and optimization of analytical procedure following QbD principles have been largely discussed and described, the place of analytical procedure validation in this framework has not been clarified. This article aims at showing that analytical procedure validation is fully integrated into the QbD paradigm and is an essential step in developing analytical procedures that are effectively fit for purpose. Adequate statistical methodologies have also their role to play: such as design of experiments, statistical modeling, and probabilistic statements. The outcome of analytical procedure validation is also an analytical procedure design space, and from it, control strategy can be set.

  20. Risk-Adjusted Returns and Stock Market Games.

    ERIC Educational Resources Information Center

    Kagan, Gary; And Others

    1995-01-01

    Maintains that stock market games are designed to provide students with a background for investing in securities, especially stocks. Reviews two games used with secondary students, analyzes statistical data from these experiences, and considers weaknesses in the games. (CFR)

  1. The use of statistical tools in field testing of putative effects of genetically modified plants on nontarget organisms

    PubMed Central

    Semenov, Alexander V; Elsas, Jan Dirk; Glandorf, Debora C M; Schilthuizen, Menno; Boer, Willem F

    2013-01-01

    Abstract To fulfill existing guidelines, applicants that aim to place their genetically modified (GM) insect-resistant crop plants on the market are required to provide data from field experiments that address the potential impacts of the GM plants on nontarget organisms (NTO's). Such data may be based on varied experimental designs. The recent EFSA guidance document for environmental risk assessment (2010) does not provide clear and structured suggestions that address the statistics of field trials on effects on NTO's. This review examines existing practices in GM plant field testing such as the way of randomization, replication, and pseudoreplication. Emphasis is placed on the importance of design features used for the field trials in which effects on NTO's are assessed. The importance of statistical power and the positive and negative aspects of various statistical models are discussed. Equivalence and difference testing are compared, and the importance of checking the distribution of experimental data is stressed to decide on the selection of the proper statistical model. While for continuous data (e.g., pH and temperature) classical statistical approaches – for example, analysis of variance (ANOVA) – are appropriate, for discontinuous data (counts) only generalized linear models (GLM) are shown to be efficient. There is no golden rule as to which statistical test is the most appropriate for any experimental situation. In particular, in experiments in which block designs are used and covariates play a role GLMs should be used. Generic advice is offered that will help in both the setting up of field testing and the interpretation and data analysis of the data obtained in this testing. The combination of decision trees and a checklist for field trials, which are provided, will help in the interpretation of the statistical analyses of field trials and to assess whether such analyses were correctly applied. We offer generic advice to risk assessors and applicants that will help in both the setting up of field testing and the interpretation and data analysis of the data obtained in field testing. PMID:24567836

  2. The use of statistical tools in field testing of putative effects of genetically modified plants on nontarget organisms.

    PubMed

    Semenov, Alexander V; Elsas, Jan Dirk; Glandorf, Debora C M; Schilthuizen, Menno; Boer, Willem F

    2013-08-01

    To fulfill existing guidelines, applicants that aim to place their genetically modified (GM) insect-resistant crop plants on the market are required to provide data from field experiments that address the potential impacts of the GM plants on nontarget organisms (NTO's). Such data may be based on varied experimental designs. The recent EFSA guidance document for environmental risk assessment (2010) does not provide clear and structured suggestions that address the statistics of field trials on effects on NTO's. This review examines existing practices in GM plant field testing such as the way of randomization, replication, and pseudoreplication. Emphasis is placed on the importance of design features used for the field trials in which effects on NTO's are assessed. The importance of statistical power and the positive and negative aspects of various statistical models are discussed. Equivalence and difference testing are compared, and the importance of checking the distribution of experimental data is stressed to decide on the selection of the proper statistical model. While for continuous data (e.g., pH and temperature) classical statistical approaches - for example, analysis of variance (ANOVA) - are appropriate, for discontinuous data (counts) only generalized linear models (GLM) are shown to be efficient. There is no golden rule as to which statistical test is the most appropriate for any experimental situation. In particular, in experiments in which block designs are used and covariates play a role GLMs should be used. Generic advice is offered that will help in both the setting up of field testing and the interpretation and data analysis of the data obtained in this testing. The combination of decision trees and a checklist for field trials, which are provided, will help in the interpretation of the statistical analyses of field trials and to assess whether such analyses were correctly applied. We offer generic advice to risk assessors and applicants that will help in both the setting up of field testing and the interpretation and data analysis of the data obtained in field testing.

  3. Assessing Statistical Competencies in Clinical and Translational Science Education: One Size Does Not Fit All

    PubMed Central

    Lindsell, Christopher J.; Welty, Leah J.; Mazumdar, Madhu; Thurston, Sally W.; Rahbar, Mohammad H.; Carter, Rickey E.; Pollock, Bradley H.; Cucchiara, Andrew J.; Kopras, Elizabeth J.; Jovanovic, Borko D.; Enders, Felicity T.

    2014-01-01

    Abstract Introduction Statistics is an essential training component for a career in clinical and translational science (CTS). Given the increasing complexity of statistics, learners may have difficulty selecting appropriate courses. Our question was: what depth of statistical knowledge do different CTS learners require? Methods For three types of CTS learners (principal investigator, co‐investigator, informed reader of the literature), each with different backgrounds in research (no previous research experience, reader of the research literature, previous research experience), 18 experts in biostatistics, epidemiology, and research design proposed levels for 21 statistical competencies. Results Statistical competencies were categorized as fundamental, intermediate, or specialized. CTS learners who intend to become independent principal investigators require more specialized training, while those intending to become informed consumers of the medical literature require more fundamental education. For most competencies, less training was proposed for those with more research background. Discussion When selecting statistical coursework, the learner's research background and career goal should guide the decision. Some statistical competencies are considered to be more important than others. Baseline knowledge assessments may help learners identify appropriate coursework. Conclusion Rather than one size fits all, tailoring education to baseline knowledge, learner background, and future goals increases learning potential while minimizing classroom time. PMID:25212569

  4. Factorial analysis of trihalomethanes formation in drinking water.

    PubMed

    Chowdhury, Shakhawat; Champagne, Pascale; McLellan, P James

    2010-06-01

    Disinfection of drinking water reduces pathogenic infection, but may pose risks to human health through the formation of disinfection byproducts. The effects of different factors on the formation of trihalomethanes were investigated using a statistically designed experimental program, and a predictive model for trihalomethanes formation was developed. Synthetic water samples with different factor levels were produced, and trihalomethanes concentrations were measured. A replicated fractional factorial design with center points was performed, and significant factors were identified through statistical analysis. A second-order trihalomethanes formation model was developed from 92 experiments, and the statistical adequacy was assessed through appropriate diagnostics. This model was validated using additional data from the Drinking Water Surveillance Program database and was applied to the Smiths Falls water supply system in Ontario, Canada. The model predictions were correlated strongly to the measured trihalomethanes, with correlations of 0.95 and 0.91, respectively. The resulting model can assist in analyzing risk-cost tradeoffs in the design and operation of water supply systems.

  5. LADES: a software for constructing and analyzing longitudinal designs in biomedical research.

    PubMed

    Vázquez-Alcocer, Alan; Garzón-Cortes, Daniel Ladislao; Sánchez-Casas, Rosa María

    2014-01-01

    One of the most important steps in biomedical longitudinal studies is choosing a good experimental design that can provide high accuracy in the analysis of results with a minimum sample size. Several methods for constructing efficient longitudinal designs have been developed based on power analysis and the statistical model used for analyzing the final results. However, development of this technology is not available to practitioners through user-friendly software. In this paper we introduce LADES (Longitudinal Analysis and Design of Experiments Software) as an alternative and easy-to-use tool for conducting longitudinal analysis and constructing efficient longitudinal designs. LADES incorporates methods for creating cost-efficient longitudinal designs, unequal longitudinal designs, and simple longitudinal designs. In addition, LADES includes different methods for analyzing longitudinal data such as linear mixed models, generalized estimating equations, among others. A study of European eels is reanalyzed in order to show LADES capabilities. Three treatments contained in three aquariums with five eels each were analyzed. Data were collected from 0 up to the 12th week post treatment for all the eels (complete design). The response under evaluation is sperm volume. A linear mixed model was fitted to the results using LADES. The complete design had a power of 88.7% using 15 eels. With LADES we propose the use of an unequal design with only 14 eels and 89.5% efficiency. LADES was developed as a powerful and simple tool to promote the use of statistical methods for analyzing and creating longitudinal experiments in biomedical research.

  6. Testing for Mutagens Using Fruit Flies.

    ERIC Educational Resources Information Center

    Liebl, Eric C.

    1998-01-01

    Describes a laboratory employed in undergraduate teaching that uses fruit flies to test student-selected compounds for their ability to cause mutations. Requires no prior experience with fruit flies, incorporates a student design component, and employs both rigorous controls and statistical analyses. (DDR)

  7. Factorial Experiments: Efficient Tools for Evaluation of Intervention Components

    PubMed Central

    Collins, Linda M.; Dziak, John J.; Kugler, Kari C.; Trail, Jessica B.

    2014-01-01

    Background An understanding of the individual and combined effects of a set of intervention components is important for moving the science of preventive medicine interventions forward. This understanding can often be achieved in an efficient and economical way via a factorial experiment, in which two or more independent variables are manipulated. The factorial experiment is a complement to the randomized controlled trial (RCT); the two designs address different research questions. Purpose This article offers an introduction to factorial experiments aimed at investigators trained primarily in the RCT. Method The factorial experiment is compared and contrasted with other experimental designs used commonly in intervention science to highlight where each is most efficient and appropriate. Results Several points are made: factorial experiments make very efficient use of experimental subjects when the data are properly analyzed; a factorial experiment can have excellent statistical power even if it has relatively few subjects per experimental condition; and when conducting research to select components for inclusion in a multicomponent intervention, interactions should be studied rather than avoided. Conclusions Investigators in preventive medicine and related areas should begin considering factorial experiments alongside other approaches. Experimental designs should be chosen from a resource management perspective, which states that the best experimental design is the one that provides the greatest scientific benefit without exceeding available resources. PMID:25092122

  8. fMRI reliability: influences of task and experimental design.

    PubMed

    Bennett, Craig M; Miller, Michael B

    2013-12-01

    As scientists, it is imperative that we understand not only the power of our research tools to yield results, but also their ability to obtain similar results over time. This study is an investigation into how common decisions made during the design and analysis of a functional magnetic resonance imaging (fMRI) study can influence the reliability of the statistical results. To that end, we gathered back-to-back test-retest fMRI data during an experiment involving multiple cognitive tasks (episodic recognition and two-back working memory) and multiple fMRI experimental designs (block, event-related genetic sequence, and event-related m-sequence). Using these data, we were able to investigate the relative influences of task, design, statistical contrast (task vs. rest, target vs. nontarget), and statistical thresholding (unthresholded, thresholded) on fMRI reliability, as measured by the intraclass correlation (ICC) coefficient. We also utilized data from a second study to investigate test-retest reliability after an extended, six-month interval. We found that all of the factors above were statistically significant, but that they had varying levels of influence on the observed ICC values. We also found that these factors could interact, increasing or decreasing the relative reliability of certain Task × Design combinations. The results suggest that fMRI reliability is a complex construct whose value may be increased or decreased by specific combinations of factors.

  9. Quasi-experimental Studies in the Fields of Infection Control and Antibiotic Resistance, Ten Years Later: A Systematic Review.

    PubMed

    Alsaggaf, Rotana; O'Hara, Lyndsay M; Stafford, Kristen A; Leekha, Surbhi; Harris, Anthony D

    2018-02-01

    OBJECTIVE A systematic review of quasi-experimental studies in the field of infectious diseases was published in 2005. The aim of this study was to assess improvements in the design and reporting of quasi-experiments 10 years after the initial review. We also aimed to report the statistical methods used to analyze quasi-experimental data. DESIGN Systematic review of articles published from January 1, 2013, to December 31, 2014, in 4 major infectious disease journals. METHODS Quasi-experimental studies focused on infection control and antibiotic resistance were identified and classified based on 4 criteria: (1) type of quasi-experimental design used, (2) justification of the use of the design, (3) use of correct nomenclature to describe the design, and (4) statistical methods used. RESULTS Of 2,600 articles, 173 (7%) featured a quasi-experimental design, compared to 73 of 2,320 articles (3%) in the previous review (P<.01). Moreover, 21 articles (12%) utilized a study design with a control group; 6 (3.5%) justified the use of a quasi-experimental design; and 68 (39%) identified their design using the correct nomenclature. In addition, 2-group statistical tests were used in 75 studies (43%); 58 studies (34%) used standard regression analysis; 18 (10%) used segmented regression analysis; 7 (4%) used standard time-series analysis; 5 (3%) used segmented time-series analysis; and 10 (6%) did not utilize statistical methods for comparisons. CONCLUSIONS While some progress occurred over the decade, it is crucial to continue improving the design and reporting of quasi-experimental studies in the fields of infection control and antibiotic resistance to better evaluate the effectiveness of important interventions. Infect Control Hosp Epidemiol 2018;39:170-176.

  10. Statistical metrology—measurement and modeling of variation for advanced process development and design rule generation

    NASA Astrophysics Data System (ADS)

    Boning, Duane S.; Chung, James E.

    1998-11-01

    Advanced process technology will require more detailed understanding and tighter control of variation in devices and interconnects. The purpose of statistical metrology is to provide methods to measure and characterize variation, to model systematic and random components of that variation, and to understand the impact of variation on both yield and performance of advanced circuits. Of particular concern are spatial or pattern-dependencies within individual chips; such systematic variation within the chip can have a much larger impact on performance than wafer-level random variation. Statistical metrology methods will play an important role in the creation of design rules for advanced technologies. For example, a key issue in multilayer interconnect is the uniformity of interlevel dielectric (ILD) thickness within the chip. For the case of ILD thickness, we describe phases of statistical metrology development and application to understanding and modeling thickness variation arising from chemical-mechanical polishing (CMP). These phases include screening experiments including design of test structures and test masks to gather electrical or optical data, techniques for statistical decomposition and analysis of the data, and approaches to calibrating empirical and physical variation models. These models can be integrated with circuit CAD tools to evaluate different process integration or design rule strategies. One focus for the generation of interconnect design rules are guidelines for the use of "dummy fill" or "metal fill" to improve the uniformity of underlying metal density and thus improve the uniformity of oxide thickness within the die. Trade-offs that can be evaluated via statistical metrology include the improvements to uniformity possible versus the effect of increased capacitance due to additional metal.

  11. Development of the Statistical Reasoning in Biology Concept Inventory (SRBCI)

    PubMed Central

    Deane, Thomas; Nomme, Kathy; Jeffery, Erica; Pollock, Carol; Birol, Gülnur

    2016-01-01

    We followed established best practices in concept inventory design and developed a 12-item inventory to assess student ability in statistical reasoning in biology (Statistical Reasoning in Biology Concept Inventory [SRBCI]). It is important to assess student thinking in this conceptual area, because it is a fundamental requirement of being statistically literate and associated skills are needed in almost all walks of life. Despite this, previous work shows that non–expert-like thinking in statistical reasoning is common, even after instruction. As science educators, our goal should be to move students along a novice-to-expert spectrum, which could be achieved with growing experience in statistical reasoning. We used item response theory analyses (the one-parameter Rasch model and associated analyses) to assess responses gathered from biology students in two populations at a large research university in Canada in order to test SRBCI’s robustness and sensitivity in capturing useful data relating to the students’ conceptual ability in statistical reasoning. Our analyses indicated that SRBCI is a unidimensional construct, with items that vary widely in difficulty and provide useful information about such student ability. SRBCI should be useful as a diagnostic tool in a variety of biology settings and as a means of measuring the success of teaching interventions designed to improve statistical reasoning skills. PMID:26903497

  12. Machine learning patterns for neuroimaging-genetic studies in the cloud.

    PubMed

    Da Mota, Benoit; Tudoran, Radu; Costan, Alexandru; Varoquaux, Gaël; Brasche, Goetz; Conrod, Patricia; Lemaitre, Herve; Paus, Tomas; Rietschel, Marcella; Frouin, Vincent; Poline, Jean-Baptiste; Antoniu, Gabriel; Thirion, Bertrand

    2014-01-01

    Brain imaging is a natural intermediate phenotype to understand the link between genetic information and behavior or brain pathologies risk factors. Massive efforts have been made in the last few years to acquire high-dimensional neuroimaging and genetic data on large cohorts of subjects. The statistical analysis of such data is carried out with increasingly sophisticated techniques and represents a great computational challenge. Fortunately, increasing computational power in distributed architectures can be harnessed, if new neuroinformatics infrastructures are designed and training to use these new tools is provided. Combining a MapReduce framework (TomusBLOB) with machine learning algorithms (Scikit-learn library), we design a scalable analysis tool that can deal with non-parametric statistics on high-dimensional data. End-users describe the statistical procedure to perform and can then test the model on their own computers before running the very same code in the cloud at a larger scale. We illustrate the potential of our approach on real data with an experiment showing how the functional signal in subcortical brain regions can be significantly fit with genome-wide genotypes. This experiment demonstrates the scalability and the reliability of our framework in the cloud with a 2 weeks deployment on hundreds of virtual machines.

  13. Parabens abatement from surface waters by electrochemical advanced oxidation with boron doped diamond anodes.

    PubMed

    Domínguez, Joaquín R; Muñoz-Peña, Maria J; González, Teresa; Palo, Patricia; Cuerda-Correa, Eduardo M

    2016-10-01

    The removal efficiency of four commonly-used parabens by electrochemical advanced oxidation with boron-doped diamond anodes in two different aqueous matrices, namely ultrapure water and surface water from the Guadiana River, has been analyzed. Response surface methodology and a factorial, composite, central, orthogonal, and rotatable (FCCOR) statistical design of experiments have been used to optimize the process. The experimental results clearly show that the initial concentration of pollutants is the factor that influences the removal efficiency in a more remarkable manner in both aqueous matrices. As a rule, as the initial concentration of parabens increases, the removal efficiency decreases. The current density also affects the removal efficiency in a statistically significant manner in both aqueous matrices. In the water river aqueous matrix, a noticeable synergistic effect on the removal efficiency has been observed, probably due to the presence of chloride ions that increase the conductivity of the solution and contribute to the generation of strong secondary oxidant species such as chlorine or HClO/ClO - . The use of a statistical design of experiments made it possible to determine the optimal conditions necessary to achieve total removal of the four parabens in ultrapure and river water aqueous matrices.

  14. Directed Design of Experiments (DOE) for Determining Probability of Detection (POD) Capability of NDE Systems (DOEPOD)

    NASA Technical Reports Server (NTRS)

    Generazio, Edward R.

    2007-01-01

    This viewgraph presentation reviews some of the problems that are encountered by designers of Non-Destructive Examination (NDE) have in determining the probability of detection. According to the author "[the] NDE community should not blindly accept statistical results due to lack of knowledge." This is an attempt to bridge the gap between people doing NDE, and statisticians.

  15. Experiential Collaborative Learning and Preferential Thinking

    NASA Astrophysics Data System (ADS)

    Volpentesta, Antonio P.; Ammirato, Salvatore; Sofo, Francesco

    The paper presents a Project-Based Learning (shortly, PBL) approach in a collaborative educational environment aimed to develop design ability and creativity of students coming from different engineering disciplines. Three collaborative learning experiences in product design were conducted in order to study their impact on preferred thinking styles of students. Using a thinking style inventory, pre- and post-survey data was collected and successively analyzed through ANOVA techniques. Statistically significant results showed students successfully developed empathy and an openness to multiple perspectives. Furthermore, data analysis confirms that the proposed collaborative learning experience positively contributes to increase awareness in students' thinking styles.

  16. Designing high speed diagnostics

    NASA Astrophysics Data System (ADS)

    Veliz Carrillo, Gerardo; Martinez, Adam; Mula, Swathi; Prestridge, Kathy; Extreme Fluids Team Team

    2017-11-01

    Timing and firing for shock-driven flows is complex because of jitter in the shock tube mechanical drivers. Consequently, experiments require dynamic triggering of diagnostics from pressure transducers. We explain the design process and criteria for setting up re-shock experiments at the Los Alamos Vertical Shock Tube facility, and the requirements for particle image velocimetry and planar laser induced fluorescence measurements necessary for calculating Richtmeyer-Meshkov variable density turbulent statistics. Dynamic triggering of diagnostics allows for further investigation of the development of the Richtemeyer-Meshkov instability at both initial shock and re-shock. Thanks to the Los Alamos National Laboratory for funding our project.

  17. Design and implementation of a hot-wire probe for simultaneous velocity and vorticity vector measurements in boundary layers

    NASA Astrophysics Data System (ADS)

    Zimmerman, S.; Morrill-Winter, C.; Klewicki, J.

    2017-10-01

    A multi-sensor hot-wire probe for simultaneously measuring all three components of velocity and vorticity in boundary layers has been designed, fabricated and implemented in experiments up to large Reynolds numbers. The probe consists of eight hot-wires, compactly arranged in two pairs of orthogonal ×-wire arrays. The ×-wire sub-arrays are symmetrically configured such that the full velocity and vorticity vectors are resolved about a single central location. During its design phase, the capacity of this sensor to accurately measure each component of velocity and vorticity was first evaluated via a synthetic experiment in a set of well-resolved DNS fields. The synthetic experiments clarified probe geometry effects, allowed assessment of various processing schemes, and predicted the effects of finite wire length and wire separation on turbulence statistics. The probe was subsequently fabricated and employed in large Reynolds number experiments in the Flow Physics Facility wind tunnel at the University of New Hampshire. Comparisons of statistics from the actual probe with those from the simulated sensor exhibit very good agreement in trend, but with some differences in magnitude. These comparisons also reveal that the use of gradient information in processing the probe data can significantly improve the accuracy of the spanwise velocity measurement near the wall. To the authors' knowledge, the present are the largest Reynolds number laboratory-based measurements of all three vorticity components in boundary layers.

  18. 45 CFR 63.6 - Evaluation of applications.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... policy objectives; (2) Feasibility of the project; (3) Soundness of research design, statistical... qualifications and experience, including managerial, of personnel; (8) Adequacy of facilities and other resources... demonstrate to other potential users that such methods or techniques are feasible and cost-effective; (3) That...

  19. 45 CFR 63.6 - Evaluation of applications.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... policy objectives; (2) Feasibility of the project; (3) Soundness of research design, statistical... qualifications and experience, including managerial, of personnel; (8) Adequacy of facilities and other resources... demonstrate to other potential users that such methods or techniques are feasible and cost-effective; (3) That...

  20. 45 CFR 63.6 - Evaluation of applications.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... policy objectives; (2) Feasibility of the project; (3) Soundness of research design, statistical... qualifications and experience, including managerial, of personnel; (8) Adequacy of facilities and other resources... demonstrate to other potential users that such methods or techniques are feasible and cost-effective; (3) That...

  1. Performance Characterization of an Instrument.

    ERIC Educational Resources Information Center

    Salin, Eric D.

    1984-01-01

    Describes an experiment designed to teach students to apply the same statistical awareness to instrumentation they commonly apply to classical techniques. Uses propagation of error techniques to pinpoint instrumental limitations and breakdowns and to demonstrate capabilities and limitations of volumetric and gravimetric methods. Provides lists of…

  2. Best Bang for the Buck: Part 1 – The Size of Experiments Relative to Design Performance

    DOE PAGES

    Anderson-Cook, Christine Michaela; Lu, Lu

    2016-10-01

    There are many choices to make, when designing an experiment for a study, such as: what design factors to consider, which levels of the factors to use and which model to focus on. One aspect of design, however, is often left unquestioned: the size of the experiment. When learning about design of experiments, problems are often posed as "select a design for a particular objective with N runs." It’s tempting to consider the design size as a given constraint in the design-selection process. If you think of learning through designed experiments as a sequential process, however, strategically planning for themore » use of resources at different stages of data collection can be beneficial: Saving experimental runs for later is advantageous if you can efficiently learn with less in the early stages. Alternatively, if you’re too frugal in the early stages, you might not learn enough to proceed confidently with the next stages. Therefore, choosing the right-sized experiment is important—not too large or too small, but with a thoughtful balance to maximize the knowledge gained given the available resources. It can be a great advantage to think about the design size as flexible and include it as an aspect for comparisons. Sometimes you’re asked to provide a small design that is too ambitious for the goals of the study. Finally, if you can show quantitatively how the suggested design size might be inadequate or lead to problems during analysis—and also offer a formal comparison to some alternatives of different (likely larger) sizes—you may have a better chance to ask for additional resources to deliver statistically sound and satisfying results« less

  3. Experimental design and data-analysis in label-free quantitative LC/MS proteomics: A tutorial with MSqRob.

    PubMed

    Goeminne, Ludger J E; Gevaert, Kris; Clement, Lieven

    2018-01-16

    Label-free shotgun proteomics is routinely used to assess proteomes. However, extracting relevant information from the massive amounts of generated data remains difficult. This tutorial provides a strong foundation on analysis of quantitative proteomics data. We provide key statistical concepts that help researchers to design proteomics experiments and we showcase how to analyze quantitative proteomics data using our recent free and open-source R package MSqRob, which was developed to implement the peptide-level robust ridge regression method for relative protein quantification described by Goeminne et al. MSqRob can handle virtually any experimental proteomics design and outputs proteins ordered by statistical significance. Moreover, its graphical user interface and interactive diagnostic plots provide easy inspection and also detection of anomalies in the data and flaws in the data analysis, allowing deeper assessment of the validity of results and a critical review of the experimental design. Our tutorial discusses interactive preprocessing, data analysis and visualization of label-free MS-based quantitative proteomics experiments with simple and more complex designs. We provide well-documented scripts to run analyses in bash mode on GitHub, enabling the integration of MSqRob in automated pipelines on cluster environments (https://github.com/statOmics/MSqRob). The concepts outlined in this tutorial aid in designing better experiments and analyzing the resulting data more appropriately. The two case studies using the MSqRob graphical user interface will contribute to a wider adaptation of advanced peptide-based models, resulting in higher quality data analysis workflows and more reproducible results in the proteomics community. We also provide well-documented scripts for experienced users that aim at automating MSqRob on cluster environments. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Identification of crop cultivars with consistently high lignocellulosic sugar release requires the use of appropriate statistical design and modelling

    PubMed Central

    2013-01-01

    Background In this study, a multi-parent population of barley cultivars was grown in the field for two consecutive years and then straw saccharification (sugar release by enzymes) was subsequently analysed in the laboratory to identify the cultivars with the highest consistent sugar yield. This experiment was used to assess the benefit of accounting for both the multi-phase and multi-environment aspects of large-scale phenotyping experiments with field-grown germplasm through sound statistical design and analysis. Results Complementary designs at both the field and laboratory phases of the experiment ensured that non-genetic sources of variation could be separated from the genetic variation of cultivars, which was the main target of the study. The field phase included biological replication and plot randomisation. The laboratory phase employed re-randomisation and technical replication of samples within a batch, with a subset of cultivars chosen as duplicates that were randomly allocated across batches. The resulting data was analysed using a linear mixed model that incorporated field and laboratory variation and a cultivar by trial interaction, and ensured that the cultivar means were more accurately represented than if the non-genetic variation was ignored. The heritability detected was more than doubled in each year of the trial by accounting for the non-genetic variation in the analysis, clearly showing the benefit of this design and approach. Conclusions The importance of accounting for both field and laboratory variation, as well as the cultivar by trial interaction, by fitting a single statistical model (multi-environment trial, MET, model), was evidenced by the changes in list of the top 40 cultivars showing the highest sugar yields. Failure to account for this interaction resulted in only eight cultivars that were consistently in the top 40 in different years. The correspondence between the rankings of cultivars was much higher at 25 in the MET model. This approach is suited to any multi-phase and multi-environment population-based genetic experiment. PMID:24359577

  5. Anodal transcranial direct current stimulation of the right anterior temporal lobe did not significantly affect verbal insight

    PubMed Central

    Ogawa, Takeshi; Shimokawa, Takeaki; Yamashita, Okito

    2017-01-01

    Humans often utilize past experience to solve difficult problems. However, if past experience is insufficient to solve a problem, solvers may reach an impasse. Insight can be valuable for breaking an impasse, enabling the reinterpretation or re-representation of a problem. Previous studies using between-subjects designs have revealed a causal relationship between the anterior temporal lobes (ATLs) and non-verbal insight, by enhancing the right ATL while inhibiting the left ATL using transcranial direct current stimulation (tDCS). In addition, neuroimaging studies have reported a correlation between right ATL activity and verbal insight. Based on these findings, we hypothesized that the right ATL is causally related to both non-verbal and verbal insight. To test this hypothesis, we conducted an experiment with 66 subjects using a within-subjects design, which typically has greater statistical power than a between-subjects design. Subjects participated in tDCS experiments across 2 days, in which they solved both non-verbal and verbal insight problems under active or sham stimulation conditions. To dissociate the effects of right ATL stimulation from those of left ATL stimulation, we used two montage types; anodal tDCS of the right ATL together with cathodal tDCS of the left ATL (stimulating both ATLs) and anodal tDCS of the right ATL with cathodal tDCS of the left cheek (stimulating only the right ATL). The montage used was counterbalanced across subjects. Statistical analyses revealed that, regardless of the montage type, there were no significant differences between the active and sham conditions for either verbal or non-verbal insight, although the finding for non-verbal insight was inconclusive because of a lack of statistical power. These results failed to support previous findings suggesting that the right ATL is the central locus of insight. PMID:28902872

  6. 3D Simulation as a Learning Environment for Acquiring the Skill of Self-Management: An Experience Involving Spanish University Students of Education

    ERIC Educational Resources Information Center

    Cela-Ranilla, Jose María; Esteve-Gonzalez, Vanessa; Esteve-Mon, Francesc; Gisbert-Cervera, Merce

    2014-01-01

    In this study we analyze how 57 Spanish university students of Education developed a learning process in a virtual world by conducting activities that involved the skill of self-management. The learning experience comprised a serious game designed in a 3D simulation environment. Descriptive statistics and non-parametric tests were used in the…

  7. Flow Measurements Using Particle Image Velocimetry in the Ultra Compact Combustor

    DTIC Science & Technology

    2009-12-01

    addition effectively increases the flow velocity resulting in increased thrust. The afterburning cycle is much less efficient than the Brayton cycle used...31. Rekab, K., & Shaikh, M., Statistical Design of Experiments with Engineering Applications, Florida: CRC Press, Taylor & Francis Group, 2005

  8. Biotherapeutic formulation factors affecting metal leachables from stainless steel studied by design of experiments.

    PubMed

    Zhou, Shuxia; Evans, Brad; Schöneich, Christian; Singh, Satish K

    2012-03-01

    Trace amounts of metals are inevitably present in biotherapeutic products. They can arise from various sources. The impact of common formulation factors such as protein concentration, antioxidant, metal chelator concentration and type, surfactant, pH, and contact time with stainless steel on metal leachables was investigated by a design of experiments approach. Three major metal leachables, iron, chromium, and nickel were monitored by inductively coupled plasma-mass spectrometry. It was observed that among all the tested factors, contact time, metal chelator concentration, and protein concentration were statistically significant factors with higher temperature resulting in higher levels of leached metals. Within a pH range of 5.5-6.5, solution pH played a minor role for chromium leaching at 25°C. No statistically significant difference was observed due to type of chelator, presence of antioxidant, or surfactant. In order to optimize a biotherapeutic formulation to achieve a target drug product shelf life with acceptable quality, each formulation component must be evaluated for its impact.

  9. Bio hydrogen production from cassava starch by anaerobic mixed cultures: Multivariate statistical modeling

    NASA Astrophysics Data System (ADS)

    Tien, Hai Minh; Le, Kien Anh; Le, Phung Thi Kim

    2017-09-01

    Bio hydrogen is a sustainable energy resource due to its potentially higher efficiency of conversion to usable power, high energy efficiency and non-polluting nature resource. In this work, the experiments have been carried out to indicate the possibility of generating bio hydrogen as well as identifying effective factors and the optimum conditions from cassava starch. Experimental design was used to investigate the effect of operating temperature (37-43 °C), pH (6-7), and inoculums ratio (6-10 %) to the yield hydrogen production, the COD reduction and the ratio of volume of hydrogen production to COD reduction. The statistical analysis of the experiment indicated that the significant effects for the fermentation yield were the main effect of temperature, pH and inoculums ratio. The interaction effects between them seem not significant. The central composite design showed that the polynomial regression models were in good agreement with the experimental results. This result will be applied to enhance the process of cassava starch processing wastewater treatment.

  10. Weather extremes in very large, high-resolution ensembles: the weatherathome experiment

    NASA Astrophysics Data System (ADS)

    Allen, M. R.; Rosier, S.; Massey, N.; Rye, C.; Bowery, A.; Miller, J.; Otto, F.; Jones, R.; Wilson, S.; Mote, P.; Stone, D. A.; Yamazaki, Y. H.; Carrington, D.

    2011-12-01

    Resolution and ensemble size are often seen as alternatives in climate modelling. Models with sufficient resolution to simulate many classes of extreme weather cannot normally be run often enough to assess the statistics of rare events, still less how these statistics may be changing. As a result, assessments of the impact of external forcing on regional climate extremes must be based either on statistical downscaling from relatively coarse-resolution models, or statistical extrapolation from 10-year to 100-year events. Under the weatherathome experiment, part of the climateprediction.net initiative, we have compiled the Met Office Regional Climate Model HadRM3P to run on personal computer volunteered by the general public at 25 and 50km resolution, embedded within the HadAM3P global atmosphere model. With a global network of about 50,000 volunteers, this allows us to run time-slice ensembles of essentially unlimited size, exploring the statistics of extreme weather under a range of scenarios for surface forcing and atmospheric composition, allowing for uncertainty in both boundary conditions and model parameters. Current experiments, developed with the support of Microsoft Research, focus on three regions, the Western USA, Europe and Southern Africa. We initially simulate the period 1959-2010 to establish which variables are realistically simulated by the model and on what scales. Our next experiments are focussing on the Event Attribution problem, exploring how the probability of various types of extreme weather would have been different over the recent past in a world unaffected by human influence, following the design of Pall et al (2011), but extended to a longer period and higher spatial resolution. We will present the first results of the unique, global, participatory experiment and discuss the implications for the attribution of recent weather events to anthropogenic influence on climate.

  11. Defense Small Business Innovation Research Program (SBIR), Volume 4, Defense Agencies Abstracts of Phase 1 Awards 1991

    DTIC Science & Technology

    1991-01-01

    EXPERIENCE IN DEVELOPING INTEGRATED OPTICAL DEVICES, NONLINEAR MAGNETIC-OPTIC MATERIALS, HIGH FREQUENCY MODULATORS, COMPUTER-AIDED MODELING AND SOPHISTICATED... HIGH -LEVEL PRESENTATION AND DISTRIBUTED CONTROL MODELS FOR INTEGRATING HETEROGENEOUS MECHANICAL ENGINEERING APPLICATIONS AND TOOLS. THE DESIGN IS FOCUSED...STATISTICALLY ACCURATE WORST CASE DEVICE MODELS FOR CIRCUIT SIMULATION. PRESENT METHODS OF WORST CASE DEVICE DESIGN ARE AD HOC AND DO NOT ALLOW THE

  12. Statistical optimization and fabrication of a press coated pulsatile dosage form to treat nocturnal acid breakthrough.

    PubMed

    Agarwal, Vaibhav; Bansal, Mayank

    2013-08-01

    Present work focuses on the use of mimosa seed gum to develop a drug delivery system making combined use of floating and pulsatile principles, for the chrono-prevention of nocturnal acid breakthrough. The desired aim was achieved by fabricating a floating delivery system bearing time - lagged coating of Mimosa pudica seed polymer for the programmed release of Famotidine. Response Surface Methodology was the statistical tool that was employed for experiment designing, mathematical model generation and optimization study. A 3(2) full factorial design was used in designing the experiment.% weight ratio of mimosa gum to hydroxy propyl methyl cellulose in the coating combination and the coating weight were the independent variables, whereas the lag time and the cumulative % drug release in 360 minutes were the observed responses. Results revealed that both the coating composition and the coating weight significantly affected the release of drug from the dosage form. The optimized formulation prepared according to the computer generated software, Design-Expert(®) deciphered response which were in close proximity with the experimental responses, thus confirming the robustness as well as accuracy of the predicted model for the utilization of natural polymer like mimosa seed gum for the chronotherapeutic treatment of nocturnal acid breakthrough.

  13. Fulfilling the law of a single independent variable and improving the result of mathematical educational research

    NASA Astrophysics Data System (ADS)

    Pardimin, H.; Arcana, N.

    2018-01-01

    Many types of research in the field of mathematics education apply the Quasi-Experimental method and statistical analysis use t-test. Quasi-experiment has a weakness that is difficult to fulfil “the law of a single independent variable”. T-test also has a weakness that is a generalization of the conclusions obtained is less powerful. This research aimed to find ways to reduce the weaknesses of the Quasi-experimental method and improved the generalization of the research results. The method applied in the research was a non-interactive qualitative method, and the type was concept analysis. Concepts analysed are the concept of statistics, research methods of education, and research reports. The result represented a way to overcome the weaknesses of quasi-Experiments and T-test. In addition, the way was to apply a combination of Factorial Design and Balanced Design, which the authors refer to as Factorial-Balanced Design. The advantages of this design are: (1) almost fulfilling “the low of single independent variable” so no need to test the similarity of the academic ability, (2) the sample size of the experimental group and the control group became larger and equal; so it becomes robust to deal with violations of the assumptions of the ANOVA test.

  14. Designing experiments on thermal interactions by secondary-school students in a simulated laboratory environment

    NASA Astrophysics Data System (ADS)

    Lefkos, Ioannis; Psillos, Dimitris; Hatzikraniotis, Euripides

    2011-07-01

    Background and purpose: The aim of this study was to explore the effect of investigative activities with manipulations in a virtual laboratory on students' ability to design experiments. Sample Fourteen students in a lower secondary school in Greece attended a teaching sequence on thermal phenomena based on the use of information and communication technology, and specifically of the simulated virtual laboratory 'ThermoLab'. Design and methods A pre-post comparison was applied. Students' design of experiments was rated in eight dimensions; namely, hypothesis forming and verification, selection of variables, initial conditions, device settings, materials and devices used, process and phenomena description. A three-level ranking scheme was employed for the evaluation of students' answers in each dimension. Results A Wilcoxon signed-rank test revealed a statistically significant difference between the students' pre- and post-test scores. Additional analysis by comparing the pre- and post-test scores using the Hake gain showed high gains in all but one dimension, which suggests that this improvement was almost inclusive. Conclusions We consider that our findings support the statement that there was an improvement in students' ability to design experiments.

  15. Type I error probabilities based on design-stage strategies with applications to noninferiority trials.

    PubMed

    Rothmann, Mark

    2005-01-01

    When testing the equality of means from two different populations, a t-test or large sample normal test tend to be performed. For these tests, when the sample size or design for the second sample is dependent on the results of the first sample, the type I error probability is altered for each specific possibility in the null hypothesis. We will examine the impact on the type I error probabilities for two confidence interval procedures and procedures using test statistics when the design for the second sample or experiment is dependent on the results from the first sample or experiment (or series of experiments). Ways for controlling a desired maximum type I error probability or a desired type I error rate will be discussed. Results are applied to the setting of noninferiority comparisons in active controlled trials where the use of a placebo is unethical.

  16. The 1985 Army Experience Survey. Data Sourcebook and User’s Manual

    DTIC Science & Technology

    1986-01-01

    on the survey data file produced for the 1985 AES.- 4 The survey data are available in Operating System (OS) as well as Statistical Analysis System ...version of the survey data files was produced using the Statistical Analysis System (SASJ. The survey data were also produced in Operating System (OS...impacts upon future enlistments. In order iThe OS data file was designed to make the survey data accessible on any IBM-compatible computer system . 3 N’ to

  17. Statistics, Uncertainty, and Transmitted Variation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wendelberger, Joanne Roth

    2014-11-05

    The field of Statistics provides methods for modeling and understanding data and making decisions in the presence of uncertainty. When examining response functions, variation present in the input variables will be transmitted via the response function to the output variables. This phenomenon can potentially have significant impacts on the uncertainty associated with results from subsequent analysis. This presentation will examine the concept of transmitted variation, its impact on designed experiments, and a method for identifying and estimating sources of transmitted variation in certain settings.

  18. Statistics of Sxy estimates

    NASA Technical Reports Server (NTRS)

    Freilich, M. H.; Pawka, S. S.

    1987-01-01

    The statistics of Sxy estimates derived from orthogonal-component measurements are examined. Based on results of Goodman (1957), the probability density function (pdf) for Sxy(f) estimates is derived, and a closed-form solution for arbitrary moments of the distribution is obtained. Characteristic functions are used to derive the exact pdf of Sxy(tot). In practice, a simple Gaussian approximation is found to be highly accurate even for relatively few degrees of freedom. Implications for experiment design are discussed, and a maximum-likelihood estimator for a posterior estimation is outlined.

  19. Design of experiments on 135 cloned poplar trees to map environmental influence in greenhouse.

    PubMed

    Pinto, Rui Climaco; Stenlund, Hans; Hertzberg, Magnus; Lundstedt, Torbjörn; Johansson, Erik; Trygg, Johan

    2011-01-31

    To find and ascertain phenotypic differences, minimal variation between biological replicates is always desired. Variation between the replicates can originate from genetic transformation but also from environmental effects in the greenhouse. Design of experiments (DoE) has been used in field trials for many years and proven its value but is underused within functional genomics including greenhouse experiments. We propose a strategy to estimate the effect of environmental factors with the ultimate goal of minimizing variation between biological replicates, based on DoE. DoE can be analyzed in many ways. We present a graphical solution together with solutions based on classical statistics as well as the newly developed OPLS methodology. In this study, we used DoE to evaluate the influence of plant specific factors (plant size, shoot type, plant quality, and amount of fertilizer) and rotation of plant positions on height and section area of 135 cloned wild type poplar trees grown in the greenhouse. Statistical analysis revealed that plant position was the main contributor to variability among biological replicates and applying a plant rotation scheme could reduce this variation. Copyright © 2010 Elsevier B.V. All rights reserved.

  20. Factorial experiments: efficient tools for evaluation of intervention components.

    PubMed

    Collins, Linda M; Dziak, John J; Kugler, Kari C; Trail, Jessica B

    2014-10-01

    An understanding of the individual and combined effects of a set of intervention components is important for moving the science of preventive medicine interventions forward. This understanding can often be achieved in an efficient and economical way via a factorial experiment, in which two or more independent variables are manipulated. The factorial experiment is a complement to the RCT; the two designs address different research questions. To offer an introduction to factorial experiments aimed at investigators trained primarily in the RCT. The factorial experiment is compared and contrasted with other experimental designs used commonly in intervention science to highlight where each is most efficient and appropriate. Several points are made: factorial experiments make very efficient use of experimental subjects when the data are properly analyzed; a factorial experiment can have excellent statistical power even if it has relatively few subjects per experimental condition; and when conducting research to select components for inclusion in a multicomponent intervention, interactions should be studied rather than avoided. Investigators in preventive medicine and related areas should begin considering factorial experiments alongside other approaches. Experimental designs should be chosen from a resource management perspective, which states that the best experimental design is the one that provides the greatest scientific benefit without exceeding available resources. Copyright © 2014 American Journal of Preventive Medicine. Published by Elsevier Inc. All rights reserved.

  1. Systematic survey of the design, statistical analysis, and reporting of studies published in the 2008 volume of the Journal of Cerebral Blood Flow and Metabolism.

    PubMed

    Vesterinen, Hanna M; Vesterinen, Hanna V; Egan, Kieren; Deister, Amelie; Schlattmann, Peter; Macleod, Malcolm R; Dirnagl, Ulrich

    2011-04-01

    Translating experimental findings into clinically effective therapies is one of the major bottlenecks of modern medicine. As this has been particularly true for cerebrovascular research, attention has turned to the quality and validity of experimental cerebrovascular studies. We set out to assess the study design, statistical analyses, and reporting of cerebrovascular research. We assessed all original articles published in the Journal of Cerebral Blood Flow and Metabolism during the year 2008 against a checklist designed to capture the key attributes relating to study design, statistical analyses, and reporting. A total of 156 original publications were included (animal, in vitro, human). Few studies reported a primary research hypothesis, statement of purpose, or measures to safeguard internal validity (such as randomization, blinding, exclusion or inclusion criteria). Many studies lacked sufficient information regarding methods and results to form a reasonable judgment about their validity. In nearly 20% of studies, statistical tests were either not appropriate or information to allow assessment of appropriateness was lacking. This study identifies a number of factors that should be addressed if the quality of research in basic and translational biomedicine is to be improved. We support the widespread implementation of the ARRIVE (Animal Research Reporting In Vivo Experiments) statement for the reporting of experimental studies in biomedicine, for improving training in proper study design and analysis, and that reviewers and editors adopt a more constructively critical approach in the assessment of manuscripts for publication.

  2. Systematic survey of the design, statistical analysis, and reporting of studies published in the 2008 volume of the Journal of Cerebral Blood Flow and Metabolism

    PubMed Central

    Vesterinen, Hanna V; Egan, Kieren; Deister, Amelie; Schlattmann, Peter; Macleod, Malcolm R; Dirnagl, Ulrich

    2011-01-01

    Translating experimental findings into clinically effective therapies is one of the major bottlenecks of modern medicine. As this has been particularly true for cerebrovascular research, attention has turned to the quality and validity of experimental cerebrovascular studies. We set out to assess the study design, statistical analyses, and reporting of cerebrovascular research. We assessed all original articles published in the Journal of Cerebral Blood Flow and Metabolism during the year 2008 against a checklist designed to capture the key attributes relating to study design, statistical analyses, and reporting. A total of 156 original publications were included (animal, in vitro, human). Few studies reported a primary research hypothesis, statement of purpose, or measures to safeguard internal validity (such as randomization, blinding, exclusion or inclusion criteria). Many studies lacked sufficient information regarding methods and results to form a reasonable judgment about their validity. In nearly 20% of studies, statistical tests were either not appropriate or information to allow assessment of appropriateness was lacking. This study identifies a number of factors that should be addressed if the quality of research in basic and translational biomedicine is to be improved. We support the widespread implementation of the ARRIVE (Animal Research Reporting In Vivo Experiments) statement for the reporting of experimental studies in biomedicine, for improving training in proper study design and analysis, and that reviewers and editors adopt a more constructively critical approach in the assessment of manuscripts for publication. PMID:21157472

  3. Impact of Assimilation on Heavy Rainfall Simulations Using WRF Model: Sensitivity of Assimilation Results to Background Error Statistics

    NASA Astrophysics Data System (ADS)

    Rakesh, V.; Kantharao, B.

    2017-03-01

    Data assimilation is considered as one of the effective tools for improving forecast skill of mesoscale models. However, for optimum utilization and effective assimilation of observations, many factors need to be taken into account while designing data assimilation methodology. One of the critical components that determines the amount and propagation observation information into the analysis, is model background error statistics (BES). The objective of this study is to quantify how BES in data assimilation impacts on simulation of heavy rainfall events over a southern state in India, Karnataka. Simulations of 40 heavy rainfall events were carried out using Weather Research and Forecasting Model with and without data assimilation. The assimilation experiments were conducted using global and regional BES while the experiment with no assimilation was used as the baseline for assessing the impact of data assimilation. The simulated rainfall is verified against high-resolution rain-gage observations over Karnataka. Statistical evaluation using several accuracy and skill measures shows that data assimilation has improved the heavy rainfall simulation. Our results showed that the experiment using regional BES outperformed the one which used global BES. Critical thermo-dynamic variables conducive for heavy rainfall like convective available potential energy simulated using regional BES is more realistic compared to global BES. It is pointed out that these results have important practical implications in design of forecast platforms while decision-making during extreme weather events

  4. Observability of ionospheric space-time structure with ISR: A simulation study

    NASA Astrophysics Data System (ADS)

    Swoboda, John; Semeter, Joshua; Zettergren, Matthew; Erickson, Philip J.

    2017-02-01

    The sources of error from electronically steerable array (ESA) incoherent scatter radar (ISR) systems are investigated both theoretically and with use of an open-source ISR simulator, developed by the authors, called Simulator for ISR (SimISR). The main sources of error incorporated in the simulator include statistical uncertainty, which arises due to nature of the measurement mechanism and the inherent space-time ambiguity from the sensor. SimISR can take a field of plasma parameters, parameterized by time and space, and create simulated ISR data at the scattered electric field (i.e., complex receiver voltage) level, subsequently processing these data to show possible reconstructions of the original parameter field. To demonstrate general utility, we show a number of simulation examples, with two cases using data from a self-consistent multifluid transport model. Results highlight the significant influence of the forward model of the ISR process and the resulting statistical uncertainty on plasma parameter measurements and the core experiment design trade-offs that must be made when planning observations. These conclusions further underscore the utility of this class of measurement simulator as a design tool for more optimal experiment design efforts using flexible ESA class ISR systems.

  5. A methodology for the design of experiments in computational intelligence with multiple regression models.

    PubMed

    Fernandez-Lozano, Carlos; Gestal, Marcos; Munteanu, Cristian R; Dorado, Julian; Pazos, Alejandro

    2016-01-01

    The design of experiments and the validation of the results achieved with them are vital in any research study. This paper focuses on the use of different Machine Learning approaches for regression tasks in the field of Computational Intelligence and especially on a correct comparison between the different results provided for different methods, as those techniques are complex systems that require further study to be fully understood. A methodology commonly accepted in Computational intelligence is implemented in an R package called RRegrs. This package includes ten simple and complex regression models to carry out predictive modeling using Machine Learning and well-known regression algorithms. The framework for experimental design presented herein is evaluated and validated against RRegrs. Our results are different for three out of five state-of-the-art simple datasets and it can be stated that the selection of the best model according to our proposal is statistically significant and relevant. It is of relevance to use a statistical approach to indicate whether the differences are statistically significant using this kind of algorithms. Furthermore, our results with three real complex datasets report different best models than with the previously published methodology. Our final goal is to provide a complete methodology for the use of different steps in order to compare the results obtained in Computational Intelligence problems, as well as from other fields, such as for bioinformatics, cheminformatics, etc., given that our proposal is open and modifiable.

  6. A methodology for the design of experiments in computational intelligence with multiple regression models

    PubMed Central

    Gestal, Marcos; Munteanu, Cristian R.; Dorado, Julian; Pazos, Alejandro

    2016-01-01

    The design of experiments and the validation of the results achieved with them are vital in any research study. This paper focuses on the use of different Machine Learning approaches for regression tasks in the field of Computational Intelligence and especially on a correct comparison between the different results provided for different methods, as those techniques are complex systems that require further study to be fully understood. A methodology commonly accepted in Computational intelligence is implemented in an R package called RRegrs. This package includes ten simple and complex regression models to carry out predictive modeling using Machine Learning and well-known regression algorithms. The framework for experimental design presented herein is evaluated and validated against RRegrs. Our results are different for three out of five state-of-the-art simple datasets and it can be stated that the selection of the best model according to our proposal is statistically significant and relevant. It is of relevance to use a statistical approach to indicate whether the differences are statistically significant using this kind of algorithms. Furthermore, our results with three real complex datasets report different best models than with the previously published methodology. Our final goal is to provide a complete methodology for the use of different steps in order to compare the results obtained in Computational Intelligence problems, as well as from other fields, such as for bioinformatics, cheminformatics, etc., given that our proposal is open and modifiable. PMID:27920952

  7. Laser diode initiated detonators for space applications

    NASA Technical Reports Server (NTRS)

    Ewick, David W.; Graham, J. A.; Hawley, J. D.

    1993-01-01

    Ensign Bickford Aerospace Company (EBAC) has over ten years of experience in the design and development of laser ordnance systems. Recent efforts have focused on the development of laser diode ordnance systems for space applications. Because the laser initiated detonators contain only insensitive secondary explosives, a high degree of system safety is achieved. Typical performance characteristics of a laser diode initiated detonator are described in this paper, including all-fire level, function time, and output. A finite difference model used at EBAC to predict detonator performance, is described and calculated results are compared to experimental data. Finally, the use of statistically designed experiments to evaluate performance of laser initiated detonators is discussed.

  8. Evaluation of Solid Rocket Motor Component Data Using a Commercially Available Statistical Software Package

    NASA Technical Reports Server (NTRS)

    Stefanski, Philip L.

    2015-01-01

    Commercially available software packages today allow users to quickly perform the routine evaluations of (1) descriptive statistics to numerically and graphically summarize both sample and population data, (2) inferential statistics that draws conclusions about a given population from samples taken of it, (3) probability determinations that can be used to generate estimates of reliability allowables, and finally (4) the setup of designed experiments and analysis of their data to identify significant material and process characteristics for application in both product manufacturing and performance enhancement. This paper presents examples of analysis and experimental design work that has been conducted using Statgraphics®(Registered Trademark) statistical software to obtain useful information with regard to solid rocket motor propellants and internal insulation material. Data were obtained from a number of programs (Shuttle, Constellation, and Space Launch System) and sources that include solid propellant burn rate strands, tensile specimens, sub-scale test motors, full-scale operational motors, rubber insulation specimens, and sub-scale rubber insulation analog samples. Besides facilitating the experimental design process to yield meaningful results, statistical software has demonstrated its ability to quickly perform complex data analyses and yield significant findings that might otherwise have gone unnoticed. One caveat to these successes is that useful results not only derive from the inherent power of the software package, but also from the skill and understanding of the data analyst.

  9. Parametric Studies of Flow Separation using Air Injection

    NASA Technical Reports Server (NTRS)

    Zhang, Wei

    2004-01-01

    Boundary Layer separation causes the airfoil to stall and therefore imposes dramatic performance degradation on the airfoil. In recent years, flow separation control has been one of the active research areas in the field of aerodynamics due to its promising performance improvements on the lifting device. These active flow separation control techniques include steady and unsteady air injection as well as suction on the airfoil surface etc. This paper will be focusing on the steady and unsteady air injection on the airfoil. Although wind tunnel experiments revealed that the performance improvements on the airfoil using injection techniques, the details of how the key variables such as air injection slot geometry and air injection angle etc impact the effectiveness of flow separation control via air injection has not been studied. A parametric study of both steady and unsteady air injection active flow control will be the main objective for this summer. For steady injection, the key variables include the slot geometry, orientation, spacing, air injection velocity as well as the injection angle. For unsteady injection, the injection frequency will also be investigated. Key metrics such as lift coefficient, drag coefficient, total pressure loss and total injection mass will be used to measure the effectiveness of the control technique. A design of experiments using the Box-Behnken Design is set up in order to determine how each of the variables affects each of the key metrics. Design of experiment is used so that the number of experimental runs will be at minimum and still be able to predict which variables are the key contributors to the responses. The experiments will then be conducted in the 1ft by 1ft wind tunnel according to the design of experiment settings. The data obtained from the experiments will be imported into JMP, statistical software, to generate sets of response surface equations which represent the statistical empirical model for each of the metrics as a function of the key variables. Next, the variables such as the slot geometry can be optimized using the build-in optimizer within JMP. Finally, a wind tunnel testing will be conducted using the optimized slot geometry and other key variables to verify the empirical statistical model. The long term goal for this effort is to assess the impacts of active flow control using air injection at system level as one of the task plan included in the NASAs URETI program with Georgia Institute of Technology.

  10. [Pseudoreplication, chatter, and the international nature of science: a response to D.V. Tatarnikov].

    PubMed

    Kozlov, M V; Hurlbert, S H

    2006-01-01

    The commentary by Tatarnikov (2005) on the design and analysis of manipulative experiments in ecology represents an obvious danger to readers with poor knowledge of modern statistics due to its erroneous interpretation of pseudoreplication and statistical independence. Here we offer clarification of those concepts--and related ones such as experimental unit and evaluation unit--by reference to studies cited by Tatarnikov (2005). We stress the necessity of learning from the accumulated experience of the international scientific community in order not to repeat the errors found in earlier publications that have already been analyzed and widely written about. (An Englisch translation of the full article is available as a pdf-file from either or the authors.)

  11. Familiar units prevail over statistical cues in word segmentation.

    PubMed

    Poulin-Charronnat, Bénédicte; Perruchet, Pierre; Tillmann, Barbara; Peereman, Ronald

    2017-09-01

    In language acquisition research, the prevailing position is that listeners exploit statistical cues, in particular transitional probabilities between syllables, to discover words of a language. However, other cues are also involved in word discovery. Assessing the weight learners give to these different cues leads to a better understanding of the processes underlying speech segmentation. The present study evaluated whether adult learners preferentially used known units or statistical cues for segmenting continuous speech. Before the exposure phase, participants were familiarized with part-words of a three-word artificial language. This design allowed the dissociation of the influence of statistical cues and familiar units, with statistical cues favoring word segmentation and familiar units favoring (nonoptimal) part-word segmentation. In Experiment 1, performance in a two-alternative forced choice (2AFC) task between words and part-words revealed part-word segmentation (even though part-words were less cohesive in terms of transitional probabilities and less frequent than words). By contrast, an unfamiliarized group exhibited word segmentation, as usually observed in standard conditions. Experiment 2 used a syllable-detection task to remove the likely contamination of performance by memory and strategy effects in the 2AFC task. Overall, the results suggest that familiar units overrode statistical cues, ultimately questioning the need for computation mechanisms of transitional probabilities (TPs) in natural language speech segmentation.

  12. A Study of Particle Beam Spin Dynamics for High Precision Experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fiedler, Andrew J.

    In the search for physics beyond the Standard Model, high precision experiments to measure fundamental properties of particles are an important frontier. One group of such measurements involves magnetic dipole moment (MDM) values as well as searching for an electric dipole moment (EDM), both of which could provide insights about how particles interact with their environment at the quantum level and if there are undiscovered new particles. For these types of high precision experiments, minimizing statistical uncertainties in the measurements plays a critical role. \\\\ \\indent This work leverages computer simulations to quantify the effects of statistical uncertainty for experimentsmore » investigating spin dynamics. In it, analysis of beam properties and lattice design effects on the polarization of the beam is performed. As a case study, the beam lines that will provide polarized muon beams to the Fermilab Muon \\emph{g}-2 experiment are analyzed to determine the effects of correlations between the phase space variables and the overall polarization of the muon beam.« less

  13. Climate Change Conceptual Change: Scientific Information Can Transform Attitudes.

    PubMed

    Ranney, Michael Andrew; Clark, Dav

    2016-01-01

    Of this article's seven experiments, the first five demonstrate that virtually no Americans know the basic global warming mechanism. Fortunately, Experiments 2-5 found that 2-45 min of physical-chemical climate instruction durably increased such understandings. This mechanistic learning, or merely receiving seven highly germane statistical facts (Experiment 6), also increased climate-change acceptance-across the liberal-conservative spectrum. However, Experiment 7's misleading statistics decreased such acceptance (and dramatically, knowledge-confidence). These readily available attitudinal and conceptual changes through scientific information disconfirm what we term "stasis theory"--which some researchers and many laypeople varyingly maintain. Stasis theory subsumes the claim that informing people (particularly Americans) about climate science may be largely futile or even counterproductive--a view that appears historically naïve, suffers from range restrictions (e.g., near-zero mechanistic knowledge), and/or misinterprets some polarization and (noncausal) correlational data. Our studies evidenced no polarizations. Finally, we introduce HowGlobalWarmingWorks.org--a website designed to directly enhance public "climate-change cognition." Copyright © 2016 Cognitive Science Society, Inc.

  14. Development of chemistry attitudes and experiences questionnaire (CAEQ)

    NASA Astrophysics Data System (ADS)

    Dalgety, Jacinta; Coll, Richard K.; Jones, Alister

    2003-09-01

    In this article we describe the development of the Chemistry Attitudes and Experiences Questionnaire (CAEQ) that measures first-year university chemistry students' attitude toward chemistry, chemistry self-efficacy, and learning experiences. The instrument was developed as part of a larger study and sought to fulfill a need for an instrument to investigate factors that influence student enrollment choice. We set out to design the instrument in a manner that would maximize construct validity. The CAEQ was piloted with a cohort of science and technology students (n = 129) at the end of their first year. Based on statistical analysis the instrument was modified and subsequently administered on two occasions at two tertiary institutions (n = 669). Statistical data along with additional data gathered from interviews suggest that the CAEQ possesses good construct validity and will prove a useful tool for tertiary level educators who wish to gain an understanding of factors that influence student choice of chemistry enrolment.

  15. Knowledge and opinions of Downsview physicians regarding the chiropractic profession

    PubMed Central

    Newton-Leo, Linda; King-Isaacs, Debra; Lichti, Janice

    1994-01-01

    This study was a preliminary investigation into the knowledge of and current attitudes towards the chiropractic profession by medical practitioners with varying years of clinical experience. A questionnaire was designed and mailed to seventy general practitioners in Downsview, Ontario who agreed to participate in the study. Twenty-six were returned for a response rate of 37%. The data were analyzed and responses from doctors with differing years of practice experience were compared using the chi square statistic. When comparing attitudes towards the chiropractic profession between medical practitioners with greater and less than 15 years of clinical experience a statistically significant difference was found (p = 0.0005). However, no significant differences were observed in terms of their interaction with or knowledge of the chiropractic profession. Further, 88% of respondents reported that they had referred a patient to a chiropractor. The limitations of the study and suggestions for improvement are discussed.

  16. Semantically enabled and statistically supported biological hypothesis testing with tissue microarray databases

    PubMed Central

    2011-01-01

    Background Although many biological databases are applying semantic web technologies, meaningful biological hypothesis testing cannot be easily achieved. Database-driven high throughput genomic hypothesis testing requires both of the capabilities of obtaining semantically relevant experimental data and of performing relevant statistical testing for the retrieved data. Tissue Microarray (TMA) data are semantically rich and contains many biologically important hypotheses waiting for high throughput conclusions. Methods An application-specific ontology was developed for managing TMA and DNA microarray databases by semantic web technologies. Data were represented as Resource Description Framework (RDF) according to the framework of the ontology. Applications for hypothesis testing (Xperanto-RDF) for TMA data were designed and implemented by (1) formulating the syntactic and semantic structures of the hypotheses derived from TMA experiments, (2) formulating SPARQLs to reflect the semantic structures of the hypotheses, and (3) performing statistical test with the result sets returned by the SPARQLs. Results When a user designs a hypothesis in Xperanto-RDF and submits it, the hypothesis can be tested against TMA experimental data stored in Xperanto-RDF. When we evaluated four previously validated hypotheses as an illustration, all the hypotheses were supported by Xperanto-RDF. Conclusions We demonstrated the utility of high throughput biological hypothesis testing. We believe that preliminary investigation before performing highly controlled experiment can be benefited. PMID:21342584

  17. Split-plot microarray experiments: issues of design, power and sample size.

    PubMed

    Tsai, Pi-Wen; Lee, Mei-Ling Ting

    2005-01-01

    This article focuses on microarray experiments with two or more factors in which treatment combinations of the factors corresponding to the samples paired together onto arrays are not completely random. A main effect of one (or more) factor(s) is confounded with arrays (the experimental blocks). This is called a split-plot microarray experiment. We utilise an analysis of variance (ANOVA) model to assess differentially expressed genes for between-array and within-array comparisons that are generic under a split-plot microarray experiment. Instead of standard t- or F-test statistics that rely on mean square errors of the ANOVA model, we use a robust method, referred to as 'a pooled percentile estimator', to identify genes that are differentially expressed across different treatment conditions. We illustrate the design and analysis of split-plot microarray experiments based on a case application described by Jin et al. A brief discussion of power and sample size for split-plot microarray experiments is also presented.

  18. Mapping Remote and Multidisciplinary Learning Barriers: Lessons from "Challenge-Based Innovation" at CERN

    ERIC Educational Resources Information Center

    Jensen, Matilde Bisballe; Utriainen, Tuuli Maria; Steinert, Martin

    2018-01-01

    This paper presents the experienced difficulties of students participating in the multidisciplinary, remote collaborating engineering design course challenge-based innovation at CERN. This is with the aim to identify learning barriers and improve future learning experiences. We statistically analyse the rated differences between distinct design…

  19. Bootstrapping Methods Applied for Simulating Laboratory Works

    ERIC Educational Resources Information Center

    Prodan, Augustin; Campean, Remus

    2005-01-01

    Purpose: The aim of this work is to implement bootstrapping methods into software tools, based on Java. Design/methodology/approach: This paper presents a category of software e-tools aimed at simulating laboratory works and experiments. Findings: Both students and teaching staff use traditional statistical methods to infer the truth from sample…

  20. A Comparison of Conjoint Analysis Response Formats

    Treesearch

    Kevin J. Boyle; Thomas P. Holmes; Mario F. Teisl; Brian Roe

    2001-01-01

    A split-sample design is used to evaluate the convergent validity of three response formats used in conjoint analysis experiments. WC investigate whether recoding rating data to rankings and choose-one formats, and recoding ranking data to choose one. result in structural models and welfare estimates that are statistically indistinguishable from...

  1. Using Alien Coins to Test Whether Simple Inference Is Bayesian

    ERIC Educational Resources Information Center

    Cassey, Peter; Hawkins, Guy E.; Donkin, Chris; Brown, Scott D.

    2016-01-01

    Reasoning and inference are well-studied aspects of basic cognition that have been explained as statistically optimal Bayesian inference. Using a simplified experimental design, we conducted quantitative comparisons between Bayesian inference and human inference at the level of individuals. In 3 experiments, with more than 13,000 participants, we…

  2. Congruence between Disabled Elders and Their Primary Caregivers

    ERIC Educational Resources Information Center

    Horowitz, Amy; Goodman, Caryn R.; Reinhardt, Joann P.

    2004-01-01

    Purpose: This study examines the extent and independent correlates of congruence between disabled elders and their caregivers on several aspects of the caregiving experience. Design and Methods: Participants were 117 visually impaired elders and their caregivers. Correlational analyses, kappa statistics, and paired t tests were used to examine the…

  3. Experience-Based Discrimination: Classroom Games

    ERIC Educational Resources Information Center

    Fryer, Roland G., Jr.; Goeree, Jacob K.; Holt, Charles A.

    2005-01-01

    The authors present a simple classroom game in which students are randomly designated as employers, purple workers, or green workers. This environment may generate "statistical" discrimination if workers of one color tend not to invest because they anticipate lower opportunities in the labor market, and these beliefs are self-confirming as…

  4. Spatial Probability Cuing and Right Hemisphere Damage

    ERIC Educational Resources Information Center

    Shaqiri, Albulena; Anderson, Britt

    2012-01-01

    In this experiment we studied statistical learning, inter-trial priming, and visual attention. We assessed healthy controls and right brain damaged (RBD) patients with and without neglect, on a simple visual discrimination task designed to measure priming effects and probability learning. All participants showed a preserved priming effect for item…

  5. 78 FR 39282 - Proposed Information Collection Request; Comment Request; Willingness to Pay Survey for Salmon...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-01

    ... the survey, a choice experiment framework is used with statistically designed tradeoff questions... reason for the survey is public value research. All survey responses will be kept confidential. Form... Collection Request; Comment Request; Willingness to Pay Survey for Salmon Recovery in the Willamette...

  6. How to Engage Medical Students in Chronobiology: An Example on Autorhythmometry

    ERIC Educational Resources Information Center

    Rol de Lama, M. A.; Lozano, J. P.; Ortiz, V.; Sanchez-Vazquez, F. J.; Madrid, J. A.

    2005-01-01

    This contribution describes a new laboratory experience that improves medical students' learning of chronobiology by introducing them to basic chronobiology concepts as well as to methods and statistical analysis tools specific for circadian rhythms. We designed an autorhythmometry laboratory session where students simultaneously played the role…

  7. Developing and Refining the Taiwan Birth Cohort Study (TBCS): Five Years of Experience

    ERIC Educational Resources Information Center

    Lung, For-Wey; Chiang, Tung-Liang; Lin, Shio-Jean; Shu, Bih-Ching; Lee, Meng-Chih

    2011-01-01

    The Taiwan Birth Cohort Study (TBCS) is the first nationwide birth cohort database in Asia designed to establish national norms of children's development. Several challenges during database development and data analysis were identified. Challenges include sampling methods, instrument development and statistical approach to missing data. The…

  8. Less label, more free: approaches in label-free quantitative mass spectrometry.

    PubMed

    Neilson, Karlie A; Ali, Naveid A; Muralidharan, Sridevi; Mirzaei, Mehdi; Mariani, Michael; Assadourian, Gariné; Lee, Albert; van Sluyter, Steven C; Haynes, Paul A

    2011-02-01

    In this review we examine techniques, software, and statistical analyses used in label-free quantitative proteomics studies for area under the curve and spectral counting approaches. Recent advances in the field are discussed in an order that reflects a logical workflow design. Examples of studies that follow this design are presented to highlight the requirement for statistical assessment and further experiments to validate results from label-free quantitation. Limitations of label-free approaches are considered, label-free approaches are compared with labelling techniques, and forward-looking applications for label-free quantitative data are presented. We conclude that label-free quantitative proteomics is a reliable, versatile, and cost-effective alternative to labelled quantitation. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. Optimizing Mass Spectrometry Analyses: A Tailored Review on the Utility of Design of Experiments

    PubMed Central

    Hecht, Elizabeth S.; Oberg, Ann L.; Muddiman, David

    2016-01-01

    SUMMARY Mass spectrometry (MS) has emerged as a tool that can analyze nearly all classes of molecules, with its scope rapidly expanding in the areas of post-translational modifications, MS instrumentation, and many others. Yet integration of novel analyte preparatory and purification methods with existing or novel mass spectrometers can introduce new challenges for MS sensitivity. The mechanisms that govern detection by MS are particularly complex and interdependent, including ionization efficiency, ion suppression, and transmission. Performance of both off-line and MS methods can be optimized separately or, when appropriate, simultaneously through statistical designs, broadly referred to as “design of experiments” (DOE). The following review provides a tutorial-like guide into the selection of DOE for MS experiments, the practices for modeling and optimization of response variables, and the available software tools that support DOE implementation in any laboratory. This review comes three years after the latest DOE review (Hibbert DB 2012), which provided a comprehensive overview on the types of designs available and their statistical construction. Since that time, new classes of DOE, such as the definitive screening design, have emerged and new calls have been made for mass spectrometrists to adopt the practice. Rather than exhaustively cover all possible designs, we have highlighted the three most practical DOE classes available to mass spectrometrists. This review further differentiates itself by providing expert recommendations for experimental setup and defining DOE entirely in the context of three case-studies that highlight the utility of different designs to achieve different goals. A step-by-step tutorial is also provided. PMID:26951559

  10. SynGenics Optimization System (SynOptSys)

    NASA Technical Reports Server (NTRS)

    Ventresca, Carol; McMilan, Michelle L.; Globus, Stephanie

    2013-01-01

    The SynGenics Optimization System (SynOptSys) software application optimizes a product with respect to multiple, competing criteria using statistical Design of Experiments, Response-Surface Methodology, and the Desirability Optimization Methodology. The user is not required to be skilled in the underlying math; thus, SynOptSys can help designers and product developers overcome the barriers that prevent them from using powerful techniques to develop better pro ducts in a less costly manner. SynOpt-Sys is applicable to the design of any product or process with multiple criteria to meet, and at least two factors that influence achievement of those criteria. The user begins with a selected solution principle or system concept and a set of criteria that needs to be satisfied. The criteria may be expressed in terms of documented desirements or defined responses that the future system needs to achieve. Documented desirements can be imported into SynOptSys or created and documented directly within SynOptSys. Subsequent steps include identifying factors, specifying model order for each response, designing the experiment, running the experiment and gathering the data, analyzing the results, and determining the specifications for the optimized system. The user may also enter textual information as the project progresses. Data is easily edited within SynOptSys, and the software design enables full traceability within any step in the process, and facilitates reporting as needed. SynOptSys is unique in the way responses are defined and the nuances of the goodness associated with changes in response values for each of the responses of interest. The Desirability Optimization Methodology provides the basis of this novel feature. Moreover, this is a complete, guided design and optimization process tool with embedded math that can remain invisible to the user. It is not a standalone statistical program; it is a design and optimization system.

  11. Does sadness impair color perception? Flawed evidence and faulty methods.

    PubMed

    Holcombe, Alex O; Brown, Nicholas J L; Goodbourn, Patrick T; Etz, Alexander; Geukes, Sebastian

    2016-01-01

    In their 2015 paper, Thorstenson, Pazda, and Elliot offered evidence from two experiments that perception of colors on the blue-yellow axis was impaired if the participants had watched a sad movie clip, compared to participants who watched clips designed to induce a happy or neutral mood. Subsequently, these authors retracted their article, citing a mistake in their statistical analyses and a problem with the data in one of their experiments. Here, we discuss a number of other methodological problems with Thorstenson et al.'s experimental design, and also demonstrate that the problems with the data go beyond what these authors reported. We conclude that repeating one of the two experiments, with the minor revisions proposed by Thorstenson et al., will not be sufficient to address the problems with this work.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson-Cook, Christine Michaela; Lu, Lu

    There are many choices to make, when designing an experiment for a study, such as: what design factors to consider, which levels of the factors to use and which model to focus on. One aspect of design, however, is often left unquestioned: the size of the experiment. When learning about design of experiments, problems are often posed as "select a design for a particular objective with N runs." It’s tempting to consider the design size as a given constraint in the design-selection process. If you think of learning through designed experiments as a sequential process, however, strategically planning for themore » use of resources at different stages of data collection can be beneficial: Saving experimental runs for later is advantageous if you can efficiently learn with less in the early stages. Alternatively, if you’re too frugal in the early stages, you might not learn enough to proceed confidently with the next stages. Therefore, choosing the right-sized experiment is important—not too large or too small, but with a thoughtful balance to maximize the knowledge gained given the available resources. It can be a great advantage to think about the design size as flexible and include it as an aspect for comparisons. Sometimes you’re asked to provide a small design that is too ambitious for the goals of the study. Finally, if you can show quantitatively how the suggested design size might be inadequate or lead to problems during analysis—and also offer a formal comparison to some alternatives of different (likely larger) sizes—you may have a better chance to ask for additional resources to deliver statistically sound and satisfying results« less

  13. Assessing the applicability of the Taguchi design method to an interrill erosion study

    NASA Astrophysics Data System (ADS)

    Zhang, F. B.; Wang, Z. L.; Yang, M. Y.

    2015-02-01

    Full-factorial experimental designs have been used in soil erosion studies, but are time, cost and labor intensive, and sometimes they are impossible to conduct due to the increasing number of factors and their levels to consider. The Taguchi design is a simple, economical and efficient statistical tool that only uses a portion of the total possible factorial combinations to obtain the results of a study. Soil erosion studies that use the Taguchi design are scarce and no comparisons with full-factorial designs have been made. In this paper, a series of simulated rainfall experiments using a full-factorial design of five slope lengths (0.4, 0.8, 1.2, 1.6, and 2 m), five slope gradients (18%, 27%, 36%, 48%, and 58%), and five rainfall intensities (48, 62.4, 102, 149, and 170 mm h-1) were conducted. Validation of the applicability of a Taguchi design to interrill erosion experiments was achieved by extracting data from the full dataset according to a theoretical Taguchi design. The statistical parameters for the mean quasi-steady state erosion and runoff rates of each test, the optimum conditions for producing maximum erosion and runoff, and the main effect and percentage contribution of each factor obtained from the full-factorial and Taguchi designs were compared. Both designs generated almost identical results. Using the experimental data from the Taguchi design, it was possible to accurately predict the erosion and runoff rates under the conditions that had been excluded from the Taguchi design. All of the results obtained from analyzing the experimental data for both designs indicated that the Taguchi design could be applied to interrill erosion studies and could replace full-factorial designs. This would save time, labor and costs by generally reducing the number of tests to be conducted. Further work should test the applicability of the Taguchi design to a wider range of conditions.

  14. Failure Mode Identification Through Clustering Analysis

    NASA Technical Reports Server (NTRS)

    Arunajadai, Srikesh G.; Stone, Robert B.; Tumer, Irem Y.; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Research has shown that nearly 80% of the costs and problems are created in product development and that cost and quality are essentially designed into products in the conceptual stage. Currently, failure identification procedures (such as FMEA (Failure Modes and Effects Analysis), FMECA (Failure Modes, Effects and Criticality Analysis) and FTA (Fault Tree Analysis)) and design of experiments are being used for quality control and for the detection of potential failure modes during the detail design stage or post-product launch. Though all of these methods have their own advantages, they do not give information as to what are the predominant failures that a designer should focus on while designing a product. This work uses a functional approach to identify failure modes, which hypothesizes that similarities exist between different failure modes based on the functionality of the product/component. In this paper, a statistical clustering procedure is proposed to retrieve information on the set of predominant failures that a function experiences. The various stages of the methodology are illustrated using a hypothetical design example.

  15. Detector Development for the MARE Neutrino Experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Galeazzi, M.; Bogorin, D.; Molina, R.

    2009-12-16

    The MARE experiment is designed to measure the mass of the neutrino with sub-eV sensitivity by measuring the beta decay of {sup 187}Re with cryogenic microcalorimeters. A preliminary analysis shows that, to achieve the necessary statistics, between 10,000 and 50,000 detectors are likely necessary. We have fabricated and characterized Iridium transition edge sensors with high reproducibility and uniformity for such a large scale experiment. We have also started a full scale simulation of the experimental setup for MARE, including thermalization in the absorber, detector response, and optimum filter analysis, to understand the issues related to reaching a sub-eV sensitivity andmore » to optimize the design of the MARE experiment. We present our characterization of the Ir devices, including reproducibility, uniformity, and sensitivity, and we discuss the implementation and capabilities of our full scale simulation.« less

  16. The Role of Formal Experiment Design in Hypersonic Flight System Technology Development

    NASA Technical Reports Server (NTRS)

    McClinton, Charles R.; Ferlemann, Shelly M.; Rock, Ken E.; Ferlemann, Paul G.

    2002-01-01

    Hypersonic airbreathing engine (scramjet) powered vehicles are being considered to replace conventional rocket-powered launch systems. Effective utilization of scramjet engines requires careful integration with the air vehicle. This integration synergistically combines aerodynamic forces with propulsive cycle functions of the engine. Due to the highly integrated nature of the hypersonic vehicle design problem, the large flight envelope, and the large number of design variables, the use of a statistical design approach in design is effective. Modern Design-of-Experiments (MDOE) has been used throughout the Hyper-X program, for both systems analysis and experimental testing. Application of MDOE fall into four categories: (1) experimental testing; (2) studies of unit phenomena; (3) refining engine design; and (4) full vehicle system optimization. The MDOE process also provides analytical models, which are also used to document lessons learned, supplement low-level design tools, and accelerate future studies. This paper will discuss the design considerations for scramjet-powered vehicles, specifics of MDOE utilized for Hyper-X, and present highlights from the use of these MDOE methods within the Hyper-X Program.

  17. Statistical optimization of lovastatin production by Omphalotus olearius (DC.) singer in submerged fermentation.

    PubMed

    Atlı, Burcu; Yamaç, Mustafa; Yıldız, Zeki; Isikhuemhen, Omoanghe S

    2016-01-01

    In this study, culture conditions were optimized to improve lovastatin production by Omphalotus olearius, isolate OBCC 2002, using statistical experimental designs. The Plackett-Burman design was used to select important variables affecting lovastatin production. Accordingly, glucose, peptone, and agitation speed were determined as the variables that have influence on lovastatin production. In a further experiment, these variables were optimized with a Box-Behnken design and applied in a submerged process; this resulted in 12.51 mg/L lovastatin production on a medium containing glucose (10 g/L), peptone (5 g/L), thiamine (1 mg/L), and NaCl (0.4 g/L) under static conditions. This level of lovastatin production is eight times higher than that produced under unoptimized media and growth conditions by Omphalotus olearius. To the best of our knowledge, this is the first attempt to optimize submerged fermentation process for lovastatin production by Omphalotus olearius.

  18. The UAE Rainfall Enhancement Assessment Program: Implications of Thermodynamic Profiles on the Development of Precipitation in Convective Clouds over the Oman Mountains

    NASA Astrophysics Data System (ADS)

    Breed, D.; Bruintjes, R.; Jensen, T.; Salazar, V.; Fowler, T.

    2005-12-01

    During the winter and summer seasons of 2001 and 2002, data were collected to assess the efficacy of cloud seeding to enhance precipitation in the United Arab Emirates (UAE). The results of the feasibility study concluded: 1) that winter clouds in the UAE rarely produced conditions amenable to hygroscopic cloud seeding; 2) that summer convective clouds developed often enough, particularly over the Oman Mountains (e.g., the Hajar Mountains along the eastern UAE border and into Oman) to justify a randomized seeding experiment; 3) that collecting quantitative radar observations continues to be a complex but essential part of evaluating a cloud seeding experiment; 4) that successful flight operations would require solving several logistical issues; and 5) that several scientific questions would need to be studied in order to fully evaluate the efficacy and feasibility of hygroscopic cloud seeding, including cloud physical responses, radar-derived rainfall estimates as related to rainfall at the ground, and hydrological impacts. Based on these results, the UAE program proceeded through the design and implemention of a randomized hygroscopic cloud seeding experiment during the summer seasons to statistically quantify the potential for cloud seeding to enhance rainfall, specifically over the UAE and Oman Mountains, while collecting concurrent and separate physical measurements to support the statistical results and provide substantiation for the physical hypothesis. The randomized seeding experiment was carried out over the summers of 2003 and 2004, and a total of 134 cases were treated over the two summer seasons, of which 96 met the analysis criteria established in the experimental design of the program. The statistical evaluation of these cases yielded largely inconclusive results. Evidence will show that the thermodynamic profile had a large influence on storm characteristics and on precipitation development. This in turn provided a confounding factor in the conduct of the seeding experiment, particularly in the lateness of treatment in the storm cycle. The prevalence of capping inversions and the sensitivity of clouds to the level of the inversions as well as to wind shear will be shown using several data sets (soundings, aircraft, radar, numerical models). Concurrent physical measurements with the randomized experiment provided new insights into the physical processes of precipitation that developed in summertime convective clouds over the UAE that in turn helped in the interpretation of the statistical results.

  19. Maximizing Macromolecule Crystal Size for Neutron Diffraction Experiments

    NASA Technical Reports Server (NTRS)

    Judge, R. A.; Kephart, R.; Leardi, R.; Myles, D. A.; Snell, E. H.; vanderWoerd, M.; Curreri, Peter A. (Technical Monitor)

    2002-01-01

    A challenge in neutron diffraction experiments is growing large (greater than 1 cu mm) macromolecule crystals. In taking up this challenge we have used statistical experiment design techniques to quickly identify crystallization conditions under which the largest crystals grow. These techniques provide the maximum information for minimal experimental effort, allowing optimal screening of crystallization variables in a simple experimental matrix, using the minimum amount of sample. Analysis of the results quickly tells the investigator what conditions are the most important for the crystallization. These can then be used to maximize the crystallization results in terms of reducing crystal numbers and providing large crystals of suitable habit. We have used these techniques to grow large crystals of Glucose isomerase. Glucose isomerase is an industrial enzyme used extensively in the food industry for the conversion of glucose to fructose. The aim of this study is the elucidation of the enzymatic mechanism at the molecular level. The accurate determination of hydrogen positions, which is critical for this, is a requirement that neutron diffraction is uniquely suited for. Preliminary neutron diffraction experiments with these crystals conducted at the Institute Laue-Langevin (Grenoble, France) reveal diffraction to beyond 2.5 angstrom. Macromolecular crystal growth is a process involving many parameters, and statistical experimental design is naturally suited to this field. These techniques are sample independent and provide an experimental strategy to maximize crystal volume and habit for neutron diffraction studies.

  20. Improving Robustness of Hydrologic Ensemble Predictions Through Probabilistic Pre- and Post-Processing in Sequential Data Assimilation

    NASA Astrophysics Data System (ADS)

    Wang, S.; Ancell, B. C.; Huang, G. H.; Baetz, B. W.

    2018-03-01

    Data assimilation using the ensemble Kalman filter (EnKF) has been increasingly recognized as a promising tool for probabilistic hydrologic predictions. However, little effort has been made to conduct the pre- and post-processing of assimilation experiments, posing a significant challenge in achieving the best performance of hydrologic predictions. This paper presents a unified data assimilation framework for improving the robustness of hydrologic ensemble predictions. Statistical pre-processing of assimilation experiments is conducted through the factorial design and analysis to identify the best EnKF settings with maximized performance. After the data assimilation operation, statistical post-processing analysis is also performed through the factorial polynomial chaos expansion to efficiently address uncertainties in hydrologic predictions, as well as to explicitly reveal potential interactions among model parameters and their contributions to the predictive accuracy. In addition, the Gaussian anamorphosis is used to establish a seamless bridge between data assimilation and uncertainty quantification of hydrologic predictions. Both synthetic and real data assimilation experiments are carried out to demonstrate feasibility and applicability of the proposed methodology in the Guadalupe River basin, Texas. Results suggest that statistical pre- and post-processing of data assimilation experiments provide meaningful insights into the dynamic behavior of hydrologic systems and enhance robustness of hydrologic ensemble predictions.

  1. Stochastic Analysis and Design of Heterogeneous Microstructural Materials System

    NASA Astrophysics Data System (ADS)

    Xu, Hongyi

    Advanced materials system refers to new materials that are comprised of multiple traditional constituents but complex microstructure morphologies, which lead to superior properties over the conventional materials. To accelerate the development of new advanced materials system, the objective of this dissertation is to develop a computational design framework and the associated techniques for design automation of microstructure materials systems, with an emphasis on addressing the uncertainties associated with the heterogeneity of microstructural materials. Five key research tasks are identified: design representation, design evaluation, design synthesis, material informatics and uncertainty quantification. Design representation of microstructure includes statistical characterization and stochastic reconstruction. This dissertation develops a new descriptor-based methodology, which characterizes 2D microstructures using descriptors of composition, dispersion and geometry. Statistics of 3D descriptors are predicted based on 2D information to enable 2D-to-3D reconstruction. An efficient sequential reconstruction algorithm is developed to reconstruct statistically equivalent random 3D digital microstructures. In design evaluation, a stochastic decomposition and reassembly strategy is developed to deal with the high computational costs and uncertainties induced by material heterogeneity. The properties of Representative Volume Elements (RVE) are predicted by stochastically reassembling SVE elements with stochastic properties into a coarse representation of the RVE. In design synthesis, a new descriptor-based design framework is developed, which integrates computational methods of microstructure characterization and reconstruction, sensitivity analysis, Design of Experiments (DOE), metamodeling and optimization the enable parametric optimization of the microstructure for achieving the desired material properties. Material informatics is studied to efficiently reduce the dimension of microstructure design space. This dissertation develops a machine learning-based methodology to identify the key microstructure descriptors that highly impact properties of interest. In uncertainty quantification, a comparative study on data-driven random process models is conducted to provide guidance for choosing the most accurate model in statistical uncertainty quantification. Two new goodness-of-fit metrics are developed to provide quantitative measurements of random process models' accuracy. The benefits of the proposed methods are demonstrated by the example of designing the microstructure of polymer nanocomposites. This dissertation provides material-generic, intelligent modeling/design methodologies and techniques to accelerate the process of analyzing and designing new microstructural materials system.

  2. Senior Computational Scientist | Center for Cancer Research

    Cancer.gov

    The Basic Science Program (BSP) pursues independent, multidisciplinary research in basic and applied molecular biology, immunology, retrovirology, cancer biology, and human genetics. Research efforts and support are an integral part of the Center for Cancer Research (CCR) at the Frederick National Laboratory for Cancer Research (FNLCR). The Cancer & Inflammation Program (CIP), Basic Science Program, HLA Immunogenetics Section, under the leadership of Dr. Mary Carrington, studies the influence of human leukocyte antigens (HLA) and specific KIR/HLA genotypes on risk of and outcomes to infection, cancer, autoimmune disease, and maternal-fetal disease. Recent studies have focused on the impact of HLA gene expression in disease, the molecular mechanism regulating expression levels, and the functional basis for the effect of differential expression on disease outcome. The lab’s further focus is on the genetic basis for resistance/susceptibility to disease conferred by immunogenetic variation. KEY ROLES/RESPONSIBILITIES The Senior Computational Scientist will provide research support to the CIP-BSP-HLA Immunogenetics Section performing bio-statistical design, analysis and reporting of research projects conducted in the lab. This individual will be involved in the implementation of statistical models and data preparation. Successful candidate should have 5 or more years of competent, innovative biostatistics/bioinformatics research experience, beyond doctoral training Considerable experience with statistical software, such as SAS, R and S-Plus Sound knowledge, and demonstrated experience of theoretical and applied statistics Write program code to analyze data using statistical analysis software Contribute to the interpretation and publication of research results

  3. Improving self-efficacy in spinal cord injury patients through "design thinking" rehabilitation workshops.

    PubMed

    Wolstenholme, Daniel; Downes, Tom; Leaver, Jackie; Partridge, Rebecca; Langley, Joseph

    2014-01-01

    Advances in surgical and medical management have significantly reduced the length of time that patients with spinal cord injury (SCI) have to stay in hospital, but has left patients with potentially less time to psychologically adjust. Following a pilot in 2012, this project was designed to test the effect of "design thinking" workshops on the self-efficacy of people undergoing rehabilitation following spinal injuries. Design thinking is about understanding the approaches and methods that designers use and then applying these to think creatively about problems and suggest ways to solve them. In this instance, design thinking is not about designing new products (although the approaches can be used to do this) but about developing a long term creative and explorative mind-set through skills such as lateral thinking, prototyping and verbal and visual communication. The principles of "design thinking" have underpinned design education and practice for many years, it is also recognised in business and innovation for example, but a literature review indicated that there was no evidence of it being used in rehabilitation or spinal injury settings. Twenty participants took part in the study; 13 (65%) were male and the average age was 37 years (range 16 to 72). Statistically significant improvements were seen for EQ-5D score (t = -3.13, p = 0.007) and Patient Activation Measure score (t = -3.85, p = 0.001). Other outcome measures improved but not statistically. There were no statistical effects on length of stay or readmission rates, but qualitative interviews indicated improved patient experience.

  4. A general model-based design of experiments approach to achieve practical identifiability of pharmacokinetic and pharmacodynamic models.

    PubMed

    Galvanin, Federico; Ballan, Carlo C; Barolo, Massimiliano; Bezzo, Fabrizio

    2013-08-01

    The use of pharmacokinetic (PK) and pharmacodynamic (PD) models is a common and widespread practice in the preliminary stages of drug development. However, PK-PD models may be affected by structural identifiability issues intrinsically related to their mathematical formulation. A preliminary structural identifiability analysis is usually carried out to check if the set of model parameters can be uniquely determined from experimental observations under the ideal assumptions of noise-free data and no model uncertainty. However, even for structurally identifiable models, real-life experimental conditions and model uncertainty may strongly affect the practical possibility to estimate the model parameters in a statistically sound way. A systematic procedure coupling the numerical assessment of structural identifiability with advanced model-based design of experiments formulations is presented in this paper. The objective is to propose a general approach to design experiments in an optimal way, detecting a proper set of experimental settings that ensure the practical identifiability of PK-PD models. Two simulated case studies based on in vitro bacterial growth and killing models are presented to demonstrate the applicability and generality of the methodology to tackle model identifiability issues effectively, through the design of feasible and highly informative experiments.

  5. Optimizing the vacuum plasma spray deposition of metal, ceramic, and cermet coatings using designed experiments

    NASA Astrophysics Data System (ADS)

    Kingswell, R.; Scott, K. T.; Wassell, L. L.

    1993-06-01

    The vacuum plasma spray (VPS) deposition of metal, ceramic, and cermet coatings has been investigated using designed statistical experiments. Processing conditions that were considered likely to have a significant influence on the melting characteristics of the precursor powders and hence deposition efficiency were incorporated into full and fractional factorial experimental designs. The processing of an alumina powder was very sensitive to variations in the deposition conditions, particularly the injection velocity of the powder into the plasma flame, the plasma gas composition, and the power supplied to the gun. Using a combination of full and fractional factorial experimental designs, it was possible to rapidly identify the important spraying variables and adjust these to produce a deposition efficiency approaching 80 percent. The deposition of a nickel-base alloy metal powder was less sensitive to processing conditions. Generally, however, a high degree of particle melting was achieved for a wide range of spray conditions. Preliminary experiments performed using a tungsten carbide/cobalt cermet powder indicated that spray efficiency was not sensitive to deposition conditions. However, microstructural analysis revealed considerable variations in the degree of tungsten carbide dissolution. The structure and properties of the optimized coatings produced in the factorial experiments are also discussed.

  6. Space-filling designs for computer experiments: A review

    DOE PAGES

    Joseph, V. Roshan

    2016-01-29

    Improving the quality of a product/process using a computer simulator is a much less expensive option than the real physical testing. However, simulation using computationally intensive computer models can be time consuming and therefore, directly doing the optimization on the computer simulator can be infeasible. Experimental design and statistical modeling techniques can be used for overcoming this problem. This article reviews experimental designs known as space-filling designs that are suitable for computer simulations. In the review, a special emphasis is given for a recently developed space-filling design called maximum projection design. Furthermore, its advantages are illustrated using a simulation conductedmore » for optimizing a milling process.« less

  7. Space-filling designs for computer experiments: A review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Joseph, V. Roshan

    Improving the quality of a product/process using a computer simulator is a much less expensive option than the real physical testing. However, simulation using computationally intensive computer models can be time consuming and therefore, directly doing the optimization on the computer simulator can be infeasible. Experimental design and statistical modeling techniques can be used for overcoming this problem. This article reviews experimental designs known as space-filling designs that are suitable for computer simulations. In the review, a special emphasis is given for a recently developed space-filling design called maximum projection design. Furthermore, its advantages are illustrated using a simulation conductedmore » for optimizing a milling process.« less

  8. Image Understanding. Proceedings of a Workshop Held in Pittsburgh, Pennsylvania on 11-13 September, 1990

    DTIC Science & Technology

    1990-09-01

    performed some preliminary longest piers are about three times the length of a de- experiments to detect the ships in the high resolution stroyer...statistics, and these are coordinates then shipped via a high - speed interface to a host where the stereo triangulation and kinematic control algorithms Grasp...Design: Perception research includes the design of new sensor technologies, such as this hybrid analog/digital chip for a high - speed light-stripe

  9. High productivity chromatography refolding process for Hepatitis B Virus X (HBx) protein guided by statistical design of experiment studies.

    PubMed

    Basu, Anindya; Leong, Susanna Su Jan

    2012-02-03

    The Hepatitis B Virus X (HBx) protein is a potential therapeutic target for the treatment of hepatocellular carcinoma. However, consistent expression of the protein as insoluble inclusion bodies in bacteria host systems has largely hindered HBx manufacturing via economical biosynthesis routes, thereby impeding the development of anti-HBx therapeutic strategies. To eliminate this roadblock, this work reports the development of the first 'chromatography refolding'-based bioprocess for HBx using immobilised metal affinity chromatography (IMAC). This process enabled production of HBx at quantities and purity that facilitate their direct use in structural and molecular characterization studies. In line with the principles of quality by design (QbD), we used a statistical design of experiments (DoE) methodology to design the optimum process which delivered bioactive HBx at a productivity of 0.21 mg/ml/h at a refolding yield of 54% (at 10 mg/ml refolding concentration), which was 4.4-fold higher than that achieved in dilution refolding. The systematic DoE methodology adopted for this study enabled us to obtain important insights into the effect of different bioprocess parameters like the effect of buffer exchange gradients on HBx productivity and quality. Such a bioprocess design approach can play a pivotal role in developing intensified processes for other novel proteins, and hence helping to resolve validation and speed-to-market challenges faced by the biopharmaceutical industry today. Copyright © 2011 Elsevier B.V. All rights reserved.

  10. Proceedings of the Conference on the Design of Experiments in Army Research, Development and Testing (19th) Held at Rock Island, Illinois, on 24-26 October 1973

    DTIC Science & Technology

    1974-11-01

    everyone else. This Is an incomplete and unproductive introduction to Bayesian statistics and Is more likely to lead to polmics than to understanding. A...more informative introduction can be achieved by criticizing some of the ideas underlyitng the fmquentist viewpoint. I shall therefore start by...statistics. It is not "ey to discuss this point constructively. and so I simply record my on view that such rejection is premature, and that the introduction

  11. AAFE man-made noise experiment project. Volume 1: Introduction experiment definition and requirements

    NASA Technical Reports Server (NTRS)

    1974-01-01

    An experiment was conducted to measure and map the man-made radio frequency emanations which exist at earth orbital altitudes. The major objectives of the program are to develop a complete conceptual experiment and developmental hardware for the collection and processing of data required to produce meaningful statistics on man-made noise level variations as functions of time, frequency, and geographic location. A wide dispersion measurement receiver mounted in a spacecraft operating in a specialized orbit is used to obtain the data. A summary of the experiment designs goals and constraints is provided. The recommended orbit for the spacecraft is defined. The characteristics of the receiver and the antennas are analyzed.

  12. Investigation of Sb-Containing Precursors for Cu(In, Ga)Se2 Thin Films Through Design of Experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mansfield, Lorelle M.; To, Bobby; Reedy, Robert C.

    2016-11-21

    The Design of Experiments (DoE) module in JMP statistical software was used to determine the best parameters for Sb-containing CIGS precursors with a fixed selenization step. Solar cells were fabricated and measured for all completed films. The most important factor influencing the current-voltage device parameters was identified as the temperature and antimony flux interaction. The DoE prediction profiler and predictive contour plots provided guidance to further improve the device parameters. In one follow-up run, we increased device efficiency from 14.9% to 15.5% Additional gains in efficiency to 16.9% were realized by introducing an intentional Ga gradient and an antireflective coating.

  13. Statistical inference from multiple iTRAQ experiments without using common reference standards.

    PubMed

    Herbrich, Shelley M; Cole, Robert N; West, Keith P; Schulze, Kerry; Yager, James D; Groopman, John D; Christian, Parul; Wu, Lee; O'Meally, Robert N; May, Damon H; McIntosh, Martin W; Ruczinski, Ingo

    2013-02-01

    Isobaric tags for relative and absolute quantitation (iTRAQ) is a prominent mass spectrometry technology for protein identification and quantification that is capable of analyzing multiple samples in a single experiment. Frequently, iTRAQ experiments are carried out using an aliquot from a pool of all samples, or "masterpool", in one of the channels as a reference sample standard to estimate protein relative abundances in the biological samples and to combine abundance estimates from multiple experiments. In this manuscript, we show that using a masterpool is counterproductive. We obtain more precise estimates of protein relative abundance by using the available biological data instead of the masterpool and do not need to occupy a channel that could otherwise be used for another biological sample. In addition, we introduce a simple statistical method to associate proteomic data from multiple iTRAQ experiments with a numeric response and show that this approach is more powerful than the conventionally employed masterpool-based approach. We illustrate our methods using data from four replicate iTRAQ experiments on aliquots of the same pool of plasma samples and from a 406-sample project designed to identify plasma proteins that covary with nutrient concentrations in chronically undernourished children from South Asia.

  14. An Investigation of Civilians Preparedness to Compete with Individuals with Military Experience for Army Board Select Acquisition Positions

    DTIC Science & Technology

    2017-05-25

    37 Research Design ... research employed a mixed research methodology – quantitative with descriptive statistical analysis and qualitative with a thematic analysis approach...mixed research methodology – quantitative and qualitative, using interviews to collect the data. The interviews included demographic and open-ended

  15. How to Introduce Historically the Normal Distribution in Engineering Education: A Classroom Experiment

    ERIC Educational Resources Information Center

    Blanco, Monica; Ginovart, Marta

    2010-01-01

    Little has been explored with regard to introducing historical aspects in the undergraduate statistics classroom in engineering studies. This article focuses on the design, implementation and assessment of a specific activity concerning the introduction of the normal probability curve and related aspects from a historical dimension. Following a…

  16. On the Hedges Correction for a "t"-Test

    ERIC Educational Resources Information Center

    VanHoudnos, Nathan M.; Greenhouse, Joel B.

    2016-01-01

    When cluster randomized experiments are analyzed as if units were independent, test statistics for treatment effects can be anticonservative. Hedges proposed a correction for such tests by scaling them to control their Type I error rate. This article generalizes the Hedges correction from a posttest-only experimental design to more common designs…

  17. A Meta-Analysis of Referential Communication Studies: A Computer Readable Literature Review.

    ERIC Educational Resources Information Center

    Dickson, W. Patrick; Moskoff, Mary

    A computer-assisted analysis of studies on referential communication (giving directions/explanations) located 66 reports involving 80 experiments, 114 referential tasks, and over 6,200 individuals. The studies were entered into a statistical software package system (SPSS) and analyzed for characteristics of the subjects and experimental designs,…

  18. A Study of the Effects of Multimedia Dynamic Teaching on Cognitive Load and Learning Outcome

    ERIC Educational Resources Information Center

    Zhang, Xiaozhu; Zhang, Xiurong; Yang, Xiaoming

    2016-01-01

    The statistics reveal that about many students have learning difficulties. For this reason, appropriate curricula and materials should be planned to match with multimedia teaching design in order to reduce students' learning frustration and obstacles caused by insufficient experiences and basic competence. Multimedia dynamic, a curriculum oriented…

  19. Latin and Magic Squares

    ERIC Educational Resources Information Center

    Emanouilidis, Emanuel

    2005-01-01

    Latin squares have existed for hundreds of years but it wasn't until rather recently that Latin squares were used in other areas such as statistics, graph theory, coding theory and the generation of random numbers as well as in the design and analysis of experiments. This note describes Latin and diagonal Latin squares, a method of constructing…

  20. Latin and Cross Latin Squares

    ERIC Educational Resources Information Center

    Emanouilidis, Emanuel

    2008-01-01

    Latin squares were first introduced and studied by the famous mathematician Leonhard Euler in the 1700s. Through the years, Latin squares have been used in areas such as statistics, graph theory, coding theory, the generation of random numbers as well as in the design and analysis of experiments. Recently, with the international popularity of…

  1. Design of experiment (DOE) based screening of factors affecting municipal solid waste (MSW) composting.

    PubMed

    Kazemi, Khoshrooz; Zhang, Baiyu; Lye, Leonard M; Cai, Qinghong; Cao, Tong

    2016-12-01

    A design of experiment (DOE) based methodology was adopted in this study to investigate the effects of multiple factors and their interactions on the performance of a municipal solid waste (MSW) composting process. The impact of four factors, carbon/nitrogen ratio (C/N), moisture content (MC), type of bulking agent (BA) and aeration rate (AR) on the maturity, stability and toxicity of compost product was investigated. The statistically significant factors were identified using final C/N, germination index (GI) and especially the enzyme activities as responses. Experimental results validated the use of enzyme activities as proper indices during the course of composting. Maximum enzyme activities occurred during the active phase of decomposition. MC has a significant effect on dehydrogenase activity (DGH), β-glucosidase activity (BGH), phosphodiesterase activity (PDE) and the final moisture content of the compost. C/N is statistically significant for final C/N, DGH, BGH, and GI. The results provided guidance to optimize a MSW composting system that will lead to increased decomposition rate and the production of more stable and mature compost. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. Photoresist thin-film effects on alignment process capability

    NASA Astrophysics Data System (ADS)

    Flores, Gary E.; Flack, Warren W.

    1993-08-01

    Two photoresists were selected for alignment characterization based on their dissimilar coating properties and observed differences on alignment capability. The materials are Dynachem OFPR-800 and Shipley System 8. Both photoresists were examined on two challenging alignment levels in a submicron CMOS process, a nitride level and a planarized second level metal. An Ultratech Stepper model 1500 which features a darkfield alignment system with a broadband green light for alignment signal detection was used for this project. Initially, statistically designed linear screening experiments were performed to examine six process factors for each photoresist: viscosity, spin acceleration, spin speed, spin time, softbake time, and softbake temperature. Using the results derived from the screening experiments, a more thorough examination of the statistically significant process factors was performed. A full quadratic experimental design was conducted to examine viscosity, spin speed, and spin time coating properties on alignment. This included a characterization of both intra and inter wafer alignment control and alignment process capability. Insight to the different alignment behavior is analyzed in terms of photoresist material properties and the physical nature of the alignment detection system.

  3. Use of statistical design of experiments for surface modification of Kapton films by CF4sbnd O2 microwave plasma treatment

    NASA Astrophysics Data System (ADS)

    Grandoni, Andrea; Mannini, Giacomo; Glisenti, Antonella; Manariti, Antonella; Galli, Giancarlo

    2017-10-01

    A statistical design of experiments (DoE) was used to evaluate the effects of CF4sbnd O2 plasma on Kapton films in which the duration of treatment, volume ratio of plasma gases, and microwave power were selected as effective experimental factors for systematic investigation of surface modification. Static water contact angle (θW), polar component of surface free energy (γSp) and surface O/C atomic ratio were analyzed as response variables. A significant enhancement in wettability and polarity of the treated films compared to untreated Kapton films was observed; depending on the experimental conditions, θW very significantly decreased, showing full wettability, and γSp rose dramatically, up to ten times. Within the DoE the conditions of plasma treatment were identified that resulted in selected optimal values of θW, γSp and O/C responses. Surface chemical changes were detected by XPS and ATR-IR investigations that evidenced both the introduction of fluorinated groups and the opening of the imide ring in the plasma-treated films.

  4. How to get statistically significant effects in any ERP experiment (and why you shouldn't).

    PubMed

    Luck, Steven J; Gaspelin, Nicholas

    2017-01-01

    ERP experiments generate massive datasets, often containing thousands of values for each participant, even after averaging. The richness of these datasets can be very useful in testing sophisticated hypotheses, but this richness also creates many opportunities to obtain effects that are statistically significant but do not reflect true differences among groups or conditions (bogus effects). The purpose of this paper is to demonstrate how common and seemingly innocuous methods for quantifying and analyzing ERP effects can lead to very high rates of significant but bogus effects, with the likelihood of obtaining at least one such bogus effect exceeding 50% in many experiments. We focus on two specific problems: using the grand-averaged data to select the time windows and electrode sites for quantifying component amplitudes and latencies, and using one or more multifactor statistical analyses. Reanalyses of prior data and simulations of typical experimental designs are used to show how these problems can greatly increase the likelihood of significant but bogus results. Several strategies are described for avoiding these problems and for increasing the likelihood that significant effects actually reflect true differences among groups or conditions. © 2016 Society for Psychophysiological Research.

  5. How to Get Statistically Significant Effects in Any ERP Experiment (and Why You Shouldn’t)

    PubMed Central

    Luck, Steven J.; Gaspelin, Nicholas

    2016-01-01

    Event-related potential (ERP) experiments generate massive data sets, often containing thousands of values for each participant, even after averaging. The richness of these data sets can be very useful in testing sophisticated hypotheses, but this richness also creates many opportunities to obtain effects that are statistically significant but do not reflect true differences among groups or conditions (bogus effects). The purpose of this paper is to demonstrate how common and seemingly innocuous methods for quantifying and analyzing ERP effects can lead to very high rates of significant-but-bogus effects, with the likelihood of obtaining at least one such bogus effect exceeding 50% in many experiments. We focus on two specific problems: using the grand average data to select the time windows and electrode sites for quantifying component amplitudes and latencies, and using one or more multi-factor statistical analyses. Re-analyses of prior data and simulations of typical experimental designs are used to show how these problems can greatly increase the likelihood of significant-but-bogus results. Several strategies are described for avoiding these problems and for increasing the likelihood that significant effects actually reflect true differences among groups or conditions. PMID:28000253

  6. Computational Analysis for Rocket-Based Combined-Cycle Systems During Rocket-Only Operation

    NASA Technical Reports Server (NTRS)

    Steffen, C. J., Jr.; Smith, T. D.; Yungster, S.; Keller, D. J.

    2000-01-01

    A series of Reynolds-averaged Navier-Stokes calculations were employed to study the performance of rocket-based combined-cycle systems operating in an all-rocket mode. This parametric series of calculations were executed within a statistical framework, commonly known as design of experiments. The parametric design space included four geometric and two flowfield variables set at three levels each, for a total of 729 possible combinations. A D-optimal design strategy was selected. It required that only 36 separate computational fluid dynamics (CFD) solutions be performed to develop a full response surface model, which quantified the linear, bilinear, and curvilinear effects of the six experimental variables. The axisymmetric, Reynolds-averaged Navier-Stokes simulations were executed with the NPARC v3.0 code. The response used in the statistical analysis was created from Isp efficiency data integrated from the 36 CFD simulations. The influence of turbulence modeling was analyzed by using both one- and two-equation models. Careful attention was also given to quantify the influence of mesh dependence, iterative convergence, and artificial viscosity upon the resulting statistical model. Thirteen statistically significant effects were observed to have an influence on rocket-based combined-cycle nozzle performance. It was apparent that the free-expansion process, directly downstream of the rocket nozzle, can influence the Isp efficiency. Numerical schlieren images and particle traces have been used to further understand the physical phenomena behind several of the statistically significant results.

  7. In vivo evaluation of the effect of stimulus distribution on FIR statistical efficiency in event-related fMRI

    PubMed Central

    Jansma, J Martijn; de Zwart, Jacco A; van Gelderen, Peter; Duyn, Jeff H; Drevets, Wayne C; Furey, Maura L

    2013-01-01

    Technical developments in MRI have improved signal to noise, allowing use of analysis methods such as Finite impulse response (FIR) of rapid event related functional MRI (er-fMRI). FIR is one of the most informative analysis methods as it determines onset and full shape of the hemodynamic response function (HRF) without any a-priori assumptions. FIR is however vulnerable to multicollinearity, which is directly related to the distribution of stimuli over time. Efficiency can be optimized by simplifying a design, and restricting stimuli distribution to specific sequences, while more design flexibility necessarily reduces efficiency. However, the actual effect of efficiency on fMRI results has never been tested in vivo. Thus, it is currently difficult to make an informed choice between protocol flexibility and statistical efficiency. The main goal of this study was to assign concrete fMRI signal to noise values to the abstract scale of FIR statistical efficiency. Ten subjects repeated a perception task with five random and m-sequence based protocol, with varying but, according to literature, acceptable levels of multicollinearity. Results indicated substantial differences in signal standard deviation, while the level was a function of multicollinearity. Experiment protocols varied up to 55.4% in standard deviation. Results confirm that quality of fMRI in an FIR analysis can significantly and substantially vary with statistical efficiency. Our in vivo measurements can be used to aid in making an informed decision between freedom in protocol design and statistical efficiency. PMID:23473798

  8. Guidelines for Genome-Scale Analysis of Biological Rhythms.

    PubMed

    Hughes, Michael E; Abruzzi, Katherine C; Allada, Ravi; Anafi, Ron; Arpat, Alaaddin Bulak; Asher, Gad; Baldi, Pierre; de Bekker, Charissa; Bell-Pedersen, Deborah; Blau, Justin; Brown, Steve; Ceriani, M Fernanda; Chen, Zheng; Chiu, Joanna C; Cox, Juergen; Crowell, Alexander M; DeBruyne, Jason P; Dijk, Derk-Jan; DiTacchio, Luciano; Doyle, Francis J; Duffield, Giles E; Dunlap, Jay C; Eckel-Mahan, Kristin; Esser, Karyn A; FitzGerald, Garret A; Forger, Daniel B; Francey, Lauren J; Fu, Ying-Hui; Gachon, Frédéric; Gatfield, David; de Goede, Paul; Golden, Susan S; Green, Carla; Harer, John; Harmer, Stacey; Haspel, Jeff; Hastings, Michael H; Herzel, Hanspeter; Herzog, Erik D; Hoffmann, Christy; Hong, Christian; Hughey, Jacob J; Hurley, Jennifer M; de la Iglesia, Horacio O; Johnson, Carl; Kay, Steve A; Koike, Nobuya; Kornacker, Karl; Kramer, Achim; Lamia, Katja; Leise, Tanya; Lewis, Scott A; Li, Jiajia; Li, Xiaodong; Liu, Andrew C; Loros, Jennifer J; Martino, Tami A; Menet, Jerome S; Merrow, Martha; Millar, Andrew J; Mockler, Todd; Naef, Felix; Nagoshi, Emi; Nitabach, Michael N; Olmedo, Maria; Nusinow, Dmitri A; Ptáček, Louis J; Rand, David; Reddy, Akhilesh B; Robles, Maria S; Roenneberg, Till; Rosbash, Michael; Ruben, Marc D; Rund, Samuel S C; Sancar, Aziz; Sassone-Corsi, Paolo; Sehgal, Amita; Sherrill-Mix, Scott; Skene, Debra J; Storch, Kai-Florian; Takahashi, Joseph S; Ueda, Hiroki R; Wang, Han; Weitz, Charles; Westermark, Pål O; Wijnen, Herman; Xu, Ying; Wu, Gang; Yoo, Seung-Hee; Young, Michael; Zhang, Eric Erquan; Zielinski, Tomasz; Hogenesch, John B

    2017-10-01

    Genome biology approaches have made enormous contributions to our understanding of biological rhythms, particularly in identifying outputs of the clock, including RNAs, proteins, and metabolites, whose abundance oscillates throughout the day. These methods hold significant promise for future discovery, particularly when combined with computational modeling. However, genome-scale experiments are costly and laborious, yielding "big data" that are conceptually and statistically difficult to analyze. There is no obvious consensus regarding design or analysis. Here we discuss the relevant technical considerations to generate reproducible, statistically sound, and broadly useful genome-scale data. Rather than suggest a set of rigid rules, we aim to codify principles by which investigators, reviewers, and readers of the primary literature can evaluate the suitability of different experimental designs for measuring different aspects of biological rhythms. We introduce CircaInSilico, a web-based application for generating synthetic genome biology data to benchmark statistical methods for studying biological rhythms. Finally, we discuss several unmet analytical needs, including applications to clinical medicine, and suggest productive avenues to address them.

  9. Guidelines for Genome-Scale Analysis of Biological Rhythms

    PubMed Central

    Hughes, Michael E.; Abruzzi, Katherine C.; Allada, Ravi; Anafi, Ron; Arpat, Alaaddin Bulak; Asher, Gad; Baldi, Pierre; de Bekker, Charissa; Bell-Pedersen, Deborah; Blau, Justin; Brown, Steve; Ceriani, M. Fernanda; Chen, Zheng; Chiu, Joanna C.; Cox, Juergen; Crowell, Alexander M.; DeBruyne, Jason P.; Dijk, Derk-Jan; DiTacchio, Luciano; Doyle, Francis J.; Duffield, Giles E.; Dunlap, Jay C.; Eckel-Mahan, Kristin; Esser, Karyn A.; FitzGerald, Garret A.; Forger, Daniel B.; Francey, Lauren J.; Fu, Ying-Hui; Gachon, Frédéric; Gatfield, David; de Goede, Paul; Golden, Susan S.; Green, Carla; Harer, John; Harmer, Stacey; Haspel, Jeff; Hastings, Michael H.; Herzel, Hanspeter; Herzog, Erik D.; Hoffmann, Christy; Hong, Christian; Hughey, Jacob J.; Hurley, Jennifer M.; de la Iglesia, Horacio O.; Johnson, Carl; Kay, Steve A.; Koike, Nobuya; Kornacker, Karl; Kramer, Achim; Lamia, Katja; Leise, Tanya; Lewis, Scott A.; Li, Jiajia; Li, Xiaodong; Liu, Andrew C.; Loros, Jennifer J.; Martino, Tami A.; Menet, Jerome S.; Merrow, Martha; Millar, Andrew J.; Mockler, Todd; Naef, Felix; Nagoshi, Emi; Nitabach, Michael N.; Olmedo, Maria; Nusinow, Dmitri A.; Ptáček, Louis J.; Rand, David; Reddy, Akhilesh B.; Robles, Maria S.; Roenneberg, Till; Rosbash, Michael; Ruben, Marc D.; Rund, Samuel S.C.; Sancar, Aziz; Sassone-Corsi, Paolo; Sehgal, Amita; Sherrill-Mix, Scott; Skene, Debra J.; Storch, Kai-Florian; Takahashi, Joseph S.; Ueda, Hiroki R.; Wang, Han; Weitz, Charles; Westermark, Pål O.; Wijnen, Herman; Xu, Ying; Wu, Gang; Yoo, Seung-Hee; Young, Michael; Zhang, Eric Erquan; Zielinski, Tomasz; Hogenesch, John B.

    2017-01-01

    Genome biology approaches have made enormous contributions to our understanding of biological rhythms, particularly in identifying outputs of the clock, including RNAs, proteins, and metabolites, whose abundance oscillates throughout the day. These methods hold significant promise for future discovery, particularly when combined with computational modeling. However, genome-scale experiments are costly and laborious, yielding “big data” that are conceptually and statistically difficult to analyze. There is no obvious consensus regarding design or analysis. Here we discuss the relevant technical considerations to generate reproducible, statistically sound, and broadly useful genome-scale data. Rather than suggest a set of rigid rules, we aim to codify principles by which investigators, reviewers, and readers of the primary literature can evaluate the suitability of different experimental designs for measuring different aspects of biological rhythms. We introduce CircaInSilico, a web-based application for generating synthetic genome biology data to benchmark statistical methods for studying biological rhythms. Finally, we discuss several unmet analytical needs, including applications to clinical medicine, and suggest productive avenues to address them. PMID:29098954

  10. Approach to design space from retrospective quality data.

    PubMed

    Puñal Peces, Daniel; García-Montoya, Encarna; Manich, Albert; Suñé-Negre, Josep Maria; Pérez-Lozano, Pilar; Miñarro, Montse; Ticó, Josep Ramon

    2016-01-01

    Nowadays, the entire manufacturing process is based on the current GMPs, which emphasize the reproducibility of the process, and companies have a lot of recorded data about their processes. The establishment of the design space (DS) from retrospective data for a wet compression process. A design of experiments (DoE) with historical data from 4 years of industrial production has been carried out using the experimental factors as the results of the previous risk analysis and eight key parameters (quality specifications) that encompassed process and quality control data. Software Statgraphics 5.0 was applied, and data were processed to obtain eight DS as well as their safe and working ranges. Experience shows that it is possible to determine DS retrospectively, being the greatest difficulty in handling and processing of high amounts of data; however, the practicality of this study is very interesting as it let have the DS with minimal investment in experiments since actual production batch data are processed statistically.

  11. A Test of the Effectiveness of Time Management Training in a Department of the Navy Program Management Office (PMO)

    DTIC Science & Technology

    1977-05-01

    An experiment, designed to introduce time management concepts, was conducted with 33 volunteers from a Department of the Navy PMO -- the experimental...group. The instruments used to conduct the experiment were a Time Management Survey and a Time Management Questionnaire. The survey was used to...data obtained from the experimental group were statistically compared with similar data from a control group. Time management principles and ’tips’ on

  12. Magnetic Johnson Noise Constraints on Electron Electric Dipole Moment Experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Munger, C.

    2004-11-18

    Magnetic fields from statistical fluctuations in currents in conducting materials broaden atomic linewidths by the Zeeman effect. The constraints so imposed on the design of experiments to measure the electric dipole moment of the electron are analyzed. Contrary to the predictions of Lamoreaux [S.K. Lamoreaux, Phys. Rev. A60, 1717(1999)], the standard material for high-permeability magnetic shields proves to be as significant a source of broadening as an ordinary metal. A scheme that would replace this standard material with ferrite is proposed.

  13. Proceedings of the Conference on the Design of Experiments in Army Research Development and Testing (32nd)

    DTIC Science & Technology

    1987-06-01

    number of series among the 63 which were identified as a particular ARIMA form and were "best" modeled by a particular technique. Figure 1 illustrates a...th time from xe’s. The integrbted autoregressive - moving average model , denoted by ARIMA (p,d,q) is a result of combining d-th differencing process...Experiments, (4) Data Analysis and Modeling , (5) Theory and Probablistic Inference, (6) Fuzzy Statistics, (7) Forecasting and Prediction, (8) Small Sample

  14. An experimental study of the temporal statistics of radio signals scattered by rain

    NASA Technical Reports Server (NTRS)

    Hubbard, R. W.; Hull, J. A.; Rice, P. L.; Wells, P. I.

    1973-01-01

    A fixed-beam bistatic CW experiment designed to measure the temporal statistics of the volume reflectivity produced by hydrometeors at several selected altitudes, scattering angles, and at two frequencies (3.6 and 7.8 GHz) is described. Surface rain gauge data, local meteorological data, surveillance S-band radar, and great-circle path propagation measurements were also made to describe the general weather and propagation conditions and to distinguish precipitation scatter signals from those caused by ducting and other nonhydrometeor scatter mechanisms. The data analysis procedures were designed to provide an assessment of a one-year sample of data with a time resolution of one minute. The cumulative distributions of the bistatic signals for all of the rainy minutes during this period are presented for the several path geometries.

  15. Experimental design and data analysis of Ago-RIP-Seq experiments for the identification of microRNA targets.

    PubMed

    Tichy, Diana; Pickl, Julia Maria Anna; Benner, Axel; Sültmann, Holger

    2017-03-31

    The identification of microRNA (miRNA) target genes is crucial for understanding miRNA function. Many methods for the genome-wide miRNA target identification have been developed in recent years; however, they have several limitations including the dependence on low-confident prediction programs and artificial miRNA manipulations. Ago-RNA immunoprecipitation combined with high-throughput sequencing (Ago-RIP-Seq) is a promising alternative. However, appropriate statistical data analysis algorithms taking into account the experimental design and the inherent noise of such experiments are largely lacking.Here, we investigate the experimental design for Ago-RIP-Seq and examine biostatistical methods to identify de novo miRNA target genes. Statistical approaches considered are either based on a negative binomial model fit to the read count data or applied to transformed data using a normal distribution-based generalized linear model. We compare them by a real data simulation study using plasmode data sets and evaluate the suitability of the approaches to detect true miRNA targets by sensitivity and false discovery rates. Our results suggest that simple approaches like linear regression models on (appropriately) transformed read count data are preferable. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  16. Design of experiments and data analysis challenges in calibration for forensics applications

    DOE PAGES

    Anderson-Cook, Christine M.; Burr, Thomas L.; Hamada, Michael S.; ...

    2015-07-15

    Forensic science aims to infer characteristics of source terms using measured observables. Our focus is on statistical design of experiments and data analysis challenges arising in nuclear forensics. More specifically, we focus on inferring aspects of experimental conditions (of a process to produce product Pu oxide powder), such as temperature, nitric acid concentration, and Pu concentration, using measured features of the product Pu oxide powder. The measured features, Y, include trace chemical concentrations and particle morphology such as particle size and shape of the produced Pu oxide power particles. Making inferences about the nature of inputs X that were usedmore » to create nuclear materials having particular characteristics, Y, is an inverse problem. Therefore, statistical analysis can be used to identify the best set (or sets) of Xs for a new set of observed responses Y. One can fit a model (or models) such as Υ = f(Χ) + error, for each of the responses, based on a calibration experiment and then “invert” to solve for the best set of Xs for a new set of Ys. This perspectives paper uses archived experimental data to consider aspects of data collection and experiment design for the calibration data to maximize the quality of the predicted Ys in the forward models; that is, we assume that well-estimated forward models are effective in the inverse problem. In addition, we consider how to identify a best solution for the inferred X, and evaluate the quality of the result and its robustness to a variety of initial assumptions, and different correlation structures between the responses. In addition, we also briefly review recent advances in metrology issues related to characterizing particle morphology measurements used in the response vector, Y.« less

  17. Design of experiments and data analysis challenges in calibration for forensics applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson-Cook, Christine M.; Burr, Thomas L.; Hamada, Michael S.

    Forensic science aims to infer characteristics of source terms using measured observables. Our focus is on statistical design of experiments and data analysis challenges arising in nuclear forensics. More specifically, we focus on inferring aspects of experimental conditions (of a process to produce product Pu oxide powder), such as temperature, nitric acid concentration, and Pu concentration, using measured features of the product Pu oxide powder. The measured features, Y, include trace chemical concentrations and particle morphology such as particle size and shape of the produced Pu oxide power particles. Making inferences about the nature of inputs X that were usedmore » to create nuclear materials having particular characteristics, Y, is an inverse problem. Therefore, statistical analysis can be used to identify the best set (or sets) of Xs for a new set of observed responses Y. One can fit a model (or models) such as Υ = f(Χ) + error, for each of the responses, based on a calibration experiment and then “invert” to solve for the best set of Xs for a new set of Ys. This perspectives paper uses archived experimental data to consider aspects of data collection and experiment design for the calibration data to maximize the quality of the predicted Ys in the forward models; that is, we assume that well-estimated forward models are effective in the inverse problem. In addition, we consider how to identify a best solution for the inferred X, and evaluate the quality of the result and its robustness to a variety of initial assumptions, and different correlation structures between the responses. In addition, we also briefly review recent advances in metrology issues related to characterizing particle morphology measurements used in the response vector, Y.« less

  18. A practical approach for the scale-up of roller compaction process.

    PubMed

    Shi, Weixian; Sprockel, Omar L

    2016-09-01

    An alternative approach for the scale-up of ribbon formation during roller compaction was investigated, which required only one batch at the commercial scale to set the operational conditions. The scale-up of ribbon formation was based on a probability method. It was sufficient in describing the mechanism of ribbon formation at both scales. In this method, a statistical relationship between roller compaction parameters and ribbon attributes (thickness and density) was first defined with DoE using a pilot Alexanderwerk WP120 roller compactor. While the milling speed was included in the design, it has no practical effect on granule properties within the study range despite its statistical significance. The statistical relationship was then adapted to a commercial Alexanderwerk WP200 roller compactor with one experimental run. The experimental run served as a calibration of the statistical model parameters. The proposed transfer method was then confirmed by conducting a mapping study on the Alexanderwerk WP200 using a factorial DoE, which showed a match between the predictions and the verification experiments. The study demonstrates the applicability of the roller compaction transfer method using the statistical model from the development scale calibrated with one experiment point at the commercial scale. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. How often should we expect to be wrong? Statistical power, P values, and the expected prevalence of false discoveries.

    PubMed

    Marino, Michael J

    2018-05-01

    There is a clear perception in the literature that there is a crisis in reproducibility in the biomedical sciences. Many underlying factors contributing to the prevalence of irreproducible results have been highlighted with a focus on poor design and execution of experiments along with the misuse of statistics. While these factors certainly contribute to irreproducibility, relatively little attention outside of the specialized statistical literature has focused on the expected prevalence of false discoveries under idealized circumstances. In other words, when everything is done correctly, how often should we expect to be wrong? Using a simple simulation of an idealized experiment, it is possible to show the central role of sample size and the related quantity of statistical power in determining the false discovery rate, and in accurate estimation of effect size. According to our calculations, based on current practice many subfields of biomedical science may expect their discoveries to be false at least 25% of the time, and the only viable course to correct this is to require the reporting of statistical power and a minimum of 80% power (1 - β = 0.80) for all studies. Copyright © 2017 Elsevier Inc. All rights reserved.

  20. Use of Taguchi methodology to enhance the yield of caffeine removal with growing cultures of Pseudomonas pseudoalcaligenes.

    PubMed

    Ashengroph, Morahem; Ababaf, Sajad

    2014-12-01

    Microbial caffeine removal is a green solution for treatment of caffeinated products and agro-industrial effluents. We directed this investigation to optimizing a bio-decaffeination process with growing cultures of Pseudomonas pseudoalcaligenes through Taguchi methodology which is a structured statistical approach that can be lowered variations in a process through Design of Experiments (DOE). Five parameters, i.e. initial fructose, tryptone, Zn(+2) ion and caffeine concentrations and also incubation time selected and an L16 orthogonal array was applied to design experiments with four 4-level factors and one 3-level factor (4(4) × 1(3)). Data analysis was performed using the statistical analysis of variance (ANOVA) method. Furthermore, the optimal conditions were determined by combining the optimal levels of the significant factors and verified by a confirming experiment. Measurement of residual caffeine concentration in the reaction mixture was performed using high-performance liquid chromatography (HPLC). Use of Taguchi methodology for optimization of design parameters resulted in about 86.14% reduction of caffeine in 48 h incubation when 5g/l fructose, 3 mM Zn(+2) ion and 4.5 g/l of caffeine are present in the designed media. Under the optimized conditions, the yield of degradation of caffeine (4.5 g/l) by the native strain of Pseudomonas pseudoalcaligenes TPS8 has been increased from 15.8% to 86.14% which is 5.4 fold higher than the normal yield. According to the experimental results, Taguchi methodology provides a powerful methodology for identifying the favorable parameters on caffeine removal using strain TPS8 which suggests that the approach also has potential application with similar strains to improve the yield of caffeine removal from caffeine containing solutions.

  1. Statistical approaches to maximize recombinant protein expression in Escherichia coli: a general review.

    PubMed

    Papaneophytou, Christos P; Kontopidis, George

    2014-02-01

    The supply of many valuable proteins that have potential clinical or industrial use is often limited by their low natural availability. With the modern advances in genomics, proteomics and bioinformatics, the number of proteins being produced using recombinant techniques is exponentially increasing and seems to guarantee an unlimited supply of recombinant proteins. The demand of recombinant proteins has increased as more applications in several fields become a commercial reality. Escherichia coli (E. coli) is the most widely used expression system for the production of recombinant proteins for structural and functional studies. However, producing soluble proteins in E. coli is still a major bottleneck for structural biology projects. One of the most challenging steps in any structural biology project is predicting which protein or protein fragment will express solubly and purify for crystallographic studies. The production of soluble and active proteins is influenced by several factors including expression host, fusion tag, induction temperature and time. Statistical designed experiments are gaining success in the production of recombinant protein because they provide information on variable interactions that escape the "one-factor-at-a-time" method. Here, we review the most important factors affecting the production of recombinant proteins in a soluble form. Moreover, we provide information about how the statistical design experiments can increase protein yield and purity as well as find conditions for crystal growth. Copyright © 2013 Elsevier Inc. All rights reserved.

  2. Optimizing an experimental design for an electromagnetic experiment

    NASA Astrophysics Data System (ADS)

    Roux, Estelle; Garcia, Xavier

    2013-04-01

    Most of geophysical studies focus on data acquisition and analysis, but another aspect which is gaining importance is the discussion on acquisition of suitable datasets. This can be done through the design of an optimal experiment. Optimizing an experimental design implies a compromise between maximizing the information we get about the target and reducing the cost of the experiment, considering a wide range of constraints (logistical, financial, experimental …). We are currently developing a method to design an optimal controlled-source electromagnetic (CSEM) experiment to detect a potential CO2 reservoir and monitor this reservoir during and after CO2 injection. Our statistical algorithm combines the use of linearized inverse theory (to evaluate the quality of one given design via the objective function) and stochastic optimization methods like genetic algorithm (to examine a wide range of possible surveys). The particularity of our method is that it uses a multi-objective genetic algorithm that searches for designs that fit several objective functions simultaneously. One main advantage of this kind of technique to design an experiment is that it does not require the acquisition of any data and can thus be easily conducted before any geophysical survey. Our new experimental design algorithm has been tested with a realistic one-dimensional resistivity model of the Earth in the region of study (northern Spain CO2 sequestration test site). We show that a small number of well distributed observations have the potential to resolve the target. This simple test also points out the importance of a well chosen objective function. Finally, in the context of CO2 sequestration that motivates this study, we might be interested in maximizing the information we get about the reservoir layer. In that case, we show how the combination of two different objective functions considerably improve its resolution.

  3. Effects of Platform Design on the Customer Experience in an Online Solar PV Marketplace

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    OShaughnessy, Eric J.; Margolis, Robert M.; Leibowicz, Benjamin

    We analyze a unique dataset of residential solar PV quotes offered in an online marketplace to understand how platform design changes affect customer outcomes. Three of the four design changes are associated with statistically significant and robust reductions in offer prices, though none of the policies were designed explicitly to reduce prices. The results suggest that even small changes in how prospective solar PV customers interact with installers can affect customer outcomes such as prices. Specifically, the four changes we evaluate are: 1) a customer map that shows potential new EnergySage registrants the locations of nearby customers; 2) a quotemore » cap that precludes more than seven installers from bidding on any one customer; 3) a price guidance feature that informs installers about competitive prices in the customer's market before they submit quotes; and 4) no pre-quote messaging to prohibit installers from contacting customers prior to offering quotes. We calculate descriptive statistics to investigate whether each design change accomplished its specific objectives. Then, we econometrically evaluate the impacts of the design changes on PV quote prices and purchase prices using a regression discontinuity approach.« less

  4. Can Propensity Score Analysis Approximate Randomized Experiments Using Pretest and Demographic Information in Pre-K Intervention Research?

    PubMed

    Dong, Nianbo; Lipsey, Mark W

    2017-01-01

    It is unclear whether propensity score analysis (PSA) based on pretest and demographic covariates will meet the ignorability assumption for replicating the results of randomized experiments. This study applies within-study comparisons to assess whether pre-Kindergarten (pre-K) treatment effects on achievement outcomes estimated using PSA based on a pretest and demographic covariates can approximate those found in a randomized experiment. Data-Four studies with samples of pre-K children each provided data on two math achievement outcome measures with baseline pretests and child demographic variables that included race, gender, age, language spoken at home, and mother's highest education. Research Design and Data Analysis-A randomized study of a pre-K math curriculum provided benchmark estimates of effects on achievement measures. Comparison samples from other pre-K studies were then substituted for the original randomized control and the effects were reestimated using PSA. The correspondence was evaluated using multiple criteria. The effect estimates using PSA were in the same direction as the benchmark estimates, had similar but not identical statistical significance, and did not differ from the benchmarks at statistically significant levels. However, the magnitude of the effect sizes differed and displayed both absolute and relative bias larger than required to show statistical equivalence with formal tests, but those results were not definitive because of the limited statistical power. We conclude that treatment effect estimates based on a single pretest and demographic covariates in PSA correspond to those from a randomized experiment on the most general criteria for equivalence.

  5. Instructional strategies for online introductory college physics based on learning styles

    NASA Astrophysics Data System (ADS)

    Ekwue, Eleazer U.

    The practical nature of physics and its reliance on mathematical presentations and problem solving pose a challenge toward presentation of the course in an online environment for effective learning experience. Most first-time introductory college physics students fail to grasp the basic concepts of the course and the problem solving skills if the instructional strategy used to deliver the course is not compatible with the learners' preferred learning styles. This study investigates the effect of four instructional strategies based on four learning styles (listening, reading, iconic, and direct-experience) to improve learning for introductory college physics in an online environment. Learning styles of 146 participants were determined with Canfield Learning Style inventory. Of the 85 learners who completed the study, research results showed a statistically significant increase in learning performance following the online instruction in all four learning style groups. No statistically significant differences in learning were found among the four groups. However, greater significant academic improvement was found among learners with iconic and direct-experience modes of learning. Learners in all four groups expressed that the design of the unit presentation to match their individual learning styles contributed most to their learning experience. They were satisfied with learning a new physics concept online that, in their opinion, is either comparable or better than an instructor-led classroom experience. Findings from this study suggest that learners' performance and satisfaction in an online introductory physics course could be improved by using instructional designs that are tailored to learners' preferred ways of learning. It could contribute toward the challenge of providing viable online physics instruction in colleges and universities.

  6. Optimal designs for copula models

    PubMed Central

    Perrone, E.; Müller, W.G.

    2016-01-01

    Copula modelling has in the past decade become a standard tool in many areas of applied statistics. However, a largely neglected aspect concerns the design of related experiments. Particularly the issue of whether the estimation of copula parameters can be enhanced by optimizing experimental conditions and how robust all the parameter estimates for the model are with respect to the type of copula employed. In this paper an equivalence theorem for (bivariate) copula models is provided that allows formulation of efficient design algorithms and quick checks of whether designs are optimal or at least efficient. Some examples illustrate that in practical situations considerable gains in design efficiency can be achieved. A natural comparison between different copula models with respect to design efficiency is provided as well. PMID:27453616

  7. Real-time PCR probe optimization using design of experiments approach.

    PubMed

    Wadle, S; Lehnert, M; Rubenwolf, S; Zengerle, R; von Stetten, F

    2016-03-01

    Primer and probe sequence designs are among the most critical input factors in real-time polymerase chain reaction (PCR) assay optimization. In this study, we present the use of statistical design of experiments (DOE) approach as a general guideline for probe optimization and more specifically focus on design optimization of label-free hydrolysis probes that are designated as mediator probes (MPs), which are used in reverse transcription MP PCR (RT-MP PCR). The effect of three input factors on assay performance was investigated: distance between primer and mediator probe cleavage site; dimer stability of MP and target sequence (influenza B virus); and dimer stability of the mediator and universal reporter (UR). The results indicated that the latter dimer stability had the greatest influence on assay performance, with RT-MP PCR efficiency increased by up to 10% with changes to this input factor. With an optimal design configuration, a detection limit of 3-14 target copies/10 μl reaction could be achieved. This improved detection limit was confirmed for another UR design and for a second target sequence, human metapneumovirus, with 7-11 copies/10 μl reaction detected in an optimum case. The DOE approach for improving oligonucleotide designs for real-time PCR not only produces excellent results but may also reduce the number of experiments that need to be performed, thus reducing costs and experimental times.

  8. A Comparison of Two Balance Calibration Model Building Methods

    NASA Technical Reports Server (NTRS)

    DeLoach, Richard; Ulbrich, Norbert

    2007-01-01

    Simulated strain-gage balance calibration data is used to compare the accuracy of two balance calibration model building methods for different noise environments and calibration experiment designs. The first building method obtains a math model for the analysis of balance calibration data after applying a candidate math model search algorithm to the calibration data set. The second building method uses stepwise regression analysis in order to construct a model for the analysis. Four balance calibration data sets were simulated in order to compare the accuracy of the two math model building methods. The simulated data sets were prepared using the traditional One Factor At a Time (OFAT) technique and the Modern Design of Experiments (MDOE) approach. Random and systematic errors were introduced in the simulated calibration data sets in order to study their influence on the math model building methods. Residuals of the fitted calibration responses and other statistical metrics were compared in order to evaluate the calibration models developed with different combinations of noise environment, experiment design, and model building method. Overall, predicted math models and residuals of both math model building methods show very good agreement. Significant differences in model quality were attributable to noise environment, experiment design, and their interaction. Generally, the addition of systematic error significantly degraded the quality of calibration models developed from OFAT data by either method, but MDOE experiment designs were more robust with respect to the introduction of a systematic component of the unexplained variance.

  9. What I See Is Not Quite the Way It Really Is: Students' Emergent Reasoning about Sampling Variability

    ERIC Educational Resources Information Center

    Pfannkuch, Maxine; Arnold, Pip; Wild, Chris J.

    2015-01-01

    Currently, instruction pays little attention to the development of students' sampling variability reasoning in relation to statistical inference. In this paper, we briefly discuss the especially designed sampling variability learning experiences students aged about 15 engaged in as part of a research project. We examine assessment and…

  10. Integrating the statistical analysis of spatial data in ecology

    Treesearch

    A. M. Liebhold; J. Gurevitch

    2002-01-01

    In many areas of ecology there is an increasing emphasis on spatial relationships. Often ecologists are interested in new ways of analyzing data with the objective of quantifying spatial patterns, and in designing surveys and experiments in light of the recognition that there may be underlying spatial pattern in biotic responses. In doing so, ecologists have adopted a...

  11. Teaching Efficacy in the Classroom: Skill Based Training for Teachers' Empowerment

    ERIC Educational Resources Information Center

    Karimzadeh, Mansoureh; Salehi, Hadi; Embi, Mohamed Amin; Nasiri, Mehdi; Shojaee, Mohammad

    2014-01-01

    This study aims to use an experimental research design to enhance teaching efficacy by social-emotional skills training in teachers. The statistical sample comprised of 68 elementary teachers (grades 4 and 5) with at least 10 years teaching experience and a bachelor's degree who were randomly assigned into control (18 female, 16 male) and…

  12. NLS Handbook, 2005. National Longitudinal Surveys

    ERIC Educational Resources Information Center

    Bureau of Labor Statistics, 2006

    2006-01-01

    The National Longitudinal Surveys (NLS), sponsored by the U.S. Bureau of Labor Statistics (BLS), are a set of surveys designed to gather information at multiple points in time on the labor market experiences of groups of men and women. Each of the cohorts has been selected to represent all people living in the United States at the initial…

  13. I Remember You: Independence and the Binomial Model

    ERIC Educational Resources Information Center

    Levine, Douglas W.; Rockhill, Beverly

    2006-01-01

    We focus on the problem of ignoring statistical independence. A binomial experiment is used to determine whether judges could match, based on looks alone, dogs to their owners. The experimental design introduces dependencies such that the probability of a given judge correctly matching a dog and an owner changes from trial to trial. We show how…

  14. Quantitative Comparison of Three Standardization Methods Using a One-Way ANOVA for Multiple Mean Comparisons

    ERIC Educational Resources Information Center

    Barrows, Russell D.

    2007-01-01

    A one-way ANOVA experiment is performed to determine whether or not the three standardization methods are statistically different in determining the concentration of the three paraffin analytes. The laboratory exercise asks students to combine the three methods in a single analytical procedure of their own design to determine the concentration of…

  15. Instructional Strategies and Course Design for Teaching Statistics Online: Perspectives from Online Students

    ERIC Educational Resources Information Center

    Yang, Dazhi

    2017-01-01

    Background: Teaching online is a different experience from that of teaching in a face-to-face setting. Knowledge and skills developed for teaching face-to-face classes are not adequate preparation for teaching online. It is even more challenging to teach science, technology, engineering and math (STEM) courses completely online because these…

  16. The Use of AC-DC-AC Methods in Assessing Corrosion Resistance Performance of Coating Systems for Magnesium Alloys

    NASA Astrophysics Data System (ADS)

    McCune, Robert C.; Upadhyay, Vinod; Wang, Yar-Ming; Battocchi, Dante

    The potential utility of AC-DC-AC electrochemical methods in comparative measures of corrosion-resisting coating system performance for magnesium alloys under consideration for the USAMP "Magnesium Front End Research and Development" project was previously shown in this forum [1]. Additional studies of this approach using statistically-designed experiments have been conducted with focus on alloy types, pretreatment, topcoat material and topcoat thickness as the variables. Additionally, sample coupons made for these designed experiments were also subjected to a typical automotive cyclic corrosion test cycle (SAE J2334) as well as ASTM B117 for comparison of relative performance. Results of these studies are presented along with advantages and limitations of the proposed methodology.

  17. Design and experiment of FBG-based icing monitoring on overhead transmission lines with an improvement trial for windy weather.

    PubMed

    Zhang, Min; Xing, Yimeng; Zhang, Zhiguo; Chen, Qiguan

    2014-12-12

    A scheme for monitoring icing on overhead transmission lines with fiber Bragg grating (FBG) strain sensors is designed and evaluated both theoretically and experimentally. The influences of temperature and wind are considered. The results of field experiments using simulated ice loading on windless days indicate that the scheme is capable of monitoring the icing thickness within 0-30 mm with an accuracy of ±1 mm, a load cell error of 0.0308v, a repeatability error of 0.3328v and a hysteresis error is 0.026%. To improve the measurement during windy weather, a correction factor is added to the effective gravity acceleration, and the absolute FBG strain is replaced by its statistical average.

  18. Configural displays can improve nutrition-related. decisions: an application of the proximity compatibility principle.

    PubMed

    Marino, Christopher J; Mahan, Robert R

    2005-01-01

    The nutrition label format currently used by consumers to make dietary-related decisions presents significant information-processing demands for integration-based decisions; however, those demands were not considered as primary factors when the format was adopted. Labels designed in accordance with known principles of cognitive psychology might enhance the kind of decision making that food labeling was intended to facilitate. Three experiments were designed on the basis of the proximity compatibility principle (PCP) to investigate the relationship between nutrition label format and decision making; the experiments involved two types of integration decisions and one type of filtering decision. Based on the PCP, decision performance was measured to test the overall hypothesis that matched task-display tandems would result in better decision performance than would mismatched tandems. In each experiment, a statistically significant increase in mean decision performance was found when the display design was cognitively matched to the demands of the task. Combined, the results from all three experiments support the general hypothesis that task-display matching is a design principle that may enhance the utility of nutrition labeling in nutrition-related decision making. Actual or potential applications of this research include developing robust display solutions that aid in less effortful assimilation of nutrition-related information for consumers.

  19. Proceedings of the Conference on the Design of Experiments in Army Research Development and Testing (35th) Held in Monterey California, California on 18-20 October 1989

    DTIC Science & Technology

    1990-08-01

    Statistics. John Wiley and Sons , 1980, p. 111. 7. Hanson, D. L. and Koopmans, Lo H.t Tolerance Limits for the Class of Distributions With Increasing...Am. Statis° Assoc., vol. 82, 1987, p. 918. 9. Lehmann, E0 Lot Testing Statistical Hypotheses. John Wiley and Sons , 1959, pp. 274-275. 10. Lemon, G. H...Surfaces," John Wily & Sons , Inc., New York. 3. Box, G. E. P. and Wilson, K. B. (1951), "On the Experimental Attainment of Optimum Conditions," Journal of

  20. An Analysis Methodology for the Gamma-ray Large Area Space Telescope

    NASA Technical Reports Server (NTRS)

    Morris, Robin D.; Cohen-Tanugi, Johann

    2004-01-01

    The Large Area Telescope (LAT) instrument on the Gamma Ray Large Area Space Telescope (GLAST) has been designed to detect high-energy gamma rays and determine their direction of incidence and energy. We propose a reconstruction algorithm based on recent advances in statistical methodology. This method, alternative to the standard event analysis inherited from high energy collider physics experiments, incorporates more accurately the physical processes occurring in the detector, and makes full use of the statistical information available. It could thus provide a better estimate of the direction and energy of the primary photon.

  1. Studies in Support of the Application of Statistical Theory to Design and Evaluation of Operational Tests. Annex D. An Application of Bayesian Statistical Methods in the Determination of Sample Size for Operational Testing in the U.S. Army

    DTIC Science & Technology

    1977-07-01

    SIZE C XNI. C UE2 - UTILITY OF EXPERIMENT OF SIZE C XN2. C ICHECK - VARIABLE USLD TO CHECK FOR C TERMINATION, C~C DIMENSION SUBLIM{20),UPLIM(20),UEI(20...1J=UPLIM(K4-I)-XNI (K+1)+SU8LIt1(K+i*. C CHECK FOR TERMINATION. 944 ICHECK =SUBLIM(K)+2 IFIICHECK.GEUPLiHMK.,OR.K.G1.20’ GO TO 930 GO TO 920 930

  2. Six Guidelines for Interesting Research.

    PubMed

    Gray, Kurt; Wegner, Daniel M

    2013-09-01

    There are many guides on proper psychology, but far fewer on interesting psychology. This article presents six guidelines for interesting research. The first three-Phenomena First, Be Surprising, and Grandmothers, Not Scientists-suggest how to choose your research question; the last three-Be The Participant, Simple Statistics, and Powerful Beginnings-suggest how to answer your research question and offer perspectives on experimental design, statistical analysis, and effective communication. These guidelines serve as reminders that replicability is necessary but not sufficient for compelling psychological science. Interesting research considers subjective experience; it listens to the music of the human condition. © The Author(s) 2013.

  3. Does chess instruction improve mathematical problem-solving ability? Two experimental studies with an active control group.

    PubMed

    Sala, Giovanni; Gobet, Fernand

    2017-12-01

    It has been proposed that playing chess enables children to improve their ability in mathematics. These claims have been recently evaluated in a meta-analysis (Sala & Gobet, 2016, Educational Research Review, 18, 46-57), which indicated a significant effect in favor of the groups playing chess. However, the meta-analysis also showed that most of the reviewed studies used a poor experimental design (in particular, they lacked an active control group). We ran two experiments that used a three-group design including both an active and a passive control group, with a focus on mathematical ability. In the first experiment (N = 233), a group of third and fourth graders was taught chess for 25 hours and tested on mathematical problem-solving tasks. Participants also filled in a questionnaire assessing their meta-cognitive ability for mathematics problems. The group playing chess was compared to an active control group (playing checkers) and a passive control group. The three groups showed no statistically significant difference in mathematical problem-solving or metacognitive abilities in the posttest. The second experiment (N = 52) broadly used the same design, but the Oriental game of Go replaced checkers in the active control group. While the chess-treated group and the passive control group slightly outperformed the active control group with mathematical problem solving, the differences were not statistically significant. No differences were found with respect to metacognitive ability. These results suggest that the effects (if any) of chess instruction, when rigorously tested, are modest and that such interventions should not replace the traditional curriculum in mathematics.

  4. Design of a K/Q-Band Beacon Receiver for the Alphasat TDP#5 Experiment

    NASA Technical Reports Server (NTRS)

    Morse, Jacquelynne R.

    2014-01-01

    This paper describes the design and performance of a coherent KQ-band (2040 GHz) beacon receiver developed at NASA Glenn Research Center (GRC) that will be installed at the Politecnico di Milano (POLIMI) for use in the Alphasat Technology Demonstration Payload 5 (TDP5) beacon experiment. The goal of this experiment is to characterize rain fade attenuation at 40 GHz to improve the performance of existing statistical rain attenuation models in the Q-band. The ground terminal developed by NASA GRC utilizes an FFT-based frequency estimation receiver capable of characterizing total path attenuation effects due to gaseous absorption, clouds, rain, and scintillation. The receiver system has been characterized in the lab and demonstrates a system dynamic range performance of better than 58 dB at 1 Hz and better than 48 dB at 10 Hz rates.

  5. Design of a K/Q-Band Beacon Receiver for the Alphasat Technology Demonstration Payload (TDP) #5 Experiment

    NASA Technical Reports Server (NTRS)

    Morse, Jacquelynne R.

    2014-01-01

    This paper describes the design and performance of a coherent KQ-band (2040 GHz) beacon receiver developed at NASA Glenn Research Center (GRC) that will be installed at the Politecnico di Milano (POLIMI) for use in the Alphasat Technology Demonstration Payload 5 (TDP5) beacon experiment. The goal of this experiment is to characterize rain fade attenuation at 40 GHz to improve the performance of existing statistical rain attenuation models in the Q-band. The ground terminal developed by NASA GRC utilizes an FFT-based frequency estimation receiver capable of characterizing total path attenuation effects due to gaseous absorption, clouds, rain, and scintillation. The receiver system has been characterized in the lab and demonstrates a system dynamic range performance of better than 58 dB at 1 Hz and better than 48 dB at 10 Hz rates.

  6. Users manual for the US baseline corn and soybean segment classification procedure

    NASA Technical Reports Server (NTRS)

    Horvath, R.; Colwell, R. (Principal Investigator); Hay, C.; Metzler, M.; Mykolenko, O.; Odenweller, J.; Rice, D.

    1981-01-01

    A user's manual for the classification component of the FY-81 U.S. Corn and Soybean Pilot Experiment in the Foreign Commodity Production Forecasting Project of AgRISTARS is presented. This experiment is one of several major experiments in AgRISTARS designed to measure and advance the remote sensing technologies for cropland inventory. The classification procedure discussed is designed to produce segment proportion estimates for corn and soybeans in the U.S. Corn Belt (Iowa, Indiana, and Illinois) using LANDSAT data. The estimates are produced by an integrated Analyst/Machine procedure. The Analyst selects acquisitions, participates in stratification, and assigns crop labels to selected samples. In concert with the Analyst, the machine digitally preprocesses LANDSAT data to remove external effects, stratifies the data into field like units and into spectrally similar groups, statistically samples the data for Analyst labeling, and combines the labeled samples into a final estimate.

  7. Design of a K/Q-band Beacon Receiver for the Alphasat TDP#5 Experiment

    NASA Technical Reports Server (NTRS)

    Nessel, James A.; Zemba, Michael J.; Morse, Jacquelynne R.

    2014-01-01

    This paper describes the design and performance of a coherent K/Q-band (20/40GHz) beacon receiver developed at NASA Glenn Research Center (GRC) that will be installed at the Politecnico di Milano (POLIMI) for use in the Alphasat Technology Demonstration Payload #5 (TDP#5) beacon experiment. The goal of this experiment is to characterize rain fade attenuation at 40GHz to improve the performance of existing statistical rain attenuation models in the Q-band. The ground terminal developed by NASA GRC utilizes an FFT-based frequency estimation receiver capable of characterizing total path attenuation effects due to gaseous absorption, clouds, rain, and scintillation. The receiver system has been characterized in the lab and demonstrates a system dynamic range performance of better than 58dB at 1Hz and better than 48dB at 10Hz rates.

  8. A statistical experiment design approach for optimizing biodegradation of weathered crude oil in coastal sediments.

    PubMed

    Mohajeri, Leila; Aziz, Hamidi Abdul; Isa, Mohamed Hasnain; Zahed, Mohammad Ali

    2010-02-01

    This work studied the bioremediation of weathered crude oil (WCO) in coastal sediment samples using central composite face centered design (CCFD) under response surface methodology (RSM). Initial oil concentration, biomass, nitrogen and phosphorus concentrations were used as independent variables (factors) and oil removal as dependent variable (response) in a 60 days trial. A statistically significant model for WCO removal was obtained. The coefficient of determination (R(2)=0.9732) and probability value (P<0.0001) demonstrated significance for the regression model. Numerical optimization based on desirability function were carried out for initial oil concentration of 2, 16 and 30 g per kg sediment and 83.13, 78.06 and 69.92 per cent removal were observed respectively, compare to 77.13, 74.17 and 69.87 per cent removal for un-optimized results.

  9. In vivo evaluation of the effect of stimulus distribution on FIR statistical efficiency in event-related fMRI.

    PubMed

    Jansma, J Martijn; de Zwart, Jacco A; van Gelderen, Peter; Duyn, Jeff H; Drevets, Wayne C; Furey, Maura L

    2013-05-15

    Technical developments in MRI have improved signal to noise, allowing use of analysis methods such as Finite impulse response (FIR) of rapid event related functional MRI (er-fMRI). FIR is one of the most informative analysis methods as it determines onset and full shape of the hemodynamic response function (HRF) without any a priori assumptions. FIR is however vulnerable to multicollinearity, which is directly related to the distribution of stimuli over time. Efficiency can be optimized by simplifying a design, and restricting stimuli distribution to specific sequences, while more design flexibility necessarily reduces efficiency. However, the actual effect of efficiency on fMRI results has never been tested in vivo. Thus, it is currently difficult to make an informed choice between protocol flexibility and statistical efficiency. The main goal of this study was to assign concrete fMRI signal to noise values to the abstract scale of FIR statistical efficiency. Ten subjects repeated a perception task with five random and m-sequence based protocol, with varying but, according to literature, acceptable levels of multicollinearity. Results indicated substantial differences in signal standard deviation, while the level was a function of multicollinearity. Experiment protocols varied up to 55.4% in standard deviation. Results confirm that quality of fMRI in an FIR analysis can significantly and substantially vary with statistical efficiency. Our in vivo measurements can be used to aid in making an informed decision between freedom in protocol design and statistical efficiency. Published by Elsevier B.V.

  10. [Triple-type theory of statistics and its application in the scientific research of biomedicine].

    PubMed

    Hu, Liang-ping; Liu, Hui-gang

    2005-07-20

    To point out the crux of why so many people failed to grasp statistics and to bring forth a "triple-type theory of statistics" to solve the problem in a creative way. Based on the experience in long-time teaching and research in statistics, the "three-type theory" was raised and clarified. Examples were provided to demonstrate that the 3 types, i.e., expressive type, prototype and the standardized type are the essentials for people to apply statistics rationally both in theory and practice, and moreover, it is demonstrated by some instances that the "three types" are correlated with each other. It can help people to see the essence by interpreting and analyzing the problems of experimental designs and statistical analyses in medical research work. Investigations reveal that for some questions, the three types are mutually identical; for some questions, the prototype is their standardized type; however, for some others, the three types are distinct from each other. It has been shown that in some multifactor experimental researches, it leads to the nonexistence of the standardized type corresponding to the prototype at all, because some researchers have committed the mistake of "incomplete control" in setting experimental groups. This is a problem which should be solved by the concept and method of "division". Once the "triple-type" for each question is clarified, a proper experimental design and statistical method can be carried out easily. "Triple-type theory of statistics" can help people to avoid committing statistical mistakes or at least to decrease the misuse rate dramatically and improve the quality, level and speed of biomedical research during the process of applying statistics. It can also help people to improve the quality of statistical textbooks and the teaching effect of statistics and it has demonstrated how to advance biomedical statistics.

  11. Design-of-experiments to Reduce Life-cycle Costs in Combat Aircraft Inlets

    NASA Technical Reports Server (NTRS)

    Anderson, Bernhard H.; Baust, Henry D.; Agrell, Johan

    2003-01-01

    It is the purpose of this study to demonstrate the viability and economy of Design- of-Experiments (DOE), to arrive at micro-secondary flow control installation designs that achieve optimal inlet performance for different mission strategies. These statistical design concepts were used to investigate the properties of "low unit strength" micro-effector installation. "Low unit strength" micro-effectors are micro-vanes, set a very low angle-of incidence, with very long chord lengths. They are designed to influence the neat wall inlet flow over an extended streamwise distance. In this study, however, the long chord lengths were replicated by a series of short chord length effectors arranged in series over multiple bands of effectors. In order to properly evaluate the performance differences between the single band extended chord length installation designs and the segmented multiband short chord length designs, both sets of installations must be optimal. Critical to achieving optimal micro-secondary flow control installation designs is the understanding of the factor interactions that occur between the multiple bands of micro-scale vane effectors. These factor interactions are best understood and brought together in an optimal manner through a structured DOE process, or more specifically Response Surface Methods (RSM).

  12. Packet Randomized Experiments for Eliminating Classes of Confounders

    PubMed Central

    Pavela, Greg; Wiener, Howard; Fontaine, Kevin R.; Fields, David A.; Voss, Jameson D.; Allison, David B.

    2014-01-01

    Background Although randomization is considered essential for causal inference, it is often not possible to randomize in nutrition and obesity research. To address this, we develop a framework for an experimental design—packet randomized experiments (PREs), which improves causal inferences when randomization on a single treatment variable is not possible. This situation arises when subjects are randomly assigned to a condition (such as a new roommate) which varies in one characteristic of interest (such as weight), but also varies across many others. There has been no general discussion of this experimental design, including its strengths, limitations, and statistical properties. As such, researchers are left to develop and apply PREs on an ad hoc basis, limiting its potential to improve causal inferences among nutrition and obesity researchers. Methods We introduce PREs as an intermediary design between randomized controlled trials and observational studies. We review previous research that used the PRE design and describe its application in obesity-related research, including random roommate assignments, heterochronic parabiosis, and the quasi-random assignment of subjects to geographic areas. We then provide a statistical framework to control for potential packet-level confounders not accounted for by randomization. Results PREs have successfully been used to improve causal estimates of the effect of roommates, altitude, and breastfeeding on weight outcomes. When certain assumptions are met, PREs can asymptotically control for packet-level characteristics. This has the potential to statistically estimate the effect of a single treatment even when randomization to a single treatment did not occur. Conclusions Applying PREs to obesity-related research will improve decisions about clinical, public health, and policy actions insofar as it offers researchers new insight into cause and effect relationships among variables. PMID:25444088

  13. System Synthesis in Preliminary Aircraft Design using Statistical Methods

    NASA Technical Reports Server (NTRS)

    DeLaurentis, Daniel; Mavris, Dimitri N.; Schrage, Daniel P.

    1996-01-01

    This paper documents an approach to conceptual and preliminary aircraft design in which system synthesis is achieved using statistical methods, specifically design of experiments (DOE) and response surface methodology (RSM). These methods are employed in order to more efficiently search the design space for optimum configurations. In particular, a methodology incorporating three uses of these techniques is presented. First, response surface equations are formed which represent aerodynamic analyses, in the form of regression polynomials, which are more sophisticated than generally available in early design stages. Next, a regression equation for an overall evaluation criterion is constructed for the purpose of constrained optimization at the system level. This optimization, though achieved in a innovative way, is still traditional in that it is a point design solution. The methodology put forward here remedies this by introducing uncertainty into the problem, resulting a solutions which are probabilistic in nature. DOE/RSM is used for the third time in this setting. The process is demonstrated through a detailed aero-propulsion optimization of a high speed civil transport. Fundamental goals of the methodology, then, are to introduce higher fidelity disciplinary analyses to the conceptual aircraft synthesis and provide a roadmap for transitioning from point solutions to probabalistic designs (and eventually robust ones).

  14. Critical evaluation of challenges and future use of animals in experimentation for biomedical research.

    PubMed

    Singh, Vijay Pal; Pratap, Kunal; Sinha, Juhi; Desiraju, Koundinya; Bahal, Devika; Kukreti, Ritushree

    2016-12-01

    Animal experiments that are conducted worldwide contribute to significant findings and breakthroughs in the understanding of the underlying mechanisms of various diseases, bringing up appropriate clinical interventions. However, their predictive value is often low, leading to translational failure. Problems like translational failure of animal studies and poorly designed animal experiments lead to loss of animal lives and less translatable data which affect research outcomes ethically and economically. Due to increasing complexities in animal usage with changes in public perception and stringent guidelines, it is becoming difficult to use animals for conducting studies. This review deals with challenges like poor experimental design and ethical concerns and discusses key concepts like sample size, statistics in experimental design, humane endpoints, economic assessment, species difference, housing conditions, and systematic reviews and meta-analyses that are often neglected. If practiced, these strategies can refine the procedures effectively and help translate the outcomes efficiently. © The Author(s) 2016.

  15. Critical evaluation of challenges and future use of animals in experimentation for biomedical research

    PubMed Central

    Singh, Vijay Pal; Pratap, Kunal; Sinha, Juhi; Desiraju, Koundinya; Bahal, Devika; Kukreti, Ritushree

    2016-01-01

    Animal experiments that are conducted worldwide contribute to significant findings and breakthroughs in the understanding of the underlying mechanisms of various diseases, bringing up appropriate clinical interventions. However, their predictive value is often low, leading to translational failure. Problems like translational failure of animal studies and poorly designed animal experiments lead to loss of animal lives and less translatable data which affect research outcomes ethically and economically. Due to increasing complexities in animal usage with changes in public perception and stringent guidelines, it is becoming difficult to use animals for conducting studies. This review deals with challenges like poor experimental design and ethical concerns and discusses key concepts like sample size, statistics in experimental design, humane endpoints, economic assessment, species difference, housing conditions, and systematic reviews and meta-analyses that are often neglected. If practiced, these strategies can refine the procedures effectively and help translate the outcomes efficiently. PMID:27694614

  16. Statistics Graduate Students' Professional Development for Teaching: A Communities of Practice Model

    NASA Astrophysics Data System (ADS)

    Justice, Nicola

    Graduate teaching assistants (GTAs) are responsible for instructing approximately 25% of introductory statistics courses in the United States (Blair, Kirkman, & Maxwell, 2013). Most research on GTA professional development focuses on structured activities (e.g., courses, workshops) that have been developed to improve GTAs' pedagogy and content knowledge. Few studies take into account the social contexts of GTAs' professional development. However, GTAs perceive their social interactions with other GTAs to be a vital part of their preparation and support for teaching (e.g., Staton & Darling, 1989). Communities of practice (CoPs) are one way to bring together the study of the social contexts and structured activities of GTA professional development. CoPs are defined as groups of practitioners who deepen their knowledge and expertise by interacting with each other on an ongoing basis (e.g., Lave & Wenger, 1991). Graduate students may participate in CoPs related to teaching in many ways, including attending courses or workshops, participating in weekly meetings, engaging in informal discussions about teaching, or participating in e-mail conversations related to teaching tasks. This study explored the relationship between statistics graduate students' experiences in CoPs and the extent to which they hold student-centered teaching beliefs. A framework for characterizing GTAs' experiences in CoPs was described and a theoretical model relating these characteristics to GTAs' beliefs was developed. To gather data to test the model, the Graduate Students' Experiences Teaching Statistics (GETS) Inventory was created. Items were written to collect information about GTAs' current teaching beliefs, teaching beliefs before entering their degree programs, characteristics of GTAs' experiences in CoPs, and demographic information. Using an online program, the GETS Inventory was administered to N =218 statistics graduate students representing 37 institutions in 24 different U.S. states. The data gathered from the national survey suggest that statistics graduate students often experience CoPs through required meetings and voluntary discussions about teaching. Participants feel comfortable disagreeing with the people they perceive to be most influential on their teaching beliefs. Most participants perceive a faculty member to have the most influential role in shaping their teaching beliefs. The survey data did not provide evidence to support the proposed theoretical model relating characteristics of experiences in CoPs and beliefs about teaching statistics. Based on cross-validation results, prior beliefs about teaching statistics was the best predictor of current beliefs. Additional models were retained that included student characteristics suggested by previous literature to be associated with student-centered or traditional teaching beliefs (e.g., prior teaching experience, international student status). The results of this study can be used to inform future efforts to help promote student-centered teaching beliefs and teaching practices among statistics GTAs. Modifications to the GETS Inventory are suggested for use in future research designed to gather information about GTAs, their teaching beliefs, and their experiences in CoPs. Suggestions are also made for aspects of CoPs that might be studied further in order to learn how CoPs can promote teaching beliefs and practices that support student learning.

  17. Précis of statistical significance: rationale, validity, and utility.

    PubMed

    Chow, S L

    1998-04-01

    The null-hypothesis significance-test procedure (NHSTP) is defended in the context of the theory-corroboration experiment, as well as the following contrasts: (a) substantive hypotheses versus statistical hypotheses, (b) theory corroboration versus statistical hypothesis testing, (c) theoretical inference versus statistical decision, (d) experiments versus nonexperimental studies, and (e) theory corroboration versus treatment assessment. The null hypothesis can be true because it is the hypothesis that errors are randomly distributed in data. Moreover, the null hypothesis is never used as a categorical proposition. Statistical significance means only that chance influences can be excluded as an explanation of data; it does not identify the nonchance factor responsible. The experimental conclusion is drawn with the inductive principle underlying the experimental design. A chain of deductive arguments gives rise to the theoretical conclusion via the experimental conclusion. The anomalous relationship between statistical significance and the effect size often used to criticize NHSTP is more apparent than real. The absolute size of the effect is not an index of evidential support for the substantive hypothesis. Nor is the effect size, by itself, informative as to the practical importance of the research result. Being a conditional probability, statistical power cannot be the a priori probability of statistical significance. The validity of statistical power is debatable because statistical significance is determined with a single sampling distribution of the test statistic based on H0, whereas it takes two distributions to represent statistical power or effect size. Sample size should not be determined in the mechanical manner envisaged in power analysis. It is inappropriate to criticize NHSTP for nonstatistical reasons. At the same time, neither effect size, nor confidence interval estimate, nor posterior probability can be used to exclude chance as an explanation of data. Neither can any of them fulfill the nonstatistical functions expected of them by critics.

  18. The Functional Measurement Experiment Builder suite: two Java-based programs to generate and run functional measurement experiments.

    PubMed

    Mairesse, Olivier; Hofmans, Joeri; Theuns, Peter

    2008-05-01

    We propose a free, easy-to-use computer program that does not requires prior knowledge of computer programming to generate and run experiments using textual or pictorial stimuli. Although the FM Experiment Builder suite was initially programmed for building and conducting FM experiments, it can also be applied for non-FM experiments that necessitate randomized, single, or multifactorial designs. The program is highly configurable, allowing multilingual use and a wide range of different response formats. The outputs of the experiments are Microsoft Excel compatible .xls files that allow easy copy-paste of the results into Weiss's FM CalSTAT program (2006) or any other statistical package. Its Java-based structure is compatible with both Windows and Macintosh operating systems, and its compactness (< 1 MB) makes it easily distributable over the Internet.

  19. Accessible Information Without Disturbing Partially Known Quantum States on a von Neumann Algebra

    NASA Astrophysics Data System (ADS)

    Kuramochi, Yui

    2018-04-01

    This paper addresses the problem of how much information we can extract without disturbing a statistical experiment, which is a family of partially known normal states on a von Neumann algebra. We define the classical part of a statistical experiment as the restriction of the equivalent minimal sufficient statistical experiment to the center of the outcome space, which, in the case of density operators on a Hilbert space, corresponds to the classical probability distributions appearing in the maximal decomposition by Koashi and Imoto (Phys. Rev. A 66, 022,318 2002). We show that we can access by a Schwarz or completely positive channel at most the classical part of a statistical experiment if we do not disturb the states. We apply this result to the broadcasting problem of a statistical experiment. We also show that the classical part of the direct product of statistical experiments is the direct product of the classical parts of the statistical experiments. The proof of the latter result is based on the theorem that the direct product of minimal sufficient statistical experiments is also minimal sufficient.

  20. An improved approach for flight readiness certification: Probabilistic models for flaw propagation and turbine blade failure. Volume 1: Methodology and applications

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with analytical modeling of failure phenomena to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in analytical modeling, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which analytical models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. State-of-the-art analytical models currently employed for designs failure prediction, or performance analysis are used in this methodology. The rationale for the statistical approach taken in the PFA methodology is discussed, the PFA methodology is described, and examples of its application to structural failure modes are presented. The engineering models and computer software used in fatigue crack growth and fatigue crack initiation applications are thoroughly documented.

  1. An improved approach for flight readiness certification: Probabilistic models for flaw propagation and turbine blade failure. Volume 2: Software documentation

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflights systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with analytical modeling of failure phenomena to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in analytical modeling, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which analytical models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. State-of-the-art analytical models currently employed for design, failure prediction, or performance analysis are used in this methodology. The rationale for the statistical approach taken in the PFA methodology is discussed, the PFA methodology is described, and examples of its application to structural failure modes are presented. The engineering models and computer software used in fatigue crack growth and fatigue crack initiation applications are thoroughly documented.

  2. An improved approach for flight readiness certification: Methodology for failure risk assessment and application examples. Volume 2: Software documentation

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes, These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.

  3. An improved approach for flight readiness certification: Methodology for failure risk assessment and application examples, volume 1

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.

  4. Intelligent Condition Diagnosis Method Based on Adaptive Statistic Test Filter and Diagnostic Bayesian Network

    PubMed Central

    Li, Ke; Zhang, Qiuju; Wang, Kun; Chen, Peng; Wang, Huaqing

    2016-01-01

    A new fault diagnosis method for rotating machinery based on adaptive statistic test filter (ASTF) and Diagnostic Bayesian Network (DBN) is presented in this paper. ASTF is proposed to obtain weak fault features under background noise, ASTF is based on statistic hypothesis testing in the frequency domain to evaluate similarity between reference signal (noise signal) and original signal, and remove the component of high similarity. The optimal level of significance α is obtained using particle swarm optimization (PSO). To evaluate the performance of the ASTF, evaluation factor Ipq is also defined. In addition, a simulation experiment is designed to verify the effectiveness and robustness of ASTF. A sensitive evaluation method using principal component analysis (PCA) is proposed to evaluate the sensitiveness of symptom parameters (SPs) for condition diagnosis. By this way, the good SPs that have high sensitiveness for condition diagnosis can be selected. A three-layer DBN is developed to identify condition of rotation machinery based on the Bayesian Belief Network (BBN) theory. Condition diagnosis experiment for rolling element bearings demonstrates the effectiveness of the proposed method. PMID:26761006

  5. Intelligent Condition Diagnosis Method Based on Adaptive Statistic Test Filter and Diagnostic Bayesian Network.

    PubMed

    Li, Ke; Zhang, Qiuju; Wang, Kun; Chen, Peng; Wang, Huaqing

    2016-01-08

    A new fault diagnosis method for rotating machinery based on adaptive statistic test filter (ASTF) and Diagnostic Bayesian Network (DBN) is presented in this paper. ASTF is proposed to obtain weak fault features under background noise, ASTF is based on statistic hypothesis testing in the frequency domain to evaluate similarity between reference signal (noise signal) and original signal, and remove the component of high similarity. The optimal level of significance α is obtained using particle swarm optimization (PSO). To evaluate the performance of the ASTF, evaluation factor Ipq is also defined. In addition, a simulation experiment is designed to verify the effectiveness and robustness of ASTF. A sensitive evaluation method using principal component analysis (PCA) is proposed to evaluate the sensitiveness of symptom parameters (SPs) for condition diagnosis. By this way, the good SPs that have high sensitiveness for condition diagnosis can be selected. A three-layer DBN is developed to identify condition of rotation machinery based on the Bayesian Belief Network (BBN) theory. Condition diagnosis experiment for rolling element bearings demonstrates the effectiveness of the proposed method.

  6. Automated optimal coordination of multiple-DOF neuromuscular actions in feedforward neuroprostheses.

    PubMed

    Lujan, J Luis; Crago, Patrick E

    2009-01-01

    This paper describes a new method for designing feedforward controllers for multiple-muscle, multiple-DOF, motor system neural prostheses. The design process is based on experimental measurement of the forward input/output properties of the neuromechanical system and numerical optimization of stimulation patterns to meet muscle coactivation criteria, thus resolving the muscle redundancy (i.e., overcontrol) and the coupled DOF problems inherent in neuromechanical systems. We designed feedforward controllers to control the isometric forces at the tip of the thumb in two directions during stimulation of three thumb muscles as a model system. We tested the method experimentally in ten able-bodied individuals and one patient with spinal cord injury. Good control of isometric force in both DOFs was observed, with rms errors less than 10% of the force range in seven experiments and statistically significant correlations between the actual and target forces in all ten experiments. Systematic bias and slope errors were observed in a few experiments, likely due to the neuromuscular fatigue. Overall, the tests demonstrated the ability of a general design approach to satisfy both control and coactivation criteria in multiple-muscle, multiple-axis neuromechanical systems, which is applicable to a wide range of neuromechanical systems and stimulation electrodes.

  7. Design preferences and cognitive styles: experimentation by automated website synthesis.

    PubMed

    Leung, Siu-Wai; Lee, John; Johnson, Chris; Robertson, David

    2012-06-29

    This article aims to demonstrate computational synthesis of Web-based experiments in undertaking experimentation on relationships among the participants' design preference, rationale, and cognitive test performance. The exemplified experiments were computationally synthesised, including the websites as materials, experiment protocols as methods, and cognitive tests as protocol modules. This work also exemplifies the use of a website synthesiser as an essential instrument enabling the participants to explore different possible designs, which were generated on the fly, before selection of preferred designs. The participants were given interactive tree and table generators so that they could explore some different ways of presenting causality information in tables and trees as the visualisation formats. The participants gave their preference ratings for the available designs, as well as their rationale (criteria) for their design decisions. The participants were also asked to take four cognitive tests, which focus on the aspects of visualisation and analogy-making. The relationships among preference ratings, rationale, and the results of cognitive tests were analysed by conservative non-parametric statistics including Wilcoxon test, Krustal-Wallis test, and Kendall correlation. In the test, 41 of the total 64 participants preferred graphical (tree-form) to tabular presentation. Despite the popular preference for graphical presentation, the given tabular presentation was generally rated to be easier than graphical presentation to interpret, especially by those who were scored lower in the visualization and analogy-making tests. This piece of evidence helps generate a hypothesis that design preferences are related to specific cognitive abilities. Without the use of computational synthesis, the experiment setup and scientific results would be impractical to obtain.

  8. Remedial Coursetaking at U.S. Public 2- and 4-Year Institutions: Scope, Experiences, and Outcomes. Statistical Analysis Report. NCES 2016-405

    ERIC Educational Resources Information Center

    Chen, Xianglei

    2016-01-01

    Every year, millions of new college students arrive on campus lacking the necessary academic skills to perform at the college level. Postsecondary institutions address this problem with extensive remedial programs designed to strengthen students' basic skills. While much research on the effectiveness of remedial education has been conducted,…

  9. Impact of Monetary Incentives and Mailing Procedures: An Experiment in a Federally Sponsored Telephone Survey. Methodology Report. NCES 2006-066

    ERIC Educational Resources Information Center

    Brick, J. Michael; Hagedorn, Mary Collins; Montaquila, Jill; Roth, Shelley Brock; Chapman, Christopher

    2006-01-01

    The National Household Education Surveys Program (NHES) includes a series of random digit dial (RDD) surveys developed by the National Center for Education Statistics (NCES) in the Institute of Education Sciences, U.S. Department of Education. It is designed to collect information on important educational issues through telephone surveys of…

  10. A primer of statistical methods for correlating parameters and properties of electrospun poly(L-lactide) scaffolds for tissue engineering--PART 1: design of experiments.

    PubMed

    Seyedmahmoud, Rasoul; Rainer, Alberto; Mozetic, Pamela; Maria Giannitelli, Sara; Trombetta, Marcella; Traversa, Enrico; Licoccia, Silvia; Rinaldi, Antonio

    2015-01-01

    Tissue engineering scaffolds produced by electrospinning are of enormous interest, but still lack a true understanding about the fundamental connection between the outstanding functional properties, the architecture, the mechanical properties, and the process parameters. Fragmentary results from several parametric studies only render some partial insights that are hard to compare and generally miss the role of parameters interactions. To bridge this gap, this article (Part-1 of 2) features a case study on poly-L-lactide scaffolds to demonstrate how statistical methods such as design of experiments can quantitatively identify the correlations existing between key scaffold properties and control parameters, in a systematic, consistent, and comprehensive manner disentangling main effects from interactions. The morphological properties (i.e., fiber distribution and porosity) and mechanical properties (Young's modulus) are "charted" as a function of molecular weight (MW) and other electrospinning process parameters (the Xs), considering the single effect as well as interactions between Xs. For the first time, the major role of the MW emerges clearly in controlling all scaffold properties. The correlation between mechanical and morphological properties is also addressed. © 2014 Wiley Periodicals, Inc.

  11. Highly Efficient Design-of-Experiments Methods for Combining CFD Analysis and Experimental Data

    NASA Technical Reports Server (NTRS)

    Anderson, Bernhard H.; Haller, Harold S.

    2009-01-01

    It is the purpose of this study to examine the impact of "highly efficient" Design-of-Experiments (DOE) methods for combining sets of CFD generated analysis data with smaller sets of Experimental test data in order to accurately predict performance results where experimental test data were not obtained. The study examines the impact of micro-ramp flow control on the shock wave boundary layer (SWBL) interaction where a complete paired set of data exist from both CFD analysis and Experimental measurements By combining the complete set of CFD analysis data composed of fifteen (15) cases with a smaller subset of experimental test data containing four/five (4/5) cases, compound data sets (CFD/EXP) were generated which allows the prediction of the complete set of Experimental results No statistical difference were found to exist between the combined (CFD/EXP) generated data sets and the complete Experimental data set composed of fifteen (15) cases. The same optimal micro-ramp configuration was obtained using the (CFD/EXP) generated data as obtained with the complete set of Experimental data, and the DOE response surfaces generated by the two data sets were also not statistically different.

  12. Effect of grit-blasting on substrate roughness and coating adhesion

    NASA Astrophysics Data System (ADS)

    Varacalle, Dominic J.; Guillen, Donna Post; Deason, Douglas M.; Rhodaberger, William; Sampson, Elliott

    2006-09-01

    Statistically designed experiments were performed to compare the surface roughness produced by grit blasting A36/1020 steel using different abrasives. Grit blast media, blast pressure, and working distance were varied using a Box-type statistical design of experiment (SDE) approach. The surface textures produced by four metal grits (HG16, HG18, HG25, and HG40) and three conventional grits (copper slag, coal slag, and chilled iron) were compared. Substrate roughness was measured using surface profilometry and correlated with operating parameters. The HG16 grit produced the highest surface roughness of all the grits tested. Aluminum and zinc-aluminum coatings were deposited on the grit-blasted substrates using the twin-wire electric are (TWEA) process. Bond strength of the coatings was measured with a portable adhesion tester in accordance with ASTM standard D 4541. The coatings on substrates roughened with steel grit exhibit superior bond strength to those prepared with conventional grit. For aluminum coatings sprayed onto surfaces prepared with the HG16 grit, the bond strength was most influenced by current, spray distance, and spray gun pressure (in that order). The highest bond strength for the zinc-aluminum coatings was attained on surfaces prepared using the metal grits.

  13. Design Steps for Physic STEM Education Learning in Secondary School

    NASA Astrophysics Data System (ADS)

    Teevasuthonsakul, C.; Yuvanatheeme, V.; Sriput, V.; Suwandecha, S.

    2017-09-01

    This study aimed to develop the process of STEM Education activity design used in Physics subjects in the Thai secondary schools. The researchers have conducted the study by reviewing the literature and related works, interviewing Physics experts, designing and revising the process accordingly, and experimenting the designed process in actual classrooms. This brought about the five-step process of STEM Education activity design which Physics teachers applied to their actual teaching context. The results from the after-class evaluation revealed that the students’ satisfaction level toward Physics subject and critical thinking skill was found higher statistically significant at p < .05. Moreover, teachers were advised to integrate the principles of science, mathematics, technology, and engineering design process as the foundation when creating case study of problems and solutions.

  14. The application of statistically designed experiments to resistance spot welding

    NASA Technical Reports Server (NTRS)

    Hafley, Robert A.; Hales, Stephen J.

    1991-01-01

    State-of-the-art Resistance Spot Welding (RSW) equipment has the potential to permit realtime monitoring of operations through advances in computerized process control. In order to realize adaptive feedback capabilities, it is necessary to establish correlations among process variables, welder outputs, and weldment properties. The initial step toward achieving this goal must involve assessment of the effect of specific process inputs and the interactions among these variables on spot weld characteristics. This investigation evaluated these effects through the application of a statistically designed experiment to the RSW process. A half-factorial, Taguchi L sub 16 design was used to understand and refine a RSW schedule developed for welding dissimilar aluminum-lithium alloys of different thickness. The baseline schedule had been established previously by traditional trial and error methods based on engineering judgment and one-factor-at-a-time studies. A hierarchy of inputs with respect to each other was established, and the significance of these inputs with respect to experimental noise was determined. Useful insight was gained into the effect of interactions among process variables, particularly with respect to weldment defects. The effects of equipment related changes associated with disassembly and recalibration were also identified. In spite of an apparent decrease in equipment performance, a significant improvement in the maximum strength for defect-free welds compared to the baseline schedule was achieved.

  15. The MSFC UNIVAC 1108 EXEC 8 simulation model

    NASA Technical Reports Server (NTRS)

    Williams, T. G.; Richards, F. M.; Weatherbee, J. E.; Paul, L. K.

    1972-01-01

    A model is presented which simulates the MSFC Univac 1108 multiprocessor system. The hardware/operating system is described to enable a good statistical measurement of the system behavior. The performance of the 1108 is evaluated by performing twenty-four different experiments designed to locate system bottlenecks and also to test the sensitivity of system throughput with respect to perturbation of the various Exec 8 scheduling algorithms. The model is implemented in the general purpose system simulation language and the techniques described can be used to assist in the design, development, and evaluation of multiprocessor systems.

  16. Product placement of computer games in cyberspace.

    PubMed

    Yang, Heng-Li; Wang, Cheng-Shu

    2008-08-01

    Computer games are considered an emerging media and are even regarded as an advertising channel. By a three-phase experiment, this study investigated the advertising effectiveness of computer games for different product placement forms, product types, and their combinations. As the statistical results revealed, computer games are appropriate for placement advertising. Additionally, different product types and placement forms produced different advertising effectiveness. Optimum combinations of product types and placement forms existed. An advertisement design model is proposed for use in game design environments. Some suggestions are given for advertisers and game companies respectively.

  17. Evaluation and optimization of hepatocyte culture media factors by design of experiments (DoE) methodology

    PubMed Central

    Dong, Jia; Lübberstedt, Marc; Urbaniak, Thomas; Nüssler, Andreas K.N.; Knobeloch, Daniel; Gerlach, Jörg C.; Zeilinger, Katrin

    2008-01-01

    Optimization of cell culture media based on statistical experimental design methodology is a widely used approach for improving cultivation conditions. We applied this methodology to refine the composition of an established culture medium for growth of a human hepatoma cell line, C3A. A selection of growth factors and nutrient supplements were systematically screened according to standard design of experiments (DoE) procedures. The results of the screening indicated that the medium additives hepatocyte growth factor, oncostatin M, and fibroblast growth factor 4 significantly influenced the metabolic activities of the C3A cell line. Surface response methodology revealed that the optimum levels for these factors were 30 ng/ml for hepatocyte growth factor and 35 ng/ml for oncostatin M. Additional experiments on primary human hepatocyte cultures showed high variance in metabolic activities between cells from different individuals, making determination of optimal levels of factors more difficult. Still, it was possible to conclude that hepatocyte growth factor, epidermal growth factor, and oncostatin M had decisive effects on the metabolic functions of primary human hepatocytes. PMID:19003182

  18. Evaluation and optimization of hepatocyte culture media factors by design of experiments (DoE) methodology.

    PubMed

    Dong, Jia; Mandenius, Carl-Fredrik; Lübberstedt, Marc; Urbaniak, Thomas; Nüssler, Andreas K N; Knobeloch, Daniel; Gerlach, Jörg C; Zeilinger, Katrin

    2008-07-01

    Optimization of cell culture media based on statistical experimental design methodology is a widely used approach for improving cultivation conditions. We applied this methodology to refine the composition of an established culture medium for growth of a human hepatoma cell line, C3A. A selection of growth factors and nutrient supplements were systematically screened according to standard design of experiments (DoE) procedures. The results of the screening indicated that the medium additives hepatocyte growth factor, oncostatin M, and fibroblast growth factor 4 significantly influenced the metabolic activities of the C3A cell line. Surface response methodology revealed that the optimum levels for these factors were 30 ng/ml for hepatocyte growth factor and 35 ng/ml for oncostatin M. Additional experiments on primary human hepatocyte cultures showed high variance in metabolic activities between cells from different individuals, making determination of optimal levels of factors more difficult. Still, it was possible to conclude that hepatocyte growth factor, epidermal growth factor, and oncostatin M had decisive effects on the metabolic functions of primary human hepatocytes.

  19. Factors that influence the tribocharging of pulverulent materials in compressed-air devices

    NASA Astrophysics Data System (ADS)

    Das, S.; Medles, K.; Mihalcioiu, A.; Beleca, R.; Dragan, C.; Dascalescu, L.

    2008-12-01

    Tribocharging of pulverulent materials in compressed-air devices is a typical multi-factorial process. This paper aims at demonstrating the interest of using the design of experiments methodology in association with virtual instrumentation for quantifying the effects of various process varaibles and of their interactions, as a prerequisite for the development of new tribocharging devices for industrial applications. The study is focused on the tribocharging of PVC powders in compressed-air devices similar to those employed in electrostatic painting. A classical 2 full-factorial design (3 factors at two levels) was employed for conducting the experiments. The response function was the charge/mass ratio of the material collected in a modified Faraday cage, at the exit of the tribocharging device. The charge/mass ratio was found to increase with the injection pressure and the vortex pressure in the tribocharging device, and to decrease with the increasing of the feed rate. In the present study an in-house design of experiments software was employed for statistical analysis of experimental data and validation of the experimental model.

  20. Quantitative knowledge acquisition for expert systems

    NASA Technical Reports Server (NTRS)

    Belkin, Brenda L.; Stengel, Robert F.

    1991-01-01

    A common problem in the design of expert systems is the definition of rules from data obtained in system operation or simulation. While it is relatively easy to collect data and to log the comments of human operators engaged in experiments, generalizing such information to a set of rules has not previously been a direct task. A statistical method is presented for generating rule bases from numerical data, motivated by an example based on aircraft navigation with multiple sensors. The specific objective is to design an expert system that selects a satisfactory suite of measurements from a dissimilar, redundant set, given an arbitrary navigation geometry and possible sensor failures. The systematic development is described of a Navigation Sensor Management (NSM) Expert System from Kalman Filter convariance data. The method invokes two statistical techniques: Analysis of Variance (ANOVA) and the ID3 Algorithm. The ANOVA technique indicates whether variations of problem parameters give statistically different covariance results, and the ID3 algorithms identifies the relationships between the problem parameters using probabilistic knowledge extracted from a simulation example set. Both are detailed.

  1. Photocatalytic degradation using design of experiments: a review and example of the Congo red degradation.

    PubMed

    Sakkas, Vasilios A; Islam, Md Azharul; Stalikas, Constantine; Albanis, Triantafyllos A

    2010-03-15

    The use of chemometric methods such as response surface methodology (RSM) based on statistical design of experiments (DOEs) is becoming increasingly widespread in several sciences such as analytical chemistry, engineering and environmental chemistry. Applied catalysis, is certainly not the exception. It is clear that photocatalytic processes mated with chemometric experimental design play a crucial role in the ability of reaching the optimum of the catalytic reactions. The present article reviews the major applications of RSM in modern experimental design combined with photocatalytic degradation processes. Moreover, the theoretical principles and designs that enable to obtain a polynomial regression equation, which expresses the influence of process parameters on the response are thoroughly discussed. An original experimental work, the photocatalytic degradation of the dye Congo red (CR) using TiO(2) suspensions and H(2)O(2), in natural surface water (river water) is comprehensively described as a case study, in order to provide sufficient guidelines to deal with this subject, in a rational and integrated way. (c) 2009 Elsevier B.V. All rights reserved.

  2. Point process statistics in atom probe tomography.

    PubMed

    Philippe, T; Duguay, S; Grancher, G; Blavette, D

    2013-09-01

    We present a review of spatial point processes as statistical models that we have designed for the analysis and treatment of atom probe tomography (APT) data. As a major advantage, these methods do not require sampling. The mean distance to nearest neighbour is an attractive approach to exhibit a non-random atomic distribution. A χ(2) test based on distance distributions to nearest neighbour has been developed to detect deviation from randomness. Best-fit methods based on first nearest neighbour distance (1 NN method) and pair correlation function are presented and compared to assess the chemical composition of tiny clusters. Delaunay tessellation for cluster selection has been also illustrated. These statistical tools have been applied to APT experiments on microelectronics materials. Copyright © 2012 Elsevier B.V. All rights reserved.

  3. Use of iPhone technology in improving acetabular component position in total hip arthroplasty.

    PubMed

    Tay, Xiau Wei; Zhang, Benny Xu; Gayagay, George

    2017-09-01

    Improper acetabular cup positioning is associated with high risk of complications after total hip arthroplasty. The aim of our study is to objectively compare 3 methods, namely (1) free hand, (2) alignment jig (Sputnik), and (3) iPhone application to identify an easy, reproducible, and accurate method in improving acetabular cup placement. We designed a simple setup and carried out a simple experiment (see Method section). Using statistical analysis, the difference in inclination angles using iPhone application compared with the freehand method was found to be statistically significant ( F [2,51] = 4.17, P = .02) in the "untrained group". There is no statistical significance detected for the other groups. This suggests a potential role for iPhone applications in junior surgeons in overcoming the steep learning curve.

  4. Analyzing Single-Molecule Protein Transportation Experiments via Hierarchical Hidden Markov Models

    PubMed Central

    Chen, Yang; Shen, Kuang

    2017-01-01

    To maintain proper cellular functions, over 50% of proteins encoded in the genome need to be transported to cellular membranes. The molecular mechanism behind such a process, often referred to as protein targeting, is not well understood. Single-molecule experiments are designed to unveil the detailed mechanisms and reveal the functions of different molecular machineries involved in the process. The experimental data consist of hundreds of stochastic time traces from the fluorescence recordings of the experimental system. We introduce a Bayesian hierarchical model on top of hidden Markov models (HMMs) to analyze these data and use the statistical results to answer the biological questions. In addition to resolving the biological puzzles and delineating the regulating roles of different molecular complexes, our statistical results enable us to propose a more detailed mechanism for the late stages of the protein targeting process. PMID:28943680

  5. Downscaling of Global Climate Change Estimates to Regional Scales: An Application to Iberian Rainfall in Wintertime.

    NASA Astrophysics Data System (ADS)

    von Storch, Hans; Zorita, Eduardo; Cubasch, Ulrich

    1993-06-01

    A statistical strategy to deduct regional-scale features from climate general circulation model (GCM) simulations has been designed and tested. The main idea is to interrelate the characteristic patterns of observed simultaneous variations of regional climate parameters and of large-scale atmospheric flow using the canonical correlation technique.The large-scale North Atlantic sea level pressure (SLP) is related to the regional, variable, winter (DJF) mean Iberian Peninsula rainfall. The skill of the resulting statistical model is shown by reproducing, to a good approximation, the winter mean Iberian rainfall from 1900 to present from the observed North Atlantic mean SLP distributions. It is shown that this observed relationship between these two variables is not well reproduced in the output of a general circulation model (GCM).The implications for Iberian rainfall changes as the response to increasing atmospheric greenhouse-gas concentrations simulated by two GCM experiments are examined with the proposed statistical model. In an instantaneous `2 C02' doubling experiment, using the simulated change of the mean North Atlantic SLP field to predict Iberian rainfall yields, there is an insignificant increase of area-averaged rainfall of 1 mm/month, with maximum values of 4 mm/month in the northwest of the peninsula. In contrast, for the four GCM grid points representing the Iberian Peninsula, the change is 10 mm/month, with a minimum of 19 mm/month in the southwest. In the second experiment, with the IPCC scenario A ("business as usual") increase Of C02, the statistical-model results partially differ from the directly simulated rainfall changes: in the experimental range of 100 years, the area-averaged rainfall decreases by 7 mm/month (statistical model), and by 9 mm/month (GCM); at the same time the amplitude of the interdecadal variability is quite different.

  6. Optimization of fermentation medium for the production of atrazine degrading strain Acinetobacter sp. DNS(32) by statistical analysis system.

    PubMed

    Zhang, Ying; Wang, Yang; Wang, Zhi-Gang; Wang, Xi; Guo, Huo-Sheng; Meng, Dong-Fang; Wong, Po-Keung

    2012-01-01

    Statistical experimental designs provided by statistical analysis system (SAS) software were applied to optimize the fermentation medium composition for the production of atrazine-degrading Acinetobacter sp. DNS(32) in shake-flask cultures. A "Plackett-Burman Design" was employed to evaluate the effects of different components in the medium. The concentrations of corn flour, soybean flour, and K(2)HPO(4) were found to significantly influence Acinetobacter sp. DNS(32) production. The steepest ascent method was employed to determine the optimal regions of these three significant factors. Then, these three factors were optimized using central composite design of "response surface methodology." The optimized fermentation medium composition was composed as follows (g/L): corn flour 39.49, soybean flour 25.64, CaCO(3) 3, K(2)HPO(4) 3.27, MgSO(4)·7H(2)O 0.2, and NaCl 0.2. The predicted and verifiable values in the medium with optimized concentration of components in shake flasks experiments were 7.079 × 10(8) CFU/mL and 7.194 × 10(8) CFU/mL, respectively. The validated model can precisely predict the growth of atrazine-degraing bacterium, Acinetobacter sp. DNS(32).

  7. High-throughput optimization by statistical designs: example with rat liver slices cryopreservation.

    PubMed

    Martin, H; Bournique, B; Blanchi, B; Lerche-Langrand, C

    2003-08-01

    The purpose of this study was to optimize cryopreservation conditions of rat liver slices in a high-throughput format, with focus on reproducibility. A statistical design of 32 experiments was performed and intracellular lactate dehydrogenase (LDHi) activity and antipyrine (AP) metabolism were evaluated as biomarkers. At freezing, modified University of Wisconsin solution was better than Williams'E medium, and pure dimethyl sulfoxide was better than a cryoprotectant mixture. The best cryoprotectant concentrations were 10% for LDHi and 20% for AP metabolism. Fetal calf serum could be used at 50 or 80%, and incubation of slices with the cryoprotectant could last 10 or 20 min. At thawing, 42 degrees C was better than 22 degrees C. After thawing, 1h was better than 3h of preculture. Cryopreservation increased the interslice variability of the biomarkers. After cryopreservation, LDHi and AP metabolism levels were up to 84 and 80% of fresh values. However, these high levels were not reproducibly achieved. Two factors involved in the day-to-day variability of LDHi were identified: the incubation time with the cryoprotectant and the preculture time. In conclusion, the statistical design was very efficient to quickly determine optimized conditions by simultaneously measuring the role of numerous factors. The cryopreservation procedure developed appears suitable for qualitative metabolic profiling studies.

  8. Complete scanpaths analysis toolbox.

    PubMed

    Augustyniak, Piotr; Mikrut, Zbigniew

    2006-01-01

    This paper presents a complete open software environment for control, data processing and assessment of visual experiments. Visual experiments are widely used in research on human perception physiology and the results are applicable to various visual information-based man-machine interfacing, human-emulated automatic visual systems or scanpath-based learning of perceptual habits. The toolbox is designed for Matlab platform and supports infra-red reflection-based eyetracker in calibration and scanpath analysis modes. Toolbox procedures are organized in three layers: the lower one, communicating with the eyetracker output file, the middle detecting scanpath events on a physiological background and the one upper consisting of experiment schedule scripts, statistics and summaries. Several examples of visual experiments carried out with use of the presented toolbox complete the paper.

  9. Comparison of student's learning achievement through realistic mathematics education (RME) approach and problem solving approach on grade VII

    NASA Astrophysics Data System (ADS)

    Ilyas, Muhammad; Salwah

    2017-02-01

    The type of this research was experiment. The purpose of this study was to determine the difference and the quality of student's learning achievement between students who obtained learning through Realistic Mathematics Education (RME) approach and students who obtained learning through problem solving approach. This study was a quasi-experimental research with non-equivalent experiment group design. The population of this study was all students of grade VII in one of junior high school in Palopo, in the second semester of academic year 2015/2016. Two classes were selected purposively as sample of research that was: year VII-5 as many as 28 students were selected as experiment group I and VII-6 as many as 23 students were selected as experiment group II. Treatment that used in the experiment group I was learning by RME Approach, whereas in the experiment group II by problem solving approach. Technique of data collection in this study gave pretest and posttest to students. The analysis used in this research was an analysis of descriptive statistics and analysis of inferential statistics using t-test. Based on the analysis of descriptive statistics, it can be concluded that the average score of students' mathematics learning after taught using problem solving approach was similar to the average results of students' mathematics learning after taught using realistic mathematics education (RME) approach, which are both at the high category. In addition, It can also be concluded that; (1) there was no difference in the results of students' mathematics learning taught using realistic mathematics education (RME) approach and students who taught using problem solving approach, (2) quality of learning achievement of students who received RME approach and problem solving approach learning was same, which was at the high category.

  10. Preliminary results from DIMES: Dispersion in the ACC

    NASA Astrophysics Data System (ADS)

    Balwada, D.; Speer, K.; LaCasce, J. H.; Owens, B.

    2012-04-01

    The Diapycnal and Isopynal Mixing Experiment in the Southern Ocean (DIMES) is a CLIVAR process study designed to study mixing in the Antarctic Circumpolar Current. The experiment includes tracer release, float, and small-scale turbulence components. This presentation will report on some results of the float component, from floats deployed across the ACC in the Southeast Pacific Ocean. These are the first subsurface Lagrangian trajectories from the ACC. Floats were deployed to follow approximately a constant density surface for a period of 1-3 years. To help aid the experimental results virtual floats were advected using AVISO data and basic statistics were derived from both deployed and virtual float trajectories. Experimental design, initial results, comparison to virtual floats and single particle and relative dispersion calculations will be presented.

  11. Design and Experiment of FBG-Based Icing Monitoring on Overhead Transmission Lines with an Improvement Trial for Windy Weather

    PubMed Central

    Zhang, Min; Xing, Yimeng; Zhang, Zhiguo; Chen, Qiguan

    2014-01-01

    A scheme for monitoring icing on overhead transmission lines with fiber Bragg grating (FBG) strain sensors is designed and evaluated both theoretically and experimentally. The influences of temperature and wind are considered. The results of field experiments using simulated ice loading on windless days indicate that the scheme is capable of monitoring the icing thickness within 0–30 mm with an accuracy of ±1 mm, a load cell error of 0.0308v, a repeatability error of 0.3328v and a hysteresis error is 0.026%. To improve the measurement during windy weather, a correction factor is added to the effective gravity acceleration, and the absolute FBG strain is replaced by its statistical average. PMID:25615733

  12. Teacher Professional Development to Foster Authentic Student Research Experiences

    NASA Astrophysics Data System (ADS)

    Conn, K.; Iyengar, E.

    2004-12-01

    This presentation reports on a new teacher workshop design that encourages teachers to initiate and support long-term student-directed research projects in the classroom setting. Teachers were recruited and engaged in an intensive marine ecology learning experience at Shoals Marine Laboratory, Appledore Island, Maine. Part of the weeklong summer workshop was spent in field work, part in laboratory work, and part in learning experimental design and basic statistical analysis of experimental results. Teachers were presented with strategies to adapt their workshop learnings to formulate plans for initiating and managing authentic student research projects in their classrooms. The authors will report on the different considerations and constraints facing the teachers in their home school settings and teachers' progress in implementing their plans. Suggestions for replicating the workshop will be offered.

  13. Plant growth modeling at the JSC variable pressure growth chamber - An application of experimental design

    NASA Technical Reports Server (NTRS)

    Miller, Adam M.; Edeen, Marybeth; Sirko, Robert J.

    1992-01-01

    This paper describes the approach and results of an effort to characterize plant growth under various environmental conditions at the Johnson Space Center variable pressure growth chamber. Using a field of applied mathematics and statistics known as design of experiments (DOE), we developed a test plan for varying environmental parameters during a lettuce growth experiment. The test plan was developed using a Box-Behnken approach to DOE. As a result of the experimental runs, we have developed empirical models of both the transpiration process and carbon dioxide assimilation for Waldman's Green lettuce over specified ranges of environmental parameters including carbon dioxide concentration, light intensity, dew-point temperature, and air velocity. This model also predicts transpiration and carbon dioxide assimilation for different ages of the plant canopy.

  14. Response properties of ON-OFF retinal ganglion cells to high-order stimulus statistics.

    PubMed

    Xiao, Lei; Gong, Han-Yan; Gong, Hai-Qing; Liang, Pei-Ji; Zhang, Pu-Ming

    2014-10-17

    The visual stimulus statistics are the fundamental parameters to provide the reference for studying visual coding rules. In this study, the multi-electrode extracellular recording experiments were designed and implemented on bullfrog retinal ganglion cells to explore the neural response properties to the changes in stimulus statistics. The changes in low-order stimulus statistics, such as intensity and contrast, were clearly reflected in the neuronal firing rate. However, it was difficult to distinguish the changes in high-order statistics, such as skewness and kurtosis, only based on the neuronal firing rate. The neuronal temporal filtering and sensitivity characteristics were further analyzed. We observed that the peak-to-peak amplitude of the temporal filter and the neuronal sensitivity, which were obtained from either neuronal ON spikes or OFF spikes, could exhibit significant changes when the high-order stimulus statistics were changed. These results indicate that in the retina, the neuronal response properties may be reliable and powerful in carrying some complex and subtle visual information. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  15. Development of the Statistical Reasoning in Biology Concept Inventory (SRBCI).

    PubMed

    Deane, Thomas; Nomme, Kathy; Jeffery, Erica; Pollock, Carol; Birol, Gülnur

    2016-01-01

    We followed established best practices in concept inventory design and developed a 12-item inventory to assess student ability in statistical reasoning in biology (Statistical Reasoning in Biology Concept Inventory [SRBCI]). It is important to assess student thinking in this conceptual area, because it is a fundamental requirement of being statistically literate and associated skills are needed in almost all walks of life. Despite this, previous work shows that non-expert-like thinking in statistical reasoning is common, even after instruction. As science educators, our goal should be to move students along a novice-to-expert spectrum, which could be achieved with growing experience in statistical reasoning. We used item response theory analyses (the one-parameter Rasch model and associated analyses) to assess responses gathered from biology students in two populations at a large research university in Canada in order to test SRBCI's robustness and sensitivity in capturing useful data relating to the students' conceptual ability in statistical reasoning. Our analyses indicated that SRBCI is a unidimensional construct, with items that vary widely in difficulty and provide useful information about such student ability. SRBCI should be useful as a diagnostic tool in a variety of biology settings and as a means of measuring the success of teaching interventions designed to improve statistical reasoning skills. © 2016 T. Deane et al. CBE—Life Sciences Education © 2016 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  16. The impact of particle size and initial solid loading on thermochemical pretreatment of wheat straw for improving sugar recovery.

    PubMed

    Rojas-Rejón, Oscar A; Sánchez, Arturo

    2014-07-01

    This work studies the effect of initial solid load (4-32 %; w/v, DS) and particle size (0.41-50 mm) on monosaccharide yield of wheat straw subjected to dilute H(2)SO(4) (0.75 %, v/v) pretreatment and enzymatic saccharification. Response surface methodology (RSM) based on a full factorial design (FFD) was used for the statistical analysis of pretreatment and enzymatic hydrolysis. The highest xylose yield obtained during pretreatment (ca. 86 %; of theoretical) was achieved at 4 % (w/v, DS) and 25 mm. The solid fraction obtained from the first set of experiments was subjected to enzymatic hydrolysis at constant enzyme dosage (17 FPU/g); statistical analysis revealed that glucose yield was favored with solids pretreated at low initial solid loads and small particle sizes. Dynamic experiments showed that glucose yield did not increase after 48 h of enzymatic hydrolysis. Once established pretreatment conditions, experiments were carried out with several initial solid loading (4-24 %; w/v, DS) and enzyme dosages (5-50 FPU/g). Two straw sizes (0.41 and 50 mm) were used for verification purposes. The highest glucose yield (ca. 55 %; of theoretical) was achieved at 4 % (w/v, DS), 0.41 mm and 50 FPU/g. Statistical analysis of experiments showed that at low enzyme dosage, particle size had a remarkable effect over glucose yield and initial solid load was the main factor for glucose yield.

  17. The statistical evaluation of duct tape end match as physical evidence

    NASA Astrophysics Data System (ADS)

    Chan, Ka Lok

    Duct tapes are often submitted to crime laboratories as evidence associated with abductions, homicides, or construction of explosive devices. As a result, trace evidence examiners are often asked to analyze and compare commercial duct tapes so that they can establish possible evidentiary links. Duct tape end matches are believed to be the strongest association between exemplar and question samples because they are considered as evidence with unique individual characteristics. While end match analysis and comparison have long been undertaken by trace evidence examiners, there is a significant lack of scientific research for associating two or more segments of duct tapes. This study is designed to obtain statistical inferences on the uniqueness of duct tape tears. Three experiments were devised to compile the basis for a statistical assessment of the probability of duct tape end matches along with a proposed error rate. In one experiment, we conducted the equivalent of 10,000 end match examinations with an error rate of 0%. In the second experiment, we performed 2,704 end match examinations having 0% error rate. In the third experiment, using duct tape by an Elmendorf Tear tester, we conducted 576 end match examinations with an error rate of 0% and having all samples correctly associated. The results of this study indicate that end matches are distinguishable among a single roll of duct tape and between two different rolls of duct tape having very similar surface features and weave pattern.

  18. Methods for processing microarray data.

    PubMed

    Ares, Manuel

    2014-02-01

    Quality control must be maintained at every step of a microarray experiment, from RNA isolation through statistical evaluation. Here we provide suggestions for analyzing microarray data. Because the utility of the results depends directly on the design of the experiment, the first critical step is to ensure that the experiment can be properly analyzed and interpreted. What is the biological question? What is the best way to perform the experiment? How many replicates will be required to obtain the desired statistical resolution? Next, the samples must be prepared, pass quality controls for integrity and representation, and be hybridized and scanned. Also, slides with defects, missing data, high background, or weak signal must be rejected. Data from individual slides must be normalized and combined so that the data are as free of systematic bias as possible. The third phase is to apply statistical filters and tests to the data to determine genes (1) expressed above background, (2) whose expression level changes in different samples, and (3) whose RNA-processing patterns or protein associations change. Next, a subset of the data should be validated by an alternative method, such as reverse transcription-polymerase chain reaction (RT-PCR). Provided that this endorses the general conclusions of the array analysis, gene sets whose expression, splicing, polyadenylation, protein binding, etc. change in different samples can be classified with respect to function, sequence motif properties, as well as other categories to extract hypotheses for their biological roles and regulatory logic.

  19. The Roles of Experience, Gender, and Individual Differences in Statistical Reasoning

    ERIC Educational Resources Information Center

    Martin, Nadia; Hughes, Jeffrey; Fugelsang, Jonathan

    2017-01-01

    We examine the joint effects of gender and experience on statistical reasoning. Participants with various levels of experience in statistics completed the Statistical Reasoning Assessment (Garfield, 2003), along with individual difference measures assessing cognitive ability and thinking dispositions. Although the performance of both genders…

  20. Experiences with an adaptive design for a dose-finding study in patients with osteoarthritis.

    PubMed

    Miller, Frank; Björnsson, Marcus; Svensson, Ola; Karlsten, Rolf

    2014-03-01

    Dose-finding studies in non-oncology areas are usually conducted in Phase II of the development process of a new potential medicine and it is key to choose a good design for such a study, as the results will decide if and how to proceed to Phase III. The present article has focus on the design of a dose-finding study for pain in osteoarthritis patients treated with the TRPV1 antagonist AZD1386. We describe different design alternatives in the planning of this study, the reasoning for choosing the adaptive design and experiences with conduct and interim analysis. Three alternatives were proposed: one single dose-finding study with parallel design, a programme with a smaller Phase IIa study followed by a Phase IIb dose-finding study, and an adaptive dose-finding study. We describe these alternatives in detail and explain why the adaptive design was chosen for the study. We give insights in design aspects of the adaptive study, which need to be pre-planned, like interim decision criteria, statistical analysis method and setup of a Data Monitoring Committee. Based on the interim analysis it was recommended to stop the study for futility since AZD1386 showed no significant pain decrease based on the primary variable. We discuss results and experiences from the conduct of the study with the novel design approach. Huge cost savings have been done compared to if the option with one dose-finding design for Phase II had been chosen. However, we point out several challenges with this approach. Copyright © 2014 Elsevier Inc. All rights reserved.

  1. Proteomic Workflows for Biomarker Identification Using Mass Spectrometry — Technical and Statistical Considerations during Initial Discovery

    PubMed Central

    Orton, Dennis J.; Doucette, Alan A.

    2013-01-01

    Identification of biomarkers capable of differentiating between pathophysiological states of an individual is a laudable goal in the field of proteomics. Protein biomarker discovery generally employs high throughput sample characterization by mass spectrometry (MS), being capable of identifying and quantifying thousands of proteins per sample. While MS-based technologies have rapidly matured, the identification of truly informative biomarkers remains elusive, with only a handful of clinically applicable tests stemming from proteomic workflows. This underlying lack of progress is attributed in large part to erroneous experimental design, biased sample handling, as well as improper statistical analysis of the resulting data. This review will discuss in detail the importance of experimental design and provide some insight into the overall workflow required for biomarker identification experiments. Proper balance between the degree of biological vs. technical replication is required for confident biomarker identification. PMID:28250400

  2. Artificial neural networks in evaluation and optimization of modified release solid dosage forms.

    PubMed

    Ibrić, Svetlana; Djuriš, Jelena; Parojčić, Jelena; Djurić, Zorica

    2012-10-18

    Implementation of the Quality by Design (QbD) approach in pharmaceutical development has compelled researchers in the pharmaceutical industry to employ Design of Experiments (DoE) as a statistical tool, in product development. Among all DoE techniques, response surface methodology (RSM) is the one most frequently used. Progress of computer science has had an impact on pharmaceutical development as well. Simultaneous with the implementation of statistical methods, machine learning tools took an important place in drug formulation. Twenty years ago, the first papers describing application of artificial neural networks in optimization of modified release products appeared. Since then, a lot of work has been done towards implementation of new techniques, especially Artificial Neural Networks (ANN) in modeling of production, drug release and drug stability of modified release solid dosage forms. The aim of this paper is to review artificial neural networks in evaluation and optimization of modified release solid dosage forms.

  3. Artificial Neural Networks in Evaluation and Optimization of Modified Release Solid Dosage Forms

    PubMed Central

    Ibrić, Svetlana; Djuriš, Jelena; Parojčić, Jelena; Djurić, Zorica

    2012-01-01

    Implementation of the Quality by Design (QbD) approach in pharmaceutical development has compelled researchers in the pharmaceutical industry to employ Design of Experiments (DoE) as a statistical tool, in product development. Among all DoE techniques, response surface methodology (RSM) is the one most frequently used. Progress of computer science has had an impact on pharmaceutical development as well. Simultaneous with the implementation of statistical methods, machine learning tools took an important place in drug formulation. Twenty years ago, the first papers describing application of artificial neural networks in optimization of modified release products appeared. Since then, a lot of work has been done towards implementation of new techniques, especially Artificial Neural Networks (ANN) in modeling of production, drug release and drug stability of modified release solid dosage forms. The aim of this paper is to review artificial neural networks in evaluation and optimization of modified release solid dosage forms. PMID:24300369

  4. Stated Choice design comparison in a developing country: recall and attribute nonattendance

    PubMed Central

    2014-01-01

    Background Experimental designs constitute a vital component of all Stated Choice (aka discrete choice experiment) studies. However, there exists limited empirical evaluation of the statistical benefits of Stated Choice (SC) experimental designs that employ non-zero prior estimates in constructing non-orthogonal constrained designs. This paper statistically compares the performance of contrasting SC experimental designs. In so doing, the effect of respondent literacy on patterns of Attribute non-Attendance (ANA) across fractional factorial orthogonal and efficient designs is also evaluated. The study uses a ‘real’ SC design to model consumer choice of primary health care providers in rural north India. A total of 623 respondents were sampled across four villages in Uttar Pradesh, India. Methods Comparison of orthogonal and efficient SC experimental designs is based on several measures. Appropriate comparison of each design’s respective efficiency measure is made using D-error results. Standardised Akaike Information Criteria are compared between designs and across recall periods. Comparisons control for stated and inferred ANA. Coefficient and standard error estimates are also compared. Results The added complexity of the efficient SC design, theorised elsewhere, is reflected in higher estimated amounts of ANA among illiterate respondents. However, controlling for ANA using stated and inferred methods consistently shows that the efficient design performs statistically better. Modelling SC data from the orthogonal and efficient design shows that model-fit of the efficient design outperform the orthogonal design when using a 14-day recall period. The performance of the orthogonal design, with respect to standardised AIC model-fit, is better when longer recall periods of 30-days, 6-months and 12-months are used. Conclusions The effect of the efficient design’s cognitive demand is apparent among literate and illiterate respondents, although, more pronounced among illiterate respondents. This study empirically confirms that relaxing the orthogonality constraint of SC experimental designs increases the information collected in choice tasks, subject to the accuracy of the non-zero priors in the design and the correct specification of a ‘real’ SC recall period. PMID:25386388

  5. Demystification of Bell inequality

    NASA Astrophysics Data System (ADS)

    Khrennikov, Andrei

    2009-08-01

    The main aim of this review is to show that the common conclusion that Bell's argument implies that any attempt to proceed beyond quantum mechanics induces a nonlocal model was not totally justified. Our analysis of Bell's argument demonstrates that violation of Bell's inequality implies neither "death of realism" nor nonlocality. This violation is just a sign of non-Kolmogorovness of statistical data - impossibility to put statistical data collected in a few different experiments (corresponding to incompatible settings of polarization beam splitters) in one probability space. This inequality was well known in theoretical probability since 19th century (from works of Boole). We couple non-Kolmogorovness of data with design of modern detectors of photons.

  6. The influence of previous subject experience on interactions during peer instruction in an introductory physics course: A mixed methods analysis

    NASA Astrophysics Data System (ADS)

    Vondruska, Judy A.

    Over the past decade, peer instruction and the introduction of student response systems has provided a means of improving student engagement and achievement in large-lecture settings. While the nature of the student discourse occurring during peer instruction is less understood, existing studies have shown student ideas about the subject, extraneous cues, and confidence level appear to matter in the student-student discourse. Using a mixed methods research design, this study examined the influence of previous subject experience on peer instruction in an introductory, one-semester Survey of Physics course. Quantitative results indicated students in discussion pairs where both had previous subject experience were more likely to answer clicker question correctly both before and after peer discussion compared to student groups where neither partner had previous subject experience. Students in mixed discussion pairs were not statistically different in correct response rates from the other pairings. There was no statistically significant difference between the experience pairs on unit exam scores or the Peer Instruction Partner Survey. Although there was a statistically significant difference between the pre-MPEX and post-MPEX scores, there was no difference between the members of the various subject experience peer discussion pairs. The qualitative study, conducted after the quantitative study, helped to inform the quantitative results by exploring the nature of the peer interactions through survey questions and a series of focus groups discussions. While the majority of participants described a benefit to the use of clickers in the lecture, their experience with their discussion partners varied. Students with previous subject experience tended to describe peer instruction more positively than students who did not have previous subject experience, regardless of the experience level of their partner. They were also more likely to report favorable levels of comfort with the peer instruction experience. Students with no previous subject experience were more likely to describe a level of discomfort being assigned a stranger for a discussion partner and were more likely to report communication issues with their partner. Most group members, regardless of previous subject experience, related deeper discussions occurring when partners did not initially have the same answer to the clicker questions.

  7. Some challenges with statistical inference in adaptive designs.

    PubMed

    Hung, H M James; Wang, Sue-Jane; Yang, Peiling

    2014-01-01

    Adaptive designs have generated a great deal of attention to clinical trial communities. The literature contains many statistical methods to deal with added statistical uncertainties concerning the adaptations. Increasingly encountered in regulatory applications are adaptive statistical information designs that allow modification of sample size or related statistical information and adaptive selection designs that allow selection of doses or patient populations during the course of a clinical trial. For adaptive statistical information designs, a few statistical testing methods are mathematically equivalent, as a number of articles have stipulated, but arguably there are large differences in their practical ramifications. We pinpoint some undesirable features of these methods in this work. For adaptive selection designs, the selection based on biomarker data for testing the correlated clinical endpoints may increase statistical uncertainty in terms of type I error probability, and most importantly the increased statistical uncertainty may be impossible to assess.

  8. Statistical modeling of isoform splicing dynamics from RNA-seq time series data.

    PubMed

    Huang, Yuanhua; Sanguinetti, Guido

    2016-10-01

    Isoform quantification is an important goal of RNA-seq experiments, yet it remains problematic for genes with low expression or several isoforms. These difficulties may in principle be ameliorated by exploiting correlated experimental designs, such as time series or dosage response experiments. Time series RNA-seq experiments, in particular, are becoming increasingly popular, yet there are no methods that explicitly leverage the experimental design to improve isoform quantification. Here, we present DICEseq, the first isoform quantification method tailored to correlated RNA-seq experiments. DICEseq explicitly models the correlations between different RNA-seq experiments to aid the quantification of isoforms across experiments. Numerical experiments on simulated datasets show that DICEseq yields more accurate results than state-of-the-art methods, an advantage that can become considerable at low coverage levels. On real datasets, our results show that DICEseq provides substantially more reproducible and robust quantifications, increasing the correlation of estimates from replicate datasets by up to 10% on genes with low or moderate expression levels (bottom third of all genes). Furthermore, DICEseq permits to quantify the trade-off between temporal sampling of RNA and depth of sequencing, frequently an important choice when planning experiments. Our results have strong implications for the design of RNA-seq experiments, and offer a novel tool for improved analysis of such datasets. Python code is freely available at http://diceseq.sf.net G.Sanguinetti@ed.ac.uk Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  9. 'Mind genomics': the experimental, inductive science of the ordinary, and its application to aspects of food and feeding.

    PubMed

    Moskowitz, Howard R

    2012-11-05

    The paper introduces the empirical science of 'mind genomics', whose objective is to understand the dimensions of ordinary, everyday experience, identify mind-set segments of people who value different aspects of that everyday experience, and then assign a new person to a mind-set by a statistically appropriate procedure. By studying different experiences using experimental design of ideas, 'mind genomics' constructs an empirical, inductive science of perception and experience, layer by layer. The ultimate objective of 'mind genomics' is a large-scale science of experience created using induction, with the science based upon emergent commonalities across many different types of daily experience. The particular topic investigated in the paper is the experience of healthful snacks, what makes a person 'want' them, and the dollar value of different sensory aspects of the healthful snack. Copyright © 2012 Elsevier Inc. All rights reserved.

  10. [Review of research design and statistical methods in Chinese Journal of Cardiology].

    PubMed

    Zhang, Li-jun; Yu, Jin-ming

    2009-07-01

    To evaluate the research design and the use of statistical methods in Chinese Journal of Cardiology. Peer through the research design and statistical methods in all of the original papers in Chinese Journal of Cardiology from December 2007 to November 2008. The most frequently used research designs are cross-sectional design (34%), prospective design (21%) and experimental design (25%). In all of the articles, 49 (25%) use wrong statistical methods, 29 (15%) lack some sort of statistic analysis, 23 (12%) have inconsistencies in description of methods. There are significant differences between different statistical methods (P < 0.001). The correction rates of multifactor analysis were low and repeated measurement datas were not used repeated measurement analysis. Many problems exist in Chinese Journal of Cardiology. Better research design and correct use of statistical methods are still needed. More strict review by statistician and epidemiologist is also required to improve the literature qualities.

  11. Single element injector cold flow testing for STME swirl coaxial injector element design

    NASA Technical Reports Server (NTRS)

    Hulka, J.; Schneider, J. A.

    1993-01-01

    An oxidizer-swirled coaxial element injector is being investigated for application in the Space Transportation Main Engine (STME). Single element cold flow experiments were conducted to provide characterization of the STME injector element for future analysis, design, and optimization. All tests were conducted to quiescent, ambient backpressure conditions. Spray angle, circumferential spray uniformity, dropsize, and dropsize distribution were measured in water-only and water/nitrogen flows. Rupe mixing efficiency was measured using water/sucrose solution flows with a large grid patternator for simple comparative evaluation of mixing. Factorial designs of experiment were used for statistical evaluation of injector geometrical design features and propellant flow conditions on mixing and atomization. Increasing the free swirl angle of the liquid oxidizer had the greatest influence on increasing the mixing efficiency. The addition of gas assistance had the most significant effect on reducing oxidizer droplet size parameters and increasing droplet size distribution. Increasing the oxidizer injection velocity had the greatest influence for reducing oxidizer droplet size parameters and increasing size distribution for non-gas assisted flows. Single element and multi-element subscale hot fire testing are recommended to verify optimized designs before committing to the STME design.

  12. Results from the MWA EoR Experiment

    NASA Astrophysics Data System (ADS)

    Webster, Rachel L.; MWA EoR Collaboration

    2018-05-01

    The MWA EoR is one of a small handful of experiments designed to detect the statistical signal from the Epoch of Reionisation. Each of these experiments has reached a level of maturity, where the challenges, in particular of foreground removal, are being more fully understood. Over the past decade, the MWA EoR Collaboration has developed expertise and an understanding of the elements of the telescope array, the end-to-end pipelines, ionospheric conditions, and and the foreground emissions. Sufficient data has been collected to detect the theoretically predicted EoR signal. Limits have been published regularly, however we still several orders of magnitude from a possible detection. This paper outlines recent progress and indicates directions for future efforts.

  13. Radiation induced rotation of interplanetary dust particles - A feasibility study for a space experiment

    NASA Technical Reports Server (NTRS)

    Ratcliff, K. F.; Misconi, N. Y.; Paddack, S. J.

    1980-01-01

    Irregular interplanetary dust particles may acquire a considerable spin rate due to two non-statistical dynamical mechanisms induced by solar radiation. These arise from variations in surface albedo discussed by Radzievskii (1954) and from irregularities in surface geometry discussed by Paddack (1969). An experiment is reported which will lead to an evaluation in space of the effectiveness of these two spin mechanisms. The technique of optical levitation in an argon laser beam provides a stable trap for particles 10-60 microns in diameter. The objective is to design an optical trap for dielectric particles in vacuum to study these rotation mechanisms in the gravity-free environment of a Spacelab experiment.

  14. Evaluation of the flame propagation within an SI engine using flame imaging and LES

    NASA Astrophysics Data System (ADS)

    He, Chao; Kuenne, Guido; Yildar, Esra; van Oijen, Jeroen; di Mare, Francesca; Sadiki, Amsini; Ding, Carl-Philipp; Baum, Elias; Peterson, Brian; Böhm, Benjamin; Janicka, Johannes

    2017-11-01

    This work shows experiments and simulations of the fired operation of a spark ignition engine with port-fuelled injection. The test rig considered is an optically accessible single cylinder engine specifically designed at TU Darmstadt for the detailed investigation of in-cylinder processes and model validation. The engine was operated under lean conditions using iso-octane as a substitute for gasoline. Experiments have been conducted to provide a sound database of the combustion process. A planar flame imaging technique has been applied within the swirl- and tumble-planes to provide statistical information on the combustion process to complement a pressure-based comparison between simulation and experiments. This data is then analysed and used to assess the large eddy simulation performed within this work. For the simulation, the engine code KIVA has been extended by the dynamically thickened flame model combined with chemistry reduction by means of pressure dependent tabulation. Sixty cycles have been simulated to perform a statistical evaluation. Based on a detailed comparison with the experimental data, a systematic study has been conducted to obtain insight into the most crucial modelling uncertainties.

  15. Critical Analysis of Primary Literature in a Master’s-Level Class: Effects on Self-Efficacy and Science-Process Skills

    PubMed Central

    Abdullah, Christopher; Parris, Julian; Lie, Richard; Guzdar, Amy; Tour, Ella

    2015-01-01

    The ability to think analytically and creatively is crucial for success in the modern workforce, particularly for graduate students, who often aim to become physicians or researchers. Analysis of the primary literature provides an excellent opportunity to practice these skills. We describe a course that includes a structured analysis of four research papers from diverse fields of biology and group exercises in proposing experiments that would follow up on these papers. To facilitate a critical approach to primary literature, we included a paper with questionable data interpretation and two papers investigating the same biological question yet reaching opposite conclusions. We report a significant increase in students’ self-efficacy in analyzing data from research papers, evaluating authors’ conclusions, and designing experiments. Using our science-process skills test, we observe a statistically significant increase in students’ ability to propose an experiment that matches the goal of investigation. We also detect gains in interpretation of controls and quantitative analysis of data. No statistically significant changes were observed in questions that tested the skills of interpretation, inference, and evaluation. PMID:26250564

  16. Optimization Under Uncertainty for Electronics Cooling Design

    NASA Astrophysics Data System (ADS)

    Bodla, Karthik K.; Murthy, Jayathi Y.; Garimella, Suresh V.

    Optimization under uncertainty is a powerful methodology used in design and optimization to produce robust, reliable designs. Such an optimization methodology, employed when the input quantities of interest are uncertain, produces output uncertainties, helping the designer choose input parameters that would result in satisfactory thermal solutions. Apart from providing basic statistical information such as mean and standard deviation in the output quantities, auxiliary data from an uncertainty based optimization, such as local and global sensitivities, help the designer decide the input parameter(s) to which the output quantity of interest is most sensitive. This helps the design of experiments based on the most sensitive input parameter(s). A further crucial output of such a methodology is the solution to the inverse problem - finding the allowable uncertainty range in the input parameter(s), given an acceptable uncertainty range in the output quantity of interest...

  17. Application of a quality by design approach to the cell culture process of monoclonal antibody production, resulting in the establishment of a design space.

    PubMed

    Nagashima, Hiroaki; Watari, Akiko; Shinoda, Yasuharu; Okamoto, Hiroshi; Takuma, Shinya

    2013-12-01

    This case study describes the application of Quality by Design elements to the process of culturing Chinese hamster ovary cells in the production of a monoclonal antibody. All steps in the cell culture process and all process parameters in each step were identified by using a cause-and-effect diagram. Prospective risk assessment using failure mode and effects analysis identified the following four potential critical process parameters in the production culture step: initial viable cell density, culture duration, pH, and temperature. These parameters and lot-to-lot variability in raw material were then evaluated by process characterization utilizing a design of experiments approach consisting of a face-centered central composite design integrated with a full factorial design. Process characterization was conducted using a scaled down model that had been qualified by comparison with large-scale production data. Multivariate regression analysis was used to establish statistical prediction models for performance indicators and quality attributes; with these, we constructed contour plots and conducted Monte Carlo simulation to clarify the design space. The statistical analyses, especially for raw materials, identified set point values, which were most robust with respect to the lot-to-lot variability of raw materials while keeping the product quality within the acceptance criteria. © 2013 Wiley Periodicals, Inc. and the American Pharmacists Association.

  18. Using volcano plots and regularized-chi statistics in genetic association studies.

    PubMed

    Li, Wentian; Freudenberg, Jan; Suh, Young Ju; Yang, Yaning

    2014-02-01

    Labor intensive experiments are typically required to identify the causal disease variants from a list of disease associated variants in the genome. For designing such experiments, candidate variants are ranked by their strength of genetic association with the disease. However, the two commonly used measures of genetic association, the odds-ratio (OR) and p-value may rank variants in different order. To integrate these two measures into a single analysis, here we transfer the volcano plot methodology from gene expression analysis to genetic association studies. In its original setting, volcano plots are scatter plots of fold-change and t-test statistic (or -log of the p-value), with the latter being more sensitive to sample size. In genetic association studies, the OR and Pearson's chi-square statistic (or equivalently its square root, chi; or the standardized log(OR)) can be analogously used in a volcano plot, allowing for their visual inspection. Moreover, the geometric interpretation of these plots leads to an intuitive method for filtering results by a combination of both OR and chi-square statistic, which we term "regularized-chi". This method selects associated markers by a smooth curve in the volcano plot instead of the right-angled lines which corresponds to independent cutoffs for OR and chi-square statistic. The regularized-chi incorporates relatively more signals from variants with lower minor-allele-frequencies than chi-square test statistic. As rare variants tend to have stronger functional effects, regularized-chi is better suited to the task of prioritization of candidate genes. Copyright © 2013 Elsevier Ltd. All rights reserved.

  19. Performance mapping of the STM4-120 kinematic Stirling engine using a statistical design of experiments method

    NASA Astrophysics Data System (ADS)

    Powell, M. A.; Rawlinson, K. S.

    A kinetic Stirling cycle engine, the Stirling Thermal Motors (STM) STM4-120, was tested at the Sandia National Laboratories Engine Test Facility (ETF) from March 1989-August 1992. Sandia is interested in determining this engine's potential for solar-thermal-electric applications. The last round of testing was conducted from July-August 1992 using Sandia-designed gas-fired heat pipe evaporators as the heat input system to the engine. The STM4-120 was performance mapped over a range of sodium vapor temperatures, cooling water temperatures, and cycle pressures. The resulting shaft power output levels ranged from 5-9 kW. The engine demonstrated high conversion efficiency (24-31%) even though the power output level was less than 40% of the rated output of 25 kW. The engine had been previously derated from 25 kW to 10 kW shaft power due to mechanical limitations that were identified by STM during parallel testing at their facility in Ann Arbor, MI. A statistical method was used to design the experiment, to choose the experimental points, and to generate correlation equations describing the engine performance given the operating parameters. The testing was truncated due to a failure of the heat pipe system caused by entrainment of liquid sodium in the condenser section of the heat pipes. Enough data was gathered to generate the correlations and to demonstrate the experimental technique. The correlation is accurate in the experimental space and is simple enough for use in hand calculations and spreadsheet-based system models. Use of this method can simplify the construction of accurate performance and economic models of systems in which the engine is a component. The purpose of this paper is to present the method used to design the experiments and to analyze the performance data.

  20. Statistical power, the Belmont report, and the ethics of clinical trials.

    PubMed

    Vollmer, Sara H; Howard, George

    2010-12-01

    Achieving a good clinical trial design increases the likelihood that a trial will take place as planned, including that data will be obtained from a sufficient number of participants, and the total number of participants will be the minimal required to gain the knowledge sought. A good trial design also increases the likelihood that the knowledge sought by the experiment will be forthcoming. Achieving such a design is more than good sense-it is ethically required in experiments when participants are at risk of harm. This paper argues that doing a power analysis effectively contributes to ensuring that a trial design is good. The ethical importance of good trial design has long been recognized for trials in which there is risk of serious harm to participants. However, whether the quality of a trial design, when the risk to participants is only minimal, is an ethical issue is rarely discussed. This paper argues that even in cases when the risk is minimal, the quality of the trial design is an ethical issue, and that this is reflected in the emphasis the Belmont Report places on the importance of the benefit of knowledge gained by society. The paper also argues that good trial design is required for true informed consent.

  1. Does reviewing lead to better learning and decision making? Answers from a randomized stock market experiment.

    PubMed

    Wessa, Patrick; Holliday, Ian E

    2012-01-01

    The literature is not univocal about the effects of Peer Review (PR) within the context of constructivist learning. Due to the predominant focus on using PR as an assessment tool, rather than a constructivist learning activity, and because most studies implicitly assume that the benefits of PR are limited to the reviewee, little is known about the effects upon students who are required to review their peers. Much of the theoretical debate in the literature is focused on explaining how and why constructivist learning is beneficial. At the same time these discussions are marked by an underlying presupposition of a causal relationship between reviewing and deep learning. The purpose of the study is to investigate whether the writing of PR feedback causes students to benefit in terms of: perceived utility about statistics, actual use of statistics, better understanding of statistical concepts and associated methods, changed attitudes towards market risks, and outcomes of decisions that were made. We conducted a randomized experiment, assigning students randomly to receive PR or non-PR treatments and used two cohorts with a different time span. The paper discusses the experimental design and all the software components that we used to support the learning process: Reproducible Computing technology which allows students to reproduce or re-use statistical results from peers, Collaborative PR, and an AI-enhanced Stock Market Engine. The results establish that the writing of PR feedback messages causes students to experience benefits in terms of Behavior, Non-Rote Learning, and Attitudes, provided the sequence of PR activities are maintained for a period that is sufficiently long.

  2. Potential sources of variability in mesocosm experiments on the response of phytoplankton to ocean acidification

    NASA Astrophysics Data System (ADS)

    Moreno de Castro, Maria; Schartau, Markus; Wirtz, Kai

    2017-04-01

    Mesocosm experiments on phytoplankton dynamics under high CO2 concentrations mimic the response of marine primary producers to future ocean acidification. However, potential acidification effects can be hindered by the high standard deviation typically found in the replicates of the same CO2 treatment level. In experiments with multiple unresolved factors and a sub-optimal number of replicates, post-processing statistical inference tools might fail to detect an effect that is present. We propose that in such cases, data-based model analyses might be suitable tools to unearth potential responses to the treatment and identify the uncertainties that could produce the observed variability. As test cases, we used data from two independent mesocosm experiments. Both experiments showed high standard deviations and, according to statistical inference tools, biomass appeared insensitive to changing CO2 conditions. Conversely, our simulations showed earlier and more intense phytoplankton blooms in modeled replicates at high CO2 concentrations and suggested that uncertainties in average cell size, phytoplankton biomass losses, and initial nutrient concentration potentially outweigh acidification effects by triggering strong variability during the bloom phase. We also estimated the thresholds below which uncertainties do not escalate to high variability. This information might help in designing future mesocosm experiments and interpreting controversial results on the effect of acidification or other pressures on ecosystem functions.

  3. Statistical Evaluation and Improvement of Methods for Combining Random and Harmonic Loads

    NASA Technical Reports Server (NTRS)

    Brown, A. M.; McGhee, D. S.

    2003-01-01

    Structures in many environments experience both random and harmonic excitation. A variety of closed-form techniques has been used in the aerospace industry to combine the loads resulting from the two sources. The resulting combined loads are then used to design for both yield/ultimate strength and high- cycle fatigue capability. This Technical Publication examines the cumulative distribution percentiles obtained using each method by integrating the joint probability density function of the sine and random components. A new Microsoft Excel spreadsheet macro that links with the software program Mathematica to calculate the combined value corresponding to any desired percentile is then presented along with a curve tit to this value. Another Excel macro that calculates the combination using Monte Carlo simulation is shown. Unlike the traditional techniques. these methods quantify the calculated load value with a consistent percentile. Using either of the presented methods can be extremely valuable in probabilistic design, which requires a statistical characterization of the loading. Additionally, since the CDF at high probability levels is very flat, the design value is extremely sensitive to the predetermined percentile; therefore, applying the new techniques can substantially lower the design loading without losing any of the identified structural reliability.

  4. Statistical Comparison and Improvement of Methods for Combining Random and Harmonic Loads

    NASA Technical Reports Server (NTRS)

    Brown, Andrew M.; McGhee, David S.

    2004-01-01

    Structures in many environments experience both random and harmonic excitation. A variety of closed-form techniques has been used in the aerospace industry to combine the loads resulting from the two sources. The resulting combined loads are then used to design for both yield ultimate strength and high cycle fatigue capability. This paper examines the cumulative distribution function (CDF) percentiles obtained using each method by integrating the joint probability density function of the sine and random components. A new Microsoft Excel spreadsheet macro that links with the software program Mathematics is then used to calculate the combined value corresponding to any desired percentile along with a curve fit to this value. Another Excel macro is used to calculate the combination using a Monte Carlo simulation. Unlike the traditional techniques, these methods quantify the calculated load value with a Consistent percentile. Using either of the presented methods can be extremely valuable in probabilistic design, which requires a statistical characterization of the loading. Also, since the CDF at high probability levels is very flat, the design value is extremely sensitive to the predetermined percentile; therefore, applying the new techniques can lower the design loading substantially without losing any of the identified structural reliability.

  5. Design optimization for cost and quality: The robust design approach

    NASA Technical Reports Server (NTRS)

    Unal, Resit

    1990-01-01

    Designing reliable, low cost, and operable space systems has become the key to future space operations. Designing high quality space systems at low cost is an economic and technological challenge to the designer. A systematic and efficient way to meet this challenge is a new method of design optimization for performance, quality, and cost, called Robust Design. Robust Design is an approach for design optimization. It consists of: making system performance insensitive to material and subsystem variation, thus allowing the use of less costly materials and components; making designs less sensitive to the variations in the operating environment, thus improving reliability and reducing operating costs; and using a new structured development process so that engineering time is used most productively. The objective in Robust Design is to select the best combination of controllable design parameters so that the system is most robust to uncontrollable noise factors. The robust design methodology uses a mathematical tool called an orthogonal array, from design of experiments theory, to study a large number of decision variables with a significantly small number of experiments. Robust design also uses a statistical measure of performance, called a signal-to-noise ratio, from electrical control theory, to evaluate the level of performance and the effect of noise factors. The purpose is to investigate the Robust Design methodology for improving quality and cost, demonstrate its application by the use of an example, and suggest its use as an integral part of space system design process.

  6. Designing Preclinical Perceptibility Measures to Evaluate Topical Vaginal Gel Formulations: Relating User Sensory Perceptions and Experiences to Formulation Properties

    PubMed Central

    Fava, Joseph L.; Rosen, Rochelle K.; Vargas, Sara; Shaw, Julia G.; Kojic, E. Milu; Kiser, Patrick F.; Friend, David R.; Katz, David F.

    2014-01-01

    Abstract The effectiveness of any biomedical prevention technology relies on both biological efficacy and behavioral adherence. Microbicide trials have been hampered by low adherence, limiting the ability to draw meaningful conclusions about product effectiveness. Central to this problem may be an inadequate conceptualization of how product properties themselves impact user experience and adherence. Our goal is to expand the current microbicide development framework to include product “perceptibility,” the objective measurement of user sensory perceptions (i.e., sensations) and experiences of formulation performance during use. For vaginal gels, a set of biophysical properties, including rheological properties and measures of spreading and retention, may critically impact user experiences. Project LINK sought to characterize the user experience in this regard, and to validate measures of user sensory perceptions and experiences (USPEs) using four prototype topical vaginal gel formulations designed for pericoital use. Perceptibility scales captured a range of USPEs during the product application process (five scales), ambulation after product insertion (six scales), and during sexual activity (eight scales). Comparative statistical analyses provided empirical support for hypothesized relationships between gel properties, spreading performance, and the user experience. Project LINK provides preliminary evidence for the utility of evaluating USPEs, introducing a paradigm shift in the field of microbicide formulation design. We propose that these user sensory perceptions and experiences initiate cognitive processes in users resulting in product choice and willingness-to-use. By understanding the impact of USPEs on that process, formulation development can optimize both drug delivery and adherence. PMID:24180360

  7. Quasi-experimental study designs series-paper 1: introduction: two historical lineages.

    PubMed

    Bärnighausen, Till; Røttingen, John-Arne; Rockers, Peter; Shemilt, Ian; Tugwell, Peter

    2017-09-01

    The objective of this study was to contrast the historical development of experiments and quasi-experiments and provide the motivation for a journal series on quasi-experimental designs in health research. A short historical narrative, with concrete examples, and arguments based on an understanding of the practice of health research and evidence synthesis. Health research has played a key role in developing today's gold standard for causal inference-the randomized controlled multiply blinded trial. Historically, allocation approaches developed from convenience and purposive allocation to alternate and, finally, to random allocation. This development was motivated both by concerns for manipulation in allocation as well as statistical and theoretical developments demonstrating the power of randomization in creating counterfactuals for causal inference. In contrast to the sequential development of experiments, quasi-experiments originated at very different points in time, from very different scientific perspectives, and with frequent and long interruptions in their methodological development. Health researchers have only recently started to recognize the value of quasi-experiments for generating novel insights on causal relationships. While quasi-experiments are unlikely to replace experiments in generating the efficacy and safety evidence required for clinical guidelines and regulatory approval of medical technologies, quasi-experiments can play an important role in establishing the effectiveness of health care practice, programs, and policies. The papers in this series describe and discuss a range of important issues in utilizing quasi-experimental designs for primary research and quasi-experimental results for evidence synthesis. Copyright © 2017 Elsevier Inc. All rights reserved.

  8. Designing preclinical perceptibility measures to evaluate topical vaginal gel formulations: relating user sensory perceptions and experiences to formulation properties.

    PubMed

    Morrow, Kathleen M; Fava, Joseph L; Rosen, Rochelle K; Vargas, Sara; Shaw, Julia G; Kojic, E Milu; Kiser, Patrick F; Friend, David R; Katz, David F

    2014-01-01

    Abstract The effectiveness of any biomedical prevention technology relies on both biological efficacy and behavioral adherence. Microbicide trials have been hampered by low adherence, limiting the ability to draw meaningful conclusions about product effectiveness. Central to this problem may be an inadequate conceptualization of how product properties themselves impact user experience and adherence. Our goal is to expand the current microbicide development framework to include product "perceptibility," the objective measurement of user sensory perceptions (i.e., sensations) and experiences of formulation performance during use. For vaginal gels, a set of biophysical properties, including rheological properties and measures of spreading and retention, may critically impact user experiences. Project LINK sought to characterize the user experience in this regard, and to validate measures of user sensory perceptions and experiences (USPEs) using four prototype topical vaginal gel formulations designed for pericoital use. Perceptibility scales captured a range of USPEs during the product application process (five scales), ambulation after product insertion (six scales), and during sexual activity (eight scales). Comparative statistical analyses provided empirical support for hypothesized relationships between gel properties, spreading performance, and the user experience. Project LINK provides preliminary evidence for the utility of evaluating USPEs, introducing a paradigm shift in the field of microbicide formulation design. We propose that these user sensory perceptions and experiences initiate cognitive processes in users resulting in product choice and willingness-to-use. By understanding the impact of USPEs on that process, formulation development can optimize both drug delivery and adherence.

  9. Kerr Reservoir LANDSAT experiment analysis for November 1980

    NASA Technical Reports Server (NTRS)

    Lecroy, S. R.

    1982-01-01

    An experiment was conducted on the waters of Kerr Reservoir to determine if reliable algorithms could be developed that relate water quality parameters to remotely sensed data. LANDSAT radiance data was used in the analysis since it is readily available and covers the area of interest on a regular basis. By properly designing the experiment, many of the unwanted variations due to atmosphere, solar, and hydraulic changes were minimized. The algorithms developed were constrained to satisfy rigorous statistical criteria before they could be considered dependable in predicting water quality parameters. A complete mix of different types of algorithms using the LANDSAT bands was generated to provide a thorough understanding of the relationships among the data involved. The study demonstrated that for the ranges measured, the algorithms that satisfactorily represented the data are mostly linear and only require a maximum of one or two LANDSAT bands. Rationing techniques did not improve the results since the initial design of the experiment minimized the errors that this procedure is effective against. Good correlations were established for inorganic suspended solids, iron, turbidity, and secchi depth.

  10. Secure and scalable deduplication of horizontally partitioned health data for privacy-preserving distributed statistical computation.

    PubMed

    Yigzaw, Kassaye Yitbarek; Michalas, Antonis; Bellika, Johan Gustav

    2017-01-03

    Techniques have been developed to compute statistics on distributed datasets without revealing private information except the statistical results. However, duplicate records in a distributed dataset may lead to incorrect statistical results. Therefore, to increase the accuracy of the statistical analysis of a distributed dataset, secure deduplication is an important preprocessing step. We designed a secure protocol for the deduplication of horizontally partitioned datasets with deterministic record linkage algorithms. We provided a formal security analysis of the protocol in the presence of semi-honest adversaries. The protocol was implemented and deployed across three microbiology laboratories located in Norway, and we ran experiments on the datasets in which the number of records for each laboratory varied. Experiments were also performed on simulated microbiology datasets and data custodians connected through a local area network. The security analysis demonstrated that the protocol protects the privacy of individuals and data custodians under a semi-honest adversarial model. More precisely, the protocol remains secure with the collusion of up to N - 2 corrupt data custodians. The total runtime for the protocol scales linearly with the addition of data custodians and records. One million simulated records distributed across 20 data custodians were deduplicated within 45 s. The experimental results showed that the protocol is more efficient and scalable than previous protocols for the same problem. The proposed deduplication protocol is efficient and scalable for practical uses while protecting the privacy of patients and data custodians.

  11. Gender differences in learning physical science concepts: Does computer animation help equalize them?

    NASA Astrophysics Data System (ADS)

    Jacek, Laura Lee

    This dissertation details an experiment designed to identify gender differences in learning using three experimental treatments: animation, static graphics, and verbal instruction alone. Three learning presentations were used in testing of 332 university students. Statistical analysis was performed using ANOVA, binomial tests for differences of proportion, and descriptive statistics. Results showed that animation significantly improved women's long-term learning over static graphics (p = 0.067), but didn't significantly improve men's long-term learning over static graphics. In all cases, women's scores improved with animation over both other forms of instruction for long-term testing, indicating that future research should not abandon the study of animation as a tool that may promote gender equity in science. Short-term test differences were smaller, and not statistically significant. Variation present in short-term scores was related more to presentation topic than treatment. This research also details characteristics of each of the three presentations, to identify variables (e.g. level of abstraction in presentation) affecting score differences within treatments. Differences between men's and women's scores were non-standard between presentations, but these differences were not statistically significant (long-term p = 0.2961, short-term p = 0.2893). In future research, experiments might be better designed to test these presentational variables in isolation, possibly yielding more distinctive differences between presentational scores. Differences in confidence interval overlaps between presentations suggested that treatment superiority may be somewhat dependent on the design or topic of the learning presentation. Confidence intervals greatly overlap in all situations. This undercut, to some degree, the surety of conclusions indicating superiority of one treatment type over the others. However, confidence intervals for animation were smaller, overlapped nearly completely for men and women (there was less overlap between the genders for the other two treatments), and centered around slightly higher means, lending further support to the conclusion that animation helped equalize men's and women's learning. The most important conclusion identified in this research is that gender is an important variable experimental populations testing animation as a learning device. Averages indicated that both men and women prefer to work with animation over either static graphics or verbal instruction alone.

  12. Optimization of solid content, carbon/nitrogen ratio and food/inoculum ratio for biogas production from food waste.

    PubMed

    Dadaser-Celik, Filiz; Azgin, Sukru Taner; Yildiz, Yalcin Sevki

    2016-12-01

    Biogas production from food waste has been used as an efficient waste treatment option for years. The methane yields from decomposition of waste are, however, highly variable under different operating conditions. In this study, a statistical experimental design method (Taguchi OA 9 ) was implemented to investigate the effects of simultaneous variations of three parameters on methane production. The parameters investigated were solid content (SC), carbon/nitrogen ratio (C/N) and food/inoculum ratio (F/I). Two sets of experiments were conducted with nine anaerobic reactors operating under different conditions. Optimum conditions were determined using statistical analysis, such as analysis of variance (ANOVA). A confirmation experiment was carried out at optimum conditions to investigate the validity of the results. Statistical analysis showed that SC was the most important parameter for methane production with a 45% contribution, followed by F/I ratio with a 35% contribution. The optimum methane yield of 151 l kg -1 volatile solids (VS) was achieved after 24 days of digestion when SC was 4%, C/N was 28 and F/I were 0.3. The confirmation experiment provided a methane yield of 167 l kg -1 VS after 24 days. The analysis showed biogas production from food waste may be increased by optimization of operating conditions. © The Author(s) 2016.

  13. An improved approach for flight readiness certification: Methodology for failure risk assessment and application examples. Volume 3: Structure and listing of programs

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.

  14. Application of Plackett-Burman experimental design in the development of muffin using adlay flour

    NASA Astrophysics Data System (ADS)

    Valmorida, J. S.; Castillo-Israel, K. A. T.

    2018-01-01

    The application of Plackett-Burman experimental design was made to identify significant formulation and process variables in the development of muffin using adlay flour. Out of the seven screened variables, levels of sugar, levels of butter and baking temperature had the most significant influence on the product model in terms of physicochemical and sensory acceptability. Results of the experiment further demonstrate the effectiveness of Plackett-Burman design in choosing the best adlay variety for muffin production. Hence, the statistical method used in the study permits an efficient selection of important variables needed in the development of muffin from adlay which can be optimized using response surface methodology.

  15. Mutual interference between statistical summary perception and statistical learning.

    PubMed

    Zhao, Jiaying; Ngo, Nhi; McKendrick, Ryan; Turk-Browne, Nicholas B

    2011-09-01

    The visual system is an efficient statistician, extracting statistical summaries over sets of objects (statistical summary perception) and statistical regularities among individual objects (statistical learning). Although these two kinds of statistical processing have been studied extensively in isolation, their relationship is not yet understood. We first examined how statistical summary perception influences statistical learning by manipulating the task that participants performed over sets of objects containing statistical regularities (Experiment 1). Participants who performed a summary task showed no statistical learning of the regularities, whereas those who performed control tasks showed robust learning. We then examined how statistical learning influences statistical summary perception by manipulating whether the sets being summarized contained regularities (Experiment 2) and whether such regularities had already been learned (Experiment 3). The accuracy of summary judgments improved when regularities were removed and when learning had occurred in advance. In sum, calculating summary statistics impeded statistical learning, and extracting statistical regularities impeded statistical summary perception. This mutual interference suggests that statistical summary perception and statistical learning are fundamentally related.

  16. Experimental design data for the biosynthesis of citric acid using Central Composite Design method.

    PubMed

    Kola, Anand Kishore; Mekala, Mallaiah; Goli, Venkat Reddy

    2017-06-01

    In the present investigation, we report that statistical design and optimization of significant variables for the microbial production of citric acid from sucrose in presence of filamentous fungi A. niger NCIM 705. Various combinations of experiments were designed with Central Composite Design (CCD) of Response Surface Methodology (RSM) for the production of citric acid as a function of six variables. The variables are; initial sucrose concentration, initial pH of medium, fermentation temperature, incubation time, stirrer rotational speed, and oxygen flow rate. From experimental data, a statistical model for this process has been developed. The optimum conditions reported in the present article are initial concentration of sucrose of 163.6 g/L, initial pH of medium 5.26, stirrer rotational speed of 247.78 rpm, incubation time of 8.18 days, fermentation temperature of 30.06 °C and flow rate of oxygen of 1.35 lpm. Under optimum conditions the predicted maximum citric acid is 86.42 g/L. The experimental validation carried out under the optimal values and reported citric acid to be 82.0 g/L. The model is able to represent the experimental data and the agreement between the model and experimental data is good.

  17. Computer Optimization of Biodegradable Nanoparticles Fabricated by Dispersion Polymerization.

    PubMed

    Akala, Emmanuel O; Adesina, Simeon; Ogunwuyi, Oluwaseun

    2015-12-22

    Quality by design (QbD) in the pharmaceutical industry involves designing and developing drug formulations and manufacturing processes which ensure predefined drug product specifications. QbD helps to understand how process and formulation variables affect product characteristics and subsequent optimization of these variables vis-à-vis final specifications. Statistical design of experiments (DoE) identifies important parameters in a pharmaceutical dosage form design followed by optimizing the parameters with respect to certain specifications. DoE establishes in mathematical form the relationships between critical process parameters together with critical material attributes and critical quality attributes. We focused on the fabrication of biodegradable nanoparticles by dispersion polymerization. Aided by a statistical software, d-optimal mixture design was used to vary the components (crosslinker, initiator, stabilizer, and macromonomers) to obtain twenty nanoparticle formulations (PLLA-based nanoparticles) and thirty formulations (poly-ɛ-caprolactone-based nanoparticles). Scheffe polynomial models were generated to predict particle size (nm), zeta potential, and yield (%) as functions of the composition of the formulations. Simultaneous optimizations were carried out on the response variables. Solutions were returned from simultaneous optimization of the response variables for component combinations to (1) minimize nanoparticle size; (2) maximize the surface negative zeta potential; and (3) maximize percent yield to make the nanoparticle fabrication an economic proposition.

  18. Research design and statistical methods in Pakistan Journal of Medical Sciences (PJMS).

    PubMed

    Akhtar, Sohail; Shah, Syed Wadood Ali; Rafiq, M; Khan, Ajmal

    2016-01-01

    This article compares the study design and statistical methods used in 2005, 2010 and 2015 of Pakistan Journal of Medical Sciences (PJMS). Only original articles of PJMS were considered for the analysis. The articles were carefully reviewed for statistical methods and designs, and then recorded accordingly. The frequency of each statistical method and research design was estimated and compared with previous years. A total of 429 articles were evaluated (n=74 in 2005, n=179 in 2010, n=176 in 2015) in which 171 (40%) were cross-sectional and 116 (27%) were prospective study designs. A verity of statistical methods were found in the analysis. The most frequent methods include: descriptive statistics (n=315, 73.4%), chi-square/Fisher's exact tests (n=205, 47.8%) and student t-test (n=186, 43.4%). There was a significant increase in the use of statistical methods over time period: t-test, chi-square/Fisher's exact test, logistic regression, epidemiological statistics, and non-parametric tests. This study shows that a diverse variety of statistical methods have been used in the research articles of PJMS and frequency improved from 2005 to 2015. However, descriptive statistics was the most frequent method of statistical analysis in the published articles while cross-sectional study design was common study design.

  19. Framework for the rapid optimization of soluble protein expression in Escherichia coli combining microscale experiments and statistical experimental design.

    PubMed

    Islam, R S; Tisi, D; Levy, M S; Lye, G J

    2007-01-01

    A major bottleneck in drug discovery is the production of soluble human recombinant protein in sufficient quantities for analysis. This problem is compounded by the complex relationship between protein yield and the large number of variables which affect it. Here, we describe a generic framework for the rapid identification and optimization of factors affecting soluble protein yield in microwell plate fermentations as a prelude to the predictive and reliable scaleup of optimized culture conditions. Recombinant expression of firefly luciferase in Escherichia coli was used as a model system. Two rounds of statistical design of experiments (DoE) were employed to first screen (D-optimal design) and then optimize (central composite face design) the yield of soluble protein. Biological variables from the initial screening experiments included medium type and growth and induction conditions. To provide insight into the impact of the engineering environment on cell growth and expression, plate geometry, shaking speed, and liquid fill volume were included as factors since these strongly influence oxygen transfer into the wells. Compared to standard reference conditions, both the screening and optimization designs gave up to 3-fold increases in the soluble protein yield, i.e., a 9-fold increase overall. In general the highest protein yields were obtained when cells were induced at a relatively low biomass concentration and then allowed to grow slowly up to a high final biomass concentration, >8 g.L-1. Consideration and analysis of the model results showed 6 of the original 10 variables to be important at the screening stage and 3 after optimization. The latter included the microwell plate shaking speeds pre- and postinduction, indicating the importance of oxygen transfer into the microwells and identifying this as a critical parameter for subsequent scale translation studies. The optimization process, also known as response surface methodology (RSM), predicted there to be a distinct optimum set of conditions for protein expression which could be verified experimentally. This work provides a generic approach to protein expression optimization in which both biological and engineering variables are investigated from the initial screening stage. The application of DoE reduces the total number of experiments needed to be performed, while experimentation at the microwell scale increases experimental throughput and reduces cost.

  20. Single-Case Experimental Designs to Evaluate Novel Technology-Based Health Interventions

    PubMed Central

    Cassidy, Rachel N; Raiff, Bethany R

    2013-01-01

    Technology-based interventions to promote health are expanding rapidly. Assessing the preliminary efficacy of these interventions can be achieved by employing single-case experiments (sometimes referred to as n-of-1 studies). Although single-case experiments are often misunderstood, they offer excellent solutions to address the challenges associated with testing new technology-based interventions. This paper provides an introduction to single-case techniques and highlights advances in developing and evaluating single-case experiments, which help ensure that treatment outcomes are reliable, replicable, and generalizable. These advances include quality control standards, heuristics to guide visual analysis of time-series data, effect size calculations, and statistical analyses. They also include experimental designs to isolate the active elements in a treatment package and to assess the mechanisms of behavior change. The paper concludes with a discussion of issues related to the generality of findings derived from single-case research and how generality can be established through replication and through analysis of behavioral mechanisms. PMID:23399668

  1. Using experimental design modules for process characterization in manufacturing/materials processes laboratories

    NASA Technical Reports Server (NTRS)

    Ankenman, Bruce; Ermer, Donald; Clum, James A.

    1994-01-01

    Modules dealing with statistical experimental design (SED), process modeling and improvement, and response surface methods have been developed and tested in two laboratory courses. One course was a manufacturing processes course in Mechanical Engineering and the other course was a materials processing course in Materials Science and Engineering. Each module is used as an 'experiment' in the course with the intent that subsequent course experiments will use SED methods for analysis and interpretation of data. Evaluation of the modules' effectiveness has been done by both survey questionnaires and inclusion of the module methodology in course examination questions. Results of the evaluation have been very positive. Those evaluation results and details of the modules' content and implementation are presented. The modules represent an important component for updating laboratory instruction and to provide training in quality for improved engineering practice.

  2. Flux control coefficients determined by inhibitor titration: the design and analysis of experiments to minimize errors.

    PubMed Central

    Small, J R

    1993-01-01

    This paper is a study into the effects of experimental error on the estimated values of flux control coefficients obtained using specific inhibitors. Two possible techniques for analysing the experimental data are compared: a simple extrapolation method (the so-called graph method) and a non-linear function fitting method. For these techniques, the sources of systematic errors are identified and the effects of systematic and random errors are quantified, using both statistical analysis and numerical computation. It is shown that the graph method is very sensitive to random errors and, under all conditions studied, that the fitting method, even under conditions where the assumptions underlying the fitted function do not hold, outperformed the graph method. Possible ways of designing experiments to minimize the effects of experimental errors are analysed and discussed. PMID:8257434

  3. Applying the methodology of Design of Experiments to stability studies: a Partial Least Squares approach for evaluation of drug stability.

    PubMed

    Jordan, Nika; Zakrajšek, Jure; Bohanec, Simona; Roškar, Robert; Grabnar, Iztok

    2018-05-01

    The aim of the present research is to show that the methodology of Design of Experiments can be applied to stability data evaluation, as they can be seen as multi-factor and multi-level experimental designs. Linear regression analysis is usually an approach for analyzing stability data, but multivariate statistical methods could also be used to assess drug stability during the development phase. Data from a stability study for a pharmaceutical product with hydrochlorothiazide (HCTZ) as an unstable drug substance was used as a case example in this paper. The design space of the stability study was modeled using Umetrics MODDE 10.1 software. We showed that a Partial Least Squares model could be used for a multi-dimensional presentation of all data generated in a stability study and for determination of the relationship among factors that influence drug stability. It might also be used for stability predictions and potentially for the optimization of the extent of stability testing needed to determine shelf life and storage conditions, which would be time and cost-effective for the pharmaceutical industry.

  4. Data Assimilation - Advances and Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, Brian J.

    2014-07-30

    This presentation provides an overview of data assimilation (model calibration) for complex computer experiments. Calibration refers to the process of probabilistically constraining uncertain physics/engineering model inputs to be consistent with observed experimental data. An initial probability distribution for these parameters is updated using the experimental information. Utilization of surrogate models and empirical adjustment for model form error in code calibration form the basis for the statistical methodology considered. The role of probabilistic code calibration in supporting code validation is discussed. Incorporation of model form uncertainty in rigorous uncertainty quantification (UQ) analyses is also addressed. Design criteria used within a batchmore » sequential design algorithm are introduced for efficiently achieving predictive maturity and improved code calibration. Predictive maturity refers to obtaining stable predictive inference with calibrated computer codes. These approaches allow for augmentation of initial experiment designs for collecting new physical data. A standard framework for data assimilation is presented and techniques for updating the posterior distribution of the state variables based on particle filtering and the ensemble Kalman filter are introduced.« less

  5. Agent-Based Simulation and Analysis of a Defensive UAV Swarm Against an Enemy UAV Swarm

    DTIC Science & Technology

    2011-06-01

    de Investigacion, Programas y Desarrollo de la Armada Armada de Chile CHILE 10. CAPT Jeffrey Kline, USN(ret.) Naval Postgraduate School Monterey, California 91 ...this de - fensive swarm system, an agent-based simulation model is developed, and appropriate designs of experiments and statistical analyses are... de - velopment and implementation of counter UAV technology from readily-available commercial products. The organization leverages the “largest

  6. Case Studies for the Statistical Design of Experiments Applied to Powered Rotor Wind Tunnel Tests

    NASA Technical Reports Server (NTRS)

    Overmeyer, Austin D.; Tanner, Philip E.; Martin, Preston B.; Commo, Sean A.

    2015-01-01

    The application of statistical Design of Experiments (DOE) to helicopter wind tunnel testing was explored during two powered rotor wind tunnel entries during the summers of 2012 and 2013. These tests were performed jointly by the U.S. Army Aviation Development Directorate Joint Research Program Office and NASA Rotary Wing Project Office, currently the Revolutionary Vertical Lift Project, at NASA Langley Research Center located in Hampton, Virginia. Both entries were conducted in the 14- by 22-Foot Subsonic Tunnel with a small portion of the overall tests devoted to developing case studies of the DOE approach as it applies to powered rotor testing. A 16-47 times reduction in the number of data points required was estimated by comparing the DOE approach to conventional testing methods. The average error for the DOE surface response model for the OH-58F test was 0.95 percent and 4.06 percent for drag and download, respectively. The DOE surface response model of the Active Flow Control test captured the drag within 4.1 percent of measured data. The operational differences between the two testing approaches are identified, but did not prevent the safe operation of the powered rotor model throughout the DOE test matrices.

  7. Data visualization, bar naked: A free tool for creating interactive graphics.

    PubMed

    Weissgerber, Tracey L; Savic, Marko; Winham, Stacey J; Stanisavljevic, Dejana; Garovic, Vesna D; Milic, Natasa M

    2017-12-15

    Although bar graphs are designed for categorical data, they are routinely used to present continuous data in studies that have small sample sizes. This presentation is problematic, as many data distributions can lead to the same bar graph, and the actual data may suggest different conclusions from the summary statistics. To address this problem, many journals have implemented new policies that require authors to show the data distribution. This paper introduces a free, web-based tool for creating an interactive alternative to the bar graph (http://statistika.mfub.bg.ac.rs/interactive-dotplot/). This tool allows authors with no programming expertise to create customized interactive graphics, including univariate scatterplots, box plots, and violin plots, for comparing values of a continuous variable across different study groups. Individual data points may be overlaid on the graphs. Additional features facilitate visualization of subgroups or clusters of non-independent data. A second tool enables authors to create interactive graphics from data obtained with repeated independent experiments (http://statistika.mfub.bg.ac.rs/interactive-repeated-experiments-dotplot/). These tools are designed to encourage exploration and critical evaluation of the data behind the summary statistics and may be valuable for promoting transparency, reproducibility, and open science in basic biomedical research. © 2017 by The American Society for Biochemistry and Molecular Biology, Inc.

  8. Study designs, use of statistical tests, and statistical analysis software choice in 2015: Results from two Pakistani monthly Medline indexed journals.

    PubMed

    Shaikh, Masood Ali

    2017-09-01

    Assessment of research articles in terms of study designs used, statistical tests applied and the use of statistical analysis programmes help determine research activity profile and trends in the country. In this descriptive study, all original articles published by Journal of Pakistan Medical Association (JPMA) and Journal of the College of Physicians and Surgeons Pakistan (JCPSP), in the year 2015 were reviewed in terms of study designs used, application of statistical tests, and the use of statistical analysis programmes. JPMA and JCPSP published 192 and 128 original articles, respectively, in the year 2015. Results of this study indicate that cross-sectional study design, bivariate inferential statistical analysis entailing comparison between two variables/groups, and use of statistical software programme SPSS to be the most common study design, inferential statistical analysis, and statistical analysis software programmes, respectively. These results echo previously published assessment of these two journals for the year 2014.

  9. A persuasive concept of research-oriented teaching in Soil Biochemistry

    NASA Astrophysics Data System (ADS)

    Blagodatskaya, Evgenia; Kuzyakova, Irina

    2013-04-01

    One of the main problems of existing bachelor programs is disconnection of basic and experimental education: even during practical training the methods learned are not related to characterization of soil field experiments and observed soil processes. We introduce a multi-level research-oriented teaching system involving Bachelor students in four-semesters active study by integration the basic knowledge, experimental techniques, statistical approaches, project design and it's realization.The novelty of research-oriented teaching system is based 1) on linkage of ongoing experiment to the study of statistical methods and 2) on self-responsibility of students for interpretation of soil chemical and biochemical characteristics obtained in the very beginning of their study by analysing the set of soil samples allowing full-factorial data treatment. This experimental data set is related to specific soil stand and is used as a backbone of the teaching system accelerating the student's interest to soil studies and motivating them for application of basic knowledge from lecture courses. The multi-level system includes: 1) basic lecture course on soil biochemistry with analysis of research questions, 2) practical training course on laboratory analytics where small groups of students are responsible for analysis of soil samples related to the specific land-use/forest type/forest age; 3) training course on biotic (e.g. respiration) - abiotic (e.g. temperature, moisture, fire etc.) interactions in the same soil samples; 4) theoretical seminars where students present and make a first attempt to explain soil characteristics of various soil stands as affected by abiotic factors (first semester); 5) lecture and seminar course on soil statistics where students apply newly learned statistical methods to prove their conclusions and to find relationships between soil characteristics obtained during first semester; 6) seminar course on project design where students develop their scientific projects to study the uncertainties revealed in soil responses to abiotic factors (second and third semesters); 7) Lecture, seminar and training courses on estimation of active microbial biomass in soil where students realize their projects applying a new knowledge to the soils from the stands they are responsible for (fourth semester). Thus, during four semesters the students continuously combine the theoretical knowledge from the lectures with their own experimental experience, compare and discuss results of various groups during seminars and obtain the skills in project design. The successful application of research-oriented teaching system in University of Göttingen allowed each student the early-stage revealing knowledge gaps, accelerated their involvement in ongoing research projects, and motivated them to begin own scientific career.

  10. Direct atomic force microscopy observation of DNA tile crystal growth at the single-molecule level.

    PubMed

    Evans, Constantine G; Hariadi, Rizal F; Winfree, Erik

    2012-06-27

    While the theoretical implications of models of DNA tile self-assembly have been extensively researched and such models have been used to design DNA tile systems for use in experiments, there has been little research testing the fundamental assumptions of those models. In this paper, we use direct observation of individual tile attachments and detachments of two DNA tile systems on a mica surface imaged with an atomic force microscope (AFM) to compile statistics of tile attachments and detachments. We show that these statistics fit the widely used kinetic Tile Assembly Model and demonstrate AFM movies as a viable technique for directly investigating DNA tile systems during growth rather than after assembly.

  11. Soft x-ray speckle from rough surfaces

    NASA Astrophysics Data System (ADS)

    Porter, Matthew Stanton

    Dynamic light scattering has been of great use in determining diffusion times for polymer solutions. At the same time, polymer thin films are becoming of increasing importance, especially in the semiconductor industry where they are used as photoresists and interlevel dielectrics. As the dimensions of these devices decrease we will reach a point where lasers will no longer be able to probe the length scales of interest. Current laser wavelengths limit the size of observable diffusion lengths to 180-700 nm. This dissertation will discuss attempts at pushing dynamic fight scattering experiments into the soft x-ray region so that we can examine fluctuations in polymer thin films on the molecular length scale. The dissertation explores the possibility of carrying out a dynamic light scattering experiment in the soft x-ray regime. A detailed account of how to meet the basic requirements for a coherent scattering experiment in the soft x-ray regime win be given. In addition, a complete description of the chamber design will be discussed. We used our custom designed scattering chamber to collect reproducible coherent soft x-ray scattering data from etched silicon wafers and from polystyrene coated silicon wafers. The data from the silicon wafers followed the statistics for a well-developed speckle pattern while the data from the polystyrene films exhibited Poisson statistics. We used the data from both the etched wafers and the polystyrene coated wafers to place a lower limit of ~20 Å on the RMS surface roughness of samples which will produce well defined speckle patterns for the current detector setup. Future experiments which use the criteria set forth in this dissertation have the opportunity to be even more successful than this dissertation project.

  12. Plackett-Burman experimental design to facilitate syntactic foam development

    DOE PAGES

    Smith, Zachary D.; Keller, Jennie R.; Bello, Mollie; ...

    2015-09-14

    The use of an eight-experiment Plackett–Burman method can assess six experimental variables and eight responses in a polysiloxane-glass microsphere syntactic foam. The approach aims to decrease the time required to develop a tunable polymer composite by identifying a reduced set of variables and responses suitable for future predictive modeling. The statistical design assesses the main effects of mixing process parameters, polymer matrix composition, microsphere density and volume loading, and the blending of two grades of microspheres, using a dummy factor in statistical calculations. Responses cover rheological, physical, thermal, and mechanical properties. The cure accelerator content of the polymer matrix andmore » the volume loading of the microspheres have the largest effects on foam properties. These factors are the most suitable for controlling the gel point of the curing foam, and the density of the cured foam. The mixing parameters introduce widespread variability and therefore should be fixed at effective levels during follow-up testing. Some responses may require greater contrast in microsphere-related factors. As a result, compared to other possible statistical approaches, the run economy of the Plackett–Burman method makes it a valuable tool for rapidly characterizing new foams.« less

  13. Determination of kinetic parameters of 1,3-propanediol fermentation by Clostridium diolis using statistically optimized medium.

    PubMed

    Kaur, Guneet; Srivastava, Ashok K; Chand, Subhash

    2012-09-01

    1,3-propanediol (1,3-PD) is a chemical compound of immense importance primarily used as a raw material for fiber and textile industry. It can be produced by the fermentation of glycerol available abundantly as a by-product from the biodiesel plant. The present study was aimed at determination of key kinetic parameters of 1,3-PD fermentation by Clostridium diolis. Initial experiments on microbial growth inhibition were followed by optimization of nutrient medium recipe by statistical means. Batch kinetic data from studies in bioreactor using optimum concentration of variables obtained from statistical medium design was used for estimation of kinetic parameters of 1,3-PD production. Direct use of raw glycerol from biodiesel plant without any pre-treatment for 1,3-PD production using this strain investigated for the first time in this work gave results comparable to commercial glycerol. The parameter values obtained in this study would be used to develop a mathematical model for 1,3-PD to be used as a guide for designing various reactor operating strategies for further improving 1,3-PD production. An outline of protocol for model development has been discussed in the present work.

  14. Illuminating Tradespace Decisions Using Efficient Experimental Space-Filling Designs for the Engineered Resilient System Architecture

    DTIC Science & Technology

    2015-06-30

    7. Building Statistical Metamodels using Simulation Experimental Designs ............................................... 34 7.1. Statistical Design...system design drivers across several different domain models, our methodology uses statistical metamodeling to approximate the simulations’ behavior. A...output. We build metamodels using a number of statistical methods that include stepwise regression, boosted trees, neural nets, and bootstrap forest

  15. Illuminating Tradespace Decisions Using Efficient Experimental Space-Filling Designs for the Engineered Resilient System Architecture

    DTIC Science & Technology

    2015-06-01

    7. Building Statistical Metamodels using Simulation Experimental Designs ............................................... 34 7.1. Statistical Design...system design drivers across several different domain models, our methodology uses statistical metamodeling to approximate the simulations’ behavior. A...output. We build metamodels using a number of statistical methods that include stepwise regression, boosted trees, neural nets, and bootstrap forest

  16. Research design and statistical methods in Pakistan Journal of Medical Sciences (PJMS)

    PubMed Central

    Akhtar, Sohail; Shah, Syed Wadood Ali; Rafiq, M.; Khan, Ajmal

    2016-01-01

    Objective: This article compares the study design and statistical methods used in 2005, 2010 and 2015 of Pakistan Journal of Medical Sciences (PJMS). Methods: Only original articles of PJMS were considered for the analysis. The articles were carefully reviewed for statistical methods and designs, and then recorded accordingly. The frequency of each statistical method and research design was estimated and compared with previous years. Results: A total of 429 articles were evaluated (n=74 in 2005, n=179 in 2010, n=176 in 2015) in which 171 (40%) were cross-sectional and 116 (27%) were prospective study designs. A verity of statistical methods were found in the analysis. The most frequent methods include: descriptive statistics (n=315, 73.4%), chi-square/Fisher’s exact tests (n=205, 47.8%) and student t-test (n=186, 43.4%). There was a significant increase in the use of statistical methods over time period: t-test, chi-square/Fisher’s exact test, logistic regression, epidemiological statistics, and non-parametric tests. Conclusion: This study shows that a diverse variety of statistical methods have been used in the research articles of PJMS and frequency improved from 2005 to 2015. However, descriptive statistics was the most frequent method of statistical analysis in the published articles while cross-sectional study design was common study design. PMID:27022365

  17. Understanding the significance variables for fabrication of fish gelatin nanoparticles by Plackett-Burman design

    NASA Astrophysics Data System (ADS)

    Subara, Deni; Jaswir, Irwandi; Alkhatib, Maan Fahmi Rashid; Noorbatcha, Ibrahim Ali

    2018-01-01

    The aim of this experiment is to screen and to understand the process variables on the fabrication of fish gelatin nanoparticles by using quality-design approach. The most influencing process variables were screened by using Plackett-Burman design. Mean particles size, size distribution, and zeta potential were found in the range 240±9.76 nm, 0.3, and -9 mV, respectively. Statistical results explained that concentration of acetone, pH of solution during precipitation step and volume of cross linker had a most significant effect on particles size of fish gelatin nanoparticles. It was found that, time and chemical consuming is lower than previous research. This study revealed the potential of quality-by design in understanding the effects of process variables on the fish gelatin nanoparticles production.

  18. Mapping remote and multidisciplinary learning barriers: lessons from challenge-based innovation at CERN

    NASA Astrophysics Data System (ADS)

    Jensen, Matilde Bisballe; Utriainen, Tuuli Maria; Steinert, Martin

    2018-01-01

    This paper presents the experienced difficulties of students participating in the multidisciplinary, remote collaborating engineering design course challenge-based innovation at CERN. This is with the aim to identify learning barriers and improve future learning experiences. We statistically analyse the rated differences between distinct design activities, educational background and remote vs. co-located collaboration. The analysis is based on a quantitative and qualitative questionnaire (N = 37). Our analysis found significant ranking differences between remote and co-located activities. This questions whether the remote factor might be a barrier for the originally intended learning goals. Further a correlation between analytical and converging design phases was identified. Hence, future facilitators are suggested to help students in the transition from one design phase to the next rather than only teaching methods in the individual design phases. Finally, we discuss how educators address the identified learning barriers when designing future courses including multidisciplinary or remote collaboration.

  19. Design-based and model-based inference in surveys of freshwater mollusks

    USGS Publications Warehouse

    Dorazio, R.M.

    1999-01-01

    Well-known concepts in statistical inference and sampling theory are used to develop recommendations for planning and analyzing the results of quantitative surveys of freshwater mollusks. Two methods of inference commonly used in survey sampling (design-based and model-based) are described and illustrated using examples relevant in surveys of freshwater mollusks. The particular objectives of a survey and the type of information observed in each unit of sampling can be used to help select the sampling design and the method of inference. For example, the mean density of a sparsely distributed population of mollusks can be estimated with higher precision by using model-based inference or by using design-based inference with adaptive cluster sampling than by using design-based inference with conventional sampling. More experience with quantitative surveys of natural assemblages of freshwater mollusks is needed to determine the actual benefits of different sampling designs and inferential procedures.

  20. Development and Validation of the Caring Loneliness Scale.

    PubMed

    Karhe, Liisa; Kaunonen, Marja; Koivisto, Anna-Maija

    2016-12-01

    The Caring Loneliness Scale (CARLOS) includes 5 categories derived from earlier qualitative research. This article assesses the reliability and construct validity of a scale designed to measure patient experiences of loneliness in a professional caring relationship. Statistical analysis with 4 different sample sizes included Cronbach's alpha and exploratory factor analysis with principal axis factoring extraction. The sample size of 250 gave the most useful and comprehensible structure, but all 4 samples yielded underlying content of loneliness experiences. The initial 5 categories were reduced to 4 factors with 24 items and Cronbach's alpha ranging from .77 to .90. The findings support the reliability and validity of CARLOS for the assessment of Finnish breast cancer and heart surgery patients' experiences but as all instruments, further validation is needed.

  1. Latest results on νμ → ντ oscillations from the OPERA experiment

    NASA Astrophysics Data System (ADS)

    Komatsu, Masahiro; OPERA Collaboration

    2016-04-01

    The OPERA experiment is designed to prove neutrino oscillations in the νμ to ντ channel through the direct observation of the tau lepton in tau neutrino charged current interactions. The experiment has accumulated data for five years, from 2008 to 2012, with the CERN Neutrinos to Gran Sasso (CNGS), an almost pure νμ beam. In the last two years, a very large amount of the data accumulated in the nuclear emulsions has been analyzed. The latest results on oscillations with the increased statistics, which include a fourth tau neutrino candidate event, will be presented. Given the extremely low expected background, this result corresponds to the observation of the oscillation process with a four sigma level significance.

  2. Detection and rate discrimination of amplitude modulation in electrical hearing.

    PubMed

    Chatterjee, Monita; Oberzut, Cherish

    2011-09-01

    Three experiments were designed to examine temporal envelope processing by cochlear implant (CI) listeners. In experiment 1, the hypothesis that listeners' modulation sensitivity would in part determine their ability to discriminate between temporal modulation rates was examined. Temporal modulation transfer functions (TMTFs) obtained in an amplitude modulation detection (AMD) task were compared to threshold functions obtained in an amplitude modulation rate discrimination (AMRD) task. Statistically significant nonlinear correlations were observed between the two measures. In experiment 2, results of loudness-balancing showed small increases in the loudness of modulated over unmodulated stimuli beyond a modulation depth of 16%. Results of experiment 3 indicated small but statistically significant effects of level-roving on the overall gain of the TMTF, but no impact of level-roving on the average shape of the TMTF across subjects. This suggested that level-roving simply increased the task difficulty for most listeners, but did not indicate increased use of intensity cues under more challenging conditions. Data obtained with one subject, however, suggested that the most sensitive listeners may derive some benefit from intensity cues in these tasks. Overall, results indicated that intensity cues did not play an important role in temporal envelope processing by the average CI listener. © 2011 Acoustical Society of America

  3. Choice Experiments to Quantify Preferences for Health and Healthcare: State of the Practice.

    PubMed

    Mühlbacher, Axel; Johnson, F Reed

    2016-06-01

    Stated-preference methods increasingly are used to quantify preferences in health economics, health technology assessment, benefit-risk analysis and health services research. The objective of stated-preference studies is to acquire information about trade-off preferences among treatment outcomes, prioritization of clinical decision criteria, likely uptake or adherence to healthcare products and acceptability of healthcare services or policies. A widely accepted approach to eliciting preferences is discrete-choice experiments. Patient, physician, insurant or general-public respondents choose among constructed, experimentally controlled alternatives described by decision-relevant features or attributes. Attributes can represent complete health states, sets of treatment outcomes or characteristics of a healthcare system. The observed pattern of choice reveals how different respondents or groups of respondents implicitly weigh, value and assess different characteristics of treatments, products or services. An important advantage of choice experiments is their foundation in microeconomic utility theory. This conceptual framework provides tests of internal validity, guidance for statistical analysis of latent preference structures, and testable behavioural hypotheses. Choice experiments require expertise in survey-research methods, random-utility theory, experimental design and advanced statistical analysis. This paper should be understood as an introduction to setting up a basic experiment rather than an exhaustive critique of the latest findings and procedures. Where appropriate, we have identified topics of active research where a broad consensus has not yet been established.

  4. Concurrent Movement Impairs Incidental but Not Intentional Statistical Learning

    ERIC Educational Resources Information Center

    Stevens, David J.; Arciuli, Joanne; Anderson, David I.

    2015-01-01

    The effect of concurrent movement on incidental versus intentional statistical learning was examined in two experiments. In Experiment 1, participants learned the statistical regularities embedded within familiarization stimuli implicitly, whereas in Experiment 2 they were made aware of the embedded regularities and were instructed explicitly to…

  5. The Relationship Between Levels of Fidelity in Simulation, Traditional Clinical Experiences and Objectives.

    PubMed

    Gore, Teresa

    2017-06-15

    The purpose of this study was to explore the relationship of baccalaureate nursing students' (BSN) perceived learning effectiveness using the Clinical Learning Environments Comparison Survey of different levels of fidelity simulation and traditional clinical experiences. A convenience sample of 103 first semester BSN enrolled in a fundamental/assessment clinical course and 155 fifth semester BSN enrolled in a leadership clinical course participated in this study. A descriptive correlational design was used for this cross-sectional study to evaluate students' perceptions after a simulation experience and the completion of the traditional clinical experiences. The subscales measured were communication, nursing leadership, and teaching-learning dyad. No statistical differences were noted based on the learning objectives. The communication subscale showed a tendency toward preference for traditional clinical experiences in meeting students perceived learning for communication. For student perceived learning effectiveness, faculty should determine the appropriate level of fidelity in simulation based on the learning objectives.

  6. The ICF has made a difference to functioning and disability measurement and statistics.

    PubMed

    Madden, Rosamond H; Bundy, Anita

    2018-02-12

    Fifteen years after the publication of the International Classification of Functioning, Disability and Health (ICF), we investigated: How ICF applications align with ICF aims, contents and principles, and how the ICF has been used to improve measurement of functioning and related statistics. In a scoping review, we investigated research published 2001-2015 relating to measurement and statistics for evidence of: a change in thinking; alignment of applications with ICF specifications and philosophy; and the emergence of new knowledge. The ICF is used in diverse applications, settings and countries, with processes largely aligned with the ICF and intended to improve measurement and statistics: new national surveys, information systems and ICF-based instruments; and international efforts to improve disability data. Knowledge is growing about the components and interactions of the ICF model, the diverse effects of the environment on functioning, and the meaning and measurement of participation. The ICF provides specificity and a common language in the complex world of functioning and disability and is stimulating new thinking, new applications in measurement and statistics, and the assembling of new knowledge. Nevertheless, the field needs to mature. Identified gaps suggest ways to improve measurement and statistics to underpin policies, services and outcomes. Implications for Rehabilitation The ICF offers a conceptualization of functioning and disability that can underpin assessment and documentation in rehabilitation, with a growing body of experience to draw on for guidance. Experience with the ICF reminds practitioners to consider all the domains of participation, the effect of the environment on participation and the importance of involving clients/patients in assessment and service planning. Understanding the variability of functioning within everyday environments and designing interventions for removing barriers in various environments is a vital part of rehabilitation planning.

  7. Experiments and Model for Serration Statistics in Low-Entropy, Medium-Entropy, and High-Entropy Alloys

    DOE PAGES

    Carroll, Robert; Lee, Chi; Tsai, Che-Wei; ...

    2015-11-23

    In this study, high-entropy alloys (HEAs) are new alloys that contain five or more elements in roughly-equal proportion. We present new experiments and theory on the deformation behavior of HEAs under slow stretching (straining), and observe differences, compared to conventional alloys with fewer elements. For a specific range of temperatures and strain-rates, HEAs deform in a jerky way, with sudden slips that make it difficult to precisely control the deformation. An analytic model explains these slips as avalanches of slipping weak spots and predicts the observed slip statistics, stress-strain curves, and their dependence on temperature, strain-rate, and material composition. Themore » ratio of the weak spots’ healing rate to the strain-rate is the main tuning parameter, reminiscent of the Portevin- LeChatellier effect and time-temperature superposition in polymers. Our model predictions agree with the experimental results. The proposed widely-applicable deformation mechanism is useful for deformation control and alloy design.« less

  8. US Geological Survey nutrient preservation experiment : experimental design, statistical analysis, and interpretation of analytical results

    USGS Publications Warehouse

    Patton, Charles J.; Gilroy, Edward J.

    1999-01-01

    Data on which this report is based, including nutrient concentrations in synthetic reference samples determined concurrently with those in real samples, are extensive (greater than 20,000 determinations) and have been published separately. In addition to confirming the well-documented instability of nitrite in acidified samples, this study also demonstrates that when biota are removed from samples at collection sites by 0.45-micrometer membrane filtration, subsequent preservation with sulfuric acid or mercury (II) provides no statistically significant improvement in nutrient concentration stability during storage at 4 degrees Celsius for 30 days. Biocide preservation had no statistically significant effect on the 30-day stability of phosphorus concentrations in whole-water splits from any of the 15 stations, but did stabilize Kjeldahl nitrogen concentrations in whole-water splits from three data-collection stations where ammonium accounted for at least half of the measured Kjeldahl nitrogen.

  9. Does Reviewing Lead to Better Learning and Decision Making? Answers from a Randomized Stock Market Experiment

    PubMed Central

    Wessa, Patrick; Holliday, Ian E.

    2012-01-01

    Background The literature is not univocal about the effects of Peer Review (PR) within the context of constructivist learning. Due to the predominant focus on using PR as an assessment tool, rather than a constructivist learning activity, and because most studies implicitly assume that the benefits of PR are limited to the reviewee, little is known about the effects upon students who are required to review their peers. Much of the theoretical debate in the literature is focused on explaining how and why constructivist learning is beneficial. At the same time these discussions are marked by an underlying presupposition of a causal relationship between reviewing and deep learning. Objectives The purpose of the study is to investigate whether the writing of PR feedback causes students to benefit in terms of: perceived utility about statistics, actual use of statistics, better understanding of statistical concepts and associated methods, changed attitudes towards market risks, and outcomes of decisions that were made. Methods We conducted a randomized experiment, assigning students randomly to receive PR or non–PR treatments and used two cohorts with a different time span. The paper discusses the experimental design and all the software components that we used to support the learning process: Reproducible Computing technology which allows students to reproduce or re–use statistical results from peers, Collaborative PR, and an AI–enhanced Stock Market Engine. Results The results establish that the writing of PR feedback messages causes students to experience benefits in terms of Behavior, Non–Rote Learning, and Attitudes, provided the sequence of PR activities are maintained for a period that is sufficiently long. PMID:22666385

  10. A Survey of Statistical Capstone Projects

    ERIC Educational Resources Information Center

    Martonosi, Susan E.; Williams, Talithia D.

    2016-01-01

    In this article, we highlight the advantages of incorporating a statistical capstone experience in the undergraduate curriculum, where students perform an in-depth analysis of real-world data. Capstone experiences develop statistical thinking by allowing students to engage in a consulting-like experience that requires skills outside the scope of…

  11. [The modeling of the ricochet shot fired from a light weapon].

    PubMed

    Gusentsov, A O; Chuchko, V A; Kil'dyushev, E M; Tumanov, E V

    The objective of the present study was to choose the optimal method for the modeling of the glance of a bullet after hitting a target under conditions of the laboratory experiment. The study required the designing and construction of an original device for the modeling of the rebound effect of a light-firearm shot under experimental conditions. The device was tested under conditions of the laboratory experiment. The trials have demonstrated the possibility of using barriers of different weight and dimensions in the above device, their positioning and fixation depending on the purpose of the experiment, dynamic alteration of its conditions with due regard for the safety and security arrangements to protect the health and life of the experimenters without compromising the statistical significance and scientific validity of the results of the experiments.

  12. Do information, price, or morals influence ethical consumption? A natural field experiment and customer survey on the purchase of Fair Trade coffee.

    PubMed

    Andorfer, Veronika A; Liebe, Ulf

    2015-07-01

    We address ethical consumption using a natural field experiment on the actual purchase of Fair Trade (FT) coffee in three supermarkets in Germany. Based on a quasi-experimental before-and-after design the effects of three different treatments - information, 20% price reduction, and a moral appeal - are analyzed. Sales data cover actual ethical purchase behavior and avoid problems of social desirability. But they offer only limited insights into the motivations of individual consumers. We therefore complemented the field experiment with a customer survey that allows us to contrast observed (ethical) buying behavior with self-reported FT consumption. Results from the experiment suggest that only the price reduction had the expected positive and statistically significant effect on FT consumption. Copyright © 2015 Elsevier Inc. All rights reserved.

  13. Pareto fronts for multiobjective optimization design on materials data

    NASA Astrophysics Data System (ADS)

    Gopakumar, Abhijith; Balachandran, Prasanna; Gubernatis, James E.; Lookman, Turab

    Optimizing multiple properties simultaneously is vital in materials design. Here we apply infor- mation driven, statistical optimization strategies blended with machine learning methods, to address multi-objective optimization tasks on materials data. These strategies aim to find the Pareto front consisting of non-dominated data points from a set of candidate compounds with known character- istics. The objective is to find the pareto front in as few additional measurements or calculations as possible. We show how exploration of the data space to find the front is achieved by using uncer- tainties in predictions from regression models. We test our proposed design strategies on multiple, independent data sets including those from computations as well as experiments. These include data sets for Max phases, piezoelectrics and multicomponent alloys.

  14. Simulation methods to estimate design power: an overview for applied research.

    PubMed

    Arnold, Benjamin F; Hogan, Daniel R; Colford, John M; Hubbard, Alan E

    2011-06-20

    Estimating the required sample size and statistical power for a study is an integral part of study design. For standard designs, power equations provide an efficient solution to the problem, but they are unavailable for many complex study designs that arise in practice. For such complex study designs, computer simulation is a useful alternative for estimating study power. Although this approach is well known among statisticians, in our experience many epidemiologists and social scientists are unfamiliar with the technique. This article aims to address this knowledge gap. We review an approach to estimate study power for individual- or cluster-randomized designs using computer simulation. This flexible approach arises naturally from the model used to derive conventional power equations, but extends those methods to accommodate arbitrarily complex designs. The method is universally applicable to a broad range of designs and outcomes, and we present the material in a way that is approachable for quantitative, applied researchers. We illustrate the method using two examples (one simple, one complex) based on sanitation and nutritional interventions to improve child growth. We first show how simulation reproduces conventional power estimates for simple randomized designs over a broad range of sample scenarios to familiarize the reader with the approach. We then demonstrate how to extend the simulation approach to more complex designs. Finally, we discuss extensions to the examples in the article, and provide computer code to efficiently run the example simulations in both R and Stata. Simulation methods offer a flexible option to estimate statistical power for standard and non-traditional study designs and parameters of interest. The approach we have described is universally applicable for evaluating study designs used in epidemiologic and social science research.

  15. Simulation methods to estimate design power: an overview for applied research

    PubMed Central

    2011-01-01

    Background Estimating the required sample size and statistical power for a study is an integral part of study design. For standard designs, power equations provide an efficient solution to the problem, but they are unavailable for many complex study designs that arise in practice. For such complex study designs, computer simulation is a useful alternative for estimating study power. Although this approach is well known among statisticians, in our experience many epidemiologists and social scientists are unfamiliar with the technique. This article aims to address this knowledge gap. Methods We review an approach to estimate study power for individual- or cluster-randomized designs using computer simulation. This flexible approach arises naturally from the model used to derive conventional power equations, but extends those methods to accommodate arbitrarily complex designs. The method is universally applicable to a broad range of designs and outcomes, and we present the material in a way that is approachable for quantitative, applied researchers. We illustrate the method using two examples (one simple, one complex) based on sanitation and nutritional interventions to improve child growth. Results We first show how simulation reproduces conventional power estimates for simple randomized designs over a broad range of sample scenarios to familiarize the reader with the approach. We then demonstrate how to extend the simulation approach to more complex designs. Finally, we discuss extensions to the examples in the article, and provide computer code to efficiently run the example simulations in both R and Stata. Conclusions Simulation methods offer a flexible option to estimate statistical power for standard and non-traditional study designs and parameters of interest. The approach we have described is universally applicable for evaluating study designs used in epidemiologic and social science research. PMID:21689447

  16. Study of the effect of cloud inhomogeneity on the earth radiation budget experiment

    NASA Technical Reports Server (NTRS)

    Smith, Phillip J.

    1988-01-01

    The Earth Radiation Budget Experiment (ERBE) is the most recent and probably the most intensive mission designed to gather precise measurements of the Earth's radiation components. The data obtained from ERBE is of great importance for future climatological studies. A statistical study reveals that the ERBE scanner data are highly correlated and that instantaneous measurements corresponding to neighboring pixels contain almost the same information. Analyzing only a fraction of the data set when sampling is suggested and applications of this strategy are given in the calculation of the albedo of the Earth and of the cloud-forcing over ocean.

  17. A revised design for microarray experiments to account for experimental noise and uncertainty of probe response.

    PubMed

    Pozhitkov, Alex E; Noble, Peter A; Bryk, Jarosław; Tautz, Diethard

    2014-01-01

    Although microarrays are analysis tools in biomedical research, they are known to yield noisy output that usually requires experimental confirmation. To tackle this problem, many studies have developed rules for optimizing probe design and devised complex statistical tools to analyze the output. However, less emphasis has been placed on systematically identifying the noise component as part of the experimental procedure. One source of noise is the variance in probe binding, which can be assessed by replicating array probes. The second source is poor probe performance, which can be assessed by calibrating the array based on a dilution series of target molecules. Using model experiments for copy number variation and gene expression measurements, we investigate here a revised design for microarray experiments that addresses both of these sources of variance. Two custom arrays were used to evaluate the revised design: one based on 25 mer probes from an Affymetrix design and the other based on 60 mer probes from an Agilent design. To assess experimental variance in probe binding, all probes were replicated ten times. To assess probe performance, the probes were calibrated using a dilution series of target molecules and the signal response was fitted to an adsorption model. We found that significant variance of the signal could be controlled by averaging across probes and removing probes that are nonresponsive or poorly responsive in the calibration experiment. Taking this into account, one can obtain a more reliable signal with the added option of obtaining absolute rather than relative measurements. The assessment of technical variance within the experiments, combined with the calibration of probes allows to remove poorly responding probes and yields more reliable signals for the remaining ones. Once an array is properly calibrated, absolute quantification of signals becomes straight forward, alleviating the need for normalization and reference hybridizations.

  18. The Box Task: A tool to design experiments for assessing visuospatial working memory.

    PubMed

    Kessels, Roy P C; Postma, Albert

    2017-09-15

    The present paper describes the Box Task, a paradigm for the computerized assessment of visuospatial working memory. In this task, hidden objects have to be searched by opening closed boxes that are shown at different locations on the computer screen. The set size (i.e., number of boxes that must be searched) can be varied and different error scores can be computed that measure specific working memory processes (i.e., the number of within-search and between-search errors). The Box Task also has a developer's mode in which new stimulus displays can be designed for use in tailored experiments. The Box Task comes with a standard set of stimulus displays (including practice trials, as well as stimulus displays with 4, 6, and 8 boxes). The raw data can be analyzed easily and the results of individual participants can be aggregated into one spreadsheet for further statistical analyses.

  19. A Developed Meta-model for Selection of Cotton Fabrics Using Design of Experiments and TOPSIS Method

    NASA Astrophysics Data System (ADS)

    Chakraborty, Shankar; Chatterjee, Prasenjit

    2017-12-01

    Selection of cotton fabrics for providing optimal clothing comfort is often considered as a multi-criteria decision making problem consisting of an array of candidate alternatives to be evaluated based of several conflicting properties. In this paper, design of experiments and technique for order preference by similarity to ideal solution (TOPSIS) are integrated so as to develop regression meta-models for identifying the most suitable cotton fabrics with respect to the computed TOPSIS scores. The applicability of the adopted method is demonstrated using two real time examples. These developed models can also identify the statistically significant fabric properties and their interactions affecting the measured TOPSIS scores and final selection decisions. There exists good degree of congruence between the ranking patterns as derived using these meta-models and the existing methods for cotton fabric ranking and subsequent selection.

  20. A designed experiment in stitched/RTM composites

    NASA Technical Reports Server (NTRS)

    Dickinson, Larry C.

    1993-01-01

    The damage tolerance of composite laminates can be significantly improved by the addition of through-the-thickness fibrous reinforcement such as stitching. However, there are numerous stitching parameters which can be independently varied, and their separate and combined effects on mechanical properties need to be determined. A statistically designed experiment (a 2(sup 5-1) fractional factorial, also known as a Taguchi L16 test matrix) used to evaluate five important parameters is described. The effects and interactions of stitch thread material, stitch thread strength, stitch row spacing and stitch pitch are examined for both thick (48 ply) and thin (16 ply) carbon/epoxy (AS4/E905L) composites. Tension, compression and compression after impact tests are described. Preliminary results of completed tension testing are discussed. Larger threads decreased tensile strength. Panel thickness was found not to be an important stitching parameter for tensile properties. Tensile modulus was unaffected by stitching.

  1. ``Learning to Research'' in a Virtual Learning Environment: A Case Study on the Effectiveness of a Socio-constructivist Learning Design

    NASA Astrophysics Data System (ADS)

    López-Alonso, C.; Fernández-Pampillón, A.; de-Miguel, E.; Pita, G.

    Learning is the basis for research and lifelong training. The implementation of virtual environments for developing this competency requires the use of effective learning models. In this study we present an experiment in positive learning from the virtual campus of the Complutense University of Madrid (UCM). In order to carry it out we have used E-Ling, an e-learning environment that has been developed with an innovative didactic design based on a socio-constructivist learning approach. E-Ling has been used since 2006 to train future teachers and researchers in “learning to research”. Some of the results of this experiment have been statistically analysed in order to compare them with other learning models. From the obtained results we have concluded that E-Ling is a more productive proposal for developing competences in learning to research.

  2. International Educational Interactions and Students' Critical Consciousness: A Pilot Study.

    PubMed

    Aldrich, Rebecca M; Grajo, Lenin C

    Online technologies facilitate connections between students around the world, but their impact on occupational science and occupational therapy students' critical consciousness about culture is underexplored. In this article we present research on five groups of occupational science and occupational therapy students across two cohorts at one Midwestern university. We used a pretest-posttest group design and the Multicultural Experiences Questionnaire to investigate the potential influence of students' exposure to international educational interactions on their multicultural experiences and desires. Of 157 students surveyed, those who experienced the greatest number of international educational interactions demonstrated statistically significant increases in their desire to become acquainted with other people of different backgrounds and to explore their own prejudices and biases. Given the transformative potential of international educational interactions, future research must assess the ways in which such interactions affect critical cultural consciousness apart from other educational content and design. Copyright © 2017 by the American Occupational Therapy Association, Inc.

  3. Relating design and environmental variables to reliability

    NASA Astrophysics Data System (ADS)

    Kolarik, William J.; Landers, Thomas L.

    The combination of space application and nuclear power source demands high reliability hardware. The possibilities of failure, either an inability to provide power or a catastrophic accident, must be minimized. Nuclear power experiences on the ground have led to highly sophisticated probabilistic risk assessment procedures, most of which require quantitative information to adequately assess such risks. In the area of hardware risk analysis, reliability information plays a key role. One of the lessons learned from the Three Mile Island experience is that thorough analyses of critical components are essential. Nuclear grade equipment shows some reliability advantages over commercial. However, no statistically significant difference has been found. A recent study pertaining to spacecraft electronics reliability, examined some 2500 malfunctions on more than 300 aircraft. The study classified the equipment failures into seven general categories. Design deficiencies and lack of environmental protection accounted for about half of all failures. Within each class, limited reliability modeling was performed using a Weibull failure model.

  4. SOCR Analyses - an Instructional Java Web-based Statistical Analysis Toolkit.

    PubMed

    Chu, Annie; Cui, Jenny; Dinov, Ivo D

    2009-03-01

    The Statistical Online Computational Resource (SOCR) designs web-based tools for educational use in a variety of undergraduate courses (Dinov 2006). Several studies have demonstrated that these resources significantly improve students' motivation and learning experiences (Dinov et al. 2008). SOCR Analyses is a new component that concentrates on data modeling and analysis using parametric and non-parametric techniques supported with graphical model diagnostics. Currently implemented analyses include commonly used models in undergraduate statistics courses like linear models (Simple Linear Regression, Multiple Linear Regression, One-Way and Two-Way ANOVA). In addition, we implemented tests for sample comparisons, such as t-test in the parametric category; and Wilcoxon rank sum test, Kruskal-Wallis test, Friedman's test, in the non-parametric category. SOCR Analyses also include several hypothesis test models, such as Contingency tables, Friedman's test and Fisher's exact test.The code itself is open source (http://socr.googlecode.com/), hoping to contribute to the efforts of the statistical computing community. The code includes functionality for each specific analysis model and it has general utilities that can be applied in various statistical computing tasks. For example, concrete methods with API (Application Programming Interface) have been implemented in statistical summary, least square solutions of general linear models, rank calculations, etc. HTML interfaces, tutorials, source code, activities, and data are freely available via the web (www.SOCR.ucla.edu). Code examples for developers and demos for educators are provided on the SOCR Wiki website.In this article, the pedagogical utilization of the SOCR Analyses is discussed, as well as the underlying design framework. As the SOCR project is on-going and more functions and tools are being added to it, these resources are constantly improved. The reader is strongly encouraged to check the SOCR site for most updated information and newly added models.

  5. Modelling short time series in metabolomics: a functional data analysis approach.

    PubMed

    Montana, Giovanni; Berk, Maurice; Ebbels, Tim

    2011-01-01

    Metabolomics is the study of the complement of small molecule metabolites in cells, biofluids and tissues. Many metabolomic experiments are designed to compare changes observed over time under two or more experimental conditions (e.g. a control and drug-treated group), thus producing time course data. Models from traditional time series analysis are often unsuitable because, by design, only very few time points are available and there are a high number of missing values. We propose a functional data analysis approach for modelling short time series arising in metabolomic studies which overcomes these obstacles. Our model assumes that each observed time series is a smooth random curve, and we propose a statistical approach for inferring this curve from repeated measurements taken on the experimental units. A test statistic for detecting differences between temporal profiles associated with two experimental conditions is then presented. The methodology has been applied to NMR spectroscopy data collected in a pre-clinical toxicology study.

  6. Nurse-Administered Hand Massage: Integration Into an Infusion Suite's Standard of Care
.

    PubMed

    Braithwaite, Caitlin M; Ringdahl, Deborah

    2017-08-01

    Nurse-delivered hand massage is a safe and effective intervention that has potential for positively affecting nursing and patient outcomes.
. Nurses in a National Cancer Institute-designated academic health center outpatient chemotherapy infusion suite were taught how to administer a hand massage to strengthen the nurse-patient relationship and improve patient experience, comfort, satisfaction, stress, and anxiety.
. A pre-/postimplementation group comparison design was used. Patients in both groups completed self-reported measures of stress, comfort, satisfaction, and anxiety. Nurses completed Likert-type scales pre- and postimplementation on the perceived benefits of hand massage to the patient and nursing practice, impact on patient anxiety, and preparation in providing a hand massage.
. A positive trend was seen in all indicators. Patients who received a hand massage had a statistically significant improvement in comfort (p = 0.025) compared to those who did not. A statistically significant improvement was seen in all nurse indicators pre- to postimplementation.

  7. How to hit HIV where it hurts

    NASA Astrophysics Data System (ADS)

    Chakraborty, Arup

    No medical procedure has saved more lives than vaccination. But, today, some pathogens have evolved which have defied successful vaccination using the empirical paradigms pioneered by Pasteur and Jenner. One characteristic of many pathogens for which successful vaccines do not exist is that they present themselves in various guises. HIV is an extreme example because of its high mutability. This highly mutable virus can evade natural or vaccine induced immune responses, often by mutating at multiple sites linked by compensatory interactions. I will describe first how by bringing to bear ideas from statistical physics (e.g., maximum entropy models, Hopfield models, Feynman variational theory) together with in vitro experiments and clinical data, the fitness landscape of HIV is beginning to be defined with explicit account for collective mutational pathways. I will describe how this knowledge can be harnessed for vaccine design. Finally, I will describe how ideas at the intersection of evolutionary biology, immunology, and statistical physics can help guide the design of strategies that may be able to induce broadly neutralizing antibodies.

  8. Downscaling of global climate change estimates to regional scales: An application to Iberian rainfall in wintertime

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    von Storch, H.; Zorita, E.; Cubasch, U.

    A statistical strategy to deduct regional-scale features from climate general circulation model (GCM) simulations has been designed and tested. The main idea is to interrelate the characteristic patterns of observed simultaneous variations of regional climate parameters and of large-scale atmospheric flow using the canonical correlation technique. The large-scale North Atlantic sea level pressure (SLP) is related to the regional, variable, winter (DJF) mean Iberian Peninsula rainfall. The skill of the resulting statistical model is shown by reproducing, to a good approximation, the winter mean Iberian rainfall from 1900 to present from the observed North Atlantic mean SLP distributions. It ismore » shown that this observed relationship between these two variables is not well reproduced in the output of a general circulation model (GCM). The implications for Iberian rainfall changes as the response to increasing atmospheric greenhouse-gas concentrations simulated by two GCM experiments are examined with the proposed statistical model. In an instantaneous [open quotes]2 CO[sub 2][close quotes] doubling experiment, using the simulated change of the mean North Atlantic SLP field to predict Iberian rainfall yields, there is an insignificant increase of area-averaged rainfall of I mm/month, with maximum values of 4 mm/month in the northwest of the peninsula. In contrast, for the four GCM grid points representing the lberian Peninsula, the change is - 10 mm/month, with a minimum of - 19 mm/month in the southwest. In the second experiment, with the IPCC scenario A ([open quotes]business as usual[close quotes]) increase of CO[sub 2], the statistical-model results partially differ from the directly simulated rainfall changes: in the experimental range of 100 years, the area-averaged rainfall decreases by 7 mm/month (statistical model), and by 9 mm/month (GCM); at the same time the amplitude of the interdecadal variability is quite different. 17 refs., 10 figs.« less

  9. Global Sensitivity Analysis of Environmental Systems via Multiple Indices based on Statistical Moments of Model Outputs

    NASA Astrophysics Data System (ADS)

    Guadagnini, A.; Riva, M.; Dell'Oca, A.

    2017-12-01

    We propose to ground sensitivity of uncertain parameters of environmental models on a set of indices based on the main (statistical) moments, i.e., mean, variance, skewness and kurtosis, of the probability density function (pdf) of a target model output. This enables us to perform Global Sensitivity Analysis (GSA) of a model in terms of multiple statistical moments and yields a quantification of the impact of model parameters on features driving the shape of the pdf of model output. Our GSA approach includes the possibility of being coupled with the construction of a reduced complexity model that allows approximating the full model response at a reduced computational cost. We demonstrate our approach through a variety of test cases. These include a commonly used analytical benchmark, a simplified model representing pumping in a coastal aquifer, a laboratory-scale tracer experiment, and the migration of fracturing fluid through a naturally fractured reservoir (source) to reach an overlying formation (target). Our strategy allows discriminating the relative importance of model parameters to the four statistical moments considered. We also provide an appraisal of the error associated with the evaluation of our sensitivity metrics by replacing the original system model through the selected surrogate model. Our results suggest that one might need to construct a surrogate model with increasing level of accuracy depending on the statistical moment considered in the GSA. The methodological framework we propose can assist the development of analysis techniques targeted to model calibration, design of experiment, uncertainty quantification and risk assessment.

  10. Impact of audio-visual storytelling in simulation learning experiences of undergraduate nursing students.

    PubMed

    Johnston, Sandra; Parker, Christina N; Fox, Amanda

    2017-09-01

    Use of high fidelity simulation has become increasingly popular in nursing education to the extent that it is now an integral component of most nursing programs. Anecdotal evidence suggests that students have difficulty engaging with simulation manikins due to their unrealistic appearance. Introduction of the manikin as a 'real patient' with the use of an audio-visual narrative may engage students in the simulated learning experience and impact on their learning. A paucity of literature currently exists on the use of audio-visual narratives to enhance simulated learning experiences. This study aimed to determine if viewing an audio-visual narrative during a simulation pre-brief altered undergraduate nursing student perceptions of the learning experience. A quasi-experimental post-test design was utilised. A convenience sample of final year baccalaureate nursing students at a large metropolitan university. Participants completed a modified version of the Student Satisfaction with Simulation Experiences survey. This 12-item questionnaire contained questions relating to the ability to transfer skills learned in simulation to the real clinical world, the realism of the simulation and the overall value of the learning experience. Descriptive statistics were used to summarise demographic information. Two tailed, independent group t-tests were used to determine statistical differences within the categories. Findings indicated that students reported high levels of value, realism and transferability in relation to the viewing of an audio-visual narrative. Statistically significant results (t=2.38, p<0.02) were evident in the subscale of transferability of learning from simulation to clinical practice. The subgroups of age and gender although not significant indicated some interesting results. High satisfaction with simulation was indicated by all students in relation to value and realism. There was a significant finding in relation to transferability on knowledge and this is vital to quality educational outcomes. Copyright © 2017. Published by Elsevier Ltd.

  11. Improved silicon nitride for advanced heat engines

    NASA Technical Reports Server (NTRS)

    Yeh, Hun C.; Fang, Ho T.

    1987-01-01

    The technology base required to fabricate silicon nitride components with the strength, reliability, and reproducibility necessary for actual heat engine applications is presented. Task 2 was set up to develop test bars with high Weibull slope and greater high temperature strength, and to conduct an initial net shape component fabrication evaluation. Screening experiments were performed in Task 7 on advanced materials and processing for input to Task 2. The technical efforts performed in the second year of a 5-yr program are covered. The first iteration of Task 2 was completed as planned. Two half-replicated, fractional factorial (2 sup 5), statistically designed matrix experiments were conducted. These experiments have identified Denka 9FW Si3N4 as an alternate raw material to GTE SN502 Si3N4 for subsequent process evaluation. A detailed statistical analysis was conducted to correlate processing conditions with as-processed test bar properties. One processing condition produced a material with a 97 ksi average room temperature MOR (100 percent of goal) with 13.2 Weibull slope (83 percent of goal); another condition produced 86 ksi (6 percent over baseline) room temperature strength with a Weibull slope of 20 (125 percent of goal).

  12. Measuring medical students' motivation to learning anatomy by cadaveric dissection.

    PubMed

    Abdel Meguid, Eiman M; Khalil, Mohammed K

    2017-07-01

    Motivation and learning are inter-related. It is well known that motivating learners is clearly a complex endeavor, which can be influenced by the educational program and the learning environment. Limited research has been conducted to examine students' motivation as a method to assess the effectiveness of dissection in medical education. This study aimed to assess and analyze students' motivation following their dissection experience. A 29-item survey was developed based on the Attention, Relevance, Confidence, and Satisfaction model of motivation. Descriptive statistics were undertaken to describe students' motivation to the dissection experience. T-test and ANOVA were used to compare differences in motivational scores between gender and educational characteristics of students. Dissection activities appear to promote students' motivation. Gender difference was statistically significant as males were more motivated by the dissection experience than females. Comparison between students with different knowledge of anatomy was also significantly different. The study is an important step in the motivational design to improve students' motivation to learn. The outcome of this study provides guidance to the selection of specific strategies to increase motivation by generating motivational strategies/tactics to facilitate learning. Anat Sci Educ 10: 363-371. © 2016 American Association of Anatomists. © 2016 American Association of Anatomists.

  13. Holos: A collaborative environment for similarity-based holistic approaches.

    PubMed

    Lê, Tâm Minh; Brard, Margot; Lê, Sébastien

    2017-10-01

    Through this article, we aim to introduce Holos-a new collaborative environment that allows researchers to carry out experiments based on similarity assessments between stimuli, such as in projective-mapping and sorting tasks. An important feature of Holos is its capacity to assess real-time individual processes during the task. Within the Holos environment, researchers can design experiments on its platform, which can handle four kinds of stimuli: concepts, images, sounds, and videos. In addition, researchers can share their study resources within the scientific community, including stimuli, experimental protocols, and/or the data collected. With a dedicated Android application combined with a tactile human-machine interface, subjects can perform experiments using a tablet to obtain similarity measures between stimuli. On the tablet, the stimuli are displayed as icons that can be dragged with one finger to position them, depending on the ways they are perceived. By recording the x,y coordinates of the stimuli while subjects move the icons, the obtained data can reveal the cognitive processes of the subjects during the experiment. Such data, named digit-tracking data, can be analyzed with the SensoMineR package. In this article, we describe how researchers can design an experiment, how subjects can perform the experiment, and how digit-tracking data can be statistically analyzed within the Holos environment. At the end of the article, a short exemplary experiment is presented.

  14. A novel statistical method for quantitative comparison of multiple ChIP-seq datasets.

    PubMed

    Chen, Li; Wang, Chi; Qin, Zhaohui S; Wu, Hao

    2015-06-15

    ChIP-seq is a powerful technology to measure the protein binding or histone modification strength in the whole genome scale. Although there are a number of methods available for single ChIP-seq data analysis (e.g. 'peak detection'), rigorous statistical method for quantitative comparison of multiple ChIP-seq datasets with the considerations of data from control experiment, signal to noise ratios, biological variations and multiple-factor experimental designs is under-developed. In this work, we develop a statistical method to perform quantitative comparison of multiple ChIP-seq datasets and detect genomic regions showing differential protein binding or histone modification. We first detect peaks from all datasets and then union them to form a single set of candidate regions. The read counts from IP experiment at the candidate regions are assumed to follow Poisson distribution. The underlying Poisson rates are modeled as an experiment-specific function of artifacts and biological signals. We then obtain the estimated biological signals and compare them through the hypothesis testing procedure in a linear model framework. Simulations and real data analyses demonstrate that the proposed method provides more accurate and robust results compared with existing ones. An R software package ChIPComp is freely available at http://web1.sph.emory.edu/users/hwu30/software/ChIPComp.html. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  15. Three Dimensional CFD Analysis of the GTX Combustor

    NASA Technical Reports Server (NTRS)

    Steffen, C. J., Jr.; Bond, R. B.; Edwards, J. R.

    2002-01-01

    The annular combustor geometry of a combined-cycle engine has been analyzed with three-dimensional computational fluid dynamics. Both subsonic combustion and supersonic combustion flowfields have been simulated. The subsonic combustion analysis was executed in conjunction with a direct-connect test rig. Two cold-flow and one hot-flow results are presented. The simulations compare favorably with the test data for the two cold flow calculations; the hot-flow data was not yet available. The hot-flow simulation indicates that the conventional ejector-ramjet cycle would not provide adequate mixing at the conditions tested. The supersonic combustion ramjet flowfield was simulated with frozen chemistry model. A five-parameter test matrix was specified, according to statistical design-of-experiments theory. Twenty-seven separate simulations were used to assemble surrogate models for combustor mixing efficiency and total pressure recovery. ScramJet injector design parameters (injector angle, location, and fuel split) as well as mission variables (total fuel massflow and freestream Mach number) were included in the analysis. A promising injector design has been identified that provides good mixing characteristics with low total pressure losses. The surrogate models can be used to develop performance maps of different injector designs. Several complex three-way variable interactions appear within the dataset that are not adequately resolved with the current statistical analysis.

  16. DESIGNING ENVIRONMENTAL MONITORING DATABASES FOR STATISTIC ASSESSMENT

    EPA Science Inventory

    Databases designed for statistical analyses have characteristics that distinguish them from databases intended for general use. EMAP uses a probabilistic sampling design to collect data to produce statistical assessments of environmental conditions. In addition to supporting the ...

  17. Design of a Ka-Band Propagation Terminal for Atmospheric Measurements in Polar Regions

    NASA Technical Reports Server (NTRS)

    Houts, Jacquelynne R.; Nessel, James A.; Zemba, Michael J.

    2016-01-01

    This paper describes the design and performance of a Ka-Band beacon receiver developed at NASA Glenn Research Center (GRC) that will be installed alongside an existing Ka-Band Radiometer [2] located at the east end of the Svalbard Near Earth Network (NEN) complex. The goal of this experiment is to characterize rain fade attenuation to improve the performance of existing statistical rain attenuation models. The ground terminal developed by NASA GRC utilizes an FFT-based frequency estimation [3] receiver capable of characterizing total path attenuation effects due to gaseous absorption, clouds, rain, and scintillation by directly measuring the propagated signal from the satellite Thor 7.

  18. Design of a Ka-band Propagation Terminal for Atmospheric Measurements in Polar Regions

    NASA Technical Reports Server (NTRS)

    Houts, Jacquelynne R.; Nessel, James A.; Zemba, Michael J.

    2016-01-01

    This paper describes the design and performance of a Ka-Band beacon receiver developed at NASA Glenn Research Center (GRC) that will be installed alongside an existing Ka-Band Radiometer located at the east end of the Svalbard Near Earth Network (NEN) complex. The goal of this experiment is to characterize rain fade attenuation to improve the performance of existing statistical rain attenuation models. The ground terminal developed by NASA GRC utilizes an FFT-based frequency estimation receiver capable of characterizing total path attenuation effects due to gaseous absorption, clouds, rain, and scintillation by directly measuring the propagated signal from the satellite Thor 7.

  19. CARES/Life Software for Designing More Reliable Ceramic Parts

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel N.; Powers, Lynn M.; Baker, Eric H.

    1997-01-01

    Products made from advanced ceramics show great promise for revolutionizing aerospace and terrestrial propulsion, and power generation. However, ceramic components are difficult to design because brittle materials in general have widely varying strength values. The CAPES/Life software eases this task by providing a tool to optimize the design and manufacture of brittle material components using probabilistic reliability analysis techniques. Probabilistic component design involves predicting the probability of failure for a thermomechanically loaded component from specimen rupture data. Typically, these experiments are performed using many simple geometry flexural or tensile test specimens. A static, dynamic, or cyclic load is applied to each specimen until fracture. Statistical strength and SCG (fatigue) parameters are then determined from these data. Using these parameters and the results obtained from a finite element analysis, the time-dependent reliability for a complex component geometry and loading is then predicted. Appropriate design changes are made until an acceptable probability of failure has been reached.

  20. Treatment of dyeing wastewater by TiO2/H2O2/UV process: experimental design approach for evaluating total organic carbon (TOC) removal efficiency.

    PubMed

    Lee, Seung-Mok; Kim, Young-Gyu; Cho, Il-Hyoung

    2005-01-01

    Optimal operating conditions in order to treat dyeing wastewater were investigated by using the factorial design and responses surface methodology (RSM). The experiment was statistically designed and carried out according to a 22 full factorial design with four factorial points, three center points, and four axial points. Then, the linear and nonlinear regression was applied on the data by using SAS package software. The independent variables were TiO2 dosage, H2O2 concentration and total organic carbon (TOC) removal efficiency of dyeing wastewater was dependent variable. From the factorial design and responses surface methodology (RSM), maximum removal efficiency (85%) of dyeing wastewater was obtained at TiO2 dosage (1.82 gL(-1)), H2O2 concentration (980 mgL(-1)) for oxidation reaction (20 min).

  1. The Content of Statistical Requirements for Authors in Biomedical Research Journals

    PubMed Central

    Liu, Tian-Yi; Cai, Si-Yu; Nie, Xiao-Lu; Lyu, Ya-Qi; Peng, Xiao-Xia; Feng, Guo-Shuang

    2016-01-01

    Background: Robust statistical designing, sound statistical analysis, and standardized presentation are important to enhance the quality and transparency of biomedical research. This systematic review was conducted to summarize the statistical reporting requirements introduced by biomedical research journals with an impact factor of 10 or above so that researchers are able to give statistical issues’ serious considerations not only at the stage of data analysis but also at the stage of methodological design. Methods: Detailed statistical instructions for authors were downloaded from the homepage of each of the included journals or obtained from the editors directly via email. Then, we described the types and numbers of statistical guidelines introduced by different press groups. Items of statistical reporting guideline as well as particular requirements were summarized in frequency, which were grouped into design, method of analysis, and presentation, respectively. Finally, updated statistical guidelines and particular requirements for improvement were summed up. Results: Totally, 21 of 23 press groups introduced at least one statistical guideline. More than half of press groups can update their statistical instruction for authors gradually relative to issues of new statistical reporting guidelines. In addition, 16 press groups, covering 44 journals, address particular statistical requirements. The most of the particular requirements focused on the performance of statistical analysis and transparency in statistical reporting, including “address issues relevant to research design, including participant flow diagram, eligibility criteria, and sample size estimation,” and “statistical methods and the reasons.” Conclusions: Statistical requirements for authors are becoming increasingly perfected. Statistical requirements for authors remind researchers that they should make sufficient consideration not only in regards to statistical methods during the research design, but also standardized statistical reporting, which would be beneficial in providing stronger evidence and making a greater critical appraisal of evidence more accessible. PMID:27748343

  2. The Content of Statistical Requirements for Authors in Biomedical Research Journals.

    PubMed

    Liu, Tian-Yi; Cai, Si-Yu; Nie, Xiao-Lu; Lyu, Ya-Qi; Peng, Xiao-Xia; Feng, Guo-Shuang

    2016-10-20

    Robust statistical designing, sound statistical analysis, and standardized presentation are important to enhance the quality and transparency of biomedical research. This systematic review was conducted to summarize the statistical reporting requirements introduced by biomedical research journals with an impact factor of 10 or above so that researchers are able to give statistical issues' serious considerations not only at the stage of data analysis but also at the stage of methodological design. Detailed statistical instructions for authors were downloaded from the homepage of each of the included journals or obtained from the editors directly via email. Then, we described the types and numbers of statistical guidelines introduced by different press groups. Items of statistical reporting guideline as well as particular requirements were summarized in frequency, which were grouped into design, method of analysis, and presentation, respectively. Finally, updated statistical guidelines and particular requirements for improvement were summed up. Totally, 21 of 23 press groups introduced at least one statistical guideline. More than half of press groups can update their statistical instruction for authors gradually relative to issues of new statistical reporting guidelines. In addition, 16 press groups, covering 44 journals, address particular statistical requirements. The most of the particular requirements focused on the performance of statistical analysis and transparency in statistical reporting, including "address issues relevant to research design, including participant flow diagram, eligibility criteria, and sample size estimation," and "statistical methods and the reasons." Statistical requirements for authors are becoming increasingly perfected. Statistical requirements for authors remind researchers that they should make sufficient consideration not only in regards to statistical methods during the research design, but also standardized statistical reporting, which would be beneficial in providing stronger evidence and making a greater critical appraisal of evidence more accessible.

  3. Validating an Air Traffic Management Concept of Operation Using Statistical Modeling

    NASA Technical Reports Server (NTRS)

    He, Yuning; Davies, Misty Dawn

    2013-01-01

    Validating a concept of operation for a complex, safety-critical system (like the National Airspace System) is challenging because of the high dimensionality of the controllable parameters and the infinite number of states of the system. In this paper, we use statistical modeling techniques to explore the behavior of a conflict detection and resolution algorithm designed for the terminal airspace. These techniques predict the robustness of the system simulation to both nominal and off-nominal behaviors within the overall airspace. They also can be used to evaluate the output of the simulation against recorded airspace data. Additionally, the techniques carry with them a mathematical value of the worth of each prediction-a statistical uncertainty for any robustness estimate. Uncertainty Quantification (UQ) is the process of quantitative characterization and ultimately a reduction of uncertainties in complex systems. UQ is important for understanding the influence of uncertainties on the behavior of a system and therefore is valuable for design, analysis, and verification and validation. In this paper, we apply advanced statistical modeling methodologies and techniques on an advanced air traffic management system, namely the Terminal Tactical Separation Assured Flight Environment (T-TSAFE). We show initial results for a parameter analysis and safety boundary (envelope) detection in the high-dimensional parameter space. For our boundary analysis, we developed a new sequential approach based upon the design of computer experiments, allowing us to incorporate knowledge from domain experts into our modeling and to determine the most likely boundary shapes and its parameters. We carried out the analysis on system parameters and describe an initial approach that will allow us to include time-series inputs, such as the radar track data, into the analysis

  4. PsyToolkit: a software package for programming psychological experiments using Linux.

    PubMed

    Stoet, Gijsbert

    2010-11-01

    PsyToolkit is a set of software tools for programming psychological experiments on Linux computers. Given that PsyToolkit is freely available under the Gnu Public License, open source, and designed such that it can easily be modified and extended for individual needs, it is suitable not only for technically oriented Linux users, but also for students, researchers on small budgets, and universities in developing countries. The software includes a high-level scripting language, a library for the programming language C, and a questionnaire presenter. The software easily integrates with other open source tools, such as the statistical software package R. PsyToolkit is designed to work with external hardware (including IoLab and Cedrus response keyboards and two common digital input/output boards) and to support millisecond timing precision. Four in-depth examples explain the basic functionality of PsyToolkit. Example 1 demonstrates a stimulus-response compatibility experiment. Example 2 demonstrates a novel mouse-controlled visual search experiment. Example 3 shows how to control light emitting diodes using PsyToolkit, and Example 4 shows how to build a light-detection sensor. The last two examples explain the electronic hardware setup such that they can even be used with other software packages.

  5. Attachment styles, earlier interpersonal relationships and schizotypy in a non-clinical sample.

    PubMed

    Berry, Katherine; Band, Rebecca; Corcoran, Rhiannon; Barrowclough, Christine; Wearden, Alison

    2007-12-01

    This paper investigates associations between adult attachment style, relationships with significant others during childhood, traumatic life-events and schizotypy. Relationships between attachment and hypothesized correlates were investigated in a cross-sectional design using an analogue sample. The reliability of the attachment and trauma measures was investigated using a test-retest design. Three hundred and four students completed the self-report version of the Psychosis Attachment Measure (PAM), maternal and paternal versions of the Parental Bonding Instrument, the Attachment History Questionnaire, a measure of trauma and the Oxford-Liverpool Inventory of Feelings and Experiences scale through an internet website. As predicted, there were statistically significant associations between insecure attachment in adult relationships and experiences of negative interpersonal events. Both earlier interpersonal experiences and adult attachment style predicted schizotypy, and adult attachment style emerged as an independent predictor of positive schizotypal characteristics. The findings support associations between adult attachment style and previous interpersonal experiences and between adult attachment and schizotypy. The PAM is a reliable and valid instrument that can be used to explore attachment styles in analogue samples and associations between attachment styles and psychotic symptoms in clinical samples.

  6. A method to accelerate creation of plasma etch recipes using physics and Bayesian statistics

    NASA Astrophysics Data System (ADS)

    Chopra, Meghali J.; Verma, Rahul; Lane, Austin; Willson, C. G.; Bonnecaze, Roger T.

    2017-03-01

    Next generation semiconductor technologies like high density memory storage require precise 2D and 3D nanopatterns. Plasma etching processes are essential to achieving the nanoscale precision required for these structures. Current plasma process development methods rely primarily on iterative trial and error or factorial design of experiment (DOE) to define the plasma process space. Here we evaluate the efficacy of the software tool Recipe Optimization for Deposition and Etching (RODEo) against standard industry methods at determining the process parameters of a high density O2 plasma system with three case studies. In the first case study, we demonstrate that RODEo is able to predict etch rates more accurately than a regression model based on a full factorial design while using 40% fewer experiments. In the second case study, we demonstrate that RODEo performs significantly better than a full factorial DOE at identifying optimal process conditions to maximize anisotropy. In the third case study we experimentally show how RODEo maximizes etch rates while using half the experiments of a full factorial DOE method. With enhanced process predictions and more accurate maps of the process space, RODEo reduces the number of experiments required to develop and optimize plasma processes.

  7. Eigenvalues of Random Matrices with Isotropic Gaussian Noise and the Design of Diffusion Tensor Imaging Experiments.

    PubMed

    Gasbarra, Dario; Pajevic, Sinisa; Basser, Peter J

    2017-01-01

    Tensor-valued and matrix-valued measurements of different physical properties are increasingly available in material sciences and medical imaging applications. The eigenvalues and eigenvectors of such multivariate data provide novel and unique information, but at the cost of requiring a more complex statistical analysis. In this work we derive the distributions of eigenvalues and eigenvectors in the special but important case of m×m symmetric random matrices, D , observed with isotropic matrix-variate Gaussian noise. The properties of these distributions depend strongly on the symmetries of the mean tensor/matrix, D̄ . When D̄ has repeated eigenvalues, the eigenvalues of D are not asymptotically Gaussian, and repulsion is observed between the eigenvalues corresponding to the same D̄ eigenspaces. We apply these results to diffusion tensor imaging (DTI), with m = 3, addressing an important problem of detecting the symmetries of the diffusion tensor, and seeking an experimental design that could potentially yield an isotropic Gaussian distribution. In the 3-dimensional case, when the mean tensor is spherically symmetric and the noise is Gaussian and isotropic, the asymptotic distribution of the first three eigenvalue central moment statistics is simple and can be used to test for isotropy. In order to apply such tests, we use quadrature rules of order t ≥ 4 with constant weights on the unit sphere to design a DTI-experiment with the property that isotropy of the underlying true tensor implies isotropy of the Fisher information. We also explain the potential implications of the methods using simulated DTI data with a Rician noise model.

  8. Characterization of a Saccharomyces cerevisiae fermentation process for production of a therapeutic recombinant protein using a multivariate Bayesian approach.

    PubMed

    Fu, Zhibiao; Baker, Daniel; Cheng, Aili; Leighton, Julie; Appelbaum, Edward; Aon, Juan

    2016-05-01

    The principle of quality by design (QbD) has been widely applied to biopharmaceutical manufacturing processes. Process characterization is an essential step to implement the QbD concept to establish the design space and to define the proven acceptable ranges (PAR) for critical process parameters (CPPs). In this study, we present characterization of a Saccharomyces cerevisiae fermentation process using risk assessment analysis, statistical design of experiments (DoE), and the multivariate Bayesian predictive approach. The critical quality attributes (CQAs) and CPPs were identified with a risk assessment. The statistical model for each attribute was established using the results from the DoE study with consideration given to interactions between CPPs. Both the conventional overlapping contour plot and the multivariate Bayesian predictive approaches were used to establish the region of process operating conditions where all attributes met their specifications simultaneously. The quantitative Bayesian predictive approach was chosen to define the PARs for the CPPs, which apply to the manufacturing control strategy. Experience from the 10,000 L manufacturing scale process validation, including 64 continued process verification batches, indicates that the CPPs remain under a state of control and within the established PARs. The end product quality attributes were within their drug substance specifications. The probability generated with the Bayesian approach was also used as a tool to assess CPP deviations. This approach can be extended to develop other production process characterization and quantify a reliable operating region. © 2016 American Institute of Chemical Engineers Biotechnol. Prog., 32:799-812, 2016. © 2016 American Institute of Chemical Engineers.

  9. Optimization of Anodic Porous Alumina Fabricated from Commercial Aluminum Food Foils: A Statistical Approach

    PubMed Central

    Riccomagno, Eva; Shayganpour, Amirreza; Salerno, Marco

    2017-01-01

    Anodic porous alumina is a known material based on an old industry, yet with emerging applications in nanoscience and nanotechnology. This is promising, but the nanostructured alumina should be fabricated from inexpensive raw material. We fabricated porous alumina from commercial aluminum food plate in 0.4 M aqueous phosphoric acid, aiming to design an effective manufacturing protocol for the material used as nanoporous filler in dental restorative composites, an application demonstrated previously by our group. We identified the critical input parameters of anodization voltage, bath temperature and anodization time, and the main output parameters of pore diameter, pore spacing and oxide thickness. Scanning electron microscopy and grain analysis allowed us to assess the nanostructured material, and the statistical design of experiments was used to optimize its fabrication. We analyzed a preliminary dataset, designed a second dataset aimed at clarifying the correlations between input and output parameters, and ran a confirmation dataset. Anodization conditions close to 125 V, 20 °C, and 7 h were identified as the best for obtaining, in the shortest possible time, pore diameters and spacing of 100–150 nm and 150–275 nm respectively, and thickness of 6–8 µm, which are desirable for the selected application according to previously published results. Our analysis confirmed the linear dependence of pore size on anodization voltage and of thickness on anodization time. The importance of proper control on the experiment was highlighted, since batch effects emerge when the experimental conditions are not exactly reproduced. PMID:28772776

  10. Eigenvalues of Random Matrices with Isotropic Gaussian Noise and the Design of Diffusion Tensor Imaging Experiments*

    PubMed Central

    Gasbarra, Dario; Pajevic, Sinisa; Basser, Peter J.

    2017-01-01

    Tensor-valued and matrix-valued measurements of different physical properties are increasingly available in material sciences and medical imaging applications. The eigenvalues and eigenvectors of such multivariate data provide novel and unique information, but at the cost of requiring a more complex statistical analysis. In this work we derive the distributions of eigenvalues and eigenvectors in the special but important case of m×m symmetric random matrices, D, observed with isotropic matrix-variate Gaussian noise. The properties of these distributions depend strongly on the symmetries of the mean tensor/matrix, D̄. When D̄ has repeated eigenvalues, the eigenvalues of D are not asymptotically Gaussian, and repulsion is observed between the eigenvalues corresponding to the same D̄ eigenspaces. We apply these results to diffusion tensor imaging (DTI), with m = 3, addressing an important problem of detecting the symmetries of the diffusion tensor, and seeking an experimental design that could potentially yield an isotropic Gaussian distribution. In the 3-dimensional case, when the mean tensor is spherically symmetric and the noise is Gaussian and isotropic, the asymptotic distribution of the first three eigenvalue central moment statistics is simple and can be used to test for isotropy. In order to apply such tests, we use quadrature rules of order t ≥ 4 with constant weights on the unit sphere to design a DTI-experiment with the property that isotropy of the underlying true tensor implies isotropy of the Fisher information. We also explain the potential implications of the methods using simulated DTI data with a Rician noise model. PMID:28989561

  11. Investigation of Statistical Inference Methodologies Through Scale Model Propagation Experiments

    DTIC Science & Technology

    2015-09-30

    statistical inference methodologies for ocean- acoustic problems by investigating and applying statistical methods to data collected from scale-model...to begin planning experiments for statistical inference applications. APPROACH In the ocean acoustics community over the past two decades...solutions for waveguide parameters. With the introduction of statistical inference to the field of ocean acoustics came the desire to interpret marginal

  12. Application of Taguchi methods to infrared window design

    NASA Astrophysics Data System (ADS)

    Osmer, Kurt A.; Pruszynski, Charles J.

    1990-10-01

    Dr. Genichi Taguchi, a prominent quality consultant, reduced a branch of statistics known as "Design of Experiments" to a cookbook methodology that can be employed by any competent engineer. This technique has been extensively employed by Japanese manufacturers, and is widely credited with helping them attain their current level of success in low cost, high quality product design and fabrication. Although this technique was originally put forth as a tool to streamline the determination of improved production processes, it can also be applied to a wide range of engineering problems. As part of an internal research project, this method of experimental design has been adapted to window trade studies and materials research. Two of these analyses are presented herein, and have been chosen to illustrate the breadth of applications to which the Taguchi method can be utilized.

  13. Iterative LQG Controller Design Through Closed-Loop Identification

    NASA Technical Reports Server (NTRS)

    Hsiao, Min-Hung; Huang, Jen-Kuang; Cox, David E.

    1996-01-01

    This paper presents an iterative Linear Quadratic Gaussian (LQG) controller design approach for a linear stochastic system with an uncertain open-loop model and unknown noise statistics. This approach consists of closed-loop identification and controller redesign cycles. In each cycle, the closed-loop identification method is used to identify an open-loop model and a steady-state Kalman filter gain from closed-loop input/output test data obtained by using a feedback LQG controller designed from the previous cycle. Then the identified open-loop model is used to redesign the state feedback. The state feedback and the identified Kalman filter gain are used to form an updated LQC controller for the next cycle. This iterative process continues until the updated controller converges. The proposed controller design is demonstrated by numerical simulations and experiments on a highly unstable large-gap magnetic suspension system.

  14. Merging National Forest and National Forest Health Inventories to Obtain an Integrated Forest Resource Inventory – Experiences from Bavaria, Slovenia and Sweden

    PubMed Central

    Kovač, Marko; Bauer, Arthur; Ståhl, Göran

    2014-01-01

    Backgrounds, Material and Methods To meet the demands of sustainable forest management and international commitments, European nations have designed a variety of forest-monitoring systems for specific needs. While the majority of countries are committed to independent, single-purpose inventorying, a minority of countries have merged their single-purpose forest inventory systems into integrated forest resource inventories. The statistical efficiencies of the Bavarian, Slovene and Swedish integrated forest resource inventory designs are investigated with the various statistical parameters of the variables of growing stock volume, shares of damaged trees, and deadwood volume. The parameters are derived by using the estimators for the given inventory designs. The required sample sizes are derived via the general formula for non-stratified independent samples and via statistical power analyses. The cost effectiveness of the designs is compared via two simple cost effectiveness ratios. Results In terms of precision, the most illustrative parameters of the variables are relative standard errors; their values range between 1% and 3% if the variables’ variations are low (s%<80%) and are higher in the case of higher variations. A comparison of the actual and required sample sizes shows that the actual sample sizes were deliberately set high to provide precise estimates for the majority of variables and strata. In turn, the successive inventories are statistically efficient, because they allow detecting the mean changes of variables with powers higher than 90%; the highest precision is attained for the changes of growing stock volume and the lowest for the changes of the shares of damaged trees. Two indicators of cost effectiveness also show that the time input spent for measuring one variable decreases with the complexity of inventories. Conclusion There is an increasing need for credible information on forest resources to be used for decision making and national and international policy making. Such information can be cost-efficiently provided through integrated forest resource inventories. PMID:24941120

  15. InChIKey collision resistance: an experimental testing

    PubMed Central

    2012-01-01

    InChIKey is a 27-character compacted (hashed) version of InChI which is intended for Internet and database searching/indexing and is based on an SHA-256 hash of the InChI character string. The first block of InChIKey encodes molecular skeleton while the second block represents various kinds of isomerism (stereo, tautomeric, etc.). InChIKey is designed to be a nearly unique substitute for the parent InChI. However, a single InChIKey may occasionally map to two or more InChI strings (collision). The appearance of collision itself does not compromise the signature as collision-free hashing is impossible; the only viable approach is to set and keep a reasonable level of collision resistance which is sufficient for typical applications. We tested, in computational experiments, how well the real-life InChIKey collision resistance corresponds to the theoretical estimates expected by design. For this purpose, we analyzed the statistical characteristics of InChIKey for datasets of variable size in comparison to the theoretical statistical frequencies. For the relatively short second block, an exhaustive direct testing was performed. We computed and compared to theory the numbers of collisions for the stereoisomers of Spongistatin I (using the whole set of 67,108,864 isomers and its subsets). For the longer first block, we generated, using custom-made software, InChIKeys for more than 3 × 1010 chemical structures. The statistical behavior of this block was tested by comparison of experimental and theoretical frequencies for the various four-letter sequences which may appear in the first block body. From the results of our computational experiments we conclude that the observed characteristics of InChIKey collision resistance are in good agreement with theoretical expectations. PMID:23256896

  16. InChIKey collision resistance: an experimental testing.

    PubMed

    Pletnev, Igor; Erin, Andrey; McNaught, Alan; Blinov, Kirill; Tchekhovskoi, Dmitrii; Heller, Steve

    2012-12-20

    InChIKey is a 27-character compacted (hashed) version of InChI which is intended for Internet and database searching/indexing and is based on an SHA-256 hash of the InChI character string. The first block of InChIKey encodes molecular skeleton while the second block represents various kinds of isomerism (stereo, tautomeric, etc.). InChIKey is designed to be a nearly unique substitute for the parent InChI. However, a single InChIKey may occasionally map to two or more InChI strings (collision). The appearance of collision itself does not compromise the signature as collision-free hashing is impossible; the only viable approach is to set and keep a reasonable level of collision resistance which is sufficient for typical applications.We tested, in computational experiments, how well the real-life InChIKey collision resistance corresponds to the theoretical estimates expected by design. For this purpose, we analyzed the statistical characteristics of InChIKey for datasets of variable size in comparison to the theoretical statistical frequencies. For the relatively short second block, an exhaustive direct testing was performed. We computed and compared to theory the numbers of collisions for the stereoisomers of Spongistatin I (using the whole set of 67,108,864 isomers and its subsets). For the longer first block, we generated, using custom-made software, InChIKeys for more than 3 × 1010 chemical structures. The statistical behavior of this block was tested by comparison of experimental and theoretical frequencies for the various four-letter sequences which may appear in the first block body.From the results of our computational experiments we conclude that the observed characteristics of InChIKey collision resistance are in good agreement with theoretical expectations.

  17. ‘Dignity therapy’, a promising intervention in palliative care: A comprehensive systematic literature review

    PubMed Central

    Martínez, Marina; Arantzamendi, María; Belar, Alazne; Carrasco, José Miguel; Carvajal, Ana; Rullán, María; Centeno, Carlos

    2016-01-01

    Background: Dignity therapy is psychotherapy to relieve psychological and existential distress in patients at the end of life. Little is known about its effect. Aim: To analyse the outcomes of dignity therapy in patients with advanced life-threatening diseases. Design: Systematic review was conducted. Three authors extracted data of the articles and evaluated quality using Critical Appraisal Skills Programme. Data were synthesized, considering study objectives. Data sources: PubMed, CINAHL, Cochrane Library and PsycINFO. The years searched were 2002 (year of dignity therapy development) to January 2016. ‘Dignity therapy’ was used as search term. Studies with patients with advanced life-threatening diseases were included. Results: Of 121 studies, 28 were included. Quality of studies is high. Results were grouped into effectiveness, satisfaction, suitability and feasibility, and adaptability to different diseases and cultures. Two of five randomized control trials applied dignity therapy to patients with high levels of baseline psychological distress. One showed statistically significant decrease on patients’ anxiety and depression scores over time. The other showed statistical decrease on anxiety scores pre–post dignity therapy, not on depression. Nonrandomized studies suggested statistically significant improvements in existential and psychosocial measurements. Patients, relatives and professionals perceived it improved end-of-life experience. Conclusion: Evidence suggests that dignity therapy is beneficial. One randomized controlled trial with patients with high levels of psychological distress shows DT efficacy in anxiety and depression scores. Other design studies report beneficial outcomes in terms of end-of-life experience. Further research should understand how dignity therapy functions to establish a means for measuring its impact and assessing whether high level of distress patients can benefit most from this therapy. PMID:27566756

  18. Power of mental health nursing research: a statistical analysis of studies in the International Journal of Mental Health Nursing.

    PubMed

    Gaskin, Cadeyrn J; Happell, Brenda

    2013-02-01

    Having sufficient power to detect effect sizes of an expected magnitude is a core consideration when designing studies in which inferential statistics will be used. The main aim of this study was to investigate the statistical power in studies published in the International Journal of Mental Health Nursing. From volumes 19 (2010) and 20 (2011) of the journal, studies were analysed for their power to detect small, medium, and large effect sizes, according to Cohen's guidelines. The power of the 23 studies included in this review to detect small, medium, and large effects was 0.34, 0.79, and 0.94, respectively. In 90% of papers, no adjustments for experiment-wise error were reported. With a median of nine inferential tests per paper, the mean experiment-wise error rate was 0.51. A priori power analyses were only reported in 17% of studies. Although effect sizes for correlations and regressions were routinely reported, effect sizes for other tests (χ(2)-tests, t-tests, ANOVA/MANOVA) were largely absent from the papers. All types of effect sizes were infrequently interpreted. Researchers are strongly encouraged to conduct power analyses when designing studies, and to avoid scattergun approaches to data analysis (i.e. undertaking large numbers of tests in the hope of finding 'significant' results). Because reviewing effect sizes is essential for determining the clinical significance of study findings, researchers would better serve the field of mental health nursing if they reported and interpreted effect sizes. © 2012 The Authors. International Journal of Mental Health Nursing © 2012 Australian College of Mental Health Nurses Inc.

  19. Investigate-and-redesign tasks as a context for learning and doing science and technology: A study of naive, novice and expert high school and adult designers doing product comparisons and redesign tasks

    NASA Astrophysics Data System (ADS)

    Crismond, David Paul

    This thesis studied high school students and adults with varying degrees of design experience doing two technology investigate-and-redesign (I&R) tasks. Each involved subjects investigating products, designing experiments to compare them fairly, and then redesigning the devices. A total of 25 pairs of subjects participated in this investigation and included naive and novice high school designers, as well as naive, novice, and expert adult designers. Subjects of similar age and design experience worked in same-gender teams and met for two 2-hour sessions. The essential research question of this thesis was: "What process skills and concepts do naive, novice and expert designers use and learn when investigating devices, designing experiments, and redesigning the devices?" Three methodologies were used to gather and analyze the data: clinical interviewing (Piaget, 1929/1960), protocol analysis (Ericsson & Simon, 1984) and interaction analysis (Jordan and Henderson, 1995). The thesis provides composite case-studies of 10 of the 50 test sessions, buttressed by descriptions of performance trends for all subjects. Given the small sample sizes involved, the findings are by necessity tentative and not supported by statistical analysis: (1) I&R activities are engaging, less time-intensive complements to design-and-build tasks, which involve simple mechanical devices and carry with them a host of potential "alternative understandings" in science and technology. Much gets learned during these tasks, more involving "device knowledge" and "device inquiry skills" than "big ideas" in science and technology. (2) Redesign tasks scaffold naive and novice designers to improved performance in the multidimensional and context-specific activity of design. The performances of naive and novice designers were more like that of expert designers when redesigning existing devices than when doing start-from-scratch designing. (3) Conceptual redesign involved more analysis- than synthesis-related design strategies, suggesting that opportunities for teaching science and technology during design are present, but underutilized since only experts made frequent connections to key science concepts. (4) Naive subjects focused mostly on product features and functions in their designs and made analogies mostly to concrete objects, while experts focused more on problem-finding, determining appropriate mechanisms, and made connections using analogies and concepts at both abstract and concrete levels.

  20. Estimating weak ratiometric signals in imaging data. II. Meta-analysis with multiple, dual-channel datasets.

    PubMed

    Sornborger, Andrew; Broder, Josef; Majumder, Anirban; Srinivasamoorthy, Ganesh; Porter, Erika; Reagin, Sean S; Keith, Charles; Lauderdale, James D

    2008-09-01

    Ratiometric fluorescent indicators are used for making quantitative measurements of a variety of physiological variables. Their utility is often limited by noise. This is the second in a series of papers describing statistical methods for denoising ratiometric data with the aim of obtaining improved quantitative estimates of variables of interest. Here, we outline a statistical optimization method that is designed for the analysis of ratiometric imaging data in which multiple measurements have been taken of systems responding to the same stimulation protocol. This method takes advantage of correlated information across multiple datasets for objectively detecting and estimating ratiometric signals. We demonstrate our method by showing results of its application on multiple, ratiometric calcium imaging experiments.

Top