Sample records for high statistics experiments

  1. QCD Precision Measurements and Structure Function Extraction at a High Statistics, High Energy Neutrino Scattering Experiment:. NuSOnG

    NASA Astrophysics Data System (ADS)

    Adams, T.; Batra, P.; Bugel, L.; Camilleri, L.; Conrad, J. M.; de Gouvêa, A.; Fisher, P. H.; Formaggio, J. A.; Jenkins, J.; Karagiorgi, G.; Kobilarcik, T. R.; Kopp, S.; Kyle, G.; Loinaz, W. A.; Mason, D. A.; Milner, R.; Moore, R.; Morfín, J. G.; Nakamura, M.; Naples, D.; Nienaber, P.; Olness, F. I.; Owens, J. F.; Pate, S. F.; Pronin, A.; Seligman, W. G.; Shaevitz, M. H.; Schellman, H.; Schienbein, I.; Syphers, M. J.; Tait, T. M. P.; Takeuchi, T.; Tan, C. Y.; van de Water, R. G.; Yamamoto, R. K.; Yu, J. Y.

    We extend the physics case for a new high-energy, ultra-high statistics neutrino scattering experiment, NuSOnG (Neutrino Scattering On Glass) to address a variety of issues including precision QCD measurements, extraction of structure functions, and the derived Parton Distribution Functions (PDF's). This experiment uses a Tevatron-based neutrino beam to obtain a sample of Deep Inelastic Scattering (DIS) events which is over two orders of magnitude larger than past samples. We outline an innovative method for fitting the structure functions using a parametrized energy shift which yields reduced systematic uncertainties. High statistics measurements, in combination with improved systematics, will enable NuSOnG to perform discerning tests of fundamental Standard Model parameters as we search for deviations which may hint of "Beyond the Standard Model" physics.

  2. The influence of narrative v. statistical information on perceiving vaccination risks.

    PubMed

    Betsch, Cornelia; Ulshöfer, Corina; Renkewitz, Frank; Betsch, Tilmann

    2011-01-01

    Health-related information found on the Internet is increasing and impacts patient decision making, e.g. regarding vaccination decisions. In addition to statistical information (e.g. incidence rates of vaccine adverse events), narrative information is also widely available such as postings on online bulletin boards. Previous research has shown that narrative information can impact treatment decisions, even when statistical information is presented concurrently. As the determinants of this effect are largely unknown, we will vary features of the narratives to identify mechanisms through which narratives impact risk judgments. An online bulletin board setting provided participants with statistical information and authentic narratives about the occurrence and nonoccurrence of adverse events. Experiment 1 followed a single factorial design with 1, 2, or 4 narratives out of 10 reporting adverse events. Experiment 2 implemented a 2 (statistical risk 20% vs. 40%) × 2 (2/10 vs. 4/10 narratives reporting adverse events) × 2 (high vs. low richness) × 2 (high vs. low emotionality) between-subjects design. Dependent variables were perceived risk of side-effects and vaccination intentions. Experiment 1 shows an inverse relation between the number of narratives reporting adverse-events and vaccination intentions, which was mediated by the perceived risk of vaccinating. Experiment 2 showed a stronger influence of the number of narratives than of the statistical risk information. High (vs. low) emotional narratives had a greater impact on the perceived risk, while richness had no effect. The number of narratives influences risk judgments can potentially override statistical information about risk.

  3. Musical Experience Influences Statistical Learning of a Novel Language

    PubMed Central

    Shook, Anthony; Marian, Viorica; Bartolotti, James; Schroeder, Scott R.

    2014-01-01

    Musical experience may benefit learning a new language by enhancing the fidelity with which the auditory system encodes sound. In the current study, participants with varying degrees of musical experience were exposed to two statistically-defined languages consisting of auditory Morse-code sequences which varied in difficulty. We found an advantage for highly-skilled musicians, relative to less-skilled musicians, in learning novel Morse-code based words. Furthermore, in the more difficult learning condition, performance of lower-skilled musicians was mediated by their general cognitive abilities. We suggest that musical experience may lead to enhanced processing of statistical information and that musicians’ enhanced ability to learn statistical probabilities in a novel Morse-code language may extend to natural language learning. PMID:23505962

  4. A statistical approach to selecting and confirming validation targets in -omics experiments

    PubMed Central

    2012-01-01

    Background Genomic technologies are, by their very nature, designed for hypothesis generation. In some cases, the hypotheses that are generated require that genome scientists confirm findings about specific genes or proteins. But one major advantage of high-throughput technology is that global genetic, genomic, transcriptomic, and proteomic behaviors can be observed. Manual confirmation of every statistically significant genomic result is prohibitively expensive. This has led researchers in genomics to adopt the strategy of confirming only a handful of the most statistically significant results, a small subset chosen for biological interest, or a small random subset. But there is no standard approach for selecting and quantitatively evaluating validation targets. Results Here we present a new statistical method and approach for statistically validating lists of significant results based on confirming only a small random sample. We apply our statistical method to show that the usual practice of confirming only the most statistically significant results does not statistically validate result lists. We analyze an extensively validated RNA-sequencing experiment to show that confirming a random subset can statistically validate entire lists of significant results. Finally, we analyze multiple publicly available microarray experiments to show that statistically validating random samples can both (i) provide evidence to confirm long gene lists and (ii) save thousands of dollars and hundreds of hours of labor over manual validation of each significant result. Conclusions For high-throughput -omics studies, statistical validation is a cost-effective and statistically valid approach to confirming lists of significant results. PMID:22738145

  5. Statistical Engineering in Air Traffic Management Research

    NASA Technical Reports Server (NTRS)

    Wilson, Sara R.

    2015-01-01

    NASA is working to develop an integrated set of advanced technologies to enable efficient arrival operations in high-density terminal airspace for the Next Generation Air Transportation System. This integrated arrival solution is being validated and verified in laboratories and transitioned to a field prototype for an operational demonstration at a major U.S. airport. Within NASA, this is a collaborative effort between Ames and Langley Research Centers involving a multi-year iterative experimentation process. Designing and analyzing a series of sequential batch computer simulations and human-in-the-loop experiments across multiple facilities and simulation environments involves a number of statistical challenges. Experiments conducted in separate laboratories typically have different limitations and constraints, and can take different approaches with respect to the fundamental principles of statistical design of experiments. This often makes it difficult to compare results from multiple experiments and incorporate findings into the next experiment in the series. A statistical engineering approach is being employed within this project to support risk-informed decision making and maximize the knowledge gained within the available resources. This presentation describes a statistical engineering case study from NASA, highlights statistical challenges, and discusses areas where existing statistical methodology is adapted and extended.

  6. Physical fitness modulates incidental but not intentional statistical learning of simultaneous auditory sequences during concurrent physical exercise.

    PubMed

    Daikoku, Tatsuya; Takahashi, Yuji; Futagami, Hiroko; Tarumoto, Nagayoshi; Yasuda, Hideki

    2017-02-01

    In real-world auditory environments, humans are exposed to overlapping auditory information such as those made by human voices and musical instruments even during routine physical activities such as walking and cycling. The present study investigated how concurrent physical exercise affects performance of incidental and intentional learning of overlapping auditory streams, and whether physical fitness modulates the performances of learning. Participants were grouped with 11 participants with lower and higher fitness each, based on their Vo 2 max value. They were presented simultaneous auditory sequences with a distinct statistical regularity each other (i.e. statistical learning), while they were pedaling on the bike and seating on a bike at rest. In experiment 1, they were instructed to attend to one of the two sequences and ignore to the other sequence. In experiment 2, they were instructed to attend to both of the two sequences. After exposure to the sequences, learning effects were evaluated by familiarity test. In the experiment 1, performance of statistical learning of ignored sequences during concurrent pedaling could be higher in the participants with high than low physical fitness, whereas in attended sequence, there was no significant difference in performance of statistical learning between high than low physical fitness. Furthermore, there was no significant effect of physical fitness on learning while resting. In the experiment 2, the both participants with high and low physical fitness could perform intentional statistical learning of two simultaneous sequences in the both exercise and rest sessions. The improvement in physical fitness might facilitate incidental but not intentional statistical learning of simultaneous auditory sequences during concurrent physical exercise.

  7. Multiplexed single-molecule force spectroscopy using a centrifuge.

    PubMed

    Yang, Darren; Ward, Andrew; Halvorsen, Ken; Wong, Wesley P

    2016-03-17

    We present a miniature centrifuge force microscope (CFM) that repurposes a benchtop centrifuge for high-throughput single-molecule experiments with high-resolution particle tracking, a large force range, temperature control and simple push-button operation. Incorporating DNA nanoswitches to enable repeated interrogation by force of single molecular pairs, we demonstrate increased throughput, reliability and the ability to characterize population heterogeneity. We perform spatiotemporally multiplexed experiments to collect 1,863 bond rupture statistics from 538 traceable molecular pairs in a single experiment, and show that 2 populations of DNA zippers can be distinguished using per-molecule statistics to reduce noise.

  8. Multiplexed single-molecule force spectroscopy using a centrifuge

    PubMed Central

    Yang, Darren; Ward, Andrew; Halvorsen, Ken; Wong, Wesley P.

    2016-01-01

    We present a miniature centrifuge force microscope (CFM) that repurposes a benchtop centrifuge for high-throughput single-molecule experiments with high-resolution particle tracking, a large force range, temperature control and simple push-button operation. Incorporating DNA nanoswitches to enable repeated interrogation by force of single molecular pairs, we demonstrate increased throughput, reliability and the ability to characterize population heterogeneity. We perform spatiotemporally multiplexed experiments to collect 1,863 bond rupture statistics from 538 traceable molecular pairs in a single experiment, and show that 2 populations of DNA zippers can be distinguished using per-molecule statistics to reduce noise. PMID:26984516

  9. Radar derived spatial statistics of summer rain. Volume 1: Experiment description

    NASA Technical Reports Server (NTRS)

    Katz, I.; Arnold, A.; Goldhirsh, J.; Konrad, T. G.; Vann, W. L.; Dobson, E. B.; Rowland, J. R.

    1975-01-01

    An experiment was performed at Wallops Island, Virginia, to obtain a statistical description of summer rainstorms. Its purpose was to obtain information needed for design of earth and space communications systems in which precipitation in the earth's atmosphere scatters or attenuates the radio signal. Rainstorms were monitored with the high resolution SPANDAR radar and the 3-dimensional structures of the storms were recorded on digital tape. The equipment, the experiment, and tabulated data obtained during the experiment are described.

  10. Adaptive interference cancel filter for evoked potential using high-order cumulants.

    PubMed

    Lin, Bor-Shyh; Lin, Bor-Shing; Chong, Fok-Ching; Lai, Feipei

    2004-01-01

    This paper is to present evoked potential (EP) processing using adaptive interference cancel (AIC) filter with second and high order cumulants. In conventional ensemble averaging method, people have to conduct repetitively experiments to record the required data. Recently, the use of AIC structure with second statistics in processing EP has proved more efficiency than traditional averaging method, but it is sensitive to both of the reference signal statistics and the choice of step size. Thus, we proposed higher order statistics-based AIC method to improve these disadvantages. This study was experimented in somatosensory EP corrupted with EEG. Gradient type algorithm is used in AIC method. Comparisons with AIC filter on second, third, fourth order statistics are also presented in this paper. We observed that AIC filter with third order statistics has better convergent performance for EP processing and is not sensitive to the selection of step size and reference input.

  11. Increasing the statistical significance of entanglement detection in experiments.

    PubMed

    Jungnitsch, Bastian; Niekamp, Sönke; Kleinmann, Matthias; Gühne, Otfried; Lu, He; Gao, Wei-Bo; Chen, Yu-Ao; Chen, Zeng-Bing; Pan, Jian-Wei

    2010-05-28

    Entanglement is often verified by a violation of an inequality like a Bell inequality or an entanglement witness. Considerable effort has been devoted to the optimization of such inequalities in order to obtain a high violation. We demonstrate theoretically and experimentally that such an optimization does not necessarily lead to a better entanglement test, if the statistical error is taken into account. Theoretically, we show for different error models that reducing the violation of an inequality can improve the significance. Experimentally, we observe this phenomenon in a four-photon experiment, testing the Mermin and Ardehali inequality for different levels of noise. Furthermore, we provide a way to develop entanglement tests with high statistical significance.

  12. Data Literacy is Statistical Literacy

    ERIC Educational Resources Information Center

    Gould, Robert

    2017-01-01

    Past definitions of statistical literacy should be updated in order to account for the greatly amplified role that data now play in our lives. Experience working with high-school students in an innovative data science curriculum has shown that teaching statistical literacy, augmented by data literacy, can begin early.

  13. Black Females in High School: A Statistical Educational Profile

    ERIC Educational Resources Information Center

    Muhammad, Crystal Gafford; Dixson, Adrienne D.

    2008-01-01

    In life as in literature, both the mainstream public and the Black community writ large, overlook the Black female experiences, both adolescent and adult. In order to contribute to the knowledge base regarding this population, we present through our study a statistical portrait of Black females in high school. To do so, we present an analysis of…

  14. Educational Statistics Authentic Learning CAPSULES: Community Action Projects for Students Utilizing Leadership and E-Based Statistics

    ERIC Educational Resources Information Center

    Thompson, Carla J.

    2009-01-01

    Since educational statistics is a core or general requirement of all students enrolled in graduate education programs, the need for high quality student engagement and appropriate authentic learning experiences is critical for promoting student interest and student success in the course. Based in authentic learning theory and engagement theory…

  15. Software for the Integration of Multiomics Experiments in Bioconductor.

    PubMed

    Ramos, Marcel; Schiffer, Lucas; Re, Angela; Azhar, Rimsha; Basunia, Azfar; Rodriguez, Carmen; Chan, Tiffany; Chapman, Phil; Davis, Sean R; Gomez-Cabrero, David; Culhane, Aedin C; Haibe-Kains, Benjamin; Hansen, Kasper D; Kodali, Hanish; Louis, Marie S; Mer, Arvind S; Riester, Markus; Morgan, Martin; Carey, Vince; Waldron, Levi

    2017-11-01

    Multiomics experiments are increasingly commonplace in biomedical research and add layers of complexity to experimental design, data integration, and analysis. R and Bioconductor provide a generic framework for statistical analysis and visualization, as well as specialized data classes for a variety of high-throughput data types, but methods are lacking for integrative analysis of multiomics experiments. The MultiAssayExperiment software package, implemented in R and leveraging Bioconductor software and design principles, provides for the coordinated representation of, storage of, and operation on multiple diverse genomics data. We provide the unrestricted multiple 'omics data for each cancer tissue in The Cancer Genome Atlas as ready-to-analyze MultiAssayExperiment objects and demonstrate in these and other datasets how the software simplifies data representation, statistical analysis, and visualization. The MultiAssayExperiment Bioconductor package reduces major obstacles to efficient, scalable, and reproducible statistical analysis of multiomics data and enhances data science applications of multiple omics datasets. Cancer Res; 77(21); e39-42. ©2017 AACR . ©2017 American Association for Cancer Research.

  16. A Study of Particle Beam Spin Dynamics for High Precision Experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fiedler, Andrew J.

    In the search for physics beyond the Standard Model, high precision experiments to measure fundamental properties of particles are an important frontier. One group of such measurements involves magnetic dipole moment (MDM) values as well as searching for an electric dipole moment (EDM), both of which could provide insights about how particles interact with their environment at the quantum level and if there are undiscovered new particles. For these types of high precision experiments, minimizing statistical uncertainties in the measurements plays a critical role. \\\\ \\indent This work leverages computer simulations to quantify the effects of statistical uncertainty for experimentsmore » investigating spin dynamics. In it, analysis of beam properties and lattice design effects on the polarization of the beam is performed. As a case study, the beam lines that will provide polarized muon beams to the Fermilab Muon \\emph{g}-2 experiment are analyzed to determine the effects of correlations between the phase space variables and the overall polarization of the muon beam.« less

  17. Potential sources of variability in mesocosm experiments on the response of phytoplankton to ocean acidification

    NASA Astrophysics Data System (ADS)

    Moreno de Castro, Maria; Schartau, Markus; Wirtz, Kai

    2017-04-01

    Mesocosm experiments on phytoplankton dynamics under high CO2 concentrations mimic the response of marine primary producers to future ocean acidification. However, potential acidification effects can be hindered by the high standard deviation typically found in the replicates of the same CO2 treatment level. In experiments with multiple unresolved factors and a sub-optimal number of replicates, post-processing statistical inference tools might fail to detect an effect that is present. We propose that in such cases, data-based model analyses might be suitable tools to unearth potential responses to the treatment and identify the uncertainties that could produce the observed variability. As test cases, we used data from two independent mesocosm experiments. Both experiments showed high standard deviations and, according to statistical inference tools, biomass appeared insensitive to changing CO2 conditions. Conversely, our simulations showed earlier and more intense phytoplankton blooms in modeled replicates at high CO2 concentrations and suggested that uncertainties in average cell size, phytoplankton biomass losses, and initial nutrient concentration potentially outweigh acidification effects by triggering strong variability during the bloom phase. We also estimated the thresholds below which uncertainties do not escalate to high variability. This information might help in designing future mesocosm experiments and interpreting controversial results on the effect of acidification or other pressures on ecosystem functions.

  18. Data exploration, quality control and statistical analysis of ChIP-exo/nexus experiments

    PubMed Central

    Welch, Rene; Chung, Dongjun; Grass, Jeffrey; Landick, Robert

    2017-01-01

    Abstract ChIP-exo/nexus experiments rely on innovative modifications of the commonly used ChIP-seq protocol for high resolution mapping of transcription factor binding sites. Although many aspects of the ChIP-exo data analysis are similar to those of ChIP-seq, these high throughput experiments pose a number of unique quality control and analysis challenges. We develop a novel statistical quality control pipeline and accompanying R/Bioconductor package, ChIPexoQual, to enable exploration and analysis of ChIP-exo and related experiments. ChIPexoQual evaluates a number of key issues including strand imbalance, library complexity, and signal enrichment of data. Assessment of these features are facilitated through diagnostic plots and summary statistics computed over regions of the genome with varying levels of coverage. We evaluated our QC pipeline with both large collections of public ChIP-exo/nexus data and multiple, new ChIP-exo datasets from Escherichia coli. ChIPexoQual analysis of these datasets resulted in guidelines for using these QC metrics across a wide range of sequencing depths and provided further insights for modelling ChIP-exo data. PMID:28911122

  19. Data exploration, quality control and statistical analysis of ChIP-exo/nexus experiments.

    PubMed

    Welch, Rene; Chung, Dongjun; Grass, Jeffrey; Landick, Robert; Keles, Sündüz

    2017-09-06

    ChIP-exo/nexus experiments rely on innovative modifications of the commonly used ChIP-seq protocol for high resolution mapping of transcription factor binding sites. Although many aspects of the ChIP-exo data analysis are similar to those of ChIP-seq, these high throughput experiments pose a number of unique quality control and analysis challenges. We develop a novel statistical quality control pipeline and accompanying R/Bioconductor package, ChIPexoQual, to enable exploration and analysis of ChIP-exo and related experiments. ChIPexoQual evaluates a number of key issues including strand imbalance, library complexity, and signal enrichment of data. Assessment of these features are facilitated through diagnostic plots and summary statistics computed over regions of the genome with varying levels of coverage. We evaluated our QC pipeline with both large collections of public ChIP-exo/nexus data and multiple, new ChIP-exo datasets from Escherichia coli. ChIPexoQual analysis of these datasets resulted in guidelines for using these QC metrics across a wide range of sequencing depths and provided further insights for modelling ChIP-exo data. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  20. The longevity of statistical learning: When infant memory decays, isolated words come to the rescue.

    PubMed

    Karaman, Ferhat; Hay, Jessica F

    2018-02-01

    Research over the past 2 decades has demonstrated that infants are equipped with remarkable computational abilities that allow them to find words in continuous speech. Infants can encode information about the transitional probability (TP) between syllables to segment words from artificial and natural languages. As previous research has tested infants immediately after familiarization, infants' ability to retain sequential statistics beyond the immediate familiarization context remains unknown. Here, we examine infants' memory for statistically defined words 10 min after familiarization with an Italian corpus. Eight-month-old English-learning infants were familiarized with Italian sentences that contained 4 embedded target words-2 words had high internal TP (HTP, TP = 1.0) and 2 had low TP (LTP, TP = .33)-and were tested on their ability to discriminate HTP from LTP words using the Headturn Preference Procedure. When tested after a 10-min delay, infants failed to discriminate HTP from LTP words, suggesting that memory for statistical information likely decays over even short delays (Experiment 1). Experiments 2-4 were designed to test whether experience with isolated words selectively reinforces memory for statistically defined (i.e., HTP) words. When 8-month-olds were given additional experience with isolated tokens of both HTP and LTP words immediately after familiarization, they looked significantly longer on HTP than LTP test trials 10 min later. Although initial representations of statistically defined words may be fragile, our results suggest that experience with isolated words may reinforce the output of statistical learning by helping infants create more robust memories for words with strong versus weak co-occurrence statistics. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  1. Implementation of statistical process control for proteomic experiments via LC MS/MS.

    PubMed

    Bereman, Michael S; Johnson, Richard; Bollinger, James; Boss, Yuval; Shulman, Nick; MacLean, Brendan; Hoofnagle, Andrew N; MacCoss, Michael J

    2014-04-01

    Statistical process control (SPC) is a robust set of tools that aids in the visualization, detection, and identification of assignable causes of variation in any process that creates products, services, or information. A tool has been developed termed Statistical Process Control in Proteomics (SProCoP) which implements aspects of SPC (e.g., control charts and Pareto analysis) into the Skyline proteomics software. It monitors five quality control metrics in a shotgun or targeted proteomic workflow. None of these metrics require peptide identification. The source code, written in the R statistical language, runs directly from the Skyline interface, which supports the use of raw data files from several of the mass spectrometry vendors. It provides real time evaluation of the chromatographic performance (e.g., retention time reproducibility, peak asymmetry, and resolution), and mass spectrometric performance (targeted peptide ion intensity and mass measurement accuracy for high resolving power instruments) via control charts. Thresholds are experiment- and instrument-specific and are determined empirically from user-defined quality control standards that enable the separation of random noise and systematic error. Finally, Pareto analysis provides a summary of performance metrics and guides the user to metrics with high variance. The utility of these charts to evaluate proteomic experiments is illustrated in two case studies.

  2. Comparison of student's learning achievement through realistic mathematics education (RME) approach and problem solving approach on grade VII

    NASA Astrophysics Data System (ADS)

    Ilyas, Muhammad; Salwah

    2017-02-01

    The type of this research was experiment. The purpose of this study was to determine the difference and the quality of student's learning achievement between students who obtained learning through Realistic Mathematics Education (RME) approach and students who obtained learning through problem solving approach. This study was a quasi-experimental research with non-equivalent experiment group design. The population of this study was all students of grade VII in one of junior high school in Palopo, in the second semester of academic year 2015/2016. Two classes were selected purposively as sample of research that was: year VII-5 as many as 28 students were selected as experiment group I and VII-6 as many as 23 students were selected as experiment group II. Treatment that used in the experiment group I was learning by RME Approach, whereas in the experiment group II by problem solving approach. Technique of data collection in this study gave pretest and posttest to students. The analysis used in this research was an analysis of descriptive statistics and analysis of inferential statistics using t-test. Based on the analysis of descriptive statistics, it can be concluded that the average score of students' mathematics learning after taught using problem solving approach was similar to the average results of students' mathematics learning after taught using realistic mathematics education (RME) approach, which are both at the high category. In addition, It can also be concluded that; (1) there was no difference in the results of students' mathematics learning taught using realistic mathematics education (RME) approach and students who taught using problem solving approach, (2) quality of learning achievement of students who received RME approach and problem solving approach learning was same, which was at the high category.

  3. Teaching Probability with the Support of the R Statistical Software

    ERIC Educational Resources Information Center

    dos Santos Ferreira, Robson; Kataoka, Verônica Yumi; Karrer, Monica

    2014-01-01

    The objective of this paper is to discuss aspects of high school students' learning of probability in a context where they are supported by the statistical software R. We report on the application of a teaching experiment, constructed using the perspective of Gal's probabilistic literacy and Papert's constructionism. The results show improvement…

  4. Preservice Teachers' Memories of Their Secondary Science Education Experiences

    ERIC Educational Resources Information Center

    Hudson, Peter; Usak, Muhammet; Fancovicova, Jana; Erdogan, Mehmet; Prokop, Pavol

    2010-01-01

    Understanding preservice teachers' memories of their education may aid towards articulating high-impact teaching practices. This study describes 246 preservice teachers' perceptions of their secondary science education experiences through a questionnaire and 28-item survey. ANOVA was statistically significant about participants' memories of…

  5. Preservice Teachers' Memories of Their Secondary Science Education Experiences

    NASA Astrophysics Data System (ADS)

    Hudson, Peter; Usak, Muhammet; Fančovičová, Jana; Erdoğan, Mehmet; Prokop, Pavol

    2010-12-01

    Understanding preservice teachers' memories of their education may aid towards articulating high-impact teaching practices. This study describes 246 preservice teachers' perceptions of their secondary science education experiences through a questionnaire and 28-item survey. ANOVA was statistically significant about participants' memories of science with 15 of the 28 survey items. Descriptive statistics through SPSS further showed that a teacher's enthusiastic nature (87%) and positive attitude towards science (87%) were regarded as highly memorable. In addition, explaining abstract concepts well (79%), and guiding the students' conceptual development with practical science activities (73%) may be considered as memorable secondary science teaching strategies. Implementing science lessons with one or more of these memorable science teaching practices may "make a difference" towards influencing high school students' positive long-term memories about science and their science education. Further research in other key learning areas may provide a clearer picture of high-impact teaching and a way to enhance pedagogical practices.

  6. Parallel processing of genomics data

    NASA Astrophysics Data System (ADS)

    Agapito, Giuseppe; Guzzi, Pietro Hiram; Cannataro, Mario

    2016-10-01

    The availability of high-throughput experimental platforms for the analysis of biological samples, such as mass spectrometry, microarrays and Next Generation Sequencing, have made possible to analyze a whole genome in a single experiment. Such platforms produce an enormous volume of data per single experiment, thus the analysis of this enormous flow of data poses several challenges in term of data storage, preprocessing, and analysis. To face those issues, efficient, possibly parallel, bioinformatics software needs to be used to preprocess and analyze data, for instance to highlight genetic variation associated with complex diseases. In this paper we present a parallel algorithm for the parallel preprocessing and statistical analysis of genomics data, able to face high dimension of data and resulting in good response time. The proposed system is able to find statistically significant biological markers able to discriminate classes of patients that respond to drugs in different ways. Experiments performed on real and synthetic genomic datasets show good speed-up and scalability.

  7. Which Type of Risk Information to Use for Whom? Moderating Role of Outcome-Relevant Involvement in the Effects of Statistical and Exemplified Risk Information on Risk Perceptions.

    PubMed

    So, Jiyeon; Jeong, Se-Hoon; Hwang, Yoori

    2017-04-01

    The extant empirical research examining the effectiveness of statistical and exemplar-based health information is largely inconsistent. Under the premise that the inconsistency may be due to an unacknowledged moderator (O'Keefe, 2002), this study examined a moderating role of outcome-relevant involvement (Johnson & Eagly, 1989) in the effects of statistical and exemplified risk information on risk perception. Consistent with predictions based on elaboration likelihood model (Petty & Cacioppo, 1984), findings from an experiment (N = 237) concerning alcohol consumption risks showed that statistical risk information predicted risk perceptions of individuals with high, rather than low, involvement, while exemplified risk information predicted risk perceptions of those with low, rather than high, involvement. Moreover, statistical risk information contributed to negative attitude toward drinking via increased risk perception only for highly involved individuals, while exemplified risk information influenced the attitude through the same mechanism only for individuals with low involvement. Theoretical and practical implications for health risk communication are discussed.

  8. Assessing signal-to-noise in quantitative proteomics: multivariate statistical analysis in DIGE experiments.

    PubMed

    Friedman, David B

    2012-01-01

    All quantitative proteomics experiments measure variation between samples. When performing large-scale experiments that involve multiple conditions or treatments, the experimental design should include the appropriate number of individual biological replicates from each condition to enable the distinction between a relevant biological signal from technical noise. Multivariate statistical analyses, such as principal component analysis (PCA), provide a global perspective on experimental variation, thereby enabling the assessment of whether the variation describes the expected biological signal or the unanticipated technical/biological noise inherent in the system. Examples will be shown from high-resolution multivariable DIGE experiments where PCA was instrumental in demonstrating biologically significant variation as well as sample outliers, fouled samples, and overriding technical variation that would not be readily observed using standard univariate tests.

  9. Teaching Highs and Lows: Exploring University Teaching Assistants' Experiences

    ERIC Educational Resources Information Center

    Green, Jennifer L.

    2010-01-01

    Recent reforms in statistics education have initiated the need to prepare graduate teaching assistants (TAs) for these changes. A focus group study explored the experiences and perceptions of University of Nebraska-Lincoln TAs. The results reinforced the idea that content, pedagogy, and technology are central aspects for teaching an introductory…

  10. Guide to Using Onionskin Analysis Code (U)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fugate, Michael Lynn; Morzinski, Jerome Arthur

    2016-09-15

    This document is a guide to using R-code written for the purpose of analyzing onionskin experiments. We expect the user to be very familiar with statistical methods and the R programming language. For more details about onionskin experiments and the statistical methods mentioned in this document see Storlie, Fugate, et al. (2013). Engineers at LANL experiment with detonators and high explosives to assess performance. The experimental unit, called an onionskin, is a hemisphere consisting of a detonator and a booster pellet surrounded by explosive material. When the detonator explodes, a streak camera mounted above the pole of the hemisphere recordsmore » when the shock wave arrives at the surface. The output from the camera is a two-dimensional image that is transformed into a curve that shows the arrival time as a function of polar angle. The statistical challenge is to characterize a baseline population of arrival time curves and to compare the baseline curves to curves from a new, so-called, test series. The hope is that the new test series of curves is statistically similar to the baseline population.« less

  11. Whose statistical reasoning is facilitated by a causal structure intervention?

    PubMed

    McNair, Simon; Feeney, Aidan

    2015-02-01

    People often struggle when making Bayesian probabilistic estimates on the basis of competing sources of statistical evidence. Recently, Krynski and Tenenbaum (Journal of Experimental Psychology: General, 136, 430-450, 2007) proposed that a causal Bayesian framework accounts for peoples' errors in Bayesian reasoning and showed that, by clarifying the causal relations among the pieces of evidence, judgments on a classic statistical reasoning problem could be significantly improved. We aimed to understand whose statistical reasoning is facilitated by the causal structure intervention. In Experiment 1, although we observed causal facilitation effects overall, the effect was confined to participants high in numeracy. We did not find an overall facilitation effect in Experiment 2 but did replicate the earlier interaction between numerical ability and the presence or absence of causal content. This effect held when we controlled for general cognitive ability and thinking disposition. Our results suggest that clarifying causal structure facilitates Bayesian judgments, but only for participants with sufficient understanding of basic concepts in probability and statistics.

  12. Response properties of ON-OFF retinal ganglion cells to high-order stimulus statistics.

    PubMed

    Xiao, Lei; Gong, Han-Yan; Gong, Hai-Qing; Liang, Pei-Ji; Zhang, Pu-Ming

    2014-10-17

    The visual stimulus statistics are the fundamental parameters to provide the reference for studying visual coding rules. In this study, the multi-electrode extracellular recording experiments were designed and implemented on bullfrog retinal ganglion cells to explore the neural response properties to the changes in stimulus statistics. The changes in low-order stimulus statistics, such as intensity and contrast, were clearly reflected in the neuronal firing rate. However, it was difficult to distinguish the changes in high-order statistics, such as skewness and kurtosis, only based on the neuronal firing rate. The neuronal temporal filtering and sensitivity characteristics were further analyzed. We observed that the peak-to-peak amplitude of the temporal filter and the neuronal sensitivity, which were obtained from either neuronal ON spikes or OFF spikes, could exhibit significant changes when the high-order stimulus statistics were changed. These results indicate that in the retina, the neuronal response properties may be reliable and powerful in carrying some complex and subtle visual information. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  13. Standardized seawater rearing of chinook salmon smolts to evaluate hatchery practices showed low statistical power

    USGS Publications Warehouse

    Palmisano, Aldo N.; Elder, N.E.

    2001-01-01

    We examined, under standardized conditions, seawater survival of chinook salmon Oncorhynchus tshawytscha at the smolt stage to evaluate the experimental hatchery practices applied to their rearing. The experimental rearing practices included rearing fish at different densities; attempting to control bacterial kidney disease with broodstock segregation, erythromycin injection, and an experimental diet; rearing fish on different water sources; and freeze branding the fish. After application of experimental rearing practices in hatcheries, smolts were transported to a rearing facility for about 2-3 months of seawater rearing. Of 16 experiments, 4 yielded statistically significant differences in seawater survival. In general we found that high variability among replicates, plus the low numbers of replicates available, resulted in low statistical power. We recommend including four or five replicates and using ?? = 0.10 in 1-tailed tests of hatchery experiments to try to increase the statistical power to 0.80.

  14. From Statistics to Meaning: Infants’ Acquisition of Lexical Categories

    PubMed Central

    Lany, Jill; Saffran, Jenny R.

    2013-01-01

    Infants are highly sensitive to statistical patterns in their auditory language input that mark word categories (e.g., noun and verb). However, it is unknown whether experience with these cues facilitates the acquisition of semantic properties of word categories. In a study testing this hypothesis, infants first listened to an artificial language in which word categories were reliably distinguished by statistical cues (experimental group) or in which these properties did not cue category membership (control group). Both groups were then trained on identical pairings between the words and pictures from two categories (animals and vehicles). Only infants in the experimental group learned the trained associations between specific words and pictures. Moreover, these infants generalized the pattern to include novel pairings. These results suggest that experience with statistical cues marking lexical categories sets the stage for learning the meanings of individual words and for generalizing meanings to new category members. PMID:20424058

  15. Algorithm for computing descriptive statistics for very large data sets and the exa-scale era

    NASA Astrophysics Data System (ADS)

    Beekman, Izaak

    2017-11-01

    An algorithm for Single-point, Parallel, Online, Converging Statistics (SPOCS) is presented. It is suited for in situ analysis that traditionally would be relegated to post-processing, and can be used to monitor the statistical convergence and estimate the error/residual in the quantity-useful for uncertainty quantification too. Today, data may be generated at an overwhelming rate by numerical simulations and proliferating sensing apparatuses in experiments and engineering applications. Monitoring descriptive statistics in real time lets costly computations and experiments be gracefully aborted if an error has occurred, and monitoring the level of statistical convergence allows them to be run for the shortest amount of time required to obtain good results. This algorithm extends work by Pébay (Sandia Report SAND2008-6212). Pébay's algorithms are recast into a converging delta formulation, with provably favorable properties. The mean, variance, covariances and arbitrary higher order statistical moments are computed in one pass. The algorithm is tested using Sillero, Jiménez, & Moser's (2013, 2014) publicly available UPM high Reynolds number turbulent boundary layer data set, demonstrating numerical robustness, efficiency and other favorable properties.

  16. An experimental study of the surface elevation probability distribution and statistics of wind-generated waves

    NASA Technical Reports Server (NTRS)

    Huang, N. E.; Long, S. R.

    1980-01-01

    Laboratory experiments were performed to measure the surface elevation probability density function and associated statistical properties for a wind-generated wave field. The laboratory data along with some limited field data were compared. The statistical properties of the surface elevation were processed for comparison with the results derived from the Longuet-Higgins (1963) theory. It is found that, even for the highly non-Gaussian cases, the distribution function proposed by Longuet-Higgins still gives good approximations.

  17. A significant-loophole-free test of Bell's theorem with entangled photons

    NASA Astrophysics Data System (ADS)

    Giustina, Marissa; Versteegh, Marijn A. M.; Wengerowsky, Sören; Handsteiner, Johannes; Hochrainer, Armin; Phelan, Kevin; Steinlechner, Fabian; Kofler, Johannes; Larsson, Jan-Åke; Abellán, Carlos; Amaya, Waldimar; Mitchell, Morgan W.; Beyer, Jörn; Gerrits, Thomas; Lita, Adriana E.; Shalm, Lynden K.; Nam, Sae Woo; Scheidl, Thomas; Ursin, Rupert; Wittmann, Bernhard; Zeilinger, Anton

    2017-10-01

    John Bell's theorem of 1964 states that local elements of physical reality, existing independent of measurement, are inconsistent with the predictions of quantum mechanics (Bell, J. S. (1964), Physics (College. Park. Md). Specifically, correlations between measurement results from distant entangled systems would be smaller than predicted by quantum physics. This is expressed in Bell's inequalities. Employing modifications of Bell's inequalities, many experiments have been performed that convincingly support the quantum predictions. Yet, all experiments rely on assumptions, which provide loopholes for a local realist explanation of the measurement. Here we report an experiment with polarization-entangled photons that simultaneously closes the most significant of these loopholes. We use a highly efficient source of entangled photons, distributed these over a distance of 58.5 meters, and implemented rapid random setting generation and high-efficiency detection to observe a violation of a Bell inequality with high statistical significance. The merely statistical probability of our results to occur under local realism is less than 3.74×10-31, corresponding to an 11.5 standard deviation effect.

  18. Linking sounds to meanings: infant statistical learning in a natural language.

    PubMed

    Hay, Jessica F; Pelucchi, Bruna; Graf Estes, Katharine; Saffran, Jenny R

    2011-09-01

    The processes of infant word segmentation and infant word learning have largely been studied separately. However, the ease with which potential word forms are segmented from fluent speech seems likely to influence subsequent mappings between words and their referents. To explore this process, we tested the link between the statistical coherence of sequences presented in fluent speech and infants' subsequent use of those sequences as labels for novel objects. Notably, the materials were drawn from a natural language unfamiliar to the infants (Italian). The results of three experiments suggest that there is a close relationship between the statistics of the speech stream and subsequent mapping of labels to referents. Mapping was facilitated when the labels contained high transitional probabilities in the forward and/or backward direction (Experiment 1). When no transitional probability information was available (Experiment 2), or when the internal transitional probabilities of the labels were low in both directions (Experiment 3), infants failed to link the labels to their referents. Word learning appears to be strongly influenced by infants' prior experience with the distribution of sounds that make up words in natural languages. Copyright © 2011 Elsevier Inc. All rights reserved.

  19. A Numerical Simulation and Statistical Modeling of High Intensity Radiated Fields Experiment Data

    NASA Technical Reports Server (NTRS)

    Smith, Laura J.

    2004-01-01

    Tests are conducted on a quad-redundant fault tolerant flight control computer to establish upset characteristics of an avionics system in an electromagnetic field. A numerical simulation and statistical model are described in this work to analyze the open loop experiment data collected in the reverberation chamber at NASA LaRC as a part of an effort to examine the effects of electromagnetic interference on fly-by-wire aircraft control systems. By comparing thousands of simulation and model outputs, the models that best describe the data are first identified and then a systematic statistical analysis is performed on the data. All of these efforts are combined which culminate in an extrapolation of values that are in turn used to support previous efforts used in evaluating the data.

  20. Accessible Information Without Disturbing Partially Known Quantum States on a von Neumann Algebra

    NASA Astrophysics Data System (ADS)

    Kuramochi, Yui

    2018-04-01

    This paper addresses the problem of how much information we can extract without disturbing a statistical experiment, which is a family of partially known normal states on a von Neumann algebra. We define the classical part of a statistical experiment as the restriction of the equivalent minimal sufficient statistical experiment to the center of the outcome space, which, in the case of density operators on a Hilbert space, corresponds to the classical probability distributions appearing in the maximal decomposition by Koashi and Imoto (Phys. Rev. A 66, 022,318 2002). We show that we can access by a Schwarz or completely positive channel at most the classical part of a statistical experiment if we do not disturb the states. We apply this result to the broadcasting problem of a statistical experiment. We also show that the classical part of the direct product of statistical experiments is the direct product of the classical parts of the statistical experiments. The proof of the latter result is based on the theorem that the direct product of minimal sufficient statistical experiments is also minimal sufficient.

  1. Semantically enabled and statistically supported biological hypothesis testing with tissue microarray databases

    PubMed Central

    2011-01-01

    Background Although many biological databases are applying semantic web technologies, meaningful biological hypothesis testing cannot be easily achieved. Database-driven high throughput genomic hypothesis testing requires both of the capabilities of obtaining semantically relevant experimental data and of performing relevant statistical testing for the retrieved data. Tissue Microarray (TMA) data are semantically rich and contains many biologically important hypotheses waiting for high throughput conclusions. Methods An application-specific ontology was developed for managing TMA and DNA microarray databases by semantic web technologies. Data were represented as Resource Description Framework (RDF) according to the framework of the ontology. Applications for hypothesis testing (Xperanto-RDF) for TMA data were designed and implemented by (1) formulating the syntactic and semantic structures of the hypotheses derived from TMA experiments, (2) formulating SPARQLs to reflect the semantic structures of the hypotheses, and (3) performing statistical test with the result sets returned by the SPARQLs. Results When a user designs a hypothesis in Xperanto-RDF and submits it, the hypothesis can be tested against TMA experimental data stored in Xperanto-RDF. When we evaluated four previously validated hypotheses as an illustration, all the hypotheses were supported by Xperanto-RDF. Conclusions We demonstrated the utility of high throughput biological hypothesis testing. We believe that preliminary investigation before performing highly controlled experiment can be benefited. PMID:21342584

  2. Impact of audio-visual storytelling in simulation learning experiences of undergraduate nursing students.

    PubMed

    Johnston, Sandra; Parker, Christina N; Fox, Amanda

    2017-09-01

    Use of high fidelity simulation has become increasingly popular in nursing education to the extent that it is now an integral component of most nursing programs. Anecdotal evidence suggests that students have difficulty engaging with simulation manikins due to their unrealistic appearance. Introduction of the manikin as a 'real patient' with the use of an audio-visual narrative may engage students in the simulated learning experience and impact on their learning. A paucity of literature currently exists on the use of audio-visual narratives to enhance simulated learning experiences. This study aimed to determine if viewing an audio-visual narrative during a simulation pre-brief altered undergraduate nursing student perceptions of the learning experience. A quasi-experimental post-test design was utilised. A convenience sample of final year baccalaureate nursing students at a large metropolitan university. Participants completed a modified version of the Student Satisfaction with Simulation Experiences survey. This 12-item questionnaire contained questions relating to the ability to transfer skills learned in simulation to the real clinical world, the realism of the simulation and the overall value of the learning experience. Descriptive statistics were used to summarise demographic information. Two tailed, independent group t-tests were used to determine statistical differences within the categories. Findings indicated that students reported high levels of value, realism and transferability in relation to the viewing of an audio-visual narrative. Statistically significant results (t=2.38, p<0.02) were evident in the subscale of transferability of learning from simulation to clinical practice. The subgroups of age and gender although not significant indicated some interesting results. High satisfaction with simulation was indicated by all students in relation to value and realism. There was a significant finding in relation to transferability on knowledge and this is vital to quality educational outcomes. Copyright © 2017. Published by Elsevier Ltd.

  3. A Virtual Study of Grid Resolution on Experiments of a Highly-Resolved Turbulent Plume

    NASA Astrophysics Data System (ADS)

    Maisto, Pietro M. F.; Marshall, Andre W.; Gollner, Michael J.; Fire Protection Engineering Department Collaboration

    2017-11-01

    An accurate representation of sub-grid scale turbulent mixing is critical for modeling fire plumes and smoke transport. In this study, PLIF and PIV diagnostics are used with the saltwater modeling technique to provide highly-resolved instantaneous field measurements in unconfined turbulent plumes useful for statistical analysis, physical insight, and model validation. The effect of resolution was investigated employing a virtual interrogation window (of varying size) applied to the high-resolution field measurements. Motivated by LES low-pass filtering concepts, the high-resolution experimental data in this study can be analyzed within the interrogation windows (i.e. statistics at the sub-grid scale) and on interrogation windows (i.e. statistics at the resolved scale). A dimensionless resolution threshold (L/D*) criterion was determined to achieve converged statistics on the filtered measurements. Such a criterion was then used to establish the relative importance between large and small-scale turbulence phenomena while investigating specific scales for the turbulent flow. First order data sets start to collapse at a resolution of 0.3D*, while for second and higher order statistical moments the interrogation window size drops down to 0.2D*.

  4. Weather extremes in very large, high-resolution ensembles: the weatherathome experiment

    NASA Astrophysics Data System (ADS)

    Allen, M. R.; Rosier, S.; Massey, N.; Rye, C.; Bowery, A.; Miller, J.; Otto, F.; Jones, R.; Wilson, S.; Mote, P.; Stone, D. A.; Yamazaki, Y. H.; Carrington, D.

    2011-12-01

    Resolution and ensemble size are often seen as alternatives in climate modelling. Models with sufficient resolution to simulate many classes of extreme weather cannot normally be run often enough to assess the statistics of rare events, still less how these statistics may be changing. As a result, assessments of the impact of external forcing on regional climate extremes must be based either on statistical downscaling from relatively coarse-resolution models, or statistical extrapolation from 10-year to 100-year events. Under the weatherathome experiment, part of the climateprediction.net initiative, we have compiled the Met Office Regional Climate Model HadRM3P to run on personal computer volunteered by the general public at 25 and 50km resolution, embedded within the HadAM3P global atmosphere model. With a global network of about 50,000 volunteers, this allows us to run time-slice ensembles of essentially unlimited size, exploring the statistics of extreme weather under a range of scenarios for surface forcing and atmospheric composition, allowing for uncertainty in both boundary conditions and model parameters. Current experiments, developed with the support of Microsoft Research, focus on three regions, the Western USA, Europe and Southern Africa. We initially simulate the period 1959-2010 to establish which variables are realistically simulated by the model and on what scales. Our next experiments are focussing on the Event Attribution problem, exploring how the probability of various types of extreme weather would have been different over the recent past in a world unaffected by human influence, following the design of Pall et al (2011), but extended to a longer period and higher spatial resolution. We will present the first results of the unique, global, participatory experiment and discuss the implications for the attribution of recent weather events to anthropogenic influence on climate.

  5. An Analysis of Operational Suitability for Test and Evaluation of Highly Reliable Systems

    DTIC Science & Technology

    1994-03-04

    Exposition," Journal of the American Statistical A iation-59: 353-375 (June 1964). 17. SYS 229, Test and Evaluation Management Coursebook , School of Systems...in hours, 0 is 2-5 the desired MTBCF in hours, R is the number of critical failures, and a is the P[type-I error] of the X2 statistic with 2*R+2...design of experiments (DOE) tables and the use of Bayesian statistics to increase the confidence level of the test results that will be obtained from

  6. Pentaquark Searches at Jlab

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rossi, Patrizia

    2007-01-01

    Since LEPS collaboration reported the first evidence for a S=+1 baryon resonance in early 2003 with a mass of 1.54 GeV, dubbed Θ+, more than ten experiments have confirmed this exotic state, among these two carried out at Jefferson Laboratory. At the same time, there are a number of experiments, mostly at high energies, that report null results. To try to clarify this situation, during the past year, The CLAS Collaboration at Jefferson Laboratory has undertaken a second generation high-statistics experimental program to search for exotics baryons. Here the preliminary results from these experiments are reported.

  7. Benchmarking statistical averaging of spectra with HULLAC

    NASA Astrophysics Data System (ADS)

    Klapisch, Marcel; Busquet, Michel

    2008-11-01

    Knowledge of radiative properties of hot plasmas is important for ICF, astrophysics, etc When mid-Z or high-Z elements are present, the spectra are so complex that one commonly uses statistically averaged description of atomic systems [1]. In a recent experiment on Fe[2], performed under controlled conditions, high resolution transmission spectra were obtained. The new version of HULLAC [3] allows the use of the same model with different levels of details/averaging. We will take advantage of this feature to check the effect of averaging with comparison with experiment. [1] A Bar-Shalom, J Oreg, and M Klapisch, J. Quant. Spectros. Rad. Transf. 65, 43 (2000). [2] J. E. Bailey, G. A. Rochau, C. A. Iglesias et al., Phys. Rev. Lett. 99, 265002-4 (2007). [3]. M. Klapisch, M. Busquet, and A. Bar-Shalom, AIP Conference Proceedings 926, 206-15 (2007).

  8. Significant-Loophole-Free Test of Bell's Theorem with Entangled Photons.

    PubMed

    Giustina, Marissa; Versteegh, Marijn A M; Wengerowsky, Sören; Handsteiner, Johannes; Hochrainer, Armin; Phelan, Kevin; Steinlechner, Fabian; Kofler, Johannes; Larsson, Jan-Åke; Abellán, Carlos; Amaya, Waldimar; Pruneri, Valerio; Mitchell, Morgan W; Beyer, Jörn; Gerrits, Thomas; Lita, Adriana E; Shalm, Lynden K; Nam, Sae Woo; Scheidl, Thomas; Ursin, Rupert; Wittmann, Bernhard; Zeilinger, Anton

    2015-12-18

    Local realism is the worldview in which physical properties of objects exist independently of measurement and where physical influences cannot travel faster than the speed of light. Bell's theorem states that this worldview is incompatible with the predictions of quantum mechanics, as is expressed in Bell's inequalities. Previous experiments convincingly supported the quantum predictions. Yet, every experiment requires assumptions that provide loopholes for a local realist explanation. Here, we report a Bell test that closes the most significant of these loopholes simultaneously. Using a well-optimized source of entangled photons, rapid setting generation, and highly efficient superconducting detectors, we observe a violation of a Bell inequality with high statistical significance. The purely statistical probability of our results to occur under local realism does not exceed 3.74×10^{-31}, corresponding to an 11.5 standard deviation effect.

  9. Direct evidence for a dual process model of deductive inference.

    PubMed

    Markovits, Henry; Brunet, Marie-Laurence; Thompson, Valerie; Brisson, Janie

    2013-07-01

    In 2 experiments, we tested a strong version of a dual process theory of conditional inference (cf. Verschueren et al., 2005a, 2005b) that assumes that most reasoners have 2 strategies available, the choice of which is determined by situational variables, cognitive capacity, and metacognitive control. The statistical strategy evaluates inferences probabilistically, accepting those with high conditional probability. The counterexample strategy rejects inferences when a counterexample shows the inference to be invalid. To discriminate strategy use, we presented reasoners with conditional statements (if p, then q) and explicit statistical information about the relative frequency of the probability of p/q (50% vs. 90%). A statistical strategy would accept the more probable inferences more frequently, whereas the counterexample one would reject both. In Experiment 1, reasoners under time pressure used the statistical strategy more, but switched to the counterexample strategy when time constraints were removed; the former took less time than the latter. These data are consistent with the hypothesis that the statistical strategy is the default heuristic. Under a free-time condition, reasoners preferred the counterexample strategy and kept it when put under time pressure. Thus, it is not simply a lack of capacity that produces a statistical strategy; instead, it seems that time pressure disrupts the ability to make good metacognitive choices. In line with this conclusion, in a 2nd experiment, we measured reasoners' confidence in their performance; those under time pressure were less confident in the statistical than the counterexample strategy and more likely to switch strategies under free-time conditions. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  10. The Relative Importance of Low Significance Level and High Power in Multiple Tests of Significance.

    ERIC Educational Resources Information Center

    Westermann, Rainer; Hager, Willi

    1983-01-01

    Two psychological experiments--Anderson and Shanteau (1970), Berkowitz and LePage (1967)--are reanalyzed to present the problem of the relative importance of low Type 1 error probability and high power when answering a research question by testing several statistical hypotheses. (Author/PN)

  11. Intelligent Condition Diagnosis Method Based on Adaptive Statistic Test Filter and Diagnostic Bayesian Network

    PubMed Central

    Li, Ke; Zhang, Qiuju; Wang, Kun; Chen, Peng; Wang, Huaqing

    2016-01-01

    A new fault diagnosis method for rotating machinery based on adaptive statistic test filter (ASTF) and Diagnostic Bayesian Network (DBN) is presented in this paper. ASTF is proposed to obtain weak fault features under background noise, ASTF is based on statistic hypothesis testing in the frequency domain to evaluate similarity between reference signal (noise signal) and original signal, and remove the component of high similarity. The optimal level of significance α is obtained using particle swarm optimization (PSO). To evaluate the performance of the ASTF, evaluation factor Ipq is also defined. In addition, a simulation experiment is designed to verify the effectiveness and robustness of ASTF. A sensitive evaluation method using principal component analysis (PCA) is proposed to evaluate the sensitiveness of symptom parameters (SPs) for condition diagnosis. By this way, the good SPs that have high sensitiveness for condition diagnosis can be selected. A three-layer DBN is developed to identify condition of rotation machinery based on the Bayesian Belief Network (BBN) theory. Condition diagnosis experiment for rolling element bearings demonstrates the effectiveness of the proposed method. PMID:26761006

  12. Intelligent Condition Diagnosis Method Based on Adaptive Statistic Test Filter and Diagnostic Bayesian Network.

    PubMed

    Li, Ke; Zhang, Qiuju; Wang, Kun; Chen, Peng; Wang, Huaqing

    2016-01-08

    A new fault diagnosis method for rotating machinery based on adaptive statistic test filter (ASTF) and Diagnostic Bayesian Network (DBN) is presented in this paper. ASTF is proposed to obtain weak fault features under background noise, ASTF is based on statistic hypothesis testing in the frequency domain to evaluate similarity between reference signal (noise signal) and original signal, and remove the component of high similarity. The optimal level of significance α is obtained using particle swarm optimization (PSO). To evaluate the performance of the ASTF, evaluation factor Ipq is also defined. In addition, a simulation experiment is designed to verify the effectiveness and robustness of ASTF. A sensitive evaluation method using principal component analysis (PCA) is proposed to evaluate the sensitiveness of symptom parameters (SPs) for condition diagnosis. By this way, the good SPs that have high sensitiveness for condition diagnosis can be selected. A three-layer DBN is developed to identify condition of rotation machinery based on the Bayesian Belief Network (BBN) theory. Condition diagnosis experiment for rolling element bearings demonstrates the effectiveness of the proposed method.

  13. The Roles of Experience, Gender, and Individual Differences in Statistical Reasoning

    ERIC Educational Resources Information Center

    Martin, Nadia; Hughes, Jeffrey; Fugelsang, Jonathan

    2017-01-01

    We examine the joint effects of gender and experience on statistical reasoning. Participants with various levels of experience in statistics completed the Statistical Reasoning Assessment (Garfield, 2003), along with individual difference measures assessing cognitive ability and thinking dispositions. Although the performance of both genders…

  14. Data mining and statistical inference in selective laser melting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kamath, Chandrika

    Selective laser melting (SLM) is an additive manufacturing process that builds a complex three-dimensional part, layer-by-layer, using a laser beam to fuse fine metal powder together. The design freedom afforded by SLM comes associated with complexity. As the physical phenomena occur over a broad range of length and time scales, the computational cost of modeling the process is high. At the same time, the large number of parameters that control the quality of a part make experiments expensive. In this paper, we describe ways in which we can use data mining and statistical inference techniques to intelligently combine simulations andmore » experiments to build parts with desired properties. We start with a brief summary of prior work in finding process parameters for high-density parts. We then expand on this work to show how we can improve the approach by using feature selection techniques to identify important variables, data-driven surrogate models to reduce computational costs, improved sampling techniques to cover the design space adequately, and uncertainty analysis for statistical inference. Here, our results indicate that techniques from data mining and statistics can complement those from physical modeling to provide greater insight into complex processes such as selective laser melting.« less

  15. Data mining and statistical inference in selective laser melting

    DOE PAGES

    Kamath, Chandrika

    2016-01-11

    Selective laser melting (SLM) is an additive manufacturing process that builds a complex three-dimensional part, layer-by-layer, using a laser beam to fuse fine metal powder together. The design freedom afforded by SLM comes associated with complexity. As the physical phenomena occur over a broad range of length and time scales, the computational cost of modeling the process is high. At the same time, the large number of parameters that control the quality of a part make experiments expensive. In this paper, we describe ways in which we can use data mining and statistical inference techniques to intelligently combine simulations andmore » experiments to build parts with desired properties. We start with a brief summary of prior work in finding process parameters for high-density parts. We then expand on this work to show how we can improve the approach by using feature selection techniques to identify important variables, data-driven surrogate models to reduce computational costs, improved sampling techniques to cover the design space adequately, and uncertainty analysis for statistical inference. Here, our results indicate that techniques from data mining and statistics can complement those from physical modeling to provide greater insight into complex processes such as selective laser melting.« less

  16. Comparison of the Earth's high-latitude disturbances with energetic electrons measured by the ERG/Arase satellite

    NASA Astrophysics Data System (ADS)

    Chiang, C. Y.; Chang, T. F.; Tam, S. W. Y.; Syugu, W. J.; Kazama, Y.; Wang, B. J.; Wang, S. Y.; Kasahara, S.; Yokota, S.; Hori, T.; Yoshizumi, M.; Shinohara, I.

    2017-12-01

    The Exploration of energization and Radiation in Geospace (ERG) satellite has been successfully launched from the Uchinoura Space Center in December 2016. The main goal of the ERG project is to elucidate acceleration and loss mechanisms of relativistic electrons in the radiation belts. In addition, the apogee of the ERG satellite's orbit often exceeds the edge of outer radiation belt in radial distance. Thus the data measured from the higher-L region may be associated with the activities observed in the Earth's high-latitude region. We statistically compare the Auroral Electrojet (AE) index with the data measured by the Low-Energy Particle Experiments - Electron Analyzer (LEP-e) and Medium-Energy Particle Experiments - Electron Analyzer (MEP-e) onboard the ERG satellite in the past months. With the selected data for L > 7, we statistically investigate the contributions of the different electron energies observed in various magnetic local time (MLT) sectors to the Earth's high-latitude disturbances.

  17. Interaction effects of metals and salinity on biodegradation of a complex hydrocarbon waste.

    PubMed

    Amatya, Prasanna L; Hettiaratchi, Joseph Patrick A; Joshi, Ramesh C

    2006-02-01

    The presence of high levels of salts because of produced brine water disposal at flare pits and the presence of metals at sufficient concentrations to impact microbial activity are of concern to bioremediation of flare pit waste in the upstream oil and gas industry. Two slurry-phase biotreatment experiments based on three-level factorial statistical experimental design were conducted with a flare pit waste. The experiments separately studied the primary effect of cadmium [Cd(II)] and interaction effect between Cd(II) and salinity and the primary effect of zinc [Zn(II)] and interaction effect between Zn(II) and salinity on hydrocarbon biodegradation. The results showed 42-52.5% hydrocarbon removal in slurries spiked with Cd and 47-62.5% in the slurries spiked with Zn. The analysis of variance showed that the primary effects of Cd and Cd-salinity interaction were statistically significant on hydrocarbon degradation. The primary effects of Zn and the Zn-salinity interaction were statistically insignificant, whereas the quadratic effect of Zn was highly significant on hydrocarbon degradation. The study on effects of metallic chloro-complexes showed that the total aqueous concentration of Cd or Zn does not give a reliable indication of overall toxicity to the microbial activity in the presence of high salinity levels.

  18. An Analysis Methodology for the Gamma-ray Large Area Space Telescope

    NASA Technical Reports Server (NTRS)

    Morris, Robin D.; Cohen-Tanugi, Johann

    2004-01-01

    The Large Area Telescope (LAT) instrument on the Gamma Ray Large Area Space Telescope (GLAST) has been designed to detect high-energy gamma rays and determine their direction of incidence and energy. We propose a reconstruction algorithm based on recent advances in statistical methodology. This method, alternative to the standard event analysis inherited from high energy collider physics experiments, incorporates more accurately the physical processes occurring in the detector, and makes full use of the statistical information available. It could thus provide a better estimate of the direction and energy of the primary photon.

  19. Operating Experience Review of the INL HTE Gas Monitoring System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    L. C. Cadwallader; K. G. DeWall

    2010-06-01

    This paper describes the operations of several types of gas monitors in use at the Idaho National Laboratory (INL) High Temperature Electrolysis Experiment (HTE) laboratory. The gases monitored at hydrogen, carbon monoxide, carbon dioxide, and oxygen. The operating time, calibration, and unwanted alarms are described. The calibration session time durations are described. Some simple statistics are given for the reliability of these monitors and the results are compared to operating experiences of other types of monitors.

  20. 3-D High-Lift Flow-Physics Experiment - Transition Measurements

    NASA Technical Reports Server (NTRS)

    McGinley, Catherine B.; Jenkins, Luther N.; Watson, Ralph D.; Bertelrud, Arild

    2005-01-01

    An analysis of the flow state on a trapezoidal wing model from the NASA 3-D High Lift Flow Physics Experiment is presented. The objective of the experiment was to characterize the flow over a non-proprietary semi-span three-element high-lift configuration to aid in assessing the state of the art in the computation of three-dimensional high-lift flows. Surface pressures and hot-film sensors are used to determine the flow conditions on the slat, main, and flap. The locations of the attachments lines and the values of the attachment line Reynolds number are estimated based on the model surface pressures. Data from the hot-films are used to determine if the flow is laminar, transitional, or turbulent by examining the hot-film time histories, statistics, and frequency spectra.

  1. Hadoop and friends - first experience at CERN with a new platform for high throughput analysis steps

    NASA Astrophysics Data System (ADS)

    Duellmann, D.; Surdy, K.; Menichetti, L.; Toebbicke, R.

    2017-10-01

    The statistical analysis of infrastructure metrics comes with several specific challenges, including the fairly large volume of unstructured metrics from a large set of independent data sources. Hadoop and Spark provide an ideal environment in particular for the first steps of skimming rapidly through hundreds of TB of low relevance data to find and extract the much smaller data volume that is relevant for statistical analysis and modelling. This presentation will describe the new Hadoop service at CERN and the use of several of its components for high throughput data aggregation and ad-hoc pattern searches. We will describe the hardware setup used, the service structure with a small set of decoupled clusters and the first experience with co-hosting different applications and performing software upgrades. We will further detail the common infrastructure used for data extraction and preparation from continuous monitoring and database input sources.

  2. Mutual interference between statistical summary perception and statistical learning.

    PubMed

    Zhao, Jiaying; Ngo, Nhi; McKendrick, Ryan; Turk-Browne, Nicholas B

    2011-09-01

    The visual system is an efficient statistician, extracting statistical summaries over sets of objects (statistical summary perception) and statistical regularities among individual objects (statistical learning). Although these two kinds of statistical processing have been studied extensively in isolation, their relationship is not yet understood. We first examined how statistical summary perception influences statistical learning by manipulating the task that participants performed over sets of objects containing statistical regularities (Experiment 1). Participants who performed a summary task showed no statistical learning of the regularities, whereas those who performed control tasks showed robust learning. We then examined how statistical learning influences statistical summary perception by manipulating whether the sets being summarized contained regularities (Experiment 2) and whether such regularities had already been learned (Experiment 3). The accuracy of summary judgments improved when regularities were removed and when learning had occurred in advance. In sum, calculating summary statistics impeded statistical learning, and extracting statistical regularities impeded statistical summary perception. This mutual interference suggests that statistical summary perception and statistical learning are fundamentally related.

  3. Study of Isospin Correlation in High Energy Heavy Ion Interactions with the RHIC PHENIX. Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Takahashi, Y.

    This report describes the research work performed under the support of the DOE research grant E-FG02-97ER4108. The work is composed of three parts: (1) Visual analysis and quality control of the Micro Vertex Detector (MVD) of the PHENIX experiments carried out of Brookhaven National Laboratory. (2) Continuation of the data analysis of the EMU05/09/16 experiments for the study of the inclusive particle production spectra and multi-particle correlation. (3) Exploration of a new statistical means to study very high-multiplicity of nuclear-particle ensembles and its perspectives to apply to the higher energy experiments.

  4. Built-Up Area Detection from High-Resolution Satellite Images Using Multi-Scale Wavelet Transform and Local Spatial Statistics

    NASA Astrophysics Data System (ADS)

    Chen, Y.; Zhang, Y.; Gao, J.; Yuan, Y.; Lv, Z.

    2018-04-01

    Recently, built-up area detection from high-resolution satellite images (HRSI) has attracted increasing attention because HRSI can provide more detailed object information. In this paper, multi-resolution wavelet transform and local spatial autocorrelation statistic are introduced to model the spatial patterns of built-up areas. First, the input image is decomposed into high- and low-frequency subbands by wavelet transform at three levels. Then the high-frequency detail information in three directions (horizontal, vertical and diagonal) are extracted followed by a maximization operation to integrate the information in all directions. Afterward, a cross-scale operation is implemented to fuse different levels of information. Finally, local spatial autocorrelation statistic is introduced to enhance the saliency of built-up features and an adaptive threshold algorithm is used to achieve the detection of built-up areas. Experiments are conducted on ZY-3 and Quickbird panchromatic satellite images, and the results show that the proposed method is very effective for built-up area detection.

  5. How to inhibit a distractor location? Statistical learning versus active, top-down suppression.

    PubMed

    Wang, Benchi; Theeuwes, Jan

    2018-05-01

    Recently, Wang and Theeuwes (Journal of Experimental Psychology: Human Perception and Performance, 44(1), 13-17, 2018a) demonstrated the role of lingering selection biases in an additional singleton search task in which the distractor singleton appeared much more often in one location than in all other locations. For this location, there was less capture and selection efficiency was reduced. It was argued that statistical learning induces plasticity within the spatial priority map such that particular locations that are high likely to contain a distractor are suppressed relative to all other locations. The current study replicated these findings regarding statistical learning (Experiment 1) and investigated whether similar effects can be obtained by cueing the distractor location in a top-down way on a trial-by-trial basis. The results show that top-down cueing of the distractor location with long (1,500 ms; Experiment 2) and short stimulus-onset symmetries (SOAs) (600 ms; Experiment 3) does not result in suppression: The amount of capture nor the efficiency of selection was affected by the cue. If anything, we found an attentional benefit (instead of the suppression) for the short SOA. We argue that through statistical learning, weights within the attentional priority map are changed such that one location containing a salient distractor is suppressed relative to all other locations. Our cueing experiments show that this effect cannot be accomplished by active, top-down suppression. Consequences for recent theories of distractor suppression are discussed.

  6. Towards Direct Simulation of Future Tropical Cyclone Statistics in a High-Resolution Global Atmospheric Model

    DOE PAGES

    Wehner, Michael F.; Bala, G.; Duffy, Phillip; ...

    2010-01-01

    We present a set of high-resolution global atmospheric general circulation model (AGCM) simulations focusing on the model's ability to represent tropical storms and their statistics. We find that the model produces storms of hurricane strength with realistic dynamical features. We also find that tropical storm statistics are reasonable, both globally and in the north Atlantic, when compared to recent observations. The sensitivity of simulated tropical storm statistics to increases in sea surface temperature (SST) is also investigated, revealing that a credible late 21st century SST increase produced increases in simulated tropical storm numbers and intensities in all ocean basins. Whilemore » this paper supports previous high-resolution model and theoretical findings that the frequency of very intense storms will increase in a warmer climate, it differs notably from previous medium and high-resolution model studies that show a global reduction in total tropical storm frequency. However, we are quick to point out that this particular model finding remains speculative due to a lack of radiative forcing changes in our time-slice experiments as well as a focus on the Northern hemisphere tropical storm seasons.« less

  7. Counting statistics of chaotic resonances at optical frequencies: Theory and experiments

    NASA Astrophysics Data System (ADS)

    Lippolis, Domenico; Wang, Li; Xiao, Yun-Feng

    2017-07-01

    A deformed dielectric microcavity is used as an experimental platform for the analysis of the statistics of chaotic resonances, in the perspective of testing fractal Weyl laws at optical frequencies. In order to surmount the difficulties that arise from reading strongly overlapping spectra, we exploit the mixed nature of the phase space at hand, and only count the high-Q whispering-gallery modes (WGMs) directly. That enables us to draw statistical information on the more lossy chaotic resonances, coupled to the high-Q regular modes via dynamical tunneling. Three different models [classical, Random-Matrix-Theory (RMT) based, semiclassical] to interpret the experimental data are discussed. On the basis of least-squares analysis, theoretical estimates of Ehrenfest time, and independent measurements, we find that a semiclassically modified RMT-based expression best describes the experiment in all its realizations, particularly when the resonator is coupled to visible light, while RMT alone still works quite well in the infrared. In this work we reexamine and substantially extend the results of a short paper published earlier [L. Wang et al., Phys. Rev. E 93, 040201(R) (2016), 10.1103/PhysRevE.93.040201].

  8. Machine learning patterns for neuroimaging-genetic studies in the cloud.

    PubMed

    Da Mota, Benoit; Tudoran, Radu; Costan, Alexandru; Varoquaux, Gaël; Brasche, Goetz; Conrod, Patricia; Lemaitre, Herve; Paus, Tomas; Rietschel, Marcella; Frouin, Vincent; Poline, Jean-Baptiste; Antoniu, Gabriel; Thirion, Bertrand

    2014-01-01

    Brain imaging is a natural intermediate phenotype to understand the link between genetic information and behavior or brain pathologies risk factors. Massive efforts have been made in the last few years to acquire high-dimensional neuroimaging and genetic data on large cohorts of subjects. The statistical analysis of such data is carried out with increasingly sophisticated techniques and represents a great computational challenge. Fortunately, increasing computational power in distributed architectures can be harnessed, if new neuroinformatics infrastructures are designed and training to use these new tools is provided. Combining a MapReduce framework (TomusBLOB) with machine learning algorithms (Scikit-learn library), we design a scalable analysis tool that can deal with non-parametric statistics on high-dimensional data. End-users describe the statistical procedure to perform and can then test the model on their own computers before running the very same code in the cloud at a larger scale. We illustrate the potential of our approach on real data with an experiment showing how the functional signal in subcortical brain regions can be significantly fit with genome-wide genotypes. This experiment demonstrates the scalability and the reliability of our framework in the cloud with a 2 weeks deployment on hundreds of virtual machines.

  9. Sb2Te3 and Its Superlattices: Optimization by Statistical Design.

    PubMed

    Behera, Jitendra K; Zhou, Xilin; Ranjan, Alok; Simpson, Robert E

    2018-05-02

    The objective of this work is to demonstrate the usefulness of fractional factorial design for optimizing the crystal quality of chalcogenide van der Waals (vdW) crystals. We statistically analyze the growth parameters of highly c axis oriented Sb 2 Te 3 crystals and Sb 2 Te 3 -GeTe phase change vdW heterostructured superlattices. The statistical significance of the growth parameters of temperature, pressure, power, buffer materials, and buffer layer thickness was found by fractional factorial design and response surface analysis. Temperature, pressure, power, and their second-order interactions are the major factors that significantly influence the quality of the crystals. Additionally, using tungsten rather than molybdenum as a buffer layer significantly enhances the crystal quality. Fractional factorial design minimizes the number of experiments that are necessary to find the optimal growth conditions, resulting in an order of magnitude improvement in the crystal quality. We highlight that statistical design of experiment methods, which is more commonly used in product design, should be considered more broadly by those designing and optimizing materials.

  10. Climate Change Conceptual Change: Scientific Information Can Transform Attitudes.

    PubMed

    Ranney, Michael Andrew; Clark, Dav

    2016-01-01

    Of this article's seven experiments, the first five demonstrate that virtually no Americans know the basic global warming mechanism. Fortunately, Experiments 2-5 found that 2-45 min of physical-chemical climate instruction durably increased such understandings. This mechanistic learning, or merely receiving seven highly germane statistical facts (Experiment 6), also increased climate-change acceptance-across the liberal-conservative spectrum. However, Experiment 7's misleading statistics decreased such acceptance (and dramatically, knowledge-confidence). These readily available attitudinal and conceptual changes through scientific information disconfirm what we term "stasis theory"--which some researchers and many laypeople varyingly maintain. Stasis theory subsumes the claim that informing people (particularly Americans) about climate science may be largely futile or even counterproductive--a view that appears historically naïve, suffers from range restrictions (e.g., near-zero mechanistic knowledge), and/or misinterprets some polarization and (noncausal) correlational data. Our studies evidenced no polarizations. Finally, we introduce HowGlobalWarmingWorks.org--a website designed to directly enhance public "climate-change cognition." Copyright © 2016 Cognitive Science Society, Inc.

  11. Patterns of shading tolerance determined from experimental light reduction studies of seagrasses

    EPA Science Inventory

    An extensive review of the experimental literature on seagrass shading evaluated the relationship between experimental light reductions, duration of experiment and seagrass response metrics to determine whether there were consistent statistical patterns. There were highly signif...

  12. Concurrent Movement Impairs Incidental but Not Intentional Statistical Learning

    ERIC Educational Resources Information Center

    Stevens, David J.; Arciuli, Joanne; Anderson, David I.

    2015-01-01

    The effect of concurrent movement on incidental versus intentional statistical learning was examined in two experiments. In Experiment 1, participants learned the statistical regularities embedded within familiarization stimuli implicitly, whereas in Experiment 2 they were made aware of the embedded regularities and were instructed explicitly to…

  13. Constructing a Reward-Related Quality of Life Statistic in Daily Life—a Proof of Concept Study Using Positive Affect

    PubMed Central

    Verhagen, Simone J. W.; Simons, Claudia J. P.; van Zelst, Catherine; Delespaul, Philippe A. E. G.

    2017-01-01

    Background: Mental healthcare needs person-tailored interventions. Experience Sampling Method (ESM) can provide daily life monitoring of personal experiences. This study aims to operationalize and test a measure of momentary reward-related Quality of Life (rQoL). Intuitively, quality of life improves by spending more time on rewarding experiences. ESM clinical interventions can use this information to coach patients to find a realistic, optimal balance of positive experiences (maximize reward) in daily life. rQoL combines the frequency of engaging in a relevant context (a ‘behavior setting’) with concurrent (positive) affect. High rQoL occurs when the most frequent behavior settings are combined with positive affect or infrequent behavior settings co-occur with low positive affect. Methods: Resampling procedures (Monte Carlo experiments) were applied to assess the reliability of rQoL using various behavior setting definitions under different sampling circumstances, for real or virtual subjects with low-, average- and high contextual variability. Furthermore, resampling was used to assess whether rQoL is a distinct concept from positive affect. Virtual ESM beep datasets were extracted from 1,058 valid ESM observations for virtual and real subjects. Results: Behavior settings defined by Who-What contextual information were most informative. Simulations of at least 100 ESM observations are needed for reliable assessment. Virtual ESM beep datasets of a real subject can be defined by Who-What-Where behavior setting combinations. Large sample sizes are necessary for reliable rQoL assessments, except for subjects with low contextual variability. rQoL is distinct from positive affect. Conclusion: rQoL is a feasible concept. Monte Carlo experiments should be used to assess the reliable implementation of an ESM statistic. Future research in ESM should asses the behavior of summary statistics under different sampling situations. This exploration is especially relevant in clinical implementation, where often only small datasets are available. PMID:29163294

  14. Constructing a Reward-Related Quality of Life Statistic in Daily Life-a Proof of Concept Study Using Positive Affect.

    PubMed

    Verhagen, Simone J W; Simons, Claudia J P; van Zelst, Catherine; Delespaul, Philippe A E G

    2017-01-01

    Background: Mental healthcare needs person-tailored interventions. Experience Sampling Method (ESM) can provide daily life monitoring of personal experiences. This study aims to operationalize and test a measure of momentary reward-related Quality of Life (rQoL). Intuitively, quality of life improves by spending more time on rewarding experiences. ESM clinical interventions can use this information to coach patients to find a realistic, optimal balance of positive experiences (maximize reward) in daily life. rQoL combines the frequency of engaging in a relevant context (a 'behavior setting') with concurrent (positive) affect. High rQoL occurs when the most frequent behavior settings are combined with positive affect or infrequent behavior settings co-occur with low positive affect. Methods: Resampling procedures (Monte Carlo experiments) were applied to assess the reliability of rQoL using various behavior setting definitions under different sampling circumstances, for real or virtual subjects with low-, average- and high contextual variability. Furthermore, resampling was used to assess whether rQoL is a distinct concept from positive affect. Virtual ESM beep datasets were extracted from 1,058 valid ESM observations for virtual and real subjects. Results: Behavior settings defined by Who-What contextual information were most informative. Simulations of at least 100 ESM observations are needed for reliable assessment. Virtual ESM beep datasets of a real subject can be defined by Who-What-Where behavior setting combinations. Large sample sizes are necessary for reliable rQoL assessments, except for subjects with low contextual variability. rQoL is distinct from positive affect. Conclusion: rQoL is a feasible concept. Monte Carlo experiments should be used to assess the reliable implementation of an ESM statistic. Future research in ESM should asses the behavior of summary statistics under different sampling situations. This exploration is especially relevant in clinical implementation, where often only small datasets are available.

  15. Residency Program Directors' View on the Value of Teaching.

    PubMed

    Korte, Catherine; Smith, Andrew; Pace, Heather

    2016-08-01

    There is no standardization for teaching activities or a requirement for residency programs to offer specific teaching programs to pharmacy residents. This study will determine the perceived value of providing teaching opportunities to postgraduate year 1 (PGY-1) pharmacy residents in the perspective of the residency program director. The study will also identify the features, depth, and breadth of the teaching experiences afforded to PGY-1 pharmacy residents. A 20-question survey was distributed electronically to 868 American Society of Health-System Pharmacists-accredited PGY-1 residency program directors. The survey was completed by 322 program directors. Developing pharmacy educators was found to be highly valued by 57% of the program directors. Advertisement of teaching opportunities was found to be statistically significant when comparing program directors with a high perceived value for providing teaching opportunities to program demographics. Statistically significant differences were identified associating development of a teaching portfolio, evaluation of Advanced Pharmacy Practice Experiences students, and delivery of didactic lectures with program directors who highly value developing pharmacy educators. Future residency candidates interested in teaching or a career in academia may utilize these findings to identify programs that are more likely to value developing pharmacy educators. The implementation of a standardized teaching experience among all programs may be difficult. © The Author(s) 2015.

  16. Economic Statistical Design of Integrated X-bar-S Control Chart with Preventive Maintenance and General Failure Distribution

    PubMed Central

    Caballero Morales, Santiago Omar

    2013-01-01

    The application of Preventive Maintenance (PM) and Statistical Process Control (SPC) are important practices to achieve high product quality, small frequency of failures, and cost reduction in a production process. However there are some points that have not been explored in depth about its joint application. First, most SPC is performed with the X-bar control chart which does not fully consider the variability of the production process. Second, many studies of design of control charts consider just the economic aspect while statistical restrictions must be considered to achieve charts with low probabilities of false detection of failures. Third, the effect of PM on processes with different failure probability distributions has not been studied. Hence, this paper covers these points, presenting the Economic Statistical Design (ESD) of joint X-bar-S control charts with a cost model that integrates PM with general failure distribution. Experiments showed statistically significant reductions in costs when PM is performed on processes with high failure rates and reductions in the sampling frequency of units for testing under SPC. PMID:23527082

  17. Statistical Learning in a Natural Language by 8-Month-Old Infants

    PubMed Central

    Pelucchi, Bruna; Hay, Jessica F.; Saffran, Jenny R.

    2013-01-01

    Numerous studies over the past decade support the claim that infants are equipped with powerful statistical language learning mechanisms. The primary evidence for statistical language learning in word segmentation comes from studies using artificial languages, continuous streams of synthesized syllables that are highly simplified relative to real speech. To what extent can these conclusions be scaled up to natural language learning? In the current experiments, English-learning 8-month-old infants’ ability to track transitional probabilities in fluent infant-directed Italian speech was tested (N = 72). The results suggest that infants are sensitive to transitional probability cues in unfamiliar natural language stimuli, and support the claim that statistical learning is sufficiently robust to support aspects of real-world language acquisition. PMID:19489896

  18. Statistical learning in a natural language by 8-month-old infants.

    PubMed

    Pelucchi, Bruna; Hay, Jessica F; Saffran, Jenny R

    2009-01-01

    Numerous studies over the past decade support the claim that infants are equipped with powerful statistical language learning mechanisms. The primary evidence for statistical language learning in word segmentation comes from studies using artificial languages, continuous streams of synthesized syllables that are highly simplified relative to real speech. To what extent can these conclusions be scaled up to natural language learning? In the current experiments, English-learning 8-month-old infants' ability to track transitional probabilities in fluent infant-directed Italian speech was tested (N = 72). The results suggest that infants are sensitive to transitional probability cues in unfamiliar natural language stimuli, and support the claim that statistical learning is sufficiently robust to support aspects of real-world language acquisition.

  19. A pedagogical approach to the Boltzmann factor through experiments and simulations

    NASA Astrophysics Data System (ADS)

    Battaglia, O. R.; Bonura, A.; Sperandeo-Mineo, R. M.

    2009-09-01

    The Boltzmann factor is the basis of a huge amount of thermodynamic and statistical physics, both classical and quantum. It governs the behaviour of all systems in nature that are exchanging energy with their environment. To understand why the expression has this specific form involves a deep mathematical analysis, whose flow of logic is hard to see and is not at the level of high school or college students' preparation. We here present some experiments and simulations aimed at directly deriving its mathematical expression and illustrating the fundamental concepts on which it is grounded. Experiments use easily available apparatuses, and simulations are developed in the Net-Logo environment that, besides having a user-friendly interface, allows an easy interaction with the algorithm. The approach supplies pedagogical support for the introduction of the Boltzmann factor at the undergraduate level to students without a background in statistical mechanics.

  20. Enrichment analysis in high-throughput genomics - accounting for dependency in the NULL.

    PubMed

    Gold, David L; Coombes, Kevin R; Wang, Jing; Mallick, Bani

    2007-03-01

    Translating the overwhelming amount of data generated in high-throughput genomics experiments into biologically meaningful evidence, which may for example point to a series of biomarkers or hint at a relevant pathway, is a matter of great interest in bioinformatics these days. Genes showing similar experimental profiles, it is hypothesized, share biological mechanisms that if understood could provide clues to the molecular processes leading to pathological events. It is the topic of further study to learn if or how a priori information about the known genes may serve to explain coexpression. One popular method of knowledge discovery in high-throughput genomics experiments, enrichment analysis (EA), seeks to infer if an interesting collection of genes is 'enriched' for a Consortium particular set of a priori Gene Ontology Consortium (GO) classes. For the purposes of statistical testing, the conventional methods offered in EA software implicitly assume independence between the GO classes. Genes may be annotated for more than one biological classification, and therefore the resulting test statistics of enrichment between GO classes can be highly dependent if the overlapping gene sets are relatively large. There is a need to formally determine if conventional EA results are robust to the independence assumption. We derive the exact null distribution for testing enrichment of GO classes by relaxing the independence assumption using well-known statistical theory. In applications with publicly available data sets, our test results are similar to the conventional approach which assumes independence. We argue that the independence assumption is not detrimental.

  1. A Survey of Statistical Capstone Projects

    ERIC Educational Resources Information Center

    Martonosi, Susan E.; Williams, Talithia D.

    2016-01-01

    In this article, we highlight the advantages of incorporating a statistical capstone experience in the undergraduate curriculum, where students perform an in-depth analysis of real-world data. Capstone experiences develop statistical thinking by allowing students to engage in a consulting-like experience that requires skills outside the scope of…

  2. [Study on early intervention of compound nutrition for cognitive dysfunction in Alzheimer's disease].

    PubMed

    Wang, Chao; Xie, Wei; Zhu, Jinfeng; Dang, Rui; Wang, Decai

    2014-01-01

    To observe the early prevention effect of the compound nutrients recipe for cognitive dysfunction of Alzheimer' s disease model-APP-PSN transgenic mouse. 36 APP-PSN transgenic mice aged two months randomly were divided into the intervention group supplied with compound recipe in the diet and the control group fed based feed, the former had high dose and low dose, 12 APP-PSN transgenic negative mice aged two months as the negative control were fed based feed. After 3 months' intervention, four groups' cognitive functions were evaluated using the Morris water maze, active avoidance experiment and jumping stair experiment. There was not statistically different between all the four groups for the weight and food intake. Compared with the control group, Morris water maze's incubation period of the intervention group was lower obviously, and jumping stair experiment's incubation period of the intervention group was higher obviously. In the active avoidance experiment, the high and low dose intervention group' s conditioned response accounted about 46.67% and 45.00% respectively, and the control group's conditioned response accounted about 20.83%. The differences of the three behavioral experiments between control group and intervention group had the statistical significance (P < 0.05), so the same as between control group and negative control group (P < 0.05). And there was no difference between intervention group and negative control group for the three behavioral experiments. The early supplementation with compound nutrition could postpone the occurrence and development of Alzheimer' s disease mice model's cognitive dysfunction.

  3. Experiments on Nucleation in Different Flow Regimes

    NASA Technical Reports Server (NTRS)

    Bayuzick, R. J.; Hofmeister, W. H.; Morton, C. M.; Robinson, M. B.

    1998-01-01

    The vast majority of metallic engineering materials are solidified from the liquid phase. Understanding the solidification process is essential to control microstructure, which in turn, determines the properties of materials. The genesis of solidification is nucleation, where the first stable solid forms from the liquid phase. Nucleation kinetics determine the degree of undercooling and phase selection. As such, it is important to understand nucleation phenomena in order to control solidification or glass formation in metals and alloys. Early experiments in nucleation kinetics were accomplished by droplet dispersion methods. Dilitometry was used by Turnbull and others, and more recently differential thermal analysis and differential scanning calorimetry have been used for kinetic studies. These techniques have enjoyed success; however, there are difficulties with these experiments. Since materials are dispersed in a medium, the character of the emulsion/metal interface affects the nucleation behavior. Statistics are derived from the large number of particles observed in a single experiment, but dispersions have a finite size distribution which adds to the uncertainty of the kinetic determinations. Even though temperature can be controlled quite well before the onset of nucleation, the release of the latent heat of fusion during nucleation of particles complicates the assumption of isothermality during these experiments. Containerless processing has enabled another approach to the study of nucleation kinetics. With levitation techniques it is possible to undercool one sample to nucleation repeatedly in a controlled manner, such that the statistics of the nucleation process can be derived from multiple experiments on a single sample. The authors have fully developed the analysis of nucleation experiments on single samples following the suggestions of Skripov. The advantage of these experiments is that the samples are directly observable. The nucleation temperature can be measured by noncontact optical pyrometry, the mass of the sample is known, and post-processing analysis can be conducted on the sample. The disadvantages are that temperature measurement must have exceptionally high precision, and it is not possible to isolate specific heterogeneous sites as in droplet dispersions. Levitation processing of refractory materials in ultra high vacuum provides an avenue to conduct these kinetic studies on single samples. Two experimental methods have been identified where ultra high vacuum experiments are possible; electrostatic levitation in ground-based experiments and electromagnetic processing in low earth orbit on TEMPUS. Such experiments, reported here, were conducted on zirconium. Liquid zirconium is an excellent solvent and has a high solubility for contaminants contained in the bulk material as well as those contaminants found in the vacuum environment. Oxides, nitrides, and carbides do not exist in the melt, and do not form on the surface of molten zirconium, for the materials and vacuum levels used in this study. Ground-based experiments with electrostatic levitation have shown that the statistical nucleation kinetic experiments are viable and yield results which are consistent with classical nucleation theory. The advantage of low earth orbit experiments is the ability to vary the flow conditions in the liquid prior to nucleation. The put-pose of nucleation experiments in TEMPUS was to examine.

  4. Statistical analysis of time transfer data from Timation 2. [US Naval Observatory and Australia

    NASA Technical Reports Server (NTRS)

    Luck, J. M.; Morgan, P.

    1974-01-01

    Between July 1973 and January 1974, three time transfer experiments using the Timation 2 satellite were conducted to measure time differences between the U.S. Naval Observatory and Australia. Statistical tests showed that the results are unaffected by the satellite's position with respect to the sunrise/sunset line or by its closest approach azimuth at the Australian station. Further tests revealed that forward predictions of time scale differences, based on the measurements, can be made with high confidence.

  5. Statistics of Sxy estimates

    NASA Technical Reports Server (NTRS)

    Freilich, M. H.; Pawka, S. S.

    1987-01-01

    The statistics of Sxy estimates derived from orthogonal-component measurements are examined. Based on results of Goodman (1957), the probability density function (pdf) for Sxy(f) estimates is derived, and a closed-form solution for arbitrary moments of the distribution is obtained. Characteristic functions are used to derive the exact pdf of Sxy(tot). In practice, a simple Gaussian approximation is found to be highly accurate even for relatively few degrees of freedom. Implications for experiment design are discussed, and a maximum-likelihood estimator for a posterior estimation is outlined.

  6. Neutrino Oscillations at Proton Accelerators

    NASA Astrophysics Data System (ADS)

    Michael, Douglas

    2002-12-01

    Data from many different experiments have started to build a first glimpse of the phenomenology associated with neutrino oscillations. Results on atmospheric and solar neutrinos are particularly clear while a third result from LSND suggests a possibly very complex oscillation phenomenology. As impressive as the results from current experiments are, it is clear that we are just getting started on a long-term experimental program to understand neutrino masses, mixings and the physics which produce them. A number of exciting fundamental physics possibilities exist, including that neutrino oscillations could demonstrate CP or CPT violation and could be tied to exotic high-energy phenomena including strings and extra dimensions. A complete exploration of oscillation phenomena demands many experiments, including those possible using neutrino beams produced at high energy proton accelerators. Most existing neutrino experiments are statistics limited even though they use gigantic detectors. High intensity proton beams are essential for producing the intense neutrino beams which we need for next generation neutrino oscillation experiments.

  7. How to get statistically significant effects in any ERP experiment (and why you shouldn't).

    PubMed

    Luck, Steven J; Gaspelin, Nicholas

    2017-01-01

    ERP experiments generate massive datasets, often containing thousands of values for each participant, even after averaging. The richness of these datasets can be very useful in testing sophisticated hypotheses, but this richness also creates many opportunities to obtain effects that are statistically significant but do not reflect true differences among groups or conditions (bogus effects). The purpose of this paper is to demonstrate how common and seemingly innocuous methods for quantifying and analyzing ERP effects can lead to very high rates of significant but bogus effects, with the likelihood of obtaining at least one such bogus effect exceeding 50% in many experiments. We focus on two specific problems: using the grand-averaged data to select the time windows and electrode sites for quantifying component amplitudes and latencies, and using one or more multifactor statistical analyses. Reanalyses of prior data and simulations of typical experimental designs are used to show how these problems can greatly increase the likelihood of significant but bogus results. Several strategies are described for avoiding these problems and for increasing the likelihood that significant effects actually reflect true differences among groups or conditions. © 2016 Society for Psychophysiological Research.

  8. How to Get Statistically Significant Effects in Any ERP Experiment (and Why You Shouldn’t)

    PubMed Central

    Luck, Steven J.; Gaspelin, Nicholas

    2016-01-01

    Event-related potential (ERP) experiments generate massive data sets, often containing thousands of values for each participant, even after averaging. The richness of these data sets can be very useful in testing sophisticated hypotheses, but this richness also creates many opportunities to obtain effects that are statistically significant but do not reflect true differences among groups or conditions (bogus effects). The purpose of this paper is to demonstrate how common and seemingly innocuous methods for quantifying and analyzing ERP effects can lead to very high rates of significant-but-bogus effects, with the likelihood of obtaining at least one such bogus effect exceeding 50% in many experiments. We focus on two specific problems: using the grand average data to select the time windows and electrode sites for quantifying component amplitudes and latencies, and using one or more multi-factor statistical analyses. Re-analyses of prior data and simulations of typical experimental designs are used to show how these problems can greatly increase the likelihood of significant-but-bogus results. Several strategies are described for avoiding these problems and for increasing the likelihood that significant effects actually reflect true differences among groups or conditions. PMID:28000253

  9. Under the hood of statistical learning: A statistical MMN reflects the magnitude of transitional probabilities in auditory sequences.

    PubMed

    Koelsch, Stefan; Busch, Tobias; Jentschke, Sebastian; Rohrmeier, Martin

    2016-02-02

    Within the framework of statistical learning, many behavioural studies investigated the processing of unpredicted events. However, surprisingly few neurophysiological studies are available on this topic, and no statistical learning experiment has investigated electroencephalographic (EEG) correlates of processing events with different transition probabilities. We carried out an EEG study with a novel variant of the established statistical learning paradigm. Timbres were presented in isochronous sequences of triplets. The first two sounds of all triplets were equiprobable, while the third sound occurred with either low (10%), intermediate (30%), or high (60%) probability. Thus, the occurrence probability of the third item of each triplet (given the first two items) was varied. Compared to high-probability triplet endings, endings with low and intermediate probability elicited an early anterior negativity that had an onset around 100 ms and was maximal at around 180 ms. This effect was larger for events with low than for events with intermediate probability. Our results reveal that, when predictions are based on statistical learning, events that do not match a prediction evoke an early anterior negativity, with the amplitude of this mismatch response being inversely related to the probability of such events. Thus, we report a statistical mismatch negativity (sMMN) that reflects statistical learning of transitional probability distributions that go beyond auditory sensory memory capabilities.

  10. Investigation of Statistical Inference Methodologies Through Scale Model Propagation Experiments

    DTIC Science & Technology

    2015-09-30

    statistical inference methodologies for ocean- acoustic problems by investigating and applying statistical methods to data collected from scale-model...to begin planning experiments for statistical inference applications. APPROACH In the ocean acoustics community over the past two decades...solutions for waveguide parameters. With the introduction of statistical inference to the field of ocean acoustics came the desire to interpret marginal

  11. Improved silicon nitride for advanced heat engines

    NASA Technical Reports Server (NTRS)

    Yeh, Hun C.; Fang, Ho T.

    1987-01-01

    The technology base required to fabricate silicon nitride components with the strength, reliability, and reproducibility necessary for actual heat engine applications is presented. Task 2 was set up to develop test bars with high Weibull slope and greater high temperature strength, and to conduct an initial net shape component fabrication evaluation. Screening experiments were performed in Task 7 on advanced materials and processing for input to Task 2. The technical efforts performed in the second year of a 5-yr program are covered. The first iteration of Task 2 was completed as planned. Two half-replicated, fractional factorial (2 sup 5), statistically designed matrix experiments were conducted. These experiments have identified Denka 9FW Si3N4 as an alternate raw material to GTE SN502 Si3N4 for subsequent process evaluation. A detailed statistical analysis was conducted to correlate processing conditions with as-processed test bar properties. One processing condition produced a material with a 97 ksi average room temperature MOR (100 percent of goal) with 13.2 Weibull slope (83 percent of goal); another condition produced 86 ksi (6 percent over baseline) room temperature strength with a Weibull slope of 20 (125 percent of goal).

  12. Experiments and Model for Serration Statistics in Low-Entropy, Medium-Entropy, and High-Entropy Alloys

    DOE PAGES

    Carroll, Robert; Lee, Chi; Tsai, Che-Wei; ...

    2015-11-23

    In this study, high-entropy alloys (HEAs) are new alloys that contain five or more elements in roughly-equal proportion. We present new experiments and theory on the deformation behavior of HEAs under slow stretching (straining), and observe differences, compared to conventional alloys with fewer elements. For a specific range of temperatures and strain-rates, HEAs deform in a jerky way, with sudden slips that make it difficult to precisely control the deformation. An analytic model explains these slips as avalanches of slipping weak spots and predicts the observed slip statistics, stress-strain curves, and their dependence on temperature, strain-rate, and material composition. Themore » ratio of the weak spots’ healing rate to the strain-rate is the main tuning parameter, reminiscent of the Portevin- LeChatellier effect and time-temperature superposition in polymers. Our model predictions agree with the experimental results. The proposed widely-applicable deformation mechanism is useful for deformation control and alloy design.« less

  13. Communications Link Characterization Experiment (CLCE) technical data report, volume 2

    NASA Technical Reports Server (NTRS)

    1977-01-01

    The results are presented of the long term rain rate statistical analysis and the investigation of determining the worst month statistical from the measured attenuation data caused by precipitation. The rain rate statistics cover a period of 11 months from July of 1974 to May of 1975 for measurements taken at the NASA, Rosman station. The rain rate statistical analysis is a continuation of the analysis of the rain rate data accumulated for the ATS-6 Millimeter Wave Progation Experiment. The statistical characteristics of the rain rate data through December of 1974 is also presented for the above experiment.

  14. Statistical learning of action: the role of conditional probability.

    PubMed

    Meyer, Meredith; Baldwin, Dare

    2011-12-01

    Identification of distinct units within a continuous flow of human action is fundamental to action processing. Such segmentation may rest in part on statistical learning. In a series of four experiments, we examined what types of statistics people can use to segment a continuous stream involving many brief, goal-directed action elements. The results of Experiment 1 showed no evidence for sensitivity to conditional probability, whereas Experiment 2 displayed learning based on joint probability. In Experiment 3, we demonstrated that additional exposure to the input failed to engender sensitivity to conditional probability. However, the results of Experiment 4 showed that a subset of adults-namely, those more successful at identifying actions that had been seen more frequently than comparison sequences-were also successful at learning conditional-probability statistics. These experiments help to clarify the mechanisms subserving processing of intentional action, and they highlight important differences from, as well as similarities to, prior studies of statistical learning in other domains, including language.

  15. Statistical Patterns of Ionospheric Convection Derived From Mid-Latitude, High-Latitude, and Polar SuperDARN HF Radar Observations

    NASA Astrophysics Data System (ADS)

    Thomas, E. G.; Shepherd, S. G.

    2017-12-01

    Global patterns of ionospheric convection have been widely studied in terms of the interplanetary magnetic field (IMF) magnitude and orientation in both the Northern and Southern Hemispheres using observations from the Super Dual Auroral Radar Network (SuperDARN). The dynamic range of driving conditions under which existing SuperDARN statistical models are valid is currently limited to periods when the high-latitude convection pattern remains above about 60° geomagnetic latitude. Cousins and Shepherd [2010] found this to correspond to intervals when the solar wind electric field Esw < 4.1 mV/m and IMF Bz is negative. Conversely, under northward IMF conditions (Bz > 0) the high-latitude radars often experience difficulties in measuring convection above about 85° geomagnetic latitude. In this presentation, we introduce a new statistical model of ionospheric convection which is valid for much more dominant IMF Bz conditions than was previously possible by including velocity measurements from the newly constructed tiers of radars in the Northern Hemisphere at midlatitudes and in the polar cap. This new model (TS17) is compared to previous statistical models derived from high-latitude SuperDARN observations (RG96, PSR10, CS10) and its impact on instantaneous Map Potential solutions is examined.

  16. Facilitating Student Experimentation with Statistical Concepts.

    ERIC Educational Resources Information Center

    Smith, Patricia K.

    2002-01-01

    Offers a Web page with seven Java applets allowing students to experiment with key concepts in an introductory statistics course. Indicates the applets can be used in three ways: to place links to the applets, to create in-class demonstrations of statistical concepts, and to lead students through experiments and discover statistical relationships.…

  17. Statistical Methodologies to Integrate Experimental and Computational Research

    NASA Technical Reports Server (NTRS)

    Parker, P. A.; Johnson, R. T.; Montgomery, D. C.

    2008-01-01

    Development of advanced algorithms for simulating engine flow paths requires the integration of fundamental experiments with the validation of enhanced mathematical models. In this paper, we provide an overview of statistical methods to strategically and efficiently conduct experiments and computational model refinement. Moreover, the integration of experimental and computational research efforts is emphasized. With a statistical engineering perspective, scientific and engineering expertise is combined with statistical sciences to gain deeper insights into experimental phenomenon and code development performance; supporting the overall research objectives. The particular statistical methods discussed are design of experiments, response surface methodology, and uncertainty analysis and planning. Their application is illustrated with a coaxial free jet experiment and a turbulence model refinement investigation. Our goal is to provide an overview, focusing on concepts rather than practice, to demonstrate the benefits of using statistical methods in research and development, thereby encouraging their broader and more systematic application.

  18. Detecting Patchy Reionization in the Cosmic Microwave Background.

    PubMed

    Smith, Kendrick M; Ferraro, Simone

    2017-07-14

    Upcoming cosmic microwave background (CMB) experiments will measure temperature fluctuations on small angular scales with unprecedented precision. Small-scale CMB fluctuations are a mixture of late-time effects: gravitational lensing, Doppler shifting of CMB photons by moving electrons [the kinematic Sunyaev-Zel'dovich (KSZ) effect], and residual foregrounds. We propose a new statistic which separates the KSZ signal from the others, and also allows the KSZ signal to be decomposed in redshift bins. The decomposition extends to high redshift and does not require external data sets such as galaxy surveys. In particular, the high-redshift signal from patchy reionization can be cleanly isolated, enabling future CMB experiments to make high-significance and qualitatively new measurements of the reionization era.

  19. Analysis of counting data: Development of the SATLAS Python package

    NASA Astrophysics Data System (ADS)

    Gins, W.; de Groote, R. P.; Bissell, M. L.; Granados Buitrago, C.; Ferrer, R.; Lynch, K. M.; Neyens, G.; Sels, S.

    2018-01-01

    For the analysis of low-statistics counting experiments, a traditional nonlinear least squares minimization routine may not always provide correct parameter and uncertainty estimates due to the assumptions inherent in the algorithm(s). In response to this, a user-friendly Python package (SATLAS) was written to provide an easy interface between the data and a variety of minimization algorithms which are suited for analyzinglow, as well as high, statistics data. The advantage of this package is that it allows the user to define their own model function and then compare different minimization routines to determine the optimal parameter values and their respective (correlated) errors. Experimental validation of the different approaches in the package is done through analysis of hyperfine structure data of 203Fr gathered by the CRIS experiment at ISOLDE, CERN.

  20. Angular velocity estimation based on star vector with improved current statistical model Kalman filter.

    PubMed

    Zhang, Hao; Niu, Yanxiong; Lu, Jiazhen; Zhang, He

    2016-11-20

    Angular velocity information is a requisite for a spacecraft guidance, navigation, and control system. In this paper, an approach for angular velocity estimation based merely on star vector measurement with an improved current statistical model Kalman filter is proposed. High-precision angular velocity estimation can be achieved under dynamic conditions. The amount of calculation is also reduced compared to a Kalman filter. Different trajectories are simulated to test this approach, and experiments with real starry sky observation are implemented for further confirmation. The estimation accuracy is proved to be better than 10-4  rad/s under various conditions. Both the simulation and the experiment demonstrate that the described approach is effective and shows an excellent performance under both static and dynamic conditions.

  1. 'Dignity therapy', a promising intervention in palliative care: A comprehensive systematic literature review.

    PubMed

    Martínez, Marina; Arantzamendi, María; Belar, Alazne; Carrasco, José Miguel; Carvajal, Ana; Rullán, María; Centeno, Carlos

    2017-06-01

    Dignity therapy is psychotherapy to relieve psychological and existential distress in patients at the end of life. Little is known about its effect. To analyse the outcomes of dignity therapy in patients with advanced life-threatening diseases. Systematic review was conducted. Three authors extracted data of the articles and evaluated quality using Critical Appraisal Skills Programme. Data were synthesized, considering study objectives. PubMed, CINAHL, Cochrane Library and PsycINFO. The years searched were 2002 (year of dignity therapy development) to January 2016. 'Dignity therapy' was used as search term. Studies with patients with advanced life-threatening diseases were included. Of 121 studies, 28 were included. Quality of studies is high. Results were grouped into effectiveness, satisfaction, suitability and feasibility, and adaptability to different diseases and cultures. Two of five randomized control trials applied dignity therapy to patients with high levels of baseline psychological distress. One showed statistically significant decrease on patients' anxiety and depression scores over time. The other showed statistical decrease on anxiety scores pre-post dignity therapy, not on depression. Nonrandomized studies suggested statistically significant improvements in existential and psychosocial measurements. Patients, relatives and professionals perceived it improved end-of-life experience. Evidence suggests that dignity therapy is beneficial. One randomized controlled trial with patients with high levels of psychological distress shows DT efficacy in anxiety and depression scores. Other design studies report beneficial outcomes in terms of end-of-life experience. Further research should understand how dignity therapy functions to establish a means for measuring its impact and assessing whether high level of distress patients can benefit most from this therapy.

  2. Temperature and Voltage Offsets in High- ZT Thermoelectrics

    NASA Astrophysics Data System (ADS)

    Levy, George S.

    2018-06-01

    Thermodynamic temperature can take on different meanings. Kinetic temperature is an expectation value and a function of the kinetic energy distribution. Statistical temperature is a parameter of the distribution. Kinetic temperature and statistical temperature, identical in Maxwell-Boltzmann statistics, can differ in other statistics such as those of Fermi-Dirac or Bose-Einstein when a field is present. Thermal equilibrium corresponds to zero statistical temperature gradient, not zero kinetic temperature gradient. Since heat carriers in thermoelectrics are fermions, the difference between these two temperatures may explain voltage and temperature offsets observed during meticulous Seebeck measurements in which the temperature-voltage curve does not go through the origin. In conventional semiconductors, temperature offsets produced by fermionic electrical carriers are not observable because they are shorted by heat phonons in the lattice. In high- ZT materials, however, these offsets have been detected but attributed to faulty laboratory procedures. Additional supporting evidence for spontaneous voltages and temperature gradients includes data collected in epistatic experiments and in the plasma Q-machine. Device fabrication guidelines for testing the hypothesis are suggested including using unipolar junctions stacked in a superlattice, alternating n/ n + and p/ p + junctions, selecting appropriate dimensions, doping, and loading.

  3. Temperature and Voltage Offsets in High-ZT Thermoelectrics

    NASA Astrophysics Data System (ADS)

    Levy, George S.

    2017-10-01

    Thermodynamic temperature can take on different meanings. Kinetic temperature is an expectation value and a function of the kinetic energy distribution. Statistical temperature is a parameter of the distribution. Kinetic temperature and statistical temperature, identical in Maxwell-Boltzmann statistics, can differ in other statistics such as those of Fermi-Dirac or Bose-Einstein when a field is present. Thermal equilibrium corresponds to zero statistical temperature gradient, not zero kinetic temperature gradient. Since heat carriers in thermoelectrics are fermions, the difference between these two temperatures may explain voltage and temperature offsets observed during meticulous Seebeck measurements in which the temperature-voltage curve does not go through the origin. In conventional semiconductors, temperature offsets produced by fermionic electrical carriers are not observable because they are shorted by heat phonons in the lattice. In high-ZT materials, however, these offsets have been detected but attributed to faulty laboratory procedures. Additional supporting evidence for spontaneous voltages and temperature gradients includes data collected in epistatic experiments and in the plasma Q-machine. Device fabrication guidelines for testing the hypothesis are suggested including using unipolar junctions stacked in a superlattice, alternating n/n + and p/p + junctions, selecting appropriate dimensions, doping, and loading.

  4. Statistical inference for tumor growth inhibition T/C ratio.

    PubMed

    Wu, Jianrong

    2010-09-01

    The tumor growth inhibition T/C ratio is commonly used to quantify treatment effects in drug screening tumor xenograft experiments. The T/C ratio is converted to an antitumor activity rating using an arbitrary cutoff point and often without any formal statistical inference. Here, we applied a nonparametric bootstrap method and a small sample likelihood ratio statistic to make a statistical inference of the T/C ratio, including both hypothesis testing and a confidence interval estimate. Furthermore, sample size and power are also discussed for statistical design of tumor xenograft experiments. Tumor xenograft data from an actual experiment were analyzed to illustrate the application.

  5. Projected Changes in Hydrological Extremes in a Cold Region Watershed: Sensitivity of Results to Statistical Methods of Analysis

    NASA Astrophysics Data System (ADS)

    Dibike, Y. B.; Eum, H. I.; Prowse, T. D.

    2017-12-01

    Flows originating from alpine dominated cold region watersheds typically experience extended winter low flows followed by spring snowmelt and summer rainfall driven high flows. In a warmer climate, there will be temperature- induced shift in precipitation from snow towards rain as well as changes in snowmelt timing affecting the frequency of extreme high and low flow events which could significantly alter ecosystem services. This study examines the potential changes in the frequency and severity of hydrologic extremes in the Athabasca River watershed in Alberta, Canada based on the Variable Infiltration Capacity (VIC) hydrologic model and selected and statistically downscaled climate change scenario data from the latest Coupled Model Intercomparison Project (CMIP5). The sensitivity of these projected changes is also examined by applying different extreme flow analysis methods. The hydrological model projections show an overall increase in mean annual streamflow in the watershed and a corresponding shift in the freshet timing to earlier period. Most of the streams are projected to experience increases during the winter and spring seasons and decreases during the summer and early fall seasons, with an overall projected increases in extreme high flows, especially for low frequency events. While the middle and lower parts of the watershed are characterised by projected increases in extreme high flows, the high elevation alpine region is mainly characterised by corresponding decreases in extreme low flow events. However, the magnitude of projected changes in extreme flow varies over a wide range, especially for low frequent events, depending on the climate scenario and period of analysis, and sometimes in a nonlinear way. Nonetheless, the sensitivity of the projected changes to the statistical method of analysis is found to be relatively small compared to the inter-model variability.

  6. Modeling genome coverage in single-cell sequencing

    PubMed Central

    Daley, Timothy; Smith, Andrew D.

    2014-01-01

    Motivation: Single-cell DNA sequencing is necessary for examining genetic variation at the cellular level, which remains hidden in bulk sequencing experiments. But because they begin with such small amounts of starting material, the amount of information that is obtained from single-cell sequencing experiment is highly sensitive to the choice of protocol employed and variability in library preparation. In particular, the fraction of the genome represented in single-cell sequencing libraries exhibits extreme variability due to quantitative biases in amplification and loss of genetic material. Results: We propose a method to predict the genome coverage of a deep sequencing experiment using information from an initial shallow sequencing experiment mapped to a reference genome. The observed coverage statistics are used in a non-parametric empirical Bayes Poisson model to estimate the gain in coverage from deeper sequencing. This approach allows researchers to know statistical features of deep sequencing experiments without actually sequencing deeply, providing a basis for optimizing and comparing single-cell sequencing protocols or screening libraries. Availability and implementation: The method is available as part of the preseq software package. Source code is available at http://smithlabresearch.org/preseq. Contact: andrewds@usc.edu Supplementary information: Supplementary material is available at Bioinformatics online. PMID:25107873

  7. Schools and Labor Market Outcomes. EQW Working Papers WP33.

    ERIC Educational Resources Information Center

    Crawford, David L.; And Others

    The relationship between school characteristics and labor market outcomes was examined through a literature review and an econometric analysis of the effects of various characteristics of the schooling experience on students' labor market performance after high school. Data from the National Center on Education Statistics' longitudinal survey of…

  8. Image Understanding. Proceedings of a Workshop Held in Pittsburgh, Pennsylvania on 11-13 September, 1990

    DTIC Science & Technology

    1990-09-01

    performed some preliminary longest piers are about three times the length of a de- experiments to detect the ships in the high resolution stroyer...statistics, and these are coordinates then shipped via a high - speed interface to a host where the stereo triangulation and kinematic control algorithms Grasp...Design: Perception research includes the design of new sensor technologies, such as this hybrid analog/digital chip for a high - speed light-stripe

  9. Attitudes towards statistics of graduate entry medical students: the role of prior learning experiences

    PubMed Central

    2014-01-01

    Background While statistics is increasingly taught as part of the medical curriculum, it can be an unpopular subject and feedback from students indicates that some find it more difficult than other subjects. Understanding attitudes towards statistics on entry to graduate entry medical programmes is particularly important, given that many students may have been exposed to quantitative courses in their previous degree and hence bring preconceptions of their ability and interest to their medical education programme. The aim of this study therefore is to explore, for the first time, attitudes towards statistics of graduate entry medical students from a variety of backgrounds and focus on understanding the role of prior learning experiences. Methods 121 first year graduate entry medical students completed the Survey of Attitudes toward Statistics instrument together with information on demographics and prior learning experiences. Results Students tended to appreciate the relevance of statistics in their professional life and be prepared to put effort into learning statistics. They had neutral to positive attitudes about their interest in statistics and their intellectual knowledge and skills when applied to it. Their feelings towards statistics were slightly less positive e.g. feelings of insecurity, stress, fear and frustration and they tended to view statistics as difficult. Even though 85% of students had taken a quantitative course in the past, only 24% of students described it as likely that they would take any course in statistics if the choice was theirs. How well students felt they had performed in mathematics in the past was a strong predictor of many of the components of attitudes. Conclusion The teaching of statistics to medical students should start with addressing the association between students’ past experiences in mathematics and their attitudes towards statistics and encouraging students to recognise the difference between the two disciplines. Addressing these issues may reduce students’ anxiety and perception of difficulty at the start of their learning experience and encourage students to engage with statistics in their future careers. PMID:24708762

  10. Attitudes towards statistics of graduate entry medical students: the role of prior learning experiences.

    PubMed

    Hannigan, Ailish; Hegarty, Avril C; McGrath, Deirdre

    2014-04-04

    While statistics is increasingly taught as part of the medical curriculum, it can be an unpopular subject and feedback from students indicates that some find it more difficult than other subjects. Understanding attitudes towards statistics on entry to graduate entry medical programmes is particularly important, given that many students may have been exposed to quantitative courses in their previous degree and hence bring preconceptions of their ability and interest to their medical education programme. The aim of this study therefore is to explore, for the first time, attitudes towards statistics of graduate entry medical students from a variety of backgrounds and focus on understanding the role of prior learning experiences. 121 first year graduate entry medical students completed the Survey of Attitudes toward Statistics instrument together with information on demographics and prior learning experiences. Students tended to appreciate the relevance of statistics in their professional life and be prepared to put effort into learning statistics. They had neutral to positive attitudes about their interest in statistics and their intellectual knowledge and skills when applied to it. Their feelings towards statistics were slightly less positive e.g. feelings of insecurity, stress, fear and frustration and they tended to view statistics as difficult. Even though 85% of students had taken a quantitative course in the past, only 24% of students described it as likely that they would take any course in statistics if the choice was theirs. How well students felt they had performed in mathematics in the past was a strong predictor of many of the components of attitudes. The teaching of statistics to medical students should start with addressing the association between students' past experiences in mathematics and their attitudes towards statistics and encouraging students to recognise the difference between the two disciplines. Addressing these issues may reduce students' anxiety and perception of difficulty at the start of their learning experience and encourage students to engage with statistics in their future careers.

  11. Patient Experience with the Patient-Centered Medical Home in Michigan's Statewide Multi-Payer Demonstration: A Cross-Sectional Study.

    PubMed

    Sarinopoulos, Issidoros; Bechel-Marriott, Diane L; Malouin, Jean M; Zhai, Shaohui; Forney, Jason C; Tanner, Clare L

    2017-11-01

    The literature on patient-centered medical homes (PCMHs) and patient experience is somewhat mixed. Government and private payers are promoting multi-payer PCMH initiatives to align requirements and resources and to enhance practice transformation outcomes. To this end, the multipayer Michigan Primary Care Transformation (MiPCT) demonstration project was carried out. To examine whether the PCMH is associated with a better patient experience, and whether a mature, multi-payer PCMH demonstration is associated with even further improvement in the patient experience. This is a cross-sectional comparison of adults attributed to MiPCT PCMH, non-participating PCMH, and non-PCMH practices, statistically controlling for potential confounders, and conducted among both general and high-risk patient samples. Responses came from 3893 patients in the general population and 4605 in the high-risk population (response rates of 31.8% and 34.1%, respectively). The Clinician and Group Consumer Assessment of Healthcare Providers and Systems survey, with PCMH supplemental questions, was administered in January and February 2015. MiPCT general and high-risk patients reported a significantly better experience than non-PCMH patients in most domains. Adjusted mean differences were as follows: access (0.35**, 0.36***), communication (0.19*, 0.18*), and coordination (0.33**, 0.35***), respectively (on a 10-point scale, with significance indicated by: *= p<0.05, **= p<0.01, and ***= p<0.001). Adjusted mean differences in overall provider ratings were not significant. Global odds ratios were significant for the domains of self-management support (1.38**, 1.41***) and comprehensiveness (1.67***, 1.61***). Non-participating PCMH ratings fell between MiPCT and non-PCMH across all domains and populations, sometimes attaining statistical significance. PCMH practices have more positive patient experiences across domains characteristic of advanced primary care. A mature multi-payer model has the strongest, most consistent association with a better patient experience, pointing to the need to provide consistent expectations, resources, and time for practice transformation. Our results held for a general population and a high-risk population which has much more contact with the healthcare system.

  12. Systematic review of restraint interventions for challenging behaviour among persons with intellectual disabilities: focus on effectiveness in single-case experiments.

    PubMed

    Heyvaert, Mieke; Saenen, Lore; Maes, Bea; Onghena, Patrick

    2014-11-01

    This article is the first in a two-part series: we focus on the effectiveness of restraint interventions (RIs) for reducing challenging behaviour (CB) among persons with intellectual disabilities in this first article. In the second article, we focus on experiences with RIs for CB among people with intellectual disabilities. A mixed-methods research synthesis involving statistical meta-analysis and qualitative meta-synthesis techniques was applied to synthesize 76 retrieved articles. This first article reports on the meta-analysis of 59 single-case experiments (SCEs) on effectiveness of RIs for CB among people with intellectual disabilities. The RIs reported on in the SCEs were on average highly effective in reducing CB for people with intellectual disabilities, and this reduction in CB was statistically significant. However, the effects vary significantly over the included participants, and the published data and reported outcomes are rather unrepresentative of the everyday use of RIs among persons with intellectual disabilities. © 2014 John Wiley & Sons Ltd.

  13. A Developmental Approach to Machine Learning?

    PubMed Central

    Smith, Linda B.; Slone, Lauren K.

    2017-01-01

    Visual learning depends on both the algorithms and the training material. This essay considers the natural statistics of infant- and toddler-egocentric vision. These natural training sets for human visual object recognition are very different from the training data fed into machine vision systems. Rather than equal experiences with all kinds of things, toddlers experience extremely skewed distributions with many repeated occurrences of a very few things. And though highly variable when considered as a whole, individual views of things are experienced in a specific order – with slow, smooth visual changes moment-to-moment, and developmentally ordered transitions in scene content. We propose that the skewed, ordered, biased visual experiences of infants and toddlers are the training data that allow human learners to develop a way to recognize everything, both the pervasively present entities and the rarely encountered ones. The joint consideration of real-world statistics for learning by researchers of human and machine learning seems likely to bring advances in both disciplines. PMID:29259573

  14. Matter-wave diffraction approaching limits predicted by Feynman path integrals for multipath interference

    NASA Astrophysics Data System (ADS)

    Barnea, A. Ronny; Cheshnovsky, Ori; Even, Uzi

    2018-02-01

    Interference experiments have been paramount in our understanding of quantum mechanics and are frequently the basis of testing the superposition principle in the framework of quantum theory. In recent years, several studies have challenged the nature of wave-function interference from the perspective of Born's rule—namely, the manifestation of so-called high-order interference terms in a superposition generated by diffraction of the wave functions. Here we present an experimental test of multipath interference in the diffraction of metastable helium atoms, with large-number counting statistics, comparable to photon-based experiments. We use a variation of the original triple-slit experiment and accurate single-event counting techniques to provide a new experimental bound of 2.9 ×10-5 on the statistical deviation from the commonly approximated null third-order interference term in Born's rule for matter waves. Our value is on the order of the maximal contribution predicted for multipath trajectories by Feynman path integrals.

  15. A new u-statistic with superior design sensitivity in matched observational studies.

    PubMed

    Rosenbaum, Paul R

    2011-09-01

    In an observational or nonrandomized study of treatment effects, a sensitivity analysis indicates the magnitude of bias from unmeasured covariates that would need to be present to alter the conclusions of a naïve analysis that presumes adjustments for observed covariates suffice to remove all bias. The power of sensitivity analysis is the probability that it will reject a false hypothesis about treatment effects allowing for a departure from random assignment of a specified magnitude; in particular, if this specified magnitude is "no departure" then this is the same as the power of a randomization test in a randomized experiment. A new family of u-statistics is proposed that includes Wilcoxon's signed rank statistic but also includes other statistics with substantially higher power when a sensitivity analysis is performed in an observational study. Wilcoxon's statistic has high power to detect small effects in large randomized experiments-that is, it often has good Pitman efficiency-but small effects are invariably sensitive to small unobserved biases. Members of this family of u-statistics that emphasize medium to large effects can have substantially higher power in a sensitivity analysis. For example, in one situation with 250 pair differences that are Normal with expectation 1/2 and variance 1, the power of a sensitivity analysis that uses Wilcoxon's statistic is 0.08 while the power of another member of the family of u-statistics is 0.66. The topic is examined by performing a sensitivity analysis in three observational studies, using an asymptotic measure called the design sensitivity, and by simulating power in finite samples. The three examples are drawn from epidemiology, clinical medicine, and genetic toxicology. © 2010, The International Biometric Society.

  16. Equilibrium statistical-thermal models in high-energy physics

    NASA Astrophysics Data System (ADS)

    Tawfik, Abdel Nasser

    2014-05-01

    We review some recent highlights from the applications of statistical-thermal models to different experimental measurements and lattice QCD thermodynamics that have been made during the last decade. We start with a short review of the historical milestones on the path of constructing statistical-thermal models for heavy-ion physics. We discovered that Heinz Koppe formulated in 1948, an almost complete recipe for the statistical-thermal models. In 1950, Enrico Fermi generalized this statistical approach, in which he started with a general cross-section formula and inserted into it, the simplifying assumptions about the matrix element of the interaction process that likely reflects many features of the high-energy reactions dominated by density in the phase space of final states. In 1964, Hagedorn systematically analyzed the high-energy phenomena using all tools of statistical physics and introduced the concept of limiting temperature based on the statistical bootstrap model. It turns to be quite often that many-particle systems can be studied with the help of statistical-thermal methods. The analysis of yield multiplicities in high-energy collisions gives an overwhelming evidence for the chemical equilibrium in the final state. The strange particles might be an exception, as they are suppressed at lower beam energies. However, their relative yields fulfill statistical equilibrium, as well. We review the equilibrium statistical-thermal models for particle production, fluctuations and collective flow in heavy-ion experiments. We also review their reproduction of the lattice QCD thermodynamics at vanishing and finite chemical potential. During the last decade, five conditions have been suggested to describe the universal behavior of the chemical freeze-out parameters. The higher order moments of multiplicity have been discussed. They offer deep insights about particle production and to critical fluctuations. Therefore, we use them to describe the freeze-out parameters and suggest the location of the QCD critical endpoint. Various extensions have been proposed in order to take into consideration the possible deviations of the ideal hadron gas. We highlight various types of interactions, dissipative properties and location-dependences (spatial rapidity). Furthermore, we review three models combining hadronic with partonic phases; quasi-particle model, linear sigma model with Polyakov potentials and compressible bag model.

  17. Development of Large Area Emulsion Chamber Methods with a Super Conducting Magnet for Observation of Cosmic Ray Nuclei from 1 GeV to 1,000 TeV (Emulsion Techniques)

    NASA Technical Reports Server (NTRS)

    Takahashi, Yoshiyuki; Gregory, John C.; Tominaga, Taka; Dong, Bei Lei

    1997-01-01

    The research developed the fundamental techniques of the emulsion chamber methods that permit measurements of the composition and energy spectra of cosmic rays at energies ranging from 1 GeV/n to over 1,000 TeV/n. The research program consisted of exploring new principles and techniques in measuring very high energy cosmic nuclei with large-area emulsion chambers for high statistics experiments. These tasks have been accomplished and their use was essential in successful analysis of the balloon-borne emulsion chamber experiments up to 10(exp 14) eV. It also provided the fundamental technologies for designing large-area detectors that are aimed at measuring the composition at above 1015 eV region. The latter is now partially succeeded by a NASA Mission Concept, Advanced Cosmic Composition Experiments on the Space Station (ACCESS). The cosmic ray group at the University of Alabama in Huntsville has performed technological R & D as well as contributing to the Japanese-American-Emulsion-Chamber-Experiments (JACEE) Collaboration with the regular data analysis. While primary research support for other institutions' efforts in the JACEE experiments came from NSF and DOE, primary support for the University of Alabama in Huntsville was this contract. Supplemental tasks to standardize the data base and hardware upgrades (automatized microscope) had this institutions cooperation. Investigation of new techniques in this program consisted of development of a fast calorimetry, magnetic/scattering selection of high momentum tracks for a pairmeter, and high statistics momentum measurements for low energy nuclei (E < 1 TeV/n). The highest energy calorimetry and a pairmeter have been considered as strawman instruments by the GOAL (Galactic Origin and Acceleration Limit) proposal of the NASA Cosmic Ray Working Group for long- duration balloon flights. We accomplished the objectives of the GOAL program with three circumpolar, Antarctic JACEE balloon flights during 1992 - 1994.

  18. Defense Small Business Innovation Research Program (SBIR), Volume 4, Defense Agencies Abstracts of Phase 1 Awards 1991

    DTIC Science & Technology

    1991-01-01

    EXPERIENCE IN DEVELOPING INTEGRATED OPTICAL DEVICES, NONLINEAR MAGNETIC-OPTIC MATERIALS, HIGH FREQUENCY MODULATORS, COMPUTER-AIDED MODELING AND SOPHISTICATED... HIGH -LEVEL PRESENTATION AND DISTRIBUTED CONTROL MODELS FOR INTEGRATING HETEROGENEOUS MECHANICAL ENGINEERING APPLICATIONS AND TOOLS. THE DESIGN IS FOCUSED...STATISTICALLY ACCURATE WORST CASE DEVICE MODELS FOR CIRCUIT SIMULATION. PRESENT METHODS OF WORST CASE DEVICE DESIGN ARE AD HOC AND DO NOT ALLOW THE

  19. Progress and Challenges in Short to Medium Range Coupled Prediction

    NASA Technical Reports Server (NTRS)

    Brassington, G. B.; Martin, M. J.; Tolman, H. L.; Akella, Santha; Balmeseda, M.; Chambers, C. R. S.; Cummings, J. A.; Drillet, Y.; Jansen, P. A. E. M.; Laloyaux, P.; hide

    2014-01-01

    The availability of GODAE Oceanview-type ocean forecast systems provides the opportunity to develop high-resolution, short- to medium-range coupled prediction systems. Several groups have undertaken the first experiments based on relatively unsophisticated approaches. Progress is being driven at the institutional level targeting a range of applications that represent their respective national interests with clear overlaps and opportunities for information exchange and collaboration. These include general circulation, hurricanes, extra-tropical storms, high-latitude weather and sea-ice forecasting as well as coastal air-sea interaction. In some cases, research has moved beyond case and sensitivity studies to controlled experiments to obtain statistically significant metrics.

  20. Patient experience and process measures of quality of care at home health agencies: Factors associated with high performance.

    PubMed

    Smith, Laura M; Anderson, Wayne L; Lines, Lisa M; Pronier, Cristalle; Thornburg, Vanessa; Butler, Janelle P; Teichman, Lori; Dean-Whittaker, Debra; Goldstein, Elizabeth

    2017-01-01

    We examined the effects of provider characteristics on home health agency performance on patient experience of care (Home Health CAHPS) and process (OASIS) measures. Descriptive, multivariate, and factor analyses were used. While agencies score high on both domains, factor analyses showed that the underlying items represent separate constructs. Freestanding and Visiting Nurse Association agencies, higher number of home health aides per 100 episodes, and urban location were statistically significant predictors of lower performance. Lack of variation in composite measures potentially led to counterintuitive results for effects of organizational characteristics. This exploratory study showed the value of having separate quality domains.

  1. Statistical mixture design and multivariate analysis of inkjet printed a-WO3/TiO2/WOX electrochromic films.

    PubMed

    Wojcik, Pawel Jerzy; Pereira, Luís; Martins, Rodrigo; Fortunato, Elvira

    2014-01-13

    An efficient mathematical strategy in the field of solution processed electrochromic (EC) films is outlined as a combination of an experimental work, modeling, and information extraction from massive computational data via statistical software. Design of Experiment (DOE) was used for statistical multivariate analysis and prediction of mixtures through a multiple regression model, as well as the optimization of a five-component sol-gel precursor subjected to complex constraints. This approach significantly reduces the number of experiments to be realized, from 162 in the full factorial (L=3) and 72 in the extreme vertices (D=2) approach down to only 30 runs, while still maintaining a high accuracy of the analysis. By carrying out a finite number of experiments, the empirical modeling in this study shows reasonably good prediction ability in terms of the overall EC performance. An optimized ink formulation was employed in a prototype of a passive EC matrix fabricated in order to test and trial this optically active material system together with a solid-state electrolyte for the prospective application in EC displays. Coupling of DOE with chromogenic material formulation shows the potential to maximize the capabilities of these systems and ensures increased productivity in many potential solution-processed electrochemical applications.

  2. Water Quality Statistics

    ERIC Educational Resources Information Center

    Hodgson, Ted; Andersen, Lyle; Robison-Cox, Jim; Jones, Clain

    2004-01-01

    Water quality experiments, especially the use of macroinvertebrates as indicators of water quality, offer an ideal context for connecting statistics and science. In the STAR program for secondary students and teachers, water quality experiments were also used as a context for teaching statistics. In this article, we trace one activity that uses…

  3. Development of High Sensitivity Nuclear Emulsion and Fine Grained Emulsion

    NASA Astrophysics Data System (ADS)

    Kawahara, H.; Asada, T.; Naka, T.; Naganawa, N.; Kuwabara, K.; Nakamura, M.

    2014-08-01

    Nuclear emulsion is a particle detector having high spacial resolution and angular resolution. It became useful for large statistics experiment thanks to the development of automatic scanning system. In 2010, a facility for emulsion production was introduced and R&D of nuclear emulsion began at Nagoya university. In this paper, we present results of development of the high sensitivity emulsion and fine grained emulsion for dark matter search experiment. Improvement of sensitivity is achieved by raising density of silver halide crystals and doping well-adjusted amount of chemicals. Production of fine grained emulsion was difficult because of unexpected crystal condensation. By mixing polyvinyl alcohol (PVA) to gelatin as a binder, we succeeded in making a stable fine grained emulsion.

  4. Magnetic Johnson Noise Constraints on Electron Electric Dipole Moment Experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Munger, C.

    2004-11-18

    Magnetic fields from statistical fluctuations in currents in conducting materials broaden atomic linewidths by the Zeeman effect. The constraints so imposed on the design of experiments to measure the electric dipole moment of the electron are analyzed. Contrary to the predictions of Lamoreaux [S.K. Lamoreaux, Phys. Rev. A60, 1717(1999)], the standard material for high-permeability magnetic shields proves to be as significant a source of broadening as an ordinary metal. A scheme that would replace this standard material with ferrite is proposed.

  5. Medical intelligence in Sweden. Vitamin B12: oral compared with parenteral?

    PubMed

    Nilsson, M; Norberg, B; Hultdin, J; Sandström, H; Westman, G; Lökk, J

    2005-03-01

    Sweden is the only country in which oral high dose vitamin B12 has gained widespread use in the treatment of deficiency states. The aim of the study was to describe prescribing patterns and sales statistics of vitamin B12 tablets and injections in Sweden 1990-2000.Design, setting, and sources: Official statistics of cobalamin prescriptions and sales were used. The use of vitamin B12 increased in Sweden 1990-2000, mainly because of an increase in the use of oral high dose vitamin B12 therapy. The experience, in statistical terms a "total investigation", comprised 1,000,000 patient years for tablets and 750,000 patient years for injections. During 2000, 13% of residents aged 70 and over were treated with vitamin B12, two of three with the tablet preparation. Most patients in Sweden requiring vitamin B12 therapy have transferred from parenteral to oral high dose vitamin B12 since 1964, when the oral preparation was introduced. The findings suggest that many patients in other post-industrial societies may also be suitable for oral vitamin B12 treatment.

  6. Assay optimization: a statistical design of experiments approach.

    PubMed

    Altekar, Maneesha; Homon, Carol A; Kashem, Mohammed A; Mason, Steven W; Nelson, Richard M; Patnaude, Lori A; Yingling, Jeffrey; Taylor, Paul B

    2007-03-01

    With the transition from manual to robotic HTS in the last several years, assay optimization has become a significant bottleneck. Recent advances in robotic liquid handling have made it feasible to reduce assay optimization timelines with the application of statistically designed experiments. When implemented, they can efficiently optimize assays by rapidly identifying significant factors, complex interactions, and nonlinear responses. This article focuses on the use of statistically designed experiments in assay optimization.

  7. The Developing Infant Creates a Curriculum for Statistical Learning.

    PubMed

    Smith, Linda B; Jayaraman, Swapnaa; Clerkin, Elizabeth; Yu, Chen

    2018-04-01

    New efforts are using head cameras and eye-trackers worn by infants to capture everyday visual environments from the point of view of the infant learner. From this vantage point, the training sets for statistical learning develop as the sensorimotor abilities of the infant develop, yielding a series of ordered datasets for visual learning that differ in content and structure between timepoints but are highly selective at each timepoint. These changing environments may constitute a developmentally ordered curriculum that optimizes learning across many domains. Future advances in computational models will be necessary to connect the developmentally changing content and statistics of infant experience to the internal machinery that does the learning. Copyright © 2018 Elsevier Ltd. All rights reserved.

  8. Making Spatial Statistics Service Accessible On Cloud Platform

    NASA Astrophysics Data System (ADS)

    Mu, X.; Wu, J.; Li, T.; Zhong, Y.; Gao, X.

    2014-04-01

    Web service can bring together applications running on diverse platforms, users can access and share various data, information and models more effectively and conveniently from certain web service platform. Cloud computing emerges as a paradigm of Internet computing in which dynamical, scalable and often virtualized resources are provided as services. With the rampant growth of massive data and restriction of net, traditional web services platforms have some prominent problems existing in development such as calculation efficiency, maintenance cost and data security. In this paper, we offer a spatial statistics service based on Microsoft cloud. An experiment was carried out to evaluate the availability and efficiency of this service. The results show that this spatial statistics service is accessible for the public conveniently with high processing efficiency.

  9. The effect of the flipped classroom on urban high school students' motivation and academic achievement in a high school science course

    NASA Astrophysics Data System (ADS)

    Dixon, Keshia L.

    This study investigated the effect of the flipped classroom on urban high school students' motivation and academic achievement in a high school science course. In this quantitative study, the sample population was comprised of North Star High School 12th grade students enrolled in human anatomy and physiology. A quasi-experimental, pretest-posttest non-equivalent group design was conducted. After receipt of Liberty University Institutional Review Board approval and the school district's Department of Research and Evaluation for School Improvement, students completed a pretest comprised of the Science Motivation Questionnaire II (SMQ-II) and the Human Anatomy and Physiology Unit Test. Participants in the experimental group engaged in the treatment, the flipped classroom, using instructional materials on the educational website, Edmodo(TM), and applied content material taught using hands-on activities inclusive of assigned laboratory experiments. Participants in the control group received instruction using traditional face-to-face lecture-homework format while also engaging in assigned laboratory experiments. After the completion of the treatment all participants completed a posttest. Data from both the pretest and posttest was statistically analyzed individually using two separate one-way ANOVA/ANCOVA analyses; and researcher reported the results of the statistical analyses. After completion of the analyses, and interpretation of the results, recommendations for future research were given.

  10. Methods for processing microarray data.

    PubMed

    Ares, Manuel

    2014-02-01

    Quality control must be maintained at every step of a microarray experiment, from RNA isolation through statistical evaluation. Here we provide suggestions for analyzing microarray data. Because the utility of the results depends directly on the design of the experiment, the first critical step is to ensure that the experiment can be properly analyzed and interpreted. What is the biological question? What is the best way to perform the experiment? How many replicates will be required to obtain the desired statistical resolution? Next, the samples must be prepared, pass quality controls for integrity and representation, and be hybridized and scanned. Also, slides with defects, missing data, high background, or weak signal must be rejected. Data from individual slides must be normalized and combined so that the data are as free of systematic bias as possible. The third phase is to apply statistical filters and tests to the data to determine genes (1) expressed above background, (2) whose expression level changes in different samples, and (3) whose RNA-processing patterns or protein associations change. Next, a subset of the data should be validated by an alternative method, such as reverse transcription-polymerase chain reaction (RT-PCR). Provided that this endorses the general conclusions of the array analysis, gene sets whose expression, splicing, polyadenylation, protein binding, etc. change in different samples can be classified with respect to function, sequence motif properties, as well as other categories to extract hypotheses for their biological roles and regulatory logic.

  11. Statistical models for RNA-seq data derived from a two-condition 48-replicate experiment.

    PubMed

    Gierliński, Marek; Cole, Christian; Schofield, Pietà; Schurch, Nicholas J; Sherstnev, Alexander; Singh, Vijender; Wrobel, Nicola; Gharbi, Karim; Simpson, Gordon; Owen-Hughes, Tom; Blaxter, Mark; Barton, Geoffrey J

    2015-11-15

    High-throughput RNA sequencing (RNA-seq) is now the standard method to determine differential gene expression. Identifying differentially expressed genes crucially depends on estimates of read-count variability. These estimates are typically based on statistical models such as the negative binomial distribution, which is employed by the tools edgeR, DESeq and cuffdiff. Until now, the validity of these models has usually been tested on either low-replicate RNA-seq data or simulations. A 48-replicate RNA-seq experiment in yeast was performed and data tested against theoretical models. The observed gene read counts were consistent with both log-normal and negative binomial distributions, while the mean-variance relation followed the line of constant dispersion parameter of ∼0.01. The high-replicate data also allowed for strict quality control and screening of 'bad' replicates, which can drastically affect the gene read-count distribution. RNA-seq data have been submitted to ENA archive with project ID PRJEB5348. g.j.barton@dundee.ac.uk. © The Author 2015. Published by Oxford University Press.

  12. Statistical models for RNA-seq data derived from a two-condition 48-replicate experiment

    PubMed Central

    Cole, Christian; Schofield, Pietà; Schurch, Nicholas J.; Sherstnev, Alexander; Singh, Vijender; Wrobel, Nicola; Gharbi, Karim; Simpson, Gordon; Owen-Hughes, Tom; Blaxter, Mark; Barton, Geoffrey J.

    2015-01-01

    Motivation: High-throughput RNA sequencing (RNA-seq) is now the standard method to determine differential gene expression. Identifying differentially expressed genes crucially depends on estimates of read-count variability. These estimates are typically based on statistical models such as the negative binomial distribution, which is employed by the tools edgeR, DESeq and cuffdiff. Until now, the validity of these models has usually been tested on either low-replicate RNA-seq data or simulations. Results: A 48-replicate RNA-seq experiment in yeast was performed and data tested against theoretical models. The observed gene read counts were consistent with both log-normal and negative binomial distributions, while the mean-variance relation followed the line of constant dispersion parameter of ∼0.01. The high-replicate data also allowed for strict quality control and screening of ‘bad’ replicates, which can drastically affect the gene read-count distribution. Availability and implementation: RNA-seq data have been submitted to ENA archive with project ID PRJEB5348. Contact: g.j.barton@dundee.ac.uk PMID:26206307

  13. ‘Dignity therapy’, a promising intervention in palliative care: A comprehensive systematic literature review

    PubMed Central

    Martínez, Marina; Arantzamendi, María; Belar, Alazne; Carrasco, José Miguel; Carvajal, Ana; Rullán, María; Centeno, Carlos

    2016-01-01

    Background: Dignity therapy is psychotherapy to relieve psychological and existential distress in patients at the end of life. Little is known about its effect. Aim: To analyse the outcomes of dignity therapy in patients with advanced life-threatening diseases. Design: Systematic review was conducted. Three authors extracted data of the articles and evaluated quality using Critical Appraisal Skills Programme. Data were synthesized, considering study objectives. Data sources: PubMed, CINAHL, Cochrane Library and PsycINFO. The years searched were 2002 (year of dignity therapy development) to January 2016. ‘Dignity therapy’ was used as search term. Studies with patients with advanced life-threatening diseases were included. Results: Of 121 studies, 28 were included. Quality of studies is high. Results were grouped into effectiveness, satisfaction, suitability and feasibility, and adaptability to different diseases and cultures. Two of five randomized control trials applied dignity therapy to patients with high levels of baseline psychological distress. One showed statistically significant decrease on patients’ anxiety and depression scores over time. The other showed statistical decrease on anxiety scores pre–post dignity therapy, not on depression. Nonrandomized studies suggested statistically significant improvements in existential and psychosocial measurements. Patients, relatives and professionals perceived it improved end-of-life experience. Conclusion: Evidence suggests that dignity therapy is beneficial. One randomized controlled trial with patients with high levels of psychological distress shows DT efficacy in anxiety and depression scores. Other design studies report beneficial outcomes in terms of end-of-life experience. Further research should understand how dignity therapy functions to establish a means for measuring its impact and assessing whether high level of distress patients can benefit most from this therapy. PMID:27566756

  14. Rhythmic grouping biases constrain infant statistical learning

    PubMed Central

    Hay, Jessica F.; Saffran, Jenny R.

    2012-01-01

    Linguistic stress and sequential statistical cues to word boundaries interact during speech segmentation in infancy. However, little is known about how the different acoustic components of stress constrain statistical learning. The current studies were designed to investigate whether intensity and duration each function independently as cues to initial prominence (trochaic-based hypothesis) or whether, as predicted by the Iambic-Trochaic Law (ITL), intensity and duration have characteristic and separable effects on rhythmic grouping (ITL-based hypothesis) in a statistical learning task. Infants were familiarized with an artificial language (Experiments 1 & 3) or a tone stream (Experiment 2) in which there was an alternation in either intensity or duration. In addition to potential acoustic cues, the familiarization sequences also contained statistical cues to word boundaries. In speech (Experiment 1) and non-speech (Experiment 2) conditions, 9-month-old infants demonstrated discrimination patterns consistent with an ITL-based hypothesis: intensity signaled initial prominence and duration signaled final prominence. The results of Experiment 3, in which 6.5-month-old infants were familiarized with the speech streams from Experiment 1, suggest that there is a developmental change in infants’ willingness to treat increased duration as a cue to word offsets in fluent speech. Infants’ perceptual systems interact with linguistic experience to constrain how infants learn from their auditory environment. PMID:23730217

  15. Experimental toxicology: Issues of statistics, experimental design, and replication.

    PubMed

    Briner, Wayne; Kirwan, Jeral

    2017-01-01

    The difficulty of replicating experiments has drawn considerable attention. Issues with replication occur for a variety of reasons ranging from experimental design to laboratory errors to inappropriate statistical analysis. Here we review a variety of guidelines for statistical analysis, design, and execution of experiments in toxicology. In general, replication can be improved by using hypothesis driven experiments with adequate sample sizes, randomization, and blind data collection techniques. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. Sparse approximation of currents for statistics on curves and surfaces.

    PubMed

    Durrleman, Stanley; Pennec, Xavier; Trouvé, Alain; Ayache, Nicholas

    2008-01-01

    Computing, processing, visualizing statistics on shapes like curves or surfaces is a real challenge with many applications ranging from medical image analysis to computational geometry. Modelling such geometrical primitives with currents avoids feature-based approach as well as point-correspondence method. This framework has been proved to be powerful to register brain surfaces or to measure geometrical invariants. However, if the state-of-the-art methods perform efficiently pairwise registrations, new numerical schemes are required to process groupwise statistics due to an increasing complexity when the size of the database is growing. Statistics such as mean and principal modes of a set of shapes often have a heavy and highly redundant representation. We propose therefore to find an adapted basis on which mean and principal modes have a sparse decomposition. Besides the computational improvement, this sparse representation offers a way to visualize and interpret statistics on currents. Experiments show the relevance of the approach on 34 sets of 70 sulcal lines and on 50 sets of 10 meshes of deep brain structures.

  17. Real-world visual statistics and infants' first-learned object names

    PubMed Central

    Clerkin, Elizabeth M.; Hart, Elizabeth; Rehg, James M.; Yu, Chen

    2017-01-01

    We offer a new solution to the unsolved problem of how infants break into word learning based on the visual statistics of everyday infant-perspective scenes. Images from head camera video captured by 8 1/2 to 10 1/2 month-old infants at 147 at-home mealtime events were analysed for the objects in view. The images were found to be highly cluttered with many different objects in view. However, the frequency distribution of object categories was extremely right skewed such that a very small set of objects was pervasively present—a fact that may substantially reduce the problem of referential ambiguity. The statistical structure of objects in these infant egocentric scenes differs markedly from that in the training sets used in computational models and in experiments on statistical word-referent learning. Therefore, the results also indicate a need to re-examine current explanations of how infants break into word learning. This article is part of the themed issue ‘New frontiers for statistical learning in the cognitive sciences’. PMID:27872373

  18. The Functional Measurement Experiment Builder suite: two Java-based programs to generate and run functional measurement experiments.

    PubMed

    Mairesse, Olivier; Hofmans, Joeri; Theuns, Peter

    2008-05-01

    We propose a free, easy-to-use computer program that does not requires prior knowledge of computer programming to generate and run experiments using textual or pictorial stimuli. Although the FM Experiment Builder suite was initially programmed for building and conducting FM experiments, it can also be applied for non-FM experiments that necessitate randomized, single, or multifactorial designs. The program is highly configurable, allowing multilingual use and a wide range of different response formats. The outputs of the experiments are Microsoft Excel compatible .xls files that allow easy copy-paste of the results into Weiss's FM CalSTAT program (2006) or any other statistical package. Its Java-based structure is compatible with both Windows and Macintosh operating systems, and its compactness (< 1 MB) makes it easily distributable over the Internet.

  19. The Asymmetry Parameter and Branching Ratio of Sigma Plus Radiative Decay

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Foucher, Maurice Emile

    1992-05-01

    We have measured the asymmetry parameter and branching ratio of themore » $$\\Sigma^+$$ radiative decay. This high statistics experiment (FNAL 761) was performed in the Proton Center charged hyperon beam at Fermi National Accelerator Laboratory in Batavia, Illinois. We find for the asymmetry parameter -0.720 $$\\pm$$ 0.086 $$\\pm$$ 0.045 where the first error is statistical and the second is systematic. This result is based on a sample of 34754 $$\\pm$$ 212 events. We find a preliminary value for the branching ratio $$Br ( \\Sigma^+ \\to p\\gamma )$$ $$/ Br ( \\Sigma^+ \\to p \\pi^0 )$$ = (2.14 $$\\pm$$ 0.07 $$\\pm$$ 0.11) x $$10^{-3}$$ where the first error is statistical and the second is systematic. This result is based on a sample of 31040 $$\\pm$$ 650 events. Both results are in agreement with previous low statistics measurements.« less

  20. Image statistics underlying natural texture selectivity of neurons in macaque V4

    PubMed Central

    Okazawa, Gouki; Tajima, Satohiro; Komatsu, Hidehiko

    2015-01-01

    Our daily visual experiences are inevitably linked to recognizing the rich variety of textures. However, how the brain encodes and differentiates a plethora of natural textures remains poorly understood. Here, we show that many neurons in macaque V4 selectively encode sparse combinations of higher-order image statistics to represent natural textures. We systematically explored neural selectivity in a high-dimensional texture space by combining texture synthesis and efficient-sampling techniques. This yielded parameterized models for individual texture-selective neurons. The models provided parsimonious but powerful predictors for each neuron’s preferred textures using a sparse combination of image statistics. As a whole population, the neuronal tuning was distributed in a way suitable for categorizing textures and quantitatively predicts human ability to discriminate textures. Together, we suggest that the collective representation of visual image statistics in V4 plays a key role in organizing the natural texture perception. PMID:25535362

  1. Landing Site Dispersion Analysis and Statistical Assessment for the Mars Phoenix Lander

    NASA Technical Reports Server (NTRS)

    Bonfiglio, Eugene P.; Adams, Douglas; Craig, Lynn; Spencer, David A.; Strauss, William; Seelos, Frank P.; Seelos, Kimberly D.; Arvidson, Ray; Heet, Tabatha

    2008-01-01

    The Mars Phoenix Lander launched on August 4, 2007 and successfully landed on Mars 10 months later on May 25, 2008. Landing ellipse predicts and hazard maps were key in selecting safe surface targets for Phoenix. Hazard maps were based on terrain slopes, geomorphology maps and automated rock counts of MRO's High Resolution Imaging Science Experiment (HiRISE) images. The expected landing dispersion which led to the selection of Phoenix's surface target is discussed as well as the actual landing dispersion predicts determined during operations in the weeks, days, and hours before landing. A statistical assessment of these dispersions is performed, comparing the actual landing-safety probabilities to criteria levied by the project. Also discussed are applications for this statistical analysis which were used by the Phoenix project. These include using the statistical analysis used to verify the effectiveness of a pre-planned maneuver menu and calculating the probability of future maneuvers.

  2. The invariant statistical rule of aerosol scattering pulse signal modulated by random noise

    NASA Astrophysics Data System (ADS)

    Yan, Zhen-gang; Bian, Bao-Min; Yang, Juan; Peng, Gang; Li, Zhen-hua

    2010-11-01

    A model of the random background noise acting on particle signals is established to study the impact of the background noise of the photoelectric sensor in the laser airborne particle counter on the statistical character of the aerosol scattering pulse signals. The results show that the noises broaden the statistical distribution of the particle's measurement. Further numerical research shows that the output of the signal amplitude still has the same distribution when the airborne particle with the lognormal distribution was modulated by random noise which has lognormal distribution. Namely it follows the statistics law of invariance. Based on this model, the background noise of photoelectric sensor and the counting distributions of random signal for aerosol's scattering pulse are obtained and analyzed by using a high-speed data acquisition card PCI-9812. It is found that the experiment results and simulation results are well consistent.

  3. Statistical Analysis of Spectral Properties and Prosodic Parameters of Emotional Speech

    NASA Astrophysics Data System (ADS)

    Přibil, J.; Přibilová, A.

    2009-01-01

    The paper addresses reflection of microintonation and spectral properties in male and female acted emotional speech. Microintonation component of speech melody is analyzed regarding its spectral and statistical parameters. According to psychological research of emotional speech, different emotions are accompanied by different spectral noise. We control its amount by spectral flatness according to which the high frequency noise is mixed in voiced frames during cepstral speech synthesis. Our experiments are aimed at statistical analysis of cepstral coefficient values and ranges of spectral flatness in three emotions (joy, sadness, anger), and a neutral state for comparison. Calculated histograms of spectral flatness distribution are visually compared and modelled by Gamma probability distribution. Histograms of cepstral coefficient distribution are evaluated and compared using skewness and kurtosis. Achieved statistical results show good correlation comparing male and female voices for all emotional states portrayed by several Czech and Slovak professional actors.

  4. [Road Extraction in Remote Sensing Images Based on Spectral and Edge Analysis].

    PubMed

    Zhao, Wen-zhi; Luo, Li-qun; Guo, Zhou; Yue, Jun; Yu, Xue-ying; Liu, Hui; Wei, Jing

    2015-10-01

    Roads are typically man-made objects in urban areas. Road extraction from high-resolution images has important applications for urban planning and transportation development. However, due to the confusion of spectral characteristic, it is difficult to distinguish roads from other objects by merely using traditional classification methods that mainly depend on spectral information. Edge is an important feature for the identification of linear objects (e. g. , roads). The distribution patterns of edges vary greatly among different objects. It is crucial to merge edge statistical information into spectral ones. In this study, a new method that combines spectral information and edge statistical features has been proposed. First, edge detection is conducted by using self-adaptive mean-shift algorithm on the panchromatic band, which can greatly reduce pseudo-edges and noise effects. Then, edge statistical features are obtained from the edge statistical model, which measures the length and angle distribution of edges. Finally, by integrating the spectral and edge statistical features, SVM algorithm is used to classify the image and roads are ultimately extracted. A series of experiments are conducted and the results show that the overall accuracy of proposed method is 93% comparing with only 78% overall accuracy of the traditional. The results demonstrate that the proposed method is efficient and valuable for road extraction, especially on high-resolution images.

  5. Application of multivariate statistical techniques in microbial ecology

    PubMed Central

    Paliy, O.; Shankar, V.

    2016-01-01

    Recent advances in high-throughput methods of molecular analyses have led to an explosion of studies generating large scale ecological datasets. Especially noticeable effect has been attained in the field of microbial ecology, where new experimental approaches provided in-depth assessments of the composition, functions, and dynamic changes of complex microbial communities. Because even a single high-throughput experiment produces large amounts of data, powerful statistical techniques of multivariate analysis are well suited to analyze and interpret these datasets. Many different multivariate techniques are available, and often it is not clear which method should be applied to a particular dataset. In this review we describe and compare the most widely used multivariate statistical techniques including exploratory, interpretive, and discriminatory procedures. We consider several important limitations and assumptions of these methods, and we present examples of how these approaches have been utilized in recent studies to provide insight into the ecology of the microbial world. Finally, we offer suggestions for the selection of appropriate methods based on the research question and dataset structure. PMID:26786791

  6. Effect of Task Presentation on Students' Performances in Introductory Statistics Courses

    ERIC Educational Resources Information Center

    Tomasetto, Carlo; Matteucci, Maria Cristina; Carugati, Felice; Selleri, Patrizia

    2009-01-01

    Research on academic learning indicates that many students experience major difficulties with introductory statistics and methodology courses. We hypothesized that students' difficulties may depend in part on the fact that statistics tasks are commonly viewed as related to the threatening domain of math. In two field experiments which we carried…

  7. Efficient summary statistical representation when change localization fails.

    PubMed

    Haberman, Jason; Whitney, David

    2011-10-01

    People are sensitive to the summary statistics of the visual world (e.g., average orientation/speed/facial expression). We readily derive this information from complex scenes, often without explicit awareness. Given the fundamental and ubiquitous nature of summary statistical representation, we tested whether this kind of information is subject to the attentional constraints imposed by change blindness. We show that information regarding the summary statistics of a scene is available despite limited conscious access. In a novel experiment, we found that while observers can suffer from change blindness (i.e., not localize where change occurred between two views of the same scene), observers could nevertheless accurately report changes in the summary statistics (or "gist") about the very same scene. In the experiment, observers saw two successively presented sets of 16 faces that varied in expression. Four of the faces in the first set changed from one emotional extreme (e.g., happy) to another (e.g., sad) in the second set. Observers performed poorly when asked to locate any of the faces that changed (change blindness). However, when asked about the ensemble (which set was happier, on average), observer performance remained high. Observers were sensitive to the average expression even when they failed to localize any specific object change. That is, even when observers could not locate the very faces driving the change in average expression between the two sets, they nonetheless derived a precise ensemble representation. Thus, the visual system may be optimized to process summary statistics in an efficient manner, allowing it to operate despite minimal conscious access to the information presented.

  8. Atomic Bose-Hubbard Systems with Single-Particle Control

    NASA Astrophysics Data System (ADS)

    Preiss, Philipp Moritz

    Experiments with ultracold atoms in optical lattices provide outstanding opportunities to realize exotic quantum states due to a high degree of tunability and control. In this thesis, I present experiments that extend this control from global parameters to the level of individual particles. Using a quantum gas microscope for 87Rb, we have developed a single-site addressing scheme based on digital amplitude holograms. The system self-corrects for aberrations in the imaging setup and creates arbitrary beam profiles. We are thus able to shape optical potentials on the scale of single lattice sites and control the dynamics of individual atoms. We study the role of quantum statistics and interactions in the Bose-Hubbard model on the fundamental level of two particles. Bosonic quantum statistics are apparent in the Hong-Ou-Mandel interference of massive particles, which we observe in tailored double-well potentials. These underlying statistics, in combination with tunable repulsive interactions, dominate the dynamics in single- and two-particle quantum walks. We observe highly coherent position-space Bloch oscillations, bosonic bunching in Hanbury Brown-Twiss interference and the fermionization of strongly interacting bosons. Many-body states of indistinguishable quantum particles are characterized by large-scale spatial entanglement, which is difficult to detect in itinerant systems. Here, we extend the concept of Hong-Ou-Mandel interference from individual particles to many-body states to directly quantify entanglement entropy. We perform collective measurements on two copies of a quantum state and detect entanglement entropy through many-body interference. We measure the second order Renyi entropy in small Bose-Hubbard systems and detect the buildup of spatial entanglement across the superfluid-insulator transition. Our experiments open new opportunities for the single-particle-resolved preparation and characterization of many-body quantum states.

  9. Method and data evaluation at NASA endocrine laboratory. [Skylab 3 experiments

    NASA Technical Reports Server (NTRS)

    Johnston, D. A.

    1974-01-01

    The biomedical data of the astronauts on Skylab 3 were analyzed to evaluate the univariate statistical methods for comparing endocrine series experiments in relation to other medical experiments. It was found that an information storage and retrieval system was needed to facilitate statistical analyses.

  10. Own and parental war experience as a risk factor for mental health problems among adolescents with an immigrant background: results from a cross sectional study in Oslo, Norway

    PubMed Central

    2006-01-01

    Background An increasing proportion of immigrants to Western countries in the past decade are from war affected countries. The aim of this study was to estimate the prevalence of war experience among adolescents and their parents and to investigate possible differences in internalizing and externalizing mental health problems between adolescents exposed and unexposed to own and parental war experience. Method The study is based on a cross-sectional population-based survey of all 10th grade pupils in Oslo for two consecutive years. A total of 1,758 aadolescents were included, all with both parents born outside of Norway. Internalizing and externalizing mental health problems were measured by Hopkins Symptom Checklist-10 and subscales of the Strengths and Difficulties Questionnaire, respectively. Own and parental war experience is based on adolescent self-report. Results The proportion of adolescents with own war experience was 14% with the highest prevalence in immigrants from Eastern Europe and Sub-Saharan Africa. The proportion of parental war experience was 33% with Sub-Saharan Africa being highest. Adolescents reporting own war experience had higher scores for both internalizing and externalizing mental health problems compared to immigrants without war experience, but only externalizing problems reached statistically significant differences. For parental war experience there was a statistically significant relationship between parental war experience and internalizing mental health problems. The association remained significant after adjustment for parental educational level and adolescents' own war experience. Conclusion War exposure is highly prevalent among immigrants living in Oslo, Norway, both among adolescents themselves and their parents. Among immigrants to Norway, parental war experience appears to be stronger associated with mental health problems than adolescents own exposure to war experience. PMID:17081315

  11. Teaching statistics to nursing students: an expert panel consensus.

    PubMed

    Hayat, Matthew J; Eckardt, Patricia; Higgins, Melinda; Kim, MyoungJin; Schmiege, Sarah J

    2013-06-01

    Statistics education is a necessary element of nursing education, and its inclusion is recommended in the American Association of Colleges of Nursing guidelines for nurse training at all levels. This article presents a cohesive summary of an expert panel discussion, "Teaching Statistics to Nursing Students," held at the 2012 Joint Statistical Meetings. All panelists were statistics experts, had extensive teaching and consulting experience, and held faculty appointments in a U.S.-based nursing college or school. The panel discussed degree-specific curriculum requirements, course content, how to ensure nursing students understand the relevance of statistics, approaches to integrating statistics consulting knowledge, experience with classroom instruction, use of knowledge from the statistics education research field to make improvements in statistics education for nursing students, and classroom pedagogy and instruction on the use of statistical software. Panelists also discussed the need for evidence to make data-informed decisions about statistics education and training for nurses. Copyright 2013, SLACK Incorporated.

  12. A Tutorial on Adaptive Design Optimization

    PubMed Central

    Myung, Jay I.; Cavagnaro, Daniel R.; Pitt, Mark A.

    2013-01-01

    Experimentation is ubiquitous in the field of psychology and fundamental to the advancement of its science, and one of the biggest challenges for researchers is designing experiments that can conclusively discriminate the theoretical hypotheses or models under investigation. The recognition of this challenge has led to the development of sophisticated statistical methods that aid in the design of experiments and that are within the reach of everyday experimental scientists. This tutorial paper introduces the reader to an implementable experimentation methodology, dubbed Adaptive Design Optimization, that can help scientists to conduct “smart” experiments that are maximally informative and highly efficient, which in turn should accelerate scientific discovery in psychology and beyond. PMID:23997275

  13. Statistical Analysis Experiment for Freshman Chemistry Lab.

    ERIC Educational Resources Information Center

    Salzsieder, John C.

    1995-01-01

    Describes a laboratory experiment dissolving zinc from galvanized nails in which data can be gathered very quickly for statistical analysis. The data have sufficient significant figures and the experiment yields a nice distribution of random errors. Freshman students can gain an appreciation of the relationships between random error, number of…

  14. Probability distributions of molecular observables computed from Markov models. II. Uncertainties in observables and their time-evolution

    NASA Astrophysics Data System (ADS)

    Chodera, John D.; Noé, Frank

    2010-09-01

    Discrete-state Markov (or master equation) models provide a useful simplified representation for characterizing the long-time statistical evolution of biomolecules in a manner that allows direct comparison with experiments as well as the elucidation of mechanistic pathways for an inherently stochastic process. A vital part of meaningful comparison with experiment is the characterization of the statistical uncertainty in the predicted experimental measurement, which may take the form of an equilibrium measurement of some spectroscopic signal, the time-evolution of this signal following a perturbation, or the observation of some statistic (such as the correlation function) of the equilibrium dynamics of a single molecule. Without meaningful error bars (which arise from both approximation and statistical error), there is no way to determine whether the deviations between model and experiment are statistically meaningful. Previous work has demonstrated that a Bayesian method that enforces microscopic reversibility can be used to characterize the statistical component of correlated uncertainties in state-to-state transition probabilities (and functions thereof) for a model inferred from molecular simulation data. Here, we extend this approach to include the uncertainty in observables that are functions of molecular conformation (such as surrogate spectroscopic signals) characterizing each state, permitting the full statistical uncertainty in computed spectroscopic experiments to be assessed. We test the approach in a simple model system to demonstrate that the computed uncertainties provide a useful indicator of statistical variation, and then apply it to the computation of the fluorescence autocorrelation function measured for a dye-labeled peptide previously studied by both experiment and simulation.

  15. The Rise and Fall of Pentaquarks in Experiments

    NASA Astrophysics Data System (ADS)

    Schumacher, Reinhard A.

    2006-07-01

    Experimental evidence for and against the existence of pentaquarks has accumulated rapidly in the last three years. If they exist, they would be dramatic examples of hadronic states beyond our well-tested and successful particle models. The positive evidence suggests existence of baryonic objects with widths of at most a few MeV, some displaying exotic quantum numbers, such as baryons with strangeness S = +1. The non-observations of these states have often come from reaction channels very different from the positive evidence channels, making comparisons difficult. The situation has now been largely clarified, however, by high-statistics repetitions of the positive sightings, with the result that none of the positive sightings have been convincingly reproduced. The most recent unconfirmed positive sightings suffer again from low statistics and large backgrounds. It seems that a kind of "bandwagon" effect led to the overly-optimistic interpretation of numerous experiments in the earlier reports of exotic pentaquarks.

  16. Statistical numeracy as a moderator of (pseudo)contingency effects on decision behavior.

    PubMed

    Fleig, Hanna; Meiser, Thorsten; Ettlin, Florence; Rummel, Jan

    2017-03-01

    Pseudocontingencies denote contingency estimates inferred from base rates rather than from cell frequencies. We examined the role of statistical numeracy for effects of such fallible but adaptive inferences on choice behavior. In Experiment 1, we provided information on single observations as well as on base rates and tracked participants' eye movements. In Experiment 2, we manipulated the availability of information on cell frequencies and base rates between conditions. Our results demonstrate that a focus on base rates rather than cell frequencies benefits pseudocontingency effects. Learners who are more proficient in (conditional) probability calculation prefer to rely on cell frequencies in order to judge contingencies, though, as was evident from their gaze behavior. If cell frequencies are available in summarized format, they may infer the true contingency between options and outcomes. Otherwise, however, even highly numerate learners are susceptible to pseudocontingency effects. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Using Bayes' theorem for free energy calculations

    NASA Astrophysics Data System (ADS)

    Rogers, David M.

    Statistical mechanics is fundamentally based on calculating the probabilities of molecular-scale events. Although Bayes' theorem has generally been recognized as providing key guiding principals for setup and analysis of statistical experiments [83], classical frequentist models still predominate in the world of computational experimentation. As a starting point for widespread application of Bayesian methods in statistical mechanics, we investigate the central quantity of free energies from this perspective. This dissertation thus reviews the basics of Bayes' view of probability theory, and the maximum entropy formulation of statistical mechanics before providing examples of its application to several advanced research areas. We first apply Bayes' theorem to a multinomial counting problem in order to determine inner shell and hard sphere solvation free energy components of Quasi-Chemical Theory [140]. We proceed to consider the general problem of free energy calculations from samples of interaction energy distributions. From there, we turn to spline-based estimation of the potential of mean force [142], and empirical modeling of observed dynamics using integrator matching. The results of this research are expected to advance the state of the art in coarse-graining methods, as they allow a systematic connection from high-resolution (atomic) to low-resolution (coarse) structure and dynamics. In total, our work on these problems constitutes a critical starting point for further application of Bayes' theorem in all areas of statistical mechanics. It is hoped that the understanding so gained will allow for improvements in comparisons between theory and experiment.

  18. Constraining nuclear photon strength functions by the decay properties of photo-excited states

    NASA Astrophysics Data System (ADS)

    Isaak, J.; Savran, D.; Krtička, M.; Ahmed, M. W.; Beller, J.; Fiori, E.; Glorius, J.; Kelley, J. H.; Löher, B.; Pietralla, N.; Romig, C.; Rusev, G.; Scheck, M.; Schnorrenberger, L.; Silva, J.; Sonnabend, K.; Tonchev, A. P.; Tornow, W.; Weller, H. R.; Zweidinger, M.

    2013-12-01

    A new approach for constraining the low-energy part of the electric dipole Photon Strength Function (E1-PSF) is presented. Experiments at the Darmstadt High-Intensity Photon Setup and the High Intensity γ→-Ray Source have been performed to investigate the decay properties of 130Te between 5.50 and 8.15 MeV excitation energy. In particular, the average γ-ray branching ratio to the ground state and the population intensity of low-lying excited states have been studied. A comparison to the statistical model shows that the latter is sensitive to the low-energy behavior of the E1-PSF, while the average ground state branching ratio cannot be described by the statistical model in the energy range between 5.5 and 6.5 MeV.

  19. Teaching Statistical Inference for Causal Effects in Experiments and Observational Studies

    ERIC Educational Resources Information Center

    Rubin, Donald B.

    2004-01-01

    Inference for causal effects is a critical activity in many branches of science and public policy. The field of statistics is the one field most suited to address such problems, whether from designed experiments or observational studies. Consequently, it is arguably essential that departments of statistics teach courses in causal inference to both…

  20. A Laboratory Experiment, Based on the Maillard Reaction, Conducted as a Project in Introductory Statistics

    ERIC Educational Resources Information Center

    Kravchuk, Olena; Elliott, Antony; Bhandari, Bhesh

    2005-01-01

    A simple laboratory experiment, based on the Maillard reaction, served as a project in Introductory Statistics for undergraduates in Food Science and Technology. By using the principles of randomization and replication and reflecting on the sources of variation in the experimental data, students reinforced the statistical concepts and techniques…

  1. Number of Black Children in Extreme Poverty Hits Record High. Analysis Background.

    ERIC Educational Resources Information Center

    Children's Defense Fund, Washington, DC.

    To examine the experiences of black children and poverty, researchers conducted a computer analysis of data from the U.S. Census Bureau's Current Population Survey, the source of official government poverty statistics. The data are through 2001. Results indicated that nearly 1 million black children were living in extreme poverty, with after-tax…

  2. First-Generation and Continuing-Generation College Students: A Comparison of High School and Postsecondary Experiences. Stats in Brief. NCES 2018-009

    ERIC Educational Resources Information Center

    Redford, Jeremy; Hoyer, Kathleen Mulvaney

    2017-01-01

    This Statistics in Brief examines background and educational characteristics, plans for college, postsecondary enrollment, and postsecondary completion patterns of first-generation college students and their peers whose parents have college degrees. The brief also explores how postsecondary plans, attendance, and completion varies between these…

  3. HIV Antibody Testing among Adults in the United States: Data from 1988 NHIS.

    ERIC Educational Resources Information Center

    Hardy, Ann M.; Dawson, Deborah A.

    1990-01-01

    Analyzes statistical data from 1988 National Health Interview Survey to determine adult awareness of and experience with HIV antibody testing. Following findings reported: most knew of test; 17 percent had been tested; Blacks and Hispanics were more likely than Whites to have been voluntarily tested; and high-risk group members were more likely…

  4. Use of iPhone technology in improving acetabular component position in total hip arthroplasty.

    PubMed

    Tay, Xiau Wei; Zhang, Benny Xu; Gayagay, George

    2017-09-01

    Improper acetabular cup positioning is associated with high risk of complications after total hip arthroplasty. The aim of our study is to objectively compare 3 methods, namely (1) free hand, (2) alignment jig (Sputnik), and (3) iPhone application to identify an easy, reproducible, and accurate method in improving acetabular cup placement. We designed a simple setup and carried out a simple experiment (see Method section). Using statistical analysis, the difference in inclination angles using iPhone application compared with the freehand method was found to be statistically significant ( F [2,51] = 4.17, P = .02) in the "untrained group". There is no statistical significance detected for the other groups. This suggests a potential role for iPhone applications in junior surgeons in overcoming the steep learning curve.

  5. Gift from statistical learning: Visual statistical learning enhances memory for sequence elements and impairs memory for items that disrupt regularities.

    PubMed

    Otsuka, Sachio; Saiki, Jun

    2016-02-01

    Prior studies have shown that visual statistical learning (VSL) enhances familiarity (a type of memory) of sequences. How do statistical regularities influence the processing of each triplet element and inserted distractors that disrupt the regularity? Given that increased attention to triplets induced by VSL and inhibition of unattended triplets, we predicted that VSL would promote memory for each triplet constituent, and degrade memory for inserted stimuli. Across the first two experiments, we found that objects from structured sequences were more likely to be remembered than objects from random sequences, and that letters (Experiment 1) or objects (Experiment 2) inserted into structured sequences were less likely to be remembered than those inserted into random sequences. In the subsequent two experiments, we examined an alternative account for our results, whereby the difference in memory for inserted items between structured and random conditions is due to individuation of items within random sequences. Our findings replicated even when control letters (Experiment 3A) or objects (Experiment 3B) were presented before or after, rather than inserted into, random sequences. Our findings suggest that statistical learning enhances memory for each item in a regular set and impairs memory for items that disrupt the regularity. Copyright © 2015 Elsevier B.V. All rights reserved.

  6. Optimization of solid content, carbon/nitrogen ratio and food/inoculum ratio for biogas production from food waste.

    PubMed

    Dadaser-Celik, Filiz; Azgin, Sukru Taner; Yildiz, Yalcin Sevki

    2016-12-01

    Biogas production from food waste has been used as an efficient waste treatment option for years. The methane yields from decomposition of waste are, however, highly variable under different operating conditions. In this study, a statistical experimental design method (Taguchi OA 9 ) was implemented to investigate the effects of simultaneous variations of three parameters on methane production. The parameters investigated were solid content (SC), carbon/nitrogen ratio (C/N) and food/inoculum ratio (F/I). Two sets of experiments were conducted with nine anaerobic reactors operating under different conditions. Optimum conditions were determined using statistical analysis, such as analysis of variance (ANOVA). A confirmation experiment was carried out at optimum conditions to investigate the validity of the results. Statistical analysis showed that SC was the most important parameter for methane production with a 45% contribution, followed by F/I ratio with a 35% contribution. The optimum methane yield of 151 l kg -1 volatile solids (VS) was achieved after 24 days of digestion when SC was 4%, C/N was 28 and F/I were 0.3. The confirmation experiment provided a methane yield of 167 l kg -1 VS after 24 days. The analysis showed biogas production from food waste may be increased by optimization of operating conditions. © The Author(s) 2016.

  7. Estimating pseudocounts and fold changes for digital expression measurements.

    PubMed

    Erhard, Florian

    2018-06-19

    Fold changes from count based high-throughput experiments such as RNA-seq suffer from a zero-frequency problem. To circumvent division by zero, so-called pseudocounts are added to make all observed counts strictly positive. The magnitude of pseudocounts for digital expression measurements and on which stage of the analysis they are introduced remained an arbitrary choice. Moreover, in the strict sense, fold changes are not quantities that can be computed. Instead, due to the stochasticity involved in the experiments, they must be estimated by statistical inference. Here, we build on a statistical framework for fold changes, where pseudocounts correspond to the parameters of the prior distribution used for Bayesian inference of the fold change. We show that arbirary and widely used choices for applying pseudocounts can lead to biased results. As a statistical rigorous alternative, we propose and test an empirical Bayes procedure to choose appropriate pseudocounts. Moreover, we introduce the novel estimator Ψ LFC for fold changes showing favorable properties with small counts and smaller deviations from the truth in simulations and real data compared to existing methods. Our results have direct implications for entities with few reads in sequencing experiments, and indirectly also affect results for entities with many reads. Ψ LFC is available as an R package under https://github.com/erhard-lab/lfc (Apache 2.0 license); R scripts to generate all figures are available at zenodo (doi:10.5281/zenodo.1163029).

  8. Dealing with the Conflicting Results of Psycholinguistic Experiments: How to Resolve Them with the Help of Statistical Meta-analysis.

    PubMed

    Rákosi, Csilla

    2018-01-22

    This paper proposes the use of the tools of statistical meta-analysis as a method of conflict resolution with respect to experiments in cognitive linguistics. With the help of statistical meta-analysis, the effect size of similar experiments can be compared, a well-founded and robust synthesis of the experimental data can be achieved, and possible causes of any divergence(s) in the outcomes can be revealed. This application of statistical meta-analysis offers a novel method of how diverging evidence can be dealt with. The workability of this idea is exemplified by a case study dealing with a series of experiments conducted as non-exact replications of Thibodeau and Boroditsky (PLoS ONE 6(2):e16782, 2011. https://doi.org/10.1371/journal.pone.0016782 ).

  9. First experiences of high-fidelity simulation training in junior nursing students in Korea.

    PubMed

    Lee, Suk Jeong; Kim, Sang Suk; Park, Young-Mi

    2015-07-01

    This study was conducted to explore first experiences of high-fidelity simulation training in Korean nursing students, in order to develop and establish more effective guidelines for future simulation training in Korea. Thirty-three junior nursing students participated in high-fidelity simulation training for the first time. Using both qualitative and quantitative methods, data were collected from reflective journals and questionnaires of simulation effectiveness after simulation training. Descriptive statistics were used to analyze simulation effectiveness and content analysis was performed with the reflective journal data. Five dimensions and 31 domains, both positive and negative experiences, emerged from qualitative analysis: (i) machine-human interaction in a safe environment; (ii) perceived learning capability; (iii) observational learning; (iv) reconciling practice with theory; and (v) follow-up debriefing effect. More than 70% of students scored high on increased ability to identify changes in the patient's condition, critical thinking, decision-making, effectiveness of peer observation, and debriefing in effectiveness of simulation. This study reported both positive and negative experiences of simulation. The results of this study could be used to set the level of task difficulty in simulation. Future simulation programs can be designed by reinforcing the positive experiences and modifying the negative results. © 2014 The Authors. Japan Journal of Nursing Science © 2014 Japan Academy of Nursing Science.

  10. Abstraction and generalization in statistical learning: implications for the relationship between semantic types and episodic tokens

    PubMed Central

    2017-01-01

    Statistical approaches to emergent knowledge have tended to focus on the process by which experience of individual episodes accumulates into generalizable experience across episodes. However, there is a seemingly opposite, but equally critical, process that such experience affords: the process by which, from a space of types (e.g. onions—a semantic class that develops through exposure to individual episodes involving individual onions), we can perceive or create, on-the-fly, a specific token (a specific onion, perhaps one that is chopped) in the absence of any prior perceptual experience with that specific token. This article reviews a selection of statistical learning studies that lead to the speculation that this process—the generation, on the basis of semantic memory, of a novel episodic representation—is itself an instance of a statistical, in fact associative, process. The article concludes that the same processes that enable statistical abstraction across individual episodes to form semantic memories also enable the generation, from those semantic memories, of representations that correspond to individual tokens, and of novel episodic facts about those tokens. Statistical learning is a window onto these deeper processes that underpin cognition. This article is part of the themed issue ‘New frontiers for statistical learning in the cognitive sciences’. PMID:27872378

  11. Abstraction and generalization in statistical learning: implications for the relationship between semantic types and episodic tokens.

    PubMed

    Altmann, Gerry T M

    2017-01-05

    Statistical approaches to emergent knowledge have tended to focus on the process by which experience of individual episodes accumulates into generalizable experience across episodes. However, there is a seemingly opposite, but equally critical, process that such experience affords: the process by which, from a space of types (e.g. onions-a semantic class that develops through exposure to individual episodes involving individual onions), we can perceive or create, on-the-fly, a specific token (a specific onion, perhaps one that is chopped) in the absence of any prior perceptual experience with that specific token. This article reviews a selection of statistical learning studies that lead to the speculation that this process-the generation, on the basis of semantic memory, of a novel episodic representation-is itself an instance of a statistical, in fact associative, process. The article concludes that the same processes that enable statistical abstraction across individual episodes to form semantic memories also enable the generation, from those semantic memories, of representations that correspond to individual tokens, and of novel episodic facts about those tokens. Statistical learning is a window onto these deeper processes that underpin cognition.This article is part of the themed issue 'New frontiers for statistical learning in the cognitive sciences'. © 2016 The Author(s).

  12. Sequential Tests of Multiple Hypotheses Controlling Type I and II Familywise Error Rates

    PubMed Central

    Bartroff, Jay; Song, Jinlin

    2014-01-01

    This paper addresses the following general scenario: A scientist wishes to perform a battery of experiments, each generating a sequential stream of data, to investigate some phenomenon. The scientist would like to control the overall error rate in order to draw statistically-valid conclusions from each experiment, while being as efficient as possible. The between-stream data may differ in distribution and dimension but also may be highly correlated, even duplicated exactly in some cases. Treating each experiment as a hypothesis test and adopting the familywise error rate (FWER) metric, we give a procedure that sequentially tests each hypothesis while controlling both the type I and II FWERs regardless of the between-stream correlation, and only requires arbitrary sequential test statistics that control the error rates for a given stream in isolation. The proposed procedure, which we call the sequential Holm procedure because of its inspiration from Holm’s (1979) seminal fixed-sample procedure, shows simultaneous savings in expected sample size and less conservative error control relative to fixed sample, sequential Bonferroni, and other recently proposed sequential procedures in a simulation study. PMID:25092948

  13. Experimental Design and Power Calculation for RNA-seq Experiments.

    PubMed

    Wu, Zhijin; Wu, Hao

    2016-01-01

    Power calculation is a critical component of RNA-seq experimental design. The flexibility of RNA-seq experiment and the wide dynamic range of transcription it measures make it an attractive technology for whole transcriptome analysis. These features, in addition to the high dimensionality of RNA-seq data, bring complexity in experimental design, making an analytical power calculation no longer realistic. In this chapter we review the major factors that influence the statistical power of detecting differential expression, and give examples of power assessment using the R package PROPER.

  14. Results from the HARP Experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Catanesi, M. G.

    2008-02-21

    Hadron production is a key ingredient in many aspects of {nu} physics. Precise prediction of atmospheric {nu} fluxes, characterization of accelerator {nu} beams, quantification of {pi} production and capture for {nu}-factory designs, all of these would profit from hadron production measurements. HARP at the CERN PS was the first hadron production experiment designed on purpose to match all these requirements. It combines a large, full phase space acceptance with low systematic errors and high statistics. HARP was operated in the range from 3 GeV to 15 GeV. We briefly describe here the most recent results.

  15. The LDEF ultra heavy cosmic ray experiment

    NASA Technical Reports Server (NTRS)

    Osullivan, D.; Thompson, A.; Bosch, J.; Keegan, R.; Wenzel, K.-P.; Smit, A.; Domingo, C.

    1992-01-01

    The LDEF Ultra Heavy Cosmic Ray Experiment (UHCRE) used 16 side viewing LDEF trays giving a total geometry factor for high energy cosmic rays of 30 sq m sr. The total exposure factor was 170 sq m sr y. The experiment is based on a modular array of 192 solid state nuclear track detector stacks, mounted in sets of four in 48 pressure vessels. The extended duration of the LDEF mission has resulted in a greatly enhanced potential scientific yield from the UHCRE. Initial scanning results indicate that at least 1800 cosmic ray nuclei with Z greater than 65 were collected, including the world's first statistically significant sample of actinides. Post flight work to date and the current status of the experiment are reviewed.

  16. Applying Statistics in the Undergraduate Chemistry Laboratory: Experiments with Food Dyes.

    ERIC Educational Resources Information Center

    Thomasson, Kathryn; Lofthus-Merschman, Sheila; Humbert, Michelle; Kulevsky, Norman

    1998-01-01

    Describes several experiments to teach different aspects of the statistical analysis of data using household substances and a simple analysis technique. Each experiment can be performed in three hours. Students learn about treatment of spurious data, application of a pooled variance, linear least-squares fitting, and simultaneous analysis of dyes…

  17. A Laboratory Experiment on the Statistical Theory of Nuclear Reactions

    ERIC Educational Resources Information Center

    Loveland, Walter

    1971-01-01

    Describes an undergraduate laboratory experiment on the statistical theory of nuclear reactions. The experiment involves measuring the relative cross sections for formation of a nucleus in its meta stable excited state and its ground state by applying gamma-ray spectroscopy to an irradiated sample. Involves 3-4 hours of laboratory time plus…

  18. Statistical issues in quality control of proteomic analyses: good experimental design and planning.

    PubMed

    Cairns, David A

    2011-03-01

    Quality control is becoming increasingly important in proteomic investigations as experiments become more multivariate and quantitative. Quality control applies to all stages of an investigation and statistics can play a key role. In this review, the role of statistical ideas in the design and planning of an investigation is described. This involves the design of unbiased experiments using key concepts from statistical experimental design, the understanding of the biological and analytical variation in a system using variance components analysis and the determination of a required sample size to perform a statistically powerful investigation. These concepts are described through simple examples and an example data set from a 2-D DIGE pilot experiment. Each of these concepts can prove useful in producing better and more reproducible data. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. Calibrating a numerical model's morphology using high-resolution spatial and temporal datasets from multithread channel flume experiments.

    NASA Astrophysics Data System (ADS)

    Javernick, L.; Bertoldi, W.; Redolfi, M.

    2017-12-01

    Accessing or acquiring high quality, low-cost topographic data has never been easier due to recent developments of the photogrammetric techniques of Structure-from-Motion (SfM). Researchers can acquire the necessary SfM imagery with various platforms, with the ability to capture millimetre resolution and accuracy, or large-scale areas with the help of unmanned platforms. Such datasets in combination with numerical modelling have opened up new opportunities to study river environments physical and ecological relationships. While numerical models overall predictive accuracy is most influenced by topography, proper model calibration requires hydraulic data and morphological data; however, rich hydraulic and morphological datasets remain scarce. This lack in field and laboratory data has limited model advancement through the inability to properly calibrate, assess sensitivity, and validate the models performance. However, new time-lapse imagery techniques have shown success in identifying instantaneous sediment transport in flume experiments and their ability to improve hydraulic model calibration. With new capabilities to capture high resolution spatial and temporal datasets of flume experiments, there is a need to further assess model performance. To address this demand, this research used braided river flume experiments and captured time-lapse observed sediment transport and repeat SfM elevation surveys to provide unprecedented spatial and temporal datasets. Through newly created metrics that quantified observed and modeled activation, deactivation, and bank erosion rates, the numerical model Delft3d was calibrated. This increased temporal data of both high-resolution time series and long-term temporal coverage provided significantly improved calibration routines that refined calibration parameterization. Model results show that there is a trade-off between achieving quantitative statistical and qualitative morphological representations. Specifically, statistical agreement simulations suffered to represent braiding planforms (evolving toward meandering), and parameterization that ensured braided produced exaggerated activation and bank erosion rates. Marie Sklodowska-Curie Individual Fellowship: River-HMV, 656917

  20. Assessing Fun Items' Effectiveness in Increasing Learning of College Introductory Statistics Students: Results of a Randomized Experiment

    ERIC Educational Resources Information Center

    Lesser, Lawrence M.; Pearl, Dennis K.; Weber, John J., III

    2016-01-01

    There has been a recent emergence of scholarship on the use of fun in the college statistics classroom, with at least 20 modalities identified. While there have been randomized experiments that suggest that fun can enhance student achievement or attitudes in statistics, these studies have generally been limited to one particular fun modality or…

  1. Stochastic space interval as a link between quantum randomness and macroscopic randomness?

    NASA Astrophysics Data System (ADS)

    Haug, Espen Gaarder; Hoff, Harald

    2018-03-01

    For many stochastic phenomena, we observe statistical distributions that have fat-tails and high-peaks compared to the Gaussian distribution. In this paper, we will explain how observable statistical distributions in the macroscopic world could be related to the randomness in the subatomic world. We show that fat-tailed (leptokurtic) phenomena in our everyday macroscopic world are ultimately rooted in Gaussian - or very close to Gaussian-distributed subatomic particle randomness, but they are not, in a strict sense, Gaussian distributions. By running a truly random experiment over a three and a half-year period, we observed a type of random behavior in trillions of photons. Combining our results with simple logic, we find that fat-tailed and high-peaked statistical distributions are exactly what we would expect to observe if the subatomic world is quantized and not continuously divisible. We extend our analysis to the fact that one typically observes fat-tails and high-peaks relative to the Gaussian distribution in stocks and commodity prices and many aspects of the natural world; these instances are all observable and documentable macro phenomena that strongly suggest that the ultimate building blocks of nature are discrete (e.g. they appear in quanta).

  2. Where statistics and molecular microarray experiments biology meet.

    PubMed

    Kelmansky, Diana M

    2013-01-01

    This review chapter presents a statistical point of view to microarray experiments with the purpose of understanding the apparent contradictions that often appear in relation to their results. We give a brief introduction of molecular biology for nonspecialists. We describe microarray experiments from their construction and the biological principles the experiments rely on, to data acquisition and analysis. The role of epidemiological approaches and sample size considerations are also discussed.

  3. Exploratory Visual Analysis of Statistical Results from Microarray Experiments Comparing High and Low Grade Glioma

    PubMed Central

    Reif, David M.; Israel, Mark A.; Moore, Jason H.

    2007-01-01

    The biological interpretation of gene expression microarray results is a daunting challenge. For complex diseases such as cancer, wherein the body of published research is extensive, the incorporation of expert knowledge provides a useful analytical framework. We have previously developed the Exploratory Visual Analysis (EVA) software for exploring data analysis results in the context of annotation information about each gene, as well as biologically relevant groups of genes. We present EVA as a flexible combination of statistics and biological annotation that provides a straightforward visual interface for the interpretation of microarray analyses of gene expression in the most commonly occuring class of brain tumors, glioma. We demonstrate the utility of EVA for the biological interpretation of statistical results by analyzing publicly available gene expression profiles of two important glial tumors. The results of a statistical comparison between 21 malignant, high-grade glioblastoma multiforme (GBM) tumors and 19 indolent, low-grade pilocytic astrocytomas were analyzed using EVA. By using EVA to examine the results of a relatively simple statistical analysis, we were able to identify tumor class-specific gene expression patterns having both statistical and biological significance. Our interactive analysis highlighted the potential importance of genes involved in cell cycle progression, proliferation, signaling, adhesion, migration, motility, and structure, as well as candidate gene loci on a region of Chromosome 7 that has been implicated in glioma. Because EVA does not require statistical or computational expertise and has the flexibility to accommodate any type of statistical analysis, we anticipate EVA will prove a useful addition to the repertoire of computational methods used for microarray data analysis. EVA is available at no charge to academic users and can be found at http://www.epistasis.org. PMID:19390666

  4. Self-similarity in high Atwood number Rayleigh-Taylor experiments

    NASA Astrophysics Data System (ADS)

    Mikhaeil, Mark; Suchandra, Prasoon; Pathikonda, Gokul; Ranjan, Devesh

    2017-11-01

    Self-similarity is a critical concept in turbulent and mixing flows. In the Rayleigh-Taylor instability, theory and simulations have shown that the flow exhibits properties of self-similarity as the mixing Reynolds number exceeds 20000 and the flow enters the turbulent regime. Here, we present results from the first large Atwood number (0.7) Rayleigh-Taylor experimental campaign for mixing Reynolds number beyond 20000 in an effort to characterize the self-similar nature of the instability. Experiments are performed in a statistically steady gas tunnel facility, allowing for the evaluation of turbulence statistics. A visualization diagnostic is used to study the evolution of the mixing width as the instability grows. This allows for computation of the instability growth rate. For the first time in such a facility, stereoscopic particle image velocimetry is used to resolve three-component velocity information in a plane. Velocity means, fluctuations, and correlations are considered as well as their appropriate scaling. Probability density functions of velocity fields, energy spectra, and higher-order statistics are also presented. The energy budget of the flow is described, including the ratio of the kinetic energy to the released potential energy. This work was supported by the DOE-NNSA SSAA Grant DE-NA0002922.

  5. Practical Advice on Calculating Confidence Intervals for Radioprotection Effects and Reducing Animal Numbers in Radiation Countermeasure Experiments

    PubMed Central

    Landes, Reid D.; Lensing, Shelly Y.; Kodell, Ralph L.; Hauer-Jensen, Martin

    2014-01-01

    The dose of a substance that causes death in P% of a population is called an LDP, where LD stands for lethal dose. In radiation research, a common LDP of interest is the radiation dose that kills 50% of the population by a specified time, i.e., lethal dose 50 or LD50. When comparing LD50 between two populations, relative potency is the parameter of interest. In radiation research, this is commonly known as the dose reduction factor (DRF). Unfortunately, statistical inference on dose reduction factor is seldom reported. We illustrate how to calculate confidence intervals for dose reduction factor, which may then be used for statistical inference. Further, most dose reduction factor experiments use hundreds, rather than tens of animals. Through better dosing strategies and the use of a recently available sample size formula, we also show how animal numbers may be reduced while maintaining high statistical power. The illustrations center on realistic examples comparing LD50 values between a radiation countermeasure group and a radiation-only control. We also provide easy-to-use spreadsheets for sample size calculations and confidence interval calculations, as well as SAS® and R code for the latter. PMID:24164553

  6. Velocity distributions of granular gases with drag and with long-range interactions.

    PubMed

    Kohlstedt, K; Snezhko, A; Sapozhnikov, M V; Aranson, I S; Olafsen, J S; Ben-Naim, E

    2005-08-05

    We study velocity statistics of electrostatically driven granular gases. For two different experiments, (i) nonmagnetic particles in a viscous fluid and (ii) magnetic particles in air, the velocity distribution is non-Maxwellian, and its high-energy tail is exponential, P(upsilon) approximately exp(-/upsilon/). This behavior is consistent with the kinetic theory of driven dissipative particles. For particles immersed in a fluid, viscous damping is responsible for the exponential tail, while for magnetic particles, long-range interactions cause the exponential tail. We conclude that velocity statistics of dissipative gases are sensitive to the fluid environment and to the form of the particle interaction.

  7. Experience and Explanation: Using Videogames to Prepare Students for Formal Instruction in Statistics

    ERIC Educational Resources Information Center

    Arena, Dylan A.; Schwartz, Daniel L.

    2014-01-01

    Well-designed digital games can deliver powerful experiences that are difficult to provide through traditional instruction, while traditional instruction can deliver formal explanations that are not a natural fit for gameplay. Combined, they can accomplish more than either can alone. An experiment tested this claim using the topic of statistics,…

  8. The Dependence of Strength in Plastics upon Polymer Chain Length and Chain Orientation: An Experiment Emphasizing the Statistical Handling and Evaluation of Data.

    ERIC Educational Resources Information Center

    Spencer, R. Donald

    1984-01-01

    Describes an experiment (using plastic bags) designed to give students practical understanding on using statistics to evaluate data and how statistical treatment of experimental results can enhance their value in solving scientific problems. Students also gain insight into the orientation and structure of polymers by examining the plastic bags.…

  9. Selective activation around the left occipito-temporal sulcus for words relative to pictures: individual variability or false positives?

    PubMed

    Wright, Nicholas D; Mechelli, Andrea; Noppeney, Uta; Veltman, Dick J; Rombouts, Serge A R B; Glensman, Janice; Haynes, John-Dylan; Price, Cathy J

    2008-08-01

    We used high-resolution fMRI to investigate claims that learning to read results in greater left occipito-temporal (OT) activation for written words relative to pictures of objects. In the first experiment, 9/16 subjects performing a one-back task showed activation in > or =1 left OT voxel for words relative to pictures (P < 0.05 uncorrected). In a second experiment, another 9/15 subjects performing a semantic decision task activated > or =1 left OT voxel for words relative to pictures. However, at this low statistical threshold false positives need to be excluded. The semantic decision paradigm was therefore repeated, within subject, in two different scanners (1.5 and 3 T). Both scanners consistently localised left OT activation for words relative to fixation and pictures relative to words, but there were no consistent effects for words relative to pictures. Finally, in a third experiment, we minimised the voxel size (1.5 x 1.5 x 1.5 mm(3)) and demonstrated a striking concordance between the voxels activated for words and pictures, irrespective of task (naming vs. one-back) or script (English vs. Hebrew). In summary, although we detected differential activation for words relative to pictures, these effects: (i) do not withstand statistical rigour; (ii) do not replicate within or between subjects; and (iii) are observed in voxels that also respond to pictures of objects. Our findings have implications for the role of left OT activation during reading. More generally, they show that studies using low statistical thresholds in single subject analyses should correct the statistical threshold for the number of comparisons made or replicate effects within subject. (c) 2007 Wiley-Liss, Inc.

  10. Selective Activation Around the Left Occipito-Temporal Sulcus for Words Relative to Pictures: Individual Variability or False Positives?

    PubMed Central

    Wright, Nicholas D; Mechelli, Andrea; Noppeney, Uta; Veltman, Dick J; Rombouts, Serge ARB; Glensman, Janice; Haynes, John-Dylan; Price, Cathy J

    2008-01-01

    We used high-resolution fMRI to investigate claims that learning to read results in greater left occipito-temporal (OT) activation for written words relative to pictures of objects. In the first experiment, 9/16 subjects performing a one-back task showed activation in ≥1 left OT voxel for words relative to pictures (P < 0.05 uncorrected). In a second experiment, another 9/15 subjects performing a semantic decision task activated ≥1 left OT voxel for words relative to pictures. However, at this low statistical threshold false positives need to be excluded. The semantic decision paradigm was therefore repeated, within subject, in two different scanners (1.5 and 3 T). Both scanners consistently localised left OT activation for words relative to fixation and pictures relative to words, but there were no consistent effects for words relative to pictures. Finally, in a third experiment, we minimised the voxel size (1.5 × 1.5 × 1.5 mm3) and demonstrated a striking concordance between the voxels activated for words and pictures, irrespective of task (naming vs. one-back) or script (English vs. Hebrew). In summary, although we detected differential activation for words relative to pictures, these effects: (i) do not withstand statistical rigour; (ii) do not replicate within or between subjects; and (iii) are observed in voxels that also respond to pictures of objects. Our findings have implications for the role of left OT activation during reading. More generally, they show that studies using low statistical thresholds in single subject analyses should correct the statistical threshold for the number of comparisons made or replicate effects within subject. Hum Brain Mapp 2008. © 2007 Wiley-Liss, Inc. PMID:17712786

  11. Peer Assessment Enhances Student Learning: The Results of a Matched Randomized Crossover Experiment in a College Statistics Class.

    PubMed

    Sun, Dennis L; Harris, Naftali; Walther, Guenther; Baiocchi, Michael

    2015-01-01

    Feedback has a powerful influence on learning, but it is also expensive to provide. In large classes it may even be impossible for instructors to provide individualized feedback. Peer assessment is one way to provide personalized feedback that scales to large classes. Besides these obvious logistical benefits, it has been conjectured that students also learn from the practice of peer assessment. However, this has never been conclusively demonstrated. Using an online educational platform that we developed, we conducted an in-class matched-set, randomized crossover experiment with high power to detect small effects. We establish that peer assessment causes a small but significant gain in student achievement. Our study also demonstrates the potential of web-based platforms to facilitate the design of high-quality experiments to identify small effects that were previously not detectable.

  12. Effects of Learning Style and Training Method on Computer Attitude and Performance in World Wide Web Page Design Training.

    ERIC Educational Resources Information Center

    Chou, Huey-Wen; Wang, Yu-Fang

    1999-01-01

    Compares the effects of two training methods on computer attitude and performance in a World Wide Web page design program in a field experiment with high school students in Taiwan. Discusses individual differences, Kolb's Experiential Learning Theory and Learning Style Inventory, Computer Attitude Scale, and results of statistical analyses.…

  13. A new assessment of the alleged link between element 115 and element 117 decay chains

    NASA Astrophysics Data System (ADS)

    Forsberg, U.; Rudolph, D.; Fahlander, C.; Golubev, P.; Sarmiento, L. G.; Åberg, S.; Block, M.; Düllmann, Ch. E.; Heßberger, F. P.; Kratz, J. V.; Yakushev, A.

    2016-09-01

    A novel rigorous statistical treatment is applied to available data (May 9, 2016) from search and spectroscopy experiments on the elements with atomic numbers Z = 115 and Z = 117. The present analysis implies that the hitherto proposed cross-reaction link between α-decay chains associated with the isotopes 293117 and 289115 is highly improbable.

  14. Millimeter wave propagation measurements using the ATS 5 satellite

    NASA Technical Reports Server (NTRS)

    Ippolito, L. J.

    1972-01-01

    The ATS 5 millimeter wave propagation experiment determines long- and short-term attenuation statistics of operational millimeter wavelength earthspace links as functions of defined meteorological conditions. A preliminary analysis of results with 15 GHz downlink and 32 GHz uplink frequency bands indicates that both frequency bands exhibit an excellent potential for utilization in reliable high data rate earth-space communications systems.

  15. Exploring Customization in Higher Education: An Experiment in Leveraging Computer Spreadsheet Technology to Deliver Highly Individualized Online Instruction to Undergraduate Business Students

    ERIC Educational Resources Information Center

    Kunzler, Jayson S.

    2012-01-01

    This dissertation describes a research study designed to explore whether customization of online instruction results in improved learning in a college business statistics course. The study involved utilizing computer spreadsheet technology to develop an intelligent tutoring system (ITS) designed to: a) collect and monitor individual real-time…

  16. Constructed Response Tests in the NELS:88 High School Effectiveness Study. National Education Longitudinal Study of 1988 Second Followup. Statistical Analysis Report.

    ERIC Educational Resources Information Center

    Pollock, Judith M.; And Others

    This report describes an experiment in constructed response testing undertaken in conjunction with the National Education Longitudinal Study of 1988 (NELS:88). Constructed response questions are those that require students to produce their own response rather than selecting the correct answer from several options. Participants in this experiment…

  17. Highly Efficient Design-of-Experiments Methods for Combining CFD Analysis and Experimental Data

    NASA Technical Reports Server (NTRS)

    Anderson, Bernhard H.; Haller, Harold S.

    2009-01-01

    It is the purpose of this study to examine the impact of "highly efficient" Design-of-Experiments (DOE) methods for combining sets of CFD generated analysis data with smaller sets of Experimental test data in order to accurately predict performance results where experimental test data were not obtained. The study examines the impact of micro-ramp flow control on the shock wave boundary layer (SWBL) interaction where a complete paired set of data exist from both CFD analysis and Experimental measurements By combining the complete set of CFD analysis data composed of fifteen (15) cases with a smaller subset of experimental test data containing four/five (4/5) cases, compound data sets (CFD/EXP) were generated which allows the prediction of the complete set of Experimental results No statistical difference were found to exist between the combined (CFD/EXP) generated data sets and the complete Experimental data set composed of fifteen (15) cases. The same optimal micro-ramp configuration was obtained using the (CFD/EXP) generated data as obtained with the complete set of Experimental data, and the DOE response surfaces generated by the two data sets were also not statistically different.

  18. A novel bi-level meta-analysis approach: applied to biological pathway analysis.

    PubMed

    Nguyen, Tin; Tagett, Rebecca; Donato, Michele; Mitrea, Cristina; Draghici, Sorin

    2016-02-01

    The accumulation of high-throughput data in public repositories creates a pressing need for integrative analysis of multiple datasets from independent experiments. However, study heterogeneity, study bias, outliers and the lack of power of available methods present real challenge in integrating genomic data. One practical drawback of many P-value-based meta-analysis methods, including Fisher's, Stouffer's, minP and maxP, is that they are sensitive to outliers. Another drawback is that, because they perform just one statistical test for each individual experiment, they may not fully exploit the potentially large number of samples within each study. We propose a novel bi-level meta-analysis approach that employs the additive method and the Central Limit Theorem within each individual experiment and also across multiple experiments. We prove that the bi-level framework is robust against bias, less sensitive to outliers than other methods, and more sensitive to small changes in signal. For comparative analysis, we demonstrate that the intra-experiment analysis has more power than the equivalent statistical test performed on a single large experiment. For pathway analysis, we compare the proposed framework versus classical meta-analysis approaches (Fisher's, Stouffer's and the additive method) as well as against a dedicated pathway meta-analysis package (MetaPath), using 1252 samples from 21 datasets related to three human diseases, acute myeloid leukemia (9 datasets), type II diabetes (5 datasets) and Alzheimer's disease (7 datasets). Our framework outperforms its competitors to correctly identify pathways relevant to the phenotypes. The framework is sufficiently general to be applied to any type of statistical meta-analysis. The R scripts are available on demand from the authors. sorin@wayne.edu Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  19. Study of muon-induced neutron production using accelerator muon beam at CERN

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nakajima, Y.; Lin, C. J.; Ochoa-Ricoux, J. P.

    2015-08-17

    Cosmogenic muon-induced neutrons are one of the most problematic backgrounds for various underground experiments for rare event searches. In order to accurately understand such backgrounds, experimental data with high-statistics and well-controlled systematics is essential. We performed a test experiment to measure muon-induced neutron production yield and energy spectrum using a high-energy accelerator muon beam at CERN. We successfully observed neutrons from 160 GeV/c muon interaction on lead, and measured kinetic energy distributions for various production angles. Works towards evaluation of absolute neutron production yield is underway. This work also demonstrates that the setup is feasible for a future large-scale experimentmore » for more comprehensive study of muon-induced neutron production.« less

  20. DEM Particle Fracture Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Boning; Herbold, Eric B.; Homel, Michael A.

    2015-12-01

    An adaptive particle fracture model in poly-ellipsoidal Discrete Element Method is developed. The poly-ellipsoidal particle will break into several sub-poly-ellipsoids by Hoek-Brown fracture criterion based on continuum stress and the maximum tensile stress in contacts. Also Weibull theory is introduced to consider the statistics and size effects on particle strength. Finally, high strain-rate split Hopkinson pressure bar experiment of silica sand is simulated using this newly developed model. Comparisons with experiments show that our particle fracture model can capture the mechanical behavior of this experiment very well, both in stress-strain response and particle size redistribution. The effects of density andmore » packings o the samples are also studied in numerical examples.« less

  1. Anatomy of a Jam

    NASA Astrophysics Data System (ADS)

    Tang, Junyao; Sagdighpour, Sepehr; Behringer, Robert

    2008-11-01

    Flow in a hopper is both a fertile testing ground for understanding models for granular flow and industrially highly relevant. However, the formation of arches in the hopper opening, which halts the hopper flow unpredictably, is still poorly understood. In this work, we conduct a two-dimension hopper experiments, using photoelastic particles, and characterize these experiments in terms of a statistical model that considers the probability of jamming. The distribution of the hopper flow times exhibits an exponential decay, which shows the existence of a characteristic ``mean flow time.'' We then conduct further experiments to examine the connection between the mean flow time, the hopper geometry, the local density, and geometric structures and forces at the particle scale.

  2. Hospital graduate social work field work programs: a study in New York City.

    PubMed

    Showers, N

    1990-02-01

    Twenty-seven hospital field work programs in New York City were studied. Questionnaires were administered to program coordinators and 238 graduate social work students participating in study programs. High degrees of program structural complexity and variation were found, indicating a state of art well beyond that described in the general field work literature. High rates of student satisfaction with learning, field instructors, programs, and the overall field work experience found suggest that the complexity of study programs may be more effective than traditional field work models. Statistically nonsignificant study findings indicate areas in which hospital social work departments may develop field work programs consistent with shifting organizational needs, without undue risk to educational effectiveness. Statistically significant findings suggest areas in which inflexibility in program design may be more beneficial in the diagnostic related groups era.

  3. Effects of heat-treatment and explosive brisance on fragmentation of high strength steel

    NASA Astrophysics Data System (ADS)

    Stolken, James; Kumar, Mukul; Gold, Vladimir; Baker, Ernest; Lawrence Livermore Nationa Laboratory Collaboration; Armament Research Development; Eng Collaboration

    2011-06-01

    Tubes of AISI-4340 steel were heat-treated to three distinct microstructures resulting in nominal hardness values of 25 Rc, 38 Rc and 48 Rc. The specimens were then explosively fragmented using TNT and PETN. The experiments were conducted in a contained firing facility with high fragment collection efficiency. Statistical analyses of recovered fragments were performed. Fragment rank-order statistics and generalized goodness-of-fit tests were used to characterize the fragment mass distributions. These analyses indicated significant interaction effects between the heat-treatment (and the resulting microstructure) and the explosive brisance. The role of the microstructure in relation to the yield-strength and toughness will also be discussed. This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  4. Real-world visual statistics and infants' first-learned object names.

    PubMed

    Clerkin, Elizabeth M; Hart, Elizabeth; Rehg, James M; Yu, Chen; Smith, Linda B

    2017-01-05

    We offer a new solution to the unsolved problem of how infants break into word learning based on the visual statistics of everyday infant-perspective scenes. Images from head camera video captured by 8 1/2 to 10 1/2 month-old infants at 147 at-home mealtime events were analysed for the objects in view. The images were found to be highly cluttered with many different objects in view. However, the frequency distribution of object categories was extremely right skewed such that a very small set of objects was pervasively present-a fact that may substantially reduce the problem of referential ambiguity. The statistical structure of objects in these infant egocentric scenes differs markedly from that in the training sets used in computational models and in experiments on statistical word-referent learning. Therefore, the results also indicate a need to re-examine current explanations of how infants break into word learning.This article is part of the themed issue 'New frontiers for statistical learning in the cognitive sciences'. © 2016 The Author(s).

  5. The LDEF ultra heavy cosmic ray experiment

    NASA Technical Reports Server (NTRS)

    Osullivan, D.; Thompson, A.; Bosch, J.; Keegan, R.; Wenzel, K.-P.; Smit, A.; Domingo, C.

    1991-01-01

    The Long Duration Exposure Facility (LDEF) Ultra Heavy Cosmic Ray Experiment (UHCRE) used 16 side viewing LDEF trays giving a total geometry factor for high energy cosmic rays of 30 sq m sr. The total exposure factor was 170 sq m sr y. The experiment is based on a modular array of 192 solid state nuclear track detector stacks, mounted in sets of 4 pressure vessels (3 experiment tray). The extended duration of the LDEF mission has resulted in a greatly enhanced potential scientific yield from the UHCRE. Initial scanning results indicate that at least 2000 cosmic ray nuclei with Z greater than 65 were collected, including the world's first statistically significant sample of actinides. Postflight work to date and the current status of the experiment are reviewed. Provisional results from analysis of preflight and postflight calibrations are presented.

  6. Statistical issues in the design and planning of proteomic profiling experiments.

    PubMed

    Cairns, David A

    2015-01-01

    The statistical design of a clinical proteomics experiment is a critical part of well-undertaken investigation. Standard concepts from experimental design such as randomization, replication and blocking should be applied in all experiments, and this is possible when the experimental conditions are well understood by the investigator. The large number of proteins simultaneously considered in proteomic discovery experiments means that determining the number of required replicates to perform a powerful experiment is more complicated than in simple experiments. However, by using information about the nature of an experiment and making simple assumptions this is achievable for a variety of experiments useful for biomarker discovery and initial validation.

  7. PageMan: an interactive ontology tool to generate, display, and annotate overview graphs for profiling experiments.

    PubMed

    Usadel, Björn; Nagel, Axel; Steinhauser, Dirk; Gibon, Yves; Bläsing, Oliver E; Redestig, Henning; Sreenivasulu, Nese; Krall, Leonard; Hannah, Matthew A; Poree, Fabien; Fernie, Alisdair R; Stitt, Mark

    2006-12-18

    Microarray technology has become a widely accepted and standardized tool in biology. The first microarray data analysis programs were developed to support pair-wise comparison. However, as microarray experiments have become more routine, large scale experiments have become more common, which investigate multiple time points or sets of mutants or transgenics. To extract biological information from such high-throughput expression data, it is necessary to develop efficient analytical platforms, which combine manually curated gene ontologies with efficient visualization and navigation tools. Currently, most tools focus on a few limited biological aspects, rather than offering a holistic, integrated analysis. Here we introduce PageMan, a multiplatform, user-friendly, and stand-alone software tool that annotates, investigates, and condenses high-throughput microarray data in the context of functional ontologies. It includes a GUI tool to transform different ontologies into a suitable format, enabling the user to compare and choose between different ontologies. It is equipped with several statistical modules for data analysis, including over-representation analysis and Wilcoxon statistical testing. Results are exported in a graphical format for direct use, or for further editing in graphics programs.PageMan provides a fast overview of single treatments, allows genome-level responses to be compared across several microarray experiments covering, for example, stress responses at multiple time points. This aids in searching for trait-specific changes in pathways using mutants or transgenics, analyzing development time-courses, and comparison between species. In a case study, we analyze the results of publicly available microarrays of multiple cold stress experiments using PageMan, and compare the results to a previously published meta-analysis.PageMan offers a complete user's guide, a web-based over-representation analysis as well as a tutorial, and is freely available at http://mapman.mpimp-golm.mpg.de/pageman/. PageMan allows multiple microarray experiments to be efficiently condensed into a single page graphical display. The flexible interface allows data to be quickly and easily visualized, facilitating comparisons within experiments and to published experiments, thus enabling researchers to gain a rapid overview of the biological responses in the experiments.

  8. Exceedance statistics of accelerations resulting from thruster firings on the Apollo-Soyuz mission

    NASA Technical Reports Server (NTRS)

    Fichtl, G. H.; Holland, R. L.

    1981-01-01

    Spacecraft acceleration resulting from firings of vernier control system thrusters is an important consideration in the design, planning, execution and post-flight analysis of laboratory experiments in space. In particular, scientists and technologists involved with the development of experiments to be performed in space in many instances required statistical information on the magnitude and rate of occurrence of spacecraft accelerations. Typically, these accelerations are stochastic in nature, so that it is useful to characterize these accelerations in statistical terms. Statistics of spacecraft accelerations are summarized.

  9. Statistical models for the analysis and design of digital polymerase chain (dPCR) experiments

    USGS Publications Warehouse

    Dorazio, Robert; Hunter, Margaret

    2015-01-01

    Statistical methods for the analysis and design of experiments using digital PCR (dPCR) have received only limited attention and have been misused in many instances. To address this issue and to provide a more general approach to the analysis of dPCR data, we describe a class of statistical models for the analysis and design of experiments that require quantification of nucleic acids. These models are mathematically equivalent to generalized linear models of binomial responses that include a complementary, log–log link function and an offset that is dependent on the dPCR partition volume. These models are both versatile and easy to fit using conventional statistical software. Covariates can be used to specify different sources of variation in nucleic acid concentration, and a model’s parameters can be used to quantify the effects of these covariates. For purposes of illustration, we analyzed dPCR data from different types of experiments, including serial dilution, evaluation of copy number variation, and quantification of gene expression. We also showed how these models can be used to help design dPCR experiments, as in selection of sample sizes needed to achieve desired levels of precision in estimates of nucleic acid concentration or to detect differences in concentration among treatments with prescribed levels of statistical power.

  10. Statistical Models for the Analysis and Design of Digital Polymerase Chain Reaction (dPCR) Experiments.

    PubMed

    Dorazio, Robert M; Hunter, Margaret E

    2015-11-03

    Statistical methods for the analysis and design of experiments using digital PCR (dPCR) have received only limited attention and have been misused in many instances. To address this issue and to provide a more general approach to the analysis of dPCR data, we describe a class of statistical models for the analysis and design of experiments that require quantification of nucleic acids. These models are mathematically equivalent to generalized linear models of binomial responses that include a complementary, log-log link function and an offset that is dependent on the dPCR partition volume. These models are both versatile and easy to fit using conventional statistical software. Covariates can be used to specify different sources of variation in nucleic acid concentration, and a model's parameters can be used to quantify the effects of these covariates. For purposes of illustration, we analyzed dPCR data from different types of experiments, including serial dilution, evaluation of copy number variation, and quantification of gene expression. We also showed how these models can be used to help design dPCR experiments, as in selection of sample sizes needed to achieve desired levels of precision in estimates of nucleic acid concentration or to detect differences in concentration among treatments with prescribed levels of statistical power.

  11. Dynamic weakening is limited by granular dynamics

    NASA Astrophysics Data System (ADS)

    Kuwano, O.; Hatano, T.

    2011-12-01

    Earthquakes are the result of the frictional instability of faults containing fine rock powders called gouge derived from attrition in past fault motions. Understanding the frictional instability of granular matter in terms of constitutive laws is thus important. Because of the importance of granular matter for industries and engineering, the friction of granular matter has been studied in the field of solid earth science and other fields, such as statistical physics. In solid earth science, the rate- and state-dependent friction law was established by laboratory experiments at a very low sliding velocity (μm/s to mm/s). Recent experiments conducted at sub-seismic to seismic sliding velocities (mm/s to m/s), however, show that frictional properties are much richer than those predicted by the rate- and state-dependent friction law. One of the most important findings in such experiments is the remarkable weakening due to mechano-chemical effects by frictional heating [Tullis, 2007]. In statistical physics, another empirical law holds for much faster deformation than the former, showing positive shear-rate dependence. Until Recently, friction of granular matter has been investigated independently in the fields of solid earth science and statistical physics, and thus the relation between these distinct constitutive laws is not clear. Recently, some experimental studies have been reported to connect the achievements in these two fields. For example, a laboratory experiment on dry glass beads under very low normal stress (0.02 to 0.05 MPa) in which the frictional heat is negligible reveals the transition from velocity-weakening friction at low sliding velocities to velocity-strengthening friction at high sliding velocities [Kuwano et al., 2011]. Importantly, the velocity-strengthening nature at high sliding velocities is quantitatively the same as those observed in simulations. The inelastic deformation of the grains therefore plays a vital role at high sliding velocities. In this study, we report a friction experiment under higher pressure (0.1 to 0.9 MPa), in which the frictional heat is significant. To clarify the effect of frictional heat in high-speed friction systematically, we investigated both the pressure and the velocity dependence of the friction coefficient over a wide range of sliding velocities ranging from aseismic to seismic slip velocities. We observed considerable weakening, described well by a flash-heating theory, above the sliding velocity of 1 cm/s regardless of pressure. At higher velocities, the velocity strengthening behavior replaced the velocity weakening behavior. This strengthening at higher velocities agrees with data from numerical simulations on sheared granular matter and is therefore described in terms of energy dissipation due to the inelastic deformation of grains. We propose a unified steady-state friction law that well describes the velocity and pressure dependence of the steady-state friction coefficient.

  12. Defining window-boundaries for genomic analyses using smoothing spline techniques

    DOE PAGES

    Beissinger, Timothy M.; Rosa, Guilherme J.M.; Kaeppler, Shawn M.; ...

    2015-04-17

    High-density genomic data is often analyzed by combining information over windows of adjacent markers. Interpretation of data grouped in windows versus at individual locations may increase statistical power, simplify computation, reduce sampling noise, and reduce the total number of tests performed. However, use of adjacent marker information can result in over- or under-smoothing, undesirable window boundary specifications, or highly correlated test statistics. We introduce a method for defining windows based on statistically guided breakpoints in the data, as a foundation for the analysis of multiple adjacent data points. This method involves first fitting a cubic smoothing spline to the datamore » and then identifying the inflection points of the fitted spline, which serve as the boundaries of adjacent windows. This technique does not require prior knowledge of linkage disequilibrium, and therefore can be applied to data collected from individual or pooled sequencing experiments. Moreover, in contrast to existing methods, an arbitrary choice of window size is not necessary, since these are determined empirically and allowed to vary along the genome.« less

  13. Application of multivariate statistical techniques in microbial ecology.

    PubMed

    Paliy, O; Shankar, V

    2016-03-01

    Recent advances in high-throughput methods of molecular analyses have led to an explosion of studies generating large-scale ecological data sets. In particular, noticeable effect has been attained in the field of microbial ecology, where new experimental approaches provided in-depth assessments of the composition, functions and dynamic changes of complex microbial communities. Because even a single high-throughput experiment produces large amount of data, powerful statistical techniques of multivariate analysis are well suited to analyse and interpret these data sets. Many different multivariate techniques are available, and often it is not clear which method should be applied to a particular data set. In this review, we describe and compare the most widely used multivariate statistical techniques including exploratory, interpretive and discriminatory procedures. We consider several important limitations and assumptions of these methods, and we present examples of how these approaches have been utilized in recent studies to provide insight into the ecology of the microbial world. Finally, we offer suggestions for the selection of appropriate methods based on the research question and data set structure. © 2016 John Wiley & Sons Ltd.

  14. Mathematics Anxiety and Statistics Anxiety. Shared but Also Unshared Components and Antagonistic Contributions to Performance in Statistics

    PubMed Central

    Paechter, Manuela; Macher, Daniel; Martskvishvili, Khatuna; Wimmer, Sigrid; Papousek, Ilona

    2017-01-01

    In many social science majors, e.g., psychology, students report high levels of statistics anxiety. However, these majors are often chosen by students who are less prone to mathematics and who might have experienced difficulties and unpleasant feelings in their mathematics courses at school. The present study investigates whether statistics anxiety is a genuine form of anxiety that impairs students' achievements or whether learners mainly transfer previous experiences in mathematics and their anxiety in mathematics to statistics. The relationship between mathematics anxiety and statistics anxiety, their relationship to learning behaviors and to performance in a statistics examination were investigated in a sample of 225 undergraduate psychology students (164 women, 61 men). Data were recorded at three points in time: At the beginning of term students' mathematics anxiety, general proneness to anxiety, school grades, and demographic data were assessed; 2 weeks before the end of term, they completed questionnaires on statistics anxiety and their learning behaviors. At the end of term, examination scores were recorded. Mathematics anxiety and statistics anxiety correlated highly but the comparison of different structural equation models showed that they had genuine and even antagonistic contributions to learning behaviors and performance in the examination. Surprisingly, mathematics anxiety was positively related to performance. It might be that students realized over the course of their first term that knowledge and skills in higher secondary education mathematics are not sufficient to be successful in statistics. Part of mathematics anxiety may then have strengthened positive extrinsic effort motivation by the intention to avoid failure and may have led to higher effort for the exam preparation. However, via statistics anxiety mathematics anxiety also had a negative contribution to performance. Statistics anxiety led to higher procrastination in the structural equation model and, therefore, contributed indirectly and negatively to performance. Furthermore, it had a direct negative impact on performance (probably via increased tension and worry in the exam). The results of the study speak for shared but also unique components of statistics anxiety and mathematics anxiety. They are also important for instruction and give recommendations to learners as well as to instructors. PMID:28790938

  15. Mathematics Anxiety and Statistics Anxiety. Shared but Also Unshared Components and Antagonistic Contributions to Performance in Statistics.

    PubMed

    Paechter, Manuela; Macher, Daniel; Martskvishvili, Khatuna; Wimmer, Sigrid; Papousek, Ilona

    2017-01-01

    In many social science majors, e.g., psychology, students report high levels of statistics anxiety. However, these majors are often chosen by students who are less prone to mathematics and who might have experienced difficulties and unpleasant feelings in their mathematics courses at school. The present study investigates whether statistics anxiety is a genuine form of anxiety that impairs students' achievements or whether learners mainly transfer previous experiences in mathematics and their anxiety in mathematics to statistics. The relationship between mathematics anxiety and statistics anxiety, their relationship to learning behaviors and to performance in a statistics examination were investigated in a sample of 225 undergraduate psychology students (164 women, 61 men). Data were recorded at three points in time: At the beginning of term students' mathematics anxiety, general proneness to anxiety, school grades, and demographic data were assessed; 2 weeks before the end of term, they completed questionnaires on statistics anxiety and their learning behaviors. At the end of term, examination scores were recorded. Mathematics anxiety and statistics anxiety correlated highly but the comparison of different structural equation models showed that they had genuine and even antagonistic contributions to learning behaviors and performance in the examination. Surprisingly, mathematics anxiety was positively related to performance. It might be that students realized over the course of their first term that knowledge and skills in higher secondary education mathematics are not sufficient to be successful in statistics. Part of mathematics anxiety may then have strengthened positive extrinsic effort motivation by the intention to avoid failure and may have led to higher effort for the exam preparation. However, via statistics anxiety mathematics anxiety also had a negative contribution to performance. Statistics anxiety led to higher procrastination in the structural equation model and, therefore, contributed indirectly and negatively to performance. Furthermore, it had a direct negative impact on performance (probably via increased tension and worry in the exam). The results of the study speak for shared but also unique components of statistics anxiety and mathematics anxiety. They are also important for instruction and give recommendations to learners as well as to instructors.

  16. Engaging Diverse Students in Statistical Inquiry: A Comparison of Learning Experiences and Outcomes of Under-Represented and Non-Underrepresented Students Enrolled in a Multidisciplinary Project-Based Statistics Course

    ERIC Educational Resources Information Center

    Dierker, Lisa; Alexander, Jalen; Cooper, Jennifer L.; Selya, Arielle; Rose, Jennifer; Dasgupta, Nilanjana

    2016-01-01

    Introductory statistics needs innovative, evidence-based teaching practices that support and engage diverse students. To evaluate the success of a multidisciplinary, project-based course, we compared experiences of under-represented (URM) and non-underrepresented students in 4 years of the course. While URM students considered the material more…

  17. Statistical power and optimal design in experiments in which samples of participants respond to samples of stimuli.

    PubMed

    Westfall, Jacob; Kenny, David A; Judd, Charles M

    2014-10-01

    Researchers designing experiments in which a sample of participants responds to a sample of stimuli are faced with difficult questions about optimal study design. The conventional procedures of statistical power analysis fail to provide appropriate answers to these questions because they are based on statistical models in which stimuli are not assumed to be a source of random variation in the data, models that are inappropriate for experiments involving crossed random factors of participants and stimuli. In this article, we present new methods of power analysis for designs with crossed random factors, and we give detailed, practical guidance to psychology researchers planning experiments in which a sample of participants responds to a sample of stimuli. We extensively examine 5 commonly used experimental designs, describe how to estimate statistical power in each, and provide power analysis results based on a reasonable set of default parameter values. We then develop general conclusions and formulate rules of thumb concerning the optimal design of experiments in which a sample of participants responds to a sample of stimuli. We show that in crossed designs, statistical power typically does not approach unity as the number of participants goes to infinity but instead approaches a maximum attainable power value that is possibly small, depending on the stimulus sample. We also consider the statistical merits of designs involving multiple stimulus blocks. Finally, we provide a simple and flexible Web-based power application to aid researchers in planning studies with samples of stimuli.

  18. "Describing our whole experience": the statistical philosophies of W. F. R. Weldon and Karl Pearson.

    PubMed

    Pence, Charles H

    2011-12-01

    There are two motivations commonly ascribed to historical actors for taking up statistics: to reduce complicated data to a mean value (e.g., Quetelet), and to take account of diversity (e.g., Galton). Different motivations will, it is assumed, lead to different methodological decisions in the practice of the statistical sciences. Karl Pearson and W. F. R. Weldon are generally seen as following directly in Galton's footsteps. I argue for two related theses in light of this standard interpretation, based on a reading of several sources in which Weldon, independently of Pearson, reflects on his own motivations. First, while Pearson does approach statistics from this "Galtonian" perspective, he is, consistent with his positivist philosophy of science, utilizing statistics to simplify the highly variable data of biology. Weldon, on the other hand, is brought to statistics by a rich empiricism and a desire to preserve the diversity of biological data. Secondly, we have here a counterexample to the claim that divergence in motivation will lead to a corresponding separation in methodology. Pearson and Weldon, despite embracing biometry for different reasons, settled on precisely the same set of statistical tools for the investigation of evolution. Copyright © 2011 Elsevier Ltd. All rights reserved.

  19. A comparison of InVivoStat with other statistical software packages for analysis of data generated from animal experiments.

    PubMed

    Clark, Robin A; Shoaib, Mohammed; Hewitt, Katherine N; Stanford, S Clare; Bate, Simon T

    2012-08-01

    InVivoStat is a free-to-use statistical software package for analysis of data generated from animal experiments. The package is designed specifically for researchers in the behavioural sciences, where exploiting the experimental design is crucial for reliable statistical analyses. This paper compares the analysis of three experiments conducted using InVivoStat with other widely used statistical packages: SPSS (V19), PRISM (V5), UniStat (V5.6) and Statistica (V9). We show that InVivoStat provides results that are similar to those from the other packages and, in some cases, are more advanced. This investigation provides evidence of further validation of InVivoStat and should strengthen users' confidence in this new software package.

  20. Impact of Assimilation on Heavy Rainfall Simulations Using WRF Model: Sensitivity of Assimilation Results to Background Error Statistics

    NASA Astrophysics Data System (ADS)

    Rakesh, V.; Kantharao, B.

    2017-03-01

    Data assimilation is considered as one of the effective tools for improving forecast skill of mesoscale models. However, for optimum utilization and effective assimilation of observations, many factors need to be taken into account while designing data assimilation methodology. One of the critical components that determines the amount and propagation observation information into the analysis, is model background error statistics (BES). The objective of this study is to quantify how BES in data assimilation impacts on simulation of heavy rainfall events over a southern state in India, Karnataka. Simulations of 40 heavy rainfall events were carried out using Weather Research and Forecasting Model with and without data assimilation. The assimilation experiments were conducted using global and regional BES while the experiment with no assimilation was used as the baseline for assessing the impact of data assimilation. The simulated rainfall is verified against high-resolution rain-gage observations over Karnataka. Statistical evaluation using several accuracy and skill measures shows that data assimilation has improved the heavy rainfall simulation. Our results showed that the experiment using regional BES outperformed the one which used global BES. Critical thermo-dynamic variables conducive for heavy rainfall like convective available potential energy simulated using regional BES is more realistic compared to global BES. It is pointed out that these results have important practical implications in design of forecast platforms while decision-making during extreme weather events

  1. Bag-breakup control of surface drag in hurricanes

    NASA Astrophysics Data System (ADS)

    Troitskaya, Yuliya; Zilitinkevich, Sergej; Kandaurov, Alexander; Ermakova, Olga; Kozlov, Dmitry; Sergeev, Daniil

    2016-04-01

    Air-sea interaction at extreme winds is of special interest now in connection with the problem of the sea surface drag reduction at the wind speed exceeding 30-35 m/s. This phenomenon predicted by Emanuel (1995) and confirmed by a number of field (e.g., Powell, et al, 2003) and laboratory (Donelan et al, 2004) experiments still waits its physical explanation. Several papers attributed the drag reduction to spume droplets - spray turning off the crests of breaking waves (e.g., Kudryavtsev, Makin, 2011, Bao, et al, 2011). The fluxes associated with the spray are determined by the rate of droplet production at the surface quantified by the sea spray generation function (SSGF), defined as the number of spray particles of radius r produced from the unit area of water surface in unit time. However, the mechanism of spume droplets' formation is unknown and empirical estimates of SSGF varied over six orders of magnitude; therefore, the production rate of large sea spray droplets is not adequately described and there are significant uncertainties in estimations of exchange processes in hurricanes. Herewith, it is unknown what is air-sea interface and how water is fragmented to spray at hurricane wind. Using high-speed video, we observed mechanisms of production of spume droplets at strong winds by high-speed video filming, investigated statistics and compared their efficiency. Experiments showed, that the generation of the spume droplets near the wave crest is caused by the following events: bursting of submerged bubbles, generation and breakup of "projections" and "bag breakup". Statistical analysis of results of these experiments showed that the main mechanism of spray-generation is attributed to "bag-breakup mechanism", namely, inflating and consequent blowing of short-lived, sail-like pieces of the water-surface film. Using high-speed video, we show that at hurricane winds the main mechanism of spray production is attributed to "bag-breakup", namely, inflating and consequent breaking of short-lived, sail-like pieces of the water-surface film - "bags". On the base of general principles of statistical physics (model of a canonical ensemble) we developed statistics of the "bag-breakup" events: their number and statistical distribution of geometrical parameters depending on wind speed. Basing on the developed statistics, we estimated the surface stress caused by bags as the average sum of stresses caused by individual bags depending on their eometrical parameters. The resulting stress is subjected to counteracting impacts of the increasing wind speed: the increasing number of bags, and their decreasing sizes and life times and the balance yields a peaking dependence of the bag resistance on the wind speed: the share of bag-stress peaks at U10  35 m/s and then reduces. Peaking of surface stress associated with the "bag-breakup" explains seemingly paradoxical non-monotonous wind-dependence of surface drag coefficient peaking at winds about 35 m/s. This work was supported by the Russian Foundation of Basic Research (14-05-91767, 13-05-12093, 16-05-00839, 14-05-91767, 16-55-52025, 15-35-20953) and experiment and equipment was supported by Russian Science Foundation (Agreements 14-17-00667 and 15-17-20009 respectively), Yu.Troitskaya, A.Kandaurov and D.Sergeev were partially supported by FP7 Collaborative Project No. 612610.

  2. Statistical Optimization of 1,3-Propanediol (1,3-PD) Production from Crude Glycerol by Considering Four Objectives: 1,3-PD Concentration, Yield, Selectivity, and Productivity.

    PubMed

    Supaporn, Pansuwan; Yeom, Sung Ho

    2018-04-30

    This study investigated the biological conversion of crude glycerol generated from a commercial biodiesel production plant as a by-product to 1,3-propanediol (1,3-PD). Statistical analysis was employed to derive a statistical model for the individual and interactive effects of glycerol, (NH 4 ) 2 SO 4 , trace elements, pH, and cultivation time on the four objectives: 1,3-PD concentration, yield, selectivity, and productivity. Optimum conditions for each objective with its maximum value were predicted by statistical optimization, and experiments under the optimum conditions verified the predictions. In addition, by systematic analysis of the values of four objectives, optimum conditions for 1,3-PD concentration (49.8 g/L initial glycerol, 4.0 g/L of (NH 4 ) 2 SO 4 , 2.0 mL/L of trace element, pH 7.5, and 11.2 h of cultivation time) were determined to be the global optimum culture conditions for 1,3-PD production. Under these conditions, we could achieve high 1,3-PD yield (47.4%), 1,3-PD selectivity (88.8%), and 1,3-PD productivity (2.1/g/L/h) as well as high 1,3-PD concentration (23.6 g/L).

  3. Designing high speed diagnostics

    NASA Astrophysics Data System (ADS)

    Veliz Carrillo, Gerardo; Martinez, Adam; Mula, Swathi; Prestridge, Kathy; Extreme Fluids Team Team

    2017-11-01

    Timing and firing for shock-driven flows is complex because of jitter in the shock tube mechanical drivers. Consequently, experiments require dynamic triggering of diagnostics from pressure transducers. We explain the design process and criteria for setting up re-shock experiments at the Los Alamos Vertical Shock Tube facility, and the requirements for particle image velocimetry and planar laser induced fluorescence measurements necessary for calculating Richtmeyer-Meshkov variable density turbulent statistics. Dynamic triggering of diagnostics allows for further investigation of the development of the Richtemeyer-Meshkov instability at both initial shock and re-shock. Thanks to the Los Alamos National Laboratory for funding our project.

  4. Modeling the Test-Retest Statistics of a Localization Experiment in the Full Horizontal Plane.

    PubMed

    Morsnowski, André; Maune, Steffen

    2016-10-01

    Two approaches to model the test-retest statistics of a localization experiment basing on Gaussian distribution and on surrogate data are introduced. Their efficiency is investigated using different measures describing directional hearing ability. A localization experiment in the full horizontal plane is a challenging task for hearing impaired patients. In clinical routine, we use this experiment to evaluate the progress of our cochlear implant (CI) recipients. Listening and time effort limit the reproducibility. The localization experiment consists of a 12 loudspeaker circle, placed in an anechoic room, a "camera silens". In darkness, HSM sentences are presented at 65 dB pseudo-erratically from all 12 directions with five repetitions. This experiment is modeled by a set of Gaussian distributions with different standard deviations added to a perfect estimator, as well as by surrogate data. Five repetitions per direction are used to produce surrogate data distributions for the sensation directions. To investigate the statistics, we retrospectively use the data of 33 CI patients with 92 pairs of test-retest-measurements from the same day. The first model does not take inversions into account, (i.e., permutations of the direction from back to front and vice versa are not considered), although they are common for hearing impaired persons particularly in the rear hemisphere. The second model considers these inversions but does not work with all measures. The introduced models successfully describe test-retest statistics of directional hearing. However, since their applications on the investigated measures perform differently no general recommendation can be provided. The presented test-retest statistics enable pair test comparisons for localization experiments.

  5. 3D reconstruction of carbon nanotube networks from neutron scattering experiments

    DOE PAGES

    Mahdavi, Mostafa; Baniassadi, Majid; Baghani, Mostafa; ...

    2015-09-03

    Structure reconstruction from statistical descriptors, such as scattering data obtained using x-rays or neutrons, is essential in understanding various properties of nanocomposites. Scattering based reconstruction can provide a realistic model, over various length scales, that can be used for numerical simulations. In this study, 3D reconstruction of a highly loaded carbon nanotube (CNT)-conducting polymer system based on small and ultra-small angle neutron scattering (SANS and USANS, respectively) data was performed. These light-weight and flexible materials have recently shown great promise for high-performance thermoelectric energy conversion, and their further improvement requires a thorough understanding of their structure-property relationships. The first stepmore » in achieving such understanding is to generate models that contain the hierarchy of CNT networks over nano and micron scales. The studied system is a single walled carbon nanotube (SWCNT)/poly (3,4-ethylenedioxythiophene): poly (styrene sulfonate) (PEDOT: PSS). SANS and USANS patterns of the different samples containing 10, 30, and 50 wt% SWCNTs were measured. These curves were then utilized to calculate statistical two-point correlation functions of the nanostructure. These functions along with the geometrical information extracted from SANS data and scanning electron microscopy images were used to reconstruct a representative volume element (RVE) nanostructure. Generated RVEs can be used for simulations of various mechanical and physical properties. This work, therefore, introduces a framework for the reconstruction of 3D RVEs of high volume faction nanocomposites containing high aspect ratio fillers from scattering experiments.« less

  6. 3D reconstruction of carbon nanotube networks from neutron scattering experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mahdavi, Mostafa; Baniassadi, Majid; Baghani, Mostafa

    Structure reconstruction from statistical descriptors, such as scattering data obtained using x-rays or neutrons, is essential in understanding various properties of nanocomposites. Scattering based reconstruction can provide a realistic model, over various length scales, that can be used for numerical simulations. In this study, 3D reconstruction of a highly loaded carbon nanotube (CNT)-conducting polymer system based on small and ultra-small angle neutron scattering (SANS and USANS, respectively) data was performed. These light-weight and flexible materials have recently shown great promise for high-performance thermoelectric energy conversion, and their further improvement requires a thorough understanding of their structure-property relationships. The first stepmore » in achieving such understanding is to generate models that contain the hierarchy of CNT networks over nano and micron scales. The studied system is a single walled carbon nanotube (SWCNT)/poly (3,4-ethylenedioxythiophene): poly (styrene sulfonate) (PEDOT: PSS). SANS and USANS patterns of the different samples containing 10, 30, and 50 wt% SWCNTs were measured. These curves were then utilized to calculate statistical two-point correlation functions of the nanostructure. These functions along with the geometrical information extracted from SANS data and scanning electron microscopy images were used to reconstruct a representative volume element (RVE) nanostructure. Generated RVEs can be used for simulations of various mechanical and physical properties. This work, therefore, introduces a framework for the reconstruction of 3D RVEs of high volume faction nanocomposites containing high aspect ratio fillers from scattering experiments.« less

  7. Decay pattern of the Pygmy Dipole Resonance in 130Te

    NASA Astrophysics Data System (ADS)

    Isaak, J.; Beller, J.; Fiori, E.; Krtička, M.; Löher, B.; Pietralla, N.; Romig, C.; Rusev, G.; Savran, D.; Scheck, M.; Silva, J.; Sonnabend, K.; Tonchev, A.; Tornow, W.; Weller, H.; Zweidinger, M.

    2014-03-01

    The electric dipole strength distribution in 130Te has been investigated using the method of Nuclear Resonance Fluorescence. The experiments were performed at the Darmstadt High Intensity Photon Setup using bremsstrahlung as photon source and at the High Intensity overrightarrow γ -Ray Source, where quasi-monochromatic and polarized photon beams are provided. Average decay properties of 130Te below the neutron separation energy are determined. Comparing the experimental data to the predictions of the statistical model indicate, that nuclear structure effects play an important role even at sufficiently high excitation energies. Preliminary results will be presented.

  8. Nuclear science outreach program for high school girls

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Foster, D.E.; Stone, C.A.

    1996-12-31

    The authors have developed a 2-week summer school on nuclear science for high school girls. This summer school is an outgrowth of a recent American Nuclear Society high school teachers workshop held at San Jose State University. Young scientists are introduced to concepts in nuclear science through a combination of lectures, laboratory experiments, literature research, and visits to local national laboratories and nuclear facilities. Lectures cover a range of topics, including radioactivity and radioactive decay, statistics, fission and fusion, nuclear medicine, and food irradiation. A variety of applications of nuclear science concepts are also presented.

  9. Knowledge of general dentists in the current guidelines for emergency treatment of avulsed teeth and dental trauma prevention.

    PubMed

    de Vasconcellos, Luis Gustavo Oliveira; Brentel, Aline Scalone; Vanderlei, Aleska Dias; de Vasconcellos, Luana Marotta Reis; Valera, Márcia Carneiro; de Araújo, Maria Amélia Máximo

    2009-12-01

    A high prevalence of dental trauma exists and its effects on function and esthetics deserve the attention of general dentists. The aim of this study was to assess the level of general dental practitioners' (GDPs) knowledge about guidelines for dental avulsion and its prevention using a questionnaire. The 21-item questionnaire was distributed among 264 GDPs and the survey was realized between August-November 2006. The data obtained were statistically analyzed using descriptive analysis and Pearson's Chi-square test to determine associations between knowledge regarding emergency treatment and dentists from public or private dental schools and years of experience. The results showed that the participants exhibited appropriate knowledge concerning procedures in cases of tooth avulsion and its prevention. The number of correct answers was low in relation to recommended treatment at the site of injury. Storage medium, preparation of the alveolus and splint time for receiving the avulsed tooth received a high number of correct answers. One statistically significant association between years of experience and recommended treatment at the site of the injury in the case an avulsed tooth (chi(2) = 9.384, P = 0.009). In conclusion, this survey showed appropriate knowledge of dental avulsion management and its prevention among the surveyed dentists. The findings also showed that communication between dentists and the population is deficient, especially concerning practitioners of high risk and contact sports.

  10. Calocube-A highly segmented calorimeter for a space based experiment

    NASA Astrophysics Data System (ADS)

    D`Alessandro, R.; Adriani, O.; Agnesi, A.; Albergo, S.; Auditore, L.; Basti, A.; Berti, E.; Bigongiari, G.; Bonechi, L.; Bonechi, S.; Bongi, M.; Bonvicini, V.; Bottai, S.; Brogi, P.; Carotenuto, G.; Castellini, G.; Cattaneo, P. W.; Cauz, D.; Chiari, M.; Daddi, N.; Detti, S.; Fasoli, M.; Finetti, N.; Gregorio, A.; Lenzi, P.; Maestro, P.; Marrocchesi, P. S.; Miritello, M.; Mori, N.; Pacini, L.; Papini, P.; Pauletta, G.; Pirzio, F.; Rappazzo, G. F.; Rappoldi, A.; Ricciarini, S.; Santi, L. G.; Spillantini, P.; Starodubtsev, O.; Suh, J. E.; Sulaj, A.; Tiberio, A.; Tricomi, A.; Trifiro, A.; Trimarchi, M.; Vannuccini, E.; Vedda, A.; Zampa, G.; Zampa, N.; Zerbo, B.

    2016-07-01

    Future research in High Energy Cosmic Ray Physics concerns fundamental questions on their origin, acceleration mechanism, and composition. Unambiguous measurements of the energy spectra and of the composition of cosmic rays at the "knee" region could provide some of the answers to the above questions. Only ground based observations, which rely on sophisticated models describing high energy interactions in the earth's atmosphere, have been possible so far due to the extremely low particle rates at these energies. A calorimeter based space experiment can provide not only flux measurements but also energy spectra and particle identification, especially when coupled to a dE/dx measuring detector, and thus overcome some of the limitations plaguing ground based experiments. For this to be possible very large acceptances are needed if enough statistic is to be collected in a reasonable time. This contrasts with the lightness and compactness requirements for space based experiments. A novel idea in calorimetry is discussed here which addresses these issues while limiting the mass and volume of the detector. In fact a small prototype is currently being built and tested with ions. In this paper the results obtained will be presented in light of the simulations performed.

  11. Is there an emotional cost of completing high school? Ecological factors and psychological distress among LGBT homeless youth.

    PubMed

    Bidell, Markus P

    2014-01-01

    This study explored the nexus of home and school climate on the psychological distress of lesbian, gay, bisexual, and transgender (LGBT) homeless youth, as well as their experiences during high school. Of the LGBT homeless youth (N = 89) surveyed, 39.3% reported not completing high school. Most participants did not seek support from school staff nor did they report attending a school with a Gay-Straight Alliance. Significantly higher levels of psychological distress were found among high school graduates and those reporting LGBT harassment at home; however, harassment experienced at school was not statistically related to psychological distress. Findings are discussed.

  12. Capturing readiness to learn and collaboration as explored with an interprofessional simulation scenario: A mixed-methods research study.

    PubMed

    Rossler, Kelly L; Kimble, Laura P

    2016-01-01

    Didactic lecture does not lend itself to teaching interprofessional collaboration. High-fidelity human patient simulation with a focus on clinical situations/scenarios is highly conducive to interprofessional education. Consequently, a need for research supporting the incorporation of interprofessional education with high-fidelity patient simulation based technology exists. The purpose of this study was to explore readiness for interprofessional learning and collaboration among pre-licensure health professions students participating in an interprofessional education human patient simulation experience. Using a mixed methods convergent parallel design, a sample of 53 pre-licensure health professions students enrolled in nursing, respiratory therapy, health administration, and physical therapy programs within a college of health professions participated in high-fidelity human patient simulation experiences. Perceptions of interprofessional learning and collaboration were measured with the revised Readiness for Interprofessional Learning Scale (RIPLS) and the Health Professional Collaboration Scale (HPCS). Focus groups were conducted during the simulation post-briefing to obtain qualitative data. Statistical analysis included non-parametric, inferential statistics. Qualitative data were analyzed using a phenomenological approach. Pre- and post-RIPLS demonstrated pre-licensure health professions students reported significantly more positive attitudes about readiness for interprofessional learning post-simulation in the areas of team work and collaboration, negative professional identity, and positive professional identity. Post-simulation HPCS revealed pre-licensure nursing and health administration groups reported greater health collaboration during simulation than physical therapy students. Qualitative analysis yielded three themes: "exposure to experiential learning," "acquisition of interactional relationships," and "presence of chronology in role preparation." Quantitative and qualitative data converged around the finding that physical therapy students had less positive perceptions of the experience because they viewed physical therapy practice as occurring one-on-one rather than in groups. Findings support that pre-licensure students are ready to engage in interprofessional education through exposure to an experiential format such as high-fidelity human patient simulation. Copyright © 2015 Elsevier Ltd. All rights reserved.

  13. Persistence in STEM: An investigation of the relationship between high school experiences in science and mathematics and college degree completion in STEM fields

    NASA Astrophysics Data System (ADS)

    Maltese, Adam V.

    While the number of Bachelor's degrees awarded annually has nearly tripled over the past 40 years (NSF, 2008), the same cannot be said for degrees in the STEM (science, technology, engineering and mathematics) fields. The Bureau of Labor Statistics projects that by the year 2014 the combination of new positions and retirements will lead to 2 million job openings in STEM (BLS, 2005). Thus, the research questions I sought to answer with this study were: (1)What are the most common enrollment patterns for students who enter into and exit from the STEM pipeline during high school and college? (2) Controlling for differences in student background and early interest in STEM careers, what are the high school science and mathematics classroom experiences that characterize student completion of a college major in STEM? Using data from NELS:88 I analyzed descriptive statistics and completed logistic regressions to gain an understanding of factors related to student persistence in STEM. Approximately 4700 students with transcript records and who participated in all survey rounds were included in the analyses. The results of the descriptive analysis demonstrated that most students who went on to complete majors in STEM completed at least three or four years of STEM courses during high school, and enrolled in advanced high school mathematics and science courses at higher rates. At almost every pipeline checkpoint indicators of the level of coursework and achievement were significant in predicting student completion of a STEM degree. The results also support previous research that showed demographic variables have little effect on persistence once the sample is limited to those who have the intrinsic ability and desire to complete a college degree. The most significant finding is that measures of student interest and engagement in science and mathematics were significant in predicting completion of a STEM degree, above and beyond the effects of course enrollment and performance. A final analysis, which involved the comparison of descriptive statistics for students who switched into and out of the STEM pipeline during high school, suggested that attitudes toward mathematics and science play a major role in choices regarding pipeline persistence.

  14. Mapping probabilities of extreme continental water storage changes from space gravimetry

    NASA Astrophysics Data System (ADS)

    Kusche, J.; Eicker, A.; Forootan, E.; Springer, A.; Longuevergne, L.

    2016-08-01

    Using data from the Gravity Recovery And Climate Experiment (GRACE) mission, we derive statistically robust "hot spot" regions of high probability of peak anomalous—i.e., with respect to the seasonal cycle—water storage (of up to 0.7 m one-in-five-year return level) and flux (up to 0.14 m/month). Analysis of, and comparison with, up to 32 years of ERA-Interim reanalysis fields reveals generally good agreement of these hot spot regions to GRACE results and that most exceptions are located in the tropics. However, a simulation experiment reveals that differences observed by GRACE are statistically significant, and further error analysis suggests that by around the year 2020, it will be possible to detect temporal changes in the frequency of extreme total fluxes (i.e., combined effects of mainly precipitation and floods) for at least 10-20% of the continental area, assuming that we have a continuation of GRACE by its follow-up GRACE Follow-On (GRACE-FO) mission.

  15. Inferring Models of Bacterial Dynamics toward Point Sources

    PubMed Central

    Jashnsaz, Hossein; Nguyen, Tyler; Petrache, Horia I.; Pressé, Steve

    2015-01-01

    Experiments have shown that bacteria can be sensitive to small variations in chemoattractant (CA) concentrations. Motivated by these findings, our focus here is on a regime rarely studied in experiments: bacteria tracking point CA sources (such as food patches or even prey). In tracking point sources, the CA detected by bacteria may show very large spatiotemporal fluctuations which vary with distance from the source. We present a general statistical model to describe how bacteria locate point sources of food on the basis of stochastic event detection, rather than CA gradient information. We show how all model parameters can be directly inferred from single cell tracking data even in the limit of high detection noise. Once parameterized, our model recapitulates bacterial behavior around point sources such as the “volcano effect”. In addition, while the search by bacteria for point sources such as prey may appear random, our model identifies key statistical signatures of a targeted search for a point source given any arbitrary source configuration. PMID:26466373

  16. Bio hydrogen production from cassava starch by anaerobic mixed cultures: Multivariate statistical modeling

    NASA Astrophysics Data System (ADS)

    Tien, Hai Minh; Le, Kien Anh; Le, Phung Thi Kim

    2017-09-01

    Bio hydrogen is a sustainable energy resource due to its potentially higher efficiency of conversion to usable power, high energy efficiency and non-polluting nature resource. In this work, the experiments have been carried out to indicate the possibility of generating bio hydrogen as well as identifying effective factors and the optimum conditions from cassava starch. Experimental design was used to investigate the effect of operating temperature (37-43 °C), pH (6-7), and inoculums ratio (6-10 %) to the yield hydrogen production, the COD reduction and the ratio of volume of hydrogen production to COD reduction. The statistical analysis of the experiment indicated that the significant effects for the fermentation yield were the main effect of temperature, pH and inoculums ratio. The interaction effects between them seem not significant. The central composite design showed that the polynomial regression models were in good agreement with the experimental results. This result will be applied to enhance the process of cassava starch processing wastewater treatment.

  17. Exceedance statistics of accelerations resulting from thruster firings on the Apollo-Soyuz mission

    NASA Technical Reports Server (NTRS)

    Fichtl, G. H.; Holland, R. L.

    1983-01-01

    Spacecraft acceleration resulting from firings of vernier control system thrusters is an important consideration in the design, planning, execution and post-flight analysis of laboratory experiments in space. In particular, scientists and technologists involved with the development of experiments to be performed in space in many instances required statistical information on the magnitude and rate of occurrence of spacecraft accelerations. Typically, these accelerations are stochastic in nature, so that it is useful to characterize these accelerations in statistical terms. Statistics of spacecraft accelerations are summarized. Previously announced in STAR as N82-12127

  18. Eustachian Tube Mucosal Inflammation Scale Validation Based on Digital Video Images.

    PubMed

    Kivekäs, Ilkka; Pöyhönen, Leena; Aarnisalo, Antti; Rautiainen, Markus; Poe, Dennis

    2015-12-01

    The most common cause for Eustachian tube dilatory dysfunction is mucosal inflammation. The aim of this study was to validate a scale for Eustachian tube mucosal inflammation, based on digital video clips obtained during diagnostic rigid endoscopy. A previously described four-step scale for grading the degree of inflammation of the mucosa of the Eustachian tube lumen was used for this validation study. A tutorial for use of the scale, including static images and 10 second video clips, was presented to 26 clinicians with various levels of experience. Each clinician then reviewed 35 short digital video samples of Eustachian tubes from patients and rated the degree of inflammation. A subset of the clinicians performed a second rating of the same video clips at a subsequent time. Statistical analysis of the ratings provided inter- and intrarater reliability scores. Twenty-six clinicians with various levels of experience rated a total of 35 videos. Thirteen clinicians rated the videos twice. The overall correlation coefficient for the rating of inflammation severity was relatively good (0.74, 95% confidence interval, 0.72-0.76). The intralevel correlation coefficient for intrarater reliability was high (0.86). For those who rated videos twice, the intralevel correlation coefficient improved after the first rating (0.73, to 0.76), but improvement was not statistically significant. The inflammation scale used for Eustachian tube mucosal inflammation is reliable and this scale can be used with a high level of consistency by clinicians with various levels of experience.

  19. Self-reported experience of bullying of students who stutter: relations with life satisfaction, life orientation, and self-esteem.

    PubMed

    Blood, Gordon W; Blood, Ingrid M; Tramontana, G Michael; Sylvia, Anna J; Boyle, Michael P; Motzko, Gina R

    2011-10-01

    Self-reported self-esteem, life orientation, satisfaction with life, and bullying were examined in relation to victimization experiences among 54 students who stuttered and 54 students who did not stutter. Those who stuttered reported greater, i.e., clinically significant, victimization (44.4%) than students who did not stutter (9.2%). Significant differences were found between means for self-esteem and life orientation, with students who stuttered reporting lower self-esteem and less optimistic life orientation than those who did not stutter. In both groups of students, high victimization scores had statistically significant negative correlations with optimistic life orientation, high self-esteem, and high satisfaction with life scores. Given the increased likelihood of students who stuttered being bullied, the negative relation of adjustment variables and bullying, and the potentially negative long-term effects of bullying, increased vigilance and early intervention are discussed.

  20. Correcting systematic errors in high-sensitivity deuteron polarization measurements

    NASA Astrophysics Data System (ADS)

    Brantjes, N. P. M.; Dzordzhadze, V.; Gebel, R.; Gonnella, F.; Gray, F. E.; van der Hoek, D. J.; Imig, A.; Kruithof, W. L.; Lazarus, D. M.; Lehrach, A.; Lorentz, B.; Messi, R.; Moricciani, D.; Morse, W. M.; Noid, G. A.; Onderwater, C. J. G.; Özben, C. S.; Prasuhn, D.; Levi Sandri, P.; Semertzidis, Y. K.; da Silva e Silva, M.; Stephenson, E. J.; Stockhorst, H.; Venanzoni, G.; Versolato, O. O.

    2012-02-01

    This paper reports deuteron vector and tensor beam polarization measurements taken to investigate the systematic variations due to geometric beam misalignments and high data rates. The experiments used the In-Beam Polarimeter at the KVI-Groningen and the EDDA detector at the Cooler Synchrotron COSY at Jülich. By measuring with very high statistical precision, the contributions that are second-order in the systematic errors become apparent. By calibrating the sensitivity of the polarimeter to such errors, it becomes possible to obtain information from the raw count rate values on the size of the errors and to use this information to correct the polarization measurements. During the experiment, it was possible to demonstrate that corrections were satisfactory at the level of 10 -5 for deliberately large errors. This may facilitate the real time observation of vector polarization changes smaller than 10 -6 in a search for an electric dipole moment using a storage ring.

  1. Dynamic Conductivity and Partial Ionization in Warm, Dense Hydrogen

    NASA Astrophysics Data System (ADS)

    Zaghoo, M.; Silvera, I. F.

    2017-10-01

    A theoretical description for optical conduction experiments in dense fluid hydrogen is presented. Different quantum statistical approaches are used to describe the mechanism of electron transport in hydrogen's high-temperature dense phase. We show that at the onset of the metallic transition, optical conduction could be described by a strong rise in the atomic polarizability, resulting from increased ionization; whereas in the highly degenerate limit, the Ziman weak-scattering model better describes the observed saturation of reflectance. In the highly degenerate region, the inclusion of partial ionization effects provides excellent agreement with experimental results. Hydrogen's fluid metallic state is revealed to be a partially ionized free-electron plasma. These results provide a crucial benchmark for ab initio calculations as well as an important guide for future experiments. Research supported by DOE Stockpile Stewardship Academic Alliance Program, Grant DE-FG52-10NA29656, and NASA Earth and Space Science Fellowship Program, Award NNX14AP17H.

  2. Characterization of DNA-protein interactions using high-throughput sequencing data from pulldown experiments

    NASA Astrophysics Data System (ADS)

    Moreland, Blythe; Oman, Kenji; Curfman, John; Yan, Pearlly; Bundschuh, Ralf

    Methyl-binding domain (MBD) protein pulldown experiments have been a valuable tool in measuring the levels of methylated CpG dinucleotides. Due to the frequent use of this technique, high-throughput sequencing data sets are available that allow a detailed quantitative characterization of the underlying interaction between methylated DNA and MBD proteins. Analyzing such data sets, we first found that two such proteins cannot bind closer to each other than 2 bp, consistent with structural models of the DNA-protein interaction. Second, the large amount of sequencing data allowed us to find rather weak but nevertheless clearly statistically significant sequence preferences for several bases around the required CpG. These results demonstrate that pulldown sequencing is a high-precision tool in characterizing DNA-protein interactions. This material is based upon work supported by the National Science Foundation under Grant No. DMR-1410172.

  3. 'Chain pooling' model selection as developed for the statistical analysis of a rotor burst protection experiment

    NASA Technical Reports Server (NTRS)

    Holms, A. G.

    1977-01-01

    A statistical decision procedure called chain pooling had been developed for model selection in fitting the results of a two-level fixed-effects full or fractional factorial experiment not having replication. The basic strategy included the use of one nominal level of significance for a preliminary test and a second nominal level of significance for the final test. The subject has been reexamined from the point of view of using as many as three successive statistical model deletion procedures in fitting the results of a single experiment. The investigation consisted of random number studies intended to simulate the results of a proposed aircraft turbine-engine rotor-burst-protection experiment. As a conservative approach, population model coefficients were chosen to represent a saturated 2 to the 4th power experiment with a distribution of parameter values unfavorable to the decision procedures. Three model selection strategies were developed.

  4. Fear and loathing: undergraduate nursing students' experiences of a mandatory course in applied statistics.

    PubMed

    Hagen, Brad; Awosoga, Oluwagbohunmi A; Kellett, Peter; Damgaard, Marie

    2013-04-23

    This article describes the results of a qualitative research study evaluating nursing students' experiences of a mandatory course in applied statistics, and the perceived effectiveness of teaching methods implemented during the course. Fifteen nursing students in the third year of a four-year baccalaureate program in nursing participated in focus groups before and after taking the mandatory course in statistics. The interviews were transcribed and analyzed using content analysis to reveal four major themes: (i) "one of those courses you throw out?," (ii) "numbers and terrifying equations," (iii) "first aid for statistics casualties," and (iv) "re-thinking curriculum." Overall, the data revealed that although nursing students initially enter statistics courses with considerable skepticism, fear, and anxiety, there are a number of concrete actions statistics instructors can take to reduce student fear and increase the perceived relevance of courses in statistics.

  5. Simulation and statistics: Like rhythm and song

    NASA Astrophysics Data System (ADS)

    Othman, Abdul Rahman

    2013-04-01

    Simulation has been introduced to solve problems in the form of systems. By using this technique the following two problems can be overcome. First, a problem that has an analytical solution but the cost of running an experiment to solve is high in terms of money and lives. Second, a problem exists but has no analytical solution. In the field of statistical inference the second problem is often encountered. With the advent of high-speed computing devices, a statistician can now use resampling techniques such as the bootstrap and permutations to form pseudo sampling distribution that will lead to the solution of the problem that cannot be solved analytically. This paper discusses how a Monte Carlo simulation was and still being used to verify the analytical solution in inference. This paper also discusses the resampling techniques as simulation techniques. The misunderstandings about these two techniques are examined. The successful usages of both techniques are also explained.

  6. Commissioning of the NPDGamma Detector Array: Counting Statistics in Current Mode Operation and Parity Violation in the Capture of Cold Neutrons on B 4 C and (27) Al.

    PubMed

    Gericke, M T; Bowman, J D; Carlini, R D; Chupp, T E; Coulter, K P; Dabaghyan, M; Desai, D; Freedman, S J; Gentile, T R; Gillis, R C; Greene, G L; Hersman, F W; Ino, T; Ishimoto, S; Jones, G L; Lauss, B; Leuschner, M B; Losowski, B; Mahurin, R; Masuda, Y; Mitchell, G S; Muto, S; Nann, H; Page, S A; Penttila, S I; Ramsay, W D; Santra, S; Seo, P-N; Sharapov, E I; Smith, T B; Snow, W M; Wilburn, W S; Yuan, V; Zhu, H

    2005-01-01

    The NPDGamma γ-ray detector has been built to measure, with high accuracy, the size of the small parity-violating asymmetry in the angular distribution of gamma rays from the capture of polarized cold neutrons by protons. The high cold neutron flux at the Los Alamos Neutron Scattering Center (LANSCE) spallation neutron source and control of systematic errors require the use of current mode detection with vacuum photodiodes and low-noise solid-state preamplifiers. We show that the detector array operates at counting statistics and that the asymmetries due to B4C and (27)Al are zero to with- in 2 × 10(-6) and 7 × 10(-7), respectively. Boron and aluminum are used throughout the experiment. The results presented here are preliminary.

  7. Ice Mass Change in Greenland and Antarctica Between 1993 and 2013 from Satellite Gravity Measurements

    NASA Technical Reports Server (NTRS)

    Talpe, Matthieu J.; Nerem, R. Steven; Forootan, Ehsan; Schmidt, Michael; Lemoine, Frank G.; Enderlin, Ellyn M.; Landerer, Felix W.

    2017-01-01

    We construct long-term time series of Greenland and Antarctic ice sheet mass change from satellite gravity measurements. A statistical reconstruction approach is developed based on a principal component analysis (PCA) to combine high-resolution spatial modes from the Gravity Recovery and Climate Experiment (GRACE) mission with the gravity information from conventional satellite tracking data. Uncertainties of this reconstruction are rigorously assessed; they include temporal limitations for short GRACE measurements, spatial limitations for the low-resolution conventional tracking data measurements, and limitations of the estimated statistical relationships between low- and high-degree potential coefficients reflected in the PCA modes. Trends of mass variations in Greenland and Antarctica are assessed against a number of previous studies. The resulting time series for Greenland show a higher rate of mass loss than other methods before 2000, while the Antarctic ice sheet appears heavily influenced by interannual variations.

  8. Does a single session of reading literary fiction prime enhanced mentalising performance? Four replication experiments of Kidd and Castano (2013).

    PubMed

    Samur, Dalya; Tops, Mattie; Koole, Sander L

    2018-02-01

    Prior experiments indicated that reading literary fiction improves mentalising performance relative to reading popular fiction, non-fiction, or not reading. However, the experiments had relatively small sample sizes and hence low statistical power. To address this limitation, the present authors conducted four high-powered replication experiments (combined N = 1006) testing the causal impact of reading literary fiction on mentalising. Relative to the original research, the present experiments used the same literary texts in the reading manipulation; the same mentalising task; and the same kind of participant samples. Moreover, one experiment was pre-registered as a direct replication. In none of the experiments did reading literary fiction have any effect on mentalising relative to control conditions. The results replicate earlier findings that familiarity with fiction is positively correlated with mentalising. Taken together, the present findings call into question whether a single session of reading fiction leads to immediate improvements in mentalising.

  9. High precision measurements of 26Naβ- decay

    NASA Astrophysics Data System (ADS)

    Grinyer, G. F.; Svensson, C. E.; Andreoiu, C.; Andreyev, A. N.; Austin, R. A.; Ball, G. C.; Chakrawarthy, R. S.; Finlay, P.; Garrett, P. E.; Hackman, G.; Hardy, J. C.; Hyland, B.; Iacob, V. E.; Koopmans, K. A.; Kulp, W. D.; Leslie, J. R.; MacDonald, J. A.; Morton, A. C.; Ormand, W. E.; Osborne, C. J.; Pearson, C. J.; Phillips, A. A.; Sarazin, F.; Schumaker, M. A.; Scraggs, H. C.; Schwarzenberg, J.; Smith, M. B.; Valiente-Dobón, J. J.; Waddington, J. C.; Wood, J. L.; Zganjar, E. F.

    2005-04-01

    High-precision measurements of the half-life and β-branching ratios for the β- decay of 26Na to 26Mg have been measured in β-counting and γ-decay experiments, respectively. A 4π proportional counter and fast tape transport system were employed for the half-life measurement, whereas the γ rays emitted by the daughter nucleus 26Mg were detected with the 8π γ-ray spectrometer, both located at TRIUMF's isotope separator and accelerator radioactive beam facility. The half-life of 26Na was determined to be T1/2=1.07128±0.00013±0.00021s, where the first error is statistical and the second systematic. The logft values derived from these experiments are compared with theoretical values from a full sd-shell model calculation.

  10. Statistical modeling of software reliability

    NASA Technical Reports Server (NTRS)

    Miller, Douglas R.

    1992-01-01

    This working paper discusses the statistical simulation part of a controlled software development experiment being conducted under the direction of the System Validation Methods Branch, Information Systems Division, NASA Langley Research Center. The experiment uses guidance and control software (GCS) aboard a fictitious planetary landing spacecraft: real-time control software operating on a transient mission. Software execution is simulated to study the statistical aspects of reliability and other failure characteristics of the software during development, testing, and random usage. Quantification of software reliability is a major goal. Various reliability concepts are discussed. Experiments are described for performing simulations and collecting appropriate simulated software performance and failure data. This data is then used to make statistical inferences about the quality of the software development and verification processes as well as inferences about the reliability of software versions and reliability growth under random testing and debugging.

  11. The impact of being bullied at school on psychological distress and work engagement in a community sample of adult workers in Japan

    PubMed Central

    Iwanaga, Mai; Imamura, Kotaro; Shimazu, Akihito

    2018-01-01

    Objective The aim of this study was to investigate the long-term impact of being bullied at school on current psychological distress and work engagement in adulthood among Japanese workers. We hypothesized that workers who had been bullied at school could have higher psychological distress and lower work engagement compared to those who had not been bullied. Methods We used data from the Japanese Study on Stratification, Health, Income, and Neighborhood (J-SHINE) project, conducted from July 2010 to February 2011 in Japan. This survey randomly selected the local residents around a metropolitan area in Japan. Of 13,920 adults originally selected, 4,317 people participated this survey, and the total response rate was 31%. The self-administered questionnaires assessed current psychological distress (K6), work engagement (UWES), the experiences of being bullied in elementary or junior high school and other covariates. Statistical analyses were conducted only for workers. Hierarchical multiple regression analyses were conducted to determine associations between experiences of being bullied at school and psychological distress/work engagement, with six steps. Result Statistical analysis was conducted for 3,111 workers. The number of respondents who reported being bullied in elementary or junior high school was 1,318 (42%). We found that the experience of being bullied at school was significantly associated with high psychological distress in adulthood (β = .079, p = < .0001); however, the work engagement scores of respondents who were bullied were significantly higher than for people who were not bullied at school (β = .068, p = < .0001), after adjusting all covariates. Conclusion Being bullied at school was positively associated with both psychological distress and work engagement in a sample of workers. Being bullied at school may be a predisposing factor for psychological distress, as previously reported. The higher levels of work engagement among people who experienced being bullied at school may be because some of them might have overcome the experience to gain more psychological resilience. PMID:29746552

  12. The impact of being bullied at school on psychological distress and work engagement in a community sample of adult workers in Japan.

    PubMed

    Iwanaga, Mai; Imamura, Kotaro; Shimazu, Akihito; Kawakami, Norito

    2018-01-01

    The aim of this study was to investigate the long-term impact of being bullied at school on current psychological distress and work engagement in adulthood among Japanese workers. We hypothesized that workers who had been bullied at school could have higher psychological distress and lower work engagement compared to those who had not been bullied. We used data from the Japanese Study on Stratification, Health, Income, and Neighborhood (J-SHINE) project, conducted from July 2010 to February 2011 in Japan. This survey randomly selected the local residents around a metropolitan area in Japan. Of 13,920 adults originally selected, 4,317 people participated this survey, and the total response rate was 31%. The self-administered questionnaires assessed current psychological distress (K6), work engagement (UWES), the experiences of being bullied in elementary or junior high school and other covariates. Statistical analyses were conducted only for workers. Hierarchical multiple regression analyses were conducted to determine associations between experiences of being bullied at school and psychological distress/work engagement, with six steps. Statistical analysis was conducted for 3,111 workers. The number of respondents who reported being bullied in elementary or junior high school was 1,318 (42%). We found that the experience of being bullied at school was significantly associated with high psychological distress in adulthood (β = .079, p = < .0001); however, the work engagement scores of respondents who were bullied were significantly higher than for people who were not bullied at school (β = .068, p = < .0001), after adjusting all covariates. Being bullied at school was positively associated with both psychological distress and work engagement in a sample of workers. Being bullied at school may be a predisposing factor for psychological distress, as previously reported. The higher levels of work engagement among people who experienced being bullied at school may be because some of them might have overcome the experience to gain more psychological resilience.

  13. High-throughput optimization by statistical designs: example with rat liver slices cryopreservation.

    PubMed

    Martin, H; Bournique, B; Blanchi, B; Lerche-Langrand, C

    2003-08-01

    The purpose of this study was to optimize cryopreservation conditions of rat liver slices in a high-throughput format, with focus on reproducibility. A statistical design of 32 experiments was performed and intracellular lactate dehydrogenase (LDHi) activity and antipyrine (AP) metabolism were evaluated as biomarkers. At freezing, modified University of Wisconsin solution was better than Williams'E medium, and pure dimethyl sulfoxide was better than a cryoprotectant mixture. The best cryoprotectant concentrations were 10% for LDHi and 20% for AP metabolism. Fetal calf serum could be used at 50 or 80%, and incubation of slices with the cryoprotectant could last 10 or 20 min. At thawing, 42 degrees C was better than 22 degrees C. After thawing, 1h was better than 3h of preculture. Cryopreservation increased the interslice variability of the biomarkers. After cryopreservation, LDHi and AP metabolism levels were up to 84 and 80% of fresh values. However, these high levels were not reproducibly achieved. Two factors involved in the day-to-day variability of LDHi were identified: the incubation time with the cryoprotectant and the preculture time. In conclusion, the statistical design was very efficient to quickly determine optimized conditions by simultaneously measuring the role of numerous factors. The cryopreservation procedure developed appears suitable for qualitative metabolic profiling studies.

  14. Spatial correlations and probability density function of the phase difference in a developed speckle-field: numerical and natural experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mysina, N Yu; Maksimova, L A; Ryabukho, V P

    Investigated are statistical properties of the phase difference of oscillations in speckle-fields at two points in the far-field diffraction region, with different shapes of the scatterer aperture. Statistical and spatial nonuniformity of the probability density function of the field phase difference is established. Numerical experiments show that, for the speckle-fields with an oscillating alternating-sign transverse correlation function, a significant nonuniformity of the probability density function of the phase difference in the correlation region of the field complex amplitude, with the most probable values 0 and p, is observed. A natural statistical interference experiment using Young diagrams has confirmed the resultsmore » of numerical experiments. (laser applications and other topics in quantum electronics)« less

  15. Contrasting effects of feature-based statistics on the categorisation and identification of visual objects

    PubMed Central

    Taylor, Kirsten I.; Devereux, Barry J.; Acres, Kadia; Randall, Billi; Tyler, Lorraine K.

    2013-01-01

    Conceptual representations are at the heart of our mental lives, involved in every aspect of cognitive functioning. Despite their centrality, a long-standing debate persists as to how the meanings of concepts are represented and processed. Many accounts agree that the meanings of concrete concepts are represented by their individual features, but disagree about the importance of different feature-based variables: some views stress the importance of the information carried by distinctive features in conceptual processing, others the features which are shared over many concepts, and still others the extent to which features co-occur. We suggest that previously disparate theoretical positions and experimental findings can be unified by an account which claims that task demands determine how concepts are processed in addition to the effects of feature distinctiveness and co-occurrence. We tested these predictions in a basic-level naming task which relies on distinctive feature information (Experiment 1) and a domain decision task which relies on shared feature information (Experiment 2). Both used large-scale regression designs with the same visual objects, and mixed-effects models incorporating participant, session, stimulus-related and feature statistic variables to model the performance. We found that concepts with relatively more distinctive and more highly correlated distinctive relative to shared features facilitated basic-level naming latencies, while concepts with relatively more shared and more highly correlated shared relative to distinctive features speeded domain decisions. These findings demonstrate that the feature statistics of distinctiveness (shared vs. distinctive) and correlational strength, as well as the task demands, determine how concept meaning is processed in the conceptual system. PMID:22137770

  16. SOCR: Statistics Online Computational Resource

    PubMed Central

    Dinov, Ivo D.

    2011-01-01

    The need for hands-on computer laboratory experience in undergraduate and graduate statistics education has been firmly established in the past decade. As a result a number of attempts have been undertaken to develop novel approaches for problem-driven statistical thinking, data analysis and result interpretation. In this paper we describe an integrated educational web-based framework for: interactive distribution modeling, virtual online probability experimentation, statistical data analysis, visualization and integration. Following years of experience in statistical teaching at all college levels using established licensed statistical software packages, like STATA, S-PLUS, R, SPSS, SAS, Systat, etc., we have attempted to engineer a new statistics education environment, the Statistics Online Computational Resource (SOCR). This resource performs many of the standard types of statistical analysis, much like other classical tools. In addition, it is designed in a plug-in object-oriented architecture and is completely platform independent, web-based, interactive, extensible and secure. Over the past 4 years we have tested, fine-tuned and reanalyzed the SOCR framework in many of our undergraduate and graduate probability and statistics courses and have evidence that SOCR resources build student’s intuition and enhance their learning. PMID:21451741

  17. Experience and Explanation: Using Videogames to Prepare Students for Formal Instruction in Statistics

    NASA Astrophysics Data System (ADS)

    Arena, Dylan A.; Schwartz, Daniel L.

    2014-08-01

    Well-designed digital games can deliver powerful experiences that are difficult to provide through traditional instruction, while traditional instruction can deliver formal explanations that are not a natural fit for gameplay. Combined, they can accomplish more than either can alone. An experiment tested this claim using the topic of statistics, where people's everyday experiences often conflict with normative statistical theories and a videogame might provide an alternate set of experiences for students to draw upon. The research used a game called Stats Invaders!, a variant of the classic videogame Space Invaders. In Stats Invaders!, the locations of descending alien invaders follow probability distributions, and players need to infer the shape of the distributions to play well. The experiment tested whether the game developed participants' intuitions about the structure of random events and thereby prepared them for future learning from a subsequent written passage on probability distributions. Community-college students who played the game and then read the passage learned more than participants who only read the passage.

  18. Chained Bell Inequality Experiment with High-Efficiency Measurements

    NASA Astrophysics Data System (ADS)

    Tan, T. R.; Wan, Y.; Erickson, S.; Bierhorst, P.; Kienzler, D.; Glancy, S.; Knill, E.; Leibfried, D.; Wineland, D. J.

    2017-03-01

    We report correlation measurements on two 9Be+ ions that violate a chained Bell inequality obeyed by any local-realistic theory. The correlations can be modeled as derived from a mixture of a local-realistic probabilistic distribution and a distribution that violates the inequality. A statistical framework is formulated to quantify the local-realistic fraction allowable in the observed distribution without the fair-sampling or independent-and-identical-distributions assumptions. We exclude models of our experiment whose local-realistic fraction is above 0.327 at the 95% confidence level. This bound is significantly lower than 0.586, the minimum fraction derived from a perfect Clauser-Horne-Shimony-Holt inequality experiment. Furthermore, our data provide a device-independent certification of the deterministically created Bell states.

  19. A Statistical Discrimination Experiment for Eurasian Events Using a Twenty-Seven-Station Network

    DTIC Science & Technology

    1980-07-08

    to test the effectiveness of a multivariate method of analysis for distinguishing earthquakes from explosions. The data base for the experiment...to test the effectiveness of a multivariate method of analysis for distinguishing earthquakes from explosions. The data base for the experiment...the weight assigned to each variable whenever a new one is added. Jennrich, R. I. (1977). Stepwise discriminant analysis , in Statistical Methods for

  20. Balancing Treatment and Control Groups in Quasi-Experiments: An Introduction to Propensity Scoring

    ERIC Educational Resources Information Center

    Connelly, Brian S.; Sackett, Paul R.; Waters, Shonna D.

    2013-01-01

    Organizational and applied sciences have long struggled with improving causal inference in quasi-experiments. We introduce organizational researchers to propensity scoring, a statistical technique that has become popular in other applied sciences as a means for improving internal validity. Propensity scoring statistically models how individuals in…

  1. Temporal and Statistical Information in Causal Structure Learning

    ERIC Educational Resources Information Center

    McCormack, Teresa; Frosch, Caren; Patrick, Fiona; Lagnado, David

    2015-01-01

    Three experiments examined children's and adults' abilities to use statistical and temporal information to distinguish between common cause and causal chain structures. In Experiment 1, participants were provided with conditional probability information and/or temporal information and asked to infer the causal structure of a 3-variable mechanical…

  2. Visualizing and Understanding Probability and Statistics: Graphical Simulations Using Excel

    ERIC Educational Resources Information Center

    Gordon, Sheldon P.; Gordon, Florence S.

    2009-01-01

    The authors describe a collection of dynamic interactive simulations for teaching and learning most of the important ideas and techniques of introductory statistics and probability. The modules cover such topics as randomness, simulations of probability experiments such as coin flipping, dice rolling and general binomial experiments, a simulation…

  3. Flipping the Classroom and Student Performance in Advanced Statistics: Evidence from a Quasi-Experiment

    ERIC Educational Resources Information Center

    Touchton, Michael

    2015-01-01

    I administer a quasi-experiment using undergraduate political science majors in statistics classes to evaluate whether "flipping the classroom" (the treatment) alters students' applied problem-solving performance and satisfaction relative to students in a traditional classroom environment (the control). I also assess whether general…

  4. Jordanian twelfth-grade science teachers' self-reported usage of science and engineering practices in the next generation science standards

    NASA Astrophysics Data System (ADS)

    Malkawi, Amal Reda; Rababah, Ebtesam Qassim

    2018-06-01

    This study investigated the degree that Science and Engineering Practices (SEPs) criteria from the Next Generation Science Standards (NGSS) were included in self-reported teaching practices of twelfth-grade science teachers in Jordan. This study sampled (n = 315) science teachers recruited from eight different public school directorates. The sample was surveyed using an instrument adapted from Kawasaki (2015). Results found that Jordanian science teachers incorporate (SEPs) in their classroom teaching at only a moderate level. SEPs applied most frequently included 'using the diagram, table or graphic through instructions to clarify the subject of a new science,' and to 'discuss with the students how to interpret the quantitative data from the experiment or investigation'. The practice with the lowest frequency was 'teach a lesson on interpreting statistics or quantitative data,' which was moderately applied. No statistically significant differences at (α = 0.05) were found among these Jordanian science teachers' self-estimations of (SEP) application into their own teaching according to the study's demographic variables (specialisation, educational qualification, teaching experience). However, a statistically significant difference at (α = 0.05) was found among Jordanian high school science teachers' practice means based on gender, with female teachers using SEPs at a higher rate than male teachers.

  5. Application of the Statistical ICA Technique in the DANCE Data Analysis

    NASA Astrophysics Data System (ADS)

    Baramsai, Bayarbadrakh; Jandel, M.; Bredeweg, T. A.; Rusev, G.; Walker, C. L.; Couture, A.; Mosby, S.; Ullmann, J. L.; Dance Collaboration

    2015-10-01

    The Detector for Advanced Neutron Capture Experiments (DANCE) at the Los Alamos Neutron Science Center is used to improve our understanding of the neutron capture reaction. DANCE is a highly efficient 4 π γ-ray detector array consisting of 160 BaF2 crystals which make it an ideal tool for neutron capture experiments. The (n, γ) reaction Q-value equals to the sum energy of all γ-rays emitted in the de-excitation cascades from the excited capture state to the ground state. The total γ-ray energy is used to identify reactions on different isotopes as well as the background. However, it's challenging to identify contribution in the Esum spectra from different isotopes with the similar Q-values. Recently we have tested the applicability of modern statistical methods such as Independent Component Analysis (ICA) to identify and separate different (n, γ) reaction yields on different isotopes that are present in the target material. ICA is a recently developed computational tool for separating multidimensional data into statistically independent additive subcomponents. In this conference talk, we present some results of the application of ICA algorithms and its modification for the DANCE experimental data analysis. This research is supported by the U. S. Department of Energy, Office of Science, Nuclear Physics under the Early Career Award No. LANL20135009.

  6. Acoustic correlates of Japanese expressions associated with voice quality of male adults

    NASA Astrophysics Data System (ADS)

    Kido, Hiroshi; Kasuya, Hideki

    2004-05-01

    Japanese expressions associated with the voice quality of male adults were extracted by a series of questionnaire surveys and statistical multivariate analysis. One hundred and thirty-seven Japanese expressions were collected through the first questionnaire and careful investigations of well-established Japanese dictionaries and articles. From the second questionnaire about familiarity with each of the expressions and synonymity that were addressed to 249 subjects, 25 expressions were extracted. The third questionnaire was about an evaluation of their own voice quality. By applying a statistical clustering method and a correlation analysis to the results of the questionnaires, eight bipolar expressions and one unipolar expression were obtained. They constituted high-pitched/low-pitched, masculine/feminine, hoarse/clear, calm/excited, powerful/weak, youthful/elderly, thick/thin, tense/lax, and nasal, respectively. Acoustic correlates of each of the eight bipolar expressions were extracted by means of perceptual evaluation experiments that were made with sentence utterances of 36 males and by a statistical decision tree method. They included an average of the fundamental frequency (F0) of the utterance, speaking rate, spectral tilt, formant frequency parameter, standard deviation of F0 values, and glottal noise, when SPL of each of the stimuli was maintained identical in the perceptual experiments.

  7. Second Language Experience Facilitates Statistical Learning of Novel Linguistic Materials.

    PubMed

    Potter, Christine E; Wang, Tianlin; Saffran, Jenny R

    2017-04-01

    Recent research has begun to explore individual differences in statistical learning, and how those differences may be related to other cognitive abilities, particularly their effects on language learning. In this research, we explored a different type of relationship between language learning and statistical learning: the possibility that learning a new language may also influence statistical learning by changing the regularities to which learners are sensitive. We tested two groups of participants, Mandarin Learners and Naïve Controls, at two time points, 6 months apart. At each time point, participants performed two different statistical learning tasks: an artificial tonal language statistical learning task and a visual statistical learning task. Only the Mandarin-learning group showed significant improvement on the linguistic task, whereas both groups improved equally on the visual task. These results support the view that there are multiple influences on statistical learning. Domain-relevant experiences may affect the regularities that learners can discover when presented with novel stimuli. Copyright © 2016 Cognitive Science Society, Inc.

  8. Second language experience facilitates statistical learning of novel linguistic materials

    PubMed Central

    Potter, Christine E.; Wang, Tianlin; Saffran, Jenny R.

    2016-01-01

    Recent research has begun to explore individual differences in statistical learning, and how those differences may be related to other cognitive abilities, particularly their effects on language learning. In the present research, we explored a different type of relationship between language learning and statistical learning: the possibility that learning a new language may also influence statistical learning by changing the regularities to which learners are sensitive. We tested two groups of participants, Mandarin Learners and Naïve Controls, at two time points, six months apart. At each time point, participants performed two different statistical learning tasks: an artificial tonal language statistical learning task and a visual statistical learning task. Only the Mandarin-learning group showed significant improvement on the linguistic task, while both groups improved equally on the visual task. These results support the view that there are multiple influences on statistical learning. Domain-relevant experiences may affect the regularities that learners can discover when presented with novel stimuli. PMID:27988939

  9. Development of the Concept of Energy Conservation using Simple Experiments for Grade 10 Students

    NASA Astrophysics Data System (ADS)

    Rachniyom, S.; Toedtanya, K.; Wuttiprom, S.

    2017-09-01

    The purpose of this research was to develop students’ concept of and retention rate in relation to energy conservation. Activities included simple and easy experiments that considered energy transformation from potential to kinetic energy. The participants were 30 purposively selected grade 10 students in the second semester of the 2016 academic year. The research tools consisted of learning lesson plans and a learning achievement test. Results showed that the experiments worked well and were appropriate as learning activities. The students’ achievement scores significantly increased at the statistical level of 05, the students’ retention rates were at a high level, and learning behaviour was at a good level. These simple experiments allowed students to learn to demonstrate to their peers and encouraged them to use familiar models to explain phenomena in daily life.

  10. High-risk behaviors and experiences with traffic law among night drivers in Curitiba, Brazil.

    PubMed

    Ulinski, Sandra L; Moysés, Simone T; Werneck, Renata I; Moysés, Samuel J

    2016-01-08

    To explore high-risk behaviors and experiences with traffic law among night drivers in Curitiba, Brazil. Data from 398 drivers on sociodemographic parameters, high-risk behaviors, experiences with traffic law, and traffic law violations were collected through interviews conducted at sobriety checkpoints. Exploratory-descriptive and analytical statistics were used. The mean age of the participants was 32.6±11.2 years (range, 18 to 75 years). Half of the drivers reported having driven after drinking in the last year, predominantly single men aged 18 to 29 years who drive cars and drink alcohol frequently. Only 55% of the drivers who had driven after drinking in the last year self-reported some concern about being detected in a police operation. A significant association was found between sociodemographic variables and behavior, which can help tailor public interventions to a specific group of drivers: young men who exhibit high-risk behaviors in traffic, such as driving after drinking alcohol, some of whom report heavy alcohol consumption. This group represents a challenge for educational and enforcement interventions, particularly because they admit to violating current laws and have a low perception of punishment due to the low risk of being detected by the police.

  11. Pilot Domain Task Experience in Night Fatal Helicopter Emergency Medical Service Accidents.

    PubMed

    Aherne, Bryan B; Zhang, Chrystal; Newman, David G

    2016-06-01

    In the United States, accident and fatality rates in helicopter emergency medical service (HEMS) operations increase significantly under nighttime environmentally hazardous operational conditions. Other studies have found pilots' total flight hours unrelated to HEMS accident outcomes. Many factors affect pilots' decision making, including their experience. This study seeks to investigate whether pilot domain task experience (DTE) in HEMS plays a role against likelihood of accidents at night when hazardous operational conditions are entered. There were 32 flights with single pilot nighttime fatal HEMS accidents between 1995 and 2013 with findings of controlled flight into terrain (CFIT) and loss of control (LCTRL) due to spatial disorientation (SD) identified. The HEMS DTE of the pilots were compared with industry survey data. Of the pilots, 56% had ≤2 yr of HEMS experience and 9% had >10 yr of HEMS experience. There were 21 (66%) accidents that occurred in non-visual flight rules (VFR) conditions despite all flights being required to be conducted under VFR. There was a statistically significant increase in accident rates in pilots with <2 and <4 yr HEMS DTE and a statistically significant decrease in accident rates in pilots with >10 yr HEMS DTE. HEMS DTE plays a preventive role against the likelihood of a night operational accident. Pilots with limited HEMS DTE are more likely to make a poor assessment of hazardous conditions at night, and this will place HEMS flight crew at high risk in the VFR night domain.

  12. Statistical fluctuations in pedestrian evacuation times and the effect of social contagion

    NASA Astrophysics Data System (ADS)

    Nicolas, Alexandre; Bouzat, Sebastián; Kuperman, Marcelo N.

    2016-08-01

    Mathematical models of pedestrian evacuation and the associated simulation software have become essential tools for the assessment of the safety of public facilities and buildings. While a variety of models is now available, their calibration and test against empirical data are generally restricted to global averaged quantities; the statistics compiled from the time series of individual escapes ("microscopic" statistics) measured in recent experiments are thus overlooked. In the same spirit, much research has primarily focused on the average global evacuation time, whereas the whole distribution of evacuation times over some set of realizations should matter. In the present paper we propose and discuss the validity of a simple relation between this distribution and the microscopic statistics, which is theoretically valid in the absence of correlations. To this purpose, we develop a minimal cellular automaton, with features that afford a semiquantitative reproduction of the experimental microscopic statistics. We then introduce a process of social contagion of impatient behavior in the model and show that the simple relation under test may dramatically fail at high contagion strengths, the latter being responsible for the emergence of strong correlations in the system. We conclude with comments on the potential practical relevance for safety science of calculations based on microscopic statistics.

  13. Reflection on Training, Experience, and Introductory Statistics: A Mini-Survey of Tertiary Level Statistics Instructors

    ERIC Educational Resources Information Center

    Hassad, Rossi A.

    2006-01-01

    Instructors of statistics who teach non-statistics majors possess varied academic backgrounds, and hence it is reasonable to expect variability in their content knowledge, and pedagogical approach. The aim of this study was to determine the specific course(s) that contributed mostly to instructors' understanding of statistics. Courses reported…

  14. Effects of Caffeine and Warrior Stress on Behavioral : An Animal Model

    DTIC Science & Technology

    2016-03-14

    contributes invaluably to ethical and humane research. A special thank you to Erin Barry for providing statistical expertise and methodological support...of behavioral health in rats. Several ethical and logistical issues prevent the use of humans in true controlled experiments that manipulate stress...play in the development or maintenance of behavioral problems. There are ethical issues associated with exposing humans to high caffeine doses and

  15. Transition and Skills Development through Education, Training and Work Experiences: A Follow-up Study, Seven Oaks School Division.

    ERIC Educational Resources Information Center

    Taylor, Lynn; Simpson, Wayne; McClure, Karen; Graham, Barbara; Levin, Benjamin

    A Canadian study of the school-to-work transition followed students enrolled in grade 11 in 1990 (n=177), 1992 (n=172), and 1994 (n=347) in Seven Oaks School Division's three high schools. Based largely on questions from the Statistics Canada (SC) School Leavers Survey and SC Graduates Study (1997), the telephone survey focused on these elements:…

  16. Implicit Statistical Learning and Language Skills in Bilingual Children

    ERIC Educational Resources Information Center

    Yim, Dongsun; Rudoy, John

    2013-01-01

    Purpose: Implicit statistical learning in 2 nonlinguistic domains (visual and auditory) was used to investigate (a) whether linguistic experience influences the underlying learning mechanism and (b) whether there are modality constraints in predicting implicit statistical learning with age and language skills. Method: Implicit statistical learning…

  17. Using Guided Reinvention to Develop Teachers' Understanding of Hypothesis Testing Concepts

    ERIC Educational Resources Information Center

    Dolor, Jason; Noll, Jennifer

    2015-01-01

    Statistics education reform efforts emphasize the importance of informal inference in the learning of statistics. Research suggests statistics teachers experience similar difficulties understanding statistical inference concepts as students and how teacher knowledge can impact student learning. This study investigates how teachers reinvented an…

  18. The influence of temperature and relative humidity on the development of Lepidoglyphus destructor (Acari: Glycyphagidae) and its production of allergens: a laboratory experiment.

    PubMed

    Danielsen, Charlotte; Hansen, Lise Stengård; Nachman, Gösta; Herling, Christian

    2004-01-01

    Laboratory experiments with Lepidoglyphus destructor on a diet of mainly whole wheat were conducted to study the mite's development and production of a specific allergen, Lep d 2, at four different temperatures (5, 10, 15 and 20 degrees C) and three levels of relative humidity (ca. 70-88%). Statistical models were used to analyse the role played by temperature, relative humidity and time in explaining the observed number of L. destructor and the amount of allergen produced. Moreover, the life stage distributions of the mites were determined and related to the population growth. Based on a statistical model the intrinsic rate of natural increase, rm, was computed for a range of different temperatures and relative humidities. High relative humidity in combination with temperatures at about 25 degrees C will lead to the highest rm (ca. 0.15 day-1). The highest concentration of Lep d 2 was 3 micrograms g-1 grain, found at 20 degrees C and high relative humidity at a mite density of 254 mites g-1 grain. The concentration of allergens in the grain was best explained by a model that incorporated both the current and the cumulative numbers of mites.

  19. Analyzing social experiments as implemented: A reexamination of the evidence from the HighScope Perry Preschool Program

    PubMed Central

    Heckman, James; Moon, Seong Hyeok; Pinto, Rodrigo; Savelyev, Peter; Yavitz, Adam

    2012-01-01

    Social experiments are powerful sources of information about the effectiveness of interventions. In practice, initial randomization plans are almost always compromised. Multiple hypotheses are frequently tested. “Significant” effects are often reported with p-values that do not account for preliminary screening from a large candidate pool of possible effects. This paper develops tools for analyzing data from experiments as they are actually implemented. We apply these tools to analyze the influential HighScope Perry Preschool Program. The Perry program was a social experiment that provided preschool education and home visits to disadvantaged children during their preschool years. It was evaluated by the method of random assignment. Both treatments and controls have been followed from age 3 through age 40. Previous analyses of the Perry data assume that the planned randomization protocol was implemented. In fact, as in many social experiments, the intended randomization protocol was compromised. Accounting for compromised randomization, multiple-hypothesis testing, and small sample sizes, we find statistically significant and economically important program effects for both males and females. We also examine the representativeness of the Perry study. PMID:23255883

  20. Deciphering the Landauer-Büttiker Transmission Function from Single Molecule Break Junction Experiments

    NASA Astrophysics Data System (ADS)

    Reuter, Matthew; Tschudi, Stephen

    When investigating the electrical response properties of molecules, experiments often measure conductance whereas computation predicts transmission probabilities. Although the Landauer-Büttiker theory relates the two in the limit of coherent scattering through the molecule, a direct comparison between experiment and computation can still be difficult. Experimental data (specifically that from break junctions) is statistical and computational results are deterministic. Many studies compare the most probable experimental conductance with computation, but such an analysis discards almost all of the experimental statistics. In this work we develop tools to decipher the Landauer-Büttiker transmission function directly from experimental statistics and then apply them to enable a fairer comparison between experimental and computational results.

  1. [Correlation of adverse childhood experiences with psychiatric disorders and aggressiveness in adulthood].

    PubMed

    Samardzić, Ljiljana; Nikolić, Gordana; Grbesa, Grozdanko; Simonović, Maja; Milenković, Tatjana

    2010-08-01

    Consequences of individual adverse childhood experiences for adult mental health have been precisely studied during past decades. The focus of past research was mainly on childhood maltreatment and neglect. The aim of this paper was to determine association between multiple adverse childhood experiences and psychiatric disorders, as well as their correlation to the degree and type of aggressiveness in adult psychiatric patients. One hundred and thirteen psychiatric outpatients were divided into three diagnostic groups: psychotics, non-psychotics and alcoholics and compared with fourty healthy individuals. Adverse childhood experiences data were gathered retrospectively, using the Adverse childhood experiences questionnaire and explanatory interview. Aggressiveness was assessed using Buss-Perry Aggression Questionnaire. The Student's t test, ANOVA and correlational analysis were used for evaluation of statistical significance of differences among the groups. A value p < 0.05 was considered statistically significant. Our results showed that the mean number of adverse childhood experiences in each group of psychiatric patients, as well as in the whole group of patients, was statistically significantly higher than in the group of healthy individuals (p < 0.001); there was a statistically significant difference in score of physical aggressiveness between the patients exposed to adverse childhood experiences and those who were not exposed to them (p < 0.05); scores of physical aggressiveness were in positive correlation with the number of adverse childhood experiences (p < 0.05). The highest mean score of adverse childhood experiences was evidenced in the group of patients with psychotic disorders. Multiple adverse childhood experiences are significantly associated with psychotic disorders, nonpsychotic disorders and alcohol dependence in adulthood and their presence is important morbidity risk factor for psychiatric disorders. They are in positive correlation with physical aggressiveness of the patients from these diagnostic groups.

  2. Melody and pitch processing in five musical savants with congenital blindness.

    PubMed

    Pring, Linda; Woolf, Katherine; Tadic, Valerie

    2008-01-01

    We examined absolute-pitch (AP) and short-term musical memory abilities of five musical savants with congenital blindness, seven musicians, and seven non-musicians with good vision and normal intelligence in two experiments. In the first, short-term memory for musical phrases was tested and the savants and musicians performed statistically indistinguishably, both significantly outperforming the non-musicians and remembering more material from the C major scale sequences than random trials. In the second experiment, participants learnt associations between four pitches and four objects using a non-verbal paradigm. This experiment approximates to testing AP ability. Low statistical power meant the savants were not statistically better than the musicians, although only the savants scored statistically higher than the non-musicians. The results are evidence for a musical module, separate from general intelligence; they also support the anecdotal reporting of AP in musical savants, which is thought to be necessary for the development of musical-savant skill.

  3. Statistical learning using real-world scenes: extracting categorical regularities without conscious intent.

    PubMed

    Brady, Timothy F; Oliva, Aude

    2008-07-01

    Recent work has shown that observers can parse streams of syllables, tones, or visual shapes and learn statistical regularities in them without conscious intent (e.g., learn that A is always followed by B). Here, we demonstrate that these statistical-learning mechanisms can operate at an abstract, conceptual level. In Experiments 1 and 2, observers incidentally learned which semantic categories of natural scenes covaried (e.g., kitchen scenes were always followed by forest scenes). In Experiments 3 and 4, category learning with images of scenes transferred to words that represented the categories. In each experiment, the category of the scenes was irrelevant to the task. Together, these results suggest that statistical-learning mechanisms can operate at a categorical level, enabling generalization of learned regularities using existing conceptual knowledge. Such mechanisms may guide learning in domains as disparate as the acquisition of causal knowledge and the development of cognitive maps from environmental exploration.

  4. Childhood adversity and behavioral health outcomes for youth: An investigation using state administrative data.

    PubMed

    Lucenko, Barbara A; Sharkova, Irina V; Huber, Alice; Jemelka, Ron; Mancuso, David

    2015-09-01

    This study aimed to measure the relative contribution of adverse experiences to adolescent behavioral health problems using administrative data. Specifically, we sought to understand the predictive value of adverse experiences on the presence of mental health and substance abuse problems for youth receiving publicly funded social and health services. Medicaid claims and other service records were analyzed for 125,123 youth age 12-17 and their biological parents. Measures from administrative records reflected presence of parental domestic violence, mental illness, substance abuse, criminal justice involvement, child abuse and/or neglect, homelessness, and death of a biological parent. Mental health and substance abuse status of adolescents were analyzed as functions of adverse experiences and other youth characteristics using logistic regression. In multivariate analyses, all predictors except parental domestic violence were statistically significant for substance abuse; parental death, parental mental illness, child abuse or neglect and homelessness were statistically significant for mental illness. Odds ratios for child abuse/neglect were particularly high in both models. The ability to identify risks during childhood using administrative data suggests the potential to target prevention and early intervention efforts for children with specific family risk factors who are at increased risk for developing behavioral health problems during adolescence. This study illustrates the utility of administrative data in understanding adverse experiences on children and the advantages and disadvantages of this approach. Copyright © 2015 Elsevier Ltd. All rights reserved.

  5. Statistical validation and an empirical model of hydrogen production enhancement found by utilizing passive flow disturbance in the steam-reformation process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Erickson, Paul A.; Liao, Chang-hsien

    2007-11-15

    A passive flow disturbance has been proven to enhance the conversion of fuel in a methanol-steam reformer. This study presents a statistical validation of the experiment based on a standard 2{sup k} factorial experiment design and the resulting empirical model of the enhanced hydrogen producing process. A factorial experiment design was used to statistically analyze the effects and interactions of various input factors in the experiment. Three input factors, including the number of flow disturbers, catalyst size, and reactant flow rate were investigated for their effects on the fuel conversion in the steam-reformation process. Based on the experimental results, anmore » empirical model was developed and further evaluated with an uncertainty analysis and interior point data. (author)« less

  6. Are We Able to Pass the Mission of Statistics to Students?

    ERIC Educational Resources Information Center

    Hindls, Richard; Hronová, Stanislava

    2015-01-01

    The article illustrates our long term experience in teaching statistics for non-statisticians, especially for students of economics and humanities. The article is focused on some problems of the basic course that can weaken the interest in statistics or lead to false use of statistic methods.

  7. Errors in statistical decision making Chapter 2 in Applied Statistics in Agricultural, Biological, and Environmental Sciences

    USDA-ARS?s Scientific Manuscript database

    Agronomic and Environmental research experiments result in data that are analyzed using statistical methods. These data are unavoidably accompanied by uncertainty. Decisions about hypotheses, based on statistical analyses of these data are therefore subject to error. This error is of three types,...

  8. Forest statistics of central and northern Indiana

    Treesearch

    The Forest Survey Organization Central States Forest Experiment Station

    1952-01-01

    The Forest Survey is conducted in the various regions by the forest experiment stations of the Forest Service. In Indiana the project is directed by the Central States Forest Experiment Station with headquarters in Columbus, Ohio. This Survey Release presents the more significant preliminary statistics on the forest area and timber volume for Central and Northern...

  9. Forest statistics of eastern Kentucky

    Treesearch

    The Forest Survey Organization Central States Forest Experiment Station

    1952-01-01

    The Forest Survey is conducted in the various regions by the forest experiment stations of the Forest Service. In Kentucky the project is directed by the Central States Forest Experiment Station with headquarters in Columbus, Ohio. This Survey Release presents the more significant preliminary statistics on the forest area and timber volume for the Eastern Kentucky...

  10. Forest statistics of Indiana

    Treesearch

    The Forest Survey Organization Central States Forest Experiment Station

    1953-01-01

    The Forest Survey is conducted in the various regions by the forest experiment stations of the Forest Service. In Indiana the project is directed by the Central States Forest Experiment Station with headquarters in Columbus, Ohio. This Survey Release presents the more significant preliminary statistics on the forest area timber volume, timber growth, and timber drain...

  11. Forest statistics of Kentucky

    Treesearch

    The Forest Survey Organization Central States Forest Experiment Station

    1952-01-01

    The Forest Survey is conducted in the various regions by the forest experiment stations of the Forest Service. In Kentucky the project is directed by the Central States Forest Experiment Station with headquarters in Columbus, Ohio. This Survey Release presents the more significant preliminary statistics on the forest area, timber volume, timber growth, and timber drain...

  12. Statistical and Cooperative Learning in Reading: An Artificial Orthography Learning Study

    ERIC Educational Resources Information Center

    Zhao, Jingjing; Li, Tong; Elliott, Mark A.; Rueckl, Jay G.

    2018-01-01

    This article reports two experiments in which the artificial orthography paradigm was used to investigate the mechanisms underlying learning to read. In each experiment, participants were taught the meanings and pronunications of words written in an unfamiliar orthography, and the statistical structure of the mapping between written and spoken…

  13. Study of the effect of cloud inhomogeneity on the earth radiation budget experiment

    NASA Technical Reports Server (NTRS)

    Smith, Phillip J.

    1988-01-01

    The Earth Radiation Budget Experiment (ERBE) is the most recent and probably the most intensive mission designed to gather precise measurements of the Earth's radiation components. The data obtained from ERBE is of great importance for future climatological studies. A statistical study reveals that the ERBE scanner data are highly correlated and that instantaneous measurements corresponding to neighboring pixels contain almost the same information. Analyzing only a fraction of the data set when sampling is suggested and applications of this strategy are given in the calculation of the albedo of the Earth and of the cloud-forcing over ocean.

  14. Status of a Deep Learning Based Measurement of the Inclusive Muon Neutrino Charged-current Cross Section in the NOvA Near Detector

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Behera, Biswaranjan

    NOvA is a long-baseline neutrino oscillation experiment. It uses the NuMI beam from Fermilab and two sampling calorimeter detectors placed off-axis from the beam. The 293 ton Near Detector measures the unoscillated neutrino energy spectrum, which can be used to predict the neutrino energy spectrum observed at the 14 kton Far Detector. The Near Detector also provides an excellent opportunity to measure neutrino interaction cross sections with high statistics, which will benefit current and future long-baseline neutrino oscillation experiments. This analysis implements new algorithms to identifymore » $$\

  15. Integrating teaching and authentic research in the field and laboratory settings

    NASA Astrophysics Data System (ADS)

    Daryanto, S.; Wang, L.; Kaseke, K. F.; Ravi, S.

    2016-12-01

    Typically authentic research activities are separated from rigorous classroom teaching. Here we assessed the potential of integrating teaching and research activities both in the field and in the laboratory. We worked with students from both US and abroad without strong science background to utilize advanced environmental sensors and statistical tool to conduct innovative projects. The students include one from Namibia and two local high school students in Indianapolis (through Project SEED, Summer Experience for the Economically Disadvantaged). They conducted leaf potential measurements, isotope measurements and meta-analysis. The experience showed us the great potential of integrating teaching and research in both field and laboratory settings.

  16. [Formal sample size calculation and its limited validity in animal studies of medical basic research].

    PubMed

    Mayer, B; Muche, R

    2013-01-01

    Animal studies are highly relevant for basic medical research, although their usage is discussed controversially in public. Thus, an optimal sample size for these projects should be aimed at from a biometrical point of view. Statistical sample size calculation is usually the appropriate methodology in planning medical research projects. However, required information is often not valid or only available during the course of an animal experiment. This article critically discusses the validity of formal sample size calculation for animal studies. Within the discussion, some requirements are formulated to fundamentally regulate the process of sample size determination for animal experiments.

  17. Laser diode initiated detonators for space applications

    NASA Technical Reports Server (NTRS)

    Ewick, David W.; Graham, J. A.; Hawley, J. D.

    1993-01-01

    Ensign Bickford Aerospace Company (EBAC) has over ten years of experience in the design and development of laser ordnance systems. Recent efforts have focused on the development of laser diode ordnance systems for space applications. Because the laser initiated detonators contain only insensitive secondary explosives, a high degree of system safety is achieved. Typical performance characteristics of a laser diode initiated detonator are described in this paper, including all-fire level, function time, and output. A finite difference model used at EBAC to predict detonator performance, is described and calculated results are compared to experimental data. Finally, the use of statistically designed experiments to evaluate performance of laser initiated detonators is discussed.

  18. Knowledge and utilization of computer-software for statistics among Nigerian dentists.

    PubMed

    Chukwuneke, F N; Anyanechi, C E; Obiakor, A O; Amobi, O; Onyejiaka, N; Alamba, I

    2013-01-01

    The use of computer soft ware for generation of statistic analysis has transformed health information and data to simplest form in the areas of access, storage, retrieval and analysis in the field of research. This survey therefore was carried out to assess the level of knowledge and utilization of computer software for statistical analysis among dental researchers in eastern Nigeria. Questionnaires on the use of computer software for statistical analysis were randomly distributed to 65 practicing dental surgeons of above 5 years experience in the tertiary academic hospitals in eastern Nigeria. The focus was on: years of clinical experience; research work experience; knowledge and application of computer generated software for data processing and stastistical analysis. Sixty-two (62/65; 95.4%) of these questionnaires were returned anonymously, which were used in our data analysis. Twenty-nine (29/62; 46.8%) respondents fall within those with 5-10 years of clinical experience out of which none has completed the specialist training programme. Practitioners with above 10 years clinical experiences were 33 (33/62; 53.2%) out of which 15 (15/33; 45.5%) are specialists representing 24.2% (15/62) of the total number of respondents. All the 15 specialists are actively involved in research activities and only five (5/15; 33.3%) can utilize software statistical analysis unaided. This study has i dentified poor utilization of computer software for statistic analysis among dental researchers in eastern Nigeria. This is strongly associated with lack of exposure on the use of these software early enough especially during the undergraduate training. This call for introduction of computer training programme in dental curriculum to enable practitioners develops the attitude of using computer software for their research.

  19. RepExplore: addressing technical replicate variance in proteomics and metabolomics data analysis.

    PubMed

    Glaab, Enrico; Schneider, Reinhard

    2015-07-01

    High-throughput omics datasets often contain technical replicates included to account for technical sources of noise in the measurement process. Although summarizing these replicate measurements by using robust averages may help to reduce the influence of noise on downstream data analysis, the information on the variance across the replicate measurements is lost in the averaging process and therefore typically disregarded in subsequent statistical analyses.We introduce RepExplore, a web-service dedicated to exploit the information captured in the technical replicate variance to provide more reliable and informative differential expression and abundance statistics for omics datasets. The software builds on previously published statistical methods, which have been applied successfully to biomedical omics data but are difficult to use without prior experience in programming or scripting. RepExplore facilitates the analysis by providing a fully automated data processing and interactive ranking tables, whisker plot, heat map and principal component analysis visualizations to interpret omics data and derived statistics. Freely available at http://www.repexplore.tk enrico.glaab@uni.lu Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press.

  20. Statistical characteristics of transient enclosure voltage in ultra-high-voltage gas-insulated switchgear

    NASA Astrophysics Data System (ADS)

    Cai, Yuanji; Guan, Yonggang; Liu, Weidong

    2017-06-01

    Transient enclosure voltage (TEV), which is a phenomenon induced by the inner dielectric breakdown of SF6 during disconnector operations in a gas-insulated switchgear (GIS), may cause issues relating to shock hazard and electromagnetic interference to secondary equipment. This is a critical factor regarding the electromagnetic compatibility of ultra-high-voltage (UHV) substations. In this paper, the statistical characteristics of TEV at UHV level are collected from field experiments, and are analyzed and compared to those from a repeated strike process. The TEV waveforms during disconnector operations are recorded by a self-developed measurement system first. Then, statistical characteristics, such as the pulse number, duration of pulses, frequency components, magnitude and single pulse duration, are extracted. The transmission line theory is introduced to analyze the TEV and is validated by the experimental results. Finally, the relationship between the TEV and the repeated strike process is analyzed. This proves that the pulse voltage of the TEV is proportional to the corresponding breakdown voltage. The results contribute to the definition of the standard testing waveform of the TEV, and can aid the protection of electronic devices in substations by minimizing the threat of this phenomenon.

  1. Why Wait? The Influence of Academic Self-Regulation, Intrinsic Motivation, and Statistics Anxiety on Procrastination in Online Statistics

    ERIC Educational Resources Information Center

    Dunn, Karee

    2014-01-01

    Online graduate education programs are expanding rapidly. Many of these programs require a statistics course, resulting in an increasing need for online statistics courses. The study reported here grew from experiences teaching online, graduate statistics courses. In seeking answers on how to improve this class, I discovered that research has yet…

  2. An examination of ESI triage scoring accuracy in relationship to ED nursing attitudes and experience.

    PubMed

    Martin, Andrew; Davidson, Carolyn L; Panik, Anne; Buckenmyer, Charlotte; Delpais, Paul; Ortiz, Michele

    2014-09-01

    This research was designed to examine if there is a difference in nurse attitudes and experience for those who assign Emergency Severity Index (ESI) scores accurately and those who do not assign ESI scores accurately. Studies that have used ESI scoring discussed the role of experience, but have not specifically addressed how the amount of experience and attitude towards patients in triage affect the triage nurse's decision-making capabilities. A descriptive, exploratory study design was used. Data from 64 nurses and 1,644 triage events at 3 emergency departments was collected. Participants completed demographic data, attitude (Caring Nurse Patient Interaction, CNPI-23) survey, and triage data collection tools during the continuous 8-hour triage shift. Clinical nurse expert raters retrospectively reviewed the charts and assigned an ESI score to be compared with the nurse. Descriptive statistics were used to describe the nurse and Pearson's correlation was used to examine the relationship between experience and attitude. In this study of 64 nurse participants, the ESI score assigned by nurse participants did not differ significantly based on years of experience or CNPI mean score. The Kappa statistic ranged from a high of 0.63 in the nurse participant with 1.00 to 1.99 years of experience to a low of 0.51 in the nurse participant with 15 to 19 years of experience. The nurse participants with an overall mean CNPI-23 score of 106 to 115 achieved the highest agreement compared with a single participant with a CNPI-23 overall mean score of less than 77 who had a Kappa agreement of 0.50. The nurse participants with a CNPI-23 overall mean score between 81 and 92 demonstrated agreement of 0.54 to 0.60. Based on the high level of liability the triage area presents, special consideration needs to be made when deciding which nurse should be assigned to that area. The evidence produced from this study should provide some reassurance to ED managers and nurses alike that nurses with minimal ED experience and a working understanding of the ESI 5-level triage algorithm possess the knowledge and the capacity to safely and appropriately triage patients in the emergency department. Copyright © 2014 Emergency Nurses Association. Published by Elsevier Inc. All rights reserved. Published by Elsevier Inc. All rights reserved.

  3. Statistical Computations Underlying the Dynamics of Memory Updating

    PubMed Central

    Gershman, Samuel J.; Radulescu, Angela; Norman, Kenneth A.; Niv, Yael

    2014-01-01

    Psychophysical and neurophysiological studies have suggested that memory is not simply a carbon copy of our experience: Memories are modified or new memories are formed depending on the dynamic structure of our experience, and specifically, on how gradually or abruptly the world changes. We present a statistical theory of memory formation in a dynamic environment, based on a nonparametric generalization of the switching Kalman filter. We show that this theory can qualitatively account for several psychophysical and neural phenomena, and present results of a new visual memory experiment aimed at testing the theory directly. Our experimental findings suggest that humans can use temporal discontinuities in the structure of the environment to determine when to form new memory traces. The statistical perspective we offer provides a coherent account of the conditions under which new experience is integrated into an old memory versus forming a new memory, and shows that memory formation depends on inferences about the underlying structure of our experience. PMID:25375816

  4. Toward an affective neuroscience account of financial risk taking.

    PubMed

    Wu, Charlene C; Sacchet, Matthew D; Knutson, Brian

    2012-01-01

    To explain human financial risk taking, economic, and finance theories typically refer to the mathematical properties of financial options, whereas psychological theories have emphasized the influence of emotion and cognition on choice. From a neuroscience perspective, choice emanates from a dynamic multicomponential process. Recent technological advances in neuroimaging have made it possible for researchers to separately visualize perceptual input, intermediate processing, and motor output. An affective neuroscience account of financial risk taking thus might illuminate affective mediators that bridge the gap between statistical input and choice output. To test this hypothesis, we conducted a quantitative meta-analysis (via activation likelihood estimate or ALE) of functional magnetic resonance imaging experiments that focused on neural responses to financial options with varying statistical moments (i.e., mean, variance, skewness). Results suggested that different statistical moments elicit both common and distinct patterns of neural activity. Across studies, high versus low mean had the highest probability of increasing ventral striatal activity, but high versus low variance had the highest probability of increasing anterior insula activity. Further, high versus low skewness had the highest probability of increasing ventral striatal activity. Since ventral striatal activity has been associated with positive aroused affect (e.g., excitement), whereas anterior insular activity has been associated with negative aroused affect (e.g., anxiety) or general arousal, these findings are consistent with the notion that statistical input influences choice output by eliciting anticipatory affect. The findings also imply that neural activity can be used to predict financial risk taking - both when it conforms to and violates traditional models of choice.

  5. Toward an Affective Neuroscience Account of Financial Risk Taking

    PubMed Central

    Wu, Charlene C.; Sacchet, Matthew D.; Knutson, Brian

    2012-01-01

    To explain human financial risk taking, economic, and finance theories typically refer to the mathematical properties of financial options, whereas psychological theories have emphasized the influence of emotion and cognition on choice. From a neuroscience perspective, choice emanates from a dynamic multicomponential process. Recent technological advances in neuroimaging have made it possible for researchers to separately visualize perceptual input, intermediate processing, and motor output. An affective neuroscience account of financial risk taking thus might illuminate affective mediators that bridge the gap between statistical input and choice output. To test this hypothesis, we conducted a quantitative meta-analysis (via activation likelihood estimate or ALE) of functional magnetic resonance imaging experiments that focused on neural responses to financial options with varying statistical moments (i.e., mean, variance, skewness). Results suggested that different statistical moments elicit both common and distinct patterns of neural activity. Across studies, high versus low mean had the highest probability of increasing ventral striatal activity, but high versus low variance had the highest probability of increasing anterior insula activity. Further, high versus low skewness had the highest probability of increasing ventral striatal activity. Since ventral striatal activity has been associated with positive aroused affect (e.g., excitement), whereas anterior insular activity has been associated with negative aroused affect (e.g., anxiety) or general arousal, these findings are consistent with the notion that statistical input influences choice output by eliciting anticipatory affect. The findings also imply that neural activity can be used to predict financial risk taking – both when it conforms to and violates traditional models of choice. PMID:23129993

  6. Promoting the Positive Development of Boys in High-Poverty Neighborhoods: Evidence From Four Anti-Poverty Experiments

    PubMed Central

    Snell, Emily K.; Castells, Nina; Duncan, Greg; Gennetian, Lisa; Magnuson, Katherine; Morris, Pamela

    2012-01-01

    This study uses geocoded address data and information about parent’s economic behavior and children’s development from four random-assignment welfare and anti-poverty experiments conducted during the 1990s. We find that the impacts of these welfare and anti-poverty programs on boys’ and girls’ developmental outcomes during the transition to early adolescence differ as a function of neighborhood poverty levels. The strongest positive impacts of these programs are among boys who lived in high-poverty neighborhoods at the time their parents enrolled in the studies, with smaller or non-statistically significant effects for boys in lower poverty neighborhoods and for girls across all neighborhoods. This research informs our understanding of how neighborhood context and child gender may interact with employment-based policies to affect children’s well-being. PMID:24348000

  7. Radiation from particles moving in small-scale magnetic fields created in solid-density laser-plasma laboratory experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keenan, Brett D., E-mail: bdkeenan@ku.edu; Medvedev, Mikhail V.

    2015-11-15

    Plasmas created by high-intensity lasers are often subject to the formation of kinetic-streaming instabilities, such as the Weibel instability, which lead to the spontaneous generation of high-amplitude, tangled magnetic fields. These fields typically exist on small spatial scales, i.e., “sub-Larmor scales.” Radiation from charged particles moving through small-scale electromagnetic (EM) turbulence has spectral characteristics distinct from both synchrotron and cyclotron radiation, and it carries valuable information on the statistical properties of the EM field structure and evolution. Consequently, this radiation from laser-produced plasmas may offer insight into the underlying electromagnetic turbulence. Here, we investigate the prospects for, and demonstrate themore » feasibility of, such direct radiative diagnostics for mildly relativistic, solid-density laser plasmas produced in lab experiments.« less

  8. A review of the Fermilab fixed-target program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rameika, R.

    1994-12-01

    All eyes are now on the Fermilab collider program as the intense search for the top quark continues. Nevertheless, Fermilab`s long tradition of operating a strong, diverse physics program depends not only on collider physics but also on effective use of the facilities the Laboratory was founded on, the fixed-target beamlines. In this talk the author presents highlights of the Fermilab fixed-target program from its (not too distant) past, (soon to be) present, and (hopefully, not too distant) future program. The author concentrates on those experiments which are unique to the fixed-target program, in particular hadron structure measurements which usemore » the varied beams and targets available in this mode and the physics results from kaon, hyperon and high statistics charm experiments which are not easily accessible in high p{sub T} hadron collider detectors.« less

  9. Experimental characterization of the saturating, near infrared, self-amplified spontaneous emission free electron laser: Analysis of radiation properties and electron beam dynamics

    NASA Astrophysics Data System (ADS)

    Murokh, Alex

    2002-01-01

    In this work, the main results of the VISA experiment (Visible to Infrared SASE Amplifier) are presented and analyzed. The purpose of the experiment was to build a state-of-the-art single pass self-amplified spontaneous emission (SASE) free electron laser (FEL) based on a high brightness electron beam, and characterize its operation, including saturation, in the near infrared spectral region. This experiment was hosted by Accelerator Test Facility (ATF) at Brookhaven National Laboratory, which is a users facility that provides high brightness relativistic electron beams generated with the photoinjector. During the experiment, SASE FEL performance was studied in two regimes: a long bunch, lower gain operation; and a short bunch high gain regime. The transition between the two conditions was possible due to a novel bunch compression mechanism, which was discovered in the course of the experiment. This compression allowed the variation of peak current in the electron beam before it was launched into the 4-m VISA undulator. In the long bunch regime, a SASE FEL power gain length of 29 cm was obtained, and the generated radiation spectral and statistical properties were characterized. In the short bunch regime, a power gain length of under 18 cm was achieved at 842 nm, which is at least a factor of two shorter than ever previously achieved in this spectral range. Further, FEL saturation was obtained before the undulator exit. The FEL system's performance was measured along the length of the VISA undulator, and in the final state. Statistical, spectral and angular properties of the short bunch SASE radiation have been measured in the exponential gain regime, and at saturation. One of the most important aspects of the data analysis presented in this thesis was the development and use of start-to-end numerical simulations of the experiment. The dynamics of the ATF electron beam was modeled starting from the photocathode, through acceleration, transport, and inside the VISA undulator. The model allowed simulation of SASE process for different beam conditions, including the effects of the novel bunch compression mechanism on the electron beam 6-D phase space distribution. The numerical simulations displayed an excellent agreement with the experimental data, and became key to understanding complex dynamics of the SASE FEL process at VISA.

  10. Incorporating big data into treatment plan evaluation: Development of statistical DVH metrics and visualization dashboards.

    PubMed

    Mayo, Charles S; Yao, John; Eisbruch, Avraham; Balter, James M; Litzenberg, Dale W; Matuszak, Martha M; Kessler, Marc L; Weyburn, Grant; Anderson, Carlos J; Owen, Dawn; Jackson, William C; Haken, Randall Ten

    2017-01-01

    To develop statistical dose-volume histogram (DVH)-based metrics and a visualization method to quantify the comparison of treatment plans with historical experience and among different institutions. The descriptive statistical summary (ie, median, first and third quartiles, and 95% confidence intervals) of volume-normalized DVH curve sets of past experiences was visualized through the creation of statistical DVH plots. Detailed distribution parameters were calculated and stored in JavaScript Object Notation files to facilitate management, including transfer and potential multi-institutional comparisons. In the treatment plan evaluation, structure DVH curves were scored against computed statistical DVHs and weighted experience scores (WESs). Individual, clinically used, DVH-based metrics were integrated into a generalized evaluation metric (GEM) as a priority-weighted sum of normalized incomplete gamma functions. Historical treatment plans for 351 patients with head and neck cancer, 104 with prostate cancer who were treated with conventional fractionation, and 94 with liver cancer who were treated with stereotactic body radiation therapy were analyzed to demonstrate the usage of statistical DVH, WES, and GEM in a plan evaluation. A shareable dashboard plugin was created to display statistical DVHs and integrate GEM and WES scores into a clinical plan evaluation within the treatment planning system. Benchmarking with normal tissue complication probability scores was carried out to compare the behavior of GEM and WES scores. DVH curves from historical treatment plans were characterized and presented, with difficult-to-spare structures (ie, frequently compromised organs at risk) identified. Quantitative evaluations by GEM and/or WES compared favorably with the normal tissue complication probability Lyman-Kutcher-Burman model, transforming a set of discrete threshold-priority limits into a continuous model reflecting physician objectives and historical experience. Statistical DVH offers an easy-to-read, detailed, and comprehensive way to visualize the quantitative comparison with historical experiences and among institutions. WES and GEM metrics offer a flexible means of incorporating discrete threshold-prioritizations and historic context into a set of standardized scoring metrics. Together, they provide a practical approach for incorporating big data into clinical practice for treatment plan evaluations.

  11. Search for the Theta+ in photoproduction on the deuteron

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    K.H. Hicks

    2005-07-26

    A high-statistics experiment on a deuterium target was performed using a real photon beam with energies up to 3.6 GeV at the CLAS detector of Jefferson Lab. The reaction reported here is for {gamma}d {yields} pK{sup -} K{sup +} n where the neutron was identified using the missing mass technique. No statistically significant narrow peak in the mass region from 1.5-1.6 GeV was found. An upper limit on the elementary process {gamma}n {yields} K{sup -} {Theta}{sup +} was estimated to be about 4-5 nb, using a model-dependent correction for rescattering determined from {Lambda}(1520) production. Other reactions with less model-dependence aremore » being pursued.« less

  12. The Search for the Pentaquark

    NASA Astrophysics Data System (ADS)

    Eyrich, Wolfgang

    2007-01-01

    In the last two years, starting with the LEPS collaboration1 several experiments reported evidence for a manifestly exotic narrow state with a mass of about 1530 MeV/c2. The state was found to decay into K0p and K+n. This object with strangeness S = +1 was named Θ+ and identified with the lightest exotic antidecuplet baryon predicted in the soliton model2. Many experiments have scanned their data for a pentaquark signal with varying results. Some searches resulted in evidence for the Θ+ while others fail to produce any narrow structure in the region of interest. Currently, a number of high statistics experiments are being evaluated with the goal to confirm or refute the existence of the Θ+. In this contribution the experimental status and further prospects will be discussed.

  13. Strengthening Statistics Graduate Programs with Statistical Collaboration--The Case of Hawassa University, Ethiopia

    ERIC Educational Resources Information Center

    Goshu, Ayele Taye

    2016-01-01

    This paper describes the experiences gained from the established statistical collaboration canter at Hawassa University in May 2015 as part of LISA 2020 [Laboratory for Interdisciplinary Statistical Analysis] network. The center has got similar setup as LISA of Virginia Tech. Statisticians are trained on how to become more effective scientific…

  14. Inductive reasoning and judgment interference: experiments on Simpson's paradox.

    PubMed

    Fiedler, Klaus; Walther, Eva; Freytag, Peter; Nickel, Stefanie

    2003-01-01

    In a series of experiments on inductive reasoning, participants assessed the relationship between gender, success, and a covariate in a situation akin to Simpson's paradox: Although women were less successful then men according to overall statistics, they actually fared better then men at either of two universities. Understanding trivariate relationships of this kind requires cognitive routines similar to analysis of covariance. Across the first five experiments, however, participants generalized the disadvantage of women at the aggregate level to judgments referring to the different levels of the covariate, even when motivation was high and appropriate mental models were activated. The remaining three experiments demonstrated that Simpson's paradox could be mastered when the salience of the covariate was increased and when the salience of gender was decreased by the inclusion of temporal cues that disambiguate the causal status of the covariate. Copyright 2003 Society for Personality and Social Psychology, Inc.

  15. Latest results from Daya Bay

    NASA Astrophysics Data System (ADS)

    Vorobel, Vit; Daya Bay Collaboration

    2017-07-01

    The Daya Bay Reactor Neutrino Experiment was designed to measure θ 13, the smallest mixing angle in the three-neutrino mixing framework, with unprecedented precision. The experiment consists of eight functionally identical detectors placed underground at different baselines from three pairs of nuclear reactors in South China. Since Dec. 2011, the experiment has been running stably for more than 4 years, and has collected the largest reactor anti-neutrino sample to date. Daya Bay is able to greatly improve the precision on θ 13 and to make an independent measurement of the effective mass splitting in the electron antineutrino disappearance channel. Daya Bay can also perform a number of other precise measurements, such as a high-statistics determination of the absolute reactor antineutrino flux and spectrum, as well as a search for sterile neutrino mixing, among others. The most recent results from Daya Bay are discussed in this paper, as well as the current status and future prospects of the experiment.

  16. The role of fear in delusions of the paranormal.

    PubMed

    Lange, R; Houran, J

    1999-03-01

    Based on an extended process model derived from attribution theory, we hypothesized that pervasive and persistent delusions of the paranormal are characterized by the existence of a positive (self-reinforcing), rather than a negative (self-correcting), feedback loop involving paranormal beliefs, fears, and experiences, as moderated by gender and tolerance of ambiguity. A cross-cultural sample of "international" students who reported poltergeist-like experiences showing high fear of the paranormal was identified. As in earlier research, path analysis showed statistically significant and positive effects of belief on experience and/or fear on belief. However, paranormal experience now had a positive effect on fear as well. Thus, as predicted, increased fear removes the option of neutralizing ambiguous events by labeling them as "paranormal." Although female subjects showed significantly greater fear of the paranormal than male subjects, there is no evidence that the nature of the delusional process is gender specific.

  17. Detector Development for the MARE Neutrino Experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Galeazzi, M.; Bogorin, D.; Molina, R.

    2009-12-16

    The MARE experiment is designed to measure the mass of the neutrino with sub-eV sensitivity by measuring the beta decay of {sup 187}Re with cryogenic microcalorimeters. A preliminary analysis shows that, to achieve the necessary statistics, between 10,000 and 50,000 detectors are likely necessary. We have fabricated and characterized Iridium transition edge sensors with high reproducibility and uniformity for such a large scale experiment. We have also started a full scale simulation of the experimental setup for MARE, including thermalization in the absorber, detector response, and optimum filter analysis, to understand the issues related to reaching a sub-eV sensitivity andmore » to optimize the design of the MARE experiment. We present our characterization of the Ir devices, including reproducibility, uniformity, and sensitivity, and we discuss the implementation and capabilities of our full scale simulation.« less

  18. Tracking Multiple Statistics: Simultaneous Learning of Object Names and Categories in English and Mandarin Speakers

    ERIC Educational Resources Information Center

    Chen, Chi-hsin; Gershkoff-Stowe, Lisa; Wu, Chih-Yi; Cheung, Hintat; Yu, Chen

    2017-01-01

    Two experiments were conducted to examine adult learners' ability to extract multiple statistics in simultaneously presented visual and auditory input. Experiment 1 used a cross-situational learning paradigm to test whether English speakers were able to use co-occurrences to learn word-to-object mappings and concurrently form object categories…

  19. Organization and Carrying out the Educational Experiment and Statistical Analysis of Its Results in IHL

    ERIC Educational Resources Information Center

    Sidorov, Oleg V.; Kozub, Lyubov' V.; Goferberg, Alexander V.; Osintseva, Natalya V.

    2018-01-01

    The article discusses the methodological approach to the technology of the educational experiment performance, the ways of the research data processing by means of research methods and methods of mathematical statistics. The article shows the integrated use of some effective approaches to the training of the students majoring in…

  20. Pedagogy of the Spirit: Comparing Evangelical and Latter-Day Saint Youth Self-Reported In-Class Spiritual Experiences

    ERIC Educational Resources Information Center

    Wong, Arch Chee Keen; Sweat, Anthony; Gardner, Ryan

    2017-01-01

    This study statistically analyzes data from 756 evangelical and Latter-day Saint youth regarding their perceived in-class spiritual experiences of twenty items related to Christian theology. The data indicates similar spiritual outcomes between the two groups, with no statistically significant differences between eleven of the twenty spiritual…

  1. An Automated Statistical Process Control Study of Inline Mixing Using Spectrophotometric Detection

    ERIC Educational Resources Information Center

    Dickey, Michael D.; Stewart, Michael D.; Willson, C. Grant

    2006-01-01

    An experiment is described, which is designed for a junior-level chemical engineering "fundamentals of measurements and data analysis" course, where students are introduced to the concept of statistical process control (SPC) through a simple inline mixing experiment. The students learn how to create and analyze control charts in an effort to…

  2. Content, Affective, and Behavioral Challenges to Learning: Students' Experiences Learning Statistics

    ERIC Educational Resources Information Center

    McGrath, April L.

    2014-01-01

    This study examined the experiences of and challenges faced by students when completing a statistics course. As part of the requirement for this course, students completed a learning check-in, which consisted of an individual meeting with the instructor to discuss questions and the completion of a learning reflection and study plan. Forty…

  3. Second-career teachers

    NASA Astrophysics Data System (ADS)

    2012-11-01

    Last month we saw that the age distribution for high school physics teachers skewed older than that of all teachers. We also noted that, even though they are older, at least three-fourths of the high school physics teachers indicated that they planned to teach high school for at least six more years. The figure shows the age and years of teaching experience for high school physics teachers. We see that almost 12% of the teachers who are 50 years old or older have five years or fewer of teaching experience. Thus, these are likely second-career teachers. The typical age range for second-career teachers is 33 to 59. However, the younger second-career teachers are more difficult to isolate because the average duration of the previous career is one and one-half to three years. In the December issue, we will look at teaching activities physics teachers use in the classroom, Susan White is Research Manager in the Statistical Research Center at the American Institute of Physics; she directs the Nationwide Survey of High School Physics Teachers. If you have any questions, please contact Susan at swhite@aip.org.

  4. The Role of Stromally Produced Cathepsin D in Promoting Prostate Tumorigenesis

    DTIC Science & Technology

    2014-11-01

    was related to TGF-β activity. It has been previously shown in in vitro experiments that CathD can liberate TGF-β from the latency inhibitor complex...was performed fol- lowing a protocol that was described previously [31]. CathepsinDandProstateCancer 3 The Prostate Tissue slides were then incubated... low grade and high grade malignant prostate tissue. P-values less than 0.05 were consid- ered statistically significant. RESULTS

  5. Impact of Assimilating Surface Velocity Observations on the Model Sea Surface Height Using the NCOM-4DVAR

    DTIC Science & Technology

    2016-09-26

    statistical analysis is done by not only examining the SSH forecast error across the entire do- main, but also by concentrating on the areamost densely covered...over (b) entire GoM domain and (d) GLAD region only. Statistics shown for FR (thin black), SSH1 (thick black), and VEL (gray) experiment 96-h SSH...coefficient. To statistically FIG. 9. Sea surface height (m) for AVISO (a) 1 Aug, (b) 20 Aug, (c) 10 Sep, and (d) 30 Sep; for SSH1 experiment (e) 1

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kooperman, Gabriel J.; Pritchard, Michael S.; Burt, Melissa A.

    Changes in the character of rainfall are assessed using a holistic set of statistics based on rainfall frequency and amount distributions in climate change experiments with three conventional and superparameterized versions of the Community Atmosphere Model (CAM and SPCAM). Previous work has shown that high-order statistics of present-day rainfall intensity are significantly improved with superparameterization, especially in regions of tropical convection. Globally, the two modeling approaches project a similar future increase in mean rainfall, especially across the Inter-Tropical Convergence Zone (ITCZ) and at high latitudes, but over land, SPCAM predicts a smaller mean change than CAM. Changes in high-order statisticsmore » are similar at high latitudes in the two models but diverge at lower latitudes. In the tropics, SPCAM projects a large intensification of moderate and extreme rain rates in regions of organized convection associated with the Madden Julian Oscillation, ITCZ, monsoons, and tropical waves. In contrast, this signal is missing in all versions of CAM, which are found to be prone to predicting increases in the amount but not intensity of moderate rates. Predictions from SPCAM exhibit a scale-insensitive behavior with little dependence on horizontal resolution for extreme rates, while lower resolution (~2°) versions of CAM are not able to capture the response simulated with higher resolution (~1°). Furthermore, moderate rain rates analyzed by the “amount mode” and “amount median” are found to be especially telling as a diagnostic for evaluating climate model performance and tracing future changes in rainfall statistics to tropical wave modes in SPCAM.« less

  7. WE-G-18A-04: 3D Dictionary Learning Based Statistical Iterative Reconstruction for Low-Dose Cone Beam CT Imaging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bai, T; UT Southwestern Medical Center, Dallas, TX; Yan, H

    2014-06-15

    Purpose: To develop a 3D dictionary learning based statistical reconstruction algorithm on graphic processing units (GPU), to improve the quality of low-dose cone beam CT (CBCT) imaging with high efficiency. Methods: A 3D dictionary containing 256 small volumes (atoms) of 3x3x3 voxels was trained from a high quality volume image. During reconstruction, we utilized a Cholesky decomposition based orthogonal matching pursuit algorithm to find a sparse representation on this dictionary basis of each patch in the reconstructed image, in order to regularize the image quality. To accelerate the time-consuming sparse coding in the 3D case, we implemented our algorithm inmore » a parallel fashion by taking advantage of the tremendous computational power of GPU. Evaluations are performed based on a head-neck patient case. FDK reconstruction with full dataset of 364 projections is used as the reference. We compared the proposed 3D dictionary learning based method with a tight frame (TF) based one using a subset data of 121 projections. The image qualities under different resolutions in z-direction, with or without statistical weighting are also studied. Results: Compared to the TF-based CBCT reconstruction, our experiments indicated that 3D dictionary learning based CBCT reconstruction is able to recover finer structures, to remove more streaking artifacts, and is less susceptible to blocky artifacts. It is also observed that statistical reconstruction approach is sensitive to inconsistency between the forward and backward projection operations in parallel computing. Using high a spatial resolution along z direction helps improving the algorithm robustness. Conclusion: 3D dictionary learning based CBCT reconstruction algorithm is able to sense the structural information while suppressing noise, and hence to achieve high quality reconstruction. The GPU realization of the whole algorithm offers a significant efficiency enhancement, making this algorithm more feasible for potential clinical application. A high zresolution is preferred to stabilize statistical iterative reconstruction. This work was supported in part by NIH(1R01CA154747-01), NSFC((No. 61172163), Research Fund for the Doctoral Program of Higher Education of China (No. 20110201110011), China Scholarship Council.« less

  8. Contrasting effects of feature-based statistics on the categorisation and basic-level identification of visual objects.

    PubMed

    Taylor, Kirsten I; Devereux, Barry J; Acres, Kadia; Randall, Billi; Tyler, Lorraine K

    2012-03-01

    Conceptual representations are at the heart of our mental lives, involved in every aspect of cognitive functioning. Despite their centrality, a long-standing debate persists as to how the meanings of concepts are represented and processed. Many accounts agree that the meanings of concrete concepts are represented by their individual features, but disagree about the importance of different feature-based variables: some views stress the importance of the information carried by distinctive features in conceptual processing, others the features which are shared over many concepts, and still others the extent to which features co-occur. We suggest that previously disparate theoretical positions and experimental findings can be unified by an account which claims that task demands determine how concepts are processed in addition to the effects of feature distinctiveness and co-occurrence. We tested these predictions in a basic-level naming task which relies on distinctive feature information (Experiment 1) and a domain decision task which relies on shared feature information (Experiment 2). Both used large-scale regression designs with the same visual objects, and mixed-effects models incorporating participant, session, stimulus-related and feature statistic variables to model the performance. We found that concepts with relatively more distinctive and more highly correlated distinctive relative to shared features facilitated basic-level naming latencies, while concepts with relatively more shared and more highly correlated shared relative to distinctive features speeded domain decisions. These findings demonstrate that the feature statistics of distinctiveness (shared vs. distinctive) and correlational strength, as well as the task demands, determine how concept meaning is processed in the conceptual system. Copyright © 2011 Elsevier B.V. All rights reserved.

  9. Statistical contact angle analyses; "slow moving" drops on a horizontal silicon-oxide surface.

    PubMed

    Schmitt, M; Grub, J; Heib, F

    2015-06-01

    Sessile drop experiments on horizontal surfaces are commonly used to characterise surface properties in science and in industry. The advancing angle and the receding angle are measurable on every solid. Specially on horizontal surfaces even the notions themselves are critically questioned by some authors. Building a standard, reproducible and valid method of measuring and defining specific (advancing/receding) contact angles is an important challenge of surface science. Recently we have developed two/three approaches, by sigmoid fitting, by independent and by dependent statistical analyses, which are practicable for the determination of specific angles/slopes if inclining the sample surface. These approaches lead to contact angle data which are independent on "user-skills" and subjectivity of the operator which is also of urgent need to evaluate dynamic measurements of contact angles. We will show in this contribution that the slightly modified procedures are also applicable to find specific angles for experiments on horizontal surfaces. As an example droplets on a flat freshly cleaned silicon-oxide surface (wafer) are dynamically measured by sessile drop technique while the volume of the liquid is increased/decreased. The triple points, the time, the contact angles during the advancing and the receding of the drop obtained by high-precision drop shape analysis are statistically analysed. As stated in the previous contribution the procedure is called "slow movement" analysis due to the small covered distance and the dominance of data points with low velocity. Even smallest variations in velocity such as the minimal advancing motion during the withdrawing of the liquid are identifiable which confirms the flatness and the chemical homogeneity of the sample surface and the high sensitivity of the presented approaches. Copyright © 2014 Elsevier Inc. All rights reserved.

  10. Evaluation of Two Compressed Air Foam Systems for Culling Caged Layer Hens.

    PubMed

    Benson, Eric R; Weiher, Jaclyn A; Alphin, Robert L; Farnell, Morgan; Hougentogler, Daniel P

    2018-04-24

    Outbreaks of avian influenza (AI) and other highly contagious poultry diseases continue to be a concern for those involved in the poultry industry. In the situation of an outbreak, emergency depopulation of the birds involved is necessary. In this project, two compressed air foam systems (CAFS) were evaluated for mass emergency depopulation of layer hens in a manure belt equipped cage system. In both experiments, a randomized block design was used with multiple commercial layer hens treated with one of three randomly selected depopulation methods: CAFS, CAFS with CO₂ gas, and CO₂ gas. In Experiment 1, a Rowe manufactured CAFS was used, a selection of birds were instrumented, and the time to unconsciousness, brain death, altered terminal cardiac activity and motion cessation were recorded. CAFS with and without CO₂ was faster to unconsciousness, however, the other parameters were not statistically significant. In Experiment 2, a custom Hale based CAFS was used to evaluate the impact of bird age, a selection of birds were instrumented, and the time to motion cessation was recorded. The difference in time to cessation of movement between pullets and spent hens using CAFS was not statistically significant. Both CAFS depopulate caged layers, however, there was no benefit to including CO₂.

  11. Effects of consumer motives on search behavior using internet advertising.

    PubMed

    Yang, Kenneth C C

    2004-08-01

    Past studies on uses and gratifications theory suggested that consumer motives affect how they will use media and media contents. Recent advertising research has extended the theory to study the use of Internet advertising. The current study explores the effects of consumer motives on their search behavior using Internet advertising. The study employed a 2 by 2 between-subjects factorial experiment design. A total of 120 subjects were assigned to an experiment condition that contains an Internet advertisement varying by advertising appeals (i.e., rational vs. emotional) and product involvement levels (high vs. low). Consumer search behavior (measured by the depth, breadth, total amount of search), demographics, and motives were collected by post-experiment questionnaires. Because all three dependent variables measuring search behavior were conceptually related to each other, MANCOVA procedures were employed to examine the moderating effects of consumer motives on the dependent variables in four product involvement-advertising appeal conditions. Results indicated that main effects for product involvements and advertising appeals were statistically significant. Univariate ANOVA also showed that advertising appeals and product involvement levels influenced the total amount of search. Three-way interactions among advertising appeals, product involvement levels, and information motive were also statistically significant. Implications and future research directions are discussed.

  12. The Necessity of the Medial Temporal Lobe for Statistical Learning

    PubMed Central

    Schapiro, Anna C.; Gregory, Emma; Landau, Barbara; McCloskey, Michael; Turk-Browne, Nicholas B.

    2014-01-01

    The sensory input that we experience is highly patterned, and we are experts at detecting these regularities. Although the extraction of such regularities, or statistical learning (SL), is typically viewed as a cortical process, recent studies have implicated the medial temporal lobe (MTL), including the hippocampus. These studies have employed fMRI, leaving open the possibility that the MTL is involved but not necessary for SL. Here, we examined this issue in a case study of LSJ, a patient with complete bilateral hippocampal loss and broader MTL damage. In Experiments 1 and 2, LSJ and matched control participants were passively exposed to a continuous sequence of shapes, syllables, scenes, or tones containing temporal regularities in the co-occurrence of items. In a subsequent test phase, the control groups exhibited reliable SL in all conditions, successfully discriminating regularities from recombinations of the same items into novel foil sequences. LSJ, however, exhibited no SL, failing to discriminate regularities from foils. Experiment 3 ruled out more general explanations for this failure, such as inattention during exposure or difficulty following test instructions, by showing that LSJ could discriminate which individual items had been exposed. These findings provide converging support for the importance of the MTL in extracting temporal regularities. PMID:24456393

  13. A comparison of spectral magnitude and phase-locking value analyses of the frequency-following response to complex tones

    PubMed Central

    Zhu, Li; Bharadwaj, Hari; Xia, Jing; Shinn-Cunningham, Barbara

    2013-01-01

    Two experiments, both presenting diotic, harmonic tone complexes (100 Hz fundamental), were conducted to explore the envelope-related component of the frequency-following response (FFRENV), a measure of synchronous, subcortical neural activity evoked by a periodic acoustic input. Experiment 1 directly compared two common analysis methods, computing the magnitude spectrum and the phase-locking value (PLV). Bootstrapping identified which FFRENV frequency components were statistically above the noise floor for each metric and quantified the statistical power of the approaches. Across listeners and conditions, the two methods produced highly correlated results. However, PLV analysis required fewer processing stages to produce readily interpretable results. Moreover, at the fundamental frequency of the input, PLVs were farther above the metric's noise floor than spectral magnitudes. Having established the advantages of PLV analysis, the efficacy of the approach was further demonstrated by investigating how different acoustic frequencies contribute to FFRENV, analyzing responses to complex tones composed of different acoustic harmonics of 100 Hz (Experiment 2). Results show that the FFRENV response is dominated by peripheral auditory channels responding to unresolved harmonics, although low-frequency channels driven by resolved harmonics also contribute. These results demonstrate the utility of the PLV for quantifying the strength of FFRENV across conditions. PMID:23862815

  14. Organizational citizenship behavior and work experience.

    PubMed

    Kegans, Loyd; McCamey, Randy B; Hammond, Honor

    2012-01-01

    The authors compared the relationship of elements of the Organizational Citizenship Behavior (OCB) and years of work experience of registered nurses in the state of Texas. Work experience research has shown a relationship between OCB and work experience through mediating roles of various work related characteristics does exist. Work experience is described as the overall length of time in an occupation or workforce. Civic virtue was the only element of organizational citizenship behavior to have a statistically significant correlation with years of work experience in this study. Other elements were found to have no statistically significant correlation with years of work experience. Further research should be undertaken to determine if correlations between these two constructs holds up when the population under study is further refined by job classification, such as management and staff, or industry segment.

  15. Investigation of 124Xe nuclear structure with the 8Pi spectrometer at TRIUMF-ISAC

    NASA Astrophysics Data System (ADS)

    Radich, Allison; Garrett, P.; Jigmeddorj, B.; Michetti-Wilson, J.; Diaz Varela, A.; Hadinia, B.; Bianco, L.; Wong, J.; Chagnon-Lessard, S.; Dunlop, R.; Finlay, P.; Laffoley, A.; Leach, K. G.; Rand, E.; Sumithrarachchi, C.; Svennson, C. E.; Wood, J. L.; Yates, S. W.; Andreoiu, C.; Starosta, K.; Cross, D.; Garnsworthy, A. B.; Hackman, G.; Ball, G.; Triambak, S.

    2013-10-01

    The 124Xe nucleus has been thought to obey O(6) symmetry but a recent Coulomb excitation study has found that while O(5) may be preserved, O(6) appears to be badly broken. To further characterize the structure of this nucleus, a beta-decay experiment was performed at the TRIUMF-ISAC facility. A beam of radioactive 124Cs at a rate of 9.8 × 107 ions/s was implanted at the center of the 8Pi spectrometer where it underwent β + /EC decay into stable 124Xe. High-statistics gamma-gamma coincidence measurements have been analyzed to add to the level scheme of 124Xe, which has been extended considerably. The high statistics data set has revealed a new decay branch from a 124Cs high-spin isomer as well as several very-weak transitions between low-spin states in 124Xe. Branching ratios and B(E2) transition strengths have been calculated for the updated level scheme. The results will be important in determining collective properties and nuclear structure of the 124Xe.

  16. Collaborative classification of hyperspectral and visible images with convolutional neural network

    NASA Astrophysics Data System (ADS)

    Zhang, Mengmeng; Li, Wei; Du, Qian

    2017-10-01

    Recent advances in remote sensing technology have made multisensor data available for the same area, and it is well-known that remote sensing data processing and analysis often benefit from multisource data fusion. Specifically, low spatial resolution of hyperspectral imagery (HSI) degrades the quality of the subsequent classification task while using visible (VIS) images with high spatial resolution enables high-fidelity spatial analysis. A collaborative classification framework is proposed to fuse HSI and VIS images for finer classification. First, the convolutional neural network model is employed to extract deep spectral features for HSI classification. Second, effective binarized statistical image features are learned as contextual basis vectors for the high-resolution VIS image, followed by a classifier. The proposed approach employs diversified data in a decision fusion, leading to an integration of the rich spectral information, spatial information, and statistical representation information. In particular, the proposed approach eliminates the potential problems of the curse of dimensionality and excessive computation time. The experiments evaluated on two standard data sets demonstrate better classification performance offered by this framework.

  17. Theory of hydromagnetic turbulence

    NASA Technical Reports Server (NTRS)

    Montgomery, D.

    1983-01-01

    The present state of MHD turbulence theory as a possible solar wind research tool is surveyed. The theory is statistical, and does not make statements about individual events. The ensembles considered typically have individual realizations which differ qualitatively, unlike equilibrium statistical mechanics. Most of the theory deals with highly symmetric situations; most of these symmetries have yet to be tested in the solar wind. The applicability of MHD itself to solar wind parameters is highly questionable; yet it has no competitors, as a potentially comprehensive dynamical description. The purpose of solar wind research require sharper articulation. If they are to understand radial turbulent plasma flows from spheres, laboratory experiments and numerical solution of equations of motion may be cheap alternative to spacecraft. If "real life" information is demanded, multiple spacecraft with variable separation may be necessary to go further. The principal emphasis in the theory so far has been on spectral behavior for spatial covariances in wave number space. There is no respectable theory of these for highly anisotropic situations. A rather slow development of theory acts as a brake on justifiable measurement, at this point.

  18. Influences of credibility of testimony and strength of statistical evidence on children’s and adolescents’ reasoning

    PubMed Central

    Kail, Robert V.

    2013-01-01

    According to dual-process models that include analytic and heuristic modes of processing, analytic processing is often expected to become more common with development. Consistent with this view, on reasoning problems, adolescents are more likely than children to select alternatives that are backed by statistical evidence. It is shown here that this pattern depends on the quality of the statistical evidence and the quality of the testimonial that is the typical alternative to statistical evidence. In Experiment 1, 9- and 13-year-olds (N = 64) were presented with scenarios in which solid statistical evidence was contrasted with casual or expert testimonial evidence. When testimony was casual, children relied on it but adolescents did not; when testimony was expert, both children and adolescents relied on it. In Experiment 2, 9- and 13-year-olds (N = 83) were presented with scenarios in which casual testimonial evidence was contrasted with weak or strong statistical evidence. When statistical evidence was weak, children and adolescents relied on both testimonial and statistical evidence; when statistical evidence was strong, most children and adolescents relied on it. Results are discussed in terms of their implications for dual-process accounts of cognitive development. PMID:23735681

  19. A Method for Characterizing Phenotypic Changes in Highly Variable Cell Populations and its Application to High Content Screening of Arabidopsis thaliana Protoplastsa

    PubMed Central

    Johnson, Gregory R.; Kangas, Joshua D.; Dovzhenko, Alexander; Trojok, Rüdiger; Voigt, Karsten; Majarian, Timothy D.; Palme, Klaus; Murphy, Robert F.

    2017-01-01

    Quantitative image analysis procedures are necessary for the automated discovery of effects of drug treatment in large collections of fluorescent micrographs. When compared to their mammalian counterparts, the effects of drug conditions on protein localization in plant species are poorly understood and underexplored. To investigate this relationship, we generated a large collection of images of single plant cells after various drug treatments. For this, protoplasts were isolated from six transgenic lines of A. thaliana expressing fluorescently tagged proteins. Nine drugs at three concentrations were applied to protoplast cultures followed by automated image acquisition. For image analysis, we developed a cell segmentation protocol for detecting drug effects using a Hough-transform based region of interest detector and a novel cross-channel texture feature descriptor. In order to determine treatment effects, we summarized differences between treated and untreated experiments with an L1 Cramér-von Mises statistic. The distribution of these statistics across all pairs of treated and untreated replicates was compared to the variation within control replicates to determine the statistical significance of observed effects. Using this pipeline, we report the dose dependent drug effects in the first high-content Arabidopsis thaliana drug screen of its kind. These results can function as a baseline for comparison to other protein organization modeling approaches in plant cells. PMID:28245335

  20. Sources of computer self-efficacy: The relationship to outcome expectations, computer anxiety, and intention to use computers

    NASA Astrophysics Data System (ADS)

    Antoine, Marilyn V.

    2011-12-01

    The purpose of this research was to extend earlier research on sources of selfefficacy (Lent, Lopez, & Biechke, 1991; Usher & Pajares, 2009) to the information technology domain. The principal investigator examined how Bandura's (1977) sources of self-efficacy information---mastery experience, vicarious experience, verbal persuasion, and physiological states---shape computer self-efficacy beliefs and influence the decision to use or not use computers. The study took place at a mid-sized Historically Black College or University in the South. A convenience sample of 105 undergraduates was drawn from students enrolled in multiple sections of two introductory computer courses. There were 67 females and 38 males. This research was a correlational study of the following variables: sources of computer self-efficacy, general computer self-efficacy, outcome expectations, computer anxiety, and intention to use computers. The principal investigator administered a survey questionnaire containing 52 Likert items to measure the major study variables. Additionally, the survey instrument collected demographic variables such as gender, age, race, intended major, classification, technology use, technology adoption category, and whether the student owns a computer. The results reveal the following: (1) Mastery experience and verbal persuasion had statistically significant relationships to general computer self-efficacy, while vicarious experience and physiological states had non-significant relationships. Mastery experience had the strongest correlation to general computer self-efficacy. (2) All of the sources of computer self-efficacy had statistically significant relationships to personal outcome expectations. Vicarious experience had the strongest correlation to personal outcome expectations. (3) All of the sources of self-efficacy had statistically significant relationships to performance outcome expectations. Vicarious experience had the strongest correlation to performance outcome expectations. (4) Mastery experience and physiological states had statistically significant relationships to computer anxiety, while vicarious experience and verbal persuasion had non-significant relationships. Physiological states had the strongest correlation to computer anxiety. (5) Mastery experience, vicarious experience, and physiological states had statistically significant relationships to intention to use computers, while verbal persuasion had a non-significant relationship. Mastery experience had the strongest correlation to intention to use computers. Gender-related findings indicate that females reported higher average mastery experience, vicarious experience, physiological states, and intention to use computers than males. Females reported lower average general computer self-efficacy, computer anxiety, verbal persuasion, personal outcome expectations, and performance outcome expectations than males. The results of this study can be used to develop strategies for increasing general computer self-efficacy, outcome expectations, and intention to use computers. The results can also be used to develop strategies for reducing computer anxiety.

  1. Search for Point Sources of Ultra-High-Energy Cosmic Rays above 4.0 × 1019 eV Using a Maximum Likelihood Ratio Test

    NASA Astrophysics Data System (ADS)

    Abbasi, R. U.; Abu-Zayyad, T.; Amann, J. F.; Archbold, G.; Atkins, R.; Bellido, J. A.; Belov, K.; Belz, J. W.; Ben-Zvi, S. Y.; Bergman, D. R.; Boyer, J. H.; Burt, G. W.; Cao, Z.; Clay, R. W.; Connolly, B. M.; Dawson, B. R.; Deng, W.; Farrar, G. R.; Fedorova, Y.; Findlay, J.; Finley, C. B.; Hanlon, W. F.; Hoffman, C. M.; Holzscheiter, M. H.; Hughes, G. A.; Hüntemeyer, P.; Jui, C. C. H.; Kim, K.; Kirn, M. A.; Knapp, B. C.; Loh, E. C.; Maestas, M. M.; Manago, N.; Mannel, E. J.; Marek, L. J.; Martens, K.; Matthews, J. A. J.; Matthews, J. N.; O'Neill, A.; Painter, C. A.; Perera, L.; Reil, K.; Riehle, R.; Roberts, M. D.; Sasaki, M.; Schnetzer, S. R.; Seman, M.; Simpson, K. M.; Sinnis, G.; Smith, J. D.; Snow, R.; Sokolsky, P.; Song, C.; Springer, R. W.; Stokes, B. T.; Thomas, J. R.; Thomas, S. B.; Thomson, G. B.; Tupa, D.; Westerhoff, S.; Wiencke, L. R.; Zech, A.

    2005-04-01

    We present the results of a search for cosmic-ray point sources at energies in excess of 4.0×1019 eV in the combined data sets recorded by the Akeno Giant Air Shower Array and High Resolution Fly's Eye stereo experiments. The analysis is based on a maximum likelihood ratio test using the probability density function for each event rather than requiring an a priori choice of a fixed angular bin size. No statistically significant clustering of events consistent with a point source is found.

  2. Attitudes toward statistics in medical postgraduates: measuring, evaluating and monitoring.

    PubMed

    Zhang, Yuhai; Shang, Lei; Wang, Rui; Zhao, Qinbo; Li, Chanjuan; Xu, Yongyong; Su, Haixia

    2012-11-23

    In medical training, statistics is considered a very difficult course to learn and teach. Current studies have found that students' attitudes toward statistics can influence their learning process. Measuring, evaluating and monitoring the changes of students' attitudes toward statistics are important. Few studies have focused on the attitudes of postgraduates, especially medical postgraduates. Our purpose was to understand current attitudes regarding statistics held by medical postgraduates and explore their effects on students' achievement. We also wanted to explore the influencing factors and the sources of these attitudes and monitor their changes after a systematic statistics course. A total of 539 medical postgraduates enrolled in a systematic statistics course completed the pre-form of the Survey of Attitudes Toward Statistics -28 scale, and 83 postgraduates were selected randomly from among them to complete the post-form scale after the course. Most medical postgraduates held positive attitudes toward statistics, but they thought statistics was a very difficult subject. The attitudes mainly came from experiences in a former statistical or mathematical class. Age, level of statistical education, research experience, specialty and mathematics basis may influence postgraduate attitudes toward statistics. There were significant positive correlations between course achievement and attitudes toward statistics. In general, student attitudes showed negative changes after completing a statistics course. The importance of student attitudes toward statistics must be recognized in medical postgraduate training. To make sure all students have a positive learning environment, statistics teachers should measure their students' attitudes and monitor their change of status during a course. Some necessary assistance should be offered for those students who develop negative attitudes.

  3. Relationship Between Physicians' Active Participation in Maintenance of Certification and Patients' Perspective of Care Surveys.

    PubMed

    Morrell, Jessica; Stratman, Erik J

    2016-06-01

    Medical specialty boards have a Maintenance of Certification (MOC) paradigm whose intention is to ensure high-quality patient care. How the patient experience is affected by physician MOC enrollment/participation is unknown. Our goal was to determine if patient experience is associated with physician board certification and MOC status. We analyzed physician experience and MOC databases to determine the relationships among physicians' patient experience national percentile rankings and board certification status and MOC enrollment and activity status. Board-certified physicians enrolled in MOC did not have statistically significant different patient experience scores compared to board-certified physicians not enrolled in MOC. Mid-career physicians enrolled in MOC had patients more likely to recommend them and reported higher confidence in them. Patients did not perceive physicians participating in MOC patient safety modules as more cautious in providing patient care. Although most analyses did not demonstrate significant differences in patient experience scores for physicians actively participating in MOC compared to those not, some differences were noted. Higher provider-specific patient experience scores were noted, particularly for mid-career physicians.

  4. Mapping probabilities of extreme continental water storage changes from space gravimetry

    NASA Astrophysics Data System (ADS)

    Kusche, J.; Eicker, A.; Forootan, E.; Springer, A.; Longuevergne, L.

    2016-12-01

    Using data from the Gravity Recovery and Climate Experiment (GRACE) mission, we derive statistically robust 'hotspot' regions of high probability of peak anomalous - i.e. with respect to the seasonal cycle - water storage (of up to 0.7 m one-in-five-year return level) and flux (up to 0.14 m/mon). Analysis of, and comparison with, up to 32 years of ERA-Interim reanalysis fields reveals generally good agreement of these hotspot regions to GRACE results, and that most exceptions are located in the Tropics. However, a simulation experiment reveals that differences observed by GRACE are statistically significant, and further error analysis suggests that by around the year 2020 it will be possible to detect temporal changes in the frequency of extreme total fluxes (i.e. combined effects of mainly precipitation and floods) for at least 10-20% of the continental area, assuming that we have a continuation of GRACE by its follow-up GRACE-FO. J. Kusche et al. (2016): Mapping probabilities of extreme continental water storage changes from space gravimetry, Geophysical Research Letters, accepted online, doi:10.1002/2016GL069538

  5. Salicylate-induced changes in spontaneous activity of single units in the inferior colliculus of the guinea pig.

    PubMed

    Jastreboff, P J; Sasaki, C T

    1986-11-01

    Changes in spontaneous neuronal activity of the inferior colliculus in albino guinea pigs before and after administration of sodium salicylate were analyzed. Animals were anesthetized with pentobarbital, and two microelectrodes separated by a few hundred microns were driven through the inferior colliculus. After collecting a sufficiently large sample of cells, sodium salicylate (450 mg/kg) was injected i.p. and recordings again made 2 h after the injection. Comparison of spontaneous activity recorded before and after salicylate administration revealed highly statistically significant differences (p less than 0.001). After salicylate, the mean rate of the cell population increased from 29 to 83 Hz and the median from 26 to 74 Hz. Control experiments in which sodium salicylate was replaced by saline injection revealed no statistically significant differences in cell discharges. Recordings made during the same experiments from lobulus V of the cerebellar vermis revealed no changes in response to salicylate. The observed changes in single-unit activity due to salicylate administration may represent the first systematic evidence of a tinnituslike phenomenon in animals.

  6. Applying compressive sensing to TEM video: A substantial frame rate increase on any camera

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stevens, Andrew; Kovarik, Libor; Abellan, Patricia

    One of the main limitations of imaging at high spatial and temporal resolution during in-situ transmission electron microscopy (TEM) experiments is the frame rate of the camera being used to image the dynamic process. While the recent development of direct detectors has provided the hardware to achieve frame rates approaching 0.1 ms, the cameras are expensive and must replace existing detectors. In this paper, we examine the use of coded aperture compressive sensing (CS) methods to increase the frame rate of any camera with simple, low-cost hardware modifications. The coded aperture approach allows multiple sub-frames to be coded and integratedmore » into a single camera frame during the acquisition process, and then extracted upon readout using statistical CS inversion. Here we describe the background of CS and statistical methods in depth and simulate the frame rates and efficiencies for in-situ TEM experiments. Depending on the resolution and signal/noise of the image, it should be possible to increase the speed of any camera by more than an order of magnitude using this approach.« less

  7. Applying compressive sensing to TEM video: A substantial frame rate increase on any camera

    DOE PAGES

    Stevens, Andrew; Kovarik, Libor; Abellan, Patricia; ...

    2015-08-13

    One of the main limitations of imaging at high spatial and temporal resolution during in-situ transmission electron microscopy (TEM) experiments is the frame rate of the camera being used to image the dynamic process. While the recent development of direct detectors has provided the hardware to achieve frame rates approaching 0.1 ms, the cameras are expensive and must replace existing detectors. In this paper, we examine the use of coded aperture compressive sensing (CS) methods to increase the frame rate of any camera with simple, low-cost hardware modifications. The coded aperture approach allows multiple sub-frames to be coded and integratedmore » into a single camera frame during the acquisition process, and then extracted upon readout using statistical CS inversion. Here we describe the background of CS and statistical methods in depth and simulate the frame rates and efficiencies for in-situ TEM experiments. Depending on the resolution and signal/noise of the image, it should be possible to increase the speed of any camera by more than an order of magnitude using this approach.« less

  8. From access to success in science: An academic-student affairs intervention for undergraduate freshmen biology students

    NASA Astrophysics Data System (ADS)

    Aldridge, Jacqueline Nouvelle

    The first year experience is known to present an array of challenges for traditional college students. In particular, freshmen who major in a STEM discipline have their own unique set of challenges when they transition from high school science and math to college science and math; especially chemistry. As a result, students may encounter negative experiences which lower academic and social confidence. This project was designed as a pilot study intervention for a small group of freshmen biology students who were considered academically at-risk due their math SAT scores. The study occurred during the fall semester involving an enhanced active learning component based on the Peer-led Team Learning (PLTL) general chemistry supplemental pedagogy model, and a biology-focused First Year Experience (FYE). PLTL workshops took place in freshmen residence halls, creating a live-n-learn community environment. Mid-term and final chemistry grades and final math grades were collected to measure academic progress. Self-reporting surveys and journals were used to encourage participants to reconstruct their experiences and perceptions of the study. Descriptive analysis was performed to measure statistical significance between midterm and final grade performance, and a general inductive qualitative method was used to determine academic and social confidence as well as experiences and perceptions of the project. Findings of this project revealed a statistically significant improvement between chemistry midterm and final grades of the sample participants. Although academic confidence did not increase, results reveal that social confidence progressed as the majority of students developed a value for studying in groups.

  9. Fostering Students' Statistical Literacy through Significant Learning Experience

    ERIC Educational Resources Information Center

    Krishnan, Saras

    2015-01-01

    A major objective of statistics education is to develop students' statistical literacy that enables them to be educated users of data in context. Teaching statistics in today's educational settings is not an easy feat because teachers have a huge task in keeping up with the demands of the new generation of learners. The present day students have…

  10. Double-slit experiment with single wave-driven particles and its relation to quantum mechanics.

    PubMed

    Andersen, Anders; Madsen, Jacob; Reichelt, Christian; Rosenlund Ahl, Sonja; Lautrup, Benny; Ellegaard, Clive; Levinsen, Mogens T; Bohr, Tomas

    2015-07-01

    In a thought-provoking paper, Couder and Fort [Phys. Rev. Lett. 97, 154101 (2006)] describe a version of the famous double-slit experiment performed with droplets bouncing on a vertically vibrated fluid surface. In the experiment, an interference pattern in the single-particle statistics is found even though it is possible to determine unambiguously which slit the walking droplet passes. Here we argue, however, that the single-particle statistics in such an experiment will be fundamentally different from the single-particle statistics of quantum mechanics. Quantum mechanical interference takes place between different classical paths with precise amplitude and phase relations. In the double-slit experiment with walking droplets, these relations are lost since one of the paths is singled out by the droplet. To support our conclusions, we have carried out our own double-slit experiment, and our results, in particular the long and variable slit passage times of the droplets, cast strong doubt on the feasibility of the interference claimed by Couder and Fort. To understand theoretically the limitations of wave-driven particle systems as analogs to quantum mechanics, we introduce a Schrödinger equation with a source term originating from a localized particle that generates a wave while being simultaneously guided by it. We show that the ensuing particle-wave dynamics can capture some characteristics of quantum mechanics such as orbital quantization. However, the particle-wave dynamics can not reproduce quantum mechanics in general, and we show that the single-particle statistics for our model in a double-slit experiment with an additional splitter plate differs qualitatively from that of quantum mechanics.

  11. Statistics of vacuum breakdown in the high-gradient and low-rate regime

    NASA Astrophysics Data System (ADS)

    Wuensch, Walter; Degiovanni, Alberto; Calatroni, Sergio; Korsbäck, Anders; Djurabekova, Flyura; Rajamäki, Robin; Giner-Navarro, Jorge

    2017-01-01

    In an increasing number of high-gradient linear accelerator applications, accelerating structures must operate with both high surface electric fields and low breakdown rates. Understanding the statistical properties of breakdown occurrence in such a regime is of practical importance for optimizing accelerator conditioning and operation algorithms, as well as of interest for efforts to understand the physical processes which underlie the breakdown phenomenon. Experimental data of breakdown has been collected in two distinct high-gradient experimental set-ups: A prototype linear accelerating structure operated in the Compact Linear Collider Xbox 12 GHz test stands, and a parallel plate electrode system operated with pulsed DC in the kV range. Collected data is presented, analyzed and compared. The two systems show similar, distinctive, two-part distributions of number of pulses between breakdowns, with each part corresponding to a specific, constant event rate. The correlation between distance and number of pulses between breakdown indicates that the two parts of the distribution, and their corresponding event rates, represent independent primary and induced follow-up breakdowns. The similarity of results from pulsed DC to 12 GHz rf indicates a similar vacuum arc triggering mechanism over the range of conditions covered by the experiments.

  12. Uncertainty quantification for nuclear density functional theory and information content of new measurements.

    PubMed

    McDonnell, J D; Schunck, N; Higdon, D; Sarich, J; Wild, S M; Nazarewicz, W

    2015-03-27

    Statistical tools of uncertainty quantification can be used to assess the information content of measured observables with respect to present-day theoretical models, to estimate model errors and thereby improve predictive capability, to extrapolate beyond the regions reached by experiment, and to provide meaningful input to applications and planned measurements. To showcase new opportunities offered by such tools, we make a rigorous analysis of theoretical statistical uncertainties in nuclear density functional theory using Bayesian inference methods. By considering the recent mass measurements from the Canadian Penning Trap at Argonne National Laboratory, we demonstrate how the Bayesian analysis and a direct least-squares optimization, combined with high-performance computing, can be used to assess the information content of the new data with respect to a model based on the Skyrme energy density functional approach. Employing the posterior probability distribution computed with a Gaussian process emulator, we apply the Bayesian framework to propagate theoretical statistical uncertainties in predictions of nuclear masses, two-neutron dripline, and fission barriers. Overall, we find that the new mass measurements do not impose a constraint that is strong enough to lead to significant changes in the model parameters. The example discussed in this study sets the stage for quantifying and maximizing the impact of new measurements with respect to current modeling and guiding future experimental efforts, thus enhancing the experiment-theory cycle in the scientific method.

  13. Guidelines for the design and statistical analysis of experiments in papers submitted to ATLA.

    PubMed

    Festing, M F

    2001-01-01

    In vitro experiments need to be well designed and correctly analysed if they are to achieve their full potential to replace the use of animals in research. An "experiment" is a procedure for collecting scientific data in order to answer a hypothesis, or to provide material for generating new hypotheses, and differs from a survey because the scientist has control over the treatments that can be applied. Most experiments can be classified into one of a few formal designs, the most common being completely randomised, and randomised block designs. These are quite common with in vitro experiments, which are often replicated in time. Some experiments involve a single independent (treatment) variable, while other "factorial" designs simultaneously vary two or more independent variables, such as drug treatment and cell line. Factorial designs often provide additional information at little extra cost. Experiments need to be carefully planned to avoid bias, be powerful yet simple, provide for a valid statistical analysis and, in some cases, have a wide range of applicability. Virtually all experiments need some sort of statistical analysis in order to take account of biological variation among the experimental subjects. Parametric methods using the t test or analysis of variance are usually more powerful than non-parametric methods, provided the underlying assumptions of normality of the residuals and equal variances are approximately valid. The statistical analyses of data from a completely randomised design, and from a randomised-block design are demonstrated in Appendices 1 and 2, and methods of determining sample size are discussed in Appendix 3. Appendix 4 gives a checklist for authors submitting papers to ATLA.

  14. Statistical Surrogate Modeling of Atmospheric Dispersion Events Using Bayesian Adaptive Splines

    NASA Astrophysics Data System (ADS)

    Francom, D.; Sansó, B.; Bulaevskaya, V.; Lucas, D. D.

    2016-12-01

    Uncertainty in the inputs of complex computer models, including atmospheric dispersion and transport codes, is often assessed via statistical surrogate models. Surrogate models are computationally efficient statistical approximations of expensive computer models that enable uncertainty analysis. We introduce Bayesian adaptive spline methods for producing surrogate models that capture the major spatiotemporal patterns of the parent model, while satisfying all the necessities of flexibility, accuracy and computational feasibility. We present novel methodological and computational approaches motivated by a controlled atmospheric tracer release experiment conducted at the Diablo Canyon nuclear power plant in California. Traditional methods for building statistical surrogate models often do not scale well to experiments with large amounts of data. Our approach is well suited to experiments involving large numbers of model inputs, large numbers of simulations, and functional output for each simulation. Our approach allows us to perform global sensitivity analysis with ease. We also present an approach to calibration of simulators using field data.

  15. Statistical Learning of Two Artificial Languages Presented Successively: How Conscious?

    PubMed Central

    Franco, Ana; Cleeremans, Axel; Destrebecqz, Arnaud

    2011-01-01

    Statistical learning is assumed to occur automatically and implicitly, but little is known about the extent to which the representations acquired over training are available to conscious awareness. In this study, we focus on whether the knowledge acquired in a statistical learning situation is available to conscious control. Participants were first exposed to an artificial language presented auditorily. Immediately thereafter, they were exposed to a second artificial language. Both languages were composed of the same corpus of syllables and differed only in the transitional probabilities. We first determined that both languages were equally learnable (Experiment 1) and that participants could learn the two languages and differentiate between them (Experiment 2). Then, in Experiment 3, we used an adaptation of the Process-Dissociation Procedure (Jacoby, 1991) to explore whether participants could consciously manipulate the acquired knowledge. Results suggest that statistical information can be used to parse and differentiate between two different artificial languages, and that the resulting representations are available to conscious control. PMID:21960981

  16. Statistic analyses of the color experience according to the age of the observer.

    PubMed

    Hunjet, Anica; Parac-Osterman, Durdica; Vucaj, Edita

    2013-04-01

    Psychological experience of color is a real state of the communication between the environment and color, and it will depend on the source of the light, angle of the view, and particular on the observer and his health condition. Hering's theory or a theory of the opponent processes supposes that cones, which are situated in the retina of the eye, are not sensible on the three chromatic domains (areas, fields, zones) (red, green and purple-blue), but they produce a signal based on the principle of the opposed pairs of colors. A reason of this theory depends on the fact that certain disorders of the color eyesight, which include blindness to certain colors, cause blindness to pairs of opponent colors. This paper presents a demonstration of the experience of blue and yellow tone according to the age of the observer. For the testing of the statistically significant differences in the omission in the color experience according to the color of the background we use following statistical tests: Mann-Whitnney U Test, Kruskal-Wallis ANOVA and Median test. It was proven that the differences are statistically significant in the elderly persons (older than 35 years).

  17. Assessment of Multiple Daily Precipitation Statistics in ERA-Interim Driven Med-CORDEX and EURO-CORDEX Experiments Against High Resolution Observations

    NASA Astrophysics Data System (ADS)

    Coppola, E.; Fantini, A.; Raffaele, F.; Torma, C. Z.; Bacer, S.; Giorgi, F.; Ahrens, B.; Dubois, C.; Sanchez, E.; Verdecchia, M.

    2017-12-01

    We assess the statistics of different daily precipitation indices in ensembles of Med-CORDEX and EUROCORDEX experiments at high resolution (grid spacing of ˜0.11° , or RCM11) and medium resolution (grid spacing of ˜0.44° , or RCM44) with regional climate models (RCMs) driven by the ERA-Interim reanalysis of observations for the period 1989-2008. The assessment is carried out by comparison with a set of high resolution observation datasets for 9 European subregions. The statistics analyzed include quantitative metrics for mean precipitation, daily precipitation Probability Density Functions (PDFs), daily precipitation intensity, frequency, 95th percentile and 95th percentile of dry spell length. We assess both an ensemble including all Med-CORDEX and EURO-CORDEX models and one including the Med-CORDEX models alone. For the All Models ensembles, the RCM11 one shows a remarkable performance in reproducing the spatial patterns and seasonal cycle of mean precipitation over all regions, with a consistent and marked improvement compared to the RCM44 ensemble and the ERA-Interim reanalysis. A good consistency with observations by the RCM11 ensemble (and a substantial improvement compared to RCM44 and ERA-Interim) is found also for the daily precipitation PDFs, mean intensity and, to a lesser extent, the 95th percentile. In fact, for some regions the RCM11 ensemble overestimates the occurrence of very high intensity events while for one region the models underestimate the occurrence of the largest extremes. The RCM11 ensemble still shows a general tendency to underestimate the dry day frequency and 95th percentile of dry spell length over wetter regions, with only a marginal improvement compared to the lower resolution models. This indicates that the problem of the excessive production of low precipitation events found in many climate models persists also at relatively high resolutions, at least in wet climate regimes. Concerning the Med-CORDEX model ensembles we find that their performance is of similar quality as that of the all-models over the Mediterranean regions analyzed. Finally, we stress the need of consistent and quality checked fine scale observation datasets for the assessment of RCMs run at increasingly high horizontal resolutions.

  18. Interactivity fosters Bayesian reasoning without instruction.

    PubMed

    Vallée-Tourangeau, Gaëlle; Abadie, Marlène; Vallée-Tourangeau, Frédéric

    2015-06-01

    Successful statistical reasoning emerges from a dynamic system including: a cognitive agent, material artifacts with their actions possibilities, and the thoughts and actions that are realized while reasoning takes place. Five experiments provide evidence that enabling the physical manipulation of the problem information (through the use of playing cards) substantially improves statistical reasoning, without training or instruction, not only with natural frequency statements (Experiment 1) but also with single-event probability statements (Experiment 2). Improved statistical reasoning was not simply a matter of making all sets and subsets explicit in the pack of cards (Experiment 3), it was not merely due to the discrete and countable layout resulting from the cards manipulation, and it was not mediated by participants' level of engagement with the task (Experiment 5). The positive effect of an increased manipulability of the problem information on participants' reasoning performance was generalizable both over problems whose numeric properties did not map perfectly onto the cards and over different types of cards (Experiment 4). A systematic analysis of participants' behaviors revealed that manipulating cards improved performance when reasoners spent more time actively changing the presentation layout "in the world" as opposed to when they spent more time passively pointing at cards, seemingly attempting to solve the problem "in their head." Although they often go unnoticed, the action possibilities of the material artifacts available and the actions that are realized on those artifacts are constitutive of successful statistical reasoning, even in adults who have ostensibly reached cognitive maturity. (c) 2015 APA, all rights reserved).

  19. Evaluating the assumption of power-law late time scaling of breakthrough curves in highly heterogeneous media

    NASA Astrophysics Data System (ADS)

    Pedretti, Daniele

    2017-04-01

    Power-law (PL) distributions are widely adopted to define the late-time scaling of solute breakthrough curves (BTCs) during transport experiments in highly heterogeneous media. However, from a statistical perspective, distinguishing between a PL distribution and another tailed distribution is difficult, particularly when a qualitative assessment based on visual analysis of double-logarithmic plotting is used. This presentation aims to discuss the results from a recent analysis where a suite of statistical tools was applied to evaluate rigorously the scaling of BTCs from experiments that generate tailed distributions typically described as PL at late time. To this end, a set of BTCs from numerical simulations in highly heterogeneous media were generated using a transition probability approach (T-PROGS) coupled to a finite different numerical solver of the flow equation (MODFLOW) and a random walk particle tracking approach for Lagrangian transport (RW3D). The T-PROGS fields assumed randomly distributed hydraulic heterogeneities with long correlation scales creating solute channeling and anomalous transport. For simplicity, transport was simulated as purely advective. This combination of tools generates strongly non-symmetric BTCs visually resembling PL distributions at late time when plotted in double log scales. Unlike other combination of modeling parameters and boundary conditions (e.g. matrix diffusion in fractures), at late time no direct link exists between the mathematical functions describing scaling of these curves and physical parameters controlling transport. The results suggest that the statistical tests fail to describe the majority of curves as PL distributed. Moreover, they suggest that PL or lognormal distributions have the same likelihood to represent parametrically the shape of the tails. It is noticeable that forcing a model to reproduce the tail as PL functions results in a distribution of PL slopes comprised between 1.2 and 4, which are the typical values observed during field experiments. We conclude that care must be taken when defining a BTC late time distribution as a power law function. Even though the estimated scaling factors are found to fall in traditional ranges, the actual distribution controlling the scaling of concentration may different from a power-law function, with direct consequences for instance for the selection of effective parameters in upscaling modeling solutions.

  20. Immunodepletion Plasma Proteomics by TripleTOF 5600 and Orbitrap Elite/LTQ-Orbitrap Velos/Q Exactive Mass Spectrometers

    PubMed Central

    Patel, Bhavinkumar B.; Kelsen, Steven G.; Braverman, Alan; Swinton, Derrick J.; Gafken, Philip R.; Jones, Lisa A.; Lane, William S.; Neveu, John M.; Leung, Hon-Chiu E.; Shaffer, Scott A.; Leszyk, John D.; Stanley, Bruce A.; Fox, Todd E.; Stanley, Anne; Hall, Michael J.; Hampel, Heather; South, Christopher D.; de la Chapelle, Albert; Burt, Randall W.; Jones, David A.; Kopelovich, Levy; Yeung, Anthony T.

    2013-01-01

    Plasma proteomic experiments performed rapidly and economically using several of the latest high-resolution mass spectrometers were compared. Four quantitative hyperfractionated plasma proteomics experiments were analyzed in replicates by two AB SCIEX TripleTOF 5600 and three Thermo Scientific Orbitrap (Elite/LTQ-Orbitrap Velos/Q Exactive) instruments. Each experiment compared two iTRAQ isobaric-labeled immunodepleted plasma proteomes, provided as 30 labeled peptide fractions. 480 LC-MS/MS runs delivered >250 GB of data in two months. Several analysis algorithms were compared. At 1 % false discovery rate, the relative comparative findings concluded that the Thermo Scientific Q Exactive Mass Spectrometer resulted in the highest number of identified proteins and unique sequences with iTRAQ quantitation. The confidence of iTRAQ fold-change for each protein is dependent on the overall ion statistics (Mascot Protein Score) attainable by each instrument. The benchmarking also suggested how to further improve the mass spectrometry parameters and HPLC conditions. Our findings highlight the special challenges presented by the low abundance peptide ions of iTRAQ plasma proteome because the dynamic range of plasma protein abundance is uniquely high compared with cell lysates, necessitating high instrument sensitivity. PMID:24004147

  1. Providing responsive nursing care to new mothers with high and low confidence.

    PubMed

    Mantha, Shannon; Davies, Barbara; Moyer, Alwyn; Crowe, Katherine

    2008-01-01

    To describe new mothers' experiences with family-centered maternity care in relation to their confidence level and to determine how care could have been more responsive to their needs. Using data from a prospective Canadian survey of 596 postpartum women, a subsample of women with low and high confidence (N = 74) was selected. Data were analyzed using descriptive statistics and content analysis. Women with both high and low confidence expressed negative experiences with similar frequency (n = 47/74, 64%). Women wanted more nursing support for breastfeeding and postpartum teaching and education. Women who reported a language other than English or French as their first language were significantly less confident than English- and French-speaking women (p < .05). A multilevel framework about family-centered care is presented for healthcare providers in prenatal, labor and birth, and postpartum care. It is recommended that nurses ask new mothers about their confidence level and give special consideration to cultural background in order to provide supportive care in hospital and community settings.

  2. Low-level contrast statistics are diagnostic of invariance of natural textures

    PubMed Central

    Groen, Iris I. A.; Ghebreab, Sennay; Lamme, Victor A. F.; Scholte, H. Steven

    2012-01-01

    Texture may provide important clues for real world object and scene perception. To be reliable, these clues should ideally be invariant to common viewing variations such as changes in illumination and orientation. In a large image database of natural materials, we found textures with low-level contrast statistics that varied substantially under viewing variations, as well as textures that remained relatively constant. This led us to ask whether textures with constant contrast statistics give rise to more invariant representations compared to other textures. To test this, we selected natural texture images with either high (HV) or low (LV) variance in contrast statistics and presented these to human observers. In two distinct behavioral categorization paradigms, participants more often judged HV textures as “different” compared to LV textures, showing that textures with constant contrast statistics are perceived as being more invariant. In a separate electroencephalogram (EEG) experiment, evoked responses to single texture images (single-image ERPs) were collected. The results show that differences in contrast statistics correlated with both early and late differences in occipital ERP amplitude between individual images. Importantly, ERP differences between images of HV textures were mainly driven by illumination angle, which was not the case for LV images: there, differences were completely driven by texture membership. These converging neural and behavioral results imply that some natural textures are surprisingly invariant to illumination changes and that low-level contrast statistics are diagnostic of the extent of this invariance. PMID:22701419

  3. Statistics Graduate Students' Professional Development for Teaching: A Communities of Practice Model

    NASA Astrophysics Data System (ADS)

    Justice, Nicola

    Graduate teaching assistants (GTAs) are responsible for instructing approximately 25% of introductory statistics courses in the United States (Blair, Kirkman, & Maxwell, 2013). Most research on GTA professional development focuses on structured activities (e.g., courses, workshops) that have been developed to improve GTAs' pedagogy and content knowledge. Few studies take into account the social contexts of GTAs' professional development. However, GTAs perceive their social interactions with other GTAs to be a vital part of their preparation and support for teaching (e.g., Staton & Darling, 1989). Communities of practice (CoPs) are one way to bring together the study of the social contexts and structured activities of GTA professional development. CoPs are defined as groups of practitioners who deepen their knowledge and expertise by interacting with each other on an ongoing basis (e.g., Lave & Wenger, 1991). Graduate students may participate in CoPs related to teaching in many ways, including attending courses or workshops, participating in weekly meetings, engaging in informal discussions about teaching, or participating in e-mail conversations related to teaching tasks. This study explored the relationship between statistics graduate students' experiences in CoPs and the extent to which they hold student-centered teaching beliefs. A framework for characterizing GTAs' experiences in CoPs was described and a theoretical model relating these characteristics to GTAs' beliefs was developed. To gather data to test the model, the Graduate Students' Experiences Teaching Statistics (GETS) Inventory was created. Items were written to collect information about GTAs' current teaching beliefs, teaching beliefs before entering their degree programs, characteristics of GTAs' experiences in CoPs, and demographic information. Using an online program, the GETS Inventory was administered to N =218 statistics graduate students representing 37 institutions in 24 different U.S. states. The data gathered from the national survey suggest that statistics graduate students often experience CoPs through required meetings and voluntary discussions about teaching. Participants feel comfortable disagreeing with the people they perceive to be most influential on their teaching beliefs. Most participants perceive a faculty member to have the most influential role in shaping their teaching beliefs. The survey data did not provide evidence to support the proposed theoretical model relating characteristics of experiences in CoPs and beliefs about teaching statistics. Based on cross-validation results, prior beliefs about teaching statistics was the best predictor of current beliefs. Additional models were retained that included student characteristics suggested by previous literature to be associated with student-centered or traditional teaching beliefs (e.g., prior teaching experience, international student status). The results of this study can be used to inform future efforts to help promote student-centered teaching beliefs and teaching practices among statistics GTAs. Modifications to the GETS Inventory are suggested for use in future research designed to gather information about GTAs, their teaching beliefs, and their experiences in CoPs. Suggestions are also made for aspects of CoPs that might be studied further in order to learn how CoPs can promote teaching beliefs and practices that support student learning.

  4. High Fidelity Simulation Experience in Emergency settings: doctors and nurses satisfaction levels.

    PubMed

    Calamassi, Diletta; Nannelli, Tiziana; Guazzini, Andrea; Rasero, Laura; Bambi, Stefano

    2016-11-22

    Lots of studies describe High Fidelity Simulation (HFS) as an experience well-accepted by the learners. This study has explored doctors and nurses satisfaction levels during HFS sessions, searching the associations with the setting of simulation events (simulation center or on the field simulation). Moreover, we studied the correlation between HFS experience satisfaction levels and the socio-demographic features of the participants. Mixed method study, using the Satisfaction of High-Fidelity Simulation Experience (SESAF) questionnaire through an online survey. SESAF was administered to doctors and nurses who previously took part to HFS sessions in a simulation center or in the field. Quantitative data were analyzed through descriptive and inferential statistics methods; qualitative data was performed through the Giorgi method. 143 doctors and 94 nurses filled the questionnaire. The satisfaction level was high: on a 10 points scale, the mean score was 8.17 (SD±1.924). There was no significant difference between doctors and nurses satisfaction levels in almost all the SESAF factors. We didn't find any correlation between gender and HFS experience satisfaction levels. The knowledge of theoretical aspects of the simulated case before the HFS experience is related to a higher general satisfaction (r=0.166 p=0.05), a higher effectiveness of debriefing (r=0,143 p=0,05), and a higher professional impact (r=0.143 p=0.05). The respondents that performed a HFS on the field, were more satisfied than the others, and experienced a higher "professional impact", "clinical reasoning and self efficacy", and "team dynamics" (p< 0,01). Narrative data suggest that HFS facilitators should improve their behaviors during the debriefing. Healthcare managers should extend the HFS to all kind of healthcare workers in real clinical settings. There is the need to improve and implement the communication competences of HFS facilitators.

  5. GenomeGraphs: integrated genomic data visualization with R.

    PubMed

    Durinck, Steffen; Bullard, James; Spellman, Paul T; Dudoit, Sandrine

    2009-01-06

    Biological studies involve a growing number of distinct high-throughput experiments to characterize samples of interest. There is a lack of methods to visualize these different genomic datasets in a versatile manner. In addition, genomic data analysis requires integrated visualization of experimental data along with constantly changing genomic annotation and statistical analyses. We developed GenomeGraphs, as an add-on software package for the statistical programming environment R, to facilitate integrated visualization of genomic datasets. GenomeGraphs uses the biomaRt package to perform on-line annotation queries to Ensembl and translates these to gene/transcript structures in viewports of the grid graphics package. This allows genomic annotation to be plotted together with experimental data. GenomeGraphs can also be used to plot custom annotation tracks in combination with different experimental data types together in one plot using the same genomic coordinate system. GenomeGraphs is a flexible and extensible software package which can be used to visualize a multitude of genomic datasets within the statistical programming environment R.

  6. A statistical framework to predict functional non-coding regions in the human genome through integrated analysis of annotation data.

    PubMed

    Lu, Qiongshi; Hu, Yiming; Sun, Jiehuan; Cheng, Yuwei; Cheung, Kei-Hoi; Zhao, Hongyu

    2015-05-27

    Identifying functional regions in the human genome is a major goal in human genetics. Great efforts have been made to functionally annotate the human genome either through computational predictions, such as genomic conservation, or high-throughput experiments, such as the ENCODE project. These efforts have resulted in a rich collection of functional annotation data of diverse types that need to be jointly analyzed for integrated interpretation and annotation. Here we present GenoCanyon, a whole-genome annotation method that performs unsupervised statistical learning using 22 computational and experimental annotations thereby inferring the functional potential of each position in the human genome. With GenoCanyon, we are able to predict many of the known functional regions. The ability of predicting functional regions as well as its generalizable statistical framework makes GenoCanyon a unique and powerful tool for whole-genome annotation. The GenoCanyon web server is available at http://genocanyon.med.yale.edu.

  7. Color Image Segmentation Based on Statistics of Location and Feature Similarity

    NASA Astrophysics Data System (ADS)

    Mori, Fumihiko; Yamada, Hiromitsu; Mizuno, Makoto; Sugano, Naotoshi

    The process of “image segmentation and extracting remarkable regions” is an important research subject for the image understanding. However, an algorithm based on the global features is hardly found. The requisite of such an image segmentation algorism is to reduce as much as possible the over segmentation and over unification. We developed an algorithm using the multidimensional convex hull based on the density as the global feature. In the concrete, we propose a new algorithm in which regions are expanded according to the statistics of the region such as the mean value, standard deviation, maximum value and minimum value of pixel location, brightness and color elements and the statistics are updated. We also introduced a new concept of conspicuity degree and applied it to the various 21 images to examine the effectiveness. The remarkable object regions, which were extracted by the presented system, highly coincided with those which were pointed by the sixty four subjects who attended the psychological experiment.

  8. Full Counting Statistics for Interacting Fermions with Determinantal Quantum Monte Carlo Simulations.

    PubMed

    Humeniuk, Stephan; Büchler, Hans Peter

    2017-12-08

    We present a method for computing the full probability distribution function of quadratic observables such as particle number or magnetization for the Fermi-Hubbard model within the framework of determinantal quantum Monte Carlo calculations. Especially in cold atom experiments with single-site resolution, such a full counting statistics can be obtained from repeated projective measurements. We demonstrate that the full counting statistics can provide important information on the size of preformed pairs. Furthermore, we compute the full counting statistics of the staggered magnetization in the repulsive Hubbard model at half filling and find excellent agreement with recent experimental results. We show that current experiments are capable of probing the difference between the Hubbard model and the limiting Heisenberg model.

  9. Statistical methods for quantitative mass spectrometry proteomic experiments with labeling.

    PubMed

    Oberg, Ann L; Mahoney, Douglas W

    2012-01-01

    Mass Spectrometry utilizing labeling allows multiple specimens to be subjected to mass spectrometry simultaneously. As a result, between-experiment variability is reduced. Here we describe use of fundamental concepts of statistical experimental design in the labeling framework in order to minimize variability and avoid biases. We demonstrate how to export data in the format that is most efficient for statistical analysis. We demonstrate how to assess the need for normalization, perform normalization, and check whether it worked. We describe how to build a model explaining the observed values and test for differential protein abundance along with descriptive statistics and measures of reliability of the findings. Concepts are illustrated through the use of three case studies utilizing the iTRAQ 4-plex labeling protocol.

  10. Calculations with the quasirelativistic local-spin-density-functional theory for high-Z atoms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guo, Y.; Whitehead, M.A.

    1988-10-01

    The generalized-exchange local-spin-density-functional theory (LSD-GX) with relativistic corrections of the mass velocity and Darwin terms has been used to calculate statistical total energies for the neutral atoms, the positive ions, and the negative ions for high-Z elements. The effect of the correlation and relaxation correction on the statistical total energy is discussed. Comparing the calculated results for the ionization potentials and electron affinities for the atoms (atomic number Z from 37 to 56 and 72 to 80) with experiment, shows that for the atoms rubidium to barium both the LSD-GX and the quasirelativistic LSD-GX, with self-interaction correction, Gopinathan, Whitehead, andmore » Bogdanovic's Fermi-hole parameters (Phys. Rev. A 14, 1 (1976)), and Vosko, Wilk, and Nusair's correlation correction (Can. J. Phys. 58, 1200 (1980)), are very good methods for calculating ionization potentials and electron affinities. For the atoms hafnium to mercury the relativistic effect has to be considered.« less

  11. How to hit HIV where it hurts

    NASA Astrophysics Data System (ADS)

    Chakraborty, Arup

    No medical procedure has saved more lives than vaccination. But, today, some pathogens have evolved which have defied successful vaccination using the empirical paradigms pioneered by Pasteur and Jenner. One characteristic of many pathogens for which successful vaccines do not exist is that they present themselves in various guises. HIV is an extreme example because of its high mutability. This highly mutable virus can evade natural or vaccine induced immune responses, often by mutating at multiple sites linked by compensatory interactions. I will describe first how by bringing to bear ideas from statistical physics (e.g., maximum entropy models, Hopfield models, Feynman variational theory) together with in vitro experiments and clinical data, the fitness landscape of HIV is beginning to be defined with explicit account for collective mutational pathways. I will describe how this knowledge can be harnessed for vaccine design. Finally, I will describe how ideas at the intersection of evolutionary biology, immunology, and statistical physics can help guide the design of strategies that may be able to induce broadly neutralizing antibodies.

  12. Assessment and improvement of statistical tools for comparative proteomics analysis of sparse data sets with few experimental replicates.

    PubMed

    Schwämmle, Veit; León, Ileana Rodríguez; Jensen, Ole Nørregaard

    2013-09-06

    Large-scale quantitative analyses of biological systems are often performed with few replicate experiments, leading to multiple nonidentical data sets due to missing values. For example, mass spectrometry driven proteomics experiments are frequently performed with few biological or technical replicates due to sample-scarcity or due to duty-cycle or sensitivity constraints, or limited capacity of the available instrumentation, leading to incomplete results where detection of significant feature changes becomes a challenge. This problem is further exacerbated for the detection of significant changes on the peptide level, for example, in phospho-proteomics experiments. In order to assess the extent of this problem and the implications for large-scale proteome analysis, we investigated and optimized the performance of three statistical approaches by using simulated and experimental data sets with varying numbers of missing values. We applied three tools, including standard t test, moderated t test, also known as limma, and rank products for the detection of significantly changing features in simulated and experimental proteomics data sets with missing values. The rank product method was improved to work with data sets containing missing values. Extensive analysis of simulated and experimental data sets revealed that the performance of the statistical analysis tools depended on simple properties of the data sets. High-confidence results were obtained by using the limma and rank products methods for analyses of triplicate data sets that exhibited more than 1000 features and more than 50% missing values. The maximum number of differentially represented features was identified by using limma and rank products methods in a complementary manner. We therefore recommend combined usage of these methods as a novel and optimal way to detect significantly changing features in these data sets. This approach is suitable for large quantitative data sets from stable isotope labeling and mass spectrometry experiments and should be applicable to large data sets of any type. An R script that implements the improved rank products algorithm and the combined analysis is available.

  13. Statistical Metamodeling and Sequential Design of Computer Experiments to Model Glyco-Altered Gating of Sodium Channels in Cardiac Myocytes.

    PubMed

    Du, Dongping; Yang, Hui; Ednie, Andrew R; Bennett, Eric S

    2016-09-01

    Glycan structures account for up to 35% of the mass of cardiac sodium ( Nav ) channels. To question whether and how reduced sialylation affects Nav activity and cardiac electrical signaling, we conducted a series of in vitro experiments on ventricular apex myocytes under two different glycosylation conditions, reduced protein sialylation (ST3Gal4(-/-)) and full glycosylation (control). Although aberrant electrical signaling is observed in reduced sialylation, realizing a better understanding of mechanistic details of pathological variations in INa and AP is difficult without performing in silico studies. However, computer model of Nav channels and cardiac myocytes involves greater levels of complexity, e.g., high-dimensional parameter space, nonlinear and nonconvex equations. Traditional linear and nonlinear optimization methods have encountered many difficulties for model calibration. This paper presents a new statistical metamodeling approach for efficient computer experiments and optimization of Nav models. First, we utilize a fractional factorial design to identify control variables from the large set of model parameters, thereby reducing the dimensionality of parametric space. Further, we develop the Gaussian process model as a surrogate of expensive and time-consuming computer models and then identify the next best design point that yields the maximal probability of improvement. This process iterates until convergence, and the performance is evaluated and validated with real-world experimental data. Experimental results show the proposed algorithm achieves superior performance in modeling the kinetics of Nav channels under a variety of glycosylation conditions. As a result, in silico models provide a better understanding of glyco-altered mechanistic details in state transitions and distributions of Nav channels. Notably, ST3Gal4(-/-) myocytes are shown to have higher probabilities accumulated in intermediate inactivation during the repolarization and yield a shorter refractory period than WTs. The proposed statistical design of computer experiments is generally extensible to many other disciplines that involve large-scale and computationally expensive models.

  14. Ultracold Neutron Sources

    NASA Astrophysics Data System (ADS)

    Martin, Jeffery

    2016-09-01

    The free neutron is an excellent laboratory for searches for physics beyond the standard model. Ultracold neutrons (UCN) are free neutrons that can be confined to material, magnetic, and gravitational traps. UCN are compelling for experiments requiring long observation times, high polarization, or low energies. The challenge of experiments has been to create enough UCN to reach the statistical precision required. Production techniques involving neutron interactions with condensed matter systems have resulted in some successes, and new UCN sources are being pursued worldwide to exploit higher UCN densities offered by these techniques. I will review the physics of how the UCN sources work, along with the present status of the world's efforts. research supported by NSERC, CFI, and CRC.

  15. An Experiment Quantifying The Effect Of Clutter On Target Detection

    NASA Astrophysics Data System (ADS)

    Weathersby, Marshall R.; Schmieder, David E.

    1985-01-01

    Experiments were conducted to determine the influence of background clutter on target detection criteria. The experiment consisted of placing observers in front of displayed images on a TV monitor. Observer ability to detect military targets embedded in simulated natural and manmade background clutter was measured when there was unlimited viewing time. Results were described in terms of detection probability versus target resolution for various signal to clutter ratios (SCR). The experiments were preceded by a search for a meaningful clutter definition. The selected definition was a statistical measure computed by averaging the standard deviation of contiguous scene cells over the whole scene. The cell size was comparable to the target size. Observer test results confirmed the expectation that the resolution required for a given detection probability was a continuum function of the clutter level. At the lower SCRs the resolution required for a high probability of detection was near 6 lines pairs per target (LP/TGT), while at the higher SCRs it was found that a resolution of less than 0.25 LP/TGT would yield a high probability of detection. These results are expected to aid in target acquisition performance modeling and to lead to improved specifications for imaging automatic target screeners.

  16. A method to accelerate creation of plasma etch recipes using physics and Bayesian statistics

    NASA Astrophysics Data System (ADS)

    Chopra, Meghali J.; Verma, Rahul; Lane, Austin; Willson, C. G.; Bonnecaze, Roger T.

    2017-03-01

    Next generation semiconductor technologies like high density memory storage require precise 2D and 3D nanopatterns. Plasma etching processes are essential to achieving the nanoscale precision required for these structures. Current plasma process development methods rely primarily on iterative trial and error or factorial design of experiment (DOE) to define the plasma process space. Here we evaluate the efficacy of the software tool Recipe Optimization for Deposition and Etching (RODEo) against standard industry methods at determining the process parameters of a high density O2 plasma system with three case studies. In the first case study, we demonstrate that RODEo is able to predict etch rates more accurately than a regression model based on a full factorial design while using 40% fewer experiments. In the second case study, we demonstrate that RODEo performs significantly better than a full factorial DOE at identifying optimal process conditions to maximize anisotropy. In the third case study we experimentally show how RODEo maximizes etch rates while using half the experiments of a full factorial DOE method. With enhanced process predictions and more accurate maps of the process space, RODEo reduces the number of experiments required to develop and optimize plasma processes.

  17. Incorporating an Interactive Statistics Workshop into an Introductory Biology Course-Based Undergraduate Research Experience (CURE) Enhances Students’ Statistical Reasoning and Quantitative Literacy Skills †

    PubMed Central

    Olimpo, Jeffrey T.; Pevey, Ryan S.; McCabe, Thomas M.

    2018-01-01

    Course-based undergraduate research experiences (CUREs) provide an avenue for student participation in authentic scientific opportunities. Within the context of such coursework, students are often expected to collect, analyze, and evaluate data obtained from their own investigations. Yet, limited research has been conducted that examines mechanisms for supporting students in these endeavors. In this article, we discuss the development and evaluation of an interactive statistics workshop that was expressly designed to provide students with an open platform for graduate teaching assistant (GTA)-mentored data processing, statistical testing, and synthesis of their own research findings. Mixed methods analyses of pre/post-intervention survey data indicated a statistically significant increase in students’ reasoning and quantitative literacy abilities in the domain, as well as enhancement of student self-reported confidence in and knowledge of the application of various statistical metrics to real-world contexts. Collectively, these data reify an important role for scaffolded instruction in statistics in preparing emergent scientists to be data-savvy researchers in a globally expansive STEM workforce. PMID:29904549

  18. Incorporating an Interactive Statistics Workshop into an Introductory Biology Course-Based Undergraduate Research Experience (CURE) Enhances Students' Statistical Reasoning and Quantitative Literacy Skills.

    PubMed

    Olimpo, Jeffrey T; Pevey, Ryan S; McCabe, Thomas M

    2018-01-01

    Course-based undergraduate research experiences (CUREs) provide an avenue for student participation in authentic scientific opportunities. Within the context of such coursework, students are often expected to collect, analyze, and evaluate data obtained from their own investigations. Yet, limited research has been conducted that examines mechanisms for supporting students in these endeavors. In this article, we discuss the development and evaluation of an interactive statistics workshop that was expressly designed to provide students with an open platform for graduate teaching assistant (GTA)-mentored data processing, statistical testing, and synthesis of their own research findings. Mixed methods analyses of pre/post-intervention survey data indicated a statistically significant increase in students' reasoning and quantitative literacy abilities in the domain, as well as enhancement of student self-reported confidence in and knowledge of the application of various statistical metrics to real-world contexts. Collectively, these data reify an important role for scaffolded instruction in statistics in preparing emergent scientists to be data-savvy researchers in a globally expansive STEM workforce.

  19. Choosing an Appropriate Modelling Framework for Analysing Multispecies Co-culture Cell Biology Experiments.

    PubMed

    Markham, Deborah C; Simpson, Matthew J; Baker, Ruth E

    2015-04-01

    In vitro cell biology assays play a crucial role in informing our understanding of the migratory, proliferative and invasive properties of many cell types in different biological contexts. While mono-culture assays involve the study of a population of cells composed of a single cell type, co-culture assays study a population of cells composed of multiple cell types (or subpopulations of cells). Such co-culture assays can provide more realistic insights into many biological processes including tissue repair, tissue regeneration and malignant spreading. Typically, system parameters, such as motility and proliferation rates, are estimated by calibrating a mathematical or computational model to the observed experimental data. However, parameter estimates can be highly sensitive to the choice of model and modelling framework. This observation motivates us to consider the fundamental question of how we can best choose a model to facilitate accurate parameter estimation for a particular assay. In this work we describe three mathematical models of mono-culture and co-culture assays that include different levels of spatial detail. We study various spatial summary statistics to explore if they can be used to distinguish between the suitability of each model over a range of parameter space. Our results for mono-culture experiments are promising, in that we suggest two spatial statistics that can be used to direct model choice. However, co-culture experiments are far more challenging: we show that these same spatial statistics which provide useful insight into mono-culture systems are insufficient for co-culture systems. Therefore, we conclude that great care ought to be exercised when estimating the parameters of co-culture assays.

  20. High statistics search for the θ+(1.54) pentaquark state

    NASA Astrophysics Data System (ADS)

    Longo, M. J.; Burnstein, R. A.; Chakravorty, A.; Chen, Y. C.; Choong, W. S.; Clark, K.; Dukes, E. C.; Durandet, C.; Felix, J.; Fu, Y.; Gidal, G.; Gustafson, H. R.; Holmstrom, T.; Huang, M.; James, C.; Jenkins, C. M.; Jones, T.; Kaplan, D. M.; Lederman, L. M.; Leros, N.; Lopez, F.; Lu, L. C.; Luebke, W.; Luk, K. B.; Nelson, K. S.; Park, H. K.; Perroud, J.-P.; Rajaram, D.; Rubin, H. A.; Volk, J.; White, C. G.; White, S.; Zyla, P.

    2004-12-01

    We have searched for θ+(1.54)→K0p decays using data from the 1999 run of the HyperCP experiment at Fermilab. We see no evidence for a narrow peak in the K0Sp mass distribution near 1.54 GeV/c among 106 000 K0Sp candidates, and obtain an upper limit for the fraction of θ+(1.54) to K0Sp candidates of <0.3% at 90% confidence.

  1. Engaging Underrepresented High School Students in Data Driven Storytelling: An Examination of Learning Experiences and Outcomes for a Cohort of Rising Seniors Enrolled in the Gaining Early Awareness and Readiness for Undergraduate Program (GEAR UP)

    ERIC Educational Resources Information Center

    Dierker, Lisa; Ward, Nadia; Alexander, Jalen; Donate, Emmanuel

    2017-01-01

    Background: Upward trends in data-oriented careers threaten to further increase the underrepresentation of both females and individuals from racial minority groups in programs focused on data analysis and applied statistics. To begin to develop the necessary skills for a data-oriented career, project-based learning seems the most promising given…

  2. Emulating Simulations of Cosmic Dawn for 21 cm Power Spectrum Constraints on Cosmology, Reionization, and X-Ray Heating

    NASA Astrophysics Data System (ADS)

    Kern, Nicholas S.; Liu, Adrian; Parsons, Aaron R.; Mesinger, Andrei; Greig, Bradley

    2017-10-01

    Current and upcoming radio interferometric experiments are aiming to make a statistical characterization of the high-redshift 21 cm fluctuation signal spanning the hydrogen reionization and X-ray heating epochs of the universe. However, connecting 21 cm statistics to the underlying physical parameters is complicated by the theoretical challenge of modeling the relevant physics at computational speeds quick enough to enable exploration of the high-dimensional and weakly constrained parameter space. In this work, we use machine learning algorithms to build a fast emulator that can accurately mimic an expensive simulation of the 21 cm signal across a wide parameter space. We embed our emulator within a Markov Chain Monte Carlo framework in order to perform Bayesian parameter constraints over a large number of model parameters, including those that govern the Epoch of Reionization, the Epoch of X-ray Heating, and cosmology. As a worked example, we use our emulator to present an updated parameter constraint forecast for the Hydrogen Epoch of Reionization Array experiment, showing that its characterization of a fiducial 21 cm power spectrum will considerably narrow the allowed parameter space of reionization and heating parameters, and could help strengthen Planck's constraints on {σ }8. We provide both our generalized emulator code and its implementation specifically for 21 cm parameter constraints as publicly available software.

  3. Association Between Medicare Summary Star Ratings for Patient Experience and Clinical Outcomes in US Hospitals.

    PubMed

    Trzeciak, Stephen; Gaughan, John P; Bosire, Joshua; Mazzarelli, Anthony J

    2016-03-01

    In 2015, the Centers for Medicare and Medicaid Services (CMS) released new summary star ratings for US hospitals based on patient experience. We aimed to test the association between CMS patient experience star ratings and clinical outcomes. We analyzed risk-adjusted data for more than 3000 US hospitals from CMS Hospital Compare using linear regression. We found that better patient experience was associated with favorable clinical outcomes. Specifically, a higher number of stars for patient experience had a statistically significant association with lower rates of many in-hospital complications. A higher patient experience star rating also had a statistically significant association with lower rates of unplanned readmissions to the hospital within 30 days. Better patient experience according to the CMS star ratings is associated with favorable clinical outcomes. These results support the inclusion of patient experience data in the framework of how hospitals are paid for services.

  4. Experience and Sentence Processing: Statistical Learning and Relative Clause Comprehension

    PubMed Central

    Wells, Justine B.; Christiansen, Morten H.; Race, David S.; Acheson, Daniel J.; MacDonald, Maryellen C.

    2009-01-01

    Many explanations of the difficulties associated with interpreting object relative clauses appeal to the demands that object relatives make on working memory. MacDonald and Christiansen (2002) pointed to variations in reading experience as a source of differences, arguing that the unique word order of object relatives makes their processing more difficult and more sensitive to the effects of previous experience than the processing of subject relatives. This hypothesis was tested in a large-scale study manipulating reading experiences of adults over several weeks. The group receiving relative clause experience increased reading speeds for object relatives more than for subject relatives, whereas a control experience group did not. The reading time data were compared to performance of a computational model given different amounts of experience. The results support claims for experience-based individual differences and an important role for statistical learning in sentence comprehension processes. PMID:18922516

  5. Effects of systemic and non-systemic stresses on the thermal characteristics of corn

    NASA Technical Reports Server (NTRS)

    Kumar, R.; Silva, L. F.; Baer, M. E.

    1978-01-01

    Experiments were conducted on corn plants using a calibrated spectroradiometer under field conditions in the indium antimonide channel (InSb, 2.8 to 5.6 mm) and the mercury cadmium telluride channel (HgCdTe, 7 to 14 mm). A ground cover experiment, an experiment on nonsystemic corn plants, and an experiment on systemic-stressed corn plants were included. The average spectral radiance temperature of corn plant populations was found (1) to be statistically significantly different for four healthy corn plant populations, (2) to increase with increased blight severity, and (3) to be statistically significantly different for varying rates of nitrogen applications.

  6. Rapid Configurational Fluctuations in a Model of Methylcellulose

    NASA Astrophysics Data System (ADS)

    Li, Xiaolan; Dorfman, Kevin

    Methylcellulose is a thermoresponsive polymer that undergoes a phase transition at elevated temperature, forming fibrils of a uniform diameter. However, the gelation mechanism is still unclear, in particular at higher polymer concentrations. We have investigated a coarse-grained model for methylcellulose, proposed by Larson and coworkers, that produces collapsed toroids in dilute solution with a radius close to that in experiments. Using Brownian Dynamics simulations, we demonstrate that this model's dihedral potential generates ``flipping events'', which helps the chain to avoid kinetic traps by undergoing a sudden transition between a coiled and a collapsed state. If the dihedral potential is removed, the chains cannot escape from their collapsed configuration, whereas at high dihedral potentials, the chains cannot stabilize the collapsed state. We will present quantitative results on the effect of the dihedral potential on both chain statistics and dynamic behavior, and discuss the implication of our results on the spontaneous formation of high-aspect ratio fibrils in experiments.

  7. Integrated, multi-scale, spatial-temporal cell biology--A next step in the post genomic era.

    PubMed

    Horwitz, Rick

    2016-03-01

    New microscopic approaches, high-throughput imaging, and gene editing promise major new insights into cellular behaviors. When coupled with genomic and other 'omic information and "mined" for correlations and associations, a new breed of powerful and useful cellular models should emerge. These top down, coarse-grained, and statistical models, in turn, can be used to form hypotheses merging with fine-grained, bottom up mechanistic studies and models that are the back bone of cell biology. The goal of the Allen Institute for Cell Science is to develop the top down approach by developing a high throughput microscopy pipeline that is integrated with modeling, using gene edited hiPS cell lines in various physiological and pathological contexts. The output of these experiments and models will be an "animated" cell, capable of integrating and analyzing image data generated from experiments and models. Copyright © 2015 Elsevier Inc. All rights reserved.

  8. Search for high mass dilepton resonances in pp collisions at $$\\sqrt{s} = 7$$ TeV with the ATLAS experiment

    DOE PAGES

    Aad, G.

    2011-06-01

    This article presents a search for high mass e⁺e⁻ or μ⁺μ⁻ resonances in pp collisions at √s = 7 TeV at the LHC. The data were recorded by the ATLAS experiment during 2010 and correspond to a total integrated luminosity of ~ 40 pb⁻¹. No statistically significant excess above the Standard Model expectation is observed in the search region of dilepton invariant mass above 110 GeV. Upper limits at the 95% confidence level are set on the cross section times branching ratio of Z' resonances decaying to dielectrons and dimuons as a function of the resonance mass. Lastly, a lowermore » mass limit of 1.048 TeV on the Sequential Standard Model Z' boson is derived, as well as mass limits on Z' and E₆-motivated Z' models.« less

  9. High-Intensity Radiated Field Fault-Injection Experiment for a Fault-Tolerant Distributed Communication System

    NASA Technical Reports Server (NTRS)

    Yates, Amy M.; Torres-Pomales, Wilfredo; Malekpour, Mahyar R.; Gonzalez, Oscar R.; Gray, W. Steven

    2010-01-01

    Safety-critical distributed flight control systems require robustness in the presence of faults. In general, these systems consist of a number of input/output (I/O) and computation nodes interacting through a fault-tolerant data communication system. The communication system transfers sensor data and control commands and can handle most faults under typical operating conditions. However, the performance of the closed-loop system can be adversely affected as a result of operating in harsh environments. In particular, High-Intensity Radiated Field (HIRF) environments have the potential to cause random fault manifestations in individual avionic components and to generate simultaneous system-wide communication faults that overwhelm existing fault management mechanisms. This paper presents the design of an experiment conducted at the NASA Langley Research Center's HIRF Laboratory to statistically characterize the faults that a HIRF environment can trigger on a single node of a distributed flight control system.

  10. Improved silicon nitride for advanced heat engines

    NASA Technical Reports Server (NTRS)

    Yeh, H. C.; Wimmer, J. M.

    1986-01-01

    Silicon nitride is a high temperature material currently under consideration for heat engine and other applications. The objective is to improve the net shape fabrication technology of Si3N4 by injection molding. This is to be accomplished by optimizing the process through a series of statistically designed matrix experiments. To provide input to the matrix experiments, a wide range of alternate materials and processing parameters was investigated throughout the whole program. The improvement in the processing is to be demonstrated by a 20 percent increase in strength and a 100 percent increase in the Weibull modulus over that of the baseline material. A full characterization of the baseline process was completed. Material properties were found to be highly dependent on each step of the process. Several important parameters identified thus far are the starting raw materials, sinter/hot isostatic pressing cycle, powder bed, mixing methods, and sintering aid levels.

  11. Optimal Design of Experiments by Combining Coarse and Fine Measurements

    NASA Astrophysics Data System (ADS)

    Lee, Alpha A.; Brenner, Michael P.; Colwell, Lucy J.

    2017-11-01

    In many contexts, it is extremely costly to perform enough high-quality experimental measurements to accurately parametrize a predictive quantitative model. However, it is often much easier to carry out large numbers of experiments that indicate whether each sample is above or below a given threshold. Can many such categorical or "coarse" measurements be combined with a much smaller number of high-resolution or "fine" measurements to yield accurate models? Here, we demonstrate an intuitive strategy, inspired by statistical physics, wherein the coarse measurements are used to identify the salient features of the data, while the fine measurements determine the relative importance of these features. A linear model is inferred from the fine measurements, augmented by a quadratic term that captures the correlation structure of the coarse data. We illustrate our strategy by considering the problems of predicting the antimalarial potency and aqueous solubility of small organic molecules from their 2D molecular structure.

  12. Search for high mass dilepton resonances in pp collisions at √{ s} = 7 TeV with the ATLAS experiment

    NASA Astrophysics Data System (ADS)

    Aad, G.; Abbott, B.; Abdallah, J.; Abdelalim, A. A.; Abdesselam, A.; Abdinov, O.; Abi, B.; Abolins, M.; Abramowicz, H.; Abreu, H.; Acerbi, E.; Acharya, B. S.; Adams, D. L.; Addy, T. N.; Adelman, J.; Aderholz, M.; Adomeit, S.; Adragna, P.; Adye, T.; Aefsky, S.; Aguilar-Saavedra, J. A.; Aharrouche, M.; Ahlen, S. P.; Ahles, F.; Ahmad, A.; Ahsan, M.; Aielli, G.; Akdogan, T.; Åkesson, T. P. A.; Akimoto, G.; Akimov, A. V.; Akiyama, A.; Alam, M. S.; Alam, M. A.; Albrand, S.; Aleksa, M.; Aleksandrov, I. N.; Alessandria, F.; Alexa, C.; Alexander, G.; Alexandre, G.; Alexopoulos, T.; Alhroob, M.; Aliev, M.; Alimonti, G.; Alison, J.; Aliyev, M.; Allport, P. P.; Allwood-Spiers, S. E.; Almond, J.; Aloisio, A.; Alon, R.; Alonso, A.; Alviggi, M. G.; Amako, K.; Amaral, P.; Amelung, C.; Ammosov, V. V.; Amorim, A.; Amorós, G.; Amram, N.; Anastopoulos, C.; Andeen, T.; Anders, C. F.; Anderson, K. J.; Andreazza, A.; Andrei, V.; Andrieux, M.-L.; Anduaga, X. S.; Angerami, A.; Anghinolfi, F.; Anjos, N.; Annovi, A.; Antonaki, A.; Antonelli, M.; Antonelli, S.; Antonov, A.; Antos, J.; Anulli, F.; Aoun, S.; Aperio Bella, L.; Apolle, R.; Arabidze, G.; Aracena, I.; Arai, Y.; Arce, A. T. H.; Archambault, J. P.; Arfaoui, S.; Arguin, J.-F.; Arik, E.; Arik, M.; Armbruster, A. J.; Arnaez, O.; Arnault, C.; Artamonov, A.; Artoni, G.; Arutinov, D.; Asai, S.; Asfandiyarov, R.; Ask, S.; Åsman, B.; Asquith, L.; Assamagan, K.; Astbury, A.; Astvatsatourov, A.; Atoian, G.; Aubert, B.; Auerbach, B.; Auge, E.; Augsten, K.; Aurousseau, M.; Austin, N.; Avramidou, R.; Axen, D.; Ay, C.; Azuelos, G.; Azuma, Y.; Baak, M. A.; Baccaglioni, G.; Bacci, C.; Bach, A. M.; Bachacou, H.; Bachas, K.; Bachy, G.; Backes, M.; Backhaus, M.; Badescu, E.; Bagnaia, P.; Bahinipati, S.; Bai, Y.; Bailey, D. C.; Bain, T.; Baines, J. T.; Baker, O. K.; Baker, M. D.; Baker, S.; Baltasar Dos Santos Pedrosa, F.; Banas, E.; Banerjee, P.; Banerjee, Sw.; Banfi, D.; Bangert, A.; Bansal, V.; Bansil, H. S.; Barak, L.; Baranov, S. P.; Barashkou, A.; Barbaro Galtieri, A.; Barber, T.; Barberio, E. L.; Barberis, D.; Barbero, M.; Bardin, D. Y.; Barillari, T.; Barisonzi, M.; Barklow, T.; Barlow, N.; Barnett, B. M.; Barnett, R. M.; Baroncelli, A.; Barr, A. J.; Barreiro, F.; Barreiro Guimarães da Costa, J.; Barrillon, P.; Bartoldus, R.; Barton, A. E.; Bartsch, D.; Bartsch, V.; Bates, R. L.; Batkova, L.; Batley, J. R.; Battaglia, A.; Battistin, M.; Battistoni, G.; Bauer, F.; Bawa, H. S.; Beare, B.; Beau, T.; Beauchemin, P. H.; Beccherle, R.; Bechtle, P.; Beck, H. P.; Beckingham, M.; Becks, K. H.; Beddall, A. J.; Beddall, A.; Bedikian, S.; Bednyakov, V. A.; Bee, C. P.; Begel, M.; Behar Harpaz, S.; Behera, P. K.; Beimforde, M.; Belanger-Champagne, C.; Bell, P. J.; Bell, W. H.; Bella, G.; Bellagamba, L.; Bellina, F.; Bellomo, M.; Belloni, A.; Beloborodova, O.; Belotskiy, K.; Beltramello, O.; Ben Ami, S.; Benary, O.; Benchekroun, D.; Benchouk, C.; Bendel, M.; Benedict, B. H.; Benekos, N.; Benhammou, Y.; Benjamin, D. P.; Benoit, M.; Bensinger, J. R.; Benslama, K.; Bentvelsen, S.; Berge, D.; Bergeaas Kuutmann, E.; Berger, N.; Berghaus, F.; Berglund, E.; Beringer, J.; Bernardet, K.; Bernat, P.; Bernhard, R.; Bernius, C.; Berry, T.; Bertin, A.; Bertinelli, F.; Bertolucci, F.; Besana, M. I.; Besson, N.; Bethke, S.; Bhimji, W.; Bianchi, R. M.; Bianco, M.; Biebel, O.; Bieniek, S. P.; Biesiada, J.; Biglietti, M.; Bilokon, H.; Bindi, M.; Binet, S.; Bingul, A.; Bini, C.; Biscarat, C.; Bitenc, U.; Black, K. M.; Blair, R. E.; Blanchard, J.-B.; Blanchot, G.; Blocker, C.; Blocki, J.; Blondel, A.; Blum, W.; Blumenschein, U.; Bobbink, G. J.; Bobrovnikov, V. B.; Bocchetta, S. S.; Bocci, A.; Boddy, C. R.; Boehler, M.; Boek, J.; Boelaert, N.; Böser, S.; Bogaerts, J. A.; Bogdanchikov, A.; Bogouch, A.; Bohm, C.; Boisvert, V.; Bold, T.; Boldea, V.; Bolnet, N. M.; Bona, M.; Bondarenko, V. G.; Boonekamp, M.; Boorman, G.; Booth, C. N.; Booth, P.; Bordoni, S.; Borer, C.; Borisov, A.; Borissov, G.; Borjanovic, I.; Borroni, S.; Bos, K.; Boscherini, D.; Bosman, M.; Boterenbrood, H.; Botterill, D.; Bouchami, J.; Boudreau, J.; Bouhova-Thacker, E. V.; Boulahouache, C.; Bourdarios, C.; Bousson, N.; Boveia, A.; Boyd, J.; Boyko, I. R.; Bozhko, N. I.; Bozovic-Jelisavcic, I.; Bracinik, J.; Braem, A.; Branchini, P.; Brandenburg, G. W.; Brandt, A.; Brandt, G.; Brandt, O.; Bratzler, U.; Brau, B.; Brau, J. E.; Braun, H. M.; Brelier, B.; Bremer, J.; Brenner, R.; Bressler, S.; Breton, D.; Brett, N. D.; Britton, D.; Brochu, F. M.; Brock, I.; Brock, R.; Brodbeck, T. J.; Brodet, E.; Broggi, F.; Bromberg, C.; Brooijmans, G.; Brooks, W. K.; Brown, G.; Brubaker, E.; Bruckman de Renstrom, P. A.; Bruncko, D.; Bruneliere, R.; Brunet, S.; Bruni, A.; Bruni, G.; Bruschi, M.; Buanes, T.; Bucci, F.; Buchanan, J.; Buchanan, N. J.; Buchholz, P.; Buckingham, R. M.; Buckley, A. G.; Buda, S. I.; Budagov, I. A.; Budick, B.; Büscher, V.; Bugge, L.; Buira-Clark, D.; Buis, E. J.; Bulekov, O.; Bunse, M.; Buran, T.; Burckhart, H.; Burdin, S.; Burgess, T.; Burke, S.; Busato, E.; Bussey, P.; Buszello, C. P.; Butin, F.; Butler, B.; Butler, J. M.; Buttar, C. M.; Butterworth, J. M.; Buttinger, W.; Byatt, T.; Cabrera Urbán, S.; Caforio, D.; Cakir, O.; Calafiura, P.; Calderini, G.; Calfayan, P.; Calkins, R.; Caloba, L. P.; Caloi, R.; Calvet, D.; Calvet, S.; Camacho Toro, R.; Camard, A.; Camarri, P.; Cambiaghi, M.; Cameron, D.; Cammin, J.; Campana, S.; Campanelli, M.; Canale, V.; Canelli, F.; Canepa, A.; Cantero, J.; Capasso, L.; Capeans Garrido, M. D. M.; Caprini, I.; Caprini, M.; Capriotti, D.; Capua, M.; Caputo, R.; Caramarcu, C.; Cardarelli, R.; Carli, T.; Carlino, G.; Carminati, L.; Caron, B.; Caron, S.; Carpentieri, C.; Carrillo Montoya, G. D.; Carter, A. A.; Carter, J. R.; Carvalho, J.; Casadei, D.; Casado, M. P.; Cascella, M.; Caso, C.; Castaneda Hernandez, A. M.; Castaneda-Miranda, E.; Castillo Gimenez, V.; Castro, N. F.; Cataldi, G.; Cataneo, F.; Catinaccio, A.; Catmore, J. R.; Cattai, A.; Cattani, G.; Caughron, S.; Cauz, D.; Cavallari, A.; Cavalleri, P.; Cavalli, D.; Cavalli-Sforza, M.; Cavasinni, V.; Cazzato, A.; Ceradini, F.; Cerqueira, A. S.; Cerri, A.; Cerrito, L.; Cerutti, F.; Cetin, S. A.; Cevenini, F.; Chafaq, A.; Chakraborty, D.; Chan, K.; Chapleau, B.; Chapman, J. D.; Chapman, J. W.; Chareyre, E.; Charlton, D. G.; Chavda, V.; Cheatham, S.; Chekanov, S.; Chekulaev, S. V.; Chelkov, G. A.; Chelstowska, M. A.; Chen, C.; Chen, H.; Chen, L.; Chen, S.; Chen, T.; Chen, X.; Cheng, S.; Cheplakov, A.; Chepurnov, V. F.; Cherkaoui El Moursli, R.; Chernyatin, V.; Cheu, E.; Cheung, S. L.; Chevalier, L.; Chiefari, G.; Chikovani, L.; Childers, J. T.; Chilingarov, A.; Chiodini, G.; Chizhov, M. V.; Choudalakis, G.; Chouridou, S.; Christidi, I. A.; Christov, A.; Chromek-Burckhart, D.; Chu, M. L.; Chudoba, J.; Ciapetti, G.; Ciba, K.; Ciftci, A. K.; Ciftci, R.; Cinca, D.; Cindro, V.; Ciobotaru, M. D.; Ciocca, C.; Ciocio, A.; Cirilli, M.; Ciubancan, M.; Clark, A.; Clark, P. J.; Cleland, W.; Clemens, J. C.; Clement, B.; Clement, C.; Clifft, R. W.; Coadou, Y.; Cobal, M.; Coccaro, A.; Cochran, J.; Coe, P.; Cogan, J. G.; Coggeshall, J.; Cogneras, E.; Cojocaru, C. D.; Colas, J.; Colijn, A. P.; Collard, C.; Collins, N. J.; Collins-Tooth, C.; Collot, J.; Colon, G.; Comune, G.; Conde Muiño, P.; Coniavitis, E.; Conidi, M. C.; Consonni, M.; Constantinescu, S.; Conta, C.; Conventi, F.; Cook, J.; Cooke, M.; Cooper, B. D.; Cooper-Sarkar, A. M.; Cooper-Smith, N. J.; Copic, K.; Cornelissen, T.; Corradi, M.; Corriveau, F.; Cortes-Gonzalez, A.; Cortiana, G.; Costa, G.; Costa, M. J.; Costanzo, D.; Costin, T.; Côté, D.; Coura Torres, R.; Courneyea, L.; Cowan, G.; Cowden, C.; Cox, B. E.; Cranmer, K.; Crescioli, F.; Cristinziani, M.; Crosetti, G.; Crupi, R.; Crépé-Renaudin, S.; Cuenca Almenar, C.; Cuhadar Donszelmann, T.; Cuneo, S.; Curatolo, M.; Curtis, C. J.; Cwetanski, P.; Czirr, H.; Czyczula, Z.; D'Auria, S.; D'Onofrio, M.; D'Orazio, A.; da Rocha Gesualdi Mello, A.; da Silva, P. V. M.; da Via, C.; Dabrowski, W.; Dahlhoff, A.; Dai, T.; Dallapiccola, C.; Dallison, S. J.; Dam, M.; Dameri, M.; Damiani, D. S.; Danielsson, H. O.; Dankers, R.; Dannheim, D.; Dao, V.; Darbo, G.; Darlea, G. L.; Daum, C.; Dauvergne, J. P.; Davey, W.; Davidek, T.; Davidson, N.; Davidson, R.; Davies, M.; Davison, A. R.; Dawe, E.; Dawson, I.; Dawson, J. W.; Daya, R. K.; de, K.; de Asmundis, R.; de Castro, S.; de Castro Faria Salgado, P. E.; de Cecco, S.; de Graat, J.; de Groot, N.; de Jong, P.; de La Taille, C.; de la Torre, H.; de Lotto, B.; de Mora, L.; de Nooij, L.; de Oliveira Branco, M.; de Pedis, D.; de Saintignon, P.; de Salvo, A.; de Sanctis, U.; de Santo, A.; de Vivie de Regie, J. B.; Dean, S.; Dedovich, D. V.; Degenhardt, J.; Dehchar, M.; Deile, M.; Del Papa, C.; Del Peso, J.; Del Prete, T.; Dell'Acqua, A.; Dell'Asta, L.; Della Pietra, M.; Della Volpe, D.; Delmastro, M.; Delpierre, P.; Delruelle, N.; Delsart, P. A.; Deluca, C.; Demers, S.; Demichev, M.; Demirkoz, B.; Deng, J.; Denisov, S. P.; Derendarz, D.; Derkaoui, J. E.; Derue, F.; Dervan, P.; Desch, K.; Devetak, E.; Deviveiros, P. O.; Dewhurst, A.; Dewilde, B.; Dhaliwal, S.; Dhullipudi, R.; di Ciaccio, A.; di Ciaccio, L.; di Girolamo, A.; di Girolamo, B.; di Luise, S.; di Mattia, A.; di Micco, B.; di Nardo, R.; di Simone, A.; di Sipio, R.; Diaz, M. A.; Diblen, F.; Diehl, E. B.; Dietl, H.; Dietrich, J.; Dietzsch, T. A.; Diglio, S.; Dindar Yagci, K.; Dingfelder, J.; Dionisi, C.; Dita, P.; Dita, S.; Dittus, F.; Djama, F.; Djilkibaev, R.; Djobava, T.; Do Vale, M. A. B.; Do Valle Wemans, A.; Doan, T. K. O.; Dobbs, M.; Dobinson, R.; Dobos, D.; Dobson, E.; Dobson, M.; Dodd, J.; Dogan, O. B.; Doglioni, C.; Doherty, T.; Doi, Y.; Dolejsi, J.; Dolenc, I.; Dolezal, Z.; Dolgoshein, B. A.; Dohmae, T.; Donadelli, M.; Donega, M.; Donini, J.; Dopke, J.; Doria, A.; Dos Anjos, A.; Dosil, M.; Dotti, A.; Dova, M. T.; Dowell, J. D.; Doxiadis, A. D.; Doyle, A. T.; Drasal, Z.; Drees, J.; Dressnandt, N.; Drevermann, H.; Driouichi, C.; Dris, M.; Drohan, J. G.; Dubbert, J.; Dubbs, T.; Dube, S.; Duchovni, E.; Duckeck, G.; Dudarev, A.; Dudziak, F.; Dührssen, M.; Duerdoth, I. P.; Duflot, L.; Dufour, M.-A.; Dunford, M.; Duran Yildiz, H.; Duxfield, R.; Dwuznik, M.; Dydak, F.; Dzahini, D.; Düren, M.; Ebenstein, W. L.; Ebke, J.; Eckert, S.; Eckweiler, S.; Edmonds, K.; Edwards, C. A.; Ehrenfeld, W.; Ehrich, T.; Eifert, T.; Eigen, G.; Einsweiler, K.; Eisenhandler, E.; Ekelof, T.; El Kacimi, M.; Ellert, M.; Elles, S.; Ellinghaus, F.; Ellis, K.; Ellis, N.; Elmsheuser, J.; Elsing, M.; Ely, R.; Emeliyanov, D.; Engelmann, R.; Engl, A.; Epp, B.; Eppig, A.; Erdmann, J.; Ereditato, A.; Eriksson, D.; Ernst, J.; Ernst, M.; Ernwein, J.; Errede, D.; Errede, S.; Ertel, E.; Escalier, M.; Escobar, C.; Espinal Curull, X.; Esposito, B.; Etienne, F.; Etienvre, A. I.; Etzion, E.; Evangelakou, D.; Evans, H.; Fabbri, L.; Fabre, C.; Facius, K.; Fakhrutdinov, R. M.; Falciano, S.; Falou, A. C.; Fang, Y.; Fanti, M.; Farbin, A.; Farilla, A.; Farley, J.; Farooque, T.; Farrington, S. M.; Farthouat, P.; Fasching, D.; Fassnacht, P.; Fassouliotis, D.; Fatholahzadeh, B.; Favareto, A.; Fayard, L.; Fazio, S.; Febbraro, R.; Federic, P.; Fedin, O. L.; Fedorko, I.; Fedorko, W.; Fehling-Kaschek, M.; Feligioni, L.; Fellmann, D.; Felzmann, C. U.; Feng, C.; Feng, E. J.; Fenyuk, A. B.; Ferencei, J.; Ferland, J.; Fernandes, B.; Fernando, W.; Ferrag, S.; Ferrando, J.; Ferrara, V.; Ferrari, A.; Ferrari, P.; Ferrari, R.; Ferrer, A.; Ferrer, M. L.; Ferrere, D.; Ferretti, C.; Ferretto Parodi, A.; Fiascaris, M.; Fiedler, F.; Filipčič, A.; Filippas, A.; Filthaut, F.; Fincke-Keeler, M.; Fiolhais, M. C. N.; Fiorini, L.; Firan, A.; Fischer, G.; Fischer, P.; Fisher, M. J.; Fisher, S. M.; Flammer, J.; Flechl, M.; Fleck, I.; Fleckner, J.; Fleischmann, P.; Fleischmann, S.; Flick, T.; Flores Castillo, L. R.; Flowerdew, M. J.; Föhlisch, F.; Fokitis, M.; Fonseca Martin, T.; Forbush, D. A.; Formica, A.; Forti, A.; Fortin, D.; Foster, J. M.; Fournier, D.; Foussat, A.; Fowler, A. J.; Fowler, K.; Fox, H.; Francavilla, P.; Franchino, S.; Francis, D.; Frank, T.; Franklin, M.; Franz, S.; Fraternali, M.; Fratina, S.; French, S. T.; Froeschl, R.; Froidevaux, D.; Frost, J. A.; Fukunaga, C.; Fullana Torregrosa, E.; Fuster, J.; Gabaldon, C.; Gabizon, O.; Gadfort, T.; Gadomski, S.; Gagliardi, G.; Gagnon, P.; Galea, C.; Gallas, E. J.; Gallas, M. V.; Gallo, V.; Gallop, B. J.; Gallus, P.; Galyaev, E.; Gan, K. K.; Gao, Y. S.; Gapienko, V. A.; Gaponenko, A.; Garberson, F.; Garcia-Sciveres, M.; García, C.; García Navarro, J. E.; Gardner, R. W.; Garelli, N.; Garitaonandia, H.; Garonne, V.; Garvey, J.; Gatti, C.; Gaudio, G.; Gaumer, O.; Gaur, B.; Gauthier, L.; Gavrilenko, I. L.; Gay, C.; Gaycken, G.; Gayde, J.-C.; Gazis, E. N.; Ge, P.; Gee, C. N. P.; Geerts, D. A. A.; Geich-Gimbel, Ch.; Gellerstedt, K.; Gemme, C.; Gemmell, A.; Genest, M. H.; Gentile, S.; George, M.; George, S.; Gerlach, P.; Gershon, A.; Geweniger, C.; Ghazlane, H.; Ghez, P.; Ghodbane, N.; Giacobbe, B.; Giagu, S.; Giakoumopoulou, V.; Giangiobbe, V.; Gianotti, F.; Gibbard, B.; Gibson, A.; Gibson, S. M.; Gieraltowski, G. F.; Gilbert, L. M.; Gilchriese, M.; Gilewsky, V.; Gillberg, D.; Gillman, A. R.; Gingrich, D. M.; Ginzburg, J.; Giokaris, N.; Giordano, R.; Giorgi, F. M.; Giovannini, P.; Giraud, P. F.; Giugni, D.; Giusti, P.; Gjelsten, B. K.; Gladilin, L. K.; Glasman, C.; Glatzer, J.; Glazov, A.; Glitza, K. W.; Glonti, G. L.; Godfrey, J.; Godlewski, J.; Goebel, M.; Göpfert, T.; Goeringer, C.; Gössling, C.; Göttfert, T.; Goldfarb, S.; Goldin, D.; Golling, T.; Golovnia, S. N.; Gomes, A.; Gomez Fajardo, L. S.; Gonçalo, R.; Goncalves Pinto Firmino da Costa, J.; Gonella, L.; Gonidec, A.; Gonzalez, S.; González de La Hoz, S.; Gonzalez Silva, M. L.; Gonzalez-Sevilla, S.; Goodson, J. J.; Goossens, L.; Gorbounov, P. A.; Gordon, H. A.; Gorelov, I.; Gorfine, G.; Gorini, B.; Gorini, E.; Gorišek, A.; Gornicki, E.; Gorokhov, S. A.; Goryachev, V. N.; Gosdzik, B.; Gosselink, M.; Gostkin, M. I.; Gouanère, M.; Gough Eschrich, I.; Gouighri, M.; Goujdami, D.; Goulette, M. P.; Goussiou, A. G.; Goy, C.; Grabowska-Bold, I.; Grabski, V.; Grafström, P.; Grah, C.; Grahn, K.-J.; Grancagnolo, F.; Grancagnolo, S.; Grassi, V.; Gratchev, V.; Grau, N.; Gray, H. M.; Gray, J. A.; Graziani, E.; Grebenyuk, O. G.; Greenfield, D.; Greenshaw, T.; Greenwood, Z. D.; Gregor, I. M.; Grenier, P.; Griesmayer, E.; Griffiths, J.; Grigalashvili, N.; Grillo, A. A.; Grinstein, S.; Gris, P. L. Y.; Grishkevich, Y. V.; Grivaz, J.-F.; Grognuz, J.; Groh, M.; Gross, E.; Grosse-Knetter, J.; Groth-Jensen, J.; Gruwe, M.; Grybel, K.; Guarino, V. J.; Guest, D.; Guicheney, C.; Guida, A.; Guillemin, T.; Guindon, S.; Guler, H.; Gunther, J.; Guo, B.; Guo, J.; Gupta, A.; Gusakov, Y.; Gushchin, V. N.; Gutierrez, A.; Gutierrez, P.; Guttman, N.; Gutzwiller, O.; Guyot, C.; Gwenlan, C.; Gwilliam, C. B.; Haas, A.; Haas, S.; Haber, C.; Hackenburg, R.; Hadavand, H. K.; Hadley, D. R.; Haefner, P.; Hahn, F.; Haider, S.; Hajduk, Z.; Hakobyan, H.; Haller, J.; Hamacher, K.; Hamal, P.; Hamilton, A.; Hamilton, S.; Han, H.; Han, L.; Hanagaki, K.; Hance, M.; Handel, C.; Hanke, P.; Hansen, C. J.; Hansen, J. R.; Hansen, J. B.; Hansen, J. D.; Hansen, P. H.; Hansson, P.; Hara, K.; Hare, G. A.; Harenberg, T.; Harper, D.; Harrington, R. D.; Harris, O. M.; Harrison, K.; Hartert, J.; Hartjes, F.; Haruyama, T.; Harvey, A.; Hasegawa, S.; Hasegawa, Y.; Hassani, S.; Hatch, M.; Hauff, D.; Haug, S.; Hauschild, M.; Hauser, R.; Havranek, M.; Hawes, B. M.; Hawkes, C. M.; Hawkings, R. J.; Hawkins, D.; Hayakawa, T.; Hayden, D.; Hayward, H. S.; Haywood, S. J.; Hazen, E.; He, M.; Head, S. J.; Hedberg, V.; Heelan, L.; Heim, S.; Heinemann, B.; Heisterkamp, S.; Helary, L.; Heldmann, M.; Heller, M.; Hellman, S.; Helsens, C.; Henderson, R. C. W.; Henke, M.; Henrichs, A.; Henriques Correia, A. M.; Henrot-Versille, S.; Henry-Couannier, F.; Hensel, C.; Henß, T.; Hernández Jiménez, Y.; Herrberg, R.; Hershenhorn, A. D.; Herten, G.; Hertenberger, R.; Hervas, L.; Hessey, N. P.; Hidvegi, A.; Higón-Rodriguez, E.; Hill, D.; Hill, J. C.; Hill, N.; Hiller, K. H.; Hillert, S.; Hillier, S. J.; Hinchliffe, I.; Hines, E.; Hirose, M.; Hirsch, F.; Hirschbuehl, D.; Hobbs, J.; Hod, N.; Hodgkinson, M. C.; Hodgson, P.; Hoecker, A.; Hoeferkamp, M. R.; Hoffman, J.; Hoffmann, D.; Hohlfeld, M.; Holder, M.; Holmes, A.; Holmgren, S. O.; Holy, T.; Holzbauer, J. L.; Homma, Y.; Hooft van Huysduynen, L.; Horazdovsky, T.; Horn, C.; Horner, S.; Horton, K.; Hostachy, J.-Y.; Hou, S.; Houlden, M. A.; Hoummada, A.; Howarth, J.; Howell, D. F.; Hristova, I.; Hrivnac, J.; Hruska, I.; Hryn'ova, T.; Hsu, P. J.; Hsu, S.-C.; Huang, G. S.; Hubacek, Z.; Hubaut, F.; Huegging, F.; Huffman, T. B.; Hughes, E. W.; Hughes, G.; Hughes-Jones, R. E.; Huhtinen, M.; Hurst, P.; Hurwitz, M.; Husemann, U.; Huseynov, N.; Huston, J.; Huth, J.; Iacobucci, G.; Iakovidis, G.; Ibbotson, M.; Ibragimov, I.; Ichimiya, R.; Iconomidou-Fayard, L.; Idarraga, J.; Idzik, M.; Iengo, P.; Igonkina, O.; Ikegami, Y.; Ikeno, M.; Ilchenko, Y.; Iliadis, D.; Imbault, D.; Imhaeuser, M.; Imori, M.; Ince, T.; Inigo-Golfin, J.; Ioannou, P.; Iodice, M.; Ionescu, G.; Irles Quiles, A.; Ishii, K.; Ishikawa, A.; Ishino, M.; Ishmukhametov, R.; Issever, C.; Istin, S.; Itoh, Y.; Ivashin, A. V.; Iwanski, W.; Iwasaki, H.; Izen, J. M.; Izzo, V.; Jackson, B.; Jackson, J. N.; Jackson, P.; Jaekel, M. R.; Jain, V.; Jakobs, K.; Jakobsen, S.; Jakubek, J.; Jana, D. K.; Jankowski, E.; Jansen, E.; Jantsch, A.; Janus, M.; Jarlskog, G.; Jeanty, L.; Jelen, K.; Jen-La Plante, I.; Jenni, P.; Jeremie, A.; Jež, P.; Jézéquel, S.; Jha, M. K.; Ji, H.; Ji, W.; Jia, J.; Jiang, Y.; Jimenez Belenguer, M.; Jin, G.; Jin, S.; Jinnouchi, O.; Joergensen, M. D.; Joffe, D.; Johansen, L. G.; Johansen, M.; Johansson, K. E.; Johansson, P.; Johnert, S.; Johns, K. A.; Jon-And, K.; Jones, G.; Jones, R. W. L.; Jones, T. W.; Jones, T. J.; Jonsson, O.; Joram, C.; Jorge, P. M.; Joseph, J.; Ju, X.; Juranek, V.; Jussel, P.; Kabachenko, V. V.; Kabana, S.; Kaci, M.; Kaczmarska, A.; Kadlecik, P.; Kado, M.; Kagan, H.; Kagan, M.; Kaiser, S.; Kajomovitz, E.; Kalinin, S.; Kalinovskaya, L. V.; Kama, S.; Kanaya, N.; Kaneda, M.; Kanno, T.; Kantserov, V. A.; Kanzaki, J.; Kaplan, B.; Kapliy, A.; Kaplon, J.; Kar, D.; Karagoz, M.; Karnevskiy, M.; Karr, K.; Kartvelishvili, V.; Karyukhin, A. N.; Kashif, L.; Kasmi, A.; Kass, R. D.; Kastanas, A.; Kataoka, M.; Kataoka, Y.; Katsoufis, E.; Katzy, J.; Kaushik, V.; Kawagoe, K.; Kawamoto, T.; Kawamura, G.; Kayl, M. S.; Kazanin, V. A.; Kazarinov, M. Y.; Kazi, S. I.; Keates, J. R.; Keeler, R.; Kehoe, R.; Keil, M.; Kekelidze, G. D.; Kelly, M.; Kennedy, J.; Kenney, C. J.; Kenyon, M.; Kepka, O.; Kerschen, N.; Kerševan, B. P.; Kersten, S.; Kessoku, K.; Ketterer, C.; Khakzad, M.; Khalil-Zada, F.; Khandanyan, H.; Khanov, A.; Kharchenko, D.; Khodinov, A.; Kholodenko, A. G.; Khomich, A.; Khoo, T. J.; Khoriauli, G.; Khovanskiy, N.; Khovanskiy, V.; Khramov, E.; Khubua, J.; Kilvington, G.; Kim, H.; Kim, M. S.; Kim, P. C.; Kim, S. H.; Kimura, N.; Kind, O.; King, B. T.; King, M.; King, R. S. B.; Kirk, J.; Kirsch, G. P.; Kirsch, L. E.; Kiryunin, A. E.; Kisielewska, D.; Kittelmann, T.; Kiver, A. M.; Kiyamura, H.; Kladiva, E.; Klaiber-Lodewigs, J.; Klein, M.; Klein, U.; Kleinknecht, K.; Klemetti, M.; Klier, A.; Klimentov, A.; Klingenberg, R.; Klinkby, E. B.; Klioutchnikova, T.; Klok, P. F.; Klous, S.; Kluge, E.-E.; Kluge, T.; Kluit, P.; Kluth, S.; Kneringer, E.; Knobloch, J.; Knoops, E. B. F. G.; Knue, A.; Ko, B. R.; Kobayashi, T.; Kobel, M.; Koblitz, B.; Kocian, M.; Kocnar, A.; Kodys, P.; Köneke, K.; König, A. C.; Koenig, S.; Köpke, L.; Koetsveld, F.; Koevesarki, P.; Koffas, T.; Koffeman, E.; Kohn, F.; Kohout, Z.; Kohriki, T.; Koi, T.; Kokott, T.; Kolachev, G. M.; Kolanoski, H.; Kolesnikov, V.; Koletsou, I.; Koll, J.; Kollar, D.; Kollefrath, M.; Kolya, S. D.; Komar, A. A.; Komaragiri, J. R.; Kondo, T.; Kono, T.; Kononov, A. I.; Konoplich, R.; Konstantinidis, N.; Kootz, A.; Koperny, S.; Kopikov, S. V.; Korcyl, K.; Kordas, K.; Koreshev, V.; Korn, A.; Korol, A.; Korolkov, I.; Korolkova, E. V.; Korotkov, V. A.; Kortner, O.; Kortner, S.; Kostyukhin, V. V.; Kotamäki, M. J.; Kotov, S.; Kotov, V. M.; Kotwal, A.; Kourkoumelis, C.; Kouskoura, V.; Koutsman, A.; Kowalewski, R.; Kowalski, H.; Kowalski, T. Z.; Kozanecki, W.; Kozhin, A. S.; Kral, V.; Kramarenko, V. A.; Kramberger, G.; Krasel, O.; Krasny, M. W.; Krasznahorkay, A.; Kraus, J.; Kreisel, A.; Krejci, F.; Kretzschmar, J.; Krieger, N.; Krieger, P.; Kroeninger, K.; Kroha, H.; Kroll, J.; Kroseberg, J.; Krstic, J.; Kruchonak, U.; Krüger, H.; Krumshteyn, Z. V.; Kruth, A.; Kubota, T.; Kuehn, S.; Kugel, A.; Kuhl, T.; Kuhn, D.; Kukhtin, V.; Kulchitsky, Y.; Kuleshov, S.; Kummer, C.; Kuna, M.; Kundu, N.; Kunkle, J.; Kupco, A.; Kurashige, H.; Kurata, M.; Kurochkin, Y. A.; Kus, V.; Kuykendall, W.; Kuze, M.; Kuzhir, P.; Kvasnicka, O.; Kvita, J.; Kwee, R.; La Rosa, A.; La Rotonda, L.; Labarga, L.; Labbe, J.; Lablak, S.; Lacasta, C.; Lacava, F.; Lacker, H.; Lacour, D.; Lacuesta, V. R.; Ladygin, E.; Lafaye, R.; Laforge, B.; Lagouri, T.; Lai, S.; Laisne, E.; Lamanna, M.; Lampen, C. L.; Lampl, W.; Lancon, E.; Landgraf, U.; Landon, M. P. J.; Landsman, H.; Lane, J. L.; Lange, C.; Lankford, A. J.; Lanni, F.; Lantzsch, K.; Lapin, V. V.; Laplace, S.; Lapoire, C.; Laporte, J. F.; Lari, T.; Larionov, A. V.; Larner, A.; Lasseur, C.; Lassnig, M.; Lau, W.; Laurelli, P.; Lavorato, A.; Lavrijsen, W.; Laycock, P.; Lazarev, A. B.; Lazzaro, A.; Le Dortz, O.; Le Guirriec, E.; Le Maner, C.; Le Menedeu, E.; Lebedev, A.; Lebel, C.; Lecompte, T.; Ledroit-Guillon, F.; Lee, H.; Lee, J. S. H.; Lee, S. C.; Lee, L.; Lefebvre, M.; Legendre, M.; Leger, A.; Legeyt, B. C.; Legger, F.; Leggett, C.; Lehmacher, M.; Lehmann Miotto, G.; Lei, X.; Leite, M. A. L.; Leitner, R.; Lellouch, D.; Lellouch, J.; Leltchouk, M.; Lendermann, V.; Leney, K. J. C.; Lenz, T.; Lenzen, G.; Lenzi, B.; Leonhardt, K.; Leontsinis, S.; Leroy, C.; Lessard, J.-R.; Lesser, J.; Lester, C. G.; Leung Fook Cheong, A.; Levêque, J.; Levin, D.; Levinson, L. J.; Levitski, M. S.; Lewandowska, M.; Lewis, G. H.; Leyton, M.; Li, B.; Li, H.; Li, S.; Li, X.; Liang, Z.; Liang, Z.; Liberti, B.; Lichard, P.; Lichtnecker, M.; Lie, K.; Liebig, W.; Lifshitz, R.; Lilley, J. N.; Limbach, C.; Limosani, A.; Limper, M.; Lin, S. C.; Linde, F.; Linnemann, J. T.; Lipeles, E.; Lipinsky, L.; Lipniacka, A.; Liss, T. M.; Lissauer, D.; Lister, A.; Litke, A. M.; Liu, C.; Liu, D.; Liu, H.; Liu, J. B.; Liu, M.; Liu, S.; Liu, Y.; Livan, M.; Livermore, S. S. A.; Lleres, A.; Lloyd, S. L.; Lobodzinska, E.; Loch, P.; Lockman, W. S.; Lockwitz, S.; Loddenkoetter, T.; Loebinger, F. K.; Loginov, A.; Loh, C. W.; Lohse, T.; Lohwasser, K.; Lokajicek, M.; Loken, J.; Lombardo, V. P.; Long, R. E.; Lopes, L.; Lopez Mateos, D.; Losada, M.; Loscutoff, P.; Lo Sterzo, F.; Losty, M. J.; Lou, X.; Lounis, A.; Loureiro, K. F.; Love, J.; Love, P. A.; Lowe, A. J.; Lu, F.; Lu, L.; Lubatti, H. J.; Luci, C.; Lucotte, A.; Ludwig, A.; Ludwig, D.; Ludwig, I.; Ludwig, J.; Luehring, F.; Luijckx, G.; Lumb, D.; Luminari, L.; Lund, E.; Lund-Jensen, B.; Lundberg, B.; Lundberg, J.; Lundquist, J.; Lungwitz, M.; Lupi, A.; Lutz, G.; Lynn, D.; Lys, J.; Lytken, E.; Ma, H.; Ma, L. L.; Macana Goia, J. A.; Maccarrone, G.; Macchiolo, A.; Maček, B.; Machado Miguens, J.; Macina, D.; Mackeprang, R.; Madaras, R. J.; Mader, W. F.; Maenner, R.; Maeno, T.; Mättig, P.; Mättig, S.; Magalhaes Martins, P. J.; Magnoni, L.; Magradze, E.; Mahalalel, Y.; Mahboubi, K.; Mahout, G.; Maiani, C.; Maidantchik, C.; Maio, A.; Majewski, S.; Makida, Y.; Makovec, N.; Mal, P.; Malecki, Pa.; Malecki, P.; Maleev, V. P.; Malek, F.; Mallik, U.; Malon, D.; Maltezos, S.; Malyshev, V.; Malyukov, S.; Mameghani, R.; Mamuzic, J.; Manabe, A.; Mandelli, L.; Mandić, I.; Mandrysch, R.; Maneira, J.; Mangeard, P. S.; Manjavidze, I. D.; Mann, A.; Manning, P. M.; Manousakis-Katsikakis, A.; Mansoulie, B.; Manz, A.; Mapelli, A.; Mapelli, L.; March, L.; Marchand, J. F.; Marchese, F.; Marchiori, G.; Marcisovsky, M.; Marin, A.; Marino, C. P.; Marroquim, F.; Marshall, R.; Marshall, Z.; Martens, F. K.; Marti-Garcia, S.; Martin, A. J.; Martin, B.; Martin, B.; Martin, F. F.; Martin, J. P.; Martin, Ph.; Martin, T. A.; Martin Dit Latour, B.; Martinez, M.; Martinez Outschoorn, V.; Martyniuk, A. C.; Marx, M.; Marzano, F.; Marzin, A.; Masetti, L.; Mashimo, T.; Mashinistov, R.; Masik, J.; Maslennikov, A. L.; Maß, M.; Massa, I.; Massaro, G.; Massol, N.; Mastroberardino, A.; Masubuchi, T.; Mathes, M.; Matricon, P.; Matsumoto, H.; Matsunaga, H.; Matsushita, T.; Mattravers, C.; Maugain, J. M.; Maxfield, S. J.; Maximov, D. A.; May, E. N.; Mayne, A.; Mazini, R.; Mazur, M.; Mazzanti, M.; Mazzoni, E.; Mc Kee, S. P.; McCarn, A.; McCarthy, R. L.; McCarthy, T. G.; McCubbin, N. A.; McFarlane, K. W.; McFayden, J. A.; McGlone, H.; McHedlidze, G.; McLaren, R. A.; McLaughlan, T.; McMahon, S. J.; McPherson, R. A.; Meade, A.; Mechnich, J.; Mechtel, M.; Medinnis, M.; Meera-Lebbai, R.; Meguro, T.; Mehdiyev, R.; Mehlhase, S.; Mehta, A.; Meier, K.; Meinhardt, J.; Meirose, B.; Melachrinos, C.; Mellado Garcia, B. R.; Mendoza Navas, L.; Meng, Z.; Mengarelli, A.; Menke, S.; Menot, C.; Meoni, E.; Mercurio, K. M.; Mermod, P.; Merola, L.; Meroni, C.; Merritt, F. S.; Messina, A.; Metcalfe, J.; Mete, A. S.; Meuser, S.; Meyer, C.; Meyer, J.-P.; Meyer, J.; Meyer, J.; Meyer, T. C.; Meyer, W. T.; Miao, J.; Michal, S.; Micu, L.; Middleton, R. P.; Miele, P.; Migas, S.; Mijović, L.; Mikenberg, G.; Mikestikova, M.; Mikulec, B.; Mikuž, M.; Miller, D. W.; Miller, R. J.; Mills, W. J.; Mills, C.; Milov, A.; Milstead, D. A.; Milstein, D.; Minaenko, A. A.; Miñano, M.; Minashvili, I. A.; Mincer, A. I.; Mindur, B.; Mineev, M.; Ming, Y.; Mir, L. M.; Mirabelli, G.; Miralles Verge, L.; Misiejuk, A.; Mitrevski, J.; Mitrofanov, G. Y.; Mitsou, V. A.; Mitsui, S.; Miyagawa, P. S.; Miyazaki, K.; Mjörnmark, J. U.; Moa, T.; Mockett, P.; Moed, S.; Moeller, V.; Mönig, K.; Möser, N.; Mohapatra, S.; Mohn, B.; Mohr, W.; Mohrdieck-Möck, S.; Moisseev, A. M.; Moles-Valls, R.; Molina-Perez, J.; Moneta, L.; Monk, J.; Monnier, E.; Montesano, S.; Monticelli, F.; Monzani, S.; Moore, R. W.; Moorhead, G. F.; Mora Herrera, C.; Moraes, A.; Morais, A.; Morange, N.; Morello, G.; Moreno, D.; Moreno Llácer, M.; Morettini, P.; Morii, M.; Morin, J.; Morita, Y.; Morley, A. K.; Mornacchi, G.; Morone, M.-C.; Morozov, S. V.; Morris, J. D.; Moser, H. G.; Mosidze, M.; Moss, J.; Mount, R.; Mountricha, E.; Mouraviev, S. V.; Moyse, E. J. W.; Mudrinic, M.; Mueller, F.; Mueller, J.; Mueller, K.; Müller, T. A.; Muenstermann, D.; Muijs, A.; Muir, A.; Munwes, Y.; Murakami, K.; Murray, W. J.; Mussche, I.; Musto, E.; Myagkov, A. G.; Myska, M.; Nadal, J.; Nagai, K.; Nagano, K.; Nagasaka, Y.; Nairz, A. M.; Nakahama, Y.; Nakamura, K.; Nakano, I.; Nanava, G.; Napier, A.; Nash, M.; Nation, N. R.; Nattermann, T.; Naumann, T.; Navarro, G.; Neal, H. A.; Nebot, E.; Nechaeva, P. Yu.; Negri, A.; Negri, G.; Nektarijevic, S.; Nelson, A.; Nelson, S.; Nelson, T. K.; Nemecek, S.; Nemethy, P.; Nepomuceno, A. A.; Nessi, M.; Nesterov, S. Y.; Neubauer, M. S.; Neusiedl, A.; Neves, R. M.; Nevski, P.; Newman, P. R.; Nickerson, R. B.; Nicolaidou, R.; Nicolas, L.; Nicquevert, B.; Niedercorn, F.; Nielsen, J.; Niinikoski, T.; Nikiforov, A.; Nikolaenko, V.; Nikolaev, K.; Nikolic-Audit, I.; Nikolopoulos, K.; Nilsen, H.; Nilsson, P.; Ninomiya, Y.; Nisati, A.; Nishiyama, T.; Nisius, R.; Nodulman, L.; Nomachi, M.; Nomidis, I.; Nomoto, H.; Nordberg, M.; Nordkvist, B.; Norton, P. R.; Novakova, J.; Nozaki, M.; Nožička, M.; Nozka, L.; Nugent, I. M.; Nuncio-Quiroz, A.-E.; Nunes Hanninger, G.; Nunnemann, T.; Nurse, E.; Nyman, T.; O'Brien, B. J.; O'Neale, S. W.; O'Neil, D. C.; O'Shea, V.; Oakham, F. G.; Oberlack, H.; Ocariz, J.; Ochi, A.; Oda, S.; Odaka, S.; Odier, J.; Ogren, H.; Oh, A.; Oh, S. H.; Ohm, C. C.; Ohshima, T.; Ohshita, H.; Ohska, T. K.; Ohsugi, T.; Okada, S.; Okawa, H.; Okumura, Y.; Okuyama, T.; Olcese, M.; Olchevski, A. G.; Oliveira, M.; Oliveira Damazio, D.; Oliver Garcia, E.; Olivito, D.; Olszewski, A.; Olszowska, J.; Omachi, C.; Onofre, A.; Onyisi, P. U. E.; Oram, C. J.; Oreglia, M. J.; Orellana, F.; Oren, Y.; Orestano, D.; Orlov, I.; Oropeza Barrera, C.; Orr, R. S.; Ortega, E. O.; Osculati, B.; Ospanov, R.; Osuna, C.; Otero Y Garzon, G.; Ottersbach, J. P.; Ouchrif, M.; Ould-Saada, F.; Ouraou, A.; Ouyang, Q.; Owen, M.; Owen, S.; Øye, O. K.; Ozcan, V. E.; Ozturk, N.; Pacheco Pages, A.; Padilla Aranda, C.; Paganis, E.; Paige, F.; Pajchel, K.; Palestini, S.; Pallin, D.; Palma, A.; Palmer, J. D.; Pan, Y. B.; Panagiotopoulou, E.; Panes, B.; Panikashvili, N.; Panitkin, S.; Pantea, D.; Panuskova, M.; Paolone, V.; Paoloni, A.; Papadelis, A.; Papadopoulou, Th. D.; Paramonov, A.; Park, W.; Parker, M. A.; Parodi, F.; Parsons, J. A.; Parzefall, U.; Pasqualucci, E.; Passeri, A.; Pastore, F.; Pastore, Fr.; Pásztor, G.; Pataraia, S.; Patel, N.; Pater, J. R.; Patricelli, S.; Pauly, T.; Pecsy, M.; Pedraza Morales, M. I.; Peleganchuk, S. V.; Peng, H.; Pengo, R.; Penson, A.; Penwell, J.; Perantoni, M.; Perez, K.; Perez Cavalcanti, T.; Perez Codina, E.; Pérez García-Estañ, M. T.; Perez Reale, V.; Peric, I.; Perini, L.; Pernegger, H.; Perrino, R.; Perrodo, P.; Persembe, S.; Peshekhonov, V. D.; Peters, O.; Petersen, B. A.; Petersen, J.; Petersen, T. C.; Petit, E.; Petridis, A.; Petridou, C.; Petrolo, E.; Petrucci, F.; Petschull, D.; Petteni, M.; Pezoa, R.; Phan, A.; Phillips, A. W.; Phillips, P. W.; Piacquadio, G.; Piccaro, E.; Piccinini, M.; Pickford, A.; Piec, S. M.; Piegaia, R.; Pilcher, J. E.; Pilkington, A. D.; Pina, J.; Pinamonti, M.; Pinder, A.; Pinfold, J. L.; Ping, J.; Pinto, B.; Pirotte, O.; Pizio, C.; Placakyte, R.; Plamondon, M.; Plano, W. G.; Pleier, M.-A.; Pleskach, A. V.; Poblaguev, A.; Poddar, S.; Podlyski, F.; Poggioli, L.; Poghosyan, T.; Pohl, M.; Polci, F.; Polesello, G.; Policicchio, A.; Polini, A.; Poll, J.; Polychronakos, V.; Pomarede, D. M.; Pomeroy, D.; Pommès, K.; Pontecorvo, L.; Pope, B. G.; Popeneciu, G. A.; Popovic, D. S.; Poppleton, A.; Portell Bueso, X.; Porter, R.; Posch, C.; Pospelov, G. E.; Pospisil, S.; Potrap, I. N.; Potter, C. J.; Potter, C. T.; Poulard, G.; Poveda, J.; Prabhu, R.; Pralavorio, P.; Prasad, S.; Pravahan, R.; Prell, S.; Pretzl, K.; Pribyl, L.; Price, D.; Price, L. E.; Price, M. J.; Prichard, P. M.; Prieur, D.; Primavera, M.; Prokofiev, K.; Prokoshin, F.; Protopopescu, S.; Proudfoot, J.; Prudent, X.; Przysiezniak, H.; Psoroulas, S.; Ptacek, E.; Purdham, J.; Purohit, M.; Puzo, P.; Pylypchenko, Y.; Qian, J.; Qian, Z.; Qin, Z.; Quadt, A.; Quarrie, D. R.; Quayle, W. B.; Quinonez, F.; Raas, M.; Radescu, V.; Radics, B.; Rador, T.; Ragusa, F.; Rahal, G.; Rahimi, A. M.; Rahm, D.; Rajagopalan, S.; Rammensee, M.; Rammes, M.; Ramstedt, M.; Randrianarivony, K.; Ratoff, P. N.; Rauscher, F.; Rauter, E.; Raymond, M.; Read, A. L.; Rebuzzi, D. M.; Redelbach, A.; Redlinger, G.; Reece, R.; Reeves, K.; Reichold, A.; Reinherz-Aronis, E.; Reinsch, A.; Reisinger, I.; Reljic, D.; Rembser, C.; Ren, Z. L.; Renaud, A.; Renkel, P.; Rensch, B.; Rescigno, M.; Resconi, S.; Resende, B.; Reznicek, P.; Rezvani, R.; Richards, A.; Richter, R.; Richter-Was, E.; Ridel, M.; Rieke, S.; Rijpstra, M.; Rijssenbeek, M.; Rimoldi, A.; Rinaldi, L.; Rios, R. R.; Riu, I.; Rivoltella, G.; Rizatdinova, F.; Rizvi, E.; Robertson, S. H.; Robichaud-Veronneau, A.; Robinson, D.; Robinson, J. E. M.; Robinson, M.; Robson, A.; Rocha de Lima, J. G.; Roda, C.; Roda Dos Santos, D.; Rodier, S.; Rodriguez, D.; Rodriguez Garcia, Y.; Roe, A.; Roe, S.; Røhne, O.; Rojo, V.; Rolli, S.; Romaniouk, A.; Romanov, V. M.; Romeo, G.; Romero Maltrana, D.; Roos, L.; Ros, E.; Rosati, S.; Rose, M.; Rosenbaum, G. A.; Rosenberg, E. I.; Rosendahl, P. L.; Rosselet, L.; Rossetti, V.; Rossi, E.; Rossi, L. P.; Rossi, L.; Rotaru, M.; Roth, I.; Rothberg, J.; Rousseau, D.; Royon, C. R.; Rozanov, A.; Rozen, Y.; Ruan, X.; Rubinskiy, I.; Ruckert, B.; Ruckstuhl, N.; Rud, V. I.; Rudolph, G.; Rühr, F.; Ruggieri, F.; Ruiz-Martinez, A.; Rulikowska-Zarebska, E.; Rumiantsev, V.; Rumyantsev, L.; Runge, K.; Runolfsson, O.; Rurikova, Z.; Rusakovich, N. A.; Rust, D. R.; Rutherfoord, J. P.; Ruwiedel, C.; Ruzicka, P.; Ryabov, Y. F.; Ryadovikov, V.; Ryan, P.; Rybar, M.; Rybkin, G.; Ryder, N. C.; Rzaeva, S.; Saavedra, A. F.; Sadeh, I.; Sadrozinski, H. F.-W.; Sadykov, R.; Safai Tehrani, F.; Sakamoto, H.; Salamanna, G.; Salamon, A.; Saleem, M.; Salihagic, D.; Salnikov, A.; Salt, J.; Salvachua Ferrando, B. M.; Salvatore, D.; Salvatore, F.; Salzburger, A.; Sampsonidis, D.; Samset, B. H.; Sandaker, H.; Sander, H. G.; Sanders, M. P.; Sandhoff, M.; Sandhu, P.; Sandoval, T.; Sandstroem, R.; Sandvoss, S.; Sankey, D. P. C.; Sansoni, A.; Santamarina Rios, C.; Santoni, C.; Santonico, R.; Santos, H.; Saraiva, J. G.; Sarangi, T.; Sarkisyan-Grinbaum, E.; Sarri, F.; Sartisohn, G.; Sasaki, O.; Sasaki, T.; Sasao, N.; Satsounkevitch, I.; Sauvage, G.; Sauvan, J. B.; Savard, P.; Savinov, V.; Savu, D. O.; Savva, P.; Sawyer, L.; Saxon, D. H.; Says, L. P.; Sbarra, C.; Sbrizzi, A.; Scallon, O.; Scannicchio, D. A.; Schaarschmidt, J.; Schacht, P.; Schäfer, U.; Schaepe, S.; Schaetzel, S.; Schaffer, A. C.; Schaile, D.; Schamberger, R. D.; Schamov, A. G.; Scharf, V.; Schegelsky, V. A.; Scheirich, D.; Scherzer, M. I.; Schiavi, C.; Schieck, J.; Schioppa, M.; Schlenker, S.; Schlereth, J. L.; Schmidt, E.; Schmidt, M. P.; Schmieden, K.; Schmitt, C.; Schmitz, M.; Schöning, A.; Schott, M.; Schouten, D.; Schovancova, J.; Schram, M.; Schroeder, C.; Schroer, N.; Schuh, S.; Schuler, G.; Schultes, J.; Schultz-Coulon, H.-C.; Schulz, H.; Schumacher, J. W.; Schumacher, M.; Schumm, B. A.; Schune, Ph.; Schwanenberger, C.; Schwartzman, A.; Schwemling, Ph.; Schwienhorst, R.; Schwierz, R.; Schwindling, J.; Scott, W. G.; Searcy, J.; Sedykh, E.; Segura, E.; Seidel, S. C.; Seiden, A.; Seifert, F.; Seixas, J. M.; Sekhniaidze, G.; Seliverstov, D. M.; Sellden, B.; Sellers, G.; Seman, M.; Semprini-Cesari, N.; Serfon, C.; Serin, L.; Seuster, R.; Severini, H.; Sevior, M. E.; Sfyrla, A.; Shabalina, E.; Shamim, M.; Shan, L. Y.; Shank, J. T.; Shao, Q. T.; Shapiro, M.; Shatalov, P. B.; Shaver, L.; Shaw, C.; Shaw, K.; Sherman, D.; Sherwood, P.; Shibata, A.; Shimizu, S.; Shimojima, M.; Shin, T.; Shmeleva, A.; Shochet, M. J.; Short, D.; Shupe, M. A.; Sicho, P.; Sidoti, A.; Siebel, A.; Siegert, F.; Siegrist, J.; Sijacki, Dj.; Silbert, O.; Silva, J.; Silver, Y.; Silverstein, D.; Silverstein, S. B.; Simak, V.; Simard, O.; Simic, Lj.; Simion, S.; Simmons, B.; Simonyan, M.; Sinervo, P.; Sinev, N. B.; Sipica, V.; Siragusa, G.; Sisakyan, A. N.; Sivoklokov, S. Yu.; Sjölin, J.; Sjursen, T. B.; Skinnari, L. A.; Skovpen, K.; Skubic, P.; Skvorodnev, N.; Slater, M.; Slavicek, T.; Sliwa, K.; Sloan, T. J.; Sloper, J.; Smakhtin, V.; Smirnov, S. Yu.; Smirnova, L. N.; Smirnova, O.; Smith, B. C.; Smith, D.; Smith, K. M.; Smizanska, M.; Smolek, K.; Snesarev, A. A.; Snow, S. W.; Snow, J.; Snuverink, J.; Snyder, S.; Soares, M.; Sobie, R.; Sodomka, J.; Soffer, A.; Solans, C. A.; Solar, M.; Solc, J.; Soldatov, E.; Soldevila, U.; Solfaroli Camillocci, E.; Solodkov, A. A.; Solovyanov, O. V.; Sondericker, J.; Soni, N.; Sopko, V.; Sopko, B.; Sorbi, M.; Sosebee, M.; Soukharev, A.; Spagnolo, S.; Spanò, F.; Spighi, R.; Spigo, G.; Spila, F.; Spiriti, E.; Spiwoks, R.; Spousta, M.; Spreitzer, T.; Spurlock, B.; St. Denis, R. D.; Stahl, T.; Stahlman, J.; Stamen, R.; Stanecka, E.; Stanek, R. W.; Stanescu, C.; Stapnes, S.; Starchenko, E. A.; Stark, J.; Staroba, P.; Starovoitov, P.; Staude, A.; Stavina, P.; Stavropoulos, G.; Steele, G.; Steinbach, P.; Steinberg, P.; Stekl, I.; Stelzer, B.; Stelzer, H. J.; Stelzer-Chilton, O.; Stenzel, H.; Stevenson, K.; Stewart, G. A.; Stillings, J. A.; Stockmanns, T.; Stockton, M. C.; Stoerig, K.; Stoicea, G.; Stonjek, S.; Strachota, P.; Stradling, A. R.; Straessner, A.; Strandberg, J.; Strandberg, S.; Strandlie, A.; Strang, M.; Strauss, E.; Strauss, M.; Strizenec, P.; Ströhmer, R.; Strom, D. M.; Strong, J. A.; Stroynowski, R.; Strube, J.; Stugu, B.; Stumer, I.; Stupak, J.; Sturm, P.; Soh, D. A.; Su, D.; Subramania, H. S.; Succurro, A.; Sugaya, Y.; Sugimoto, T.; Suhr, C.; Suita, K.; Suk, M.; Sulin, V. V.; Sultansoy, S.; Sumida, T.; Sun, X.; Sundermann, J. E.; Suruliz, K.; Sushkov, S.; Susinno, G.; Sutton, M. R.; Suzuki, Y.; Sviridov, Yu. M.; Swedish, S.; Sykora, I.; Sykora, T.; Szeless, B.; Sánchez, J.; Ta, D.; Tackmann, K.; Taffard, A.; Tafirout, R.; Taga, A.; Taiblum, N.; Takahashi, Y.; Takai, H.; Takashima, R.; Takeda, H.; Takeshita, T.; Talby, M.; Talyshev, A.; Tamsett, M. C.; Tanaka, J.; Tanaka, R.; Tanaka, S.; Tanaka, S.; Tanaka, Y.; Tani, K.; Tannoury, N.; Tappern, G. P.; Tapprogge, S.; Tardif, D.; Tarem, S.; Tarrade, F.; Tartarelli, G. F.; Tas, P.; Tasevsky, M.; Tassi, E.; Tatarkhanov, M.; Taylor, C.; Taylor, F. E.; Taylor, G. N.; Taylor, W.; Teixeira Dias Castanheira, M.; Teixeira-Dias, P.; Temming, K. K.; Ten Kate, H.; Teng, P. K.; Terada, S.; Terashi, K.; Terron, J.; Terwort, M.; Testa, M.; Teuscher, R. J.; Tevlin, C. M.; Thadome, J.; Therhaag, J.; Theveneaux-Pelzer, T.; Thioye, M.; Thoma, S.; Thomas, J. P.; Thompson, E. N.; Thompson, P. D.; Thompson, P. D.; Thompson, A. S.; Thomson, E.; Thomson, M.; Thun, R. P.; Tic, T.; Tikhomirov, V. O.; Tikhonov, Y. A.; Timmermans, C. J. W. P.; Tipton, P.; Tique Aires Viegas, F. J.; Tisserant, S.; Tobias, J.; Toczek, B.; Todorov, T.; Todorova-Nova, S.; Toggerson, B.; Tojo, J.; Tokár, S.; Tokunaga, K.; Tokushuku, K.; Tollefson, K.; Tomoto, M.; Tompkins, L.; Toms, K.; Tong, G.; Tonoyan, A.; Topfel, C.; Topilin, N. D.; Torchiani, I.; Torrence, E.; Torró Pastor, E.; Toth, J.; Touchard, F.; Tovey, D. R.; Traynor, D.; Trefzger, T.; Treis, J.; Tremblet, L.; Tricoli, A.; Trigger, I. M.; Trincaz-Duvoid, S.; Trinh, T. N.; Tripiana, M. F.; Triplett, N.; Trischuk, W.; Trivedi, A.; Trocmé, B.; Troncon, C.; Trottier-McDonald, M.; Trzupek, A.; Tsarouchas, C.; Tseng, J. C.-L.; Tsiakiris, M.; Tsiareshka, P. V.; Tsionou, D.; Tsipolitis, G.; Tsiskaridze, V.; Tskhadadze, E. G.; Tsukerman, I. I.; Tsulaia, V.; Tsung, J.-W.; Tsuno, S.; Tsybychev, D.; Tua, A.; Tuggle, J. M.; Turala, M.; Turecek, D.; Turk Cakir, I.; Turlay, E.; Turra, R.; Tuts, P. M.; Tykhonov, A.; Tylmad, M.; Tyndel, M.; Tyrvainen, H.; Tzanakos, G.; Uchida, K.; Ueda, I.; Ueno, R.; Ugland, M.; Uhlenbrock, M.; Uhrmacher, M.; Ukegawa, F.; Unal, G.; Underwood, D. G.; Undrus, A.; Unel, G.; Unno, Y.; Urbaniec, D.; Urkovsky, E.; Urrejola, P.; Usai, G.; Uslenghi, M.; Vacavant, L.; Vacek, V.; Vachon, B.; Vahsen, S.; Valderanis, C.; Valenta, J.; Valente, P.; Valentinetti, S.; Valkar, S.; Valladolid Gallego, E.; Vallecorsa, S.; Valls Ferrer, J. A.; van der Graaf, H.; van der Kraaij, E.; van der Leeuw, R.; van der Poel, E.; van der Ster, D.; van Eijk, B.; van Eldik, N.; van Gemmeren, P.; van Kesteren, Z.; van Vulpen, I.; Vandelli, W.; Vandoni, G.; Vaniachine, A.; Vankov, P.; Vannucci, F.; Varela Rodriguez, F.; Vari, R.; Varnes, E. W.; Varouchas, D.; Vartapetian, A.; Varvell, K. E.; Vassilakopoulos, V. I.; Vazeille, F.; Vegni, G.; Veillet, J. J.; Vellidis, C.; Veloso, F.; Veness, R.; Veneziano, S.; Ventura, A.; Ventura, D.; Venturi, M.; Venturi, N.; Vercesi, V.; Verducci, M.; Verkerke, W.; Vermeulen, J. C.; Vest, A.; Vetterli, M. C.; Vichou, I.; Vickey, T.; Viehhauser, G. H. A.; Viel, S.; Villa, M.; Villaplana Perez, M.; Vilucchi, E.; Vincter, M. G.; Vinek, E.; Vinogradov, V. B.; Virchaux, M.; Viret, S.; Virzi, J.; Vitale, A.; Vitells, O.; Viti, M.; Vivarelli, I.; Vives Vaque, F.; Vlachos, S.; Vlasak, M.; Vlasov, N.; Vogel, A.; Vokac, P.; Volpi, G.; Volpi, M.; Volpini, G.; von der Schmitt, H.; von Loeben, J.; von Radziewski, H.; von Toerne, E.; Vorobel, V.; Vorobiev, A. P.; Vorwerk, V.; Vos, M.; Voss, R.; Voss, T. T.; Vossebeld, J. H.; Vovenko, A. S.; Vranjes, N.; Vranjes Milosavljevic, M.; Vrba, V.; Vreeswijk, M.; Vu Anh, T.; Vuillermet, R.; Vukotic, I.; Wagner, W.; Wagner, P.; Wahlen, H.; Wakabayashi, J.; Walbersloh, J.; Walch, S.; Walder, J.; Walker, R.; Walkowiak, W.; Wall, R.; Waller, P.; Wang, C.; Wang, H.; Wang, H.; Wang, J.; Wang, J.; Wang, J. C.; Wang, R.; Wang, S. M.; Warburton, A.; Ward, C. P.; Warsinsky, M.; Watkins, P. M.; Watson, A. T.; Watson, M. F.; Watts, G.; Watts, S.; Waugh, A. T.; Waugh, B. M.; Weber, J.; Weber, M.; Weber, M. S.; Weber, P.; Weidberg, A. R.; Weigell, P.; Weingarten, J.; Weiser, C.; Wellenstein, H.; Wells, P. S.; Wen, M.; Wenaus, T.; Wendler, S.; Weng, Z.; Wengler, T.; Wenig, S.; Wermes, N.; Werner, M.; Werner, P.; Werth, M.; Wessels, M.; Whalen, K.; Wheeler-Ellis, S. J.; Whitaker, S. P.; White, A.; White, M. J.; White, S.; Whitehead, S. R.; Whiteson, D.; Whittington, D.; Wicek, F.; Wicke, D.; Wickens, F. J.; Wiedenmann, W.; Wielers, M.; Wienemann, P.; Wiglesworth, C.; Wiik, L. A. M.; Wijeratne, P. A.; Wildauer, A.; Wildt, M. A.; Wilhelm, I.; Wilkens, H. G.; Will, J. Z.; Williams, E.; Williams, H. H.; Willis, W.; Willocq, S.; Wilson, J. A.; Wilson, M. G.; Wilson, A.; Wingerter-Seez, I.; Winkelmann, S.; Winklmeier, F.; Wittgen, M.; Wolter, M. W.; Wolters, H.; Wooden, G.; Wosiek, B. K.; Wotschack, J.; Woudstra, M. J.; Wraight, K.; Wright, C.; Wrona, B.; Wu, S. L.; Wu, X.; Wu, Y.; Wulf, E.; Wunstorf, R.; Wynne, B. M.; Xaplanteris, L.; Xella, S.; Xie, S.; Xie, Y.; Xu, C.; Xu, D.; Xu, G.; Yabsley, B.; Yamada, M.; Yamamoto, A.; Yamamoto, K.; Yamamoto, S.; Yamamura, T.; Yamaoka, J.; Yamazaki, T.; Yamazaki, Y.; Yan, Z.; Yang, H.; Yang, U. K.; Yang, Y.; Yang, Y.; Yang, Z.; Yanush, S.; Yao, W.-M.; Yao, Y.; Yasu, Y.; Ybeles Smit, G. V.; Ye, J.; Ye, S.; Yilmaz, M.; Yoosoofmiya, R.; Yorita, K.; Yoshida, R.; Young, C.; Youssef, S.; Yu, D.; Yu, J.; Yu, J.; Yuan, L.; Yurkewicz, A.; Zaets, V. G.; Zaidan, R.; Zaitsev, A. M.; Zajacova, Z.; Zalite, Yo. K.; Zanello, L.; Zarzhitsky, P.; Zaytsev, A.; Zeitnitz, C.; Zeller, M.; Zema, P. F.; Zemla, A.; Zendler, C.; Zenin, A. V.; Zenin, O.; Ženiš, T.; Zenonos, Z.; Zenz, S.; Zerwas, D.; Zevi Della Porta, G.; Zhan, Z.; Zhang, D.; Zhang, H.; Zhang, J.; Zhang, X.; Zhang, Z.; Zhao, L.; Zhao, T.; Zhao, Z.; Zhemchugov, A.; Zheng, S.; Zhong, J.; Zhou, B.; Zhou, N.; Zhou, Y.; Zhu, C. G.; Zhu, H.; Zhu, Y.; Zhuang, X.; Zhuravlov, V.; Zieminska, D.; Zimmermann, R.; Zimmermann, S.; Zimmermann, S.; Ziolkowski, M.; Zitoun, R.; Živković, L.; Zmouchko, V. V.; Zobernig, G.; Zoccoli, A.; Zolnierowski, Y.; Zsenei, A.; Zur Nedden, M.; Zutshi, V.; Zwalinski, L.; Atlas Collaboration

    2011-06-01

    This Letter presents a search for high mass e+e- or μ+μ- resonances in pp collisions at √{ s} = 7 TeV at the LHC. The data were recorded by the ATLAS experiment during 2010 and correspond to a total integrated luminosity of ∼ 40pb-1. No statistically significant excess above the Standard Model expectation is observed in the search region of dilepton invariant mass above 110 GeV. Upper limits at the 95% confidence level are set on the cross section times branching ratio of Z‧ resonances decaying to dielectrons and dimuons as a function of the resonance mass. A lower mass limit of 1.048 TeV on the Sequential Standard Model Z‧ boson is derived, as well as mass limits on Z* and E6-motivated Z‧ models.

  13. Relating design and environmental variables to reliability

    NASA Astrophysics Data System (ADS)

    Kolarik, William J.; Landers, Thomas L.

    The combination of space application and nuclear power source demands high reliability hardware. The possibilities of failure, either an inability to provide power or a catastrophic accident, must be minimized. Nuclear power experiences on the ground have led to highly sophisticated probabilistic risk assessment procedures, most of which require quantitative information to adequately assess such risks. In the area of hardware risk analysis, reliability information plays a key role. One of the lessons learned from the Three Mile Island experience is that thorough analyses of critical components are essential. Nuclear grade equipment shows some reliability advantages over commercial. However, no statistically significant difference has been found. A recent study pertaining to spacecraft electronics reliability, examined some 2500 malfunctions on more than 300 aircraft. The study classified the equipment failures into seven general categories. Design deficiencies and lack of environmental protection accounted for about half of all failures. Within each class, limited reliability modeling was performed using a Weibull failure model.

  14. Statistical learning from nonrecurrent experience with discrete input variables and recursive-error-minimization equations

    NASA Astrophysics Data System (ADS)

    Carter, Jeffrey R.; Simon, Wayne E.

    1990-08-01

    Neural networks are trained using Recursive Error Minimization (REM) equations to perform statistical classification. Using REM equations with continuous input variables reduces the required number of training experiences by factors of one to two orders of magnitude over standard back propagation. Replacing the continuous input variables with discrete binary representations reduces the number of connections by a factor proportional to the number of variables reducing the required number of experiences by another order of magnitude. Undesirable effects of using recurrent experience to train neural networks for statistical classification problems are demonstrated and nonrecurrent experience used to avoid these undesirable effects. 1. THE 1-41 PROBLEM The statistical classification problem which we address is is that of assigning points in ddimensional space to one of two classes. The first class has a covariance matrix of I (the identity matrix) the covariance matrix of the second class is 41. For this reason the problem is known as the 1-41 problem. Both classes have equal probability of occurrence and samples from both classes may appear anywhere throughout the ddimensional space. Most samples near the origin of the coordinate system will be from the first class while most samples away from the origin will be from the second class. Since the two classes completely overlap it is impossible to have a classifier with zero error. The minimum possible error is known as the Bayes error and

  15. Statistical results on restorative dentistry experiments: effect of the interaction between main variables

    PubMed Central

    CAVALCANTI, Andrea Nóbrega; MARCHI, Giselle Maria; AMBROSANO, Gláucia Maria Bovi

    2010-01-01

    Statistical analysis interpretation is a critical field in scientific research. When there is more than one main variable being studied in a research, the effect of the interaction between those variables is fundamental on experiments discussion. However, some doubts can occur when the p-value of the interaction is greater than the significance level. Objective To determine the most adequate interpretation for factorial experiments with p-values of the interaction nearly higher than the significance level. Materials and methods The p-values of the interactions found in two restorative dentistry experiments (0.053 and 0.068) were interpreted in two distinct ways: considering the interaction as not significant and as significant. Results Different findings were observed between the two analyses, and studies results became more coherent when the significant interaction was used. Conclusion The p-value of the interaction between main variables must be analyzed with caution because it can change the outcomes of research studies. Researchers are strongly advised to interpret carefully the results of their statistical analysis in order to discuss the findings of their experiments properly. PMID:20857003

  16. Experiments on Nucleation in Different Flow Regimes

    NASA Technical Reports Server (NTRS)

    Bayuzick, R. J.; Hofmeister, W. H.; Morton, C. M.; Robinson, M. B.

    1999-01-01

    The vast majority of metallic engineering materials are solidified from the liquid phase. Understanding the solidification process is essential to control microstructure, which in turn, determines the properties of materials. The genesis of solidification is nucleation, where the first stable solid forms from the liquid phase. Nucleation kinetics determine the degree of undercooling and phase selection. As such, it is important to understand nucleation phenomena in order to control solidification or glass formation in metals and alloys. Early experiments in nucleation kinetics were accomplished by droplet dispersion methods. Dilatometry was used by Turnbull and others, and more recently differential thermal analysis and differential scanning calorimetry have been used for kinetic studies. These techniques have enjoyed success; however, there are difficulties with these experiments. Since materials are dispersed in a medium, the character of the emulsion/metal interface affects the nucleation behavior. Statistics are derived from the large number of particles observed in a single experiment, but dispersions have a finite size distribution which adds to the uncertainty of the kinetic determinations. Even though temperature can be controlled quite well before the onset of nucleation, the release of the latent heat of fusion during nucleation of particles complicates the assumption of isothermality during these experiments. Containerless processing has enabled another approach to the study of nucleation kinetics. With levitation techniques it is possible to undercool one sample to nucleation repeatedly in a controlled manner, such that the statistics of the nucleation process can be derived from multiple experiments on a single sample. The authors have fully developed the analysis of nucleation experiments on single samples following the suggestions of Skripov. The advantage of these experiments is that the samples are directly observable. The nucleation temperature can be measured by noncontact optical pyrometry, the mass of the sample is known, and post processing analysis can be conducted on the sample. The disadvantages are that temperature measurement must have exceptionally high precision, and it is not possible to isolate specific heterogeneous sites as in droplet dispersions.

  17. Reducing statistics anxiety and enhancing statistics learning achievement: effectiveness of a one-minute strategy.

    PubMed

    Chiou, Chei-Chang; Wang, Yu-Min; Lee, Li-Tze

    2014-08-01

    Statistical knowledge is widely used in academia; however, statistics teachers struggle with the issue of how to reduce students' statistics anxiety and enhance students' statistics learning. This study assesses the effectiveness of a "one-minute paper strategy" in reducing students' statistics-related anxiety and in improving students' statistics-related achievement. Participants were 77 undergraduates from two classes enrolled in applied statistics courses. An experiment was implemented according to a pretest/posttest comparison group design. The quasi-experimental design showed that the one-minute paper strategy significantly reduced students' statistics anxiety and improved students' statistics learning achievement. The strategy was a better instructional tool than the textbook exercise for reducing students' statistics anxiety and improving students' statistics achievement.

  18. High-precision X-ray spectroscopy of highly-charged ions at the experimental storage ring using silicon microcalorimeters

    NASA Astrophysics Data System (ADS)

    Scholz, Pascal A.; Andrianov, Victor; Echler, Artur; Egelhof, Peter; Kilbourne, Caroline; Kiselev, Oleg; Kraft-Bermuth, Saskia; McCammon, Dan

    2017-10-01

    X-ray spectroscopy on highly charged heavy ions provides a sensitive test of quantum electrodynamics in very strong Coulomb fields. One limitation of the current accuracy of such experiments is the energy resolution of available X-ray detectors for energies up to 100 keV. To improve this accuracy, a novel detector concept, namely the concept of microcalorimeters, is exploited for this kind of measurements. The microcalorimeters used in the present experiments consist of silicon thermometers, ensuring a high dynamic range, and of absorbers made of high-Z material to provide high X-ray absorption efficiency. Recently, besides an earlier used detector, a new compact detector design, housed in a new dry cryostat equipped with a pulse tube cooler, was applied at a test beamtime at the experimental storage ring (ESR) of the GSI facility in Darmstadt. A U89+ beam at 75 MeV/u and a 124Xe54+ beam at various beam energies, both interacting with an internal gas-jet target, were used in different cycles. This test was an important benchmark for designing a larger array with an improved lateral sensitivity and statistical accuracy.

  19. Detecting Spatial Patterns in Biological Array Experiments

    PubMed Central

    ROOT, DAVID E.; KELLEY, BRIAN P.; STOCKWELL, BRENT R.

    2005-01-01

    Chemical genetic screening and DNA and protein microarrays are among a number of increasingly important and widely used biological research tools that involve large numbers of parallel experiments arranged in a spatial array. It is often difficult to ensure that uniform experimental conditions are present throughout the entire array, and as a result, one often observes systematic spatially correlated errors, especially when array experiments are performed using robots. Here, the authors apply techniques based on the discrete Fourier transform to identify and quantify spatially correlated errors superimposed on a spatially random background. They demonstrate that these techniques are effective in identifying common spatially systematic errors in high-throughput 384-well microplate assay data. In addition, the authors employ a statistical test to allow for automatic detection of such errors. Software tools for using this approach are provided. PMID:14567791

  20. Attitudes toward statistics in medical postgraduates: measuring, evaluating and monitoring

    PubMed Central

    2012-01-01

    Background In medical training, statistics is considered a very difficult course to learn and teach. Current studies have found that students’ attitudes toward statistics can influence their learning process. Measuring, evaluating and monitoring the changes of students’ attitudes toward statistics are important. Few studies have focused on the attitudes of postgraduates, especially medical postgraduates. Our purpose was to understand current attitudes regarding statistics held by medical postgraduates and explore their effects on students’ achievement. We also wanted to explore the influencing factors and the sources of these attitudes and monitor their changes after a systematic statistics course. Methods A total of 539 medical postgraduates enrolled in a systematic statistics course completed the pre-form of the Survey of Attitudes Toward Statistics −28 scale, and 83 postgraduates were selected randomly from among them to complete the post-form scale after the course. Results Most medical postgraduates held positive attitudes toward statistics, but they thought statistics was a very difficult subject. The attitudes mainly came from experiences in a former statistical or mathematical class. Age, level of statistical education, research experience, specialty and mathematics basis may influence postgraduate attitudes toward statistics. There were significant positive correlations between course achievement and attitudes toward statistics. In general, student attitudes showed negative changes after completing a statistics course. Conclusions The importance of student attitudes toward statistics must be recognized in medical postgraduate training. To make sure all students have a positive learning environment, statistics teachers should measure their students’ attitudes and monitor their change of status during a course. Some necessary assistance should be offered for those students who develop negative attitudes. PMID:23173770

  1. Implementation of cross correlation for energy discrimination on the time-of-flight spectrometer CORELLI.

    PubMed

    Ye, Feng; Liu, Yaohua; Whitfield, Ross; Osborn, Ray; Rosenkranz, Stephan

    2018-04-01

    The CORELLI instrument at Oak Ridge National Laboratory is a statistical chopper spectrometer designed and optimized to probe complex disorder in crystalline materials through diffuse scattering experiments. On CORELLI, the high efficiency of white-beam Laue diffraction combined with elastic discrimination have enabled an unprecedented data collection rate to obtain both the total and the elastic-only scattering over a large volume of reciprocal space from a single measurement. To achieve this, CORELLI is equipped with a statistical chopper to modulate the incoming neutron beam quasi-randomly, and then the cross-correlation method is applied to reconstruct the elastic component from the scattering data. Details of the implementation of the cross-correlation method on CORELLI are given and its performance is discussed.

  2. Implementation of cross correlation for energy discrimination on the time-of-flight spectrometer CORELLI

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ye, Feng; Liu, Yaohua; Whitfield, Ross

    The CORELLI instrument at Oak Ridge National Laboratory is a statistical chopper spectrometer designed and optimized to probe complex disorder in crystalline materials through diffuse scattering experiments. On CORELLI, the high efficiency of white-beam Laue diffraction combined with elastic discrimination have enabled an unprecedented data collection rate to obtain both the total and the elastic-only scattering over a large volume of reciprocal space from a single measurement. To achieve this, CORELLI is equipped with a statistical chopper to modulate the incoming neutron beam quasi-randomly, and then the cross-correlation method is applied to reconstruct the elastic component from the scattering data.more » Lastly, details of the implementation of the cross-correlation method on CORELLI are given and its performance is discussed.« less

  3. Implementation of cross correlation for energy discrimination on the time-of-flight spectrometer CORELLI

    DOE PAGES

    Ye, Feng; Liu, Yaohua; Whitfield, Ross; ...

    2018-03-26

    The CORELLI instrument at Oak Ridge National Laboratory is a statistical chopper spectrometer designed and optimized to probe complex disorder in crystalline materials through diffuse scattering experiments. On CORELLI, the high efficiency of white-beam Laue diffraction combined with elastic discrimination have enabled an unprecedented data collection rate to obtain both the total and the elastic-only scattering over a large volume of reciprocal space from a single measurement. To achieve this, CORELLI is equipped with a statistical chopper to modulate the incoming neutron beam quasi-randomly, and then the cross-correlation method is applied to reconstruct the elastic component from the scattering data.more » Lastly, details of the implementation of the cross-correlation method on CORELLI are given and its performance is discussed.« less

  4. Compositionality and Statistics in Adjective Acquisition: 4-Year-Olds Interpret "Tall" and "Short" Based on the Size Distributions of Novel Noun Referents

    ERIC Educational Resources Information Center

    Barner, David; Snedeker, Jesse

    2008-01-01

    Four experiments investigated 4-year-olds' understanding of adjective-noun compositionality and their sensitivity to statistics when interpreting scalar adjectives. In Experiments 1 and 2, children selected "tall" and "short" items from 9 novel objects called "pimwits" (1-9 in. in height) or from this array plus 4 taller or shorter distractor…

  5. Difference in Learning among Students Doing Pen-and-Paper Homework Compared to Web-Based Homework in an Introductory Statistics Course

    ERIC Educational Resources Information Center

    Jonsdottir, Anna Helga; Bjornsdottir, Audbjorg; Stefansson, Gunnar

    2017-01-01

    A repeated crossover experiment comparing learning among students handing in pen-and-paper homework (PPH) with students handing in web-based homework (WBH) has been conducted. The system used in the experiments, the tutor-web, has been used to deliver homework problems to thousands of students in mathematics and statistics over several years.…

  6. The Calibration System of the E989 Experiment at Fermilab

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anastasi, Antonio

    The muon anomaly aµ is one of the most precise quantity known in physics experimentally and theoretically. The high level of accuracy permits to use the measurement of aµ as a test of the Standard Model comparing with the theoretical calculation. After the impressive result obtained at Brookhaven National Laboratory in 2001 with a total accuracy of 0.54 ppm, a new experiment E989 is under construction at Fermilab, motivated by the diff of aexp SM µ - aµ ~ 3σ. The purpose of the E989 experiment is a fourfold reduction of the error, with a goal of 0.14 ppm,more » improving both the systematic and statistical uncertainty. With the use of the Fermilab beam complex a statistic of × 21 with respect to BNL will be reached in almost 2 years of data taking improving the statistical uncertainty to 0.1 ppm. Improvement on the systematic error involves the measurement technique of ωa and ωp, the anomalous precession frequency of the muon and the Larmor precession frequency of the proton respectively. The measurement of ωp involves the magnetic field measurement and improvements on this sector related to the uniformity of the field should reduce the systematic uncertainty with respect to BNL from 170 ppb to 70 ppb. A reduction from 180 ppb to 70 ppb is also required for the measurement of ωa; new DAQ, a faster electronics and new detectors and calibration system will be implemented with respect to E821 to reach this goal. In particular the laser calibration system will reduce the systematic error due to gain fl of the photodetectors from 0.12 to 0.02 ppm. The 0.02 ppm limit on systematic requires a system with a stability of 10 -4 on short time scale (700 µs) while on longer time scale the stability is at the percent level. The 10 -4 stability level required is almost an order of magnitude better than the existing laser calibration system in particle physics, making the calibration system a very challenging item. In addition to the high level of stability a particular environment, due to the presence of a 14 m diameter storage ring, a highly uniform magnetic field and the detector distribution around the storage ring, set specific guidelines and constraints. This thesis will focus on the final design of the Laser Calibration System developed for the E989 experiment. Chapter 1 introduces the subject of the anomalous magnetic moment of the muon; chapter 2 presents previous measurement of g -2, while chapter 3 discusses the Standard Model prediction and possible new physics scenario. Chapter 4 describes the E989 experiment. In this chapter will be described the experimental technique and also will be presented the experimental apparatus focusing on the improvements necessary to reduce the statistical and systematic errors. The main item of the thesis is discussed in the last two chapters: chapter 5 is focused on the Laser Calibration system while chapter 6 describes the Test Beam performed at the Beam Test Facility of Laboratori Nazionali di Frascati from the 29th February to the 7th March as a final test for the full calibrations system. An introduction explain the physics motivation of the system and the diff t devices implemented. In the final chapter the setup used will be described and some of the results obtained will be presented.« less

  7. Prevalence of High Blood Pressure, Heart Disease, Thalassemia, Sickle-Cell Anemia, and Iron-Deficiency Anemia among the UAE Adolescent Population

    PubMed Central

    Barakat-Haddad, Caroline

    2013-01-01

    This study examined the prevalence of high blood pressure, heart disease, and medical diagnoses in relation to blood disorders, among 6,329 adolescent students (age 15 to 18 years) who reside in the United Arab Emirates (UAE). Findings indicated that the overall prevalence of high blood pressure and heart disease was 1.8% and 1.3%, respectively. Overall, the prevalence for thalassemia, sickle-cell anemia, and iron-deficiency anemia was 0.9%, 1.6%, and 5%, respectively. Bivariate analysis revealed statistically significant differences in the prevalence of high blood pressure among the local and expatriate adolescent population in the Emirate of Sharjah. Similarly, statistically significant differences in the prevalence of iron-deficiency anemia were observed among the local and expatriate population in Abu Dhabi city, the western region of Abu Dhabi, and Al-Ain. Multivariate analysis revealed the following significant predictors of high blood pressure: residing in proximity to industry, nonconventional substance abuse, and age when smoking or exposure to smoking began. Ethnicity was a significant predictor of heart disease, thalassemia, sickle-cell anemia, and iron-deficiency anemia. In addition, predictors of thalassemia included gender (female) and participating in physical activity. Participants diagnosed with sickle-cell anemia and iron-deficiency anemia were more likely to experience different physical activities. PMID:23606864

  8. Research Associate | Center for Cancer Research

    Cancer.gov

    The Basic Science Program (BSP) at the Frederick National Laboratory for Cancer Research (FNLCR) pursues independent, multidisciplinary research programs in basic or applied molecular biology, immunology, retrovirology, cancer biology or human genetics. As part of the BSP, the Microbiome and Genetics Core (the Core) characterizes microbiomes by next-generation sequencing to determine their composition and variation, as influenced by immune, genetic, and host health factors. The Core provides support across a spectrum of processes, from nucleic acid isolation through bioinformatics and statistical analysis. KEY ROLES/RESPONSIBILITIES The Research Associate II will provide support in the areas of automated isolation, preparation, PCR and sequencing of DNA on next generation platforms (Illumina MiSeq and NextSeq). An opportunity exists to join the Core’s team of highly trained experimentalists and bioinformaticians working to characterize microbiome samples. The following represent requirements of the position: A minimum of five (5) years related of biomedical experience. Experience with high-throughput nucleic acid (DNA/RNA) extraction. Experience in performing PCR amplification (including quantitative real-time PCR). Experience or familiarity with robotic liquid handling protocols (especially on the Eppendorf epMotion 5073 or 5075 platforms). Experience in operating and maintaining benchtop Illumina sequencers (MiSeq and NextSeq). Ability to evaluate experimental quality and to troubleshoot molecular biology protocols. Experience with sample tracking, inventory management and biobanking. Ability to operate and communicate effectively in a team-oriented work environment.

  9. Enhancing residents’ neonatal resuscitation competency through unannounced simulation-based training

    PubMed Central

    Surcouf, Jeffrey W.; Chauvin, Sheila W.; Ferry, Jenelle; Yang, Tong; Barkemeyer, Brian

    2013-01-01

    Background Almost half of pediatric third-year residents surveyed in 2000 had never led a resuscitation event. With increasing restrictions on residency work hours and a decline in patient volume in some hospitals, there is potential for fewer opportunities. Purpose Our primary purpose was to test the hypothesis that an unannounced mock resuscitation in a high-fidelity in-situ simulation training program would improve both residents’ self-confidence and observed performance of adopted best practices in neonatal resuscitation. Methods Each pediatric and medicine–pediatric resident in one pediatric residency program responded to an unannounced scenario that required resuscitation of the high fidelity infant simulator. Structured debriefing followed in the same setting, and a second cycle of scenario response and debriefing occurred before ending the 1-hour training experience. Measures included pre- and post-program confidence questionnaires and trained observer assessments of live and videotaped performances. Results Statistically significant pre–post gains for self-confidence were observed for 8 of the 14 NRP critical behaviors (p=0.00–0.03) reflecting knowledge, technical, and non-technical (teamwork) skills. The pre–post gain in overall confidence score was statistically significant (p=0.00). With a maximum possible assessment score of 41, the average pre–post gain was 8.28 and statistically significant (p<0.001). Results of the video-based assessments revealed statistically significant performance gains (p<0.0001). Correlation between live and video-based assessments were strong for pre–post training scenario performances (pre: r=0.64, p<0.0001; post: r=0.75, p<0.0001). Conclusions Results revealed high receptivity to in-situ, simulation-based training and significant positive gains in confidence and observed competency-related abilities. Results support the potential for other applications in residency and continuing education. PMID:23522399

  10. [Tracking study to improve basic academic ability in chemistry for freshmen].

    PubMed

    Sato, Atsuko; Morone, Mieko; Azuma, Yutaka

    2010-08-01

    The aims of this study were to assess the basic academic ability of freshmen with regard to chemistry and implement suitable educational guidance measures. At Tohoku Pharmaceutical University, basic academic ability examinations are conducted in chemistry for freshmen immediately after entrance into the college. From 2003 to 2009, the examination was conducted using the same questions, and the secular changes in the mean percentage of correct response were statistically analyzed. An experience survey was also conducted on 2007 and 2009 freshmen regarding chemical experiments at senior high school. Analysis of the basic academic ability examinations revealed a significant decrease in the mean percentage of correct responses after 2007. With regard to the answers for each question, there was a significant decrease in the percentage of correct answers for approximately 80% of questions. In particular, a marked decrease was observed for calculation questions involving percentages. A significant decrease was also observed in the number of students who had experiences with chemical experiments in high school. However, notable results have been achieved through the implementation of practice incorporating calculation problems in order to improve calculation ability. Learning of chemistry and a lack of experimental experience in high school may be contributory factors in the decrease in chemistry academic ability. In consideration of the professional ability demanded of pharmacists, the decrease in calculation ability should be regarded as a serious issue and suitable measures for improving calculation ability are urgently required.

  11. Influences of credibility of testimony and strength of statistical evidence on children's and adolescents' reasoning.

    PubMed

    Kail, Robert V

    2013-11-01

    According to dual-process models that include analytic and heuristic modes of processing, analytic processing is often expected to become more common with development. Consistent with this view, on reasoning problems, adolescents are more likely than children to select alternatives that are backed by statistical evidence. It is shown here that this pattern depends on the quality of the statistical evidence and the quality of the testimonial that is the typical alternative to statistical evidence. In Experiment 1, 9- and 13-year-olds (N=64) were presented with scenarios in which solid statistical evidence was contrasted with casual or expert testimonial evidence. When testimony was casual, children relied on it but adolescents did not; when testimony was expert, both children and adolescents relied on it. In Experiment 2, 9- and 13-year-olds (N=83) were presented with scenarios in which casual testimonial evidence was contrasted with weak or strong statistical evidence. When statistical evidence was weak, children and adolescents relied on both testimonial and statistical evidence; when statistical evidence was strong, most children and adolescents relied on it. Results are discussed in terms of their implications for dual-process accounts of cognitive development. Copyright © 2013 Elsevier Inc. All rights reserved.

  12. Distinct contributions of attention and working memory to visual statistical learning and ensemble processing.

    PubMed

    Hall, Michelle G; Mattingley, Jason B; Dux, Paul E

    2015-08-01

    The brain exploits redundancies in the environment to efficiently represent the complexity of the visual world. One example of this is ensemble processing, which provides a statistical summary of elements within a set (e.g., mean size). Another is statistical learning, which involves the encoding of stable spatial or temporal relationships between objects. It has been suggested that ensemble processing over arrays of oriented lines disrupts statistical learning of structure within the arrays (Zhao, Ngo, McKendrick, & Turk-Browne, 2011). Here we asked whether ensemble processing and statistical learning are mutually incompatible, or whether this disruption might occur because ensemble processing encourages participants to process the stimulus arrays in a way that impedes statistical learning. In Experiment 1, we replicated Zhao and colleagues' finding that ensemble processing disrupts statistical learning. In Experiments 2 and 3, we found that statistical learning was unimpaired by ensemble processing when task demands necessitated (a) focal attention to individual items within the stimulus arrays and (b) the retention of individual items in working memory. Together, these results are consistent with an account suggesting that ensemble processing and statistical learning can operate over the same stimuli given appropriate stimulus processing demands during exposure to regularities. (c) 2015 APA, all rights reserved).

  13. Statistical results from the Virginia Tech propagation experiment using the Olympus 12, 20, and 30 GHz satellite beacons

    NASA Technical Reports Server (NTRS)

    Stutzman, Warren L.; Safaai-Jazi, A.; Pratt, Timothy; Nelson, B.; Laster, J.; Ajaz, H.

    1993-01-01

    Virginia Tech has performed a comprehensive propagation experiment using the Olympus satellite beacons at 12.5, 19.77, and 29.66 GHz (which we refer to as 12, 20, and 30 GHz). Four receive terminals were designed and constructed, one terminal at each frequency plus a portable one with 20 and 30 GHz receivers for microscale and scintillation studies. Total power radiometers were included in each terminal in order to set the clear air reference level for each beacon and also to predict path attenuation. More details on the equipment and the experiment design are found elsewhere. Statistical results for one year of data collection were analyzed. In addition, the following studies were performed: a microdiversity experiment in which two closely spaced 20 GHz receivers were used; a comparison of total power and Dicke switched radiometer measurements, frequency scaling of scintillations, and adaptive power control algorithm development. Statistical results are reported.

  14. Statistical anisotropy in free turbulence for mixing layers at high Reynolds numbers

    NASA Astrophysics Data System (ADS)

    Gardner, Patrick J.; Roggemann, Michael C.; Welsh, Byron M.; Bowersox, Rodney D.; Luke, Theodore E.

    1996-08-01

    A lateral shearing interferometer was used to measure the slope of perturbed wave fronts after propagating through free turbulent mixing layers. Shearing interferometers provide a two-dimensional flow visualization that is nonintrusive. Slope measurements were used to reconstruct the phase of the turbulence-corrupted wave front. The random phase fluctuations induced by the mixing layer were captured in a large ensemble of wave-front measurements. Experiments were performed on an unbounded, plane shear mixing layer of helium and nitrogen gas at fixed velocities and high Reynolds numbers for six locations in the flow development. Statistical autocorrelation functions and structure functions were computed on the reconstructed phase maps. The autocorrelation function results indicated that the turbulence-induced phase fluctuations were not wide-sense stationary. The structure functions exhibited statistical homogeneity, indicating that the phase fluctuations were stationary in first increments. However, the turbulence-corrupted phase was not isotropic. A five-thirds power law is shown to fit orthogonal slices of the structure function, analogous to the Kolmogorov model for isotropic turbulence. Strehl ratios were computed from the phase structure functions and compared with classical estimates that assume isotropy. The isotropic models are shown to overestimate the optical degradation by nearly 3 orders of magnitude compared with the structure function calculations.

  15. Clinical evaluation of selected Yogic procedures in individuals with low back pain

    PubMed Central

    Pushpika Attanayake, A. M.; Somarathna, K. I. W. K.; Vyas, G. H.; Dash, S. C.

    2010-01-01

    The present study has been conducted to evaluate selected yogic procedures on individuals with low back pain. The understanding of back pain as one of the commonest clinical presentations during clinical practice made the path to the present study. It has also been calculated that more than three-quarters of the world's population experience back pain at some time in their lives. Twelve patients were selected and randomly divided into two groups, viz., group A yogic group and group B control group. Advice for life style and diet was given for all the patients. The effect of the therapy was assessed subjectively and objectively. Particular scores drawn for yogic group and control group were individually analyzed before and after treatment and the values were compared using standard statistical protocols. Yogic intervention revealed 79% relief in both subjective and objective parameters (i.e., 7 out of 14 parameters showed statistically highly significant P < 0.01 results, while 4 showed significant results P < 0.05). Comparative effect of yogic group and control group showed 79% relief in both subjective and objective parameters. (i.e., total 6 out of 14 parameters showed statistically highly significant (P < 0.01) results, while 5 showed significant results (P < 0.05). PMID:22131719

  16. Data management in large-scale collaborative toxicity studies: how to file experimental data for automated statistical analysis.

    PubMed

    Stanzel, Sven; Weimer, Marc; Kopp-Schneider, Annette

    2013-06-01

    High-throughput screening approaches are carried out for the toxicity assessment of a large number of chemical compounds. In such large-scale in vitro toxicity studies several hundred or thousand concentration-response experiments are conducted. The automated evaluation of concentration-response data using statistical analysis scripts saves time and yields more consistent results in comparison to data analysis performed by the use of menu-driven statistical software. Automated statistical analysis requires that concentration-response data are available in a standardised data format across all compounds. To obtain consistent data formats, a standardised data management workflow must be established, including guidelines for data storage, data handling and data extraction. In this paper two procedures for data management within large-scale toxicological projects are proposed. Both procedures are based on Microsoft Excel files as the researcher's primary data format and use a computer programme to automate the handling of data files. The first procedure assumes that data collection has not yet started whereas the second procedure can be used when data files already exist. Successful implementation of the two approaches into the European project ACuteTox is illustrated. Copyright © 2012 Elsevier Ltd. All rights reserved.

  17. A Guideline to Univariate Statistical Analysis for LC/MS-Based Untargeted Metabolomics-Derived Data

    PubMed Central

    Vinaixa, Maria; Samino, Sara; Saez, Isabel; Duran, Jordi; Guinovart, Joan J.; Yanes, Oscar

    2012-01-01

    Several metabolomic software programs provide methods for peak picking, retention time alignment and quantification of metabolite features in LC/MS-based metabolomics. Statistical analysis, however, is needed in order to discover those features significantly altered between samples. By comparing the retention time and MS/MS data of a model compound to that from the altered feature of interest in the research sample, metabolites can be then unequivocally identified. This paper reports on a comprehensive overview of a workflow for statistical analysis to rank relevant metabolite features that will be selected for further MS/MS experiments. We focus on univariate data analysis applied in parallel on all detected features. Characteristics and challenges of this analysis are discussed and illustrated using four different real LC/MS untargeted metabolomic datasets. We demonstrate the influence of considering or violating mathematical assumptions on which univariate statistical test rely, using high-dimensional LC/MS datasets. Issues in data analysis such as determination of sample size, analytical variation, assumption of normality and homocedasticity, or correction for multiple testing are discussed and illustrated in the context of our four untargeted LC/MS working examples. PMID:24957762

  18. A Guideline to Univariate Statistical Analysis for LC/MS-Based Untargeted Metabolomics-Derived Data.

    PubMed

    Vinaixa, Maria; Samino, Sara; Saez, Isabel; Duran, Jordi; Guinovart, Joan J; Yanes, Oscar

    2012-10-18

    Several metabolomic software programs provide methods for peak picking, retention time alignment and quantification of metabolite features in LC/MS-based metabolomics. Statistical analysis, however, is needed in order to discover those features significantly altered between samples. By comparing the retention time and MS/MS data of a model compound to that from the altered feature of interest in the research sample, metabolites can be then unequivocally identified. This paper reports on a comprehensive overview of a workflow for statistical analysis to rank relevant metabolite features that will be selected for further MS/MS experiments. We focus on univariate data analysis applied in parallel on all detected features. Characteristics and challenges of this analysis are discussed and illustrated using four different real LC/MS untargeted metabolomic datasets. We demonstrate the influence of considering or violating mathematical assumptions on which univariate statistical test rely, using high-dimensional LC/MS datasets. Issues in data analysis such as determination of sample size, analytical variation, assumption of normality and homocedasticity, or correction for multiple testing are discussed and illustrated in the context of our four untargeted LC/MS working examples.

  19. The effect of iconicity of visual displays on statistical reasoning: evidence in favor of the null hypothesis.

    PubMed

    Sirota, Miroslav; Kostovičová, Lenka; Juanchich, Marie

    2014-08-01

    Knowing which properties of visual displays facilitate statistical reasoning bears practical and theoretical implications. Therefore, we studied the effect of one property of visual diplays - iconicity (i.e., the resemblance of a visual sign to its referent) - on Bayesian reasoning. Two main accounts of statistical reasoning predict different effect of iconicity on Bayesian reasoning. The ecological-rationality account predicts a positive iconicity effect, because more highly iconic signs resemble more individuated objects, which tap better into an evolutionary-designed frequency-coding mechanism that, in turn, facilitates Bayesian reasoning. The nested-sets account predicts a null iconicity effect, because iconicity does not affect the salience of a nested-sets structure-the factor facilitating Bayesian reasoning processed by a general reasoning mechanism. In two well-powered experiments (N = 577), we found no support for a positive iconicity effect across different iconicity levels that were manipulated in different visual displays (meta-analytical overall effect: log OR = -0.13, 95% CI [-0.53, 0.28]). A Bayes factor analysis provided strong evidence in favor of the null hypothesis-the null iconicity effect. Thus, these findings corroborate the nested-sets rather than the ecological-rationality account of statistical reasoning.

  20. Interpreting support vector machine models for multivariate group wise analysis in neuroimaging

    PubMed Central

    Gaonkar, Bilwaj; Shinohara, Russell T; Davatzikos, Christos

    2015-01-01

    Machine learning based classification algorithms like support vector machines (SVMs) have shown great promise for turning a high dimensional neuroimaging data into clinically useful decision criteria. However, tracing imaging based patterns that contribute significantly to classifier decisions remains an open problem. This is an issue of critical importance in imaging studies seeking to determine which anatomical or physiological imaging features contribute to the classifier’s decision, thereby allowing users to critically evaluate the findings of such machine learning methods and to understand disease mechanisms. The majority of published work addresses the question of statistical inference for support vector classification using permutation tests based on SVM weight vectors. Such permutation testing ignores the SVM margin, which is critical in SVM theory. In this work we emphasize the use of a statistic that explicitly accounts for the SVM margin and show that the null distributions associated with this statistic are asymptotically normal. Further, our experiments show that this statistic is a lot less conservative as compared to weight based permutation tests and yet specific enough to tease out multivariate patterns in the data. Thus, we can better understand the multivariate patterns that the SVM uses for neuroimaging based classification. PMID:26210913

  1. Directing Improvements in Primary Care Patient Experience through Analysis of Service Quality.

    PubMed

    Hudson Smith, Mel; Smith, David

    2018-06-03

    To examine the influence of dimensions of service quality on patient experience of primary care. Data from the national GP Patient Survey in England 2014/15, with responses from 858,351 patients registered at 7,918 practices. Expert panel and principal component analysis helped identify relevant dimensions of service quality. Regression was then used to examine the relationships between these dimensions and reported patient experience. Aggregated scores for each practice were used, comprising the proportion of positive responses to each element of the study. Of eight service quality dimensions identified, six have statistically significant impacts on patient experience but only two have large effects. Patient experience is highly influenced by practice responsiveness and the interactions with the physician. Other dimensions have small or even slightly negative influence. Service quality provided by nurses has negligible effect on patient experience. To improve patient experience in primary health care, efforts should focus on practice responsiveness and interactions with the physician. Other areas have little influence over patient experience. This suggests a gap in patients' perspectives on health care, which has policy implications for patient education. © Health Research and Educational Trust.

  2. Rediscovery rate estimation for assessing the validation of significant findings in high-throughput studies.

    PubMed

    Ganna, Andrea; Lee, Donghwan; Ingelsson, Erik; Pawitan, Yudi

    2015-07-01

    It is common and advised practice in biomedical research to validate experimental or observational findings in a population different from the one where the findings were initially assessed. This practice increases the generalizability of the results and decreases the likelihood of reporting false-positive findings. Validation becomes critical when dealing with high-throughput experiments, where the large number of tests increases the chance to observe false-positive results. In this article, we review common approaches to determine statistical thresholds for validation and describe the factors influencing the proportion of significant findings from a 'training' sample that are replicated in a 'validation' sample. We refer to this proportion as rediscovery rate (RDR). In high-throughput studies, the RDR is a function of false-positive rate and power in both the training and validation samples. We illustrate the application of the RDR using simulated data and real data examples from metabolomics experiments. We further describe an online tool to calculate the RDR using t-statistics. We foresee two main applications. First, if the validation study has not yet been collected, the RDR can be used to decide the optimal combination between the proportion of findings taken to validation and the size of the validation study. Secondly, if a validation study has already been done, the RDR estimated using the training data can be compared with the observed RDR from the validation data; hence, the success of the validation study can be assessed. © The Author 2014. Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  3. Advances in single-cell experimental design made possible by automated imaging platforms with feedback through segmentation.

    PubMed

    Crick, Alex J; Cammarota, Eugenia; Moulang, Katie; Kotar, Jurij; Cicuta, Pietro

    2015-01-01

    Live optical microscopy has become an essential tool for studying the dynamical behaviors and variability of single cells, and cell-cell interactions. However, experiments and data analysis in this area are often extremely labor intensive, and it has often not been achievable or practical to perform properly standardized experiments on a statistically viable scale. We have addressed this challenge by developing automated live imaging platforms, to help standardize experiments, increasing throughput, and unlocking previously impossible ones. Our real-time cell tracking programs communicate in feedback with microscope and camera control software, and they are highly customizable, flexible, and efficient. As examples of our current research which utilize these automated platforms, we describe two quite different applications: egress-invasion interactions of malaria parasites and red blood cells, and imaging of immune cells which possess high motility and internal dynamics. The automated imaging platforms are able to track a large number of motile cells simultaneously, over hours or even days at a time, greatly increasing data throughput and opening up new experimental possibilities. Copyright © 2015 Elsevier Inc. All rights reserved.

  4. Association between interpersonal trust, reciprocity, and suicidal behaviors: A longitudinal cohort study in South Korea.

    PubMed

    Kim, Ja Young; Yoon, Jaehong; Kim, Myoung-Hee; Kim, Seung-Sup

    2017-06-01

    While a growing body of evidence suggest that social capital including interpersonal trust and reciprocity might be associated with mental health outcomes, few studies have explored the relationship with suicidal behaviors. This research examined the prospective association between interpersonal trust and reciprocity and suicidal behaviors using the Korea Welfare Panel Study, a nationally representative longitudinal cohort dataset in South Korea. Interpersonal trust and reciprocity were assessed at the 7th wave of the survey (2012), and each measure was classified into two categories (low vs. high). Experience of suicidal ideation, planning, and attempt was assessed between the 8th (2013) and 10th wave (2015) of the surveys. After adjusting for confounders including lifetime experience of suicidal behaviors at the 7th wave of the survey (2012) as well as socio-demographic information, the low interpersonal trust group was more likely to experience suicidal ideation (OR: 1.30, 95% CI: 1.11-1.53) compared to the high interpersonal trust group whereas no statistically significant association was observed in the reciprocity analysis. Copyright © 2017 Elsevier Inc. All rights reserved.

  5. What's in a Face? Visual Contributions to Speech Segmentation

    ERIC Educational Resources Information Center

    Mitchel, Aaron D.; Weiss, Daniel J.

    2010-01-01

    Recent research has demonstrated that adults successfully segment two interleaved artificial speech streams with incongruent statistics (i.e., streams whose combined statistics are noisier than the encapsulated statistics) only when provided with an indexical cue of speaker voice. In a series of five experiments, our study explores whether…

  6. SOCR: Statistics Online Computational Resource

    ERIC Educational Resources Information Center

    Dinov, Ivo D.

    2006-01-01

    The need for hands-on computer laboratory experience in undergraduate and graduate statistics education has been firmly established in the past decade. As a result a number of attempts have been undertaken to develop novel approaches for problem-driven statistical thinking, data analysis and result interpretation. In this paper we describe an…

  7. Flexibility in Statistical Word Segmentation: Finding Words in Foreign Speech

    ERIC Educational Resources Information Center

    Graf Estes, Katharine; Gluck, Stephanie Chen-Wu; Bastos, Carolina

    2015-01-01

    The present experiments investigated the flexibility of statistical word segmentation. There is ample evidence that infants can use statistical cues (e.g., syllable transitional probabilities) to segment fluent speech. However, it is unclear how effectively infants track these patterns in unfamiliar phonological systems. We examined whether…

  8. Second Language Experience Facilitates Statistical Learning of Novel Linguistic Materials

    ERIC Educational Resources Information Center

    Potter, Christine E.; Wang, Tianlin; Saffran, Jenny R.

    2017-01-01

    Recent research has begun to explore individual differences in statistical learning, and how those differences may be related to other cognitive abilities, particularly their effects on language learning. In this research, we explored a different type of relationship between language learning and statistical learning: the possibility that learning…

  9. K-nearest neighbors based methods for identification of different gear crack levels under different motor speeds and loads: Revisited

    NASA Astrophysics Data System (ADS)

    Wang, Dong

    2016-03-01

    Gears are the most commonly used components in mechanical transmission systems. Their failures may cause transmission system breakdown and result in economic loss. Identification of different gear crack levels is important to prevent any unexpected gear failure because gear cracks lead to gear tooth breakage. Signal processing based methods mainly require expertize to explain gear fault signatures which is usually not easy to be achieved by ordinary users. In order to automatically identify different gear crack levels, intelligent gear crack identification methods should be developed. The previous case studies experimentally proved that K-nearest neighbors based methods exhibit high prediction accuracies for identification of 3 different gear crack levels under different motor speeds and loads. In this short communication, to further enhance prediction accuracies of existing K-nearest neighbors based methods and extend identification of 3 different gear crack levels to identification of 5 different gear crack levels, redundant statistical features are constructed by using Daubechies 44 (db44) binary wavelet packet transform at different wavelet decomposition levels, prior to the use of a K-nearest neighbors method. The dimensionality of redundant statistical features is 620, which provides richer gear fault signatures. Since many of these statistical features are redundant and highly correlated with each other, dimensionality reduction of redundant statistical features is conducted to obtain new significant statistical features. At last, the K-nearest neighbors method is used to identify 5 different gear crack levels under different motor speeds and loads. A case study including 3 experiments is investigated to demonstrate that the developed method provides higher prediction accuracies than the existing K-nearest neighbors based methods for recognizing different gear crack levels under different motor speeds and loads. Based on the new significant statistical features, some other popular statistical models including linear discriminant analysis, quadratic discriminant analysis, classification and regression tree and naive Bayes classifier, are compared with the developed method. The results show that the developed method has the highest prediction accuracies among these statistical models. Additionally, selection of the number of new significant features and parameter selection of K-nearest neighbors are thoroughly investigated.

  10. Experimental design, power and sample size for animal reproduction experiments.

    PubMed

    Chapman, Phillip L; Seidel, George E

    2008-01-01

    The present paper concerns statistical issues in the design of animal reproduction experiments, with emphasis on the problems of sample size determination and power calculations. We include examples and non-technical discussions aimed at helping researchers avoid serious errors that may invalidate or seriously impair the validity of conclusions from experiments. Screen shots from interactive power calculation programs and basic SAS power calculation programs are presented to aid in understanding statistical power and computing power in some common experimental situations. Practical issues that are common to most statistical design problems are briefly discussed. These include one-sided hypothesis tests, power level criteria, equality of within-group variances, transformations of response variables to achieve variance equality, optimal specification of treatment group sizes, 'post hoc' power analysis and arguments for the increased use of confidence intervals in place of hypothesis tests.

  11. Small-scale plasma turbulence and intermittency in the high latitude F region based on the ICI-2 sounding rocket experiment

    NASA Astrophysics Data System (ADS)

    Spicher, A.; Miloch, W.; Moen, J. I.; Clausen, L. B. N.

    2015-12-01

    Small-scale plasma irregularities and turbulence are common phenomena in the F layer of the ionosphere, both in the equatorial and polar regions. A common approach in analyzing data from experiments on space and ionospheric plasma irregularities are power spectra. Power spectra give no information about the phases of the waveforms, and thus do not allow to determine whether some of the phases are correlated or whether they exhibit a random character. The former case would imply the presence of nonlinear wave-wave interactions, while the latter suggests a more turbulent-like process. Discerning between these mechanisms is crucial for understanding high latitude plasma irregularities and can be addressed with bispectral analysis and higher order statistics. In this study, we use higher order spectra and statistics to analyze electron density data observed with the ICI-2 sounding rocket experiment at a meter-scale resolution. The main objective of ICI-2 was to investigate plasma irregularities in the cusp in the F layer ionosphere. We study in detail two regions intersected during the rocket flight and which are characterized by large density fluctuations: a trailing edge of a cold polar cap patch, and a density enhancement subject to cusp auroral particle precipitation. While these two regions exhibit similar power spectra, our analysis reveals that their internal structure is different. The structures on the edge of the polar cap patch are characterized by significant coherent mode coupling and intermittency, while the plasma enhancement associated with precipitation exhibits stronger random characteristics. This indicates that particle precipitation may play a fundamental role in ionospheric plasma structuring by creating turbulent-like structures.

  12. Data analysis report on ATS-F COMSAT millimeter wave propagation experiment, part 1. [effects of hydrometeors on ground to satellite communication

    NASA Technical Reports Server (NTRS)

    Hyde, G.

    1976-01-01

    The 13/18 GHz COMSAT Propagation Experiment (CPE) was performed to measure attenuation caused by hydrometeors along slant paths from transmitting terminals on the ground to the ATS-6 satellite. The effectiveness of site diversity in overcoming this impairment was also studied. Problems encountered in assembling a valid data base of rain induced attenuation data for statistical analysis are considered. The procedures used to obtain the various statistics are then outlined. The graphs and tables of statistical data for the 15 dual frequency (13 and 18 GHz) site diversity locations are discussed. Cumulative rain rate statistics for the Fayetteville and Boston sites based on point rainfall data collected are presented along with extrapolations of the attenuation and point rainfall data.

  13. Effect of theory-based intervention to promote physical activity among adolescent girls: a randomized control trial

    PubMed Central

    Darabi, Fatemeh; Kaveh, Mohammad Hossein; Majlessi, Fereshteh; Farahani, Farideh Khalaj Abadi; Yaseri, Mehdi; Shojaeizadeh, Davoud

    2017-01-01

    Background Physical activity (PA) rates decline among most high school female students, and due to cultural restrictions, the reduction of physical activity might be exacerbated in female Iranian adolescents. Objective To determine the effects of the physical activity education theory-based intervention to promote activity among adolescent girls. Methods This randomized clinical trial was conducted at public high schools in Tehran, Iran, from September 2015 to July 2016 on 578 girls. The subjects were assigned randomly to two groups of experiment and control (n=289 per group). All participants in the experimental group received an educational program based on a modified TPB. Measures were assessed before and 6 months after the experiment. The data were analyzed using SPSS version 23. We used descriptive statistics, multilevel analysis, Likelihood Ratio (LR) test, P-value less than 0.05 were considered statistically significant. Results Five hundred and seventy-eight participants with a mean age of 14.26±0.96 years were studied in two groups of experiment (n=289) and control (n=289). Moreover, adjusted for the baseline values, the mean of the scores of the knowledge (84.1±13.6), attitude (31.2±13.6), subjective norm (40.4±11.1), behavioral intention (34.3±14.7), perceived behavioral control (38.4±11.6), perceived parental control (42.9±14.2), behavioral (42.6±17.1) was significantly higher in the experiment group compared with the control group (p<0.001). Conclusions The results of this study implicate that theory based educational intervention is considered to be more effective in improving physical activity in adolescents. This result can be used to increase adolescent’s health promotion. Trial registration The trial was registered at the Iranian Registry of Clinical Trials (IRST) with the identification number: IRCT2015070623089N2. Funding The authors received no financial support for the research from Kermanshah University of Medical Sciences. PMID:28607661

  14. The Role of Design-of-Experiments in Managing Flow in Compact Air Vehicle Inlets

    NASA Technical Reports Server (NTRS)

    Anderson, Bernhard H.; Miller, Daniel N.; Gridley, Marvin C.; Agrell, Johan

    2003-01-01

    It is the purpose of this study to demonstrate the viability and economy of Design-of-Experiments methodologies to arrive at microscale secondary flow control array designs that maintain optimal inlet performance over a wide range of the mission variables and to explore how these statistical methods provide a better understanding of the management of flow in compact air vehicle inlets. These statistical design concepts were used to investigate the robustness properties of low unit strength micro-effector arrays. Low unit strength micro-effectors are micro-vanes set at very low angles-of-incidence with very long chord lengths. They were designed to influence the near wall inlet flow over an extended streamwise distance, and their advantage lies in low total pressure loss and high effectiveness in managing engine face distortion. The term robustness is used in this paper in the same sense as it is used in the industrial problem solving community. It refers to minimizing the effects of the hard-to-control factors that influence the development of a product or process. In Robustness Engineering, the effects of the hard-to-control factors are often called noise , and the hard-to-control factors themselves are referred to as the environmental variables or sometimes as the Taguchi noise variables. Hence Robust Optimization refers to minimizing the effects of the environmental or noise variables on the development (design) of a product or process. In the management of flow in compact inlets, the environmental or noise variables can be identified with the mission variables. Therefore this paper formulates a statistical design methodology that minimizes the impact of variations in the mission variables on inlet performance and demonstrates that these statistical design concepts can lead to simpler inlet flow management systems.

  15. TSP Symposium 2012 Proceedings

    DTIC Science & Technology

    2012-11-01

    and Statistical Model 78 7.3 Analysis and Results 79 7.4 Threats to Validity and Limitations 85 7.5 Conclusions 86 7.6 Acknowledgments 87 7.7...Table 12: Overall Statistics of the Experiment 32 Table 13: Results of Pairwise ANOVA Analysis, Highlighting Statistically Significant Differences...we calculated the percentage of defects injected. The distribution statistics are shown in Table 2. Table 2: Mean Lower, Upper Confidence Interval

  16. Molecular Dynamics of Hot Dense Plasmas: New Horizons

    NASA Astrophysics Data System (ADS)

    Graziani, Frank

    2011-10-01

    We describe the status of a new time-dependent simulation capability for hot dense plasmas. The backbone of this multi-institutional computational and experimental effort--the Cimarron Project--is the massively parallel molecular dynamics (MD) code ``ddcMD''. The project's focus is material conditions such as exist in inertial confinement fusion experiments, and in many stellar interiors: high temperatures, high densities, significant electromagnetic fields, mixtures of high- and low- Zelements, and non-Maxwellian particle distributions. Of particular importance is our ability to incorporate into this classical MD code key atomic, radiative, and nuclear processes, so that their interacting effects under non-ideal plasma conditions can be investigated. This talk summarizes progress in computational methodology, discusses strengths and weaknesses of quantum statistical potentials as effective interactions for MD, explains the model used for quantum events possibly occurring in a collision and highlights some significant results obtained to date. We describe the status of a new time-dependent simulation capability for hot dense plasmas. The backbone of this multi-institutional computational and experimental effort--the Cimarron Project--is the massively parallel molecular dynamics (MD) code ``ddcMD''. The project's focus is material conditions such as exist in inertial confinement fusion experiments, and in many stellar interiors: high temperatures, high densities, significant electromagnetic fields, mixtures of high- and low- Zelements, and non-Maxwellian particle distributions. Of particular importance is our ability to incorporate into this classical MD code key atomic, radiative, and nuclear processes, so that their interacting effects under non-ideal plasma conditions can be investigated. This talk summarizes progress in computational methodology, discusses strengths and weaknesses of quantum statistical potentials as effective interactions for MD, explains the model used for quantum events possibly occurring in a collision and highlights some significant results obtained to date. This work is performed under the auspices of the U. S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  17. Chemical quality of bottom sediments in selected streams, Jefferson County, Kentucky, April-July 1992

    USGS Publications Warehouse

    Moore, B.L.; Evaldi, R.D.

    1995-01-01

    Bottom sediments from 25 stream sites in Jefferson County, Ky., were analyzed for percent volatile solids and concentrations of nutrients, major metals, trace elements, miscellaneous inorganic compounds, and selected organic compounds. Statistical high outliers of the constituent concentrations analyzed for in the bottom sediments were defined as a measure of possible elevated concentrations. Statistical high outliers were determined for at least 1 constituent at each of 12 sampling sites in Jefferson County. Of the 10 stream basins sampled in Jefferson County, the Middle Fork Beargrass Basin, Cedar Creek Basin, and Harrods Creek Basin were the only three basins where a statistical high outlier was not found for any of the measured constituents. In the Pennsylvania Run Basin, total volatile solids, nitrate plus nitrite, and endrin constituents were statistical high outliers. Pond Creek was the only basin where five constituents were statistical high outliers-barium, beryllium, cadmium, chromium, and silver. Nitrate plus nitrite and copper constituents were the only statistical high outliers found in the Mill Creek Basin. In the Floyds Fork Basin, nitrate plus nitrite, phosphorus, mercury, and silver constituents were the only statistical high outliers. Ammonia was the only statistical high outlier found in the South Fork Beargrass Basin. In the Goose Creek Basin, mercury and silver constituents were the only statistical high outliers. Cyanide was the only statistical high outlier in the Muddy Fork Basin.

  18. Statistical evaluation of the metallurgical test data in the ORR-PSF-PVS irradiation experiment. [PWR; BWR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stallmann, F.W.

    1984-08-01

    A statistical analysis of Charpy test results of the two-year Pressure Vessel Simulation metallurgical irradiation experiment was performed. Determination of transition temperature and upper shelf energy derived from computer fits compare well with eyeball fits. Uncertainties for all results can be obtained with computer fits. The results were compared with predictions in Regulatory Guide 1.99 and other irradiation damage models.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Le, K. C.; Tran, T. M.; Langer, J. S.

    The statistical-thermodynamic dislocation theory developed in previous papers is used here in an analysis of high-temperature deformation of aluminum and steel. Using physics-based parameters that we expect theoretically to be independent of strain rate and temperature, we are able to fit experimental stress-strain curves for three different strain rates and three different temperatures for each of these two materials. Here, our theoretical curves include yielding transitions at zero strain in agreement with experiment. We find that thermal softening effects are important even at the lowest temperatures and smallest strain rates.

  20. Measurement of the D(s)+ lifetime.

    PubMed

    Link, J M; Yager, P M; Anjos, J C; Bediaga, I; Castromonte, C; Machado, A A; Magnin, J; Massafferi, A; de Miranda, J M; Pepe, I M; Polycarpo, E; dos Reis, A C; Carrillo, S; Casimiro, E; Cuautle, E; Sánchez-Hernández, A; Uribe, C; Vázquez, F; Agostino, L; Cinquini, L; Cumalat, J P; O'Reilly, B; Segoni, I; Stenson, K; Butler, J N; Cheung, H W K; Chiodini, G; Gaines, I; Garbincius, P H; Garren, L A; Gottschalk, E; Kasper, P H; Kreymer, A E; Kutschke, R; Wang, M; Benussi, L; Bertani, M; Bianco, S; Fabbri, F L; Pacetti, S; Zallo, A; Reyes, M; Cawlfield, C; Kim, D Y; Rahimi, A; Wiss, J; Gardner, R; Kryemadhi, A; Chung, Y S; Kang, J S; Ko, B R; Kwak, J W; Lee, K B; Cho, K; Park, H; Alimonti, G; Barberis, S; Boschini, M; Cerutti, A; D'Angelo, P; DiCorato, M; Dini, P; Edera, L; Erba, S; Inzani, P; Leveraro, F; Malvezzi, S; Menasce, D; Mezzadri, M; Milazzo, L; Moroni, L; Pedrini, D; Pontoglio, C; Prelz, F; Rovere, M; Sala, S; Davenport, T F; Arena, V; Boca, G; Bonomi, G; Gianini, G; Liguori, G; Pegna, D Lopes; Merlo, M M; Pantea, D; Ratti, S P; Riccardi, C; Vitulo, P; Göbel, C; Hernandez, H; Lopez, A M; Mendez, H; Paris, A; Quinones, J; Ramirez, J E; Zhang, Y; Wilson, J R; Handler, T; Mitchell, R; Engh, D; Hosack, M; Johns, W E; Luiggi, E; Moore, J E; Nehring, M; Sheldon, P D; Vaandering, E W; Webster, M; Sheaff, M

    2005-07-29

    A high statistics measurement of the D(s)+ lifetime from the Fermilab fixed-target FOCUS photoproduction experiment is presented. We describe the analysis of the two decay modes, D(s)+ --> phi(1020)pi+ and D(s)+ -->K*(892)0K+, used for the measurement. The measured lifetime is 507.4 +/- 5.5(stat) +/- 5.1(syst) fs using 8961 +/- 105 D(s)+ --> phi(1020)pi+ and 4680 +/- 90 D(s)+ --> K*(892)0K+ decays. This is a significant improvement over the present world average.

  1. Determining significant material properties: A discovery approach

    NASA Technical Reports Server (NTRS)

    Karplus, Alan K.

    1992-01-01

    The following is a laboratory experiment designed to further understanding of materials science. The experiment itself can be informative for persons of any age past elementary school, and even for some in elementary school. The preparation of the plastic samples is readily accomplished by persons with resonable dexterity in the cutting of paper designs. The completion of the statistical Design of Experiments, which uses Yates' Method, requires basic math (addition and subtraction). Interpretive work requires plotting of data and making observations. Knowledge of statistical methods would be helpful. The purpose of this experiment is to acquaint students with the seven classes of recyclable plastics, and provide hands-on learning about the response of these plastics to mechanical tensile loading.

  2. The exact probability distribution of the rank product statistics for replicated experiments.

    PubMed

    Eisinga, Rob; Breitling, Rainer; Heskes, Tom

    2013-03-18

    The rank product method is a widely accepted technique for detecting differentially regulated genes in replicated microarray experiments. To approximate the sampling distribution of the rank product statistic, the original publication proposed a permutation approach, whereas recently an alternative approximation based on the continuous gamma distribution was suggested. However, both approximations are imperfect for estimating small tail probabilities. In this paper we relate the rank product statistic to number theory and provide a derivation of its exact probability distribution and the true tail probabilities. Copyright © 2013 Federation of European Biochemical Societies. Published by Elsevier B.V. All rights reserved.

  3. Statistical Machine Learning for Structured and High Dimensional Data

    DTIC Science & Technology

    2014-09-17

    AFRL-OSR-VA-TR-2014-0234 STATISTICAL MACHINE LEARNING FOR STRUCTURED AND HIGH DIMENSIONAL DATA Larry Wasserman CARNEGIE MELLON UNIVERSITY Final...Re . 8-98) v Prescribed by ANSI Std. Z39.18 14-06-2014 Final Dec 2009 - Aug 2014 Statistical Machine Learning for Structured and High Dimensional...area of resource-constrained statistical estimation. machine learning , high-dimensional statistics U U U UU John Lafferty 773-702-3813 > Research under

  4. Hierarchical Bayesian modelling of gene expression time series across irregularly sampled replicates and clusters.

    PubMed

    Hensman, James; Lawrence, Neil D; Rattray, Magnus

    2013-08-20

    Time course data from microarrays and high-throughput sequencing experiments require simple, computationally efficient and powerful statistical models to extract meaningful biological signal, and for tasks such as data fusion and clustering. Existing methodologies fail to capture either the temporal or replicated nature of the experiments, and often impose constraints on the data collection process, such as regularly spaced samples, or similar sampling schema across replications. We propose hierarchical Gaussian processes as a general model of gene expression time-series, with application to a variety of problems. In particular, we illustrate the method's capacity for missing data imputation, data fusion and clustering.The method can impute data which is missing both systematically and at random: in a hold-out test on real data, performance is significantly better than commonly used imputation methods. The method's ability to model inter- and intra-cluster variance leads to more biologically meaningful clusters. The approach removes the necessity for evenly spaced samples, an advantage illustrated on a developmental Drosophila dataset with irregular replications. The hierarchical Gaussian process model provides an excellent statistical basis for several gene-expression time-series tasks. It has only a few additional parameters over a regular GP, has negligible additional complexity, is easily implemented and can be integrated into several existing algorithms. Our experiments were implemented in python, and are available from the authors' website: http://staffwww.dcs.shef.ac.uk/people/J.Hensman/.

  5. Modelling solute dispersion in periodic heterogeneous porous media: Model benchmarking against intermediate scale experiments

    NASA Astrophysics Data System (ADS)

    Majdalani, Samer; Guinot, Vincent; Delenne, Carole; Gebran, Hicham

    2018-06-01

    This paper is devoted to theoretical and experimental investigations of solute dispersion in heterogeneous porous media. Dispersion in heterogenous porous media has been reported to be scale-dependent, a likely indication that the proposed dispersion models are incompletely formulated. A high quality experimental data set of breakthrough curves in periodic model heterogeneous porous media is presented. In contrast with most previously published experiments, the present experiments involve numerous replicates. This allows the statistical variability of experimental data to be accounted for. Several models are benchmarked against the data set: the Fickian-based advection-dispersion, mobile-immobile, multirate, multiple region advection dispersion models, and a newly proposed transport model based on pure advection. A salient property of the latter model is that its solutions exhibit a ballistic behaviour for small times, while tending to the Fickian behaviour for large time scales. Model performance is assessed using a novel objective function accounting for the statistical variability of the experimental data set, while putting equal emphasis on both small and large time scale behaviours. Besides being as accurate as the other models, the new purely advective model has the advantages that (i) it does not exhibit the undesirable effects associated with the usual Fickian operator (namely the infinite solute front propagation speed), and (ii) it allows dispersive transport to be simulated on every heterogeneity scale using scale-independent parameters.

  6. Crackling to periodic transition in a granular stick-slip experiment

    NASA Astrophysics Data System (ADS)

    Abed Zadeh, Aghil; BaréS, Jonathan; Behringer, Robert

    We perform a stick-slip experiment to characterize avalanches in time and space for granular materials. In our experiment, a constant speed stage pulls a slider which rests on a vertical bed of circular photo-elastic particles in a 2D system. The stage is connected to the slider by a spring. We measure the force on the spring by a force sensor attached to the spring. We study the avalanche size statistics, and other seismicity laws of slip avalanches. Using the power spectrum of the force signal and avalanche statistics, we analyze the effect of the loading speed and of the spring stiffness and we capture a transition from crackling to periodic regime by changing these parameters. From a more local point of view and by using a high speed camera and the photo-elastic properties of our particles, we characterize the local stress change and flow of particles during slip avalanches. By image processing, we detect the local avalanches as connected components in space and time, and we study the avalanche size probability density functions (PDF). The PDF of avalanches obey power laws both at global and local scales, but with different exponents. We try to understand the correlation of local avalanches in space and the way they coarse grain to the global avalanches. NSF Grant DMR-1206351, NASA Grant NNX15AD38G, and the William M. Keck Foundation.

  7. An electron fixed target experiment to search for a new vector boson A' decaying to e +e -

    DOE PAGES

    Rouven Essig; Schuster, Philip; Toro, Natalia; ...

    2011-02-02

    We describe an experiment to search for a new vector boson A' with weak coupling alpha' > 6 x 10 –8 α to electrons (α' = e 2/4π) in the mass range 65 MeV < m A' < 550 MeV. New vector bosons with such small couplings arise naturally from a small kinetic mixing of the "dark photon" A' with the photon -- one of the very few ways in which new forces can couple to the Standard Model -- and have received considerable attention as an explanation of various dark matter related anomalies. A' bosons are produced by radiationmore » off an electron beam, and could appear as narrow resonances with small production cross-section in the trident e +e - spectrum. We summarize the experimental approach described in a proposal submitted to Jefferson Laboratory's PAC35, PR-10-009. This experiment, the A' Experiment (APEX), uses the electron beam of the Continuous Electron Beam Accelerator Facility at Jefferson Laboratory (CEBAF) at energies of ~1-4 GeV incident on 0.5-10% radiation length Tungsten wire mesh targets, and measures the resulting e+e- pairs to search for the A' using the High Resolution Spectrometer and the septum magnet in Hall A. With a ~1 month run, APEX will achieve very good sensitivity because the statistics of e+e- pairs will be ~10,000 times larger in the explored mass range than any previous search for the A' boson. These statistics and the excellent mass resolution of the spectrometers allow sensitivity to α'/α one to three orders of magnitude below current limits, in a region of parameter space of great theoretical and phenomenological interest. Similar experiments could also be performed at other facilities, such as the Mainz Microtron.« less

  8. [Experience, prevalence and severity of dental caries and its association with nutritional status in Mexican infants 17-47 months].

    PubMed

    Zúñiga-Manríquez, Ana Gabriela; Medina-Solís, Carlo Eduardo; Lara-Carrillo, Edith; Márquez-Corona, María de Lourdes; Robles-Bermeo, Norma Leticia; Scougall-Vilchis, Rogelio José; Maupomé, Gerardo

    2013-01-01

    To determine the experience, prevalence and severity of dental caries and its relationship with nutritional status in nursery infants 17 to 47 months of age. A cross-sectional study in 152 infants 17 to 47 months of age attending one of five day care centers of the city of Pachuca, Hidalgo was performed. Clinical examinations were performed using the methods recommended by the World Health Organization for epidemiologic studies on dental caries. We calculated the caries index (dmft), the significant caries index (SiC) as well as the treatment needs index (TNI) and the care index (CI). Nutritional status was determined using the weight and height for age, in Federico Gomez's scale. In the statistical analysis nonparametric tests were used. Mean age was 2.52 ± 0.76 years; 51.3% were boys. With regard to nutritional status, 19.1% were classified as malnourished and 19.1% were overweight/obese. The dmft index was 1.53 ± 2.52. The SiC index was 4.14, the TNI 86.3% and the CI 13.7%. Caries prevalence was 48.0%. It was observed that 33.5% of children had 1 to 3 teeth with caries experience and 14.5% had 4 or more teeth affected. Statistically significant differences for tooth decay were identified (p < 0.05) by age, height and weight but not (p> 0.05) by sex and nutritional status. This study shows that nearly half of children examined had caries experience. High treatment needs for dental caries were observed. A correlation was found between dmft index and age, weight and height. No association was identified between experience, prevalence and severity of dental caries and nutritional status of infants. It appears necessary to improve oral health preventive measures in these infants.

  9. Optimized Design and Analysis of Sparse-Sampling fMRI Experiments

    PubMed Central

    Perrachione, Tyler K.; Ghosh, Satrajit S.

    2013-01-01

    Sparse-sampling is an important methodological advance in functional magnetic resonance imaging (fMRI), in which silent delays are introduced between MR volume acquisitions, allowing for the presentation of auditory stimuli without contamination by acoustic scanner noise and for overt vocal responses without motion-induced artifacts in the functional time series. As such, the sparse-sampling technique has become a mainstay of principled fMRI research into the cognitive and systems neuroscience of speech, language, hearing, and music. Despite being in use for over a decade, there has been little systematic investigation of the acquisition parameters, experimental design considerations, and statistical analysis approaches that bear on the results and interpretation of sparse-sampling fMRI experiments. In this report, we examined how design and analysis choices related to the duration of repetition time (TR) delay (an acquisition parameter), stimulation rate (an experimental design parameter), and model basis function (an analysis parameter) act independently and interactively to affect the neural activation profiles observed in fMRI. First, we conducted a series of computational simulations to explore the parameter space of sparse design and analysis with respect to these variables; second, we validated the results of these simulations in a series of sparse-sampling fMRI experiments. Overall, these experiments suggest the employment of three methodological approaches that can, in many situations, substantially improve the detection of neurophysiological response in sparse fMRI: (1) Sparse analyses should utilize a physiologically informed model that incorporates hemodynamic response convolution to reduce model error. (2) The design of sparse fMRI experiments should maintain a high rate of stimulus presentation to maximize effect size. (3) TR delays of short to intermediate length can be used between acquisitions of sparse-sampled functional image volumes to increase the number of samples and improve statistical power. PMID:23616742

  10. Optimized design and analysis of sparse-sampling FMRI experiments.

    PubMed

    Perrachione, Tyler K; Ghosh, Satrajit S

    2013-01-01

    Sparse-sampling is an important methodological advance in functional magnetic resonance imaging (fMRI), in which silent delays are introduced between MR volume acquisitions, allowing for the presentation of auditory stimuli without contamination by acoustic scanner noise and for overt vocal responses without motion-induced artifacts in the functional time series. As such, the sparse-sampling technique has become a mainstay of principled fMRI research into the cognitive and systems neuroscience of speech, language, hearing, and music. Despite being in use for over a decade, there has been little systematic investigation of the acquisition parameters, experimental design considerations, and statistical analysis approaches that bear on the results and interpretation of sparse-sampling fMRI experiments. In this report, we examined how design and analysis choices related to the duration of repetition time (TR) delay (an acquisition parameter), stimulation rate (an experimental design parameter), and model basis function (an analysis parameter) act independently and interactively to affect the neural activation profiles observed in fMRI. First, we conducted a series of computational simulations to explore the parameter space of sparse design and analysis with respect to these variables; second, we validated the results of these simulations in a series of sparse-sampling fMRI experiments. Overall, these experiments suggest the employment of three methodological approaches that can, in many situations, substantially improve the detection of neurophysiological response in sparse fMRI: (1) Sparse analyses should utilize a physiologically informed model that incorporates hemodynamic response convolution to reduce model error. (2) The design of sparse fMRI experiments should maintain a high rate of stimulus presentation to maximize effect size. (3) TR delays of short to intermediate length can be used between acquisitions of sparse-sampled functional image volumes to increase the number of samples and improve statistical power.

  11. Demonstration of improved sensitivity of echo interferometers to gravitational acceleration

    NASA Astrophysics Data System (ADS)

    Mok, C.; Barrett, B.; Carew, A.; Berthiaume, R.; Beattie, S.; Kumarakrishnan, A.

    2013-08-01

    We have developed two configurations of an echo interferometer that rely on standing-wave excitation of a laser-cooled sample of rubidium atoms. Both configurations can be used to measure acceleration a along the axis of excitation. For a two-pulse configuration, the signal from the interferometer is modulated at the recoil frequency and exhibits a sinusoidal frequency chirp as a function of pulse spacing. In comparison, for a three-pulse stimulated-echo configuration, the signal is observed without recoil modulation and exhibits a modulation at a single frequency as a function of pulse spacing. The three-pulse configuration is less sensitive to effects of vibrations and magnetic field curvature, leading to a longer experimental time scale. For both configurations of the atom interferometer (AI), we show that a measurement of acceleration with a statistical precision of 0.5% can be realized by analyzing the shape of the echo envelope that has a temporal duration of a few microseconds. Using the two-pulse AI, we obtain measurements of acceleration that are statistically precise to 6 parts per million (ppm) on a 25 ms time scale. In comparison, using the three-pulse AI, we obtain measurements of acceleration that are statistically precise to 0.4 ppm on a time scale of 50 ms. A further statistical enhancement is achieved by analyzing the data across the echo envelope so that the statistical error is reduced to 75 parts per billion (ppb). The inhomogeneous field of a magnetized vacuum chamber limited the experimental time scale and resulted in prominent systematic effects. Extended time scales and improved signal-to-noise ratio observed in recent echo experiments using a nonmagnetic vacuum chamber suggest that echo techniques are suitable for a high-precision measurement of gravitational acceleration g. We discuss methods for reducing systematic effects and improving the signal-to-noise ratio. Simulations of both AI configurations with a time scale of 300 ms suggest that an optimized experiment with improved vibration isolation and atoms selected in the mF=0 state can result in measurements of g statistically precise to 0.3 ppb for the two-pulse AI and 0.6 ppb for the three-pulse AI.

  12. Statistical learning of movement.

    PubMed

    Ongchoco, Joan Danielle Khonghun; Uddenberg, Stefan; Chun, Marvin M

    2016-12-01

    The environment is dynamic, but objects move in predictable and characteristic ways, whether they are a dancer in motion, or a bee buzzing around in flight. Sequences of movement are comprised of simpler motion trajectory elements chained together. But how do we know where one trajectory element ends and another begins, much like we parse words from continuous streams of speech? As a novel test of statistical learning, we explored the ability to parse continuous movement sequences into simpler element trajectories. Across four experiments, we showed that people can robustly parse such sequences from a continuous stream of trajectories under increasingly stringent tests of segmentation ability and statistical learning. Observers viewed a single dot as it moved along simple sequences of paths, and were later able to discriminate these sequences from novel and partial ones shown at test. Observers demonstrated this ability when there were potentially helpful trajectory-segmentation cues such as a common origin for all movements (Experiment 1); when the dot's motions were entirely continuous and unconstrained (Experiment 2); when sequences were tested against partial sequences as a more stringent test of statistical learning (Experiment 3); and finally, even when the element trajectories were in fact pairs of trajectories, so that abrupt directional changes in the dot's motion could no longer signal inter-trajectory boundaries (Experiment 4). These results suggest that observers can automatically extract regularities in movement - an ability that may underpin our capacity to learn more complex biological motions, as in sport or dance.

  13. Redefining the lower statistical limit in x-ray phase-contrast imaging

    NASA Astrophysics Data System (ADS)

    Marschner, M.; Birnbacher, L.; Willner, M.; Chabior, M.; Fehringer, A.; Herzen, J.; Noël, P. B.; Pfeiffer, F.

    2015-03-01

    Phase-contrast x-ray computed tomography (PCCT) is currently investigated and developed as a potentially very interesting extension of conventional CT, because it promises to provide high soft-tissue contrast for weakly absorbing samples. For data acquisition several images at different grating positions are combined to obtain a phase-contrast projection. For short exposure times, which are necessary for lower radiation dose, the photon counts in a single stepping position are very low. In this case, the currently used phase-retrieval does not provide reliable results for some pixels. This uncertainty results in statistical phase wrapping, which leads to a higher standard deviation in the phase-contrast projections than theoretically expected. For even lower statistics, the phase retrieval breaks down completely and the phase information is lost. New measurement procedures rely on a linear approximation of the sinusoidal phase stepping curve around the zero crossings. In this case only two images are acquired to obtain the phase-contrast projection. The approximation is only valid for small phase values. However, typically nearly all pixels are within this regime due to the differential nature of the signal. We examine the statistical properties of a linear approximation method and illustrate by simulation and experiment that the lower statistical limit can be redefined using this method. That means that the phase signal can be retrieved even with very low photon counts and statistical phase wrapping can be avoided. This is an important step towards enhanced image quality in PCCT with very low photon counts.

  14. Age and experience shape developmental changes in the neural basis of language-related learning.

    PubMed

    McNealy, Kristin; Mazziotta, John C; Dapretto, Mirella

    2011-11-01

    Very little is known about the neural underpinnings of language learning across the lifespan and how these might be modified by maturational and experiential factors. Building on behavioral research highlighting the importance of early word segmentation (i.e. the detection of word boundaries in continuous speech) for subsequent language learning, here we characterize developmental changes in brain activity as this process occurs online, using data collected in a mixed cross-sectional and longitudinal design. One hundred and fifty-six participants, ranging from age 5 to adulthood, underwent functional magnetic resonance imaging (fMRI) while listening to three novel streams of continuous speech, which contained either strong statistical regularities, strong statistical regularities and speech cues, or weak statistical regularities providing minimal cues to word boundaries. All age groups displayed significant signal increases over time in temporal cortices for the streams with high statistical regularities; however, we observed a significant right-to-left shift in the laterality of these learning-related increases with age. Interestingly, only the 5- to 10-year-old children displayed significant signal increases for the stream with low statistical regularities, suggesting an age-related decrease in sensitivity to more subtle statistical cues. Further, in a sample of 78 10-year-olds, we examined the impact of proficiency in a second language and level of pubertal development on learning-related signal increases, showing that the brain regions involved in language learning are influenced by both experiential and maturational factors. 2011 Blackwell Publishing Ltd.

  15. Optimizing ELISAs for precision and robustness using laboratory automation and statistical design of experiments.

    PubMed

    Joelsson, Daniel; Moravec, Phil; Troutman, Matthew; Pigeon, Joseph; DePhillips, Pete

    2008-08-20

    Transferring manual ELISAs to automated platforms requires optimizing the assays for each particular robotic platform. These optimization experiments are often time consuming and difficult to perform using a traditional one-factor-at-a-time strategy. In this manuscript we describe the development of an automated process using statistical design of experiments (DOE) to quickly optimize immunoassays for precision and robustness on the Tecan EVO liquid handler. By using fractional factorials and a split-plot design, five incubation time variables and four reagent concentration variables can be optimized in a short period of time.

  16. Artificial Intelligence Approach to Support Statistical Quality Control Teaching

    ERIC Educational Resources Information Center

    Reis, Marcelo Menezes; Paladini, Edson Pacheco; Khator, Suresh; Sommer, Willy Arno

    2006-01-01

    Statistical quality control--SQC (consisting of Statistical Process Control, Process Capability Studies, Acceptance Sampling and Design of Experiments) is a very important tool to obtain, maintain and improve the Quality level of goods and services produced by an organization. Despite its importance, and the fact that it is taught in technical and…

  17. Developing Young Children's Emergent Inferential Practices in Statistics

    ERIC Educational Resources Information Center

    Makar, Katie

    2016-01-01

    Informal statistical inference has now been researched at all levels of schooling and initial tertiary study. Work in informal statistical inference is least understood in the early years, where children have had little if any exposure to data handling. A qualitative study in Australia was carried out through a series of teaching experiments with…

  18. Forest statistics for New Hampshire

    Treesearch

    Thomas S. Frieswyk; Anne M. Malley

    1985-01-01

    This is a statistical report on the fourth forest survey of New Hampshire conducted in 1982-83 by the Forest Inventory and Analysis Unit, Northeastern Forest Experiment Station. Statistics for forest area, numbers of trees, timber volume, tree biomass, and timber products output are displayed at the state, unit, and county levels. The current inventory indicates that...

  19. Caesarean section on demand: influence of personal birth experience and working environment on attitude of German gynaecologists.

    PubMed

    Faas-Fehervary, Patricia; Schwarz, Kai; Bauer, Lelia; Melchert, Frank

    2005-10-01

    We performed a survey among German obstetricians and gynecologists in order to evaluate the influence of biographic data, working environment and personal birth experience on the attitude towards Cesarean Section on demand. All 2106 board-certified gynecologists in Baden-Württemberg received an anonymous questionnaire in 2002-2003 concerning attitude towards C-section on demand, biographical data, personal birth experience and working environment. Seven hundred and nineteen questionnaires were returned and entered into statistical analysis. General approval of C-section was in 59% of all participants, with huge statistically significant variations according to age, personal birth experience and working field. When asked for their preferred way of delivery for themselves or their partner after a low-risk pregnancy, 90% of the responding gynecologists opted for vaginal delivery. The approval depended statistically significant on parenthood, personal birth experience and working environment. Biographical data, personal birth experience and working environment influence the attitude towards elective Cesaran section. Although 90% would chose vaginal delivery for themselves or their partner as best medical practice, 59% of the physicians approve of the general opportunity of C-section on demand. This shows, that not only best medical practice, but also patient autonomy and forensic aspects seem to play an important role.

  20. Assessing Statistical Competencies in Clinical and Translational Science Education: One Size Does Not Fit All

    PubMed Central

    Lindsell, Christopher J.; Welty, Leah J.; Mazumdar, Madhu; Thurston, Sally W.; Rahbar, Mohammad H.; Carter, Rickey E.; Pollock, Bradley H.; Cucchiara, Andrew J.; Kopras, Elizabeth J.; Jovanovic, Borko D.; Enders, Felicity T.

    2014-01-01

    Abstract Introduction Statistics is an essential training component for a career in clinical and translational science (CTS). Given the increasing complexity of statistics, learners may have difficulty selecting appropriate courses. Our question was: what depth of statistical knowledge do different CTS learners require? Methods For three types of CTS learners (principal investigator, co‐investigator, informed reader of the literature), each with different backgrounds in research (no previous research experience, reader of the research literature, previous research experience), 18 experts in biostatistics, epidemiology, and research design proposed levels for 21 statistical competencies. Results Statistical competencies were categorized as fundamental, intermediate, or specialized. CTS learners who intend to become independent principal investigators require more specialized training, while those intending to become informed consumers of the medical literature require more fundamental education. For most competencies, less training was proposed for those with more research background. Discussion When selecting statistical coursework, the learner's research background and career goal should guide the decision. Some statistical competencies are considered to be more important than others. Baseline knowledge assessments may help learners identify appropriate coursework. Conclusion Rather than one size fits all, tailoring education to baseline knowledge, learner background, and future goals increases learning potential while minimizing classroom time. PMID:25212569

  1. Physics through the 1990s: Elementary-particle physics

    NASA Astrophysics Data System (ADS)

    The volume begins with a non-mathematical discussion of the motivation behind, and basic ideas of, elementary-particle physics theory and experiment. The progress over the past two decades with the quark model and unification of the electromagnetic and weak interactions is reviewed. Existing theoretical problems in the field, such as the origin of mass and the unification of the fundamental forces, are detailed, along with experimental programs to test the new theories. Accelerators, instrumentation, and detectors are described for both current and future facilities. Interactions with other areas of both theoretical and applied physics are presented. The sociology of the field is examined regarding the education of graduate students, the organization necessary in large-scale experiments, and the decision-making process involved in high-cost experiments. Finally, conclusions and recommendations for maintaining US excellence in theory and experiment are given. Appendices list both current and planned accelerators, and present statistical data on the US elementary-particle physics program. A glossary is included.

  2. Physics through the 1990s: elementary-particle physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1986-01-01

    The volume begins with a non-mathematical discussion of the motivation behind, and basic ideas of, elementary-particle physics theory and experiment. The progress over the past two decades with the quark model and unification of the electromagnetic and weak interactions is reviewed. Existing theoretical problems in the field, such as the origin of mass and the unification of the fundamental forces, are detailed, along with experimental programs to test the new theories. Accelerators, instrumentation, and detectors are described for both current and future facilities. Interactions with other areas of both theoretical and applied physics are presented. The sociology of the fieldmore » is examined regarding the education of graduate students, the organization necessary in large-scale experiments, and the decision-making process involved in high-cost experiments. Finally, conclusions and recommendations for maintaining US excellence in theory and experiment are given. Appendices list both current and planned accelerators, and present statistical data on the US elementary-particle physics program. A glossary is included.« less

  3. Physics through the 1990s: Elementary-particle physics

    NASA Technical Reports Server (NTRS)

    1986-01-01

    The volume begins with a non-mathematical discussion of the motivation behind, and basic ideas of, elementary-particle physics theory and experiment. The progress over the past two decades with the quark model and unification of the electromagnetic and weak interactions is reviewed. Existing theoretical problems in the field, such as the origin of mass and the unification of the fundamental forces, are detailed, along with experimental programs to test the new theories. Accelerators, instrumentation, and detectors are described for both current and future facilities. Interactions with other areas of both theoretical and applied physics are presented. The sociology of the field is examined regarding the education of graduate students, the organization necessary in large-scale experiments, and the decision-making process involved in high-cost experiments. Finally, conclusions and recommendations for maintaining US excellence in theory and experiment are given. Appendices list both current and planned accelerators, and present statistical data on the US elementary-particle physics program. A glossary is included.

  4. Spatially Pooled Contrast Responses Predict Neural and Perceptual Similarity of Naturalistic Image Categories

    PubMed Central

    Groen, Iris I. A.; Ghebreab, Sennay; Lamme, Victor A. F.; Scholte, H. Steven

    2012-01-01

    The visual world is complex and continuously changing. Yet, our brain transforms patterns of light falling on our retina into a coherent percept within a few hundred milliseconds. Possibly, low-level neural responses already carry substantial information to facilitate rapid characterization of the visual input. Here, we computationally estimated low-level contrast responses to computer-generated naturalistic images, and tested whether spatial pooling of these responses could predict image similarity at the neural and behavioral level. Using EEG, we show that statistics derived from pooled responses explain a large amount of variance between single-image evoked potentials (ERPs) in individual subjects. Dissimilarity analysis on multi-electrode ERPs demonstrated that large differences between images in pooled response statistics are predictive of more dissimilar patterns of evoked activity, whereas images with little difference in statistics give rise to highly similar evoked activity patterns. In a separate behavioral experiment, images with large differences in statistics were judged as different categories, whereas images with little differences were confused. These findings suggest that statistics derived from low-level contrast responses can be extracted in early visual processing and can be relevant for rapid judgment of visual similarity. We compared our results with two other, well- known contrast statistics: Fourier power spectra and higher-order properties of contrast distributions (skewness and kurtosis). Interestingly, whereas these statistics allow for accurate image categorization, they do not predict ERP response patterns or behavioral categorization confusions. These converging computational, neural and behavioral results suggest that statistics of pooled contrast responses contain information that corresponds with perceived visual similarity in a rapid, low-level categorization task. PMID:23093921

  5. Factors that predict the use or non-use of virtual dissection by high school biology teachers

    NASA Astrophysics Data System (ADS)

    Cockerham, William

    2001-07-01

    With the advent of computers into scholastic classrooms, virtual dissection has become a potential educational tool in high school biology lab settings. Utilizing non-experimental survey research methodology, this study attempted to identify factors that may influence high school biology teachers to use or not to use a virtual dissection. A 75-item research survey instrument consisting of both demographic background and Likert style questions was completed by 215 high school members of the National Association of Biology Teachers. The survey responses provided data to answer the research questions concerning the relationship between the likelihood of a high school biology teacher using a virtual dissection and a number of independent variables from the following three categories: (a) demographics, (b) attitude and experience, and (c) resources and support. These data also allowed for the determination of a demographic profile of the sample population. The demographic profile showed the sample population of high school biology teachers to be two-thirds female, mature, highly educated and very experienced. Analysis of variance and Pearson product moment correlational statistics were used to determine if there was a relationship between high school biology teachers' likelihood to use a virtual dissection and the independent variables. None of the demographic or resource and support independent variables demonstrated a strong relationship to the dependent variable of teachers' likelihood to use a virtual dissection. Three of the attitude and experience independent variables showed a statistically significant (p < .05) relationship to teachers' likelihood to use a virtual dissection: attitude toward virtual dissection, previous use of a virtual dissection and intention to use a real animal dissection. These findings may indicate that teachers are using virtual dissection as a supplement rather than a substitute. It appears that those concerned with promoting virtual dissection in high school biology classrooms will have to develop simulations that are more compelling to the teachers. Additionally, if science teacher organizations want to reduce the controversy surrounding dissection, they may need to re-visit their positions on the importance of real animal dissection.

  6. Uncertainty quantification for nuclear density functional theory and information content of new measurements

    DOE PAGES

    McDonnell, J. D.; Schunck, N.; Higdon, D.; ...

    2015-03-24

    Statistical tools of uncertainty quantification can be used to assess the information content of measured observables with respect to present-day theoretical models, to estimate model errors and thereby improve predictive capability, to extrapolate beyond the regions reached by experiment, and to provide meaningful input to applications and planned measurements. To showcase new opportunities offered by such tools, we make a rigorous analysis of theoretical statistical uncertainties in nuclear density functional theory using Bayesian inference methods. By considering the recent mass measurements from the Canadian Penning Trap at Argonne National Laboratory, we demonstrate how the Bayesian analysis and a direct least-squaresmore » optimization, combined with high-performance computing, can be used to assess the information content of the new data with respect to a model based on the Skyrme energy density functional approach. Employing the posterior probability distribution computed with a Gaussian process emulator, we apply the Bayesian framework to propagate theoretical statistical uncertainties in predictions of nuclear masses, two-neutron dripline, and fission barriers. Overall, we find that the new mass measurements do not impose a constraint that is strong enough to lead to significant changes in the model parameters. In addition, the example discussed in this study sets the stage for quantifying and maximizing the impact of new measurements with respect to current modeling and guiding future experimental efforts, thus enhancing the experiment-theory cycle in the scientific method.« less

  7. Poisson-process generalization for the trading waiting-time distribution in a double-auction mechanism

    NASA Astrophysics Data System (ADS)

    Cincotti, Silvano; Ponta, Linda; Raberto, Marco; Scalas, Enrico

    2005-05-01

    In this paper, empirical analyses and computational experiments are presented on high-frequency data for a double-auction (book) market. Main objective of the paper is to generalize the order waiting time process in order to properly model such empirical evidences. The empirical study is performed on the best bid and best ask data of 7 U.S. financial markets, for 30-stock time series. In particular, statistical properties of trading waiting times have been analyzed and quality of fits is evaluated by suitable statistical tests, i.e., comparing empirical distributions with theoretical models. Starting from the statistical studies on real data, attention has been focused on the reproducibility of such results in an artificial market. The computational experiments have been performed within the Genoa Artificial Stock Market. In the market model, heterogeneous agents trade one risky asset in exchange for cash. Agents have zero intelligence and issue random limit or market orders depending on their budget constraints. The price is cleared by means of a limit order book. The order generation is modelled with a renewal process. Based on empirical trading estimation, the distribution of waiting times between two consecutive orders is modelled by a mixture of exponential processes. Results show that the empirical waiting-time distribution can be considered as a generalization of a Poisson process. Moreover, the renewal process can approximate real data and implementation on the artificial stocks market can reproduce the trading activity in a realistic way.

  8. Uncertainty quantification for nuclear density functional theory and information content of new measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McDonnell, J. D.; Schunck, N.; Higdon, D.

    2015-03-24

    Statistical tools of uncertainty quantification can be used to assess the information content of measured observables with respect to present-day theoretical models, to estimate model errors and thereby improve predictive capability, to extrapolate beyond the regions reached by experiment, and to provide meaningful input to applications and planned measurements. To showcase new opportunities offered by such tools, we make a rigorous analysis of theoretical statistical uncertainties in nuclear density functional theory using Bayesian inference methods. By considering the recent mass measurements from the Canadian Penning Trap at Argonne National Laboratory, we demonstrate how the Bayesian analysis and a direct least-squaresmore » optimization, combined with high-performance computing, can be used to assess the information content of the new data with respect to a model based on the Skyrme energy density functional approach. Employing the posterior probability distribution computed with a Gaussian process emulator, we apply the Bayesian framework to propagate theoretical statistical uncertainties in predictions of nuclear masses, two-neutron dripline, and fission barriers. Overall, we find that the new mass measurements do not impose a constraint that is strong enough to lead to significant changes in the model parameters. As a result, the example discussed in this study sets the stage for quantifying and maximizing the impact of new measurements with respect to current modeling and guiding future experimental efforts, thus enhancing the experiment-theory cycle in the scientific method.« less

  9. Global 3ν oscillation analysis: Status of unknown parameters and future systematic challenges for ORCA and PINGU

    NASA Astrophysics Data System (ADS)

    Capozzi, Francesco; Lisi, Eligio; Marrone, Antonio

    2016-04-01

    Within the standard 3ν oscillation framework, we illustrate the status of currently unknown oscillation parameters: the θ23 octant, the mass hierarchy (normal or inverted), and the possible CP-violating phase δ, as derived by a (preliminary) global analysis of oscillation data available in 2015. We then discuss some challenges that will be faced by future, high-statistics analyses of spectral data, starting with one-dimensional energy spectra in reactor experiments, and concluding with two-dimensional energy-angle spectra in large-volume atmospheric experiments. It is shown that systematic uncertainties in the spectral shapes can noticeably affect the prospective sensitivities to unknown oscillation parameters, in particular to the mass hierarchy.

  10. Self-tuning digital Mössbauer detection system

    NASA Astrophysics Data System (ADS)

    Veiga, A.; Grunfeld, C. M.; Pasquevich, G. A.; Mendoza Zélis, P.; Martínez, N.; Sánchez, F. H.

    2014-01-01

    Long term gamma spectroscopy experiments involving single-channel analyzer equipment depend upon thermal stability of the detector and its associated high-voltage supply. Assuming constant discrimination levels, a drift in the detector gain impacts the output rate, producing an effect on the output spectrum. In some cases (e.g. single-energy resonant absorption experiments) data of interest can be completely lost. We present a digital self-adapting discrimination strategy that tracks emission line shifts using statistical measurements on a predefined region-of-interest of the spectrum. It is developed in the form of a synthesizable module that can be intercalated in the digital processing chain. It requires a moderate to small amount of digital resources and can be easily activated and deactivated.

  11. ITA, a portable program for the interactive analysis of data from tracer experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wootton, R.; Ashley, K.

    ITA is a portable program for analyzing data from tracer experiments, most of the mathematical and graphical work being carried out by subroutines from the NAG and DASL libraries. The program can be used in batch or interactive mode, commands being typed in an English-like language, in free format. Data can be entered from a terminal keyboard or read from a file, and can be validated by printing or plotting them. Erroneous values can be corrected by appropriate editing. Analysis can involve elementary statistics, multiple-isotope crossover corrections, convolution or deconvolution, polyexponential curve-fitting, spline interpolation and/or compartmental analysis. On those installationsmore » with the appropriate hardware, high-resolution graphs can be drawn.« less

  12. Guidelines for the welfare and use of animals in cancer research

    PubMed Central

    Workman, P; Aboagye, E O; Balkwill, F; Balmain, A; Bruder, G; Chaplin, D J; Double, J A; Everitt, J; Farningham, D A H; Glennie, M J; Kelland, L R; Robinson, V; Stratford, I J; Tozer, G M; Watson, S; Wedge, S R; Eccles, S A

    2010-01-01

    Animal experiments remain essential to understand the fundamental mechanisms underpinning malignancy and to discover improved methods to prevent, diagnose and treat cancer. Excellent standards of animal care are fully consistent with the conduct of high quality cancer research. Here we provide updated guidelines on the welfare and use of animals in cancer research. All experiments should incorporate the 3Rs: replacement, reduction and refinement. Focusing on animal welfare, we present recommendations on all aspects of cancer research, including: study design, statistics and pilot studies; choice of tumour models (e.g., genetically engineered, orthotopic and metastatic); therapy (including drugs and radiation); imaging (covering techniques, anaesthesia and restraint); humane endpoints (including tumour burden and site); and publication of best practice. PMID:20502460

  13. Supervised Classification Techniques for Hyperspectral Data

    NASA Technical Reports Server (NTRS)

    Jimenez, Luis O.

    1997-01-01

    The recent development of more sophisticated remote sensing systems enables the measurement of radiation in many mm-e spectral intervals than previous possible. An example of this technology is the AVIRIS system, which collects image data in 220 bands. The increased dimensionality of such hyperspectral data provides a challenge to the current techniques for analyzing such data. Human experience in three dimensional space tends to mislead one's intuition of geometrical and statistical properties in high dimensional space, properties which must guide our choices in the data analysis process. In this paper high dimensional space properties are mentioned with their implication for high dimensional data analysis in order to illuminate the next steps that need to be taken for the next generation of hyperspectral data classifiers.

  14. Status of the KATRIN experiment and prospects to search for keV-mass sterile neutrinos in tritium β-decay

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mertens, Susanne

    In this contribution the current status and future perspectives of the Karlsruhe Tritium Neutrino (KATRIN) Experiment are presented. The prime goal of this single β-decay experiment is to probe the absolute neutrino mass scale with a sensitivity of 200 meV (90% CL). We discuss first results of the recent main spectrometer commissioning measurements, successfully verifying the spectrometer’s basic vacuum, transmission and background properties. We also discuss the prospects of making use of the KATRIN tritium source, to search for sterile neutrinos in the multi-keV mass range constituting a classical candidate for Warm Dark Matter. Due to the very high sourcemore » luminosity, a statistical sensitivity down to active-sterile mixing angles of sin² θ < 1 · 10⁻⁷ (90% CL) could be reached.« less

  15. Status of the KATRIN experiment and prospects to search for keV-mass sterile neutrinos in tritium β-decay

    DOE PAGES

    Mertens, Susanne

    2015-03-24

    In this contribution the current status and future perspectives of the Karlsruhe Tritium Neutrino (KATRIN) Experiment are presented. The prime goal of this single β-decay experiment is to probe the absolute neutrino mass scale with a sensitivity of 200 meV (90% CL). We discuss first results of the recent main spectrometer commissioning measurements, successfully verifying the spectrometer’s basic vacuum, transmission and background properties. We also discuss the prospects of making use of the KATRIN tritium source, to search for sterile neutrinos in the multi-keV mass range constituting a classical candidate for Warm Dark Matter. Due to the very high sourcemore » luminosity, a statistical sensitivity down to active-sterile mixing angles of sin² θ < 1 · 10⁻⁷ (90% CL) could be reached.« less

  16. Exploring students’ adaptive reasoning skills and van Hiele levels of geometric thinking: a case study in geometry

    NASA Astrophysics Data System (ADS)

    Rizki, H. T. N.; Frentika, D.; Wijaya, A.

    2018-03-01

    This study aims to explore junior high school students’ adaptive reasoning and the Van Hiele level of geometric thinking. The present study was a quasi-experiment with the non-equivalent control group design. The participants of the study were 34 seventh graders and 35 eighth graders in the experiment classes and 34 seventh graders and 34 eighth graders in the control classes. The students in the experiment classes learned geometry under the circumstances of a Knisley mathematical learning. The data were analyzed quantitatively by using inferential statistics. The results of data analysis show an improvement of adaptive reasoning skills both in the grade seven and grade eight. An improvement was also found for the Van Hiele level of geometric thinking. These results indicate the positive impact of Knisley learning model on students’ adaptive reasoning skills and Van Hiele level of geometric thinking.

  17. Probing quantum and classical turbulence analogy in von Kármán liquid helium, nitrogen, and water experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Saint-Michel, B.; Aix Marseille Université, CNRS, Centrale Marseille, IRPHE UMR 7342, 13384 Marseille; Herbert, E.

    2014-12-15

    We report measurements of the dissipation in the Superfluid helium high REynold number von Kármán flow experiment for different forcing conditions. Statistically steady flows are reached; they display a hysteretic behavior similar to what has been observed in a 1:4 scale water experiment. Our macroscopical measurements indicate no noticeable difference between classical and superfluid flows, thereby providing evidence of the same dissipation scaling laws in the two phases. A detailed study of the evolution of the hysteresis cycle with the Reynolds number supports the idea that the stability of the steady states of classical turbulence in this closed flow ismore » partly governed by the dissipative scales. It also supports the idea that the normal and the superfluid components at these temperatures (1.6 K) are locked down to the dissipative length scale.« less

  18. Bayesian component separation: The Planck experience

    NASA Astrophysics Data System (ADS)

    Wehus, Ingunn Kathrine; Eriksen, Hans Kristian

    2018-05-01

    Bayesian component separation techniques have played a central role in the data reduction process of Planck. The most important strength of this approach is its global nature, in which a parametric and physical model is fitted to the data. Such physical modeling allows the user to constrain very general data models, and jointly probe cosmological, astrophysical and instrumental parameters. This approach also supports statistically robust goodness-of-fit tests in terms of data-minus-model residual maps, which are essential for identifying residual systematic effects in the data. The main challenges are high code complexity and computational cost. Whether or not these costs are justified for a given experiment depends on its final uncertainty budget. We therefore predict that the importance of Bayesian component separation techniques is likely to increase with time for intensity mapping experiments, similar to what has happened in the CMB field, as observational techniques mature, and their overall sensitivity improves.

  19. Polychromatic wave-optics models for image-plane speckle. 2. Unresolved objects.

    PubMed

    Van Zandt, Noah R; Spencer, Mark F; Steinbock, Michael J; Anderson, Brian M; Hyde, Milo W; Fiorino, Steven T

    2018-05-20

    Polychromatic laser light can reduce speckle noise in many wavefront-sensing and imaging applications. To help quantify the achievable reduction in speckle noise, this study investigates the accuracy of three polychromatic wave-optics models under the specific conditions of an unresolved object. Because existing theory assumes a well-resolved object, laboratory experiments are used to evaluate model accuracy. The three models use Monte-Carlo averaging, depth slicing, and spectral slicing, respectively, to simulate the laser-object interaction. The experiments involve spoiling the temporal coherence of laser light via a fiber-based, electro-optic modulator. After the light scatters off of the rough object, speckle statistics are measured. The Monte-Carlo method is found to be highly inaccurate, while depth-slicing error peaks at 7.8% but is generally much lower in comparison. The spectral-slicing method is the most accurate, always producing results within the error bounds of the experiment.

  20. Millimeter wavelength propagation studies

    NASA Technical Reports Server (NTRS)

    Hodge, D. B.

    1974-01-01

    The investigations conducted for the Millimeter Wavelength Propagation Studies during the period December, 1966, to June 1974 are reported. These efforts included the preparation for the ATS-5 Millimeter Wavelength Propagation Experiment and the subsequent data acquisition and data analysis. The emphasis of the OSU participation in this experiment was placed on the determination of reliability improvement resulting from the use of space diversity on a millimeter wavelength earth-space communication link. Related measurements included the determination of the correlation between radiometric temperature and attenuation along the earth-space propagation path. Along with this experimental effort a theoretical model was developed for the prediction of attenuation statistics on single and spatially separated earth space propagation paths. A High Resolution Radar/Radiometer System and Low Resolution Radar System were developed and implemented for the study of intense rain cells in preparation for the ATS-6 Millimeter Wavelength Propagation Experiment.

  1. Proceedings of the Conference on the Design of Experiments (23rd) S

    DTIC Science & Technology

    1978-07-01

    of Statistics, Carnegie-Mellon University. * [12] Duran , B. S . (1976). A survey of nonparametric tests for scale. Comunications in Statistics A5, 1287...the twenty-third Design of Experiments Conference was the U. S . Army Combat Development Experimentation Command, Fort Ord, California. Excellent...Availability Prof. G. E. P. Box Time Series Modelling University of Wisconsin Dr. Churchill Eisenhart was recipient this year of the Samuel S . Wilks Memorial

  2. Optimal allocation of testing resources for statistical simulations

    NASA Astrophysics Data System (ADS)

    Quintana, Carolina; Millwater, Harry R.; Singh, Gulshan; Golden, Patrick

    2015-07-01

    Statistical estimates from simulation involve uncertainty caused by the variability in the input random variables due to limited data. Allocating resources to obtain more experimental data of the input variables to better characterize their probability distributions can reduce the variance of statistical estimates. The methodology proposed determines the optimal number of additional experiments required to minimize the variance of the output moments given single or multiple constraints. The method uses multivariate t-distribution and Wishart distribution to generate realizations of the population mean and covariance of the input variables, respectively, given an amount of available data. This method handles independent and correlated random variables. A particle swarm method is used for the optimization. The optimal number of additional experiments per variable depends on the number and variance of the initial data, the influence of the variable in the output function and the cost of each additional experiment. The methodology is demonstrated using a fretting fatigue example.

  3. A database application for pre-processing, storage and comparison of mass spectra derived from patients and controls

    PubMed Central

    Titulaer, Mark K; Siccama, Ivar; Dekker, Lennard J; van Rijswijk, Angelique LCT; Heeren, Ron MA; Sillevis Smitt, Peter A; Luider, Theo M

    2006-01-01

    Background Statistical comparison of peptide profiles in biomarker discovery requires fast, user-friendly software for high throughput data analysis. Important features are flexibility in changing input variables and statistical analysis of peptides that are differentially expressed between patient and control groups. In addition, integration the mass spectrometry data with the results of other experiments, such as microarray analysis, and information from other databases requires a central storage of the profile matrix, where protein id's can be added to peptide masses of interest. Results A new database application is presented, to detect and identify significantly differentially expressed peptides in peptide profiles obtained from body fluids of patient and control groups. The presented modular software is capable of central storage of mass spectra and results in fast analysis. The software architecture consists of 4 pillars, 1) a Graphical User Interface written in Java, 2) a MySQL database, which contains all metadata, such as experiment numbers and sample codes, 3) a FTP (File Transport Protocol) server to store all raw mass spectrometry files and processed data, and 4) the software package R, which is used for modular statistical calculations, such as the Wilcoxon-Mann-Whitney rank sum test. Statistic analysis by the Wilcoxon-Mann-Whitney test in R demonstrates that peptide-profiles of two patient groups 1) breast cancer patients with leptomeningeal metastases and 2) prostate cancer patients in end stage disease can be distinguished from those of control groups. Conclusion The database application is capable to distinguish patient Matrix Assisted Laser Desorption Ionization (MALDI-TOF) peptide profiles from control groups using large size datasets. The modular architecture of the application makes it possible to adapt the application to handle also large sized data from MS/MS- and Fourier Transform Ion Cyclotron Resonance (FT-ICR) mass spectrometry experiments. It is expected that the higher resolution and mass accuracy of the FT-ICR mass spectrometry prevents the clustering of peaks of different peptides and allows the identification of differentially expressed proteins from the peptide profiles. PMID:16953879

  4. A database application for pre-processing, storage and comparison of mass spectra derived from patients and controls.

    PubMed

    Titulaer, Mark K; Siccama, Ivar; Dekker, Lennard J; van Rijswijk, Angelique L C T; Heeren, Ron M A; Sillevis Smitt, Peter A; Luider, Theo M

    2006-09-05

    Statistical comparison of peptide profiles in biomarker discovery requires fast, user-friendly software for high throughput data analysis. Important features are flexibility in changing input variables and statistical analysis of peptides that are differentially expressed between patient and control groups. In addition, integration the mass spectrometry data with the results of other experiments, such as microarray analysis, and information from other databases requires a central storage of the profile matrix, where protein id's can be added to peptide masses of interest. A new database application is presented, to detect and identify significantly differentially expressed peptides in peptide profiles obtained from body fluids of patient and control groups. The presented modular software is capable of central storage of mass spectra and results in fast analysis. The software architecture consists of 4 pillars, 1) a Graphical User Interface written in Java, 2) a MySQL database, which contains all metadata, such as experiment numbers and sample codes, 3) a FTP (File Transport Protocol) server to store all raw mass spectrometry files and processed data, and 4) the software package R, which is used for modular statistical calculations, such as the Wilcoxon-Mann-Whitney rank sum test. Statistic analysis by the Wilcoxon-Mann-Whitney test in R demonstrates that peptide-profiles of two patient groups 1) breast cancer patients with leptomeningeal metastases and 2) prostate cancer patients in end stage disease can be distinguished from those of control groups. The database application is capable to distinguish patient Matrix Assisted Laser Desorption Ionization (MALDI-TOF) peptide profiles from control groups using large size datasets. The modular architecture of the application makes it possible to adapt the application to handle also large sized data from MS/MS- and Fourier Transform Ion Cyclotron Resonance (FT-ICR) mass spectrometry experiments. It is expected that the higher resolution and mass accuracy of the FT-ICR mass spectrometry prevents the clustering of peaks of different peptides and allows the identification of differentially expressed proteins from the peptide profiles.

  5. Patterns of shading tolerance determined from experimental ...

    EPA Pesticide Factsheets

    An extensive review of the experimental literature on seagrass shading evaluated the relationship between experimental light reductions, duration of experiment and seagrass response metrics to determine whether there were consistent statistical patterns. There were highly significant linear relationships of both percent biomass and percent shoot density reduction versus percent light reduction (versus controls), although unexplained variation in the data were high. Duration of exposure affected extent of response for both metrics, but was more clearly a factor in biomass response. Both biomass and shoot density showed linear responses to duration of light reduction for treatments 60%. Unexplained variation was again high, and greater for shoot density than biomass. With few exceptions, regressions of both biomass and shoot density on light reduction for individual species and for genera were statistically significant, but also tended to show high degrees of variability in data. Multivariate regressions that included both percent light reduction and duration of reduction as dependent variables increased the percentage of variation explained in almost every case. Analysis of response data by seagrass life history category (Colonizing, Opportunistic, Persistent) did not yield clearly separate response relationships in most cases. Biomass tended to show somewhat less variation in response to light reduction than shoot density, and of the two, may be the prefe

  6. On Time Performance Pressure

    NASA Technical Reports Server (NTRS)

    Connell, Linda; Wichner, David; Jakey, Abegael

    2013-01-01

    Within many operations, the pressures for on-time performance are high. Each month, on-time statistics are reported to the Department of Transportation and made public. There is a natural tendency for employees under pressure to do their best to meet these objectives. As a result, pressure to get the job done within the allotted time may cause personnel to deviate from procedures and policies. Additionally, inadequate or unavailable resources may drive employees to work around standard processes that are seen as barriers. However, bypassing practices to enable on-time performance may affect more than the statistics. ASRS reports often highlight on-time performance pressures which may result in impact across all workgroups in an attempt to achieve on-time performance. Reporters often provide in-depth insights into their experiences which can be used by industry to identify and focus on the implementation of systemic fixes.

  7. Guenter Tulip Filter Retrieval Experience: Predictors of Successful Retrieval

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Turba, Ulku Cenk, E-mail: uct5d@virginia.edu; Arslan, Bulent, E-mail: ba6e@virginia.edu; Meuse, Michael, E-mail: mm5tz@virginia.edu

    We report our experience with Guenter Tulip filter placement indications, retrievals, and procedural problems, with emphasis on alternative retrieval techniques. We have identified 92 consecutive patients in whom a Guenter Tulip filter was placed and filter removal attempted. We recorded patient demographic information, filter placement and retrieval indications, procedures, standard and nonstandard filter retrieval techniques, complications, and clinical outcomes. The mean time to retrieval for those who experienced filter strut penetration was statistically significant [F(1,90) = 8.55, p = 0.004]. Filter strut(s) IVC penetration and successful retrieval were found to be statistically significant (p = 0.043). The filter hook-IVC relationshipmore » correlated with successful retrieval. A modified guidewire loop technique was applied in 8 of 10 cases where the hook appeared to penetrate the IVC wall and could not be engaged with a loop snare catheter, providing additional technical success in 6 of 8 (75%). Therefore, the total filter retrieval success increased from 88 to 95%. In conclusion, the Guenter Tulip filter has high successful retrieval rates with low rates of complication. Additional maneuvers such as a guidewire loop method can be used to improve retrieval success rates when the filter hook is endothelialized.« less

  8. Direct simulation of a self-similar plane wake

    NASA Technical Reports Server (NTRS)

    Moser, Robert D.; Rogers, Michael M.

    1994-01-01

    Direct simulations of two time-developing turbulent wakes have been performed. Initial conditions for the simulations were obtained from two realizations of a direct simulation of a turbulent boundary layer at momentum thickness Reynolds number 670. In addition, extra two dimensional disturbances were added in one of the cases to mimic two dimensional forcing. The unforced wake is allowed to evolve long enough to attain self similarity. The mass-flux Reynolds number (equivalent to the momentum thickness Reynolds number in spatially developing wakes) is 2000, which is high enough for a short k(exp -5/3) range to be evident in the streamwise one dimensional velocity spectrum. Several turbulence statistics have been computed by averaging in space and over the self-similar period in time. The growth rate in the unforced flow is low compared to experiments, but when this growth-rate difference is accounted for, the statistics of the unforced case are in reasonable agreement with experiments. However, the forced case is significantly different. The growth rate, turbulence Reynolds number, and turbulence intensities are as much as ten times larger in the forced case. In addition, the forced flow exhibits large-scale structures similar to those observed in transitional wakes, while the unforced flow does not.

  9. Integrated framework for developing search and discrimination metrics

    NASA Astrophysics Data System (ADS)

    Copeland, Anthony C.; Trivedi, Mohan M.

    1997-06-01

    This paper presents an experimental framework for evaluating target signature metrics as models of human visual search and discrimination. This framework is based on a prototype eye tracking testbed, the Integrated Testbed for Eye Movement Studies (ITEMS). ITEMS determines an observer's visual fixation point while he studies a displayed image scene, by processing video of the observer's eye. The utility of this framework is illustrated with an experiment using gray-scale images of outdoor scenes that contain randomly placed targets. Each target is a square region of a specific size containing pixel values from another image of an outdoor scene. The real-world analogy of this experiment is that of a military observer looking upon the sensed image of a static scene to find camouflaged enemy targets that are reported to be in the area. ITEMS provides the data necessary to compute various statistics for each target to describe how easily the observers located it, including the likelihood the target was fixated or identified and the time required to do so. The computed values of several target signature metrics are compared to these statistics, and a second-order metric based on a model of image texture was found to be the most highly correlated.

  10. Reducing the standard deviation in multiple-assay experiments where the variation matters but the absolute value does not.

    PubMed

    Echenique-Robba, Pablo; Nelo-Bazán, María Alejandra; Carrodeguas, José A

    2013-01-01

    When the value of a quantity x for a number of systems (cells, molecules, people, chunks of metal, DNA vectors, so on) is measured and the aim is to replicate the whole set again for different trials or assays, despite the efforts for a near-equal design, scientists might often obtain quite different measurements. As a consequence, some systems' averages present standard deviations that are too large to render statistically significant results. This work presents a novel correction method of a very low mathematical and numerical complexity that can reduce the standard deviation of such results and increase their statistical significance. Two conditions are to be met: the inter-system variations of x matter while its absolute value does not, and a similar tendency in the values of x must be present in the different assays (or in other words, the results corresponding to different assays must present a high linear correlation). We demonstrate the improvements this method offers with a cell biology experiment, but it can definitely be applied to any problem that conforms to the described structure and requirements and in any quantitative scientific field that deals with data subject to uncertainty.

  11. The effects of pediatric community simulation experience on the self-confidence and satisfaction of baccalaureate nursing students: A quasi-experimental study.

    PubMed

    Lubbers, Jaclynn; Rossman, Carol

    2016-04-01

    Simulation in nursing education is a means to transform student learning and respond to decreasing clinical site availability. This study proposed an innovative simulation experience where students completed community based clinical hours with simulation scenarios. The purpose of this study was to determine the effects of a pediatric community simulation experience on the self-confidence of nursing students. Bandura's (1977) Self-Efficacy Theory and Jeffries' (2005) Nursing Education Simulation Framework were used. This quasi-experimental study collected data using a pre-test and posttest tool. The setting was a private, liberal arts college in the Midwestern United States. Fifty-four baccalaureate nursing students in a convenience sample were the population of interest. The sample was predominantly female with very little exposure to simulation prior to this study. The participants completed a 16-item self-confidence instrument developed for this study which measured students' self-confidence in pediatric community nursing knowledge, skill, communication, and documentation. The overall study showed statistically significant results (t=20.70, p<0.001) and statistically significant results within each of the eight 4-item sub-scales (p<0.001). Students also reported a high level of satisfaction with their simulation experience. The data demonstrate that students who took the Pediatric Community Based Simulation course reported higher self-confidence after the course than before the course. Higher self-confidence scores for simulation participants have been shown to increase quality of care for patients. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. The Effect of Alcohol on Emotional Inertia: A Test of Alcohol Myopia

    PubMed Central

    Fairbairn, Catharine E.; Sayette, Michael A.

    2017-01-01

    Alcohol Myopia (AM) has emerged as one of the most widely-researched theories of alcohol’s effects on emotional experience. Given this theory’s popularity it is notable that a central tenet of AM has not been tested—namely, that alcohol creates a myopic focus on the present moment, limiting the extent to which the present is permeated by emotions derived from prior experience. We aimed to test the impact of alcohol on moment-to-moment fluctuations in affect, applying advances in emotion assessment and statistical analysis to test this aspect of AM without drawing the attention of participants to their own emotional experiences. We measured emotional fluctuations using autocorrelation, a statistic borrowed from time-series analysis measuring the correlation between successive observations in time. High emotion autocorrelation is termed “emotional inertia” and linked to negative mood outcomes. Seven-hundred-twenty social drinkers consumed alcohol, placebo, or control beverages in groups of three over a 36-min group formation task. We indexed affect using the Duchenne smile, recorded continuously during the interaction (34.9 million video frames) according to Paul Ekman’s Facial Action Coding System. Autocorrelation of Duchenne smiling emerged as the most consistent predictor of self-reported mood and social bonding when compared with Duchenne smiling mean, standard deviation, and linear trend. Alcohol reduced affective autocorrelation, and autocorrelation mediated the link between alcohol and self-reported mood and social outcomes. Findings suggest that alcohol enhances our ability to freely enjoy the present moment untethered by past experience and highlight the importance of emotion dynamics in research examining affective correlates of psychopathology. PMID:24016015

  13. Persistence of exponential bed thickness distributions in the stratigraphic record: Experiments and theory

    NASA Astrophysics Data System (ADS)

    Straub, K. M.; Ganti, V. K.; Paola, C.; Foufoula-Georgiou, E.

    2010-12-01

    Stratigraphy preserved in alluvial basins houses the most complete record of information necessary to reconstruct past environmental conditions. Indeed, the character of the sedimentary record is inextricably related to the surface processes that formed it. In this presentation we explore how the signals of surface processes are recorded in stratigraphy through the use of physical and numerical experiments. We focus on linking surface processes to stratigraphy in 1D by quantifying the probability distributions of processes that govern the evolution of depositional systems to the probability distribution of preserved bed thicknesses. In this study we define a bed as a package of sediment bounded above and below by erosional surfaces. In a companion presentation we document heavy-tailed statistics of erosion and deposition from high-resolution temporal elevation data recorded during a controlled physical experiment. However, the heavy tails in the magnitudes of erosional and depositional events are not preserved in the experimental stratigraphy. Similar to many bed thickness distributions reported in field studies we find that an exponential distribution adequately describes the thicknesses of beds preserved in our experiment. We explore the generation of exponential bed thickness distributions from heavy-tailed surface statistics using 1D numerical models. These models indicate that when the full distribution of elevation fluctuations (both erosional and depositional events) is symmetrical, the resulting distribution of bed thicknesses is exponential in form. Finally, we illustrate that a predictable relationship exists between the coefficient of variation of surface elevation fluctuations and the scale-parameter of the resulting exponential distribution of bed thicknesses.

  14. Factors associated with student learning processes in primary health care units: a questionnaire study.

    PubMed

    Bos, Elisabeth; Alinaghizadeh, Hassan; Saarikoski, Mikko; Kaila, Päivi

    2015-01-01

    Clinical placement plays a key role in education intended to develop nursing and caregiving skills. Studies of nursing students' clinical learning experiences show that these dimensions affect learning processes: (i) supervisory relationship, (ii) pedagogical atmosphere, (iii) management leadership style, (iv) premises of nursing care on the ward, and (v) nursing teachers' roles. Few empirical studies address the probability of an association between these dimensions and factors such as student (a) motivation, (b) satisfaction with clinical placement, and (c) experiences with professional role models. The study aimed to investigate factors associated with the five dimensions in clinical learning environments within primary health care units. The Swedish version of Clinical Learning Environment, Supervision and Teacher, a validated evaluation scale, was administered to 356 graduating nursing students after four or five weeks clinical placement in primary health care units. Response rate was 84%. Multivariate analysis of variance is determined if the five dimensions are associated with factors a, b, and c above. The analysis revealed a statistically significant association with the five dimensions and two factors: students' motivation and experiences with professional role models. The satisfaction factor had a statistically significant association (effect size was high) with all dimensions; this clearly indicates that students experienced satisfaction. These questionnaire results show that a good clinical learning experience constitutes a complex whole (totality) that involves several interacting factors. Supervisory relationship and pedagogical atmosphere particularly influenced students' satisfaction and motivation. These results provide valuable decision-support material for clinical education planning, implementation, and management. Copyright © 2014 Elsevier Ltd. All rights reserved.

  15. FLORIDA TOWER FOOTPRINT EXPERIMENTS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    WATSON,T.B.; DIETZ, R.N.; WILKE, R.

    2007-01-01

    The Florida Footprint experiments were a series of field programs in which perfluorocarbon tracers were released in different configurations centered on a flux tower to generate a data set that can be used to test transport and dispersion models. These models are used to determine the sources of the CO{sub 2} that cause the fluxes measured at eddy covariance towers. Experiments were conducted in a managed slash pine forest, 10 km northeast of Gainesville, Florida, in 2002, 2004, and 2006 and in atmospheric conditions that ranged from well mixed, to very stable, including the transition period between convective conditions atmore » midday to stable conditions after sun set. There were a total of 15 experiments. The characteristics of the PFTs, details of sampling and analysis methods, quality control measures, and analytical statistics including confidence limits are presented. Details of the field programs including tracer release rates, tracer source configurations, and configuration of the samplers are discussed. The result of this experiment is a high quality, well documented tracer and meteorological data set that can be used to improve and validate canopy dispersion models.« less

  16. Influence of operator experience on canal preparation time when using the rotary Ni-Ti ProFile system in simulated curved canals.

    PubMed

    Mesgouez, C; Rilliard, F; Matossian, L; Nassiri, K; Mandel, E

    2003-03-01

    The aim of this study was to determine the influence of operator experience on the time needed for canal preparation when using a rotary nickel-titanium (Ni-Ti) system. A total of 100 simulated curved canals in resin blocks were used. Four operators prepared a total of 25 canals each. The operators included practitioners with prior experience of the preparation technique, and practitioners with no experience. The working length for each instrument was precisely predetermined. All canals were instrumented with rotary Ni-Ti ProFile Variable Taper Series 29 engine-driven instruments using a high-torque handpiece (Maillefer, Ballaigues, Switzerland). The time taken to prepare each canal was recorded. Significant differences between the operators were analysed using the Student's t-test and the Kruskall-Wallis and Dunn nonparametric tests. Comparison of canal preparation times demonstrated a statistically significant difference between the four operators (P < 0.001). In the inexperienced group, a significant linear regression between canal number and preparation time occurred. Time required for canal preparation was inversely related to operator experience.

  17. Postgraduate students experience in research supervision

    NASA Astrophysics Data System (ADS)

    Mohamed, Hazura; Judi, Hairulliza Mohamad; Mohammad, Rofizah

    2017-04-01

    The success and quality of postgraduate education depends largely on the effective and efficient supervision of postgraduate students. The role of the supervisor becomes more challenging with supervisory expectations rising high quality graduates. The main objective of this study was to examine the experiences of postgraduate students towards supervisory services for the duration of their studies. It also examines whether supervisory experience varies based on demographic variables such as level of study and nationality. This study uses a quantitative approach in the form of survey. Questionnaires were distributed to 96 postgraduate students of the Faculty of Information Science and Technology, Universiti Kebangsaan Malaysia. Data collected were analyzed using Statistical Package for the Social Science (SPSS 23.0) to get the frequency, mean and standard deviation. T-test was used to find the difference between demographic variables and supervisory experience. The findings overall showed that postgraduate students gave positive response to the supervisory services. However, there were differences supervisory experiences based on the level of study and nationality. The results of this study hope the parties involved could provide a better support to improve the quality of supervision.

  18. The APEX experiment at JLab. Searching for the vector boson A' decaying to e+e-

    NASA Astrophysics Data System (ADS)

    Franklin, Gregg B.

    2017-04-01

    Jefferson Lab's A' Experiment (APEX) will search for a new vector boson, the A', in the mass range 65 MeV < mA' < 550 MeV, with sensitivity for the A' coupling to electrons of α' > 6 × 10-8α, where α = e2/4π. New vector bosons with such small couplings arise naturally from a small kinetic mixing of the "dark photon", A', with the photon — one of the very few ways in which new forces can couple to the Standard Model — and have received considerable attention as an explanation of various dark-matter related anomalies. In this experiment, A' bosons produced by radiation off an electron beam could appear as narrow resonances with small production cross-sections in the e+e- invariant mass distribution. The two Jefferson Lab HRS spectrometers will provide a reconstructed invariant-mass resolution for the A' of δM/M < 0.5%. With a 33-day run, the experiment will achieve high sensitivity by taking advantage of this mass resolution and high statistics of the e+e- pairs, which will be orders of magnitude larger than in previous searches for the A' boson in this mass range. This paper will review the key concepts of the experiment and the status of the preparations for running APEX. The results of a completed pilot run will be presented.

  19. Authorization of Animal Experiments Is Based on Confidence Rather than Evidence of Scientific Rigor

    PubMed Central

    Nathues, Christina; Würbel, Hanno

    2016-01-01

    Accumulating evidence indicates high risk of bias in preclinical animal research, questioning the scientific validity and reproducibility of published research findings. Systematic reviews found low rates of reporting of measures against risks of bias in the published literature (e.g., randomization, blinding, sample size calculation) and a correlation between low reporting rates and inflated treatment effects. That most animal research undergoes peer review or ethical review would offer the possibility to detect risks of bias at an earlier stage, before the research has been conducted. For example, in Switzerland, animal experiments are licensed based on a detailed description of the study protocol and a harm–benefit analysis. We therefore screened applications for animal experiments submitted to Swiss authorities (n = 1,277) for the rates at which the use of seven basic measures against bias (allocation concealment, blinding, randomization, sample size calculation, inclusion/exclusion criteria, primary outcome variable, and statistical analysis plan) were described and compared them with the reporting rates of the same measures in a representative sub-sample of publications (n = 50) resulting from studies described in these applications. Measures against bias were described at very low rates, ranging on average from 2.4% for statistical analysis plan to 19% for primary outcome variable in applications for animal experiments, and from 0.0% for sample size calculation to 34% for statistical analysis plan in publications from these experiments. Calculating an internal validity score (IVS) based on the proportion of the seven measures against bias, we found a weak positive correlation between the IVS of applications and that of publications (Spearman’s rho = 0.34, p = 0.014), indicating that the rates of description of these measures in applications partly predict their rates of reporting in publications. These results indicate that the authorities licensing animal experiments are lacking important information about experimental conduct that determines the scientific validity of the findings, which may be critical for the weight attributed to the benefit of the research in the harm–benefit analysis. Similar to manuscripts getting accepted for publication despite poor reporting of measures against bias, applications for animal experiments may often be approved based on implicit confidence rather than explicit evidence of scientific rigor. Our findings shed serious doubt on the current authorization procedure for animal experiments, as well as the peer-review process for scientific publications, which in the long run may undermine the credibility of research. Developing existing authorization procedures that are already in place in many countries towards a preregistration system for animal research is one promising way to reform the system. This would not only benefit the scientific validity of findings from animal experiments but also help to avoid unnecessary harm to animals for inconclusive research. PMID:27911892

  20. Authorization of Animal Experiments Is Based on Confidence Rather than Evidence of Scientific Rigor.

    PubMed

    Vogt, Lucile; Reichlin, Thomas S; Nathues, Christina; Würbel, Hanno

    2016-12-01

    Accumulating evidence indicates high risk of bias in preclinical animal research, questioning the scientific validity and reproducibility of published research findings. Systematic reviews found low rates of reporting of measures against risks of bias in the published literature (e.g., randomization, blinding, sample size calculation) and a correlation between low reporting rates and inflated treatment effects. That most animal research undergoes peer review or ethical review would offer the possibility to detect risks of bias at an earlier stage, before the research has been conducted. For example, in Switzerland, animal experiments are licensed based on a detailed description of the study protocol and a harm-benefit analysis. We therefore screened applications for animal experiments submitted to Swiss authorities (n = 1,277) for the rates at which the use of seven basic measures against bias (allocation concealment, blinding, randomization, sample size calculation, inclusion/exclusion criteria, primary outcome variable, and statistical analysis plan) were described and compared them with the reporting rates of the same measures in a representative sub-sample of publications (n = 50) resulting from studies described in these applications. Measures against bias were described at very low rates, ranging on average from 2.4% for statistical analysis plan to 19% for primary outcome variable in applications for animal experiments, and from 0.0% for sample size calculation to 34% for statistical analysis plan in publications from these experiments. Calculating an internal validity score (IVS) based on the proportion of the seven measures against bias, we found a weak positive correlation between the IVS of applications and that of publications (Spearman's rho = 0.34, p = 0.014), indicating that the rates of description of these measures in applications partly predict their rates of reporting in publications. These results indicate that the authorities licensing animal experiments are lacking important information about experimental conduct that determines the scientific validity of the findings, which may be critical for the weight attributed to the benefit of the research in the harm-benefit analysis. Similar to manuscripts getting accepted for publication despite poor reporting of measures against bias, applications for animal experiments may often be approved based on implicit confidence rather than explicit evidence of scientific rigor. Our findings shed serious doubt on the current authorization procedure for animal experiments, as well as the peer-review process for scientific publications, which in the long run may undermine the credibility of research. Developing existing authorization procedures that are already in place in many countries towards a preregistration system for animal research is one promising way to reform the system. This would not only benefit the scientific validity of findings from animal experiments but also help to avoid unnecessary harm to animals for inconclusive research.

Top