Sample records for statistically based experimental

  1. An experimental validation of a statistical-based damage detection approach.

    DOT National Transportation Integrated Search

    2011-01-01

    In this work, a previously-developed, statistical-based, damage-detection approach was validated for its ability to : autonomously detect damage in bridges. The damage-detection approach uses statistical differences in the actual and : predicted beha...

  2. Effects of Web-Based Instruction on Math Anxiety, the Sense of Mastery, and Global Self-Esteem: A Quasi-Experimental Study of Undergraduate Statistics Students

    ERIC Educational Resources Information Center

    Van Gundy, Karen; Morton, Beth A.; Liu, Hope Q.; Kline, Jennifer

    2006-01-01

    To explore the effects of web-based instruction (WBI) on math anxiety, the sense of mastery, and global self-esteem, we use quasi-experimental data from undergraduate statistics students in classes assigned to three study conditions, each with varied access to, and incentive for, the use of online technologies. Results suggest that when statistics…

  3. Instruction of Statistics via Computer-Based Tools: Effects on Statistics' Anxiety, Attitude, and Achievement

    ERIC Educational Resources Information Center

    Ciftci, S. Koza; Karadag, Engin; Akdal, Pinar

    2014-01-01

    The purpose of this study was to determine the effect of statistics instruction using computer-based tools, on statistics anxiety, attitude, and achievement. This study was designed as quasi-experimental research and the pattern used was a matched pre-test/post-test with control group design. Data was collected using three scales: a Statistics…

  4. Statistical tools for transgene copy number estimation based on real-time PCR.

    PubMed

    Yuan, Joshua S; Burris, Jason; Stewart, Nathan R; Mentewab, Ayalew; Stewart, C Neal

    2007-11-01

    As compared with traditional transgene copy number detection technologies such as Southern blot analysis, real-time PCR provides a fast, inexpensive and high-throughput alternative. However, the real-time PCR based transgene copy number estimation tends to be ambiguous and subjective stemming from the lack of proper statistical analysis and data quality control to render a reliable estimation of copy number with a prediction value. Despite the recent progresses in statistical analysis of real-time PCR, few publications have integrated these advancements in real-time PCR based transgene copy number determination. Three experimental designs and four data quality control integrated statistical models are presented. For the first method, external calibration curves are established for the transgene based on serially-diluted templates. The Ct number from a control transgenic event and putative transgenic event are compared to derive the transgene copy number or zygosity estimation. Simple linear regression and two group T-test procedures were combined to model the data from this design. For the second experimental design, standard curves were generated for both an internal reference gene and the transgene, and the copy number of transgene was compared with that of internal reference gene. Multiple regression models and ANOVA models can be employed to analyze the data and perform quality control for this approach. In the third experimental design, transgene copy number is compared with reference gene without a standard curve, but rather, is based directly on fluorescence data. Two different multiple regression models were proposed to analyze the data based on two different approaches of amplification efficiency integration. Our results highlight the importance of proper statistical treatment and quality control integration in real-time PCR-based transgene copy number determination. These statistical methods allow the real-time PCR-based transgene copy number estimation to be more reliable and precise with a proper statistical estimation. Proper confidence intervals are necessary for unambiguous prediction of trangene copy number. The four different statistical methods are compared for their advantages and disadvantages. Moreover, the statistical methods can also be applied for other real-time PCR-based quantification assays including transfection efficiency analysis and pathogen quantification.

  5. Comparison between project-based learning and discovery learning toward students' metacognitive strategies on global warming concept

    NASA Astrophysics Data System (ADS)

    Tumewu, Widya Anjelia; Wulan, Ana Ratna; Sanjaya, Yayan

    2017-05-01

    The purpose of this study was to know comparing the effectiveness of learning using Project-based learning (PjBL) and Discovery Learning (DL) toward students metacognitive strategies on global warming concept. A quasi-experimental research design with a The Matching-Only Pretest-Posttest Control Group Design was used in this study. The subjects were students of two classes 7th grade of one of junior high school in Bandung City, West Java of 2015/2016 academic year. The study was conducted on two experimental class, that were project-based learning treatment on the experimental class I and discovery learning treatment was done on the experimental class II. The data was collected through questionnaire to know students metacognitive strategies. The statistical analysis showed that there were statistically significant differences in students metacognitive strategies between project-based learning and discovery learning.

  6. A Laboratory Experiment, Based on the Maillard Reaction, Conducted as a Project in Introductory Statistics

    ERIC Educational Resources Information Center

    Kravchuk, Olena; Elliott, Antony; Bhandari, Bhesh

    2005-01-01

    A simple laboratory experiment, based on the Maillard reaction, served as a project in Introductory Statistics for undergraduates in Food Science and Technology. By using the principles of randomization and replication and reflecting on the sources of variation in the experimental data, students reinforced the statistical concepts and techniques…

  7. Integrating Statistical Mechanics with Experimental Data from the Rotational-Vibrational Spectrum of HCl into the Physical Chemistry Laboratory

    ERIC Educational Resources Information Center

    Findley, Bret R.; Mylon, Steven E.

    2008-01-01

    We introduce a computer exercise that bridges spectroscopy and thermodynamics using statistical mechanics and the experimental data taken from the commonly used laboratory exercise involving the rotational-vibrational spectrum of HCl. Based on the results from the analysis of their HCl spectrum, students calculate bulk thermodynamic properties…

  8. New approach in the quantum statistical parton distribution

    NASA Astrophysics Data System (ADS)

    Sohaily, Sozha; Vaziri (Khamedi), Mohammad

    2017-12-01

    An attempt to find simple parton distribution functions (PDFs) based on quantum statistical approach is presented. The PDFs described by the statistical model have very interesting physical properties which help to understand the structure of partons. The longitudinal portion of distribution functions are given by applying the maximum entropy principle. An interesting and simple approach to determine the statistical variables exactly without fitting and fixing parameters is surveyed. Analytic expressions of the x-dependent PDFs are obtained in the whole x region [0, 1], and the computed distributions are consistent with the experimental observations. The agreement with experimental data, gives a robust confirm of our simple presented statistical model.

  9. The Effect of Project Based Learning on the Statistical Literacy Levels of Student 8th Grade

    ERIC Educational Resources Information Center

    Koparan, Timur; Güven, Bülent

    2014-01-01

    This study examines the effect of project based learning on 8th grade students' statistical literacy levels. A performance test was developed for this aim. Quasi-experimental research model was used in this article. In this context, the statistics were taught with traditional method in the control group and it was taught using project based…

  10. The Effect on the 8th Grade Students' Attitude towards Statistics of Project Based Learning

    ERIC Educational Resources Information Center

    Koparan, Timur; Güven, Bülent

    2014-01-01

    This study investigates the effect of the project based learning approach on 8th grade students' attitude towards statistics. With this aim, an attitude scale towards statistics was developed. Quasi-experimental research model was used in this study. Following this model in the control group the traditional method was applied to teach statistics…

  11. Learning Axes and Bridging Tools in a Technology-Based Design for Statistics

    ERIC Educational Resources Information Center

    Abrahamson, Dor; Wilensky, Uri

    2007-01-01

    We introduce a design-based research framework, "learning axes and bridging tools," and demonstrate its application in the preparation and study of an implementation of a middle-school experimental computer-based unit on probability and statistics, "ProbLab" (Probability Laboratory, Abrahamson and Wilensky 2002 [Abrahamson, D., & Wilensky, U.…

  12. Mass detection, localization and estimation for wind turbine blades based on statistical pattern recognition

    NASA Astrophysics Data System (ADS)

    Colone, L.; Hovgaard, M. K.; Glavind, L.; Brincker, R.

    2018-07-01

    A method for mass change detection on wind turbine blades using natural frequencies is presented. The approach is based on two statistical tests. The first test decides if there is a significant mass change and the second test is a statistical group classification based on Linear Discriminant Analysis. The frequencies are identified by means of Operational Modal Analysis using natural excitation. Based on the assumption of Gaussianity of the frequencies, a multi-class statistical model is developed by combining finite element model sensitivities in 10 classes of change location on the blade, the smallest area being 1/5 of the span. The method is experimentally validated for a full scale wind turbine blade in a test setup and loaded by natural wind. Mass change from natural causes was imitated with sand bags and the algorithm was observed to perform well with an experimental detection rate of 1, localization rate of 0.88 and mass estimation rate of 0.72.

  13. Design of a factorial experiment with randomization restrictions to assess medical device performance on vascular tissue.

    PubMed

    Diestelkamp, Wiebke S; Krane, Carissa M; Pinnell, Margaret F

    2011-05-20

    Energy-based surgical scalpels are designed to efficiently transect and seal blood vessels using thermal energy to promote protein denaturation and coagulation. Assessment and design improvement of ultrasonic scalpel performance relies on both in vivo and ex vivo testing. The objective of this work was to design and implement a robust, experimental test matrix with randomization restrictions and predictive statistical power, which allowed for identification of those experimental variables that may affect the quality of the seal obtained ex vivo. The design of the experiment included three factors: temperature (two levels); the type of solution used to perfuse the artery during transection (three types); and artery type (two types) resulting in a total of twelve possible treatment combinations. Burst pressures of porcine carotid and renal arteries sealed ex vivo were assigned as the response variable. The experimental test matrix was designed and carried out as a split-plot experiment in order to assess the contributions of several variables and their interactions while accounting for randomization restrictions present in the experimental setup. The statistical software package SAS was utilized and PROC MIXED was used to account for the randomization restrictions in the split-plot design. The combination of temperature, solution, and vessel type had a statistically significant impact on seal quality. The design and implementation of a split-plot experimental test-matrix provided a mechanism for addressing the existing technical randomization restrictions of ex vivo ultrasonic scalpel performance testing, while preserving the ability to examine the potential effects of independent factors or variables. This method for generating the experimental design and the statistical analyses of the resulting data are adaptable to a wide variety of experimental problems involving large-scale tissue-based studies of medical or experimental device efficacy and performance.

  14. A Web-Based Learning Tool Improves Student Performance in Statistics: A Randomized Masked Trial

    ERIC Educational Resources Information Center

    Gonzalez, Jose A.; Jover, Lluis; Cobo, Erik; Munoz, Pilar

    2010-01-01

    Background: e-status is a web-based tool able to generate different statistical exercises and to provide immediate feedback to students' answers. Although the use of Information and Communication Technologies (ICTs) is becoming widespread in undergraduate education, there are few experimental studies evaluating its effects on learning. Method: All…

  15. Effectiveness of Project Based Learning in Statistics for Lower Secondary Schools

    ERIC Educational Resources Information Center

    Siswono, Tatag Yuli Eko; Hartono, Sugi; Kohar, Ahmad Wachidul

    2018-01-01

    Purpose: This study aimed at investigating the effectiveness of implementing Project Based Learning (PBL) on the topic of statistics at a lower secondary school in Surabaya city, Indonesia, indicated by examining student learning outcomes, student responses, and student activity. Research Methods: A quasi experimental method was conducted over two…

  16. Decision Support Systems: Applications in Statistics and Hypothesis Testing.

    ERIC Educational Resources Information Center

    Olsen, Christopher R.; Bozeman, William C.

    1988-01-01

    Discussion of the selection of appropriate statistical procedures by educators highlights a study conducted to investigate the effectiveness of decision aids in facilitating the use of appropriate statistics. Experimental groups and a control group using a printed flow chart, a computer-based decision aid, and a standard text are described. (11…

  17. Brownian motion properties of optoelectronic random bit generators based on laser chaos.

    PubMed

    Li, Pu; Yi, Xiaogang; Liu, Xianglian; Wang, Yuncai; Wang, Yongge

    2016-07-11

    The nondeterministic property of the optoelectronic random bit generator (RBG) based on laser chaos are experimentally analyzed from two aspects of the central limit theorem and law of iterated logarithm. The random bits are extracted from an optical feedback chaotic laser diode using a multi-bit extraction technique in the electrical domain. Our experimental results demonstrate that the generated random bits have no statistical distance from the Brownian motion, besides that they can pass the state-of-the-art industry-benchmark statistical test suite (NIST SP800-22). All of them give a mathematically provable evidence that the ultrafast random bit generator based on laser chaos can be used as a nondeterministic random bit source.

  18. Set statistics in conductive bridge random access memory device with Cu/HfO{sub 2}/Pt structure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Meiyun; Long, Shibing, E-mail: longshibing@ime.ac.cn; Wang, Guoming

    2014-11-10

    The switching parameter variation of resistive switching memory is one of the most important challenges in its application. In this letter, we have studied the set statistics of conductive bridge random access memory with a Cu/HfO{sub 2}/Pt structure. The experimental distributions of the set parameters in several off resistance ranges are shown to nicely fit a Weibull model. The Weibull slopes of the set voltage and current increase and decrease logarithmically with off resistance, respectively. This experimental behavior is perfectly captured by a Monte Carlo simulator based on the cell-based set voltage statistics model and the Quantum Point Contact electronmore » transport model. Our work provides indications for the improvement of the switching uniformity.« less

  19. Correlation of RNA secondary structure statistics with thermodynamic stability and applications to folding.

    PubMed

    Wu, Johnny C; Gardner, David P; Ozer, Stuart; Gutell, Robin R; Ren, Pengyu

    2009-08-28

    The accurate prediction of the secondary and tertiary structure of an RNA with different folding algorithms is dependent on several factors, including the energy functions. However, an RNA higher-order structure cannot be predicted accurately from its sequence based on a limited set of energy parameters. The inter- and intramolecular forces between this RNA and other small molecules and macromolecules, in addition to other factors in the cell such as pH, ionic strength, and temperature, influence the complex dynamics associated with transition of a single stranded RNA to its secondary and tertiary structure. Since all of the factors that affect the formation of an RNAs 3D structure cannot be determined experimentally, statistically derived potential energy has been used in the prediction of protein structure. In the current work, we evaluate the statistical free energy of various secondary structure motifs, including base-pair stacks, hairpin loops, and internal loops, using their statistical frequency obtained from the comparative analysis of more than 50,000 RNA sequences stored in the RNA Comparative Analysis Database (rCAD) at the Comparative RNA Web (CRW) Site. Statistical energy was computed from the structural statistics for several datasets. While the statistical energy for a base-pair stack correlates with experimentally derived free energy values, suggesting a Boltzmann-like distribution, variation is observed between different molecules and their location on the phylogenetic tree of life. Our statistical energy values calculated for several structural elements were utilized in the Mfold RNA-folding algorithm. The combined statistical energy values for base-pair stacks, hairpins and internal loop flanks result in a significant improvement in the accuracy of secondary structure prediction; the hairpin flanks contribute the most.

  20. Design of a factorial experiment with randomization restrictions to assess medical device performance on vascular tissue

    PubMed Central

    2011-01-01

    Background Energy-based surgical scalpels are designed to efficiently transect and seal blood vessels using thermal energy to promote protein denaturation and coagulation. Assessment and design improvement of ultrasonic scalpel performance relies on both in vivo and ex vivo testing. The objective of this work was to design and implement a robust, experimental test matrix with randomization restrictions and predictive statistical power, which allowed for identification of those experimental variables that may affect the quality of the seal obtained ex vivo. Methods The design of the experiment included three factors: temperature (two levels); the type of solution used to perfuse the artery during transection (three types); and artery type (two types) resulting in a total of twelve possible treatment combinations. Burst pressures of porcine carotid and renal arteries sealed ex vivo were assigned as the response variable. Results The experimental test matrix was designed and carried out as a split-plot experiment in order to assess the contributions of several variables and their interactions while accounting for randomization restrictions present in the experimental setup. The statistical software package SAS was utilized and PROC MIXED was used to account for the randomization restrictions in the split-plot design. The combination of temperature, solution, and vessel type had a statistically significant impact on seal quality. Conclusions The design and implementation of a split-plot experimental test-matrix provided a mechanism for addressing the existing technical randomization restrictions of ex vivo ultrasonic scalpel performance testing, while preserving the ability to examine the potential effects of independent factors or variables. This method for generating the experimental design and the statistical analyses of the resulting data are adaptable to a wide variety of experimental problems involving large-scale tissue-based studies of medical or experimental device efficacy and performance. PMID:21599963

  1. Ice Water Classification Using Statistical Distribution Based Conditional Random Fields in RADARSAT-2 Dual Polarization Imagery

    NASA Astrophysics Data System (ADS)

    Zhang, Y.; Li, F.; Zhang, S.; Hao, W.; Zhu, T.; Yuan, L.; Xiao, F.

    2017-09-01

    In this paper, Statistical Distribution based Conditional Random Fields (STA-CRF) algorithm is exploited for improving marginal ice-water classification. Pixel level ice concentration is presented as the comparison of methods based on CRF. Furthermore, in order to explore the effective statistical distribution model to be integrated into STA-CRF, five statistical distribution models are investigated. The STA-CRF methods are tested on 2 scenes around Prydz Bay and Adélie Depression, where contain a variety of ice types during melt season. Experimental results indicate that the proposed method can resolve sea ice edge well in Marginal Ice Zone (MIZ) and show a robust distinction of ice and water.

  2. Predicting Acquisition of Learning Outcomes: A Comparison of Traditional and Activity-Based Instruction in an Introductory Statistics Course.

    ERIC Educational Resources Information Center

    Geske, Jenenne A.; Mickelson, William T.; Bandalos, Deborah L.; Jonson, Jessica; Smith, Russell W.

    The bulk of experimental research related to reforms in the teaching of statistics concentrates on the effects of alternative teaching methods on statistics achievement. This study expands on that research by including an examination of the effects of instructor and the interaction between instructor and method on achievement as well as attitudes,…

  3. STATWIZ - AN ELECTRONIC STATISTICAL TOOL (ABSTRACT)

    EPA Science Inventory

    StatWiz is a web-based, interactive, and dynamic statistical tool for researchers. It will allow researchers to input information and/or data and then receive experimental design options, or outputs from data analysis. StatWiz is envisioned as an expert system that will walk rese...

  4. Physical concepts in the development of constitutive equations

    NASA Technical Reports Server (NTRS)

    Cassenti, B. N.

    1985-01-01

    Proposed viscoplastic material models include in their formulation observed material response but do not generally incorporate principles from thermodynamics, statistical mechanics, and quantum mechanics. Numerous hypotheses were made for material response based on first principles. Many of these hypotheses were tested experimentally. The proposed viscoplastic theories and the experimental basis of these hypotheses must be checked against the hypotheses. The physics of thermodynamics, statistical mechanics and quantum mechanics, and the effects of defects, are reviewed for their application to the development of constitutive laws.

  5. Enhancing efficiency and quality of statistical estimation of immunogenicity assay cut points through standardization and automation.

    PubMed

    Su, Cheng; Zhou, Lei; Hu, Zheng; Weng, Winnie; Subramani, Jayanthi; Tadkod, Vineet; Hamilton, Kortney; Bautista, Ami; Wu, Yu; Chirmule, Narendra; Zhong, Zhandong Don

    2015-10-01

    Biotherapeutics can elicit immune responses, which can alter the exposure, safety, and efficacy of the therapeutics. A well-designed and robust bioanalytical method is critical for the detection and characterization of relevant anti-drug antibody (ADA) and the success of an immunogenicity study. As a fundamental criterion in immunogenicity testing, assay cut points need to be statistically established with a risk-based approach to reduce subjectivity. This manuscript describes the development of a validated, web-based, multi-tier customized assay statistical tool (CAST) for assessing cut points of ADA assays. The tool provides an intuitive web interface that allows users to import experimental data generated from a standardized experimental design, select the assay factors, run the standardized analysis algorithms, and generate tables, figures, and listings (TFL). It allows bioanalytical scientists to perform complex statistical analysis at a click of the button to produce reliable assay parameters in support of immunogenicity studies. Copyright © 2015 Elsevier B.V. All rights reserved.

  6. A model for generating Surface EMG signal of m. Tibialis Anterior.

    PubMed

    Siddiqi, Ariba; Kumar, Dinesh; Arjunan, Sridhar P

    2014-01-01

    A model that simulates surface electromyogram (sEMG) signal of m. Tibialis Anterior has been developed and tested. This has a firing rate equation that is based on experimental findings. It also has a recruitment threshold that is based on observed statistical distribution. Importantly, it has considered both, slow and fast type which has been distinguished based on their conduction velocity. This model has assumed that the deeper unipennate half of the muscle does not contribute significantly to the potential induced on the surface of the muscle and has approximated the muscle to have parallel structure. The model was validated by comparing the simulated and the experimental sEMG signal recordings. Experiments were conducted on eight subjects who performed isometric dorsiflexion at 10, 20, 30, 50, 75, and 100% maximal voluntary contraction. Normalized root mean square and median frequency of the experimental and simulated EMG signal were computed and the slopes of the linearity with the force were statistically analyzed. The gradients were found to be similar (p>0.05) for both experimental and simulated sEMG signal, validating the proposed model.

  7. [Triple-type theory of statistics and its application in the scientific research of biomedicine].

    PubMed

    Hu, Liang-ping; Liu, Hui-gang

    2005-07-20

    To point out the crux of why so many people failed to grasp statistics and to bring forth a "triple-type theory of statistics" to solve the problem in a creative way. Based on the experience in long-time teaching and research in statistics, the "three-type theory" was raised and clarified. Examples were provided to demonstrate that the 3 types, i.e., expressive type, prototype and the standardized type are the essentials for people to apply statistics rationally both in theory and practice, and moreover, it is demonstrated by some instances that the "three types" are correlated with each other. It can help people to see the essence by interpreting and analyzing the problems of experimental designs and statistical analyses in medical research work. Investigations reveal that for some questions, the three types are mutually identical; for some questions, the prototype is their standardized type; however, for some others, the three types are distinct from each other. It has been shown that in some multifactor experimental researches, it leads to the nonexistence of the standardized type corresponding to the prototype at all, because some researchers have committed the mistake of "incomplete control" in setting experimental groups. This is a problem which should be solved by the concept and method of "division". Once the "triple-type" for each question is clarified, a proper experimental design and statistical method can be carried out easily. "Triple-type theory of statistics" can help people to avoid committing statistical mistakes or at least to decrease the misuse rate dramatically and improve the quality, level and speed of biomedical research during the process of applying statistics. It can also help people to improve the quality of statistical textbooks and the teaching effect of statistics and it has demonstrated how to advance biomedical statistics.

  8. A Weibull statistics-based lignocellulose saccharification model and a built-in parameter accurately predict lignocellulose hydrolysis performance.

    PubMed

    Wang, Mingyu; Han, Lijuan; Liu, Shasha; Zhao, Xuebing; Yang, Jinghua; Loh, Soh Kheang; Sun, Xiaomin; Zhang, Chenxi; Fang, Xu

    2015-09-01

    Renewable energy from lignocellulosic biomass has been deemed an alternative to depleting fossil fuels. In order to improve this technology, we aim to develop robust mathematical models for the enzymatic lignocellulose degradation process. By analyzing 96 groups of previously published and newly obtained lignocellulose saccharification results and fitting them to Weibull distribution, we discovered Weibull statistics can accurately predict lignocellulose saccharification data, regardless of the type of substrates, enzymes and saccharification conditions. A mathematical model for enzymatic lignocellulose degradation was subsequently constructed based on Weibull statistics. Further analysis of the mathematical structure of the model and experimental saccharification data showed the significance of the two parameters in this model. In particular, the λ value, defined the characteristic time, represents the overall performance of the saccharification system. This suggestion was further supported by statistical analysis of experimental saccharification data and analysis of the glucose production levels when λ and n values change. In conclusion, the constructed Weibull statistics-based model can accurately predict lignocellulose hydrolysis behavior and we can use the λ parameter to assess the overall performance of enzymatic lignocellulose degradation. Advantages and potential applications of the model and the λ value in saccharification performance assessment were discussed. Copyright © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. Structural damage detection based on stochastic subspace identification and statistical pattern recognition: II. Experimental validation under varying temperature

    NASA Astrophysics Data System (ADS)

    Lin, Y. Q.; Ren, W. X.; Fang, S. E.

    2011-11-01

    Although most vibration-based damage detection methods can acquire satisfactory verification on analytical or numerical structures, most of them may encounter problems when applied to real-world structures under varying environments. The damage detection methods that directly extract damage features from the periodically sampled dynamic time history response measurements are desirable but relevant research and field application verification are still lacking. In this second part of a two-part paper, the robustness and performance of the statistics-based damage index using the forward innovation model by stochastic subspace identification of a vibrating structure proposed in the first part have been investigated against two prestressed reinforced concrete (RC) beams tested in the laboratory and a full-scale RC arch bridge tested in the field under varying environments. Experimental verification is focused on temperature effects. It is demonstrated that the proposed statistics-based damage index is insensitive to temperature variations but sensitive to the structural deterioration or state alteration. This makes it possible to detect the structural damage for the real-scale structures experiencing ambient excitations and varying environmental conditions.

  10. Robust Combining of Disparate Classifiers Through Order Statistics

    NASA Technical Reports Server (NTRS)

    Tumer, Kagan; Ghosh, Joydeep

    2001-01-01

    Integrating the outputs of multiple classifiers via combiners or meta-learners has led to substantial improvements in several difficult pattern recognition problems. In this article we investigate a family of combiners based on order statistics, for robust handling of situations where there are large discrepancies in performance of individual classifiers. Based on a mathematical modeling of how the decision boundaries are affected by order statistic combiners, we derive expressions for the reductions in error expected when simple output combination methods based on the the median, the maximum and in general, the ith order statistic, are used. Furthermore, we analyze the trim and spread combiners, both based on linear combinations of the ordered classifier outputs, and show that in the presence of uneven classifier performance, they often provide substantial gains over both linear and simple order statistics combiners. Experimental results on both real world data and standard public domain data sets corroborate these findings.

  11. A clinical investigation of the efficacy of a dentifrice containing 1.5% arginine and 1450 ppm fluoride, as sodium monofluorophosphate in a calcium base, on primary root caries.

    PubMed

    Hu, D Y; Yin, W; Li, X; Feng, Y; Zhang, Y P; Cummins, D; Mateo, L R; Ellwood, R P

    2013-01-01

    The purpose of this six-month study was to assess the ability of a new dentifrice containing 1.5% arginine, an insoluble calcium compound, and 1450 ppm fluoride, as sodium monofluorophosphate, to arrest and reverse primary root caries lesions in adults. Three test groups used dentifrices which contained either: 1) 1.5% arginine and 1450 ppm fluoride as sodium monofluorophosphate in a calcium base (experimental); 2) 1450 ppm fluoride as sodium fluoride in a silica base (positive control); or 3) no fluoride in a calcium base (negative control). The study participants were residents of the city of Chengdu, Sichuan Province, China. In order to take part, subjects had to have at least one non-cavitated primary root caries lesion. A total of 412 subjects completed the study. They were aged from 50 to 70 years (mean age 64 +/- 4.1 years) and 53.6% were female. Efficacy for arresting and reversal of primary root caries was assessed by clinical hardness measures and through the use of the Electrical Caries Monitor. After three months of product use, clinical hardness measures showed that 27.7%, 24.6%, and 13.1% of lesions had improved in the experimental, positive, and negative control groups, respectively, and 0.7%, 4.5%, and 16.8% had become worse, respectively. The differences in the distribution of lesion change between the negative control group and both the experimental (p < 0.001) and positive control (p = 0.001) were statistically significant. The Electrical Caries Monitor was also used as an objective measure of lesion severity. The end values increased from baseline to the three-month examinations, but none of the differences between the groups attained statistical significance. After six months, clinical hardness measures showed that only one lesion (0.7%) was worse than at the baseline examination-in the experimental group compared to 9.0% and 18.2% in the positive and negative control groups, respectively. In addition, 61.7%, 56.0%, and 27.0%, respectively, showed improvement for the three groups. The differences in the distribution of lesion change scores between the negative control group and both the experimental (p < 0.001) and positive control (p < 0.001) were statistically significant, as was the difference between the experimental group and the positive control (p = 0.006). The Electrical Caries Monitor end values for the experimental, positive, and negative control groups at the six-month examination were 7.9, 1.9 mega omega(s), and 387 kilo omegas(s), respectively. The differences between the negative control group and both the experimental (p < 0.001) and positive control (p < 0.001) were statistically significant. The difference between the experimental and positive control groups was also statistically significant (p = 0.03). It is concluded that the new toothpaste containing 1.5% arginine and 1450 ppm fluoride, as sodium monofluorophosphate in a calcium base, provided greater anticaries benefits than a conventional toothpaste containing 1450 ppm fluoride. Both fluoride toothpastes demonstrated greater benefits than non-fluoride toothpaste.

  12. D-OPTIMAL EXPERIMENTAL DESIGNS TO TEST FOR DEPARTURE FROM ADDITIVITY IN A FIXED-RATIO MIXTURE RAY.

    EPA Science Inventory

    Traditional factorial designs for evaluating interactions among chemicals in a mixture are prohibitive when the number of chemicals is large. However, recent advances in statistically-based experimental design have made it easier to evaluate interactions involving many chemicals...

  13. The effect on prospective teachers of the learning environment supported by dynamic statistics software

    NASA Astrophysics Data System (ADS)

    Koparan, Timur

    2016-02-01

    In this study, the effect on the achievement and attitudes of prospective teachers is examined. With this aim ahead, achievement test, attitude scale for statistics and interviews were used as data collection tools. The achievement test comprises 8 problems based on statistical data, and the attitude scale comprises 13 Likert-type items. The study was carried out in 2014-2015 academic year fall semester at a university in Turkey. The study, which employed the pre-test-post-test control group design of quasi-experimental research method, was carried out on a group of 80 prospective teachers, 40 in the control group and 40 in the experimental group. Both groups had four-hour classes about descriptive statistics. The classes with the control group were carried out through traditional methods while dynamic statistics software was used in the experimental group. Five prospective teachers from the experimental group were interviewed clinically after the application for a deeper examination of their views about application. Qualitative data gained are presented under various themes. At the end of the study, it was found that there is a significant difference in favour of the experimental group in terms of achievement and attitudes, the prospective teachers have affirmative approach to the use of dynamic software and see it as an effective tool to enrich maths classes. In accordance with the findings of the study, it is suggested that dynamic software, which offers unique opportunities, be used in classes by teachers and students.

  14. Cation Selectivity in Biological Cation Channels Using Experimental Structural Information and Statistical Mechanical Simulation.

    PubMed

    Finnerty, Justin John; Peyser, Alexander; Carloni, Paolo

    2015-01-01

    Cation selective channels constitute the gate for ion currents through the cell membrane. Here we present an improved statistical mechanical model based on atomistic structural information, cation hydration state and without tuned parameters that reproduces the selectivity of biological Na+ and Ca2+ ion channels. The importance of the inclusion of step-wise cation hydration in these results confirms the essential role partial dehydration plays in the bacterial Na+ channels. The model, proven reliable against experimental data, could be straightforwardly used for designing Na+ and Ca2+ selective nanopores.

  15. Improving UWB-Based Localization in IoT Scenarios with Statistical Models of Distance Error.

    PubMed

    Monica, Stefania; Ferrari, Gianluigi

    2018-05-17

    Interest in the Internet of Things (IoT) is rapidly increasing, as the number of connected devices is exponentially growing. One of the application scenarios envisaged for IoT technologies involves indoor localization and context awareness. In this paper, we focus on a localization approach that relies on a particular type of communication technology, namely Ultra Wide Band (UWB). UWB technology is an attractive choice for indoor localization, owing to its high accuracy. Since localization algorithms typically rely on estimated inter-node distances, the goal of this paper is to evaluate the improvement brought by a simple (linear) statistical model of the distance error. On the basis of an extensive experimental measurement campaign, we propose a general analytical framework, based on a Least Square (LS) method, to derive a novel statistical model for the range estimation error between a pair of UWB nodes. The proposed statistical model is then applied to improve the performance of a few illustrative localization algorithms in various realistic scenarios. The obtained experimental results show that the use of the proposed statistical model improves the accuracy of the considered localization algorithms with a reduction of the localization error up to 66%.

  16. Interpretation of the results of statistical measurements. [search for basic probability model

    NASA Technical Reports Server (NTRS)

    Olshevskiy, V. V.

    1973-01-01

    For random processes, the calculated probability characteristic, and the measured statistical estimate are used in a quality functional, which defines the difference between the two functions. Based on the assumption that the statistical measurement procedure is organized so that the parameters for a selected model are optimized, it is shown that the interpretation of experimental research is a search for a basic probability model.

  17. Optimal Experimental Design for Model Discrimination

    ERIC Educational Resources Information Center

    Myung, Jay I.; Pitt, Mark A.

    2009-01-01

    Models of a psychological process can be difficult to discriminate experimentally because it is not easy to determine the values of the critical design variables (e.g., presentation schedule, stimulus structure) that will be most informative in differentiating them. Recent developments in sampling-based search methods in statistics make it…

  18. The problem is not just sample size: the consequences of low base rates in policing experiments in smaller cities.

    PubMed

    Hinkle, Joshua C; Weisburd, David; Famega, Christine; Ready, Justin

    2013-01-01

    Hot spots policing is one of the most influential police innovations, with a strong body of experimental research showing it to be effective in reducing crime and disorder. However, most studies have been conducted in major cities, and we thus know little about whether it is effective in smaller cities, which account for a majority of police agencies. The lack of experimental studies in smaller cities is likely in part due to challenges designing statistically powerful tests in such contexts. The current article explores the challenges of statistical power and "noise" resulting from low base rates of crime in smaller cities and provides suggestions for future evaluations to overcome these limitations. Data from a randomized experimental evaluation of broken windows policing in hot spots are used to illustrate the challenges that low base rates present for evaluating hot spots policing programs in smaller cities. Analyses show low base rates make it difficult to detect treatment effects. Very large effect sizes would be required to reach sufficient power, and random fluctuations around low base rates make detecting treatment effects difficult, irrespective of power, by masking differences between treatment and control groups. Low base rates present strong challenges to researchers attempting to evaluate hot spots policing in smaller cities. As such, base rates must be taken directly into account when designing experimental evaluations. The article offers suggestions for researchers attempting to expand the examination of hot spots policing and other microplace-based interventions to smaller jurisdictions.

  19. The Development and Demonstration of Multiple Regression Models for Operant Conditioning Questions.

    ERIC Educational Resources Information Center

    Fanning, Fred; Newman, Isadore

    Based on the assumption that inferential statistics can make the operant conditioner more sensitive to possible significant relationships, regressions models were developed to test the statistical significance between slopes and Y intercepts of the experimental and control group subjects. These results were then compared to the traditional operant…

  20. Statistical design of quantitative mass spectrometry-based proteomic experiments.

    PubMed

    Oberg, Ann L; Vitek, Olga

    2009-05-01

    We review the fundamental principles of statistical experimental design, and their application to quantitative mass spectrometry-based proteomics. We focus on class comparison using Analysis of Variance (ANOVA), and discuss how randomization, replication and blocking help avoid systematic biases due to the experimental procedure, and help optimize our ability to detect true quantitative changes between groups. We also discuss the issues of pooling multiple biological specimens for a single mass analysis, and calculation of the number of replicates in a future study. When applicable, we emphasize the parallels between designing quantitative proteomic experiments and experiments with gene expression microarrays, and give examples from that area of research. We illustrate the discussion using theoretical considerations, and using real-data examples of profiling of disease.

  1. Nonlinear wave chaos: statistics of second harmonic fields.

    PubMed

    Zhou, Min; Ott, Edward; Antonsen, Thomas M; Anlage, Steven M

    2017-10-01

    Concepts from the field of wave chaos have been shown to successfully predict the statistical properties of linear electromagnetic fields in electrically large enclosures. The Random Coupling Model (RCM) describes these properties by incorporating both universal features described by Random Matrix Theory and the system-specific features of particular system realizations. In an effort to extend this approach to the nonlinear domain, we add an active nonlinear frequency-doubling circuit to an otherwise linear wave chaotic system, and we measure the statistical properties of the resulting second harmonic fields. We develop an RCM-based model of this system as two linear chaotic cavities coupled by means of a nonlinear transfer function. The harmonic field strengths are predicted to be the product of two statistical quantities and the nonlinearity characteristics. Statistical results from measurement-based calculation, RCM-based simulation, and direct experimental measurements are compared and show good agreement over many decades of power.

  2. Experimental Mathematics and Computational Statistics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bailey, David H.; Borwein, Jonathan M.

    2009-04-30

    The field of statistics has long been noted for techniques to detect patterns and regularities in numerical data. In this article we explore connections between statistics and the emerging field of 'experimental mathematics'. These includes both applications of experimental mathematics in statistics, as well as statistical methods applied to computational mathematics.

  3. Quasi-experimental Studies in the Fields of Infection Control and Antibiotic Resistance, Ten Years Later: A Systematic Review.

    PubMed

    Alsaggaf, Rotana; O'Hara, Lyndsay M; Stafford, Kristen A; Leekha, Surbhi; Harris, Anthony D

    2018-02-01

    OBJECTIVE A systematic review of quasi-experimental studies in the field of infectious diseases was published in 2005. The aim of this study was to assess improvements in the design and reporting of quasi-experiments 10 years after the initial review. We also aimed to report the statistical methods used to analyze quasi-experimental data. DESIGN Systematic review of articles published from January 1, 2013, to December 31, 2014, in 4 major infectious disease journals. METHODS Quasi-experimental studies focused on infection control and antibiotic resistance were identified and classified based on 4 criteria: (1) type of quasi-experimental design used, (2) justification of the use of the design, (3) use of correct nomenclature to describe the design, and (4) statistical methods used. RESULTS Of 2,600 articles, 173 (7%) featured a quasi-experimental design, compared to 73 of 2,320 articles (3%) in the previous review (P<.01). Moreover, 21 articles (12%) utilized a study design with a control group; 6 (3.5%) justified the use of a quasi-experimental design; and 68 (39%) identified their design using the correct nomenclature. In addition, 2-group statistical tests were used in 75 studies (43%); 58 studies (34%) used standard regression analysis; 18 (10%) used segmented regression analysis; 7 (4%) used standard time-series analysis; 5 (3%) used segmented time-series analysis; and 10 (6%) did not utilize statistical methods for comparisons. CONCLUSIONS While some progress occurred over the decade, it is crucial to continue improving the design and reporting of quasi-experimental studies in the fields of infection control and antibiotic resistance to better evaluate the effectiveness of important interventions. Infect Control Hosp Epidemiol 2018;39:170-176.

  4. Nonlinear Curve-Fitting Program

    NASA Technical Reports Server (NTRS)

    Everhart, Joel L.; Badavi, Forooz F.

    1989-01-01

    Nonlinear optimization algorithm helps in finding best-fit curve. Nonlinear Curve Fitting Program, NLINEAR, interactive curve-fitting routine based on description of quadratic expansion of X(sup 2) statistic. Utilizes nonlinear optimization algorithm calculating best statistically weighted values of parameters of fitting function and X(sup 2) minimized. Provides user with such statistical information as goodness of fit and estimated values of parameters producing highest degree of correlation between experimental data and mathematical model. Written in FORTRAN 77.

  5. Role of hydrogen in volatile behaviour of defects in SiO2-based electronic devices

    NASA Astrophysics Data System (ADS)

    Wimmer, Yannick; El-Sayed, Al-Moatasem; Gös, Wolfgang; Grasser, Tibor; Shluger, Alexander L.

    2016-06-01

    Charge capture and emission by point defects in gate oxides of metal-oxide-semiconductor field-effect transistors (MOSFETs) strongly affect reliability and performance of electronic devices. Recent advances in experimental techniques used for probing defect properties have led to new insights into their characteristics. In particular, these experimental data show a repeated dis- and reappearance (the so-called volatility) of the defect-related signals. We use multiscale modelling to explain the charge capture and emission as well as defect volatility in amorphous SiO2 gate dielectrics. We first briefly discuss the recent experimental results and use a multiphonon charge capture model to describe the charge-trapping behaviour of defects in silicon-based MOSFETs. We then link this model to ab initio calculations that investigate the three most promising defect candidates. Statistical distributions of defect characteristics obtained from ab initio calculations in amorphous SiO2 are compared with the experimentally measured statistical properties of charge traps. This allows us to suggest an atomistic mechanism to explain the experimentally observed volatile behaviour of defects. We conclude that the hydroxyl-E' centre is a promising candidate to explain all the observed features, including defect volatility.

  6. The Structure of Research Methodology Competency in Higher Education and the Role of Teaching Teams and Course Temporal Distance

    ERIC Educational Resources Information Center

    Schweizer, Karl; Steinwascher, Merle; Moosbrugger, Helfried; Reiss, Siegbert

    2011-01-01

    The development of research methodology competency is a major aim of the psychology curriculum at universities. Usually, three courses concentrating on basic statistics, advanced statistics and experimental methods, respectively, serve the achievement of this aim. However, this traditional curriculum-based course structure gives rise to the…

  7. Statistical analysis and application of quasi experiments to antimicrobial resistance intervention studies.

    PubMed

    Shardell, Michelle; Harris, Anthony D; El-Kamary, Samer S; Furuno, Jon P; Miller, Ram R; Perencevich, Eli N

    2007-10-01

    Quasi-experimental study designs are frequently used to assess interventions that aim to limit the emergence of antimicrobial-resistant pathogens. However, previous studies using these designs have often used suboptimal statistical methods, which may result in researchers making spurious conclusions. Methods used to analyze quasi-experimental data include 2-group tests, regression analysis, and time-series analysis, and they all have specific assumptions, data requirements, strengths, and limitations. An example of a hospital-based intervention to reduce methicillin-resistant Staphylococcus aureus infection rates and reduce overall length of stay is used to explore these methods.

  8. Cation Selectivity in Biological Cation Channels Using Experimental Structural Information and Statistical Mechanical Simulation

    PubMed Central

    Finnerty, Justin John

    2015-01-01

    Cation selective channels constitute the gate for ion currents through the cell membrane. Here we present an improved statistical mechanical model based on atomistic structural information, cation hydration state and without tuned parameters that reproduces the selectivity of biological Na+ and Ca2+ ion channels. The importance of the inclusion of step-wise cation hydration in these results confirms the essential role partial dehydration plays in the bacterial Na+ channels. The model, proven reliable against experimental data, could be straightforwardly used for designing Na+ and Ca2+ selective nanopores. PMID:26460827

  9. Comparisons of non-Gaussian statistical models in DNA methylation analysis.

    PubMed

    Ma, Zhanyu; Teschendorff, Andrew E; Yu, Hong; Taghia, Jalil; Guo, Jun

    2014-06-16

    As a key regulatory mechanism of gene expression, DNA methylation patterns are widely altered in many complex genetic diseases, including cancer. DNA methylation is naturally quantified by bounded support data; therefore, it is non-Gaussian distributed. In order to capture such properties, we introduce some non-Gaussian statistical models to perform dimension reduction on DNA methylation data. Afterwards, non-Gaussian statistical model-based unsupervised clustering strategies are applied to cluster the data. Comparisons and analysis of different dimension reduction strategies and unsupervised clustering methods are presented. Experimental results show that the non-Gaussian statistical model-based methods are superior to the conventional Gaussian distribution-based method. They are meaningful tools for DNA methylation analysis. Moreover, among several non-Gaussian methods, the one that captures the bounded nature of DNA methylation data reveals the best clustering performance.

  10. Comparisons of Non-Gaussian Statistical Models in DNA Methylation Analysis

    PubMed Central

    Ma, Zhanyu; Teschendorff, Andrew E.; Yu, Hong; Taghia, Jalil; Guo, Jun

    2014-01-01

    As a key regulatory mechanism of gene expression, DNA methylation patterns are widely altered in many complex genetic diseases, including cancer. DNA methylation is naturally quantified by bounded support data; therefore, it is non-Gaussian distributed. In order to capture such properties, we introduce some non-Gaussian statistical models to perform dimension reduction on DNA methylation data. Afterwards, non-Gaussian statistical model-based unsupervised clustering strategies are applied to cluster the data. Comparisons and analysis of different dimension reduction strategies and unsupervised clustering methods are presented. Experimental results show that the non-Gaussian statistical model-based methods are superior to the conventional Gaussian distribution-based method. They are meaningful tools for DNA methylation analysis. Moreover, among several non-Gaussian methods, the one that captures the bounded nature of DNA methylation data reveals the best clustering performance. PMID:24937687

  11. Tips and Tricks for Successful Application of Statistical Methods to Biological Data.

    PubMed

    Schlenker, Evelyn

    2016-01-01

    This chapter discusses experimental design and use of statistics to describe characteristics of data (descriptive statistics) and inferential statistics that test the hypothesis posed by the investigator. Inferential statistics, based on probability distributions, depend upon the type and distribution of the data. For data that are continuous, randomly and independently selected, as well as normally distributed more powerful parametric tests such as Student's t test and analysis of variance (ANOVA) can be used. For non-normally distributed or skewed data, transformation of the data (using logarithms) may normalize the data allowing use of parametric tests. Alternatively, with skewed data nonparametric tests can be utilized, some of which rely on data that are ranked prior to statistical analysis. Experimental designs and analyses need to balance between committing type 1 errors (false positives) and type 2 errors (false negatives). For a variety of clinical studies that determine risk or benefit, relative risk ratios (random clinical trials and cohort studies) or odds ratios (case-control studies) are utilized. Although both use 2 × 2 tables, their premise and calculations differ. Finally, special statistical methods are applied to microarray and proteomics data, since the large number of genes or proteins evaluated increase the likelihood of false discoveries. Additional studies in separate samples are used to verify microarray and proteomic data. Examples in this chapter and references are available to help continued investigation of experimental designs and appropriate data analysis.

  12. A Hybrid Template-Based Composite Classification System

    DTIC Science & Technology

    2009-02-01

    Hybrid Classifier: Forced Decision . . . . 116 5.3.2 Forced Decision Experimental Results . . . . . 119 5.3.3 Test for Statistical Significance ...Results . . . . . . . . . . 127 5.4.2 Test for Statistical Significance : NDEC Option 129 5.5 Implementing the Hyrid Classifier with OOL Targets . 130...comple- mentary in nature . Complementary classifiers are observed by finding an optimal method for partitioning the problem space. For example, the

  13. Application of Turchin's method of statistical regularization

    NASA Astrophysics Data System (ADS)

    Zelenyi, Mikhail; Poliakova, Mariia; Nozik, Alexander; Khudyakov, Alexey

    2018-04-01

    During analysis of experimental data, one usually needs to restore a signal after it has been convoluted with some kind of apparatus function. According to Hadamard's definition this problem is ill-posed and requires regularization to provide sensible results. In this article we describe an implementation of the Turchin's method of statistical regularization based on the Bayesian approach to the regularization strategy.

  14. SOCR: Statistics Online Computational Resource

    PubMed Central

    Dinov, Ivo D.

    2011-01-01

    The need for hands-on computer laboratory experience in undergraduate and graduate statistics education has been firmly established in the past decade. As a result a number of attempts have been undertaken to develop novel approaches for problem-driven statistical thinking, data analysis and result interpretation. In this paper we describe an integrated educational web-based framework for: interactive distribution modeling, virtual online probability experimentation, statistical data analysis, visualization and integration. Following years of experience in statistical teaching at all college levels using established licensed statistical software packages, like STATA, S-PLUS, R, SPSS, SAS, Systat, etc., we have attempted to engineer a new statistics education environment, the Statistics Online Computational Resource (SOCR). This resource performs many of the standard types of statistical analysis, much like other classical tools. In addition, it is designed in a plug-in object-oriented architecture and is completely platform independent, web-based, interactive, extensible and secure. Over the past 4 years we have tested, fine-tuned and reanalyzed the SOCR framework in many of our undergraduate and graduate probability and statistics courses and have evidence that SOCR resources build student’s intuition and enhance their learning. PMID:21451741

  15. Detached Eddy Simulation of Flap Side-Edge Flow

    NASA Technical Reports Server (NTRS)

    Balakrishnan, Shankar K.; Shariff, Karim R.

    2016-01-01

    Detached Eddy Simulation (DES) of flap side-edge flow was performed with a wing and half-span flap configuration used in previous experimental and numerical studies. The focus of the study is the unsteady flow features responsible for the production of far-field noise. The simulation was performed at a Reynolds number (based on the main wing chord) of 3.7 million. Reynolds Averaged Navier-Stokes (RANS) simulations were performed as a precursor to the DES. The results of these precursor simulations match previous experimental and RANS results closely. Although the present DES simulations have not reached statistical stationary yet, some unsteady features of the developing flap side-edge flowfield are presented. In the final paper it is expected that statistically stationary results will be presented including comparisons of surface pressure spectra with experimental data.

  16. Identification of natural images and computer-generated graphics based on statistical and textural features.

    PubMed

    Peng, Fei; Li, Jiao-ting; Long, Min

    2015-03-01

    To discriminate the acquisition pipelines of digital images, a novel scheme for the identification of natural images and computer-generated graphics is proposed based on statistical and textural features. First, the differences between them are investigated from the view of statistics and texture, and 31 dimensions of feature are acquired for identification. Then, LIBSVM is used for the classification. Finally, the experimental results are presented. The results show that it can achieve an identification accuracy of 97.89% for computer-generated graphics, and an identification accuracy of 97.75% for natural images. The analyses also demonstrate the proposed method has excellent performance, compared with some existing methods based only on statistical features or other features. The method has a great potential to be implemented for the identification of natural images and computer-generated graphics. © 2014 American Academy of Forensic Sciences.

  17. A new statistical method for transfer coefficient calculations in the framework of the general multiple-compartment model of transport for radionuclides in biological systems.

    PubMed

    Garcia, F; Arruda-Neto, J D; Manso, M V; Helene, O M; Vanin, V R; Rodriguez, O; Mesa, J; Likhachev, V P; Filho, J W; Deppman, A; Perez, G; Guzman, F; de Camargo, S P

    1999-10-01

    A new and simple statistical procedure (STATFLUX) for the calculation of transfer coefficients of radionuclide transport to animals and plants is proposed. The method is based on the general multiple-compartment model, which uses a system of linear equations involving geometrical volume considerations. By using experimentally available curves of radionuclide concentrations versus time, for each animal compartment (organs), flow parameters were estimated by employing a least-squares procedure, whose consistency is tested. Some numerical results are presented in order to compare the STATFLUX transfer coefficients with those from other works and experimental data.

  18. Polypropylene Production Optimization in Fluidized Bed Catalytic Reactor (FBCR): Statistical Modeling and Pilot Scale Experimental Validation

    PubMed Central

    Khan, Mohammad Jakir Hossain; Hussain, Mohd Azlan; Mujtaba, Iqbal Mohammed

    2014-01-01

    Propylene is one type of plastic that is widely used in our everyday life. This study focuses on the identification and justification of the optimum process parameters for polypropylene production in a novel pilot plant based fluidized bed reactor. This first-of-its-kind statistical modeling with experimental validation for the process parameters of polypropylene production was conducted by applying ANNOVA (Analysis of variance) method to Response Surface Methodology (RSM). Three important process variables i.e., reaction temperature, system pressure and hydrogen percentage were considered as the important input factors for the polypropylene production in the analysis performed. In order to examine the effect of process parameters and their interactions, the ANOVA method was utilized among a range of other statistical diagnostic tools such as the correlation between actual and predicted values, the residuals and predicted response, outlier t plot, 3D response surface and contour analysis plots. The statistical analysis showed that the proposed quadratic model had a good fit with the experimental results. At optimum conditions with temperature of 75°C, system pressure of 25 bar and hydrogen percentage of 2%, the highest polypropylene production obtained is 5.82% per pass. Hence it is concluded that the developed experimental design and proposed model can be successfully employed with over a 95% confidence level for optimum polypropylene production in a fluidized bed catalytic reactor (FBCR). PMID:28788576

  19. Précis of statistical significance: rationale, validity, and utility.

    PubMed

    Chow, S L

    1998-04-01

    The null-hypothesis significance-test procedure (NHSTP) is defended in the context of the theory-corroboration experiment, as well as the following contrasts: (a) substantive hypotheses versus statistical hypotheses, (b) theory corroboration versus statistical hypothesis testing, (c) theoretical inference versus statistical decision, (d) experiments versus nonexperimental studies, and (e) theory corroboration versus treatment assessment. The null hypothesis can be true because it is the hypothesis that errors are randomly distributed in data. Moreover, the null hypothesis is never used as a categorical proposition. Statistical significance means only that chance influences can be excluded as an explanation of data; it does not identify the nonchance factor responsible. The experimental conclusion is drawn with the inductive principle underlying the experimental design. A chain of deductive arguments gives rise to the theoretical conclusion via the experimental conclusion. The anomalous relationship between statistical significance and the effect size often used to criticize NHSTP is more apparent than real. The absolute size of the effect is not an index of evidential support for the substantive hypothesis. Nor is the effect size, by itself, informative as to the practical importance of the research result. Being a conditional probability, statistical power cannot be the a priori probability of statistical significance. The validity of statistical power is debatable because statistical significance is determined with a single sampling distribution of the test statistic based on H0, whereas it takes two distributions to represent statistical power or effect size. Sample size should not be determined in the mechanical manner envisaged in power analysis. It is inappropriate to criticize NHSTP for nonstatistical reasons. At the same time, neither effect size, nor confidence interval estimate, nor posterior probability can be used to exclude chance as an explanation of data. Neither can any of them fulfill the nonstatistical functions expected of them by critics.

  20. Experimental Test of Heisenberg's Measurement Uncertainty Relation Based on Statistical Distances

    NASA Astrophysics Data System (ADS)

    Ma, Wenchao; Ma, Zhihao; Wang, Hengyan; Chen, Zhihua; Liu, Ying; Kong, Fei; Li, Zhaokai; Peng, Xinhua; Shi, Mingjun; Shi, Fazhan; Fei, Shao-Ming; Du, Jiangfeng

    2016-04-01

    Incompatible observables can be approximated by compatible observables in joint measurement or measured sequentially, with constrained accuracy as implied by Heisenberg's original formulation of the uncertainty principle. Recently, Busch, Lahti, and Werner proposed inaccuracy trade-off relations based on statistical distances between probability distributions of measurement outcomes [P. Busch et al., Phys. Rev. Lett. 111, 160405 (2013); P. Busch et al., Phys. Rev. A 89, 012129 (2014)]. Here we reformulate their theoretical framework, derive an improved relation for qubit measurement, and perform an experimental test on a spin system. The relation reveals that the worst-case inaccuracy is tightly bounded from below by the incompatibility of target observables, and is verified by the experiment employing joint measurement in which two compatible observables designed to approximate two incompatible observables on one qubit are measured simultaneously.

  1. Experimental Test of Heisenberg's Measurement Uncertainty Relation Based on Statistical Distances.

    PubMed

    Ma, Wenchao; Ma, Zhihao; Wang, Hengyan; Chen, Zhihua; Liu, Ying; Kong, Fei; Li, Zhaokai; Peng, Xinhua; Shi, Mingjun; Shi, Fazhan; Fei, Shao-Ming; Du, Jiangfeng

    2016-04-22

    Incompatible observables can be approximated by compatible observables in joint measurement or measured sequentially, with constrained accuracy as implied by Heisenberg's original formulation of the uncertainty principle. Recently, Busch, Lahti, and Werner proposed inaccuracy trade-off relations based on statistical distances between probability distributions of measurement outcomes [P. Busch et al., Phys. Rev. Lett. 111, 160405 (2013); P. Busch et al., Phys. Rev. A 89, 012129 (2014)]. Here we reformulate their theoretical framework, derive an improved relation for qubit measurement, and perform an experimental test on a spin system. The relation reveals that the worst-case inaccuracy is tightly bounded from below by the incompatibility of target observables, and is verified by the experiment employing joint measurement in which two compatible observables designed to approximate two incompatible observables on one qubit are measured simultaneously.

  2. Parametric study of the dynamic JWL-EOS for detonation products

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Urtiew, P.A.; Hayes, B.

    1990-03-01

    The JWL equation of state describing the adiabatic expansion of detonation products is revisited to complete the description of the principal eigenvalue, to reset the secondary eigenvalue to produce a well-behaved adiabatic gamma profile, and to normalize the characteristic equation of state in terms of conventional parameters having a clear experimental interpretation. This is accomplished by interjecting a dynamic flow condition concerning the value of the relative specific volume when the particle velocity of the detonation products is zero. In addition, a set of generic parameters based on the statistical distribution of the primary explosives making up the available datamore » base is presented. Unlike theoretical and statistical mechanical models, the adiabatic gamma function for these materials is seen to have a positive initial slope in accord with experimental findings. 10 refs., 4 figs.« less

  3. Hybrid statistics-simulations based method for atom-counting from ADF STEM images.

    PubMed

    De Wael, Annelies; De Backer, Annick; Jones, Lewys; Nellist, Peter D; Van Aert, Sandra

    2017-06-01

    A hybrid statistics-simulations based method for atom-counting from annular dark field scanning transmission electron microscopy (ADF STEM) images of monotype crystalline nanostructures is presented. Different atom-counting methods already exist for model-like systems. However, the increasing relevance of radiation damage in the study of nanostructures demands a method that allows atom-counting from low dose images with a low signal-to-noise ratio. Therefore, the hybrid method directly includes prior knowledge from image simulations into the existing statistics-based method for atom-counting, and accounts in this manner for possible discrepancies between actual and simulated experimental conditions. It is shown by means of simulations and experiments that this hybrid method outperforms the statistics-based method, especially for low electron doses and small nanoparticles. The analysis of a simulated low dose image of a small nanoparticle suggests that this method allows for far more reliable quantitative analysis of beam-sensitive materials. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Forecasting runout of rock and debris avalanches

    USGS Publications Warehouse

    Iverson, Richard M.; Evans, S.G.; Mugnozza, G.S.; Strom, A.; Hermanns, R.L.

    2006-01-01

    Physically based mathematical models and statistically based empirical equations each may provide useful means of forecasting runout of rock and debris avalanches. This paper compares the foundations, strengths, and limitations of a physically based model and a statistically based forecasting method, both of which were developed to predict runout across three-dimensional topography. The chief advantage of the physically based model results from its ties to physical conservation laws and well-tested axioms of soil and rock mechanics, such as the Coulomb friction rule and effective-stress principle. The output of this model provides detailed information about the dynamics of avalanche runout, at the expense of high demands for accurate input data, numerical computation, and experimental testing. In comparison, the statistical method requires relatively modest computation and no input data except identification of prospective avalanche source areas and a range of postulated avalanche volumes. Like the physically based model, the statistical method yields maps of predicted runout, but it provides no information on runout dynamics. Although the two methods differ significantly in their structure and objectives, insights gained from one method can aid refinement of the other.

  5. Experimental design and quantitative analysis of microbial community multiomics.

    PubMed

    Mallick, Himel; Ma, Siyuan; Franzosa, Eric A; Vatanen, Tommi; Morgan, Xochitl C; Huttenhower, Curtis

    2017-11-30

    Studies of the microbiome have become increasingly sophisticated, and multiple sequence-based, molecular methods as well as culture-based methods exist for population-scale microbiome profiles. To link the resulting host and microbial data types to human health, several experimental design considerations, data analysis challenges, and statistical epidemiological approaches must be addressed. Here, we survey current best practices for experimental design in microbiome molecular epidemiology, including technologies for generating, analyzing, and integrating microbiome multiomics data. We highlight studies that have identified molecular bioactives that influence human health, and we suggest steps for scaling translational microbiome research to high-throughput target discovery across large populations.

  6. Analysis of uncertainties and convergence of the statistical quantities in turbulent wall-bounded flows by means of a physically based criterion

    NASA Astrophysics Data System (ADS)

    Andrade, João Rodrigo; Martins, Ramon Silva; Thompson, Roney Leon; Mompean, Gilmar; da Silveira Neto, Aristeu

    2018-04-01

    The present paper provides an analysis of the statistical uncertainties associated with direct numerical simulation (DNS) results and experimental data for turbulent channel and pipe flows, showing a new physically based quantification of these errors, to improve the determination of the statistical deviations between DNSs and experiments. The analysis is carried out using a recently proposed criterion by Thompson et al. ["A methodology to evaluate statistical errors in DNS data of plane channel flows," Comput. Fluids 130, 1-7 (2016)] for fully turbulent plane channel flows, where the mean velocity error is estimated by considering the Reynolds stress tensor, and using the balance of the mean force equation. It also presents how the residual error evolves in time for a DNS of a plane channel flow, and the influence of the Reynolds number on its convergence rate. The root mean square of the residual error is shown in order to capture a single quantitative value of the error associated with the dimensionless averaging time. The evolution in time of the error norm is compared with the final error provided by DNS data of similar Reynolds numbers available in the literature. A direct consequence of this approach is that it was possible to compare different numerical results and experimental data, providing an improved understanding of the convergence of the statistical quantities in turbulent wall-bounded flows.

  7. Prediction/discussion-based learning cycle versus conceptual change text: comparative effects on students' understanding of genetics

    NASA Astrophysics Data System (ADS)

    khawaldeh, Salem A. Al

    2013-07-01

    Background and purpose: The purpose of this study was to investigate the comparative effects of a prediction/discussion-based learning cycle (HPD-LC), conceptual change text (CCT) and traditional instruction on 10th grade students' understanding of genetics concepts. Sample: Participants were 112 10th basic grade male students in three classes of the same school located in an urban area. The three classes taught by the same biology teacher were randomly assigned as a prediction/discussion-based learning cycle class (n = 39), conceptual change text class (n = 37) and traditional class (n = 36). Design and method: A quasi-experimental research design of pre-test-post-test non-equivalent control group was adopted. Participants completed the Genetics Concept Test as pre-test-post-test, to examine the effects of instructional strategies on their genetics understanding. Pre-test scores and Test of Logical Thinking scores were used as covariates. Results: The analysis of covariance showed a statistically significant difference between the experimental and control groups in the favor of experimental groups after treatment. However, no statistically significant difference between the experimental groups (HPD-LC versus CCT instruction) was found. Conclusions: Overall, the findings of this study support the use of the prediction/discussion-based learning cycle and conceptual change text in both research and teaching. The findings may be useful for improving classroom practices in teaching science concepts and for the development of suitable materials promoting students' understanding of science.

  8. Slow crack growth: Models and experiments

    NASA Astrophysics Data System (ADS)

    Santucci, S.; Vanel, L.; Ciliberto, S.

    2007-07-01

    The properties of slow crack growth in brittle materials are analyzed both theoretically and experimentally. We propose a model based on a thermally activated rupture process. Considering a 2D spring network submitted to an external load and to thermal noise, we show that a preexisting crack in the network may slowly grow because of stress fluctuations. An analytical solution is found for the evolution of the crack length as a function of time, the time to rupture and the statistics of the crack jumps. These theoretical predictions are verified by studying experimentally the subcritical growth of a single crack in thin sheets of paper. A good agreement between the theoretical predictions and the experimental results is found. In particular, our model suggests that the statistical stress fluctuations trigger rupture events at a nanometric scale corresponding to the diameter of cellulose microfibrils.

  9. Statistical analysis of experimental multifragmentation events in 64Zn+112Sn at 40 MeV/nucleon

    NASA Astrophysics Data System (ADS)

    Lin, W.; Zheng, H.; Ren, P.; Liu, X.; Huang, M.; Wada, R.; Chen, Z.; Wang, J.; Xiao, G. Q.; Qu, G.

    2018-04-01

    A statistical multifragmentation model (SMM) is applied to the experimentally observed multifragmentation events in an intermediate heavy-ion reaction. Using the temperature and symmetry energy extracted from the isobaric yield ratio (IYR) method based on the modified Fisher model (MFM), SMM is applied to the reaction 64Zn+112Sn at 40 MeV/nucleon. The experimental isotope distribution and mass distribution of the primary reconstructed fragments are compared without afterburner and they are well reproduced. The extracted temperature T and symmetry energy coefficient asym from SMM simulated events, using the IYR method, are also consistent with those from the experiment. These results strongly suggest that in the multifragmentation process there is a freezeout volume, in which the thermal and chemical equilibrium is established before or at the time of the intermediate-mass fragments emission.

  10. A Unified Statistical Rain-Attenuation Model for Communication Link Fade Predictions and Optimal Stochastic Fade Control Design Using a Location-Dependent Rain-Statistic Database

    NASA Technical Reports Server (NTRS)

    Manning, Robert M.

    1990-01-01

    A static and dynamic rain-attenuation model is presented which describes the statistics of attenuation on an arbitrarily specified satellite link for any location for which there are long-term rainfall statistics. The model may be used in the design of the optimal stochastic control algorithms to mitigate the effects of attenuation and maintain link reliability. A rain-statistics data base is compiled, which makes it possible to apply the model to any location in the continental U.S. with a resolution of 0-5 degrees in latitude and longitude. The model predictions are compared with experimental observations, showing good agreement.

  11. Study on Hybrid Image Search Technology Based on Texts and Contents

    NASA Astrophysics Data System (ADS)

    Wang, H. T.; Ma, F. L.; Yan, C.; Pan, H.

    2018-05-01

    Image search was studied first here based on texts and contents, respectively. The text-based image feature extraction was put forward by integrating the statistical and topic features in view of the limitation of extraction of keywords only by means of statistical features of words. On the other hand, a search-by-image method was put forward based on multi-feature fusion in view of the imprecision of the content-based image search by means of a single feature. The layered-searching method depended on primarily the text-based image search method and additionally the content-based image search was then put forward in view of differences between the text-based and content-based methods and their difficult direct fusion. The feasibility and effectiveness of the hybrid search algorithm were experimentally verified.

  12. Application of a statistical design to the optimization of parameters and culture medium for alpha-amylase production by Aspergillus oryzae CBS 819.72 grown on gruel (wheat grinding by-product).

    PubMed

    Kammoun, Radhouane; Naili, Belgacem; Bejar, Samir

    2008-09-01

    The production optimization of alpha-amylase (E.C.3.2.1.1) from Aspergillus oryzae CBS 819.72 fungus, using a by-product of wheat grinding (gruel) as sole carbon source, was performed with statistical methodology based on three experimental designs. The optimisation of temperature, agitation and inoculum size was attempted using a Box-Behnken design under the response surface methodology. The screening of nineteen nutrients for their influence on alpha-amylase production was achieved using a Plackett-Burman design. KH(2)PO(4), urea, glycerol, (NH(4))(2)SO(4), CoCl(2), casein hydrolysate, soybean meal hydrolysate, MgSO(4) were selected based on their positive influence on enzyme formation. The optimized nutrients concentration was obtained using a Taguchi experimental design and the analysis of the data predicts a theoretical increase in the alpha-amylase expression of 73.2% (from 40.1 to 151.1 U/ml). These conditions were validated experimentally and revealed an enhanced alpha-amylase yield of 72.7%.

  13. The emergence of modern statistics in agricultural science: analysis of variance, experimental design and the reshaping of research at Rothamsted Experimental Station, 1919-1933.

    PubMed

    Parolini, Giuditta

    2015-01-01

    During the twentieth century statistical methods have transformed research in the experimental and social sciences. Qualitative evidence has largely been replaced by quantitative results and the tools of statistical inference have helped foster a new ideal of objectivity in scientific knowledge. The paper will investigate this transformation by considering the genesis of analysis of variance and experimental design, statistical methods nowadays taught in every elementary course of statistics for the experimental and social sciences. These methods were developed by the mathematician and geneticist R. A. Fisher during the 1920s, while he was working at Rothamsted Experimental Station, where agricultural research was in turn reshaped by Fisher's methods. Analysis of variance and experimental design required new practices and instruments in field and laboratory research, and imposed a redistribution of expertise among statisticians, experimental scientists and the farm staff. On the other hand the use of statistical methods in agricultural science called for a systematization of information management and made computing an activity integral to the experimental research done at Rothamsted, permanently integrating the statisticians' tools and expertise into the station research programme. Fisher's statistical methods did not remain confined within agricultural research and by the end of the 1950s they had come to stay in psychology, sociology, education, chemistry, medicine, engineering, economics, quality control, just to mention a few of the disciplines which adopted them.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Labby, Z.

    Physicists are often expected to have a solid grounding in experimental design and statistical analysis, sometimes filling in when biostatisticians or other experts are not available for consultation. Unfortunately, graduate education on these topics is seldom emphasized and few opportunities for continuing education exist. Clinical physicists incorporate new technology and methods into their practice based on published literature. A poor understanding of experimental design and analysis could Result in inappropriate use of new techniques. Clinical physicists also improve current practice through quality initiatives that require sound experimental design and analysis. Academic physicists with a poor understanding of design and analysismore » may produce ambiguous (or misleading) results. This can Result in unnecessary rewrites, publication rejection, and experimental redesign (wasting time, money, and effort). This symposium will provide a practical review of error and uncertainty, common study designs, and statistical tests. Instruction will primarily focus on practical implementation through examples and answer questions such as: where would you typically apply the test/design and where is the test/design typically misapplied (i.e., common pitfalls)? An analysis of error and uncertainty will also be explored using biological studies and associated modeling as a specific use case. Learning Objectives: Understand common experimental testing and clinical trial designs, what questions they can answer, and how to interpret the results Determine where specific statistical tests are appropriate and identify common pitfalls Understand the how uncertainty and error are addressed in biological testing and associated biological modeling.« less

  15. Experimental validation of a distribution theory based analysis of the effect of manufacturing tolerances on permanent magnet synchronous machines

    NASA Astrophysics Data System (ADS)

    Boscaino, V.; Cipriani, G.; Di Dio, V.; Corpora, M.; Curto, D.; Franzitta, V.; Trapanese, M.

    2017-05-01

    An experimental study on the effect of permanent magnet tolerances on the performances of a Tubular Linear Ferrite Motor is presented in this paper. The performances that have been investigated are: cogging force, end effect cogging force and generated thrust. It is demonstrated that: 1) the statistical variability of the magnets introduces harmonics in the spectrum of the cogging force; 2) the value of the end effect cogging force is directly linked to the values of then remanence field of the external magnets placed on the slider; 3) the generated thrust and its statistical distribution depend on the remanence field of the magnets placed on the translator.

  16. Application of Statistical Design for the Production of Cellulase by Trichoderma reesei Using Mango Peel.

    PubMed

    Saravanan, P; Muthuvelayudham, R; Viruthagiri, T

    2012-01-01

    Optimization of the culture medium for cellulase production using Trichoderma reesei was carried out. The optimization of cellulase production using mango peel as substrate was performed with statistical methodology based on experimental designs. The screening of nine nutrients for their influence on cellulase production is achieved using Plackett-Burman design. Avicel, soybean cake flour, KH(2)PO(4), and CoCl(2)·6H(2)O were selected based on their positive influence on cellulase production. The composition of the selected components was optimized using Response Surface Methodology (RSM). The optimum conditions are as follows: Avicel: 25.30 g/L, Soybean cake flour: 23.53 g/L, KH(2)PO(4): 4.90 g/L, and CoCl(2)·6H(2)O: 0.95 g/L. These conditions are validated experimentally which revealed an enhanced Cellulase activity of 7.8 IU/mL.

  17. Experimental econophysics: Complexity, self-organization, and emergent properties

    NASA Astrophysics Data System (ADS)

    Huang, J. P.

    2015-03-01

    Experimental econophysics is concerned with statistical physics of humans in the laboratory, and it is based on controlled human experiments developed by physicists to study some problems related to economics or finance. It relies on controlled human experiments in the laboratory together with agent-based modeling (for computer simulations and/or analytical theory), with an attempt to reveal the general cause-effect relationship between specific conditions and emergent properties of real economic/financial markets (a kind of complex adaptive systems). Here I review the latest progress in the field, namely, stylized facts, herd behavior, contrarian behavior, spontaneous cooperation, partial information, and risk management. Also, I highlight the connections between such progress and other topics of traditional statistical physics. The main theme of the review is to show diverse emergent properties of the laboratory markets, originating from self-organization due to the nonlinear interactions among heterogeneous humans or agents (complexity).

  18. Evaluating the quality of a cell counting measurement process via a dilution series experimental design.

    PubMed

    Sarkar, Sumona; Lund, Steven P; Vyzasatya, Ravi; Vanguri, Padmavathy; Elliott, John T; Plant, Anne L; Lin-Gibson, Sheng

    2017-12-01

    Cell counting measurements are critical in the research, development and manufacturing of cell-based products, yet determining cell quantity with accuracy and precision remains a challenge. Validating and evaluating a cell counting measurement process can be difficult because of the lack of appropriate reference material. Here we describe an experimental design and statistical analysis approach to evaluate the quality of a cell counting measurement process in the absence of appropriate reference materials or reference methods. The experimental design is based on a dilution series study with replicate samples and observations as well as measurement process controls. The statistical analysis evaluates the precision and proportionality of the cell counting measurement process and can be used to compare the quality of two or more counting methods. As an illustration of this approach, cell counting measurement processes (automated and manual methods) were compared for a human mesenchymal stromal cell (hMSC) preparation. For the hMSC preparation investigated, results indicated that the automated method performed better than the manual counting methods in terms of precision and proportionality. By conducting well controlled dilution series experimental designs coupled with appropriate statistical analysis, quantitative indicators of repeatability and proportionality can be calculated to provide an assessment of cell counting measurement quality. This approach does not rely on the use of a reference material or comparison to "gold standard" methods known to have limited assurance of accuracy and precision. The approach presented here may help the selection, optimization, and/or validation of a cell counting measurement process. Published by Elsevier Inc.

  19. Statistical analysis of modeling error in structural dynamic systems

    NASA Technical Reports Server (NTRS)

    Hasselman, T. K.; Chrostowski, J. D.

    1990-01-01

    The paper presents a generic statistical model of the (total) modeling error for conventional space structures in their launch configuration. Modeling error is defined as the difference between analytical prediction and experimental measurement. It is represented by the differences between predicted and measured real eigenvalues and eigenvectors. Comparisons are made between pre-test and post-test models. Total modeling error is then subdivided into measurement error, experimental error and 'pure' modeling error, and comparisons made between measurement error and total modeling error. The generic statistical model presented in this paper is based on the first four global (primary structure) modes of four different structures belonging to the generic category of Conventional Space Structures (specifically excluding large truss-type space structures). As such, it may be used to evaluate the uncertainty of predicted mode shapes and frequencies, sinusoidal response, or the transient response of other structures belonging to the same generic category.

  20. Experimental observations of Lagrangian sand grain kinematics under bedload transport: statistical description of the step and rest regimes

    NASA Astrophysics Data System (ADS)

    Guala, M.; Liu, M.

    2017-12-01

    The kinematics of sediment particles is investigated by non-intrusive imaging methods to provide a statistical description of bedload transport in conditions near the threshold of motion. In particular, we focus on the cyclic transition between motion and rest regimes to quantify the waiting time statistics inferred to be responsible for anomalous diffusion, and so far elusive. Despite obvious limitations in the spatio-temporal domain of the observations, we are able to identify the probability distributions of the particle step time and length, velocity, acceleration, waiting time, and thus distinguish which quantities exhibit well converged mean values, based on the thickness of their respective tails. The experimental results shown here for four different transport conditions highlight the importance of the waiting time distribution and represent a benchmark dataset for the stochastic modeling of bedload transport.

  1. Wet scrubbing of biomass producer gas tars using vegetable oil

    NASA Astrophysics Data System (ADS)

    Bhoi, Prakashbhai Ramabhai

    The overall aims of this research study were to generate novel design data and to develop an equilibrium stage-based thermodynamic model of a vegetable oil based wet scrubbing system for the removal of model tar compounds (benzene, toluene and ethylbenzene) found in biomass producer gas. The specific objectives were to design, fabricate and evaluate a vegetable oil based wet scrubbing system and to optimize the design and operating variables; i.e., packed bed height, vegetable oil type, solvent temperature, and solvent flow rate. The experimental wet packed bed scrubbing system includes a liquid distributor specifically designed to distribute a high viscous vegetable oil uniformly and a mixing section, which was designed to generate a desired concentration of tar compounds in a simulated air stream. A method and calibration protocol of gas chromatography/mass spectroscopy was developed to quantify tar compounds. Experimental data were analyzed statistically using analysis of variance (ANOVA) procedure. Statistical analysis showed that both soybean and canola oils are potential solvents, providing comparable removal efficiency of tar compounds. The experimental height equivalent to a theoretical plate (HETP) was determined as 0.11 m for vegetable oil based scrubbing system. Packed bed height and solvent temperature had highly significant effect (p0.05) effect on the removal of model tar compounds. The packing specific constants, Ch and CP,0, for the Billet and Schultes pressure drop correlation were determined as 2.52 and 2.93, respectively. The equilibrium stage based thermodynamic model predicted the removal efficiency of model tar compounds in the range of 1-6%, 1-4% and 1-2% of experimental data for benzene, toluene and ethylbenzene, respectively, for the solvent temperature of 30° C. The NRTL-PR property model and UNIFAC for estimating binary interaction parameters are recommended for modeling absorption of tar compounds in vegetable oils. Bench scale experimental data from the wet scrubbing system would be useful in the design and operation of a pilot scale vegetable oil based system. The process model, validated using experimental data, would be a key design tool for the design and optimization of a pilot scale vegetable oil based system.

  2. Predicting adsorptive removal of chlorophenol from aqueous solution using artificial intelligence based modeling approaches.

    PubMed

    Singh, Kunwar P; Gupta, Shikha; Ojha, Priyanka; Rai, Premanjali

    2013-04-01

    The research aims to develop artificial intelligence (AI)-based model to predict the adsorptive removal of 2-chlorophenol (CP) in aqueous solution by coconut shell carbon (CSC) using four operational variables (pH of solution, adsorbate concentration, temperature, and contact time), and to investigate their effects on the adsorption process. Accordingly, based on a factorial design, 640 batch experiments were conducted. Nonlinearities in experimental data were checked using Brock-Dechert-Scheimkman (BDS) statistics. Five nonlinear models were constructed to predict the adsorptive removal of CP in aqueous solution by CSC using four variables as input. Performances of the constructed models were evaluated and compared using statistical criteria. BDS statistics revealed strong nonlinearity in experimental data. Performance of all the models constructed here was satisfactory. Radial basis function network (RBFN) and multilayer perceptron network (MLPN) models performed better than generalized regression neural network, support vector machines, and gene expression programming models. Sensitivity analysis revealed that the contact time had highest effect on adsorption followed by the solution pH, temperature, and CP concentration. The study concluded that all the models constructed here were capable of capturing the nonlinearity in data. A better generalization and predictive performance of RBFN and MLPN models suggested that these can be used to predict the adsorption of CP in aqueous solution using CSC.

  3. Segmenting lung fields in serial chest radiographs using both population-based and patient-specific shape statistics.

    PubMed

    Shi, Y; Qi, F; Xue, Z; Chen, L; Ito, K; Matsuo, H; Shen, D

    2008-04-01

    This paper presents a new deformable model using both population-based and patient-specific shape statistics to segment lung fields from serial chest radiographs. There are two novelties in the proposed deformable model. First, a modified scale invariant feature transform (SIFT) local descriptor, which is more distinctive than the general intensity and gradient features, is used to characterize the image features in the vicinity of each pixel. Second, the deformable contour is constrained by both population-based and patient-specific shape statistics, and it yields more robust and accurate segmentation of lung fields for serial chest radiographs. In particular, for segmenting the initial time-point images, the population-based shape statistics is used to constrain the deformable contour; as more subsequent images of the same patient are acquired, the patient-specific shape statistics online collected from the previous segmentation results gradually takes more roles. Thus, this patient-specific shape statistics is updated each time when a new segmentation result is obtained, and it is further used to refine the segmentation results of all the available time-point images. Experimental results show that the proposed method is more robust and accurate than other active shape models in segmenting the lung fields from serial chest radiographs.

  4. CSMP Mathematics for the Intermediate Grades Part II, Teacher's Guide. The Languages of Strings and Arrows. Geometry and Measurement. Probability and Statistics. Experimental Version.

    ERIC Educational Resources Information Center

    CEMREL, Inc., St. Ann, MO.

    This guide represents the final experimental version of a pilot project which was conducted in the United States between 1973 and 1976. The ideas and the manner of presentation are based on the works of Georges and Frederique Papy. They are recognized for having introduced colored arrow drawings ("papygrams") and models of our numeration…

  5. CSMP Mathematics for the Intermediate Grades Part III, Teacher's Guide. The Languages of Strings and Arrows. Geometry and Measurement. Probability and Statistics. Experimental Version.

    ERIC Educational Resources Information Center

    CEMREL, Inc., St. Ann, MO.

    This guide represents the final experimental version of a pilot project which was conducted in the United States between 1973 and 1976. The ideas and the manner of presentation are based on the works of Georges and Frederique Papy. They are recognized for having introduced colored arrow drawings ("papygrams") and models of our numeration…

  6. Commentary.

    ERIC Educational Resources Information Center

    Scarr, Sandra

    1995-01-01

    Argues that Gottlieb rejects population sampling and statistical analyses of distributions as he proposes that his experimental brand of mechanistic science is the only legitimate approach to developmental research. Maintains that Gottlieb exaggerates developmental uncertainty, based on his own research with extreme environmental manipulations.…

  7. The effect of project-based learning on students' statistical literacy levels for data representation

    NASA Astrophysics Data System (ADS)

    Koparan, Timur; Güven, Bülent

    2015-07-01

    The point of this study is to define the effect of project-based learning approach on 8th Grade secondary-school students' statistical literacy levels for data representation. To achieve this goal, a test which consists of 12 open-ended questions in accordance with the views of experts was developed. Seventy 8th grade secondary-school students, 35 in the experimental group and 35 in the control group, took this test twice, one before the application and one after the application. All the raw scores were turned into linear points by using the Winsteps 3.72 modelling program that makes the Rasch analysis and t-tests, and an ANCOVA analysis was carried out with the linear points. Depending on the findings, it was concluded that the project-based learning approach increases students' level of statistical literacy for data representation. Students' levels of statistical literacy before and after the application were shown through the obtained person-item maps.

  8. Evaporation residue cross-section measurements for 48Ti-induced reactions

    NASA Astrophysics Data System (ADS)

    Sharma, Priya; Behera, B. R.; Mahajan, Ruchi; Thakur, Meenu; Kaur, Gurpreet; Kapoor, Kushal; Rani, Kavita; Madhavan, N.; Nath, S.; Gehlot, J.; Dubey, R.; Mazumdar, I.; Patel, S. M.; Dhibar, M.; Hosamani, M. M.; Khushboo, Kumar, Neeraj; Shamlath, A.; Mohanto, G.; Pal, Santanu

    2017-09-01

    Background: A significant research effort is currently aimed at understanding the synthesis of heavy elements. For this purpose, heavy ion induced fusion reactions are used and various experimental observations have indicated the influence of shell and deformation effects in the compound nucleus (CN) formation. There is a need to understand these two effects. Purpose: To investigate the effect of proton shell closure and deformation through the comparison of evaporation residue (ER) cross sections for the systems involving heavy compound nuclei around the ZCN=82 region. Methods: A systematic study of ER cross-section measurements was carried out for the 48Ti+Nd,150142 , 144Sm systems in the energy range of 140 -205 MeV . The measurement has been performed using the gas-filled mode of the hybrid recoil mass analyzer present at the Inter University Accelerator Centre (IUAC), New Delhi. Theoretical calculations based on a statistical model were carried out incorporating an adjustable barrier scaling factor to fit the experimental ER cross section. Coupled-channel calculations were also performed using the ccfull code to obtain the spin distribution of the CN, which was used as an input in the calculations. Results: Experimental ER cross sections for 48Ti+Nd,150142 were found to be considerably smaller than the statistical model predictions whereas experimental and statistical model predictions for 48Ti+144Sm were of comparable magnitudes. Conclusion: Though comparison of experimental ER cross sections with statistical model predictions indicate considerable non-compound-nuclear processes for 48Ti+Nd,150142 reactions, no such evidence is found for the 48Ti+144Sm system. Further investigations are required to understand the difference in fusion probabilities of 48Ti+142Nd and 48Ti+144Sm systems.

  9. Free Energy Computations by Minimization of Kullback-Leibler Divergence: An Efficient Adaptive Biasing Potential Method for Sparse Representations

    DTIC Science & Technology

    2011-10-14

    landscapes. It is motivated by statistical learning arguments and unifies the tasks of biasing the molecular dynamics to escape free energy wells and...statistical learning arguments and unifies the tasks of biasing the molecular dynamics to escape free energy wells and estimating the free energy...experimentally, to characterize global changes as well as investigate relative stabilities. In most applications, a brute- force computation based on

  10. Transfer of SIMNET Training in the Armor Officer Basic Course

    DTIC Science & Technology

    1991-01-01

    group correctly performed more tasks in the posttest , but the difference was not statistically significant for these small samples. Gains from pretest ...to posttest were not compared statistically, but the field-trained group showed little average gain. Based on these results and other supporting data...that serve as a control group , and (b) SIMNET classes after the change that serve as a treatment group . The comparison is termed quasi - experimental

  11. Jet Measurements for Development of Jet Noise Prediction Tools

    NASA Technical Reports Server (NTRS)

    Bridges, James E.

    2006-01-01

    The primary focus of my presentation is the development of the jet noise prediction code JeNo with most examples coming from the experimental work that drove the theoretical development and validation. JeNo is a statistical jet noise prediction code, based upon the Lilley acoustic analogy. Our approach uses time-average 2-D or 3-D mean and turbulent statistics of the flow as input. The output is source distributions and spectral directivity.

  12. Learning physics: A comparative analysis between instructional design methods

    NASA Astrophysics Data System (ADS)

    Mathew, Easow

    The purpose of this research was to determine if there were differences in academic performance between students who participated in traditional versus collaborative problem-based learning (PBL) instructional design approaches to physics curricula. This study utilized a quantitative quasi-experimental design methodology to determine the significance of differences in pre- and posttest introductory physics exam performance between students who participated in traditional (i.e., control group) versus collaborative problem solving (PBL) instructional design (i.e., experimental group) approaches to physics curricula over a college semester in 2008. There were 42 student participants (N = 42) enrolled in an introductory physics course at the research site in the Spring 2008 semester who agreed to participate in this study after reading and signing informed consent documents. A total of 22 participants were assigned to the experimental group (n = 22) who participated in a PBL based teaching methodology along with traditional lecture methods. The other 20 students were assigned to the control group (n = 20) who participated in the traditional lecture teaching methodology. Both the courses were taught by experienced professors who have qualifications at the doctoral level. The results indicated statistically significant differences (p < .01) in academic performance between students who participated in traditional (i.e., lower physics posttest scores and lower differences between pre- and posttest scores) versus collaborative (i.e., higher physics posttest scores, and higher differences between pre- and posttest scores) instructional design approaches to physics curricula. Despite some slight differences in control group and experimental group demographic characteristics (gender, ethnicity, and age) there were statistically significant (p = .04) differences between female average academic improvement which was much higher than male average academic improvement (˜63%) in the control group which may indicate that traditional teaching methods are more effective in females, whereas there was no significant difference noted in the experimental group between male and female participants. There was a statistically significant and negative relationship (r = -.61, p = .01) between age and physics pretest scores in the control group. No statistical analyses yielded significantly different average academic performance values in either group as delineated by ethnicity.

  13. Asymptotic formulae for likelihood-based tests of new physics

    NASA Astrophysics Data System (ADS)

    Cowan, Glen; Cranmer, Kyle; Gross, Eilam; Vitells, Ofer

    2011-02-01

    We describe likelihood-based statistical tests for use in high energy physics for the discovery of new phenomena and for construction of confidence intervals on model parameters. We focus on the properties of the test procedures that allow one to account for systematic uncertainties. Explicit formulae for the asymptotic distributions of test statistics are derived using results of Wilks and Wald. We motivate and justify the use of a representative data set, called the "Asimov data set", which provides a simple method to obtain the median experimental sensitivity of a search or measurement as well as fluctuations about this expectation.

  14. Logo image clustering based on advanced statistics

    NASA Astrophysics Data System (ADS)

    Wei, Yi; Kamel, Mohamed; He, Yiwei

    2007-11-01

    In recent years, there has been a growing interest in the research of image content description techniques. Among those, image clustering is one of the most frequently discussed topics. Similar to image recognition, image clustering is also a high-level representation technique. However it focuses on the coarse categorization rather than the accurate recognition. Based on wavelet transform (WT) and advanced statistics, the authors propose a novel approach that divides various shaped logo images into groups according to the external boundary of each logo image. Experimental results show that the presented method is accurate, fast and insensitive to defects.

  15. Markov model plus k-word distributions: a synergy that produces novel statistical measures for sequence comparison.

    PubMed

    Dai, Qi; Yang, Yanchun; Wang, Tianming

    2008-10-15

    Many proposed statistical measures can efficiently compare biological sequences to further infer their structures, functions and evolutionary information. They are related in spirit because all the ideas for sequence comparison try to use the information on the k-word distributions, Markov model or both. Motivated by adding k-word distributions to Markov model directly, we investigated two novel statistical measures for sequence comparison, called wre.k.r and S2.k.r. The proposed measures were tested by similarity search, evaluation on functionally related regulatory sequences and phylogenetic analysis. This offers the systematic and quantitative experimental assessment of our measures. Moreover, we compared our achievements with these based on alignment or alignment-free. We grouped our experiments into two sets. The first one, performed via ROC (receiver operating curve) analysis, aims at assessing the intrinsic ability of our statistical measures to search for similar sequences from a database and discriminate functionally related regulatory sequences from unrelated sequences. The second one aims at assessing how well our statistical measure is used for phylogenetic analysis. The experimental assessment demonstrates that our similarity measures intending to incorporate k-word distributions into Markov model are more efficient.

  16. Effectiveness of an integrated adventure-based training and health education program in promoting regular physical activity among childhood cancer survivors.

    PubMed

    Li, H C William; Chung, Oi Kwan Joyce; Ho, Ka Yan; Chiu, Sau Ying; Lopez, Violeta

    2013-11-01

    There is growing concern about declining levels of physical activity in childhood cancer survivors. This study aimed to examine the effectiveness of an integrated adventure-based training and health education program in promoting changes in exercise behavior and enhancing the physical activity levels, self-efficacy, and quality of life of Hong Kong Chinese childhood cancer survivors. A randomized controlled trial, two-group pretest and repeated post-test, between-subjects design was conducted to 71 childhood cancer survivors (9- to 16-year-olds). Participants in the experimental group joined a 4-day integrated adventure-based training and health education program. Control group participants received the same amount of time and attention as the experimental group but not in such a way as to have any specific effect on the outcome measures. Participants' exercise behavior changes, levels of physical activity, self-efficacy, and quality of life were assessed at the time of recruitment, 3, 6, and 9 months after starting the intervention. Participants in the experimental group reported statistically significant differences in physical activity stages of change (p < 0.001), higher levels of physical activity (p < 0.001) and self-efficacy (p = 0.04) than those in the control group. Besides, there were statistically significant mean differences (p < 0.001) in physical activity levels (-2.6), self-efficacy (-2.0), and quality of life (-4.3) of participants in the experimental group from baseline to 9 months after starting the intervention. The integrated adventure-based training and health education program was found to be Copyright © 2013 John Wiley & Sons, Ltd.

  17. A school-based health promotion program for stressed nursing students in Taiwan.

    PubMed

    Hsieh, Pei-Lin

    2011-09-01

    : Nursing students face both clinical and academic stress. Extensive theoretical and research literature suggests that peer support and regular exercise are critically important and can efficiently manage stress for nursing students. : The purpose of this study was to investigate the effect of a school-based health promotion program in a group physical activity intervention and peer support program for stressed nursing students. : This study used a quasi-experimental design to collect information and collected data from a stress questionnaire, semistructured questionnaire, and group discussion. Participants included 77 nursing students at an institute of technology in northern Taiwan. Participants were randomly assigned into experimental (n = 37) and control (n = 40) groups. Program duration was 16 weeks. Participants were selected based on their assessment results as having moderate or severe levels of stress. All participants in the experimental group took part in a group physical activity for 30 minutes three times a week. Eight weeks later, the researcher invited each group to discuss their feelings and stress coping strategies. Both groups completed pretest and posttest stress questionnaires. Quantitative data were analyzed using SPSS 14.0 Statistical Package for Windows, and qualitative data from each group discussion were analyzed using content analysis. : Results revealed that level of stress was statistically decreased in the experimental group. Posttest stress levels were significantly different in experimental and control groups. The results suggested that students who participated in the intervention had less stress than did those in the control group after the intervention. Those in the experimental group held positive views of peer support and physical activity. : The results of this study confirmed the efficacy of school-based health promotion programs in reducing stress in nursing students. Findings may provide educators with information to assist their developing effective health promotion programs to manage stress for their students. This study can also help students develop personal coping strategies through physical activity and peer support.

  18. Toward statistical modeling of saccadic eye-movement and visual saliency.

    PubMed

    Sun, Xiaoshuai; Yao, Hongxun; Ji, Rongrong; Liu, Xian-Ming

    2014-11-01

    In this paper, we present a unified statistical framework for modeling both saccadic eye movements and visual saliency. By analyzing the statistical properties of human eye fixations on natural images, we found that human attention is sparsely distributed and usually deployed to locations with abundant structural information. This observations inspired us to model saccadic behavior and visual saliency based on super-Gaussian component (SGC) analysis. Our model sequentially obtains SGC using projection pursuit, and generates eye movements by selecting the location with maximum SGC response. Besides human saccadic behavior simulation, we also demonstrated our superior effectiveness and robustness over state-of-the-arts by carrying out dense experiments on synthetic patterns and human eye fixation benchmarks. Multiple key issues in saliency modeling research, such as individual differences, the effects of scale and blur, are explored in this paper. Based on extensive qualitative and quantitative experimental results, we show promising potentials of statistical approaches for human behavior research.

  19. Ultrasound-enhanced bioscouring of greige cotton: regression analysis of process factors

    USDA-ARS?s Scientific Manuscript database

    Ultrasound-enhanced bioscouring process factors for greige cotton fabric are examined using custom experimental design utilizing statistical principles. An equation is presented which predicts bioscouring performance based upon percent reflectance values obtained from UV-Vis measurements of rutheniu...

  20. What Is Design-Based Causal Inference for RCTs and Why Should I Use It? NCEE 2017-4025

    ERIC Educational Resources Information Center

    Schochet, Peter Z.

    2017-01-01

    Design-based methods have recently been developed as a way to analyze data from impact evaluations of interventions, programs, and policies. The impact estimators are derived using the building blocks of experimental designs with minimal assumptions, and have good statistical properties. The methods apply to randomized controlled trials (RCTs) and…

  1. Data-Mining Techniques in Detecting Factors Linked to Academic Achievement

    ERIC Educational Resources Information Center

    Martínez Abad, Fernando; Chaparro Caso López, Alicia A.

    2017-01-01

    In light of the emergence of statistical analysis techniques based on data mining in education sciences, and the potential they offer to detect non-trivial information in large databases, this paper presents a procedure used to detect factors linked to academic achievement in large-scale assessments. The study is based on a non-experimental,…

  2. Experimental study on the statistic characteristics of a 3x3 RF MIMO channel over a single conventional multimode fiber.

    PubMed

    Lei, Yi; Li, Jianqiang; Wu, Rui; Fan, Yuting; Fu, Songnian; Yin, Feifei; Dai, Yitang; Xu, Kun

    2017-06-01

    Based on the observed random fluctuation phenomenon of speckle pattern across multimode fiber (MMF) facet and received optical power distribution across three output ports, we experimentally investigate the statistic characteristics of a 3×3 radio frequency multiple-input multiple-output (MIMO) channel enabled by mode division multiplexing in a conventional 50 µm MMF using non-mode-selective three-dimensional waveguide photonic lanterns as mode multiplexer and demultiplexer. The impacts of mode coupling on the MIMO channel coefficients, channel matrix, and channel capacity have been analyzed over different fiber lengths. The results indicate that spatial multiplexing benefits from the greater fiber length with stronger mode coupling, despite a higher optical loss.

  3. Towards Principled Experimental Study of Autonomous Mobile Robots

    NASA Technical Reports Server (NTRS)

    Gat, Erann

    1995-01-01

    We review the current state of research in autonomous mobile robots and conclude that there is an inadequate basis for predicting the reliability and behavior of robots operating in unengineered environments. We present a new approach to the study of autonomous mobile robot performance based on formal statistical analysis of independently reproducible experiments conducted on real robots. Simulators serve as models rather than experimental surrogates. We demonstrate three new results: 1) Two commonly used performance metrics (time and distance) are not as well correlated as is often tacitly assumed. 2) The probability distributions of these performance metrics are exponential rather than normal, and 3) a modular, object-oriented simulation accurately predicts the behavior of the real robot in a statistically significant manner.

  4. Counting statistics of chaotic resonances at optical frequencies: Theory and experiments

    NASA Astrophysics Data System (ADS)

    Lippolis, Domenico; Wang, Li; Xiao, Yun-Feng

    2017-07-01

    A deformed dielectric microcavity is used as an experimental platform for the analysis of the statistics of chaotic resonances, in the perspective of testing fractal Weyl laws at optical frequencies. In order to surmount the difficulties that arise from reading strongly overlapping spectra, we exploit the mixed nature of the phase space at hand, and only count the high-Q whispering-gallery modes (WGMs) directly. That enables us to draw statistical information on the more lossy chaotic resonances, coupled to the high-Q regular modes via dynamical tunneling. Three different models [classical, Random-Matrix-Theory (RMT) based, semiclassical] to interpret the experimental data are discussed. On the basis of least-squares analysis, theoretical estimates of Ehrenfest time, and independent measurements, we find that a semiclassically modified RMT-based expression best describes the experiment in all its realizations, particularly when the resonator is coupled to visible light, while RMT alone still works quite well in the infrared. In this work we reexamine and substantially extend the results of a short paper published earlier [L. Wang et al., Phys. Rev. E 93, 040201(R) (2016), 10.1103/PhysRevE.93.040201].

  5. Assessment of Current Jet Noise Prediction Capabilities

    NASA Technical Reports Server (NTRS)

    Hunter, Craid A.; Bridges, James E.; Khavaran, Abbas

    2008-01-01

    An assessment was made of the capability of jet noise prediction codes over a broad range of jet flows, with the objective of quantifying current capabilities and identifying areas requiring future research investment. Three separate codes in NASA s possession, representative of two classes of jet noise prediction codes, were evaluated, one empirical and two statistical. The empirical code is the Stone Jet Noise Module (ST2JET) contained within the ANOPP aircraft noise prediction code. It is well documented, and represents the state of the art in semi-empirical acoustic prediction codes where virtual sources are attributed to various aspects of noise generation in each jet. These sources, in combination, predict the spectral directivity of a jet plume. A total of 258 jet noise cases were examined on the ST2JET code, each run requiring only fractions of a second to complete. Two statistical jet noise prediction codes were also evaluated, JeNo v1, and Jet3D. Fewer cases were run for the statistical prediction methods because they require substantially more resources, typically a Reynolds-Averaged Navier-Stokes solution of the jet, volume integration of the source statistical models over the entire plume, and a numerical solution of the governing propagation equation within the jet. In the evaluation process, substantial justification of experimental datasets used in the evaluations was made. In the end, none of the current codes can predict jet noise within experimental uncertainty. The empirical code came within 2dB on a 1/3 octave spectral basis for a wide range of flows. The statistical code Jet3D was within experimental uncertainty at broadside angles for hot supersonic jets, but errors in peak frequency and amplitude put it out of experimental uncertainty at cooler, lower speed conditions. Jet3D did not predict changes in directivity in the downstream angles. The statistical code JeNo,v1 was within experimental uncertainty predicting noise from cold subsonic jets at all angles, but did not predict changes with heating of the jet and did not account for directivity changes at supersonic conditions. Shortcomings addressed here give direction for future work relevant to the statistical-based prediction methods. A full report will be released as a chapter in a NASA publication assessing the state of the art in aircraft noise prediction.

  6. Sensitivity analysis of navy aviation readiness based sparing model

    DTIC Science & Technology

    2017-09-01

    variability. (See Figure 4.) Figure 4. Research design flowchart 18 Figure 4 lays out the four steps of the methodology , starting in the upper left-hand...as a function of changes in key inputs. We develop NAVARM Experimental Designs (NED), a computational tool created by applying a state-of-the-art...experimental design to the NAVARM model. Statistical analysis of the resulting data identifies the most influential cost factors. Those are, in order of

  7. Experimental evaluation of LED-based solar blind NLOS communication links.

    PubMed

    Chen, Gang; Abou-Galala, Feras; Xu, Zhengyuan; Sadler, Brian M

    2008-09-15

    Experimental results are reported demonstrating non-line of sight short-range ultraviolet communication link losses, and performance of photon counting detectors, operating in the solar blind spectrum regime. We employ light emitting diodes with divergent beams, a solar blind filter, and a wide field-of-view detector. Signal and noise statistics are characterized, and receiver performance is demonstrated. The effects of transmitter and receiver elevation angles, separation distance, and path loss are included.

  8. Cloud-based solution to identify statistically significant MS peaks differentiating sample categories.

    PubMed

    Ji, Jun; Ling, Jeffrey; Jiang, Helen; Wen, Qiaojun; Whitin, John C; Tian, Lu; Cohen, Harvey J; Ling, Xuefeng B

    2013-03-23

    Mass spectrometry (MS) has evolved to become the primary high throughput tool for proteomics based biomarker discovery. Until now, multiple challenges in protein MS data analysis remain: large-scale and complex data set management; MS peak identification, indexing; and high dimensional peak differential analysis with the concurrent statistical tests based false discovery rate (FDR). "Turnkey" solutions are needed for biomarker investigations to rapidly process MS data sets to identify statistically significant peaks for subsequent validation. Here we present an efficient and effective solution, which provides experimental biologists easy access to "cloud" computing capabilities to analyze MS data. The web portal can be accessed at http://transmed.stanford.edu/ssa/. Presented web application supplies large scale MS data online uploading and analysis with a simple user interface. This bioinformatic tool will facilitate the discovery of the potential protein biomarkers using MS.

  9. Modelling spruce bark beetle infestation probability

    Treesearch

    Paulius Zolubas; Jose Negron; A. Steven Munson

    2009-01-01

    Spruce bark beetle (Ips typographus L.) risk model, based on pure Norway spruce (Picea abies Karst.) stand characteristics in experimental and control plots was developed using classification and regression tree statistical technique under endemic pest population density. The most significant variable in spruce bark beetle...

  10. Discovery and validation of gene classifiers for endocrine-disrupting chemicals in zebrafish (Danio rerio)

    EPA Science Inventory

    Development and application of transcriptomics-based gene classifiers for ecotoxicological applications lag far behind those of human biomedical science. Many such classifiers discovered thus far lack vigorous statistical and experimental validations, with their stability and rel...

  11. Complete PMD compensation in 40-Gbit/s optical transmission system

    NASA Astrophysics Data System (ADS)

    Luo, Rui; Li, Tangjun; Wang, Muguang; Cui, Jie; Jian, Shuisheng

    2004-04-01

    In this paper, we successfully demonstrated automatic PMD compensation in 40Gbit/s NRZ transmission for the first time. Using a PMD monitor of 20GHz intensity extracted from the receive 40Gbit/s NRZ base band signal, we accomplished the feedback control of an optical PMD compensator consisting of a polarization controller and a polarization-maintaining fiber. And we report the statistical assessment of an adaptive optical PMD compensator at 40Gbit/s. The mitigator, described in, is experimentally tested in many PMD conditions (not limited to first order) covering Maxwellian-like PMD statistics. Experimental results, including bit error rate measurements, are successfully compared with theory, hereby demonstrating the compensator efficiency at 40Gbit/s. Furthermore, this letter introduces a two-stage PMD compensator. Our experimental results shows that, the compensators based on the two-stages of compensator can be used to PMD compensation in a 40Gbit/s OTDM system with 60 km high PMD fiber. The first-order PMD was max.274ps before PMD compensation. It was smaller than 7ps after PMD compensation. At the same time, the tunable FBG have a function of dispersion compensation.

  12. Predicting Binding Free Energy Change Caused by Point Mutations with Knowledge-Modified MM/PBSA Method.

    PubMed

    Petukh, Marharyta; Li, Minghui; Alexov, Emil

    2015-07-01

    A new methodology termed Single Amino Acid Mutation based change in Binding free Energy (SAAMBE) was developed to predict the changes of the binding free energy caused by mutations. The method utilizes 3D structures of the corresponding protein-protein complexes and takes advantage of both approaches: sequence- and structure-based methods. The method has two components: a MM/PBSA-based component, and an additional set of statistical terms delivered from statistical investigation of physico-chemical properties of protein complexes. While the approach is rigid body approach and does not explicitly consider plausible conformational changes caused by the binding, the effect of conformational changes, including changes away from binding interface, on electrostatics are mimicked with amino acid specific dielectric constants. This provides significant improvement of SAAMBE predictions as indicated by better match against experimentally determined binding free energy changes over 1300 mutations in 43 proteins. The final benchmarking resulted in a very good agreement with experimental data (correlation coefficient 0.624) while the algorithm being fast enough to allow for large-scale calculations (the average time is less than a minute per mutation).

  13. Genetic programming based models in plant tissue culture: An addendum to traditional statistical approach.

    PubMed

    Mridula, Meenu R; Nair, Ashalatha S; Kumar, K Satheesh

    2018-02-01

    In this paper, we compared the efficacy of observation based modeling approach using a genetic algorithm with the regular statistical analysis as an alternative methodology in plant research. Preliminary experimental data on in vitro rooting was taken for this study with an aim to understand the effect of charcoal and naphthalene acetic acid (NAA) on successful rooting and also to optimize the two variables for maximum result. Observation-based modelling, as well as traditional approach, could identify NAA as a critical factor in rooting of the plantlets under the experimental conditions employed. Symbolic regression analysis using the software deployed here optimised the treatments studied and was successful in identifying the complex non-linear interaction among the variables, with minimalistic preliminary data. The presence of charcoal in the culture medium has a significant impact on root generation by reducing basal callus mass formation. Such an approach is advantageous for establishing in vitro culture protocols as these models will have significant potential for saving time and expenditure in plant tissue culture laboratories, and it further reduces the need for specialised background.

  14. The effect of a Web-based education programme (WBEP) on disease severity, quality of life and mothers' self-efficacy in children with atopic dermatitis.

    PubMed

    Son, Hae Kyoung; Lim, Jiyoung

    2014-10-01

    To develop and evaluate the effects of a web-based education programme in early childhood for children with atopic dermatitis. The prevalence rate of atopic dermatitis is highest in early childhood. A holistic approach is urgently needed for young children with respect to disease severity, quality of life and management, particularly parental knowledge about atopic dermatitis and adherence to treatment. A quasi-experimental study design was used. A total of 40 mother-child dyads participated in the study from 1 July-30 November 2011 in Korea. All children were under 3 years of age. The programme was based on the Network-Based Instructional System Design model, which consists of five phases: analysis, design, development, implementation and evaluation. The experimental group participated in the programme for 2 weeks. Participants took part in a learning session during the first week and then conducted the practice session at home during the second week. Participant knowledge and compliance were evaluated through online quizzes and self-checklists. Statistical analyses (chi-square test and t-test) were performed using the Statistical Analysis System, Version 9.13. There was a significant improvement in disease severity, quality of life and mothers' self-efficacy in the experimental group; thus, the web-based education programme was effective. The web-based education programme as an advanced intervention may be useful in providing basic data for future atopic dermatitis-related studies. Moreover, the programme may serve as a nursing educational intervention tool for clinical nursing practices. © 2014 John Wiley & Sons Ltd.

  15. On the radiated EMI current extraction of dc transmission line based on corona current statistical measurements

    NASA Astrophysics Data System (ADS)

    Yi, Yong; Chen, Zhengying; Wang, Liming

    2018-05-01

    Corona-originated discharge of DC transmission lines is the main reason for the radiated electromagnetic interference (EMI) field in the vicinity of transmission lines. A joint time-frequency analysis technique was proposed to extract the radiated EMI current (excitation current) of DC corona based on corona current statistical measurements. A reduced-scale experimental platform was setup to measure the statistical distributions of current waveform parameters of aluminum conductor steel reinforced. Based on the measured results, the peak value, root-mean-square value and average value with 9 kHz and 200 Hz band-with of 0.5 MHz radiated EMI current were calculated by the technique proposed and validated with conventional excitation function method. Radio interference (RI) was calculated based on the radiated EMI current and a wire-to-plate platform was built for the validity of the RI computation results. The reason for the certain deviation between the computations and measurements was detailed analyzed.

  16. Scene-based nonuniformity correction with reduced ghosting using a gated LMS algorithm.

    PubMed

    Hardie, Russell C; Baxley, Frank; Brys, Brandon; Hytla, Patrick

    2009-08-17

    In this paper, we present a scene-based nouniformity correction (NUC) method using a modified adaptive least mean square (LMS) algorithm with a novel gating operation on the updates. The gating is designed to significantly reduce ghosting artifacts produced by many scene-based NUC algorithms by halting updates when temporal variation is lacking. We define the algorithm and present a number of experimental results to demonstrate the efficacy of the proposed method in comparison to several previously published methods including other LMS and constant statistics based methods. The experimental results include simulated imagery and a real infrared image sequence. We show that the proposed method significantly reduces ghosting artifacts, but has a slightly longer convergence time. (c) 2009 Optical Society of America

  17. Objective research of auscultation signals in Traditional Chinese Medicine based on wavelet packet energy and support vector machine.

    PubMed

    Yan, Jianjun; Shen, Xiaojing; Wang, Yiqin; Li, Fufeng; Xia, Chunming; Guo, Rui; Chen, Chunfeng; Shen, Qingwei

    2010-01-01

    This study aims at utilising Wavelet Packet Transform (WPT) and Support Vector Machine (SVM) algorithm to make objective analysis and quantitative research for the auscultation in Traditional Chinese Medicine (TCM) diagnosis. First, Wavelet Packet Decomposition (WPD) at level 6 was employed to split more elaborate frequency bands of the auscultation signals. Then statistic analysis was made based on the extracted Wavelet Packet Energy (WPE) features from WPD coefficients. Furthermore, the pattern recognition was used to distinguish mixed subjects' statistical feature values of sample groups through SVM. Finally, the experimental results showed that the classification accuracies were at a high level.

  18. Application of Statistical Design for the Production of Cellulase by Trichoderma reesei Using Mango Peel

    PubMed Central

    Saravanan, P.; Muthuvelayudham, R.; Viruthagiri, T.

    2012-01-01

    Optimization of the culture medium for cellulase production using Trichoderma reesei was carried out. The optimization of cellulase production using mango peel as substrate was performed with statistical methodology based on experimental designs. The screening of nine nutrients for their influence on cellulase production is achieved using Plackett-Burman design. Avicel, soybean cake flour, KH2PO4, and CoCl2 ·6H2O were selected based on their positive influence on cellulase production. The composition of the selected components was optimized using Response Surface Methodology (RSM). The optimum conditions are as follows: Avicel: 25.30 g/L, Soybean cake flour: 23.53 g/L, KH2PO4: 4.90 g/L, and CoCl2 ·6H2O: 0.95 g/L. These conditions are validated experimentally which revealed an enhanced Cellulase activity of 7.8 IU/mL. PMID:23304453

  19. Data-driven sensitivity inference for Thomson scattering electron density measurement systems.

    PubMed

    Fujii, Keisuke; Yamada, Ichihiro; Hasuo, Masahiro

    2017-01-01

    We developed a method to infer the calibration parameters of multichannel measurement systems, such as channel variations of sensitivity and noise amplitude, from experimental data. We regard such uncertainties of the calibration parameters as dependent noise. The statistical properties of the dependent noise and that of the latent functions were modeled and implemented in the Gaussian process kernel. Based on their statistical difference, both parameters were inferred from the data. We applied this method to the electron density measurement system by Thomson scattering for the Large Helical Device plasma, which is equipped with 141 spatial channels. Based on the 210 sets of experimental data, we evaluated the correction factor of the sensitivity and noise amplitude for each channel. The correction factor varies by ≈10%, and the random noise amplitude is ≈2%, i.e., the measurement accuracy increases by a factor of 5 after this sensitivity correction. The certainty improvement in the spatial derivative inference was demonstrated.

  20. Modelling solute dispersion in periodic heterogeneous porous media: Model benchmarking against intermediate scale experiments

    NASA Astrophysics Data System (ADS)

    Majdalani, Samer; Guinot, Vincent; Delenne, Carole; Gebran, Hicham

    2018-06-01

    This paper is devoted to theoretical and experimental investigations of solute dispersion in heterogeneous porous media. Dispersion in heterogenous porous media has been reported to be scale-dependent, a likely indication that the proposed dispersion models are incompletely formulated. A high quality experimental data set of breakthrough curves in periodic model heterogeneous porous media is presented. In contrast with most previously published experiments, the present experiments involve numerous replicates. This allows the statistical variability of experimental data to be accounted for. Several models are benchmarked against the data set: the Fickian-based advection-dispersion, mobile-immobile, multirate, multiple region advection dispersion models, and a newly proposed transport model based on pure advection. A salient property of the latter model is that its solutions exhibit a ballistic behaviour for small times, while tending to the Fickian behaviour for large time scales. Model performance is assessed using a novel objective function accounting for the statistical variability of the experimental data set, while putting equal emphasis on both small and large time scale behaviours. Besides being as accurate as the other models, the new purely advective model has the advantages that (i) it does not exhibit the undesirable effects associated with the usual Fickian operator (namely the infinite solute front propagation speed), and (ii) it allows dispersive transport to be simulated on every heterogeneity scale using scale-independent parameters.

  1. Nonelastic nuclear reactions and accompanying gamma radiation

    NASA Technical Reports Server (NTRS)

    Snow, R.; Rosner, H. R.; George, M. C.; Hayes, J. D.

    1971-01-01

    Several aspects of nonelastic nuclear reactions which proceed through the formation of a compound nucleus are dealt with. The full statistical model and the partial statistical model are described and computer programs based on these models are presented along with operating instructions and input and output for sample problems. A theoretical development of the expression for the reaction cross section for the hybrid case which involves a combination of the continuum aspects of the full statistical model with the discrete level aspects of the partial statistical model is presented. Cross sections for level excitation and gamma production by neutron inelastic scattering from the nuclei Al-27, Fe-56, Si-28, and Pb-208 are calculated and compared with avaliable experimental data.

  2. Neural network approaches versus statistical methods in classification of multisource remote sensing data

    NASA Technical Reports Server (NTRS)

    Benediktsson, Jon A.; Swain, Philip H.; Ersoy, Okan K.

    1990-01-01

    Neural network learning procedures and statistical classificaiton methods are applied and compared empirically in classification of multisource remote sensing and geographic data. Statistical multisource classification by means of a method based on Bayesian classification theory is also investigated and modified. The modifications permit control of the influence of the data sources involved in the classification process. Reliability measures are introduced to rank the quality of the data sources. The data sources are then weighted according to these rankings in the statistical multisource classification. Four data sources are used in experiments: Landsat MSS data and three forms of topographic data (elevation, slope, and aspect). Experimental results show that two different approaches have unique advantages and disadvantages in this classification application.

  3. A standard for test reliability in group research.

    PubMed

    Ellis, Jules L

    2013-03-01

    Many authors adhere to the rule that test reliabilities should be at least .70 or .80 in group research. This article introduces a new standard according to which reliabilities can be evaluated. This standard is based on the costs or time of the experiment and of administering the test. For example, if test administration costs are 7 % of the total experimental costs, the efficient value of the reliability is .93. If the actual reliability of a test is equal to this efficient reliability, the test size maximizes the statistical power of the experiment, given the costs. As a standard in experimental research, it is proposed that the reliability of the dependent variable be close to the efficient reliability. Adhering to this standard will enhance the statistical power and reduce the costs of experiments.

  4. Projecting Range Limits with Coupled Thermal Tolerance - Climate Change Models: An Example Based on Gray Snapper (Lutjanus griseus) along the U.S. East Coast

    PubMed Central

    Hare, Jonathan A.; Wuenschel, Mark J.; Kimball, Matthew E.

    2012-01-01

    We couple a species range limit hypothesis with the output of an ensemble of general circulation models to project the poleward range limit of gray snapper. Using laboratory-derived thermal limits and statistical downscaling from IPCC AR4 general circulation models, we project that gray snapper will shift northwards; the magnitude of this shift is dependent on the magnitude of climate change. We also evaluate the uncertainty in our projection and find that statistical uncertainty associated with the experimentally-derived thermal limits is the largest contributor (∼ 65%) to overall quantified uncertainty. This finding argues for more experimental work aimed at understanding and parameterizing the effects of climate change and variability on marine species. PMID:23284974

  5. A practical approach for the scale-up of roller compaction process.

    PubMed

    Shi, Weixian; Sprockel, Omar L

    2016-09-01

    An alternative approach for the scale-up of ribbon formation during roller compaction was investigated, which required only one batch at the commercial scale to set the operational conditions. The scale-up of ribbon formation was based on a probability method. It was sufficient in describing the mechanism of ribbon formation at both scales. In this method, a statistical relationship between roller compaction parameters and ribbon attributes (thickness and density) was first defined with DoE using a pilot Alexanderwerk WP120 roller compactor. While the milling speed was included in the design, it has no practical effect on granule properties within the study range despite its statistical significance. The statistical relationship was then adapted to a commercial Alexanderwerk WP200 roller compactor with one experimental run. The experimental run served as a calibration of the statistical model parameters. The proposed transfer method was then confirmed by conducting a mapping study on the Alexanderwerk WP200 using a factorial DoE, which showed a match between the predictions and the verification experiments. The study demonstrates the applicability of the roller compaction transfer method using the statistical model from the development scale calibrated with one experiment point at the commercial scale. Copyright © 2016 Elsevier B.V. All rights reserved.

  6. Modelling of Batch Lactic Acid Fermentation in
the Presence of Anionic Clay

    PubMed Central

    Jinescu, Cosmin; Aruş, Vasilica Alisa; Nistor, Ileana Denisa

    2014-01-01

    Summary Batch fermentation of milk inoculated with lactic acid bacteria was conducted in the presence of hydrotalcite-type anionic clay under static and ultrasonic conditions. An experimental study of the effect of fermentation temperature (t=38–43 °C), clay/milk ratio (R=1–7.5 g/L) and ultrasonic field (ν=0 and 35 kHz) on process dynamics was performed. A mathematical model was selected to describe the fermentation process kinetics and its parameters were estimated based on experimental data. A good agreement between the experimental and simulated results was achieved. Consequently, the model can be employed to predict the dynamics of batch lactic acid fermentation with values of process variables in the studied ranges. A statistical analysis of the data based on a 23 factorial experiment was performed in order to express experimental and model-regressed process responses depending on t, R and ν factors. PMID:27904318

  7. Illuminating Tradespace Decisions Using Efficient Experimental Space-Filling Designs for the Engineered Resilient System Architecture

    DTIC Science & Technology

    2015-06-30

    7. Building Statistical Metamodels using Simulation Experimental Designs ............................................... 34 7.1. Statistical Design...system design drivers across several different domain models, our methodology uses statistical metamodeling to approximate the simulations’ behavior. A...output. We build metamodels using a number of statistical methods that include stepwise regression, boosted trees, neural nets, and bootstrap forest

  8. Illuminating Tradespace Decisions Using Efficient Experimental Space-Filling Designs for the Engineered Resilient System Architecture

    DTIC Science & Technology

    2015-06-01

    7. Building Statistical Metamodels using Simulation Experimental Designs ............................................... 34 7.1. Statistical Design...system design drivers across several different domain models, our methodology uses statistical metamodeling to approximate the simulations’ behavior. A...output. We build metamodels using a number of statistical methods that include stepwise regression, boosted trees, neural nets, and bootstrap forest

  9. Anti-adhesive effect of poloxamer-based thermo-sensitive sol-gel in rabbit laminectomy model.

    PubMed

    Shin, Sung Joon; Lee, Jae Hyup; So, Jungwon; Min, Kyungdan

    2016-11-01

    Poloxamer-based thermo-sensitive sol-gel has been developed to reduce the incidence of postoperative scar formation at the laminectomy site. The purpose of this study was to evaluate the anti-adhesive effect of poloxamer based thermo-sensitive sol-gel compared to hyaluronate based solution after laminectomy, using a rabbit model. A thermo-sensitive anti-adhesive with a property of sol-gel transition was manufactured by a physical mixture of Poloxamer188/407, Chitosan and Gelatin. The viscosity in different temperatures was assessed. 72 adult New Zealand rabbits underwent lumbar laminectomy and were randomly divided into experimental (treated with the newly developed agent), positive (treated with hyaluronate based solution), and negative control groups. Each group was subdivided into 1 and 4-week subgroups. Gross and histological evaluations were performed to assess the extent of epidural adhesion. The experimental group showed significantly higher viscosity compared to the positive control group and showed a significant increase of viscosity as the temperature increased. Gross evaluation showed no statistically significant differences between the 1- and 4-week subgroups. However, histologic evaluation showed significant differences both in 1- and 4-week subgroups. Although the 4-week histologic results of the experimental and the positive control subgroups showed no significant difference, both subgroups revealed higher value compared to the negative control subgroup with regard to the ratio of adhesion less than 50 %. The new poloxamer based thermo-sensitive agent showed superior efficacy over the hyaluronate based agent at 1 week postoperatively. At 4 weeks postoperatively, there were no statistically significant differences between the two agents, although both showed efficacy over the sham group.

  10. No-reference image quality assessment based on natural scene statistics and gradient magnitude similarity

    NASA Astrophysics Data System (ADS)

    Jia, Huizhen; Sun, Quansen; Ji, Zexuan; Wang, Tonghan; Chen, Qiang

    2014-11-01

    The goal of no-reference/blind image quality assessment (NR-IQA) is to devise a perceptual model that can accurately predict the quality of a distorted image as human opinions, in which feature extraction is an important issue. However, the features used in the state-of-the-art "general purpose" NR-IQA algorithms are usually natural scene statistics (NSS) based or are perceptually relevant; therefore, the performance of these models is limited. To further improve the performance of NR-IQA, we propose a general purpose NR-IQA algorithm which combines NSS-based features with perceptually relevant features. The new method extracts features in both the spatial and gradient domains. In the spatial domain, we extract the point-wise statistics for single pixel values which are characterized by a generalized Gaussian distribution model to form the underlying features. In the gradient domain, statistical features based on neighboring gradient magnitude similarity are extracted. Then a mapping is learned to predict quality scores using a support vector regression. The experimental results on the benchmark image databases demonstrate that the proposed algorithm correlates highly with human judgments of quality and leads to significant performance improvements over state-of-the-art methods.

  11. Optimal Experimental Design for Model Discrimination

    PubMed Central

    Myung, Jay I.; Pitt, Mark A.

    2009-01-01

    Models of a psychological process can be difficult to discriminate experimentally because it is not easy to determine the values of the critical design variables (e.g., presentation schedule, stimulus structure) that will be most informative in differentiating them. Recent developments in sampling-based search methods in statistics make it possible to determine these values, and thereby identify an optimal experimental design. After describing the method, it is demonstrated in two content areas in cognitive psychology in which models are highly competitive: retention (i.e., forgetting) and categorization. The optimal design is compared with the quality of designs used in the literature. The findings demonstrate that design optimization has the potential to increase the informativeness of the experimental method. PMID:19618983

  12. Using entropy measures to characterize human locomotion.

    PubMed

    Leverick, Graham; Szturm, Tony; Wu, Christine Q

    2014-12-01

    Entropy measures have been widely used to quantify the complexity of theoretical and experimental dynamical systems. In this paper, the value of using entropy measures to characterize human locomotion is demonstrated based on their construct validity, predictive validity in a simple model of human walking and convergent validity in an experimental study. Results show that four of the five considered entropy measures increase meaningfully with the increased probability of falling in a simple passive bipedal walker model. The same four entropy measures also experienced statistically significant increases in response to increasing age and gait impairment caused by cognitive interference in an experimental study. Of the considered entropy measures, the proposed quantized dynamical entropy (QDE) and quantization-based approximation of sample entropy (QASE) offered the best combination of sensitivity to changes in gait dynamics and computational efficiency. Based on these results, entropy appears to be a viable candidate for assessing the stability of human locomotion.

  13. Statistical validation and an empirical model of hydrogen production enhancement found by utilizing passive flow disturbance in the steam-reformation process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Erickson, Paul A.; Liao, Chang-hsien

    2007-11-15

    A passive flow disturbance has been proven to enhance the conversion of fuel in a methanol-steam reformer. This study presents a statistical validation of the experiment based on a standard 2{sup k} factorial experiment design and the resulting empirical model of the enhanced hydrogen producing process. A factorial experiment design was used to statistically analyze the effects and interactions of various input factors in the experiment. Three input factors, including the number of flow disturbers, catalyst size, and reactant flow rate were investigated for their effects on the fuel conversion in the steam-reformation process. Based on the experimental results, anmore » empirical model was developed and further evaluated with an uncertainty analysis and interior point data. (author)« less

  14. In pursuit of a science of agriculture: the role of statistics in field experiments.

    PubMed

    Parolini, Giuditta

    2015-09-01

    Since the beginning of the twentieth century statistics has reshaped the experimental cultures of agricultural research taking part in the subtle dialectic between the epistemic and the material that is proper to experimental systems. This transformation has become especially relevant in field trials and the paper will examine the British agricultural institution, Rothamsted Experimental Station, where statistical methods nowadays popular in the planning and analysis of field experiments were developed in the 1920s. At Rothamsted statistics promoted randomisation over systematic arrangements, factorisation over one-question trials, and emphasised the importance of the experimental error in assessing field trials. These changes in methodology transformed also the material culture of agricultural science, and a new body, the Field Plots Committee, was created to manage the field research of the agricultural institution. Although successful, the vision of field experimentation proposed by the Rothamsted statisticians was not unproblematic. Experimental scientists closely linked to the farming community questioned it in favour of a field research that could be more easily understood by farmers. The clash between the two agendas reveals how the role attributed to statistics in field experimentation defined different pursuits of agricultural research, alternately conceived of as a scientists' science or as a farmers' science.

  15. Use of simulation-based learning in undergraduate nurse education: An umbrella systematic review.

    PubMed

    Cant, Robyn P; Cooper, Simon J

    2017-02-01

    To conduct a systematic review to appraise and review evidence on the impact of simulation-based education for undergraduate/pre-licensure nursing students, using existing reviews of literature. An umbrella review (review of reviews). Cumulative Index of Nursing and Allied Health Literature (CINAHLPlus), PubMed, and Google Scholar. Reviews of literature conducted between 2010 and 2015 regarding simulation-based education for pre-licensure nursing students. The Joanna Briggs Institute methodology for conduct of an umbrella review was used to inform the review process. Twenty-five systematic reviews of literature were included, of which 14 were recent (2013-2015). Most described the level of evidence of component studies as a mix of experimental and quasi-experimental designs. The reviews measured around 14 different main outcome variables, thus limiting the number of primary studies that each individual review could pool to appraise. Many reviews agreed on the key learning outcome of knowledge acquisition, although no overall quantitative effect was derived. Three of four high-quality reviews found that simulation supported psychomotor development; a fourth found too few high quality studies to make a statistical comparison. Simulation statistically improved self-efficacy in pretest-posttest studies, and in experimental designs self-efficacy was superior to that of other teaching methods; lower level research designs limiting further comparison. The reviews commonly reported strong student satisfaction with simulation education and some reported improved confidence and/or critical thinking. This umbrella review took a global view of 25 reviews of simulation research in nursing education, comprising over 700 primary studies. To discern overall outcomes across reviews, statistical comparison of quantitative results (effect size) must be the key comparator. Simulation-based education contributes to students' learning in a number of ways when integrated into pre-licensure nursing curricula. Overall, use of a constellation of instruments and a lack of high quality study designs mean that there are still some gaps in evidence of effects that need to be addressed. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. The microdopant effects of surfactant elements on structure-phase transitions during the rapid quenched crystallization of Fe-C-based melts

    NASA Astrophysics Data System (ADS)

    Polukhin, V. A.; Belyakova, R. M.; Rigmant, L. K.

    2008-02-01

    The nature of microdopant effects of surfactant Te and H2 reagents on structure-phase transitions in rapidly quenched and crystallized eutectic Fe-C-based melts were studied by experimental and computer methods. On the base of results of statistic-geometrical analysis the new information about the structure changes in multi-scaling systems -from meso- to nano-ones were obtained.

  17. Effect of an empowerment-based nutrition promotion program on food consumption and serum lipid levels in hyperlipidemic Thai elderly.

    PubMed

    Boonyasopun, Umaporn; Aree, Patcharaporn; Avant, Kay C

    2008-06-01

    This quasi-experimental study examined the effects of an empowerment-based nutrition promotion program on food consumption and serum lipid levels among hyperlipidemic Thai elderly. Fifty-six experimental subjects received the program; 48 control subjects maintained their habitual lifestyle. The statistical methods used were the t-test, Z-test, and chi2/Fisher's exact test. After the program, the consumption of high saturated fat, cholesterol, and simple sugar diets was significantly lower for the experimental group than for the control group. The percentage change of the serum total cholesterol of the experimental subjects was significantly higher than that of the control subjects. The number of experimental subjects that changed from hyperlipidemia to normolipidemia significantly increased compared to that for the control subjects. The implementation of this program was related to an improvement in food consumption and serum lipid levels among hyperlipidemic Thai elderly and, therefore, has implications for practice.

  18. Data Processing System (DPS) software with experimental design, statistical analysis and data mining developed for use in entomological research.

    PubMed

    Tang, Qi-Yi; Zhang, Chuan-Xi

    2013-04-01

    A comprehensive but simple-to-use software package called DPS (Data Processing System) has been developed to execute a range of standard numerical analyses and operations used in experimental design, statistics and data mining. This program runs on standard Windows computers. Many of the functions are specific to entomological and other biological research and are not found in standard statistical software. This paper presents applications of DPS to experimental design, statistical analysis and data mining in entomology. © 2012 The Authors Insect Science © 2012 Institute of Zoology, Chinese Academy of Sciences.

  19. Comparison of Internet-based and paper-based questionnaires in Taiwan using multisample invariance approach.

    PubMed

    Yu, Sen-Chi; Yu, Min-Ning

    2007-08-01

    This study examines whether the Internet-based questionnaire is psychometrically equivalent to the paper-based questionnaire. A random sample of 2,400 teachers in Taiwan was divided into experimental and control groups. The experimental group was invited to complete the electronic form of the Chinese version of Center for Epidemiologic Studies Depression Scale (CES-D) placed on the Internet, whereas the control group was invited to complete the paper-based CES-D, which they received by mail. The multisample invariance approach, derived from structural equation modeling (SEM), was applied to analyze the collected data. The analytical results show that the two groups have equivalent factor structures in the CES-D. That is, the items in CES-D function equivalently in the two groups. Then the equality of latent mean test was performed. The latent means of "depressed mood," "positive affect," and "interpersonal problems" in CES-D are not significantly different between these two groups. However, the difference in the "somatic symptoms" latent means between these two groups is statistically significant at alpha = 0.01. But the Cohen's d statistics indicates that such differences in latent means do not apparently lead to a meaningful effect size in practice. Both CES-D questionnaires exhibit equal validity, reliability, and factor structures and exhibit a little difference in latent means. Therefore, the Internet-based questionnaire represents a promising alternative to the paper-based questionnaire.

  20. Grain-Size Based Additivity Models for Scaling Multi-rate Uranyl Surface Complexation in Subsurface Sediments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Xiaoying; Liu, Chongxuan; Hu, Bill X.

    This study statistically analyzed a grain-size based additivity model that has been proposed to scale reaction rates and parameters from laboratory to field. The additivity model assumed that reaction properties in a sediment including surface area, reactive site concentration, reaction rate, and extent can be predicted from field-scale grain size distribution by linearly adding reaction properties for individual grain size fractions. This study focused on the statistical analysis of the additivity model with respect to reaction rate constants using multi-rate uranyl (U(VI)) surface complexation reactions in a contaminated sediment as an example. Experimental data of rate-limited U(VI) desorption in amore » stirred flow-cell reactor were used to estimate the statistical properties of multi-rate parameters for individual grain size fractions. The statistical properties of the rate constants for the individual grain size fractions were then used to analyze the statistical properties of the additivity model to predict rate-limited U(VI) desorption in the composite sediment, and to evaluate the relative importance of individual grain size fractions to the overall U(VI) desorption. The result indicated that the additivity model provided a good prediction of the U(VI) desorption in the composite sediment. However, the rate constants were not directly scalable using the additivity model, and U(VI) desorption in individual grain size fractions have to be simulated in order to apply the additivity model. An approximate additivity model for directly scaling rate constants was subsequently proposed and evaluated. The result found that the approximate model provided a good prediction of the experimental results within statistical uncertainty. This study also found that a gravel size fraction (2-8mm), which is often ignored in modeling U(VI) sorption and desorption, is statistically significant to the U(VI) desorption in the sediment.« less

  1. DETECTORS AND EXPERIMENTAL METHODS: Heuristic approach for peak regions estimation in gamma-ray spectra measured by a NaI detector

    NASA Astrophysics Data System (ADS)

    Zhu, Meng-Hua; Liu, Liang-Gang; You, Zhong; Xu, Ao-Ao

    2009-03-01

    In this paper, a heuristic approach based on Slavic's peak searching method has been employed to estimate the width of peak regions for background removing. Synthetic and experimental data are used to test this method. With the estimated peak regions using the proposed method in the whole spectrum, we find it is simple and effective enough to be used together with the Statistics-sensitive Nonlinear Iterative Peak-Clipping method.

  2. Structure-guided statistical textural distinctiveness for salient region detection in natural images.

    PubMed

    Scharfenberger, Christian; Wong, Alexander; Clausi, David A

    2015-01-01

    We propose a simple yet effective structure-guided statistical textural distinctiveness approach to salient region detection. Our method uses a multilayer approach to analyze the structural and textural characteristics of natural images as important features for salient region detection from a scale point of view. To represent the structural characteristics, we abstract the image using structured image elements and extract rotational-invariant neighborhood-based textural representations to characterize each element by an individual texture pattern. We then learn a set of representative texture atoms for sparse texture modeling and construct a statistical textural distinctiveness matrix to determine the distinctiveness between all representative texture atom pairs in each layer. Finally, we determine saliency maps for each layer based on the occurrence probability of the texture atoms and their respective statistical textural distinctiveness and fuse them to compute a final saliency map. Experimental results using four public data sets and a variety of performance evaluation metrics show that our approach provides promising results when compared with existing salient region detection approaches.

  3. Deciphering the Landauer-Büttiker Transmission Function from Single Molecule Break Junction Experiments

    NASA Astrophysics Data System (ADS)

    Reuter, Matthew; Tschudi, Stephen

    When investigating the electrical response properties of molecules, experiments often measure conductance whereas computation predicts transmission probabilities. Although the Landauer-Büttiker theory relates the two in the limit of coherent scattering through the molecule, a direct comparison between experiment and computation can still be difficult. Experimental data (specifically that from break junctions) is statistical and computational results are deterministic. Many studies compare the most probable experimental conductance with computation, but such an analysis discards almost all of the experimental statistics. In this work we develop tools to decipher the Landauer-Büttiker transmission function directly from experimental statistics and then apply them to enable a fairer comparison between experimental and computational results.

  4. Is Statistical Learning Constrained by Lower Level Perceptual Organization?

    PubMed Central

    Emberson, Lauren L.; Liu, Ran; Zevin, Jason D.

    2013-01-01

    In order for statistical information to aid in complex developmental processes such as language acquisition, learning from higher-order statistics (e.g. across successive syllables in a speech stream to support segmentation) must be possible while perceptual abilities (e.g. speech categorization) are still developing. The current study examines how perceptual organization interacts with statistical learning. Adult participants were presented with multiple exemplars from novel, complex sound categories designed to reflect some of the spectral complexity and variability of speech. These categories were organized into sequential pairs and presented such that higher-order statistics, defined based on sound categories, could support stream segmentation. Perceptual similarity judgments and multi-dimensional scaling revealed that participants only perceived three perceptual clusters of sounds and thus did not distinguish the four experimenter-defined categories, creating a tension between lower level perceptual organization and higher-order statistical information. We examined whether the resulting pattern of learning is more consistent with statistical learning being “bottom-up,” constrained by the lower levels of organization, or “top-down,” such that higher-order statistical information of the stimulus stream takes priority over the perceptual organization, and perhaps influences perceptual organization. We consistently find evidence that learning is constrained by perceptual organization. Moreover, participants generalize their learning to novel sounds that occupy a similar perceptual space, suggesting that statistical learning occurs based on regions of or clusters in perceptual space. Overall, these results reveal a constraint on learning of sound sequences, such that statistical information is determined based on lower level organization. These findings have important implications for the role of statistical learning in language acquisition. PMID:23618755

  5. Experimental toxicology: Issues of statistics, experimental design, and replication.

    PubMed

    Briner, Wayne; Kirwan, Jeral

    2017-01-01

    The difficulty of replicating experiments has drawn considerable attention. Issues with replication occur for a variety of reasons ranging from experimental design to laboratory errors to inappropriate statistical analysis. Here we review a variety of guidelines for statistical analysis, design, and execution of experiments in toxicology. In general, replication can be improved by using hypothesis driven experiments with adequate sample sizes, randomization, and blind data collection techniques. Copyright © 2016 Elsevier B.V. All rights reserved.

  6. The in vitro antimicrobial activity of natural infant fluoride-free toothpastes on oral micro-organisms.

    PubMed

    Carvalho, Fabíola G; Negrini, Thais De Cássia; Sacramento, Luis Victor S; Hebling, Josimeri; Spolidorio, Denise M P; Duque, Cristiane

    2011-01-01

    The objective of this study was to evaluate the antimicrobial activity of six toothpastes for infants: 3 fluoride-free experimental toothpastes--cashew-based, mango-based and without plant extract and fluoride compared with 2 commercially fluoride-free toothpastes and 1 fluoridated toothpastes. Six toothpastes for infants were evaluated in this study: (1) experimental cashew-based toothpaste; (2) experimental mango-based toothpaste; (3) experimental toothpaste without plant extract and fluoride (negative control); (4) First Teeth brand toothpaste; (5) Weleda brand toothpaste; and (6) Tandy brand toothpaste (positive control). The antimicrobial activity was recorded against Streptococcus mutans, Streptococcus sobrinus, Lactobacillus acidophilus, and Candida albicans using the agar plate diffusion test. First Teeth, Weleda, mango-based toothpaste, and toothpaste without plant extract presented no antimicrobial effect against any of the tested micro-organisms. Cashew toothpaste had antimicrobial activity against S mutans, S sobrinus, and L acidophilus, but it showed no antimicrobial activity against C albicans. There was no statistical difference between the inhibition halo of cashew and Tandy toothpastes against S mutans and L acidophilus. Cashew fluoride-free toothpaste had inhibitory activity against Streptococcus mutans and Lactobacillus acidophilus, and these results were similar to those obtained for fluoridated toothpaste.

  7. Experimental Investigations of Non-Stationary Properties In Radiometer Receivers Using Measurements of Multiple Calibration References

    NASA Technical Reports Server (NTRS)

    Racette, Paul; Lang, Roger; Zhang, Zhao-Nan; Zacharias, David; Krebs, Carolyn A. (Technical Monitor)

    2002-01-01

    Radiometers must be periodically calibrated because the receiver response fluctuates. Many techniques exist to correct for the time varying response of a radiometer receiver. An analytical technique has been developed that uses generalized least squares regression (LSR) to predict the performance of a wide variety of calibration algorithms. The total measurement uncertainty including the uncertainty of the calibration can be computed using LSR. The uncertainties of the calibration samples used in the regression are based upon treating the receiver fluctuations as non-stationary processes. Signals originating from the different sources of emission are treated as simultaneously existing random processes. Thus, the radiometer output is a series of samples obtained from these random processes. The samples are treated as random variables but because the underlying processes are non-stationary the statistics of the samples are treated as non-stationary. The statistics of the calibration samples depend upon the time for which the samples are to be applied. The statistics of the random variables are equated to the mean statistics of the non-stationary processes over the interval defined by the time of calibration sample and when it is applied. This analysis opens the opportunity for experimental investigation into the underlying properties of receiver non stationarity through the use of multiple calibration references. In this presentation we will discuss the application of LSR to the analysis of various calibration algorithms, requirements for experimental verification of the theory, and preliminary results from analyzing experiment measurements.

  8. Statistical dielectronic recombination rates for multielectron ions in plasma

    NASA Astrophysics Data System (ADS)

    Demura, A. V.; Leont'iev, D. S.; Lisitsa, V. S.; Shurygin, V. A.

    2017-10-01

    We describe the general analytic derivation of the dielectronic recombination (DR) rate coefficient for multielectron ions in a plasma based on the statistical theory of an atom in terms of the spatial distribution of the atomic electron density. The dielectronic recombination rates for complex multielectron tungsten ions are calculated numerically in a wide range of variation of the plasma temperature, which is important for modern nuclear fusion studies. The results of statistical theory are compared with the data obtained using level-by-level codes ADPAK, FAC, HULLAC, and experimental results. We consider different statistical DR models based on the Thomas-Fermi distribution, viz., integral and differential with respect to the orbital angular momenta of the ion core and the trapped electron, as well as the Rost model, which is an analog of the Frank-Condon model as applied to atomic structures. In view of its universality and relative simplicity, the statistical approach can be used for obtaining express estimates of the dielectronic recombination rate coefficients in complex calculations of the parameters of the thermonuclear plasmas. The application of statistical methods also provides information for the dielectronic recombination rates with much smaller computer time expenditures as compared to available level-by-level codes.

  9. Assessing effects of a semi-customized experimental cervical pillow on symptomatic adults with chronic neck pain with and without headache

    PubMed Central

    Erfanian, Parham; Tenzif, Siamak; Guerriero, Rocco C

    2004-01-01

    Objective To determine the effects of a semi-customized experimental cervical pillow on symptomatic adults with chronic neck pain (with and without headache) during a four week study. Design A randomized controlled trial. Sample size Thirty-six adults were recruited for the trial, and randomly assigned to experimental or non-experimental groups of 17 and 19 participants respectively. Subjects Adults with chronic biomechanical neck pain who were recruited from the Canadian Memorial Chiropractic College (CMCC) Walk-in Clinic. Outcome measures Subjective findings were assessed using a mail-in self-report daily pain diary, and the CMCC Neck Disability Index (NDI). Statistical analysis Using repeated measure analysis of variance weekly NDI scores, average weekly AM and PM pain scores between the experimental and non-experimental groups were compared throughout the study. Results The experimental group had statistically significant lower NDI scores (p < 0.05) than the non-experimental group. The average weekly AM scores were lower and statistically significant (p < 0.05) in the experimental group. The PM scores in the experimental group were lower but not statistically significant than the other group. Conclusions The study results show that compared to conventional pillows, this experimental semi-customized cervical pillow was effective in reducing low-level neck pain intensity, especially in the morning following its use in a 4 week long study. PMID:17549216

  10. Vertical Enhancement of Second-Year Psychology Research

    ERIC Educational Resources Information Center

    Morys-Carter, Wakefield L.; Paltoglou, Aspasia E.; Davies, Emma L.

    2015-01-01

    Statistics and Research Methods modules are often unpopular with psychology students; however, at Oxford Brookes University the seminar component of the second-year research methods module tends to get very positive feedback. Over half of the seminars work towards the submission of a research-based experimental lab report. This article introduces…

  11. The impact of inquiry-based learning on the critical thinking dispositions of pre-service science teachers

    NASA Astrophysics Data System (ADS)

    Arsal, Zeki

    2017-07-01

    In the study, the impact of inquiry-based learning on pre-service teachers' critical thinking dispositions was investigated. The sample of the study comprised of 56 pre-service teachers in the science education teacher education programme at the public university in the north of Turkey. In the study, quasi-experimental design with an experimental and a control group were applied to find out the impact of inquiry-based learning on the critical thinking dispositions of the pre-service teachers in the teacher education programme. The results showed that the pre-service teachers in the experimental group did not show statistically significant greater progress in terms of critical thinking dispositions than those in the control group. Teacher educators who are responsible for pedagogical courses in the teacher education programme should consider that the inquiry-based learning could not be effective method to improve pre-service teachers' critical thinking dispositions. The results are discussed in relation to potential impact on science teacher education and implications for future research.

  12. Promoting students' conceptual understanding using STEM-based e-book

    NASA Astrophysics Data System (ADS)

    Komarudin, U.; Rustaman, N. Y.; Hasanah, L.

    2017-05-01

    This study aims to examine the effect of Science, Technology, Engineering, and Mathematics (STEM) based e-book in promoting students'conceptual understanding on lever system in human body. The E-book used was the e-book published by National Ministry of Science Education. The research was conducted by a quasi experimental with pretest and posttest design. The subjects consist of two classes of 8th grade junior high school in Pangkalpinang, Indonesia, which were devided into experimental group (n=34) and control group (n=32). The students in experimental group was taught by STEM-based e-book, while the control group learned by non STEM-based e-book. The data was collected by an instrument pretest and postest. Pretest and posttest scored, thenanalyzed using descriptive statistics and independent t-test. The result of independent sample t-test shows that no significant differenceson students' pretest score between control and experimental group. However, there were significant differences on students posttest score and N-gain score between control and experimental group with sig = 0.000(p<0.005). N-gain analysis showsthe higher performance of students who were participated in experimental group (mean = 66.03) higher compared to control group (mean = 47.66) in answering conceptual understanding questions. Based on the results, it can be concluded that STEM-based e-book has positiveimpact in promoting students' understanding on lever system in human body. Therefore this learning approach is potential to be used as an alternative to triger the enhancement of students' understanding in science.

  13. Cluster size statistic and cluster mass statistic: two novel methods for identifying changes in functional connectivity between groups or conditions.

    PubMed

    Ing, Alex; Schwarzbauer, Christian

    2014-01-01

    Functional connectivity has become an increasingly important area of research in recent years. At a typical spatial resolution, approximately 300 million connections link each voxel in the brain with every other. This pattern of connectivity is known as the functional connectome. Connectivity is often compared between experimental groups and conditions. Standard methods used to control the type 1 error rate are likely to be insensitive when comparisons are carried out across the whole connectome, due to the huge number of statistical tests involved. To address this problem, two new cluster based methods--the cluster size statistic (CSS) and cluster mass statistic (CMS)--are introduced to control the family wise error rate across all connectivity values. These methods operate within a statistical framework similar to the cluster based methods used in conventional task based fMRI. Both methods are data driven, permutation based and require minimal statistical assumptions. Here, the performance of each procedure is evaluated in a receiver operator characteristic (ROC) analysis, utilising a simulated dataset. The relative sensitivity of each method is also tested on real data: BOLD (blood oxygen level dependent) fMRI scans were carried out on twelve subjects under normal conditions and during the hypercapnic state (induced through the inhalation of 6% CO2 in 21% O2 and 73%N2). Both CSS and CMS detected significant changes in connectivity between normal and hypercapnic states. A family wise error correction carried out at the individual connection level exhibited no significant changes in connectivity.

  14. Cluster Size Statistic and Cluster Mass Statistic: Two Novel Methods for Identifying Changes in Functional Connectivity Between Groups or Conditions

    PubMed Central

    Ing, Alex; Schwarzbauer, Christian

    2014-01-01

    Functional connectivity has become an increasingly important area of research in recent years. At a typical spatial resolution, approximately 300 million connections link each voxel in the brain with every other. This pattern of connectivity is known as the functional connectome. Connectivity is often compared between experimental groups and conditions. Standard methods used to control the type 1 error rate are likely to be insensitive when comparisons are carried out across the whole connectome, due to the huge number of statistical tests involved. To address this problem, two new cluster based methods – the cluster size statistic (CSS) and cluster mass statistic (CMS) – are introduced to control the family wise error rate across all connectivity values. These methods operate within a statistical framework similar to the cluster based methods used in conventional task based fMRI. Both methods are data driven, permutation based and require minimal statistical assumptions. Here, the performance of each procedure is evaluated in a receiver operator characteristic (ROC) analysis, utilising a simulated dataset. The relative sensitivity of each method is also tested on real data: BOLD (blood oxygen level dependent) fMRI scans were carried out on twelve subjects under normal conditions and during the hypercapnic state (induced through the inhalation of 6% CO2 in 21% O2 and 73%N2). Both CSS and CMS detected significant changes in connectivity between normal and hypercapnic states. A family wise error correction carried out at the individual connection level exhibited no significant changes in connectivity. PMID:24906136

  15. Fault-tolerant Greenberger-Horne-Zeilinger paradox based on non-Abelian anyons.

    PubMed

    Deng, Dong-Ling; Wu, Chunfeng; Chen, Jing-Ling; Oh, C H

    2010-08-06

    We propose a scheme to test the Greenberger-Horne-Zeilinger paradox based on braidings of non-Abelian anyons, which are exotic quasiparticle excitations of topological states of matter. Because topological ordered states are robust against local perturbations, this scheme is in some sense "fault-tolerant" and might close the detection inefficiency loophole problem in previous experimental tests of the Greenberger-Horne-Zeilinger paradox. In turn, the construction of the Greenberger-Horne-Zeilinger paradox reveals the nonlocal property of non-Abelian anyons. Our results indicate that the non-Abelian fractional statistics is a pure quantum effect and cannot be described by local realistic theories. Finally, we present a possible experimental implementation of the scheme based on the anyonic interferometry technologies.

  16. Using experimental data to test an n -body dynamical model coupled with an energy-based clusterization algorithm at low incident energies

    NASA Astrophysics Data System (ADS)

    Kumar, Rohit; Puri, Rajeev K.

    2018-03-01

    Employing the quantum molecular dynamics (QMD) approach for nucleus-nucleus collisions, we test the predictive power of the energy-based clusterization algorithm, i.e., the simulating annealing clusterization algorithm (SACA), to describe the experimental data of charge distribution and various event-by-event correlations among fragments. The calculations are constrained into the Fermi-energy domain and/or mildly excited nuclear matter. Our detailed study spans over different system masses, and system-mass asymmetries of colliding partners show the importance of the energy-based clusterization algorithm for understanding multifragmentation. The present calculations are also compared with the other available calculations, which use one-body models, statistical models, and/or hybrid models.

  17. Experimental design and data analysis of Ago-RIP-Seq experiments for the identification of microRNA targets.

    PubMed

    Tichy, Diana; Pickl, Julia Maria Anna; Benner, Axel; Sültmann, Holger

    2017-03-31

    The identification of microRNA (miRNA) target genes is crucial for understanding miRNA function. Many methods for the genome-wide miRNA target identification have been developed in recent years; however, they have several limitations including the dependence on low-confident prediction programs and artificial miRNA manipulations. Ago-RNA immunoprecipitation combined with high-throughput sequencing (Ago-RIP-Seq) is a promising alternative. However, appropriate statistical data analysis algorithms taking into account the experimental design and the inherent noise of such experiments are largely lacking.Here, we investigate the experimental design for Ago-RIP-Seq and examine biostatistical methods to identify de novo miRNA target genes. Statistical approaches considered are either based on a negative binomial model fit to the read count data or applied to transformed data using a normal distribution-based generalized linear model. We compare them by a real data simulation study using plasmode data sets and evaluate the suitability of the approaches to detect true miRNA targets by sensitivity and false discovery rates. Our results suggest that simple approaches like linear regression models on (appropriately) transformed read count data are preferable. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  18. Characterizing Protease Specificity: How Many Substrates Do We Need?

    PubMed Central

    Schauperl, Michael; Fuchs, Julian E.; Waldner, Birgit J.; Huber, Roland G.; Kramer, Christian; Liedl, Klaus R.

    2015-01-01

    Calculation of cleavage entropies allows to quantify, map and compare protease substrate specificity by an information entropy based approach. The metric intrinsically depends on the number of experimentally determined substrates (data points). Thus a statistical analysis of its numerical stability is crucial to estimate the systematic error made by estimating specificity based on a limited number of substrates. In this contribution, we show the mathematical basis for estimating the uncertainty in cleavage entropies. Sets of cleavage entropies are calculated using experimental cleavage data and modeled extreme cases. By analyzing the underlying mathematics and applying statistical tools, a linear dependence of the metric in respect to 1/n was found. This allows us to extrapolate the values to an infinite number of samples and to estimate the errors. Analyzing the errors, a minimum number of 30 substrates was found to be necessary to characterize substrate specificity, in terms of amino acid variability, for a protease (S4-S4’) with an uncertainty of 5 percent. Therefore, we encourage experimental researchers in the protease field to record specificity profiles of novel proteases aiming to identify at least 30 peptide substrates of maximum sequence diversity. We expect a full characterization of protease specificity helpful to rationalize biological functions of proteases and to assist rational drug design. PMID:26559682

  19. A Robust Adaptive Autonomous Approach to Optimal Experimental Design

    NASA Astrophysics Data System (ADS)

    Gu, Hairong

    Experimentation is the fundamental tool of scientific inquiries to understand the laws governing the nature and human behaviors. Many complex real-world experimental scenarios, particularly in quest of prediction accuracy, often encounter difficulties to conduct experiments using an existing experimental procedure for the following two reasons. First, the existing experimental procedures require a parametric model to serve as the proxy of the latent data structure or data-generating mechanism at the beginning of an experiment. However, for those experimental scenarios of concern, a sound model is often unavailable before an experiment. Second, those experimental scenarios usually contain a large number of design variables, which potentially leads to a lengthy and costly data collection cycle. Incompetently, the existing experimental procedures are unable to optimize large-scale experiments so as to minimize the experimental length and cost. Facing the two challenges in those experimental scenarios, the aim of the present study is to develop a new experimental procedure that allows an experiment to be conducted without the assumption of a parametric model while still achieving satisfactory prediction, and performs optimization of experimental designs to improve the efficiency of an experiment. The new experimental procedure developed in the present study is named robust adaptive autonomous system (RAAS). RAAS is a procedure for sequential experiments composed of multiple experimental trials, which performs function estimation, variable selection, reverse prediction and design optimization on each trial. Directly addressing the challenges in those experimental scenarios of concern, function estimation and variable selection are performed by data-driven modeling methods to generate a predictive model from data collected during the course of an experiment, thus exempting the requirement of a parametric model at the beginning of an experiment; design optimization is performed to select experimental designs on the fly of an experiment based on their usefulness so that fewest designs are needed to reach useful inferential conclusions. Technically, function estimation is realized by Bayesian P-splines, variable selection is realized by Bayesian spike-and-slab prior, reverse prediction is realized by grid-search and design optimization is realized by the concepts of active learning. The present study demonstrated that RAAS achieves statistical robustness by making accurate predictions without the assumption of a parametric model serving as the proxy of latent data structure while the existing procedures can draw poor statistical inferences if a misspecified model is assumed; RAAS also achieves inferential efficiency by taking fewer designs to acquire useful statistical inferences than non-optimal procedures. Thus, RAAS is expected to be a principled solution to real-world experimental scenarios pursuing robust prediction and efficient experimentation.

  20. Three-dimensional holoscopic image coding scheme using high-efficiency video coding with kernel-based minimum mean-square-error estimation

    NASA Astrophysics Data System (ADS)

    Liu, Deyang; An, Ping; Ma, Ran; Yang, Chao; Shen, Liquan; Li, Kai

    2016-07-01

    Three-dimensional (3-D) holoscopic imaging, also known as integral imaging, light field imaging, or plenoptic imaging, can provide natural and fatigue-free 3-D visualization. However, a large amount of data is required to represent the 3-D holoscopic content. Therefore, efficient coding schemes for this particular type of image are needed. A 3-D holoscopic image coding scheme with kernel-based minimum mean square error (MMSE) estimation is proposed. In the proposed scheme, the coding block is predicted by an MMSE estimator under statistical modeling. In order to obtain the signal statistical behavior, kernel density estimation (KDE) is utilized to estimate the probability density function of the statistical modeling. As bandwidth estimation (BE) is a key issue in the KDE problem, we also propose a BE method based on kernel trick. The experimental results demonstrate that the proposed scheme can achieve a better rate-distortion performance and a better visual rendering quality.

  1. Adaptive convergence nonuniformity correction algorithm.

    PubMed

    Qian, Weixian; Chen, Qian; Bai, Junqi; Gu, Guohua

    2011-01-01

    Nowadays, convergence and ghosting artifacts are common problems in scene-based nonuniformity correction (NUC) algorithms. In this study, we introduce the idea of space frequency to the scene-based NUC. Then the convergence speed factor is presented, which can adaptively change the convergence speed by a change of the scene dynamic range. In fact, the convergence speed factor role is to decrease the statistical data standard deviation. The nonuniformity space relativity characteristic was summarized by plenty of experimental statistical data. The space relativity characteristic was used to correct the convergence speed factor, which can make it more stable. Finally, real and simulated infrared image sequences were applied to demonstrate the positive effect of our algorithm.

  2. Removal of EMG and ECG artifacts from EEG based on wavelet transform and ICA.

    PubMed

    Zhou, Weidong; Gotman, Jean

    2004-01-01

    In this study, the methods of wavelet threshold de-noising and independent component analysis (ICA) are introduced. ICA is a novel signal processing technique based on high order statistics, and is used to separate independent components from measurements. The extended ICA algorithm does not need to calculate the higher order statistics, converges fast, and can be used to separate subGaussian and superGaussian sources. A pre-whitening procedure is performed to de-correlate the mixed signals before extracting sources. The experimental results indicate the electromyogram (EMG) and electrocardiograph (ECG) artifacts in electroencephalograph (EEG) can be removed by a combination of wavelet threshold de-noising and ICA.

  3. Experimental comparisons of hypothesis test and moving average based combustion phase controllers.

    PubMed

    Gao, Jinwu; Wu, Yuhu; Shen, Tielong

    2016-11-01

    For engine control, combustion phase is the most effective and direct parameter to improve fuel efficiency. In this paper, the statistical control strategy based on hypothesis test criterion is discussed. Taking location of peak pressure (LPP) as combustion phase indicator, the statistical model of LPP is first proposed, and then the controller design method is discussed on the basis of both Z and T tests. For comparison, moving average based control strategy is also presented and implemented in this study. The experiments on a spark ignition gasoline engine at various operating conditions show that the hypothesis test based controller is able to regulate LPP close to set point while maintaining the rapid transient response, and the variance of LPP is also well constrained. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.

  4. A Realistic Experimental Design and Statistical Analysis Project

    ERIC Educational Resources Information Center

    Muske, Kenneth R.; Myers, John A.

    2007-01-01

    A realistic applied chemical engineering experimental design and statistical analysis project is documented in this article. This project has been implemented as part of the professional development and applied statistics courses at Villanova University over the past five years. The novel aspects of this project are that the students are given a…

  5. Statistical Modeling of Single Target Cell Encapsulation

    PubMed Central

    Moon, SangJun; Ceyhan, Elvan; Gurkan, Umut Atakan; Demirci, Utkan

    2011-01-01

    High throughput drop-on-demand systems for separation and encapsulation of individual target cells from heterogeneous mixtures of multiple cell types is an emerging method in biotechnology that has broad applications in tissue engineering and regenerative medicine, genomics, and cryobiology. However, cell encapsulation in droplets is a random process that is hard to control. Statistical models can provide an understanding of the underlying processes and estimation of the relevant parameters, and enable reliable and repeatable control over the encapsulation of cells in droplets during the isolation process with high confidence level. We have modeled and experimentally verified a microdroplet-based cell encapsulation process for various combinations of cell loading and target cell concentrations. Here, we explain theoretically and validate experimentally a model to isolate and pattern single target cells from heterogeneous mixtures without using complex peripheral systems. PMID:21814548

  6. Predicting the stochastic guiding of kinesin-driven microtubules in microfabricated tracks: a statistical-mechanics-based modeling approach.

    PubMed

    Lin, Chih-Tin; Meyhofer, Edgar; Kurabayashi, Katsuo

    2010-01-01

    Directional control of microtubule shuttles via microfabricated tracks is key to the development of controlled nanoscale mass transport by kinesin motor molecules. Here we develop and test a model to quantitatively predict the stochastic behavior of microtubule guiding when they mechanically collide with the sidewalls of lithographically patterned tracks. By taking into account appropriate probability distributions of microscopic states of the microtubule system, the model allows us to theoretically analyze the roles of collision conditions and kinesin surface densities in determining how the motion of microtubule shuttles is controlled. In addition, we experimentally observe the statistics of microtubule collision events and compare our theoretical prediction with experimental data to validate our model. The model will direct the design of future hybrid nanotechnology devices that integrate nanoscale transport systems powered by kinesin-driven molecular shuttles.

  7. An experimental comparison of various methods of nearfield acoustic holography

    DOE PAGES

    Chelliah, Kanthasamy; Raman, Ganesh; Muehleisen, Ralph T.

    2017-05-19

    An experimental comparison of four different methods of nearfield acoustic holography (NAH) is presented in this study for planar acoustic sources. The four NAH methods considered in this study are based on: (1) spatial Fourier transform, (2) equivalent sources model, (3) boundary element methods and (4) statistically optimized NAH. Two dimensional measurements were obtained at different distances in front of a tonal sound source and the NAH methods were used to reconstruct the sound field at the source surface. Reconstructed particle velocity and acoustic pressure fields presented in this study showed that the equivalent sources model based algorithm along withmore » Tikhonov regularization provided the best localization of the sources. Reconstruction errors were found to be smaller for the equivalent sources model based algorithm and the statistically optimized NAH algorithm. Effect of hologram distance on the performance of various algorithms is discussed in detail. The study also compares the computational time required by each algorithm to complete the comparison. Four different regularization parameter choice methods were compared. The L-curve method provided more accurate reconstructions than the generalized cross validation and the Morozov discrepancy principle. Finally, the performance of fixed parameter regularization was comparable to that of the L-curve method.« less

  8. An experimental comparison of various methods of nearfield acoustic holography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chelliah, Kanthasamy; Raman, Ganesh; Muehleisen, Ralph T.

    An experimental comparison of four different methods of nearfield acoustic holography (NAH) is presented in this study for planar acoustic sources. The four NAH methods considered in this study are based on: (1) spatial Fourier transform, (2) equivalent sources model, (3) boundary element methods and (4) statistically optimized NAH. Two dimensional measurements were obtained at different distances in front of a tonal sound source and the NAH methods were used to reconstruct the sound field at the source surface. Reconstructed particle velocity and acoustic pressure fields presented in this study showed that the equivalent sources model based algorithm along withmore » Tikhonov regularization provided the best localization of the sources. Reconstruction errors were found to be smaller for the equivalent sources model based algorithm and the statistically optimized NAH algorithm. Effect of hologram distance on the performance of various algorithms is discussed in detail. The study also compares the computational time required by each algorithm to complete the comparison. Four different regularization parameter choice methods were compared. The L-curve method provided more accurate reconstructions than the generalized cross validation and the Morozov discrepancy principle. Finally, the performance of fixed parameter regularization was comparable to that of the L-curve method.« less

  9. The same analysis approach: Practical protection against the pitfalls of novel neuroimaging analysis methods.

    PubMed

    Görgen, Kai; Hebart, Martin N; Allefeld, Carsten; Haynes, John-Dylan

    2017-12-27

    Standard neuroimaging data analysis based on traditional principles of experimental design, modelling, and statistical inference is increasingly complemented by novel analysis methods, driven e.g. by machine learning methods. While these novel approaches provide new insights into neuroimaging data, they often have unexpected properties, generating a growing literature on possible pitfalls. We propose to meet this challenge by adopting a habit of systematic testing of experimental design, analysis procedures, and statistical inference. Specifically, we suggest to apply the analysis method used for experimental data also to aspects of the experimental design, simulated confounds, simulated null data, and control data. We stress the importance of keeping the analysis method the same in main and test analyses, because only this way possible confounds and unexpected properties can be reliably detected and avoided. We describe and discuss this Same Analysis Approach in detail, and demonstrate it in two worked examples using multivariate decoding. With these examples, we reveal two sources of error: A mismatch between counterbalancing (crossover designs) and cross-validation which leads to systematic below-chance accuracies, and linear decoding of a nonlinear effect, a difference in variance. Copyright © 2017 Elsevier Inc. All rights reserved.

  10. Comparing the MRI-based Goutallier Classification to an experimental quantitative MR spectroscopic fat measurement of the supraspinatus muscle.

    PubMed

    Gilbert, Fabian; Böhm, Dirk; Eden, Lars; Schmalzl, Jonas; Meffert, Rainer H; Köstler, Herbert; Weng, Andreas M; Ziegler, Dirk

    2016-08-22

    The Goutallier Classification is a semi quantitative classification system to determine the amount of fatty degeneration in rotator cuff muscles. Although initially proposed for axial computer tomography scans it is currently applied to magnet-resonance-imaging-scans. The role for its clinical use is controversial, as the reliability of the classification has been shown to be inconsistent. The purpose of this study was to compare the semi quantitative MRI-based Goutallier Classification applied by 5 different raters to experimental MR spectroscopic quantitative fat measurement in order to determine the correlation between this classification system and the true extent of fatty degeneration shown by spectroscopy. MRI-scans of 42 patients with rotator cuff tears were examined by 5 shoulder surgeons and were graduated according to the MRI-based Goutallier Classification proposed by Fuchs et al. Additionally the fat/water ratio was measured with MR spectroscopy using the experimental SPLASH technique. The semi quantitative grading according to the Goutallier Classification was statistically correlated with the quantitative measured fat/water ratio using Spearman's rank correlation. Statistical analysis of the data revealed only fair correlation of the Goutallier Classification system and the quantitative fat/water ratio with R = 0.35 (p < 0.05). By dichotomizing the scale the correlation was 0.72. The interobserver and intraobserver reliabilities were substantial with R = 0.62 and R = 0.74 (p < 0.01). The correlation between the semi quantitative MRI based Goutallier Classification system and MR spectroscopic fat measurement is weak. As an adequate estimation of fatty degeneration based on standard MRI may not be possible, quantitative methods need to be considered in order to increase diagnostic safety and thus provide patients with ideal care in regard to the amount of fatty degeneration. Spectroscopic MR measurement may increase the accuracy of the Goutallier classification and thus improve the prediction of clinical results after rotator cuff repair. However, these techniques are currently only available in an experimental setting.

  11. Effects of Topic Simvastatin for the Treatment of Chronic Vascular Cutaneous Ulcers: A Pilot Study.

    PubMed

    Raposio, Edoardo; Libondi, Guido; Bertozzi, Nicolò; Grignaffini, Eugenio; Grieco, Michele P

    2015-12-01

    Recent research suggests that statins might be useful in the process of wound healing, playing a positive immune-modulatory role, improving microvascular function and reducing oxidative stress. The aim of this pilot study was to evaluate the efficacy of topic application of Simvastatin-based cream in the treatment of chronic vascular cutaneous ulcers, comparing this type of treatment to a collagen-based dressing, proven to be effective for ulcer treatment. A total of 20 ulcers were studied in 2 Groups of randomly-chosen patients for a period of one month. In the first Group a 0.5% Simvastatin-based cream was topically administered, while the second Group (control) was treated with an absorbable type I bovine collagen-based medication. Each week, wound healing progress was observed in both Groups, and the ulcers photographed. Wound healing rate was calculated by considering the absolute change in area and by the formula "healing ratio (%) = [(Area 0 - Area t4 )/Area 0 ] × 100," both sets of data being related to the days comprised in the study in order to calculate healing rate per day. Statistical analysis was performed by Student t test. Study endpoint equaling the time-course changes of ulcer areas. At the end of the study, when considering absolute change in area, the experimental Group appeared to heal better and faster than the control Group although differences between the Groups were not statistically significant. Conversely, rates of wound healing in the experimental and control Groups were 46.88% and 64% respectively, revealing statistically significant differences. ( P < 0.05). In conclusion, topic application of a simvastatin-based cream proved to be well- tolerated but not effective in the management of vascular leg ulcers in a 4 week-period.

  12. Unified risk analysis of fatigue failure in ductile alloy components during all three stages of fatigue crack evolution process.

    PubMed

    Patankar, Ravindra

    2003-10-01

    Statistical fatigue life of a ductile alloy specimen is traditionally divided into three stages, namely, crack nucleation, small crack growth, and large crack growth. Crack nucleation and small crack growth show a wide variation and hence a big spread on cycles versus crack length graph. Relatively, large crack growth shows a lesser variation. Therefore, different models are fitted to the different stages of the fatigue evolution process, thus treating different stages as different phenomena. With these independent models, it is impossible to predict one phenomenon based on the information available about the other phenomenon. Experimentally, it is easier to carry out crack length measurements of large cracks compared to nucleating cracks and small cracks. Thus, it is easier to collect statistical data for large crack growth compared to the painstaking effort it would take to collect statistical data for crack nucleation and small crack growth. This article presents a fracture mechanics-based stochastic model of fatigue crack growth in ductile alloys that are commonly encountered in mechanical structures and machine components. The model has been validated by Ray (1998) for crack propagation by various statistical fatigue data. Based on the model, this article proposes a technique to predict statistical information of fatigue crack nucleation and small crack growth properties that uses the statistical properties of large crack growth under constant amplitude stress excitation. The statistical properties of large crack growth under constant amplitude stress excitation can be obtained via experiments.

  13. The landscape of W± and Z bosons produced in pp collisions up to LHC energies

    NASA Astrophysics Data System (ADS)

    Basso, Eduardo; Bourrely, Claude; Pasechnik, Roman; Soffer, Jacques

    2017-10-01

    We consider a selection of recent experimental results on electroweak W± , Z gauge boson production in pp collisions at BNL RHIC and CERN LHC energies in comparison to prediction of perturbative QCD calculations based on different sets of NLO parton distribution functions including the statistical PDF model known from fits to the DIS data. We show that the current statistical PDF parametrization (fitted to the DIS data only) underestimates the LHC data on W± , Z gauge boson production cross sections at the NLO by about 20%. This suggests that there is a need to refit the parameters of the statistical PDF including the latest LHC data.

  14. Research on Time Selection of Mass Sports in Tibetan Areas Plateau of Gansu Province Based on Environmental Science

    NASA Astrophysics Data System (ADS)

    Gao, Jike

    2018-01-01

    Through using the method of literature review, instrument measuring, questionnaire and mathematical statistics, this paper analyzed the current situation in Mass Sports of Tibetan Areas Plateau in Gansu Province. Through experimental test access to Tibetan areas in gansu province of air pollutants and meteorological index data as the foundation, control related national standard and exercise science, statistical analysis of data, the Tibetan plateau, gansu province people participate in physical exercise is dedicated to providing you with scientific methods and appropriate time.

  15. A computational visual saliency model based on statistics and machine learning.

    PubMed

    Lin, Ru-Je; Lin, Wei-Song

    2014-08-01

    Identifying the type of stimuli that attracts human visual attention has been an appealing topic for scientists for many years. In particular, marking the salient regions in images is useful for both psychologists and many computer vision applications. In this paper, we propose a computational approach for producing saliency maps using statistics and machine learning methods. Based on four assumptions, three properties (Feature-Prior, Position-Prior, and Feature-Distribution) can be derived and combined by a simple intersection operation to obtain a saliency map. These properties are implemented by a similarity computation, support vector regression (SVR) technique, statistical analysis of training samples, and information theory using low-level features. This technique is able to learn the preferences of human visual behavior while simultaneously considering feature uniqueness. Experimental results show that our approach performs better in predicting human visual attention regions than 12 other models in two test databases. © 2014 ARVO.

  16. A statistically derived index for classifying East Coast fever reactions in cattle challenged with Theileria parva under experimental conditions.

    PubMed

    Rowlands, G J; Musoke, A J; Morzaria, S P; Nagda, S M; Ballingall, K T; McKeever, D J

    2000-04-01

    A statistically derived disease reaction index based on parasitological, clinical and haematological measurements observed in 309 5 to 8-month-old Boran cattle following laboratory challenge with Theileria parva is described. Principal component analysis was applied to 13 measures including first appearance of schizonts, first appearance of piroplasms and first occurrence of pyrexia, together with the duration and severity of these symptoms, and white blood cell count. The first principal component, which was based on approximately equal contributions of the 13 variables, provided the definition for the disease reaction index, defined on a scale of 0-10. As well as providing a more objective measure of the severity of the reaction, the continuous nature of the index score enables more powerful statistical analysis of the data compared with that which has been previously possible through clinically derived categories of non-, mild, moderate and severe reactions.

  17. An Improved Rank Correlation Effect Size Statistic for Single-Case Designs: Baseline Corrected Tau.

    PubMed

    Tarlow, Kevin R

    2017-07-01

    Measuring treatment effects when an individual's pretreatment performance is improving poses a challenge for single-case experimental designs. It may be difficult to determine whether improvement is due to the treatment or due to the preexisting baseline trend. Tau- U is a popular single-case effect size statistic that purports to control for baseline trend. However, despite its strengths, Tau- U has substantial limitations: Its values are inflated and not bound between -1 and +1, it cannot be visually graphed, and its relatively weak method of trend control leads to unacceptable levels of Type I error wherein ineffective treatments appear effective. An improved effect size statistic based on rank correlation and robust regression, Baseline Corrected Tau, is proposed and field-tested with both published and simulated single-case time series. A web-based calculator for Baseline Corrected Tau is also introduced for use by single-case investigators.

  18. Effects of a work-based critical reflection program for novice nurses.

    PubMed

    Kim, Yeon Hee; Min, Ja; Kim, Soon Hee; Shin, Sujin

    2018-02-27

    Critical reflection is effective in improving students' communication abilities and confidence. The aim of this study was to evaluate the effectiveness of a work-based critical reflection program to enhance novice nurses' clinical critical-thinking abilities, communication competency, and job performance. The present study used a quasi-experimental design. From October 2014 to August 2015, we collected data from 44 novice nurses working in an advanced general hospital in S city in Korea. Nurses in the experimental group participated in a critical reflection program for six months. Outcome variables were clinical critical-thinking skills, communication abilities, and job performance. A non-parametric Mann-Whitney U-test and a Wilcoxon rank sum test were selected to evaluate differences in mean ranks and to assess the null hypothesis that the medians were equal across the groups. The results showed that the clinical critical-thinking skills of those in the experimental group improved significantly (p = 0.003). The differences in mean ranks of communication ability between two groups was significantly statistically different (p = 0.028). Job performance improved significantly in both the experimental group and the control group, so there was no statistical difference (p = 0.294). We therefore suggest that a critical reflection program be considered an essential tool for improving critical thinking and communication abilities among novice nurses who need to adapt to the clinical environment as quickly as possible. Further, we suggest conducting research into critical reflection programs among larger and more diverse samples.

  19. Digital Image Analysis of Yeast Single Cells Growing in Two Different Oxygen Concentrations to Analyze the Population Growth and to Assist Individual-Based Modeling.

    PubMed

    Ginovart, Marta; Carbó, Rosa; Blanco, Mónica; Portell, Xavier

    2017-01-01

    Nowadays control of the growth of Saccharomyces to obtain biomass or cellular wall components is crucial for specific industrial applications. The general aim of this contribution is to deal with experimental data obtained from yeast cells and from yeast cultures to attempt the integration of the two levels of information, individual and population, to progress in the control of yeast biotechnological processes by means of the overall analysis of this set of experimental data, and to assist in the improvement of an individual-based model, namely, INDISIM- Saccha . Populations of S. cerevisiae growing in liquid batch culture, in aerobic and microaerophilic conditions, were studied. A set of digital images was taken during the population growth, and a protocol for the treatment and analyses of the images obtained was established. The piecewise linear model of Buchanan was adjusted to the temporal evolutions of the yeast populations to determine the kinetic parameters and changes of growth phases. In parallel, for all the yeast cells analyzed, values of direct morphological parameters, such as area, perimeter, major diameter, minor diameter, and derived ones, such as circularity and elongation, were obtained. Graphical and numerical methods from descriptive statistics were applied to these data to characterize the growth phases and the budding state of the yeast cells in both experimental conditions, and inferential statistical methods were used to compare the diverse groups of data achieved. Oxidative metabolism of yeast in a medium with oxygen available and low initial sugar concentration can be taken into account in order to obtain a greater number of cells or larger cells. Morphological parameters were analyzed statistically to identify which were the most useful for the discrimination of the different states, according to budding and/or growth phase, in aerobic and microaerophilic conditions. The use of the experimental data for subsequent modeling work was then discussed and compared to simulation results generated with INDISIM- Saccha , which allowed us to advance in the development of this yeast model, and illustrated the utility of data at different levels of observation and the needs and logic behind the development of a microbial individual-based model.

  20. The use of statistical tools in field testing of putative effects of genetically modified plants on nontarget organisms

    PubMed Central

    Semenov, Alexander V; Elsas, Jan Dirk; Glandorf, Debora C M; Schilthuizen, Menno; Boer, Willem F

    2013-01-01

    Abstract To fulfill existing guidelines, applicants that aim to place their genetically modified (GM) insect-resistant crop plants on the market are required to provide data from field experiments that address the potential impacts of the GM plants on nontarget organisms (NTO's). Such data may be based on varied experimental designs. The recent EFSA guidance document for environmental risk assessment (2010) does not provide clear and structured suggestions that address the statistics of field trials on effects on NTO's. This review examines existing practices in GM plant field testing such as the way of randomization, replication, and pseudoreplication. Emphasis is placed on the importance of design features used for the field trials in which effects on NTO's are assessed. The importance of statistical power and the positive and negative aspects of various statistical models are discussed. Equivalence and difference testing are compared, and the importance of checking the distribution of experimental data is stressed to decide on the selection of the proper statistical model. While for continuous data (e.g., pH and temperature) classical statistical approaches – for example, analysis of variance (ANOVA) – are appropriate, for discontinuous data (counts) only generalized linear models (GLM) are shown to be efficient. There is no golden rule as to which statistical test is the most appropriate for any experimental situation. In particular, in experiments in which block designs are used and covariates play a role GLMs should be used. Generic advice is offered that will help in both the setting up of field testing and the interpretation and data analysis of the data obtained in this testing. The combination of decision trees and a checklist for field trials, which are provided, will help in the interpretation of the statistical analyses of field trials and to assess whether such analyses were correctly applied. We offer generic advice to risk assessors and applicants that will help in both the setting up of field testing and the interpretation and data analysis of the data obtained in field testing. PMID:24567836

  1. The use of statistical tools in field testing of putative effects of genetically modified plants on nontarget organisms.

    PubMed

    Semenov, Alexander V; Elsas, Jan Dirk; Glandorf, Debora C M; Schilthuizen, Menno; Boer, Willem F

    2013-08-01

    To fulfill existing guidelines, applicants that aim to place their genetically modified (GM) insect-resistant crop plants on the market are required to provide data from field experiments that address the potential impacts of the GM plants on nontarget organisms (NTO's). Such data may be based on varied experimental designs. The recent EFSA guidance document for environmental risk assessment (2010) does not provide clear and structured suggestions that address the statistics of field trials on effects on NTO's. This review examines existing practices in GM plant field testing such as the way of randomization, replication, and pseudoreplication. Emphasis is placed on the importance of design features used for the field trials in which effects on NTO's are assessed. The importance of statistical power and the positive and negative aspects of various statistical models are discussed. Equivalence and difference testing are compared, and the importance of checking the distribution of experimental data is stressed to decide on the selection of the proper statistical model. While for continuous data (e.g., pH and temperature) classical statistical approaches - for example, analysis of variance (ANOVA) - are appropriate, for discontinuous data (counts) only generalized linear models (GLM) are shown to be efficient. There is no golden rule as to which statistical test is the most appropriate for any experimental situation. In particular, in experiments in which block designs are used and covariates play a role GLMs should be used. Generic advice is offered that will help in both the setting up of field testing and the interpretation and data analysis of the data obtained in this testing. The combination of decision trees and a checklist for field trials, which are provided, will help in the interpretation of the statistical analyses of field trials and to assess whether such analyses were correctly applied. We offer generic advice to risk assessors and applicants that will help in both the setting up of field testing and the interpretation and data analysis of the data obtained in field testing.

  2. Effects of far-infrared irradiation on myofascial neck pain: a randomized, double-blind, placebo-controlled pilot study.

    PubMed

    Lai, Chien-Hung; Leung, Ting-Kai; Peng, Chih-Wei; Chang, Kwang-Hwa; Lai, Ming-Jun; Lai, Wen-Fu; Chen, Shih-Ching

    2014-02-01

    The objective of this study was to determine the relative efficacy of irradiation using a device containing a far-infrared emitting ceramic powder (cFIR) for the management of chronic myofascial neck pain compared with a control treatment. This was a randomized, double-blind, placebo-controlled pilot study. The study comprised 48 patients with chronic, myofascial neck pain. Patients were randomly assigned to the experimental group or the control (sham-treatment) group. The patients in the experimental group wore a cFIR neck device for 1 week, and the control group wore an inert neck device for 1 week. Quantitative measurements based on a visual analogue scale (VAS) scoring of pain, a sleep quality assessment, pressure-pain threshold (PPT) testing, muscle tone and compliance analysis, and skin temperature analysis were obtained. Both the experimental and control groups demonstrated significant improvement in pain scores. However, no statistically significant difference in the pain scores was observed between the experimental and control groups. Significant decreases in muscle stiffness in the upper regions of the trapezius muscles were reported in the experimental group after 1 week of treatment. Short-term treatment using the cFIR neck device partly reduced muscle stiffness. Although the differences in the VAS and PPT scores for the experimental and control groups were not statistically significant, the improvement in muscle stiffness in the experimental group warrants further investigation of the long-term effects of cFIR treatment for pain management.

  3. "Using Power Tables to Compute Statistical Power in Multilevel Experimental Designs"

    ERIC Educational Resources Information Center

    Konstantopoulos, Spyros

    2009-01-01

    Power computations for one-level experimental designs that assume simple random samples are greatly facilitated by power tables such as those presented in Cohen's book about statistical power analysis. However, in education and the social sciences experimental designs have naturally nested structures and multilevel models are needed to compute the…

  4. Is There a Critical Distance for Fickian Transport? - a Statistical Approach to Sub-Fickian Transport Modelling in Porous Media

    NASA Astrophysics Data System (ADS)

    Most, S.; Nowak, W.; Bijeljic, B.

    2014-12-01

    Transport processes in porous media are frequently simulated as particle movement. This process can be formulated as a stochastic process of particle position increments. At the pore scale, the geometry and micro-heterogeneities prohibit the commonly made assumption of independent and normally distributed increments to represent dispersion. Many recent particle methods seek to loosen this assumption. Recent experimental data suggest that we have not yet reached the end of the need to generalize, because particle increments show statistical dependency beyond linear correlation and over many time steps. The goal of this work is to better understand the validity regions of commonly made assumptions. We are investigating after what transport distances can we observe: A statistical dependence between increments, that can be modelled as an order-k Markov process, boils down to order 1. This would be the Markovian distance for the process, where the validity of yet-unexplored non-Gaussian-but-Markovian random walks would start. A bivariate statistical dependence that simplifies to a multi-Gaussian dependence based on simple linear correlation (validity of correlated PTRW). Complete absence of statistical dependence (validity of classical PTRW/CTRW). The approach is to derive a statistical model for pore-scale transport from a powerful experimental data set via copula analysis. The model is formulated as a non-Gaussian, mutually dependent Markov process of higher order, which allows us to investigate the validity ranges of simpler models.

  5. Statistical Methodologies to Integrate Experimental and Computational Research

    NASA Technical Reports Server (NTRS)

    Parker, P. A.; Johnson, R. T.; Montgomery, D. C.

    2008-01-01

    Development of advanced algorithms for simulating engine flow paths requires the integration of fundamental experiments with the validation of enhanced mathematical models. In this paper, we provide an overview of statistical methods to strategically and efficiently conduct experiments and computational model refinement. Moreover, the integration of experimental and computational research efforts is emphasized. With a statistical engineering perspective, scientific and engineering expertise is combined with statistical sciences to gain deeper insights into experimental phenomenon and code development performance; supporting the overall research objectives. The particular statistical methods discussed are design of experiments, response surface methodology, and uncertainty analysis and planning. Their application is illustrated with a coaxial free jet experiment and a turbulence model refinement investigation. Our goal is to provide an overview, focusing on concepts rather than practice, to demonstrate the benefits of using statistical methods in research and development, thereby encouraging their broader and more systematic application.

  6. Measurements of experimental precision for trials with cowpea (Vigna unguiculata L. Walp.) genotypes.

    PubMed

    Teodoro, P E; Torres, F E; Santos, A D; Corrêa, A M; Nascimento, M; Barroso, L M A; Ceccon, G

    2016-05-09

    The aim of this study was to evaluate the suitability of statistics as experimental precision degree measures for trials with cowpea (Vigna unguiculata L. Walp.) genotypes. Cowpea genotype yields were evaluated in 29 trials conducted in Brazil between 2005 and 2012. The genotypes were evaluated with a randomized block design with four replications. Ten statistics that were estimated for each trial were compared using descriptive statistics, Pearson correlations, and path analysis. According to the class limits established, selective accuracy and F-test values for genotype, heritability, and the coefficient of determination adequately estimated the degree of experimental precision. Using these statistics, 86.21% of the trials had adequate experimental precision. Selective accuracy and the F-test values for genotype, heritability, and the coefficient of determination were directly related to each other, and were more suitable than the coefficient of variation and the least significant difference (by the Tukey test) to evaluate experimental precision in trials with cowpea genotypes.

  7. A Case Study on Teaching the Topic "Experimental Unit" and How It Is Presented in Advanced Placement Statistics Textbooks

    ERIC Educational Resources Information Center

    Perrett, Jamis J.

    2012-01-01

    This article demonstrates how textbooks differ in their description of the term "experimental unit". Advanced Placement Statistics teachers and students are often limited in their statistical knowledge by the information presented in their classroom textbook. Definitions and descriptions differ among textbooks as well as among different…

  8. A statistical approach for segregating cognitive task stages from multivariate fMRI BOLD time series.

    PubMed

    Demanuele, Charmaine; Bähner, Florian; Plichta, Michael M; Kirsch, Peter; Tost, Heike; Meyer-Lindenberg, Andreas; Durstewitz, Daniel

    2015-01-01

    Multivariate pattern analysis can reveal new information from neuroimaging data to illuminate human cognition and its disturbances. Here, we develop a methodological approach, based on multivariate statistical/machine learning and time series analysis, to discern cognitive processing stages from functional magnetic resonance imaging (fMRI) blood oxygenation level dependent (BOLD) time series. We apply this method to data recorded from a group of healthy adults whilst performing a virtual reality version of the delayed win-shift radial arm maze (RAM) task. This task has been frequently used to study working memory and decision making in rodents. Using linear classifiers and multivariate test statistics in conjunction with time series bootstraps, we show that different cognitive stages of the task, as defined by the experimenter, namely, the encoding/retrieval, choice, reward and delay stages, can be statistically discriminated from the BOLD time series in brain areas relevant for decision making and working memory. Discrimination of these task stages was significantly reduced during poor behavioral performance in dorsolateral prefrontal cortex (DLPFC), but not in the primary visual cortex (V1). Experimenter-defined dissection of time series into class labels based on task structure was confirmed by an unsupervised, bottom-up approach based on Hidden Markov Models. Furthermore, we show that different groupings of recorded time points into cognitive event classes can be used to test hypotheses about the specific cognitive role of a given brain region during task execution. We found that whilst the DLPFC strongly differentiated between task stages associated with different memory loads, but not between different visual-spatial aspects, the reverse was true for V1. Our methodology illustrates how different aspects of cognitive information processing during one and the same task can be separated and attributed to specific brain regions based on information contained in multivariate patterns of voxel activity.

  9. The Effectiveness of the Harm Reduction Group Therapy Based on Bandura's Self-Efficacy Theory on Risky Behaviors of Drug-Dependent Sex Worker Women.

    PubMed

    Rabani-Bavojdan, Marjan; Rabani-Bavojdan, Mozhgan; Rajabizadeh, Ghodratollah; Kaviani, Nahid; Bahramnejad, Ali; Ghaffari, Zohreh; Shafiei-Bafti, Mehdi

    2017-07-01

    The aim of this study was to investigate the effectiveness of the harm reduction group therapy based on Bandura's self-efficacy theory on risky behaviors of sex workers in Kerman, Iran. A quasi-experimental two-group design (a random selection with pre-test and post-test) was used. A risky behaviors questionnaire was used to collect. The sample was selected among sex workers referring to drop-in centers in Kerman. Subjects were allocated to two groups and were randomly classified into two experimental and control groups. The sample group consisted of 56 subjects. The experimental design was carried out during 12 sessions, and the post-test was performed one month and two weeks after the completion of the sessions. The results were analyzed statistically. By reducing harm based on Bandura's self-efficacy theory, the risky behaviors of the experimental group, including injection behavior, sexual behavior, violence, and damage to the skin, were significantly reduced in the pre-test compared to the post-test (P < 0.010). The harm reduction group therapy based on Bandura's self-efficacy theory can reduce the risky behaviors of sex workers.

  10. Leveraging the Power of Music to Improve Science Education

    ERIC Educational Resources Information Center

    Crowther, Gregory J.; McFadden, Tom; Fleming, Jean S.; Davis, Katie

    2016-01-01

    We assessed the impact of music videos with science-based lyrics on content knowledge and attitudes in a three-part experimental research study of over 1000 participants (mostly K-12 students). In Study A, 13 of 15 music videos were followed by statistically significant improvements on questions about material covered in the videos, while…

  11. Electronic contributions to the sigma(p) parameter of the Hammett equation.

    PubMed

    Domingo, Luis R; Pérez, Patricia; Contreras, Renato

    2003-07-25

    A statistical procedure to obtain the intrinsic electronic contributions to the Hammett substituent constant sigma(p) is reported. The method is based on the comparison between the experimental sigma(p) values and the electronic electrophilicity index omega evaluated for a series of 42 functional groups commonly present in organic compounds.

  12. Demonstration of Wavelet Techniques in the Spectral Analysis of Bypass Transition Data

    NASA Technical Reports Server (NTRS)

    Lewalle, Jacques; Ashpis, David E.; Sohn, Ki-Hyeon

    1997-01-01

    A number of wavelet-based techniques for the analysis of experimental data are developed and illustrated. A multiscale analysis based on the Mexican hat wavelet is demonstrated as a tool for acquiring physical and quantitative information not obtainable by standard signal analysis methods. Experimental data for the analysis came from simultaneous hot-wire velocity traces in a bypass transition of the boundary layer on a heated flat plate. A pair of traces (two components of velocity) at one location was excerpted. A number of ensemble and conditional statistics related to dominant time scales for energy and momentum transport were calculated. The analysis revealed a lack of energy-dominant time scales inside turbulent spots but identified transport-dominant scales inside spots that account for the largest part of the Reynolds stress. Momentum transport was much more intermittent than were energetic fluctuations. This work is the first step in a continuing study of the spatial evolution of these scale-related statistics, the goal being to apply the multiscale analysis results to improve the modeling of transitional and turbulent industrial flows.

  13. Statistical exchange-coupling errors and the practicality of scalable silicon donor qubits

    NASA Astrophysics Data System (ADS)

    Song, Yang; Das Sarma, S.

    2016-12-01

    Recent experimental efforts have led to considerable interest in donor-based localized electron spins in Si as viable qubits for a scalable silicon quantum computer. With the use of isotopically purified 28Si and the realization of extremely long spin coherence time in single-donor electrons, the recent experimental focus is on two-coupled donors with the eventual goal of a scaled-up quantum circuit. Motivated by this development, we simulate the statistical distribution of the exchange coupling J between a pair of donors under realistic donor placement straggles, and quantify the errors relative to the intended J value. With J values in a broad range of donor-pair separation ( 5 <|R |<60 nm), we work out various cases systematically, for a target donor separation R0 along the [001], [110] and [111] Si crystallographic directions, with |R0|=10 ,20 or 30 nm and standard deviation σR=1 ,2 ,5 or 10 nm. Our extensive theoretical results demonstrate the great challenge for a prescribed J gate even with just a donor pair, a first step for any scalable Si-donor-based quantum computer.

  14. Salient target detection based on pseudo-Wigner-Ville distribution and Rényi entropy.

    PubMed

    Xu, Yuannan; Zhao, Yuan; Jin, Chenfei; Qu, Zengfeng; Liu, Liping; Sun, Xiudong

    2010-02-15

    We present what we believe to be a novel method based on pseudo-Wigner-Ville distribution (PWVD) and Rényi entropy for salient targets detection. In the foundation of studying the statistical property of Rényi entropy via PWVD, the residual entropy-based saliency map of an input image can be obtained. From the saliency map, target detection is completed by the simple and convenient threshold segmentation. Experimental results demonstrate the proposed method can detect targets effectively in complex ground scenes.

  15. The Statistical Analysis Techniques to Support the NGNP Fuel Performance Experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bihn T. Pham; Jeffrey J. Einerson

    2010-06-01

    This paper describes the development and application of statistical analysis techniques to support the AGR experimental program on NGNP fuel performance. The experiments conducted in the Idaho National Laboratory’s Advanced Test Reactor employ fuel compacts placed in a graphite cylinder shrouded by a steel capsule. The tests are instrumented with thermocouples embedded in graphite blocks and the target quantity (fuel/graphite temperature) is regulated by the He-Ne gas mixture that fills the gap volume. Three techniques for statistical analysis, namely control charting, correlation analysis, and regression analysis, are implemented in the SAS-based NGNP Data Management and Analysis System (NDMAS) for automatedmore » processing and qualification of the AGR measured data. The NDMAS also stores daily neutronic (power) and thermal (heat transfer) code simulation results along with the measurement data, allowing for their combined use and comparative scrutiny. The ultimate objective of this work includes (a) a multi-faceted system for data monitoring and data accuracy testing, (b) identification of possible modes of diagnostics deterioration and changes in experimental conditions, (c) qualification of data for use in code validation, and (d) identification and use of data trends to support effective control of test conditions with respect to the test target. Analysis results and examples given in the paper show the three statistical analysis techniques providing a complementary capability to warn of thermocouple failures. It also suggests that the regression analysis models relating calculated fuel temperatures and thermocouple readings can enable online regulation of experimental parameters (i.e. gas mixture content), to effectively maintain the target quantity (fuel temperature) within a given range.« less

  16. Statistical and dynamical modeling of heavy-ion fusion-fission reactions

    NASA Astrophysics Data System (ADS)

    Eslamizadeh, H.; Razazzadeh, H.

    2018-02-01

    A modified statistical model and a four dimensional dynamical model based on Langevin equations have been used to simulate the fission process of the excited compound nuclei 207At and 216Ra produced in the fusion 19F + 188Os and 19F + 197Au reactions. The evaporation residue cross section, the fission cross section, the pre-scission neutron, proton and alpha multiplicities and the anisotropy of fission fragments angular distribution have been calculated for the excited compound nuclei 207At and 216Ra. In the modified statistical model the effects of spin K about the symmetry axis and temperature have been considered in calculations of the fission widths and the potential energy surfaces. It was shown that the modified statistical model can reproduce the above mentioned experimental data by using appropriate values of the temperature coefficient of the effective potential equal to λ = 0.0180 ± 0.0055, 0.0080 ± 0.0030 MeV-2 and the scaling factor of the fission barrier height equal to rs = 1.0015 ± 0.0025, 1.0040 ± 0.0020 for the compound nuclei 207At and 216Ra, respectively. Three collective shape coordinates plus the projection of total spin of the compound nucleus on the symmetry axis, K, were considered in the four dimensional dynamical model. In the dynamical calculations, dissipation was generated through the chaos weighted wall and window friction formula. Comparison of the theoretical results with the experimental data showed that two models make it possible to reproduce satisfactorily the above mentioned experimental data for the excited compound nuclei 207At and 216Ra.

  17. Development and Validation of a Statistical Shape Modeling-Based Finite Element Model of the Cervical Spine Under Low-Level Multiple Direction Loading Conditions

    PubMed Central

    Bredbenner, Todd L.; Eliason, Travis D.; Francis, W. Loren; McFarland, John M.; Merkle, Andrew C.; Nicolella, Daniel P.

    2014-01-01

    Cervical spinal injuries are a significant concern in all trauma injuries. Recent military conflicts have demonstrated the substantial risk of spinal injury for the modern warfighter. Finite element models used to investigate injury mechanisms often fail to examine the effects of variation in geometry or material properties on mechanical behavior. The goals of this study were to model geometric variation for a set of cervical spines, to extend this model to a parametric finite element model, and, as a first step, to validate the parametric model against experimental data for low-loading conditions. Individual finite element models were created using cervical spine (C3–T1) computed tomography data for five male cadavers. Statistical shape modeling (SSM) was used to generate a parametric finite element model incorporating variability of spine geometry, and soft-tissue material property variation was also included. The probabilistic loading response of the parametric model was determined under flexion-extension, axial rotation, and lateral bending and validated by comparison to experimental data. Based on qualitative and quantitative comparison of the experimental loading response and model simulations, we suggest that the model performs adequately under relatively low-level loading conditions in multiple loading directions. In conclusion, SSM methods coupled with finite element analyses within a probabilistic framework, along with the ability to statistically validate the overall model performance, provide innovative and important steps toward describing the differences in vertebral morphology, spinal curvature, and variation in material properties. We suggest that these methods, with additional investigation and validation under injurious loading conditions, will lead to understanding and mitigating the risks of injury in the spine and other musculoskeletal structures. PMID:25506051

  18. [Dilemma of null hypothesis in ecological hypothesis's experiment test.

    PubMed

    Li, Ji

    2016-06-01

    Experimental test is one of the major test methods of ecological hypothesis, though there are many arguments due to null hypothesis. Quinn and Dunham (1983) analyzed the hypothesis deduction model from Platt (1964) and thus stated that there is no null hypothesis in ecology that can be strictly tested by experiments. Fisher's falsificationism and Neyman-Pearson (N-P)'s non-decisivity inhibit statistical null hypothesis from being strictly tested. Moreover, since the null hypothesis H 0 (α=1, β=0) and alternative hypothesis H 1 '(α'=1, β'=0) in ecological progresses are diffe-rent from classic physics, the ecological null hypothesis can neither be strictly tested experimentally. These dilemmas of null hypothesis could be relieved via the reduction of P value, careful selection of null hypothesis, non-centralization of non-null hypothesis, and two-tailed test. However, the statistical null hypothesis significance testing (NHST) should not to be equivalent to the causality logistical test in ecological hypothesis. Hence, the findings and conclusions about methodological studies and experimental tests based on NHST are not always logically reliable.

  19. A Primer on Bayesian Analysis for Experimental Psychopathologists

    PubMed Central

    Krypotos, Angelos-Miltiadis; Blanken, Tessa F.; Arnaudova, Inna; Matzke, Dora; Beckers, Tom

    2016-01-01

    The principal goals of experimental psychopathology (EPP) research are to offer insights into the pathogenic mechanisms of mental disorders and to provide a stable ground for the development of clinical interventions. The main message of the present article is that those goals are better served by the adoption of Bayesian statistics than by the continued use of null-hypothesis significance testing (NHST). In the first part of the article we list the main disadvantages of NHST and explain why those disadvantages limit the conclusions that can be drawn from EPP research. Next, we highlight the advantages of Bayesian statistics. To illustrate, we then pit NHST and Bayesian analysis against each other using an experimental data set from our lab. Finally, we discuss some challenges when adopting Bayesian statistics. We hope that the present article will encourage experimental psychopathologists to embrace Bayesian statistics, which could strengthen the conclusions drawn from EPP research. PMID:28748068

  20. Standardized seawater rearing of chinook salmon smolts to evaluate hatchery practices showed low statistical power

    USGS Publications Warehouse

    Palmisano, Aldo N.; Elder, N.E.

    2001-01-01

    We examined, under standardized conditions, seawater survival of chinook salmon Oncorhynchus tshawytscha at the smolt stage to evaluate the experimental hatchery practices applied to their rearing. The experimental rearing practices included rearing fish at different densities; attempting to control bacterial kidney disease with broodstock segregation, erythromycin injection, and an experimental diet; rearing fish on different water sources; and freeze branding the fish. After application of experimental rearing practices in hatcheries, smolts were transported to a rearing facility for about 2-3 months of seawater rearing. Of 16 experiments, 4 yielded statistically significant differences in seawater survival. In general we found that high variability among replicates, plus the low numbers of replicates available, resulted in low statistical power. We recommend including four or five replicates and using ?? = 0.10 in 1-tailed tests of hatchery experiments to try to increase the statistical power to 0.80.

  1. For the Love of Statistics: Appreciating and Learning to Apply Experimental Analysis and Statistics through Computer Programming Activities

    ERIC Educational Resources Information Center

    Mascaró, Maite; Sacristán, Ana Isabel; Rufino, Marta M.

    2016-01-01

    For the past 4 years, we have been involved in a project that aims to enhance the teaching and learning of experimental analysis and statistics, of environmental and biological sciences students, through computational programming activities (using R code). In this project, through an iterative design, we have developed sequences of R-code-based…

  2. The Nonsubsampled Contourlet Transform Based Statistical Medical Image Fusion Using Generalized Gaussian Density

    PubMed Central

    Yang, Guocheng; Li, Meiling; Chen, Leiting; Yu, Jie

    2015-01-01

    We propose a novel medical image fusion scheme based on the statistical dependencies between coefficients in the nonsubsampled contourlet transform (NSCT) domain, in which the probability density function of the NSCT coefficients is concisely fitted using generalized Gaussian density (GGD), as well as the similarity measurement of two subbands is accurately computed by Jensen-Shannon divergence of two GGDs. To preserve more useful information from source images, the new fusion rules are developed to combine the subbands with the varied frequencies. That is, the low frequency subbands are fused by utilizing two activity measures based on the regional standard deviation and Shannon entropy and the high frequency subbands are merged together via weight maps which are determined by the saliency values of pixels. The experimental results demonstrate that the proposed method significantly outperforms the conventional NSCT based medical image fusion approaches in both visual perception and evaluation indices. PMID:26557871

  3. Tailored Algorithm for Sensitivity Enhancement of Gas Concentration Sensors Based on Tunable Laser Absorption Spectroscopy.

    PubMed

    Vargas-Rodriguez, Everardo; Guzman-Chavez, Ana Dinora; Baeza-Serrato, Roberto

    2018-06-04

    In this work, a novel tailored algorithm to enhance the overall sensitivity of gas concentration sensors based on the Direct Absorption Tunable Laser Absorption Spectroscopy (DA-ATLAS) method is presented. By using this algorithm, the sensor sensitivity can be custom-designed to be quasi constant over a much larger dynamic range compared with that obtained by typical methods based on a single statistics feature of the sensor signal output (peak amplitude, area under the curve, mean or RMS). Additionally, it is shown that with our algorithm, an optimal function can be tailored to get a quasi linear relationship between the concentration and some specific statistics features over a wider dynamic range. In order to test the viability of our algorithm, a basic C 2 H 2 sensor based on DA-ATLAS was implemented, and its experimental measurements support the simulated results provided by our algorithm.

  4. A statistical parts-based appearance model of inter-subject variability.

    PubMed

    Toews, Matthew; Collins, D Louis; Arbel, Tal

    2006-01-01

    In this article, we present a general statistical parts-based model for representing the appearance of an image set, applied to the problem of inter-subject MR brain image matching. In contrast with global image representations such as active appearance models, the parts-based model consists of a collection of localized image parts whose appearance, geometry and occurrence frequency are quantified statistically. The parts-based approach explicitly addresses the case where one-to-one correspondence does not exist between subjects due to anatomical differences, as parts are not expected to occur in all subjects. The model can be learned automatically, discovering structures that appear with statistical regularity in a large set of subject images, and can be robustly fit to new images, all in the presence of significant inter-subject variability. As parts are derived from generic scale-invariant features, the framework can be applied in a wide variety of image contexts, in order to study the commonality of anatomical parts or to group subjects according to the parts they share. Experimentation shows that a parts-based model can be learned from a large set of MR brain images, and used to determine parts that are common within the group of subjects. Preliminary results indicate that the model can be used to automatically identify distinctive features for inter-subject image registration despite large changes in appearance.

  5. Examining the Stationarity Assumption for Statistically Downscaled Climate Projections of Precipitation

    NASA Astrophysics Data System (ADS)

    Wootten, A.; Dixon, K. W.; Lanzante, J. R.; Mcpherson, R. A.

    2017-12-01

    Empirical statistical downscaling (ESD) approaches attempt to refine global climate model (GCM) information via statistical relationships between observations and GCM simulations. The aim of such downscaling efforts is to create added-value climate projections by adding finer spatial detail and reducing biases. The results of statistical downscaling exercises are often used in impact assessments under the assumption that past performance provides an indicator of future results. Given prior research describing the danger of this assumption with regards to temperature, this study expands the perfect model experimental design from previous case studies to test the stationarity assumption with respect to precipitation. Assuming stationarity implies the performance of ESD methods are similar between the future projections and historical training. Case study results from four quantile-mapping based ESD methods demonstrate violations of the stationarity assumption for both central tendency and extremes of precipitation. These violations vary geographically and seasonally. For the four ESD methods tested the greatest challenges for downscaling of daily total precipitation projections occur in regions with limited precipitation and for extremes of precipitation along Southeast coastal regions. We conclude with a discussion of future expansion of the perfect model experimental design and the implications for improving ESD methods and providing guidance on the use of ESD techniques for impact assessments and decision-support.

  6. The role of causal criteria in causal inferences: Bradford Hill's "aspects of association".

    PubMed

    Ward, Andrew C

    2009-06-17

    As noted by Wesley Salmon and many others, causal concepts are ubiquitous in every branch of theoretical science, in the practical disciplines and in everyday life. In the theoretical and practical sciences especially, people often base claims about causal relations on applications of statistical methods to data. However, the source and type of data place important constraints on the choice of statistical methods as well as on the warrant attributed to the causal claims based on the use of such methods. For example, much of the data used by people interested in making causal claims come from non-experimental, observational studies in which random allocations to treatment and control groups are not present. Thus, one of the most important problems in the social and health sciences concerns making justified causal inferences using non-experimental, observational data. In this paper, I examine one method of justifying such inferences that is especially widespread in epidemiology and the health sciences generally - the use of causal criteria. I argue that while the use of causal criteria is not appropriate for either deductive or inductive inferences, they do have an important role to play in inferences to the best explanation. As such, causal criteria, exemplified by what Bradford Hill referred to as "aspects of [statistical] associations", have an indispensible part to play in the goal of making justified causal claims.

  7. The role of causal criteria in causal inferences: Bradford Hill's "aspects of association"

    PubMed Central

    Ward, Andrew C

    2009-01-01

    As noted by Wesley Salmon and many others, causal concepts are ubiquitous in every branch of theoretical science, in the practical disciplines and in everyday life. In the theoretical and practical sciences especially, people often base claims about causal relations on applications of statistical methods to data. However, the source and type of data place important constraints on the choice of statistical methods as well as on the warrant attributed to the causal claims based on the use of such methods. For example, much of the data used by people interested in making causal claims come from non-experimental, observational studies in which random allocations to treatment and control groups are not present. Thus, one of the most important problems in the social and health sciences concerns making justified causal inferences using non-experimental, observational data. In this paper, I examine one method of justifying such inferences that is especially widespread in epidemiology and the health sciences generally – the use of causal criteria. I argue that while the use of causal criteria is not appropriate for either deductive or inductive inferences, they do have an important role to play in inferences to the best explanation. As such, causal criteria, exemplified by what Bradford Hill referred to as "aspects of [statistical] associations", have an indispensible part to play in the goal of making justified causal claims. PMID:19534788

  8. Proteomic Workflows for Biomarker Identification Using Mass Spectrometry — Technical and Statistical Considerations during Initial Discovery

    PubMed Central

    Orton, Dennis J.; Doucette, Alan A.

    2013-01-01

    Identification of biomarkers capable of differentiating between pathophysiological states of an individual is a laudable goal in the field of proteomics. Protein biomarker discovery generally employs high throughput sample characterization by mass spectrometry (MS), being capable of identifying and quantifying thousands of proteins per sample. While MS-based technologies have rapidly matured, the identification of truly informative biomarkers remains elusive, with only a handful of clinically applicable tests stemming from proteomic workflows. This underlying lack of progress is attributed in large part to erroneous experimental design, biased sample handling, as well as improper statistical analysis of the resulting data. This review will discuss in detail the importance of experimental design and provide some insight into the overall workflow required for biomarker identification experiments. Proper balance between the degree of biological vs. technical replication is required for confident biomarker identification. PMID:28250400

  9. Interconnect fatigue design for terrestrial photovoltaic modules

    NASA Technical Reports Server (NTRS)

    Mon, G. R.; Moore, D. M.; Ross, R. G., Jr.

    1982-01-01

    The results of comprehensive investigation of interconnect fatigue that has led to the definition of useful reliability-design and life-prediction algorithms are presented. Experimental data indicate that the classical strain-cycle (fatigue) curve for the interconnect material is a good model of mean interconnect fatigue performance, but it fails to account for the broad statistical scatter, which is critical to reliability prediction. To fill this shortcoming the classical fatigue curve is combined with experimental cumulative interconnect failure rate data to yield statistical fatigue curves (having failure probability as a parameter) which enable (1) the prediction of cumulative interconnect failures during the design life of an array field, and (2) the unambiguous--ie., quantitative--interpretation of data from field-service qualification (accelerated thermal cycling) tests. Optimal interconnect cost-reliability design algorithms are derived based on minimizing the cost of energy over the design life of the array field.

  10. Interconnect fatigue design for terrestrial photovoltaic modules

    NASA Astrophysics Data System (ADS)

    Mon, G. R.; Moore, D. M.; Ross, R. G., Jr.

    1982-03-01

    The results of comprehensive investigation of interconnect fatigue that has led to the definition of useful reliability-design and life-prediction algorithms are presented. Experimental data indicate that the classical strain-cycle (fatigue) curve for the interconnect material is a good model of mean interconnect fatigue performance, but it fails to account for the broad statistical scatter, which is critical to reliability prediction. To fill this shortcoming the classical fatigue curve is combined with experimental cumulative interconnect failure rate data to yield statistical fatigue curves (having failure probability as a parameter) which enable (1) the prediction of cumulative interconnect failures during the design life of an array field, and (2) the unambiguous--ie., quantitative--interpretation of data from field-service qualification (accelerated thermal cycling) tests. Optimal interconnect cost-reliability design algorithms are derived based on minimizing the cost of energy over the design life of the array field.

  11. Impact of ontology evolution on functional analyses.

    PubMed

    Groß, Anika; Hartung, Michael; Prüfer, Kay; Kelso, Janet; Rahm, Erhard

    2012-10-15

    Ontologies are used in the annotation and analysis of biological data. As knowledge accumulates, ontologies and annotation undergo constant modifications to reflect this new knowledge. These modifications may influence the results of statistical applications such as functional enrichment analyses that describe experimental data in terms of ontological groupings. Here, we investigate to what degree modifications of the Gene Ontology (GO) impact these statistical analyses for both experimental and simulated data. The analysis is based on new measures for the stability of result sets and considers different ontology and annotation changes. Our results show that past changes in the GO are non-uniformly distributed over different branches of the ontology. Considering the semantic relatedness of significant categories in analysis results allows a more realistic stability assessment for functional enrichment studies. We observe that the results of term-enrichment analyses tend to be surprisingly stable despite changes in ontology and annotation.

  12. Experimental study on Statistical Damage Detection of RC Structures based on Wavelet Packet Analysis

    NASA Astrophysics Data System (ADS)

    Zhu, X. Q.; Law, S. S.; Jayawardhan, M.

    2011-07-01

    A novel damage indicator based on wavelet packet transform is developed in this study for structural health monitoring. The response signal of a structure under an impact load is normalized and then decomposed into wavelet packet components. Energies of these wavelet packet components are then calculated to obtain the energy distribution. A statistical indicator is developed to describe the damage extent of the structure. This approach is applied to the test results from simply supported reinforced concrete beams in the laboratory. Cases with single damage are created from static loading, and accelerations of the structure from under impact loads are analyzed. Results show that the method can be used for the damage monitoring and assessment of the structure.

  13. Texture metric that predicts target detection performance

    NASA Astrophysics Data System (ADS)

    Culpepper, Joanne B.

    2015-12-01

    Two texture metrics based on gray level co-occurrence error (GLCE) are used to predict probability of detection and mean search time. The two texture metrics are local clutter metrics and are based on the statistics of GLCE probability distributions. The degree of correlation between various clutter metrics and the target detection performance of the nine military vehicles in complex natural scenes found in the Search_2 dataset are presented. Comparison is also made between four other common clutter metrics found in the literature: root sum of squares, Doyle, statistical variance, and target structure similarity. The experimental results show that the GLCE energy metric is a better predictor of target detection performance when searching for targets in natural scenes than the other clutter metrics studied.

  14. Automatic identification of bullet signatures based on consecutive matching striae (CMS) criteria.

    PubMed

    Chu, Wei; Thompson, Robert M; Song, John; Vorburger, Theodore V

    2013-09-10

    The consecutive matching striae (CMS) numeric criteria for firearm and toolmark identifications have been widely accepted by forensic examiners, although there have been questions concerning its observer subjectivity and limited statistical support. In this paper, based on signal processing and extraction, a model for the automatic and objective counting of CMS is proposed. The position and shape information of the striae on the bullet land is represented by a feature profile, which is used for determining the CMS number automatically. Rapid counting of CMS number provides a basis for ballistics correlations with large databases and further statistical and probability analysis. Experimental results in this report using bullets fired from ten consecutively manufactured barrels support this developed model. Published by Elsevier Ireland Ltd.

  15. Identification of microRNAs with regulatory potential using a matched microRNA-mRNA time-course data.

    PubMed

    Jayaswal, Vivek; Lutherborrow, Mark; Ma, David D F; Hwa Yang, Yee

    2009-05-01

    Over the past decade, a class of small RNA molecules called microRNAs (miRNAs) has been shown to regulate gene expression at the post-transcription stage. While early work focused on the identification of miRNAs using a combination of experimental and computational techniques, subsequent studies have focused on identification of miRNA-target mRNA pairs as each miRNA can have hundreds of mRNA targets. The experimental validation of some miRNAs as oncogenic has provided further motivation for research in this area. In this article we propose an odds-ratio (OR) statistic for identification of regulatory miRNAs. It is based on integrative analysis of matched miRNA and mRNA time-course microarray data. The OR-statistic was used for (i) identification of miRNAs with regulatory potential, (ii) identification of miRNA-target mRNA pairs and (iii) identification of time lags between changes in miRNA expression and those of its target mRNAs. We applied the OR-statistic to a cancer data set and identified a small set of miRNAs that were negatively correlated to mRNAs. A literature survey revealed that some of the miRNAs that were predicted to be regulatory, were indeed oncogenic or tumor suppressors. Finally, some of the predicted miRNA targets have been shown to be experimentally valid.

  16. Comparative evaluation of guided tissue regeneration with use of collagen-based barrier freeze-dried dura mater allograft for mandibular class 2 furcation defects (a comparative controlled clinical study).

    PubMed

    Patel, Sandeep; Kubavat, Ajay; Ruparelia, Brijesh; Agarwal, Arvind; Panda, Anup

    2012-01-01

    The aim of periodontal surgery is complete regeneration. The present study was designed to evaluate and compare clinically soft tissue changes in form of probing pocket depth, gingival shrinkage, attachment level and hard tissue changes in form of horizontal and vertical bone level using resorbable membranes. Twelve subjects with bilateral class 2 furcation defects were selected. After initial phase one treatment, open debridement was performed in control site while freezedried dura mater allograft was used in experimental site. Soft and hard tissue parameters were registered intrasurgically. Nine months reentry ensured better understanding and evaluation of the final outcome of the study. Guided tissue regeneration is a predictable treatment modality for class 2 furcation defect. There was statistically significant reduction in pocket depth as compared to control (p < 0.01). There is statistically significant increase in periodontal attachment level within control and experimental sites showed better results (p < 0.01). For hard tissue parameter, significant defect fill resulted in experimental group, while in control group, less significant defect fill was found in horizontal direction and nonsignificant defect fill was found in vertical direction. The results showed statistically significant improvement in soft and hard tissue parameters and less gingival shrinkage in experimental sites compared to control site. The use of FDDMA in furcation defects helps us to achieve predictable results. This cross-linked collagen membrane has better handling properties and ease of procurement as well as economic viability making it a logical material to be used in regenerative surgeries.

  17. Damages detection in cylindrical metallic specimens by means of statistical baseline models and updated daily temperature profiles

    NASA Astrophysics Data System (ADS)

    Villamizar-Mejia, Rodolfo; Mujica-Delgado, Luis-Eduardo; Ruiz-Ordóñez, Magda-Liliana; Camacho-Navarro, Jhonatan; Moreno-Beltrán, Gustavo

    2017-05-01

    In previous works, damage detection of metallic specimens exposed to temperature changes has been achieved by using a statistical baseline model based on Principal Component Analysis (PCA), piezodiagnostics principle and taking into account temperature effect by augmenting the baseline model or by using several baseline models according to the current temperature. In this paper a new approach is presented, where damage detection is based in a new index that combine Q and T2 statistical indices with current temperature measurements. Experimental tests were achieved in a carbon-steel pipe of 1m length and 1.5 inches diameter, instrumented with piezodevices acting as actuators or sensors. A PCA baseline model was obtained to a temperature of 21º and then T2 and Q statistical indices were obtained for a 24h temperature profile. Also, mass adding at different points of pipe between sensor and actuator was used as damage. By using the combined index the temperature contribution can be separated and a better differentiation of damages respect to undamaged cases can be graphically obtained.

  18. Comparison of algebraic and analytical approaches to the formulation of the statistical model-based reconstruction problem for X-ray computed tomography.

    PubMed

    Cierniak, Robert; Lorent, Anna

    2016-09-01

    The main aim of this paper is to investigate properties of our originally formulated statistical model-based iterative approach applied to the image reconstruction from projections problem which are related to its conditioning, and, in this manner, to prove a superiority of this approach over ones recently used by other authors. The reconstruction algorithm based on this conception uses a maximum likelihood estimation with an objective adjusted to the probability distribution of measured signals obtained from an X-ray computed tomography system with parallel beam geometry. The analysis and experimental results presented here show that our analytical approach outperforms the referential algebraic methodology which is explored widely in the literature and exploited in various commercial implementations. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Measurement of optical intensity fluctuation over an 11.8 km turbulent path.

    PubMed

    Jiang, Yijun; Ma, Jing; Tan, Liying; Yu, Siyuan; Du, Wenhe

    2008-05-12

    An 11.8km optical link is established to examine the intensity fluctuation of the laser beam transmission through atmosphere turbulence. Probability density function, fade statistic, and high-frequency spectrum are researched based on the analysis of the experimental data collected in each season of a year, including both weak and strong fluctuation cases. Finally, the daily variation curve of scintillation index is given, compared with the variation of refractive-index structure parameter C(n) (2), which is calculated from the experimental data of angle of arrival. This work provides the experimental results that are helpful to the atmospheric propagation research and the free-space optical communication system design.

  20. Characterisation of the LMS propagation channel at L- and S-bands: Narrowband experimental data and channel modelling

    NASA Technical Reports Server (NTRS)

    Sforza, Mario; Buonomo, Sergio

    1993-01-01

    During the period 1983-1992 the European Space Agency (ESA) carried out several experimental campaigns to investigate the propagation impairments of the Land Mobile Satellite (LMS) communication channel. A substantial amount of data covering quite a large range of elevation angles, environments, and frequencies was obtained. Results from the data analyses are currently used for system planning and design applications within the framework of the future ESA LMS projects. This comprehensive experimental data base is presently utilized also for channel modeling purposes and preliminary results are given. Cumulative Distribution Functions (PDF) and Duration of Fades (DoF) statistics at different elevation angles and environments were also included.

  1. Theory of atomic spectral emission intensity

    NASA Astrophysics Data System (ADS)

    Yngström, Sten

    1994-07-01

    The theoretical derivation of a new spectral line intensity formula for atomic radiative emission is presented. The theory is based on first principles of quantum physics, electrodynamics, and statistical physics. Quantum rules lead to revision of the conventional principle of local thermal equilibrium of matter and radiation. Study of electrodynamics suggests absence of spectral emission from fractions of the numbers of atoms and ions in a plasma due to radiative inhibition caused by electromagnetic force fields. Statistical probability methods are extended by the statement: A macroscopic physical system develops in the most probable of all conceivable ways consistent with the constraining conditions for the system. The crucial role of statistical physics in transforming quantum logic into common sense logic is stressed. The theory is strongly supported by experimental evidence.

  2. No-Reference Video Quality Assessment Based on Statistical Analysis in 3D-DCT Domain.

    PubMed

    Li, Xuelong; Guo, Qun; Lu, Xiaoqiang

    2016-05-13

    It is an important task to design models for universal no-reference video quality assessment (NR-VQA) in multiple video processing and computer vision applications. However, most existing NR-VQA metrics are designed for specific distortion types which are not often aware in practical applications. A further deficiency is that the spatial and temporal information of videos is hardly considered simultaneously. In this paper, we propose a new NR-VQA metric based on the spatiotemporal natural video statistics (NVS) in 3D discrete cosine transform (3D-DCT) domain. In the proposed method, a set of features are firstly extracted based on the statistical analysis of 3D-DCT coefficients to characterize the spatiotemporal statistics of videos in different views. These features are used to predict the perceived video quality via the efficient linear support vector regression (SVR) model afterwards. The contributions of this paper are: 1) we explore the spatiotemporal statistics of videos in 3DDCT domain which has the inherent spatiotemporal encoding advantage over other widely used 2D transformations; 2) we extract a small set of simple but effective statistical features for video visual quality prediction; 3) the proposed method is universal for multiple types of distortions and robust to different databases. The proposed method is tested on four widely used video databases. Extensive experimental results demonstrate that the proposed method is competitive with the state-of-art NR-VQA metrics and the top-performing FR-VQA and RR-VQA metrics.

  3. Statistical power and optimal design in experiments in which samples of participants respond to samples of stimuli.

    PubMed

    Westfall, Jacob; Kenny, David A; Judd, Charles M

    2014-10-01

    Researchers designing experiments in which a sample of participants responds to a sample of stimuli are faced with difficult questions about optimal study design. The conventional procedures of statistical power analysis fail to provide appropriate answers to these questions because they are based on statistical models in which stimuli are not assumed to be a source of random variation in the data, models that are inappropriate for experiments involving crossed random factors of participants and stimuli. In this article, we present new methods of power analysis for designs with crossed random factors, and we give detailed, practical guidance to psychology researchers planning experiments in which a sample of participants responds to a sample of stimuli. We extensively examine 5 commonly used experimental designs, describe how to estimate statistical power in each, and provide power analysis results based on a reasonable set of default parameter values. We then develop general conclusions and formulate rules of thumb concerning the optimal design of experiments in which a sample of participants responds to a sample of stimuli. We show that in crossed designs, statistical power typically does not approach unity as the number of participants goes to infinity but instead approaches a maximum attainable power value that is possibly small, depending on the stimulus sample. We also consider the statistical merits of designs involving multiple stimulus blocks. Finally, we provide a simple and flexible Web-based power application to aid researchers in planning studies with samples of stimuli.

  4. PDF-based heterogeneous multiscale filtration model.

    PubMed

    Gong, Jian; Rutland, Christopher J

    2015-04-21

    Motivated by modeling of gasoline particulate filters (GPFs), a probability density function (PDF) based heterogeneous multiscale filtration (HMF) model is developed to calculate filtration efficiency of clean particulate filters. A new methodology based on statistical theory and classic filtration theory is developed in the HMF model. Based on the analysis of experimental porosimetry data, a pore size probability density function is introduced to represent heterogeneity and multiscale characteristics of the porous wall. The filtration efficiency of a filter can be calculated as the sum of the contributions of individual collectors. The resulting HMF model overcomes the limitations of classic mean filtration models which rely on tuning of the mean collector size. Sensitivity analysis shows that the HMF model recovers the classical mean model when the pore size variance is very small. The HMF model is validated by fundamental filtration experimental data from different scales of filter samples. The model shows a good agreement with experimental data at various operating conditions. The effects of the microstructure of filters on filtration efficiency as well as the most penetrating particle size are correctly predicted by the model.

  5. Improving the Effectiveness of Fundraising Messages: The Impact of Charity Goal Attainment, Message Framing, and Evidence on Persuasion

    ERIC Educational Resources Information Center

    Das, Enny; Kerkhof, Peter; Kuiper, Joyce

    2008-01-01

    This experimental study assessed the effectiveness of fundraising messages. Based on recent findings regarding the effects of message framing and evidence, effective fundraising messages should combine abstract, statistical information with a negative message frame and anecdotal evidence with a positive message frame. In addition, building on…

  6. Teaching Efficacy in the Classroom: Skill Based Training for Teachers' Empowerment

    ERIC Educational Resources Information Center

    Karimzadeh, Mansoureh; Salehi, Hadi; Embi, Mohamed Amin; Nasiri, Mehdi; Shojaee, Mohammad

    2014-01-01

    This study aims to use an experimental research design to enhance teaching efficacy by social-emotional skills training in teachers. The statistical sample comprised of 68 elementary teachers (grades 4 and 5) with at least 10 years teaching experience and a bachelor's degree who were randomly assigned into control (18 female, 16 male) and…

  7. Central Tire Inflation: Demonstration Tests in the South

    Treesearch

    R.B. Rummer; C. Ashmore; D.L. Sirois; C.L. Rawlins

    1990-01-01

    Tests of prototype Central Tire Inflation (CT11 systems were conducted to quantify CT1 performance, road wear, and truck vibration. The CT1 systems were tested in both experimental and operational settings. Changes in the road surface that occurred during the tests could not be statistically attributed to reduced tire pressure. Vibration at the seat base, however,...

  8. On the combined gradient-stochastic plasticity model: Application to Mo-micropillar compression

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Konstantinidis, A. A., E-mail: akonsta@civil.auth.gr; Zhang, X., E-mail: zhangxu26@126.com; Aifantis, E. C., E-mail: mom@mom.gen.auth.gr

    2015-02-17

    A formulation for addressing heterogeneous material deformation is proposed. It is based on the use of a stochasticity-enhanced gradient plasticity model implemented through a cellular automaton. The specific application is on Mo-micropillar compression, for which the irregularities of the strain bursts observed have been experimentally measured and theoretically interpreted through Tsallis' q-statistics.

  9. I Remember You: Independence and the Binomial Model

    ERIC Educational Resources Information Center

    Levine, Douglas W.; Rockhill, Beverly

    2006-01-01

    We focus on the problem of ignoring statistical independence. A binomial experiment is used to determine whether judges could match, based on looks alone, dogs to their owners. The experimental design introduces dependencies such that the probability of a given judge correctly matching a dog and an owner changes from trial to trial. We show how…

  10. A Step-by-Step Guide to Propensity Score Matching in R

    ERIC Educational Resources Information Center

    Randolph, Justus J.; Falbe, Kristina; Manuel, Austin Kureethara; Balloun, Joseph L.

    2014-01-01

    Propensity score matching is a statistical technique in which a treatment case is matched with one or more control cases based on each case's propensity score. This matching can help strengthen causal arguments in quasi-experimental and observational studies by reducing selection bias. In this article we concentrate on how to conduct propensity…

  11. An Exploratory Data Analysis System for Support in Medical Decision-Making

    PubMed Central

    Copeland, J. A.; Hamel, B.; Bourne, J. R.

    1979-01-01

    An experimental system was developed to allow retrieval and analysis of data collected during a study of neurobehavioral correlates of renal disease. After retrieving data organized in a relational data base, simple bivariate statistics of parametric and nonparametric nature could be conducted. An “exploratory” mode in which the system provided guidance in selection of appropriate statistical analyses was also available to the user. The system traversed a decision tree using the inherent qualities of the data (e.g., the identity and number of patients, tests, and time epochs) to search for the appropriate analyses to employ.

  12. Full statistical mode reconstruction of a light field via a photon-number-resolved measurement

    NASA Astrophysics Data System (ADS)

    Burenkov, I. A.; Sharma, A. K.; Gerrits, T.; Harder, G.; Bartley, T. J.; Silberhorn, C.; Goldschmidt, E. A.; Polyakov, S. V.

    2017-05-01

    We present a method to reconstruct the complete statistical mode structure and optical losses of multimode conjugated optical fields using an experimentally measured joint photon-number probability distribution. We demonstrate that this method evaluates classical and nonclassical properties using a single measurement technique and is well suited for quantum mesoscopic state characterization. We obtain a nearly perfect reconstruction of a field comprised of up to ten modes based on a minimal set of assumptions. To show the utility of this method, we use it to reconstruct the mode structure of an unknown bright parametric down-conversion source.

  13. What can we learn from noise? — Mesoscopic nonequilibrium statistical physics —

    PubMed Central

    KOBAYASHI, Kensuke

    2016-01-01

    Mesoscopic systems — small electric circuits working in quantum regime — offer us a unique experimental stage to explorer quantum transport in a tunable and precise way. The purpose of this Review is to show how they can contribute to statistical physics. We introduce the significance of fluctuation, or equivalently noise, as noise measurement enables us to address the fundamental aspects of a physical system. The significance of the fluctuation theorem (FT) in statistical physics is noted. We explain what information can be deduced from the current noise measurement in mesoscopic systems. As an important application of the noise measurement to statistical physics, we describe our experimental work on the current and current noise in an electron interferometer, which is the first experimental test of FT in quantum regime. Our attempt will shed new light in the research field of mesoscopic quantum statistical physics. PMID:27477456

  14. What can we learn from noise? - Mesoscopic nonequilibrium statistical physics.

    PubMed

    Kobayashi, Kensuke

    2016-01-01

    Mesoscopic systems - small electric circuits working in quantum regime - offer us a unique experimental stage to explorer quantum transport in a tunable and precise way. The purpose of this Review is to show how they can contribute to statistical physics. We introduce the significance of fluctuation, or equivalently noise, as noise measurement enables us to address the fundamental aspects of a physical system. The significance of the fluctuation theorem (FT) in statistical physics is noted. We explain what information can be deduced from the current noise measurement in mesoscopic systems. As an important application of the noise measurement to statistical physics, we describe our experimental work on the current and current noise in an electron interferometer, which is the first experimental test of FT in quantum regime. Our attempt will shed new light in the research field of mesoscopic quantum statistical physics.

  15. A phenomenological biological dose model for proton therapy based on linear energy transfer spectra.

    PubMed

    Rørvik, Eivind; Thörnqvist, Sara; Stokkevåg, Camilla H; Dahle, Tordis J; Fjaera, Lars Fredrik; Ytre-Hauge, Kristian S

    2017-06-01

    The relative biological effectiveness (RBE) of protons varies with the radiation quality, quantified by the linear energy transfer (LET). Most phenomenological models employ a linear dependency of the dose-averaged LET (LET d ) to calculate the biological dose. However, several experiments have indicated a possible non-linear trend. Our aim was to investigate if biological dose models including non-linear LET dependencies should be considered, by introducing a LET spectrum based dose model. The RBE-LET relationship was investigated by fitting of polynomials from 1st to 5th degree to a database of 85 data points from aerobic in vitro experiments. We included both unweighted and weighted regression, the latter taking into account experimental uncertainties. Statistical testing was performed to decide whether higher degree polynomials provided better fits to the data as compared to lower degrees. The newly developed models were compared to three published LET d based models for a simulated spread out Bragg peak (SOBP) scenario. The statistical analysis of the weighted regression analysis favored a non-linear RBE-LET relationship, with the quartic polynomial found to best represent the experimental data (P = 0.010). The results of the unweighted regression analysis were on the borderline of statistical significance for non-linear functions (P = 0.053), and with the current database a linear dependency could not be rejected. For the SOBP scenario, the weighted non-linear model estimated a similar mean RBE value (1.14) compared to the three established models (1.13-1.17). The unweighted model calculated a considerably higher RBE value (1.22). The analysis indicated that non-linear models could give a better representation of the RBE-LET relationship. However, this is not decisive, as inclusion of the experimental uncertainties in the regression analysis had a significant impact on the determination and ranking of the models. As differences between the models were observed for the SOBP scenario, both non-linear LET spectrum- and linear LET d based models should be further evaluated in clinically realistic scenarios. © 2017 American Association of Physicists in Medicine.

  16. Effect of music care on depression and behavioral problems in elderly people with dementia in Taiwan: a quasi-experimental, longitudinal study.

    PubMed

    Wang, Su-Chin; Yu, Ching-Len; Chang, Su-Hsien

    2017-02-01

    The purpose was to examine the effectiveness of music care on cognitive function, depression, and behavioral problems among elderly people with dementia in long-term care facilities in Taiwan. The study had a quasi-experimental, longitudinal research design and used two groups of subjects. Subjects were not randomly assigned to experimental group (n = 90) or comparison group (n = 56). Based on Bandura's social cognition theory, subjects in the experimental group received Kagayashiki music care (KMC) twice per week for 24 weeks. Subjects in the comparison group were provided with activities as usual. Results found, using the control score of the Clifton Assessment Procedures for the Elderly Behavior Rating Scale (baseline) and time of attending KMC activities as a covariate, the two groups of subjects had statistically significant differences in the mini-mental state examination (MMSE). Results also showed that, using the control score of the Cornell Scale for Depression in Dementia (baseline) and MMSE (baseline) as a covariate, the two groups of subjects had statistically significant differences in the Clifton Assessment Procedures for the Elderly Behavior Rating Scale. These findings provide information for staff caregivers in long-term care facilities to develop a non-invasive care model for elderly people with dementia to deal with depression, anxiety, and behavioral problems.

  17. Spontaneous emergence of rogue waves in partially coherent waves: A quantitative experimental comparison between hydrodynamics and optics

    NASA Astrophysics Data System (ADS)

    El Koussaifi, R.; Tikan, A.; Toffoli, A.; Randoux, S.; Suret, P.; Onorato, M.

    2018-01-01

    Rogue waves are extreme and rare fluctuations of the wave field that have been discussed in many physical systems. Their presence substantially influences the statistical properties of a partially coherent wave field, i.e., a wave field characterized by a finite band spectrum with random Fourier phases. Their understanding is fundamental for the design of ships and offshore platforms. In many meteorological conditions waves in the ocean are characterized by the so-called Joint North Sea Wave Project (JONSWAP) spectrum. Here we compare two unique experimental results: the first one has been performed in a 270 m wave tank and the other in optical fibers. In both cases, waves characterized by a JONSWAP spectrum and random Fourier phases have been launched at the input of the experimental device. The quantitative comparison, based on an appropriate scaling of the two experiments, shows a very good agreement between the statistics in hydrodynamics and optics. Spontaneous emergence of heavy tails in the probability density function of the wave amplitude is observed in both systems. The results demonstrate the universal features of rogue waves and provide a fundamental and explicit bridge between two important fields of research. Numerical simulations are also compared with experimental results.

  18. Spontaneous emergence of rogue waves in partially coherent waves: A quantitative experimental comparison between hydrodynamics and optics.

    PubMed

    El Koussaifi, R; Tikan, A; Toffoli, A; Randoux, S; Suret, P; Onorato, M

    2018-01-01

    Rogue waves are extreme and rare fluctuations of the wave field that have been discussed in many physical systems. Their presence substantially influences the statistical properties of a partially coherent wave field, i.e., a wave field characterized by a finite band spectrum with random Fourier phases. Their understanding is fundamental for the design of ships and offshore platforms. In many meteorological conditions waves in the ocean are characterized by the so-called Joint North Sea Wave Project (JONSWAP) spectrum. Here we compare two unique experimental results: the first one has been performed in a 270 m wave tank and the other in optical fibers. In both cases, waves characterized by a JONSWAP spectrum and random Fourier phases have been launched at the input of the experimental device. The quantitative comparison, based on an appropriate scaling of the two experiments, shows a very good agreement between the statistics in hydrodynamics and optics. Spontaneous emergence of heavy tails in the probability density function of the wave amplitude is observed in both systems. The results demonstrate the universal features of rogue waves and provide a fundamental and explicit bridge between two important fields of research. Numerical simulations are also compared with experimental results.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Webb-Robertson, Bobbie-Jo M.

    Accurate identification of peptides is a current challenge in mass spectrometry (MS) based proteomics. The standard approach uses a search routine to compare tandem mass spectra to a database of peptides associated with the target organism. These database search routines yield multiple metrics associated with the quality of the mapping of the experimental spectrum to the theoretical spectrum of a peptide. The structure of these results make separating correct from false identifications difficult and has created a false identification problem. Statistical confidence scores are an approach to battle this false positive problem that has led to significant improvements in peptidemore » identification. We have shown that machine learning, specifically support vector machine (SVM), is an effective approach to separating true peptide identifications from false ones. The SVM-based peptide statistical scoring method transforms a peptide into a vector representation based on database search metrics to train and validate the SVM. In practice, following the database search routine, a peptides is denoted in its vector representation and the SVM generates a single statistical score that is then used to classify presence or absence in the sample« less

  20. BagMOOV: A novel ensemble for heart disease prediction bootstrap aggregation with multi-objective optimized voting.

    PubMed

    Bashir, Saba; Qamar, Usman; Khan, Farhan Hassan

    2015-06-01

    Conventional clinical decision support systems are based on individual classifiers or simple combination of these classifiers which tend to show moderate performance. This research paper presents a novel classifier ensemble framework based on enhanced bagging approach with multi-objective weighted voting scheme for prediction and analysis of heart disease. The proposed model overcomes the limitations of conventional performance by utilizing an ensemble of five heterogeneous classifiers: Naïve Bayes, linear regression, quadratic discriminant analysis, instance based learner and support vector machines. Five different datasets are used for experimentation, evaluation and validation. The datasets are obtained from publicly available data repositories. Effectiveness of the proposed ensemble is investigated by comparison of results with several classifiers. Prediction results of the proposed ensemble model are assessed by ten fold cross validation and ANOVA statistics. The experimental evaluation shows that the proposed framework deals with all type of attributes and achieved high diagnosis accuracy of 84.16 %, 93.29 % sensitivity, 96.70 % specificity, and 82.15 % f-measure. The f-ratio higher than f-critical and p value less than 0.05 for 95 % confidence interval indicate that the results are extremely statistically significant for most of the datasets.

  1. Retrieval of Atmospheric Particulate Matter Using Satellite Data Over Central and Eastern China

    NASA Astrophysics Data System (ADS)

    Chen, G. L.; Guang, J.; Li, Y.; Che, Y. H.; Gong, S. Q.

    2018-04-01

    Fine particulate matter (PM2.5) is a particle cluster with diameters less than or equal to 2.5 μm. Over the past few decades, regional air pollution composed of PM2.5 has frequently occurred over Central and Eastern China. In order to estimate the concentration, distribution and other properties of PM2.5, the general retrieval models built by establishing the relationship between aerosol optical depth (AOD) and PM2.5 has been widely used in many studies, including experimental models via statistics analysis and physical models with certain physical mechanism. The statistical experimental models can't be extended to other areas or historical period due to its dependence on the ground-based observations and necessary auxiliary data, which limits its further application. In this paper, a physically based model is applied to estimate the concentration of PM2.5 over Central and Eastern China from 2007 to 2016. The ground-based PM2.5 measurements were used to be as reference data to validate our retrieval results. Then annual variation and distribution of PM2.5 concentration in the Central and Eastern China was analysed. Results shows that the annual average PM2.5 show a trend of gradually increasing and then decreasing during 2007-2016, with the highest value in 2011.

  2. K-nearest neighbors based methods for identification of different gear crack levels under different motor speeds and loads: Revisited

    NASA Astrophysics Data System (ADS)

    Wang, Dong

    2016-03-01

    Gears are the most commonly used components in mechanical transmission systems. Their failures may cause transmission system breakdown and result in economic loss. Identification of different gear crack levels is important to prevent any unexpected gear failure because gear cracks lead to gear tooth breakage. Signal processing based methods mainly require expertize to explain gear fault signatures which is usually not easy to be achieved by ordinary users. In order to automatically identify different gear crack levels, intelligent gear crack identification methods should be developed. The previous case studies experimentally proved that K-nearest neighbors based methods exhibit high prediction accuracies for identification of 3 different gear crack levels under different motor speeds and loads. In this short communication, to further enhance prediction accuracies of existing K-nearest neighbors based methods and extend identification of 3 different gear crack levels to identification of 5 different gear crack levels, redundant statistical features are constructed by using Daubechies 44 (db44) binary wavelet packet transform at different wavelet decomposition levels, prior to the use of a K-nearest neighbors method. The dimensionality of redundant statistical features is 620, which provides richer gear fault signatures. Since many of these statistical features are redundant and highly correlated with each other, dimensionality reduction of redundant statistical features is conducted to obtain new significant statistical features. At last, the K-nearest neighbors method is used to identify 5 different gear crack levels under different motor speeds and loads. A case study including 3 experiments is investigated to demonstrate that the developed method provides higher prediction accuracies than the existing K-nearest neighbors based methods for recognizing different gear crack levels under different motor speeds and loads. Based on the new significant statistical features, some other popular statistical models including linear discriminant analysis, quadratic discriminant analysis, classification and regression tree and naive Bayes classifier, are compared with the developed method. The results show that the developed method has the highest prediction accuracies among these statistical models. Additionally, selection of the number of new significant features and parameter selection of K-nearest neighbors are thoroughly investigated.

  3. Assessing the significance of pedobarographic signals using random field theory.

    PubMed

    Pataky, Todd C

    2008-08-07

    Traditional pedobarographic statistical analyses are conducted over discrete regions. Recent studies have demonstrated that regionalization can corrupt pedobarographic field data through conflation when arbitrary dividing lines inappropriately delineate smooth field processes. An alternative is to register images such that homologous structures optimally overlap and then conduct statistical tests at each pixel to generate statistical parametric maps (SPMs). The significance of SPM processes may be assessed within the framework of random field theory (RFT). RFT is ideally suited to pedobarographic image analysis because its fundamental data unit is a lattice sampling of a smooth and continuous spatial field. To correct for the vast number of multiple comparisons inherent in such data, recent pedobarographic studies have employed a Bonferroni correction to retain a constant family-wise error rate. This approach unfortunately neglects the spatial correlation of neighbouring pixels, so provides an overly conservative (albeit valid) statistical threshold. RFT generally relaxes the threshold depending on field smoothness and on the geometry of the search area, but it also provides a framework for assigning p values to suprathreshold clusters based on their spatial extent. The current paper provides an overview of basic RFT concepts and uses simulated and experimental data to validate both RFT-relevant field smoothness estimations and RFT predictions regarding the topological characteristics of random pedobarographic fields. Finally, previously published experimental data are re-analysed using RFT inference procedures to demonstrate how RFT yields easily understandable statistical results that may be incorporated into routine clinical and laboratory analyses.

  4. A Range Finding Protocol to Support Design for Transcriptomics Experimentation: Examples of In-Vitro and In-Vivo Murine UV Exposure

    PubMed Central

    van Oostrom, Conny T.; Jonker, Martijs J.; de Jong, Mark; Dekker, Rob J.; Rauwerda, Han; Ensink, Wim A.; de Vries, Annemieke; Breit, Timo M.

    2014-01-01

    In transcriptomics research, design for experimentation by carefully considering biological, technological, practical and statistical aspects is very important, because the experimental design space is essentially limitless. Usually, the ranges of variable biological parameters of the design space are based on common practices and in turn on phenotypic endpoints. However, specific sub-cellular processes might only be partially reflected by phenotypic endpoints or outside the associated parameter range. Here, we provide a generic protocol for range finding in design for transcriptomics experimentation based on small-scale gene-expression experiments to help in the search for the right location in the design space by analyzing the activity of already known genes of relevant molecular mechanisms. Two examples illustrate the applicability: in-vitro UV-C exposure of mouse embryonic fibroblasts and in-vivo UV-B exposure of mouse skin. Our pragmatic approach is based on: framing a specific biological question and associated gene-set, performing a wide-ranged experiment without replication, eliminating potentially non-relevant genes, and determining the experimental ‘sweet spot’ by gene-set enrichment plus dose-response correlation analysis. Examination of many cellular processes that are related to UV response, such as DNA repair and cell-cycle arrest, revealed that basically each cellular (sub-) process is active at its own specific spot(s) in the experimental design space. Hence, the use of range finding, based on an affordable protocol like this, enables researchers to conveniently identify the ‘sweet spot’ for their cellular process of interest in an experimental design space and might have far-reaching implications for experimental standardization. PMID:24823911

  5. Simulation of spatially evolving turbulence and the applicability of Taylor's hypothesis in compressible flow

    NASA Technical Reports Server (NTRS)

    Lee, Sangsan; Lele, Sanjiva K.; Moin, Parviz

    1992-01-01

    For the numerical simulation of inhomogeneous turbulent flows, a method is developed for generating stochastic inflow boundary conditions with a prescribed power spectrum. Turbulence statistics from spatial simulations using this method with a low fluctuation Mach number are in excellent agreement with the experimental data, which validates the procedure. Turbulence statistics from spatial simulations are also compared to those from temporal simulations using Taylor's hypothesis. Statistics such as turbulence intensity, vorticity, and velocity derivative skewness compare favorably with the temporal simulation. However, the statistics of dilatation show a significant departure from those obtained in the temporal simulation. To directly check the applicability of Taylor's hypothesis, space-time correlations of fluctuations in velocity, vorticity, and dilatation are investigated. Convection velocities based on vorticity and velocity fluctuations are computed as functions of the spatial and temporal separations. The profile of the space-time correlation of dilatation fluctuations is explained via a wave propagation model.

  6. Steganalysis based on reducing the differences of image statistical characteristics

    NASA Astrophysics Data System (ADS)

    Wang, Ran; Niu, Shaozhang; Ping, Xijian; Zhang, Tao

    2018-04-01

    Compared with the process of embedding, the image contents make a more significant impact on the differences of image statistical characteristics. This makes the image steganalysis to be a classification problem with bigger withinclass scatter distances and smaller between-class scatter distances. As a result, the steganalysis features will be inseparate caused by the differences of image statistical characteristics. In this paper, a new steganalysis framework which can reduce the differences of image statistical characteristics caused by various content and processing methods is proposed. The given images are segmented to several sub-images according to the texture complexity. Steganalysis features are separately extracted from each subset with the same or close texture complexity to build a classifier. The final steganalysis result is figured out through a weighted fusing process. The theoretical analysis and experimental results can demonstrate the validity of the framework.

  7. Validating Coherence Measurements Using Aligned and Unaligned Coherence Functions

    NASA Technical Reports Server (NTRS)

    Miles, Jeffrey Hilton

    2006-01-01

    This paper describes a novel approach based on the use of coherence functions and statistical theory for sensor validation in a harsh environment. By the use of aligned and unaligned coherence functions and statistical theory one can test for sensor degradation, total sensor failure or changes in the signal. This advanced diagnostic approach and the novel data processing methodology discussed provides a single number that conveys this information. This number as calculated with standard statistical procedures for comparing the means of two distributions is compared with results obtained using Yuen's robust statistical method to create confidence intervals. Examination of experimental data from Kulite pressure transducers mounted in a Pratt & Whitney PW4098 combustor using spectrum analysis methods on aligned and unaligned time histories has verified the effectiveness of the proposed method. All the procedures produce good results which demonstrates how robust the technique is.

  8. On the Spike Train Variability Characterized by Variance-to-Mean Power Relationship.

    PubMed

    Koyama, Shinsuke

    2015-07-01

    We propose a statistical method for modeling the non-Poisson variability of spike trains observed in a wide range of brain regions. Central to our approach is the assumption that the variance and the mean of interspike intervals are related by a power function characterized by two parameters: the scale factor and exponent. It is shown that this single assumption allows the variability of spike trains to have an arbitrary scale and various dependencies on the firing rate in the spike count statistics, as well as in the interval statistics, depending on the two parameters of the power function. We also propose a statistical model for spike trains that exhibits the variance-to-mean power relationship. Based on this, a maximum likelihood method is developed for inferring the parameters from rate-modulated spike trains. The proposed method is illustrated on simulated and experimental spike trains.

  9. Effect of experimental design on the prediction performance of calibration models based on near-infrared spectroscopy for pharmaceutical applications.

    PubMed

    Bondi, Robert W; Igne, Benoît; Drennen, James K; Anderson, Carl A

    2012-12-01

    Near-infrared spectroscopy (NIRS) is a valuable tool in the pharmaceutical industry, presenting opportunities for online analyses to achieve real-time assessment of intermediates and finished dosage forms. The purpose of this work was to investigate the effect of experimental designs on prediction performance of quantitative models based on NIRS using a five-component formulation as a model system. The following experimental designs were evaluated: five-level, full factorial (5-L FF); three-level, full factorial (3-L FF); central composite; I-optimal; and D-optimal. The factors for all designs were acetaminophen content and the ratio of microcrystalline cellulose to lactose monohydrate. Other constituents included croscarmellose sodium and magnesium stearate (content remained constant). Partial least squares-based models were generated using data from individual experimental designs that related acetaminophen content to spectral data. The effect of each experimental design was evaluated by determining the statistical significance of the difference in bias and standard error of the prediction for that model's prediction performance. The calibration model derived from the I-optimal design had similar prediction performance as did the model derived from the 5-L FF design, despite containing 16 fewer design points. It also outperformed all other models estimated from designs with similar or fewer numbers of samples. This suggested that experimental-design selection for calibration-model development is critical, and optimum performance can be achieved with efficient experimental designs (i.e., optimal designs).

  10. Standard deviation and standard error of the mean.

    PubMed

    Lee, Dong Kyu; In, Junyong; Lee, Sangseok

    2015-06-01

    In most clinical and experimental studies, the standard deviation (SD) and the estimated standard error of the mean (SEM) are used to present the characteristics of sample data and to explain statistical analysis results. However, some authors occasionally muddle the distinctive usage between the SD and SEM in medical literature. Because the process of calculating the SD and SEM includes different statistical inferences, each of them has its own meaning. SD is the dispersion of data in a normal distribution. In other words, SD indicates how accurately the mean represents sample data. However the meaning of SEM includes statistical inference based on the sampling distribution. SEM is the SD of the theoretical distribution of the sample means (the sampling distribution). While either SD or SEM can be applied to describe data and statistical results, one should be aware of reasonable methods with which to use SD and SEM. We aim to elucidate the distinctions between SD and SEM and to provide proper usage guidelines for both, which summarize data and describe statistical results.

  11. Standard deviation and standard error of the mean

    PubMed Central

    In, Junyong; Lee, Sangseok

    2015-01-01

    In most clinical and experimental studies, the standard deviation (SD) and the estimated standard error of the mean (SEM) are used to present the characteristics of sample data and to explain statistical analysis results. However, some authors occasionally muddle the distinctive usage between the SD and SEM in medical literature. Because the process of calculating the SD and SEM includes different statistical inferences, each of them has its own meaning. SD is the dispersion of data in a normal distribution. In other words, SD indicates how accurately the mean represents sample data. However the meaning of SEM includes statistical inference based on the sampling distribution. SEM is the SD of the theoretical distribution of the sample means (the sampling distribution). While either SD or SEM can be applied to describe data and statistical results, one should be aware of reasonable methods with which to use SD and SEM. We aim to elucidate the distinctions between SD and SEM and to provide proper usage guidelines for both, which summarize data and describe statistical results. PMID:26045923

  12. Robust approaches to quantification of margin and uncertainty for sparse data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hund, Lauren; Schroeder, Benjamin B.; Rumsey, Kelin

    Characterizing the tails of probability distributions plays a key role in quantification of margins and uncertainties (QMU), where the goal is characterization of low probability, high consequence events based on continuous measures of performance. When data are collected using physical experimentation, probability distributions are typically fit using statistical methods based on the collected data, and these parametric distributional assumptions are often used to extrapolate about the extreme tail behavior of the underlying probability distribution. In this project, we character- ize the risk associated with such tail extrapolation. Specifically, we conducted a scaling study to demonstrate the large magnitude of themore » risk; then, we developed new methods for communicat- ing risk associated with tail extrapolation from unvalidated statistical models; lastly, we proposed a Bayesian data-integration framework to mitigate tail extrapolation risk through integrating ad- ditional information. We conclude that decision-making using QMU is a complex process that cannot be achieved using statistical analyses alone.« less

  13. Frequent statistics of link-layer bit stream data based on AC-IM algorithm

    NASA Astrophysics Data System (ADS)

    Cao, Chenghong; Lei, Yingke; Xu, Yiming

    2017-08-01

    At present, there are many relevant researches on data processing using classical pattern matching and its improved algorithm, but few researches on statistical data of link-layer bit stream. This paper adopts a frequent statistical method of link-layer bit stream data based on AC-IM algorithm for classical multi-pattern matching algorithms such as AC algorithm has high computational complexity, low efficiency and it cannot be applied to binary bit stream data. The method's maximum jump distance of the mode tree is length of the shortest mode string plus 3 in case of no missing? In this paper, theoretical analysis is made on the principle of algorithm construction firstly, and then the experimental results show that the algorithm can adapt to the binary bit stream data environment and extract the frequent sequence more accurately, the effect is obvious. Meanwhile, comparing with the classical AC algorithm and other improved algorithms, AC-IM algorithm has a greater maximum jump distance and less time-consuming.

  14. Nonadditive entropy Sq and nonextensive statistical mechanics: Applications in geophysics and elsewhere

    NASA Astrophysics Data System (ADS)

    Tsallis, Constantino

    2012-06-01

    The celebrated Boltzmann-Gibbs (BG) entropy, S BG = -kΣi p i ln p i, and associated statistical mechanics are essentially based on hypotheses such as ergodicity, i.e., when ensemble averages coincide with time averages. This dynamical simplification occurs in classical systems (and quantum counterparts) whose microscopic evolution is governed by a positive largest Lyapunov exponent (LLE). Under such circumstances, relevant microscopic variables behave, from the probabilistic viewpoint, as (nearly) independent. Many phenomena exist, however, in natural, artificial and social systems (geophysics, astrophysics, biophysics, economics, and others) that violate ergodicity. To cover a (possibly) wide class of such systems, a generalization (nonextensive statistical mechanics) of the BG theory was proposed in 1988. This theory is based on nonadditive entropies such as S_q = kfrac{{1 - sumnolimits_i {p_i^q } }} {{q - 1}}left( {S_1 = S_{BG} } right). Here we comment some central aspects of this theory, and briefly review typical predictions, verifications and applications in geophysics and elsewhere, as illustrated through theoretical, experimental, observational, and computational results.

  15. Multivariate statistical model for 3D image segmentation with application to medical images.

    PubMed

    John, Nigel M; Kabuka, Mansur R; Ibrahim, Mohamed O

    2003-12-01

    In this article we describe a statistical model that was developed to segment brain magnetic resonance images. The statistical segmentation algorithm was applied after a pre-processing stage involving the use of a 3D anisotropic filter along with histogram equalization techniques. The segmentation algorithm makes use of prior knowledge and a probability-based multivariate model designed to semi-automate the process of segmentation. The algorithm was applied to images obtained from the Center for Morphometric Analysis at Massachusetts General Hospital as part of the Internet Brain Segmentation Repository (IBSR). The developed algorithm showed improved accuracy over the k-means, adaptive Maximum Apriori Probability (MAP), biased MAP, and other algorithms. Experimental results showing the segmentation and the results of comparisons with other algorithms are provided. Results are based on an overlap criterion against expertly segmented images from the IBSR. The algorithm produced average results of approximately 80% overlap with the expertly segmented images (compared with 85% for manual segmentation and 55% for other algorithms).

  16. Statistical process control using optimized neural networks: a case study.

    PubMed

    Addeh, Jalil; Ebrahimzadeh, Ata; Azarbad, Milad; Ranaee, Vahid

    2014-09-01

    The most common statistical process control (SPC) tools employed for monitoring process changes are control charts. A control chart demonstrates that the process has altered by generating an out-of-control signal. This study investigates the design of an accurate system for the control chart patterns (CCPs) recognition in two aspects. First, an efficient system is introduced that includes two main modules: feature extraction module and classifier module. In the feature extraction module, a proper set of shape features and statistical feature are proposed as the efficient characteristics of the patterns. In the classifier module, several neural networks, such as multilayer perceptron, probabilistic neural network and radial basis function are investigated. Based on an experimental study, the best classifier is chosen in order to recognize the CCPs. Second, a hybrid heuristic recognition system is introduced based on cuckoo optimization algorithm (COA) algorithm to improve the generalization performance of the classifier. The simulation results show that the proposed algorithm has high recognition accuracy. Copyright © 2013 ISA. Published by Elsevier Ltd. All rights reserved.

  17. Statistical optimization of process parameters for lipase-catalyzed synthesis of triethanolamine-based esterquats using response surface methodology in 2-liter bioreactor.

    PubMed

    Masoumi, Hamid Reza Fard; Basri, Mahiran; Kassim, Anuar; Abdullah, Dzulkefly Kuang; Abdollahi, Yadollah; Abd Gani, Siti Salwa; Rezaee, Malahat

    2013-01-01

    Lipase-catalyzed production of triethanolamine-based esterquat by esterification of oleic acid (OA) with triethanolamine (TEA) in n-hexane was performed in 2 L stirred-tank reactor. A set of experiments was designed by central composite design to process modeling and statistically evaluate the findings. Five independent process variables, including enzyme amount, reaction time, reaction temperature, substrates molar ratio of OA to TEA, and agitation speed, were studied under the given conditions designed by Design Expert software. Experimental data were examined for normality test before data processing stage and skewness and kurtosis indices were determined. The mathematical model developed was found to be adequate and statistically accurate to predict the optimum conversion of product. Response surface methodology with central composite design gave the best performance in this study, and the methodology as a whole has been proven to be adequate for the design and optimization of the enzymatic process.

  18. A statistical approach to the brittle fracture of a multi-phase solid

    NASA Technical Reports Server (NTRS)

    Liu, W. K.; Lua, Y. I.; Belytschko, T.

    1991-01-01

    A stochastic damage model is proposed to quantify the inherent statistical distribution of the fracture toughness of a brittle, multi-phase solid. The model, based on the macrocrack-microcrack interaction, incorporates uncertainties in locations and orientations of microcracks. Due to the high concentration of microcracks near the macro-tip, a higher order analysis based on traction boundary integral equations is formulated first for an arbitrary array of cracks. The effects of uncertainties in locations and orientations of microcracks at a macro-tip are analyzed quantitatively by using the boundary integral equations method in conjunction with the computer simulation of the random microcrack array. The short range interactions resulting from surrounding microcracks closet to the main crack tip are investigated. The effects of microcrack density parameter are also explored in the present study. The validity of the present model is demonstrated by comparing its statistical output with the Neville distribution function, which gives correct fits to sets of experimental data from multi-phase solids.

  19. Modeling the milling tool wear by using an evolutionary SVM-based model from milling runs experimental data

    NASA Astrophysics Data System (ADS)

    Nieto, Paulino José García; García-Gonzalo, Esperanza; Vilán, José Antonio Vilán; Robleda, Abraham Segade

    2015-12-01

    The main aim of this research work is to build a new practical hybrid regression model to predict the milling tool wear in a regular cut as well as entry cut and exit cut of a milling tool. The model was based on Particle Swarm Optimization (PSO) in combination with support vector machines (SVMs). This optimization mechanism involved kernel parameter setting in the SVM training procedure, which significantly influences the regression accuracy. Bearing this in mind, a PSO-SVM-based model, which is based on the statistical learning theory, was successfully used here to predict the milling tool flank wear (output variable) as a function of the following input variables: the time duration of experiment, depth of cut, feed, type of material, etc. To accomplish the objective of this study, the experimental dataset represents experiments from runs on a milling machine under various operating conditions. In this way, data sampled by three different types of sensors (acoustic emission sensor, vibration sensor and current sensor) were acquired at several positions. A second aim is to determine the factors with the greatest bearing on the milling tool flank wear with a view to proposing milling machine's improvements. Firstly, this hybrid PSO-SVM-based regression model captures the main perception of statistical learning theory in order to obtain a good prediction of the dependence among the flank wear (output variable) and input variables (time, depth of cut, feed, etc.). Indeed, regression with optimal hyperparameters was performed and a determination coefficient of 0.95 was obtained. The agreement of this model with experimental data confirmed its good performance. Secondly, the main advantages of this PSO-SVM-based model are its capacity to produce a simple, easy-to-interpret model, its ability to estimate the contributions of the input variables, and its computational efficiency. Finally, the main conclusions of this study are exposed.

  20. Studies of concentration and temperature dependences of precipitation kinetics in iron-copper alloys using kinetic Monte Carlo and stochastic statistical simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Khromov, K. Yu.; Vaks, V. G., E-mail: vaks@mbslab.kiae.ru; Zhuravlev, I. A.

    2013-02-15

    The previously developed ab initio model and the kinetic Monte Carlo method (KMCM) are used to simulate precipitation in a number of iron-copper alloys with different copper concentrations x and temperatures T. The same simulations are also made using an improved version of the previously suggested stochastic statistical method (SSM). The results obtained enable us to make a number of general conclusions about the dependences of the decomposition kinetics in Fe-Cu alloys on x and T. We also show that the SSM usually describes the precipitation kinetics in good agreement with the KMCM, and using the SSM in conjunction withmore » the KMCM allows extending the KMC simulations to the longer evolution times. The results of simulations seem to agree with available experimental data for Fe-Cu alloys within statistical errors of simulations and the scatter of experimental results. Comparison of simulation results with experiments for some multicomponent Fe-Cu-based alloys allows making certain conclusions about the influence of alloying elements in these alloys on the precipitation kinetics at different stages of evolution.« less

  1. Experimental and environmental factors affect spurious detection of ecological thresholds

    USGS Publications Warehouse

    Daily, Jonathan P.; Hitt, Nathaniel P.; Smith, David; Snyder, Craig D.

    2012-01-01

    Threshold detection methods are increasingly popular for assessing nonlinear responses to environmental change, but their statistical performance remains poorly understood. We simulated linear change in stream benthic macroinvertebrate communities and evaluated the performance of commonly used threshold detection methods based on model fitting (piecewise quantile regression [PQR]), data partitioning (nonparametric change point analysis [NCPA]), and a hybrid approach (significant zero crossings [SiZer]). We demonstrated that false detection of ecological thresholds (type I errors) and inferences on threshold locations are influenced by sample size, rate of linear change, and frequency of observations across the environmental gradient (i.e., sample-environment distribution, SED). However, the relative importance of these factors varied among statistical methods and between inference types. False detection rates were influenced primarily by user-selected parameters for PQR (τ) and SiZer (bandwidth) and secondarily by sample size (for PQR) and SED (for SiZer). In contrast, the location of reported thresholds was influenced primarily by SED. Bootstrapped confidence intervals for NCPA threshold locations revealed strong correspondence to SED. We conclude that the choice of statistical methods for threshold detection should be matched to experimental and environmental constraints to minimize false detection rates and avoid spurious inferences regarding threshold location.

  2. Bounding the Set of Classical Correlations of a Many-Body System

    NASA Astrophysics Data System (ADS)

    Fadel, Matteo; Tura, Jordi

    2017-12-01

    We present a method to certify the presence of Bell correlations in experimentally observed statistics, and to obtain new Bell inequalities. Our approach is based on relaxing the conditions defining the set of correlations obeying a local hidden variable model, yielding a convergent hierarchy of semidefinite programs (SDP's). Because the size of these SDP's is independent of the number of parties involved, this technique allows us to characterize correlations in many-body systems. As an example, we illustrate our method with the experimental data presented in Science 352, 441 (2016), 10.1126/science.aad8665.

  3. Adhesive retention of experimental fiber-reinforced composite, orthodontic acrylic resin, and aliphatic urethane acrylate to silicone elastomer for maxillofacial prostheses.

    PubMed

    Kosor, Begüm Yerci; Artunç, Celal; Şahan, Heval

    2015-07-01

    A key factor of an implant-retained facial prosthesis is the success of the bonding between the substructure and the silicone elastomer. Little has been reported on the bonding of fiber reinforced composite (FRC) to silicone elastomers. Experimental FRC could be a solution for facial prostheses supported by light-activated aliphatic urethane acrylate, orthodontic acrylic resin, or commercially available FRCs. The purpose of this study was to evaluate the bonding of the experimental FRC, orthodontic acrylic resin, and light-activated aliphatic urethane acrylate to a commercially available high-temperature vulcanizing silicone elastomer. Shear and 180-degree peel bond strengths of 3 different substructures (experimental FRC, orthodontic acrylic resin, light-activated aliphatic urethane acrylate) (n=15) to a high-temperature vulcanizing maxillofacial silicone elastomer (M511) with a primer (G611) were assessed after 200 hours of accelerated artificial light-aging. The specimens were tested in a universal testing machine at a cross-head speed of 10 mm/min. Data were collected and statistically analyzed by 1-way ANOVA, followed by the Bonferroni correction and the Dunnett post hoc test (α=.05). Modes of failure were visually determined and categorized as adhesive, cohesive, or mixed and were statistically analyzed with the chi-squared goodness-of-fit test (α=.05). As the mean shear bond strength values were evaluated statistically, no difference was found among the experimental FRC, aliphatic urethane acrylate, and orthodontic acrylic resin subgroups (P>.05). The mean peel bond strengths of experimental fiber reinforced composite and aliphatic urethane acrylate were not found to be statistically different (P>.05). The mean value of the orthodontic acrylic resin subgroup peel bond strength was found to be statistically lower (P<.05). Shear test failure types were found to be statistically different (P<.05), whereas 180-degree peel test failure types were not found to be statistically significant (P>.05). Shear forces predominantly exhibited cohesive failure (64.4%), whereas peel forces predominantly exhibited adhesive failure (93.3%). The mean shear bond strengths of the experimental FRC and aliphatic urethane acrylate groups were not found to be statistically different (P>.05). The mean value of the 180-degree peel strength of the orthodontic acrylic resin group was found to be lower (P<.05). Copyright © 2015 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.

  4. A Mechanical Power Flow Capability for the Finite Element Code NASTRAN

    DTIC Science & Technology

    1989-07-01

    perimental methods. statistical energy analysis , the finite element method, and a finite element analog-,y using heat conduction equations. Experimental...weights and inertias of the transducers attached to an experimental structure may produce accuracy problems. Statistical energy analysis (SEA) is a...405-422 (1987). 8. Lyon, R.L., Statistical Energy Analysis of Dynamical Sistems, The M.I.T. Press, (1975). 9. Mickol, J.D., and R.J. Bernhard, "An

  5. A Laboratory Course for Teaching Laboratory Techniques, Experimental Design, Statistical Analysis, and Peer Review Process to Undergraduate Science Students

    ERIC Educational Resources Information Center

    Gliddon, C. M.; Rosengren, R. J.

    2012-01-01

    This article describes a 13-week laboratory course called Human Toxicology taught at the University of Otago, New Zealand. This course used a guided inquiry based laboratory coupled with formative assessment and collaborative learning to develop in undergraduate students the skills of problem solving/critical thinking, data interpretation and…

  6. Fire and Smoke Model Evaluation Experiment (FASMEE): Modeling gaps and data needs

    Treesearch

    Yongqiang Liu; Adam Kochanski; Kirk Baker; Ruddy Mell; Rodman Linn; Ronan Paugam; Jan Mandel; Aime Fournier; Mary Ann Jenkins; Scott Goodrick; Gary Achtemeier; Andrew Hudak; Matthew Dickson; Brian Potter; Craig Clements; Shawn Urbanski; Roger Ottmar; Narasimhan Larkin; Timothy Brown; Nancy French; Susan Prichard; Adam Watts; Derek McNamara

    2017-01-01

    Fire and smoke models are numerical tools for simulating fire behavior, smoke dynamics, and air quality impacts of wildland fires. Fire models are developed based on the fundamental chemistry and physics of combustion and fire spread or statistical analysis of experimental data (Sullivan 2009). They provide information on fire spread and fuel consumption for safe and...

  7. Deriving Signatures of In Vivo Toxicity Using Both Efficacy and Potency Information from In Vitro Assays: Evaluating Model Performance as a Function of Increasing Variability in Experimental Data

    EPA Science Inventory

    The US EPA ToxCast program aims to develop methods for mechanistically-based chemical prioritization using a suite of high throughput, in vitro assays that probe relevant biological pathways, and coupling them with statistical and machine learning methods that produce predictive ...

  8. Experimental research on mathematical modelling and unconventional control of clinker kiln in cement plants

    NASA Astrophysics Data System (ADS)

    Rusu-Anghel, S.

    2017-01-01

    Analytical modeling of the flow of manufacturing process of the cement is difficult because of their complexity and has not resulted in sufficiently precise mathematical models. In this paper, based on a statistical model of the process and using the knowledge of human experts, was designed a fuzzy system for automatic control of clinkering process.

  9. Digital Image Analysis of Yeast Single Cells Growing in Two Different Oxygen Concentrations to Analyze the Population Growth and to Assist Individual-Based Modeling

    PubMed Central

    Ginovart, Marta; Carbó, Rosa; Blanco, Mónica; Portell, Xavier

    2018-01-01

    Nowadays control of the growth of Saccharomyces to obtain biomass or cellular wall components is crucial for specific industrial applications. The general aim of this contribution is to deal with experimental data obtained from yeast cells and from yeast cultures to attempt the integration of the two levels of information, individual and population, to progress in the control of yeast biotechnological processes by means of the overall analysis of this set of experimental data, and to assist in the improvement of an individual-based model, namely, INDISIM-Saccha. Populations of S. cerevisiae growing in liquid batch culture, in aerobic and microaerophilic conditions, were studied. A set of digital images was taken during the population growth, and a protocol for the treatment and analyses of the images obtained was established. The piecewise linear model of Buchanan was adjusted to the temporal evolutions of the yeast populations to determine the kinetic parameters and changes of growth phases. In parallel, for all the yeast cells analyzed, values of direct morphological parameters, such as area, perimeter, major diameter, minor diameter, and derived ones, such as circularity and elongation, were obtained. Graphical and numerical methods from descriptive statistics were applied to these data to characterize the growth phases and the budding state of the yeast cells in both experimental conditions, and inferential statistical methods were used to compare the diverse groups of data achieved. Oxidative metabolism of yeast in a medium with oxygen available and low initial sugar concentration can be taken into account in order to obtain a greater number of cells or larger cells. Morphological parameters were analyzed statistically to identify which were the most useful for the discrimination of the different states, according to budding and/or growth phase, in aerobic and microaerophilic conditions. The use of the experimental data for subsequent modeling work was then discussed and compared to simulation results generated with INDISIM-Saccha, which allowed us to advance in the development of this yeast model, and illustrated the utility of data at different levels of observation and the needs and logic behind the development of a microbial individual-based model. PMID:29354112

  10. Development of a Stochastically-driven, Forward Predictive Performance Model for PEMFCs

    NASA Astrophysics Data System (ADS)

    Harvey, David Benjamin Paul

    A one-dimensional multi-scale coupled, transient, and mechanistic performance model for a PEMFC membrane electrode assembly has been developed. The model explicitly includes each of the 5 layers within a membrane electrode assembly and solves for the transport of charge, heat, mass, species, dissolved water, and liquid water. Key features of the model include the use of a multi-step implementation of the HOR reaction on the anode, agglomerate catalyst sub-models for both the anode and cathode catalyst layers, a unique approach that links the composition of the catalyst layer to key properties within the agglomerate model and the implementation of a stochastic input-based approach for component material properties. The model employs a new methodology for validation using statistically varying input parameters and statistically-based experimental performance data; this model represents the first stochastic input driven unit cell performance model. The stochastic input driven performance model was used to identify optimal ionomer content within the cathode catalyst layer, demonstrate the role of material variation in potential low performing MEA materials, provide explanation for the performance of low-Pt loaded MEAs, and investigate the validity of transient-sweep experimental diagnostic methods.

  11. Exploiting Complexity Information for Brain Activation Detection

    PubMed Central

    Zhang, Yan; Liang, Jiali; Lin, Qiang; Hu, Zhenghui

    2016-01-01

    We present a complexity-based approach for the analysis of fMRI time series, in which sample entropy (SampEn) is introduced as a quantification of the voxel complexity. Under this hypothesis the voxel complexity could be modulated in pertinent cognitive tasks, and it changes through experimental paradigms. We calculate the complexity of sequential fMRI data for each voxel in two distinct experimental paradigms and use a nonparametric statistical strategy, the Wilcoxon signed rank test, to evaluate the difference in complexity between them. The results are compared with the well known general linear model based Statistical Parametric Mapping package (SPM12), where a decided difference has been observed. This is because SampEn method detects brain complexity changes in two experiments of different conditions and the data-driven method SampEn evaluates just the complexity of specific sequential fMRI data. Also, the larger and smaller SampEn values correspond to different meanings, and the neutral-blank design produces higher predictability than threat-neutral. Complexity information can be considered as a complementary method to the existing fMRI analysis strategies, and it may help improving the understanding of human brain functions from a different perspective. PMID:27045838

  12. Fault detection, isolation, and diagnosis of self-validating multifunctional sensors.

    PubMed

    Yang, Jing-Li; Chen, Yin-Sheng; Zhang, Li-Li; Sun, Zhen

    2016-06-01

    A novel fault detection, isolation, and diagnosis (FDID) strategy for self-validating multifunctional sensors is presented in this paper. The sparse non-negative matrix factorization-based method can effectively detect faults by using the squared prediction error (SPE) statistic, and the variables contribution plots based on SPE statistic can help to locate and isolate the faulty sensitive units. The complete ensemble empirical mode decomposition is employed to decompose the fault signals to a series of intrinsic mode functions (IMFs) and a residual. The sample entropy (SampEn)-weighted energy values of each IMFs and the residual are estimated to represent the characteristics of the fault signals. Multi-class support vector machine is introduced to identify the fault mode with the purpose of diagnosing status of the faulty sensitive units. The performance of the proposed strategy is compared with other fault detection strategies such as principal component analysis, independent component analysis, and fault diagnosis strategies such as empirical mode decomposition coupled with support vector machine. The proposed strategy is fully evaluated in a real self-validating multifunctional sensors experimental system, and the experimental results demonstrate that the proposed strategy provides an excellent solution to the FDID research topic of self-validating multifunctional sensors.

  13. Photoconductivity enhancement and charge transport properties in ruthenium-containing block copolymer/carbon nanotube hybrids.

    PubMed

    Lo, Kin Cheung; Hau, King In; Chan, Wai Kin

    2018-04-05

    Functional polymer/carbon nanotube (CNT) hybrid materials can serve as a good model for light harvesting systems based on CNTs. This paper presents the synthesis of block copolymer/CNT hybrids and the characterization of their photocurrent responses by both experimental and computational approaches. A series of functional diblock copolymers was synthesized by reversible addition-fragmentation chain transfer polymerizations for the dispersion and functionalization of CNTs. The block copolymers contain photosensitizing ruthenium complexes and modified pyrene-based anchoring units. The photocurrent responses of the polymer/CNT hybrids were measured by photoconductive atomic force microscopy (PCAFM), from which the experimental data were analyzed by vigorous statistical models. The difference in photocurrent response among different hybrids was correlated to the conformations of the hybrids, which were elucidated by molecular dynamics simulations, and the electronic properties of polymers. The photoresponse of the block copolymer/CNT hybrids can be enhanced by introducing an electron-accepting block between the photosensitizing block and the CNT. We have demonstrated that the application of a rigorous statistical methodology can unravel the charge transport properties of these hybrid materials and provide general guidelines for the design of molecular light harvesting systems.

  14. On the use of attractor dimension as a feature in structural health monitoring

    USGS Publications Warehouse

    Nichols, J.M.; Virgin, L.N.; Todd, M.D.; Nichols, J.D.

    2003-01-01

    Recent works in the vibration-based structural health monitoring community have emphasised the use of correlation dimension as a discriminating statistic in seperating a damaged from undamaged response. This paper explores the utility of attractor dimension as a 'feature' and offers some comparisons between different metrics reflecting dimension. This focus is on evaluating the performance of two different measures of dimension as damage indicators in a structural health monitoring context. Results indicate that the correlation dimension is probably a poor choice of statistic for the purpose of signal discrimination. Other measures of dimension may be used for the same purposes with a higher degree of statistical reliability. The question of competing methodologies is placed in a hypothesis testing framework and answered with experimental data taken from a cantilivered beam.

  15. Classification image analysis: estimation and statistical inference for two-alternative forced-choice experiments

    NASA Technical Reports Server (NTRS)

    Abbey, Craig K.; Eckstein, Miguel P.

    2002-01-01

    We consider estimation and statistical hypothesis testing on classification images obtained from the two-alternative forced-choice experimental paradigm. We begin with a probabilistic model of task performance for simple forced-choice detection and discrimination tasks. Particular attention is paid to general linear filter models because these models lead to a direct interpretation of the classification image as an estimate of the filter weights. We then describe an estimation procedure for obtaining classification images from observer data. A number of statistical tests are presented for testing various hypotheses from classification images based on some more compact set of features derived from them. As an example of how the methods we describe can be used, we present a case study investigating detection of a Gaussian bump profile.

  16. Noise removing in encrypted color images by statistical analysis

    NASA Astrophysics Data System (ADS)

    Islam, N.; Puech, W.

    2012-03-01

    Cryptographic techniques are used to secure confidential data from unauthorized access but these techniques are very sensitive to noise. A single bit change in encrypted data can have catastrophic impact over the decrypted data. This paper addresses the problem of removing bit error in visual data which are encrypted using AES algorithm in the CBC mode. In order to remove the noise, a method is proposed which is based on the statistical analysis of each block during the decryption. The proposed method exploits local statistics of the visual data and confusion/diffusion properties of the encryption algorithm to remove the errors. Experimental results show that the proposed method can be used at the receiving end for the possible solution for noise removing in visual data in encrypted domain.

  17. Microscopic saw mark analysis: an empirical approach.

    PubMed

    Love, Jennifer C; Derrick, Sharon M; Wiersema, Jason M; Peters, Charles

    2015-01-01

    Microscopic saw mark analysis is a well published and generally accepted qualitative analytical method. However, little research has focused on identifying and mitigating potential sources of error associated with the method. The presented study proposes the use of classification trees and random forest classifiers as an optimal, statistically sound approach to mitigate the potential for error of variability and outcome error in microscopic saw mark analysis. The statistical model was applied to 58 experimental saw marks created with four types of saws. The saw marks were made in fresh human femurs obtained through anatomical gift and were analyzed using a Keyence digital microscope. The statistical approach weighed the variables based on discriminatory value and produced decision trees with an associated outcome error rate of 8.62-17.82%. © 2014 American Academy of Forensic Sciences.

  18. Identifying Novel Phenotypes of Vulnerability and Resistance to Activity-Based Anorexia in Adolescent Female Rats

    PubMed Central

    Barbarich-Marsteller, Nicole C.; Underwood, Mark D.; Foltin, Richard W.; Myers, Michael M.; Walsh, B. Timothy; Barrett, Jeffrey S.; Marsteller, Douglas A.

    2018-01-01

    Objective Activity-based anorexia is a translational rodent model that results in severe weight loss, hyperactivity, and voluntary self-starvation. The goal of our investigation was to identify vulnerable and resistant phenotypes of activity-based anorexia in adolescent female rats. Method Sprague-Dawley rats were maintained under conditions of restricted access to food (N = 64; or unlimited access, N = 16) until experimental exit, predefined as a target weight loss of 30–35% or meeting predefined criteria for animal health. Nonlinear mixed effects statistical modeling was used to describe wheel running behavior, time to event analysis was used to assess experimental exit, and a regressive partitioning algorithm was used to classify phenotypes. Results Objective criteria were identified for distinguishing novel phenotypes of activity-based anorexia, including a vulnerable phenotype that conferred maximal hyperactivity, minimal food intake, and the shortest time to experimental exit, and a resistant phenotype that conferred minimal activity and the longest time to experimental exit. Discussion The identification of objective criteria for defining vulnerable and resistant phenotypes of activity-based anorexia in adolescent female rats provides an important framework for studying the neural mechanisms that promote vulnerability to or protection against the development of self-starvation and hyperactivity during adolescence. Ultimately, future studies using these novel phenotypes may provide important translational insights into the mechanisms that promote these maladaptive behaviors characteristic of anorexia nervosa. PMID:23853140

  19. Identifying novel phenotypes of vulnerability and resistance to activity-based anorexia in adolescent female rats.

    PubMed

    Barbarich-Marsteller, Nicole C; Underwood, Mark D; Foltin, Richard W; Myers, Michael M; Walsh, B Timothy; Barrett, Jeffrey S; Marsteller, Douglas A

    2013-11-01

    Activity-based anorexia is a translational rodent model that results in severe weight loss, hyperactivity, and voluntary self-starvation. The goal of our investigation was to identify vulnerable and resistant phenotypes of activity-based anorexia in adolescent female rats. Sprague-Dawley rats were maintained under conditions of restricted access to food (N = 64; or unlimited access, N = 16) until experimental exit, predefined as a target weight loss of 30-35% or meeting predefined criteria for animal health. Nonlinear mixed effects statistical modeling was used to describe wheel running behavior, time to event analysis was used to assess experimental exit, and a regressive partitioning algorithm was used to classify phenotypes. Objective criteria were identified for distinguishing novel phenotypes of activity-based anorexia, including a vulnerable phenotype that conferred maximal hyperactivity, minimal food intake, and the shortest time to experimental exit, and a resistant phenotype that conferred minimal activity and the longest time to experimental exit. The identification of objective criteria for defining vulnerable and resistant phenotypes of activity-based anorexia in adolescent female rats provides an important framework for studying the neural mechanisms that promote vulnerability to or protection against the development of self-starvation and hyperactivity during adolescence. Ultimately, future studies using these novel phenotypes may provide important translational insights into the mechanisms that promote these maladaptive behaviors characteristic of anorexia nervosa. Copyright © 2013 Wiley Periodicals, Inc.

  20. A theoretical framework for analyzing coupled neuronal networks: Application to the olfactory system.

    PubMed

    Barreiro, Andrea K; Gautam, Shree Hari; Shew, Woodrow L; Ly, Cheng

    2017-10-01

    Determining how synaptic coupling within and between regions is modulated during sensory processing is an important topic in neuroscience. Electrophysiological recordings provide detailed information about neural spiking but have traditionally been confined to a particular region or layer of cortex. Here we develop new theoretical methods to study interactions between and within two brain regions, based on experimental measurements of spiking activity simultaneously recorded from the two regions. By systematically comparing experimentally-obtained spiking statistics to (efficiently computed) model spike rate statistics, we identify regions in model parameter space that are consistent with the experimental data. We apply our new technique to dual micro-electrode array in vivo recordings from two distinct regions: olfactory bulb (OB) and anterior piriform cortex (PC). Our analysis predicts that: i) inhibition within the afferent region (OB) has to be weaker than the inhibition within PC, ii) excitation from PC to OB is generally stronger than excitation from OB to PC, iii) excitation from PC to OB and inhibition within PC have to both be relatively strong compared to presynaptic inputs from OB. These predictions are validated in a spiking neural network model of the OB-PC pathway that satisfies the many constraints from our experimental data. We find when the derived relationships are violated, the spiking statistics no longer satisfy the constraints from the data. In principle this modeling framework can be adapted to other systems and be used to investigate relationships between other neural attributes besides network connection strengths. Thus, this work can serve as a guide to further investigations into the relationships of various neural attributes within and across different regions during sensory processing.

  1. Density profiles in the Scrape-Off Layer interpreted through filament dynamics

    NASA Astrophysics Data System (ADS)

    Militello, Fulvio

    2017-10-01

    We developed a new theoretical framework to clarify the relation between radial Scrape-Off Layer density profiles and the fluctuations that generate them. The framework provides an interpretation of the experimental features of the profiles and of the turbulence statistics on the basis of simple properties of the filaments, such as their radial motion and their draining towards the divertor. L-mode and inter-ELM filaments are described as a Poisson process in which each event is independent and modelled with a wave function of amplitude and width statistically distributed according to experimental observations and evolving according to fluid equations. We will rigorously show that radially accelerating filaments, less efficient parallel exhaust and also a statistical distribution of their radial velocity can contribute to induce flatter profiles in the far SOL and therefore enhance plasma-wall interactions. A quite general result of our analysis is the resiliency of this non-exponential nature of the profiles and the increase of the relative fluctuation amplitude towards the wall, as experimentally observed. According to the framework, profile broadening at high fueling rates can be caused by interactions with neutrals (e.g. charge exchange) in the divertor or by a significant radial acceleration of the filaments. The framework assumptions were tested with 3D numerical simulations of seeded SOL filaments based on a two fluid model. In particular, filaments interact through the electrostatic field they generate only when they are in close proximity (separation comparable to their width in the drift plane), thus justifying our independence hypothesis. In addition, we will discuss how isolated filament motion responds to variations in the plasma conditions, and specifically divertor conditions. Finally, using the theoretical framework we will reproduce and interpret experimental results obtained on JET, MAST and HL-2A.

  2. A common base method for analysis of qPCR data and the application of simple blocking in qPCR experiments.

    PubMed

    Ganger, Michael T; Dietz, Geoffrey D; Ewing, Sarah J

    2017-12-01

    qPCR has established itself as the technique of choice for the quantification of gene expression. Procedures for conducting qPCR have received significant attention; however, more rigorous approaches to the statistical analysis of qPCR data are needed. Here we develop a mathematical model, termed the Common Base Method, for analysis of qPCR data based on threshold cycle values (C q ) and efficiencies of reactions (E). The Common Base Method keeps all calculations in the logscale as long as possible by working with log 10 (E) ∙ C q , which we call the efficiency-weighted C q value; subsequent statistical analyses are then applied in the logscale. We show how efficiency-weighted C q values may be analyzed using a simple paired or unpaired experimental design and develop blocking methods to help reduce unexplained variation. The Common Base Method has several advantages. It allows for the incorporation of well-specific efficiencies and multiple reference genes. The method does not necessitate the pairing of samples that must be performed using traditional analysis methods in order to calculate relative expression ratios. Our method is also simple enough to be implemented in any spreadsheet or statistical software without additional scripts or proprietary components.

  3. Linear and Order Statistics Combiners for Pattern Classification

    NASA Technical Reports Server (NTRS)

    Tumer, Kagan; Ghosh, Joydeep; Lau, Sonie (Technical Monitor)

    2001-01-01

    Several researchers have experimentally shown that substantial improvements can be obtained in difficult pattern recognition problems by combining or integrating the outputs of multiple classifiers. This chapter provides an analytical framework to quantify the improvements in classification results due to combining. The results apply to both linear combiners and order statistics combiners. We first show that to a first order approximation, the error rate obtained over and above the Bayes error rate, is directly proportional to the variance of the actual decision boundaries around the Bayes optimum boundary. Combining classifiers in output space reduces this variance, and hence reduces the 'added' error. If N unbiased classifiers are combined by simple averaging. the added error rate can be reduced by a factor of N if the individual errors in approximating the decision boundaries are uncorrelated. Expressions are then derived for linear combiners which are biased or correlated, and the effect of output correlations on ensemble performance is quantified. For order statistics based non-linear combiners, we derive expressions that indicate how much the median, the maximum and in general the i-th order statistic can improve classifier performance. The analysis presented here facilitates the understanding of the relationships among error rates, classifier boundary distributions, and combining in output space. Experimental results on several public domain data sets are provided to illustrate the benefits of combining and to support the analytical results.

  4. AGR-1 Thermocouple Data Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jeff Einerson

    2012-05-01

    This report documents an effort to analyze measured and simulated data obtained in the Advanced Gas Reactor (AGR) fuel irradiation test program conducted in the INL's Advanced Test Reactor (ATR) to support the Next Generation Nuclear Plant (NGNP) R&D program. The work follows up on a previous study (Pham and Einerson, 2010), in which statistical analysis methods were applied for AGR-1 thermocouple data qualification. The present work exercises the idea that, while recognizing uncertainties inherent in physics and thermal simulations of the AGR-1 test, results of the numerical simulations can be used in combination with the statistical analysis methods tomore » further improve qualification of measured data. Additionally, the combined analysis of measured and simulation data can generate insights about simulation model uncertainty that can be useful for model improvement. This report also describes an experimental control procedure to maintain fuel target temperature in the future AGR tests using regression relationships that include simulation results. The report is organized into four chapters. Chapter 1 introduces the AGR Fuel Development and Qualification program, AGR-1 test configuration and test procedure, overview of AGR-1 measured data, and overview of physics and thermal simulation, including modeling assumptions and uncertainties. A brief summary of statistical analysis methods developed in (Pham and Einerson 2010) for AGR-1 measured data qualification within NGNP Data Management and Analysis System (NDMAS) is also included for completeness. Chapters 2-3 describe and discuss cases, in which the combined use of experimental and simulation data is realized. A set of issues associated with measurement and modeling uncertainties resulted from the combined analysis are identified. This includes demonstration that such a combined analysis led to important insights for reducing uncertainty in presentation of AGR-1 measured data (Chapter 2) and interpretation of simulation results (Chapter 3). The statistics-based simulation-aided experimental control procedure described for the future AGR tests is developed and demonstrated in Chapter 4. The procedure for controlling the target fuel temperature (capsule peak or average) is based on regression functions of thermocouple readings and other relevant parameters and accounting for possible changes in both physical and thermal conditions and in instrument performance.« less

  5. Electromagnetic wave scattering from rough terrain

    NASA Astrophysics Data System (ADS)

    Papa, R. J.; Lennon, J. F.; Taylor, R. L.

    1980-09-01

    This report presents two aspects of a program designed to calculate electromagnetic scattering from rough terrain: (1) the use of statistical estimation techniques to determine topographic parameters and (2) the results of a single-roughness-scale scattering calculation based on those parameters, including comparison with experimental data. In the statistical part of the present calculation, digitized topographic maps are used to generate data bases for the required scattering cells. The application of estimation theory to the data leads to the specification of statistical parameters for each cell. The estimated parameters are then used in a hypothesis test to decide on a probability density function (PDF) that represents the height distribution in the cell. Initially, the formulation uses a single observation of the multivariate data. A subsequent approach involves multiple observations of the heights on a bivariate basis, and further refinements are being considered. The electromagnetic scattering analysis, the second topic, calculates the amount of specular and diffuse multipath power reaching a monopulse receiver from a pulsed beacon positioned over a rough Earth. The program allows for spatial inhomogeneities and multiple specular reflection points. The analysis of shadowing by the rough surface has been extended to the case where the surface heights are distributed exponentially. The calculated loss of boresight pointing accuracy attributable to diffuse multipath is then compared with the experimental results. The extent of the specular region, the use of localized height variations, and the effect of the azimuthal variation in power pattern are all assessed.

  6. From statistical proofs of the Kochen-Specker theorem to noise-robust noncontextuality inequalities

    NASA Astrophysics Data System (ADS)

    Kunjwal, Ravi; Spekkens, Robert W.

    2018-05-01

    The Kochen-Specker theorem rules out models of quantum theory wherein projective measurements are assigned outcomes deterministically and independently of context. This notion of noncontextuality is not applicable to experimental measurements because these are never free of noise and thus never truly projective. For nonprojective measurements, therefore, one must drop the requirement that an outcome be assigned deterministically in the model and merely require that it be assigned a distribution over outcomes in a manner that is context-independent. By demanding context independence in the representation of preparations as well, one obtains a generalized principle of noncontextuality that also supports a quantum no-go theorem. Several recent works have shown how to derive inequalities on experimental data which, if violated, demonstrate the impossibility of finding a generalized-noncontextual model of this data. That is, these inequalities do not presume quantum theory and, in particular, they make sense without requiring an operational analog of the quantum notion of projectiveness. We here describe a technique for deriving such inequalities starting from arbitrary proofs of the Kochen-Specker theorem. It extends significantly previous techniques that worked only for logical proofs, which are based on sets of projective measurements that fail to admit of any deterministic noncontextual assignment, to the case of statistical proofs, which are based on sets of projective measurements that d o admit of some deterministic noncontextual assignments, but not enough to explain the quantum statistics.

  7. Experimental analysis of computer system dependability

    NASA Technical Reports Server (NTRS)

    Iyer, Ravishankar, K.; Tang, Dong

    1993-01-01

    This paper reviews an area which has evolved over the past 15 years: experimental analysis of computer system dependability. Methodologies and advances are discussed for three basic approaches used in the area: simulated fault injection, physical fault injection, and measurement-based analysis. The three approaches are suited, respectively, to dependability evaluation in the three phases of a system's life: design phase, prototype phase, and operational phase. Before the discussion of these phases, several statistical techniques used in the area are introduced. For each phase, a classification of research methods or study topics is outlined, followed by discussion of these methods or topics as well as representative studies. The statistical techniques introduced include the estimation of parameters and confidence intervals, probability distribution characterization, and several multivariate analysis methods. Importance sampling, a statistical technique used to accelerate Monte Carlo simulation, is also introduced. The discussion of simulated fault injection covers electrical-level, logic-level, and function-level fault injection methods as well as representative simulation environments such as FOCUS and DEPEND. The discussion of physical fault injection covers hardware, software, and radiation fault injection methods as well as several software and hybrid tools including FIAT, FERARI, HYBRID, and FINE. The discussion of measurement-based analysis covers measurement and data processing techniques, basic error characterization, dependency analysis, Markov reward modeling, software-dependability, and fault diagnosis. The discussion involves several important issues studies in the area, including fault models, fast simulation techniques, workload/failure dependency, correlated failures, and software fault tolerance.

  8. Image encryption based on a delayed fractional-order chaotic logistic system

    NASA Astrophysics Data System (ADS)

    Wang, Zhen; Huang, Xia; Li, Ning; Song, Xiao-Na

    2012-05-01

    A new image encryption scheme is proposed based on a delayed fractional-order chaotic logistic system. In the process of generating a key stream, the time-varying delay and fractional derivative are embedded in the proposed scheme to improve the security. Such a scheme is described in detail with security analyses including correlation analysis, information entropy analysis, run statistic analysis, mean-variance gray value analysis, and key sensitivity analysis. Experimental results show that the newly proposed image encryption scheme possesses high security.

  9. Bubbles and denaturation in DNA

    NASA Astrophysics Data System (ADS)

    van Erp, T. S.; Cuesta-López, S.; Peyrard, M.

    2006-08-01

    The local opening of DNA is an intriguing phenomenon from a statistical-physics point of view, but is also essential for its biological function. For instance, the transcription and replication of our genetic code cannot take place without the unwinding of the DNA double helix. Although these biological processes are driven by proteins, there might well be a relation between these biological openings and the spontaneous bubble formation due to thermal fluctuations. Mesoscopic models, like the Peyrard-Bishop-Dauxois (PBD) model, have fairly accurately reproduced some experimental denaturation curves and the sharp phase transition in the thermodynamic limit. It is, hence, tempting to see whether these models could be used to predict the biological activity of DNA. In a previous study, we introduced a method that allows to obtain very accurate results on this subject, which showed that some previous claims in this direction, based on molecular-dynamics studies, were premature. This could either imply that the present PBD model should be improved or that biological activity can only be predicted in a more complex framework that involves interactions with proteins and super helical stresses. In this article, we give a detailed description of the statistical method introduced before. Moreover, for several DNA sequences, we give a thorough analysis of the bubble-statistics as a function of position and bubble size and the so-called l-denaturation curves that can be measured experimentally. These show that some important experimental observations are missing in the present model. We discuss how the present model could be improved.

  10. Inferring gene regression networks with model trees

    PubMed Central

    2010-01-01

    Background Novel strategies are required in order to handle the huge amount of data produced by microarray technologies. To infer gene regulatory networks, the first step is to find direct regulatory relationships between genes building the so-called gene co-expression networks. They are typically generated using correlation statistics as pairwise similarity measures. Correlation-based methods are very useful in order to determine whether two genes have a strong global similarity but do not detect local similarities. Results We propose model trees as a method to identify gene interaction networks. While correlation-based methods analyze each pair of genes, in our approach we generate a single regression tree for each gene from the remaining genes. Finally, a graph from all the relationships among output and input genes is built taking into account whether the pair of genes is statistically significant. For this reason we apply a statistical procedure to control the false discovery rate. The performance of our approach, named REGNET, is experimentally tested on two well-known data sets: Saccharomyces Cerevisiae and E.coli data set. First, the biological coherence of the results are tested. Second the E.coli transcriptional network (in the Regulon database) is used as control to compare the results to that of a correlation-based method. This experiment shows that REGNET performs more accurately at detecting true gene associations than the Pearson and Spearman zeroth and first-order correlation-based methods. Conclusions REGNET generates gene association networks from gene expression data, and differs from correlation-based methods in that the relationship between one gene and others is calculated simultaneously. Model trees are very useful techniques to estimate the numerical values for the target genes by linear regression functions. They are very often more precise than linear regression models because they can add just different linear regressions to separate areas of the search space favoring to infer localized similarities over a more global similarity. Furthermore, experimental results show the good performance of REGNET. PMID:20950452

  11. Randomizing Roaches: Exploring the "Bugs" of Randomization in Experimental Design

    ERIC Educational Resources Information Center

    Wagler, Amy; Wagler, Ron

    2014-01-01

    Understanding the roles of random selection and random assignment in experimental design is a central learning objective in most introductory statistics courses. This article describes an activity, appropriate for a high school or introductory statistics course, designed to teach the concepts, values and pitfalls of random selection and assignment…

  12. Experimental test of quantum nonlocality in three-photon Greenberger-Horne-Zeilinger entanglement

    PubMed

    Pan; Bouwmeester; Daniell; Weinfurter; Zeilinger

    2000-02-03

    Bell's theorem states that certain statistical correlations predicted by quantum physics for measurements on two-particle systems cannot be understood within a realistic picture based on local properties of each individual particle-even if the two particles are separated by large distances. Einstein, Podolsky and Rosen first recognized the fundamental significance of these quantum correlations (termed 'entanglement' by Schrodinger) and the two-particle quantum predictions have found ever-increasing experimental support. A more striking conflict between quantum mechanical and local realistic predictions (for perfect correlations) has been discovered; but experimental verification has been difficult, as it requires entanglement between at least three particles. Here we report experimental confirmation of this conflict, using our recently developed method to observe three-photon entanglement, or 'Greenberger-Horne-Zeilinger' (GHZ) states. The results of three specific experiments, involving measurements of polarization correlations between three photons, lead to predictions for a fourth experiment; quantum physical predictions are mutually contradictory with expectations based on local realism. We find the results of the fourth experiment to be in agreement with the quantum prediction and in striking conflict with local realism.

  13. Impact of Temperature and Non-Gaussian Statistics on Electron Transfer in Donor-Bridge-Acceptor Molecules.

    PubMed

    Waskasi, Morteza M; Newton, Marshall D; Matyushov, Dmitry V

    2017-03-30

    A combination of experimental data and theoretical analysis provides evidence of a bell-shaped kinetics of electron transfer in the Arrhenius coordinates ln k vs 1/T. This kinetic law is a temperature analogue of the familiar Marcus bell-shaped dependence based on ln k vs the reaction free energy. These results were obtained for reactions of intramolecular charge shift between the donor and acceptor separated by a rigid spacer studied experimentally by Miller and co-workers. The non-Arrhenius kinetic law is a direct consequence of the solvent reorganization energy and reaction driving force changing approximately as hyperbolic functions with temperature. The reorganization energy decreases and the driving force increases when temperature is increased. The point of equality between them marks the maximum of the activationless reaction rate. Reaching the consistency between the kinetic and thermodynamic experimental data requires the non-Gaussian statistics of the donor-acceptor energy gap described by the Q-model of electron transfer. The theoretical formalism combines the vibrational envelope of quantum vibronic transitions with the Q-model describing the classical component of the Franck-Condon factor and a microscopic solvation model of the solvent reorganization energy and the reaction free energy.

  14. Flexible statistical modelling detects clinical functional magnetic resonance imaging activation in partially compliant subjects.

    PubMed

    Waites, Anthony B; Mannfolk, Peter; Shaw, Marnie E; Olsrud, Johan; Jackson, Graeme D

    2007-02-01

    Clinical functional magnetic resonance imaging (fMRI) occasionally fails to detect significant activation, often due to variability in task performance. The present study seeks to test whether a more flexible statistical analysis can better detect activation, by accounting for variance associated with variable compliance to the task over time. Experimental results and simulated data both confirm that even at 80% compliance to the task, such a flexible model outperforms standard statistical analysis when assessed using the extent of activation (experimental data), goodness of fit (experimental data), and area under the operator characteristic curve (simulated data). Furthermore, retrospective examination of 14 clinical fMRI examinations reveals that in patients where the standard statistical approach yields activation, there is a measurable gain in model performance in adopting the flexible statistical model, with little or no penalty in lost sensitivity. This indicates that a flexible model should be considered, particularly for clinical patients who may have difficulty complying fully with the study task.

  15. Statistical reporting inconsistencies in experimental philosophy

    PubMed Central

    Colombo, Matteo; Duev, Georgi; Nuijten, Michèle B.; Sprenger, Jan

    2018-01-01

    Experimental philosophy (x-phi) is a young field of research in the intersection of philosophy and psychology. It aims to make progress on philosophical questions by using experimental methods traditionally associated with the psychological and behavioral sciences, such as null hypothesis significance testing (NHST). Motivated by recent discussions about a methodological crisis in the behavioral sciences, questions have been raised about the methodological standards of x-phi. Here, we focus on one aspect of this question, namely the rate of inconsistencies in statistical reporting. Previous research has examined the extent to which published articles in psychology and other behavioral sciences present statistical inconsistencies in reporting the results of NHST. In this study, we used the R package statcheck to detect statistical inconsistencies in x-phi, and compared rates of inconsistencies in psychology and philosophy. We found that rates of inconsistencies in x-phi are lower than in the psychological and behavioral sciences. From the point of view of statistical reporting consistency, x-phi seems to do no worse, and perhaps even better, than psychological science. PMID:29649220

  16. Sample size determinations for group-based randomized clinical trials with different levels of data hierarchy between experimental and control arms.

    PubMed

    Heo, Moonseong; Litwin, Alain H; Blackstock, Oni; Kim, Namhee; Arnsten, Julia H

    2017-02-01

    We derived sample size formulae for detecting main effects in group-based randomized clinical trials with different levels of data hierarchy between experimental and control arms. Such designs are necessary when experimental interventions need to be administered to groups of subjects whereas control conditions need to be administered to individual subjects. This type of trial, often referred to as a partially nested or partially clustered design, has been implemented for management of chronic diseases such as diabetes and is beginning to emerge more commonly in wider clinical settings. Depending on the research setting, the level of hierarchy of data structure for the experimental arm can be three or two, whereas that for the control arm is two or one. Such different levels of data hierarchy assume correlation structures of outcomes that are different between arms, regardless of whether research settings require two or three level data structure for the experimental arm. Therefore, the different correlations should be taken into account for statistical modeling and for sample size determinations. To this end, we considered mixed-effects linear models with different correlation structures between experimental and control arms to theoretically derive and empirically validate the sample size formulae with simulation studies.

  17. Modelling of the UV Index on vertical and 40° tilted planes for different orientations.

    PubMed

    Serrano, D; Marín, M J; Utrillas, M P; Tena, F; Martínez-Lozano, J A

    2012-02-01

    In this study, estimated data of the UV Index on vertical planes are presented for the latitude of Valencia, Spain. For that purpose, the UVER values have been generated on vertical planes by means of four different geometrical models a) isotropic, b) Perez, c) Gueymard, d) Muneer, based on values of the global horizontal UVER and the diffuse horizontal UVER, measured experimentally. The UVER values, obtained by any model, overestimate the experimental values for all orientations, with the exception of the Perez model for the East plane. The results show statistical values of the MAD parameter (Mean Absolute Deviation) between 10% and 25%, the Perez model being the one that obtained a lower MAD for all levels. As for the statistic RMSD parameter (Root Mean Square Deviation), the results show values between 17% and 32%, and again the Perez model provides the best results in all vertical planes. The difference between the estimated UV Index and the experimental UV Index, for vertical and 40° tilted planes, was also calculated. 40° is an angle close to the latitude of Burjassot, Valencia, (39.5°), which, according to various studies, is the optimum angle to capture maximum radiation on tilted planes. We conclude that the models provide a good estimate of the UV Index, as they coincide or differ in one unit compared to the experimental values in 99% of cases, and this is valid for all orientations. Finally, we examined the relation between the UV Index on vertical and 40° tilted planes, both the experimental and estimated by the Perez model, and the experimental UV Index on a horizontal plane at 12 GMT. Based on the results, we can conclude that it is possible to estimate with a good approximation the UV Index on vertical and 40° tilted planes in different directions on the basis of the experimental horizontal UVI value, thus justifying the interest of this study. This journal is © The Royal Society of Chemistry and Owner Societies 2012

  18. Computational Analysis for Rocket-Based Combined-Cycle Systems During Rocket-Only Operation

    NASA Technical Reports Server (NTRS)

    Steffen, C. J., Jr.; Smith, T. D.; Yungster, S.; Keller, D. J.

    2000-01-01

    A series of Reynolds-averaged Navier-Stokes calculations were employed to study the performance of rocket-based combined-cycle systems operating in an all-rocket mode. This parametric series of calculations were executed within a statistical framework, commonly known as design of experiments. The parametric design space included four geometric and two flowfield variables set at three levels each, for a total of 729 possible combinations. A D-optimal design strategy was selected. It required that only 36 separate computational fluid dynamics (CFD) solutions be performed to develop a full response surface model, which quantified the linear, bilinear, and curvilinear effects of the six experimental variables. The axisymmetric, Reynolds-averaged Navier-Stokes simulations were executed with the NPARC v3.0 code. The response used in the statistical analysis was created from Isp efficiency data integrated from the 36 CFD simulations. The influence of turbulence modeling was analyzed by using both one- and two-equation models. Careful attention was also given to quantify the influence of mesh dependence, iterative convergence, and artificial viscosity upon the resulting statistical model. Thirteen statistically significant effects were observed to have an influence on rocket-based combined-cycle nozzle performance. It was apparent that the free-expansion process, directly downstream of the rocket nozzle, can influence the Isp efficiency. Numerical schlieren images and particle traces have been used to further understand the physical phenomena behind several of the statistically significant results.

  19. Study of deformation evolution during failure of rock specimens using laser-based vibration measurements

    NASA Astrophysics Data System (ADS)

    Smolin, I. Yu.; Kulkov, A. S.; Makarov, P. V.; Tunda, V. A.; Krasnoveikin, V. A.; Eremin, M. O.; Bakeev, R. A.

    2017-12-01

    The aim of the paper is to analyze experimental data on the dynamic response of the marble specimen in uniaxial compression. To make it we use the methods of mathematical statistics. The lateral surface velocity evolution obtained by the laser Doppler vibrometer represents the data for analysis. The registered data were regarded as a time series that reflects deformation evolution of the specimen loaded up to failure. The revealed changes in statistical parameters were considered as precursors of failure. It is shown that before failure the deformation response is autocorrelated and reflects the states of dynamic chaos and self-organized criticality.

  20. Gaussian process based modeling and experimental design for sensor calibration in drifting environments

    PubMed Central

    Geng, Zongyu; Yang, Feng; Chen, Xi; Wu, Nianqiang

    2016-01-01

    It remains a challenge to accurately calibrate a sensor subject to environmental drift. The calibration task for such a sensor is to quantify the relationship between the sensor’s response and its exposure condition, which is specified by not only the analyte concentration but also the environmental factors such as temperature and humidity. This work developed a Gaussian Process (GP)-based procedure for the efficient calibration of sensors in drifting environments. Adopted as the calibration model, GP is not only able to capture the possibly nonlinear relationship between the sensor responses and the various exposure-condition factors, but also able to provide valid statistical inference for uncertainty quantification of the target estimates (e.g., the estimated analyte concentration of an unknown environment). Built on GP’s inference ability, an experimental design method was developed to achieve efficient sampling of calibration data in a batch sequential manner. The resulting calibration procedure, which integrates the GP-based modeling and experimental design, was applied on a simulated chemiresistor sensor to demonstrate its effectiveness and its efficiency over the traditional method. PMID:26924894

  1. On the appropriateness of applying chi-square distribution based confidence intervals to spectral estimates of helicopter flyover data

    NASA Technical Reports Server (NTRS)

    Rutledge, Charles K.

    1988-01-01

    The validity of applying chi-square based confidence intervals to far-field acoustic flyover spectral estimates was investigated. Simulated data, using a Kendall series and experimental acoustic data from the NASA/McDonnell Douglas 500E acoustics test, were analyzed. Statistical significance tests to determine the equality of distributions of the simulated and experimental data relative to theoretical chi-square distributions were performed. Bias and uncertainty errors associated with the spectral estimates were easily identified from the data sets. A model relating the uncertainty and bias errors to the estimates resulted, which aided in determining the appropriateness of the chi-square distribution based confidence intervals. Such confidence intervals were appropriate for nontonally associated frequencies of the experimental data but were inappropriate for tonally associated estimate distributions. The appropriateness at the tonally associated frequencies was indicated by the presence of bias error and noncomformity of the distributions to the theoretical chi-square distribution. A technique for determining appropriate confidence intervals at the tonally associated frequencies was suggested.

  2. Skull Base Cerebrospinal Fluid Leakage Control with a Fibrin-Based Composite Tissue Adhesive

    PubMed Central

    Rock, Jack P.; Sierra, David H.; Castro-Moure, Frederico; Jiang, Feng

    1996-01-01

    Cerebrospinal fluid (CSF) leaks can be responsible for significant patient morbidity and mortality. While the majority of leaks induced after head trauma will seal without intervention, spontaneous or surgically-induced leaks often require operative repair. Many modifications on standard surgical technique are available for repair of CSF fistulae, but none assures adequate closure. We have studied the efficacy of a novel fibrin-based composite tissue adhesive (CTA) for closure of experimentally-induced CSF leaks in rats. Fistulae were created in two groups of animals. Two weeks after creation of the leaks, the animals were sacrificed and analyzed for persistence of leak. A 58% leakage rate was noted in the control group (n = 12), and no leaks were noted in the experimental group closed after application of CTA to the surgical defect followed by skin closure (n = 11). Comparing the control group to the experimental group, results were statistically significant (p = 0.015). These data suggest that CTA may be effective as an adjunct for the closure of CSF fistulae. ImagesFigure 2Figure 3 PMID:17170969

  3. Metallurgical characterization of experimental Ag-based soldering alloys.

    PubMed

    Ntasi, Argyro; Al Jabbari, Youssef S; Silikas, Nick; Al Taweel, Sara M; Zinelis, Spiros

    2014-10-01

    To characterize microstructure, hardness and thermal properties of experimental Ag-based soldering alloys for dental applications. Ag12Ga (AgGa) and Ag10Ga5Sn (AgGaSn) were fabricated by induction melting. Six samples were prepared for each alloy and microstructure, hardness and their melting range were determined by, scanning electron microscopy, energy dispersive X-ray (EDX) microanalysis, X-ray diffraction (XRD), Vickers hardness testing and differential scanning calorimetry (DSC). Both alloys demonstrated a gross dendritic microstructure while according to XRD results both materials consisted predominately of a Ag-rich face centered cubic phase The hardness of AgGa (61 ± 2) was statistically lower than that of AgGaSn (84 ± 2) while the alloys tested showed similar melting range of 627-762 °C for AgGa and 631-756 °C for AgGaSn. The experimental alloys tested demonstrated similar microstructures and melting ranges. Ga and Sn might be used as alternative to Cu and Zn to modify the selected properties of Ag based soldering alloys.

  4. Metallurgical characterization of experimental Ag-based soldering alloys

    PubMed Central

    Ntasi, Argyro; Al Jabbari, Youssef S.; Silikas, Nick; Al Taweel, Sara M.; Zinelis, Spiros

    2014-01-01

    Aim To characterize microstructure, hardness and thermal properties of experimental Ag-based soldering alloys for dental applications. Materials and methods Ag12Ga (AgGa) and Ag10Ga5Sn (AgGaSn) were fabricated by induction melting. Six samples were prepared for each alloy and microstructure, hardness and their melting range were determined by, scanning electron microscopy, energy dispersive X-ray (EDX) microanalysis, X-ray diffraction (XRD), Vickers hardness testing and differential scanning calorimetry (DSC). Results Both alloys demonstrated a gross dendritic microstructure while according to XRD results both materials consisted predominately of a Ag-rich face centered cubic phase The hardness of AgGa (61 ± 2) was statistically lower than that of AgGaSn (84 ± 2) while the alloys tested showed similar melting range of 627–762 °C for AgGa and 631–756 °C for AgGaSn. Conclusion The experimental alloys tested demonstrated similar microstructures and melting ranges. Ga and Sn might be used as alternative to Cu and Zn to modify the selected properties of Ag based soldering alloys. PMID:25382945

  5. Built-up Areas Extraction in High Resolution SAR Imagery based on the method of Multiple Feature Weighted Fusion

    NASA Astrophysics Data System (ADS)

    Liu, X.; Zhang, J. X.; Zhao, Z.; Ma, A. D.

    2015-06-01

    Synthetic aperture radar in the application of remote sensing technology is becoming more and more widely because of its all-time and all-weather operation, feature extraction research in high resolution SAR image has become a hot topic of concern. In particular, with the continuous improvement of airborne SAR image resolution, image texture information become more abundant. It's of great significance to classification and extraction. In this paper, a novel method for built-up areas extraction using both statistical and structural features is proposed according to the built-up texture features. First of all, statistical texture features and structural features are respectively extracted by classical method of gray level co-occurrence matrix and method of variogram function, and the direction information is considered in this process. Next, feature weights are calculated innovatively according to the Bhattacharyya distance. Then, all features are weighted fusion. At last, the fused image is classified with K-means classification method and the built-up areas are extracted after post classification process. The proposed method has been tested by domestic airborne P band polarization SAR images, at the same time, two groups of experiments based on the method of statistical texture and the method of structural texture were carried out respectively. On the basis of qualitative analysis, quantitative analysis based on the built-up area selected artificially is enforced, in the relatively simple experimentation area, detection rate is more than 90%, in the relatively complex experimentation area, detection rate is also higher than the other two methods. In the study-area, the results show that this method can effectively and accurately extract built-up areas in high resolution airborne SAR imagery.

  6. Observers Exploit Stochastic Models of Sensory Change to Help Judge the Passage of Time

    PubMed Central

    Ahrens, Misha B.; Sahani, Maneesh

    2011-01-01

    Summary Sensory stimulation can systematically bias the perceived passage of time [1–5], but why and how this happens is mysterious. In this report, we provide evidence that such biases may ultimately derive from an innate and adaptive use of stochastically evolving dynamic stimuli to help refine estimates derived from internal timekeeping mechanisms [6–15]. A simplified statistical model based on probabilistic expectations of stimulus change derived from the second-order temporal statistics of the natural environment [16, 17] makes three predictions. First, random noise-like stimuli whose statistics violate natural expectations should induce timing bias. Second, a previously unexplored obverse of this effect is that similar noise stimuli with natural statistics should reduce the variability of timing estimates. Finally, this reduction in variability should scale with the interval being timed, so as to preserve the overall Weber law of interval timing. All three predictions are borne out experimentally. Thus, in the context of our novel theoretical framework, these results suggest that observers routinely rely on sensory input to augment their sense of the passage of time, through a process of Bayesian inference based on expectations of change in the natural environment. PMID:21256018

  7. A Statistics-based Platform for Quantitative N-terminome Analysis and Identification of Protease Cleavage Products*

    PubMed Central

    auf dem Keller, Ulrich; Prudova, Anna; Gioia, Magda; Butler, Georgina S.; Overall, Christopher M.

    2010-01-01

    Terminal amine isotopic labeling of substrates (TAILS), our recently introduced platform for quantitative N-terminome analysis, enables wide dynamic range identification of original mature protein N-termini and protease cleavage products. Modifying TAILS by use of isobaric tag for relative and absolute quantification (iTRAQ)-like labels for quantification together with a robust statistical classifier derived from experimental protease cleavage data, we report reliable and statistically valid identification of proteolytic events in complex biological systems in MS2 mode. The statistical classifier is supported by a novel parameter evaluating ion intensity-dependent quantification confidences of single peptide quantifications, the quantification confidence factor (QCF). Furthermore, the isoform assignment score (IAS) is introduced, a new scoring system for the evaluation of single peptide-to-protein assignments based on high confidence protein identifications in the same sample prior to negative selection enrichment of N-terminal peptides. By these approaches, we identified and validated, in addition to known substrates, low abundance novel bioactive MMP-2 targets including the plasminogen receptor S100A10 (p11) and the proinflammatory cytokine proEMAP/p43 that were previously undescribed. PMID:20305283

  8. The Effectiveness of Mindfulness-Based Cognitive Therapy on Iranian Female Adolescents Suffering From Social Anxiety

    PubMed Central

    Ebrahiminejad, Shima; Poursharifi, Hamid; Bakhshiour Roodsari, Abbas; Zeinodini, Zahra; Noorbakhsh, Simasadat

    2016-01-01

    Background Social anxiety is one of the most common psychological disorders that exists among children and adolescents, and it has profound effects on their psychological states and academic achievements. Objectives The aim of this study was to determine the effectiveness of mindfulness-based cognitive therapy (MBCT) on diminishing social anxiety disorder symptoms and improving the self-esteem of female adolescents suffering from social anxiety. Patients and Methods Semi-experimental research was conducted on 30 female students diagnosed with social anxiety. From the population of female students who were studying in Tehran’s high schools in the academic year of 2013 - 2014, 30 students fulfilling the DSM-5 criteria were selected using the convenience sampling method and were randomly assigned to control and experimental groups. The experimental group received eight sessions of MBCT treatment. The control group received no treatment. All participants completed the social phobia inventory (SPIN) and Rosenberg self-esteem scale (RSES) twice as pre- and post-treatment tests. Results The results from the experimental group indicated a statistically reliable difference between the mean scores from SPIN (t (11) = 5.246, P = 0.000) and RSES (t (11) = -2.326, P = 0.040) pre-treatment and post-treatment. On the other hand, the results of the control group failed to reveal a statistically reliable difference between the mean scores from SPIN (t (12) = 1.089, P = 0.297) and RSES pre-treatment and post-treatment (t (12) = 1.089, P = 0.000). Conclusions The results indicate that MBCT is effective on both the improvement of self-esteem and the decrease of social anxiety. The results are in accordance with prior studies performed on adolescents. PMID:28191335

  9. A comparison of problem-based learning and conventional teaching in nursing ethics education.

    PubMed

    Lin, Chiou-Fen; Lu, Meei-Shiow; Chung, Chun-Chih; Yang, Che-Ming

    2010-05-01

    The aim of this study was to compare the learning effectiveness of peer tutored problem-based learning and conventional teaching of nursing ethics in Taiwan. The study adopted an experimental design. The peer tutored problem-based learning method was applied to an experimental group and the conventional teaching method to a control group. The study sample consisted of 142 senior nursing students who were randomly assigned to the two groups. All the students were tested for their nursing ethical discrimination ability both before and after the educational intervention. A learning satisfaction survey was also administered to both groups at the end of each course. After the intervention, both groups showed a significant increase in ethical discrimination ability. There was a statistically significant difference between the ethical discrimination scores of the two groups (P < 0.05), with the experimental group on average scoring higher than the control group. There were significant differences in satisfaction with self-motivated learning and critical thinking between the groups. Peer tutored problem-based learning and lecture-type conventional teaching were both effective for nursing ethics education, but problem-based learning was shown to be more effective. Peer tutored problem-based learning has the potential to enhance the efficacy of teaching nursing ethics in situations in which there are personnel and resource constraints.

  10. Comparison of LIDAR system performance for alternative single-mode receiver architectures: modeling and experimental validation

    NASA Astrophysics Data System (ADS)

    Toliver, Paul; Ozdur, Ibrahim; Agarwal, Anjali; Woodward, T. K.

    2013-05-01

    In this paper, we describe a detailed performance comparison of alternative single-pixel, single-mode LIDAR architectures including (i) linear-mode APD-based direct-detection, (ii) optically-preamplified PIN receiver, (iii) PINbased coherent-detection, and (iv) Geiger-mode single-photon-APD counting. Such a comparison is useful when considering next-generation LIDAR on a chip, which would allow one to leverage extensive waveguide-based structures and processing elements developed for telecom and apply them to small form-factor sensing applications. Models of four LIDAR transmit and receive systems are described in detail, which include not only the dominant sources of receiver noise commonly assumed in each of the four detection limits, but also additional noise terms present in realistic implementations. These receiver models are validated through the analysis of detection statistics collected from an experimental LIDAR testbed. The receiver is reconfigurable into four modes of operation, while transmit waveforms and channel characteristics are held constant. The use of a diffuse hard target highlights the importance of including speckle noise terms in the overall system analysis. All measurements are done at 1550 nm, which offers multiple system advantages including less stringent eye safety requirements and compatibility with available telecom components, optical amplification, and photonic integration. Ultimately, the experimentally-validated detection statistics can be used as part of an end-to-end system model for projecting rate, range, and resolution performance limits and tradeoffs of alternative integrated LIDAR architectures.

  11. Observational studies using propensity score analysis underestimated the effect sizes in critical care medicine.

    PubMed

    Zhang, Zhongheng; Ni, Hongying; Xu, Xiao

    2014-08-01

    Propensity score (PS) analysis has been increasingly used in critical care medicine; however, its validation has not been systematically investigated. The present study aimed to compare effect sizes in PS-based observational studies vs. randomized controlled trials (RCTs) (or meta-analysis of RCTs). Critical care observational studies using PS were systematically searched in PubMed from inception to April 2013. Identified PS-based studies were matched to one or more RCTs in terms of population, intervention, comparison, and outcome. The effect sizes of experimental treatments were compared for PS-based studies vs. RCTs (or meta-analysis of RCTs) with sign test. Furthermore, ratio of odds ratio (ROR) was calculated from the interaction term of treatment × study type in a logistic regression model. A ROR < 1 indicates greater benefit for experimental treatment in RCTs compared with PS-based studies. RORs of each comparison were pooled by using meta-analytic approach with random-effects model. A total of 20 PS-based studies were identified and matched to RCTs. Twelve of the 20 comparisons showed greater beneficial effect for experimental treatment in RCTs than that in PS-based studies (sign test P = 0.503). The difference was statistically significant in four comparisons. ROR can be calculated from 13 comparisons, of which four showed significantly greater beneficial effect for experimental treatment in RCTs. The pooled ROR was 0.71 (95% CI: 0.63, 0.79; P = 0.002), suggesting that RCTs (or meta-analysis of RCTs) were more likely to report beneficial effect for the experimental treatment than PS-based studies. The result remained unchanged in sensitivity analysis and meta-regression. In critical care literature, PS-based observational study is likely to report less beneficial effect of experimental treatment compared with RCTs (or meta-analysis of RCTs). Copyright © 2014 Elsevier Inc. All rights reserved.

  12. rf streak camera based ultrafast relativistic electron diffraction.

    PubMed

    Musumeci, P; Moody, J T; Scoby, C M; Gutierrez, M S; Tran, T

    2009-01-01

    We theoretically and experimentally investigate the possibility of using a rf streak camera to time resolve in a single shot structural changes at the sub-100 fs time scale via relativistic electron diffraction. We experimentally tested this novel concept at the UCLA Pegasus rf photoinjector. Time-resolved diffraction patterns from thin Al foil are recorded. Averaging over 50 shots is required in order to get statistics sufficient to uncover a variation in time of the diffraction patterns. In the absence of an external pump laser, this is explained as due to the energy chirp on the beam out of the electron gun. With further improvements to the electron source, rf streak camera based ultrafast electron diffraction has the potential to yield truly single shot measurements of ultrafast processes.

  13. Effectiveness of a Technology-Based Intervention to Teach Evidence-Based Practice: The EBR Tool.

    PubMed

    Long, JoAnn D; Gannaway, Paula; Ford, Cindy; Doumit, Rita; Zeeni, Nadine; Sukkarieh-Haraty, Ola; Milane, Aline; Byers, Beverly; Harrison, LaNell; Hatch, Daniel; Brown, Justin; Proper, Sharlan; White, Patricia; Song, Huaxin

    2016-02-01

    As the world becomes increasingly digital, advances in technology have changed how students access evidence-based information. Research suggests that students overestimate their ability to locate quality online research and lack the skills needed to evaluate the scientific literature. Clinical nurses report relying on personal experience to answer clinical questions rather than searching evidence-based sources. To address the problem, a web-based, evidence-based research (EBR) tool that is usable from a computer, smartphone, or iPad was developed and tested. The purpose of the EBR tool is to guide students through the basic steps needed to locate and critically appraise the online scientific literature while linking users to quality electronic resources to support evidence-based practice (EBP). Testing of the tool took place in a mixed-method, quasi-experimental, and two-population randomized controlled trial (RCT) design in a U.S. and Middle East university. A statistically significant improvement in overall research skills was supported in the quasi-experimental nursing student group and RCT nutrition student group using the EBR tool. A statistically significant proportional difference was supported in the RCT nutrition and PharmD intervention groups in participants' ability to distinguish the credibility of online source materials compared with controls. The majority of participants could correctly apply PICOTS to a case study when using the tool. The data from this preliminary study suggests that the EBR tool enhanced student overall research skills and selected EBP skills while generating data for assessment of learning outcomes. The EBR tool places evidence-based resources at the fingertips of users by addressing some of the most commonly cited barriers to research utilization while exposing users to information and online literacy standards of practice, meeting a growing need within nursing curricula. © 2016 Sigma Theta Tau International.

  14. Matter-wave diffraction approaching limits predicted by Feynman path integrals for multipath interference

    NASA Astrophysics Data System (ADS)

    Barnea, A. Ronny; Cheshnovsky, Ori; Even, Uzi

    2018-02-01

    Interference experiments have been paramount in our understanding of quantum mechanics and are frequently the basis of testing the superposition principle in the framework of quantum theory. In recent years, several studies have challenged the nature of wave-function interference from the perspective of Born's rule—namely, the manifestation of so-called high-order interference terms in a superposition generated by diffraction of the wave functions. Here we present an experimental test of multipath interference in the diffraction of metastable helium atoms, with large-number counting statistics, comparable to photon-based experiments. We use a variation of the original triple-slit experiment and accurate single-event counting techniques to provide a new experimental bound of 2.9 ×10-5 on the statistical deviation from the commonly approximated null third-order interference term in Born's rule for matter waves. Our value is on the order of the maximal contribution predicted for multipath trajectories by Feynman path integrals.

  15. Deterministic optical polarisation in nitride quantum dots at thermoelectrically cooled temperatures.

    PubMed

    Wang, Tong; Puchtler, Tim J; Patra, Saroj K; Zhu, Tongtong; Jarman, John C; Oliver, Rachel A; Schulz, Stefan; Taylor, Robert A

    2017-09-21

    We report the successful realisation of intrinsic optical polarisation control by growth, in solid-state quantum dots in the thermoelectrically cooled temperature regime (≥200 K), using a non-polar InGaN system. With statistically significant experimental data from cryogenic to high temperatures, we show that the average polarisation degree of such a system remains constant at around 0.90, below 100 K, and decreases very slowly at higher temperatures until reaching 0.77 at 200 K, with an unchanged polarisation axis determined by the material crystallography. A combination of Fermi-Dirac statistics and k·p theory with consideration of quantum dot anisotropy allows us to elucidate the origin of the robust, almost temperature-insensitive polarisation properties of this system from a fundamental perspective, producing results in very good agreement with the experimental findings. This work demonstrates that optical polarisation control can be achieved in solid-state quantum dots at thermoelectrically cooled temperatures, thereby opening the possibility of polarisation-based quantum dot applications in on-chip conditions.

  16. On vital aid: the why, what and how of validation

    PubMed Central

    Kleywegt, Gerard J.

    2009-01-01

    Limitations to the data and subjectivity in the structure-determination process may cause errors in macromolecular crystal structures. Appropriate validation techniques may be used to reveal problems in structures, ideally before they are analysed, published or deposited. Additionally, such tech­niques may be used a posteriori to assess the (relative) merits of a model by potential users. Weak validation methods and statistics assess how well a model reproduces the information that was used in its construction (i.e. experimental data and prior knowledge). Strong methods and statistics, on the other hand, test how well a model predicts data or information that were not used in the structure-determination process. These may be data that were excluded from the process on purpose, general knowledge about macromolecular structure, information about the biological role and biochemical activity of the molecule under study or its mutants or complexes and predictions that are based on the model and that can be tested experimentally. PMID:19171968

  17. A Cosmetic Content-Based Nutrition Education Program Improves Fruit and Vegetable Consumption Among Grade 11 Thai Students.

    PubMed

    Somsri, Pattraporn; Satheannoppakao, Warapone; Tipayamongkholgul, Mathuros; Vatanasomboon, Paranee; Kasemsup, Rachada

    2016-03-01

    To examine and compare the effectiveness of a cosmetic content-based nutrition education (CCBNEd) program and a health content-based nutrition education (HCBNEd) program on the promotion of fruit and vegetable (F&V) consumption. Quasi-experimental. Three secondary schools in Nonthaburi, Thailand. Three classes of students were randomly assigned to 3 study groups: experimental group 1 (n = 41) participated in the CCBNEd program, experimental group 2 (n = 35) experienced the HCBNEd program, and a comparison group (n = 37) did not participate in a program. All groups received F&V information. Data were collected between July and September, 2013. Knowledge about F&V, attitude toward F&V consumption, and the amount and variety of F&V consumed were measured at baseline, posttest, and follow-up. Nonparametric statistics were used to compare the programs' effectiveness. After the test, experimental group 1 had significantly increased knowledge scores, attitude scores, and the amount and variety of F&V consumed compared with those at baseline (P < .001). These positive changes were maintained until follow-up. In experimental group 2, knowledge and attitude scores increased (P < .001) at posttest and then decreased at follow-up whereas the comparison group positively changed only in knowledge. The CCBNEd program was most effective at increasing F&V consumption. Copyright © 2016 Society for Nutrition Education and Behavior. Published by Elsevier Inc. All rights reserved.

  18. Statistical analysis of nonmonotonic dose-response relationships: research design and analysis of nasal cell proliferation in rats exposed to formaldehyde.

    PubMed

    Gaylor, David W; Lutz, Werner K; Conolly, Rory B

    2004-01-01

    Statistical analyses of nonmonotonic dose-response curves are proposed, experimental designs to detect low-dose effects of J-shaped curves are suggested, and sample sizes are provided. For quantal data such as cancer incidence rates, much larger numbers of animals are required than for continuous data such as biomarker measurements. For example, 155 animals per dose group are required to have at least an 80% chance of detecting a decrease from a 20% incidence in controls to an incidence of 10% at a low dose. For a continuous measurement, only 14 animals per group are required to have at least an 80% chance of detecting a change of the mean by one standard deviation of the control group. Experimental designs based on three dose groups plus controls are discussed to detect nonmonotonicity or to estimate the zero equivalent dose (ZED), i.e., the dose that produces a response equal to the average response in the controls. Cell proliferation data in the nasal respiratory epithelium of rats exposed to formaldehyde by inhalation are used to illustrate the statistical procedures. Statistically significant departures from a monotonic dose response were obtained for time-weighted average labeling indices with an estimated ZED at a formaldehyde dose of 5.4 ppm, with a lower 95% confidence limit of 2.7 ppm. It is concluded that demonstration of a statistically significant bi-phasic dose-response curve, together with estimation of the resulting ZED, could serve as a point-of departure in establishing a reference dose for low-dose risk assessment.

  19. PCA as a practical indicator of OPLS-DA model reliability.

    PubMed

    Worley, Bradley; Powers, Robert

    Principal Component Analysis (PCA) and Orthogonal Projections to Latent Structures Discriminant Analysis (OPLS-DA) are powerful statistical modeling tools that provide insights into separations between experimental groups based on high-dimensional spectral measurements from NMR, MS or other analytical instrumentation. However, when used without validation, these tools may lead investigators to statistically unreliable conclusions. This danger is especially real for Partial Least Squares (PLS) and OPLS, which aggressively force separations between experimental groups. As a result, OPLS-DA is often used as an alternative method when PCA fails to expose group separation, but this practice is highly dangerous. Without rigorous validation, OPLS-DA can easily yield statistically unreliable group separation. A Monte Carlo analysis of PCA group separations and OPLS-DA cross-validation metrics was performed on NMR datasets with statistically significant separations in scores-space. A linearly increasing amount of Gaussian noise was added to each data matrix followed by the construction and validation of PCA and OPLS-DA models. With increasing added noise, the PCA scores-space distance between groups rapidly decreased and the OPLS-DA cross-validation statistics simultaneously deteriorated. A decrease in correlation between the estimated loadings (added noise) and the true (original) loadings was also observed. While the validity of the OPLS-DA model diminished with increasing added noise, the group separation in scores-space remained basically unaffected. Supported by the results of Monte Carlo analyses of PCA group separations and OPLS-DA cross-validation metrics, we provide practical guidelines and cross-validatory recommendations for reliable inference from PCA and OPLS-DA models.

  20. Does Training in Table Creation Enhance Table Interpretation? A Quasi-Experimental Study with Follow-Up

    ERIC Educational Resources Information Center

    Karazsia, Bryan T.; Wong, Kendal

    2016-01-01

    Quantitative and statistical literacy are core domains in the undergraduate psychology curriculum. An important component of such literacy includes interpretation of visual aids, such as tables containing results from statistical analyses. This article presents results of a quasi-experimental study with longitudinal follow-up that tested the…

  1. Fracture resistance of retreated roots using different retreatment systems.

    PubMed

    Er, Kursat; Tasdemir, Tamer; Siso, Seyda Herguner; Celik, Davut; Cora, Sabri

    2011-08-01

    This study was designed to evaluate the fracture resistance of retreated roots using different rotary retreatment systems. Forty eight freshly extracted human canine teeth with single straight root canals were instrumented sequentially increasing from size 30 to a size 55 using K-files whit a stepback technique. The teeth were randomly divided into three experimental and one control groups of 12 specimens each. The root canals were filled using cold lateral compaction of gutta-percha and AH Plus (Dentsply Detrey, Konstanz, Germany) sealer in experimental groups. Removal of gutta-percha was performed with the following devices and techniques: ProTaper Universal (Dentsply Maillefer, Ballaigues, Switzerland), R-Endo (Micro-Mega, Besançon, France), and Mtwo (Sweden & Martina, Padova, Italy) rotary retreatment systems. Control group specimens were only instrumented, not filled or retreated. The specimens were then mounted in copper rings, were filled with a self-curing polymethylmethacrylate resin, and the force required to cause vertical root fracture was measured using a universal testing device. The force of fracture of the roots was recorded and the results in the various groups were compared. Statistical analysis was accomplished by one-way ANOVA and a post hoc Tukey tests. There were statistically significant differences between the control and experimental groups (P<.05). However, there were no significant differences among the experimental groups. Based on the results, all rotary retreatment techniques used in this in vitro study produced similar root weakness.

  2. Proceedings of the NASTRAN (Tradename) Users’ Colloquium (17th) Held in San Antonio, Texas on 24-28 April 1989

    DTIC Science & Technology

    1989-03-01

    statistical energy analysis , the finite clement method, and the power flow method. Experimental solutions are the most common in the literature. The authors of...to the added weights and inertias of the transducers attached to an experimental structure. Statistical energy analysis (SEA) is a computational method...Analysis and Diagnosis," Journal of Sound and Vibration, Vol. 115, No. 3, pp. 405-422 (1987). 8. Lyon, R.L., Statistical Energy Analysis of Dynamical Systems

  3. Statistical analysis and digital processing of the Mössbauer spectra

    NASA Astrophysics Data System (ADS)

    Prochazka, Roman; Tucek, Pavel; Tucek, Jiri; Marek, Jaroslav; Mashlan, Miroslav; Pechousek, Jiri

    2010-02-01

    This work is focused on using the statistical methods and development of the filtration procedures for signal processing in Mössbauer spectroscopy. Statistical tools for noise filtering in the measured spectra are used in many scientific areas. The use of a pure statistical approach in accumulated Mössbauer spectra filtration is described. In Mössbauer spectroscopy, the noise can be considered as a Poisson statistical process with a Gaussian distribution for high numbers of observations. This noise is a superposition of the non-resonant photons counting with electronic noise (from γ-ray detection and discrimination units), and the velocity system quality that can be characterized by the velocity nonlinearities. The possibility of a noise-reducing process using a new design of statistical filter procedure is described. This mathematical procedure improves the signal-to-noise ratio and thus makes it easier to determine the hyperfine parameters of the given Mössbauer spectra. The filter procedure is based on a periodogram method that makes it possible to assign the statistically important components in the spectral domain. The significance level for these components is then feedback-controlled using the correlation coefficient test results. The estimation of the theoretical correlation coefficient level which corresponds to the spectrum resolution is performed. Correlation coefficient test is based on comparison of the theoretical and the experimental correlation coefficients given by the Spearman method. The correctness of this solution was analyzed by a series of statistical tests and confirmed by many spectra measured with increasing statistical quality for a given sample (absorber). The effect of this filter procedure depends on the signal-to-noise ratio and the applicability of this method has binding conditions.

  4. The Usual and the Unusual: Solving Remote Associates Test Tasks Using Simple Statistical Natural Language Processing Based on Language Use

    ERIC Educational Resources Information Center

    Klein, Ariel; Badia, Toni

    2015-01-01

    In this study we show how complex creative relations can arise from fairly frequent semantic relations observed in everyday language. By doing this, we reflect on some key cognitive aspects of linguistic and general creativity. In our experimentation, we automated the process of solving a battery of Remote Associates Test tasks. By applying…

  5. Capturing the Interaction Potential of Amyloidogenic Proteins

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Javid, Nadeem; Vogtt, Karsten; Winter, Roland

    2007-07-13

    Experimentally derived static structure factors obtained for the aggregation-prone protein insulin were analyzed with a statistical mechanical model based on the Derjaguin-Landau-Verwey-Overbeek potential. The data reveal that the protein self-assembles into equilibrium clusters already at low concentrations. Furthermore, striking differences regarding interaction forces between aggregation-prone proteins such as insulin in the preaggregated regime and natively stable globular proteins are found.

  6. Determination of heat capacity of ionic liquid based nanofluids using group method of data handling technique

    NASA Astrophysics Data System (ADS)

    Sadi, Maryam

    2018-01-01

    In this study a group method of data handling model has been successfully developed to predict heat capacity of ionic liquid based nanofluids by considering reduced temperature, acentric factor and molecular weight of ionic liquids, and nanoparticle concentration as input parameters. In order to accomplish modeling, 528 experimental data points extracted from the literature have been divided into training and testing subsets. The training set has been used to predict model coefficients and the testing set has been applied for model validation. The ability and accuracy of developed model, has been evaluated by comparison of model predictions with experimental values using different statistical parameters such as coefficient of determination, mean square error and mean absolute percentage error. The mean absolute percentage error of developed model for training and testing sets are 1.38% and 1.66%, respectively, which indicate excellent agreement between model predictions and experimental data. Also, the results estimated by the developed GMDH model exhibit a higher accuracy when compared to the available theoretical correlations.

  7. Theory and simulations of covariance mapping in multiple dimensions for data analysis in high-event-rate experiments

    NASA Astrophysics Data System (ADS)

    Zhaunerchyk, V.; Frasinski, L. J.; Eland, J. H. D.; Feifel, R.

    2014-05-01

    Multidimensional covariance analysis and its validity for correlation of processes leading to multiple products are investigated from a theoretical point of view. The need to correct for false correlations induced by experimental parameters which fluctuate from shot to shot, such as the intensity of self-amplified spontaneous emission x-ray free-electron laser pulses, is emphasized. Threefold covariance analysis based on simple extension of the two-variable formulation is shown to be valid for variables exhibiting Poisson statistics. In this case, false correlations arising from fluctuations in an unstable experimental parameter that scale linearly with signals can be eliminated by threefold partial covariance analysis, as defined here. Fourfold covariance based on the same simple extension is found to be invalid in general. Where fluctuations in an unstable parameter induce nonlinear signal variations, a technique of contingent covariance analysis is proposed here to suppress false correlations. In this paper we also show a method to eliminate false correlations associated with fluctuations of several unstable experimental parameters.

  8. A meta-analysis of experimental studies of diversion programs for juvenile offenders.

    PubMed

    Schwalbe, Craig S; Gearing, Robin E; MacKenzie, Michael J; Brewer, Kathryne B; Ibrahim, Rawan

    2012-02-01

    Research to establish an evidence-base for the treatment of conduct problems and delinquency in adolescence is well established; however, an evidence-base for interventions with offenders who are diverted from the juvenile justice system has yet to be synthesized. The purpose of this study was to conduct a meta-analysis of experimental studies testing juvenile diversion programs and to examine the moderating effect of program type and implementation quality. A literature search using PsycINFO, Web of Science, and the National Criminal Justice Reference Service data-bases and research institute websites yielded 28 eligible studies involving 57 experimental comparisons and 19,301 youths. Recidivism was the most common outcome reported across all studies. Overall, the effect of diversion programs on recidivism was non-significant (k=45, OR=0.83, 95%CI=0.43-1.58). Of the five program types identified, including case management (k=18, OR=0.78), individual treatment (k=11, OR=0.83), family treatment (k=4, OR=0.57), youth court (k=6, OR=0.93), and restorative justice (k=6, OR=0.87), only family treatment led to a statistically significant reduction in recidivism. Restorative justice studies that were implemented with active involvement of researchers led to statistically significant reductions in recidivism (k=3, OR=0.69). Other outcomes, including frequency of offending, truancy, and psycho-social problems were reported infrequently and were not subjected to meta-analysis. High levels of heterogeneity characterize diversion research. Results of this study recommend against implementation of programs limited to case management and highlight the promise of family interventions and restorative justice. Copyright © 2011 Elsevier Ltd. All rights reserved.

  9. A comparative study of two statistical approaches for the analysis of real seismicity sequences and synthetic seismicity generated by a stick-slip experimental model

    NASA Astrophysics Data System (ADS)

    Flores-Marquez, Leticia Elsa; Ramirez Rojaz, Alejandro; Telesca, Luciano

    2015-04-01

    The study of two statistical approaches is analyzed for two different types of data sets, one is the seismicity generated by the subduction processes occurred at south Pacific coast of Mexico between 2005 and 2012, and the other corresponds to the synthetic seismic data generated by a stick-slip experimental model. The statistical methods used for the present study are the visibility graph in order to investigate the time dynamics of the series and the scaled probability density function in the natural time domain to investigate the critical order of the system. This comparison has the purpose to show the similarities between the dynamical behaviors of both types of data sets, from the point of view of critical systems. The observed behaviors allow us to conclude that the experimental set up globally reproduces the behavior observed in the statistical approaches used to analyses the seismicity of the subduction zone. The present study was supported by the Bilateral Project Italy-Mexico Experimental Stick-slip models of tectonic faults: innovative statistical approaches applied to synthetic seismic sequences, jointly funded by MAECI (Italy) and AMEXCID (Mexico) in the framework of the Bilateral Agreement for Scientific and Technological Cooperation PE 2014-2016.

  10. Experimental scattershot boson sampling

    PubMed Central

    Bentivegna, Marco; Spagnolo, Nicolò; Vitelli, Chiara; Flamini, Fulvio; Viggianiello, Niko; Latmiral, Ludovico; Mataloni, Paolo; Brod, Daniel J.; Galvão, Ernesto F.; Crespi, Andrea; Ramponi, Roberta; Osellame, Roberto; Sciarrino, Fabio

    2015-01-01

    Boson sampling is a computational task strongly believed to be hard for classical computers, but efficiently solvable by orchestrated bosonic interference in a specialized quantum computer. Current experimental schemes, however, are still insufficient for a convincing demonstration of the advantage of quantum over classical computation. A new variation of this task, scattershot boson sampling, leads to an exponential increase in speed of the quantum device, using a larger number of photon sources based on parametric down-conversion. This is achieved by having multiple heralded single photons being sent, shot by shot, into different random input ports of the interferometer. We report the first scattershot boson sampling experiments, where six different photon-pair sources are coupled to integrated photonic circuits. We use recently proposed statistical tools to analyze our experimental data, providing strong evidence that our photonic quantum simulator works as expected. This approach represents an important leap toward a convincing experimental demonstration of the quantum computational supremacy. PMID:26601164

  11. Experimental scattershot boson sampling.

    PubMed

    Bentivegna, Marco; Spagnolo, Nicolò; Vitelli, Chiara; Flamini, Fulvio; Viggianiello, Niko; Latmiral, Ludovico; Mataloni, Paolo; Brod, Daniel J; Galvão, Ernesto F; Crespi, Andrea; Ramponi, Roberta; Osellame, Roberto; Sciarrino, Fabio

    2015-04-01

    Boson sampling is a computational task strongly believed to be hard for classical computers, but efficiently solvable by orchestrated bosonic interference in a specialized quantum computer. Current experimental schemes, however, are still insufficient for a convincing demonstration of the advantage of quantum over classical computation. A new variation of this task, scattershot boson sampling, leads to an exponential increase in speed of the quantum device, using a larger number of photon sources based on parametric down-conversion. This is achieved by having multiple heralded single photons being sent, shot by shot, into different random input ports of the interferometer. We report the first scattershot boson sampling experiments, where six different photon-pair sources are coupled to integrated photonic circuits. We use recently proposed statistical tools to analyze our experimental data, providing strong evidence that our photonic quantum simulator works as expected. This approach represents an important leap toward a convincing experimental demonstration of the quantum computational supremacy.

  12. Pathway Towards Fluency: Using 'disaggregate instruction' to promote science literacy

    NASA Astrophysics Data System (ADS)

    Brown, Bryan A.; Ryoo, Kihyun; Rodriguez, Jamie

    2010-07-01

    This study examines the impact of Disaggregate Instruction on students' science learning. Disaggregate Instruction is the idea that science teaching and learning can be separated into conceptual and discursive components. Using randomly assigned experimental and control groups, 49 fifth-grade students received web-based science lessons on photosynthesis using our experimental approach. We supplemented quantitative statistical comparisons of students' performance on pre- and post-test questions (multiple choice and short answer) with a qualitative analysis of students' post-test interviews. The results revealed that students in the experimental group outscored their control group counterparts across all measures. In addition, students taught using the experimental method demonstrated an improved ability to write using scientific language as well as an improved ability to provide oral explanations using scientific language. This study has important implications for how science educators can prepare teachers to teach diverse student populations.

  13. Learning Quantitative Sequence-Function Relationships from Massively Parallel Experiments

    NASA Astrophysics Data System (ADS)

    Atwal, Gurinder S.; Kinney, Justin B.

    2016-03-01

    A fundamental aspect of biological information processing is the ubiquity of sequence-function relationships—functions that map the sequence of DNA, RNA, or protein to a biochemically relevant activity. Most sequence-function relationships in biology are quantitative, but only recently have experimental techniques for effectively measuring these relationships been developed. The advent of such "massively parallel" experiments presents an exciting opportunity for the concepts and methods of statistical physics to inform the study of biological systems. After reviewing these recent experimental advances, we focus on the problem of how to infer parametric models of sequence-function relationships from the data produced by these experiments. Specifically, we retrace and extend recent theoretical work showing that inference based on mutual information, not the standard likelihood-based approach, is often necessary for accurately learning the parameters of these models. Closely connected with this result is the emergence of "diffeomorphic modes"—directions in parameter space that are far less constrained by data than likelihood-based inference would suggest. Analogous to Goldstone modes in physics, diffeomorphic modes arise from an arbitrarily broken symmetry of the inference problem. An analytically tractable model of a massively parallel experiment is then described, providing an explicit demonstration of these fundamental aspects of statistical inference. This paper concludes with an outlook on the theoretical and computational challenges currently facing studies of quantitative sequence-function relationships.

  14. Development of superalloys by powder metallurgy for use at 1000 - 1400 F

    NASA Technical Reports Server (NTRS)

    Calhoun, C. D.

    1971-01-01

    Consolidated powders of four nickel-base superalloys were studied for potential application as compressor and turbine discs in jet engines. All of the alloys were based on the Rene' 95 chemistry. Three of these had variations in carbon and A12O3 contents, and the fourth alloy was chemically modified to a higher volume fraction. The A12O3 was added by preoxidation of the powders prior to extrusion. Various levels of four experimental factors (1) alloy composition, (2) grain size, (3) thermomechanical processing, and (4) room temperature deformation plus final age were evaluated by tensile and stress rupture testing at 1200 F. Various levels of the four factors were assumed in order to construct the statistically-designed experiment, but the actual levels investigated were established in preliminary studies that preceded the statistical process development study.

  15. Mathematical models of cytotoxic effects in endpoint tumor cell line assays: critical assessment of the application of a single parametric value as a standard criterion to quantify the dose-response effects and new unexplored proposal formats.

    PubMed

    Calhelha, Ricardo C; Martínez, Mireia A; Prieto, M A; Ferreira, Isabel C F R

    2017-10-23

    The development of convenient tools for describing and quantifying the effects of standard and novel therapeutic agents is essential for the research community, to perform more precise evaluations. Although mathematical models and quantification criteria have been exchanged in the last decade between different fields of study, there are relevant methodologies that lack proper mathematical descriptions and standard criteria to quantify their responses. Therefore, part of the relevant information that can be drawn from the experimental results obtained and the quantification of its statistical reliability are lost. Despite its relevance, there is not a standard form for the in vitro endpoint tumor cell lines' assays (TCLA) that enables the evaluation of the cytotoxic dose-response effects of anti-tumor drugs. The analysis of all the specific problems associated with the diverse nature of the available TCLA used is unfeasible. However, since most TCLA share the main objectives and similar operative requirements, we have chosen the sulforhodamine B (SRB) colorimetric assay for cytotoxicity screening of tumor cell lines as an experimental case study. In this work, the common biological and practical non-linear dose-response mathematical models are tested against experimental data and, following several statistical analyses, the model based on the Weibull distribution was confirmed as the convenient approximation to test the cytotoxic effectiveness of anti-tumor compounds. Then, the advantages and disadvantages of all the different parametric criteria derived from the model, which enable the quantification of the dose-response drug-effects, are extensively discussed. Therefore, model and standard criteria for easily performing the comparisons between different compounds are established. The advantages include a simple application, provision of parametric estimations that characterize the response as standard criteria, economization of experimental effort and enabling rigorous comparisons among the effects of different compounds and experimental approaches. In all experimental data fitted, the calculated parameters were always statistically significant, the equations proved to be consistent and the correlation coefficient of determination was, in most of the cases, higher than 0.98.

  16. Realistic thermodynamic and statistical-mechanical measures for neural synchronization.

    PubMed

    Kim, Sang-Yoon; Lim, Woochang

    2014-04-15

    Synchronized brain rhythms, associated with diverse cognitive functions, have been observed in electrical recordings of brain activity. Neural synchronization may be well described by using the population-averaged global potential VG in computational neuroscience. The time-averaged fluctuation of VG plays the role of a "thermodynamic" order parameter O used for describing the synchrony-asynchrony transition in neural systems. Population spike synchronization may be well visualized in the raster plot of neural spikes. The degree of neural synchronization seen in the raster plot is well measured in terms of a "statistical-mechanical" spike-based measure Ms introduced by considering the occupation and the pacing patterns of spikes. The global potential VG is also used to give a reference global cycle for the calculation of Ms. Hence, VG becomes an important collective quantity because it is associated with calculation of both O and Ms. However, it is practically difficult to directly get VG in real experiments. To overcome this difficulty, instead of VG, we employ the instantaneous population spike rate (IPSR) which can be obtained in experiments, and develop realistic thermodynamic and statistical-mechanical measures, based on IPSR, to make practical characterization of the neural synchronization in both computational and experimental neuroscience. Particularly, more accurate characterization of weak sparse spike synchronization can be achieved in terms of realistic statistical-mechanical IPSR-based measure, in comparison with the conventional measure based on VG. Copyright © 2014. Published by Elsevier B.V.

  17. The Influence of Roughness on Gear Surface Fatigue

    NASA Technical Reports Server (NTRS)

    Krantz, Timothy

    2005-01-01

    Gear working surfaces are subjected to repeated rolling and sliding contacts, and often designs require loads sufficient to cause eventual fatigue of the surface. This research provides experimental data and analytical tools to further the understanding of the causal relationship of gear surface roughness to surface fatigue. The research included evaluations and developments of statistical tools for gear fatigue data, experimental evaluation of the surface fatigue lives of superfinished gears with a near-mirror quality, and evaluations of the experiments by analytical methods and surface inspections. Alternative statistical methods were evaluated using Monte Carlo studies leading to a final recommendation to describe gear fatigue data using a Weibull distribution, maximum likelihood estimates of shape and scale parameters, and a presumed zero-valued location parameter. A new method was developed for comparing two datasets by extending the current methods of likelihood-ratio based statistics. The surface fatigue lives of superfinished gears were evaluated by carefully controlled experiments, and it is shown conclusively that superfinishing of gears can provide for significantly greater lives relative to ground gears. The measured life improvement was approximately a factor of five. To assist with application of this finding to products, the experimental condition was evaluated. The fatigue life results were expressed in terms of specific film thickness and shown to be consistent with bearing data. Elastohydrodynamic and stress analyses were completed to relate the stress condition to fatigue. Smooth-surface models do not adequately explain the improved fatigue lives. Based on analyses using a rough surface model, it is concluded that the improved fatigue lives of superfinished gears is due to a reduced rate of near-surface micropitting fatigue processes, not due to any reduced rate of spalling (sub-surface) fatigue processes. To complete the evaluations, surface inspection were completed. The surface topographies of the ground gears changed substantially due to running, but the topographies of the superfinished gears were essentially unchanged with running.

  18. Statistical model specification and power: recommendations on the use of test-qualified pooling in analysis of experimental data

    PubMed Central

    Colegrave, Nick

    2017-01-01

    A common approach to the analysis of experimental data across much of the biological sciences is test-qualified pooling. Here non-significant terms are dropped from a statistical model, effectively pooling the variation associated with each removed term with the error term used to test hypotheses (or estimate effect sizes). This pooling is only carried out if statistical testing on the basis of applying that data to a previous more complicated model provides motivation for this model simplification; hence the pooling is test-qualified. In pooling, the researcher increases the degrees of freedom of the error term with the aim of increasing statistical power to test their hypotheses of interest. Despite this approach being widely adopted and explicitly recommended by some of the most widely cited statistical textbooks aimed at biologists, here we argue that (except in highly specialized circumstances that we can identify) the hoped-for improvement in statistical power will be small or non-existent, and there is likely to be much reduced reliability of the statistical procedures through deviation of type I error rates from nominal levels. We thus call for greatly reduced use of test-qualified pooling across experimental biology, more careful justification of any use that continues, and a different philosophy for initial selection of statistical models in the light of this change in procedure. PMID:28330912

  19. Towards random matrix model of breaking the time-reversal invariance of elastic waves in chaotic cavities by feedback

    NASA Astrophysics Data System (ADS)

    Antoniuk, Oleg; Sprik, Rudolf

    2010-03-01

    We developed a random matrix model to describe the statistics of resonances in an acoustic cavity with broken time-reversal invariance. Time-reversal invariance braking is achieved by connecting an amplified feedback loop between two transducers on the surface of the cavity. The model is based on approach [1] that describes time- reversal properties of the cavity without a feedback loop. Statistics of eigenvalues (nearest neighbor resonance spacing distributions and spectral rigidity) has been calculated and compared to the statistics obtained from our experimental data. Experiments have been performed on aluminum block of chaotic shape confining ultrasound waves. [1] Carsten Draeger and Mathias Fink, One-channel time- reversal in chaotic cavities: Theoretical limits, Journal of Acoustical Society of America, vol. 105, Nr. 2, pp. 611-617 (1999)

  20. Numerical investigation of turbulent channel flow

    NASA Technical Reports Server (NTRS)

    Moin, P.; Kim, J.

    1981-01-01

    Fully developed turbulent channel flow was simulated numerically at Reynolds number 13800, based on centerline velocity and channel halt width. The large-scale flow field was obtained by directly integrating the filtered, three dimensional, time dependent, Navier-Stokes equations. The small-scale field motions were simulated through an eddy viscosity model. The calculations were carried out on the ILLIAC IV computer with up to 516,096 grid points. The computed flow field was used to study the statistical properties of the flow as well as its time dependent features. The agreement of the computed mean velocity profile, turbulence statistics, and detailed flow structures with experimental data is good. The resolvable portion of the statistical correlations appearing in the Reynolds stress equations are calculated. Particular attention is given to the examination of the flow structure in the vicinity of the wall.

  1. A new concept of a unified parameter management, experiment control, and data analysis in fMRI: application to real-time fMRI at 3T and 7T.

    PubMed

    Hollmann, M; Mönch, T; Mulla-Osman, S; Tempelmann, C; Stadler, J; Bernarding, J

    2008-10-30

    In functional MRI (fMRI) complex experiments and applications require increasingly complex parameter handling as the experimental setup usually consists of separated soft- and hardware systems. Advanced real-time applications such as neurofeedback-based training or brain computer interfaces (BCIs) may even require adaptive changes of the paradigms and experimental setup during the measurement. This would be facilitated by an automated management of the overall workflow and a control of the communication between all experimental components. We realized a concept based on an XML software framework called Experiment Description Language (EDL). All parameters relevant for real-time data acquisition, real-time fMRI (rtfMRI) statistical data analysis, stimulus presentation, and activation processing are stored in one central EDL file, and processed during the experiment. A usability study comparing the central EDL parameter management with traditional approaches showed an improvement of the complete experimental handling. Based on this concept, a feasibility study realizing a dynamic rtfMRI-based brain computer interface showed that the developed system in combination with EDL was able to reliably detect and evaluate activation patterns in real-time. The implementation of a centrally controlled communication between the subsystems involved in the rtfMRI experiments reduced potential inconsistencies, and will open new applications for adaptive BCIs.

  2. RAId_DbS: Peptide Identification using Database Searches with Realistic Statistics

    PubMed Central

    Alves, Gelio; Ogurtsov, Aleksey Y; Yu, Yi-Kuo

    2007-01-01

    Background The key to mass-spectrometry-based proteomics is peptide identification. A major challenge in peptide identification is to obtain realistic E-values when assigning statistical significance to candidate peptides. Results Using a simple scoring scheme, we propose a database search method with theoretically characterized statistics. Taking into account possible skewness in the random variable distribution and the effect of finite sampling, we provide a theoretical derivation for the tail of the score distribution. For every experimental spectrum examined, we collect the scores of peptides in the database, and find good agreement between the collected score statistics and our theoretical distribution. Using Student's t-tests, we quantify the degree of agreement between the theoretical distribution and the score statistics collected. The T-tests may be used to measure the reliability of reported statistics. When combined with reported P-value for a peptide hit using a score distribution model, this new measure prevents exaggerated statistics. Another feature of RAId_DbS is its capability of detecting multiple co-eluted peptides. The peptide identification performance and statistical accuracy of RAId_DbS are assessed and compared with several other search tools. The executables and data related to RAId_DbS are freely available upon request. PMID:17961253

  3. Statistical methods and neural network approaches for classification of data from multiple sources

    NASA Technical Reports Server (NTRS)

    Benediktsson, Jon Atli; Swain, Philip H.

    1990-01-01

    Statistical methods for classification of data from multiple data sources are investigated and compared to neural network models. A problem with using conventional multivariate statistical approaches for classification of data of multiple types is in general that a multivariate distribution cannot be assumed for the classes in the data sources. Another common problem with statistical classification methods is that the data sources are not equally reliable. This means that the data sources need to be weighted according to their reliability but most statistical classification methods do not have a mechanism for this. This research focuses on statistical methods which can overcome these problems: a method of statistical multisource analysis and consensus theory. Reliability measures for weighting the data sources in these methods are suggested and investigated. Secondly, this research focuses on neural network models. The neural networks are distribution free since no prior knowledge of the statistical distribution of the data is needed. This is an obvious advantage over most statistical classification methods. The neural networks also automatically take care of the problem involving how much weight each data source should have. On the other hand, their training process is iterative and can take a very long time. Methods to speed up the training procedure are introduced and investigated. Experimental results of classification using both neural network models and statistical methods are given, and the approaches are compared based on these results.

  4. Bayesian Treed Calibration: An Application to Carbon Capture With AX Sorbent

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Konomi, Bledar A.; Karagiannis, Georgios; Lai, Kevin

    2017-01-02

    In cases where field or experimental measurements are not available, computer models can model real physical or engineering systems to reproduce their outcomes. They are usually calibrated in light of experimental data to create a better representation of the real system. Statistical methods, based on Gaussian processes, for calibration and prediction have been especially important when the computer models are expensive and experimental data limited. In this paper, we develop the Bayesian treed calibration (BTC) as an extension of standard Gaussian process calibration methods to deal with non-stationarity computer models and/or their discrepancy from the field (or experimental) data. Ourmore » proposed method partitions both the calibration and observable input space, based on a binary tree partitioning, into sub-regions where existing model calibration methods can be applied to connect a computer model with the real system. The estimation of the parameters in the proposed model is carried out using Markov chain Monte Carlo (MCMC) computational techniques. Different strategies have been applied to improve mixing. We illustrate our method in two artificial examples and a real application that concerns the capture of carbon dioxide with AX amine based sorbents. The source code and the examples analyzed in this paper are available as part of the supplementary materials.« less

  5. A comprehensive toxicological evaluation of three adhesives using experimental cigarettes.

    PubMed

    Coggins, Christopher R E; Jerome, Ann M; Lilly, Patrick D; McKinney, Willie J; Oldham, Michael J

    2013-01-01

    Adhesives are used in several different manufacturing operations in the production of cigarettes. The use of new, "high-speed-manufacture" adhesives (e.g. vinyl acetate based) could affect the smoke chemistry and toxicology of cigarettes, compared with older "low-speed-manufacture" adhesives (e.g. starch based). This study was conducted to determine whether the inclusion of different levels of three adhesives (ethylene vinyl acetate, polyvinyl acetate and starch) in experimental cigarettes results in different smoke chemistry and toxicological responses in in vitro and in vivo assays. A battery of tests (analytical chemistry, in vitro and in vivo assays) was used to compare the chemistry and toxicology of smoke from experimental cigarettes made with different combinations of the three adhesives. Varying levels of the different side-seam adhesives, as well as the transfer of adhesives from packaging materials, were tested. There were differences in some mainstream cigarette smoke constituents as a function of the level of adhesive added to experimental cigarettes and between the tested adhesives. None of these differences translated into statistically significant differences in the in vitro or in vivo assays. The use of newer "high-speed-manufacture" vinyl acetate-based adhesives in cigarettes does not produce toxicological profiles that prevent the adhesives from replacing the older "low-speed-manufacture" adhesives (such as starch).

  6. Optimal statistical damage detection and classification in an experimental wind turbine blade using minimum instrumentation

    NASA Astrophysics Data System (ADS)

    Hoell, Simon; Omenzetter, Piotr

    2017-04-01

    The increasing demand for carbon neutral energy in a challenging economic environment is a driving factor for erecting ever larger wind turbines in harsh environments using novel wind turbine blade (WTBs) designs characterized by high flexibilities and lower buckling capacities. To counteract resulting increasing of operation and maintenance costs, efficient structural health monitoring systems can be employed to prevent dramatic failures and to schedule maintenance actions according to the true structural state. This paper presents a novel methodology for classifying structural damages using vibrational responses from a single sensor. The method is based on statistical classification using Bayes' theorem and an advanced statistic, which allows controlling the performance by varying the number of samples which represent the current state. This is done for multivariate damage sensitive features defined as partial autocorrelation coefficients (PACCs) estimated from vibrational responses and principal component analysis scores from PACCs. Additionally, optimal DSFs are composed not only for damage classification but also for damage detection based on binary statistical hypothesis testing, where features selections are found with a fast forward procedure. The method is applied to laboratory experiments with a small scale WTB with wind-like excitation and non-destructive damage scenarios. The obtained results demonstrate the advantages of the proposed procedure and are promising for future applications of vibration-based structural health monitoring in WTBs.

  7. Using simulation to interpret experimental data in terms of protein conformational ensembles.

    PubMed

    Allison, Jane R

    2017-04-01

    In their biological environment, proteins are dynamic molecules, necessitating an ensemble structural description. Molecular dynamics simulations and solution-state experiments provide complimentary information in the form of atomically detailed coordinates and averaged or distributions of structural properties or related quantities. Recently, increases in the temporal and spatial scale of conformational sampling and comparison of the more diverse conformational ensembles thus generated have revealed the importance of sampling rare events. Excitingly, new methods based on maximum entropy and Bayesian inference are promising to provide a statistically sound mechanism for combining experimental data with molecular dynamics simulations. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. Designing biomedical proteomics experiments: state-of-the-art and future perspectives.

    PubMed

    Maes, Evelyne; Kelchtermans, Pieter; Bittremieux, Wout; De Grave, Kurt; Degroeve, Sven; Hooyberghs, Jef; Mertens, Inge; Baggerman, Geert; Ramon, Jan; Laukens, Kris; Martens, Lennart; Valkenborg, Dirk

    2016-05-01

    With the current expanded technical capabilities to perform mass spectrometry-based biomedical proteomics experiments, an improved focus on the design of experiments is crucial. As it is clear that ignoring the importance of a good design leads to an unprecedented rate of false discoveries which would poison our results, more and more tools are developed to help researchers designing proteomic experiments. In this review, we apply statistical thinking to go through the entire proteomics workflow for biomarker discovery and validation and relate the considerations that should be made at the level of hypothesis building, technology selection, experimental design and the optimization of the experimental parameters.

  9. The influence of the design matrix on treatment effect estimates in the quantitative analyses of single-subject experimental design research.

    PubMed

    Moeyaert, Mariola; Ugille, Maaike; Ferron, John M; Beretvas, S Natasha; Van den Noortgate, Wim

    2014-09-01

    The quantitative methods for analyzing single-subject experimental data have expanded during the last decade, including the use of regression models to statistically analyze the data, but still a lot of questions remain. One question is how to specify predictors in a regression model to account for the specifics of the design and estimate the effect size of interest. These quantitative effect sizes are used in retrospective analyses and allow synthesis of single-subject experimental study results which is informative for evidence-based decision making, research and theory building, and policy discussions. We discuss different design matrices that can be used for the most common single-subject experimental designs (SSEDs), namely, the multiple-baseline designs, reversal designs, and alternating treatment designs, and provide empirical illustrations. The purpose of this article is to guide single-subject experimental data analysts interested in analyzing and meta-analyzing SSED data. © The Author(s) 2014.

  10. A framework for grouping nanoparticles based on their measurable characteristics.

    PubMed

    Sayes, Christie M; Smith, P Alex; Ivanov, Ivan V

    2013-01-01

    There is a need to take a broader look at nanotoxicological studies. Eventually, the field will demand that some generalizations be made. To begin to address this issue, we posed a question: are metal colloids on the nanometer-size scale a homogeneous group? In general, most people can agree that the physicochemical properties of nanomaterials can be linked and related to their induced toxicological responses. The focus of this study was to determine how a set of selected physicochemical properties of five specific metal-based colloidal materials on the nanometer-size scale - silver, copper, nickel, iron, and zinc - could be used as nanodescriptors that facilitate the grouping of these metal-based colloids. The example of the framework pipeline processing provided in this paper shows the utility of specific statistical and pattern recognition techniques in grouping nanoparticles based on experimental data about their physicochemical properties. Interestingly, the results of the analyses suggest that a seemingly homogeneous group of nanoparticles could be separated into sub-groups depending on interdependencies observed in their nanodescriptors. These particles represent an important category of nanomaterials that are currently mass produced. Each has been reputed to induce toxicological and/or cytotoxicological effects. Here, we propose an experimental methodology coupled with mathematical and statistical modeling that can serve as a prototype for a rigorous framework that aids in the ability to group nanomaterials together and to facilitate the subsequent analysis of trends in data based on quantitative modeling of nanoparticle-specific structure-activity relationships. The computational part of the proposed framework is rather general and can be applied to other groups of nanomaterials as well.

  11. The relevance of a rules-based maize marketing policy: an experimental case study of Zambia.

    PubMed

    Abbink, Klaus; Jayne, Thomas S; Moller, Lars C

    2011-01-01

    Strategic interaction between public and private actors is increasingly recognised as an important determinant of agricultural market performance in Africa and elsewhere. Trust and consultation tends to positively affect private activity while uncertainty of government behaviour impedes it. This paper reports on a laboratory experiment based on a stylised model of the Zambian maize market. The experiment facilitates a comparison between discretionary interventionism and a rules-based policy in which the government pre-commits itself to a future course of action. A simple precommitment rule can, in theory, overcome the prevailing strategic dilemma by encouraging private sector participation. Although this result is also borne out in the economic experiment, the improvement in private sector activity is surprisingly small and not statistically significant due to irrationally cautious choices by experimental governments. Encouragingly, a rules-based policy promotes a much more stable market outcome thereby substantially reducing the risk of severe food shortages. These results underscore the importance of predictable and transparent rules for the state's involvement in agricultural markets.

  12. Laser-Based Trespassing Prediction in Restrictive Environments: A Linear Approach

    PubMed Central

    Cheein, Fernando Auat; Scaglia, Gustavo

    2012-01-01

    Stationary range laser sensors for intruder monitoring, restricted space violation detections and workspace determination are extensively used in risky environments. In this work we present a linear based approach for predicting the presence of moving agents before they trespass a laser-based restricted space. Our approach is based on the Taylor's series expansion of the detected objects' movements. The latter makes our proposal suitable for embedded applications. In the experimental results (carried out in different scenarios) presented herein, our proposal shows 100% of effectiveness in predicting trespassing situations. Several implementation results and statistics analysis showing the performance of our proposal are included in this work.

  13. Using a Discussion about Scientific Controversy to Teach Central Concepts in Experimental Design

    ERIC Educational Resources Information Center

    Bennett, Kimberley Ann

    2015-01-01

    Students may need explicit training in informal statistical reasoning in order to design experiments or use formal statistical tests effectively. By using scientific scandals and media misinterpretation, we can explore the need for good experimental design in an informal way. This article describes the use of a paper that reviews the measles mumps…

  14. Extreme event statistics in a drifting Markov chain

    NASA Astrophysics Data System (ADS)

    Kindermann, Farina; Hohmann, Michael; Lausch, Tobias; Mayer, Daniel; Schmidt, Felix; Widera, Artur

    2017-07-01

    We analyze extreme event statistics of experimentally realized Markov chains with various drifts. Our Markov chains are individual trajectories of a single atom diffusing in a one-dimensional periodic potential. Based on more than 500 individual atomic traces we verify the applicability of the Sparre Andersen theorem to our system despite the presence of a drift. We present detailed analysis of four different rare-event statistics for our system: the distributions of extreme values, of record values, of extreme value occurrence in the chain, and of the number of records in the chain. We observe that, for our data, the shape of the extreme event distributions is dominated by the underlying exponential distance distribution extracted from the atomic traces. Furthermore, we find that even small drifts influence the statistics of extreme events and record values, which is supported by numerical simulations, and we identify cases in which the drift can be determined without information about the underlying random variable distributions. Our results facilitate the use of extreme event statistics as a signal for small drifts in correlated trajectories.

  15. Exploring complex dynamics in multi agent-based intelligent systems: Theoretical and experimental approaches using the Multi Agent-based Behavioral Economic Landscape (MABEL) model

    NASA Astrophysics Data System (ADS)

    Alexandridis, Konstantinos T.

    This dissertation adopts a holistic and detailed approach to modeling spatially explicit agent-based artificial intelligent systems, using the Multi Agent-based Behavioral Economic Landscape (MABEL) model. The research questions that addresses stem from the need to understand and analyze the real-world patterns and dynamics of land use change from a coupled human-environmental systems perspective. Describes the systemic, mathematical, statistical, socio-economic and spatial dynamics of the MABEL modeling framework, and provides a wide array of cross-disciplinary modeling applications within the research, decision-making and policy domains. Establishes the symbolic properties of the MABEL model as a Markov decision process, analyzes the decision-theoretic utility and optimization attributes of agents towards comprising statistically and spatially optimal policies and actions, and explores the probabilogic character of the agents' decision-making and inference mechanisms via the use of Bayesian belief and decision networks. Develops and describes a Monte Carlo methodology for experimental replications of agent's decisions regarding complex spatial parcel acquisition and learning. Recognizes the gap on spatially-explicit accuracy assessment techniques for complex spatial models, and proposes an ensemble of statistical tools designed to address this problem. Advanced information assessment techniques such as the Receiver-Operator Characteristic curve, the impurity entropy and Gini functions, and the Bayesian classification functions are proposed. The theoretical foundation for modular Bayesian inference in spatially-explicit multi-agent artificial intelligent systems, and the ensembles of cognitive and scenario assessment modular tools build for the MABEL model are provided. Emphasizes the modularity and robustness as valuable qualitative modeling attributes, and examines the role of robust intelligent modeling as a tool for improving policy-decisions related to land use change. Finally, the major contributions to the science are presented along with valuable directions for future research.

  16. Perception-based road hazard identification with Internet support.

    PubMed

    Tarko, Andrew P; DeSalle, Brian R

    2003-01-01

    One of the most important tasks faced by highway agencies is identifying road hazards. Agencies use crash statistics to detect road intersections and segments where the frequency of crashes is excessive. With the crash-based method, a dangerous intersection or segment can be pointed out only after a sufficient number of crashes occur. A more proactive method is needed, and motorist complaints may be able to assist agencies in detecting road hazards before crashes occur. This paper investigates the quality of safety information reported by motorists and the effectiveness of hazard identification based on motorist reports, which were collected with an experimental Internet website. It demonstrates that the intersections pointed out by motorists tended to have more crashes than other intersections. The safety information collected through the website was comparable to 2-3 months of crash data. It was concluded that although the Internet-based method could not substitute for the traditional crash-based methods, its joint use with crash statistics might be useful in detecting new hazards where crash data had been collected for a short time.

  17. Statistical properties of the anomalous scaling exponent estimator based on time-averaged mean-square displacement

    NASA Astrophysics Data System (ADS)

    Sikora, Grzegorz; Teuerle, Marek; Wyłomańska, Agnieszka; Grebenkov, Denis

    2017-08-01

    The most common way of estimating the anomalous scaling exponent from single-particle trajectories consists of a linear fit of the dependence of the time-averaged mean-square displacement on the lag time at the log-log scale. We investigate the statistical properties of this estimator in the case of fractional Brownian motion (FBM). We determine the mean value, the variance, and the distribution of the estimator. Our theoretical results are confirmed by Monte Carlo simulations. In the limit of long trajectories, the estimator is shown to be asymptotically unbiased, consistent, and with vanishing variance. These properties ensure an accurate estimation of the scaling exponent even from a single (long enough) trajectory. As a consequence, we prove that the usual way to estimate the diffusion exponent of FBM is correct from the statistical point of view. Moreover, the knowledge of the estimator distribution is the first step toward new statistical tests of FBM and toward a more reliable interpretation of the experimental histograms of scaling exponents in microbiology.

  18. Blind image quality assessment based on aesthetic and statistical quality-aware features

    NASA Astrophysics Data System (ADS)

    Jenadeleh, Mohsen; Masaeli, Mohammad Masood; Moghaddam, Mohsen Ebrahimi

    2017-07-01

    The main goal of image quality assessment (IQA) methods is the emulation of human perceptual image quality judgments. Therefore, the correlation between objective scores of these methods with human perceptual scores is considered as their performance metric. Human judgment of the image quality implicitly includes many factors when assessing perceptual image qualities such as aesthetics, semantics, context, and various types of visual distortions. The main idea of this paper is to use a host of features that are commonly employed in image aesthetics assessment in order to improve blind image quality assessment (BIQA) methods accuracy. We propose an approach that enriches the features of BIQA methods by integrating a host of aesthetics image features with the features of natural image statistics derived from multiple domains. The proposed features have been used for augmenting five different state-of-the-art BIQA methods, which use statistical natural scene statistics features. Experiments were performed on seven benchmark image quality databases. The experimental results showed significant improvement of the accuracy of the methods.

  19. Effects of momentary self-monitoring on empowerment in a randomized controlled trial in patients with depression.

    PubMed

    Simons, C J P; Hartmann, J A; Kramer, I; Menne-Lothmann, C; Höhn, P; van Bemmel, A L; Myin-Germeys, I; Delespaul, P; van Os, J; Wichers, M

    2015-11-01

    Interventions based on the experience sampling method (ESM) are ideally suited to provide insight into personal, contextualized affective patterns in the flow of daily life. Recently, we showed that an ESM-intervention focusing on positive affect was associated with a decrease in symptoms in patients with depression. The aim of the present study was to examine whether ESM-intervention increased patient empowerment. Depressed out-patients (n=102) receiving psychopharmacological treatment who had participated in a randomized controlled trial with three arms: (i) an experimental group receiving six weeks of ESM self-monitoring combined with weekly feedback sessions, (ii) a pseudo-experimental group participating in six weeks of ESM self-monitoring without feedback, and (iii) a control group (treatment as usual only). Patients were recruited in the Netherlands between January 2010 and February 2012. Self-report empowerment scores were obtained pre- and post-intervention. There was an effect of group×assessment period, indicating that the experimental (B=7.26, P=0.061, d=0.44, statistically imprecise) and pseudo-experimental group (B=11.19, P=0.003, d=0.76) increased more in reported empowerment compared to the control group. In the pseudo-experimental group, 29% of the participants showed a statistically reliable increase in empowerment score and 0% reliable decrease compared to 17% reliable increase and 21% reliable decrease in the control group. The experimental group showed 19% reliable increase and 4% reliable decrease. These findings tentatively suggest that self-monitoring to complement standard antidepressant treatment may increase patients' feelings of empowerment. Further research is necessary to investigate long-term empowering effects of self-monitoring in combination with person-tailored feedback. Copyright © 2015 Elsevier Masson SAS. All rights reserved.

  20. Proper assessment of the JFK assassination bullet lead evidence from metallurgical and statistical perspectives.

    PubMed

    Randich, Erik; Grant, Patrick M

    2006-07-01

    The bullet evidence in the JFK assassination investigation was reexamined from metallurgical and statistical standpoints. The questioned specimens are comprised of soft lead, possibly from full-metal-jacketed Mannlicher-Carcano (MC), 6.5-mm ammunition. During lead refining, contaminant elements are removed to specified levels for a desired alloy or composition. Microsegregation of trace and minor elements during lead casting and processing can account for the experimental variabilities measured in various evidentiary and comparison samples by laboratory analysts. Thus, elevated concentrations of antimony and copper at crystallographic grain boundaries, the widely varying sizes of grains in MC bullet lead, and the 5-60 mg bullet samples analyzed for assassination intelligence effectively resulted in operational sampling error for the analyses. This deficiency was not considered in the original data interpretation and resulted in an invalid conclusion in favor of the single-bullet theory of the assassination. Alternate statistical calculations, based on the historic analytical data, incorporating weighted averaging and propagation of experimental uncertainties also considerably weaken support for the single-bullet theory. In effect, this assessment of the material composition of the lead specimens from the assassination concludes that the extant evidence is consistent with any number between two and five rounds fired in Dealey Plaza during the shooting.

  1. Proper Assessment of the JFK Assassination Bullet Lead Evidence from Metallurgical and Statistical Perspectives

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Randich, E; Grant, P M

    2006-08-29

    The bullet evidence in the JFK assassination investigation was reexamined from metallurgical and statistical standpoints. The questioned specimens are comprised of soft lead, possibly from full-metal-jacketed Mannlicher-Carcano, 6.5-mm ammunition. During lead refining, contaminant elements are removed to specified levels for a desired alloy or composition. Microsegregation of trace and minor elements during lead casting and processing can account for the experimental variabilities measured in various evidentiary and comparison samples by laboratory analysts. Thus, elevated concentrations of antimony and copper at crystallographic grain boundaries, the widely varying sizes of grains in Mannlicher-Carcano bullet lead, and the 5-60 mg bullet samples analyzedmore » for assassination intelligence effectively resulted in operational sampling error for the analyses. This deficiency was not considered in the original data interpretation and resulted in an invalid conclusion in favor of the single-bullet theory of the assassination. Alternate statistical calculations, based on the historic analytical data, incorporating weighted averaging and propagation of experimental uncertainties also considerably weaken support for the single-bullet theory. In effect, this assessment of the material composition of the lead specimens from the assassination concludes that the extant evidence is consistent with any number between two and five rounds fired in Dealey Plaza during the shooting.« less

  2. Statistical analysis of the effect of temperature and inlet humidities on the parameters of a semiempirical model of the internal resistance of a polymer electrolyte membrane fuel cell

    NASA Astrophysics Data System (ADS)

    Giner-Sanz, J. J.; Ortega, E. M.; Pérez-Herranz, V.

    2018-03-01

    The internal resistance of a PEM fuel cell depends on the operation conditions and on the current delivered by the cell. This work's goal is to obtain a semiempirical model able to reproduce the effect of the operation current on the internal resistance of an individual cell of a commercial PEM fuel cell stack; and to perform a statistical analysis in order to study the effect of the operation temperature and the inlet humidities on the parameters of the model. First, the internal resistance of the individual fuel cell operating in different operation conditions was experimentally measured for different DC currents, using the high frequency intercept of the impedance spectra. Then, a semiempirical model based on Springer and co-workers' model was proposed. This model is able to successfully reproduce the experimental trends. Subsequently, the curves of resistance versus DC current obtained for different operation conditions were fitted to the semiempirical model, and an analysis of variance (ANOVA) was performed in order to determine which factors have a statistically significant effect on each model parameter. Finally, a response surface method was applied in order to obtain a regression model.

  3. TIME-DOMAIN METHODS FOR DIFFUSIVE TRANSPORT IN SOFT MATTER

    PubMed Central

    Fricks, John; Yao, Lingxing; Elston, Timothy C.; Gregory Forest, And M.

    2015-01-01

    Passive microrheology [12] utilizes measurements of noisy, entropic fluctuations (i.e., diffusive properties) of micron-scale spheres in soft matter to infer bulk frequency-dependent loss and storage moduli. Here, we are concerned exclusively with diffusion of Brownian particles in viscoelastic media, for which the Mason-Weitz theoretical-experimental protocol is ideal, and the more challenging inference of bulk viscoelastic moduli is decoupled. The diffusive theory begins with a generalized Langevin equation (GLE) with a memory drag law specified by a kernel [7, 16, 22, 23]. We start with a discrete formulation of the GLE as an autoregressive stochastic process governing microbead paths measured by particle tracking. For the inverse problem (recovery of the memory kernel from experimental data) we apply time series analysis (maximum likelihood estimators via the Kalman filter) directly to bead position data, an alternative to formulas based on mean-squared displacement statistics in frequency space. For direct modeling, we present statistically exact GLE algorithms for individual particle paths as well as statistical correlations for displacement and velocity. Our time-domain methods rest upon a generalization of well-known results for a single-mode exponential kernel [1, 7, 22, 23] to an arbitrary M-mode exponential series, for which the GLE is transformed to a vector Ornstein-Uhlenbeck process. PMID:26412904

  4. Hydrostatic paradox: experimental verification of pressure equilibrium

    NASA Astrophysics Data System (ADS)

    Kodejška, Č.; Ganci, S.; Říha, J.; Sedláčková, H.

    2017-11-01

    This work is focused on the experimental verification of the balance between the atmospheric pressure acting on the sheet of paper, which encloses the cylinder completely or partially filled with water from below, where the hydrostatic pressure of the water column acts against the atmospheric pressure. First of all this paper solves a theoretical analysis of the problem, which is based, firstly, on the equation for isothermal process and, secondly, on the equality of pressures inside and outside the cylinder. From the measured values the confirmation of the theoretical quadratic dependence of the air pressure inside the cylinder on the level of the liquid in the cylinder is obtained, the maximum change in the volume of air within the cylinder occurs for the height of the water column L of one half of the total height of the vessel H. The measurements were made for different diameters of the cylinder and with plates made of different materials located at the bottom of the cylinder to prevent liquid from flowing out of the cylinder. The measured values were subjected to statistical analysis, which demonstrated the validity of the zero hypothesis, i.e. that the measured values are not statistically significantly different from the theoretically calculated ones at the statistical significance level α  =  0.05.

  5. Use of response surface methodology in a fed-batch process for optimization of tricarboxylic acid cycle intermediates to achieve high levels of canthaxanthin from Dietzia natronolimnaea HS-1.

    PubMed

    Nasri Nasrabadi, Mohammad Reza; Razavi, Seyed Hadi

    2010-04-01

    In this work, we applied statistical experimental design to a fed-batch process for optimization of tricarboxylic acid cycle (TCA) intermediates in order to achieve high-level production of canthaxanthin from Dietzia natronolimnaea HS-1 cultured in beet molasses. A fractional factorial design (screening test) was first conducted on five TCA cycle intermediates. Out of the five TCA cycle intermediates investigated via screening tests, alfaketoglutarate, oxaloacetate and succinate were selected based on their statistically significant (P<0.05) and positive effects on canthaxanthin production. These significant factors were optimized by means of response surface methodology (RSM) in order to achieve high-level production of canthaxanthin. The experimental results of the RSM were fitted with a second-order polynomial equation by means of a multiple regression technique to identify the relationship between canthaxanthin production and the three TCA cycle intermediates. By means of this statistical design under a fed-batch process, the optimum conditions required to achieve the highest level of canthaxanthin (13172 + or - 25 microg l(-1)) were determined as follows: alfaketoglutarate, 9.69 mM; oxaloacetate, 8.68 mM; succinate, 8.51 mM. Copyright 2009 The Society for Biotechnology, Japan. Published by Elsevier B.V. All rights reserved.

  6. The relation between statistical power and inference in fMRI

    PubMed Central

    Wager, Tor D.; Yarkoni, Tal

    2017-01-01

    Statistically underpowered studies can result in experimental failure even when all other experimental considerations have been addressed impeccably. In fMRI the combination of a large number of dependent variables, a relatively small number of observations (subjects), and a need to correct for multiple comparisons can decrease statistical power dramatically. This problem has been clearly addressed yet remains controversial—especially in regards to the expected effect sizes in fMRI, and especially for between-subjects effects such as group comparisons and brain-behavior correlations. We aimed to clarify the power problem by considering and contrasting two simulated scenarios of such possible brain-behavior correlations: weak diffuse effects and strong localized effects. Sampling from these scenarios shows that, particularly in the weak diffuse scenario, common sample sizes (n = 20–30) display extremely low statistical power, poorly represent the actual effects in the full sample, and show large variation on subsequent replications. Empirical data from the Human Connectome Project resembles the weak diffuse scenario much more than the localized strong scenario, which underscores the extent of the power problem for many studies. Possible solutions to the power problem include increasing the sample size, using less stringent thresholds, or focusing on a region-of-interest. However, these approaches are not always feasible and some have major drawbacks. The most prominent solutions that may help address the power problem include model-based (multivariate) prediction methods and meta-analyses with related synthesis-oriented approaches. PMID:29155843

  7. All-atom four-body knowledge-based statistical potential to distinguish native tertiary RNA structures from nonnative folds.

    PubMed

    Masso, Majid

    2018-09-14

    Scientific breakthroughs in recent decades have uncovered the capability of RNA molecules to fulfill a wide array of structural, functional, and regulatory roles in living cells, leading to a concomitantly significant increase in both the number and diversity of experimentally determined RNA three-dimensional (3D) structures. Atomic coordinates from a representative training set of solved RNA structures, displaying low sequence and structure similarity, facilitate derivation of knowledge-based energy functions. Here we develop an all-atom four-body statistical potential and evaluate its capacity to distinguish native RNA 3D structures from nonnative folds based on calculated free energy scores. Atomic four-body nearest-neighbors are objectively identified by their occurrence as tetrahedral vertices in the Delaunay tessellations of RNA structures, and rates of atomic quadruplet interactions expected by chance are obtained from a multinomial reference distribution. Our four-body energy function, referred to as RAMP (ribonucleic acids multibody potential), is subsequently derived by applying the inverted Boltzmann principle to the frequency data, yielding an energy score for each type of atomic quadruplet interaction. Several well-known benchmark datasets reveal that RAMP is comparable with, and often outperforms, existing knowledge- and physics-based energy functions. To the best of our knowledge, this is the first study detailing an RNA tertiary structure-based multibody statistical potential and its comparative evaluation. Copyright © 2018 Elsevier Ltd. All rights reserved.

  8. A randomized evaluation of a computer-based physician's workstation: design considerations and baseline results.

    PubMed Central

    Rotman, B. L.; Sullivan, A. N.; McDonald, T.; DeSmedt, P.; Goodnature, D.; Higgins, M.; Suermondt, H. J.; Young, C. Y.; Owens, D. K.

    1995-01-01

    We are performing a randomized, controlled trial of a Physician's Workstation (PWS), an ambulatory care information system, developed for use in the General Medical Clinic (GMC) of the Palo Alto VA. Goals for the project include selecting appropriate outcome variables and developing a statistically powerful experimental design with a limited number of subjects. As PWS provides real-time drug-ordering advice, we retrospectively examined drug costs and drug-drug interactions in order to select outcome variables sensitive to our short-term intervention as well as to estimate the statistical efficiency of alternative design possibilities. Drug cost data revealed the mean daily cost per physician per patient was 99.3 cents +/- 13.4 cents, with a range from 0.77 cent to 1.37 cents. The rate of major interactions per prescription for each physician was 2.9% +/- 1%, with a range from 1.5% to 4.8%. Based on these baseline analyses, we selected a two-period parallel design for the evaluation, which maximized statistical power while minimizing sources of bias. PMID:8563376

  9. Statistical estimation of femur micro-architecture using optimal shape and density predictors.

    PubMed

    Lekadir, Karim; Hazrati-Marangalou, Javad; Hoogendoorn, Corné; Taylor, Zeike; van Rietbergen, Bert; Frangi, Alejandro F

    2015-02-26

    The personalization of trabecular micro-architecture has been recently shown to be important in patient-specific biomechanical models of the femur. However, high-resolution in vivo imaging of bone micro-architecture using existing modalities is still infeasible in practice due to the associated acquisition times, costs, and X-ray radiation exposure. In this study, we describe a statistical approach for the prediction of the femur micro-architecture based on the more easily extracted subject-specific bone shape and mineral density information. To this end, a training sample of ex vivo micro-CT images is used to learn the existing statistical relationships within the low and high resolution image data. More specifically, optimal bone shape and mineral density features are selected based on their predictive power and used within a partial least square regression model to estimate the unknown trabecular micro-architecture within the anatomical models of new subjects. The experimental results demonstrate the accuracy of the proposed approach, with average errors of 0.07 for both the degree of anisotropy and tensor norms. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. Research on the correlation between corona current spectrum and audible noise spectrum of HVDC transmission line

    NASA Astrophysics Data System (ADS)

    Liu, Yingyi; Zhou, Lijuan; Liu, Yuanqing; Yuan, Haiwen; Ji, Liang

    2017-11-01

    Audible noise is closely related to corona current on a high voltage direct current (HVDC) transmission line. In this paper, we measured a large amount of audible noise and corona current waveforms simultaneously based on the largest outdoor HVDC corona cage all over the world. By analyzing the experimental data, the related statistical regularities between a corona current spectrum and an audible noise spectrum were obtained. Furthermore, the generation mechanism of audible noise was analyzed theoretically, and the related mathematical expression between the audible noise spectrum and the corona current spectrum, which is suitable for all of these measuring points in the space, has been established based on the electro-acoustic conversion theory. Finally, combined with the obtained mathematical relation, the internal reasons for these statistical regularities appearing in measured corona current and audible noise data were explained. The results of this paper not only present the statistical association regularities between the corona current spectrum and the audible noise spectrum on a HVDC transmission line, but also reveal the inherent reasons of these associated rules.

  11. The Impact of Team-Based Learning on Nervous System Examination Knowledge of Nursing Students.

    PubMed

    Hemmati Maslakpak, Masomeh; Parizad, Naser; Zareie, Farzad

    2015-12-01

    Team-based learning is one of the active learning approaches in which independent learning is combined with small group discussion in the class. This study aimed to determine the impact of team-based learning in nervous system examination knowledge of nursing students. This quasi-experimental study was conducted on 3(rd) grade nursing students, including 5th semester (intervention group) and 6(th) semester (control group). The traditional lecture method and the team-based learning method were used for educating the examination of the nervous system for intervention and control groups, respectively. The data were collected by a test covering 40-questions (multiple choice, matching, gap-filling and descriptive questions) before and after intervention in both groups. Individual Readiness Assurance Test (RAT) and Group Readiness Assurance Test (GRAT) used to collect data in the intervention group. In the end, the collected data were analyzed by SPSS ver. 13 using descriptive and inferential statistical tests. In team-based learning group, mean and standard deviation was 13.39 (4.52) before the intervention, which had been increased to 31.07 (3.20) after the intervention and this increase was statistically significant. Also, there was a statistically significant difference between the scores of RAT and GRAT in team-based learning group. Using team-based learning approach resulted in much better improvement and stability in the nervous system examination knowledge of nursing students compared to traditional lecture method; therefore, this method could be efficiently used as an effective educational approach in nursing education.

  12. A statistical model describing combined irreversible electroporation and electroporation-induced blood-brain barrier disruption.

    PubMed

    Sharabi, Shirley; Kos, Bor; Last, David; Guez, David; Daniels, Dianne; Harnof, Sagi; Mardor, Yael; Miklavcic, Damijan

    2016-03-01

    Electroporation-based therapies such as electrochemotherapy (ECT) and irreversible electroporation (IRE) are emerging as promising tools for treatment of tumors. When applied to the brain, electroporation can also induce transient blood-brain-barrier (BBB) disruption in volumes extending beyond IRE, thus enabling efficient drug penetration. The main objective of this study was to develop a statistical model predicting cell death and BBB disruption induced by electroporation. This model can be used for individual treatment planning. Cell death and BBB disruption models were developed based on the Peleg-Fermi model in combination with numerical models of the electric field. The model calculates the electric field thresholds for cell kill and BBB disruption and describes the dependence on the number of treatment pulses. The model was validated using in vivo experimental data consisting of rats brains MRIs post electroporation treatments. Linear regression analysis confirmed that the model described the IRE and BBB disruption volumes as a function of treatment pulses number (r(2) = 0.79; p < 0.008, r(2) = 0.91; p < 0.001). The results presented a strong plateau effect as the pulse number increased. The ratio between complete cell death and no cell death thresholds was relatively narrow (between 0.88-0.91) even for small numbers of pulses and depended weakly on the number of pulses. For BBB disruption, the ratio increased with the number of pulses. BBB disruption radii were on average 67% ± 11% larger than IRE volumes. The statistical model can be used to describe the dependence of treatment-effects on the number of pulses independent of the experimental setup.

  13. VoroMQA: Assessment of protein structure quality using interatomic contact areas.

    PubMed

    Olechnovič, Kliment; Venclovas, Česlovas

    2017-06-01

    In the absence of experimentally determined protein structure many biological questions can be addressed using computational structural models. However, the utility of protein structural models depends on their quality. Therefore, the estimation of the quality of predicted structures is an important problem. One of the approaches to this problem is the use of knowledge-based statistical potentials. Such methods typically rely on the statistics of distances and angles of residue-residue or atom-atom interactions collected from experimentally determined structures. Here, we present VoroMQA (Voronoi tessellation-based Model Quality Assessment), a new method for the estimation of protein structure quality. Our method combines the idea of statistical potentials with the use of interatomic contact areas instead of distances. Contact areas, derived using Voronoi tessellation of protein structure, are used to describe and seamlessly integrate both explicit interactions between protein atoms and implicit interactions of protein atoms with solvent. VoroMQA produces scores at atomic, residue, and global levels, all in the fixed range from 0 to 1. The method was tested on the CASP data and compared to several other single-model quality assessment methods. VoroMQA showed strong performance in the recognition of the native structure and in the structural model selection tests, thus demonstrating the efficacy of interatomic contact areas in estimating protein structure quality. The software implementation of VoroMQA is freely available as a standalone application and as a web server at http://bioinformatics.lt/software/voromqa. Proteins 2017; 85:1131-1145. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  14. Test Population Selection from Weibull-Based, Monte Carlo Simulations of Fatigue Life

    NASA Technical Reports Server (NTRS)

    Vlcek, Brian L.; Zaretsky, Erwin V.; Hendricks, Robert C.

    2008-01-01

    Fatigue life is probabilistic and not deterministic. Experimentally establishing the fatigue life of materials, components, and systems is both time consuming and costly. As a result, conclusions regarding fatigue life are often inferred from a statistically insufficient number of physical tests. A proposed methodology for comparing life results as a function of variability due to Weibull parameters, variability between successive trials, and variability due to size of the experimental population is presented. Using Monte Carlo simulation of randomly selected lives from a large Weibull distribution, the variation in the L10 fatigue life of aluminum alloy AL6061 rotating rod fatigue tests was determined as a function of population size. These results were compared to the L10 fatigue lives of small (10 each) populations from AL2024, AL7075 and AL6061. For aluminum alloy AL6061, a simple algebraic relationship was established for the upper and lower L10 fatigue life limits as a function of the number of specimens failed. For most engineering applications where less than 30 percent variability can be tolerated in the maximum and minimum values, at least 30 to 35 test samples are necessary. The variability of test results based on small sample sizes can be greater than actual differences, if any, that exists between materials and can result in erroneous conclusions. The fatigue life of AL2024 is statistically longer than AL6061 and AL7075. However, there is no statistical difference between the fatigue lives of AL6061 and AL7075 even though AL7075 had a fatigue life 30 percent greater than AL6061.

  15. Test Population Selection from Weibull-Based, Monte Carlo Simulations of Fatigue Life

    NASA Technical Reports Server (NTRS)

    Vlcek, Brian L.; Zaretsky, Erwin V.; Hendricks, Robert C.

    2012-01-01

    Fatigue life is probabilistic and not deterministic. Experimentally establishing the fatigue life of materials, components, and systems is both time consuming and costly. As a result, conclusions regarding fatigue life are often inferred from a statistically insufficient number of physical tests. A proposed methodology for comparing life results as a function of variability due to Weibull parameters, variability between successive trials, and variability due to size of the experimental population is presented. Using Monte Carlo simulation of randomly selected lives from a large Weibull distribution, the variation in the L10 fatigue life of aluminum alloy AL6061 rotating rod fatigue tests was determined as a function of population size. These results were compared to the L10 fatigue lives of small (10 each) populations from AL2024, AL7075 and AL6061. For aluminum alloy AL6061, a simple algebraic relationship was established for the upper and lower L10 fatigue life limits as a function of the number of specimens failed. For most engineering applications where less than 30 percent variability can be tolerated in the maximum and minimum values, at least 30 to 35 test samples are necessary. The variability of test results based on small sample sizes can be greater than actual differences, if any, that exists between materials and can result in erroneous conclusions. The fatigue life of AL2024 is statistically longer than AL6061 and AL7075. However, there is no statistical difference between the fatigue lives of AL6061 and AL7075 even though AL7075 had a fatigue life 30 percent greater than AL6061.

  16. Experimental design matters for statistical analysis: how to handle blocking.

    PubMed

    Jensen, Signe M; Schaarschmidt, Frank; Onofri, Andrea; Ritz, Christian

    2018-03-01

    Nowadays, evaluation of the effects of pesticides often relies on experimental designs that involve multiple concentrations of the pesticide of interest or multiple pesticides at specific comparable concentrations and, possibly, secondary factors of interest. Unfortunately, the experimental design is often more or less neglected when analysing data. Two data examples were analysed using different modelling strategies. First, in a randomized complete block design, mean heights of maize treated with a herbicide and one of several adjuvants were compared. Second, translocation of an insecticide applied to maize as a seed treatment was evaluated using incomplete data from an unbalanced design with several layers of hierarchical sampling. Extensive simulations were carried out to further substantiate the effects of different modelling strategies. It was shown that results from suboptimal approaches (two-sample t-tests and ordinary ANOVA assuming independent observations) may be both quantitatively and qualitatively different from the results obtained using an appropriate linear mixed model. The simulations demonstrated that the different approaches may lead to differences in coverage percentages of confidence intervals and type 1 error rates, confirming that misleading conclusions can easily happen when an inappropriate statistical approach is chosen. To ensure that experimental data are summarized appropriately, avoiding misleading conclusions, the experimental design should duly be reflected in the choice of statistical approaches and models. We recommend that author guidelines should explicitly point out that authors need to indicate how the statistical analysis reflects the experimental design. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.

  17. An Independent Filter for Gene Set Testing Based on Spectral Enrichment.

    PubMed

    Frost, H Robert; Li, Zhigang; Asselbergs, Folkert W; Moore, Jason H

    2015-01-01

    Gene set testing has become an indispensable tool for the analysis of high-dimensional genomic data. An important motivation for testing gene sets, rather than individual genomic variables, is to improve statistical power by reducing the number of tested hypotheses. Given the dramatic growth in common gene set collections, however, testing is often performed with nearly as many gene sets as underlying genomic variables. To address the challenge to statistical power posed by large gene set collections, we have developed spectral gene set filtering (SGSF), a novel technique for independent filtering of gene set collections prior to gene set testing. The SGSF method uses as a filter statistic the p-value measuring the statistical significance of the association between each gene set and the sample principal components (PCs), taking into account the significance of the associated eigenvalues. Because this filter statistic is independent of standard gene set test statistics under the null hypothesis but dependent under the alternative, the proportion of enriched gene sets is increased without impacting the type I error rate. As shown using simulated and real gene expression data, the SGSF algorithm accurately filters gene sets unrelated to the experimental outcome resulting in significantly increased gene set testing power.

  18. Indices of fiber biopersistence and carcinogen classification for synthetic vitreous fibers (SVFs).

    PubMed

    Maxim, L Daniel; Boymel, Paul; Chase, Gerald R; Bernstein, David M

    2002-06-01

    It is generally accepted that the biopersistence of a synthetic vitreous fiber (SVF) is an important determinant of its biological activity. Experimental protocols have been developed to measure the biopersistence of an SVF from short-term inhalation experiments with rats. Clearance kinetics of long (>20 microm) fibers (those believed to have greatest biological activity) have been approximated by one- or two-pool models. Several measures or indices of biopersistence have been proposed in the literature of which three, the weighted half-time (WT(1/2)), the time required to clear 90% of long fibers (T(0.9)), and the so-called slow-phase half-time (T(2)), have been investigated in some detail. This paper considers both one- and two-pool models for long fiber clearance, characterizes the properties of these candidate indices of fiber biopersistence, identifies measures with potentially superior statistical properties, suggests possible cutoff values based on the relation between biopersistence and the outcome of chronic bioassays, and offers comments on the selection of efficient experimental designs. This analysis concludes that WT(1/2) and T(0.9) are highly correlated, are efficient predictors of the outcome of chronic bioassays, and have reasonable statistical properties. T(2), although perhaps attractive in principle, suffers from some statistical shortcomings when estimated using present experimental protocols. The WT(1/2) is shown to be directly proportional to the cumulative exposure (fiber days) after the cessation of exposure and also the mean residence time of these fibers in the lung. Copyright 2002 Elsevier Science (USA)

  19. Modeling noisy resonant system response

    NASA Astrophysics Data System (ADS)

    Weber, Patrick Thomas; Walrath, David Edwin

    2017-02-01

    In this paper, a theory-based model replicating empirical acoustic resonant signals is presented and studied to understand sources of noise present in acoustic signals. Statistical properties of empirical signals are quantified and a noise amplitude parameter, which models frequency and amplitude-based noise, is created, defined, and presented. This theory-driven model isolates each phenomenon and allows for parameters to be independently studied. Using seven independent degrees of freedom, this model will accurately reproduce qualitative and quantitative properties measured from laboratory data. Results are presented and demonstrate success in replicating qualitative and quantitative properties of experimental data.

  20. An examination of student attitudes and understanding of exponential functions using interactive instructional multimedia

    NASA Astrophysics Data System (ADS)

    Singleton, Cynthia M.

    The purpose of this study was to examine students' attitudes and understanding of exponential functions using InterAct Math, a mathematics tutorial software. The researcher used a convenience sampling of a total of 78 students from two intact pre-calculus classes; the students in the experimental group totaled 41 and the control group totaled 37. The two groups were exposed to the same curriculum content taught by the same instructor, the researcher. The experimental group used the mathematics tutorial software as an integral part of the instructional delivery. The control group used traditional instruction without integration of the educational technology. Data were collected during a two week span using a mixed-methodology to address the major research questions: (1) Is there a statistically significant difference in the mean achievement test scores between the experimental and the control groups? (2) Is there a statistically significant difference in students' attitudes toward learning mathematics between the experimental group and the control group? The researcher utilized paired t-tests and independent t-tests as statistical methods to evaluate the effectiveness of the intervention and to establish whether there was a significant difference between the experimental and control groups. Based on the analyses of the quantitative data, it was established that the students who received the InterAct Math tutorial (experimental group) did not perform better than the control group on exponential functions, graphs and applications. However, the quantitative part of the study (Aiken-Dreger Mathematics Attitude Scale) revealed that, while students in the experimental and control groups started with similar attitudes about mathematics and the integration of technology, their attitudes were significantly different at the conclusion of the study. The fear of mathematics was reduced for the experimental group at the end of the study, and their enjoyment of the subject matter was increased as a result of the intervention. No significant difference was reported concerning attitudes toward fear and enjoyment of learning mathematics for the control group. The researcher concluded that the use of InterAct Math tutorial software as part of the instructional delivery was beneficial and contributed to a positive attitude change. Other qualitative data obtained from the unstructured interviews of the treatment group supported these findings and reported that the change in attitudes was attributable to the use of the InterAct software in the instructional delivery of the course. The researcher concluded that the results of the study did not provide evidence that InterAct Math software could be credited with producing better learning outcomes. However, it appears that the InterAct Math tutorial software is an effective tutorial tool in promoting positive change in students' attitudes toward learning mathematics; thus, it is an effective tool for mathematics instruction. Based on the above results, it was concluded that the InterAct Math tutorial is an effective tutorial tool in promoting positive attitude change in students toward learning mathematics.

  1. The impact of the Virtual Ophthalmology Clinic on medical students' learning: a randomised controlled trial

    PubMed Central

    Succar, T; Zebington, G; Billson, F; Byth, K; Barrie, S; McCluskey, P; Grigg, J

    2013-01-01

    Aim The Virtual Ophthalmology Clinic (VOC) is an interactive web-based teaching module, with special emphasis on history taking and clinical reasoning skills. The purpose of this study was to determine the impact of VOC on medical students' learning. Methods A randomised controlled trial (RCT) was conducted with medical students from the University of Sydney (n=188) who were randomly assigned into either an experimental (n=93) or a control group (n=95). A pre- and post-test and student satisfaction questionnaire were administered. Twelve months later a follow-up test was conducted to determine the long-term retention rate of graduates. Results There was a statistically significant (P<0.001) within-subject improvement pre- to post rotation in the number of correctly answered questions for both the control and experimental groups (mean improvement for control 10%, 95% CI 1.3–2.6, and for experimental 17.5%, 95% CI 3.0–4.0). The improvement was significantly greater in the experimental group (mean difference in improvement between groups 7.5%, 95% CI 0.8–2.3, P<0.001). At 12 months follow-up testing, the experimental group scored on average 1.6 (8%) (95%CI 0.4 to 2.7, P=0.007) higher than the controls. Conclusion On the basis of a statistically significant improvement in academic performance and highly positive student feedback, the implementation of VOC may provide a means to address challenges to ophthalmic learning outcomes in an already crowded medical curriculum. PMID:23867718

  2. Statistical correlation analysis for comparing vibration data from test and analysis

    NASA Technical Reports Server (NTRS)

    Butler, T. G.; Strang, R. F.; Purves, L. R.; Hershfeld, D. J.

    1986-01-01

    A theory was developed to compare vibration modes obtained by NASTRAN analysis with those obtained experimentally. Because many more analytical modes can be obtained than experimental modes, the analytical set was treated as expansion functions for putting both sources in comparative form. The dimensional symmetry was developed for three general cases: nonsymmetric whole model compared with a nonsymmetric whole structural test, symmetric analytical portion compared with a symmetric experimental portion, and analytical symmetric portion with a whole experimental test. The theory was coded and a statistical correlation program was installed as a utility. The theory is established with small classical structures.

  3. Cognitive Stimulation of Elderly Residents in Social Protection Centers in Cartagena, 2014.

    PubMed

    Melguizo Herrera, Estela; Bertel De La Hoz, Anyel; Paternina Osorio, Diego; Felfle Fuentes, Yurani; Porto Osorio, Leidy

    To determine the effectiveness of a program of cognitive stimulation of the elderly residents in Social Protection Centers in Cartagena, 2014. Quasi-experimental study with pre and post tests in control and experimental groups. A sample of 37 elderly residents in Social Protection Centers participated: 23 in the experimental group and 14 in the control group. A survey and a mental evaluation test (Pfeiffer) were applied. The experimental group participated in 10 sessions of cognitive stimulation. The paired t-test showed statistically significant differences in the Pfeiffer test, pre and post intervention, compared to the experimental group (P=.0005). The unpaired t-test showed statistically significant differences in Pfeiffer test results to the experimental and control groups (P=.0450). The analysis of the main components showed that more interrelated variables were: age, diseases, number of errors and test results; which were grouped around the disease variable, with a negative association. The intervention demonstrated a statistically significant improvement in cognitive functionality of the elderly. Nursing can lead this type of intervention. It should be studied further to strengthen and clarify these results. Copyright © 2016 Asociación Colombiana de Psiquiatría. Publicado por Elsevier España. All rights reserved.

  4. On the Reporting of Experimental and Control Therapies in Stroke Rehabilitation Trials: A Systematic Review.

    PubMed

    Lohse, Keith R; Pathania, Anupriya; Wegman, Rebecca; Boyd, Lara A; Lang, Catherine E

    2018-03-01

    To use the Centralized Open-Access Rehabilitation database for Stroke to explore reporting of both experimental and control interventions in randomized controlled trials for stroke rehabilitation (including upper and lower extremity therapies). The Centralized Open-Access Rehabilitation database for Stroke was created from a search of MEDLINE, Embase, Cochrane Central Register of Controlled Trials, Cochrane Database of Systematic Reviews, and Cumulative Index of Nursing and Allied Health from the earliest available date to May 31, 2014. A total of 2892 titles were reduced to 514 that were screened by full text. This screening left 215 randomized controlled trials in the database (489 independent groups representing 12,847 patients). Using a mixture of qualitative and quantitative methods, we performed a text-based analysis of how the procedures of experimental and control therapies were described. Experimental and control groups were rated by 2 independent coders according to the Template for Intervention Description and Replication criteria. Linear mixed-effects regression with a random effect of study (groups nested within studies) showed that experimental groups had statistically more words in their procedures (mean, 271.8 words) than did control groups (mean, 154.8 words) (P<.001). Experimental groups had statistically more references in their procedures (mean, 1.60 references) than did control groups (mean, .82 references) (P<.001). Experimental groups also scored significantly higher on the total Template for Intervention Description and Replication checklist (mean score, 7.43 points) than did control groups (mean score, 5.23 points) (P<.001). Control treatments in stroke motor rehabilitation trials are underdescribed relative to experimental treatments. These poor descriptions are especially problematic for "conventional" therapy control groups. Poor reporting is a threat to the internal validity and generalizability of clinical trial results. We recommend authors use preregistered protocols and established reporting criteria to improve transparency. Copyright © 2018 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  5. CSMP Mathematics for the Intermediate Grades Part IV, Teacher's Guide. The Languages of Strings and Arrows. Geometry and Measurement. Probability and Statistics.

    ERIC Educational Resources Information Center

    CEMREL, Inc., St. Ann, MO.

    This guide represents the final experimental version of a pilot project which was conducted in the United States between 1973 and 1976. The ideas and the manner of presentation are based on the works of Georges and Frederique Papy. They are recognized for having introduced colored arrow drawings ("papygrams") and models of our numeration…

  6. A statistical model of aggregate fragmentation

    NASA Astrophysics Data System (ADS)

    Spahn, F.; Vieira Neto, E.; Guimarães, A. H. F.; Gorban, A. N.; Brilliantov, N. V.

    2014-01-01

    A statistical model of fragmentation of aggregates is proposed, based on the stochastic propagation of cracks through the body. The propagation rules are formulated on a lattice and mimic two important features of the process—a crack moves against the stress gradient while dissipating energy during its growth. We perform numerical simulations of the model for two-dimensional lattice and reveal that the mass distribution for small- and intermediate-size fragments obeys a power law, F(m)∝m-3/2, in agreement with experimental observations. We develop an analytical theory which explains the detected power law and demonstrate that the overall fragment mass distribution in our model agrees qualitatively with that one observed in experiments.

  7. Robust matching for voice recognition

    NASA Astrophysics Data System (ADS)

    Higgins, Alan; Bahler, L.; Porter, J.; Blais, P.

    1994-10-01

    This paper describes an automated method of comparing a voice sample of an unknown individual with samples from known speakers in order to establish or verify the individual's identity. The method is based on a statistical pattern matching approach that employs a simple training procedure, requires no human intervention (transcription, work or phonetic marketing, etc.), and makes no assumptions regarding the expected form of the statistical distributions of the observations. The content of the speech material (vocabulary, grammar, etc.) is not assumed to be constrained in any way. An algorithm is described which incorporates frame pruning and channel equalization processes designed to achieve robust performance with reasonable computational resources. An experimental implementation demonstrating the feasibility of the concept is described.

  8. Mindfulness training for reducing anger, anxiety, and depression in fibromyalgia patients.

    PubMed

    Amutio, Alberto; Franco, Clemente; Pérez-Fuentes, María de Carmen; Gázquez, José J; Mercader, Isabel

    2014-01-01

    Fibromyalgia is a disabling syndrome. Results obtained with different therapies are very limited to date. The goal of this study was to verify whether the application of a mindfulness-based training program was effective in modifying anger, anxiety, and depression levels in a group of women diagnosed with fibromyalgia. This study is an experimental trial that employed a waiting list control group. Measures were taken at three different times: pretest, posttest, and follow-up. The statistical analyses revealed a significant reduction of anger (trait) levels, internal expression of anger, state anxiety, and depression in the experimental group as compared to the control group, as well as a significant increase in internal control of anger. It can be concluded that the mindfulness-based treatment was effective after 7 weeks. These results were maintained 3 months after the end of the intervention.

  9. Effectiveness of a Computer-Based Training Program of Attention and Memory in Patients with Acquired Brain Damage

    PubMed Central

    Fernandez, Elizabeth; Bergado Rosado, Jorge A.; Rodriguez Perez, Daymi; Salazar Santana, Sonia; Torres Aguilar, Maydane; Bringas, Maria Luisa

    2017-01-01

    Many training programs have been designed using modern software to restore the impaired cognitive functions in patients with acquired brain damage (ABD). The objective of this study was to evaluate the effectiveness of a computer-based training program of attention and memory in patients with ABD, using a two-armed parallel group design, where the experimental group (n = 50) received cognitive stimulation using RehaCom software, and the control group (n = 30) received the standard cognitive stimulation (non-computerized) for eight weeks. In order to assess the possible cognitive changes after the treatment, a post-pre experimental design was employed using the following neuropsychological tests: Wechsler Memory Scale (WMS) and Trail Making test A and B. The effectiveness of the training procedure was statistically significant (p < 0.05) when it established the comparison between the performance in these scales, before and after the training period, in each patient and between the two groups. The training group had statistically significant (p < 0.001) changes in focused attention (Trail A), two subtests (digit span and logical memory), and the overall score of WMS. Finally, we discuss the advantages of computerized training rehabilitation and further directions of this line of work. PMID:29301194

  10. Review of genotoxicity biomonitoring studies of glyphosate-based formulations

    PubMed Central

    Kier, Larry D.

    2015-01-01

    Abstract Human and environmental genotoxicity biomonitoring studies involving exposure to glyphosate-based formulations (GBFs) were reviewed to complement an earlier review of experimental genotoxicity studies of glyphosate and GBFs. The environmental and most of the human biomonitoring studies were not informative because there was either a very low frequency of GBF exposure or exposure to a large number of pesticides without analysis of specific pesticide effects. One pesticide sprayer biomonitoring study indicated there was not a statistically significant relationship between frequency of GBF exposure reported for the last spraying season and oxidative DNA damage. There were three studies of human populations in regions of GBF aerial spraying. One study found increases for the cytokinesis-block micronucleus endpoint but these increases did not show statistically significant associations with self-reported spray exposure and were not consistent with application rates. A second study found increases for the blood cell comet endpoint at high exposures causing toxicity. However, a follow-up to this study 2 years after spraying did not indicate chromosomal effects. The results of the biomonitoring studies do not contradict an earlier conclusion derived from experimental genotoxicity studies that typical GBFs do not appear to present significant genotoxic risk under normal conditions of human or environmental exposures. PMID:25687244

  11. Two-level QSAR network (2L-QSAR) for peptide inhibitor design based on amino acid properties and sequence positions.

    PubMed

    Du, Q S; Ma, Y; Xie, N Z; Huang, R B

    2014-01-01

    In the design of peptide inhibitors the huge possible variety of the peptide sequences is of high concern. In collaboration with the fast accumulation of the peptide experimental data and database, a statistical method is suggested for peptide inhibitor design. In the two-level peptide prediction network (2L-QSAR) one level is the physicochemical properties of amino acids and the other level is the peptide sequence position. The activity contributions of amino acids are the functions of physicochemical properties and the sequence positions. In the prediction equation two weight coefficient sets {ak} and {bl} are assigned to the physicochemical properties and to the sequence positions, respectively. After the two coefficient sets are optimized based on the experimental data of known peptide inhibitors using the iterative double least square (IDLS) procedure, the coefficients are used to evaluate the bioactivities of new designed peptide inhibitors. The two-level prediction network can be applied to the peptide inhibitor design that may aim for different target proteins, or different positions of a protein. A notable advantage of the two-level statistical algorithm is that there is no need for host protein structural information. It may also provide useful insight into the amino acid properties and the roles of sequence positions.

  12. Genetic algorithm for the optimization of features and neural networks in ECG signals classification

    NASA Astrophysics Data System (ADS)

    Li, Hongqiang; Yuan, Danyang; Ma, Xiangdong; Cui, Dianyin; Cao, Lu

    2017-01-01

    Feature extraction and classification of electrocardiogram (ECG) signals are necessary for the automatic diagnosis of cardiac diseases. In this study, a novel method based on genetic algorithm-back propagation neural network (GA-BPNN) for classifying ECG signals with feature extraction using wavelet packet decomposition (WPD) is proposed. WPD combined with the statistical method is utilized to extract the effective features of ECG signals. The statistical features of the wavelet packet coefficients are calculated as the feature sets. GA is employed to decrease the dimensions of the feature sets and to optimize the weights and biases of the back propagation neural network (BPNN). Thereafter, the optimized BPNN classifier is applied to classify six types of ECG signals. In addition, an experimental platform is constructed for ECG signal acquisition to supply the ECG data for verifying the effectiveness of the proposed method. The GA-BPNN method with the MIT-BIH arrhythmia database achieved a dimension reduction of nearly 50% and produced good classification results with an accuracy of 97.78%. The experimental results based on the established acquisition platform indicated that the GA-BPNN method achieved a high classification accuracy of 99.33% and could be efficiently applied in the automatic identification of cardiac arrhythmias.

  13. Experimental statistics for biological sciences.

    PubMed

    Bang, Heejung; Davidian, Marie

    2010-01-01

    In this chapter, we cover basic and fundamental principles and methods in statistics - from "What are Data and Statistics?" to "ANOVA and linear regression," which are the basis of any statistical thinking and undertaking. Readers can easily find the selected topics in most introductory statistics textbooks, but we have tried to assemble and structure them in a succinct and reader-friendly manner in a stand-alone chapter. This text has long been used in real classroom settings for both undergraduate and graduate students who do or do not major in statistical sciences. We hope that from this chapter, readers would understand the key statistical concepts and terminologies, how to design a study (experimental or observational), how to analyze the data (e.g., describe the data and/or estimate the parameter(s) and make inference), and how to interpret the results. This text would be most useful if it is used as a supplemental material, while the readers take their own statistical courses or it would serve as a great reference text associated with a manual for any statistical software as a self-teaching guide.

  14. Discrepancies between conformational distributions of a polyalanine peptide in solution obtained from molecular dynamics force fields and amide I' band profiles.

    PubMed

    Verbaro, Daniel; Ghosh, Indrajit; Nau, Werner M; Schweitzer-Stenner, Reinhard

    2010-12-30

    Structural preferences in the unfolded state of peptides determined by molecular dynamics still contradict experimental data. A remedy in this regard has been suggested by MD simulations with an optimized Amber force field ff03* ( Best, R. Hummer, G. J. Phys. Chem. B 2009 , 113 , 9004 - 9015 ). The simulations yielded a statistical coil distribution for alanine which is at variance with recent experimental results. To check the validity of this distribution, we investigated the peptide H-A(5)W-OH, which with the exception of the additional terminal tryptophan is analogous to the peptide used to optimize the force fields ff03*. Electronic circular dichroism, vibrational circular dichroism, and infrared spectroscopy as well as J-coupling constants obtained from NMR experiments were used to derive the peptide's conformational ensemble. Additionally, Förster resonance energy transfer between the terminal chromophores of the fluorescently labeled peptide analogue H-Dbo-A(5)W-OH was used to determine its average length, from which the end-to-end distance of the unlabeled peptide was estimated. Qualitatively, the experimental (3)J(H(N),C(α)), VCD, and ECD indicated a preference of alanine for polyproline II-like conformations. The experimental (3)J(H(N),C(α)) for A(5)W closely resembles the constants obtained for A(5). In order to quantitatively relate the conformational distribution of A(5) obtained with the optimized AMBER ff03* force field to experimental data, the former was used to derive a distribution function which expressed the conformational ensemble as a mixture of polyproline II, β-strand, helical, and turn conformations. This model was found to satisfactorily reproduce all experimental J-coupling constants. We employed the model to calculate the amide I' profiles of the IR and vibrational circular dichroism spectrum of A(5)W, as well as the distance between the two terminal peptide carbonyls. This led to an underestimated negative VCD couplet and an overestimated distance between terminal carbonyl groups. In order to more accurately account for the experimental data, we changed the distribution parameters based on results recently obtained for the alanine-based tripeptides. The final model, which satisfactorily reproduced amide I' profiles, J-coupling constant, and the end-to-end distance of A(5)W, reinforces alanine's high structural preference for polyproline II. Our results suggest that distributions obtained from MD simulations suggesting a statistical coil-like distribution for alanine are still based on insufficiently accurate force fields.

  15. Compression-based classification of biological sequences and structures via the Universal Similarity Metric: experimental assessment.

    PubMed

    Ferragina, Paolo; Giancarlo, Raffaele; Greco, Valentina; Manzini, Giovanni; Valiente, Gabriel

    2007-07-13

    Similarity of sequences is a key mathematical notion for Classification and Phylogenetic studies in Biology. It is currently primarily handled using alignments. However, the alignment methods seem inadequate for post-genomic studies since they do not scale well with data set size and they seem to be confined only to genomic and proteomic sequences. Therefore, alignment-free similarity measures are actively pursued. Among those, USM (Universal Similarity Metric) has gained prominence. It is based on the deep theory of Kolmogorov Complexity and universality is its most novel striking feature. Since it can only be approximated via data compression, USM is a methodology rather than a formula quantifying the similarity of two strings. Three approximations of USM are available, namely UCD (Universal Compression Dissimilarity), NCD (Normalized Compression Dissimilarity) and CD (Compression Dissimilarity). Their applicability and robustness is tested on various data sets yielding a first massive quantitative estimate that the USM methodology and its approximations are of value. Despite the rich theory developed around USM, its experimental assessment has limitations: only a few data compressors have been tested in conjunction with USM and mostly at a qualitative level, no comparison among UCD, NCD and CD is available and no comparison of USM with existing methods, both based on alignments and not, seems to be available. We experimentally test the USM methodology by using 25 compressors, all three of its known approximations and six data sets of relevance to Molecular Biology. This offers the first systematic and quantitative experimental assessment of this methodology, that naturally complements the many theoretical and the preliminary experimental results available. Moreover, we compare the USM methodology both with methods based on alignments and not. We may group our experiments into two sets. The first one, performed via ROC (Receiver Operating Curve) analysis, aims at assessing the intrinsic ability of the methodology to discriminate and classify biological sequences and structures. A second set of experiments aims at assessing how well two commonly available classification algorithms, UPGMA (Unweighted Pair Group Method with Arithmetic Mean) and NJ (Neighbor Joining), can use the methodology to perform their task, their performance being evaluated against gold standards and with the use of well known statistical indexes, i.e., the F-measure and the partition distance. Based on the experiments, several conclusions can be drawn and, from them, novel valuable guidelines for the use of USM on biological data. The main ones are reported next. UCD and NCD are indistinguishable, i.e., they yield nearly the same values of the statistical indexes we have used, accross experiments and data sets, while CD is almost always worse than both. UPGMA seems to yield better classification results with respect to NJ, i.e., better values of the statistical indexes (10% difference or above), on a substantial fraction of experiments, compressors and USM approximation choices. The compression program PPMd, based on PPM (Prediction by Partial Matching), for generic data and Gencompress for DNA, are the best performers among the compression algorithms we have used, although the difference in performance, as measured by statistical indexes, between them and the other algorithms depends critically on the data set and may not be as large as expected. PPMd used with UCD or NCD and UPGMA, on sequence data is very close, although worse, in performance with the alignment methods (less than 2% difference on the F-measure). Yet, it scales well with data set size and it can work on data other than sequences. In summary, our quantitative analysis naturally complements the rich theory behind USM and supports the conclusion that the methodology is worth using because of its robustness, flexibility, scalability, and competitiveness with existing techniques. In particular, the methodology applies to all biological data in textual format. The software and data sets are available under the GNU GPL at the supplementary material web page.

  16. Compression-based classification of biological sequences and structures via the Universal Similarity Metric: experimental assessment

    PubMed Central

    Ferragina, Paolo; Giancarlo, Raffaele; Greco, Valentina; Manzini, Giovanni; Valiente, Gabriel

    2007-01-01

    Background Similarity of sequences is a key mathematical notion for Classification and Phylogenetic studies in Biology. It is currently primarily handled using alignments. However, the alignment methods seem inadequate for post-genomic studies since they do not scale well with data set size and they seem to be confined only to genomic and proteomic sequences. Therefore, alignment-free similarity measures are actively pursued. Among those, USM (Universal Similarity Metric) has gained prominence. It is based on the deep theory of Kolmogorov Complexity and universality is its most novel striking feature. Since it can only be approximated via data compression, USM is a methodology rather than a formula quantifying the similarity of two strings. Three approximations of USM are available, namely UCD (Universal Compression Dissimilarity), NCD (Normalized Compression Dissimilarity) and CD (Compression Dissimilarity). Their applicability and robustness is tested on various data sets yielding a first massive quantitative estimate that the USM methodology and its approximations are of value. Despite the rich theory developed around USM, its experimental assessment has limitations: only a few data compressors have been tested in conjunction with USM and mostly at a qualitative level, no comparison among UCD, NCD and CD is available and no comparison of USM with existing methods, both based on alignments and not, seems to be available. Results We experimentally test the USM methodology by using 25 compressors, all three of its known approximations and six data sets of relevance to Molecular Biology. This offers the first systematic and quantitative experimental assessment of this methodology, that naturally complements the many theoretical and the preliminary experimental results available. Moreover, we compare the USM methodology both with methods based on alignments and not. We may group our experiments into two sets. The first one, performed via ROC (Receiver Operating Curve) analysis, aims at assessing the intrinsic ability of the methodology to discriminate and classify biological sequences and structures. A second set of experiments aims at assessing how well two commonly available classification algorithms, UPGMA (Unweighted Pair Group Method with Arithmetic Mean) and NJ (Neighbor Joining), can use the methodology to perform their task, their performance being evaluated against gold standards and with the use of well known statistical indexes, i.e., the F-measure and the partition distance. Based on the experiments, several conclusions can be drawn and, from them, novel valuable guidelines for the use of USM on biological data. The main ones are reported next. Conclusion UCD and NCD are indistinguishable, i.e., they yield nearly the same values of the statistical indexes we have used, accross experiments and data sets, while CD is almost always worse than both. UPGMA seems to yield better classification results with respect to NJ, i.e., better values of the statistical indexes (10% difference or above), on a substantial fraction of experiments, compressors and USM approximation choices. The compression program PPMd, based on PPM (Prediction by Partial Matching), for generic data and Gencompress for DNA, are the best performers among the compression algorithms we have used, although the difference in performance, as measured by statistical indexes, between them and the other algorithms depends critically on the data set and may not be as large as expected. PPMd used with UCD or NCD and UPGMA, on sequence data is very close, although worse, in performance with the alignment methods (less than 2% difference on the F-measure). Yet, it scales well with data set size and it can work on data other than sequences. In summary, our quantitative analysis naturally complements the rich theory behind USM and supports the conclusion that the methodology is worth using because of its robustness, flexibility, scalability, and competitiveness with existing techniques. In particular, the methodology applies to all biological data in textual format. The software and data sets are available under the GNU GPL at the supplementary material web page. PMID:17629909

  17. Point-by-point compositional analysis for atom probe tomography.

    PubMed

    Stephenson, Leigh T; Ceguerra, Anna V; Li, Tong; Rojhirunsakool, Tanaporn; Nag, Soumya; Banerjee, Rajarshi; Cairney, Julie M; Ringer, Simon P

    2014-01-01

    This new alternate approach to data processing for analyses that traditionally employed grid-based counting methods is necessary because it removes a user-imposed coordinate system that not only limits an analysis but also may introduce errors. We have modified the widely used "binomial" analysis for APT data by replacing grid-based counting with coordinate-independent nearest neighbour identification, improving the measurements and the statistics obtained, allowing quantitative analysis of smaller datasets, and datasets from non-dilute solid solutions. It also allows better visualisation of compositional fluctuations in the data. Our modifications include:.•using spherical k-atom blocks identified by each detected atom's first k nearest neighbours.•3D data visualisation of block composition and nearest neighbour anisotropy.•using z-statistics to directly compare experimental and expected composition curves. Similar modifications may be made to other grid-based counting analyses (contingency table, Langer-Bar-on-Miller, sinusoidal model) and could be instrumental in developing novel data visualisation options.

  18. Statistics and bioinformatics in nutritional sciences: analysis of complex data in the era of systems biology⋆

    PubMed Central

    Fu, Wenjiang J.; Stromberg, Arnold J.; Viele, Kert; Carroll, Raymond J.; Wu, Guoyao

    2009-01-01

    Over the past two decades, there have been revolutionary developments in life science technologies characterized by high throughput, high efficiency, and rapid computation. Nutritionists now have the advanced methodologies for the analysis of DNA, RNA, protein, low-molecular-weight metabolites, as well as access to bioinformatics databases. Statistics, which can be defined as the process of making scientific inferences from data that contain variability, has historically played an integral role in advancing nutritional sciences. Currently, in the era of systems biology, statistics has become an increasingly important tool to quantitatively analyze information about biological macromolecules. This article describes general terms used in statistical analysis of large, complex experimental data. These terms include experimental design, power analysis, sample size calculation, and experimental errors (type I and II errors) for nutritional studies at population, tissue, cellular, and molecular levels. In addition, we highlighted various sources of experimental variations in studies involving microarray gene expression, real-time polymerase chain reaction, proteomics, and other bioinformatics technologies. Moreover, we provided guidelines for nutritionists and other biomedical scientists to plan and conduct studies and to analyze the complex data. Appropriate statistical analyses are expected to make an important contribution to solving major nutrition-associated problems in humans and animals (including obesity, diabetes, cardiovascular disease, cancer, ageing, and intrauterine fetal retardation). PMID:20233650

  19. Beneficial role of green plantain [Musa paradisiaca] in the management of persistent diarrhea: a prospective randomized trial.

    PubMed

    Alvarez-Acosta, Thais; León, Cira; Acosta-González, Salvador; Parra-Soto, Haydeé; Cluet-Rodriguez, Isabel; Rossell, Maria Rosario; Colina-Chourio, José A

    2009-04-01

    To evaluate the beneficial effects of green plantain-based diet on stool volume, frequency and weight gain as compared with a traditional yogurt-based diet in children with persistent diarrhea. In a prospective, in-hospital controlled trial, two different treatments were administered to a sample of 80 children of both sexes, with ages ranging from 1 to 28 months, who had experienced >or= 14 days of persistent diarrhea. The sample was divided into two groups of isocaloric (100 kcal/kg/d) diets: experimental and control, of 40 patients each. The experimental group was randomly given a-week treatment consisting of a 50 g/L of cooked green plantain-based diet. The control group was fed on a yogurt-based diet. Both groups were not statistically different at admission. Pathogens were isolated from stools in 21.2% and 25% of patients in the experimental and control groups respectively; Aeromonas hydrophilia and Shigela flexneri were the most frequently found bacteria. The experimental group fed on a green plantain diet had a significantly better response in: diminishing stool output and consistency (p < 0.002), stool weight, diarrhea duration (p < 0.001), and increasing daily body weight gain (p < 0.001) than the yogurt-based diet group. The average duration of diarrhea in the plantain-based diet group was 18 hours shorter (p < 0.005) and it also had lower cost (p < 0.005). Our results support the benefits of green plantain in the dietary management of persistent diarrhea in hospitalized children, in relation to diarrheal duration, weight gain and costs.

  20. Comparative efficacy of two battery-powered toothbrushes on dental plaque removal.

    PubMed

    Ruhlman, C Douglas; Bartizek, Robert D; Biesbrock, Aaron R

    2002-01-01

    A number of clinical studies have consistently demonstrated that power toothbrushes deliver superior plaque removal compared to manual toothbrushes. Recently, a new power toothbrush (Crest SpinBrush) has been marketed with a design that fundamentally differs from other marketed power toothbrushes. Other power toothbrushes feature a small, round head designed to oscillate for enhanced cleaning between the teeth and below the gumline. The new power toothbrush incorporates a similar round oscillating head in conjunction with fixed bristles, which allows the user to brush with optimal manual brushing technique. The objective of this randomized, examiner-blind, parallel design study was to compare the plaque removal efficacy of a positive control power toothbrush (Colgate Actibrush) to an experimental toothbrush (Crest SpinBrush) following a single use among 59 subjects. Baseline plaque scores were 1.64 and 1.40 for the experimental toothbrush and control toothbrush treatment groups, respectively. With regard to all surfaces examined, the experimental toothbrush delivered an adjusted (via analysis of covariance) mean difference between baseline and post-brushing plaque scores of 0.47, while the control toothbrush delivered an adjusted mean difference of 0.33. On average, the difference between toothbrushes was statistically significant (p = 0.013). Because the covariate slope for the experimental group was statistically significantly greater (p = 0.001) than the slope for the control group, a separate slope model was used. Further analysis demonstrated that the experimental group had statistically significantly greater plaque removal than the control group for baseline plaque scores above 1.43. With respect to buccal surfaces, using a separate slope analysis of covariance, the experimental toothbrush delivered an adjusted mean difference between baseline and post-brushing plaque scores of 0.61, while the control toothbrush delivered an adjusted mean difference of 0.39. This difference between toothbrushes was also statistically significant (p = 0.002). On average, the results on lingual surfaces demonstrated similar directional scores favoring the experimental toothbrush; however these results did not achieve statistical significance. In conclusion, the experimental Crest SpinBrush, with its novel fixed and oscillating bristle design, was found to be more effective than the positive control Colgate Actibrush, which is designed with a small round oscillating cluster of bristles.

  1. Effect of sports vision exercise on visual perception and reading performance in 7- to 10-year-old developmental dyslexic children.

    PubMed

    Badami, Rokhsareh; Mahmoudi, Sahar; Baluch, Bahman

    2016-12-01

    The presented study was aimed at identifying for the first time the influence of sports vision exercises on fundamental motor skills and cognitive skills of 7- to 10-year-old developmental dyslexic Persian children. A pretest-posttest quasi-experimental study was conducted. The statistical population of this study was 7- to 10-year-old dyslexic children referring to two centres of learning disorder in the city of Isfahan. Twenty two of these children were selected using available and purposive sampling from the statistical population and were randomly assigned into two groups of experimental and control. The former (experimental group) participated in sports vision exercise courses for 12 weeks (3 one hr sessions per week) and the latter (control group) continued their routine daily activities during the exercise. Before the beginning and at the end of the exercise, Gardner's test of visual perception test - revised and Dehkhoda's reading skills test was administered to both groups. The results showed that the sports vision exercises increases motor skills, visual perceptual skills and reading skills in developmental dyslexic children. Based on the results of the presented study it was concluded that sports vision exercises can be used for fundamental and cognitive skills of developmental dyslexic children.

  2. Impact of Temperature and Non-Gaussian Statistics on Electron Transfer in Donor–Bridge–Acceptor Molecules

    DOE PAGES

    Waskasi, Morteza M.; Newton, Marshall D.; Matyushov, Dmitry V.

    2017-03-16

    A combination of experimental data and theoretical analysis provides evidence of a bell-shaped kinetics of electron transfer in the Arrhenius coordinates ln k vs 1/T . This kinetic law is a temperature analog of the familiar Marcus bell-shaped dependence based on ln k vs the reaction free energy. These results were obtained for reactions of intramolecular charge shift between the donor and acceptor separated by a rigid spacer studied experimentally by Miller and co-workers. The non-Arrhenius kinetic law is a direct consequence of the solvent reorganization energy and reaction driving force changing approximately as hyperbolic functions with temperature. The reorganizationmore » energy decreases and the driving force increases when temperature is increased. The point of equality between them marks the maximum of the activationless reaction rate. Reaching the consistency between the kinetic and thermodynamic experimental data requires the non-Gaussian statistics of the donor-acceptor energy gap described by the Q-model of electron transfer. Furthermore, the theoretical formalism combines the vibrational envelope of quantum vibronic transitions with the Q-model describing the classical component of the Franck-Condon factor and a microscopic solvation model of the solvent reorganization energy and the reaction free energy.« less

  3. Consistent integration of experimental and ab initio data into molecular and coarse-grained models

    NASA Astrophysics Data System (ADS)

    Vlcek, Lukas

    As computer simulations are increasingly used to complement or replace experiments, highly accurate descriptions of physical systems at different time and length scales are required to achieve realistic predictions. The questions of how to objectively measure model quality in relation to reference experimental or ab initio data, and how to transition seamlessly between different levels of resolution are therefore of prime interest. To address these issues, we use the concept of statistical distance to define a measure of similarity between statistical mechanical systems, i.e., a model and its target, and show that its minimization leads to general convergence of the systems' measurable properties. Through systematic coarse-graining, we arrive at appropriate expressions for optimization loss functions consistently incorporating microscopic ab initio data as well as macroscopic experimental data. The design of coarse-grained and multiscale models is then based on factoring the model system partition function into terms describing the system at different resolution levels. The optimization algorithm takes advantage of thermodynamic perturbation expressions for fast exploration of the model parameter space, enabling us to scan millions of parameter combinations per hour on a single CPU. The robustness and generality of the new model optimization framework and its efficient implementation are illustrated on selected examples including aqueous solutions, magnetic systems, and metal alloys.

  4. Educational Intervention on Undergraduate Cancer Awareness and Self-Directed Learning.

    PubMed

    Hwang, Lih-Lian

    2018-06-01

    Traditional lecture-based learning (LBL) can increase cancer awareness in undergraduates. However, because of the rapidly changing knowledge base in medicine, undergraduates must develop skills required for lifelong self-directed learning (SDL). Problem-based learning (PBL) has been suggested as an SDL approach. This study used a nonequivalent control group with a pretest-posttest design for comparing PBL and LBL for their effectiveness in increasing cancer awareness and SDL among nonmedicine or nonnursing major undergraduates in a health-related general education course. Experimental groups 1 and 2 were instructed using PBL while the control group was instructed using LBL. Cancer educational programs were offered to experimental group 1 and the control group but not to experimental group 2. Among the 325 undergraduates who completed a questionnaire regarding cancer awareness and SDL in the pretest, 223 completed the 12-week follow-up survey of the posttest. Cancer awareness significantly improved between the pretest and posttest in the control group (P < 0.001). No significant difference in cancer awareness improvement was observed between experimental group 1 and the control group (P = 0.934). Cancer awareness improvement in experimental group 2 was significantly less than in the control group (P = 0.010). No statistically significant change in SDL was observed in the control group during the study (P = 0.897). However, the SDL of experimental groups 1 and 2 improved more significantly than that of the control group (P = 0.049 and 0.023, respectively). Therefore, PBL is an effective method of increasing cancer awareness and SDL in undergraduates.

  5. Combined Experimental and Numerical Simulations of Thermal Barrier Coated Turbine Blades Erosion

    NASA Technical Reports Server (NTRS)

    Hamed, Awate; Tabakoff, Widen; Swar, Rohan; Shin, Dongyun; Woggon, Nthanial; Miller, Robert

    2013-01-01

    A combined experimental and computational study was conducted to investigate the erosion of thermal barrier coated (TBC) blade surfaces by alumina particles ingestion in a single stage turbine. In the experimental investigation, tests of particle surface interactions were performed in specially designed tunnels to determine the erosion rates and particle restitution characteristics under different impact conditions. The experimental results show that the erosion rates increase with increased impingement angle, impact velocity and temperature. In the computational simulations, an Euler-Lagrangian two stage approach is used in obtaining numerical solutions to the three-dimensional compressible Reynolds Averaged Navier-Stokes equations and the particles equations of motion in each blade passage reference frame. User defined functions (UDF) were developed to represent experimentally-based correlations for particle surface interaction models which were employed in the three-dimensional particle trajectory simulations to determine the particle rebound characteristics after each surface impact. The experimentally based erosion UDF model was used to predict the TBC erosion rates on the turbine blade surfaces based on the computed statistical data of the particles impact locations, velocities and angles relative to the blade surface. Computational results are presented for the predicted TBC blade erosion in a single stage commercial APU turbine, for a NASA designed automotive turbine, and for the NASA turbine scaled for modern rotorcraft operating conditions. The erosion patterns in the turbines are discussed for uniform particle ingestion and for particle ingestion concentrated in the inner and outer 5 percent of the stator blade span representing the flow cooling the combustor liner.

  6. Statistical analysis of early failures in electromigration

    NASA Astrophysics Data System (ADS)

    Gall, M.; Capasso, C.; Jawarani, D.; Hernandez, R.; Kawasaki, H.; Ho, P. S.

    2001-07-01

    The detection of early failures in electromigration (EM) and the complicated statistical nature of this important reliability phenomenon have been difficult issues to treat in the past. A satisfactory experimental approach for the detection and the statistical analysis of early failures has not yet been established. This is mainly due to the rare occurrence of early failures and difficulties in testing of large sample populations. Furthermore, experimental data on the EM behavior as a function of varying number of failure links are scarce. In this study, a technique utilizing large interconnect arrays in conjunction with the well-known Wheatstone Bridge is presented. Three types of structures with a varying number of Ti/TiN/Al(Cu)/TiN-based interconnects were used, starting from a small unit of five lines in parallel. A serial arrangement of this unit enabled testing of interconnect arrays encompassing 480 possible failure links. In addition, a Wheatstone Bridge-type wiring using four large arrays in each device enabled simultaneous testing of 1920 interconnects. In conjunction with a statistical deconvolution to the single interconnect level, the results indicate that the electromigration failure mechanism studied here follows perfect lognormal behavior down to the four sigma level. The statistical deconvolution procedure is described in detail. Over a temperature range from 155 to 200 °C, a total of more than 75 000 interconnects were tested. None of the samples have shown an indication of early, or alternate, failure mechanisms. The activation energy of the EM mechanism studied here, namely the Cu incubation time, was determined to be Q=1.08±0.05 eV. We surmise that interface diffusion of Cu along the Al(Cu) sidewalls and along the top and bottom refractory layers, coupled with grain boundary diffusion within the interconnects, constitutes the Cu incubation mechanism.

  7. Computer aided weld defect delineation using statistical parametric active contours in radiographic inspection.

    PubMed

    Goumeidane, Aicha Baya; Nacereddine, Nafaa; Khamadja, Mohammed

    2015-01-01

    A perfect knowledge of a defect shape is determinant for the analysis step in automatic radiographic inspection. Image segmentation is carried out on radiographic images and extract defects indications. This paper deals with weld defect delineation in radiographic images. The proposed method is based on a new statistics-based explicit active contour. An association of local and global modeling of the image pixels intensities is used to push the model to the desired boundaries. Furthermore, other strategies are proposed to accelerate its evolution and make the convergence speed depending only on the defect size as selecting a band around the active contour curve. The experimental results are very promising, since experiments on synthetic and radiographic images show the ability of the proposed model to extract a piece-wise homogenous object from very inhomogeneous background, even in a bad quality image.

  8. A LES-based Eulerian-Lagrangian approach to predict the dynamics of bubble plumes

    NASA Astrophysics Data System (ADS)

    Fraga, Bruño; Stoesser, Thorsten; Lai, Chris C. K.; Socolofsky, Scott A.

    2016-01-01

    An approach for Eulerian-Lagrangian large-eddy simulation of bubble plume dynamics is presented and its performance evaluated. The main numerical novelties consist in defining the gas-liquid coupling based on the bubble size to mesh resolution ratio (Dp/Δx) and the interpolation between Eulerian and Lagrangian frameworks through the use of delta functions. The model's performance is thoroughly validated for a bubble plume in a cubic tank in initially quiescent water using experimental data obtained from high-resolution ADV and PIV measurements. The predicted time-averaged velocities and second-order statistics show good agreement with the measurements, including the reproduction of the anisotropic nature of the plume's turbulence. Further, the predicted Eulerian and Lagrangian velocity fields, second-order turbulence statistics and interfacial gas-liquid forces are quantified and discussed as well as the visualization of the time-averaged primary and secondary flow structure in the tank.

  9. Predicting perceptual quality of images in realistic scenario using deep filter banks

    NASA Astrophysics Data System (ADS)

    Zhang, Weixia; Yan, Jia; Hu, Shiyong; Ma, Yang; Deng, Dexiang

    2018-03-01

    Classical image perceptual quality assessment models usually resort to natural scene statistic methods, which are based on an assumption that certain reliable statistical regularities hold on undistorted images and will be corrupted by introduced distortions. However, these models usually fail to accurately predict degradation severity of images in realistic scenarios since complex, multiple, and interactive authentic distortions usually appear on them. We propose a quality prediction model based on convolutional neural network. Quality-aware features extracted from filter banks of multiple convolutional layers are aggregated into the image representation. Furthermore, an easy-to-implement and effective feature selection strategy is used to further refine the image representation and finally a linear support vector regression model is trained to map image representation into images' subjective perceptual quality scores. The experimental results on benchmark databases present the effectiveness and generalizability of the proposed model.

  10. Nonlinear Wave Chaos and the Random Coupling Model

    NASA Astrophysics Data System (ADS)

    Zhou, Min; Ott, Edward; Antonsen, Thomas M.; Anlage, Steven

    The Random Coupling Model (RCM) has been shown to successfully predict the statistical properties of linear wave chaotic cavities in the highly over-moded regime. It is of interest to extend the RCM to strongly nonlinear systems. To introduce nonlinearity, an active nonlinear circuit is connected to two ports of the wave chaotic 1/4-bowtie cavity. The active nonlinear circuit consists of a frequency multiplier, an amplifier and several passive filters. It acts to double the input frequency in the range from 3.5 GHz to 5 GHz, and operates for microwaves going in only one direction. Measurements are taken between two additional ports of the cavity and we measure the statistics of the second harmonic voltage over an ensemble of realizations of the scattering system. We developed an RCM-based model of this system as two chaotic cavities coupled by means of a nonlinear transfer function. The harmonics received at the output are predicted to be the product of three statistical quantities that describe the three elements correspondingly. Statistical results from simulation, RCM-based modeling, and direct experimental measurements will be compared. ONR under Grant No. N000141512134, AFOSR under COE Grant FA9550-15-1-0171,0 and the Maryland Center for Nanophysics and Advanced Materials.

  11. An improved swarm optimization for parameter estimation and biological model selection.

    PubMed

    Abdullah, Afnizanfaizal; Deris, Safaai; Mohamad, Mohd Saberi; Anwar, Sohail

    2013-01-01

    One of the key aspects of computational systems biology is the investigation on the dynamic biological processes within cells. Computational models are often required to elucidate the mechanisms and principles driving the processes because of the nonlinearity and complexity. The models usually incorporate a set of parameters that signify the physical properties of the actual biological systems. In most cases, these parameters are estimated by fitting the model outputs with the corresponding experimental data. However, this is a challenging task because the available experimental data are frequently noisy and incomplete. In this paper, a new hybrid optimization method is proposed to estimate these parameters from the noisy and incomplete experimental data. The proposed method, called Swarm-based Chemical Reaction Optimization, integrates the evolutionary searching strategy employed by the Chemical Reaction Optimization, into the neighbouring searching strategy of the Firefly Algorithm method. The effectiveness of the method was evaluated using a simulated nonlinear model and two biological models: synthetic transcriptional oscillators, and extracellular protease production models. The results showed that the accuracy and computational speed of the proposed method were better than the existing Differential Evolution, Firefly Algorithm and Chemical Reaction Optimization methods. The reliability of the estimated parameters was statistically validated, which suggests that the model outputs produced by these parameters were valid even when noisy and incomplete experimental data were used. Additionally, Akaike Information Criterion was employed to evaluate the model selection, which highlighted the capability of the proposed method in choosing a plausible model based on the experimental data. In conclusion, this paper presents the effectiveness of the proposed method for parameter estimation and model selection problems using noisy and incomplete experimental data. This study is hoped to provide a new insight in developing more accurate and reliable biological models based on limited and low quality experimental data.

  12. Anyonic braiding in optical lattices

    PubMed Central

    Zhang, Chuanwei; Scarola, V. W.; Tewari, Sumanta; Das Sarma, S.

    2007-01-01

    Topological quantum states of matter, both Abelian and non-Abelian, are characterized by excitations whose wavefunctions undergo nontrivial statistical transformations as one excitation is moved (braided) around another. Topological quantum computation proposes to use the topological protection and the braiding statistics of a non-Abelian topological state to perform quantum computation. The enormous technological prospect of topological quantum computation provides new motivation for experimentally observing a topological state. Here, we explicitly work out a realistic experimental scheme to create and braid the Abelian topological excitations in the Kitaev model built on a tunable robust system, a cold atom optical lattice. We also demonstrate how to detect the key feature of these excitations: their braiding statistics. Observation of this statistics would directly establish the existence of anyons, quantum particles that are neither fermions nor bosons. In addition to establishing topological matter, the experimental scheme we develop here can also be adapted to a non-Abelian topological state, supported by the same Kitaev model but in a different parameter regime, to eventually build topologically protected quantum gates. PMID:18000038

  13. US Geological Survey nutrient preservation experiment : experimental design, statistical analysis, and interpretation of analytical results

    USGS Publications Warehouse

    Patton, Charles J.; Gilroy, Edward J.

    1999-01-01

    Data on which this report is based, including nutrient concentrations in synthetic reference samples determined concurrently with those in real samples, are extensive (greater than 20,000 determinations) and have been published separately. In addition to confirming the well-documented instability of nitrite in acidified samples, this study also demonstrates that when biota are removed from samples at collection sites by 0.45-micrometer membrane filtration, subsequent preservation with sulfuric acid or mercury (II) provides no statistically significant improvement in nutrient concentration stability during storage at 4 degrees Celsius for 30 days. Biocide preservation had no statistically significant effect on the 30-day stability of phosphorus concentrations in whole-water splits from any of the 15 stations, but did stabilize Kjeldahl nitrogen concentrations in whole-water splits from three data-collection stations where ammonium accounted for at least half of the measured Kjeldahl nitrogen.

  14. Relative fundamental frequency during vocal onset and offset in older speakers with and without Parkinson's disease.

    PubMed

    Stepp, Cara E

    2013-03-01

    The relative fundamental frequency (RFF) surrounding production of a voiceless consonant has previously been shown to be lower in speakers with hypokinetic dysarthria and Parkinson's disease (PD) relative to age/sex matched controls. Here RFF was calculated in 32 speakers with PD without overt hypokinetic dysarthria and 32 age and sex matched controls to better understand the relationships between RFF and PD progression, medication status, and sex. Results showed that RFF was statistically significantly lower in individuals with PD compared with healthy age-matched controls and was statistically significantly lower in individuals diagnosed at least 5 yrs prior to experimentation relative to individuals recorded less than 5 yrs past diagnosis. Contrary to previous trends, no effect of medication was found. However, a statistically significant effect of sex on offset RFF was shown, with lower values in males relative to females. Future work examining the physiological bases of RFF is warranted.

  15. Rigorous force field optimization principles based on statistical distance minimization

    DOE PAGES

    Vlcek, Lukas; Chialvo, Ariel A.

    2015-10-12

    We use the concept of statistical distance to define a measure of distinguishability between a pair of statistical mechanical systems, i.e., a model and its target, and show that its minimization leads to general convergence of the model’s static measurable properties to those of the target. Here we exploit this feature to define a rigorous basis for the development of accurate and robust effective molecular force fields that are inherently compatible with coarse-grained experimental data. The new model optimization principles and their efficient implementation are illustrated through selected examples, whose outcome demonstrates the higher robustness and predictive accuracy of themore » approach compared to other currently used methods, such as force matching and relative entropy minimization. We also discuss relations between the newly developed principles and established thermodynamic concepts, which include the Gibbs-Bogoliubov inequality and the thermodynamic length.« less

  16. Towards a new tool for the evaluation of the quality of ultrasound compressed images.

    PubMed

    Delgorge, Cécile; Rosenberger, Christophe; Poisson, Gérard; Vieyres, Pierre

    2006-11-01

    This paper presents a new tool for the evaluation of ultrasound image compression. The goal is to measure the image quality as easily as with a statistical criterion, and with the same reliability as the one provided by the medical assessment. An initial experiment is proposed to medical experts and represents our reference value for the comparison of evaluation criteria. Twenty-one statistical criteria are selected from the literature. A cumulative absolute similarity measure is defined as a distance between the criterion to evaluate and the reference value. A first fusion method based on a linear combination of criteria is proposed to improve the results obtained by each of them separately. The second proposed approach combines different statistical criteria and uses the medical assessment in a training phase with a support vector machine. Some experimental results are given and show the benefit of fusion.

  17. Statistical models for predicting pair dispersion and particle clustering in isotropic turbulence and their applications

    NASA Astrophysics Data System (ADS)

    Zaichik, Leonid I.; Alipchenkov, Vladimir M.

    2009-10-01

    The purpose of this paper is twofold: (i) to advance and extend the statistical two-point models of pair dispersion and particle clustering in isotropic turbulence that were previously proposed by Zaichik and Alipchenkov (2003 Phys. Fluids15 1776-87 2007 Phys. Fluids 19, 113308) and (ii) to present some applications of these models. The models developed are based on a kinetic equation for the two-point probability density function of the relative velocity distribution of two particles. These models predict the pair relative velocity statistics and the preferential accumulation of heavy particles in stationary and decaying homogeneous isotropic turbulent flows. Moreover, the models are applied to predict the effect of particle clustering on turbulent collisions, sedimentation and intensity of microwave radiation as well as to calculate the mean filtered subgrid stress of the particulate phase. Model predictions are compared with direct numerical simulations and experimental measurements.

  18. Study/experimental/research design: much more than statistics.

    PubMed

    Knight, Kenneth L

    2010-01-01

    The purpose of study, experimental, or research design in scientific manuscripts has changed significantly over the years. It has evolved from an explanation of the design of the experiment (ie, data gathering or acquisition) to an explanation of the statistical analysis. This practice makes "Methods" sections hard to read and understand. To clarify the difference between study design and statistical analysis, to show the advantages of a properly written study design on article comprehension, and to encourage authors to correctly describe study designs. The role of study design is explored from the introduction of the concept by Fisher through modern-day scientists and the AMA Manual of Style. At one time, when experiments were simpler, the study design and statistical design were identical or very similar. With the complex research that is common today, which often includes manipulating variables to create new variables and the multiple (and different) analyses of a single data set, data collection is very different than statistical design. Thus, both a study design and a statistical design are necessary. Scientific manuscripts will be much easier to read and comprehend. A proper experimental design serves as a road map to the study methods, helping readers to understand more clearly how the data were obtained and, therefore, assisting them in properly analyzing the results.

  19. A general model-based design of experiments approach to achieve practical identifiability of pharmacokinetic and pharmacodynamic models.

    PubMed

    Galvanin, Federico; Ballan, Carlo C; Barolo, Massimiliano; Bezzo, Fabrizio

    2013-08-01

    The use of pharmacokinetic (PK) and pharmacodynamic (PD) models is a common and widespread practice in the preliminary stages of drug development. However, PK-PD models may be affected by structural identifiability issues intrinsically related to their mathematical formulation. A preliminary structural identifiability analysis is usually carried out to check if the set of model parameters can be uniquely determined from experimental observations under the ideal assumptions of noise-free data and no model uncertainty. However, even for structurally identifiable models, real-life experimental conditions and model uncertainty may strongly affect the practical possibility to estimate the model parameters in a statistically sound way. A systematic procedure coupling the numerical assessment of structural identifiability with advanced model-based design of experiments formulations is presented in this paper. The objective is to propose a general approach to design experiments in an optimal way, detecting a proper set of experimental settings that ensure the practical identifiability of PK-PD models. Two simulated case studies based on in vitro bacterial growth and killing models are presented to demonstrate the applicability and generality of the methodology to tackle model identifiability issues effectively, through the design of feasible and highly informative experiments.

  20. Gearbox damage identification and quantification using stochastic resonance

    NASA Astrophysics Data System (ADS)

    Mba, Clement U.; Marchesiello, Stefano; Fasana, Alessandro; Garibaldi, Luigi

    2018-03-01

    Amongst the many new tools used for vibration based mechanical fault diagnosis in rotating machineries, stochastic resonance (SR) has been shown to be able to identify as well as quantify gearbox damage via numerical simulations. To validate the numerical simulation results that were obtained in a previous work by the authors, SR is applied in the present study to data from an experimental gearbox that is representative of an industrial gearbox. Both spur and helical gears are used in the gearbox setup. While the results of the direct application of SR to experimental data do not exactly corroborate the numerical simulation results, applying SR to experimental data in pre-processed form is shown to be quite effective. In addition, it is demonstrated that traditional statistical techniques used for gearbox diagnosis can be used as a reference to check how well SR performs.

  1. Fracture Resistance of Retreated Roots Using Different Retreatment Systems

    PubMed Central

    Er, Kursat; Tasdemir, Tamer; Siso, Seyda Herguner; Celik, Davut; Cora, Sabri

    2011-01-01

    Objectives: This study was designed to evaluate the fracture resistance of retreated roots using different rotary retreatment systems. Methods: Forty eight freshly extracted human canine teeth with single straight root canals were instrumented sequentially increasing from size 30 to a size 55 using K-files whit a stepback technique. The teeth were randomly divided into three experimental and one control groups of 12 specimens each. The root canals were filled using cold lateral compaction of gutta-percha and AH Plus (Dentsply Detrey, Konstanz, Germany) sealer in experimental groups. Removal of gutta-percha was performed with the following devices and techniques: ProTaper Universal (Dentsply Maillefer, Ballaigues, Switzerland), R-Endo (Micro-Mega, Besançon, France), and Mtwo (Sweden & Martina, Padova, Italy) rotary retreatment systems. Control group specimens were only instrumented, not filled or retreated. The specimens were then mounted in copper rings, were filled with a self-curing polymethylmethacrylate resin, and the force required to cause vertical root fracture was measured using a universal testing device. The force of fracture of the roots was recorded and the results in the various groups were compared. Statistical analysis was accomplished by one-way ANOVA and a post hoc Tukey tests. Results: There were statistically significant differences between the control and experimental groups (P<.05). However, there were no significant differences among the experimental groups. Conclusions: Based on the results, all rotary retreatment techniques used in this in vitro study produced similar root weakness. PMID:21912497

  2. Preventing Child Sexual Abuse: Body Safety Training for Young Children in Turkey.

    PubMed

    Citak Tunc, Gulseren; Gorak, Gulay; Ozyazicioglu, Nurcan; Ak, Bedriye; Isil, Ozlem; Vural, Pinar

    2018-01-01

    The "Body Safety Training Program" is an education program aimed at ensuring children are informed about their body and acquire self-protection skills. In this study, a total of 83 preschoolers were divided into experimental and control groups; based on a power analysis, 40 children comprised the experimental group, while 43 children comprised the control group. The "Body Safety Training Programme" was translated into Turkish and content validity was determined regarding the language and cultural appropriateness. The "What If Situations Test" (WIST) was administered to both groups before and after the training. Mann-Whitney U Test, Kruskal-Wallis Variance Analysis, and the Wilcoxon Signed Ranks Test were used to compare between the groups and the Spearman correlation analysis was used to determine the strength of the relationship between the dependent and independent variable. The differences between the pretest and posttest scores for the subscales (appropriate recognition, inappropriate recognition, say, do, tell, and reporting skills), and the personal safety questionnaire (PSQ) score means for the children in the experimental group were found to be statistically significant (p < .001). The posttest-pretest difference score means of the experimental group children for WIST saying, doing, telling and reporting, total skills, and PSQ were found to be statistically significant as compared to that of the control group (p < .05). The "Body Safety Training programme" is effective in increasing the child sexual abuse prevention and self-protection skills in Turkish young children.

  3. Final Report, DOE Early Career Award: Predictive modeling of complex physical systems: new tools for statistical inference, uncertainty quantification, and experimental design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marzouk, Youssef

    Predictive simulation of complex physical systems increasingly rests on the interplay of experimental observations with computational models. Key inputs, parameters, or structural aspects of models may be incomplete or unknown, and must be developed from indirect and limited observations. At the same time, quantified uncertainties are needed to qualify computational predictions in the support of design and decision-making. In this context, Bayesian statistics provides a foundation for inference from noisy and limited data, but at prohibitive computional expense. This project intends to make rigorous predictive modeling *feasible* in complex physical systems, via accelerated and scalable tools for uncertainty quantification, Bayesianmore » inference, and experimental design. Specific objectives are as follows: 1. Develop adaptive posterior approximations and dimensionality reduction approaches for Bayesian inference in high-dimensional nonlinear systems. 2. Extend accelerated Bayesian methodologies to large-scale {\\em sequential} data assimilation, fully treating nonlinear models and non-Gaussian state and parameter distributions. 3. Devise efficient surrogate-based methods for Bayesian model selection and the learning of model structure. 4. Develop scalable simulation/optimization approaches to nonlinear Bayesian experimental design, for both parameter inference and model selection. 5. Demonstrate these inferential tools on chemical kinetic models in reacting flow, constructing and refining thermochemical and electrochemical models from limited data. Demonstrate Bayesian filtering on canonical stochastic PDEs and in the dynamic estimation of inhomogeneous subsurface properties and flow fields.« less

  4. [Effects of a Mobile Web-based Pregnancy Health Care Educational Program for Mothers at an Advanced Maternal Age].

    PubMed

    Wang, Hee Jung; Kim, Il Ok

    2015-06-01

    This study was conducted to develop a mobile web-based pregnancy health care educational program for mothers who were at an advanced maternal age (AMA) and to verify the effects of the program on pregnancy health care. This program was developed using a web-based teaching-learning system design model and composed of 10 subject areas. This research was a quasi-experimental study using a non-equivalent control group pretest-posttest time serial design and data were collected from April 2 to May 3, 2014. To verify the effects of the program, it was used for 2 weeks with 30 AMA mothers (experimental group). For the control group, a classroom education booklet for pregnant women used with 31 AMA mothers. The experimental group having participated in program had statistically significantly higher scores for knowledge (t=3.76, p<.001), self-efficacy (t=8.54, p<.001), and practice behavior (t=4.88, p<.001) of pregnancy health care, compared to the control group. The results of the program indicate that a Mobile web-based pregnancy health care educational program is effective in meeting the needs of AMA mothers and can be used as the prenatal educational program for AMA mothers and is appropriate as an educational media for theses mothers.

  5. The Effect of Student-Driven Projects on the Development of Statistical Reasoning

    ERIC Educational Resources Information Center

    Sovak, Melissa M.

    2010-01-01

    Research has shown that even if students pass a standard introductory statistics course, they often still lack the ability to reason statistically. Many instructional techniques for enhancing the development of statistical reasoning have been discussed, although there is often little to no experimental evidence that they produce effective results…

  6. Facilitating Student Experimentation with Statistical Concepts.

    ERIC Educational Resources Information Center

    Smith, Patricia K.

    2002-01-01

    Offers a Web page with seven Java applets allowing students to experiment with key concepts in an introductory statistics course. Indicates the applets can be used in three ways: to place links to the applets, to create in-class demonstrations of statistical concepts, and to lead students through experiments and discover statistical relationships.…

  7. Evaluation of in silico tools to predict the skin sensitization potential of chemicals.

    PubMed

    Verheyen, G R; Braeken, E; Van Deun, K; Van Miert, S

    2017-01-01

    Public domain and commercial in silico tools were compared for their performance in predicting the skin sensitization potential of chemicals. The packages were either statistical based (Vega, CASE Ultra) or rule based (OECD Toolbox, Toxtree, Derek Nexus). In practice, several of these in silico tools are used in gap filling and read-across, but here their use was limited to make predictions based on presence/absence of structural features associated to sensitization. The top 400 ranking substances of the ATSDR 2011 Priority List of Hazardous Substances were selected as a starting point. Experimental information was identified for 160 chemically diverse substances (82 positive and 78 negative). The prediction for skin sensitization potential was compared with the experimental data. Rule-based tools perform slightly better, with accuracies ranging from 0.6 (OECD Toolbox) to 0.78 (Derek Nexus), compared with statistical tools that had accuracies ranging from 0.48 (Vega) to 0.73 (CASE Ultra - LLNA weak model). Combining models increased the performance, with positive and negative predictive values up to 80% and 84%, respectively. However, the number of substances that were predicted positive or negative for skin sensitization in both models was low. Adding more substances to the dataset will increase the confidence in the conclusions reached. The insights obtained in this evaluation are incorporated in a web database www.asopus.weebly.com that provides a potential end user context for the scope and performance of different in silico tools with respect to a common dataset of curated skin sensitization data.

  8. Statistical issues in quality control of proteomic analyses: good experimental design and planning.

    PubMed

    Cairns, David A

    2011-03-01

    Quality control is becoming increasingly important in proteomic investigations as experiments become more multivariate and quantitative. Quality control applies to all stages of an investigation and statistics can play a key role. In this review, the role of statistical ideas in the design and planning of an investigation is described. This involves the design of unbiased experiments using key concepts from statistical experimental design, the understanding of the biological and analytical variation in a system using variance components analysis and the determination of a required sample size to perform a statistically powerful investigation. These concepts are described through simple examples and an example data set from a 2-D DIGE pilot experiment. Each of these concepts can prove useful in producing better and more reproducible data. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. Mapping cell populations in flow cytometry data for cross‐sample comparison using the Friedman–Rafsky test statistic as a distance measure

    PubMed Central

    Hsiao, Chiaowen; Liu, Mengya; Stanton, Rick; McGee, Monnie; Qian, Yu

    2015-01-01

    Abstract Flow cytometry (FCM) is a fluorescence‐based single‐cell experimental technology that is routinely applied in biomedical research for identifying cellular biomarkers of normal physiological responses and abnormal disease states. While many computational methods have been developed that focus on identifying cell populations in individual FCM samples, very few have addressed how the identified cell populations can be matched across samples for comparative analysis. This article presents FlowMap‐FR, a novel method for cell population mapping across FCM samples. FlowMap‐FR is based on the Friedman–Rafsky nonparametric test statistic (FR statistic), which quantifies the equivalence of multivariate distributions. As applied to FCM data by FlowMap‐FR, the FR statistic objectively quantifies the similarity between cell populations based on the shapes, sizes, and positions of fluorescence data distributions in the multidimensional feature space. To test and evaluate the performance of FlowMap‐FR, we simulated the kinds of biological and technical sample variations that are commonly observed in FCM data. The results show that FlowMap‐FR is able to effectively identify equivalent cell populations between samples under scenarios of proportion differences and modest position shifts. As a statistical test, FlowMap‐FR can be used to determine whether the expression of a cellular marker is statistically different between two cell populations, suggesting candidates for new cellular phenotypes by providing an objective statistical measure. In addition, FlowMap‐FR can indicate situations in which inappropriate splitting or merging of cell populations has occurred during gating procedures. We compared the FR statistic with the symmetric version of Kullback–Leibler divergence measure used in a previous population matching method with both simulated and real data. The FR statistic outperforms the symmetric version of KL‐distance in distinguishing equivalent from nonequivalent cell populations. FlowMap‐FR was also employed as a distance metric to match cell populations delineated by manual gating across 30 FCM samples from a benchmark FlowCAP data set. An F‐measure of 0.88 was obtained, indicating high precision and recall of the FR‐based population matching results. FlowMap‐FR has been implemented as a standalone R/Bioconductor package so that it can be easily incorporated into current FCM data analytical workflows. © 2015 International Society for Advancement of Cytometry PMID:26274018

  10. Mapping cell populations in flow cytometry data for cross-sample comparison using the Friedman-Rafsky test statistic as a distance measure.

    PubMed

    Hsiao, Chiaowen; Liu, Mengya; Stanton, Rick; McGee, Monnie; Qian, Yu; Scheuermann, Richard H

    2016-01-01

    Flow cytometry (FCM) is a fluorescence-based single-cell experimental technology that is routinely applied in biomedical research for identifying cellular biomarkers of normal physiological responses and abnormal disease states. While many computational methods have been developed that focus on identifying cell populations in individual FCM samples, very few have addressed how the identified cell populations can be matched across samples for comparative analysis. This article presents FlowMap-FR, a novel method for cell population mapping across FCM samples. FlowMap-FR is based on the Friedman-Rafsky nonparametric test statistic (FR statistic), which quantifies the equivalence of multivariate distributions. As applied to FCM data by FlowMap-FR, the FR statistic objectively quantifies the similarity between cell populations based on the shapes, sizes, and positions of fluorescence data distributions in the multidimensional feature space. To test and evaluate the performance of FlowMap-FR, we simulated the kinds of biological and technical sample variations that are commonly observed in FCM data. The results show that FlowMap-FR is able to effectively identify equivalent cell populations between samples under scenarios of proportion differences and modest position shifts. As a statistical test, FlowMap-FR can be used to determine whether the expression of a cellular marker is statistically different between two cell populations, suggesting candidates for new cellular phenotypes by providing an objective statistical measure. In addition, FlowMap-FR can indicate situations in which inappropriate splitting or merging of cell populations has occurred during gating procedures. We compared the FR statistic with the symmetric version of Kullback-Leibler divergence measure used in a previous population matching method with both simulated and real data. The FR statistic outperforms the symmetric version of KL-distance in distinguishing equivalent from nonequivalent cell populations. FlowMap-FR was also employed as a distance metric to match cell populations delineated by manual gating across 30 FCM samples from a benchmark FlowCAP data set. An F-measure of 0.88 was obtained, indicating high precision and recall of the FR-based population matching results. FlowMap-FR has been implemented as a standalone R/Bioconductor package so that it can be easily incorporated into current FCM data analytical workflows. © The Authors. Published by Wiley Periodicals, Inc. on behalf of ISAC.

  11. Statistical Analysis for Collision-free Boson Sampling.

    PubMed

    Huang, He-Liang; Zhong, Han-Sen; Li, Tan; Li, Feng-Guang; Fu, Xiang-Qun; Zhang, Shuo; Wang, Xiang; Bao, Wan-Su

    2017-11-10

    Boson sampling is strongly believed to be intractable for classical computers but solvable with photons in linear optics, which raises widespread concern as a rapid way to demonstrate the quantum supremacy. However, due to its solution is mathematically unverifiable, how to certify the experimental results becomes a major difficulty in the boson sampling experiment. Here, we develop a statistical analysis scheme to experimentally certify the collision-free boson sampling. Numerical simulations are performed to show the feasibility and practicability of our scheme, and the effects of realistic experimental conditions are also considered, demonstrating that our proposed scheme is experimentally friendly. Moreover, our broad approach is expected to be generally applied to investigate multi-particle coherent dynamics beyond the boson sampling.

  12. Statistical performance and information content of time lag analysis and redundancy analysis in time series modeling.

    PubMed

    Angeler, David G; Viedma, Olga; Moreno, José M

    2009-11-01

    Time lag analysis (TLA) is a distance-based approach used to study temporal dynamics of ecological communities by measuring community dissimilarity over increasing time lags. Despite its increased use in recent years, its performance in comparison with other more direct methods (i.e., canonical ordination) has not been evaluated. This study fills this gap using extensive simulations and real data sets from experimental temporary ponds (true zooplankton communities) and landscape studies (landscape categories as pseudo-communities) that differ in community structure and anthropogenic stress history. Modeling time with a principal coordinate of neighborhood matrices (PCNM) approach, the canonical ordination technique (redundancy analysis; RDA) consistently outperformed the other statistical tests (i.e., TLAs, Mantel test, and RDA based on linear time trends) using all real data. In addition, the RDA-PCNM revealed different patterns of temporal change, and the strength of each individual time pattern, in terms of adjusted variance explained, could be evaluated, It also identified species contributions to these patterns of temporal change. This additional information is not provided by distance-based methods. The simulation study revealed better Type I error properties of the canonical ordination techniques compared with the distance-based approaches when no deterministic component of change was imposed on the communities. The simulation also revealed that strong emphasis on uniform deterministic change and low variability at other temporal scales is needed to result in decreased statistical power of the RDA-PCNM approach relative to the other methods. Based on the statistical performance of and information content provided by RDA-PCNM models, this technique serves ecologists as a powerful tool for modeling temporal change of ecological (pseudo-) communities.

  13. Ceramic processing: Experimental design and optimization

    NASA Technical Reports Server (NTRS)

    Weiser, Martin W.; Lauben, David N.; Madrid, Philip

    1992-01-01

    The objectives of this paper are to: (1) gain insight into the processing of ceramics and how green processing can affect the properties of ceramics; (2) investigate the technique of slip casting; (3) learn how heat treatment and temperature contribute to density, strength, and effects of under and over firing to ceramic properties; (4) experience some of the problems inherent in testing brittle materials and learn about the statistical nature of the strength of ceramics; (5) investigate orthogonal arrays as tools to examine the effect of many experimental parameters using a minimum number of experiments; (6) recognize appropriate uses for clay based ceramics; and (7) measure several different properties important to ceramic use and optimize them for a given application.

  14. Hiring a Gay Man, Taking a Risk?: A Lab Experiment on Employment Discrimination and Risk Aversion.

    PubMed

    Baert, Stijn

    2018-01-01

    We investigate risk aversion as a driver of labor market discrimination against homosexual men. We show that more hiring discrimination by more risk-averse employers is consistent with taste-based and statistical discrimination. To test this hypothesis we conduct a scenario experiment in which experimental employers take a fictitious hiring decision concerning a heterosexual or homosexual male job candidate. In addition, participants are surveyed on their risk aversion and other characteristics that might correlate with this risk aversion. Analysis of the (post-)experimental data confirms our hypothesis. The likelihood of a beneficial hiring decision for homosexual male candidates decreases by 31.7% when employers are a standard deviation more risk-averse.

  15. [Effects of Self-directed Feedback Practice using Smartphone Videos on Basic Nursing Skills, Confidence in Performance and Learning Satisfaction].

    PubMed

    Lee, Seul Gi; Shin, Yun Hee

    2016-04-01

    This study was done to verify effects of a self-directed feedback practice using smartphone videos on nursing students' basic nursing skills, confidence in performance and learning satisfaction. In this study an experimental study with a post-test only control group design was used. Twenty-nine students were assigned to the experimental group and 29 to the control group. Experimental treatment was exchanging feedback on deficiencies through smartphone recorded videos of nursing practice process taken by peers during self-directed practice. Basic nursing skills scores were higher for all items in the experimental group compared to the control group, and differences were statistically significant ["Measuring vital signs" (t=-2.10, p=.039); "Wearing protective equipment when entering and exiting the quarantine room and the management of waste materials" (t=-4.74, p<.001) "Gavage tube feeding" (t=-2.70, p=.009)]. Confidence in performance was higher in the experimental group compared to the control group, but the differences were not statistically significant. However, after the complete practice, there was a statistically significant difference in overall performance confidence (t=-3.07. p=.003). Learning satisfaction was higher in the experimental group compared to the control group, but the difference was not statistically significant (t=-1.67, p=.100). Results of this study indicate that self-directed feedback practice using smartphone videos can improve basic nursing skills. The significance is that it can help nursing students gain confidence in their nursing skills for the future through improvement of basic nursing skills and performance of quality care, thus providing patients with safer care.

  16. Multifractal modeling, segmentation, prediction, and statistical validation of posterior fossa tumors

    NASA Astrophysics Data System (ADS)

    Islam, Atiq; Iftekharuddin, Khan M.; Ogg, Robert J.; Laningham, Fred H.; Sivakumar, Bhuvaneswari

    2008-03-01

    In this paper, we characterize the tumor texture in pediatric brain magnetic resonance images (MRIs) and exploit these features for automatic segmentation of posterior fossa (PF) tumors. We focus on PF tumor because of the prevalence of such tumor in pediatric patients. Due to varying appearance in MRI, we propose to model the tumor texture with a multi-fractal process, such as a multi-fractional Brownian motion (mBm). In mBm, the time-varying Holder exponent provides flexibility in modeling irregular tumor texture. We develop a detailed mathematical framework for mBm in two-dimension and propose a novel algorithm to estimate the multi-fractal structure of tissue texture in brain MRI based on wavelet coefficients. This wavelet based multi-fractal feature along with MR image intensity and a regular fractal feature obtained using our existing piecewise-triangular-prism-surface-area (PTPSA) method, are fused in segmenting PF tumor and non-tumor regions in brain T1, T2, and FLAIR MR images respectively. We also demonstrate a non-patient-specific automated tumor prediction scheme based on these image features. We experimentally show the tumor discriminating power of our novel multi-fractal texture along with intensity and fractal features in automated tumor segmentation and statistical prediction. To evaluate the performance of our tumor prediction scheme, we obtain ROCs and demonstrate how sharply the curves reach the specificity of 1.0 sacrificing minimal sensitivity. Experimental results show the effectiveness of our proposed techniques in automatic detection of PF tumors in pediatric MRIs.

  17. Facile Fabrication of 100% Bio-Based and Degradable Ternary Cellulose/PHBV/PLA Composites

    PubMed Central

    Wang, Jinwu

    2018-01-01

    Modifying bio-based degradable polymers such as polylactide (PLA) and poly(hydroxybutyrate-co-hydroxyvalerate) (PHBV) with non-degradable agents will compromise the 100% degradability of their resultant composites. This work developed a facile and solvent-free route in order to fabricate 100% bio-based and degradable ternary cellulose/PHBV/PLA composite materials. The effects of ball milling on the physicochemical properties of pulp cellulose fibers, and the ball-milled cellulose particles on the morphology and mechanical properties of PHBV/PLA blends, were investigated experimentally and statistically. The results showed that more ball-milling time resulted in a smaller particle size and lower crystallinity by way of mechanical disintegration. Filling PHBV/PLA blends with the ball-milled celluloses dramatically increased the stiffness at all of the levels of particle size and filling content, and improved their elongation at the break and fracture work at certain levels of particle size and filling content. It was also found that the high filling content of the ball-milled cellulose particles was detrimental to the mechanical properties for the resultant composite materials. The ternary cellulose/PHBV/PLA composite materials have some potential applications, such as in packaging materials and automobile inner decoration parts. Furthermore, filling content contributes more to the variations of their mechanical properties than particle size does. Statistical analysis combined with experimental tests provide a new pathway to quantitatively evaluate the effects of multiple variables on a specific property, and figure out the dominant one for the resultant composite materials. PMID:29495315

  18. Cost-Efficient and Multi-Functional Secure Aggregation in Large Scale Distributed Application

    PubMed Central

    Zhang, Ping; Li, Wenjun; Sun, Hua

    2016-01-01

    Secure aggregation is an essential component of modern distributed applications and data mining platforms. Aggregated statistical results are typically adopted in constructing a data cube for data analysis at multiple abstraction levels in data warehouse platforms. Generating different types of statistical results efficiently at the same time (or referred to as enabling multi-functional support) is a fundamental requirement in practice. However, most of the existing schemes support a very limited number of statistics. Securely obtaining typical statistical results simultaneously in the distribution system, without recovering the original data, is still an open problem. In this paper, we present SEDAR, which is a SEcure Data Aggregation scheme under the Range segmentation model. Range segmentation model is proposed to reduce the communication cost by capturing the data characteristics, and different range uses different aggregation strategy. For raw data in the dominant range, SEDAR encodes them into well defined vectors to provide value-preservation and order-preservation, and thus provides the basis for multi-functional aggregation. A homomorphic encryption scheme is used to achieve data privacy. We also present two enhanced versions. The first one is a Random based SEDAR (REDAR), and the second is a Compression based SEDAR (CEDAR). Both of them can significantly reduce communication cost with the trade-off lower security and lower accuracy, respectively. Experimental evaluations, based on six different scenes of real data, show that all of them have an excellent performance on cost and accuracy. PMID:27551747

  19. FORESEE: Fully Outsourced secuRe gEnome Study basEd on homomorphic Encryption

    PubMed Central

    2015-01-01

    Background The increasing availability of genome data motivates massive research studies in personalized treatment and precision medicine. Public cloud services provide a flexible way to mitigate the storage and computation burden in conducting genome-wide association studies (GWAS). However, data privacy has been widely concerned when sharing the sensitive information in a cloud environment. Methods We presented a novel framework (FORESEE: Fully Outsourced secuRe gEnome Study basEd on homomorphic Encryption) to fully outsource GWAS (i.e., chi-square statistic computation) using homomorphic encryption. The proposed framework enables secure divisions over encrypted data. We introduced two division protocols (i.e., secure errorless division and secure approximation division) with a trade-off between complexity and accuracy in computing chi-square statistics. Results The proposed framework was evaluated for the task of chi-square statistic computation with two case-control datasets from the 2015 iDASH genome privacy protection challenge. Experimental results show that the performance of FORESEE can be significantly improved through algorithmic optimization and parallel computation. Remarkably, the secure approximation division provides significant performance gain, but without missing any significance SNPs in the chi-square association test using the aforementioned datasets. Conclusions Unlike many existing HME based studies, in which final results need to be computed by the data owner due to the lack of the secure division operation, the proposed FORESEE framework support complete outsourcing to the cloud and output the final encrypted chi-square statistics. PMID:26733391

  20. Cost-Efficient and Multi-Functional Secure Aggregation in Large Scale Distributed Application.

    PubMed

    Zhang, Ping; Li, Wenjun; Sun, Hua

    2016-01-01

    Secure aggregation is an essential component of modern distributed applications and data mining platforms. Aggregated statistical results are typically adopted in constructing a data cube for data analysis at multiple abstraction levels in data warehouse platforms. Generating different types of statistical results efficiently at the same time (or referred to as enabling multi-functional support) is a fundamental requirement in practice. However, most of the existing schemes support a very limited number of statistics. Securely obtaining typical statistical results simultaneously in the distribution system, without recovering the original data, is still an open problem. In this paper, we present SEDAR, which is a SEcure Data Aggregation scheme under the Range segmentation model. Range segmentation model is proposed to reduce the communication cost by capturing the data characteristics, and different range uses different aggregation strategy. For raw data in the dominant range, SEDAR encodes them into well defined vectors to provide value-preservation and order-preservation, and thus provides the basis for multi-functional aggregation. A homomorphic encryption scheme is used to achieve data privacy. We also present two enhanced versions. The first one is a Random based SEDAR (REDAR), and the second is a Compression based SEDAR (CEDAR). Both of them can significantly reduce communication cost with the trade-off lower security and lower accuracy, respectively. Experimental evaluations, based on six different scenes of real data, show that all of them have an excellent performance on cost and accuracy.

  1. FORESEE: Fully Outsourced secuRe gEnome Study basEd on homomorphic Encryption.

    PubMed

    Zhang, Yuchen; Dai, Wenrui; Jiang, Xiaoqian; Xiong, Hongkai; Wang, Shuang

    2015-01-01

    The increasing availability of genome data motivates massive research studies in personalized treatment and precision medicine. Public cloud services provide a flexible way to mitigate the storage and computation burden in conducting genome-wide association studies (GWAS). However, data privacy has been widely concerned when sharing the sensitive information in a cloud environment. We presented a novel framework (FORESEE: Fully Outsourced secuRe gEnome Study basEd on homomorphic Encryption) to fully outsource GWAS (i.e., chi-square statistic computation) using homomorphic encryption. The proposed framework enables secure divisions over encrypted data. We introduced two division protocols (i.e., secure errorless division and secure approximation division) with a trade-off between complexity and accuracy in computing chi-square statistics. The proposed framework was evaluated for the task of chi-square statistic computation with two case-control datasets from the 2015 iDASH genome privacy protection challenge. Experimental results show that the performance of FORESEE can be significantly improved through algorithmic optimization and parallel computation. Remarkably, the secure approximation division provides significant performance gain, but without missing any significance SNPs in the chi-square association test using the aforementioned datasets. Unlike many existing HME based studies, in which final results need to be computed by the data owner due to the lack of the secure division operation, the proposed FORESEE framework support complete outsourcing to the cloud and output the final encrypted chi-square statistics.

  2. Microgravity experiments on vibrated granular gases in a dilute regime: non-classical statistics

    NASA Astrophysics Data System (ADS)

    Leconte, M.; Garrabos, Y.; Falcon, E.; Lecoutre-Chabot, C.; Palencia, F.; Évesque, P.; Beysens, D.

    2006-07-01

    We report on an experimental study of a dilute gas of steel spheres colliding inelastically and excited by a piston performing sinusoidal vibration, in low gravity. Using improved experimental apparatus, here we present some results concerning the collision statistics of particles on a wall of the container. We also propose a simple model where the non-classical statistics obtained from our data are attributed to the boundary condition playing the role of a 'velostat' instead of a thermostat. The significant differences from the kinetic theory of usual gas are related to the inelasticity of collisions.

  3. Nakagami-based total variation method for speckle reduction in thyroid ultrasound images.

    PubMed

    Koundal, Deepika; Gupta, Savita; Singh, Sukhwinder

    2016-02-01

    A good statistical model is necessary for the reduction in speckle noise. The Nakagami model is more general than the Rayleigh distribution for statistical modeling of speckle in ultrasound images. In this article, the Nakagami-based noise removal method is presented to enhance thyroid ultrasound images and to improve clinical diagnosis. The statistics of log-compressed image are derived from the Nakagami distribution following a maximum a posteriori estimation framework. The minimization problem is solved by optimizing an augmented Lagrange and Chambolle's projection method. The proposed method is evaluated on both artificial speckle-simulated and real ultrasound images. The experimental findings reveal the superiority of the proposed method both quantitatively and qualitatively in comparison with other speckle reduction methods reported in the literature. The proposed method yields an average signal-to-noise ratio gain of more than 2.16 dB over the non-convex regularizer-based speckle noise removal method, 3.83 dB over the Aubert-Aujol model, 1.71 dB over the Shi-Osher model and 3.21 dB over the Rudin-Lions-Osher model on speckle-simulated synthetic images. Furthermore, visual evaluation of the despeckled images shows that the proposed method suppresses speckle noise well while preserving the textures and fine details. © IMechE 2015.

  4. Poisson Statistics of Combinatorial Library Sampling Predict False Discovery Rates of Screening

    PubMed Central

    2017-01-01

    Microfluidic droplet-based screening of DNA-encoded one-bead-one-compound combinatorial libraries is a miniaturized, potentially widely distributable approach to small molecule discovery. In these screens, a microfluidic circuit distributes library beads into droplets of activity assay reagent, photochemically cleaves the compound from the bead, then incubates and sorts the droplets based on assay result for subsequent DNA sequencing-based hit compound structure elucidation. Pilot experimental studies revealed that Poisson statistics describe nearly all aspects of such screens, prompting the development of simulations to understand system behavior. Monte Carlo screening simulation data showed that increasing mean library sampling (ε), mean droplet occupancy, or library hit rate all increase the false discovery rate (FDR). Compounds identified as hits on k > 1 beads (the replicate k class) were much more likely to be authentic hits than singletons (k = 1), in agreement with previous findings. Here, we explain this observation by deriving an equation for authenticity, which reduces to the product of a library sampling bias term (exponential in k) and a sampling saturation term (exponential in ε) setting a threshold that the k-dependent bias must overcome. The equation thus quantitatively describes why each hit structure’s FDR is based on its k class, and further predicts the feasibility of intentionally populating droplets with multiple library beads, assaying the micromixtures for function, and identifying the active members by statistical deconvolution. PMID:28682059

  5. Application of real rock pore-threat statistics to a regular pore network model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rakibul, M.; Sarker, H.; McIntyre, D.

    2011-01-01

    This work reports the application of real rock statistical data to a previously developed regular pore network model in an attempt to produce an accurate simulation tool with low computational overhead. A core plug from the St. Peter Sandstone formation in Indiana was scanned with a high resolution micro CT scanner. The pore-throat statistics of the three-dimensional reconstructed rock were extracted and the distribution of the pore-throat sizes was applied to the regular pore network model. In order to keep the equivalent model regular, only the throat area or the throat radius was varied. Ten realizations of randomly distributed throatmore » sizes were generated to simulate the drainage process and relative permeability was calculated and compared with the experimentally determined values of the original rock sample. The numerical and experimental procedures are explained in detail and the performance of the model in relation to the experimental data is discussed and analyzed. Petrophysical properties such as relative permeability are important in many applied fields such as production of petroleum fluids, enhanced oil recovery, carbon dioxide sequestration, ground water flow, etc. Relative permeability data are used for a wide range of conventional reservoir engineering calculations and in numerical reservoir simulation. Two-phase oil water relative permeability data are generated on the same core plug from both pore network model and experimental procedure. The shape and size of the relative permeability curves were compared and analyzed and good match has been observed for wetting phase relative permeability but for non-wetting phase, simulation results were found to be deviated from the experimental ones. Efforts to determine petrophysical properties of rocks using numerical techniques are to eliminate the necessity of regular core analysis, which can be time consuming and expensive. So a numerical technique is expected to be fast and to produce reliable results. In applied engineering, sometimes quick result with reasonable accuracy is acceptable than the more time consuming results. Present work is an effort to check the accuracy and validity of a previously developed pore network model for obtaining important petrophysical properties of rocks based on cutting-sized sample data.« less

  6. Stated Choice design comparison in a developing country: recall and attribute nonattendance

    PubMed Central

    2014-01-01

    Background Experimental designs constitute a vital component of all Stated Choice (aka discrete choice experiment) studies. However, there exists limited empirical evaluation of the statistical benefits of Stated Choice (SC) experimental designs that employ non-zero prior estimates in constructing non-orthogonal constrained designs. This paper statistically compares the performance of contrasting SC experimental designs. In so doing, the effect of respondent literacy on patterns of Attribute non-Attendance (ANA) across fractional factorial orthogonal and efficient designs is also evaluated. The study uses a ‘real’ SC design to model consumer choice of primary health care providers in rural north India. A total of 623 respondents were sampled across four villages in Uttar Pradesh, India. Methods Comparison of orthogonal and efficient SC experimental designs is based on several measures. Appropriate comparison of each design’s respective efficiency measure is made using D-error results. Standardised Akaike Information Criteria are compared between designs and across recall periods. Comparisons control for stated and inferred ANA. Coefficient and standard error estimates are also compared. Results The added complexity of the efficient SC design, theorised elsewhere, is reflected in higher estimated amounts of ANA among illiterate respondents. However, controlling for ANA using stated and inferred methods consistently shows that the efficient design performs statistically better. Modelling SC data from the orthogonal and efficient design shows that model-fit of the efficient design outperform the orthogonal design when using a 14-day recall period. The performance of the orthogonal design, with respect to standardised AIC model-fit, is better when longer recall periods of 30-days, 6-months and 12-months are used. Conclusions The effect of the efficient design’s cognitive demand is apparent among literate and illiterate respondents, although, more pronounced among illiterate respondents. This study empirically confirms that relaxing the orthogonality constraint of SC experimental designs increases the information collected in choice tasks, subject to the accuracy of the non-zero priors in the design and the correct specification of a ‘real’ SC recall period. PMID:25386388

  7. Application of real rock pore-throat statistics to a regular pore network model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sarker, M.R.; McIntyre, D.; Ferer, M.

    2011-01-01

    This work reports the application of real rock statistical data to a previously developed regular pore network model in an attempt to produce an accurate simulation tool with low computational overhead. A core plug from the St. Peter Sandstone formation in Indiana was scanned with a high resolution micro CT scanner. The pore-throat statistics of the three-dimensional reconstructed rock were extracted and the distribution of the pore-throat sizes was applied to the regular pore network model. In order to keep the equivalent model regular, only the throat area or the throat radius was varied. Ten realizations of randomly distributed throatmore » sizes were generated to simulate the drainage process and relative permeability was calculated and compared with the experimentally determined values of the original rock sample. The numerical and experimental procedures are explained in detail and the performance of the model in relation to the experimental data is discussed and analyzed. Petrophysical properties such as relative permeability are important in many applied fields such as production of petroleum fluids, enhanced oil recovery, carbon dioxide sequestration, ground water flow, etc. Relative permeability data are used for a wide range of conventional reservoir engineering calculations and in numerical reservoir simulation. Two-phase oil water relative permeability data are generated on the same core plug from both pore network model and experimental procedure. The shape and size of the relative permeability curves were compared and analyzed and good match has been observed for wetting phase relative permeability but for non-wetting phase, simulation results were found to be deviated from the experimental ones. Efforts to determine petrophysical properties of rocks using numerical techniques are to eliminate the necessity of regular core analysis, which can be time consuming and expensive. So a numerical technique is expected to be fast and to produce reliable results. In applied engineering, sometimes quick result with reasonable accuracy is acceptable than the more time consuming results. Present work is an effort to check the accuracy and validity of a previously developed pore network model for obtaining important petrophysical properties of rocks based on cutting-sized sample data. Introduction« less

  8. Statistical process control in nursing research.

    PubMed

    Polit, Denise F; Chaboyer, Wendy

    2012-02-01

    In intervention studies in which randomization to groups is not possible, researchers typically use quasi-experimental designs. Time series designs are strong quasi-experimental designs but are seldom used, perhaps because of technical and analytic hurdles. Statistical process control (SPC) is an alternative analytic approach to testing hypotheses about intervention effects using data collected over time. SPC, like traditional statistical methods, is a tool for understanding variation and involves the construction of control charts that distinguish between normal, random fluctuations (common cause variation), and statistically significant special cause variation that can result from an innovation. The purpose of this article is to provide an overview of SPC and to illustrate its use in a study of a nursing practice improvement intervention. Copyright © 2011 Wiley Periodicals, Inc.

  9. Statistical properties of kinetic and total energy densities in reverberant spaces.

    PubMed

    Jacobsen, Finn; Molares, Alfonso Rodríguez

    2010-04-01

    Many acoustical measurements, e.g., measurement of sound power and transmission loss, rely on determining the total sound energy in a reverberation room. The total energy is usually approximated by measuring the mean-square pressure (i.e., the potential energy density) at a number of discrete positions. The idea of measuring the total energy density instead of the potential energy density on the assumption that the former quantity varies less with position than the latter goes back to the 1930s. However, the phenomenon was not analyzed until the late 1970s and then only for the region of high modal overlap, and this analysis has never been published. Moreover, until fairly recently, measurement of the total sound energy density required an elaborate experimental arrangement based on finite-difference approximations using at least four amplitude and phase matched pressure microphones. With the advent of a three-dimensional particle velocity transducer, it has become somewhat easier to measure total rather than only potential energy density in a sound field. This paper examines the ensemble statistics of kinetic and total sound energy densities in reverberant enclosures theoretically, experimentally, and numerically.

  10. [Effectiveness of the Military Mental Health Promotion Program].

    PubMed

    Woo, Chung Hee; Kim, Sun Ah

    2014-12-01

    This study was done to evaluate the Military Mental Health Promotion Program. The program was an email based cognitive behavioral intervention. The research design was a quasi-experimental study with a non-equivalent control group pretest-posttest design. Participants were 32 soldiers who agreed to participate in the program. Data were collected at three different times from January 2012 to March 2012; pre-test, post-test, and a one-month follow-up test. The data were statistically analyzed using SPSS 18.0. The effectiveness of the program was tested by repeated measures ANOVA. The first hypothesis that the level of depression in the experimental group who participated in the program would decrease compared to the control group was not supported in that the difference in group-time interaction was not statistically significant (F=2.19, p=.121). The second and third hypothesis related to anxiety and self-esteem were supported in group-time interaction, respectively (F=7.41, p=.001, F=11.67, p<.001). Results indicate that the program is effective in improving soldiers' mental health status in areas of anxiety and self-esteem.

  11. General correlation for prediction of critical heat flux ratio in water cooled channels

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pernica, R.; Cizek, J.

    1995-09-01

    The paper present the general empirical Critical Heat Flux Ration (CHFR) correlation which is valid for vertical water upflow through tubes, internally heated concentric annuli and rod bundles geometries with both wide and very tight square and triangular rods lattices. The proposed general PG correlation directly predicts the CHFR, it comprises axial and radial non-uniform heating, and is valid in a wider range of thermal hydraulic conditions than previously published critical heat flux correlations. The PG correlation has been developed using the critical heat flux Czech data bank which includes more than 9500 experimental data on tubes, 7600 data onmore » rod bundles and 713 data on internally heated concentric annuli. Accuracy of the CHFR prediction, statistically assessed by the constant dryout conditions approach, is characterized by the mean value nearing 1.00 and the standard deviation less than 0.06. Moverover, a subchannel form of the PG correlations is statistically verified on Westinghouse and Combustion Engineering rod bundle data bases, i.e. more than 7000 experimental CHF points of Columbia University data bank were used.« less

  12. Effect of the use of combination uridine triphosphate, cytidine monophosphate, and hydroxycobalamin on the recovery of neurosensory disturbance after bilateral sagittal split osteotomy: a randomized, double-blind trial.

    PubMed

    Vieira, C L; Vasconcelos, B C do E; Leão, J C; Laureano Filho, J R

    2016-02-01

    The change in neurosensory lesions that develop after bilateral sagittal split osteotomy (BSSO) was explored, and the influence of the application of combination uridine triphosphate (UTP), cytidine monophosphate (CMP), and hydroxycobalamin (vitamin B12) on patient outcomes was assessed. This was a randomized, controlled, double-blind trial. The study sample comprised 12 patients, each evaluated on both sides (thus 24 sides). All patients fulfilled defined selection criteria. Changes in the lesions were measured both subjectively and objectively. The sample was divided into two patient groups: an experimental group receiving medication and a control group receiving placebo. The statistical analysis was performed using SPSS software. Lesions in both groups improved and no statistically significant difference between the groups was observed at any time. 'Severe' injuries in the experimental group were more likely to exhibit a significant improvement after 6 months. Based on the results of the present study, it is concluded that the combination UTP, CMP, and hydroxycobalamin did not influence recovery from neurosensory disorders. Copyright © 2015. Published by Elsevier Ltd.

  13. Photon statistics and speckle visibility spectroscopy with partially coherent X-rays.

    PubMed

    Li, Luxi; Kwaśniewski, Paweł; Orsi, Davide; Wiegart, Lutz; Cristofolini, Luigi; Caronna, Chiara; Fluerasu, Andrei

    2014-11-01

    A new approach is proposed for measuring structural dynamics in materials from multi-speckle scattering patterns obtained with partially coherent X-rays. Coherent X-ray scattering is already widely used at high-brightness synchrotron lightsources to measure dynamics using X-ray photon correlation spectroscopy, but in many situations this experimental approach based on recording long series of images (i.e. movies) is either not adequate or not practical. Following the development of visible-light speckle visibility spectroscopy, the dynamic information is obtained instead by analyzing the photon statistics and calculating the speckle contrast in single scattering patterns. This quantity, also referred to as the speckle visibility, is determined by the properties of the partially coherent beam and other experimental parameters, as well as the internal motions in the sample (dynamics). As a case study, Brownian dynamics in a low-density colloidal suspension is measured and an excellent agreement is found between correlation functions measured by X-ray photon correlation spectroscopy and the decay in speckle visibility with integration time obtained from the analysis presented here.

  14. Simultaneous assessment of phase chemistry, phase abundance and bulk chemistry with statistical electron probe micro-analyses: Application to cement clinkers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilson, William; Krakowiak, Konrad J.; Ulm, Franz-Josef, E-mail: ulm@mit.edu

    2014-01-15

    According to recent developments in cement clinker engineering, the optimization of chemical substitutions in the main clinker phases offers a promising approach to improve both reactivity and grindability of clinkers. Thus, monitoring the chemistry of the phases may become part of the quality control at the cement plants, along with the usual measurements of the abundance of the mineralogical phases (quantitative X-ray diffraction) and the bulk chemistry (X-ray fluorescence). This paper presents a new method to assess these three complementary quantities with a single experiment. The method is based on electron microprobe spot analyses, performed over a grid located onmore » a representative surface of the sample and interpreted with advanced statistical tools. This paper describes the method and the experimental program performed on industrial clinkers to establish the accuracy in comparison to conventional methods. -- Highlights: •A new method of clinker characterization •Combination of electron probe technique with cluster analysis •Simultaneous assessment of phase abundance, composition and bulk chemistry •Experimental validation performed on industrial clinkers.« less

  15. Deformation behavior of HCP titanium alloy: Experiment and Crystal plasticity modeling

    DOE PAGES

    Wronski, M.; Arul Kumar, Mariyappan; Capolungo, Laurent; ...

    2018-03-02

    The deformation behavior of commercially pure titanium is studied using experiments and a crystal plasticity model. Compression tests along the rolling, transverse, and normal-directions, and tensile tests along the rolling and transverse directions are performed at room temperature to study the activation of slip and twinning in the hexagonal closed packed titanium. A detailed EBSD based statistical analysis of the microstructure is performed to develop statistics of both {10-12} tensile and {11-22} compression twins. A simple Monte Carlo (MC) twin variant selection criterion is proposed within the framework of the visco-plastic self-consistent (VPSC) model with a dislocation density (DD) basedmore » law used to describe dislocation hardening. In the model, plasticity is accommodated by prismatic, basal and pyramidal slip modes, and {10-12} tensile and {11-22} compression twinning modes. Thus, the VPSC-MC model successfully captures the experimentally observed activation of low Schmid factor twin variants for both tensile and compression twins modes. The model also predicts macroscopic stress-strain response, texture evolution and twin volume fraction that are in agreement with experimental observations.« less

  16. Bayesian Estimation of Thermonuclear Reaction Rates for Deuterium+Deuterium Reactions

    NASA Astrophysics Data System (ADS)

    Gómez Iñesta, Á.; Iliadis, C.; Coc, A.

    2017-11-01

    The study of d+d reactions is of major interest since their reaction rates affect the predicted abundances of D, 3He, and 7Li. In particular, recent measurements of primordial D/H ratios call for reduced uncertainties in the theoretical abundances predicted by Big Bang nucleosynthesis (BBN). Different authors have studied reactions involved in BBN by incorporating new experimental data and a careful treatment of systematic and probabilistic uncertainties. To analyze the experimental data, Coc et al. used results of ab initio models for the theoretical calculation of the energy dependence of S-factors in conjunction with traditional statistical methods based on χ 2 minimization. Bayesian methods have now spread to many scientific fields and provide numerous advantages in data analysis. Astrophysical S-factors and reaction rates using Bayesian statistics were calculated by Iliadis et al. Here we present a similar analysis for two d+d reactions, d(d, n)3He and d(d, p)3H, that has been translated into a total decrease of the predicted D/H value by 0.16%.

  17. Deformation behavior of HCP titanium alloy: Experiment and Crystal plasticity modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wronski, M.; Arul Kumar, Mariyappan; Capolungo, Laurent

    The deformation behavior of commercially pure titanium is studied using experiments and a crystal plasticity model. Compression tests along the rolling, transverse, and normal-directions, and tensile tests along the rolling and transverse directions are performed at room temperature to study the activation of slip and twinning in the hexagonal closed packed titanium. A detailed EBSD based statistical analysis of the microstructure is performed to develop statistics of both {10-12} tensile and {11-22} compression twins. A simple Monte Carlo (MC) twin variant selection criterion is proposed within the framework of the visco-plastic self-consistent (VPSC) model with a dislocation density (DD) basedmore » law used to describe dislocation hardening. In the model, plasticity is accommodated by prismatic, basal and pyramidal slip modes, and {10-12} tensile and {11-22} compression twinning modes. Thus, the VPSC-MC model successfully captures the experimentally observed activation of low Schmid factor twin variants for both tensile and compression twins modes. The model also predicts macroscopic stress-strain response, texture evolution and twin volume fraction that are in agreement with experimental observations.« less

  18. Prediction of heat capacity of amine solutions using artificial neural network and thermodynamic models for CO2 capture processes

    NASA Astrophysics Data System (ADS)

    Afkhamipour, Morteza; Mofarahi, Masoud; Borhani, Tohid Nejad Ghaffar; Zanganeh, Masoud

    2018-03-01

    In this study, artificial neural network (ANN) and thermodynamic models were developed for prediction of the heat capacity ( C P ) of amine-based solvents. For ANN model, independent variables such as concentration, temperature, molecular weight and CO2 loading of amine were selected as the inputs of the model. The significance of the input variables of the ANN model on the C P values was investigated statistically by analyzing of correlation matrix. A thermodynamic model based on the Redlich-Kister equation was used to correlate the excess molar heat capacity ({C}_P^E) data as function of temperature. In addition, the effects of temperature and CO2 loading at different concentrations of conventional amines on the C P values were investigated. Both models were validated against experimental data and very good results were obtained between two mentioned models and experimental data of C P collected from various literatures. The AARD between ANN model results and experimental data of C P for 47 systems of amine-based solvents studied was 4.3%. For conventional amines, the AARD for ANN model and thermodynamic model in comparison with experimental data were 0.59% and 0.57%, respectively. The results showed that both ANN and Redlich-Kister models can be used as a practical tool for simulation and designing of CO2 removal processes by using amine solutions.

  19. Mobile phone-based clinical guidance for rural health providers in India.

    PubMed

    Gautham, Meenakshi; Iyengar, M Sriram; Johnson, Craig W

    2015-12-01

    There are few tried and tested mobile technology applications to enhance and standardize the quality of health care by frontline rural health providers in low-resource settings. We developed a media-rich, mobile phone-based clinical guidance system for management of fevers, diarrhoeas and respiratory problems by rural health providers. Using a randomized control design, we field tested this application with 16 rural health providers and 128 patients at two rural/tribal sites in Tamil Nadu, Southern India. Protocol compliance for both groups, phone usability, acceptability and patient feedback for the experimental group were evaluated. Linear mixed-model analyses showed statistically significant improvements in protocol compliance in the experimental group. Usability and acceptability among patients and rural health providers were very high. Our results indicate that mobile phone-based, media-rich procedural guidance applications have significant potential for achieving consistently standardized quality of care by diverse frontline rural health providers, with patient acceptance. © The Author(s) 2014.

  20. Single-Case Experimental Designs to Evaluate Novel Technology-Based Health Interventions

    PubMed Central

    Cassidy, Rachel N; Raiff, Bethany R

    2013-01-01

    Technology-based interventions to promote health are expanding rapidly. Assessing the preliminary efficacy of these interventions can be achieved by employing single-case experiments (sometimes referred to as n-of-1 studies). Although single-case experiments are often misunderstood, they offer excellent solutions to address the challenges associated with testing new technology-based interventions. This paper provides an introduction to single-case techniques and highlights advances in developing and evaluating single-case experiments, which help ensure that treatment outcomes are reliable, replicable, and generalizable. These advances include quality control standards, heuristics to guide visual analysis of time-series data, effect size calculations, and statistical analyses. They also include experimental designs to isolate the active elements in a treatment package and to assess the mechanisms of behavior change. The paper concludes with a discussion of issues related to the generality of findings derived from single-case research and how generality can be established through replication and through analysis of behavioral mechanisms. PMID:23399668

  1. Double-path acquisition of pulse wave transit time and heartbeat using self-mixing interferometry

    NASA Astrophysics Data System (ADS)

    Wei, Yingbin; Huang, Wencai; Wei, Zheng; Zhang, Jie; An, Tong; Wang, Xiulin; Xu, Huizhen

    2017-06-01

    We present a technique based on self-mixing interferometry for acquiring the pulse wave transit time (PWTT) and heartbeat. A signal processing method based on Continuous Wavelet Transform and Hilbert Transform is applied to extract potentially useful information in the self-mixing interference (SMI) signal, including PWTT and heartbeat. Then, some cardiovascular characteristics of the human body are easily acquired without retrieving the SMI signal by complicated algorithms. Experimentally, the PWTT is measured on the finger and the toe of the human body using double-path self-mixing interferometry. Experimental statistical data show the relation between the PWTT and blood pressure, which can be used to estimate the systolic pressure value by fitting. Moreover, the measured heartbeat shows good agreement with that obtained by a photoplethysmography sensor. The method that we demonstrate, which is based on self-mixing interferometry with significant advantages of simplicity, compactness and non-invasion, effectively illustrates the viability of the SMI technique for measuring other cardiovascular signals.

  2. A System Computational Model of Implicit Emotional Learning

    PubMed Central

    Puviani, Luca; Rama, Sidita

    2016-01-01

    Nowadays, the experimental study of emotional learning is commonly based on classical conditioning paradigms and models, which have been thoroughly investigated in the last century. Unluckily, models based on classical conditioning are unable to explain or predict important psychophysiological phenomena, such as the failure of the extinction of emotional responses in certain circumstances (for instance, those observed in evaluative conditioning, in post-traumatic stress disorders and in panic attacks). In this manuscript, starting from the experimental results available from the literature, a computational model of implicit emotional learning based both on prediction errors computation and on statistical inference is developed. The model quantitatively predicts (a) the occurrence of evaluative conditioning, (b) the dynamics and the resistance-to-extinction of the traumatic emotional responses, (c) the mathematical relation between classical conditioning and unconditioned stimulus revaluation. Moreover, we discuss how the derived computational model can lead to the development of new animal models for resistant-to-extinction emotional reactions and novel methodologies of emotions modulation. PMID:27378898

  3. A System Computational Model of Implicit Emotional Learning.

    PubMed

    Puviani, Luca; Rama, Sidita

    2016-01-01

    Nowadays, the experimental study of emotional learning is commonly based on classical conditioning paradigms and models, which have been thoroughly investigated in the last century. Unluckily, models based on classical conditioning are unable to explain or predict important psychophysiological phenomena, such as the failure of the extinction of emotional responses in certain circumstances (for instance, those observed in evaluative conditioning, in post-traumatic stress disorders and in panic attacks). In this manuscript, starting from the experimental results available from the literature, a computational model of implicit emotional learning based both on prediction errors computation and on statistical inference is developed. The model quantitatively predicts (a) the occurrence of evaluative conditioning, (b) the dynamics and the resistance-to-extinction of the traumatic emotional responses, (c) the mathematical relation between classical conditioning and unconditioned stimulus revaluation. Moreover, we discuss how the derived computational model can lead to the development of new animal models for resistant-to-extinction emotional reactions and novel methodologies of emotions modulation.

  4. The Effects of a Web-Based Nursing Process Documentation Program on Stress and Anxiety of Nursing Students in South Korea.

    PubMed

    Lee, Eunjoo; Noh, Hyun Kyung

    2016-01-01

    To examine the effects of a web-based nursing process documentation system on the stress and anxiety of nursing students during their clinical practice. A quasi-experimental design was employed. The experimental group (n = 110) used a web-based nursing process documentation program for their case reports as part of assignments for a clinical practicum, whereas the control group (n = 106) used traditional paper-based case reports. Stress and anxiety levels were measured with a numeric rating scale before, 2 weeks after, and 4 weeks after using the web-based nursing process documentation program during a clinical practicum. The data were analyzed using descriptive statistics, t tests, chi-square tests, and repeated-measures analyses of variance. Nursing students who used the web-based nursing process documentation program showed significant lower levels of stress and anxiety than the control group. A web-based nursing process documentation program could be used to reduce the stress and anxiety of nursing students during clinical practicum, which ultimately would benefit nursing students by increasing satisfaction with and effectiveness of clinical practicum. © 2015 NANDA International, Inc.

  5. Roles of frequency, attitudes, and multiple intelligence modality surrounding Electricity Content-Based Reader's Theatre

    NASA Astrophysics Data System (ADS)

    Hosier, Julie Winchester

    Integration of subjects is something elementary teachers must do to insure required objectives are covered. Science-based Reader's Theatre is one way to weave reading into science. This study examined the roles of frequency, attitudes, and Multiple Intelligence modalities surrounding Electricity Content-Based Reader's Theatre. This study used quasi-experimental, repeated measures ANOVA with time as a factor design. A convenience sample of two fifth-grade classrooms participated in the study for eighteen weeks. Five Electricity Achievement Tests were given throughout the study to assess students' growth. A Student Reader's Theatre Attitudinal Survey revealed students' attitudes before and after Electricity Content-Based Reader's Theatre treatment. The Multiple Intelligence Inventory for Kids (Faris, 2007) examined whether Multiple Intelligence modality played a role in achievement on Electricity Test 4, the post-treatment test. Analysis using repeated measures ANOVA and an independent t-test found that students in the experimental group, which practiced its student-created Electricity Content-Based Reader's Theatre skits ten times versus two times for the for control group, did significantly better on Electricity Achievement Test 4, t(76) = 3.018, p = 0.003. Dependent t-tests did not find statistically significant differences between students' attitudes about Electricity Content-Based Reader's Theatre before and after treatment. A Kruskal-Wallis test found no statistically significant difference between the various Multiple Intelligence modalities score mean ranks (x2 = 5.57, df = 2, alpha = .062). Qualitative data do, however, indicate students had strong positive feelings about Electricity Content-Based Reader's Theatre after treatment. Students indicated it to be motivating, confidence-building, and a fun way to learn about science; however, they disliked writing their own scripts. Examining the frequency, attitudes, and Multiple Intelligence modalities lead to the conclusion that the role of frequency had the greatest impact on the success of Electricity Content-Based Reader's Theatre. The participating teachers, students, and research found integrating science and reading through Electricity Content-Based Reader's Theatre beneficial.

  6. Simplified Approach to Predicting Rough Surface Transition

    NASA Technical Reports Server (NTRS)

    Boyle, Robert J.; Stripf, Matthias

    2009-01-01

    Turbine vane heat transfer predictions are given for smooth and rough vanes where the experimental data show transition moving forward on the vane as the surface roughness physical height increases. Consiste nt with smooth vane heat transfer, the transition moves forward for a fixed roughness height as the Reynolds number increases. Comparison s are presented with published experimental data. Some of the data ar e for a regular roughness geometry with a range of roughness heights, Reynolds numbers, and inlet turbulence intensities. The approach ta ken in this analysis is to treat the roughness in a statistical sense , consistent with what would be obtained from blades measured after e xposure to actual engine environments. An approach is given to determ ine the equivalent sand grain roughness from the statistics of the re gular geometry. This approach is guided by the experimental data. A roughness transition criterion is developed, and comparisons are made with experimental data over the entire range of experimental test co nditions. Additional comparisons are made with experimental heat tran sfer data, where the roughness geometries are both regular as well a s statistical. Using the developed analysis, heat transfer calculatio ns are presented for the second stage vane of a high pressure turbine at hypothetical engine conditions.

  7. Experimental design and statistical methods for improved hit detection in high-throughput screening.

    PubMed

    Malo, Nathalie; Hanley, James A; Carlile, Graeme; Liu, Jing; Pelletier, Jerry; Thomas, David; Nadon, Robert

    2010-09-01

    Identification of active compounds in high-throughput screening (HTS) contexts can be substantially improved by applying classical experimental design and statistical inference principles to all phases of HTS studies. The authors present both experimental and simulated data to illustrate how true-positive rates can be maximized without increasing false-positive rates by the following analytical process. First, the use of robust data preprocessing methods reduces unwanted variation by removing row, column, and plate biases. Second, replicate measurements allow estimation of the magnitude of the remaining random error and the use of formal statistical models to benchmark putative hits relative to what is expected by chance. Receiver Operating Characteristic (ROC) analyses revealed superior power for data preprocessed by a trimmed-mean polish method combined with the RVM t-test, particularly for small- to moderate-sized biological hits.

  8. Independent component analysis for the extraction of reliable protein signal profiles from MALDI-TOF mass spectra.

    PubMed

    Mantini, Dante; Petrucci, Francesca; Del Boccio, Piero; Pieragostino, Damiana; Di Nicola, Marta; Lugaresi, Alessandra; Federici, Giorgio; Sacchetta, Paolo; Di Ilio, Carmine; Urbani, Andrea

    2008-01-01

    Independent component analysis (ICA) is a signal processing technique that can be utilized to recover independent signals from a set of their linear mixtures. We propose ICA for the analysis of signals obtained from large proteomics investigations such as clinical multi-subject studies based on MALDI-TOF MS profiling. The method is validated on simulated and experimental data for demonstrating its capability of correctly extracting protein profiles from MALDI-TOF mass spectra. The comparison on peak detection with an open-source and two commercial methods shows its superior reliability in reducing the false discovery rate of protein peak masses. Moreover, the integration of ICA and statistical tests for detecting the differences in peak intensities between experimental groups allows to identify protein peaks that could be indicators of a diseased state. This data-driven approach demonstrates to be a promising tool for biomarker-discovery studies based on MALDI-TOF MS technology. The MATLAB implementation of the method described in the article and both simulated and experimental data are freely available at http://www.unich.it/proteomica/bioinf/.

  9. Determination of Membrane-Insertion Free Energies by Molecular Dynamics Simulations

    PubMed Central

    Gumbart, James; Roux, Benoît

    2012-01-01

    The accurate prediction of membrane-insertion probability for arbitrary protein sequences is a critical challenge to identifying membrane proteins and determining their folded structures. Although algorithms based on sequence statistics have had moderate success, a complete understanding of the energetic factors that drive the insertion of membrane proteins is essential to thoroughly meeting this challenge. In the last few years, numerous attempts to define a free-energy scale for amino-acid insertion have been made, yet disagreement between most experimental and theoretical scales persists. However, for a recently resolved water-to-bilayer scale, it is found that molecular dynamics simulations that carefully mimic the conditions of the experiment can reproduce experimental free energies, even when using the same force field as previous computational studies that were cited as evidence of this disagreement. Therefore, it is suggested that experimental and simulation-based scales can both be accurate and that discrepancies stem from disparities in the microscopic processes being considered rather than methodological errors. Furthermore, these disparities make the development of a single universally applicable membrane-insertion free energy scale difficult. PMID:22385850

  10. Temporal complexity in emission from Anderson localized lasers

    NASA Astrophysics Data System (ADS)

    Kumar, Randhir; Balasubrahmaniyam, M.; Alee, K. Shadak; Mujumdar, Sushil

    2017-12-01

    Anderson localization lasers exploit resonant cavities formed due to structural disorder. The inherent randomness in the structure of these cavities realizes a probability distribution in all cavity parameters such as quality factors, mode volumes, mode structures, and so on, implying resultant statistical fluctuations in the temporal behavior. Here we provide direct experimental measurements of temporal width distributions of Anderson localization lasing pulses in intrinsically and extrinsically disordered coupled-microresonator arrays. We first illustrate signature exponential decays in the spatial intensity distributions of the lasing modes that quantify their localized character, and then measure the temporal width distributions of the pulsed emission over several configurations. We observe a dependence of temporal widths on the disorder strength, wherein the widths show a single-peaked, left-skewed distribution in extrinsic disorder and a dual-peaked distribution in intrinsic disorder. We propose a model based on coupled rate equations for an emitter and an Anderson cavity with a random mode structure, which gives excellent quantitative and qualitative agreement with the experimental observations. The experimental and theoretical analyses bring to the fore the temporal complexity in Anderson-localization-based lasing systems.

  11. The IUPAC aqueous and non-aqueous experimental pKa data repositories of organic acids and bases.

    PubMed

    Slater, Anthony Michael

    2014-10-01

    Accurate and well-curated experimental pKa data of organic acids and bases in both aqueous and non-aqueous media are invaluable in many areas of chemical research, including pharmaceutical, agrochemical, specialty chemical and property prediction research. In pharmaceutical research, pKa data are relevant in ligand design, protein binding, absorption, distribution, metabolism, elimination as well as solubility and dissolution rate. The pKa data compilations of the International Union of Pure and Applied Chemistry, originally in book form, have been carefully converted into computer-readable form, with value being added in the process, in the form of ionisation assignments and tautomer enumeration. These compilations offer a broad range of chemistry in both aqueous and non-aqueous media and the experimental conditions and original reference for all pKa determinations are supplied. The statistics for these compilations are presented and the utility of the computer-readable form of these compilations is examined in comparison to other pKa compilations. Finally, information is provided about how to access these databases.

  12. The IUPAC aqueous and non-aqueous experimental pKa data repositories of organic acids and bases

    NASA Astrophysics Data System (ADS)

    Slater, Anthony Michael

    2014-10-01

    Accurate and well-curated experimental pKa data of organic acids and bases in both aqueous and non-aqueous media are invaluable in many areas of chemical research, including pharmaceutical, agrochemical, specialty chemical and property prediction research. In pharmaceutical research, pKa data are relevant in ligand design, protein binding, absorption, distribution, metabolism, elimination as well as solubility and dissolution rate. The pKa data compilations of the International Union of Pure and Applied Chemistry, originally in book form, have been carefully converted into computer-readable form, with value being added in the process, in the form of ionisation assignments and tautomer enumeration. These compilations offer a broad range of chemistry in both aqueous and non-aqueous media and the experimental conditions and original reference for all pKa determinations are supplied. The statistics for these compilations are presented and the utility of the computer-readable form of these compilations is examined in comparison to other pKa compilations. Finally, information is provided about how to access these databases.

  13. A smartphone application to educate undergraduate nursing students about providing care for infant airway obstruction.

    PubMed

    Kim, Shin-Jeong; Shin, Hyewon; Lee, Jungeun; Kang, SoRa; Bartlett, Robin

    2017-01-01

    This study had two aims: (a) to develop a smartphone-based application and (b) to evaluate the effectiveness of the application by measuring nursing students' knowledge, skills, and confidence in simulated performance when providing that care. We conducted a randomized trial using a pre- and post-test design at a university in Korea. Seventy-three junior nursing students participated. A smartphone-based app using a video was developed for the experimental group and one time lecture-based education was designed for the control group. We provided the app and information about its use to the experimental group, and we encouraged its use. We provided classroom instruction to the control group. Then, learning outcomes were evaluated. The smartphone-based education group showed significantly higher scores on skills (t=4.774, p<0.001) and confidence in performance (t=2.888, p=0.005) than the control group. The scores on knowledge (t=0.886, p=0.379) and satisfaction with the learning method (t=0.168, p=0.867) for the experimental group were higher than for the control group, but the differences were not statistically significant. This study suggests that smartphone-based education may be an effective method to use in nursing education related to teaching infant airway obstruction. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. Mindfulness training for reducing anger, anxiety, and depression in fibromyalgia patients

    PubMed Central

    Amutio, Alberto; Franco, Clemente; Pérez-Fuentes, María de Carmen; Gázquez, José J.; Mercader, Isabel

    2015-01-01

    Fibromyalgia is a disabling syndrome. Results obtained with different therapies are very limited to date. The goal of this study was to verify whether the application of a mindfulness-based training program was effective in modifying anger, anxiety, and depression levels in a group of women diagnosed with fibromyalgia. This study is an experimental trial that employed a waiting list control group. Measures were taken at three different times: pretest, posttest, and follow-up. The statistical analyses revealed a significant reduction of anger (trait) levels, internal expression of anger, state anxiety, and depression in the experimental group as compared to the control group, as well as a significant increase in internal control of anger. It can be concluded that the mindfulness-based treatment was effective after 7 weeks. These results were maintained 3 months after the end of the intervention. PMID:25628591

  15. Assessing noninferiority in a three-arm trial using the Bayesian approach.

    PubMed

    Ghosh, Pulak; Nathoo, Farouk; Gönen, Mithat; Tiwari, Ram C

    2011-07-10

    Non-inferiority trials, which aim to demonstrate that a test product is not worse than a competitor by more than a pre-specified small amount, are of great importance to the pharmaceutical community. As a result, methodology for designing and analyzing such trials is required, and developing new methods for such analysis is an important area of statistical research. The three-arm trial consists of a placebo, a reference and an experimental treatment, and simultaneously tests the superiority of the reference over the placebo along with comparing this reference to an experimental treatment. In this paper, we consider the analysis of non-inferiority trials using Bayesian methods which incorporate both parametric as well as semi-parametric models. The resulting testing approach is both flexible and robust. The benefit of the proposed Bayesian methods is assessed via simulation, based on a study examining home-based blood pressure interventions. Copyright © 2011 John Wiley & Sons, Ltd.

  16. ABS-FishCount: An Agent-Based Simulator of Underwater Sensors for Measuring the Amount of Fish.

    PubMed

    García-Magariño, Iván; Lacuesta, Raquel; Lloret, Jaime

    2017-11-13

    Underwater sensors provide one of the possibilities to explore oceans, seas, rivers, fish farms and dams, which all together cover most of our planet's area. Simulators can be helpful to test and discover some possible strategies before implementing these in real underwater sensors. This speeds up the development of research theories so that these can be implemented later. In this context, the current work presents an agent-based simulator for defining and testing strategies for measuring the amount of fish by means of underwater sensors. The current approach is illustrated with the definition and assessment of two strategies for measuring fish. One of these two corresponds to a simple control mechanism, while the other is an experimental strategy and includes an implicit coordination mechanism. The experimental strategy showed a statistically significant improvement over the control one in the reduction of errors with a large Cohen's d effect size of 2.55.

  17. Experimental bit commitment based on quantum communication and special relativity.

    PubMed

    Lunghi, T; Kaniewski, J; Bussières, F; Houlmann, R; Tomamichel, M; Kent, A; Gisin, N; Wehner, S; Zbinden, H

    2013-11-01

    Bit commitment is a fundamental cryptographic primitive in which Bob wishes to commit a secret bit to Alice. Perfectly secure bit commitment between two mistrustful parties is impossible through asynchronous exchange of quantum information. Perfect security is however possible when Alice and Bob split into several agents exchanging classical and quantum information at times and locations suitably chosen to satisfy specific relativistic constraints. Here we report on an implementation of a bit commitment protocol using quantum communication and special relativity. Our protocol is based on [A. Kent, Phys. Rev. Lett. 109, 130501 (2012)] and has the advantage that it is practically feasible with arbitrary large separations between the agents in order to maximize the commitment time. By positioning agents in Geneva and Singapore, we obtain a commitment time of 15 ms. A security analysis considering experimental imperfections and finite statistics is presented.

  18. Effects of special composite stretching on the swing of amateur golf players

    PubMed Central

    Lee, Joong-chul; Lee, Sung-wan; Yeo, Yun-ghi; Park, Gi Duck

    2015-01-01

    [Purpose] The study investigated stretching for safer a golf swing compared to present stretching methods for proper swings in order to examine the effects of stretching exercises on golf swings. [Subjects] The subjects were 20 amateur golf club members who were divided into two groups: an experimental group which performed stretching, and a control group which did not. The subjects had no bone deformity, muscle weakness, muscle soreness, or neurological problems. [Methods] A swing analyzer and a ROM measuring instrument were used as the measuring tools. The swing analyzer was a GS400-golf hit ball analyzer (Korea) and the ROM measuring instrument was a goniometer (Korea). [Results] The experimental group showed a statistically significant improvement in driving distance. After the special stretching training for golf, a statistically significant difference in hit-ball direction deviation after swings were found between the groups. The experimental group showed statistically significant decreases in hit ball direction deviation. After the special stretching training for golf, statistically significant differences in hit-ball speed were found between the groups. The experimental group showed significant increases in hit-ball speed. [Conclusion] To examine the effects of a special stretching program for golf on golf swing-related factors, 20 male amateur golf club members performed a 12-week stretching training program. After the golf stretching training, statistically significant differences were found between the groups in hit-ball driving distance, direction deviation, deflection distance, and speed. PMID:25995553

  19. Effects of special composite stretching on the swing of amateur golf players.

    PubMed

    Lee, Joong-Chul; Lee, Sung-Wan; Yeo, Yun-Ghi; Park, Gi Duck

    2015-04-01

    [Purpose] The study investigated stretching for safer a golf swing compared to present stretching methods for proper swings in order to examine the effects of stretching exercises on golf swings. [Subjects] The subjects were 20 amateur golf club members who were divided into two groups: an experimental group which performed stretching, and a control group which did not. The subjects had no bone deformity, muscle weakness, muscle soreness, or neurological problems. [Methods] A swing analyzer and a ROM measuring instrument were used as the measuring tools. The swing analyzer was a GS400-golf hit ball analyzer (Korea) and the ROM measuring instrument was a goniometer (Korea). [Results] The experimental group showed a statistically significant improvement in driving distance. After the special stretching training for golf, a statistically significant difference in hit-ball direction deviation after swings were found between the groups. The experimental group showed statistically significant decreases in hit ball direction deviation. After the special stretching training for golf, statistically significant differences in hit-ball speed were found between the groups. The experimental group showed significant increases in hit-ball speed. [Conclusion] To examine the effects of a special stretching program for golf on golf swing-related factors, 20 male amateur golf club members performed a 12-week stretching training program. After the golf stretching training, statistically significant differences were found between the groups in hit-ball driving distance, direction deviation, deflection distance, and speed.

  20. Using iterative cluster merging with improved gap statistics to perform online phenotype discovery in the context of high-throughput RNAi screens

    PubMed Central

    Yin, Zheng; Zhou, Xiaobo; Bakal, Chris; Li, Fuhai; Sun, Youxian; Perrimon, Norbert; Wong, Stephen TC

    2008-01-01

    Background The recent emergence of high-throughput automated image acquisition technologies has forever changed how cell biologists collect and analyze data. Historically, the interpretation of cellular phenotypes in different experimental conditions has been dependent upon the expert opinions of well-trained biologists. Such qualitative analysis is particularly effective in detecting subtle, but important, deviations in phenotypes. However, while the rapid and continuing development of automated microscope-based technologies now facilitates the acquisition of trillions of cells in thousands of diverse experimental conditions, such as in the context of RNA interference (RNAi) or small-molecule screens, the massive size of these datasets precludes human analysis. Thus, the development of automated methods which aim to identify novel and biological relevant phenotypes online is one of the major challenges in high-throughput image-based screening. Ideally, phenotype discovery methods should be designed to utilize prior/existing information and tackle three challenging tasks, i.e. restoring pre-defined biological meaningful phenotypes, differentiating novel phenotypes from known ones and clarifying novel phenotypes from each other. Arbitrarily extracted information causes biased analysis, while combining the complete existing datasets with each new image is intractable in high-throughput screens. Results Here we present the design and implementation of a novel and robust online phenotype discovery method with broad applicability that can be used in diverse experimental contexts, especially high-throughput RNAi screens. This method features phenotype modelling and iterative cluster merging using improved gap statistics. A Gaussian Mixture Model (GMM) is employed to estimate the distribution of each existing phenotype, and then used as reference distribution in gap statistics. This method is broadly applicable to a number of different types of image-based datasets derived from a wide spectrum of experimental conditions and is suitable to adaptively process new images which are continuously added to existing datasets. Validations were carried out on different dataset, including published RNAi screening using Drosophila embryos [Additional files 1, 2], dataset for cell cycle phase identification using HeLa cells [Additional files 1, 3, 4] and synthetic dataset using polygons, our methods tackled three aforementioned tasks effectively with an accuracy range of 85%–90%. When our method is implemented in the context of a Drosophila genome-scale RNAi image-based screening of cultured cells aimed to identifying the contribution of individual genes towards the regulation of cell-shape, it efficiently discovers meaningful new phenotypes and provides novel biological insight. We also propose a two-step procedure to modify the novelty detection method based on one-class SVM, so that it can be used to online phenotype discovery. In different conditions, we compared the SVM based method with our method using various datasets and our methods consistently outperformed SVM based method in at least two of three tasks by 2% to 5%. These results demonstrate that our methods can be used to better identify novel phenotypes in image-based datasets from a wide range of conditions and organisms. Conclusion We demonstrate that our method can detect various novel phenotypes effectively in complex datasets. Experiment results also validate that our method performs consistently under different order of image input, variation of starting conditions including the number and composition of existing phenotypes, and dataset from different screens. In our findings, the proposed method is suitable for online phenotype discovery in diverse high-throughput image-based genetic and chemical screens. PMID:18534020

  1. The Effect of Visual of a Courseware towards Pre-University Students' Learning in Literature

    NASA Astrophysics Data System (ADS)

    Masri, Mazyrah; Wan Ahmad, Wan Fatimah; Nordin, Shahrina Md.; Sulaiman, Suziah

    This paper highlights the effect of visual of a multimedia courseware, Black Cat Courseware (BC-C), developed for learning literature at a pre-university level in University Teknologi PETRONAS (UTP). The contents of the courseware are based on a Black Cat story which is covered in an English course at the university. The objective of this paper is to evaluate the usability and effectiveness of BC-C. A total of sixty foundation students were involved in the study. Quasi-experimental design was employed, forming two groups: experimental and control groups. The experimental group had to interact with BC-C as part of the learning activities while the control group used the conventional learning methods. The results indicate that the experimental group achieved a statistically significant compared to the control group in understanding the Black Cat story. The study result also proves that the effect of visual increases the students' performances in literature learning at a pre-university level.

  2. Sequential experimental design based generalised ANOVA

    NASA Astrophysics Data System (ADS)

    Chakraborty, Souvik; Chowdhury, Rajib

    2016-07-01

    Over the last decade, surrogate modelling technique has gained wide popularity in the field of uncertainty quantification, optimization, model exploration and sensitivity analysis. This approach relies on experimental design to generate training points and regression/interpolation for generating the surrogate. In this work, it is argued that conventional experimental design may render a surrogate model inefficient. In order to address this issue, this paper presents a novel distribution adaptive sequential experimental design (DA-SED). The proposed DA-SED has been coupled with a variant of generalised analysis of variance (G-ANOVA), developed by representing the component function using the generalised polynomial chaos expansion. Moreover, generalised analytical expressions for calculating the first two statistical moments of the response, which are utilized in predicting the probability of failure, have also been developed. The proposed approach has been utilized in predicting probability of failure of three structural mechanics problems. It is observed that the proposed approach yields accurate and computationally efficient estimate of the failure probability.

  3. Sequential experimental design based generalised ANOVA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chakraborty, Souvik, E-mail: csouvik41@gmail.com; Chowdhury, Rajib, E-mail: rajibfce@iitr.ac.in

    Over the last decade, surrogate modelling technique has gained wide popularity in the field of uncertainty quantification, optimization, model exploration and sensitivity analysis. This approach relies on experimental design to generate training points and regression/interpolation for generating the surrogate. In this work, it is argued that conventional experimental design may render a surrogate model inefficient. In order to address this issue, this paper presents a novel distribution adaptive sequential experimental design (DA-SED). The proposed DA-SED has been coupled with a variant of generalised analysis of variance (G-ANOVA), developed by representing the component function using the generalised polynomial chaos expansion. Moreover,more » generalised analytical expressions for calculating the first two statistical moments of the response, which are utilized in predicting the probability of failure, have also been developed. The proposed approach has been utilized in predicting probability of failure of three structural mechanics problems. It is observed that the proposed approach yields accurate and computationally efficient estimate of the failure probability.« less

  4. Using Bayes' theorem for free energy calculations

    NASA Astrophysics Data System (ADS)

    Rogers, David M.

    Statistical mechanics is fundamentally based on calculating the probabilities of molecular-scale events. Although Bayes' theorem has generally been recognized as providing key guiding principals for setup and analysis of statistical experiments [83], classical frequentist models still predominate in the world of computational experimentation. As a starting point for widespread application of Bayesian methods in statistical mechanics, we investigate the central quantity of free energies from this perspective. This dissertation thus reviews the basics of Bayes' view of probability theory, and the maximum entropy formulation of statistical mechanics before providing examples of its application to several advanced research areas. We first apply Bayes' theorem to a multinomial counting problem in order to determine inner shell and hard sphere solvation free energy components of Quasi-Chemical Theory [140]. We proceed to consider the general problem of free energy calculations from samples of interaction energy distributions. From there, we turn to spline-based estimation of the potential of mean force [142], and empirical modeling of observed dynamics using integrator matching. The results of this research are expected to advance the state of the art in coarse-graining methods, as they allow a systematic connection from high-resolution (atomic) to low-resolution (coarse) structure and dynamics. In total, our work on these problems constitutes a critical starting point for further application of Bayes' theorem in all areas of statistical mechanics. It is hoped that the understanding so gained will allow for improvements in comparisons between theory and experiment.

  5. Effects of exercise on fatigue, sleep, and performance: a randomized trial.

    PubMed

    Coleman, Elizabeth Ann; Goodwin, Julia A; Kennedy, Robert; Coon, Sharon K; Richards, Kathy; Enderlin, Carol; Stewart, Carol B; McNatt, Paula; Lockhart, Kim; Anaissie, Elias J

    2012-09-01

    To compare usual care with a home-based individualized exercise program (HBIEP) in patients receiving intensive treatment for multiple myeloma (MM)and epoetin alfa therapy. Randomized trial with repeated measures of two groups (one experimental and one control) and an approximate 15-week experimental period. Outpatient setting of the Myeloma Institute for Research and Therapy at the Rockfellow Cancer Center at the University of Arkansas for Medical Sciences. 187 patients with newly diagnosed MM enrolled in a separate study evaluating effectiveness of the Total Therapy regimen, with or without thalidomide. Measurements included the Profile of Mood States fatigue scale, Functional Assessment of Cancer Therapy-Fatigue, ActiGraph® recordings, 6-Minute Walk Test, and hemoglobin levels at baseline and before and after stem cell collection. Descriptive statistics were used to compare demographics and treatment effects, and repeated measures analysis of variance was used to determine effects of HBIEP. Fatigue, nighttime sleep, performance (aerobic capacity) as dependent or outcome measures, and HBIEP combining strength building and aerobic exercise as the independent variable. Both groups were equivalent for age, gender, race, receipt of thalidomide, hemoglobin levels, and type of treatment regimen for MM. No statistically significant differences existed among the experimental and control groups for fatigue, sleep, or performance (aerobic capacity). Statistically significant differences (p < 0.05) were found in each of the study outcomes for all patients as treatment progressed and patients experienced more fatigue and poorer nighttime sleep and performance (aerobic capacity). The effect of exercise seemed to be minimal on decreasing fatigue, improving sleep, and improving performance (aerobic capacity). Exercise is safe and has physiologic benefits for patients undergoing MM treatment; exercise combined with epoetin alfa helped alleviate anemia.

  6. Review of research designs and statistical methods employed in dental postgraduate dissertations.

    PubMed

    Shirahatti, Ravi V; Hegde-Shetiya, Sahana

    2015-01-01

    There is a need to evaluate the quality of postgraduate dissertations of dentistry submitted to university in the light of the international standards of reporting. We conducted the review with an objective to document the use of sampling methods, measurement standardization, blinding, methods to eliminate bias, appropriate use of statistical tests, appropriate use of data presentation in postgraduate dental research and suggest and recommend modifications. The public access database of the dissertations from Rajiv Gandhi University of Health Sciences was reviewed. Three hundred and thirty-three eligible dissertations underwent preliminary evaluation followed by detailed evaluation of 10% of randomly selected dissertations. The dissertations were assessed based on international reporting guidelines such as strengthening the reporting of observational studies in epidemiology (STROBE), consolidated standards of reporting trials (CONSORT), and other scholarly resources. The data were compiled using MS Excel and SPSS 10.0. Numbers and percentages were used for describing the data. The "in vitro" studies were the most common type of research (39%), followed by observational (32%) and experimental studies (29%). The disciplines conservative dentistry (92%) and prosthodontics (75%) reported high numbers of in vitro research. Disciplines oral surgery (80%) and periodontics (67%) had conducted experimental studies as a major share of their research. Lacunae in the studies included observational studies not following random sampling (70%), experimental studies not following random allocation (75%), not mentioning about blinding, confounding variables and calibrations in measurements, misrepresenting the data by inappropriate data presentation, errors in reporting probability values and not reporting confidence intervals. Few studies showed grossly inappropriate choice of statistical tests and many studies needed additional tests. Overall observations indicated the need to comply with standard guidelines of reporting research.

  7. Exploring the statistical and clinical impact of two interim analyses on the Phase II design with option for direct assignment.

    PubMed

    An, Ming-Wen; Mandrekar, Sumithra J; Edelman, Martin J; Sargent, Daniel J

    2014-07-01

    The primary goal of Phase II clinical trials is to understand better a treatment's safety and efficacy to inform a Phase III go/no-go decision. Many Phase II designs have been proposed, incorporating randomization, interim analyses, adaptation, and patient selection. The Phase II design with an option for direct assignment (i.e. stop randomization and assign all patients to the experimental arm based on a single interim analysis (IA) at 50% accrual) was recently proposed [An et al., 2012]. We discuss this design in the context of existing designs, and extend it from a single-IA to a two-IA design. We compared the statistical properties and clinical relevance of the direct assignment design with two IA (DAD-2) versus a balanced randomized design with two IA (BRD-2) and a direct assignment design with one IA (DAD-1), over a range of response rate ratios (2.0-3.0). The DAD-2 has minimal loss in power (<2.2%) and minimal increase in T1ER (<1.6%) compared to a BRD-2. As many as 80% more patients were treated with experimental vs. control in the DAD-2 than with the BRD-2 (experimental vs. control ratio: 1.8 vs. 1.0), and as many as 64% more in the DAD-2 than with the DAD-1 (1.8 vs. 1.1). We illustrate the DAD-2 using a case study in lung cancer. In the spectrum of Phase II designs, the direct assignment design, especially with two IA, provides a middle ground with desirable statistical properties and likely appeal to both clinicians and patients. Copyright © 2014 Elsevier Inc. All rights reserved.

  8. Statistical modelling of networked human-automation performance using working memory capacity.

    PubMed

    Ahmed, Nisar; de Visser, Ewart; Shaw, Tyler; Mohamed-Ameen, Amira; Campbell, Mark; Parasuraman, Raja

    2014-01-01

    This study examines the challenging problem of modelling the interaction between individual attentional limitations and decision-making performance in networked human-automation system tasks. Analysis of real experimental data from a task involving networked supervision of multiple unmanned aerial vehicles by human participants shows that both task load and network message quality affect performance, but that these effects are modulated by individual differences in working memory (WM) capacity. These insights were used to assess three statistical approaches for modelling and making predictions with real experimental networked supervisory performance data: classical linear regression, non-parametric Gaussian processes and probabilistic Bayesian networks. It is shown that each of these approaches can help designers of networked human-automated systems cope with various uncertainties in order to accommodate future users by linking expected operating conditions and performance from real experimental data to observable cognitive traits like WM capacity. Practitioner Summary: Working memory (WM) capacity helps account for inter-individual variability in operator performance in networked unmanned aerial vehicle supervisory tasks. This is useful for reliable performance prediction near experimental conditions via linear models; robust statistical prediction beyond experimental conditions via Gaussian process models and probabilistic inference about unknown task conditions/WM capacities via Bayesian network models.

  9. Parameter discovery in stochastic biological models using simulated annealing and statistical model checking.

    PubMed

    Hussain, Faraz; Jha, Sumit K; Jha, Susmit; Langmead, Christopher J

    2014-01-01

    Stochastic models are increasingly used to study the behaviour of biochemical systems. While the structure of such models is often readily available from first principles, unknown quantitative features of the model are incorporated into the model as parameters. Algorithmic discovery of parameter values from experimentally observed facts remains a challenge for the computational systems biology community. We present a new parameter discovery algorithm that uses simulated annealing, sequential hypothesis testing, and statistical model checking to learn the parameters in a stochastic model. We apply our technique to a model of glucose and insulin metabolism used for in-silico validation of artificial pancreata and demonstrate its effectiveness by developing parallel CUDA-based implementation for parameter synthesis in this model.

  10. A quantitative link between microplastic instability and macroscopic deformation behaviors in metallic glasses

    NASA Astrophysics Data System (ADS)

    Wu, Y.; Chen, G. L.; Hui, X. D.; Liu, C. T.; Lin, Y.; Shang, X. C.; Lu, Z. P.

    2009-10-01

    Based on mechanical instability of individual shear transformation zones (STZs), a quantitative link between the microplastic instability and macroscopic deformation behavior of metallic glasses was proposed. Our analysis confirms that macroscopic metallic glasses comprise a statistical distribution of STZ embryos with distributed values of activation energy, and the microplastic instability of all the individual STZs dictates the macroscopic deformation behavior of amorphous solids. The statistical model presented in this paper can successfully reproduce the macroscopic stress-strain curves determined experimentally and readily be used to predict strain-rate effects on the macroscopic responses with the availability of the material parameters at a certain strain rate, which offer new insights into understanding the actual deformation mechanism in amorphous solids.

  11. Angular filter refractometry analysis using simulated annealing.

    PubMed

    Angland, P; Haberberger, D; Ivancic, S T; Froula, D H

    2017-10-01

    Angular filter refractometry (AFR) is a novel technique used to characterize the density profiles of laser-produced, long-scale-length plasmas [Haberberger et al., Phys. Plasmas 21, 056304 (2014)]. A new method of analysis for AFR images was developed using an annealing algorithm to iteratively converge upon a solution. A synthetic AFR image is constructed by a user-defined density profile described by eight parameters, and the algorithm systematically alters the parameters until the comparison is optimized. The optimization and statistical uncertainty calculation is based on the minimization of the χ 2 test statistic. The algorithm was successfully applied to experimental data of plasma expanding from a flat, laser-irradiated target, resulting in an average uncertainty in the density profile of 5%-20% in the region of interest.

  12. A statistical model of brittle fracture by transgranular cleavage

    NASA Astrophysics Data System (ADS)

    Lin, Tsann; Evans, A. G.; Ritchie, R. O.

    A MODEL for brittle fracture by transgranular cleavage cracking is presented based on the application of weakest link statistics to the critical microstructural fracture mechanisms. The model permits prediction of the macroscopic fracture toughness, KI c, in single phase microstructures containing a known distribution of particles, and defines the critical distance from the crack tip at which the initial cracking event is most probable. The model is developed for unstable fracture ahead of a sharp crack considering both linear elastic and nonlinear elastic ("elastic/plastic") crack tip stress fields. Predictions are evaluated by comparison with experimental results on the low temperature flow and fracture behavior of a low carbon mild steel with a simple ferrite/grain boundary carbide microstructure.

  13. Experimental Demonstration of the Einstein-Podolsky-Rosen Steering Game Based on the All-Versus-Nothing Proof

    NASA Astrophysics Data System (ADS)

    Sun, Kai; Xu, Jin-Shi; Ye, Xiang-Jun; Wu, Yu-Chun; Chen, Jing-Ling; Li, Chuan-Feng; Guo, Guang-Can

    2014-10-01

    Einstein-Podolsky-Rosen (EPR) steering, a generalization of the original concept of "steering" proposed by Schrödinger, describes the ability of one system to nonlocally affect another system's states through local measurements. Some experimental efforts to test EPR steering in terms of inequalities have been made, which usually require many measurement settings. Analogy to the "all-versus-nothing" (AVN) proof of Bell's theorem without inequalities, testing steerability without inequalities would be more strong and require less resources. Moreover, the practical meaning of steering implies that it should also be possible to store the state information on the side to be steered, a result that has not yet been experimentally demonstrated. Using a recent AVN criterion for two-qubit entangled states, we experimentally implement a practical steering game using quantum memory. Furthermore, we develop a theoretical method to deal with the noise and finite measurement statistics within the AVN framework and apply it to analyze the experimental data. Our results clearly show the facilitation of the AVN criterion for testing steerability and provide a particularly strong perspective for understanding EPR steering.

  14. The power and promise of RNA-seq in ecology and evolution.

    PubMed

    Todd, Erica V; Black, Michael A; Gemmell, Neil J

    2016-03-01

    Reference is regularly made to the power of new genomic sequencing approaches. Using powerful technology, however, is not the same as having the necessary power to address a research question with statistical robustness. In the rush to adopt new and improved genomic research methods, limitations of technology and experimental design may be initially neglected. Here, we review these issues with regard to RNA sequencing (RNA-seq). RNA-seq adds large-scale transcriptomics to the toolkit of ecological and evolutionary biologists, enabling differential gene expression (DE) studies in nonmodel species without the need for prior genomic resources. High biological variance is typical of field-based gene expression studies and means that larger sample sizes are often needed to achieve the same degree of statistical power as clinical studies based on data from cell lines or inbred animal models. Sequencing costs have plummeted, yet RNA-seq studies still underutilize biological replication. Finite research budgets force a trade-off between sequencing effort and replication in RNA-seq experimental design. However, clear guidelines for negotiating this trade-off, while taking into account study-specific factors affecting power, are currently lacking. Study designs that prioritize sequencing depth over replication fail to capitalize on the power of RNA-seq technology for DE inference. Significant recent research effort has gone into developing statistical frameworks and software tools for power analysis and sample size calculation in the context of RNA-seq DE analysis. We synthesize progress in this area and derive an accessible rule-of-thumb guide for designing powerful RNA-seq experiments relevant in eco-evolutionary and clinical settings alike. © 2016 John Wiley & Sons Ltd.

  15. A statistical model describing combined irreversible electroporation and electroporation-induced blood-brain barrier disruption

    PubMed Central

    Sharabi, Shirley; Kos, Bor; Last, David; Guez, David; Daniels, Dianne; Harnof, Sagi; Miklavcic, Damijan

    2016-01-01

    Background Electroporation-based therapies such as electrochemotherapy (ECT) and irreversible electroporation (IRE) are emerging as promising tools for treatment of tumors. When applied to the brain, electroporation can also induce transient blood-brain-barrier (BBB) disruption in volumes extending beyond IRE, thus enabling efficient drug penetration. The main objective of this study was to develop a statistical model predicting cell death and BBB disruption induced by electroporation. This model can be used for individual treatment planning. Material and methods Cell death and BBB disruption models were developed based on the Peleg-Fermi model in combination with numerical models of the electric field. The model calculates the electric field thresholds for cell kill and BBB disruption and describes the dependence on the number of treatment pulses. The model was validated using in vivo experimental data consisting of rats brains MRIs post electroporation treatments. Results Linear regression analysis confirmed that the model described the IRE and BBB disruption volumes as a function of treatment pulses number (r2 = 0.79; p < 0.008, r2 = 0.91; p < 0.001). The results presented a strong plateau effect as the pulse number increased. The ratio between complete cell death and no cell death thresholds was relatively narrow (between 0.88-0.91) even for small numbers of pulses and depended weakly on the number of pulses. For BBB disruption, the ratio increased with the number of pulses. BBB disruption radii were on average 67% ± 11% larger than IRE volumes. Conclusions The statistical model can be used to describe the dependence of treatment-effects on the number of pulses independent of the experimental setup. PMID:27069447

  16. Advanced Gear Alloys for Ultra High Strength Applications

    NASA Technical Reports Server (NTRS)

    Shen, Tony; Krantz, Timothy; Sebastian, Jason

    2011-01-01

    Single tooth bending fatigue (STBF) test data of UHS Ferrium C61 and C64 alloys are presented in comparison with historical test data of conventional gear steels (9310 and Pyrowear 53) with comparable statistical analysis methods. Pitting and scoring tests of C61 and C64 are works in progress. Boeing statistical analysis of STBF test data for the four gear steels (C61, C64, 9310 and Pyrowear 53) indicates that the UHS grades exhibit increases in fatigue strength in the low cycle fatigue (LCF) regime. In the high cycle fatigue (HCF) regime, the UHS steels exhibit better mean fatigue strength endurance limit behavior (particularly as compared to Pyrowear 53). However, due to considerable scatter in the UHS test data, the anticipated overall benefits of the UHS grades in bending fatigue have not been fully demonstrated. Based on all the test data and on Boeing s analysis, C61 has been selected by Boeing as the gear steel for the final ERDS demonstrator test gearboxes. In terms of potential follow-up work, detailed physics-based, micromechanical analysis and modeling of the fatigue data would allow for a better understanding of the causes of the experimental scatter, and of the transition from high-stress LCF (surface-dominated) to low-stress HCF (subsurface-dominated) fatigue failure. Additional STBF test data and failure analysis work, particularly in the HCF regime and around the endurance limit stress, could allow for better statistical confidence and could reduce the observed effects of experimental test scatter. Finally, the need for further optimization of the residual compressive stress profiles of the UHS steels (resulting from carburization and peening) is noted, particularly for the case of the higher hardness C64 material.

  17. Statistical colour models: an automated digital image analysis method for quantification of histological biomarkers.

    PubMed

    Shu, Jie; Dolman, G E; Duan, Jiang; Qiu, Guoping; Ilyas, Mohammad

    2016-04-27

    Colour is the most important feature used in quantitative immunohistochemistry (IHC) image analysis; IHC is used to provide information relating to aetiology and to confirm malignancy. Statistical modelling is a technique widely used for colour detection in computer vision. We have developed a statistical model of colour detection applicable to detection of stain colour in digital IHC images. Model was first trained by massive colour pixels collected semi-automatically. To speed up the training and detection processes, we removed luminance channel, Y channel of YCbCr colour space and chose 128 histogram bins which is the optimal number. A maximum likelihood classifier is used to classify pixels in digital slides into positively or negatively stained pixels automatically. The model-based tool was developed within ImageJ to quantify targets identified using IHC and histochemistry. The purpose of evaluation was to compare the computer model with human evaluation. Several large datasets were prepared and obtained from human oesophageal cancer, colon cancer and liver cirrhosis with different colour stains. Experimental results have demonstrated the model-based tool achieves more accurate results than colour deconvolution and CMYK model in the detection of brown colour, and is comparable to colour deconvolution in the detection of pink colour. We have also demostrated the proposed model has little inter-dataset variations. A robust and effective statistical model is introduced in this paper. The model-based interactive tool in ImageJ, which can create a visual representation of the statistical model and detect a specified colour automatically, is easy to use and available freely at http://rsb.info.nih.gov/ij/plugins/ihc-toolbox/index.html . Testing to the tool by different users showed only minor inter-observer variations in results.

  18. Teleportation-based quantum information processing with Majorana zero modes

    DOE PAGES

    Vijay, Sagar; Fu, Liang

    2016-12-29

    In this work, we present a measurement-based scheme for performing braiding operations on Majorana zero modes in mesoscopic superconductor islands and for detecting their non-Abelian statistics without moving or hybridizing them. In our scheme for “braiding without braiding”, the topological qubit encoded in any pair of well-separated Majorana zero modes is read out from the transmission phase shift in electron teleportation through the island in the Coulomb-blockade regime. Finally, we propose experimental setups to measure the teleportation phase shift via conductance in an electron interferometer or persistent current in a closed loop.

  19. A consensual neural network

    NASA Technical Reports Server (NTRS)

    Benediktsson, J. A.; Ersoy, O. K.; Swain, P. H.

    1991-01-01

    A neural network architecture called a consensual neural network (CNN) is proposed for the classification of data from multiple sources. Its relation to hierarchical and ensemble neural networks is discussed. CNN is based on the statistical consensus theory and uses nonlinearly transformed input data. The input data are transformed several times, and the different transformed data are applied as if they were independent inputs. The independent inputs are classified using stage neural networks and outputs from the stage networks are then weighted and combined to make a decision. Experimental results based on remote-sensing data and geographic data are given.

  20. Teleportation-based quantum information processing with Majorana zero modes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vijay, Sagar; Fu, Liang

    In this work, we present a measurement-based scheme for performing braiding operations on Majorana zero modes in mesoscopic superconductor islands and for detecting their non-Abelian statistics without moving or hybridizing them. In our scheme for “braiding without braiding”, the topological qubit encoded in any pair of well-separated Majorana zero modes is read out from the transmission phase shift in electron teleportation through the island in the Coulomb-blockade regime. Finally, we propose experimental setups to measure the teleportation phase shift via conductance in an electron interferometer or persistent current in a closed loop.

Top