Sample records for i2 statistic results

  1. The heterogeneity statistic I(2) can be biased in small meta-analyses.

    PubMed

    von Hippel, Paul T

    2015-04-14

    Estimated effects vary across studies, partly because of random sampling error and partly because of heterogeneity. In meta-analysis, the fraction of variance that is due to heterogeneity is estimated by the statistic I(2). We calculate the bias of I(2), focusing on the situation where the number of studies in the meta-analysis is small. Small meta-analyses are common; in the Cochrane Library, the median number of studies per meta-analysis is 7 or fewer. We use Mathematica software to calculate the expectation and bias of I(2). I(2) has a substantial bias when the number of studies is small. The bias is positive when the true fraction of heterogeneity is small, but the bias is typically negative when the true fraction of heterogeneity is large. For example, with 7 studies and no true heterogeneity, I(2) will overestimate heterogeneity by an average of 12 percentage points, but with 7 studies and 80 percent true heterogeneity, I(2) can underestimate heterogeneity by an average of 28 percentage points. Biases of 12-28 percentage points are not trivial when one considers that, in the Cochrane Library, the median I(2) estimate is 21 percent. The point estimate I(2) should be interpreted cautiously when a meta-analysis has few studies. In small meta-analyses, confidence intervals should supplement or replace the biased point estimate I(2).

  2. Primer of statistics in dental research: part I.

    PubMed

    Shintani, Ayumi

    2014-01-01

    Statistics play essential roles in evidence-based dentistry (EBD) practice and research. It ranges widely from formulating scientific questions, designing studies, collecting and analyzing data to interpreting, reporting, and presenting study findings. Mastering statistical concepts appears to be an unreachable goal among many dental researchers in part due to statistical authorities' limitations of explaining statistical principles to health researchers without elaborating complex mathematical concepts. This series of 2 articles aim to introduce dental researchers to 9 essential topics in statistics to conduct EBD with intuitive examples. The part I of the series includes the first 5 topics (1) statistical graph, (2) how to deal with outliers, (3) p-value and confidence interval, (4) testing equivalence, and (5) multiplicity adjustment. Part II will follow to cover the remaining topics including (6) selecting the proper statistical tests, (7) repeated measures analysis, (8) epidemiological consideration for causal association, and (9) analysis of agreement. Copyright © 2014. Published by Elsevier Ltd.

  3. Parameter Estimation in Astronomy with Poisson-Distributed Data. 1; The (CHI)2(gamma) Statistic

    NASA Technical Reports Server (NTRS)

    Mighell, Kenneth J.

    1999-01-01

    Applying the standard weighted mean formula, [Sigma (sub i)n(sub i)ssigma(sub i, sup -2)], to determine the weighted mean of data, n(sub i), drawn from a Poisson distribution, will, on average, underestimate the true mean by approx. 1 for all true mean values larger than approx.3 when the common assumption is made that the error of the i th observation is sigma(sub i) = max square root of n(sub i), 1).This small, but statistically significant offset, explains the long-known observation that chi-square minimization techniques which use the modified Neyman'chi(sub 2) statistic, chi(sup 2, sub N) equivalent Sigma(sub i)((n(sub i) - y(sub i)(exp 2)) / max(n(sub i), 1), to compare Poisson - distributed data with model values, y(sub i), will typically predict a total number of counts that underestimates the true total by about 1 count per bin. Based on my finding that weighted mean of data drawn from a Poisson distribution can be determined using the formula [Sigma(sub i)[n(sub i) + min(n(sub i), 1)](n(sub i) + 1)(exp -1)] / [Sigma(sub i)(n(sub i) + 1)(exp -1))], I propose that a new chi(sub 2) statistic, chi(sup 2, sub gamma) equivalent, should always be used to analyze Poisson- distributed data in preference to the modified Neyman's chi(exp 2) statistic. I demonstrated the power and usefulness of,chi(sub gamma, sup 2) minimization by using two statistical fitting techniques and five chi(exp 2) statistics to analyze simulated X-ray power - low 15 - channel spectra with large and small counts per bin. I show that chi(sub gamma, sup 2) minimization with the Levenberg - Marquardt or Powell's method can produce excellent results (mean slope errors approx. less than 3%) with spectra having as few as 25 total counts.

  4. Explanation of Two Anomalous Results in Statistical Mediation Analysis.

    PubMed

    Fritz, Matthew S; Taylor, Aaron B; Mackinnon, David P

    2012-01-01

    Previous studies of different methods of testing mediation models have consistently found two anomalous results. The first result is elevated Type I error rates for the bias-corrected and accelerated bias-corrected bootstrap tests not found in nonresampling tests or in resampling tests that did not include a bias correction. This is of special concern as the bias-corrected bootstrap is often recommended and used due to its higher statistical power compared with other tests. The second result is statistical power reaching an asymptote far below 1.0 and in some conditions even declining slightly as the size of the relationship between X and M , a , increased. Two computer simulations were conducted to examine these findings in greater detail. Results from the first simulation found that the increased Type I error rates for the bias-corrected and accelerated bias-corrected bootstrap are a function of an interaction between the size of the individual paths making up the mediated effect and the sample size, such that elevated Type I error rates occur when the sample size is small and the effect size of the nonzero path is medium or larger. Results from the second simulation found that stagnation and decreases in statistical power as a function of the effect size of the a path occurred primarily when the path between M and Y , b , was small. Two empirical mediation examples are provided using data from a steroid prevention and health promotion program aimed at high school football players (Athletes Training and Learning to Avoid Steroids; Goldberg et al., 1996), one to illustrate a possible Type I error for the bias-corrected bootstrap test and a second to illustrate a loss in power related to the size of a . Implications of these findings are discussed.

  5. Vitamin E and the risk of pneumonia: using the I 2 statistic to quantify heterogeneity within a controlled trial.

    PubMed

    Hemilä, Harri

    2016-11-01

    Analyses in nutritional epidemiology usually assume a uniform effect of a nutrient. Previously, four subgroups of the Alpha-Tocopherol, Beta-Carotene Cancer Prevention (ATBC) Study of Finnish male smokers aged 50-69 years were identified in which vitamin E supplementation either significantly increased or decreased the risk of pneumonia. The purpose of this present study was to quantify the level of true heterogeneity in the effect of vitamin E on pneumonia incidence using the I 2 statistic. The I 2 value estimates the percentage of total variation across studies that is explained by true differences in the treatment effect rather than by chance, with a range from 0 to 100 %. The I 2 statistic for the effect of vitamin E supplementation on pneumonia risk for five subgroups of the ATBC population was 89 % (95 % CI 78, 95 %), indicating that essentially all heterogeneity was true variation in vitamin E effect instead of chance variation. The I 2 statistic for heterogeneity in vitamin E effects on pneumonia risk was 92 % (95 % CI 80, 97 %) for three other ATBC subgroups defined by smoking level and leisure-time exercise level. Vitamin E decreased pneumonia risk by 69 % among participants who had the least exposure to smoking and exercised during leisure time (7·6 % of the ATBC participants), and vitamin E increased pneumonia risk by 68 % among those who had the highest exposure to smoking and did not exercise (22 % of the ATBC participants). These findings refute there being a uniform effect of vitamin E supplementation on the risk of pneumonia.

  6. Interpretation of statistical results.

    PubMed

    García Garmendia, J L; Maroto Monserrat, F

    2018-02-21

    The appropriate interpretation of the statistical results is crucial to understand the advances in medical science. The statistical tools allow us to transform the uncertainty and apparent chaos in nature to measurable parameters which are applicable to our clinical practice. The importance of understanding the meaning and actual extent of these instruments is essential for researchers, the funders of research and for professionals who require a permanent update based on good evidence and supports to decision making. Various aspects of the designs, results and statistical analysis are reviewed, trying to facilitate his comprehension from the basics to what is most common but no better understood, and bringing a constructive, non-exhaustive but realistic look. Copyright © 2018 Elsevier España, S.L.U. y SEMICYUC. All rights reserved.

  7. Statistical gamma-ray decay studies at iThemba LABS

    NASA Astrophysics Data System (ADS)

    Wiedeking, M.; Bernstein, L. A.; Bleuel, D. L.; Brits, C. P.; Sowazi, K.; Görgen, A.; Goldblum, B. L.; Guttormsen, M.; Kheswa, B. V.; Larsen, A. C.; Majola, S. N. T.; Malatji, K. L.; Negi, D.; Nogwanya, T.; Siem, S.; Zikhali, B. R.

    2017-09-01

    A program to study the γ-ray decay from the region of high-level density has been established at iThemba LABS, where a high-resolution gamma-ray detector array is used in conjunction with silicon particle-telescopes. Results from two recent projects are presented: 1) The 74Ge(α,α'γ) reaction was used to investigate the Pygmy Dipole Resonance. The results were compared to (γ,γ') data and indicate that the dipole states split into mixed isospin and relatively pure isovector excitations. 2) Data from the 95Mo(d,p) reaction were used to develop a novel method for the determination of spins for low-lying discrete levels utilizing statistical γ-ray decay in the vicinity of the neutron separation energy. These results provide insight into the competition of (γ,n) and (γ,γ') reactions and highlights the need to correct for angular momentum barrier effects.

  8. Reporting Statistical Results in Medical Journals

    PubMed Central

    Arifin, Wan Nor; Sarimah, Abdullah; Norsa’adah, Bachok; Najib Majdi, Yaacob; Siti-Azrin, Ab Hamid; Kamarul Imran, Musa; Aniza, Abd Aziz; Naing, Lin

    2016-01-01

    Statistical editors of the Malaysian Journal of Medical Sciences (MJMS) must go through many submitted manuscripts, focusing on the statistical aspect of the manuscripts. However, the editors notice myriad styles of reporting the statistical results, which are not standardised among the authors. This could be due to the lack of clear written instructions on reporting statistics in the guidelines for authors. The aim of this editorial is to briefly outline reporting methods for several important and common statistical results. It will also address a number of common mistakes made by the authors. The editorial will serve as a guideline for authors aiming to publish in the MJMS as well as in other medical journals. PMID:27904419

  9. Ideas for Effective Communication of Statistical Results

    DOE PAGES

    Anderson-Cook, Christine M.

    2015-03-01

    Effective presentation of statistical results to those with less statistical training, including managers and decision-makers requires planning, anticipation and thoughtful delivery. Here are several recommendations for effectively presenting statistical results.

  10. 30 CFR 250.192 - What reports and statistics must I submit relating to a hurricane, earthquake, or other natural...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 30 Mineral Resources 2 2010-07-01 2010-07-01 false What reports and statistics must I submit... statistics must I submit relating to a hurricane, earthquake, or other natural occurrence? (a) You must submit evacuation statistics to the Regional Supervisor for a natural occurrence, such as a hurricane, a...

  11. Some new results on the statistics of radio wave scintillation. I - Empirical evidence for Gaussian statistics

    NASA Technical Reports Server (NTRS)

    Rino, C. L.; Livingston, R. C.; Whitney, H. E.

    1976-01-01

    This paper presents an analysis of ionospheric scintillation data which shows that the underlying statistical structure of the signal can be accurately modeled by the additive complex Gaussian perturbation predicted by the Born approximation in conjunction with an application of the central limit theorem. By making use of this fact, it is possible to estimate the in-phase, phase quadrature, and cophased scattered power by curve fitting to measured intensity histograms. By using this procedure, it is found that typically more than 80% of the scattered power is in phase quadrature with the undeviated signal component. Thus, the signal is modeled by a Gaussian, but highly non-Rician process. From simultaneous UHF and VHF data, only a weak dependence of this statistical structure on changes in the Fresnel radius is deduced. The signal variance is found to have a nonquadratic wavelength dependence. It is hypothesized that this latter effect is a subtle manifestation of locally homogeneous irregularity structures, a mathematical model proposed by Kolmogorov (1941) in his early studies of incompressible fluid turbulence.

  12. Effect of Internet-Based Cognitive Apprenticeship Model (i-CAM) on Statistics Learning among Postgraduate Students.

    PubMed

    Saadati, Farzaneh; Ahmad Tarmizi, Rohani; Mohd Ayub, Ahmad Fauzi; Abu Bakar, Kamariah

    2015-01-01

    Because students' ability to use statistics, which is mathematical in nature, is one of the concerns of educators, embedding within an e-learning system the pedagogical characteristics of learning is 'value added' because it facilitates the conventional method of learning mathematics. Many researchers emphasize the effectiveness of cognitive apprenticeship in learning and problem solving in the workplace. In a cognitive apprenticeship learning model, skills are learned within a community of practitioners through observation of modelling and then practice plus coaching. This study utilized an internet-based Cognitive Apprenticeship Model (i-CAM) in three phases and evaluated its effectiveness for improving statistics problem-solving performance among postgraduate students. The results showed that, when compared to the conventional mathematics learning model, the i-CAM could significantly promote students' problem-solving performance at the end of each phase. In addition, the combination of the differences in students' test scores were considered to be statistically significant after controlling for the pre-test scores. The findings conveyed in this paper confirmed the considerable value of i-CAM in the improvement of statistics learning for non-specialized postgraduate students.

  13. Statistical Analysis of NAS Parallel Benchmarks and LINPACK Results

    NASA Technical Reports Server (NTRS)

    Meuer, Hans-Werner; Simon, Horst D.; Strohmeier, Erich; Lasinski, T. A. (Technical Monitor)

    1994-01-01

    In the last three years extensive performance data have been reported for parallel machines both based on the NAS Parallel Benchmarks, and on LINPACK. In this study we have used the reported benchmark results and performed a number of statistical experiments using factor, cluster, and regression analyses. In addition to the performance results of LINPACK and the eight NAS parallel benchmarks, we have also included peak performance of the machine, and the LINPACK n and n(sub 1/2) values. Some of the results and observations can be summarized as follows: 1) All benchmarks are strongly correlated with peak performance. 2) LINPACK and EP have each a unique signature. 3) The remaining NPB can grouped into three groups as follows: (CG and IS), (LU and SP), and (MG, FT, and BT). Hence three (or four with EP) benchmarks are sufficient to characterize the overall NPB performance. Our poster presentation will follow a standard poster format, and will present the data of our statistical analysis in detail.

  14. Effect of Internet-Based Cognitive Apprenticeship Model (i-CAM) on Statistics Learning among Postgraduate Students

    PubMed Central

    Saadati, Farzaneh; Ahmad Tarmizi, Rohani

    2015-01-01

    Because students’ ability to use statistics, which is mathematical in nature, is one of the concerns of educators, embedding within an e-learning system the pedagogical characteristics of learning is ‘value added’ because it facilitates the conventional method of learning mathematics. Many researchers emphasize the effectiveness of cognitive apprenticeship in learning and problem solving in the workplace. In a cognitive apprenticeship learning model, skills are learned within a community of practitioners through observation of modelling and then practice plus coaching. This study utilized an internet-based Cognitive Apprenticeship Model (i-CAM) in three phases and evaluated its effectiveness for improving statistics problem-solving performance among postgraduate students. The results showed that, when compared to the conventional mathematics learning model, the i-CAM could significantly promote students’ problem-solving performance at the end of each phase. In addition, the combination of the differences in students' test scores were considered to be statistically significant after controlling for the pre-test scores. The findings conveyed in this paper confirmed the considerable value of i-CAM in the improvement of statistics learning for non-specialized postgraduate students. PMID:26132553

  15. Statistical literacy and sample survey results

    NASA Astrophysics Data System (ADS)

    McAlevey, Lynn; Sullivan, Charles

    2010-10-01

    Sample surveys are widely used in the social sciences and business. The news media almost daily quote from them, yet they are widely misused. Using students with prior managerial experience embarking on an MBA course, we show that common sample survey results are misunderstood even by those managers who have previously done a statistics course. In general, they fare no better than managers who have never studied statistics. There are implications for teaching, especially in business schools, as well as for consulting.

  16. Statistical mechanics of economics I

    NASA Astrophysics Data System (ADS)

    Kusmartsev, F. V.

    2011-02-01

    We show that statistical mechanics is useful in the description of financial crisis and economics. Taking a large amount of instant snapshots of a market over an interval of time we construct their ensembles and study their statistical interference. This results in a probability description of the market and gives capital, money, income, wealth and debt distributions, which in the most cases takes the form of the Bose-Einstein distribution. In addition, statistical mechanics provides the main market equations and laws which govern the correlations between the amount of money, debt, product, prices and number of retailers. We applied the found relations to a study of the evolution of the economics in USA between the years 1996 to 2008 and observe that over that time the income of a major population is well described by the Bose-Einstein distribution which parameters are different for each year. Each financial crisis corresponds to a peak in the absolute activity coefficient. The analysis correctly indicates the past crises and predicts the future one.

  17. (123)I-BZA2 as a melanin-targeted radiotracer for the identification of melanoma metastases: results and perspectives of a multicenter phase III clinical trial.

    PubMed

    Cachin, Florent; Miot-Noirault, Elisabeth; Gillet, Brigitte; Isnardi, Vanina; Labeille, Bruno; Payoux, Pierre; Meyer, Nicolas; Cammilleri, Serge; Gaudy, Caroline; Razzouk-Cadet, Micheline; Lacour, Jean Philippe; Granel-Brocard, Florence; Tychyj, Christelle; Benbouzid, Fathalah; Grange, Jean Daniel; Baulieu, Françoise; Kelly, Antony; Merlin, Charles; Mestas, Danielle; Gachon, Françoise; Chezal, Jean Michel; Degoul, Françoise; D'Incan, Michel

    2014-01-01

    Our group has developed a new radiopharmaceutical, (123)I - N-(2-diethylaminoethyl)-2-iodobenzamide ((123)I-BZA2), a benzamide derivative able to bind to melanin pigment in melanoma cells. In a prospective and multicentric phase III clinical study, the value of (18)F-FDG PET/CT and (123)I-BZA2 scintigraphy was compared for melanoma staging. Patients with a past history of cutaneous or ocular melanoma were included from 8 hospitals. (18)F-FDG imaging was performed according to a standard PET protocol. Whole-body, static planar, and SPECT/CT (if available) images were acquired 4 h after injection of a 2 MBq/kg dose of (123)I-BZA2. (18)F-FDG and (123)I-BZA2 sensitivity and specificity for the diagnosis of melanoma metastasis were calculated and compared on both a lesion basis and a patient basis. True-positive and true-negative lesion status was determined after 6 mo of clinical follow-up or according to lesion biopsies (if available). Melanin content in biopsies was evaluated with the standard Fontana-Masson silver method and was correlated with (123)I-BZA2 uptake. Based on statistical analysis, the number of inclusions was estimated at 186. In all, 87 patients were enrolled from 2008 to 2010. Of these, 45 (52%) had metastases. A total of 338 imaging abnormalities were analyzed; 86 lesions were considered metastases, and 20 of 25 lesion biopsies found melanoma metastases. In a patient-based analysis, the sensitivity of (18)F-FDG for diagnosis of melanoma metastases was higher than that of (123)I-BZA2, at 87% and 39%, respectively (P < 0.05). For specificity, (18)F-FDG and (123)I-BZA2 were not statistically different, at 78% and 94%, respectively. In a lesion-based analysis, the sensitivity of (18)F-FDG was statistically higher than that of (123)I-BZA2 (80% vs. 23%, P < 0.05). The specificity of (18)F-FDG was lower than that of (123)I-BZA2 (54% vs. 86%, P < 0.05). According to biopsy analysis, only 9 of 20 metastatic lesions (45%) were pigmented with high melanin

  18. A new ionospheric storm scale based on TEC and foF2 statistics

    NASA Astrophysics Data System (ADS)

    Nishioka, Michi; Tsugawa, Takuya; Jin, Hidekatsu; Ishii, Mamoru

    2017-01-01

    In this paper, we propose the I-scale, a new ionospheric storm scale for general users in various regions in the world. With the I-scale, ionospheric storms can be classified at any season, local time, and location. Since the ionospheric condition largely depends on many factors such as solar irradiance, energy input from the magnetosphere, and lower atmospheric activity, it had been difficult to scale ionospheric storms, which are mainly caused by solar and geomagnetic activities. In this study, statistical analysis was carried out for total electron content (TEC) and F2 layer critical frequency (foF2) in Japan for 18 years from 1997 to 2014. Seasonal, local time, and latitudinal dependences of TEC and foF2 variabilities are excluded by normalizing each percentage variation using their statistical standard deviations. The I-scale is defined by setting thresholds to the normalized numbers to seven categories: I0, IP1, IP2, IP3, IN1, IN2, and IN3. I0 represents a quiet state, and IP1 (IN1), IP2 (IN2), and IP3 (IN3) represent moderate, strong, and severe positive (negative) storms, respectively. The proposed I-scale can be used for other locations, such as polar and equatorial regions. It is considered that the proposed I-scale can be a standardized scale to help the users to assess the impact of space weather on their systems.

  19. SU-F-I-10: Spatially Local Statistics for Adaptive Image Filtering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Iliopoulos, AS; Sun, X; Floros, D

    Purpose: To facilitate adaptive image filtering operations, addressing spatial variations in both noise and signal. Such issues are prevalent in cone-beam projections, where physical effects such as X-ray scattering result in spatially variant noise, violating common assumptions of homogeneous noise and challenging conventional filtering approaches to signal extraction and noise suppression. Methods: We present a computational mechanism for probing into and quantifying the spatial variance of noise throughout an image. The mechanism builds a pyramid of local statistics at multiple spatial scales; local statistical information at each scale includes (weighted) mean, median, standard deviation, median absolute deviation, as well asmore » histogram or dynamic range after local mean/median shifting. Based on inter-scale differences of local statistics, the spatial scope of distinguishable noise variation is detected in a semi- or un-supervised manner. Additionally, we propose and demonstrate the incorporation of such information in globally parametrized (i.e., non-adaptive) filters, effectively transforming the latter into spatially adaptive filters. The multi-scale mechanism is materialized by efficient algorithms and implemented in parallel CPU/GPU architectures. Results: We demonstrate the impact of local statistics for adaptive image processing and analysis using cone-beam projections of a Catphan phantom, fitted within an annulus to increase X-ray scattering. The effective spatial scope of local statistics calculations is shown to vary throughout the image domain, necessitating multi-scale noise and signal structure analysis. Filtering results with and without spatial filter adaptation are compared visually, illustrating improvements in imaging signal extraction and noise suppression, and in preserving information in low-contrast regions. Conclusion: Local image statistics can be incorporated in filtering operations to equip them with spatial adaptivity to

  20. STATISTICAL ANALYSIS OF TANK 18F FLOOR SAMPLE RESULTS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harris, S.

    2010-09-02

    Representative sampling has been completed for characterization of the residual material on the floor of Tank 18F as per the statistical sampling plan developed by Shine [1]. Samples from eight locations have been obtained from the tank floor and two of the samples were archived as a contingency. Six samples, referred to in this report as the current scrape samples, have been submitted to and analyzed by SRNL [2]. This report contains the statistical analysis of the floor sample analytical results to determine if further data are needed to reduce uncertainty. Included are comparisons with the prior Mantis samples resultsmore » [3] to determine if they can be pooled with the current scrape samples to estimate the upper 95% confidence limits (UCL{sub 95%}) for concentration. Statistical analysis revealed that the Mantis and current scrape sample results are not compatible. Therefore, the Mantis sample results were not used to support the quantification of analytes in the residual material. Significant spatial variability among the current sample results was not found. Constituent concentrations were similar between the North and South hemispheres as well as between the inner and outer regions of the tank floor. The current scrape sample results from all six samples fall within their 3-sigma limits. In view of the results from numerous statistical tests, the data were pooled from all six current scrape samples. As such, an adequate sample size was provided for quantification of the residual material on the floor of Tank 18F. The uncertainty is quantified in this report by an upper 95% confidence limit (UCL{sub 95%}) on each analyte concentration. The uncertainty in analyte concentration was calculated as a function of the number of samples, the average, and the standard deviation of the analytical results. The UCL{sub 95%} was based entirely on the six current scrape sample results (each averaged across three analytical determinations).« less

  1. Statistically Modeling I-V Characteristics of CNT-FET with LASSO

    NASA Astrophysics Data System (ADS)

    Ma, Dongsheng; Ye, Zuochang; Wang, Yan

    2017-08-01

    With the advent of internet of things (IOT), the need for studying new material and devices for various applications is increasing. Traditionally we build compact models for transistors on the basis of physics. But physical models are expensive and need a very long time to adjust for non-ideal effects. As the vision for the application of many novel devices is not certain or the manufacture process is not mature, deriving generalized accurate physical models for such devices is very strenuous, whereas statistical modeling is becoming a potential method because of its data oriented property and fast implementation. In this paper, one classical statistical regression method, LASSO, is used to model the I-V characteristics of CNT-FET and a pseudo-PMOS inverter simulation based on the trained model is implemented in Cadence. The normalized relative mean square prediction error of the trained model versus experiment sample data and the simulation results show that the model is acceptable for digital circuit static simulation. And such modeling methodology can extend to general devices.

  2. Interpreting “statistical hypothesis testing” results in clinical research

    PubMed Central

    Sarmukaddam, Sanjeev B.

    2012-01-01

    Difference between “Clinical Significance and Statistical Significance” should be kept in mind while interpreting “statistical hypothesis testing” results in clinical research. This fact is already known to many but again pointed out here as philosophy of “statistical hypothesis testing” is sometimes unnecessarily criticized mainly due to failure in considering such distinction. Randomized controlled trials are also wrongly criticized similarly. Some scientific method may not be applicable in some peculiar/particular situation does not mean that the method is useless. Also remember that “statistical hypothesis testing” is not for decision making and the field of “decision analysis” is very much an integral part of science of statistics. It is not correct to say that “confidence intervals have nothing to do with confidence” unless one understands meaning of the word “confidence” as used in context of confidence interval. Interpretation of the results of every study should always consider all possible alternative explanations like chance, bias, and confounding. Statistical tests in inferential statistics are, in general, designed to answer the question “How likely is the difference found in random sample(s) is due to chance” and therefore limitation of relying only on statistical significance in making clinical decisions should be avoided. PMID:22707861

  3. Statistical Literacy and Sample Survey Results

    ERIC Educational Resources Information Center

    McAlevey, Lynn; Sullivan, Charles

    2010-01-01

    Sample surveys are widely used in the social sciences and business. The news media almost daily quote from them, yet they are widely misused. Using students with prior managerial experience embarking on an MBA course, we show that common sample survey results are misunderstood even by those managers who have previously done a statistics course. In…

  4. Statistics at a glance.

    PubMed

    Ector, Hugo

    2010-12-01

    I still remember my first book on statistics: "Elementary statistics with applications in medicine and the biological sciences" by Frederick E. Croxton. For me, it has been the start of pursuing understanding statistics in daily life and in medical practice. It was the first volume in a long row of books. In his introduction, Croxton pretends that"nearly everyone involved in any aspect of medicine needs to have some knowledge of statistics". The reality is that for many clinicians, statistics are limited to a "P < 0.05 = ok". I do not blame my colleagues who omit the paragraph on statistical methods. They have never had the opportunity to learn concise and clear descriptions of the key features. I have experienced how some authors can describe difficult methods in a well understandable language. Others fail completely. As a teacher, I tell my students that life is impossible without a basic knowledge of statistics. This feeling has resulted in an annual seminar of 90 minutes. This tutorial is the summary of this seminar. It is a summary and a transcription of the best pages I have detected.

  5. The estimation of the measurement results with using statistical methods

    NASA Astrophysics Data System (ADS)

    Velychko, O.; Gordiyenko, T.

    2015-02-01

    The row of international standards and guides describe various statistical methods that apply for a management, control and improvement of processes with the purpose of realization of analysis of the technical measurement results. The analysis of international standards and guides on statistical methods estimation of the measurement results recommendations for those applications in laboratories is described. For realization of analysis of standards and guides the cause-and-effect Ishikawa diagrams concerting to application of statistical methods for estimation of the measurement results are constructed.

  6. YORP effect on real objects. I. Statistical properties

    NASA Astrophysics Data System (ADS)

    Micheli, M.; Paolicchi, P.

    2008-10-01

    Context: The intensity of the YORP (Yarkovsky, O'Keefe, Radzievskii, and Paddack) effect and its ability to affect the rotational properties of asteroids depend mainly on the size of the body and on its shape. At present, we have a database of about 30 well-defined shapes of real minor bodies (most of them asteroids, but also planetary satellites and cometary nuclei). Aims: In this paper we perform a statistical analysis of how the YORP effect depends on the shape. Methods: We used the Rubincam approximation (i.e. neglecting the effects of a finite thermal conductivity). Results: We show that, among real bodies, the distribution of the YORP types, according to the classification of Vokrouhlický and Čapek, is significantly different from the one obtained in the same paper from theoretical modeling of shapes. A new “type” also comes out. Moreover, we show that the types are strongly correlated with the intensity of the YORP effect (when normalized to eliminate the dependence on the size, and thus only related to the shape).

  7. Influence of Adsorption Orientation on the Statistical Mechanics Model of Type I Antifreeze Protein: The Thermal Hysteresis Temperature.

    PubMed

    Li, Li-Fen; Liang, Xi-Xia

    2017-10-19

    The antifreeze activity of type I antifreeze proteins (AFPIs) is studied on the basis of the statistical mechanics theory, by taking the AFP's adsorption orientation into account. The thermal hysteresis temperatures are calculated by determining the system Gibbs function as well as the AFP molecule coverage rate on the ice-crystal surface. The numerical results for the thermal hysteresis temperatures of AFP9, HPLC-6, and AAAA2kE are obtained for both of the cases with and without inclusion of the adsorption orientation. The results show that the influence of the adsorption orientation on the thermal hysteresis temperature cannot be neglected. The theoretical results are coincidental preferably with the experimental data.

  8. Implementing statistical equating for MRCP(UK) Parts 1 and 2.

    PubMed

    McManus, I C; Chis, Liliana; Fox, Ray; Waller, Derek; Tang, Peter

    2014-09-26

    The MRCP(UK) exam, in 2008 and 2010, changed the standard-setting of its Part 1 and Part 2 examinations from a hybrid Angoff/Hofstee method to statistical equating using Item Response Theory, the reference group being UK graduates. The present paper considers the implementation of the change, the question of whether the pass rate increased amongst non-UK candidates, any possible role of Differential Item Functioning (DIF), and changes in examination predictive validity after the change. Analysis of data of MRCP(UK) Part 1 exam from 2003 to 2013 and Part 2 exam from 2005 to 2013. Inspection suggested that Part 1 pass rates were stable after the introduction of statistical equating, but showed greater annual variation probably due to stronger candidates taking the examination earlier. Pass rates seemed to have increased in non-UK graduates after equating was introduced, but was not associated with any changes in DIF after statistical equating. Statistical modelling of the pass rates for non-UK graduates found that pass rates, in both Part 1 and Part 2, were increasing year on year, with the changes probably beginning before the introduction of equating. The predictive validity of Part 1 for Part 2 was higher with statistical equating than with the previous hybrid Angoff/Hofstee method, confirming the utility of IRT-based statistical equating. Statistical equating was successfully introduced into the MRCP(UK) Part 1 and Part 2 written examinations, resulting in higher predictive validity than the previous Angoff/Hofstee standard setting. Concerns about an artefactual increase in pass rates for non-UK candidates after equating were shown not to be well-founded. Most likely the changes resulted from a genuine increase in candidate ability, albeit for reasons which remain unclear, coupled with a cognitive illusion giving the impression of a step-change immediately after equating began. Statistical equating provides a robust standard-setting method, with a better

  9. Statistics of time delay and scattering correlation functions in chaotic systems. I. Random matrix theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Novaes, Marcel

    2015-06-15

    We consider the statistics of time delay in a chaotic cavity having M open channels, in the absence of time-reversal invariance. In the random matrix theory approach, we compute the average value of polynomial functions of the time delay matrix Q = − iħS{sup †}dS/dE, where S is the scattering matrix. Our results do not assume M to be large. In a companion paper, we develop a semiclassical approximation to S-matrix correlation functions, from which the statistics of Q can also be derived. Together, these papers contribute to establishing the conjectured equivalence between the random matrix and the semiclassical approaches.

  10. Standards-Based Procedural Phenotyping: The Arden Syntax on i2b2.

    PubMed

    Mate, Sebastian; Castellanos, Ixchel; Ganslandt, Thomas; Prokosch, Hans-Ulrich; Kraus, Stefan

    2017-01-01

    Phenotyping, or the identification of patient cohorts, is a recurring challenge in medical informatics. While there are open source tools such as i2b2 that address this problem by providing user-friendly querying interfaces, these platforms lack semantic expressiveness to model complex phenotyping algorithms. The Arden Syntax provides procedural programming language construct, designed specifically for medical decision support and knowledge transfer. In this work, we investigate how language constructs of the Arden Syntax can be used for generic phenotyping. We implemented a prototypical tool to integrate i2b2 with an open source Arden execution environment. To demonstrate the applicability of our approach, we used the tool together with an Arden-based phenotyping algorithm to derive statistics about ICU-acquired hypernatremia. Finally, we discuss how the combination of i2b2's user-friendly cohort pre-selection and Arden's procedural expressiveness could benefit phenotyping.

  11. Characterizing the D2 statistic: word matches in biological sequences.

    PubMed

    Forêt, Sylvain; Wilson, Susan R; Burden, Conrad J

    2009-01-01

    Word matches are often used in sequence comparison methods, either as a measure of sequence similarity or in the first search steps of algorithms such as BLAST or BLAT. The D2 statistic is the number of matches of words of k letters between two sequences. Recent advances have been made in the characterization of this statistic and in the approximation of its distribution. Here, these results are extended to the case of approximate word matches. We compute the exact value of the variance of the D2 statistic for the case of a uniform letter distribution, and introduce a method to provide accurate approximations of the variance in the remaining cases. This enables the distribution of D2 to be approximated for typical situations arising in biological research. We apply these results to the identification of cis-regulatory modules, and show that this method detects such sequences with a high accuracy. The ability to approximate the distribution of D2 for both exact and approximate word matches will enable the use of this statistic in a more precise manner for sequence comparison, database searches, and identification of transcription factor binding sites.

  12. The (mis)reporting of statistical results in psychology journals.

    PubMed

    Bakker, Marjan; Wicherts, Jelte M

    2011-09-01

    In order to study the prevalence, nature (direction), and causes of reporting errors in psychology, we checked the consistency of reported test statistics, degrees of freedom, and p values in a random sample of high- and low-impact psychology journals. In a second study, we established the generality of reporting errors in a random sample of recent psychological articles. Our results, on the basis of 281 articles, indicate that around 18% of statistical results in the psychological literature are incorrectly reported. Inconsistencies were more common in low-impact journals than in high-impact journals. Moreover, around 15% of the articles contained at least one statistical conclusion that proved, upon recalculation, to be incorrect; that is, recalculation rendered the previously significant result insignificant, or vice versa. These errors were often in line with researchers' expectations. We classified the most common errors and contacted authors to shed light on the origins of the errors.

  13. STATISTICAL ANALYSIS OF TANK 19F FLOOR SAMPLE RESULTS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harris, S.

    2010-09-02

    Representative sampling has been completed for characterization of the residual material on the floor of Tank 19F as per the statistical sampling plan developed by Harris and Shine. Samples from eight locations have been obtained from the tank floor and two of the samples were archived as a contingency. Six samples, referred to in this report as the current scrape samples, have been submitted to and analyzed by SRNL. This report contains the statistical analysis of the floor sample analytical results to determine if further data are needed to reduce uncertainty. Included are comparisons with the prior Mantis samples resultsmore » to determine if they can be pooled with the current scrape samples to estimate the upper 95% confidence limits (UCL95%) for concentration. Statistical analysis revealed that the Mantis and current scrape sample results are not compatible. Therefore, the Mantis sample results were not used to support the quantification of analytes in the residual material. Significant spatial variability among the current scrape sample results was not found. Constituent concentrations were similar between the North and South hemispheres as well as between the inner and outer regions of the tank floor. The current scrape sample results from all six samples fall within their 3-sigma limits. In view of the results from numerous statistical tests, the data were pooled from all six current scrape samples. As such, an adequate sample size was provided for quantification of the residual material on the floor of Tank 19F. The uncertainty is quantified in this report by an UCL95% on each analyte concentration. The uncertainty in analyte concentration was calculated as a function of the number of samples, the average, and the standard deviation of the analytical results. The UCL95% was based entirely on the six current scrape sample results (each averaged across three analytical determinations).« less

  14. iCFD: Interpreted Computational Fluid Dynamics - Degeneration of CFD to one-dimensional advection-dispersion models using statistical experimental design - The secondary clarifier.

    PubMed

    Guyonvarch, Estelle; Ramin, Elham; Kulahci, Murat; Plósz, Benedek Gy

    2015-10-15

    The present study aims at using statistically designed computational fluid dynamics (CFD) simulations as numerical experiments for the identification of one-dimensional (1-D) advection-dispersion models - computationally light tools, used e.g., as sub-models in systems analysis. The objective is to develop a new 1-D framework, referred to as interpreted CFD (iCFD) models, in which statistical meta-models are used to calculate the pseudo-dispersion coefficient (D) as a function of design and flow boundary conditions. The method - presented in a straightforward and transparent way - is illustrated using the example of a circular secondary settling tank (SST). First, the significant design and flow factors are screened out by applying the statistical method of two-level fractional factorial design of experiments. Second, based on the number of significant factors identified through the factor screening study and system understanding, 50 different sets of design and flow conditions are selected using Latin Hypercube Sampling (LHS). The boundary condition sets are imposed on a 2-D axi-symmetrical CFD simulation model of the SST. In the framework, to degenerate the 2-D model structure, CFD model outputs are approximated by the 1-D model through the calibration of three different model structures for D. Correlation equations for the D parameter then are identified as a function of the selected design and flow boundary conditions (meta-models), and their accuracy is evaluated against D values estimated in each numerical experiment. The evaluation and validation of the iCFD model structure is carried out using scenario simulation results obtained with parameters sampled from the corners of the LHS experimental region. For the studied SST, additional iCFD model development was carried out in terms of (i) assessing different density current sub-models; (ii) implementation of a combined flocculation, hindered, transient and compression settling velocity function; and (iii

  15. Statistical limitations in functional neuroimaging. I. Non-inferential methods and statistical models.

    PubMed Central

    Petersson, K M; Nichols, T E; Poline, J B; Holmes, A P

    1999-01-01

    Functional neuroimaging (FNI) provides experimental access to the intact living brain making it possible to study higher cognitive functions in humans. In this review and in a companion paper in this issue, we discuss some common methods used to analyse FNI data. The emphasis in both papers is on assumptions and limitations of the methods reviewed. There are several methods available to analyse FNI data indicating that none is optimal for all purposes. In order to make optimal use of the methods available it is important to know the limits of applicability. For the interpretation of FNI results it is also important to take into account the assumptions, approximations and inherent limitations of the methods used. This paper gives a brief overview over some non-inferential descriptive methods and common statistical models used in FNI. Issues relating to the complex problem of model selection are discussed. In general, proper model selection is a necessary prerequisite for the validity of the subsequent statistical inference. The non-inferential section describes methods that, combined with inspection of parameter estimates and other simple measures, can aid in the process of model selection and verification of assumptions. The section on statistical models covers approaches to global normalization and some aspects of univariate, multivariate, and Bayesian models. Finally, approaches to functional connectivity and effective connectivity are discussed. In the companion paper we review issues related to signal detection and statistical inference. PMID:10466149

  16. Assessing the suitability of summary data for two-sample Mendelian randomization analyses using MR-Egger regression: the role of the I2 statistic.

    PubMed

    Bowden, Jack; Del Greco M, Fabiola; Minelli, Cosetta; Davey Smith, George; Sheehan, Nuala A; Thompson, John R

    2016-12-01

    : MR-Egger regression has recently been proposed as a method for Mendelian randomization (MR) analyses incorporating summary data estimates of causal effect from multiple individual variants, which is robust to invalid instruments. It can be used to test for directional pleiotropy and provides an estimate of the causal effect adjusted for its presence. MR-Egger regression provides a useful additional sensitivity analysis to the standard inverse variance weighted (IVW) approach that assumes all variants are valid instruments. Both methods use weights that consider the single nucleotide polymorphism (SNP)-exposure associations to be known, rather than estimated. We call this the `NO Measurement Error' (NOME) assumption. Causal effect estimates from the IVW approach exhibit weak instrument bias whenever the genetic variants utilized violate the NOME assumption, which can be reliably measured using the F-statistic. The effect of NOME violation on MR-Egger regression has yet to be studied. An adaptation of the I2 statistic from the field of meta-analysis is proposed to quantify the strength of NOME violation for MR-Egger. It lies between 0 and 1, and indicates the expected relative bias (or dilution) of the MR-Egger causal estimate in the two-sample MR context. We call it IGX2 . The method of simulation extrapolation is also explored to counteract the dilution. Their joint utility is evaluated using simulated data and applied to a real MR example. In simulated two-sample MR analyses we show that, when a causal effect exists, the MR-Egger estimate of causal effect is biased towards the null when NOME is violated, and the stronger the violation (as indicated by lower values of IGX2 ), the stronger the dilution. When additionally all genetic variants are valid instruments, the type I error rate of the MR-Egger test for pleiotropy is inflated and the causal effect underestimated. Simulation extrapolation is shown to substantially mitigate these adverse effects. We

  17. Pisces did not have increased heart failure: data-driven comparisons of binary proportions between levels of a categorical variable can result in incorrect statistical significance levels.

    PubMed

    Austin, Peter C; Goldwasser, Meredith A

    2008-03-01

    We examined the impact on statistical inference when a chi(2) test is used to compare the proportion of successes in the level of a categorical variable that has the highest observed proportion of successes with the proportion of successes in all other levels of the categorical variable combined. Monte Carlo simulations and a case study examining the association between astrological sign and hospitalization for heart failure. A standard chi(2) test results in an inflation of the type I error rate, with the type I error rate increasing as the number of levels of the categorical variable increases. Using a standard chi(2) test, the hospitalization rate for Pisces was statistically significantly different from that of the other 11 astrological signs combined (P=0.026). After accounting for the fact that the selection of Pisces was based on it having the highest observed proportion of heart failure hospitalizations, subjects born under the sign of Pisces no longer had a significantly higher rate of heart failure hospitalization compared to the other residents of Ontario (P=0.152). Post hoc comparisons of the proportions of successes across different levels of a categorical variable can result in incorrect inferences.

  18. Dynamics of the F(-) + CH3I → HF + CH2I(-) Proton Transfer Reaction.

    PubMed

    Zhang, Jiaxu; Xie, Jing; Hase, William L

    2015-12-17

    Direct chemical dynamics simulations, at collision energies Erel of 0.32 and 1.53 eV, were performed to obtain an atomistic understanding of the F(-) + CH3I reaction dynamics. There is only the F(-) + CH3I → CH3F + I(-) bimolecular nucleophilic substitution SN2 product channel at 0.32 eV. Increasing Erel to 1.53 eV opens the endothermic F(-) + CH3I → HF + CH2I(-) proton transfer reaction, which is less competitive than the SN2 reaction. The simulations reveal proton transfer occurs by two direct atomic-level mechanisms, rebound and stripping, and indirect mechanisms, involving formation of the F(-)···HCH2I complex and the roundabout. For the indirect trajectories all of the CH2I(-) is formed with zero-point energy (ZPE), while for the direct trajectories 50% form CH2I(-) without ZPE. Without a ZPE constraint for CH2I(-), the reaction cross sections for the rebound, stripping, and indirect mechanisms are 0.2 ± 0.1, 1.2 ± 0.4, and 0.7 ± 0.2 Å(2), respectively. Discarding trajectories that do not form CH2I(-) with ZPE reduces the rebound and stripping cross sections to 0.1 ± 0.1 and 0.7 ± 0.5 Å(2). The HF product is formed rotationally and vibrationally unexcited. The average value of J is 2.6 and with histogram binning n = 0. CH2I(-) is formed rotationally excited. The partitioning between CH2I(-) vibration and HF + CH2I(-) relative translation energy depends on the treatment of CH2I(-) ZPE. Without a CH2I(-) ZPE constraint the energy partitioning is primarily to relative translation with little CH2I(-) vibration. With a ZPE constraint, energy partitioning to CH2I(-) rotation, CH2I(-) vibration, and relative translation are statistically the same. The overall F(-) + CH3I rate constant at Erel of both 0.32 and 1.53 eV is in good agreement with experiment and negligibly affected by the treatment of CH2I(-) ZPE, since the SN2 reaction is the major contributor to the total reaction rate constant. The potential energy surface and reaction dynamics for F

  19. Statistical Aspects of North Atlantic Basin Tropical Cyclones During the Weather Satellite Era, 1960-2013. Part 2

    NASA Technical Reports Server (NTRS)

    Wilson, Robert M.

    2014-01-01

    This Technical Publication (TP) is part 2 of a two-part study of the North Atlantic basin tropical cyclones that occurred during the weather satellite era, 1960-2013. In particular, this TP examines the inferred statistical relationships between 25 tropical cyclone parameters and 9 specific climate-related factors, including the (1) Oceanic Niño Index (ONI), (2) Southern Oscillation Index (SOI), (3) Atlantic Multidecadal Oscillation (AMO) index, (4) Quasi-Biennial Oscillation (QBO) index, (5) North Atlantic Oscillation (NAO) index of the Climate Prediction Center (CPC), (6) NAO index of the Climate Research Unit (CRU), (7) Armagh surface air temperature (ASAT), (8) Global Land-Ocean Temperature Index (GLOTI), and (9) Mauna Loa carbon dioxide (CO2) (MLCO2) index. Part 1 of this two-part study examined the statistical aspects of the 25 tropical cyclone parameters (e.g., frequencies, peak wind speed (PWS), accumulated cyclone energy (ACE), etc.) and provided the results of statistical testing (i.e., runs-testing, the t-statistic for independent samples, and Poisson distributions). Also, the study gave predictions for the frequencies of the number of tropical cyclones (NTC), number of hurricanes (NH), number of major hurricanes (NMH), and number of United States land-falling hurricanes (NUSLFH) expected for the 2014 season, based on the statistics of the overall interval 1960-2013, the subinterval 1995-2013, and whether the year 2014 would be either an El Niño year (ENY) or a non-El Niño year (NENY).

  20. Nature of Driving Force for Protein Folding-- A Result From Analyzing the Statistical Potential

    NASA Astrophysics Data System (ADS)

    Li, Hao; Tang, Chao; Wingreen, Ned S.

    1998-03-01

    In a statistical approach to protein structure analysis, Miyazawa and Jernigan (MJ) derived a 20× 20 matrix of inter-residue contact energies between different types of amino acids. Using the method of eigenvalue decomposition, we find that the MJ matrix can be accurately reconstructed from its first two principal component vectors as M_ij=C_0+C_1(q_i+q_j)+C2 qi q_j, with constant C's, and 20 q values associated with the 20 amino acids. This regularity is due to hydrophobic interactions and a force of demixing, the latter obeying Hildebrand's solubility theory of simple liquids.

  1. Experimental cosmic statistics - I. Variance

    NASA Astrophysics Data System (ADS)

    Colombi, Stéphane; Szapudi, István; Jenkins, Adrian; Colberg, Jörg

    2000-04-01

    Counts-in-cells are measured in the τCDM Virgo Hubble Volume simulation. This large N-body experiment has 109 particles in a cubic box of size 2000h-1Mpc. The unprecedented combination of size and resolution allows, for the first time, a realistic numerical analysis of the cosmic errors and cosmic correlations of statistics related to counts-in-cells measurements, such as the probability distribution function PN itself, its factorial moments Fk and the related cumulants ψ and SNs. These statistics are extracted from the whole simulation cube, as well as from 4096 subcubes of size 125h-1Mpc, each representing a virtual random realization of the local universe. The measurements and their scatter over the subvolumes are compared to the theoretical predictions of Colombi, Bouchet & Schaeffer for P0, and of Szapudi & Colombi and Szapudi, Colombi & Bernardeau for the factorial moments and the cumulants. The general behaviour of experimental variance and cross-correlations as functions of scale and order is well described by theoretical predictions, with a few per cent accuracy in the weakly non-linear regime for the cosmic error on factorial moments. On highly non-linear scales, however, all variants of the hierarchical model used by SC and SCB to describe clustering appear to become increasingly approximate, which leads to a slight overestimation of the error, by about a factor of two in the worst case. Because of the needed supplementary perturbative approach, the theory is less accurate for non-linear estimators, such as cumulants, than for factorial moments. The cosmic bias is evaluated as well, and, in agreement with SCB, is found to be insignificant compared with the cosmic variance in all regimes investigated. While higher order statistics were previously evaluated in several simulations, this work presents textbook quality measurements of SNs, 3<=N<=10, in an unprecedented dynamic range of 0.05 <~ ψ <~ 50. In the weakly non-linear regime the results confirm

  2. Planck 2015 results. XVI. Isotropy and statistics of the CMB

    NASA Astrophysics Data System (ADS)

    Planck Collaboration; Ade, P. A. R.; Aghanim, N.; Akrami, Y.; Aluri, P. K.; Arnaud, M.; Ashdown, M.; Aumont, J.; Baccigalupi, C.; Banday, A. J.; Barreiro, R. B.; Bartolo, N.; Basak, S.; Battaner, E.; Benabed, K.; Benoît, A.; Benoit-Lévy, A.; Bernard, J.-P.; Bersanelli, M.; Bielewicz, P.; Bock, J. J.; Bonaldi, A.; Bonavera, L.; Bond, J. R.; Borrill, J.; Bouchet, F. R.; Boulanger, F.; Bucher, M.; Burigana, C.; Butler, R. C.; Calabrese, E.; Cardoso, J.-F.; Casaponsa, B.; Catalano, A.; Challinor, A.; Chamballu, A.; Chiang, H. C.; Christensen, P. R.; Church, S.; Clements, D. L.; Colombi, S.; Colombo, L. P. L.; Combet, C.; Contreras, D.; Couchot, F.; Coulais, A.; Crill, B. P.; Cruz, M.; Curto, A.; Cuttaia, F.; Danese, L.; Davies, R. D.; Davis, R. J.; de Bernardis, P.; de Rosa, A.; de Zotti, G.; Delabrouille, J.; Désert, F.-X.; Diego, J. M.; Dole, H.; Donzelli, S.; Doré, O.; Douspis, M.; Ducout, A.; Dupac, X.; Efstathiou, G.; Elsner, F.; Enßlin, T. A.; Eriksen, H. K.; Fantaye, Y.; Fergusson, J.; Fernandez-Cobos, R.; Finelli, F.; Forni, O.; Frailis, M.; Fraisse, A. A.; Franceschi, E.; Frejsel, A.; Frolov, A.; Galeotta, S.; Galli, S.; Ganga, K.; Gauthier, C.; Ghosh, T.; Giard, M.; Giraud-Héraud, Y.; Gjerløw, E.; González-Nuevo, J.; Górski, K. M.; Gratton, S.; Gregorio, A.; Gruppuso, A.; Gudmundsson, J. E.; Hansen, F. K.; Hanson, D.; Harrison, D. L.; Henrot-Versillé, S.; Hernández-Monteagudo, C.; Herranz, D.; Hildebrandt, S. R.; Hivon, E.; Hobson, M.; Holmes, W. A.; Hornstrup, A.; Hovest, W.; Huang, Z.; Huffenberger, K. M.; Hurier, G.; Jaffe, A. H.; Jaffe, T. R.; Jones, W. C.; Juvela, M.; Keihänen, E.; Keskitalo, R.; Kim, J.; Kisner, T. S.; Knoche, J.; Kunz, M.; Kurki-Suonio, H.; Lagache, G.; Lähteenmäki, A.; Lamarre, J.-M.; Lasenby, A.; Lattanzi, M.; Lawrence, C. R.; Leonardi, R.; Lesgourgues, J.; Levrier, F.; Liguori, M.; Lilje, P. B.; Linden-Vørnle, M.; Liu, H.; López-Caniego, M.; Lubin, P. M.; Macías-Pérez, J. F.; Maggio, G.; Maino, D.; Mandolesi, N.; Mangilli, A.; Marinucci, D.; Maris, M.; Martin, P. G.; Martínez-González, E.; Masi, S.; Matarrese, S.; McGehee, P.; Meinhold, P. R.; Melchiorri, A.; Mendes, L.; Mennella, A.; Migliaccio, M.; Mikkelsen, K.; Mitra, S.; Miville-Deschênes, M.-A.; Molinari, D.; Moneti, A.; Montier, L.; Morgante, G.; Mortlock, D.; Moss, A.; Munshi, D.; Murphy, J. A.; Naselsky, P.; Nati, F.; Natoli, P.; Netterfield, C. B.; Nørgaard-Nielsen, H. U.; Noviello, F.; Novikov, D.; Novikov, I.; Oxborrow, C. A.; Paci, F.; Pagano, L.; Pajot, F.; Pant, N.; Paoletti, D.; Pasian, F.; Patanchon, G.; Pearson, T. J.; Perdereau, O.; Perotto, L.; Perrotta, F.; Pettorino, V.; Piacentini, F.; Piat, M.; Pierpaoli, E.; Pietrobon, D.; Plaszczynski, S.; Pointecouteau, E.; Polenta, G.; Popa, L.; Pratt, G. W.; Prézeau, G.; Prunet, S.; Puget, J.-L.; Rachen, J. P.; Rebolo, R.; Reinecke, M.; Remazeilles, M.; Renault, C.; Renzi, A.; Ristorcelli, I.; Rocha, G.; Rosset, C.; Rossetti, M.; Rotti, A.; Roudier, G.; Rubiño-Martín, J. A.; Rusholme, B.; Sandri, M.; Santos, D.; Savelainen, M.; Savini, G.; Scott, D.; Seiffert, M. D.; Shellard, E. P. S.; Souradeep, T.; Spencer, L. D.; Stolyarov, V.; Stompor, R.; Sudiwala, R.; Sunyaev, R.; Sutton, D.; Suur-Uski, A.-S.; Sygnet, J.-F.; Tauber, J. A.; Terenzi, L.; Toffolatti, L.; Tomasi, M.; Tristram, M.; Trombetti, T.; Tucci, M.; Tuovinen, J.; Valenziano, L.; Valiviita, J.; Van Tent, B.; Vielva, P.; Villa, F.; Wade, L. A.; Wandelt, B. D.; Wehus, I. K.; Yvon, D.; Zacchei, A.; Zibin, J. P.; Zonca, A.

    2016-09-01

    We test the statistical isotropy and Gaussianity of the cosmic microwave background (CMB) anisotropies using observations made by the Planck satellite. Our results are based mainly on the full Planck mission for temperature, but also include some polarization measurements. In particular, we consider the CMB anisotropy maps derived from the multi-frequency Planck data by several component-separation methods. For the temperature anisotropies, we find excellent agreement between results based on these sky maps over both a very large fraction of the sky and a broad range of angular scales, establishing that potential foreground residuals do not affect our studies. Tests of skewness, kurtosis, multi-normality, N-point functions, and Minkowski functionals indicate consistency with Gaussianity, while a power deficit at large angular scales is manifested in several ways, for example low map variance. The results of a peak statistics analysis are consistent with the expectations of a Gaussian random field. The "Cold Spot" is detected with several methods, including map kurtosis, peak statistics, and mean temperature profile. We thoroughly probe the large-scale dipolar power asymmetry, detecting it with several independent tests, and address the subject of a posteriori correction. Tests of directionality suggest the presence of angular clustering from large to small scales, but at a significance that is dependent on the details of the approach. We perform the first examination of polarization data, finding the morphology of stacked peaks to be consistent with the expectations of statistically isotropic simulations. Where they overlap, these results are consistent with the Planck 2013 analysis based on the nominal mission data and provide our most thorough view of the statistics of the CMB fluctuations to date.

  3. Optical population of iodine molecule ion-pair states via MI2 vdW complexes, M = I2, Xe, of valence states correlating with the third, I(2 P 1/2) + I(2 P 1/2), dissociation limit

    NASA Astrophysics Data System (ADS)

    Lukashov, S. S.; Poretsky, S. A.; Pravilov, A. M.; Khadikova, E. I.; Shevchenko, E. V.

    2010-10-01

    The first results of measurements and analysis of excitation spectra of the λlum = 3250 Å luminescence corresponding to I2( D0{/u +} → X0{/g +}) transition as well as luminescence at λlum = 3400 Å, where I2( D'2 g → A'2 u and/or β1 g → A1 u ) transitions occur, observed after three-step, λ1 + λ f + λ1, λ1 = 5321-5508.2 Å, λ f = 10644.0 Å, laser excitation of pure iodine vapour and I2 + Xe mixtures at room temperature via MI2 vdW complexes, M = I2, Xe, of the I2(0{/g +}, 1 u ( bb)) valence states correlating with the third, I(2 P 1/2) + I(2 P 1/2) (I2( bb)), dissociation limit are presented. Luminescence spectra in the λlum = 2200-3500 Å spectral range are also analyzed. Strong luminescence from the I2(D) and, probably, I2(D' and β) states is observed. We discuss three alternative mechanisms of optical population of the IP state. In our opinion, the mechanism including the MI2 complexes is the most probable.

  4. An entropy-based statistic for genomewide association studies.

    PubMed

    Zhao, Jinying; Boerwinkle, Eric; Xiong, Momiao

    2005-07-01

    Efficient genotyping methods and the availability of a large collection of single-nucleotide polymorphisms provide valuable tools for genetic studies of human disease. The standard chi2 statistic for case-control studies, which uses a linear function of allele frequencies, has limited power when the number of marker loci is large. We introduce a novel test statistic for genetic association studies that uses Shannon entropy and a nonlinear function of allele frequencies to amplify the differences in allele and haplotype frequencies to maintain statistical power with large numbers of marker loci. We investigate the relationship between the entropy-based test statistic and the standard chi2 statistic and show that, in most cases, the power of the entropy-based statistic is greater than that of the standard chi2 statistic. The distribution of the entropy-based statistic and the type I error rates are validated using simulation studies. Finally, we apply the new entropy-based test statistic to two real data sets, one for the COMT gene and schizophrenia and one for the MMP-2 gene and esophageal carcinoma, to evaluate the performance of the new method for genetic association studies. The results show that the entropy-based statistic obtained smaller P values than did the standard chi2 statistic.

  5. CARDIO-i2b2: integrating arrhythmogenic disease data in i2b2.

    PubMed

    Segagni, Daniele; Tibollo, Valentina; Dagliati, Arianna; Napolitano, Carlo; G Priori, Silvia; Bellazzi, Riccardo

    2012-01-01

    The CARDIO-i2b2 project is an initiative to customize the i2b2 bioinformatics tool with the aim to integrate clinical and research data in order to support translational research in cardiology. In this work we describe the implementation and the customization of i2b2 to manage the data of arrhytmogenic disease patients collected at the Fondazione Salvatore Maugeri of Pavia in a joint project with the NYU Langone Medical Center (New York, USA). The i2b2 clinical research chart data warehouse is populated with the data obtained by the research database called TRIAD. The research infrastructure is extended by the development of new plug-ins for the i2b2 web client application able to properly select and export phenotypic data and to perform data analysis.

  6. Prediction and optimization of the laccase-mediated synthesis of the antimicrobial compound iodine (I2).

    PubMed

    Schubert, M; Fey, A; Ihssen, J; Civardi, C; Schwarze, F W M R; Mourad, S

    2015-01-10

    An artificial neural network (ANN) and genetic algorithm (GA) were applied to improve the laccase-mediated oxidation of iodide (I(-)) to elemental iodine (I2). Biosynthesis of iodine (I2) was studied with a 5-level-4-factor central composite design (CCD). The generated ANN network was mathematically evaluated by several statistical indices and revealed better results than a classical quadratic response surface (RS) model. Determination of the relative significance of model input parameters, ranking the process parameters in order of importance (pH>laccase>mediator>iodide), was performed by sensitivity analysis. ANN-GA methodology was used to optimize the input space of the neural network model to find optimal settings for the laccase-mediated synthesis of iodine. ANN-GA optimized parameters resulted in a 9.9% increase in the conversion rate. Copyright © 2014 Elsevier B.V. All rights reserved.

  7. Planck 2015 results: XVI. Isotropy and statistics of the CMB

    DOE PAGES

    Ade, P. A. R.; Aghanim, N.; Akrami, Y.; ...

    2016-09-20

    In this paper, we test the statistical isotropy and Gaussianity of the cosmic microwave background (CMB) anisotropies using observations made by the Planck satellite. Our results are based mainly on the full Planck mission for temperature, but also include some polarization measurements. In particular, we consider the CMB anisotropy maps derived from the multi-frequency Planck data by several component-separation methods. For the temperature anisotropies, we find excellent agreement between results based on these sky maps over both a very large fraction of the sky and a broad range of angular scales, establishing that potential foreground residuals do not affect ourmore » studies. Tests of skewness, kurtosis, multi-normality, N-point functions, and Minkowski functionals indicate consistency with Gaussianity, while a power deficit at large angular scales is manifested in several ways, for example low map variance. The results of a peak statistics analysis are consistent with the expectations of a Gaussian random field. The “Cold Spot” is detected with several methods, including map kurtosis, peak statistics, and mean temperature profile. We thoroughly probe the large-scale dipolar power asymmetry, detecting it with several independent tests, and address the subject of a posteriori correction. Tests of directionality suggest the presence of angular clustering from large to small scales, but at a significance that is dependent on the details of the approach. We perform the first examination of polarization data, finding the morphology of stacked peaks to be consistent with the expectations of statistically isotropic simulations. Finally, where they overlap, these results are consistent with the Planck 2013 analysis based on the nominal mission data and provide our most thorough view of the statistics of the CMB fluctuations to date.« less

  8. Reserve Manpower Statistics, 1 January - 31 March 1986.

    DTIC Science & Technology

    1986-03-31

    This is the first issue of Reserv’e Nztnpowe Statistics , a quarterly publication based upon data from the Reserve Components Common Personnel Data System...1.2~5 MI ’CROCOPY RESOLUTION TEST CHART NATIONAL BUREAU OF STANDARDS-1963-A ,iI M15 o Department of Defense RESERVE MANPOWER STATISTICS __ March 31...1986 GUARD JID I A* LECTE3I ___SEP 17 1986 it % Ii TA 9 WWto’ VubUd reIOWa jiW, ii~ Department of Defense Reserve Manpower Statistics March 31

  9. Cycom 977-2 Composite Material: Impact Test Results (workshop presentation)

    NASA Technical Reports Server (NTRS)

    Engle, Carl; Herald, Stephen; Watkins, Casey

    2005-01-01

    Contents include the following: Ambient (13A) tests of Cycom 977-2 impact characteristics by the Brucenton and statistical method at MSFC and WSTF. Repeat (13A) tests of tested Cycom from phase I at MSFC to expended testing statistical database. Conduct high-pressure tests (13B) in liquid oxygen (LOX) and GOX at MSFC and WSTF to determine Cycom reaction characteristics and batch effect. Conduct expended ambient (13A) LOX test at MSFC and high-pressure (13B) testing to determine pressure effects in LOX. Expend 13B GOX database.

  10. Evaluation of statistical designs in phase I expansion cohorts: the Dana-Farber/Harvard Cancer Center experience.

    PubMed

    Dahlberg, Suzanne E; Shapiro, Geoffrey I; Clark, Jeffrey W; Johnson, Bruce E

    2014-07-01

    Phase I trials have traditionally been designed to assess toxicity and establish phase II doses with dose-finding studies and expansion cohorts but are frequently exceeding the traditional sample size to further assess endpoints in specific patient subsets. The scientific objectives of phase I expansion cohorts and their evolving role in the current era of targeted therapies have yet to be systematically examined. Adult therapeutic phase I trials opened within Dana-Farber/Harvard Cancer Center (DF/HCC) from 1988 to 2012 were identified for sample size details. Statistical designs and study objectives of those submitted in 2011 were reviewed for expansion cohort details. Five hundred twenty-two adult therapeutic phase I trials were identified during the 25 years. The average sample size of a phase I study has increased from 33.8 patients to 73.1 patients over that time. The proportion of trials with planned enrollment of 50 or fewer patients dropped from 93.0% during the time period 1988 to 1992 to 46.0% between 2008 and 2012; at the same time, the proportion of trials enrolling 51 to 100 patients and more than 100 patients increased from 5.3% and 1.8%, respectively, to 40.5% and 13.5% (χ(2) test, two-sided P < .001). Sixteen of the 60 trials (26.7%) in 2011 enrolled patients to three or more sub-cohorts in the expansion phase. Sixty percent of studies provided no statistical justification of the sample size, although 91.7% of trials stated response as an objective. Our data suggest that phase I studies have dramatically changed in size and scientific scope within the last decade. Additional studies addressing the implications of this trend on research processes, ethical concerns, and resource burden are needed. © The Author 2014. Published by Oxford University Press. All rights reserved.

  11. Study designs, use of statistical tests, and statistical analysis software choice in 2015: Results from two Pakistani monthly Medline indexed journals.

    PubMed

    Shaikh, Masood Ali

    2017-09-01

    Assessment of research articles in terms of study designs used, statistical tests applied and the use of statistical analysis programmes help determine research activity profile and trends in the country. In this descriptive study, all original articles published by Journal of Pakistan Medical Association (JPMA) and Journal of the College of Physicians and Surgeons Pakistan (JCPSP), in the year 2015 were reviewed in terms of study designs used, application of statistical tests, and the use of statistical analysis programmes. JPMA and JCPSP published 192 and 128 original articles, respectively, in the year 2015. Results of this study indicate that cross-sectional study design, bivariate inferential statistical analysis entailing comparison between two variables/groups, and use of statistical software programme SPSS to be the most common study design, inferential statistical analysis, and statistical analysis software programmes, respectively. These results echo previously published assessment of these two journals for the year 2014.

  12. Enhancements to the TOUGH2 Simulator as Implemented in iTOUGH2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Finsterle, Stefan

    iTOUGH2 is a program for parameter estimation, sensitivity analysis, and uncertainty propagation analysis. It is based on the TOUGH2 simulator for non-isothermal multiphase, multicomponent flow and transport in fractured and porous media [Pruess, 1987, 1991, 2005, 2011; Falta et al., 1995; Pruess et al., 1999, 2002, 2012; Doughty, 2013]. The core of iTOUGH2 contains slightly modified versions of TOUGH2 modules. Most code modifications are editorial and do not affect the simulation results. As a result, standard TOUGH2 input files can be used in iTOUGH2, and identical results are obtained if iTOUGH2 is run in forward mode. However, a number ofmore » modifications have been made as described in this report. They enhance the functionality, flexibilitu, and eas-of-use of the forward simulator. This report complements the reports iTOUGH2 User's Guide, iTOUGH2 Command Referecne, and the collection of tutorial examples in iTOUGH2 Sample Problems.« less

  13. Kepler Planet Detection Metrics: Statistical Bootstrap Test

    NASA Technical Reports Server (NTRS)

    Jenkins, Jon M.; Burke, Christopher J.

    2016-01-01

    This document describes the data produced by the Statistical Bootstrap Test over the final three Threshold Crossing Event (TCE) deliveries to NExScI: SOC 9.1 (Q1Q16)1 (Tenenbaum et al. 2014), SOC 9.2 (Q1Q17) aka DR242 (Seader et al. 2015), and SOC 9.3 (Q1Q17) aka DR253 (Twicken et al. 2016). The last few years have seen significant improvements in the SOC science data processing pipeline, leading to higher quality light curves and more sensitive transit searches. The statistical bootstrap analysis results presented here and the numerical results archived at NASAs Exoplanet Science Institute (NExScI) bear witness to these software improvements. This document attempts to introduce and describe the main features and differences between these three data sets as a consequence of the software changes.

  14. Optical population of iodine molecule ion-pair states via valence states correlating with the third, I(2 P 1/2) + I(2 P 1/2), dissociation limit and their MI2 vdW complexes, M = I2, Xe

    NASA Astrophysics Data System (ADS)

    Lukashov, S. S.; Poretsky, S. A.; Pravilov, A. M.; Khadikova, E. I.; Shevchenko, E. V.

    2010-10-01

    The first results of measurements and analysis of excitation spectra of the I2( D0{/u +} → X0{/g +}) and I2( D0{/u +} → X0{/g +} and/or β1 g → A1 u ) luminescence, observed after three-step, λ1 + λ f + λ1, λ1 = 5508-5530 Å, λ f = 10644.0 Å, laser excitation of pure iodine vapour and I2 + Xe mixtures at room temperature via bound parts of the I2(0{/g +}, 1 u ( bb)) valence states correlating with the third, I(2 P 1/2) + I(2 P 1/2), dissociation limit and their MI2 vdW complexes, M = I2, Xe, are presented. Luminescence spectra in the λlum = 2200-5000 Å spectral range are also analyzed. Strong luminescence from the I2( D, γ, D', and/or β) states is observed, though the two latter may be populated in optical transitions in a free iodine molecule if hyperfine coupling of the I2(0{/g +} and 1 u ( bb)) state rovibronic levels occurs. We discuss possible mechanisms of optical population of the IP state.

  15. Combined Interactions with I1-, I2-Imidazoline Binding Sites and α2-Adrenoceptors To Manage Opioid Addiction

    PubMed Central

    2016-01-01

    Tolerance and dependence associated with chronic opioid exposure result from molecular, cellular, and neural network adaptations. Such adaptations concern opioid and nonopioid systems, including α2-adrenoceptors (α2-ARs) and I1- and I2-imidazoline binding sites (IBS). Agmatine, one of the hypothesized endogenous ligands of IBS, targeting several systems including α2-ARs and IBS, proved to be able to regulate opioid-induced analgesia and to attenuate the development of tolerance and dependence. Interested in the complex pharmacological profile of agmatine and considering the nature of its targets, we evaluated two series of imidazolines, rationally designed to simultaneously interact with I1-/I2-IBS or I1-/I2-IBS/α2-ARs. The compounds showing the highest affinities for I1-/I2-IBS or I1-/I2-IBS/α2-ARs have been selected for their in vivo evaluation on opiate withdrawal syndrome. Interestingly, 9, displaying I1-/I2-IBS/α2-ARs interaction profile, appears more effective in reducing expression and acquisition of morphine dependence and, therefore, might be considered a promising tool in managing opioid addiction. PMID:27774136

  16. Digital Assays Part I: Partitioning Statistics and Digital PCR.

    PubMed

    Basu, Amar S

    2017-08-01

    A digital assay is one in which the sample is partitioned into many small containers such that each partition contains a discrete number of biological entities (0, 1, 2, 3, …). A powerful technique in the biologist's toolkit, digital assays bring a new level of precision in quantifying nucleic acids, measuring proteins and their enzymatic activity, and probing single-cell genotypes and phenotypes. Part I of this review begins with the benefits and Poisson statistics of partitioning, including sources of error. The remainder focuses on digital PCR (dPCR) for quantification of nucleic acids. We discuss five commercial instruments that partition samples into physically isolated chambers (cdPCR) or droplet emulsions (ddPCR). We compare the strengths of dPCR (absolute quantitation, precision, and ability to detect rare or mutant targets) with those of its predecessor, quantitative real-time PCR (dynamic range, larger sample volumes, and throughput). Lastly, we describe several promising applications of dPCR, including copy number variation, quantitation of circulating tumor DNA and viral load, RNA/miRNA quantitation with reverse transcription dPCR, and library preparation for next-generation sequencing. This review is intended to give a broad perspective to scientists interested in adopting digital assays into their workflows. Part II focuses on digital protein and cell assays.

  17. 7 CFR 2.68 - Administrator, National Agricultural Statistics Service.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 1 2010-01-01 2010-01-01 false Administrator, National Agricultural Statistics... Statistics Service. (a) Delegations. Pursuant to § 2.21 (a)(3) and (a)(8), subject to reservations in § 2.21..., Education, and Economics to the Administrator, National Agricultural Statistics Service: (1) Prepare crop...

  18. Modification of emission photon statistics from single quantum dots using metal/SiO2 core/shell nanostructures.

    PubMed

    Naiki, Hiroyuki; Oikawa, Hidetoshi; Masuo, Sadahiro

    2017-04-12

    Emission photon statistics, i.e., single-photon and multi-photon emissions, of isolated QDs is required for tailoring optoelectronic applications. In this article, we demonstrate that the emission photon statistics can be modified by the control of the spectral overlap of the QDs with the localized surface plasmon resonance (LSPR) of the metal nanoparticle (metal NP) and by the distance between the QD and the metal NP. Moreover, the contribution to the modification of the emission photon statistics, which is the excitation and emission enhancements and the quenching generated by the spectral overlap and the distance, is elucidated. By fabricating well-defined SiO 2 -coated AgNPs and AuNPs (metal/SiO 2 ), the spectral overlap originated from the metal species of Ag and Au and the distance constituted by the thickness of the SiO 2 shell are controlled. The probability of single-photon emission of single QD was increased by the enhancement of the excitation rate via adjusting the distance using Ag/SiO 2 while the single-photon emission was converted to multi-photon emission by the effect of exciton quenching at a short distance and a small spectral overlap. By contrast, the probability of multi-photon emission was increased by enhancement of the multi-photon emission rate and the quenching via the spectral overlap using Au/SiO 2 . These results indicated the fundamental finding to control emission photon statistics in single QDs by controlling the spectral overlap and the distance, and understand the interaction of plasmonic nanostructures and single QD systems.

  19. Exact statistical results for binary mixing and reaction in variable density turbulence

    NASA Astrophysics Data System (ADS)

    Ristorcelli, J. R.

    2017-02-01

    We report a number of rigorous statistical results on binary active scalar mixing in variable density turbulence. The study is motivated by mixing between pure fluids with very different densities and whose density intensity is of order unity. Our primary focus is the derivation of exact mathematical results for mixing in variable density turbulence and we do point out the potential fields of application of the results. A binary one step reaction is invoked to derive a metric to asses the state of mixing. The mean reaction rate in variable density turbulent mixing can be expressed, in closed form, using the first order Favre mean variables and the Reynolds averaged density variance, ⟨ρ2⟩ . We show that the normalized density variance, ⟨ρ2⟩ , reflects the reduction of the reaction due to mixing and is a mix metric. The result is mathematically rigorous. The result is the variable density analog, the normalized mass fraction variance ⟨c2⟩ used in constant density turbulent mixing. As a consequence, we demonstrate that use of the analogous normalized Favre variance of the mass fraction, c″ ⁣2˜ , as a mix metric is not theoretically justified in variable density turbulence. We additionally derive expressions relating various second order moments of the mass fraction, specific volume, and density fields. The central role of the density specific volume covariance ⟨ρ v ⟩ is highlighted; it is a key quantity with considerable dynamical significance linking various second order statistics. For laboratory experiments, we have developed exact relations between the Reynolds scalar variance ⟨c2⟩ its Favre analog c″ ⁣2˜ , and various second moments including ⟨ρ v ⟩ . For moment closure models that evolve ⟨ρ v ⟩ and not ⟨ρ2⟩ , we provide a novel expression for ⟨ρ2⟩ in terms of a rational function of ⟨ρ v ⟩ that avoids recourse to Taylor series methods (which do not converge for large density differences). We have derived

  20. Non-statistical effects in bond fission reactions of 1,2-difluoroethane

    NASA Astrophysics Data System (ADS)

    Schranz, Harold W.; Raff, Lionel M.; Thompson, Donald L.

    1991-08-01

    A microcanonical, classical variational transition-state theory based on the use of the efficient microcanonical sampling (EMS) procedure is applied to simple bond fission in 1,2-difluoroethane. Comparison is made with results of trajectory calculations performed on the same global potential-energy surface. Agreement between the statistical theory and trajectory results for CC CF and CH bond fissions is poor with differences as large as a factor of 125. Most importantly, at the lower energy studied, 6.0 eV, the statistical calculations predict considerably slower rates than those computed from trajectories. We conclude from these results that the statistical assumptions inherent in the transition-state theory method are not valid for 1,2-difluoroethane in spite of the fact that the total intramolecular energy transfer rate out of CH and CC normal and local modes is large relative to the bond fission rates. The IVR rate is not globally rapid and the trajectories do not access all of the energetically available phase space uniformly on the timescale of the reactions.

  1. Statistical Optics

    NASA Astrophysics Data System (ADS)

    Goodman, Joseph W.

    2000-07-01

    The Wiley Classics Library consists of selected books that have become recognized classics in their respective fields. With these new unabridged and inexpensive editions, Wiley hopes to extend the life of these important works by making them available to future generations of mathematicians and scientists. Currently available in the Series: T. W. Anderson The Statistical Analysis of Time Series T. S. Arthanari & Yadolah Dodge Mathematical Programming in Statistics Emil Artin Geometric Algebra Norman T. J. Bailey The Elements of Stochastic Processes with Applications to the Natural Sciences Robert G. Bartle The Elements of Integration and Lebesgue Measure George E. P. Box & Norman R. Draper Evolutionary Operation: A Statistical Method for Process Improvement George E. P. Box & George C. Tiao Bayesian Inference in Statistical Analysis R. W. Carter Finite Groups of Lie Type: Conjugacy Classes and Complex Characters R. W. Carter Simple Groups of Lie Type William G. Cochran & Gertrude M. Cox Experimental Designs, Second Edition Richard Courant Differential and Integral Calculus, Volume I RIchard Courant Differential and Integral Calculus, Volume II Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume I Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume II D. R. Cox Planning of Experiments Harold S. M. Coxeter Introduction to Geometry, Second Edition Charles W. Curtis & Irving Reiner Representation Theory of Finite Groups and Associative Algebras Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume I Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume II Cuthbert Daniel Fitting Equations to Data: Computer Analysis of Multifactor Data, Second Edition Bruno de Finetti Theory of Probability, Volume I Bruno de Finetti Theory of Probability, Volume 2 W. Edwards Deming Sample Design in Business Research

  2. Infants' statistical learning: 2- and 5-month-olds' segmentation of continuous visual sequences.

    PubMed

    Slone, Lauren Krogh; Johnson, Scott P

    2015-05-01

    Past research suggests that infants have powerful statistical learning abilities; however, studies of infants' visual statistical learning offer differing accounts of the developmental trajectory of and constraints on this learning. To elucidate this issue, the current study tested the hypothesis that young infants' segmentation of visual sequences depends on redundant statistical cues to segmentation. A sample of 20 2-month-olds and 20 5-month-olds observed a continuous sequence of looming shapes in which unit boundaries were defined by both transitional probability and co-occurrence frequency. Following habituation, only 5-month-olds showed evidence of statistically segmenting the sequence, looking longer to a statistically improbable shape pair than to a probable pair. These results reaffirm the power of statistical learning in infants as young as 5 months but also suggest considerable development of statistical segmentation ability between 2 and 5 months of age. Moreover, the results do not support the idea that infants' ability to segment visual sequences based on transitional probabilities and/or co-occurrence frequencies is functional at the onset of visual experience, as has been suggested previously. Rather, this type of statistical segmentation appears to be constrained by the developmental state of the learner. Factors contributing to the development of statistical segmentation ability during early infancy, including memory and attention, are discussed. Copyright © 2015 Elsevier Inc. All rights reserved.

  3. The Application of Statistics Education Research in My Classroom

    ERIC Educational Resources Information Center

    Jordan, Joy

    2007-01-01

    A collaborative, statistics education research project (Lovett, 2001) is discussed. Some results of the project were applied in the computer lab sessions of my elementary statistics course. I detail the process of applying these research results, as well as the use of knowledge surveys. Furthermore, I give general suggestions to teachers who want…

  4. Statistical equilibrium in cometary C2. II - Swan/Phillips band ratios

    NASA Technical Reports Server (NTRS)

    Swamy, K. S. K.; Odell, C. R.

    1979-01-01

    Statistical equilibrium calculations have been made for both the triplet and ground state singlets for C2 in comets, using the exchange rate as a free parameter. The predictions of the results are consistent with optical observations and may be tested definitively by accurate observations of the Phillips and Swan band ratios. Comparison with the one reported observation indicates compatibility with a low exchange rate and resonance fluorescence statistical equilibrium.

  5. Scintillation properties of Eu 2+-doped KBa 2I 5 and K 2BaI 4

    DOE PAGES

    Stand, L.; Zhuravleva, M.; Chakoumakos, Bryan C.; ...

    2015-09-25

    We report two new ternary metal halide scintillators, KBa 2I 5 and K 2BaI 4, activated with divalent europium. Single crystal X-ray diffraction measurements confirmed that KBa 2I 5 has a monoclinic structure (P2 1/c) and that K 2BaI 4 has a rhombohedral structure (R3c). Differential scanning calorimetry showed singular melting and crystallization points, making these compounds viable candidates for melt growth. We grew 13 mm diameter single crystals of KBa 2I 5:Eu 2+ and K 2BaI 4:Eu2+ in evacuated quartz ampoules via the vertical Bridgman technique. The optimal Eu 2+ concentration was 4% for KBa 2I 5 and 7%more » for K 2BaI 4. The X-ray excited emissions at 444 nm for KBa 2I 5:Eu 4% and 448 nm for K 2BaI 4:Eu 7% arise from the 5d-4f radiative transition in Eu 2+. KBa 2I 5:Eu 4% has a light yield of 90,000 photons/MeV, with an energy resolution of 2.4% and K 2BaI 4:Eu 7% has a light yield of 63,000 ph/MeV, with an energy resolution of 2.9% at 662 keV. Both crystals have an excellent proportional response to a wide range of gamma-ray energies.« less

  6. Why Wait? The Influence of Academic Self-Regulation, Intrinsic Motivation, and Statistics Anxiety on Procrastination in Online Statistics

    ERIC Educational Resources Information Center

    Dunn, Karee

    2014-01-01

    Online graduate education programs are expanding rapidly. Many of these programs require a statistics course, resulting in an increasing need for online statistics courses. The study reported here grew from experiences teaching online, graduate statistics courses. In seeking answers on how to improve this class, I discovered that research has yet…

  7. Recommendations for describing statistical studies and results in general readership science and engineering journals.

    PubMed

    Gardenier, John S

    2012-12-01

    This paper recommends how authors of statistical studies can communicate to general audiences fully, clearly, and comfortably. The studies may use statistical methods to explore issues in science, engineering, and society or they may address issues in statistics specifically. In either case, readers without explicit statistical training should have no problem understanding the issues, the methods, or the results at a non-technical level. The arguments for those results should be clear, logical, and persuasive. This paper also provides advice for editors of general journals on selecting high quality statistical articles without the need for exceptional work or expense. Finally, readers are also advised to watch out for some common errors or misuses of statistics that can be detected without a technical statistical background.

  8. Why am I not disabled? Making state subjects, making statistics in post--Mao China.

    PubMed

    Kohrman, Matthew

    2003-03-01

    In this article I examine how and why disability was defined and statistically quantified by China's party-state in the late 1980s. I describe the unfolding of a particular epidemiological undertaking--China's 1987 National Sample Survey of Disabled Persons--as well as the ways the survey was an extension of what Ian Hacking has called modernity's "avalanche of numbers." I argue that, to a large degree, what fueled and shaped the 1987 survey's codification and quantification of disability was how Chinese officials were incited to shape their own identities as they negotiated an array of social, political, and ethical forces, which were at once national and transnational in orientation.

  9. Business Statistics Education: Content and Software in Undergraduate Business Statistics Courses.

    ERIC Educational Resources Information Center

    Tabatabai, Manouchehr; Gamble, Ralph

    1997-01-01

    Survey responses from 204 of 500 business schools identified most often topics in business statistics I and II courses. The most popular software at both levels was Minitab. Most schools required both statistics I and II. (SK)

  10. Role of the VDR Bsm I and Apa I polymorphisms in the risk of colorectal cancer in Kashmir.

    PubMed

    Rasool, Sabha; Kadla, Showkat A; Rasool, Vamiq; Qazi, Falak; Khan, Tanzeela; Shah, Nisar A; Ganai, Bashir A

    2014-01-01

    A case-control study aiming to evaluate the relationship between Bsm I and Apa I restriction fragment gene polymorphisms and colorectal cancer (CRC) was carried out in Kashmir, including a total of 368 subjects (180 cases and 188 controls). DNA samples extracted from the blood of the subjects were analyzed for 3' untranslated region (3' UTR) Apa I and Bsm I polymorphisms using restriction fragment length polymorphism-polymerase chain reaction (RFLP-PCR). A statistically significant 2.7-fold increased risk was observed in individuals found homozygous for the presence of the 'b' allele, in comparison to subjects homozygous for the 'B' allele (odds ratio (OR) 2.7, 95% confidence interval (CI) 1.49-4.86 (Bsm I)), and a statistically insignificant 2-fold increased risk was found among individuals with the 'aa' genotype, as compared to subjects with the 'AA' genotype (OR 2.017, 95% CI 0.86-4.7). Our study also yielded statistically significant results when the Apa I polymorphism was stratified by age (≤ 50 years) and dwelling area (rural area), and the Bsm I polymorphism by gender (male gender), suggesting a possible role of Apa I and Bsm I polymorphisms in the etiology of CRC in Kashmir. We conclude that Apa I and Bsm I single-nucleotide polymorphisms (SNPs) in the vitamin D receptor gene (VDR) might be associated with susceptibility to CRC among Kashmiris. © 2014 S. Karger GmbH, Freiburg.

  11. Agile Text Mining for the 2014 i2b2/UTHealth Cardiac Risk Factors Challenge

    PubMed Central

    Cormack, James; Nath, Chinmoy; Milward, David; Raja, Kalpana; Jonnalagadda, Siddhartha R

    2016-01-01

    This paper describes the use of an agile text mining platform (Linguamatics’ Interactive Information Extraction Platform, I2E) to extract document-level cardiac risk factors in patient records as defined in the i2b2/UTHealth 2014 Challenge. The approach uses a data-driven rule-based methodology with the addition of a simple supervised classifier. We demonstrate that agile text mining allows for rapid optimization of extraction strategies, while post-processing can leverage annotation guidelines, corpus statistics and logic inferred from the gold standard data. We also show how data imbalance in a training set affects performance. Evaluation of this approach on the test data gave an F-Score of 91.7%, one percent behind the top performing system. PMID:26209007

  12. Dissolution curve comparisons through the F(2) parameter, a Bayesian extension of the f(2) statistic.

    PubMed

    Novick, Steven; Shen, Yan; Yang, Harry; Peterson, John; LeBlond, Dave; Altan, Stan

    2015-01-01

    Dissolution (or in vitro release) studies constitute an important aspect of pharmaceutical drug development. One important use of such studies is for justifying a biowaiver for post-approval changes which requires establishing equivalence between the new and old product. We propose a statistically rigorous modeling approach for this purpose based on the estimation of what we refer to as the F2 parameter, an extension of the commonly used f2 statistic. A Bayesian test procedure is proposed in relation to a set of composite hypotheses that capture the similarity requirement on the absolute mean differences between test and reference dissolution profiles. Several examples are provided to illustrate the application. Results of our simulation study comparing the performance of f2 and the proposed method show that our Bayesian approach is comparable to or in many cases superior to the f2 statistic as a decision rule. Further useful extensions of the method, such as the use of continuous-time dissolution modeling, are considered.

  13. RELAP5-3D Results for Phase I (Exercise 2) of the OECD/NEA MHTGR-350 MW Benchmark

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gerhard Strydom

    2012-06-01

    The coupling of the PHISICS code suite to the thermal hydraulics system code RELAP5-3D has recently been initiated at the Idaho National Laboratory (INL) to provide a fully coupled prismatic Very High Temperature Reactor (VHTR) system modeling capability as part of the NGNP methods development program. The PHISICS code consists of three modules: INSTANT (performing 3D nodal transport core calculations), MRTAU (depletion and decay heat generation) and a perturbation/mixer module. As part of the verification and validation activities, steady state results have been obtained for Exercise 2 of Phase I of the newly-defined OECD/NEA MHTGR-350 MW Benchmark. This exercise requiresmore » participants to calculate a steady-state solution for an End of Equilibrium Cycle 350 MW Modular High Temperature Reactor (MHTGR), using the provided geometry, material, and coolant bypass flow description. The paper provides an overview of the MHTGR Benchmark and presents typical steady state results (e.g. solid and gas temperatures, thermal conductivities) for Phase I Exercise 2. Preliminary results are also provided for the early test phase of Exercise 3 using a two-group cross-section library and the Relap5-3D model developed for Exercise 2.« less

  14. RELAP5-3D results for phase I (Exercise 2) of the OECD/NEA MHTGR-350 MW benchmark

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Strydom, G.; Epiney, A. S.

    2012-07-01

    The coupling of the PHISICS code suite to the thermal hydraulics system code RELAP5-3D has recently been initiated at the Idaho National Laboratory (INL) to provide a fully coupled prismatic Very High Temperature Reactor (VHTR) system modeling capability as part of the NGNP methods development program. The PHISICS code consists of three modules: INSTANT (performing 3D nodal transport core calculations), MRTAU (depletion and decay heat generation) and a perturbation/mixer module. As part of the verification and validation activities, steady state results have been obtained for Exercise 2 of Phase I of the newly-defined OECD/NEA MHTGR-350 MW Benchmark. This exercise requiresmore » participants to calculate a steady-state solution for an End of Equilibrium Cycle 350 MW Modular High Temperature Reactor (MHTGR), using the provided geometry, material, and coolant bypass flow description. The paper provides an overview of the MHTGR Benchmark and presents typical steady state results (e.g. solid and gas temperatures, thermal conductivities) for Phase I Exercise 2. Preliminary results are also provided for the early test phase of Exercise 3 using a two-group cross-section library and the Relap5-3D model developed for Exercise 2. (authors)« less

  15. In situ reaction mechanism studies on the Ti(NMe{sub 2}){sub 2}(O{sup i}Pr){sub 2}-D{sub 2}O and Ti(O{sup i}Pr){sub 3}[MeC(N{sup i}Pr){sub 2}]-D{sub 2}O atomic layer deposition processes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tomczak, Yoann, E-mail: yoann.tomczak@helsinki.fi; Knapas, Kjell; Leskelä, Markku

    2014-01-15

    Reaction mechanisms in the Ti(NMe{sub 2}){sub 2}(O{sup i}Pr){sub 2}-D{sub 2}O and Ti(O{sup i}Pr){sub 3}[MeC(N{sup i}Pr){sub 2}] [also written Ti(O{sup i}Pr){sub 3}(N{sup i}Pr-Me-amd)]-D{sub 2}O atomic layer deposition processes were studied in situ with quartz crystal microbalance (QCM) and quadrupole mass spectrometry (QMS) at 275 °C. For the Ti(NMe{sub 2}){sub 2}(O{sup i}Pr){sub 2}-D{sub 2}O process, both QCM and QMS results indicated adsorption of the Ti(NMe{sub 2}){sub 2}(O{sup i}Pr){sub 2} molecule through an exchange of at least one of its –NMe{sub 2} ligands with surface hydroxyl groups. Regarding the Ti(O{sup i}Pr){sub 3}(N{sup i}Pr-Me-amd)-D{sub 2}O process, a mismatch between the QCM and QMS results revealedmore » more complex reactions: the decomposition of the [MeC(N{sup i}Pr){sub 2}] [also written (N{sup i}Pr-Me-amd)] ligand is suggested by the shape of the QCM data and the intensity of the QMS signals belonging to fragments of the [MeC(N{sup i}Pr){sub 2}] [also written (N{sup i}Pr-Me-amd)] ligand. A simple calculation model associating the growth rate per cycle of a crystalline film and the surface area taken by the ligands remaining after saturation was also used to support the decomposition of the [MeC(N{sup i}Pr){sub 2}] [also written (N{sup i}Pr-Me-amd)] ligand. The observed high growth rate is incompatible with the whole [MeC(N{sup i}Pr){sub 2}] (also written [N{sup i}Pr-Me-amd)] ligand remaining on the surface.« less

  16. Thermal annealing and pressure effects on BaFe2-<i>xCox>As2 single crystals.

    PubMed

    Shin, Dongwon; Jung, Soon-Gil; Prathiba, G; Seo, Soonbeom; Choi, Ki-Young; Kim, Kee Hoon; Park, Tuson

    2017-11-26

    We investigate the pressure and thermal annealing effects on BaFe<sub>2-<i>x</i></sub>Co<sub><i>x</i></sub>As<sub>2</sub> (Co-Ba122) single crystals with <i>x</i> = 0.1 and 0.17 via electrical transport measurements. The thermal annealing treatment not only enhances the superconducting transition temperature (<i>T</i><sub>c</sub>) from 9.6 to 12.7 K for <i>x</i> = 0.1 and from 18.1 to 21.0 K for <i>x</i> = 0.17, but also increases the antiferromagnetic transition temperature (<i>T</i><sub>N</sub>). Simultaneous enhancement of <i>T</i><sub>c</sub> and <i>T</i><sub>N</sub> by the thermal annealing treatment indicates that thermal annealing could substantially improve the quality of the Co-doped Ba122 samples. Interestingly, <i>T</i><sub>c</sub> of the Co-Ba122 compounds shows a scaling behavior with a linear dependence on the resistivity value at 290 K, irrespective of tuning parameters, such as chemical doping, pressure, and thermal annealing. These results not only provide an effective way to access the intrinsic properties of the BaFe<sub>2</sub>As<sub>2</sub> system, but also may shed a light on designing new materials with higher superconducting transition temperature. © 2017 IOP Publishing Ltd.

  17. Agile text mining for the 2014 i2b2/UTHealth Cardiac risk factors challenge.

    PubMed

    Cormack, James; Nath, Chinmoy; Milward, David; Raja, Kalpana; Jonnalagadda, Siddhartha R

    2015-12-01

    This paper describes the use of an agile text mining platform (Linguamatics' Interactive Information Extraction Platform, I2E) to extract document-level cardiac risk factors in patient records as defined in the i2b2/UTHealth 2014 challenge. The approach uses a data-driven rule-based methodology with the addition of a simple supervised classifier. We demonstrate that agile text mining allows for rapid optimization of extraction strategies, while post-processing can leverage annotation guidelines, corpus statistics and logic inferred from the gold standard data. We also show how data imbalance in a training set affects performance. Evaluation of this approach on the test data gave an F-Score of 91.7%, one percent behind the top performing system. Copyright © 2015 Elsevier Inc. All rights reserved.

  18. Detecting trends in raptor counts: power and type I error rates of various statistical tests

    USGS Publications Warehouse

    Hatfield, J.S.; Gould, W.R.; Hoover, B.A.; Fuller, M.R.; Lindquist, E.L.

    1996-01-01

    We conducted simulations that estimated power and type I error rates of statistical tests for detecting trends in raptor population count data collected from a single monitoring site. Results of the simulations were used to help analyze count data of bald eagles (Haliaeetus leucocephalus) from 7 national forests in Michigan, Minnesota, and Wisconsin during 1980-1989. Seven statistical tests were evaluated, including simple linear regression on the log scale and linear regression with a permutation test. Using 1,000 replications each, we simulated n = 10 and n = 50 years of count data and trends ranging from -5 to 5% change/year. We evaluated the tests at 3 critical levels (alpha = 0.01, 0.05, and 0.10) for both upper- and lower-tailed tests. Exponential count data were simulated by adding sampling error with a coefficient of variation of 40% from either a log-normal or autocorrelated log-normal distribution. Not surprisingly, tests performed with 50 years of data were much more powerful than tests with 10 years of data. Positive autocorrelation inflated alpha-levels upward from their nominal levels, making the tests less conservative and more likely to reject the null hypothesis of no trend. Of the tests studied, Cox and Stuart's test and Pollard's test clearly had lower power than the others. Surprisingly, the linear regression t-test, Collins' linear regression permutation test, and the nonparametric Lehmann's and Mann's tests all had similar power in our simulations. Analyses of the count data suggested that bald eagles had increasing trends on at least 2 of the 7 national forests during 1980-1989.

  19. The Statistic Results of the ISUAL Lightning Survey

    NASA Astrophysics Data System (ADS)

    Chuang, Chia-Wen; Bing-Chih Chen, Alfred; Liu, Tie-Yue; Lin, Shin-Fa; Su, Han-Tzong; Hsu, Rue-Ron

    2017-04-01

    The ISUAL (Imager for Sprites and Upper Atmospheric Lightning) onboard FORMOSAT-2 is the first science payload dedicated to the study of the lightning-induced transient luminous events (TLEs). Transient events, including TLEs and lightning, were recorded by the intensified imager, spectrophotometer (SP), and array photometer (AP) simultaneously while their light variation observed by SP exceeds a programmed threshold. Therefore, ISUAL surveys not only TLEs but also lightning globally with a good spatial, temporal and spectral resolution. In the past 12 years (2004-2016), approximately 300,000 transient events were registered, and only 42,000 are classified as TLEs. Since the main mission objective is to explore the distribution and characteristics of TLEs, the remaining transient events, mainly lightning, can act as a long-term global lightning survey. These huge amount of events cannot be processed manually as TLEs do, therefore, a data pipeline is developed to scan lightning patterns and to derive their geolocation with an efficient algorithm. The 12-year statistic results including occurrence rate, global distribution, seasonal variation, and the comparison with the LIS/OTD survey are presented in this report.

  20. Tools for Basic Statistical Analysis

    NASA Technical Reports Server (NTRS)

    Luz, Paul L.

    2005-01-01

    Statistical Analysis Toolset is a collection of eight Microsoft Excel spreadsheet programs, each of which performs calculations pertaining to an aspect of statistical analysis. These programs present input and output data in user-friendly, menu-driven formats, with automatic execution. The following types of calculations are performed: Descriptive statistics are computed for a set of data x(i) (i = 1, 2, 3 . . . ) entered by the user. Normal Distribution Estimates will calculate the statistical value that corresponds to cumulative probability values, given a sample mean and standard deviation of the normal distribution. Normal Distribution from two Data Points will extend and generate a cumulative normal distribution for the user, given two data points and their associated probability values. Two programs perform two-way analysis of variance (ANOVA) with no replication or generalized ANOVA for two factors with four levels and three repetitions. Linear Regression-ANOVA will curvefit data to the linear equation y=f(x) and will do an ANOVA to check its significance.

  1. Admixture, Population Structure, and F-Statistics.

    PubMed

    Peter, Benjamin M

    2016-04-01

    Many questions about human genetic history can be addressed by examining the patterns of shared genetic variation between sets of populations. A useful methodological framework for this purpose isF-statistics that measure shared genetic drift between sets of two, three, and four populations and can be used to test simple and complex hypotheses about admixture between populations. This article provides context from phylogenetic and population genetic theory. I review how F-statistics can be interpreted as branch lengths or paths and derive new interpretations, using coalescent theory. I further show that the admixture tests can be interpreted as testing general properties of phylogenies, allowing extension of some ideas applications to arbitrary phylogenetic trees. The new results are used to investigate the behavior of the statistics under different models of population structure and show how population substructure complicates inference. The results lead to simplified estimators in many cases, and I recommend to replace F3 with the average number of pairwise differences for estimating population divergence. Copyright © 2016 by the Genetics Society of America.

  2. Statistical controversies in clinical research: requiem for the 3 + 3 design for phase I trials.

    PubMed

    Paoletti, X; Ezzalfani, M; Le Tourneau, C

    2015-09-01

    More than 95% of published phase I trials have used the 3 + 3 design to identify the dose to be recommended for phase II trials. However, the statistical community agrees on the limitations of the 3 + 3 design compared with model-based approaches. Moreover, the mechanisms of action of targeted agents strongly challenge the hypothesis that the maximum tolerated dose constitutes the optimal dose, and more outcomes including clinical and biological activity increasingly need to be taken into account to identify the optimal dose. We review key elements from clinical publications and from the statistical literature to show that the 3 + 3 design lacks the necessary flexibility to address the challenges of targeted agents. The design issues raised by expansion cohorts, new definitions of dose-limiting toxicity and trials of combinations are not easily addressed by the 3 + 3 design or its extensions. Alternative statistical proposals have been developed to make a better use of the complex data generated by phase I trials. Their applications require a close collaboration between all actors of early phase clinical trials. © The Author 2015. Published by Oxford University Press on behalf of the European Society for Medical Oncology. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  3. BrightStat.com: free statistics online.

    PubMed

    Stricker, Daniel

    2008-10-01

    Powerful software for statistical analysis is expensive. Here I present BrightStat, a statistical software running on the Internet which is free of charge. BrightStat's goals, its main capabilities and functionalities are outlined. Three different sample runs, a Friedman test, a chi-square test, and a step-wise multiple regression are presented. The results obtained by BrightStat are compared with results computed by SPSS, one of the global leader in providing statistical software, and VassarStats, a collection of scripts for data analysis running on the Internet. Elementary statistics is an inherent part of academic education and BrightStat is an alternative to commercial products.

  4. Proof of the Spin Statistics Connection 2: Relativistic Theory

    NASA Astrophysics Data System (ADS)

    Santamato, Enrico; De Martini, Francesco

    2017-12-01

    The traditional standard theory of quantum mechanics is unable to solve the spin-statistics problem, i.e. to justify the utterly important "Pauli Exclusion Principle" but by the adoption of the complex standard relativistic quantum field theory. In a recent paper (Santamato and De Martini in Found Phys 45(7):858-873, 2015) we presented a proof of the spin-statistics problem in the nonrelativistic approximation on the basis of the "Conformal Quantum Geometrodynamics". In the present paper, by the same theory the proof of the spin-statistics theorem is extended to the relativistic domain in the general scenario of curved spacetime. The relativistic approach allows to formulate a manifestly step-by-step Weyl gauge invariant theory and to emphasize some fundamental aspects of group theory in the demonstration. No relativistic quantum field operators are used and the particle exchange properties are drawn from the conservation of the intrinsic helicity of elementary particles. It is therefore this property, not considered in the standard quantum mechanics, which determines the correct spin-statistics connection observed in Nature (Santamato and De Martini in Found Phys 45(7):858-873, 2015). The present proof of the spin-statistics theorem is simpler than the one presented in Santamato and De Martini (Found Phys 45(7):858-873, 2015), because it is based on symmetry group considerations only, without having recourse to frames attached to the particles. Second quantization and anticommuting operators are not necessary.

  5. ``Models'' CAVEAT EMPTOR!!!: ``Toy Models Too-Often Yield Toy-Results''!!!: Statistics, Polls, Politics, Economics, Elections!!!: GRAPH/Network-Physics: ``Equal-Distribution for All'' TRUMP-ED BEC ``Winner-Take-All'' ``Doctor Livingston I Presume?''

    NASA Astrophysics Data System (ADS)

    Preibus-Norquist, R. N. C.-Grover; Bush-Romney, G. W.-Willard-Mitt; Dimon, J. P.; Adelson-Koch, Sheldon-Charles-David-Sheldon; Krugman-Axelrod, Paul-David; Siegel, Edward Carl-Ludwig; D. N. C./O. F. P./''47''%/50% Collaboration; R. N. C./G. O. P./''53''%/49% Collaboration; Nyt/Wp/Cnn/Msnbc/Pbs/Npr/Ft Collaboration; Ftn/Fnc/Fox/Wsj/Fbn Collaboration; Lb/Jpmc/Bs/Boa/Ml/Wamu/S&P/Fitch/Moodys/Nmis Collaboration

    2013-03-01

    ``Models''? CAVEAT EMPTOR!!!: ``Toy Models Too-Often Yield Toy-Results''!!!: Goldenfeld[``The Role of Models in Physics'', in Lects.on Phase-Transitions & R.-G.(92)-p.32-33!!!]: statistics(Silver{[NYTimes; Bensinger, ``Math-Geerks Clearly-Defeated Pundits'', LATimes, (11/9/12)])}, polls, politics, economics, elections!!!: GRAPH/network/net/...-PHYSICS Barabasi-Albert[RMP (02)] (r,t)-space VERSUS(???) [Where's the Inverse/ Dual/Integral-Transform???] (Benjamin)Franklin(1795)-Fourier(1795; 1897;1822)-Laplace(1850)-Mellin (1902) Brillouin(1922)-...(k,)-space, {Hubbard [The World According to Wavelets,Peters (96)-p.14!!!/p.246: refs.-F2!!!]},and then (2) Albert-Barabasi[]Bose-Einstein quantum-statistics(BEQS) Bose-Einstein CONDENSATION (BEC) versus Bianconi[pvt.-comm.; arXiv:cond-mat/0204506; ...] -Barabasi [???] Fermi-Dirac

  6. Quantifying economic fluctuations by adapting methods of statistical physics

    NASA Astrophysics Data System (ADS)

    Plerou, Vasiliki

    2001-09-01

    The first focus of this thesis is the investigation of cross-correlations between the price fluctuations of different stocks using the conceptual framework of random matrix theory (RMT), developed in physics to describe the statistical properties of energy-level spectra of complex nuclei. RMT makes predictions for the statistical properties of matrices that are universal, i.e., do not depend on the interactions between the elements comprising the system. In physical systems, deviations from the predictions of RMT provide clues regarding the mechanisms controlling the dynamics of a given system so this framework is of potential value if applied to economic systems. This thesis compares the statistics of cross-correlation matrix C-whose elements Cij are the correlation coefficients of price fluctuations of stock i and j-against the ``null hypothesis'' of a random matrix having the same symmetry properties. It is shown that comparison of the eigenvalue statistics of C with RMT results can be used to distinguish random and non-random parts of C. The non-random part of C which deviates from RMT results, provides information regarding genuine cross-correlations between stocks. The interpretations and potential practical utility of these deviations are also investigated. The second focus is the characterization of the dynamics of stock price fluctuations. The statistical properties of the changes G Δt in price over a time interval Δ t are quantified and the statistical relation between G Δt and the trading activity-measured by the number of transactions NΔ t in the interval Δt is investigated. The statistical properties of the volatility, i.e., the time dependent standard deviation of price fluctuations, is related to two microscopic quantities: NΔt and the variance W2Dt of the price changes for all transactions in the interval Δ t. In addition, the statistical relationship between G Δt and the number of

  7. Statistical inference from multiple iTRAQ experiments without using common reference standards.

    PubMed

    Herbrich, Shelley M; Cole, Robert N; West, Keith P; Schulze, Kerry; Yager, James D; Groopman, John D; Christian, Parul; Wu, Lee; O'Meally, Robert N; May, Damon H; McIntosh, Martin W; Ruczinski, Ingo

    2013-02-01

    Isobaric tags for relative and absolute quantitation (iTRAQ) is a prominent mass spectrometry technology for protein identification and quantification that is capable of analyzing multiple samples in a single experiment. Frequently, iTRAQ experiments are carried out using an aliquot from a pool of all samples, or "masterpool", in one of the channels as a reference sample standard to estimate protein relative abundances in the biological samples and to combine abundance estimates from multiple experiments. In this manuscript, we show that using a masterpool is counterproductive. We obtain more precise estimates of protein relative abundance by using the available biological data instead of the masterpool and do not need to occupy a channel that could otherwise be used for another biological sample. In addition, we introduce a simple statistical method to associate proteomic data from multiple iTRAQ experiments with a numeric response and show that this approach is more powerful than the conventionally employed masterpool-based approach. We illustrate our methods using data from four replicate iTRAQ experiments on aliquots of the same pool of plasma samples and from a 406-sample project designed to identify plasma proteins that covary with nutrient concentrations in chronically undernourished children from South Asia.

  8. 30 CFR 250.192 - What reports and statistics must I submit relating to a hurricane, earthquake, or other natural...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... relating to a hurricane, earthquake, or other natural occurrence? 250.192 Section 250.192 Mineral Resources... What reports and statistics must I submit relating to a hurricane, earthquake, or other natural..., such as a hurricane, a tropical storm, or an earthquake. Statistics include facilities and rigs...

  9. 30 CFR 250.192 - What reports and statistics must I submit relating to a hurricane, earthquake, or other natural...

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... relating to a hurricane, earthquake, or other natural occurrence? 250.192 Section 250.192 Mineral Resources... What reports and statistics must I submit relating to a hurricane, earthquake, or other natural..., such as a hurricane, a tropical storm, or an earthquake. Statistics include facilities and rigs...

  10. 30 CFR 250.192 - What reports and statistics must I submit relating to a hurricane, earthquake, or other natural...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... relating to a hurricane, earthquake, or other natural occurrence? 250.192 Section 250.192 Mineral Resources... What reports and statistics must I submit relating to a hurricane, earthquake, or other natural..., such as a hurricane, a tropical storm, or an earthquake. Statistics include facilities and rigs...

  11. Mammalian Vestibular Macular Synaptic Plasticity: Results from SLS-2 Spaceflight

    NASA Technical Reports Server (NTRS)

    Ross, Muriel D.D.

    1994-01-01

    The effects of exposure to microgravity were studied in rat utricular maculas collected inflight (IF, day 13), post-flight on day of orbiter landing (day 14, R+O) and after 14 days (R+ML). Controls were collected at corresponding times. The objectives were 1) to learn whether hair cell ribbon synapses counts would be higher in tissues collected in space than in tissues collected postflight during or after readaptation to Earth's gravity; and 2) to compare results with those of SLS-1. Maculas were fixed by immersion, micro-dissected, dehydrated and prepared for ultrastructural study by usual methods. Synapses were counted in 100 serial sections 150 nm thick and were located to specific hair cells in montages of every 7th section. Counts were analyzed for statistical significance using analysis of variance. Results in maculas of IF dissected rats, one 13 day control (IFC), and one R + 0 rat have been analyzed. Study of an R+ML macula is nearly completed. For type I cells, IF mean is 2.3 +/-1.6; IFC mean is 1.6 +/-1.0; R+O mean is 2.3 +/- 1.6. For type II cells, IF mean is 11.4 +/- 17.1; IFC mean is 5.5 +/-3.5; R+O mean is 10.1 +/- 7.4. The difference between IF and IFC means for type I cells is statistically significant (p less than 0.0464). For type It cells, IF compared to IFC means, p less than 0.0003; and for IFC to R+O means, p less than 0.0139. Shifts toward spheres (p less than 0.0001) and pairs (p less than 0.0139) were significant in type II cells of IF rats. The results are largely replicating findings from SLS-1 and indicate that spaceflight affects synaptic number, form and distribution, particularly in type II hair cells. The increases in synaptic number and in sphere-like ribbons are interpreted to improve synaptic efficacy, to help return afferent discharges to a more normal state. Findings indicate that a great capacity for synaptic plasticity exists in mammalian gravity sensors, and that this plasticity is more dominant in the local circuitry. The

  12. Probability distributions of linear statistics in chaotic cavities and associated phase transitions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vivo, Pierpaolo; Majumdar, Satya N.; Bohigas, Oriol

    2010-03-01

    We establish large deviation formulas for linear statistics on the N transmission eigenvalues (T{sub i}) of a chaotic cavity, in the framework of random matrix theory. Given any linear statistics of interest A=SIGMA{sub i=1}{sup N}a(T{sub i}), the probability distribution P{sub A}(A,N) of A generically satisfies the large deviation formula lim{sub N-}>{sub i}nfinity[-2 log P{sub A}(Nx,N)/betaN{sup 2}]=PSI{sub A}(x), where PSI{sub A}(x) is a rate function that we compute explicitly in many cases (conductance, shot noise, and moments) and beta corresponds to different symmetry classes. Using these large deviation expressions, it is possible to recover easily known results and to produce newmore » formulas, such as a closed form expression for v(n)=lim{sub N-}>{sub i}nfinity var(T{sub n}) (where T{sub n}=SIGMA{sub i}T{sub i}{sup n}) for arbitrary integer n. The universal limit v*=lim{sub n-}>{sub i}nfinity v(n)=1/2pibeta is also computed exactly. The distributions display a central Gaussian region flanked on both sides by non-Gaussian tails. At the junction of the two regimes, weakly nonanalytical points appear, a direct consequence of phase transitions in an associated Coulomb gas problem. Numerical checks are also provided, which are in full agreement with our asymptotic results in both real and Laplace space even for moderately small N. Part of the results have been announced by Vivo et al. [Phys. Rev. Lett. 101, 216809 (2008)].« less

  13. An R2 statistic for fixed effects in the linear mixed model.

    PubMed

    Edwards, Lloyd J; Muller, Keith E; Wolfinger, Russell D; Qaqish, Bahjat F; Schabenberger, Oliver

    2008-12-20

    Statisticians most often use the linear mixed model to analyze Gaussian longitudinal data. The value and familiarity of the R(2) statistic in the linear univariate model naturally creates great interest in extending it to the linear mixed model. We define and describe how to compute a model R(2) statistic for the linear mixed model by using only a single model. The proposed R(2) statistic measures multivariate association between the repeated outcomes and the fixed effects in the linear mixed model. The R(2) statistic arises as a 1-1 function of an appropriate F statistic for testing all fixed effects (except typically the intercept) in a full model. The statistic compares the full model with a null model with all fixed effects deleted (except typically the intercept) while retaining exactly the same covariance structure. Furthermore, the R(2) statistic leads immediately to a natural definition of a partial R(2) statistic. A mixed model in which ethnicity gives a very small p-value as a longitudinal predictor of blood pressure (BP) compellingly illustrates the value of the statistic. In sharp contrast to the extreme p-value, a very small R(2) , a measure of statistical and scientific importance, indicates that ethnicity has an almost negligible association with the repeated BP outcomes for the study.

  14. Drug-drug interaction predictions with PBPK models and optimal multiresponse sampling time designs: application to midazolam and a phase I compound. Part 2: clinical trial results

    PubMed Central

    Chenel, Marylore; Bouzom, François; Cazade, Fanny; Ogungbenro, Kayode; Aarons, Leon; Mentré, France

    2008-01-01

    Purpose To compare results of population PK analyses obtained with a full empirical design (FD) and an optimal sparse design (MD) in a Drug-Drug Interaction (DDI) study aiming to evaluate the potential CYP3A4 inhibitory effect of a drug in development, SX, on a reference substrate, midazolam (MDZ). Secondary aim was to evaluate the interaction of SX on MDZ in the in vivo study. Methods To compare designs, real data were analysed by population PK modelling using either FD or MD with NONMEM FOCEI for SX and with NONMEM FOCEI and MONOLIX SAEM for MDZ. When applicable a Wald’s test was performed to compare model parameter estimates, such as apparent clearance (CL/F), across designs. To conclude on the potential interaction of SX on MDZ PK, a Student paired test was applied to compare the individual PK parameters (i.e. log(AUC) and log(Cmax)) obtained either by a non-compartmental approach (NCA) using FD or from empirical Bayes estimates (EBE) obtained after fitting the model separately on each treatment group using either FD or MD. Results For SX, whatever the design, CL/F was well estimated and no statistical differences were found between CL/F estimated values obtained with FD (CL/F = 8.2 L/h) and MD (CL/F = 8.2 L/h). For MDZ, only MONOLIX was able to estimate CL/F and to provide its standard error of estimation with MD. With MONOLIX, whatever the design and the administration setting, MDZ CL/F was well estimated and there were no statistical differences between CL/F estimated values obtained with FD (72 L/h and 40 L/h for MDZ alone and for MDZ with SX, respectively) and MD (77 L/h and 45 L/h for MDZ alone and for MDZ with SX, respectively). Whatever the approach, NCA or population PK modelling, and for the latter approach, whatever the design, MD or FD, comparison tests showed that there was a statistical difference (p<0.0001) between individual MDZ log(AUC) obtained after MDZ administration alone and co-administered with SX. Regarding Cmax, there was a

  15. Errors in statistical decision making Chapter 2 in Applied Statistics in Agricultural, Biological, and Environmental Sciences

    USDA-ARS?s Scientific Manuscript database

    Agronomic and Environmental research experiments result in data that are analyzed using statistical methods. These data are unavoidably accompanied by uncertainty. Decisions about hypotheses, based on statistical analyses of these data are therefore subject to error. This error is of three types,...

  16. 30 CFR 250.192 - What reports and statistics must I submit relating to a hurricane, earthquake, or other natural...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... relating to a hurricane, earthquake, or other natural occurrence? 250.192 Section 250.192 Mineral Resources... Requirements § 250.192 What reports and statistics must I submit relating to a hurricane, earthquake, or other... occurrence, such as a hurricane, a tropical storm, or an earthquake. Statistics include facilities and rigs...

  17. Problematizing Statistical Literacy: An Intersection of Critical and Statistical Literacies

    ERIC Educational Resources Information Center

    Weiland, Travis

    2017-01-01

    In this paper, I problematize traditional notions of statistical literacy by juxtaposing it with critical literacy. At the school level statistical literacy is vitally important for students who are preparing to become citizens in modern societies that are increasingly shaped and driven by data based arguments. The teaching of statistics, which is…

  18. Noise level and MPEG-2 encoder statistics

    NASA Astrophysics Data System (ADS)

    Lee, Jungwoo

    1997-01-01

    Most software in the movie and broadcasting industries are still in analog film or tape format, which typically contains random noise that originated from film, CCD camera, and tape recording. The performance of the MPEG-2 encoder may be significantly degraded by the noise. It is also affected by the scene type that includes spatial and temporal activity. The statistical property of noise originating from camera and tape player is analyzed and the models for the two types of noise are developed. The relationship between the noise, the scene type, and encoder statistics of a number of MPEG-2 parameters such as motion vector magnitude, prediction error, and quant scale are discussed. This analysis is intended to be a tool for designing robust MPEG encoding algorithms such as preprocessing and rate control.

  19. Sb2Te3 and Its Superlattices: Optimization by Statistical Design.

    PubMed

    Behera, Jitendra K; Zhou, Xilin; Ranjan, Alok; Simpson, Robert E

    2018-05-02

    The objective of this work is to demonstrate the usefulness of fractional factorial design for optimizing the crystal quality of chalcogenide van der Waals (vdW) crystals. We statistically analyze the growth parameters of highly c axis oriented Sb 2 Te 3 crystals and Sb 2 Te 3 -GeTe phase change vdW heterostructured superlattices. The statistical significance of the growth parameters of temperature, pressure, power, buffer materials, and buffer layer thickness was found by fractional factorial design and response surface analysis. Temperature, pressure, power, and their second-order interactions are the major factors that significantly influence the quality of the crystals. Additionally, using tungsten rather than molybdenum as a buffer layer significantly enhances the crystal quality. Fractional factorial design minimizes the number of experiments that are necessary to find the optimal growth conditions, resulting in an order of magnitude improvement in the crystal quality. We highlight that statistical design of experiment methods, which is more commonly used in product design, should be considered more broadly by those designing and optimizing materials.

  20. I Cannot Read My Statistics Textbook: The Relationship between Reading Ability and Statistics Anxiety

    ERIC Educational Resources Information Center

    Collins, Kathleen M. T.; Onwuegbuzie, Anthony J.

    2007-01-01

    Although several antecedents of statistics anxiety have been identified, many of these factors are relatively immutable (e.g., gender) and, at best, identify students who are at risk for debilitative levels of statistics anxiety, thereby having only minimal implications for intervention. Furthermore, the few interventions that have been designed…

  1. Genetic Ablation of Calcium-independent Phospholipase A2γ (iPLA2γ) Attenuates Calcium-induced Opening of the Mitochondrial Permeability Transition Pore and Resultant Cytochrome c Release*

    PubMed Central

    Moon, Sung Ho; Jenkins, Christopher M.; Kiebish, Michael A.; Sims, Harold F.; Mancuso, David J.; Gross, Richard W.

    2012-01-01

    Herein, we demonstrate that calcium-independent phospholipase A2γ (iPLA2γ) is a critical mechanistic participant in the calcium-induced opening of the mitochondrial permeability transition pore (mPTP). Liver mitochondria from iPLA2γ−/− mice were markedly resistant to calcium-induced swelling in the presence or absence of phosphate in comparison with wild-type littermates. Furthermore, the iPLA2γ enantioselective inhibitor (R)-(E)-6-(bromomethylene)-3-(1-naphthalenyl)-2H-tetrahydropyran-2-one ((R)-BEL) was markedly more potent than (S)-BEL in inhibiting mPTP opening in mitochondria from wild-type liver in comparison with hepatic mitochondria from iPLA2γ−/− mice. Intriguingly, low micromolar concentrations of long chain fatty acyl-CoAs and the non-hydrolyzable thioether analog of palmitoyl-CoA markedly accelerated Ca2+-induced mPTP opening in liver mitochondria from wild-type mice. The addition of l-carnitine enabled the metabolic channeling of acyl-CoA through carnitine palmitoyltransferases (CPT-1/2) and attenuated the palmitoyl-CoA-mediated amplification of calcium-induced mPTP opening. In contrast, mitochondria from iPLA2γ−/− mice were insensitive to fatty acyl-CoA-mediated augmentation of calcium-induced mPTP opening. Moreover, mitochondria from iPLA2γ−/− mouse liver were resistant to Ca2+/t-butyl hydroperoxide-induced mPTP opening in comparison with wild-type littermates. In support of these findings, cytochrome c release from iPLA2γ−/− mitochondria was dramatically decreased in response to calcium in the presence or absence of either t-butyl hydroperoxide or phenylarsine oxide in comparison with wild-type littermates. Collectively, these results identify iPLA2γ as an important mechanistic component of the mPTP, define its downstream products as potent regulators of mPTP opening, and demonstrate the integrated roles of mitochondrial bioenergetics and lipidomic flux in modulating mPTP opening promoting the activation of necrotic and

  2. Statistical Modeling of Zr/Hf Extraction using TBP-D2EHPA Mixtures

    NASA Astrophysics Data System (ADS)

    Rezaeinejhad Jirandehi, Vahid; Haghshenas Fatmehsari, Davoud; Firoozi, Sadegh; Taghizadeh, Mohammad; Keshavarz Alamdari, Eskandar

    2012-12-01

    In the present work, response surface methodology was employed for the study and prediction of Zr/Hf extraction curves in a solvent extraction system using D2EHPA-TBP mixtures. The effect of change in the levels of temperature, nitric acid concentration, and TBP/D2EHPA ratio (T/D) on the Zr/Hf extraction/separation was studied by the use of central composite design. The results showed a statistically significant effect of T/D, nitric acid concentration, and temperature on the extraction percentage of Zr and Hf. In the case of Zr, a statistically significant interaction was found between T/D and nitric acid, whereas for Hf, both interactive terms between temperature and T/D and nitric acid were significant. Additionally, the extraction curves were profitably predicted applying the developed statistical regression equations; this approach is faster and more economical compared with experimentally obtained curves.

  3. A Meta-Meta-Analysis: Empirical Review of Statistical Power, Type I Error Rates, Effect Sizes, and Model Selection of Meta-Analyses Published in Psychology

    ERIC Educational Resources Information Center

    Cafri, Guy; Kromrey, Jeffrey D.; Brannick, Michael T.

    2010-01-01

    This article uses meta-analyses published in "Psychological Bulletin" from 1995 to 2005 to describe meta-analyses in psychology, including examination of statistical power, Type I errors resulting from multiple comparisons, and model choice. Retrospective power estimates indicated that univariate categorical and continuous moderators, individual…

  4. Analysis of statistical misconception in terms of statistical reasoning

    NASA Astrophysics Data System (ADS)

    Maryati, I.; Priatna, N.

    2018-05-01

    Reasoning skill is needed for everyone to face globalization era, because every person have to be able to manage and use information from all over the world which can be obtained easily. Statistical reasoning skill is the ability to collect, group, process, interpret, and draw conclusion of information. Developing this skill can be done through various levels of education. However, the skill is low because many people assume that statistics is just the ability to count and using formulas and so do students. Students still have negative attitude toward course which is related to research. The purpose of this research is analyzing students’ misconception in descriptive statistic course toward the statistical reasoning skill. The observation was done by analyzing the misconception test result and statistical reasoning skill test; observing the students’ misconception effect toward statistical reasoning skill. The sample of this research was 32 students of math education department who had taken descriptive statistic course. The mean value of misconception test was 49,7 and standard deviation was 10,6 whereas the mean value of statistical reasoning skill test was 51,8 and standard deviation was 8,5. If the minimal value is 65 to state the standard achievement of a course competence, students’ mean value is lower than the standard competence. The result of students’ misconception study emphasized on which sub discussion that should be considered. Based on the assessment result, it was found that students’ misconception happen on this: 1) writing mathematical sentence and symbol well, 2) understanding basic definitions, 3) determining concept that will be used in solving problem. In statistical reasoning skill, the assessment was done to measure reasoning from: 1) data, 2) representation, 3) statistic format, 4) probability, 5) sample, and 6) association.

  5. Exceedance statistics of accelerations resulting from thruster firings on the Apollo-Soyuz mission

    NASA Technical Reports Server (NTRS)

    Fichtl, G. H.; Holland, R. L.

    1981-01-01

    Spacecraft acceleration resulting from firings of vernier control system thrusters is an important consideration in the design, planning, execution and post-flight analysis of laboratory experiments in space. In particular, scientists and technologists involved with the development of experiments to be performed in space in many instances required statistical information on the magnitude and rate of occurrence of spacecraft accelerations. Typically, these accelerations are stochastic in nature, so that it is useful to characterize these accelerations in statistical terms. Statistics of spacecraft accelerations are summarized.

  6. Exceedance statistics of accelerations resulting from thruster firings on the Apollo-Soyuz mission

    NASA Technical Reports Server (NTRS)

    Fichtl, G. H.; Holland, R. L.

    1983-01-01

    Spacecraft acceleration resulting from firings of vernier control system thrusters is an important consideration in the design, planning, execution and post-flight analysis of laboratory experiments in space. In particular, scientists and technologists involved with the development of experiments to be performed in space in many instances required statistical information on the magnitude and rate of occurrence of spacecraft accelerations. Typically, these accelerations are stochastic in nature, so that it is useful to characterize these accelerations in statistical terms. Statistics of spacecraft accelerations are summarized. Previously announced in STAR as N82-12127

  7. 5-HT2A receptors in the feline brain: 123I-5-I-R91150 kinetics and the influence of ketamine measured with micro-SPECT.

    PubMed

    Waelbers, Tim; Polis, Ingeborgh; Vermeire, Simon; Dobbeleir, André; Eersels, Jos; De Spiegeleer, Bart; Audenaert, Kurt; Slegers, Guido; Peremans, Kathelijne

    2013-08-01

    Subanesthetic doses of ketamine can be used as a rapid-acting antidepressant in patients with treatment-resistant depression. Therefore, the brain kinetics of (123)I-5-I-R91150 (4-amino-N-[1-[3-(4-fluorophenyl)propyl]-4-methylpiperidin-4-yl]-5-iodo-2-methoxybenzamide) and the influence of ketamine on the postsynaptic serotonin-2A receptor (5-hydroxytryptamine-2A, or 5-HT2A) status were investigated in cats using micro-SPECT. This study was conducted on 6 cats using the radioligand (123)I-5-I-R91150, a 5-HT2A receptor antagonist, as the imaging probe. Anesthesia was induced and maintained with a continuous-rate infusion of propofol (8.4 ± 1.2 mg kg(-1) followed by 0.22 mg kg(-1) min(-1)) 75 min after tracer administration, and acquisition of the first image began 15 min after induction of anesthesia. After this first acquisition, propofol (0.22 mg kg(-1) min(-1)) was combined with ketamine (5 mg kg(-1) followed by 0.023 mg kg(-1) min(-1)), and the second acquisition began 15 min later. Semiquantification, with the cerebellum as a reference region, was performed to calculate the 5-HT2A receptor binding indices (parameter for available receptor density) in the frontal and temporal cortices. The binding indices were analyzed with Wilcoxon signed ranks statistics. The addition of ketamine to the propofol continuous-rate infusion resulted in decreased binding indices in the right frontal cortex (1.25 ± 0.22 vs. 1.45 ± 0.16; P = 0.028), left frontal cortex (1.34 ± 0.15 vs. 1.49 ± 0.10; P = 0.028), right temporal cortex (1.30 ± 0.17 vs. 1.45 ± 0.09; P = 0.046), and left temporal cortex (1.41 ± 0.20 vs. 1.52 ± 0.20; P = 0.046). This study showed that cats can be used as an animal model for studying alterations of the 5-HT2A receptor status with (123)I-5-I-R91150 micro-SPECT. Furthermore, an interaction between ketamine and the 5-HT2A receptors resulting in decreased binding of (123)I-5-I-R91150 in the frontal and temporal cortices was demonstrated. Whether the

  8. DYNAMIC STABILITY OF THE SOLAR SYSTEM: STATISTICALLY INCONCLUSIVE RESULTS FROM ENSEMBLE INTEGRATIONS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zeebe, Richard E., E-mail: zeebe@soest.hawaii.edu

    Due to the chaotic nature of the solar system, the question of its long-term stability can only be answered in a statistical sense, for instance, based on numerical ensemble integrations of nearby orbits. Destabilization of the inner planets, leading to close encounters and/or collisions can be initiated through a large increase in Mercury's eccentricity, with a currently assumed likelihood of ∼1%. However, little is known at present about the robustness of this number. Here I report ensemble integrations of the full equations of motion of the eight planets and Pluto over 5 Gyr, including contributions from general relativity. The resultsmore » show that different numerical algorithms lead to statistically different results for the evolution of Mercury's eccentricity (e{sub M}). For instance, starting at present initial conditions (e{sub M}≃0.21), Mercury's maximum eccentricity achieved over 5 Gyr is, on average, significantly higher in symplectic ensemble integrations using heliocentric rather than Jacobi coordinates and stricter error control. In contrast, starting at a possible future configuration (e{sub M}≃0.53), Mercury's maximum eccentricity achieved over the subsequent 500 Myr is, on average, significantly lower using heliocentric rather than Jacobi coordinates. For example, the probability for e{sub M} to increase beyond 0.53 over 500 Myr is >90% (Jacobi) versus only 40%-55% (heliocentric). This poses a dilemma because the physical evolution of the real system—and its probabilistic behavior—cannot depend on the coordinate system or the numerical algorithm chosen to describe it. Some tests of the numerical algorithms suggest that symplectic integrators using heliocentric coordinates underestimate the odds for destabilization of Mercury's orbit at high initial e{sub M}.« less

  9. Results on Neutrinoless Double-Beta Decay from Gerda Phase I

    NASA Astrophysics Data System (ADS)

    Macolino, Carla

    2014-12-01

    The GERmanium Detector Array, GERDA, is designed to search for neutrinoless double-beta (0νββ) decay of 76Ge and it is installed in the Laboratori Nazionali del Gran Sasso (LNGS) of INFN, Italy. In this review, the detection principle and detector setup of GERDA are described. Also, the main physics results by GERDA Phase I, are discussed. They include the measurement of the half-life of 2νββ decay, the background decomposition of the energy spectrum and the techniques for the discrimination of the background, based on the pulse shape of the signal. In the last part of this review, the estimation of a limit on the half-life of 0νββ (T0ν 1/2>2.1ḑot 1025 yr at 90% C.L.) and the comparison with previous results are discussed. GERDA data from Phase I strongly disfavor the recent claim of 0νββ discovery, based on data from the Heidelberg-Moscow experiment.

  10. A la Recherche du Temps Perdu: extracting temporal relations from medical text in the 2012 i2b2 NLP challenge.

    PubMed

    Cherry, Colin; Zhu, Xiaodan; Martin, Joel; de Bruijn, Berry

    2013-01-01

    An analysis of the timing of events is critical for a deeper understanding of the course of events within a patient record. The 2012 i2b2 NLP challenge focused on the extraction of temporal relationships between concepts within textual hospital discharge summaries. The team from the National Research Council Canada (NRC) submitted three system runs to the second track of the challenge: typifying the time-relationship between pre-annotated entities. The NRC system was designed around four specialist modules containing statistical machine learning classifiers. Each specialist targeted distinct sets of relationships: local relationships, 'sectime'-type relationships, non-local overlap-type relationships, and non-local causal relationships. The best NRC submission achieved a precision of 0.7499, a recall of 0.6431, and an F1 score of 0.6924, resulting in a statistical tie for first place. Post hoc improvements led to a precision of 0.7537, a recall of 0.6455, and an F1 score of 0.6954, giving the highest scores reported on this task to date. Methods for general relation extraction extended well to temporal relations, and gave top-ranked state-of-the-art results. Careful ordering of predictions within result sets proved critical to this success.

  11. High efficiency CsI(Tl)/HgI{sub 2} gamma ray spectrometers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Y.J.; Patt, B.E.; Iwanczyk, J.S.

    CsI(Tl)/HgI{sub 2} gamma-ray spectrometers have been constructed using 0.5 inch diameter detectors which show excellent energy resolution: 4.58% FWHM for 662 keV {sup 137}Cs gamma-ray photons. Further efforts have been focused on optimization of larger size ({ge} 1 inch diameter) detector structures and improvement of low noise electronics. In order to take full advantage of scintillation detectors for high energy gamma-rays, larger scintillators are always preferred for their higher detection efficiencies. However, the larger capacitance and higher dark current caused by the larger size of the detector could result in a higher FWHM resolution. Also, the increased probability of includingmore » nonuniformities in larger pieces of crystals makes it more difficult to obtain the high resolutions one obtains from small detectors. Thus for very large volume scintillators, it may be necessary to employ a photodiode (PD) with a sensitive area smaller than the cross-section of the scintillator. Monte Carlo simulations of the light collection for various tapered scintillator/PD configuration were performed in order to find those geometries which resulted in the best light collection. According to the simulation results, scintillators with the most favorable geometry, the conical frustum, have been fabricated and evaluated. The response of a large conical frustum (top-2 inch, bottom-1 inch, 2 inch high) CsI(Tl) scintillator coupled with a 1 inch HgI{sub 2} PD was measured. The energy resolution of the 662 keV peak was 5.57%. The spectrum shows much higher detection efficiency than those from smaller scintillators, i.e., much higher peak-to-Compton ratio in the spectrum.« less

  12. Quantum state-to-state dynamics for the quenching process of Br(2P1/2) + H2(v(i) = 0, 1, j(i) = 0).

    PubMed

    Xie, Changjian; Jiang, Bin; Xie, Daiqian; Sun, Zhigang

    2012-03-21

    Quantum state-to-state dynamics for the quenching process Br((2)P(1/2)) + H(2)(v(i) = 0, 1, j(i) = 0) → Br((2)P(3/2)) + H(2)(v(f), j(f)) has been studied based on two-state model on the recent coupled potential energy surfaces. It was found that the quenching probabilities have some oscillatory structures due to the interference of reflected flux in the Br((2)P(1/2)) + H(2) and Br((2)P(3/2)) + H(2) channels by repulsive potential in the near-resonant electronic-to-vibrational energy transfer process. The final vibrational state resolved integral cross sections were found to be dominated by the quenching process Br((2)P(1/2)) + H(2)(v) → Br((2)P(3/2)) + H(2)(v+1) and the nonadiabatic reaction probabilities for Br((2)P(1/2)) + H(2)(v = 0, 1, j(i) = 0) are quite small, which are consistent with previous theoretical and experimental results. Our calculated total quenching rate constant for Br((2)P(1/2)) + H(2)(v(i) = 0, j(i) = 0) at room temperature is in good agreement with the available experimental data. © 2012 American Institute of Physics

  13. CCR2-64I polymorphism is not associated with altered CCR5 expression or coreceptor function.

    PubMed

    Mariani, R; Wong, S; Mulder, L C; Wilkinson, D A; Reinhart, A L; LaRosa, G; Nibbs, R; O'Brien, T R; Michael, N L; Connor, R I; Macdonald, M; Busch, M; Koup, R A; Landau, N R

    1999-03-01

    A polymorphism in the gene encoding CCR2 is associated with a delay in progression to AIDS in human immunodeficiency virus (HIV)-infected individuals. The polymorphism, CCR2-64I, changes valine 64 of CCR2 to isoleucine. However, it is not clear whether the effect on AIDS progression results from the amino acid change or whether the polymorphism marks a genetically linked, yet unidentified mutation that mediates the effect. Because the gene encoding CCR5, the major coreceptor for HIV type 1 primary isolates, lies 15 kb 3' to CCR2, linked mutations in the CCR5 promoter or other regulatory sequences could explain the association of CCR2-64I with slowed AIDS pathogenesis. Here, we show that CCR2-64I is efficiently expressed on the cell surface but does not have dominant negative activity on CCR5 coreceptor function. A panel of peripheral blood mononuclear cells (PBMC) from uninfected donors representing the various CCR5/CCR2 genotypes was assembled. Activated primary CD4(+) T cells of CCR2 64I/64I donors expressed cell surface CCR5 at levels comparable to those of CCR2 +/+ donors. A slight reduction in CCR5 expression was noted, although this was not statistically significant. CCR5 and CCR2 mRNA levels were nearly identical for each of the donor PBMC, regardless of genotype. Cell surface CCR5 and CCR2 levels were more variable than mRNA transcript levels, suggesting that an alternative mechanism may influence CCR5 cell surface levels. CCR2-64I is linked to the CCR5 promoter polymorphisms 208G, 303A, 627C, and 676A; however, in transfected promoter reporter constructs, these did not affect transcriptional activity. Taken together, these findings suggest that CCR2-64I does not act by influencing CCR5 transcription or mRNA levels.

  14. Spectral theory of extreme statistics in birth-death systems

    NASA Astrophysics Data System (ADS)

    Meerson, Baruch

    2008-03-01

    Statistics of rare events, or large deviations, in chemical reactions and systems of birth-death type have attracted a great deal of interest in many areas of science including cell biochemistry, astrochemistry, epidemiology, population biology, etc. Large deviations become of vital importance when discrete (non-continuum) nature of a population of ``particles'' (molecules, bacteria, cells, animals or even humans) and stochastic character of interactions can drive the population to extinction. I will briefly review the novel spectral method [1-3] for calculating the extreme statistics of a broad class of birth-death processes and reactions involving a single species. The spectral method combines the probability generating function formalism with the Sturm-Liouville theory of linear differential operators. It involves a controlled perturbative treatment based on a natural large parameter of the problem: the average number of particles/individuals in a stationary or metastable state. For extinction (the first passage) problems the method yields accurate results for the extinction statistics and for the quasi-stationary probability distribution, including the tails, of metastable states. I will demonstrate the power of the method on the example of a branching and annihilation reaction, A ->-2.8mm2mm2A,,A ->-2.8mm2mm , representative of a rather general class of processes. *M. Assaf and B. Meerson, Phys. Rev. Lett. 97, 200602 (2006). *M. Assaf and B. Meerson, Phys. Rev. E 74, 041115 (2006). *M. Assaf and B. Meerson, Phys. Rev. E 75, 031122 (2007).

  15. Adding statistical regularity results in a global slowdown in visual search.

    PubMed

    Vaskevich, Anna; Luria, Roy

    2018-05-01

    Current statistical learning theories predict that embedding implicit regularities within a task should further improve online performance, beyond general practice. We challenged this assumption by contrasting performance in a visual search task containing either a consistent-mapping (regularity) condition, a random-mapping condition, or both conditions, mixed. Surprisingly, performance in a random visual search, without any regularity, was better than performance in a mixed design search that contained a beneficial regularity. This result was replicated using different stimuli and different regularities, suggesting that mixing consistent and random conditions leads to an overall slowing down of performance. Relying on the predictive-processing framework, we suggest that this global detrimental effect depends on the validity of the regularity: when its predictive value is low, as it is in the case of a mixed design, reliance on all prior information is reduced, resulting in a general slowdown. Our results suggest that our cognitive system does not maximize speed, but rather continues to gather and implement statistical information at the expense of a possible slowdown in performance. Copyright © 2018 Elsevier B.V. All rights reserved.

  16. Explanation of Two Anomalous Results in Statistical Mediation Analysis

    ERIC Educational Resources Information Center

    Fritz, Matthew S.; Taylor, Aaron B.; MacKinnon, David P.

    2012-01-01

    Previous studies of different methods of testing mediation models have consistently found two anomalous results. The first result is elevated Type I error rates for the bias-corrected and accelerated bias-corrected bootstrap tests not found in nonresampling tests or in resampling tests that did not include a bias correction. This is of special…

  17. Results of the Verification of the Statistical Distribution Model of Microseismicity Emission Characteristics

    NASA Astrophysics Data System (ADS)

    Cianciara, Aleksander

    2016-09-01

    The paper presents the results of research aimed at verifying the hypothesis that the Weibull distribution is an appropriate statistical distribution model of microseismicity emission characteristics, namely: energy of phenomena and inter-event time. It is understood that the emission under consideration is induced by the natural rock mass fracturing. Because the recorded emission contain noise, therefore, it is subjected to an appropriate filtering. The study has been conducted using the method of statistical verification of null hypothesis that the Weibull distribution fits the empirical cumulative distribution function. As the model describing the cumulative distribution function is given in an analytical form, its verification may be performed using the Kolmogorov-Smirnov goodness-of-fit test. Interpretations by means of probabilistic methods require specifying the correct model describing the statistical distribution of data. Because in these methods measurement data are not used directly, but their statistical distributions, e.g., in the method based on the hazard analysis, or in that that uses maximum value statistics.

  18. The European I-MOVE Multicentre 2013-2014 Case-Control Study. Homogeneous moderate influenza vaccine effectiveness against A(H1N1)pdm09 and heterogenous results by country against A(H3N2).

    PubMed

    Valenciano, Marta; Kissling, Esther; Reuss, Annicka; Jiménez-Jorge, Silvia; Horváth, Judit K; Donnell, Joan M O; Pitigoi, Daniela; Machado, Ausenda; Pozo, Francisco

    2015-06-04

    In the first five I-MOVE (Influenza Monitoring Vaccine Effectiveness in Europe) influenza seasons vaccine effectiveness (VE) results were relatively homogenous among participating study sites. In 2013-2014, we undertook a multicentre case-control study based on sentinel practitioner surveillance networks in six European Union (EU) countries to measure 2013-2014 influenza VE against medically-attended influenza-like illness (ILI) laboratory-confirmed as influenza. Influenza A(H3N2) and A(H1N1)pdm09 viruses co-circulated during the season. Practitioners systematically selected ILI patients to swab within eight days of symptom onset. We compared cases (ILI positive to influenza A(H3N2) or A(H1N1)pdm09) to influenza negative patients. We calculated VE for the two influenza A subtypes and adjusted for potential confounders. We calculated heterogeneity between sites using the I(2) index and Cochrane's Q test. If the I(2) was <50%, we estimated pooled VE as (1 minus the OR)×100 using a one-stage model with study site as a fixed effect. If the I(2) was >49% we used a two-stage random effects model. We included in the A(H1N1)pdm09 analysis 531 cases and 1712 controls and in the A(H3N2) analysis 623 cases and 1920 controls. For A(H1N1)pdm09, the Q test (p=0.695) and the I(2) index (0%) suggested no heterogeneity of adjusted VE between study sites. Using a one-stage model, the overall pooled adjusted VE against influenza A(H1N1)pdm2009 was 47.5% (95% CI: 16.4-67.0). For A(H3N2), the I(2) was 51.5% (p=0.067). Using a two-stage model for the pooled analysis, the adjusted VE against A(H3N2) was 29.7 (95% CI: -34.4-63.2). The results suggest a moderate 2013-2014 influenza VE against A(H1N1)pdm09 and a low VE against A(H3N2). The A(H3N2) estimates were heterogeneous among study sites. Larger sample sizes by study site are needed to prevent statistical heterogeneity, decrease variability and allow for two-stage pooled VE for all subgroup analyses. Copyright © 2015 The Authors

  19. The autoinhibitory CARD2-Hel2i Interface of RIG-I governs RNA selection.

    PubMed

    Ramanathan, Anand; Devarkar, Swapnil C; Jiang, Fuguo; Miller, Matthew T; Khan, Abdul G; Marcotrigiano, Joseph; Patel, Smita S

    2016-01-29

    RIG-I (Retinoic Acid Inducible Gene-I) is a cytosolic innate immune receptor that detects atypical features in viral RNAs as foreign to initiate a Type I interferon signaling response. RIG-I is present in an autoinhibited state in the cytoplasm and activated by blunt-ended double-stranded (ds)RNAs carrying a 5' triphosphate (ppp) moiety. These features found in many pathogenic RNAs are absent in cellular RNAs due to post-transcriptional modifications of RNA ends. Although RIG-I is structurally well characterized, the mechanistic basis for RIG-I's remarkable ability to discriminate between cellular and pathogenic RNAs is not completely understood. We show that RIG-I's selectivity for blunt-ended 5'-ppp dsRNAs is ≈3000 times higher than non-blunt ended dsRNAs commonly found in cellular RNAs. Discrimination occurs at multiple stages and signaling RNAs have high affinity and ATPase turnover rate and thus a high katpase/Kd. We show that RIG-I uses its autoinhibitory CARD2-Hel2i (second CARD-helicase insertion domain) interface as a barrier to select against non-blunt ended dsRNAs. Accordingly, deletion of CARDs or point mutations in the CARD2-Hel2i interface decreases the selectivity from ≈3000 to 150 and 750, respectively. We propose that the CARD2-Hel2i interface is a 'gate' that prevents cellular RNAs from generating productive complexes that can signal. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  20. Improving esthetic results in benign parotid surgery: statistical evaluation of facelift approach, sternocleidomastoid flap, and superficial musculoaponeurotic system flap application.

    PubMed

    Bianchi, Bernardo; Ferri, Andrea; Ferrari, Silvano; Copelli, Chiara; Sesenna, Enrico

    2011-04-01

    The purpose of this article was to analyze the efficacy of facelift incision, sternocleidomastoid muscle flap, and superficial musculoaponeurotic system flap for improving the esthetic results in patients undergoing partial parotidectomy for benign parotid tumor resection. The usefulness of partial parotidectomy is discussed, and a statistical evaluation of the esthetic results was performed. From January 1, 1996, to January 1, 2007, 274 patients treated for benign parotid tumors were studied. Of these, 172 underwent partial parotidectomy. The 172 patients were divided into 4 groups: partial parotidectomy with classic or modified Blair incision without reconstruction (group 1), partial parotidectomy with facelift incision and without reconstruction (group 2), partial parotidectomy with facelift incision associated with sternocleidomastoid muscle flap (group 3), and partial parotidectomy with facelift incision associated with superficial musculoaponeurotic system flap (group 4). Patients were considered, after a follow-up of at least 18 months, for functional and esthetic evaluation. The functional outcome was assessed considering the facial nerve function, Frey syndrome, and recurrence. The esthetic evaluation was performed by inviting the patients and a blind panel of 1 surgeon and 2 secretaries of the department to give a score of 1 to 10 to assess the final cosmetic outcome. The statistical analysis was finally performed using the Mann-Whitney U test for nonparametric data to compare the different group results. P less than .05 was considered significant. No recurrence developed in any of the 4 groups or in any of the 274 patients during the follow-up period. The statistical analysis, comparing group 1 and the other groups, revealed a highly significant statistical difference (P < .0001) for all groups. Also, when group 2 was compared with groups 3 and 4, the difference was highly significantly different statistically (P = .0018 for group 3 and P = .0005 for

  1. Exploratory Visual Analysis of Statistical Results from Microarray Experiments Comparing High and Low Grade Glioma

    PubMed Central

    Reif, David M.; Israel, Mark A.; Moore, Jason H.

    2007-01-01

    The biological interpretation of gene expression microarray results is a daunting challenge. For complex diseases such as cancer, wherein the body of published research is extensive, the incorporation of expert knowledge provides a useful analytical framework. We have previously developed the Exploratory Visual Analysis (EVA) software for exploring data analysis results in the context of annotation information about each gene, as well as biologically relevant groups of genes. We present EVA as a flexible combination of statistics and biological annotation that provides a straightforward visual interface for the interpretation of microarray analyses of gene expression in the most commonly occuring class of brain tumors, glioma. We demonstrate the utility of EVA for the biological interpretation of statistical results by analyzing publicly available gene expression profiles of two important glial tumors. The results of a statistical comparison between 21 malignant, high-grade glioblastoma multiforme (GBM) tumors and 19 indolent, low-grade pilocytic astrocytomas were analyzed using EVA. By using EVA to examine the results of a relatively simple statistical analysis, we were able to identify tumor class-specific gene expression patterns having both statistical and biological significance. Our interactive analysis highlighted the potential importance of genes involved in cell cycle progression, proliferation, signaling, adhesion, migration, motility, and structure, as well as candidate gene loci on a region of Chromosome 7 that has been implicated in glioma. Because EVA does not require statistical or computational expertise and has the flexibility to accommodate any type of statistical analysis, we anticipate EVA will prove a useful addition to the repertoire of computational methods used for microarray data analysis. EVA is available at no charge to academic users and can be found at http://www.epistasis.org. PMID:19390666

  2. Evaluating statistical consistency in the ocean model component of the Community Earth System Model (pyCECT v2.0)

    NASA Astrophysics Data System (ADS)

    Baker, Allison H.; Hu, Yong; Hammerling, Dorit M.; Tseng, Yu-heng; Xu, Haiying; Huang, Xiaomeng; Bryan, Frank O.; Yang, Guangwen

    2016-07-01

    The Parallel Ocean Program (POP), the ocean model component of the Community Earth System Model (CESM), is widely used in climate research. Most current work in CESM-POP focuses on improving the model's efficiency or accuracy, such as improving numerical methods, advancing parameterization, porting to new architectures, or increasing parallelism. Since ocean dynamics are chaotic in nature, achieving bit-for-bit (BFB) identical results in ocean solutions cannot be guaranteed for even tiny code modifications, and determining whether modifications are admissible (i.e., statistically consistent with the original results) is non-trivial. In recent work, an ensemble-based statistical approach was shown to work well for software verification (i.e., quality assurance) on atmospheric model data. The general idea of the ensemble-based statistical consistency testing is to use a qualitative measurement of the variability of the ensemble of simulations as a metric with which to compare future simulations and make a determination of statistical distinguishability. The capability to determine consistency without BFB results boosts model confidence and provides the flexibility needed, for example, for more aggressive code optimizations and the use of heterogeneous execution environments. Since ocean and atmosphere models have differing characteristics in term of dynamics, spatial variability, and timescales, we present a new statistical method to evaluate ocean model simulation data that requires the evaluation of ensemble means and deviations in a spatial manner. In particular, the statistical distribution from an ensemble of CESM-POP simulations is used to determine the standard score of any new model solution at each grid point. Then the percentage of points that have scores greater than a specified threshold indicates whether the new model simulation is statistically distinguishable from the ensemble simulations. Both ensemble size and composition are important. Our

  3. The Statistical Fermi Paradox

    NASA Astrophysics Data System (ADS)

    Maccone, C.

    In this paper is provided the statistical generalization of the Fermi paradox. The statistics of habitable planets may be based on a set of ten (and possibly more) astrobiological requirements first pointed out by Stephen H. Dole in his book Habitable planets for man (1964). The statistical generalization of the original and by now too simplistic Dole equation is provided by replacing a product of ten positive numbers by the product of ten positive random variables. This is denoted the SEH, an acronym standing for “Statistical Equation for Habitables”. The proof in this paper is based on the Central Limit Theorem (CLT) of Statistics, stating that the sum of any number of independent random variables, each of which may be ARBITRARILY distributed, approaches a Gaussian (i.e. normal) random variable (Lyapunov form of the CLT). It is then shown that: 1. The new random variable NHab, yielding the number of habitables (i.e. habitable planets) in the Galaxy, follows the log- normal distribution. By construction, the mean value of this log-normal distribution is the total number of habitable planets as given by the statistical Dole equation. 2. The ten (or more) astrobiological factors are now positive random variables. The probability distribution of each random variable may be arbitrary. The CLT in the so-called Lyapunov or Lindeberg forms (that both do not assume the factors to be identically distributed) allows for that. In other words, the CLT "translates" into the SEH by allowing an arbitrary probability distribution for each factor. This is both astrobiologically realistic and useful for any further investigations. 3. By applying the SEH it is shown that the (average) distance between any two nearby habitable planets in the Galaxy may be shown to be inversely proportional to the cubic root of NHab. This distance is denoted by new random variable D. The relevant probability density function is derived, which was named the "Maccone distribution" by Paul Davies in

  4. STATISTICAL ANALYSIS OF TANK 5 FLOOR SAMPLE RESULTS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shine, E.

    2012-03-14

    Sampling has been completed for the characterization of the residual material on the floor of Tank 5 in the F-Area Tank Farm at the Savannah River Site (SRS), near Aiken, SC. The sampling was performed by Savannah River Remediation (SRR) LLC using a stratified random sampling plan with volume-proportional compositing. The plan consisted of partitioning the residual material on the floor of Tank 5 into three non-overlapping strata: two strata enclosed accumulations, and a third stratum consisted of a thin layer of material outside the regions of the two accumulations. Each of three composite samples was constructed from five primarymore » sample locations of residual material on the floor of Tank 5. Three of the primary samples were obtained from the stratum containing the thin layer of material, and one primary sample was obtained from each of the two strata containing an accumulation. This report documents the statistical analyses of the analytical results for the composite samples. The objective of the analysis is to determine the mean concentrations and upper 95% confidence (UCL95) bounds for the mean concentrations for a set of analytes in the tank residuals. The statistical procedures employed in the analyses were consistent with the Environmental Protection Agency (EPA) technical guidance by Singh and others [2010]. Savannah River National Laboratory (SRNL) measured the sample bulk density, nonvolatile beta, gross alpha, radionuclide, inorganic, and anion concentrations three times for each of the composite samples. The analyte concentration data were partitioned into three separate groups for further analysis: analytes with every measurement above their minimum detectable concentrations (MDCs), analytes with no measurements above their MDCs, and analytes with a mixture of some measurement results above and below their MDCs. The means, standard deviations, and UCL95s were computed for the analytes in the two groups that had at least some measurements above

  5. Statistical Analysis of Tank 5 Floor Sample Results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shine, E. P.

    2013-01-31

    Sampling has been completed for the characterization of the residual material on the floor of Tank 5 in the F-Area Tank Farm at the Savannah River Site (SRS), near Aiken, SC. The sampling was performed by Savannah River Remediation (SRR) LLC using a stratified random sampling plan with volume-proportional compositing. The plan consisted of partitioning the residual material on the floor of Tank 5 into three non-overlapping strata: two strata enclosed accumulations, and a third stratum consisted of a thin layer of material outside the regions of the two accumulations. Each of three composite samples was constructed from five primarymore » sample locations of residual material on the floor of Tank 5. Three of the primary samples were obtained from the stratum containing the thin layer of material, and one primary sample was obtained from each of the two strata containing an accumulation. This report documents the statistical analyses of the analytical results for the composite samples. The objective of the analysis is to determine the mean concentrations and upper 95% confidence (UCL95) bounds for the mean concentrations for a set of analytes in the tank residuals. The statistical procedures employed in the analyses were consistent with the Environmental Protection Agency (EPA) technical guidance by Singh and others [2010]. Savannah River National Laboratory (SRNL) measured the sample bulk density, nonvolatile beta, gross alpha, and the radionuclide1, elemental, and chemical concentrations three times for each of the composite samples. The analyte concentration data were partitioned into three separate groups for further analysis: analytes with every measurement above their minimum detectable concentrations (MDCs), analytes with no measurements above their MDCs, and analytes with a mixture of some measurement results above and below their MDCs. The means, standard deviations, and UCL95s were computed for the analytes in the two groups that had at least some

  6. Statistical Analysis Of Tank 5 Floor Sample Results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shine, E. P.

    2012-08-01

    Sampling has been completed for the characterization of the residual material on the floor of Tank 5 in the F-Area Tank Farm at the Savannah River Site (SRS), near Aiken, SC. The sampling was performed by Savannah River Remediation (SRR) LLC using a stratified random sampling plan with volume-proportional compositing. The plan consisted of partitioning the residual material on the floor of Tank 5 into three non-overlapping strata: two strata enclosed accumulations, and a third stratum consisted of a thin layer of material outside the regions of the two accumulations. Each of three composite samples was constructed from five primarymore » sample locations of residual material on the floor of Tank 5. Three of the primary samples were obtained from the stratum containing the thin layer of material, and one primary sample was obtained from each of the two strata containing an accumulation. This report documents the statistical analyses of the analytical results for the composite samples. The objective of the analysis is to determine the mean concentrations and upper 95% confidence (UCL95) bounds for the mean concentrations for a set of analytes in the tank residuals. The statistical procedures employed in the analyses were consistent with the Environmental Protection Agency (EPA) technical guidance by Singh and others [2010]. Savannah River National Laboratory (SRNL) measured the sample bulk density, nonvolatile beta, gross alpha, and the radionuclide, elemental, and chemical concentrations three times for each of the composite samples. The analyte concentration data were partitioned into three separate groups for further analysis: analytes with every measurement above their minimum detectable concentrations (MDCs), analytes with no measurements above their MDCs, and analytes with a mixture of some measurement results above and below their MDCs. The means, standard deviations, and UCL95s were computed for the analytes in the two groups that had at least some

  7. Latest Results From the QuakeFinder Statistical Analysis Framework

    NASA Astrophysics Data System (ADS)

    Kappler, K. N.; MacLean, L. S.; Schneider, D.; Bleier, T.

    2017-12-01

    be analysed using the Receiver Operating Characteristic test. In this presentation we give a status report of our latest results, largely focussed on reproducibility of results, robust statistics in the presence of missing data, and exploring optimization landscapes in our parameter space.

  8. The Effect Size Statistic: Overview of Various Choices.

    ERIC Educational Resources Information Center

    Mahadevan, Lakshmi

    Over the years, methodologists have been recommending that researchers use magnitude of effect estimates in result interpretation to highlight the distinction between statistical and practical significance (cf. R. Kirk, 1996). A magnitude of effect statistic (i.e., effect size) tells to what degree the dependent variable can be controlled,…

  9. The Use of Meta-Analytic Statistical Significance Testing

    ERIC Educational Resources Information Center

    Polanin, Joshua R.; Pigott, Terri D.

    2015-01-01

    Meta-analysis multiplicity, the concept of conducting multiple tests of statistical significance within one review, is an underdeveloped literature. We address this issue by considering how Type I errors can impact meta-analytic results, suggest how statistical power may be affected through the use of multiplicity corrections, and propose how…

  10. Land use statistics for West Virginia, Part I

    USGS Publications Warehouse

    Erwin, Robert B.; ,; ,

    1979-01-01

    The West Virginia Geological and Economic Survey and the United States Geological Survey have completed a cooperative program to provide land-use and land-cover maps and data for the State. This program begins to satisfy a longstanding need for a consistent level of detail, standardization in categorization, and scale of compilation for land-use and land-cover maps and data. The statistical information contained in this Bulletin provides land-use acreage tabulations for the first 20 counties that have been completed. Statistics are being compiled for the remaining counties and will be published shortly. This information has been derived from the recently completed Land-Use Map of West Virginia (on open file at the West Virginia Geological and Economic Survey - Environmental Section). In addition to land-use acreage, we have also included land-use percent. All statistics throughout this Bulletin are in the same format for ease of comparison.

  11. Establishment of a Method for Measuring Antioxidant Capacity in Urine, Based on Oxidation Reduction Potential and Redox Couple I2/KI

    PubMed Central

    Cao, Tinghui; He, Min; Bai, Tianyu

    2016-01-01

    Objectives. To establish a new method for determination of antioxidant capacity of human urine based on the redox couple I2/KI and to evaluate the redox status of healthy and diseased individuals. Methods. The method was based on the linear relationship between oxidation reduction potential (ORP) and logarithm of concentration ratio of I2/KI. ORP of a solution with a known concentration ratio of I2/KI will change when reacted with urine. To determine the accuracy of the method, both vitamin C and urine were reacted separately with I2/KI solution. The new method was compared with the traditional method of iodine titration and then used to measure the antioxidant capacity of urine samples from 30 diabetic patients and 30 healthy subjects. Results. A linear relationship was found between logarithm of concentration ratio of I2/KI and ORP (R 2 = 0.998). Both vitamin C and urine concentration showed a linear relationship with ORP (R 2 = 0.994 and 0.986, resp.). The precision of the method was in the acceptable range and results of two methods had a linear correlation (R 2 = 0.987). Differences in ORP values between diabetic group and control group were statistically significant (P < 0.05). Conclusions. A new method for measuring the antioxidant capacity of clinical urine has been established. PMID:28115919

  12. Ambiguities and conflicting results: the limitations of the kappa statistic in establishing the interrater reliability of the Irish nursing minimum data set for mental health: a discussion paper.

    PubMed

    Morris, Roisin; MacNeela, Padraig; Scott, Anne; Treacy, Pearl; Hyde, Abbey; O'Brien, Julian; Lehwaldt, Daniella; Byrne, Anne; Drennan, Jonathan

    2008-04-01

    In a study to establish the interrater reliability of the Irish Nursing Minimum Data Set (I-NMDS) for mental health difficulties relating to the choice of reliability test statistic were encountered. The objective of this paper is to highlight the difficulties associated with testing interrater reliability for an ordinal scale using a relatively homogenous sample and the recommended kw statistic. One pair of mental health nurses completed the I-NMDS for mental health for a total of 30 clients attending a mental health day centre over a two-week period. Data was analysed using the kw and percentage agreement statistics. A total of 34 of the 38 I-NMDS for mental health variables with lower than acceptable levels of kw reliability scores achieved acceptable levels of reliability according to their percentage agreement scores. The study findings implied that, due to the homogeneity of the sample, low variability within the data resulted in the 'base rate problem' associated with the use of kw statistic. Conclusions point to the interpretation of kw in tandem with percentage agreement scores. Suggestions that kw scores were low due to chance agreement and that one should strive to use a study sample with known variability are queried.

  13. Synaptic Plasticity In Mammalian Gravity Sensors: Preliminary Results From SLS-2

    NASA Technical Reports Server (NTRS)

    Ross, M. D.; Hargens, Alan R. (Technical Monitor)

    1996-01-01

    Sensory conflict is the prevalent theoretical explanation for space adaptation syndrome. This ultrastructural study tests the hypothesis that peripheral gravity sensors (maculae) play a role. Results were obtained from the medial part of utricular maculae of adult rats exposed to microgravity for 14 days, and from controls. Means and statistical significance of synapse counts were calculated using SUPERANOVA(Trademark) and Scheffe's procedure for post-hoc comparisons. Preliminary findings are from 2 sets of 100 serial sections for each dataset. Synapses were doubled numerically in type II hair cells of utricular maculae collected on day 13 inflight compared to controls (11.4 +/- 7.1 vs. 5.3 +/- 3.8; p < 0.0001). Flight mean synaptic number declined rapidly postflight and became comparable to means of controls. Synapses also increased numerically in type I cells inflight (2.4 +/- 1.6 vs. 1.7 +/- 1.0; p < 0.0341). Postflight there were no significant differences in counts. Results concerning shifts in ribbon type and distribution are also largely replicating previous findings from flight studies. Results indicate that mammalian maculae are adaptive endorgans that retain the property of synaptic plasticity into the adult stage. Macular plasticity has clinical implications for balance disorders of peripheral origin.

  14. CH3 NH3 PbI3 and HC(NH2 )2 PbI3 Powders Synthesized from Low-Grade PbI2 : Single Precursor for High-Efficiency Perovskite Solar Cells.

    PubMed

    Zhang, Yong; Kim, Seul-Gi; Lee, Do-Kyoung; Park, Nam-Gyu

    2018-05-09

    -purity PbI 2 . The smaller hysteresis was indicative of fewer defects in the resulting FAPbI 3 film prepared by using the δ-FAPbI 3 powder. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. Strategies for maintaining patient privacy in i2b2.

    PubMed

    Murphy, Shawn N; Gainer, Vivian; Mendis, Michael; Churchill, Susanne; Kohane, Isaac

    2011-12-01

    The re-use of patient data from electronic healthcare record systems can provide tremendous benefits for clinical research, but measures to protect patient privacy while utilizing these records have many challenges. Some of these challenges arise from a misperception that the problem should be solved technically when actually the problem needs a holistic solution. The authors' experience with informatics for integrating biology and the bedside (i2b2) use cases indicates that the privacy of the patient should be considered on three fronts: technical de-identification of the data, trust in the researcher and the research, and the security of the underlying technical platforms. The security structure of i2b2 is implemented based on consideration of all three fronts. It has been supported with several use cases across the USA, resulting in five privacy categories of users that serve to protect the data while supporting the use cases. The i2b2 architecture is designed to provide consistency and faithfully implement these user privacy categories. These privacy categories help reflect the policy of both the Health Insurance Portability and Accountability Act and the provisions of the National Research Act of 1974, as embodied by current institutional review boards. By implementing a holistic approach to patient privacy solutions, i2b2 is able to help close the gap between principle and practice.

  16. Drug-drug interaction predictions with PBPK models and optimal multiresponse sampling time designs: application to midazolam and a phase I compound. Part 2: clinical trial results.

    PubMed

    Chenel, Marylore; Bouzom, François; Cazade, Fanny; Ogungbenro, Kayode; Aarons, Leon; Mentré, France

    2008-12-01

    To compare results of population PK analyses obtained with a full empirical design (FD) and an optimal sparse design (MD) in a Drug-Drug Interaction (DDI) study aiming to evaluate the potential CYP3A4 inhibitory effect of a drug in development, SX, on a reference substrate, midazolam (MDZ). Secondary aim was to evaluate the interaction of SX on MDZ in the in vivo study. Methods To compare designs, real data were analysed by population PK modelling technique using either FD or MD with NONMEM FOCEI for SX and with NONMEM FOCEI and MONOLIX SAEM for MDZ. When applicable a Wald test was performed to compare model parameter estimates, such as apparent clearance (CL/F), across designs. To conclude on the potential interaction of SX on MDZ PK, a Student paired test was applied to compare the individual PK parameters (i.e. log(AUC) and log(C(max))) obtained either by a non-compartmental approach (NCA) using FD or from empirical Bayes estimates (EBE) obtained after fitting the model separately on each treatment group using either FD or MD. For SX, whatever the design, CL/F was well estimated and no statistical differences were found between CL/F estimated values obtained with FD (CL/F = 8.2 l/h) and MD (CL/F = 8.2 l/h). For MDZ, only MONOLIX was able to estimate CL/F and to provide its standard error of estimation with MD. With MONOLIX, whatever the design and the administration setting, MDZ CL/F was well estimated and there were no statistical differences between CL/F estimated values obtained with FD (72 l/h and 40 l/h for MDZ alone and for MDZ with SX, respectively) and MD (77 l/h and 45 l/h for MDZ alone and for MDZ with SX, respectively). Whatever the approach, NCA or population PK modelling, and for the latter approach, whatever the design, MD or FD, comparison tests showed that there was a statistical difference (P < 0.0001) between individual MDZ log(AUC) obtained after MDZ administration alone and co-administered with SX. Regarding C(max), there was a statistical

  17. Temperature and Pressure Dependences of the Reactions of Fe+ with Methyl Halides CH3X (X = Cl, Br, I): Experiments and Kinetic Modeling Results.

    PubMed

    Ard, Shaun G; Shuman, Nicholas S; Martinez, Oscar; Keyes, Nicholas R; Viggiano, Albert A; Guo, Hua; Troe, Jürgen

    2017-06-01

    The pressure and temperature dependences of the reactions of Fe + with methyl halides CH 3 X (X = Cl, Br, I) in He were measured in a selected ion flow tube over the ranges 0.4 to 1.2 Torr and 300-600 K. FeX + was observed for all three halides and FeCH 3 + was observed for the CH 3 I reaction. FeCH 3 X + adducts (for all X) were detected in all reactions. The results were interpreted assuming two-state reactivity with spin-inversions between sextet and quartet potentials. Kinetic modeling allowed for a quantitative representation of the experiments and for extrapolation to conditions outside the experimentally accessible range. The modeling required quantum-chemical calculations of molecular parameters and detailed accounting of angular momentum effects. The results show that the FeX + products come via an insertion mechanism, while the FeCH 3 + can be produced from either insertion or S N 2 mechanisms, but the latter we conclude is unlikely at thermal energies. A statistical modeling cannot reproduce the competition between the bimolecular pathways in the CH 3 I reaction, indicating that some more direct process must be important.

  18. Interactive binocular treatment (I-BiT) for amblyopia: results of a pilot study of 3D shutter glasses system.

    PubMed

    Herbison, N; Cobb, S; Gregson, R; Ash, I; Eastgate, R; Purdy, J; Hepburn, T; MacKeith, D; Foss, A

    2013-09-01

    A computer-based interactive binocular treatment system (I-BiT) for amblyopia has been developed, which utilises commercially available 3D 'shutter glasses'. The purpose of this pilot study was to report the effect of treatment on visual acuity (VA) in children with amblyopia. Thirty minutes of I-BiT treatment was given once weekly for 6 weeks. Treatment sessions consisted of playing a computer game and watching a DVD through the I-BiT system. VA was assessed at baseline, mid-treatment, at the end of treatment, and at 4 weeks post treatment. Standard summary statistics and an exploratory one-way analysis of variance (ANOVA) were performed. Ten patients were enrolled with strabismic, anisometropic, or mixed amblyopia. The mean age was 5.4 years. Nine patients (90%) completed the full course of I-BiT treatment with a mean improvement of 0.18 (SD=0.143). Six out of nine patients (67%) who completed the treatment showed a clinically significant improvement of 0.125 LogMAR units or more at follow-up. The exploratory one-way ANOVA showed an overall effect over time (F=7.95, P=0.01). No adverse effects were reported. This small, uncontrolled study has shown VA gains with 3 hours of I-BiT treatment. Although it is recognised that this pilot study had significant limitations-it was unblinded, uncontrolled, and too small to permit formal statistical analysis-these results suggest that further investigation of I-BiT treatment is worthwhile.

  19. SMART-on-FHIR implemented over i2b2

    PubMed Central

    Mandel, Joshua C; Klann, Jeffery G; Wattanasin, Nich; Mendis, Michael; Chute, Christopher G; Mandl, Kenneth D; Murphy, Shawn N

    2017-01-01

    We have developed an interface to serve patient data from Informatics for Integrating Biology and the Bedside (i2b2) repositories in the Fast Healthcare Interoperability Resources (FHIR) format, referred to as a SMART-on-FHIR cell. The cell serves FHIR resources on a per-patient basis, and supports the “substitutable” modular third-party applications (SMART) OAuth2 specification for authorization of client applications. It is implemented as an i2b2 server plug-in, consisting of 6 modules: authentication, REST, i2b2-to-FHIR converter, resource enrichment, query engine, and cache. The source code is freely available as open source. We tested the cell by accessing resources from a test i2b2 installation, demonstrating that a SMART app can be launched from the cell that accesses patient data stored in i2b2. We successfully retrieved demographics, medications, labs, and diagnoses for test patients. The SMART-on-FHIR cell will enable i2b2 sites to provide simplified but secure data access in FHIR format, and will spur innovation and interoperability. Further, it transforms i2b2 into an apps platform. PMID:27274012

  20. Evaluation of data generated by statistically oriented end-result specifications : final report.

    DOT National Transportation Integrated Search

    1979-01-01

    This report is concerned with the review of data generated on projects let under the statistically oriented end-result specifications (ERS) on asphaltic concrete and Portland cement concrete. The review covered analysis of data for determination of p...

  1. Sharing brain mapping statistical results with the neuroimaging data model

    PubMed Central

    Maumet, Camille; Auer, Tibor; Bowring, Alexander; Chen, Gang; Das, Samir; Flandin, Guillaume; Ghosh, Satrajit; Glatard, Tristan; Gorgolewski, Krzysztof J.; Helmer, Karl G.; Jenkinson, Mark; Keator, David B.; Nichols, B. Nolan; Poline, Jean-Baptiste; Reynolds, Richard; Sochat, Vanessa; Turner, Jessica; Nichols, Thomas E.

    2016-01-01

    Only a tiny fraction of the data and metadata produced by an fMRI study is finally conveyed to the community. This lack of transparency not only hinders the reproducibility of neuroimaging results but also impairs future meta-analyses. In this work we introduce NIDM-Results, a format specification providing a machine-readable description of neuroimaging statistical results along with key image data summarising the experiment. NIDM-Results provides a unified representation of mass univariate analyses including a level of detail consistent with available best practices. This standardized representation allows authors to relay methods and results in a platform-independent regularized format that is not tied to a particular neuroimaging software package. Tools are available to export NIDM-Result graphs and associated files from the widely used SPM and FSL software packages, and the NeuroVault repository can import NIDM-Results archives. The specification is publically available at: http://nidm.nidash.org/specs/nidm-results.html. PMID:27922621

  2. Exploring the Replicability of a Study's Results: Bootstrap Statistics for the Multivariate Case.

    ERIC Educational Resources Information Center

    Thompson, Bruce

    Conventional statistical significance tests do not inform the researcher regarding the likelihood that results will replicate. One strategy for evaluating result replication is to use a "bootstrap" resampling of a study's data so that the stability of results across numerous configurations of the subjects can be explored. This paper…

  3. Report for Florida Community Colleges, 1983-1984. Part I: Statistical Tables.

    ERIC Educational Resources Information Center

    Florida State Dept. of Education, Tallahassee. Div. of Community Colleges.

    Statistical data are presented on student enrollments, academic programs, personnel and salaries, and finances for the Florida community colleges for 1983-84. A series of tables provide data on: (1) opening fall enrollment by class, program and student status; (2) fall enrollment headcount by age groups; (3) annual program headcount enrollment;…

  4. The IAEA coordinated research programme on HTGR uncertainty analysis: Phase I status and Ex. I-1 prismatic reference results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bostelmann, Friederike; Strydom, Gerhard; Reitsma, Frederik

    The quantification of uncertainties in design and safety analysis of reactors is today not only broadly accepted, but in many cases became the preferred way to replace traditional conservative analysis for safety and licensing analysis. The use of a more fundamental methodology is also consistent with the reliable high fidelity physics models and robust, efficient, and accurate codes available today. To facilitate uncertainty analysis applications a comprehensive approach and methodology must be developed and applied, in contrast to the historical approach where sensitivity analysis were performed and uncertainties then determined by a simplified statistical combination of a few important inputmore » parameters. New methodologies are currently under development in the OECD/NEA Light Water Reactor (LWR) Uncertainty Analysis in Best-Estimate Modelling (UAM) benchmark activity. High Temperature Gas-cooled Reactor (HTGR) designs require specific treatment of the double heterogeneous fuel design and large graphite quantities at high temperatures. The IAEA has therefore launched a Coordinated Research Project (CRP) on HTGR Uncertainty Analysis in Modelling (UAM) in 2013 to study uncertainty propagation specifically in the HTGR analysis chain. Two benchmark problems are defined, with the prismatic design represented by the General Atomics (GA) MHTGR-350 and a 250 MW modular pebble bed design similar to the Chinese HTR-PM. Work has started on the first phase and the current CRP status is reported in the paper. A comparison of the Serpent and SCALE/KENO-VI reference Monte Carlo results for Ex. I-1 of the MHTGR-350 design is also included. It was observed that the SCALE/KENO-VI Continuous Energy (CE) k ∞ values were 395 pcm (Ex. I-1a) to 803 pcm (Ex. I-1b) higher than the respective Serpent lattice calculations, and that within the set of the SCALE results, the KENO-VI 238 Multi-Group (MG) k ∞ values were up to 800 pcm lower than the KENO-VI CE values. The use of the

  5. The IAEA coordinated research programme on HTGR uncertainty analysis: Phase I status and Ex. I-1 prismatic reference results

    DOE PAGES

    Bostelmann, Friederike; Strydom, Gerhard; Reitsma, Frederik; ...

    2016-01-11

    The quantification of uncertainties in design and safety analysis of reactors is today not only broadly accepted, but in many cases became the preferred way to replace traditional conservative analysis for safety and licensing analysis. The use of a more fundamental methodology is also consistent with the reliable high fidelity physics models and robust, efficient, and accurate codes available today. To facilitate uncertainty analysis applications a comprehensive approach and methodology must be developed and applied, in contrast to the historical approach where sensitivity analysis were performed and uncertainties then determined by a simplified statistical combination of a few important inputmore » parameters. New methodologies are currently under development in the OECD/NEA Light Water Reactor (LWR) Uncertainty Analysis in Best-Estimate Modelling (UAM) benchmark activity. High Temperature Gas-cooled Reactor (HTGR) designs require specific treatment of the double heterogeneous fuel design and large graphite quantities at high temperatures. The IAEA has therefore launched a Coordinated Research Project (CRP) on HTGR Uncertainty Analysis in Modelling (UAM) in 2013 to study uncertainty propagation specifically in the HTGR analysis chain. Two benchmark problems are defined, with the prismatic design represented by the General Atomics (GA) MHTGR-350 and a 250 MW modular pebble bed design similar to the Chinese HTR-PM. Work has started on the first phase and the current CRP status is reported in the paper. A comparison of the Serpent and SCALE/KENO-VI reference Monte Carlo results for Ex. I-1 of the MHTGR-350 design is also included. It was observed that the SCALE/KENO-VI Continuous Energy (CE) k ∞ values were 395 pcm (Ex. I-1a) to 803 pcm (Ex. I-1b) higher than the respective Serpent lattice calculations, and that within the set of the SCALE results, the KENO-VI 238 Multi-Group (MG) k ∞ values were up to 800 pcm lower than the KENO-VI CE values. The use of the

  6. Statistics for NAEG: past efforts, new results, and future plans

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gilbert, R.O.; Simpson, J.C.; Kinnison, R.R.

    A brief review of Nevada Applied Ecology Group (NAEG) objectives is followed by a summary of past statistical analyses conducted by Pacific Northwest Laboratory for the NAEG. Estimates of spatial pattern of radionuclides and other statistical analyses at NS's 201, 219 and 221 are reviewed as background for new analyses presented in this paper. Suggested NAEG activities and statistical analyses needed for the projected termination date of NAEG studies in March 1986 are given.

  7. 42 CFR 402.109 - Statistical sampling.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 2 2011-10-01 2011-10-01 false Statistical sampling. 402.109 Section 402.109... Statistical sampling. (a) Purpose. CMS or OIG may introduce the results of a statistical sampling study to... or caused to be presented. (b) Prima facie evidence. The results of the statistical sampling study...

  8. 42 CFR 402.109 - Statistical sampling.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 2 2010-10-01 2010-10-01 false Statistical sampling. 402.109 Section 402.109... Statistical sampling. (a) Purpose. CMS or OIG may introduce the results of a statistical sampling study to... or caused to be presented. (b) Prima facie evidence. The results of the statistical sampling study...

  9. Photophysics of indole-2-carboxylic acid (I2C) and indole-5-carboxylic acid (I5C): Heavy atom effect

    NASA Astrophysics Data System (ADS)

    Kowalska-Baron, Agnieszka; Gałęcki, Krystian; Wysocki, Stanisław

    2013-12-01

    In this study the effect of carboxylic group substitution in the 2 and 5 position of indole ring on the photophysics of the parent indole chromophore has been studied. The photophysical parameters crucial in triplet state decay mechanism of aqueous indole-2-carboxylic acid (I2C) and indole-5-carboxylic acid (I5C) have been determined applying our previously proposed methodology based on the heavy atom effect and fluorescence and phosphorescence decay kinetics [Kowalska-Baron et al., 2012]. The determined time-resolved phosphorescence spectra of I2C and I5C are red-shifted as compared to that of the parent indole. This red-shift was especially evident in the case of I2C and may indicate the possibility of hydrogen bonded complex formation incorporating carbonyl Cdbnd O, the NH group of I2C and, possibly, surrounding water molecules. The possibility of the excited state charge transfer process and the subsequent electronic charge redistribution in such a hydrogen bonded complex may also be postulated. The resulting stabilization of the I2C triplet state is manifested by its relatively long phosphorescence lifetime in aqueous solution (912 μs). The relatively short phosphorescence lifetime of I5C (56 μs) may be the consequence of more effective ground-state quenching of I5C triplet state. This hypothesis may be strengthened by the significantly larger value of the determined rate constant of I5C triplet state quenching by its ground-state (4.4 × 108 M-1 s-1) as compared to that for indole (6.8 × 107 M-1 s-1) and I2C (2.3 × 107 M-1 s-1). The determined bimolecular rate constant for triplet state quenching by iodide kqT1 is equal to 1 × 104 M-1 s-1; 6 × 103 M-1 s-1 and 2.7 × 104 M-1 s-1 for indole, I2C and I5C, respectively. In order to obtain a better insight into iodide quenching of I2C and I5C triplet states in aqueous solution, the temperature dependence of the bimolecular rate constants for iodide quenching of the triplet states has been expressed in

  10. Crystallographic isomorphism in the structural type of α-HgI{sub 2} by example of KHgI{sub 3} · H{sub 2}O, β-Ag{sub 2}HgI{sub 4}, and β-Cu{sub 2}HgI{sub 4}

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Borisov, S. V., E-mail: borisov@niic.nsc.ru; Magarill, S. A.; Pervukhina, N. V.

    The structure of KHgI{sub 3} · H{sub 2}O is assigned to the family of crystal structures having the three-layer cubic packing of iodine anions with cations in the tetrahedral voids (the structures of α-HgI{sub 2}, β-Ag{sub 2}HgI{sub 4}, and β-Cu{sub 2}HgI{sub 4} among them). Crystallographic analysis shows that the nodes of the three-layer close packing are populated by iodine anions and K cations in the ratio 3/4: 1/4. Transformation of the structure of α-HgI{sub 2} into the structure of KHgI{sub 3} · H{sub 2}O can be formally represented as the replacement of (HgI){sub n}{sup +} fragments by (KH{sub 2}O){sub n}{supmore » +} fragments: (Hg{sub 2}I{sub 4})–(HgI){sup +} + (KH{sub 2}O){sub n}{sup +} = KHgI{sub 3} · H{sub 2}O. Perforated layers of vertex-sharing HgI{sub 4} tetrahedra break down into parallel isolated chains. Channels formed in place of I–Hg–I–Hg–fragments are occupied by–H{sub 2}O–K–-H{sub 2}-O-K-H{sub 2}O-chains weakly bound to neighbors.« less

  11. Statistics 101 for Radiologists.

    PubMed

    Anvari, Arash; Halpern, Elkan F; Samir, Anthony E

    2015-10-01

    Diagnostic tests have wide clinical applications, including screening, diagnosis, measuring treatment effect, and determining prognosis. Interpreting diagnostic test results requires an understanding of key statistical concepts used to evaluate test efficacy. This review explains descriptive statistics and discusses probability, including mutually exclusive and independent events and conditional probability. In the inferential statistics section, a statistical perspective on study design is provided, together with an explanation of how to select appropriate statistical tests. Key concepts in recruiting study samples are discussed, including representativeness and random sampling. Variable types are defined, including predictor, outcome, and covariate variables, and the relationship of these variables to one another. In the hypothesis testing section, we explain how to determine if observed differences between groups are likely to be due to chance. We explain type I and II errors, statistical significance, and study power, followed by an explanation of effect sizes and how confidence intervals can be used to generalize observed effect sizes to the larger population. Statistical tests are explained in four categories: t tests and analysis of variance, proportion analysis tests, nonparametric tests, and regression techniques. We discuss sensitivity, specificity, accuracy, receiver operating characteristic analysis, and likelihood ratios. Measures of reliability and agreement, including κ statistics, intraclass correlation coefficients, and Bland-Altman graphs and analysis, are introduced. © RSNA, 2015.

  12. Gold(I) Complexes of the Geminal Phosphinoborane tBu2PCH2BPh2

    PubMed Central

    2018-01-01

    In this work, we explored the coordination properties of the geminal phosphinoborane tBu2PCH2BPh2 (2) toward different gold(I) precursors. The reaction of 2 with an equimolar amount of the sulfur-based complex (Me2S)AuCl resulted in displacement of the SMe2 ligand and formation of linear phosphine gold(I) chloride 3. Using an excess of ligand 2, bisligated complex 4 was formed and showed dynamic behavior at room temperature. Changing the gold(I) metal precursor to the phosphorus-based complex, (Ph3P)AuCl impacted the coordination behavior of ligand 2. Namely, the reaction of ligand 2 with (Ph3P)AuCl led to the heterolytic cleavage of the gold–chloride bond, which is favored over PPh3 ligand displacement. To the best of our knowledge, 2 is the first example of a P/B-ambiphilic ligand capable of cleaving the gold–chloride bond. The coordination chemistry of 2 was further analyzed by density functional theory calculations. PMID:29732451

  13. Gold(I) Complexes of the Geminal Phosphinoborane tBu2PCH2BPh2.

    PubMed

    Boom, Devin H A; Ehlers, Andreas W; Nieger, Martin; Devillard, Marc; Bouhadir, Ghenwa; Bourissou, Didier; Slootweg, J Chris

    2018-04-30

    In this work, we explored the coordination properties of the geminal phosphinoborane t Bu 2 PCH 2 BPh 2 ( 2 ) toward different gold(I) precursors. The reaction of 2 with an equimolar amount of the sulfur-based complex (Me 2 S)AuCl resulted in displacement of the SMe 2 ligand and formation of linear phosphine gold(I) chloride 3 . Using an excess of ligand 2 , bisligated complex 4 was formed and showed dynamic behavior at room temperature. Changing the gold(I) metal precursor to the phosphorus-based complex, (Ph 3 P)AuCl impacted the coordination behavior of ligand 2 . Namely, the reaction of ligand 2 with (Ph 3 P)AuCl led to the heterolytic cleavage of the gold-chloride bond, which is favored over PPh 3 ligand displacement. To the best of our knowledge, 2 is the first example of a P/B-ambiphilic ligand capable of cleaving the gold-chloride bond. The coordination chemistry of 2 was further analyzed by density functional theory calculations.

  14. Crop identification technology assessment for remote sensing. (CITARS) Volume 9: Statistical analysis of results

    NASA Technical Reports Server (NTRS)

    Davis, B. J.; Feiveson, A. H.

    1975-01-01

    Results are presented of CITARS data processing in raw form. Tables of descriptive statistics are given along with descriptions and results of inferential analyses. The inferential results are organized by questions which CITARS was designed to answer.

  15. Empirical Correction to the Likelihood Ratio Statistic for Structural Equation Modeling with Many Variables.

    PubMed

    Yuan, Ke-Hai; Tian, Yubin; Yanagihara, Hirokazu

    2015-06-01

    Survey data typically contain many variables. Structural equation modeling (SEM) is commonly used in analyzing such data. The most widely used statistic for evaluating the adequacy of a SEM model is T ML, a slight modification to the likelihood ratio statistic. Under normality assumption, T ML approximately follows a chi-square distribution when the number of observations (N) is large and the number of items or variables (p) is small. However, in practice, p can be rather large while N is always limited due to not having enough participants. Even with a relatively large N, empirical results show that T ML rejects the correct model too often when p is not too small. Various corrections to T ML have been proposed, but they are mostly heuristic. Following the principle of the Bartlett correction, this paper proposes an empirical approach to correct T ML so that the mean of the resulting statistic approximately equals the degrees of freedom of the nominal chi-square distribution. Results show that empirically corrected statistics follow the nominal chi-square distribution much more closely than previously proposed corrections to T ML, and they control type I errors reasonably well whenever N ≥ max(50,2p). The formulations of the empirically corrected statistics are further used to predict type I errors of T ML as reported in the literature, and they perform well.

  16. Quantifying, displaying and accounting for heterogeneity in the meta-analysis of RCTs using standard and generalised Q statistics

    PubMed Central

    2011-01-01

    Background Clinical researchers have often preferred to use a fixed effects model for the primary interpretation of a meta-analysis. Heterogeneity is usually assessed via the well known Q and I2 statistics, along with the random effects estimate they imply. In recent years, alternative methods for quantifying heterogeneity have been proposed, that are based on a 'generalised' Q statistic. Methods We review 18 IPD meta-analyses of RCTs into treatments for cancer, in order to quantify the amount of heterogeneity present and also to discuss practical methods for explaining heterogeneity. Results Differing results were obtained when the standard Q and I2 statistics were used to test for the presence of heterogeneity. The two meta-analyses with the largest amount of heterogeneity were investigated further, and on inspection the straightforward application of a random effects model was not deemed appropriate. Compared to the standard Q statistic, the generalised Q statistic provided a more accurate platform for estimating the amount of heterogeneity in the 18 meta-analyses. Conclusions Explaining heterogeneity via the pre-specification of trial subgroups, graphical diagnostic tools and sensitivity analyses produced a more desirable outcome than an automatic application of the random effects model. Generalised Q statistic methods for quantifying and adjusting for heterogeneity should be incorporated as standard into statistical software. Software is provided to help achieve this aim. PMID:21473747

  17. Exploring the Photovoltaic Performance of All-Inorganic Ag2PbI4/PbI2 Blends.

    PubMed

    Frolova, Lyubov A; Anokhin, Denis V; Piryazev, Alexey A; Luchkin, Sergey Yu; Dremova, Nadezhda N; Troshin, Pavel A

    2017-04-06

    We present an all-inorganic photoactive material composed of Ag 2 PbI 4 and PbI 2 , which shows unexpectedly good photovoltaic performance in planar junction solar cells delivering external quantum efficiencies of ∼60% and light power conversion efficiencies of ∼3.9%. The revealed characteristics are among the best reported to date for metal halides with nonperovskite crystal structure. Most importantly, the obtained results suggest a possibility of reaching high photovoltaic efficiencies for binary and, probably, also ternary blends of different inorganic semiconductor materials. This approach, resembling the bulk heterojunction concept guiding the development of organic photovoltaics for two decades, opens wide opportunities for rational design of novel inorganic and hybrid materials for efficient and sustainable photovoltaic technologies.

  18. Statistical models to predict type 2 diabetes remission after bariatric surgery.

    PubMed

    Ramos-Levi, Ana M; Matia, Pilar; Cabrerizo, Lucio; Barabash, Ana; Sanchez-Pernaute, Andres; Calle-Pascual, Alfonso L; Torres, Antonio J; Rubio, Miguel A

    2014-09-01

    Type 2 diabetes (T2D) remission may be achieved after bariatric surgery (BS), but rates vary according to patients' baseline characteristics. The present study evaluates the relevance of several preoperative factors and develops statistical models to predict T2D remission 1 year after BS. We retrospectively studied 141 patients (57.4% women), with a preoperative diagnosis of T2D, who underwent BS in a single center (2006-2011). Anthropometric and glucose metabolism parameters before surgery and at 1-year follow-up were recorded. Remission of T2D was defined according to consensus criteria: HbA1c <6%, fasting glucose (FG) <100 mg/dL, absence of pharmacologic treatment. The influence of several preoperative factors was explored and different statistical models to predict T2D remission were elaborated using logistic regression analysis. Three preoperative characteristics considered individually were identified as the most powerful predictors of T2D remission: C-peptide (R2  = 0.249; odds ratio [OR] 1.652, 95% confidence interval [CI] 1.181-2.309; P = 0.003), T2D duration (R2  = 0.197; OR 0.869, 95% CI 0.808-0.935; P < 0.001), and previous insulin therapy (R2  = 0.165; OR 4.670, 95% CI 2.257-9.665; P < 0.001). High C-peptide levels, a shorter duration of T2D, and the absence of insulin therapy favored remission. Different multivariate logistic regression models were designed. When considering sex, T2D duration, and insulin treatment, remission was correctly predicted in 72.4% of cases. The model that included age, FG and C-peptide levels resulted in 83.7% correct classifications. When sex, FG, C-peptide, insulin treatment, and percentage weight loss were considered, correct classification of T2D remission was achieved in 95.9% of cases. Preoperative characteristics determine T2D remission rates after BS to different extents. The use of statistical models may help clinicians reliably predict T2D remission rates after BS. © 2014 Ruijin Hospital

  19. Statistical Mechanics of Turbulent Dynamos

    NASA Technical Reports Server (NTRS)

    Shebalin, John V.

    2014-01-01

    Incompressible magnetohydrodynamic (MHD) turbulence and magnetic dynamos, which occur in magnetofluids with large fluid and magnetic Reynolds numbers, will be discussed. When Reynolds numbers are large and energy decays slowly, the distribution of energy with respect to length scale becomes quasi-stationary and MHD turbulence can be described statistically. In the limit of infinite Reynolds numbers, viscosity and resistivity become zero and if these values are used in the MHD equations ab initio, a model system called ideal MHD turbulence results. This model system is typically confined in simple geometries with some form of homogeneous boundary conditions, allowing for velocity and magnetic field to be represented by orthogonal function expansions. One advantage to this is that the coefficients of the expansions form a set of nonlinearly interacting variables whose behavior can be described by equilibrium statistical mechanics, i.e., by a canonical ensemble theory based on the global invariants (energy, cross helicity and magnetic helicity) of ideal MHD turbulence. Another advantage is that truncated expansions provide a finite dynamical system whose time evolution can be numerically simulated to test the predictions of the associated statistical mechanics. If ensemble predictions are the same as time averages, then the system is said to be ergodic; if not, the system is nonergodic. Although it had been implicitly assumed in the early days of ideal MHD statistical theory development that these finite dynamical systems were ergodic, numerical simulations provided sufficient evidence that they were, in fact, nonergodic. Specifically, while canonical ensemble theory predicted that expansion coefficients would be (i) zero-mean random variables with (ii) energy that decreased with length scale, it was found that although (ii) was correct, (i) was not and the expected ergodicity was broken. The exact cause of this broken ergodicity was explained, after much

  20. Crystal Structure of AgBi2I7 Thin Films.

    PubMed

    Xiao, Zewen; Meng, Weiwei; Mitzi, David B; Yan, Yanfa

    2016-10-06

    Synthesis of cubic-phase AgBi 2 I 7 iodobismuthate thin films and fabrication of air-stable Pb-free solar cells using the AgBi 2 I 7 absorber have recently been reported. On the basis of X-ray diffraction (XRD) analysis and nominal composition, it was suggested that the synthesized films have a cubic ThZr 2 H 7 crystal structure with AgBi 2 I 7 stoichiometry. Through careful examination of the proposed structure and computational evaluation of the phase stability and bandgap, we find that the reported "AgBi 2 I 7 " films cannot be forming with the ThZr 2 H 7 -type structure, but rather more likely adopt an Ag-deficient AgBiI 4 type. Both the experimental X-ray diffraction pattern and bandgap can be better explained by the AgBiI 4 structure. Additionally, the proposed AgBiI 4 structure, with octahedral bismuth coordination, removes unphysically short Bi-I bonding within the [BiI 8 ] hexahedra of the ThZr 2 I 7 model. Our results provide critical insights for assessing the photovoltaic properties of AgBi 2 I 7 iodobismuthate materials.

  1. Fragile entanglement statistics

    NASA Astrophysics Data System (ADS)

    Brody, Dorje C.; Hughston, Lane P.; Meier, David M.

    2015-10-01

    If X and Y are independent, Y and Z are independent, and so are X and Z, one might be tempted to conclude that X, Y, and Z are independent. But it has long been known in classical probability theory that, intuitive as it may seem, this is not true in general. In quantum mechanics one can ask whether analogous statistics can emerge for configurations of particles in certain types of entangled states. The explicit construction of such states, along with the specification of suitable sets of observables that have the purported statistical properties, is not entirely straightforward. We show that an example of such a configuration arises in the case of an N-particle GHZ state, and we are able to identify a family of observables with the property that the associated measurement outcomes are independent for any choice of 2,3,\\ldots ,N-1 of the particles, even though the measurement outcomes for all N particles are not independent. Although such states are highly entangled, the entanglement turns out to be ‘fragile’, i.e. the associated density matrix has the property that if one traces out the freedom associated with even a single particle, the resulting reduced density matrix is separable.

  2. Relative quantum yield of I-asterisk(2P1/2) in the tunable laser UV photodissociation of i-C3F7I and n-C3F7I - Effect of temperature and exciplex emission

    NASA Technical Reports Server (NTRS)

    Smedley, J. E.; Leone, S. R.

    1983-01-01

    Wavelength-specific relative quantum yields of metastable I from pulsed laser photodissociation of i-C3F7I and n-C3F7I in the range 265-336 nm are determined by measuring the time-resolved infrared emission from the atomic I(P-2(1/2) P-2(3/2) transition. It is shown that although this yield appears to be unity from 265 to 298 nm, it decreases dramatically at longer wavelengths. Values are also reported for the enhancement of emission from metastable I due to exciplex formation at several temperatures. The exciplex formation emission increases linearly with parent gas pressure, but decreases with increasing temperature. Absorption spectra of i- and n-C3F7I between 303 and 497 K are presented, and the effect of temperature on the quantum yields at selected wavelengths greater than 300 nm, where increasing the temperature enhances the absorption considerably, are given. The results are discussed in regard to the development of solar-pumped iodine lasers.

  3. Statistical Analyses of Scatterplots to Identify Important Factors in Large-Scale Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kleijnen, J.P.C.; Helton, J.C.

    1999-04-01

    The robustness of procedures for identifying patterns in scatterplots generated in Monte Carlo sensitivity analyses is investigated. These procedures are based on attempts to detect increasingly complex patterns in the scatterplots under consideration and involve the identification of (1) linear relationships with correlation coefficients, (2) monotonic relationships with rank correlation coefficients, (3) trends in central tendency as defined by means, medians and the Kruskal-Wallis statistic, (4) trends in variability as defined by variances and interquartile ranges, and (5) deviations from randomness as defined by the chi-square statistic. The following two topics related to the robustness of these procedures are consideredmore » for a sequence of example analyses with a large model for two-phase fluid flow: the presence of Type I and Type II errors, and the stability of results obtained with independent Latin hypercube samples. Observations from analysis include: (1) Type I errors are unavoidable, (2) Type II errors can occur when inappropriate analysis procedures are used, (3) physical explanations should always be sought for why statistical procedures identify variables as being important, and (4) the identification of important variables tends to be stable for independent Latin hypercube samples.« less

  4. Derivation and Applicability of Asymptotic Results for Multiple Subtests Person-Fit Statistics

    PubMed Central

    Albers, Casper J.; Meijer, Rob R.; Tendeiro, Jorge N.

    2016-01-01

    In high-stakes testing, it is important to check the validity of individual test scores. Although a test may, in general, result in valid test scores for most test takers, for some test takers, test scores may not provide a good description of a test taker’s proficiency level. Person-fit statistics have been proposed to check the validity of individual test scores. In this study, the theoretical asymptotic sampling distribution of two person-fit statistics that can be used for tests that consist of multiple subtests is first discussed. Second, simulation study was conducted to investigate the applicability of this asymptotic theory for tests of finite length, in which the correlation between subtests and number of items in the subtests was varied. The authors showed that these distributions provide reasonable approximations, even for tests consisting of subtests of only 10 items each. These results have practical value because researchers do not have to rely on extensive simulation studies to simulate sampling distributions. PMID:29881053

  5. Reporting of statistically significant results at ClinicalTrials.gov for completed superiority randomized controlled trials.

    PubMed

    Dechartres, Agnes; Bond, Elizabeth G; Scheer, Jordan; Riveros, Carolina; Atal, Ignacio; Ravaud, Philippe

    2016-11-30

    Publication bias and other reporting bias have been well documented for journal articles, but no study has evaluated the nature of results posted at ClinicalTrials.gov. We aimed to assess how many randomized controlled trials (RCTs) with results posted at ClinicalTrials.gov report statistically significant results and whether the proportion of trials with significant results differs when no treatment effect estimate or p-value is posted. We searched ClinicalTrials.gov in June 2015 for all studies with results posted. We included completed RCTs with a superiority hypothesis and considered results for the first primary outcome with results posted. For each trial, we assessed whether a treatment effect estimate and/or p-value was reported at ClinicalTrials.gov and if yes, whether results were statistically significant. If no treatment effect estimate or p-value was reported, we calculated the treatment effect and corresponding p-value using results per arm posted at ClinicalTrials.gov when sufficient data were reported. From the 17,536 studies with results posted at ClinicalTrials.gov, we identified 2823 completed phase 3 or 4 randomized trials with a superiority hypothesis. Of these, 1400 (50%) reported a treatment effect estimate and/or p-value. Results were statistically significant for 844 trials (60%), with a median p-value of 0.01 (Q1-Q3: 0.001-0.26). For the 1423 trials with no treatment effect estimate or p-value posted, we could calculate the treatment effect and corresponding p-value using results reported per arm for 929 (65%). For 494 trials (35%), p-values could not be calculated mainly because of insufficient reporting, censored data, or repeated measurements over time. For the 929 trials we could calculate p-values, we found statistically significant results for 342 (37%), with a median p-value of 0.19 (Q1-Q3: 0.005-0.59). Half of the trials with results posted at ClinicalTrials.gov reported a treatment effect estimate and/or p-value, with significant

  6. The new method of real-time detection of 129I2, 129I127I, 127I2 and NO2 in gases using tunable diode laser operating in the range of 632-637 nm

    NASA Astrophysics Data System (ADS)

    Kireev, S. V.; Shnyrev, S. L.

    2018-02-01

    This paper develops the new selective real-time method of 129I2, 129I127I, 127I2 and NO2 detection in gases. Measuring concentrations of molecular iodine is based on fluorescence exciting by the radiation of a tunable diode laser, operating in the red spectral region (632-637 nm), at two or three wavelengths corresponding to the centers of the absorption lines of 129I2, 129I127I and 127I2. Detection of NO2 is performed by measuring the intensity of the tunable diode laser radiation, which passed through the measuring cell. Measured simultaneously, boundary ratios of iodine molecule concentrations measured simultaneously are about 10-6. The sensitivity of nitrogen dioxide detection is 1016 cm-3.

  7. Statistical Inference for Data Adaptive Target Parameters.

    PubMed

    Hubbard, Alan E; Kherad-Pajouh, Sara; van der Laan, Mark J

    2016-05-01

    Consider one observes n i.i.d. copies of a random variable with a probability distribution that is known to be an element of a particular statistical model. In order to define our statistical target we partition the sample in V equal size sub-samples, and use this partitioning to define V splits in an estimation sample (one of the V subsamples) and corresponding complementary parameter-generating sample. For each of the V parameter-generating samples, we apply an algorithm that maps the sample to a statistical target parameter. We define our sample-split data adaptive statistical target parameter as the average of these V-sample specific target parameters. We present an estimator (and corresponding central limit theorem) of this type of data adaptive target parameter. This general methodology for generating data adaptive target parameters is demonstrated with a number of practical examples that highlight new opportunities for statistical learning from data. This new framework provides a rigorous statistical methodology for both exploratory and confirmatory analysis within the same data. Given that more research is becoming "data-driven", the theory developed within this paper provides a new impetus for a greater involvement of statistical inference into problems that are being increasingly addressed by clever, yet ad hoc pattern finding methods. To suggest such potential, and to verify the predictions of the theory, extensive simulation studies, along with a data analysis based on adaptively determined intervention rules are shown and give insight into how to structure such an approach. The results show that the data adaptive target parameter approach provides a general framework and resulting methodology for data-driven science.

  8. How to interpret the results of medical time series data analysis: Classical statistical approaches versus dynamic Bayesian network modeling.

    PubMed

    Onisko, Agnieszka; Druzdzel, Marek J; Austin, R Marshall

    2016-01-01

    Classical statistics is a well-established approach in the analysis of medical data. While the medical community seems to be familiar with the concept of a statistical analysis and its interpretation, the Bayesian approach, argued by many of its proponents to be superior to the classical frequentist approach, is still not well-recognized in the analysis of medical data. The goal of this study is to encourage data analysts to use the Bayesian approach, such as modeling with graphical probabilistic networks, as an insightful alternative to classical statistical analysis of medical data. This paper offers a comparison of two approaches to analysis of medical time series data: (1) classical statistical approach, such as the Kaplan-Meier estimator and the Cox proportional hazards regression model, and (2) dynamic Bayesian network modeling. Our comparison is based on time series cervical cancer screening data collected at Magee-Womens Hospital, University of Pittsburgh Medical Center over 10 years. The main outcomes of our comparison are cervical cancer risk assessments produced by the three approaches. However, our analysis discusses also several aspects of the comparison, such as modeling assumptions, model building, dealing with incomplete data, individualized risk assessment, results interpretation, and model validation. Our study shows that the Bayesian approach is (1) much more flexible in terms of modeling effort, and (2) it offers an individualized risk assessment, which is more cumbersome for classical statistical approaches.

  9. Photophysics of indole-2-carboxylic acid (I2C) and indole-5-carboxylic acid (I5C): heavy atom effect.

    PubMed

    Kowalska-Baron, Agnieszka; Gałęcki, Krystian; Wysocki, Stanisław

    2013-12-01

    In this study the effect of carboxylic group substitution in the 2 and 5 position of indole ring on the photophysics of the parent indole chromophore has been studied. The photophysical parameters crucial in triplet state decay mechanism of aqueous indole-2-carboxylic acid (I2C) and indole-5-carboxylic acid (I5C) have been determined applying our previously proposed methodology based on the heavy atom effect and fluorescence and phosphorescence decay kinetics [Kowalska-Baron et al., 2012]. The determined time-resolved phosphorescence spectra of I2C and I5C are red-shifted as compared to that of the parent indole. This red-shift was especially evident in the case of I2C and may indicate the possibility of hydrogen bonded complex formation incorporating carbonyl CO, the NH group of I2C and, possibly, surrounding water molecules. The possibility of the excited state charge transfer process and the subsequent electronic charge redistribution in such a hydrogen bonded complex may also be postulated. The resulting stabilization of the I2C triplet state is manifested by its relatively long phosphorescence lifetime in aqueous solution (912 μs). The relatively short phosphorescence lifetime of I5C (56 μs) may be the consequence of more effective ground-state quenching of I5 C triplet state. This hypothesis may be strengthened by the significantly larger value of the determined rate constant of I5C triplet state quenching by its ground-state (4.4 × 10(8)M(-1)s(-1)) as compared to that for indole (6.8 × 10(7)M(-1)s(-1)) and I2C (2.3 × 10(7)M(-1)s(-1)). The determined bimolecular rate constant for triplet state quenching by iodide [Formula: see text] is equal to 1 × 10(4)M(-1)s(-1); 6 × 10(3)M(-1)s(-1) and 2.7 × 10(4)M(-1)s(-1) for indole, I2 C and I5 C, respectively. In order to obtain a better insight into iodide quenching of I2C and I5C triplet states in aqueous solution, the temperature dependence of the bimolecular rate constants for iodide quenching of the

  10. Immersive Input Display Device (I2D2) for tactical information viewing

    NASA Astrophysics Data System (ADS)

    Tremper, David E.; Burnett, Kevin P.; Malloy, Andrew R.; Wert, Robert

    2006-05-01

    Daylight readability of hand-held displays has been an ongoing issue for both commercial and military applications. In an effort to reduce the effects of ambient light on the readability of military displays, the Naval Research Laboratory (NRL) began investigating and developing advanced hand-held displays. Analysis and research of display technologies with consideration for vulnerability to environmental conditions resulted in the complete design and fabrication of the hand-held Immersive Input Display Device (I2D2) monocular. The I2D2 combines an Organic Light Emitting Diode (OLED) SVGA+ micro-display developed by eMagin Corporation with an optics configuration inside a cylindrical housing. A rubber pressure-eyecup allows view ability only when the eyecup is depressed, eliminating light from both entering and leaving the device. This feature allows the I2D2 to be used during the day, while not allowing ambient light to affect the readability. It simultaneously controls light leakage, effectively eliminating the illumination, and thus preserving the tactical position, of the user in the dark. This paper will examine the characteristics and introduce the design of the I2D2.

  11. Number statistics for β-ensembles of random matrices: Applications to trapped fermions at zero temperature.

    PubMed

    Marino, Ricardo; Majumdar, Satya N; Schehr, Grégory; Vivo, Pierpaolo

    2016-09-01

    Let P_{β}^{(V)}(N_{I}) be the probability that a N×Nβ-ensemble of random matrices with confining potential V(x) has N_{I} eigenvalues inside an interval I=[a,b] on the real line. We introduce a general formalism, based on the Coulomb gas technique and the resolvent method, to compute analytically P_{β}^{(V)}(N_{I}) for large N. We show that this probability scales for large N as P_{β}^{(V)}(N_{I})≈exp[-βN^{2}ψ^{(V)}(N_{I}/N)], where β is the Dyson index of the ensemble. The rate function ψ^{(V)}(k_{I}), independent of β, is computed in terms of single integrals that can be easily evaluated numerically. The general formalism is then applied to the classical β-Gaussian (I=[-L,L]), β-Wishart (I=[1,L]), and β-Cauchy (I=[-L,L]) ensembles. Expanding the rate function around its minimum, we find that generically the number variance var(N_{I}) exhibits a nonmonotonic behavior as a function of the size of the interval, with a maximum that can be precisely characterized. These analytical results, corroborated by numerical simulations, provide the full counting statistics of many systems where random matrix models apply. In particular, we present results for the full counting statistics of zero-temperature one-dimensional spinless fermions in a harmonic trap.

  12. MICROARRAY DATA ANALYSIS USING MULTIPLE STATISTICAL MODELS

    EPA Science Inventory

    Microarray Data Analysis Using Multiple Statistical Models

    Wenjun Bao1, Judith E. Schmid1, Amber K. Goetz1, Ming Ouyang2, William J. Welsh2,Andrew I. Brooks3,4, ChiYi Chu3,Mitsunori Ogihara3,4, Yinhe Cheng5, David J. Dix1. 1National Health and Environmental Effects Researc...

  13. MICROWAVE SPECTRA AND GEOMETRIES OF C2H_{2\\cdots AgI} and C2H_{4\\cdots AgI}

    NASA Astrophysics Data System (ADS)

    Stephens, Susanna L.; Tew, David Peter; Walker, Nick; Legon, Anthony

    2015-06-01

    A chirped-pulse Fourier transform microwave spectrometer has been used to measure the microwave spectra of both C2H_{2\\cdots AgI} and C2H_{4\\cdots AgI}. These complexes are generated via laser ablation at 532 nm of a silver surface in the presence of CF3I and either C2H_{2} or C2H_{4} and argon and are stabilized by a supersonic expansion. Rotational (A0, B0, C0) and centrifugal distortion constants (ΔJ and ΔJK) of each molecule have been determined as well the nuclear electric quadrupole coupling constants the iodine atom (χaa(I) and χbb-χcc(I)). The spectrum of each molecule is consistent with a C2v structure in which the metal atom interacts with the π-orbital of the ethene or ethyne molecule. Isotopic substitutions of atoms within the C2H_{2} or C2H_{4} subunits are in progress and in conjunction with high level ab initio calculations will allow for accurate determination of the geometry of each molecule. These to complexes are put in the context of the recently studied H2S\\cdots AgI, OC\\cdotsAgI, H3N\\cdots AgI and (CH3)_{3N\\cdots AgI}. S.Z. Riaz, S.L. Stephens, W. Mizukami, D.P. Tew, N.R. Walker, A.C. Legon, Chem. Phys. Let., 531, 1-12 (2012) S.L. Stephens, W. Mizukami, D.P. Tew, N.R. Walker, A.C. Legon, J. Chem. Phys., 136(6), 064306 (2012) D.M. Bittner, D.P. Zaleski, S.L. Stephens, N.R. Walker, A.C. Legon, Study in progress.

  14. Randomized Clinical Trial of a Self-Adhering Flowable Composite for Class I Restorations: 2-Year Results.

    PubMed

    Sabbagh, J; Dagher, S; El Osta, N; Souhaid, P

    2017-01-01

    Objectives. To compare the clinical performances of a self-adhering resin composite and a conventional flowable composite with a self-etch bonding system on permanent molars. The influence of using rubber dam versus cotton roll isolation was also investigated. Materials and Methods. Patients aged between 6 and 12 years and presenting at least two permanent molars in need of small class I restorations were selected. Thirty-four pairs of restorations were randomly placed by the same operator. Fifteen patients were treated under rubber dam and nineteen using cotton rolls isolation and saliva ejector. They were evaluated according to the modified USPHS criteria at baseline, 6 months, and 1 and 2 years by two independent evaluators. Results. All patients attended the two-year recall. For all measured variables, there was no significant difference between rubber dam and cotton after 2 years of restoration with Premise Flowable or Vertise Flow ( p value > 0.05). The percentage of restorations scored alpha decreased significantly over time with Premise Flowable and Vertise Flow for marginal adaptation and surface texture as well as marginal discoloration while it did not vary significantly for color matching. After 2 years, Vertise Flow showed a similar behaviour to the Premise Flowable used with a self-adhesive resin system.

  15. Results for Phase I of the IAEA Coordinated Research Program on HTGR Uncertainties

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Strydom, Gerhard; Bostelmann, Friederike; Yoon, Su Jong

    2015-01-01

    The quantification of uncertainties in design and safety analysis of reactors is today not only broadly accepted, but in many cases became the preferred way to replace traditional conservative analysis for safety and licensing analysis. The use of a more fundamental methodology is also consistent with the reliable high fidelity physics models and robust, efficient, and accurate codes available today. To facilitate uncertainty analysis applications a comprehensive approach and methodology must be developed and applied. High Temperature Gas-cooled Reactors (HTGR) has its own peculiarities, coated particle design, large graphite quantities, different materials and high temperatures that also require other simulationmore » requirements. The IAEA has therefore launched a Coordinated Research Project (CRP) on the HTGR Uncertainty Analysis in Modeling (UAM) in 2013 to study uncertainty propagation specifically in the HTGR analysis chain. Two benchmark problems are defined, with the prismatic design represented by the General Atomics (GA) MHTGR-350 and a 250 MW modular pebble bed design similar to the HTR-PM (INET, China). This report summarizes the contributions of the HTGR Methods Simulation group at Idaho National Laboratory (INL) up to this point of the CRP. The activities at INL have been focused so far on creating the problem specifications for the prismatic design, as well as providing reference solutions for the exercises defined for Phase I. An overview is provided of the HTGR UAM objectives and scope, and the detailed specifications for Exercises I-1, I-2, I-3 and I-4 are also included here for completeness. The main focus of the report is the compilation and discussion of reference results for Phase I (i.e. for input parameters at their nominal or best-estimate values), which is defined as the first step of the uncertainty quantification process. These reference results can be used by other CRP participants for comparison with other codes or their own

  16. Latitude Dependence of Low-Altitude O+ Ion Upflow: Statistical Results From FAST Observations

    NASA Astrophysics Data System (ADS)

    Zhao, K.; Chen, K. W.; Jiang, Y.; Chen, W. J.; Huang, L. F.; Fu, S.

    2017-09-01

    We introduce a statistical model to explain the latitudinal dependence of the occurrence rate and energy flux of the ionospheric escaping ions, taking advantage of advances in the spatial coverage and accuracy of FAST observations. We use a weighted piecewise Gaussian function to fit the dependence, because two probability peaks are located in the dayside polar cusp source region and the nightside auroral oval zone source region. The statistical results show that (1) the Gaussian Mixture Model suitably describes the dayside polar cusp upflows, and the dayside and the nightside auroral oval zone upflows. (2) The magnetic latitudes of the ionospheric upflow source regions expand toward the magnetic equator as Kp increases, from 81° magnetic latitude (MLAT) (cusp upflows) and 63° MLAT (auroral oval upflows) during quiet times to 76° MLAT and 61° MLAT, respectively. (3) The dayside polar cusp region provides only 3-5% O+ upflows among all the source regions, which include the dayside auroral oval zone, dayside polar cusp, nightside auroral oval zone, and even the polar cap. However, observations show that more than 70% of upflows occur in the auroral oval zone and that the occurrence probability increases at the altitudes of 3500-4200 km, which is considered to be the lower altitude boundary of ion beams. This observed result suggests that soft electron precipitation and transverse wave heating are the most efficient ion energization/acceleration mechanisms at the altitudes of FAST orbit, and that the parallel acceleration caused by field-aligned potential drops becomes effective above that altitude.

  17. Spectral and thermal studies of MgI2·8H2O

    NASA Astrophysics Data System (ADS)

    Koleva, Violeta; Stefov, Viktor; Najdoski, Metodija; Ilievski, Zlatko; Cahil, Adnan

    2017-10-01

    In the present contribution special attention is paid to the spectroscopic and thermal characterization of MgI2·8H2O which is the stable hydrated form at room temperature. The infrared spectra of MgI2·8H2O and its deuterated analogues recorded at room and liquid nitrogen temperature are presented and interpreted. In the low-temperature diference infrared spectrum of the slightly deuterated analogue (≈5% D) at least four bands are found out of the expected five (at 2595, 2550, 2538 and 2495 cm-1) as a result of the uncoupled O-D oscillators in the isotopically isolated HOD molecules. Multiple bands are observed in the water bending region and only two bands of the HOH librational modes are found. For more precise and deep description of the processes occurring upon heating of MgI2·8H2O we have applied simultaneous TG/DTA/Mass spectrometry technique identifying the gases evolved during the thermal transformations. We have established that the thermal decomposition of MgI2·8H2O is a complex process that takes place in two main stages. In the first stage (between 120 and 275 °C) the salt undergoes a partial stepwise dehydration to MgI2·2H2O followed by a hydrolytic decomposition with formation of magnesium hydroxyiodide Mg(OH)1.44I0.56 accompanied with simultaneous release of H2O and HI. In the second stage Mg(OH)1.44I0.56 is completely decomposed to MgO with elimination of gaseous H2O, HI, I2 and H2. Infrared spectra of the annealed samples heated between 190 and 270 °C confirmed the formation of magnesium hydroxyiodide.

  18. TRACX2: a connectionist autoencoder using graded chunks to model infant visual statistical learning.

    PubMed

    Mareschal, Denis; French, Robert M

    2017-01-05

    Even newborn infants are able to extract structure from a stream of sensory inputs; yet how this is achieved remains largely a mystery. We present a connectionist autoencoder model, TRACX2, that learns to extract sequence structure by gradually constructing chunks, storing these chunks in a distributed manner across its synaptic weights and recognizing these chunks when they re-occur in the input stream. Chunks are graded rather than all-or-nothing in nature. As chunks are learnt their component parts become more and more tightly bound together. TRACX2 successfully models the data from five experiments from the infant visual statistical learning literature, including tasks involving forward and backward transitional probabilities, low-salience embedded chunk items, part-sequences and illusory items. The model also captures performance differences across ages through the tuning of a single-learning rate parameter. These results suggest that infant statistical learning is underpinned by the same domain-general learning mechanism that operates in auditory statistical learning and, potentially, in adult artificial grammar learning.This article is part of the themed issue 'New frontiers for statistical learning in the cognitive sciences'. © 2016 The Author(s).

  19. TRACX2: a connectionist autoencoder using graded chunks to model infant visual statistical learning

    PubMed Central

    French, Robert M.

    2017-01-01

    Even newborn infants are able to extract structure from a stream of sensory inputs; yet how this is achieved remains largely a mystery. We present a connectionist autoencoder model, TRACX2, that learns to extract sequence structure by gradually constructing chunks, storing these chunks in a distributed manner across its synaptic weights and recognizing these chunks when they re-occur in the input stream. Chunks are graded rather than all-or-nothing in nature. As chunks are learnt their component parts become more and more tightly bound together. TRACX2 successfully models the data from five experiments from the infant visual statistical learning literature, including tasks involving forward and backward transitional probabilities, low-salience embedded chunk items, part-sequences and illusory items. The model also captures performance differences across ages through the tuning of a single-learning rate parameter. These results suggest that infant statistical learning is underpinned by the same domain-general learning mechanism that operates in auditory statistical learning and, potentially, in adult artificial grammar learning. This article is part of the themed issue ‘New frontiers for statistical learning in the cognitive sciences’. PMID:27872375

  20. Basics of meta-analysis: I2 is not an absolute measure of heterogeneity.

    PubMed

    Borenstein, Michael; Higgins, Julian P T; Hedges, Larry V; Rothstein, Hannah R

    2017-03-01

    When we speak about heterogeneity in a meta-analysis, our intent is usually to understand the substantive implications of the heterogeneity. If an intervention yields a mean effect size of 50 points, we want to know if the effect size in different populations varies from 40 to 60, or from 10 to 90, because this speaks to the potential utility of the intervention. While there is a common belief that the I 2 statistic provides this information, it actually does not. In this example, if we are told that I 2 is 50%, we have no way of knowing if the effects range from 40 to 60, or from 10 to 90, or across some other range. Rather, if we want to communicate the predicted range of effects, then we should simply report this range. This gives readers the information they think is being captured by I 2 and does so in a way that is concise and unambiguous. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  1. Statistical Inference for Quality-Adjusted Survival Time

    DTIC Science & Technology

    2003-08-01

    survival functions of QAL. If an influence function for a test statistic exists for complete data case, denoted as ’i, then a test statistic for...the survival function for the censoring variable. Zhao and Tsiatis (2001) proposed a test statistic where O is the influence function of the general...to 1 everywhere until a subject’s death. We have considered other forms of test statistics. One option is to use an influence function 0i that is

  2. Functional brain networks for learning predictive statistics.

    PubMed

    Giorgio, Joseph; Karlaftis, Vasilis M; Wang, Rui; Shen, Yuan; Tino, Peter; Welchman, Andrew; Kourtzi, Zoe

    2017-08-18

    Making predictions about future events relies on interpreting streams of information that may initially appear incomprehensible. This skill relies on extracting regular patterns in space and time by mere exposure to the environment (i.e., without explicit feedback). Yet, we know little about the functional brain networks that mediate this type of statistical learning. Here, we test whether changes in the processing and connectivity of functional brain networks due to training relate to our ability to learn temporal regularities. By combining behavioral training and functional brain connectivity analysis, we demonstrate that individuals adapt to the environment's statistics as they change over time from simple repetition to probabilistic combinations. Further, we show that individual learning of temporal structures relates to decision strategy. Our fMRI results demonstrate that learning-dependent changes in fMRI activation within and functional connectivity between brain networks relate to individual variability in strategy. In particular, extracting the exact sequence statistics (i.e., matching) relates to changes in brain networks known to be involved in memory and stimulus-response associations, while selecting the most probable outcomes in a given context (i.e., maximizing) relates to changes in frontal and striatal networks. Thus, our findings provide evidence that dissociable brain networks mediate individual ability in learning behaviorally-relevant statistics. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  3. Statistics Refresher for Molecular Imaging Technologists, Part 2: Accuracy of Interpretation, Significance, and Variance.

    PubMed

    Farrell, Mary Beth

    2018-06-01

    This article is the second part of a continuing education series reviewing basic statistics that nuclear medicine and molecular imaging technologists should understand. In this article, the statistics for evaluating interpretation accuracy, significance, and variance are discussed. Throughout the article, actual statistics are pulled from the published literature. We begin by explaining 2 methods for quantifying interpretive accuracy: interreader and intrareader reliability. Agreement among readers can be expressed simply as a percentage. However, the Cohen κ-statistic is a more robust measure of agreement that accounts for chance. The higher the κ-statistic is, the higher is the agreement between readers. When 3 or more readers are being compared, the Fleiss κ-statistic is used. Significance testing determines whether the difference between 2 conditions or interventions is meaningful. Statistical significance is usually expressed using a number called a probability ( P ) value. Calculation of P value is beyond the scope of this review. However, knowing how to interpret P values is important for understanding the scientific literature. Generally, a P value of less than 0.05 is considered significant and indicates that the results of the experiment are due to more than just chance. Variance, standard deviation (SD), confidence interval, and standard error (SE) explain the dispersion of data around a mean of a sample drawn from a population. SD is commonly reported in the literature. A small SD indicates that there is not much variation in the sample data. Many biologic measurements fall into what is referred to as a normal distribution taking the shape of a bell curve. In a normal distribution, 68% of the data will fall within 1 SD, 95% will fall within 2 SDs, and 99.7% will fall within 3 SDs. Confidence interval defines the range of possible values within which the population parameter is likely to lie and gives an idea of the precision of the statistic being

  4. Statistical characterization of the standard map

    NASA Astrophysics Data System (ADS)

    Ruiz, Guiomar; Tirnakli, Ugur; Borges, Ernesto P.; Tsallis, Constantino

    2017-06-01

    The standard map, paradigmatic conservative system in the (x, p) phase space, has been recently shown (Tirnakli and Borges (2016 Sci. Rep. 6 23644)) to exhibit interesting statistical behaviors directly related to the value of the standard map external parameter K. A comprehensive statistical numerical description is achieved in the present paper. More precisely, for large values of K (e.g. K  =  10) where the Lyapunov exponents are neatly positive over virtually the entire phase space consistently with Boltzmann-Gibbs (BG) statistics, we verify that the q-generalized indices related to the entropy production q{ent} , the sensitivity to initial conditions q{sen} , the distribution of a time-averaged (over successive iterations) phase-space coordinate q{stat} , and the relaxation to the equilibrium final state q{rel} , collapse onto a fixed point, i.e. q{ent}=q{sen}=q{stat}=q{rel}=1 . In remarkable contrast, for small values of K (e.g. K  =  0.2) where the Lyapunov exponents are virtually zero over the entire phase space, we verify q{ent}=q{sen}=0 , q{stat} ≃ 1.935 , and q{rel} ≃1.4 . The situation corresponding to intermediate values of K, where both stable orbits and a chaotic sea are present, is discussed as well. The present results transparently illustrate when BG behavior and/or q-statistical behavior are observed.

  5. Implication of correlations among some common stability statistics - a Monte Carlo simulations.

    PubMed

    Piepho, H P

    1995-03-01

    Stability analysis of multilocation trials is often based on a mixed two-way model. Two stability measures in frequent use are the environmental variance (S i (2) )and the ecovalence (W i). Under the two-way model the rank orders of the expected values of these two statistics are identical for a given set of genotypes. By contrast, empirical rank correlations among these measures are consistently low. This suggests that the two-way mixed model may not be appropriate for describing real data. To check this hypothesis, a Monte Carlo simulation was conducted. It revealed that the low empirical rank correlation amongS i (2) and W i is most likely due to sampling errors. It is concluded that the observed low rank correlation does not invalidate the two-way model. The paper also discusses tests for homogeneity of S i (2) as well as implications of the two-way model for the classification of stability statistics.

  6. Systolic [Ca2+]i regulates diastolic levels in rat ventricular myocytes

    PubMed Central

    Sankaranarayanan, Rajiv; Kistamás, Kornél; Greensmith, David J.; Venetucci, Luigi A.

    2017-01-01

    Key points For the heart to function as a pump, intracellular calcium concentration ([Ca2+]i) must increase during systole to activate contraction and then fall, during diastole, to allow the myofilaments to relax and the heart to refill with blood.The present study investigates the control of diastolic [Ca2+]i in rat ventricular myocytes.We show that diastolic [Ca2+]i is increased by manoeuvres that decrease sarcoplasmic reticulum function. This is accompanied by a decrease of systolic [Ca2+]i such that the time‐averaged [Ca2+]i remains constant.We report that diastolic [Ca2+]i is controlled by the balance between Ca2+ entry and Ca2+ efflux during systole.The results of the present study identify a novel mechanism by which changes of the amplitude of the systolic Ca transient control diastolic [Ca2+]i. Abstract The intracellular Ca concentration ([Ca2+]i) must be sufficently low in diastole so that the ventricle is relaxed and can refill with blood. Interference with this will impair relaxation. The factors responsible for regulation of diastolic [Ca2+]i, in particular the relative roles of the sarcoplasmic reticulum (SR) and surface membrane, are unclear. We investigated the effects on diastolic [Ca2+]i that result from the changes of Ca cycling known to occur in heart failure. Experiments were performed using Fluo‐3 in voltage clamped rat ventricular myocytes. Increasing stimulation frequency increased diastolic [Ca2+]i. This increase of [Ca2+]i was larger when SR function was impaired either by making the ryanodine receptor leaky (with caffeine or ryanodine) or by decreasing sarco/endoplasmic reticulum Ca‐ATPase activity with thapsigargin. The increase of diastolic [Ca2+]i produced by interfering with the SR was accompanied by a decrease of the amplitude of the systolic Ca transient, such that there was no change of time‐averaged [Ca2+]i. Time‐averaged [Ca2+]i was increased by β‐adrenergic stimulation with isoprenaline and increased in a saturating

  7. A note on generalized Genome Scan Meta-Analysis statistics

    PubMed Central

    Koziol, James A; Feng, Anne C

    2005-01-01

    Background Wise et al. introduced a rank-based statistical technique for meta-analysis of genome scans, the Genome Scan Meta-Analysis (GSMA) method. Levinson et al. recently described two generalizations of the GSMA statistic: (i) a weighted version of the GSMA statistic, so that different studies could be ascribed different weights for analysis; and (ii) an order statistic approach, reflecting the fact that a GSMA statistic can be computed for each chromosomal region or bin width across the various genome scan studies. Results We provide an Edgeworth approximation to the null distribution of the weighted GSMA statistic, and, we examine the limiting distribution of the GSMA statistics under the order statistic formulation, and quantify the relevance of the pairwise correlations of the GSMA statistics across different bins on this limiting distribution. We also remark on aggregate criteria and multiple testing for determining significance of GSMA results. Conclusion Theoretical considerations detailed herein can lead to clarification and simplification of testing criteria for generalizations of the GSMA statistic. PMID:15717930

  8. iTOUGH2 Universal Optimization Using the PEST Protocol

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Finsterle, S.A.

    2010-07-01

    iTOUGH2 (http://www-esd.lbl.gov/iTOUGH2) is a computer program for parameter estimation, sensitivity analysis, and uncertainty propagation analysis [Finsterle, 2007a, b, c]. iTOUGH2 contains a number of local and global minimization algorithms for automatic calibration of a model against measured data, or for the solution of other, more general optimization problems (see, for example, Finsterle [2005]). A detailed residual and estimation uncertainty analysis is conducted to assess the inversion results. Moreover, iTOUGH2 can be used to perform a formal sensitivity analysis, or to conduct Monte Carlo simulations for the examination for prediction uncertainties. iTOUGH2's capabilities are continually enhanced. As the name implies, iTOUGH2more » is developed for use in conjunction with the TOUGH2 forward simulator for nonisothermal multiphase flow in porous and fractured media [Pruess, 1991]. However, iTOUGH2 provides FORTRAN interfaces for the estimation of user-specified parameters (see subroutine USERPAR) based on user-specified observations (see subroutine USEROBS). These user interfaces can be invoked to add new parameter or observation types to the standard set provided in iTOUGH2. They can also be linked to non-TOUGH2 models, i.e., iTOUGH2 can be used as a universal optimization code, similar to other model-independent, nonlinear parameter estimation packages such as PEST [Doherty, 2008] or UCODE [Poeter and Hill, 1998]. However, to make iTOUGH2's optimization capabilities available for use with an external code, the user is required to write some FORTRAN code that provides the link between the iTOUGH2 parameter vector and the input parameters of the external code, and between the output variables of the external code and the iTOUGH2 observation vector. While allowing for maximum flexibility, the coding requirement of this approach limits its applicability to those users with FORTRAN coding knowledge. To make iTOUGH2 capabilities accessible to many

  9. Development of an ultra-compact CsI/HgI{sub 2} gamma-ray scintillation spectrometer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patt, B.E.; Wang, Y.J.; Iwanczyk, J.S.

    A novel new semiconductor photodetector has been developed which utilizes large mercuric iodide photodetectors coupled to highly optimized CsI(T1) scintillators for gamma ray spectroscopy. With this new detector technology the authors have achieved energy resolution superior to that of any other scintillation detector. Furthermore, gamma probes based on the new HgI{sub 2}/CsI(Tl) detector can be highly miniaturized offering improved portability. A {1/2}-inch diameter HgI{sub 2} photodetector coupled with a {1/2}-inch diameter by {1/2}-inch high right-rectangular scintillator produced energy resolution of 4.58% FWHM for {sup 137}Cs (662 keV). This is perhaps the best result ever reported for room temperature scintillation spectroscopy.more » Evaluation of a prototype device with similar performance has been conducted at Los Alamos using Pu and U standard samples. Recently, Monte-Carlo simulations have been performed for co-optimization of the gamma-collection efficiency and light collection efficiency of the scintillator/photodetector pairs resulting in a new tapered scintillator geometry. Energy resolution of 5.69% FWHM at 662 keV was obtained for a 1-inch diameter photodetector coupled to a two-inch long conical CsI(Tl) scintillator; with dimensions: 1-inch diameter at the top tapered to 2-inch diameter at the bottom. The long term stability of the technology has been verified. Current efforts to optimize the detectors for specific applications in safeguards and in materials control and accountability are discussed.« less

  10. iTOUGH2 v7.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    FINSTERLE, STEFAN; JUNG, YOOJIN; KOWALSKY, MICHAEL

    2016-09-15

    iTOUGH2 (inverse TOUGH2) provides inverse modeling capabilities for TOUGH2, a simulator for multi-dimensional, multi-phase, multi-component, non-isothermal flow and transport in fractured porous media. iTOUGH2 performs sensitivity analyses, data-worth analyses, parameter estimation, and uncertainty propagation analyses in geosciences and reservoir engineering and other application areas. iTOUGH2 supports a number of different combinations of fluids and components (equation-of-state (EOS) modules). In addition, the optimization routines implemented in iTOUGH2 can also be used for sensitivity analysis, automatic model calibration, and uncertainty quantification of any external code that uses text-based input and output files using the PEST protocol. iTOUGH2 solves the inverse problem bymore » minimizing a non-linear objective function of the weighted differences between model output and the corresponding observations. Multiple minimization algorithms (derivative-free, gradient-based, and second-order; local and global) are available. iTOUGH2 also performs Latin Hypercube Monte Carlo simulations for uncertainty propagation analyses. A detailed residual and error analysis is provided. This upgrade includes (a) global sensitivity analysis methods, (b) dynamic memory allocation (c) additional input features and output analyses, (d) increased forward simulation capabilities, (e) parallel execution on multicore PCs and Linux clusters, and (f) bug fixes. More details can be found at http://esd.lbl.gov/iTOUGH2.« less

  11. Toward Complete Statistics of Massive Binary Stars: Penultimate Results from the Cygnus OB2 Radial Velocity Survey

    NASA Astrophysics Data System (ADS)

    Kobulnicky, Henry A.; Kiminki, Daniel C.; Lundquist, Michael J.; Burke, Jamison; Chapman, James; Keller, Erica; Lester, Kathryn; Rolen, Emily K.; Topel, Eric; Bhattacharjee, Anirban; Smullen, Rachel A.; Vargas Álvarez, Carlos A.; Runnoe, Jessie C.; Dale, Daniel A.; Brotherton, Michael M.

    2014-08-01

    We analyze orbital solutions for 48 massive multiple-star systems in the Cygnus OB2 association, 23 of which are newly presented here, to find that the observed distribution of orbital periods is approximately uniform in log P for P < 45 days, but it is not scale-free. Inflections in the cumulative distribution near 6 days, 14 days, and 45 days suggest key physical scales of sime0.2, sime0.4, and sime1 A.U. where yet-to-be-identified phenomena create distinct features. No single power law provides a statistically compelling prescription, but if features are ignored, a power law with exponent β ~= -0.22 provides a crude approximation over P = 1.4-2000 days, as does a piece-wise linear function with a break near 45 days. The cumulative period distribution flattens at P > 45 days, even after correction for completeness, indicating either a lower binary fraction or a shift toward low-mass companions. A high degree of similarity (91% likelihood) between the Cyg OB2 period distribution and that of other surveys suggests that the binary properties at P <~ 25 days are determined by local physics of disk/clump fragmentation and are relatively insensitive to environmental and evolutionary factors. Fully 30% of the unbiased parent sample is a binary with period P < 45 days. Completeness corrections imply a binary fraction near 55% for P < 5000 days. The observed distribution of mass ratios 0.2 < q < 1 is consistent with uniform, while the observed distribution of eccentricities 0.1 < e < 0.6 is consistent with uniform plus an excess of e ~= 0 systems. We identify six stars, all supergiants, that exhibit aperiodic velocity variations of ~30 km s-1 attributed to atmospheric fluctuations.

  12. Bridgman-Stockbarger growth of SrI2:Eu2+ single crystal

    NASA Astrophysics Data System (ADS)

    Raja, A.; Daniel, D. Joseph; Ramasamy, P.; Singh, S. G.; Sen, S.; Gadkari, S. C.

    2018-05-01

    Strontium Iodide (SrI2): Europium Iodide (EuI2) was purified by Zone-refinement process. Europium doped strontium iodide (SrI2:Eu2+) single crystal was grown by modified vertical Bridgman - Stockbarger technique. Photoluminescence (PL) excitation and emission (PLE) spectra were measured for Eu2+ doped SrI2 crystal. The sharp emission was recorded at 432 nm. Scintillation properties of the SrI2:Eu2+ crystal were checked by the gamma ray spectrometer using 137Cs gamma source.

  13. [125I]2-(2-chloro-4-iodo-phenylamino)-5-methyl-pyrroline (LNP 911), a high-affinity radioligand selective for I1 imidazoline receptors.

    PubMed

    Greney, Hugues; Urosevic, Dragan; Schann, Stephan; Dupuy, Laurence; Bruban, Véronique; Ehrhardt, Jean-Daniel; Bousquet, Pascal; Dontenwill, Monique

    2002-07-01

    The I1 subtype of imidazoline receptors (I1R) is a plasma membrane protein that is involved in diverse physiological functions. Available radioligands used so far to characterize the I(1)R were able to bind with similar affinities to alpha2-adrenergic receptors (alpha2-ARs) and to I1R. This feature was a major drawback for an adequate characterization of this receptor subtype. New imidazoline analogs were therefore synthesized and the present study describes one of these compounds, 2-(2-chloro-4-iodo-phenylamino)-5-methyl-pyrroline (LNP 911), which was of high affinity and selectivity for the I1R. LNP 911 was radioiodinated and its binding properties characterized in different membrane preparations. Saturation experiments with [125I]LNP 911 revealed a single high affinity binding site in PC-12 cell membranes (K(D) = 1.4 nM; B(max) = 398 fmol/mg protein) with low nonspecific binding. [125I]LNP 911 specific binding was inhibited by various imidazolines and analogs but was insensitive to guanosine-5'-O-(3-thio)triphosphate. The rank order of potency of some competing ligands [LNP 911, PIC, rilmenidine, 4-chloro-2-(imidazolin-2-ylamino)-isoindoline (BDF 6143), lofexidine, and clonidine] was consistent with the definition of [125I]LNP 911 binding sites as I1R. However, other high-affinity I1R ligands (moxonidine, efaroxan, and benazoline) exhibited low affinities for these binding sites in standard binding assays. In contrast, when [125I]LNP 911 was preincubated at 4 degrees C, competition curves of moxonidine became biphasic. In this case, moxonidine exhibited similar high affinities on [125I]LNP 911 binding sites as on I1R defined with [125I]PIC. Moxonidine proved also able to accelerate the dissociation of [125I]LNP 911 from its binding sites. These results suggest the existence of an allosteric modulation at the level of the I1R, which seems to be corroborated by the dose-dependent enhancement by LNP 911 of the agonist effects on the adenylate cyclase pathway

  14. Long-term results from an urban CO2 monitoring network

    NASA Astrophysics Data System (ADS)

    Ehleringer, J.; Pataki, D. E.; Lai, C.; Schauer, A.

    2009-12-01

    High-precision atmospheric CO2 has been monitored in several locations through the Salt Lake Valley metropolitan region of northern Utah over the past nine years. Many parts of this semi-arid grassland have transitioned into dense urban forests, supported totally by extensive homeowner irrigation practices. Diurnal changes in fossil-fuel energy uses and photosynthesis-respiration processes have resulted in significant spatial and temporal variations in atmospheric CO2. Here we present an analysis of the long-term patterns and trends in midday and nighttime CO2 values for four sites: a midvalley residential neighborhood, a midvalley non-residential neighborhood, an undeveloped valley-edge area transitioning from agriculture, and a developed valley-edge neighborhood with mixed residential and commercial activities; the neighborhoods span an elevation gradient within the valley of ~100 m. Patterns in CO2 concentrations among neighborhoods were examined relative to each other and relative to the NOAA background station, a desert site in Wendover, Utah. Four specific analyses are considered. First, we present a statistical analysis of weekday versus weekend CO2 patterns in the winter, spring, summer, and fall seasons. Second, we present a statistical analysis of the influences of high-pressure systems on the elevation of atmospheric CO2 above background levels in the winter versus summer seasons. Third, we present an analysis of the nighttime CO2 values through the year, relating these patterns to observed changes in the carbon isotope ratios of atmospheric CO2. Lastly, we examine the rate of increase in midday urban CO2 over time relative to regional and global CO2 averages to determine if the amplification of urban energy use is statistically detectable from atmospheric trace gas measurements over the past decade. These results show two important patterns. First, there is a strong weekday-weekend effect of vehicle emissions in contrast to the temperature

  15. Statistical results on restorative dentistry experiments: effect of the interaction between main variables

    PubMed Central

    CAVALCANTI, Andrea Nóbrega; MARCHI, Giselle Maria; AMBROSANO, Gláucia Maria Bovi

    2010-01-01

    Statistical analysis interpretation is a critical field in scientific research. When there is more than one main variable being studied in a research, the effect of the interaction between those variables is fundamental on experiments discussion. However, some doubts can occur when the p-value of the interaction is greater than the significance level. Objective To determine the most adequate interpretation for factorial experiments with p-values of the interaction nearly higher than the significance level. Materials and methods The p-values of the interactions found in two restorative dentistry experiments (0.053 and 0.068) were interpreted in two distinct ways: considering the interaction as not significant and as significant. Results Different findings were observed between the two analyses, and studies results became more coherent when the significant interaction was used. Conclusion The p-value of the interaction between main variables must be analyzed with caution because it can change the outcomes of research studies. Researchers are strongly advised to interpret carefully the results of their statistical analysis in order to discuss the findings of their experiments properly. PMID:20857003

  16. A Fast Healthcare Interoperability Resources (FHIR) layer implemented over i2b2.

    PubMed

    Boussadi, Abdelali; Zapletal, Eric

    2017-08-14

    Standards and technical specifications have been developed to define how the information contained in Electronic Health Records (EHRs) should be structured, semantically described, and communicated. Current trends rely on differentiating the representation of data instances from the definition of clinical information models. The dual model approach, which combines a reference model (RM) and a clinical information model (CIM), sets in practice this software design pattern. The most recent initiative, proposed by HL7, is called Fast Health Interoperability Resources (FHIR). The aim of our study was to investigate the feasibility of applying the FHIR standard to modeling and exposing EHR data of the Georges Pompidou European Hospital (HEGP) integrating biology and the bedside (i2b2) clinical data warehouse (CDW). We implemented a FHIR server over i2b2 to expose EHR data in relation with five FHIR resources: DiagnosisReport, MedicationOrder, Patient, Encounter, and Medication. The architecture of the server combines a Data Access Object design pattern and FHIR resource providers, implemented using the Java HAPI FHIR API. Two types of queries were tested: query type #1 requests the server to display DiagnosticReport resources, for which the diagnosis code is equal to a given ICD-10 code. A total of 80 DiagnosticReport resources, corresponding to 36 patients, were displayed. Query type #2, requests the server to display MedicationOrder, for which the FHIR Medication identification code is equal to a given code expressed in a French coding system. A total of 503 MedicationOrder resources, corresponding to 290 patients, were displayed. Results were validated by manually comparing the results of each request to the results displayed by an ad-hoc SQL query. We showed the feasibility of implementing a Java layer over the i2b2 database model to expose data of the CDW as a set of FHIR resources. An important part of this work was the structural and semantic mapping between the

  17. Chemical and Electrochemical Asymmetric Dihydroxylation of Olefins in I(2)-K(2)CO(3)-K(2)OsO(2)(OH)(4) and I(2)-K(3)PO(4)/K(2)HPO(4)-K(2)OsO(2)(OH)(4) Systems with Sharpless' Ligand.

    PubMed

    Torii, Sigeru; Liu, Ping; Bhuvaneswari, Narayanaswamy; Amatore, Christian; Jutand, Anny

    1996-05-03

    Iodine-assisted chemical and electrochemical asymmetric dihydroxylation of various olefins in I(2)-K(2)CO(3)-K(2)OsO(2)(OH)(4) and I(2)-K(3)PO(4)/K(2)HPO(4)-K(2)OsO(2)(OH)(4) systems with Sharpless' ligand provided the optically active glycols in excellent isolated yields and high enantiomeric excesses. Iodine (I(2)) was used stoichiometrically for the chemical dihydroxylation, and good results were obtained with nonconjugated olefins in contrast to the case of potassium ferricyanide as a co-oxidant. The potentiality of I(2) as a co-oxidant under stoichiometric conditions has been proven to be effective as an oxidizing mediator in electrolysis systems. Iodine-assisted asymmetric electro-dihydroxylation of olefins in either a t-BuOH/H(2)O(1/1)-K(2)CO(3)/(DHQD)(2)PHAL-(Pt) or t-BuOH/H(2)O(1/1)-K(3)PO(4)/K(2)HPO(4)/(DHQD)(2)PHAL-(Pt) system in the presence of potassium osmate in an undivided cell was investigated in detail. Irrespective of the substitution pattern, all the olefins afforded the diols in high yields and excellent enantiomeric excesses. A plausible mechanism is discussed on the basis of cyclic voltammograms as well as experimental observations.

  18. Reaction of O2(+)(X 2Pi sub g) with H2, D2, and HD - Guided ion beam studies, MO correlations, and statistical theory calculations

    NASA Technical Reports Server (NTRS)

    Weber, M. E.; Dalleska, N. F.; Tjelta, B. L.; Fisher, E. R.; Armentrout, P. B.

    1993-01-01

    Guided ion-beam mass spectrometry is used to examined the reactions of vibrationally cold ground-state O2(+)(X 2Pi sub g) with H2, D2, and HD. The energy dependence of the absolute integral cross sections from thermal energy to over 4 eV are measured in the center-of-mass frame of reference. Results are also presented for internally excited O2(+) ions reacting with D2 and HD. The results are consistent with the dominant state being the a 4Pi sub u electronic state. The experimental excitation functions are analyzed in detail and interpreted by extending the molecular orbital correlation arguments of Mahan (1971) and by comparison with results of statistical phase space theory and with a theory that predicts a tight transition state.

  19. Statistics of a neuron model driven by asymmetric colored noise.

    PubMed

    Müller-Hansen, Finn; Droste, Felix; Lindner, Benjamin

    2015-02-01

    Irregular firing of neurons can be modeled as a stochastic process. Here we study the perfect integrate-and-fire neuron driven by dichotomous noise, a Markovian process that jumps between two states (i.e., possesses a non-Gaussian statistics) and exhibits nonvanishing temporal correlations (i.e., represents a colored noise). Specifically, we consider asymmetric dichotomous noise with two different transition rates. Using a first-passage-time formulation, we derive exact expressions for the probability density and the serial correlation coefficient of the interspike interval (time interval between two subsequent neural action potentials) and the power spectrum of the spike train. Furthermore, we extend the model by including additional Gaussian white noise, and we give approximations for the interspike interval (ISI) statistics in this case. Numerical simulations are used to validate the exact analytical results for pure dichotomous noise, and to test the approximations of the ISI statistics when Gaussian white noise is included. The results may help to understand how correlations and asymmetry of noise and signals in nerve cells shape neuronal firing statistics.

  20. 7 CFR 2.68 - Administrator, National Agricultural Statistics Service.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... committees concerned with agricultural science, education, and development activities, including library and... Under Secretary for Research, Education, and Economics § 2.68 Administrator, National Agricultural..., Education, and Economics to the Administrator, National Agricultural Statistics Service: (1) Prepare crop...

  1. 7 CFR 2.68 - Administrator, National Agricultural Statistics Service.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... committees concerned with agricultural science, education, and development activities, including library and... Under Secretary for Research, Education, and Economics § 2.68 Administrator, National Agricultural..., Education, and Economics to the Administrator, National Agricultural Statistics Service: (1) Prepare crop...

  2. Evaluation of statistical treatments of left-censored environmental data using coincident uncensored data sets: I. Summary statistics

    USGS Publications Warehouse

    Antweiler, Ronald C.; Taylor, Howard E.

    2008-01-01

    The main classes of statistical treatment of below-detection limit (left-censored) environmental data for the determination of basic statistics that have been used in the literature are substitution methods, maximum likelihood, regression on order statistics (ROS), and nonparametric techniques. These treatments, along with using all instrument-generated data (even those below detection), were evaluated by examining data sets in which the true values of the censored data were known. It was found that for data sets with less than 70% censored data, the best technique overall for determination of summary statistics was the nonparametric Kaplan-Meier technique. ROS and the two substitution methods of assigning one-half the detection limit value to censored data or assigning a random number between zero and the detection limit to censored data were adequate alternatives. The use of these two substitution methods, however, requires a thorough understanding of how the laboratory censored the data. The technique of employing all instrument-generated data - including numbers below the detection limit - was found to be less adequate than the above techniques. At high degrees of censoring (greater than 70% censored data), no technique provided good estimates of summary statistics. Maximum likelihood techniques were found to be far inferior to all other treatments except substituting zero or the detection limit value to censored data.

  3. 2011 statistical abstract of the United States

    USGS Publications Warehouse

    Krisanda, Joseph M.

    2011-01-01

    The <i>Statistical Abstract of the United Statesi>, published since 1878, is the authoritative and comprehensive summary of statistics on the social, political, and economic organization of the United States.


    Use the Abstract as a convenient volume for statistical reference, and as a guide to sources of more information both in print and on the Web.


    Sources of data include the Census Bureau, Bureau of Labor Statistics, Bureau of Economic Analysis, and many other Federal agencies and private organizations.

  4. Statistical properties of MHD fluctuations associated with high speed streams from HELIOS 2 observations

    NASA Technical Reports Server (NTRS)

    Bavassano, B.; Dobrowolny, H.; Fanfoni, G.; Mariani, F.; Ness, N. F.

    1981-01-01

    Helios 2 magnetic data were used to obtain several statistical properties of MHD fluctuations associated with the trailing edge of a given stream served in different solar rotations. Eigenvalues and eigenvectors of the variance matrix, total power and degree of compressibility of the fluctuations were derived and discussed both as a function of distance from the Sun and as a function of the frequency range included in the sample. The results obtained add new information to the picture of MHD turbulence in the solar wind. In particular, a dependence from frequency range of the radial gradients of various statistical quantities is obtained.

  5. IGESS: a statistical approach to integrating individual-level genotype data and summary statistics in genome-wide association studies.

    PubMed

    Dai, Mingwei; Ming, Jingsi; Cai, Mingxuan; Liu, Jin; Yang, Can; Wan, Xiang; Xu, Zongben

    2017-09-15

    Results from genome-wide association studies (GWAS) suggest that a complex phenotype is often affected by many variants with small effects, known as 'polygenicity'. Tens of thousands of samples are often required to ensure statistical power of identifying these variants with small effects. However, it is often the case that a research group can only get approval for the access to individual-level genotype data with a limited sample size (e.g. a few hundreds or thousands). Meanwhile, summary statistics generated using single-variant-based analysis are becoming publicly available. The sample sizes associated with the summary statistics datasets are usually quite large. How to make the most efficient use of existing abundant data resources largely remains an open question. In this study, we propose a statistical approach, IGESS, to increasing statistical power of identifying risk variants and improving accuracy of risk prediction by i ntegrating individual level ge notype data and s ummary s tatistics. An efficient algorithm based on variational inference is developed to handle the genome-wide analysis. Through comprehensive simulation studies, we demonstrated the advantages of IGESS over the methods which take either individual-level data or summary statistics data as input. We applied IGESS to perform integrative analysis of Crohns Disease from WTCCC and summary statistics from other studies. IGESS was able to significantly increase the statistical power of identifying risk variants and improve the risk prediction accuracy from 63.2% ( ±0.4% ) to 69.4% ( ±0.1% ) using about 240 000 variants. The IGESS software is available at https://github.com/daviddaigithub/IGESS . zbxu@xjtu.edu.cn or xwan@comp.hkbu.edu.hk or eeyang@hkbu.edu.hk. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  6. Statistical Analysis of Compressive and Flexural Test Results on the Sustainable Adobe Reinforced with Steel Wire Mesh

    NASA Astrophysics Data System (ADS)

    Jokhio, Gul A.; Syed Mohsin, Sharifah M.; Gul, Yasmeen

    2018-04-01

    It has been established that Adobe provides, in addition to being sustainable and economic, a better indoor air quality without spending extensive amounts of energy as opposed to the modern synthetic materials. The material, however, suffers from weak structural behaviour when subjected to adverse loading conditions. A wide range of mechanical properties has been reported in literature owing to lack of research and standardization. The present paper presents the statistical analysis of the results that were obtained through compressive and flexural tests on Adobe samples. Adobe specimens with and without wire mesh reinforcement were tested and the results were reported. The statistical analysis of these results presents an interesting read. It has been found that the compressive strength of adobe increases by about 43% after adding a single layer of wire mesh reinforcement. This increase is statistically significant. The flexural response of Adobe has also shown improvement with the addition of wire mesh reinforcement, however, the statistical significance of the same cannot be established.

  7. Intramolecular vibrational redistribution of CH 2I 2 dissolved in supercritical Xe

    NASA Astrophysics Data System (ADS)

    Sekiguchi, K.; Shimojima, A.; Kajimoto, O.

    2003-03-01

    Intramolecular vibrational energy redistribution (IVR) of CH 2I 2 in supercritical Xe has been studied. The first overtone of the C-H stretching mode was excited with a near infrared laser pulse and the transient UV absorption near 390 nm was monitored. Signals showed a rise and decay profile, which gave the IVR and VET (intermolecular vibrational energy transfer) rates, respectively. Solvent density dependence of each rate was obtained by tuning the pressure at a constant temperature. The IVR rate in supercritical Xe increased with increasing solvent density and asymptotically reached a limiting value. This result suggests that the IVR process of CH 2I 2 in condensed phase is a solvent-assisted process.

  8. QUANTITATIVE IMAGING AND STATISTICAL ANALYSIS OF FLUORESCENCE IN SITU HYBRIDIZATION (FISH) OF AUREOBASIDIUM PULLULANS. (R823845)

    EPA Science Inventory

    2>Abstract2>

    Image and multifactorial statistical analyses were used to evaluate the intensity of fluorescence signal from cells of three strains of <i>A. pullulansi> and one strain of <i>Rhodosporidium toruloidesi>, as an outgroup, hybridized with either a universal o...

  9. An Application of M[subscript 2] Statistic to Evaluate the Fit of Cognitive Diagnostic Models

    ERIC Educational Resources Information Center

    Liu, Yanlou; Tian, Wei; Xin, Tao

    2016-01-01

    The fit of cognitive diagnostic models (CDMs) to response data needs to be evaluated, since CDMs might yield misleading results when they do not fit the data well. Limited-information statistic M[subscript 2] and the associated root mean square error of approximation (RMSEA[subscript 2]) in item factor analysis were extended to evaluate the fit of…

  10. Using the Descriptive Bootstrap to Evaluate Result Replicability (Because Statistical Significance Doesn't)

    ERIC Educational Resources Information Center

    Spinella, Sarah

    2011-01-01

    As result replicability is essential to science and difficult to achieve through external replicability, the present paper notes the insufficiency of null hypothesis statistical significance testing (NHSST) and explains the bootstrap as a plausible alternative, with a heuristic example to illustrate the bootstrap method. The bootstrap relies on…

  11. D- and L-[123I]-2-I-phenylalanine show a long tumour retention compared with D- and L-[123I]-2-I-tyrosine in R1M rhabdomyosarcoma tumour-bearing Wag/Rij rats.

    PubMed

    Bauwens, Matthias; Lahoutte, Tony; Kersemans, Ken; Caveliers, Vicky; Bossuyt, Axel; Mertens, John

    2007-07-01

    The aim of this study was the comparison of the tumour uptake and the long-term retention of [(123)I]-2-I-L-phenylalanine and [(123)I]-2-I-D-phenylalanine with those of [(123)I]-2-I-L-tyrosine and [(123)I]-2-I-D-tyrosine in R1M rhabdomyosarcoma tumour-bearing rats. The biodistribution of the radioactivity as a function of time in R1M tumour-bearing rats was measured by planar gamma camera imaging (dynamic and static). If dissection was applied, the activity in the tumours and tissues of interest was measured by gamma counting. [(123)I]-2-iodo-L-phenylalanine, [(123)I]-2-iodo-D-phenylalaine, [(123)I]-2-I-L-tyrosine showed a considerable tumour uptake reaching a maximum between 10 and 30 min. At 30 min p.i. the differential uptake ratio values of this uptake were, respectively, 2.1, 2.3, 2.5 and 1.7. The activity in the tumour was shown to be related to a tumour cell uptake and not to an increased blood pool activity. All the tracers showed a clearance from the blood to the bladder without renal retention. At longer times both L- and D- [(123)I]-2-I-tyrosine were cleared for a large part from the tumours and the body. [(123)I]-2-I-L-Phe and [(123)I]-2-I-D-Phe showed a considerable and equal retention in the tumours: as compared with 0.5 h, 91% at 24 h and 80% at 48 h. This was related to the longer retention of activity in the blood pool noticed for these compounds (81% at 24 h and 65% at 48 h). The tumour-to-background ratio increased with 25% at those longer times. At short times all the tracers were taken up to a considerable extent in the tumours. In the R1M-bearing Wag/Rij rat model only [(123)I]-2-I-L-phenylalanine and [(123)I]-2-I-D-phenylalanine showed an especially high retention at long times without any significant difference between the enantiomers. Copyright 2007 John Wiley & Sons, Ltd.

  12. Number of <i>Streptococcus mutansi> and <i>Lactobacillus> in saliva versus the status of cigarette smoking, considering duration of smoking and number of cigarettes smoked daily.

    PubMed

    Nakonieczna-Rudnicka, Marta; Bachanek, Teresa

    2017-09-21

    A large number of colonies of <i>Streptococcus mutansi> (<i>SM>) and <i>Lactobacillus> (<i>LB>) cariogenic bacteria in the saliva show a high risk of dental caries development. Cotinine is a biomarker of exposure to the tobacco smoke. The aim of the study was assessment of the number of <i>Streptococcus mutansi> and <i>Lactobacillus> in the saliva of non-smokers and smokers considering the duration of smoking and the number of cigarettes smoked daily. The number of <i>SM> and <i>LB> was analysed in relation to the frequency of oral health check-ups. The investigated group comprised 124 people aged 20-54. 58 (46.8%) reported cigarette smoking; 66 (53.2%) reported they had never smoked cigarettes and had never attempted to smoke. Cotinine concentration in the saliva was assayed using the Cotinine test (Calbiotech), and the number of <i>SM> and <i>LB> with the use of the CRT bacteria test (Ivoclar Vivadent, Liechtenstein). Statistical analysis was conducted using Chi2 and Mann-Whitney tests. Test values of p<0.05 were considered statistically significant. No essential correlation was stated between the number of <i>SM> and <i>LB> and the status of smoking, the number of cigarettes smoked daily and duration of cigarette smoking. Smokers who reported having dental check-ups at least once a year significantly more frequently had a small number of <i>LB> stated in relation to people who had dental check-ups to control their oral health less frequently than once a year. The number of <i>SM> and <i>LB> in saliva does not depend on the smoking status, the number of cigarettes smoked daily and duration of smoking.

  13. SIMTBED: A Graphical Test Bed for Analyzing and Reporting the Results of a Statistical Simulation Experiment.

    DTIC Science & Technology

    1983-09-01

    Hlc i-4.-I "m uw t4 4 did) iJ t-4 my i 04il *a 4 2 04 W dc"- W" 00m U) m ) 0U) .2 wp. 0 13 asH JcI 0- d Era1O E-4.344C 2h 1-4 Er W co )14 m 4 0 CD WI2...4 W3-~- > C-0 *1- Me ft %- % W go 0% 4 % % C4 M *0*. E- w = V I~eW= = N E- =E ( 1.4 F-4 -* in Wf 11-.4 P 4U 1b.-4 E- n r1 ,o E4 in~ 131 -.4 IE in fl

  14. Ice Water Classification Using Statistical Distribution Based Conditional Random Fields in RADARSAT-2 Dual Polarization Imagery

    NASA Astrophysics Data System (ADS)

    Zhang, Y.; Li, F.; Zhang, S.; Hao, W.; Zhu, T.; Yuan, L.; Xiao, F.

    2017-09-01

    In this paper, Statistical Distribution based Conditional Random Fields (STA-CRF) algorithm is exploited for improving marginal ice-water classification. Pixel level ice concentration is presented as the comparison of methods based on CRF. Furthermore, in order to explore the effective statistical distribution model to be integrated into STA-CRF, five statistical distribution models are investigated. The STA-CRF methods are tested on 2 scenes around Prydz Bay and Adélie Depression, where contain a variety of ice types during melt season. Experimental results indicate that the proposed method can resolve sea ice edge well in Marginal Ice Zone (MIZ) and show a robust distinction of ice and water.

  15. Quenching of I(2P1/2) by NO2, N2O4, and N2O.

    PubMed

    Kabir, Md Humayun; Azyazov, Valeriy N; Heaven, Michael C

    2007-10-11

    Quenching of excited iodine atoms (I(5p5, 2P1/2)) by nitrogen oxides are processes of relevance to discharge-driven oxygen iodine lasers. Rate constants at ambient and elevated temperatures (293-380 K) for quenching of I(2P1/2) atoms by NO2, N2O4, and N2O have been measured using time-resolved I(2P1/2) --> I(2P3/2) 1315 nm emission. The excited atoms were generated by pulsed laser photodissociation of CF3I at 248 nm. The rate constants for I(2P1/2) quenching by NO2 and N2O were found to be independent of temperature over the range examined with average values of (2.9 +/- 0.3) x 10(-15) and (1.4 +/- 0.1) x 10(-15) cm3 s(-1), respectively. The rate constant for quenching of I(2P1/2) by N2O4 was found to be (3.5 +/- 0.5) x 10(-13) cm3 s(-1) at ambient temperature.

  16. Beer Law Constants and Vapor Pressures of HgI2 over HgI2(s,l)

    NASA Technical Reports Server (NTRS)

    Su, Ching-Hua; Zhu, Shen; Ramachandran, N.; Burger, A.; Whitaker, Ann F. (Technical Monitor)

    2001-01-01

    The optical absorption spectra of the vapor phase over HgI2(s,l) were measured for wavelengths between 200 and 600 nm. The spectra show that the sample sublimed congruently into HgI2 with no Hg or I2 absorption spectrum observed. The Beer's Law constants for 15 wavelengths between 200 and 440 nm were determined. From these constants the vapor pressure of H912, P, was established as a function of temperatures for the liquid and the solid Beta-phases. The expressions correspond to the enthalpies of vaporization and sublimation of 15.30 and 20.17 Kcal/mole, respectively, for the liquid and the Beta-phase HgI2. The difference in the enthalpies gives an enthalpy of fusion of 4.87 Kcal/mole and the intersection of the two expressions gives a melting point of 537 K.

  17. Optimizing α for better statistical decisions: a case study involving the pace-of-life syndrome hypothesis: optimal α levels set to minimize Type I and II errors frequently result in different conclusions from those using α = 0.05.

    PubMed

    Mudge, Joseph F; Penny, Faith M; Houlahan, Jeff E

    2012-12-01

    Setting optimal significance levels that minimize Type I and Type II errors allows for more transparent and well-considered statistical decision making compared to the traditional α = 0.05 significance level. We use the optimal α approach to re-assess conclusions reached by three recently published tests of the pace-of-life syndrome hypothesis, which attempts to unify occurrences of different physiological, behavioral, and life history characteristics under one theory, over different scales of biological organization. While some of the conclusions reached using optimal α were consistent to those previously reported using the traditional α = 0.05 threshold, opposing conclusions were also frequently reached. The optimal α approach reduced probabilities of Type I and Type II errors, and ensured statistical significance was associated with biological relevance. Biologists should seriously consider their choice of α when conducting null hypothesis significance tests, as there are serious disadvantages with consistent reliance on the traditional but arbitrary α = 0.05 significance level. Copyright © 2012 WILEY Periodicals, Inc.

  18. Explorations in statistics: hypothesis tests and P values.

    PubMed

    Curran-Everett, Douglas

    2009-06-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This second installment of Explorations in Statistics delves into test statistics and P values, two concepts fundamental to the test of a scientific null hypothesis. The essence of a test statistic is that it compares what we observe in the experiment to what we expect to see if the null hypothesis is true. The P value associated with the magnitude of that test statistic answers this question: if the null hypothesis is true, what proportion of possible values of the test statistic are at least as extreme as the one I got? Although statisticians continue to stress the limitations of hypothesis tests, there are two realities we must acknowledge: hypothesis tests are ingrained within science, and the simple test of a null hypothesis can be useful. As a result, it behooves us to explore the notions of hypothesis tests, test statistics, and P values.

  19. Conformational Isomerism of trans-[Pt(NH2C6H11)2I2] and the Classical Wernerian Chemistry of [Pt(NH2C6H11)4]X2 (X = Cl, Br, I)1

    PubMed Central

    Johnstone, Timothy C.; Lippard, Stephen J.

    2012-01-01

    X-ray crystallographic analysis of the compound trans-[Pt(NH2C6H11)2I2] revealed the presence of two distinct conformers within one crystal lattice. This compound was studied by variable temperature NMR spectroscopy to investigate the dynamic interconversion between these isomers. The results of this investigation were interpreted using physical (CPK) and computational (molecular mechanics and density functional theory) models. The conversion of the salts [Pt(NH2C6H11)4]X2 into trans-[Pt(NH2C6H11)2X2] (X = Cl, Br, I) was also studied and is discussed here with an emphasis on parallels to the work of Alfred Werner. PMID:23554544

  20. Cosmological Constraints from Fourier Phase Statistics

    NASA Astrophysics Data System (ADS)

    Ali, Kamran; Obreschkow, Danail; Howlett, Cullan; Bonvin, Camille; Llinares, Claudio; Oliveira Franco, Felipe; Power, Chris

    2018-06-01

    Most statistical inference from cosmic large-scale structure relies on two-point statistics, i.e. on the galaxy-galaxy correlation function (2PCF) or the power spectrum. These statistics capture the full information encoded in the Fourier amplitudes of the galaxy density field but do not describe the Fourier phases of the field. Here, we quantify the information contained in the line correlation function (LCF), a three-point Fourier phase correlation function. Using cosmological simulations, we estimate the Fisher information (at redshift z = 0) of the 2PCF, LCF and their combination, regarding the cosmological parameters of the standard ΛCDM model, as well as a Warm Dark Matter (WDM) model and the f(R) and Symmetron modified gravity models. The galaxy bias is accounted for at the level of a linear bias. The relative information of the 2PCF and the LCF depends on the survey volume, sampling density (shot noise) and the bias uncertainty. For a volume of 1h^{-3}Gpc^3, sampled with points of mean density \\bar{n} = 2× 10^{-3} h3 Mpc^{-3} and a bias uncertainty of 13%, the LCF improves the parameter constraints by about 20% in the ΛCDM cosmology and potentially even more in alternative models. Finally, since a linear bias only affects the Fourier amplitudes (2PCF), but not the phases (LCF), the combination of the 2PCF and the LCF can be used to break the degeneracy between the linear bias and σ8, present in 2-point statistics.

  1. Integrating Multimodal Radiation Therapy Data into i2b2.

    PubMed

    Zapletal, Eric; Bibault, Jean-Emmanuel; Giraud, Philippe; Burgun, Anita

    2018-04-01

     Clinical data warehouses are now widely used to foster clinical and translational research and the Informatics for Integrating Biology and the Bedside (i2b2) platform has become a de facto standard for storing clinical data in many projects. However, to design predictive models and assist in personalized treatment planning in cancer or radiation oncology, all available patient data need to be integrated into i2b2, including radiation therapy data that are currently not addressed in many existing i2b2 sites.  To use radiation therapy data in projects related to rectal cancer patients, we assessed the feasibility of integrating radiation oncology data into the i2b2 platform.  The Georges Pompidou European Hospital, a hospital from the Assistance Publique - Hôpitaux de Paris group, has developed an i2b2-based clinical data warehouse of various structured and unstructured clinical data for research since 2008. To store and reuse various radiation therapy data-dose details, activities scheduling, and dose-volume histogram (DVH) curves-in this repository, we first extracted raw data by using some reverse engineering techniques and a vendor's application programming interface. Then, we implemented a hybrid storage approach by combining the standard i2b2 "Entity-Attribute-Value" storage mechanism with a "JavaScript Object Notation (JSON) document-based" storage mechanism without modifying the i2b2 core tables. Validation was performed using (1) the Business Objects framework for replicating vendor's application screens showing dose details and activities scheduling data and (2) the R software for displaying the DVH curves.  We developed a pipeline to integrate the radiation therapy data into the Georges Pompidou European Hospital i2b2 instance and evaluated it on a cohort of 262 patients. We were able to use the radiation therapy data on a preliminary use case by fetching the DVH curve data from the clinical data warehouse and displaying them in a R chart.

  2. A STATISTICAL SURVEY OF DIOXIN-LIKE COMPOUNDS IN ...

    EPA Pesticide Factsheets

    The USEPA and the USDA completed the first statistically designed survey of the occurrence and concentration of dibenzo-p-dioxins (CDDs), dibenzofurans (CDFs), and coplanar polychlorinated biphenyls (PCBs) in the fat of beef animals raised for human consumption in the United States. Back fat was sampled from 63 carcasses at federally inspected slaughter establishments nationwide. The sample design called for sampling beef animal classes in proportion to national annual slaughter statistics. All samples were analyzed using a modification of EPA method 1613, using isotope dilution, High Resolution GC/MS to determine the rate of occurrence of 2,3,7,8-substituted CDDs/CDFs/PCBs. The method detection limits ranged from 0.05 ng/kg for TCDD to 3 ng/kg for OCDD. The results of this survey showed a mean concentration (reported as I-TEQ, lipid adjusted) in U.S. beef animals of 0.35 ng/kg and 0.89 ng/kg for CDD/CDF TEQs when either non-detects are treated as 0 value or assigned a value of 1/2 the detection limit, respectively, and 0.51 ng/kg for coplanar PCB TEQs at both non-detect equal 0 and 1/2 detection limit. journal article

  3. An i2b2-based, generalizable, open source, self-scaling chronic disease registry

    PubMed Central

    Quan, Justin; Ortiz, David M; Bousvaros, Athos; Ilowite, Norman T; Inman, Christi J; Marsolo, Keith; McMurry, Andrew J; Sandborg, Christy I; Schanberg, Laura E; Wallace, Carol A; Warren, Robert W; Weber, Griffin M; Mandl, Kenneth D

    2013-01-01

    Objective Registries are a well-established mechanism for obtaining high quality, disease-specific data, but are often highly project-specific in their design, implementation, and policies for data use. In contrast to the conventional model of centralized data contribution, warehousing, and control, we design a self-scaling registry technology for collaborative data sharing, based upon the widely adopted Integrating Biology & the Bedside (i2b2) data warehousing framework and the Shared Health Research Information Network (SHRINE) peer-to-peer networking software. Materials and methods Focusing our design around creation of a scalable solution for collaboration within multi-site disease registries, we leverage the i2b2 and SHRINE open source software to create a modular, ontology-based, federated infrastructure that provides research investigators full ownership and access to their contributed data while supporting permissioned yet robust data sharing. We accomplish these objectives via web services supporting peer-group overlays, group-aware data aggregation, and administrative functions. Results The 56-site Childhood Arthritis & Rheumatology Research Alliance (CARRA) Registry and 3-site Harvard Inflammatory Bowel Diseases Longitudinal Data Repository now utilize i2b2 self-scaling registry technology (i2b2-SSR). This platform, extensible to federation of multiple projects within and between research networks, encompasses >6000 subjects at sites throughout the USA. Discussion We utilize the i2b2-SSR platform to minimize technical barriers to collaboration while enabling fine-grained control over data sharing. Conclusions The implementation of i2b2-SSR for the multi-site, multi-stakeholder CARRA Registry has established a digital infrastructure for community-driven research data sharing in pediatric rheumatology in the USA. We envision i2b2-SSR as a scalable, reusable solution facilitating interdisciplinary research across diseases. PMID:22733975

  4. Common pitfalls in statistical analysis: Clinical versus statistical significance

    PubMed Central

    Ranganathan, Priya; Pramesh, C. S.; Buyse, Marc

    2015-01-01

    In clinical research, study results, which are statistically significant are often interpreted as being clinically important. While statistical significance indicates the reliability of the study results, clinical significance reflects its impact on clinical practice. The third article in this series exploring pitfalls in statistical analysis clarifies the importance of differentiating between statistical significance and clinical significance. PMID:26229754

  5. Statistical transmutation in doped quantum dimer models.

    PubMed

    Lamas, C A; Ralko, A; Cabra, D C; Poilblanc, D; Pujol, P

    2012-07-06

    We prove a "statistical transmutation" symmetry of doped quantum dimer models on the square, triangular, and kagome lattices: the energy spectrum is invariant under a simultaneous change of statistics (i.e., bosonic into fermionic or vice versa) of the holes and of the signs of all the dimer resonance loops. This exact transformation enables us to define the duality equivalence between doped quantum dimer Hamiltonians and provides the analytic framework to analyze dynamical statistical transmutations. We investigate numerically the doping of the triangular quantum dimer model with special focus on the topological Z(2) dimer liquid. Doping leads to four (instead of two for the square lattice) inequivalent families of Hamiltonians. Competition between phase separation, superfluidity, supersolidity, and fermionic phases is investigated in the four families.

  6. Statistics of the Work done in a Quantum Quench

    NASA Astrophysics Data System (ADS)

    Silva, Alessandro

    2009-03-01

    The quantum quench, i.e. a rapid change in time of a control parameter of a quantum system, is the simplest paradigm of non-equilibrium process, completely analogous to a standard thermodynamic transformation. The dynamics following a quantum quench is particularly interesting in strongly correlated quantum systems, most prominently when the quench in performed across a quantum critical point. In this talk I will present a way to characterize the physics of quantum quenches by looking at the statistics of a basic thermodynamic variable: the work done on the system by changing its parameters [1]. I will first elucidate the relation between the probability distribution of the work, quantum Jarzynski equalities, and the Loschmidt echo, a quantity that emerges usually in the context of dephasing. Using this connection, I will then characterize the statistics of the work done on a Quantum Ising chain by quenching locally or globally the transverse field. I will then show that for global quenches the presence of a quantum critical point results in singularities of the moments of the distribution, while, for local quenches starting at criticality, the probability distribution itself displays an interesting edge singularity. The results of a similar analysis for other systems will be discussed. [4pt] [1] A. Silva, Phys. Rev. Lett. 101, 120603 (2008).

  7. GTF2I mutation frequently occurs in more indolent thymic epithelial tumors and predicts better prognosis.

    PubMed

    Feng, Yanfen; Lei, Yiyan; Wu, Xiaoyan; Huang, Yuhua; Rao, Huilan; Zhang, Yu; Wang, Fang

    2017-08-01

    A missense mutation in GTF2I was previously identified in thymic epithelioid tumor (TET). However, the clinicopathological relevance of GTF2I mutation has not been illustrated. We studied the prognostic importance of GTF2I mutation as well as its relation to histological subtypes in a large number of TETs. TET samples from 296 patients with clinical and follow-up data were collected, and histological subtypes were classified. Analysis of the GTF2I (chromosome 7 c.74146970T>A) mutation was undertaken by using quantitative real time polymerase chain reaction (qPCR) and direct sequencing. The association of GTF2I mutation with clinicopathological features as well as prognosis was analyzed. One hundred twenty-four out of 296 (41.9%) patients harbored the GTF2I mutation (chromosome 7 c.74146970T>A). GTF2I mutation was observed in 20 (87.0%) cases of type A thymoma, 70 (78.7%) of type AB thymoma, and the frequency decreased with the degree of histological subtype aggressiveness, with the lowest rate in thymic carcinoma (7.7%). The difference of GTF2I mutation distribution in histological subtypes was statistically significant (p<0.001). The GTF2I mutation was found more frequently in patients with early Masaoka stage (I-II, n=112, 90.3%) than in those with advanced stage (III-IV) disease (n=12, 9.6%, p<0.001). However, only histological subtype significantly predicted the presence of the GTF2I mutation in patients with TETs. The presence of the GTF2I mutation correlated with better prognosis (90.0% compared to 72.0% 5-year survival, and 86% compared to 56% 10-year survival, respectively; log-rank p=0.001). Moreover, it was an independent prognostic factor [hazard ratio (HR), 0.35; 95% confidential interval (CI), 0.15-0.81; p=0.014)]. The frequency of the GTF2I mutation is higher in more indolent TETs, and correlates with better prognosis. Further studies are required to elucidate the role of the GTF2I mutation in TETs and its clinical application. Copyright © 2017

  8. Amyloid Precursor-like Protein 2 Association with HLA Class I Molecules

    PubMed Central

    Tuli, Amit; Sharma, Mahak; Wang, Xiaojian; Simone, Laura C.; Capek, Haley L.; Cate, Steven; Hildebrand, William H.; Naslavsky, Naava; Caplan, Steve; Solheim, Joyce C.

    2009-01-01

    Amyloid precursor-like protein 2 (APLP2) is a ubiquitously expressed protein. The previously demonstrated functions for APLP2 include binding to the mouse major histocompatibility complex (MHC) class I molecule H-2Kd and down regulating its cell surface expression. In this study, we have investigated the interaction of APLP2 with the human leukocyte antigen (HLA) class I molecule in human tumor cell lines. APLP2 was readily detected in pancreatic, breast, and prostate tumor lines, although it was found only in very low amounts in lymphoma cell lines. In a pancreatic tumor cell line, HLA class I was extensively co-localized with APLP2 in vesicular compartments following endocytosis of HLA class I molecules. In pancreatic, breast, and prostate tumor lines, APLP2 was bound to the HLA class I molecule. APLP2 was found to bind to HLA-A24, and more strongly to HLA-A2. Increased expression of APLP2 resulted in reduced surface expression of HLA-A2 and HLA-A24. Overall, these studies demonstrate that APLP2 binds to the HLA class I molecule, co-localizes with it in intracellular vesicles, and reduces the level of HLA class I molecule cell surface expression. PMID:19184004

  9. Statistically significant performance results of a mine detector and fusion algorithm from an x-band high-resolution SAR

    NASA Astrophysics Data System (ADS)

    Williams, Arnold C.; Pachowicz, Peter W.

    2004-09-01

    Current mine detection research indicates that no single sensor or single look from a sensor will detect mines/minefields in a real-time manner at a performance level suitable for a forward maneuver unit. Hence, the integrated development of detectors and fusion algorithms are of primary importance. A problem in this development process has been the evaluation of these algorithms with relatively small data sets, leading to anecdotal and frequently over trained results. These anecdotal results are often unreliable and conflicting among various sensors and algorithms. Consequently, the physical phenomena that ought to be exploited and the performance benefits of this exploitation are often ambiguous. The Army RDECOM CERDEC Night Vision Laboratory and Electron Sensors Directorate has collected large amounts of multisensor data such that statistically significant evaluations of detection and fusion algorithms can be obtained. Even with these large data sets care must be taken in algorithm design and data processing to achieve statistically significant performance results for combined detectors and fusion algorithms. This paper discusses statistically significant detection and combined multilook fusion results for the Ellipse Detector (ED) and the Piecewise Level Fusion Algorithm (PLFA). These statistically significant performance results are characterized by ROC curves that have been obtained through processing this multilook data for the high resolution SAR data of the Veridian X-Band radar. We discuss the implications of these results on mine detection and the importance of statistical significance, sample size, ground truth, and algorithm design in performance evaluation.

  10. 25 CFR 502.2 - Class I gaming.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 25 Indians 2 2011-04-01 2011-04-01 false Class I gaming. 502.2 Section 502.2 Indians NATIONAL INDIAN GAMING COMMISSION, DEPARTMENT OF THE INTERIOR GENERAL PROVISIONS DEFINITIONS OF THIS CHAPTER § 502.2 Class I gaming. Class I gaming means: (a) Social games played solely for prizes of minimal value...

  11. 25 CFR 502.2 - Class I gaming.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 25 Indians 2 2010-04-01 2010-04-01 false Class I gaming. 502.2 Section 502.2 Indians NATIONAL INDIAN GAMING COMMISSION, DEPARTMENT OF THE INTERIOR GENERAL PROVISIONS DEFINITIONS OF THIS CHAPTER § 502.2 Class I gaming. Class I gaming means: (a) Social games played solely for prizes of minimal value...

  12. 25 CFR 502.2 - Class I gaming.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 25 Indians 2 2012-04-01 2012-04-01 false Class I gaming. 502.2 Section 502.2 Indians NATIONAL INDIAN GAMING COMMISSION, DEPARTMENT OF THE INTERIOR GENERAL PROVISIONS DEFINITIONS OF THIS CHAPTER § 502.2 Class I gaming. Class I gaming means: (a) Social games played solely for prizes of minimal value...

  13. 25 CFR 502.2 - Class I gaming.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 25 Indians 2 2013-04-01 2013-04-01 false Class I gaming. 502.2 Section 502.2 Indians NATIONAL INDIAN GAMING COMMISSION, DEPARTMENT OF THE INTERIOR GENERAL PROVISIONS DEFINITIONS OF THIS CHAPTER § 502.2 Class I gaming. Class I gaming means: (a) Social games played solely for prizes of minimal value...

  14. 25 CFR 502.2 - Class I gaming.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 25 Indians 2 2014-04-01 2014-04-01 false Class I gaming. 502.2 Section 502.2 Indians NATIONAL INDIAN GAMING COMMISSION, DEPARTMENT OF THE INTERIOR GENERAL PROVISIONS DEFINITIONS OF THIS CHAPTER § 502.2 Class I gaming. Class I gaming means: (a) Social games played solely for prizes of minimal value...

  15. Defect Engineering in SrI 2:Eu 2+ Single Crystal Scintillators

    DOE PAGES

    Wu, Yuntao; Boatner, Lynn A.; Lindsey, Adam C.; ...

    2015-06-23

    Eu 2+-activated strontium iodide is an excellent single crystal scintillator used for gamma-ray detection and significant effort is currently focused on the development of large-scale crystal growth techniques. A new approach of molten-salt pumping or so-called melt aging was recently applied to optimize the crystal quality and scintillation performance. Nevertheless, a detailed understanding of the underlying mechanism of this technique is still lacking. The main purpose of this paper is to conduct an in-depth study of the interplay between microstructure, trap centers and scintillation efficiency after melt aging treatment. Three SrI 2:2 mol% Eu2+ single crystals with 16 mm diametermore » were grown using the Bridgman method under identical growth conditions with the exception of the melt aging time (e.g. 0, 24 and 72 hours). Using energy-dispersive X-ray spectroscopy, it is found that the matrix composition of the finished crystal after melt aging treatment approaches the stoichiometric composition. The mechanism responsible for the formation of secondary phase inclusions in melt-aged SrI 2:Eu 2+ is discussed. Simultaneous improvement in light yield, energy resolution, scintillation decay-time and afterglow is achieved in melt-aged SrI 2:Eu 2+. The correlation between performance improvement and defect structure is addressed. The results of this paper lead to a better understanding of the effects of defect engineering in control and optimization of metal halide scintillators using the melt aging technique.« less

  16. Statistical Physics of Adaptation

    DTIC Science & Technology

    2016-08-23

    Statistical Physics of Adaptation Nikolay Perunov, Robert A. Marsland, and Jeremy L. England Department of Physics , Physics of Living Systems Group...Subject Areas: Biological Physics , Complex Systems, Statistical Physics I. INTRODUCTION It has long been understood that nonequilibrium driving can...equilibrium may appear to have been specially selected for physical properties connected to their ability to absorb work from the particular driving environment

  17. Assessment of trace elements levels in patients with Type 2 diabetes using multivariate statistical analysis.

    PubMed

    Badran, M; Morsy, R; Soliman, H; Elnimr, T

    2016-01-01

    The trace elements metabolism has been reported to possess specific roles in the pathogenesis and progress of diabetes mellitus. Due to the continuous increase in the population of patients with Type 2 diabetes (T2D), this study aims to assess the levels and inter-relationships of fast blood glucose (FBG) and serum trace elements in Type 2 diabetic patients. This study was conducted on 40 Egyptian Type 2 diabetic patients and 36 healthy volunteers (Hospital of Tanta University, Tanta, Egypt). The blood serum was digested and then used to determine the levels of 24 trace elements using an inductive coupled plasma mass spectroscopy (ICP-MS). Multivariate statistical analysis depended on correlation coefficient, cluster analysis (CA) and principal component analysis (PCA), were used to analysis the data. The results exhibited significant changes in FBG and eight of trace elements, Zn, Cu, Se, Fe, Mn, Cr, Mg, and As, levels in the blood serum of Type 2 diabetic patients relative to those of healthy controls. The statistical analyses using multivariate statistical techniques were obvious in the reduction of the experimental variables, and grouping the trace elements in patients into three clusters. The application of PCA revealed a distinct difference in associations of trace elements and their clustering patterns in control and patients group in particular for Mg, Fe, Cu, and Zn that appeared to be the most crucial factors which related with Type 2 diabetes. Therefore, on the basis of this study, the contributors of trace elements content in Type 2 diabetic patients can be determine and specify with correlation relationship and multivariate statistical analysis, which confirm that the alteration of some essential trace metals may play a role in the development of diabetes mellitus. Copyright © 2015 Elsevier GmbH. All rights reserved.

  18. A basic introduction to statistics for the orthopaedic surgeon.

    PubMed

    Bertrand, Catherine; Van Riet, Roger; Verstreken, Frederik; Michielsen, Jef

    2012-02-01

    Orthopaedic surgeons should review the orthopaedic literature in order to keep pace with the latest insights and practices. A good understanding of basic statistical principles is of crucial importance to the ability to read articles critically, to interpret results and to arrive at correct conclusions. This paper explains some of the key concepts in statistics, including hypothesis testing, Type I and Type II errors, testing of normality, sample size and p values.

  19. Beer Law Constants and Vapor Pressures of HgI2 over HgI2(s,l)

    NASA Technical Reports Server (NTRS)

    Su, Ching-Hua; Zhu, Shen; Ramachandran, N.; Burger, A.

    2002-01-01

    Optical absorption spectra of the vapor phase over HgI2(s,l) were measured at sample temperatures between 349 and 610 K for wavelengths between 200 and 600 nm. The spectra show the samples sublimed congruently into HGI2 without any observed Hg or I2 absorption spectra. The Beer's Law constants for 15 wavelengths between 200 and 440 nm were derived. From these constants the vapor pressure of HgI2, P, was found to be a function of temperature for the liquid and the solid beta-phases: ln P(atm) = -7700/T(K) + 12.462 (liquid phase) and ln P(atm) = -10150/T(K) + 17.026 (beta-phase). The expressions match the enthalpies of vaporization and sublimation of 15.30 and 20.17 kcal/mole respectively, for the liquid and the beta-phase HgI2. The difference in the enthalpies gives an enthalpy of fusion of 4.87 kcal/mole, and the intersection of the two expressions gives a melting point of 537 K.

  20. Prioritizing GWAS Results: A Review of Statistical Methods and Recommendations for Their Application

    PubMed Central

    Cantor, Rita M.; Lange, Kenneth; Sinsheimer, Janet S.

    2010-01-01

    Genome-wide association studies (GWAS) have rapidly become a standard method for disease gene discovery. A substantial number of recent GWAS indicate that for most disorders, only a few common variants are implicated and the associated SNPs explain only a small fraction of the genetic risk. This review is written from the viewpoint that findings from the GWAS provide preliminary genetic information that is available for additional analysis by statistical procedures that accumulate evidence, and that these secondary analyses are very likely to provide valuable information that will help prioritize the strongest constellations of results. We review and discuss three analytic methods to combine preliminary GWAS statistics to identify genes, alleles, and pathways for deeper investigations. Meta-analysis seeks to pool information from multiple GWAS to increase the chances of finding true positives among the false positives and provides a way to combine associations across GWAS, even when the original data are unavailable. Testing for epistasis within a single GWAS study can identify the stronger results that are revealed when genes interact. Pathway analysis of GWAS results is used to prioritize genes and pathways within a biological context. Following a GWAS, association results can be assigned to pathways and tested in aggregate with computational tools and pathway databases. Reviews of published methods with recommendations for their application are provided within the framework for each approach. PMID:20074509

  1. Understanding common statistical methods, Part I: descriptive methods, probability, and continuous data.

    PubMed

    Skinner, Carl G; Patel, Manish M; Thomas, Jerry D; Miller, Michael A

    2011-01-01

    Statistical methods are pervasive in medical research and general medical literature. Understanding general statistical concepts will enhance our ability to critically appraise the current literature and ultimately improve the delivery of patient care. This article intends to provide an overview of the common statistical methods relevant to medicine.

  2. Multiple statistical tests: Lessons from a d20.

    PubMed

    Madan, Christopher R

    2016-01-01

    Statistical analyses are often conducted with α= .05. When multiple statistical tests are conducted, this procedure needs to be adjusted to compensate for the otherwise inflated Type I error. In some instances in tabletop gaming, sometimes it is desired to roll a 20-sided die (or 'd20') twice and take the greater outcome. Here I draw from probability theory and the case of a d20, where the probability of obtaining any specific outcome is (1)/ 20, to determine the probability of obtaining a specific outcome (Type-I error) at least once across repeated, independent statistical tests.

  3. Strongly coupled fluid-particle flows in vertical channels. I. Reynolds-averaged two-phase turbulence statistics

    NASA Astrophysics Data System (ADS)

    Capecelatro, Jesse; Desjardins, Olivier; Fox, Rodney O.

    2016-03-01

    Simulations of strongly coupled (i.e., high-mass-loading) fluid-particle flows in vertical channels are performed with the purpose of understanding the fundamental physics of wall-bounded multiphase turbulence. The exact Reynolds-averaged (RA) equations for high-mass-loading suspensions are presented, and the unclosed terms that are retained in the context of fully developed channel flow are evaluated in an Eulerian-Lagrangian (EL) framework for the first time. A key distinction between the RA formulation presented in the current work and previous derivations of multiphase turbulence models is the partitioning of the particle velocity fluctuations into spatially correlated and uncorrelated components, used to define the components of the particle-phase turbulent kinetic energy (TKE) and granular temperature, respectively. The adaptive spatial filtering technique developed in our previous work for homogeneous flows [J. Capecelatro, O. Desjardins, and R. O. Fox, "Numerical study of collisional particle dynamics in cluster-induced turbulence," J. Fluid Mech. 747, R2 (2014)] is shown to accurately partition the particle velocity fluctuations at all distances from the wall. Strong segregation in the components of granular energy is observed, with the largest values of particle-phase TKE associated with clusters falling near the channel wall, while maximum granular temperature is observed at the center of the channel. The anisotropy of the Reynolds stresses both near the wall and far away is found to be a crucial component for understanding the distribution of the particle-phase volume fraction. In Part II of this paper, results from the EL simulations are used to validate a multiphase Reynolds-stress turbulence model that correctly predicts the wall-normal distribution of the two-phase turbulence statistics.

  4. Assessment of statistical education in Indonesia: Preliminary results and initiation to simulation-based inference

    NASA Astrophysics Data System (ADS)

    Saputra, K. V. I.; Cahyadi, L.; Sembiring, U. A.

    2018-01-01

    Start in this paper, we assess our traditional elementary statistics education and also we introduce elementary statistics with simulation-based inference. To assess our statistical class, we adapt the well-known CAOS (Comprehensive Assessment of Outcomes in Statistics) test that serves as an external measure to assess the student’s basic statistical literacy. This test generally represents as an accepted measure of statistical literacy. We also introduce a new teaching method on elementary statistics class. Different from the traditional elementary statistics course, we will introduce a simulation-based inference method to conduct hypothesis testing. From the literature, it has shown that this new teaching method works very well in increasing student’s understanding of statistics.

  5. 49 CFR 1248.1 - Freight commodity statistics.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 49 Transportation 9 2011-10-01 2011-10-01 false Freight commodity statistics. 1248.1 Section 1248... STATISTICS § 1248.1 Freight commodity statistics. All class I railroads, as described in § 1240.1 of this... statistics on the basis of the commodity codes named in § 1248.101. Carriers shall report quarterly on the...

  6. 49 CFR 1248.1 - Freight commodity statistics.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 9 2010-10-01 2010-10-01 false Freight commodity statistics. 1248.1 Section 1248... STATISTICS § 1248.1 Freight commodity statistics. All class I railroads, as described in § 1240.1 of this... statistics on the basis of the commodity codes named in § 1248.101. Carriers shall report quarterly on the...

  7. Assessing colour-dependent occupation statistics inferred from galaxy group catalogues

    NASA Astrophysics Data System (ADS)

    Campbell, Duncan; van den Bosch, Frank C.; Hearin, Andrew; Padmanabhan, Nikhil; Berlind, Andreas; Mo, H. J.; Tinker, Jeremy; Yang, Xiaohu

    2015-09-01

    We investigate the ability of current implementations of galaxy group finders to recover colour-dependent halo occupation statistics. To test the fidelity of group catalogue inferred statistics, we run three different group finders used in the literature over a mock that includes galaxy colours in a realistic manner. Overall, the resulting mock group catalogues are remarkably similar, and most colour-dependent statistics are recovered with reasonable accuracy. However, it is also clear that certain systematic errors arise as a consequence of correlated errors in group membership determination, central/satellite designation, and halo mass assignment. We introduce a new statistic, the halo transition probability (HTP), which captures the combined impact of all these errors. As a rule of thumb, errors tend to equalize the properties of distinct galaxy populations (i.e. red versus blue galaxies or centrals versus satellites), and to result in inferred occupation statistics that are more accurate for red galaxies than for blue galaxies. A statistic that is particularly poorly recovered from the group catalogues is the red fraction of central galaxies as a function of halo mass. Group finders do a good job in recovering galactic conformity, but also have a tendency to introduce weak conformity when none is present. We conclude that proper inference of colour-dependent statistics from group catalogues is best achieved using forward modelling (i.e. running group finders over mock data) or by implementing a correction scheme based on the HTP, as long as the latter is not too strongly model dependent.

  8. Numerical investigation of kinetic turbulence in relativistic pair plasmas - I. Turbulence statistics

    NASA Astrophysics Data System (ADS)

    Zhdankin, Vladimir; Uzdensky, Dmitri A.; Werner, Gregory R.; Begelman, Mitchell C.

    2018-02-01

    We describe results from particle-in-cell simulations of driven turbulence in collisionless, magnetized, relativistic pair plasma. This physical regime provides a simple setting for investigating the basic properties of kinetic turbulence and is relevant for high-energy astrophysical systems such as pulsar wind nebulae and astrophysical jets. In this paper, we investigate the statistics of turbulent fluctuations in simulations on lattices of up to 10243 cells and containing up to 2 × 1011 particles. Due to the absence of a cooling mechanism in our simulations, turbulent energy dissipation reduces the magnetization parameter to order unity within a few dynamical times, causing turbulent motions to become sub-relativistic. In the developed stage, our results agree with predictions from magnetohydrodynamic turbulence phenomenology at inertial-range scales, including a power-law magnetic energy spectrum with index near -5/3, scale-dependent anisotropy of fluctuations described by critical balance, lognormal distributions for particle density and internal energy density (related by a 4/3 adiabatic index, as predicted for an ultra-relativistic ideal gas), and the presence of intermittency. We also present possible signatures of a kinetic cascade by measuring power-law spectra for the magnetic, electric and density fluctuations at sub-Larmor scales.

  9. Docking studies on NSAID/COX-2 isozyme complexes using Contact Statistics analysis

    NASA Astrophysics Data System (ADS)

    Ermondi, Giuseppe; Caron, Giulia; Lawrence, Raelene; Longo, Dario

    2004-11-01

    The selective inhibition of COX-2 isozymes should lead to a new generation of NSAIDs with significantly reduced side effects; e.g. celecoxib (Celebrex®) and rofecoxib (Vioxx®). To obtain inhibitors with higher selectivity it has become essential to gain additional insight into the details of the interactions between COX isozymes and NSAIDs. Although X-ray structures of COX-2 complexed with a small number of ligands are available, experimental data are missing for two well-known selective COX-2 inhibitors (rofecoxib and nimesulide) and docking results reported are controversial. We use a combination of a traditional docking procedure with a new computational tool (Contact Statistics analysis) that identifies the best orientation among a number of solutions to shed some light on this topic.

  10. iTOUGH2 V6.5

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Finsterle, Stefan A.

    2010-11-01

    iTOUGH2 (inverse TOUGH2) provides inverse modeling capabilities for TOUGH2, a simulator for multi-dimensional , multi-phase, multi-component, non-isothermal flow and transport in fractured porous media. It performs sensitivity analysis, parameter estimation, and uncertainty propagation, analysis in geosciences and reservoir engineering and other application areas. It supports a number of different combination of fluids and components [equation-of-state (EOS) modules]. In addition, the optimization routines implemented in iTOUGH2 can also be used or sensitivity analysis, automatic model calibration, and uncertainty quantification of any external code that uses text-based input and output files. This link is achieved by means of the PEST application programmingmore » interface. iTOUGH2 solves the inverse problem by minimizing a non-linear objective function of the weighted differences between model output and the corresponding observations. Multiple minimization algorithms (derivative fee, gradient-based and second-order; local and global) are available. iTOUGH2 also performs Latin Hypercube Monte Carlos simulation for uncertainty propagation analysis. A detailed residual and error analysis is provided. This upgrade includes new EOS modules (specifically EOS7c, ECO2N and TMVOC), hysteretic relative permeability and capillary pressure functions and the PEST API. More details can be found at http://esd.lbl.gov/iTOUGH2 and the publications cited there. Hardware Req.: Multi-platform; Related/auxiliary software PVM (if running in parallel).« less

  11. iPARTS2: an improved tool for pairwise alignment of RNA tertiary structures, version 2.

    PubMed

    Yang, Chung-Han; Shih, Cheng-Ting; Chen, Kun-Tze; Lee, Po-Han; Tsai, Ping-Han; Lin, Jian-Cheng; Yen, Ching-Yu; Lin, Tiao-Yin; Lu, Chin Lung

    2016-07-08

    Since its first release in 2010, iPARTS has become a valuable tool for globally or locally aligning two RNA 3D structures. It was implemented by a structural alphabet (SA)-based approach, which uses an SA of 23 letters to reduce RNA 3D structures into 1D sequences of SA letters and applies traditional sequence alignment to these SA-encoded sequences for determining their global or local similarity. In this version, we have re-implemented iPARTS into a new web server iPARTS2 by constructing a totally new SA, which consists of 92 elements with each carrying both information of base and backbone geometry for a representative nucleotide. This SA is significantly different from the one used in iPARTS, because the latter consists of only 23 elements with each carrying only the backbone geometry information of a representative nucleotide. Our experimental results have shown that iPARTS2 outperforms its previous version iPARTS and also achieves better accuracy than other popular tools, such as SARA, SETTER and RASS, in RNA alignment quality and function prediction. iPARTS2 takes as input two RNA 3D structures in the PDB format and outputs their global or local alignments with graphical display. iPARTS2 is now available online at http://genome.cs.nthu.edu.tw/iPARTS2/. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  12. Ionization and excitation in cool giant stars. I - Hydrogen and helium

    NASA Technical Reports Server (NTRS)

    Luttermoser, Donald G.; Johnson, Hollis R.

    1992-01-01

    The influence that non-LTE radiative transfer has on the electron density, ionization equilibrium, and excitation equilibrium in model atmospheres representative of both oxygen-rich and carbon-rich red giant stars is demonstrated. The radiative transfer and statistical equilibrium equations are solved self-consistently for H, H(-), H2, He I, C I, C II, Na I, Mg I, Mg II, Ca I, and Ca II in a plane-parallel static medium. Calculations are made for both radiative-equilibrium model photospheres alone and model photospheres with attached chromospheric models as determined semiempirically with IUE spectra of g Her (M6 III) and TX Psc (C6, 2). The excitation and ionization results for hydrogen and helium are reported.

  13. Evaluation of dissolution profile similarity - Comparison between the f2, the multivariate statistical distance and the f2 bootstrapping methods.

    PubMed

    Paixão, Paulo; Gouveia, Luís F; Silva, Nuno; Morais, José A G

    2017-03-01

    A simulation study is presented, evaluating the performance of the f 2 , the model-independent multivariate statistical distance and the f 2 bootstrap methods in the ability to conclude similarity between two dissolution profiles. Different dissolution profiles, based on the Noyes-Whitney equation and ranging from theoretical f 2 values between 100 and 40, were simulated. Variability was introduced in the dissolution model parameters in an increasing order, ranging from a situation complying with the European guidelines requirements for the use of the f 2 metric to several situations where the f 2 metric could not be used anymore. Results have shown that the f 2 is an acceptable metric when used according to the regulatory requirements, but loses its applicability when variability increases. The multivariate statistical distance presented contradictory results in several of the simulation scenarios, which makes it an unreliable metric for dissolution profile comparisons. The bootstrap f 2 , although conservative in its conclusions is an alternative suitable method. Overall, as variability increases, all of the discussed methods reveal problems that can only be solved by increasing the number of dosage form units used in the comparison, which is usually not practical or feasible. Additionally, experimental corrective measures may be undertaken in order to reduce the overall variability, particularly when it is shown that it is mainly due to the dissolution assessment instead of being intrinsic to the dosage form. Copyright © 2016. Published by Elsevier B.V.

  14. Neural Systems with Numerically Matched Input-Output Statistic: Isotonic Bivariate Statistical Modeling

    PubMed Central

    Fiori, Simone

    2007-01-01

    Bivariate statistical modeling from incomplete data is a useful statistical tool that allows to discover the model underlying two data sets when the data in the two sets do not correspond in size nor in ordering. Such situation may occur when the sizes of the two data sets do not match (i.e., there are “holes” in the data) or when the data sets have been acquired independently. Also, statistical modeling is useful when the amount of available data is enough to show relevant statistical features of the phenomenon underlying the data. We propose to tackle the problem of statistical modeling via a neural (nonlinear) system that is able to match its input-output statistic to the statistic of the available data sets. A key point of the new implementation proposed here is that it is based on look-up-table (LUT) neural systems, which guarantee a computationally advantageous way of implementing neural systems. A number of numerical experiments, performed on both synthetic and real-world data sets, illustrate the features of the proposed modeling procedure. PMID:18566641

  15. Signal Processing in Functional Near-Infrared Spectroscopy (fNIRS): Methodological Differences Lead to Different Statistical Results.

    PubMed

    Pfeifer, Mischa D; Scholkmann, Felix; Labruyère, Rob

    2017-01-01

    Even though research in the field of functional near-infrared spectroscopy (fNIRS) has been performed for more than 20 years, consensus on signal processing methods is still lacking. A significant knowledge gap exists between established researchers and those entering the field. One major issue regularly observed in publications from researchers new to the field is the failure to consider possible signal contamination by hemodynamic changes unrelated to neurovascular coupling (i.e., scalp blood flow and systemic blood flow). This might be due to the fact that these researchers use the signal processing methods provided by the manufacturers of their measurement device without an advanced understanding of the performed steps. The aim of the present study was to investigate how different signal processing approaches (including and excluding approaches that partially correct for the possible signal contamination) affect the results of a typical functional neuroimaging study performed with fNIRS. In particular, we evaluated one standard signal processing method provided by a commercial company and compared it to three customized approaches. We thereby investigated the influence of the chosen method on the statistical outcome of a clinical data set (task-evoked motor cortex activity). No short-channels were used in the present study and therefore two types of multi-channel corrections based on multiple long-channels were applied. The choice of the signal processing method had a considerable influence on the outcome of the study. While methods that ignored the contamination of the fNIRS signals by task-evoked physiological noise yielded several significant hemodynamic responses over the whole head, the statistical significance of these findings disappeared when accounting for part of the contamination using a multi-channel regression. We conclude that adopting signal processing methods that correct for physiological confounding effects might yield more realistic results

  16. Fully Bayesian inference for structural MRI: application to segmentation and statistical analysis of T2-hypointensities.

    PubMed

    Schmidt, Paul; Schmid, Volker J; Gaser, Christian; Buck, Dorothea; Bührlen, Susanne; Förschler, Annette; Mühlau, Mark

    2013-01-01

    Aiming at iron-related T2-hypointensity, which is related to normal aging and neurodegenerative processes, we here present two practicable approaches, based on Bayesian inference, for preprocessing and statistical analysis of a complex set of structural MRI data. In particular, Markov Chain Monte Carlo methods were used to simulate posterior distributions. First, we rendered a segmentation algorithm that uses outlier detection based on model checking techniques within a Bayesian mixture model. Second, we rendered an analytical tool comprising a Bayesian regression model with smoothness priors (in the form of Gaussian Markov random fields) mitigating the necessity to smooth data prior to statistical analysis. For validation, we used simulated data and MRI data of 27 healthy controls (age: [Formula: see text]; range, [Formula: see text]). We first observed robust segmentation of both simulated T2-hypointensities and gray-matter regions known to be T2-hypointense. Second, simulated data and images of segmented T2-hypointensity were analyzed. We found not only robust identification of simulated effects but also a biologically plausible age-related increase of T2-hypointensity primarily within the dentate nucleus but also within the globus pallidus, substantia nigra, and red nucleus. Our results indicate that fully Bayesian inference can successfully be applied for preprocessing and statistical analysis of structural MRI data.

  17. Selected Streamflow Statistics and Regression Equations for Predicting Statistics at Stream Locations in Monroe County, Pennsylvania

    USGS Publications Warehouse

    Thompson, Ronald E.; Hoffman, Scott A.

    2006-01-01

    A suite of 28 streamflow statistics, ranging from extreme low to high flows, was computed for 17 continuous-record streamflow-gaging stations and predicted for 20 partial-record stations in Monroe County and contiguous counties in north-eastern Pennsylvania. The predicted statistics for the partial-record stations were based on regression analyses relating inter-mittent flow measurements made at the partial-record stations indexed to concurrent daily mean flows at continuous-record stations during base-flow conditions. The same statistics also were predicted for 134 ungaged stream locations in Monroe County on the basis of regression analyses relating the statistics to GIS-determined basin characteristics for the continuous-record station drainage areas. The prediction methodology for developing the regression equations used to estimate statistics was developed for estimating low-flow frequencies. This study and a companion study found that the methodology also has application potential for predicting intermediate- and high-flow statistics. The statistics included mean monthly flows, mean annual flow, 7-day low flows for three recurrence intervals, nine flow durations, mean annual base flow, and annual mean base flows for two recurrence intervals. Low standard errors of prediction and high coefficients of determination (R2) indicated good results in using the regression equations to predict the statistics. Regression equations for the larger flow statistics tended to have lower standard errors of prediction and higher coefficients of determination (R2) than equations for the smaller flow statistics. The report discusses the methodologies used in determining the statistics and the limitations of the statistics and the equations used to predict the statistics. Caution is indicated in using the predicted statistics for small drainage area situations. Study results constitute input needed by water-resource managers in Monroe County for planning purposes and evaluation

  18. Nanocomposite Phosphor Consisting of CaI2:Eu2+ Single Nanocrystals Embedded in Crystalline SiO2.

    PubMed

    Daicho, Hisayoshi; Iwasaki, Takeshi; Shinomiya, Yu; Nakano, Akitoshi; Sawa, Hiroshi; Yamada, Wataru; Matsuishi, Satoru; Hosono, Hideo

    2017-11-29

    High luminescence efficiency is obtained in halide- and chalcogenide-based phosphors, but they are impractical because of their poor chemical durability. Here we report a halide-based nanocomposite phosphor with excellent luminescence efficiency and sufficient durability for practical use. Our approach was to disperse luminescent single nanocrystals of CaI 2 :Eu 2+ in a chemically stable, translucent crystalline SiO 2 matrix. Using this approach, we successfully prepared a nanocomposite phosphor by means of self-organization through a simple solid-state reaction. Single nanocrystals of 6H polytype (thr notation) CaI 2 :Eu 2+ with diameters of about 50 nm could be generated not only in a SiO 2 amorphous powder but also in a SiO 2 glass plate. The nanocomposite phosphor formed upon solidification of molten CaI 2 left behind in the crystalline SiO 2 that formed from the amorphous SiO 2 under the influence of a CaI 2 flux effect. The resulting nanocomposite phosphor emitted brilliant blue luminescence with an internal quantum efficiency up to 98% upon 407 nm violet excitation. We used cathodoluminescence microscopy, scanning transmission electron microscopy, and Rietveld refinement of the X-ray diffraction patterns to confirm that the blue luminescence was generated only by the CaI 2 :Eu 2+ single nanocrystals. The phosphor was chemically durable because the luminescence sites were embedded in the crystalline SiO 2 matrix. The phosphor is suitable for use in near-ultraviolet light-emitting diodes. The concept for this nanocomposite phosphor can be expected to be effective for improvements in the practicality of poorly durable materials such as halides and chalcogenides.

  19. Applications of statistical physics to technology price evolution

    NASA Astrophysics Data System (ADS)

    McNerney, James

    Understanding how changing technology affects the prices of goods is a problem with both rich phenomenology and important policy consequences. Using methods from statistical physics, I model technology-driven price evolution. First, I examine a model for the price evolution of individual technologies. The price of a good often follows a power law equation when plotted against its cumulative production. This observation turns out to have significant consequences for technology policy aimed at mitigating climate change, where technologies are needed that achieve low carbon emissions at low cost. However, no theory adequately explains why technology prices follow power laws. To understand this behavior, I simplify an existing model that treats technologies as machines composed of interacting components. I find that the power law exponent of the price trajectory is inversely related to the number of interactions per component. I extend the model to allow for more realistic component interactions and make a testable prediction. Next, I conduct a case-study on the cost evolution of coal-fired electricity. I derive the cost in terms of various physical and economic components. The results suggest that commodities and technologies fall into distinct classes of price models, with commodities following martingales, and technologies following exponentials in time or power laws in cumulative production. I then examine the network of money flows between industries. This work is a precursor to studying the simultaneous evolution of multiple technologies. Economies resemble large machines, with different industries acting as interacting components with specialized functions. To begin studying the structure of these machines, I examine 20 economies with an emphasis on finding common features to serve as targets for statistical physics models. I find they share the same money flow and industry size distributions. I apply methods from statistical physics to show that industries

  20. Performance Monitoring System: Summary of Lock Statistics. Revision 1.

    DTIC Science & Technology

    1985-12-01

    2751 84 4057 4141 526 798 18 1342 5727 19 5523 3996 4587 8583 1056 1630 35 2721 6536LOCK A DAMI 2 AUXILIARY I Ins NO DATA RECORDD FOR THIS LOCK- " LOCK I...TOTAL (KTOMS) ’ - (AVt OPNP ETC) ’’ ,q [ " ARKANSAS RIVER "" FORRELL LOCK IP 7A/3TRC 9/N83 UPBOUID STATISTICS ISO 53 42 M6 553 356 909 221 41 21 M8

  1. Statistics of City School Systems, 1917-18. Bulletin, 1920, No. 24

    ERIC Educational Resources Information Center

    Bonner, H. R.

    1920-01-01

    This report presents the statistics of city public schools for the school year 1917-18. An attempt has been made for the first time to secure statistics from all cities which had a population of 2,500 or over in 1910. The cities have been divided into five groups: Group I, including all cities with a population of 100,000 and over; Group II, all…

  2. Statistical ensembles for money and debt

    NASA Astrophysics Data System (ADS)

    Viaggiu, Stefano; Lionetto, Andrea; Bargigli, Leonardo; Longo, Michele

    2012-10-01

    We build a statistical ensemble representation of two economic models describing respectively, in simplified terms, a payment system and a credit market. To this purpose we adopt the Boltzmann-Gibbs distribution where the role of the Hamiltonian is taken by the total money supply (i.e. including money created from debt) of a set of interacting economic agents. As a result, we can read the main thermodynamic quantities in terms of monetary ones. In particular, we define for the credit market model a work term which is related to the impact of monetary policy on credit creation. Furthermore, with our formalism we recover and extend some results concerning the temperature of an economic system, previously presented in the literature by considering only the monetary base as a conserved quantity. Finally, we study the statistical ensemble for the Pareto distribution.

  3. Unravelling the complex MRI pattern in glutaric aciduria type I using statistical models-a cohort study in 180 patients.

    PubMed

    Garbade, Sven F; Greenberg, Cheryl R; Demirkol, Mübeccel; Gökçay, Gülden; Ribes, Antonia; Campistol, Jaume; Burlina, Alberto B; Burgard, Peter; Kölker, Stefan

    2014-09-01

    Glutaric aciduria type I (GA-I) is a cerebral organic aciduria caused by inherited deficiency of glutaryl-CoA dehydrogenase and is characterized biochemically by an accumulation of putatively neurotoxic dicarboxylic metabolites. The majority of untreated patients develops a complex movement disorder with predominant dystonia during age 3-36 months. Magnetic resonance imaging (MRI) studies have demonstrated striatal and extrastriatal abnormalities. The major aim of this study was to elucidate the complex neuroradiological pattern of patients with GA-I and to associate the MRI findings with the severity of predominant neurological symptoms. In 180 patients, detailed information about the neurological presentation and brain region-specific MRI abnormalities were obtained via a standardized questionnaire. Patients with a movement disorder had more often MRI abnormalities in putamen, caudate, cortex, ventricles and external CSF spaces than patients without or with minor neurological symptoms. Putaminal MRI changes and strongly dilated ventricles were identified as the most reliable predictors of a movement disorder. In contrast, abnormalities in globus pallidus were not clearly associated with a movement disorder. Caudate and putamen as well as cortex, ventricles and external CSF spaces clearly collocalized on a two-dimensional map demonstrating statistical similarity and suggesting the same underlying pathomechanism. This study demonstrates that complex statistical methods are useful to decipher the age-dependent and region-specific MRI patterns of rare neurometabolic diseases and that these methods are helpful to elucidate the clinical relevance of specific MRI findings.

  4. Annotating longitudinal clinical narratives for de-identification: The 2014 i2b2/UTHealth corpus.

    PubMed

    Stubbs, Amber; Uzuner, Özlem

    2015-12-01

    The 2014 i2b2/UTHealth natural language processing shared task featured a track focused on the de-identification of longitudinal medical records. For this track, we de-identified a set of 1304 longitudinal medical records describing 296 patients. This corpus was de-identified under a broad interpretation of the HIPAA guidelines using double-annotation followed by arbitration, rounds of sanity checking, and proof reading. The average token-based F1 measure for the annotators compared to the gold standard was 0.927. The resulting annotations were used both to de-identify the data and to set the gold standard for the de-identification track of the 2014 i2b2/UTHealth shared task. All annotated private health information were replaced with realistic surrogates automatically and then read over and corrected manually. The resulting corpus is the first of its kind made available for de-identification research. This corpus was first used for the 2014 i2b2/UTHealth shared task, during which the systems achieved a mean F-measure of 0.872 and a maximum F-measure of 0.964 using entity-based micro-averaged evaluations. Copyright © 2015 Elsevier Inc. All rights reserved.

  5. Statistical Transmutation in Floquet Driven Optical Lattices.

    PubMed

    Sedrakyan, Tigran A; Galitski, Victor M; Kamenev, Alex

    2015-11-06

    We show that interacting bosons in a periodically driven two dimensional (2D) optical lattice may effectively exhibit fermionic statistics. The phenomenon is similar to the celebrated Tonks-Girardeau regime in 1D. The Floquet band of a driven lattice develops the moat shape, i.e., a minimum along a closed contour in the Brillouin zone. Such degeneracy of the kinetic energy favors fermionic quasiparticles. The statistical transmutation is achieved by the Chern-Simons flux attachment similar to the fractional quantum Hall case. We show that the velocity distribution of the released bosons is a sensitive probe of the fermionic nature of their stationary Floquet state.

  6. Sn2+-Stabilization in MASnI3 perovskites by superhalide incorporation.

    PubMed

    Xiang, Junxiang; Wang, Kan; Xiang, Bin; Cui, Xudong

    2018-03-28

    Sn-based hybrid halide perovskites are a potential solution to replace Pb and thereby reduce Pb toxicity in MAPbI 3 perovskite-based solar cells. However, the instability of Sn 2+ in air atmosphere causes a poor reproducibility of MASnI 3 , hindering steps towards this goal. In this paper, we propose a new type of organic metal-superhalide perovskite of MASnI 2 BH 4 and MASnI 2 AlH 4 . Through first-principles calculations, our results reveal that the incorporation of BH 4 and AlH 4 superhalides can realize an impressive enhancement of oxidation resistance of Sn 2+ in MASnI 3 perovskites because of the large electron transfer between Sn 2+ and [BH 4 ] - /[AlH 4 ] - . Meanwhile, the high carrier mobility is preserved in these superhalide perovskites and only a slight decrease is observed in the optical absorption strength. Our studies provide a new path to attain highly stable performance and reproducibility of Sn-based perovskite solar cells.

  7. Statistical parsimony networks and species assemblages in Cephalotrichid nemerteans (nemertea).

    PubMed

    Chen, Haixia; Strand, Malin; Norenburg, Jon L; Sun, Shichun; Kajihara, Hiroshi; Chernyshev, Alexey V; Maslakova, Svetlana A; Sundberg, Per

    2010-09-21

    It has been suggested that statistical parsimony network analysis could be used to get an indication of species represented in a set of nucleotide data, and the approach has been used to discuss species boundaries in some taxa. Based on 635 base pairs of the mitochondrial protein-coding gene cytochrome c oxidase I (COI), we analyzed 152 nemertean specimens using statistical parsimony network analysis with the connection probability set to 95%. The analysis revealed 15 distinct networks together with seven singletons. Statistical parsimony yielded three networks supporting the species status of Cephalothrix rufifrons, C. major and C. spiralis as they currently have been delineated by morphological characters and geographical location. Many other networks contained haplotypes from nearby geographical locations. Cladistic structure by maximum likelihood analysis overall supported the network analysis, but indicated a false positive result where subnetworks should have been connected into one network/species. This probably is caused by undersampling of the intraspecific haplotype diversity. Statistical parsimony network analysis provides a rapid and useful tool for detecting possible undescribed/cryptic species among cephalotrichid nemerteans based on COI gene. It should be combined with phylogenetic analysis to get indications of false positive results, i.e., subnetworks that would have been connected with more extensive haplotype sampling.

  8. Fisher's method of combining dependent statistics using generalizations of the gamma distribution with applications to genetic pleiotropic associations.

    PubMed

    Li, Qizhai; Hu, Jiyuan; Ding, Juan; Zheng, Gang

    2014-04-01

    A classical approach to combine independent test statistics is Fisher's combination of $p$-values, which follows the $\\chi ^2$ distribution. When the test statistics are dependent, the gamma distribution (GD) is commonly used for the Fisher's combination test (FCT). We propose to use two generalizations of the GD: the generalized and the exponentiated GDs. We study some properties of mis-using the GD for the FCT to combine dependent statistics when one of the two proposed distributions are true. Our results show that both generalizations have better control of type I error rates than the GD, which tends to have inflated type I error rates at more extreme tails. In practice, common model selection criteria (e.g. Akaike information criterion/Bayesian information criterion) can be used to help select a better distribution to use for the FCT. A simple strategy of the two generalizations of the GD in genome-wide association studies is discussed. Applications of the results to genetic pleiotrophic associations are described, where multiple traits are tested for association with a single marker.

  9. Statistical Aspects of Ice Gouging on the Alaskan Shelf of the Beaufort Sea,

    DTIC Science & Technology

    1983-09-01

    ici 1)N I’ao’ ofciv i1111 da t Cl t it:c) dcc iiiiik i onIII ol l % I IIouI ile as~ d a 1c’ iiici of ,ic dcli al\\2 Not1. I % .. ol I lOt ii title...Ccii hilial dclisl to fileicll of a p-ii peline lot a IX1 (X)-. Miid 10-\\Cat McUMr 1crOd .31 .... v STATISTICAL ASPECTS OF ICE GOUGING ON THE ALASKAN...dlata obtained hetceni 19-2 and -l aw etcnatr’’ib (I-). I Ilti, v, ’Icletriiutted 1 9,9 (ele cl iug 1 974) \\ -ere CL Ird in the pr~ mI. a~ctag ing tile

  10. Laser-diagnostic mapping of temperature and soot statistics in a 2-m diameter turbulent pool fire

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kearney, Sean P.; Grasser, Thomas W.

    We present spatial profiles of temperature and soot-volume-fraction statistics from a sooting 2-m base diameter turbulent pool fire, burning a 10%-toluene / 90%-methanol fuel mixture. Dual-pump coherent anti-Stokes Raman scattering and laser-induced incandescence are utilized to obtain radial profiles of temperature and soot probability density functions (pdf) as well as estimates of temperature/soot joint statistics at three vertical heights above the surface of the methanol/toluene fuel pool. Results are presented both in the fuel vapor-dome region at ¼ base diameter and in the actively burning region at ½ and ¾ diameters above the fuel surface. The spatial evolution of themore » soot and temperature pdfs is discussed and profiles of the temperature and soot mean and rms statistics are provided. Joint temperature/soot statistics are presented as spatially resolved conditional averages across the fire plume, and in terms of a joint pdf obtained by including measurements from multiple spatial locations.« less

  11. Laser-diagnostic mapping of temperature and soot statistics in a 2-m diameter turbulent pool fire

    DOE PAGES

    Kearney, Sean P.; Grasser, Thomas W.

    2017-08-10

    We present spatial profiles of temperature and soot-volume-fraction statistics from a sooting 2-m base diameter turbulent pool fire, burning a 10%-toluene / 90%-methanol fuel mixture. Dual-pump coherent anti-Stokes Raman scattering and laser-induced incandescence are utilized to obtain radial profiles of temperature and soot probability density functions (pdf) as well as estimates of temperature/soot joint statistics at three vertical heights above the surface of the methanol/toluene fuel pool. Results are presented both in the fuel vapor-dome region at ¼ base diameter and in the actively burning region at ½ and ¾ diameters above the fuel surface. The spatial evolution of themore » soot and temperature pdfs is discussed and profiles of the temperature and soot mean and rms statistics are provided. Joint temperature/soot statistics are presented as spatially resolved conditional averages across the fire plume, and in terms of a joint pdf obtained by including measurements from multiple spatial locations.« less

  12. Characterization of the hypothermic effects of imidazoline I2 receptor agonists in rats

    PubMed Central

    Thorn, David A; An, Xiao-Fei; Zhang, Yanan; Pigini, Maria; Li, Jun-Xu

    2012-01-01

    BACKGROUND AND PURPOSE Imidazoline I2 receptors have been implicated in several CNS disorders. Although several I2 receptor agonists have been described, no simple and sensitive in vivo bioassay is available for studying I2 receptor ligands. This study examined I2 receptor agonist-induced hypothermia as a functional in vivo assay of I2 receptor agonism. EXPERIMENTAL APPROACH Different groups of rats were used to examine the effects of I2 receptor agonists on the rectal temperature and locomotion. The pharmacological mechanisms were investigated by combining I2 receptor ligands and different antagonists. KEY RESULTS All the selective I2 receptor agonists examined (2-BFI, diphenyzoline, phenyzoline, CR4056, tracizoline, BU224 and S22687, 3.2–56 mg·kg–1, i.p.) dose-dependently and markedly decreased the rectal temperature (hypothermia) in rats, with varied duration of action. Pharmacological mechanism of the observed hypothermia was studied by combining the I2 receptor agonists (2-BFI, BU224, tracizoline and diphenyzoline) with imidazoline I2 receptor/ α2 adrenoceptor antagonist idazoxan, selective I1 receptor antagonist efaroxan, α2 adrenoceptor antagonist/5-HT1A receptor agonist yohimbine. Idazoxan but not yohimbine or efaroxan attenuated the hypothermic effects of 2-BFI, BU224, tracizoline and diphenyzoline, supporting the I2 receptor mechanism. In contrast, both idazoxan and yohimbine attenuated hypothermia induced by the α2 adrenoceptor agonist clonidine. Among all the I2 receptor agonists studied, only S22687 markedly increased the locomotor activity in rats. CONCLUSIONS AND IMPLICATIONS Imidazoline I2 receptor agonists can produce hypothermic effects, which are primarily mediated by I2 receptors. These data suggest that I2 receptor agonist-induced hypothermia is a simple and sensitive in vivo assay for studying I2 receptor ligands. PMID:22324428

  13. Determination of errors in derived magnetic field directions in geosynchronous orbit: results from a statistical approach

    NASA Astrophysics Data System (ADS)

    Chen, Yue; Cunningham, Gregory; Henderson, Michael

    2016-09-01

    This study aims to statistically estimate the errors in local magnetic field directions that are derived from electron directional distributions measured by Los Alamos National Laboratory geosynchronous (LANL GEO) satellites. First, by comparing derived and measured magnetic field directions along the GEO orbit to those calculated from three selected empirical global magnetic field models (including a static Olson and Pfitzer 1977 quiet magnetic field model, a simple dynamic Tsyganenko 1989 model, and a sophisticated dynamic Tsyganenko 2001 storm model), it is shown that the errors in both derived and modeled directions are at least comparable. Second, using a newly developed proxy method as well as comparing results from empirical models, we are able to provide for the first time circumstantial evidence showing that derived magnetic field directions should statistically match the real magnetic directions better, with averaged errors < ˜ 2°, than those from the three empirical models with averaged errors > ˜ 5°. In addition, our results suggest that the errors in derived magnetic field directions do not depend much on magnetospheric activity, in contrast to the empirical field models. Finally, as applications of the above conclusions, we show examples of electron pitch angle distributions observed by LANL GEO and also take the derived magnetic field directions as the real ones so as to test the performance of empirical field models along the GEO orbits, with results suggesting dependence on solar cycles as well as satellite locations. This study demonstrates the validity and value of the method that infers local magnetic field directions from particle spin-resolved distributions.

  14. Determination of errors in derived magnetic field directions in geosynchronous orbit: results from a statistical approach

    DOE PAGES

    Chen, Yue; Cunningham, Gregory; Henderson, Michael

    2016-09-21

    Our study aims to statistically estimate the errors in local magnetic field directions that are derived from electron directional distributions measured by Los Alamos National Laboratory geosynchronous (LANL GEO) satellites. First, by comparing derived and measured magnetic field directions along the GEO orbit to those calculated from three selected empirical global magnetic field models (including a static Olson and Pfitzer 1977 quiet magnetic field model, a simple dynamic Tsyganenko 1989 model, and a sophisticated dynamic Tsyganenko 2001 storm model), it is shown that the errors in both derived and modeled directions are at least comparable. Furthermore, using a newly developedmore » proxy method as well as comparing results from empirical models, we are able to provide for the first time circumstantial evidence showing that derived magnetic field directions should statistically match the real magnetic directions better, with averaged errors < ~2°, than those from the three empirical models with averaged errors > ~5°. In addition, our results suggest that the errors in derived magnetic field directions do not depend much on magnetospheric activity, in contrast to the empirical field models. Finally, as applications of the above conclusions, we show examples of electron pitch angle distributions observed by LANL GEO and also take the derived magnetic field directions as the real ones so as to test the performance of empirical field models along the GEO orbits, with results suggesting dependence on solar cycles as well as satellite locations. Finally, this study demonstrates the validity and value of the method that infers local magnetic field directions from particle spin-resolved distributions.« less

  15. Determination of errors in derived magnetic field directions in geosynchronous orbit: results from a statistical approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Yue; Cunningham, Gregory; Henderson, Michael

    Our study aims to statistically estimate the errors in local magnetic field directions that are derived from electron directional distributions measured by Los Alamos National Laboratory geosynchronous (LANL GEO) satellites. First, by comparing derived and measured magnetic field directions along the GEO orbit to those calculated from three selected empirical global magnetic field models (including a static Olson and Pfitzer 1977 quiet magnetic field model, a simple dynamic Tsyganenko 1989 model, and a sophisticated dynamic Tsyganenko 2001 storm model), it is shown that the errors in both derived and modeled directions are at least comparable. Furthermore, using a newly developedmore » proxy method as well as comparing results from empirical models, we are able to provide for the first time circumstantial evidence showing that derived magnetic field directions should statistically match the real magnetic directions better, with averaged errors < ~2°, than those from the three empirical models with averaged errors > ~5°. In addition, our results suggest that the errors in derived magnetic field directions do not depend much on magnetospheric activity, in contrast to the empirical field models. Finally, as applications of the above conclusions, we show examples of electron pitch angle distributions observed by LANL GEO and also take the derived magnetic field directions as the real ones so as to test the performance of empirical field models along the GEO orbits, with results suggesting dependence on solar cycles as well as satellite locations. Finally, this study demonstrates the validity and value of the method that infers local magnetic field directions from particle spin-resolved distributions.« less

  16. A Method of Relating General Circulation Model Simulated Climate to the Observed Local Climate. Part I: Seasonal Statistics.

    NASA Astrophysics Data System (ADS)

    Karl, Thomas R.; Wang, Wei-Chyung; Schlesinger, Michael E.; Knight, Richard W.; Portman, David

    1990-10-01

    Important surface observations such as the daily maximum and minimum temperature, daily precipitation, and cloud ceilings often have localized characteristics that are difficult to reproduce with the current resolution and the physical parameterizations in state-of-the-art General Circulation climate Models (GCMs). Many of the difficulties can be partially attributed to mismatches in scale, local topography. regional geography and boundary conditions between models and surface-based observations. Here, we present a method, called climatological projection by model statistics (CPMS), to relate GCM grid-point flee-atmosphere statistics, the predictors, to these important local surface observations. The method can be viewed as a generalization of the model output statistics (MOS) and perfect prog (PP) procedures used in numerical weather prediction (NWP) models. It consists of the application of three statistical methods: 1) principle component analysis (FICA), 2) canonical correlation, and 3) inflated regression analysis. The PCA reduces the redundancy of the predictors The canonical correlation is used to develop simultaneous relationships between linear combinations of the predictors, the canonical variables, and the surface-based observations. Finally, inflated regression is used to relate the important canonical variables to each of the surface-based observed variables.We demonstrate that even an early version of the Oregon State University two-level atmospheric GCM (with prescribed sea surface temperature) produces free-atmosphere statistics than can, when standardized using the model's internal means and variances (the MOS-like version of CPMS), closely approximate the observed local climate. When the model data are standardized by the observed free-atmosphere means and variances (the PP version of CPMS), however, the model does not reproduce the observed surface climate as well. Our results indicate that in the MOS-like version of CPMS the differences between

  17. Statistical Short-Range Guidance for Peak Wind Speed Forecasts on Kennedy Space Center/Cape Canaveral Air Force Station: Phase I Results

    NASA Technical Reports Server (NTRS)

    Lambert, Winifred C.; Merceret, Francis J. (Technical Monitor)

    2002-01-01

    This report describes the results of the ANU's (Applied Meteorology Unit) Short-Range Statistical Forecasting task for peak winds. The peak wind speeds are an important forecast element for the Space Shuttle and Expendable Launch Vehicle programs. The Keith Weather Squadron and the Spaceflight Meteorology Group indicate that peak winds are challenging to forecast. The Applied Meteorology Unit was tasked to develop tools that aid in short-range forecasts of peak winds at tower sites of operational interest. A 7 year record of wind tower data was used in the analysis. Hourly and directional climatologies by tower and month were developed to determine the seasonal behavior of the average and peak winds. In all climatologies, the average and peak wind speeds were highly variable in time. This indicated that the development of a peak wind forecasting tool would be difficult. Probability density functions (PDF) of peak wind speed were calculated to determine the distribution of peak speed with average speed. These provide forecasters with a means of determining the probability of meeting or exceeding a certain peak wind given an observed or forecast average speed. The climatologies and PDFs provide tools with which to make peak wind forecasts that are critical to safe operations.

  18. Fermi-Pasta-Ulam-Tsingou problems: Passage from Boltzmann to q-statistics

    NASA Astrophysics Data System (ADS)

    Bagchi, Debarshee; Tsallis, Constantino

    2018-02-01

    The Fermi-Pasta-Ulam (FPU) one-dimensional Hamiltonian includes a quartic term which guarantees ergodicity of the system in the thermodynamic limit. Consistently, the Boltzmann factor P(ε) ∼e-βε describes its equilibrium distribution of one-body energies, and its velocity distribution is Maxwellian, i.e., P(v) ∼e - βv2 /2. We consider here a generalized system where the quartic coupling constant between sites decays as 1 / dijα (α ≥ 0 ;dij = 1 , 2 , …) . Through first-principle molecular dynamics we demonstrate that, for large α (above α ≃ 1), i.e., short-range interactions, Boltzmann statistics (based on the additive entropic functional SB [ P(z) ] = - k ∫ dzP(z) ln P(z)) is verified. However, for small values of α (below α ≃ 1), i.e., long-range interactions, Boltzmann statistics dramatically fails and is replaced by q-statistics (based on the nonadditive entropic functional Sq [ P(z) ] = k(1 - ∫ dz[ P(z) ]q) /(q - 1) , with S1 =SB). Indeed, the one-body energy distribution is q-exponential, P(ε) ∼ eqε-βε ε ≡[ 1 +(qε - 1) βε ε ]-1 /(qε - 1) with qε > 1, and its velocity distribution is given by P(v) ∼ eqv-βvv2 / 2 with qv > 1. Moreover, within small error bars, we verify qε =qv = q, which decreases from an extrapolated value q ≃ 5 / 3 to q = 1 when α increases from zero to α ≃ 1, and remains q = 1 thereafter.

  19. A statistical approach to selecting and confirming validation targets in -omics experiments

    PubMed Central

    2012-01-01

    Background Genomic technologies are, by their very nature, designed for hypothesis generation. In some cases, the hypotheses that are generated require that genome scientists confirm findings about specific genes or proteins. But one major advantage of high-throughput technology is that global genetic, genomic, transcriptomic, and proteomic behaviors can be observed. Manual confirmation of every statistically significant genomic result is prohibitively expensive. This has led researchers in genomics to adopt the strategy of confirming only a handful of the most statistically significant results, a small subset chosen for biological interest, or a small random subset. But there is no standard approach for selecting and quantitatively evaluating validation targets. Results Here we present a new statistical method and approach for statistically validating lists of significant results based on confirming only a small random sample. We apply our statistical method to show that the usual practice of confirming only the most statistically significant results does not statistically validate result lists. We analyze an extensively validated RNA-sequencing experiment to show that confirming a random subset can statistically validate entire lists of significant results. Finally, we analyze multiple publicly available microarray experiments to show that statistically validating random samples can both (i) provide evidence to confirm long gene lists and (ii) save thousands of dollars and hundreds of hours of labor over manual validation of each significant result. Conclusions For high-throughput -omics studies, statistical validation is a cost-effective and statistically valid approach to confirming lists of significant results. PMID:22738145

  20. Effect of non-normality on test statistics for one-way independent groups designs.

    PubMed

    Cribbie, Robert A; Fiksenbaum, Lisa; Keselman, H J; Wilcox, Rand R

    2012-02-01

    The data obtained from one-way independent groups designs is typically non-normal in form and rarely equally variable across treatment populations (i.e., population variances are heterogeneous). Consequently, the classical test statistic that is used to assess statistical significance (i.e., the analysis of variance F test) typically provides invalid results (e.g., too many Type I errors, reduced power). For this reason, there has been considerable interest in finding a test statistic that is appropriate under conditions of non-normality and variance heterogeneity. Previously recommended procedures for analysing such data include the James test, the Welch test applied either to the usual least squares estimators of central tendency and variability, or the Welch test with robust estimators (i.e., trimmed means and Winsorized variances). A new statistic proposed by Krishnamoorthy, Lu, and Mathew, intended to deal with heterogeneous variances, though not non-normality, uses a parametric bootstrap procedure. In their investigation of the parametric bootstrap test, the authors examined its operating characteristics under limited conditions and did not compare it to the Welch test based on robust estimators. Thus, we investigated how the parametric bootstrap procedure and a modified parametric bootstrap procedure based on trimmed means perform relative to previously recommended procedures when data are non-normal and heterogeneous. The results indicated that the tests based on trimmed means offer the best Type I error control and power when variances are unequal and at least some of the distribution shapes are non-normal. © 2011 The British Psychological Society.

  1. 11.2 YIP Human In the Loop Statistical RelationalLearners

    DTIC Science & Technology

    2017-10-23

    learning formalisms including inverse reinforcement learning [4] and statistical relational learning [7, 5, 8]. We have also applied our algorithms in...one introduced for label preferences. 4 Figure 2: Active Advice Seeking for Inverse Reinforcement Learning. active advice seeking is in selecting the...learning tasks. 1.2.1 Sequential Decision-Making Our previous work on advice for inverse reinforcement learning (IRL) defined advice as action

  2. Performance Comparison of Two Gene Set Analysis Methods for Genome-wide Association Study Results: GSA-SNP vs i-GSEA4GWAS.

    PubMed

    Kwon, Ji-Sun; Kim, Jihye; Nam, Dougu; Kim, Sangsoo

    2012-06-01

    Gene set analysis (GSA) is useful in interpreting a genome-wide association study (GWAS) result in terms of biological mechanism. We compared the performance of two different GSA implementations that accept GWAS p-values of single nucleotide polymorphisms (SNPs) or gene-by-gene summaries thereof, GSA-SNP and i-GSEA4GWAS, under the same settings of inputs and parameters. GSA runs were made with two sets of p-values from a Korean type 2 diabetes mellitus GWAS study: 259,188 and 1,152,947 SNPs of the original and imputed genotype datasets, respectively. When Gene Ontology terms were used as gene sets, i-GSEA4GWAS produced 283 and 1,070 hits for the unimputed and imputed datasets, respectively. On the other hand, GSA-SNP reported 94 and 38 hits, respectively, for both datasets. Similar, but to a lesser degree, trends were observed with Kyoto Encyclopedia of Genes and Genomes (KEGG) gene sets as well. The huge number of hits by i-GSEA4GWAS for the imputed dataset was probably an artifact due to the scaling step in the algorithm. The decrease in hits by GSA-SNP for the imputed dataset may be due to the fact that it relies on Z-statistics, which is sensitive to variations in the background level of associations. Judicious evaluation of the GSA outcomes, perhaps based on multiple programs, is recommended.

  3. Hyperparameterization of soil moisture statistical models for North America with Ensemble Learning Models (Elm)

    NASA Astrophysics Data System (ADS)

    Steinberg, P. D.; Brener, G.; Duffy, D.; Nearing, G. S.; Pelissier, C.

    2017-12-01

    Hyperparameterization, of statistical models, i.e. automated model scoring and selection, such as evolutionary algorithms, grid searches, and randomized searches, can improve forecast model skill by reducing errors associated with model parameterization, model structure, and statistical properties of training data. Ensemble Learning Models (Elm), and the related Earthio package, provide a flexible interface for automating the selection of parameters and model structure for machine learning models common in climate science and land cover classification, offering convenient tools for loading NetCDF, HDF, Grib, or GeoTiff files, decomposition methods like PCA and manifold learning, and parallel training and prediction with unsupervised and supervised classification, clustering, and regression estimators. Continuum Analytics is using Elm to experiment with statistical soil moisture forecasting based on meteorological forcing data from NASA's North American Land Data Assimilation System (NLDAS). There Elm is using the NSGA-2 multiobjective optimization algorithm for optimizing statistical preprocessing of forcing data to improve goodness-of-fit for statistical models (i.e. feature engineering). This presentation will discuss Elm and its components, including dask (distributed task scheduling), xarray (data structures for n-dimensional arrays), and scikit-learn (statistical preprocessing, clustering, classification, regression), and it will show how NSGA-2 is being used for automate selection of soil moisture forecast statistical models for North America.

  4. Statistical approach for selection of biologically informative genes.

    PubMed

    Das, Samarendra; Rai, Anil; Mishra, D C; Rai, Shesh N

    2018-05-20

    Selection of informative genes from high dimensional gene expression data has emerged as an important research area in genomics. Many gene selection techniques have been proposed so far are either based on relevancy or redundancy measure. Further, the performance of these techniques has been adjudged through post selection classification accuracy computed through a classifier using the selected genes. This performance metric may be statistically sound but may not be biologically relevant. A statistical approach, i.e. Boot-MRMR, was proposed based on a composite measure of maximum relevance and minimum redundancy, which is both statistically sound and biologically relevant for informative gene selection. For comparative evaluation of the proposed approach, we developed two biological sufficient criteria, i.e. Gene Set Enrichment with QTL (GSEQ) and biological similarity score based on Gene Ontology (GO). Further, a systematic and rigorous evaluation of the proposed technique with 12 existing gene selection techniques was carried out using five gene expression datasets. This evaluation was based on a broad spectrum of statistically sound (e.g. subject classification) and biological relevant (based on QTL and GO) criteria under a multiple criteria decision-making framework. The performance analysis showed that the proposed technique selects informative genes which are more biologically relevant. The proposed technique is also found to be quite competitive with the existing techniques with respect to subject classification and computational time. Our results also showed that under the multiple criteria decision-making setup, the proposed technique is best for informative gene selection over the available alternatives. Based on the proposed approach, an R Package, i.e. BootMRMR has been developed and available at https://cran.r-project.org/web/packages/BootMRMR. This study will provide a practical guide to select statistical techniques for selecting informative genes

  5. Strongly coupled fluid-particle flows in vertical channels. I. Reynolds-averaged two-phase turbulence statistics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Capecelatro, Jesse, E-mail: jcaps@illinois.edu; Desjardins, Olivier; Fox, Rodney O.

    Simulations of strongly coupled (i.e., high-mass-loading) fluid-particle flows in vertical channels are performed with the purpose of understanding the fundamental physics of wall-bounded multiphase turbulence. The exact Reynolds-averaged (RA) equations for high-mass-loading suspensions are presented, and the unclosed terms that are retained in the context of fully developed channel flow are evaluated in an Eulerian–Lagrangian (EL) framework for the first time. A key distinction between the RA formulation presented in the current work and previous derivations of multiphase turbulence models is the partitioning of the particle velocity fluctuations into spatially correlated and uncorrelated components, used to define the components ofmore » the particle-phase turbulent kinetic energy (TKE) and granular temperature, respectively. The adaptive spatial filtering technique developed in our previous work for homogeneous flows [J. Capecelatro, O. Desjardins, and R. O. Fox, “Numerical study of collisional particle dynamics in cluster-induced turbulence,” J. Fluid Mech. 747, R2 (2014)] is shown to accurately partition the particle velocity fluctuations at all distances from the wall. Strong segregation in the components of granular energy is observed, with the largest values of particle-phase TKE associated with clusters falling near the channel wall, while maximum granular temperature is observed at the center of the channel. The anisotropy of the Reynolds stresses both near the wall and far away is found to be a crucial component for understanding the distribution of the particle-phase volume fraction. In Part II of this paper, results from the EL simulations are used to validate a multiphase Reynolds-stress turbulence model that correctly predicts the wall-normal distribution of the two-phase turbulence statistics.« less

  6. Statistical context shapes stimulus-specific adaptation in human auditory cortex

    PubMed Central

    Henry, Molly J.; Fromboluti, Elisa Kim; McAuley, J. Devin

    2015-01-01

    Stimulus-specific adaptation is the phenomenon whereby neural response magnitude decreases with repeated stimulation. Inconsistencies between recent nonhuman animal recordings and computational modeling suggest dynamic influences on stimulus-specific adaptation. The present human electroencephalography (EEG) study investigates the potential role of statistical context in dynamically modulating stimulus-specific adaptation by examining the auditory cortex-generated N1 and P2 components. As in previous studies of stimulus-specific adaptation, listeners were presented with oddball sequences in which the presentation of a repeated tone was infrequently interrupted by rare spectral changes taking on three different magnitudes. Critically, the statistical context varied with respect to the probability of small versus large spectral changes within oddball sequences (half of the time a small change was most probable; in the other half a large change was most probable). We observed larger N1 and P2 amplitudes (i.e., release from adaptation) for all spectral changes in the small-change compared with the large-change statistical context. The increase in response magnitude also held for responses to tones presented with high probability, indicating that statistical adaptation can overrule stimulus probability per se in its influence on neural responses. Computational modeling showed that the degree of coadaptation in auditory cortex changed depending on the statistical context, which in turn affected stimulus-specific adaptation. Thus the present data demonstrate that stimulus-specific adaptation in human auditory cortex critically depends on statistical context. Finally, the present results challenge the implicit assumption of stationarity of neural response magnitudes that governs the practice of isolating established deviant-detection responses such as the mismatch negativity. PMID:25652920

  7. Statistical context shapes stimulus-specific adaptation in human auditory cortex.

    PubMed

    Herrmann, Björn; Henry, Molly J; Fromboluti, Elisa Kim; McAuley, J Devin; Obleser, Jonas

    2015-04-01

    Stimulus-specific adaptation is the phenomenon whereby neural response magnitude decreases with repeated stimulation. Inconsistencies between recent nonhuman animal recordings and computational modeling suggest dynamic influences on stimulus-specific adaptation. The present human electroencephalography (EEG) study investigates the potential role of statistical context in dynamically modulating stimulus-specific adaptation by examining the auditory cortex-generated N1 and P2 components. As in previous studies of stimulus-specific adaptation, listeners were presented with oddball sequences in which the presentation of a repeated tone was infrequently interrupted by rare spectral changes taking on three different magnitudes. Critically, the statistical context varied with respect to the probability of small versus large spectral changes within oddball sequences (half of the time a small change was most probable; in the other half a large change was most probable). We observed larger N1 and P2 amplitudes (i.e., release from adaptation) for all spectral changes in the small-change compared with the large-change statistical context. The increase in response magnitude also held for responses to tones presented with high probability, indicating that statistical adaptation can overrule stimulus probability per se in its influence on neural responses. Computational modeling showed that the degree of coadaptation in auditory cortex changed depending on the statistical context, which in turn affected stimulus-specific adaptation. Thus the present data demonstrate that stimulus-specific adaptation in human auditory cortex critically depends on statistical context. Finally, the present results challenge the implicit assumption of stationarity of neural response magnitudes that governs the practice of isolating established deviant-detection responses such as the mismatch negativity. Copyright © 2015 the American Physiological Society.

  8. Dexamethasone effects on (/sup 125/I)albumin distribution in experimental RG-2 gliomas and adjacent brain

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nakagawa, H.; Groothuis, D.R.; Owens, E.S.

    1987-12-01

    A total of 72 RG-2 transplanted gliomas were studied in 58 rats at three time points (1, 30, 240 min) after intravenous injection of (/sup 125/I)radioiodinated serum albumin ((/sup 125/I)RISA). The animals were divided into two groups: a control group that received no treatment and a second group that was treated with five doses of 1.5 mg/kg of dexamethasone over 2.5 days. Local tissue concentrations of (/sup 125/I)RISA were measured with quantitative autoradiography based on morphological features of the tumors and used to calculate the tissue distribution space. Two models were used to analyze the data. A two compartment modelmore » yielded estimates of local blood-to-tissue influx constants (K1), lower limit extracellular volumes (Ve), and plasma vascular volumes (Vp) in different tumor regions. Treatment with dexamethasone consistently reduced the RISA distribution space in the RG-2 tumors; the reduction in Ve was statistically significant in almost all tumor regions: whole tumor Ve (mean +/- SE) was reduced from 0.14 +/- 0.02 ml g-1 in control animals to 0.08 +/- 0.01 ml g-1 in dexamethasone treated animals. K1 and Vp were also decreased in all tumor regions after treatment with dexamethasone (whole tumor K1 decreased from 2.36 +/- 0.89 to 0.83 +/- 0.29 microliter g-1 min-1 and Vp decreased slightly from 0.016 +/- 0.013 to 0.010 +/- 0.005 ml g-1 after dexamethasone treatment), but these changes were not statistically significant. A comparison of the tumor influx constants in control animals and the aqueous diffusion constants of two different size molecules (RISA and aminoisobutyric acid) suggests that the ''pores'' across RG-2 capillaries are large and may not restrict the free diffusion of RISA (estimated minimum pore diameter greater than 36 nm) and that the total pore area is approximately 6.2 X 10(-5) cm2 g-1 in RG-2 tumor tissue.« less

  9. Lattice study of light scalar tetraquarks with I=0,2,1/2,3/2: Are σ and κ tetraquarks?

    NASA Astrophysics Data System (ADS)

    Prelovsek, Sasa; Draper, Terrence; Lang, Christian B.; Limmer, Markus; Liu, Keh-Fei; Mathur, Nilmani; Mohler, Daniel

    2010-11-01

    We investigate whether the lightest scalar mesons σ and κ have a large tetraquark component q¯q¯qq, as is strongly supported by many phenomenological studies. A search for possible light tetraquark states with JPC=0++ and I=0,2,1/2,3/2 on the lattice is presented. We perform the two-flavor dynamical simulation with chirally improved quarks and the quenched simulation with overlap quarks, finding qualitative agreement between both results. The spectrum is determined using the generalized eigenvalue method with a number of tetraquark interpolators at the source and the sink, and we omit the disconnected contractions. The time dependence of the eigenvalues at the finite temporal extent of the lattice is explored also analytically. In all the channels, we unavoidably find the lowest scattering states π(k)π(-k) or K(k)π(-k) with back-to-back momentum k=0,2π/L,…. However, we find an additional light state in the I=0 and I=1/2 channels, which may be interpreted as the observed resonances σ and κ with a sizable tetraquark component. In the exotic repulsive channels I=2 and I=3/2, where no resonance is observed, we find no light state in addition to the scattering states.

  10. Mutual interference between statistical summary perception and statistical learning.

    PubMed

    Zhao, Jiaying; Ngo, Nhi; McKendrick, Ryan; Turk-Browne, Nicholas B

    2011-09-01

    The visual system is an efficient statistician, extracting statistical summaries over sets of objects (statistical summary perception) and statistical regularities among individual objects (statistical learning). Although these two kinds of statistical processing have been studied extensively in isolation, their relationship is not yet understood. We first examined how statistical summary perception influences statistical learning by manipulating the task that participants performed over sets of objects containing statistical regularities (Experiment 1). Participants who performed a summary task showed no statistical learning of the regularities, whereas those who performed control tasks showed robust learning. We then examined how statistical learning influences statistical summary perception by manipulating whether the sets being summarized contained regularities (Experiment 2) and whether such regularities had already been learned (Experiment 3). The accuracy of summary judgments improved when regularities were removed and when learning had occurred in advance. In sum, calculating summary statistics impeded statistical learning, and extracting statistical regularities impeded statistical summary perception. This mutual interference suggests that statistical summary perception and statistical learning are fundamentally related.

  11. Introducing 3D Visualization of Statistical Data in Education Using the i-Use Platform: Examples from Greece

    ERIC Educational Resources Information Center

    Rizou, Ourania; Klonari, Aikaterini

    2016-01-01

    In the 21st century, the age of information and technology, there is an increasing importance to statistical literacy for everyday life. In addition, education innovation and globalisation in the past decade in Europe has resulted in a new perceived complexity of reality that affected the curriculum and statistics education, with a shift from…

  12. DISSCO: direct imputation of summary statistics allowing covariates

    PubMed Central

    Xu, Zheng; Duan, Qing; Yan, Song; Chen, Wei; Li, Mingyao; Lange, Ethan; Li, Yun

    2015-01-01

    Background: Imputation of individual level genotypes at untyped markers using an external reference panel of genotyped or sequenced individuals has become standard practice in genetic association studies. Direct imputation of summary statistics can also be valuable, for example in meta-analyses where individual level genotype data are not available. Two methods (DIST and ImpG-Summary/LD), that assume a multivariate Gaussian distribution for the association summary statistics, have been proposed for imputing association summary statistics. However, both methods assume that the correlations between association summary statistics are the same as the correlations between the corresponding genotypes. This assumption can be violated in the presence of confounding covariates. Methods: We analytically show that in the absence of covariates, correlation among association summary statistics is indeed the same as that among the corresponding genotypes, thus serving as a theoretical justification for the recently proposed methods. We continue to prove that in the presence of covariates, correlation among association summary statistics becomes the partial correlation of the corresponding genotypes controlling for covariates. We therefore develop direct imputation of summary statistics allowing covariates (DISSCO). Results: We consider two real-life scenarios where the correlation and partial correlation likely make practical difference: (i) association studies in admixed populations; (ii) association studies in presence of other confounding covariate(s). Application of DISSCO to real datasets under both scenarios shows at least comparable, if not better, performance compared with existing correlation-based methods, particularly for lower frequency variants. For example, DISSCO can reduce the absolute deviation from the truth by 3.9–15.2% for variants with minor allele frequency <5%. Availability and implementation: http://www.unc.edu/∼yunmli/DISSCO. Contact: yunli

  13. Sampling Errors in Monthly Rainfall Totals for TRMM and SSM/I, Based on Statistics of Retrieved Rain Rates and Simple Models

    NASA Technical Reports Server (NTRS)

    Bell, Thomas L.; Kundu, Prasun K.; Einaudi, Franco (Technical Monitor)

    2000-01-01

    Estimates from TRMM satellite data of monthly total rainfall over an area are subject to substantial sampling errors due to the limited number of visits to the area by the satellite during the month. Quantitative comparisons of TRMM averages with data collected by other satellites and by ground-based systems require some estimate of the size of this sampling error. A method of estimating this sampling error based on the actual statistics of the TRMM observations and on some modeling work has been developed. "Sampling error" in TRMM monthly averages is defined here relative to the monthly total a hypothetical satellite permanently stationed above the area would have reported. "Sampling error" therefore includes contributions from the random and systematic errors introduced by the satellite remote sensing system. As part of our long-term goal of providing error estimates for each grid point accessible to the TRMM instruments, sampling error estimates for TRMM based on rain retrievals from TRMM microwave (TMI) data are compared for different times of the year and different oceanic areas (to minimize changes in the statistics due to algorithmic differences over land and ocean). Changes in sampling error estimates due to changes in rain statistics due 1) to evolution of the official algorithms used to process the data, and 2) differences from other remote sensing systems such as the Defense Meteorological Satellite Program (DMSP) Special Sensor Microwave/Imager (SSM/I), are analyzed.

  14. Statistical properties of solar granulation derived from the SOUP instrument on Spacelab 2

    NASA Technical Reports Server (NTRS)

    Title, A. M.; Tarbell, T. D.; Topka, K. P.; Ferguson, S. H.; Shine, R. A.

    1989-01-01

    Computer algorithms and statistical techniques were used to identify, measure, and quantify the properties of solar granulation derived from movies collected by the Solar Optical Universal Polarimeter on Spacelab 2. The results show that there is neither a typical solar granule nor a typical granule evolution. A granule's evolution is dependent on local magnetic flux density, its position with respect to the active region plage, its position in the mesogranulation pattern, and the evolution of granules in its immediate neighborhood.

  15. Ares I Scale Model Acoustic Test Liftoff Acoustic Results and Comparisons

    NASA Technical Reports Server (NTRS)

    Counter, Doug; Houston, Janice

    2011-01-01

    Conclusions: Ares I-X flight data validated the ASMAT LOA results. Ares I Liftoff acoustic environments were verified with scale model test results. Results showed that data book environments were under-conservative for Frustum (Zone 5). Recommendations: Data book environments can be updated with scale model test and flight data. Subscale acoustic model testing useful for future vehicle environment assessments.

  16. Photodissociation of CF2ICF2I in solid para-hydrogen: infrared spectra of anti- and gauche-˙C2F4I radicals.

    PubMed

    Haupa, Karolina Anna; Lim, Manho; Lee, Yuan-Pern

    2018-05-09

    The photolysis of 1,2-diiodotetrafluoroethane (CF2ICF2I) has served as a prototypical system in ultrafast reaction dynamics. Even though the intermediates, anti- and gauche-iodotetrafluoroethyl (˙C2F4I) radicals, have been characterized with electron diffraction and X-ray diffraction, their infrared spectra are unreported. We report the formation and infrared identification of these radical intermediates upon ultraviolet photodissociation of CF2ICF2I in solid para-hydrogen (p-H2) at 3.3 K. Lines at 1364.9/1358.5, 1283.2, 1177.1, 1162.2, 1126.8, 837.3, 658.0, 574.2, and 555.2 cm-1 are assigned to anti-˙C2F4I, and lines at 1325.9, 1259.7, 1143.4, 1063.4, 921.0, and 765.3 cm-1 to gauche-˙C2F4I. A secondary photodissociation leading to C2F4 was also observed. The assignments were derived according to behavior on secondary photolysis, comparison of the vibrational wavenumbers and the IR intensities of the observed lines with values predicted with the B3PW91/aug-cc-pVTZ-pp method. This spectral identification provides valuable information for future direct spectral probes of these important intermediates.

  17. An analysis of I/O efficient order-statistic-based techniques for noise power estimation in the HRMS sky survey's operational system

    NASA Technical Reports Server (NTRS)

    Zimmerman, G. A.; Olsen, E. T.

    1992-01-01

    Noise power estimation in the High-Resolution Microwave Survey (HRMS) sky survey element is considered as an example of a constant false alarm rate (CFAR) signal detection problem. Order-statistic-based noise power estimators for CFAR detection are considered in terms of required estimator accuracy and estimator dynamic range. By limiting the dynamic range of the value to be estimated, the performance of an order-statistic estimator can be achieved by simpler techniques requiring only a single pass of the data. Simple threshold-and-count techniques are examined, and it is shown how several parallel threshold-and-count estimation devices can be used to expand the dynamic range to meet HRMS system requirements with minimal hardware complexity. An input/output (I/O) efficient limited-precision order-statistic estimator with wide but limited dynamic range is also examined.

  18. Synthesis and reaction of [[HC(CMeNAr)2]Mn]2 (Ar = 2,6-iPr2C6H3): the complex containing three-coordinate manganese(I) with a Mn-Mn bond exhibiting unusual magnetic properties and electronic structure.

    PubMed

    Chai, Jianfang; Zhu, Hongping; Stückl, A Claudia; Roesky, Herbert W; Magull, Jörg; Bencini, Alessandro; Caneschi, Andrea; Gatteschi, Dante

    2005-06-29

    This paper reports on the synthesis, X-ray structure, magnetic properties, and DFT calculations of [[HC(CMeNAr)2]Mn]2 (Ar = 2,6-iPr2C6H3) (2), the first complex with three-coordinate manganese(I). Reduction of the iodide [[HC(CMeNAr)2]Mn(mu-I)]2 (1) with Na/K in toluene afforded 2 as dark-red crystals. The molecule of 2 contains a Mn2(2+) core with a Mn-Mn bond. The magnetic investigations show a rare example of a high-spin manganese(I) complex with an antiferromagnetic interaction between the two Mn(I) centers. The DFT calculations indicate a strong s-s interaction of the two Mn(I) ions with the open shell configuration (3d54s1). This suggests that the magnetic behavior of 2 could be correctly described as the coupling between two S1 = S2 = 5/2 spin centers. The Mn-Mn bond energy is estimated at 44 kcal mol(-1) by first principle calculations with the B3LYP functional. The further oxidative reaction of 2 with KMnO4 or O2 resulted in the formation of manganese(III) oxide [[HC(CMeNAr)2]Mn(mu-O)]2 (3). Compound 3 shows an antiferromagnetic coupling between the two oxo-bridged manganese(III) centers by magnetic measurements.

  19. Statistical analogues of thermodynamic extremum principles

    NASA Astrophysics Data System (ADS)

    Ramshaw, John D.

    2018-05-01

    As shown by Jaynes, the canonical and grand canonical probability distributions of equilibrium statistical mechanics can be simply derived from the principle of maximum entropy, in which the statistical entropy S=- {k}{{B}}{\\sum }i{p}i{log}{p}i is maximised subject to constraints on the mean values of the energy E and/or number of particles N in a system of fixed volume V. The Lagrange multipliers associated with those constraints are then found to be simply related to the temperature T and chemical potential μ. Here we show that the constrained maximisation of S is equivalent to, and can therefore be replaced by, the essentially unconstrained minimisation of the obvious statistical analogues of the Helmholtz free energy F = E ‑ TS and the grand potential J = F ‑ μN. Those minimisations are more easily performed than the maximisation of S because they formally eliminate the constraints on the mean values of E and N and their associated Lagrange multipliers. This procedure significantly simplifies the derivation of the canonical and grand canonical probability distributions, and shows that the well known extremum principles for the various thermodynamic potentials possess natural statistical analogues which are equivalent to the constrained maximisation of S.

  20. Photodissociation of PbI2 in the ultraviolet: analysis of the A rightarrow X band of PbI

    NASA Astrophysics Data System (ADS)

    Rodriguez, G.; Herring, C. M.; Fraser, R. D.; Eden, J. G.

    1996-07-01

    Emission and absorption studies of lead moniodide (PbI) have been carried out by photodissociation of PbI2 vapor at one of several wavelengths (193, 248, 266, 308, and 351 nm) in the ultraviolet. Strong emission on the A \\rightarrow X2 Pi 1/2 band (14400-22800 cm -1 ; 440 \\similar-or-less lambda \\similar-or-less 695 nm) occurs when PbI2 is photodissociated at 248 or 266 nm. Also, absorption bands attributed to the B \\leftarrow X2 Pi 1/2,3/2 , D \\leftarrow X2 Pi 1/2,3/2 , and E \\leftarrow X2 Pi 1/2,3/2 transitions of the diatomic molecule have been observed at 290 and 380, 225 and 265, and 203 and 240 nm, respectively, as have emission bands peaking at 397.6, 531.6, 582.0, 595.1, 639.7, 685.9, and 707.9 nm that appear to arise from PbI2 itself. Analysis and computer simulations of the A \\rightarrow X2 Pi 1/2 emission spectra have resulted in identifications for virtually all (>120) of the observed vibrational bandheads. Several spectroscopic constants for the A and the X2 Pi 1/2 states of PbI have been determined to be Te(A)=20659+/-130 cm-1 , omega e\\prime =132.2+/-1.0cm -1 , omega e\\prime xe\\prime =1.91+/-0.06cm -1 , omega e\\prime \\prime=160.3+/-0.6cm-1 , and omega e\\prime \\primexe \\prime\\prime=0.24+/-0.03cm -1 . Also, the difference between the equilibrium internuclear separations for the X2 Pi 1/2 and the A states has been determined to be Delta Re=0.45+/-0.05 . The spontaneous-emission lifetime for the PbI(A) state and the rate constant for quenching of this state by PbI2 (in two-body collisions) have been measured to be (94.3+/-8.8) ns and (4.3+/-0.4) \\times 10-10 cm3s -1 , respectively.

  1. Comparison of the Mahalanobis Distance and Pearson's χ2 Statistic as Measures of Similarity of Isotope Patterns

    NASA Astrophysics Data System (ADS)

    Zamanzad Ghavidel, Fatemeh; Claesen, Jürgen; Burzykowski, Tomasz; Valkenborg, Dirk

    2014-02-01

    To extract a genuine peptide signal from a mass spectrum, an observed series of peaks at a particular mass can be compared with the isotope distribution expected for a peptide of that mass. To decide whether the observed series of peaks is similar to the isotope distribution, a similarity measure is needed. In this short communication, we investigate whether the Mahalanobis distance could be an alternative measure for the commonly employed Pearson's χ2 statistic. We evaluate the performance of the two measures by using a controlled MALDI-TOF experiment. The results indicate that Pearson's χ2 statistic has better discriminatory performance than the Mahalanobis distance and is a more robust measure.

  2. Multinary I-III-VI2 and I2-II-IV-VI4 Semiconductor Nanostructures for Photocatalytic Applications.

    PubMed

    Regulacio, Michelle D; Han, Ming-Yong

    2016-03-15

    Semiconductor nanostructures that can effectively serve as light-responsive photocatalysts have been of considerable interest over the past decade. This is because their use in light-induced photocatalysis can potentially address some of the most serious environmental and energy-related concerns facing the world today. One important application is photocatalytic hydrogen production from water under solar radiation. It is regarded as a clean and sustainable approach to hydrogen fuel generation because it makes use of renewable resources (i.e., sunlight and water), does not involve fossil fuel consumption, and does not result in environmental pollution or greenhouse gas emission. Another notable application is the photocatalytic degradation of nonbiodegradable dyes, which offers an effective way of ridding industrial wastewater of toxic organic pollutants prior to its release into the environment. Metal oxide semiconductors (e.g., TiO2) are the most widely studied class of semiconductor photocatalysts. Their nanostructured forms have been reported to efficiently generate hydrogen from water and effectively degrade organic dyes under ultraviolet-light irradiation. However, the wide band gap characteristic of most metal oxides precludes absorption of light in the visible region, which makes up a considerable portion of the solar radiation spectrum. Meanwhile, nanostructures of cadmium chalcogenide semiconductors (e.g., CdS), with their relatively narrow band gap that can be easily adjusted through size control and alloying, have displayed immense potential as visible-light-responsive photocatalysts, but the intrinsic toxicity of cadmium poses potential risks to human health and the environment. In developing new nanostructured semiconductors for light-driven photocatalysis, it is important to choose a semiconducting material that has a high absorption coefficient over a wide spectral range and is safe for use in real-world settings. Among the most promising candidates

  3. Machine learning Z2 quantum spin liquids with quasiparticle statistics

    NASA Astrophysics Data System (ADS)

    Zhang, Yi; Melko, Roger G.; Kim, Eun-Ah

    2017-12-01

    After decades of progress and effort, obtaining a phase diagram for a strongly correlated topological system still remains a challenge. Although in principle one could turn to Wilson loops and long-range entanglement, evaluating these nonlocal observables at many points in phase space can be prohibitively costly. With growing excitement over topological quantum computation comes the need for an efficient approach for obtaining topological phase diagrams. Here we turn to machine learning using quantum loop topography (QLT), a notion we have recently introduced. Specifically, we propose a construction of QLT that is sensitive to quasiparticle statistics. We then use mutual statistics between the spinons and visons to detect a Z2 quantum spin liquid in a multiparameter phase space. We successfully obtain the quantum phase boundary between the topological and trivial phases using a simple feed-forward neural network. Furthermore, we demonstrate advantages of our approach for the evaluation of phase diagrams relating to speed and storage. Such statistics-based machine learning of topological phases opens new efficient routes to studying topological phase diagrams in strongly correlated systems.

  4. Excitation of O 2(a 1Δ g, b 1Σ g+) and I( 2P 1/2) by energy transfer from I 2(A, A' 3Π 1,2u) in solid rare gases

    NASA Astrophysics Data System (ADS)

    Böhling, R.; Becker, A. C.; Minaev, B. F.; Seranski, K.; Schurath, U.

    1990-04-01

    O 2a 1Δ g, b 1Σ g+ → X 3Σ g- and I 2P 1/22P 3/4 fluorescence occurs in I 2/O 2-doped rare gas matrices when I 2 is excited with visible laser light. O 2(a 1Δ g) and I( 2P 1/2) are populated independently by near-resonant energy transfer from the metastable triplet states of I 2. The doublet splitting of the O 2a→X band, which peaks at 7879 and 7863 cm -1 in argon, is interpreted as sensitized emission from O 2 trapped in distinct nearest neighbour positions of the donor 3I 2. Annealing reverses the intensity of the doublet, showing that the sites can be interconverted. It is suggested that the a→X emission rate is enhanced by the sensitizer, causing a lifetime reduction of the a 1Δ g state from 79 s in pure argon to 21 and 3±1 s next to I 2. The long-lived O 2(a 1Δ g) state is the precursor of I 2-sensitized emission from O 2(b 1Σ g+). The lifetime of O 2(b 1Σ g+) is reduced from 24.5 ms in pure argon to 17±1 ms in the presence of I 2.

  5. Intrinsic radioactivity of KSr2I5:Eu2+

    NASA Astrophysics Data System (ADS)

    Rust, M.; Melcher, C.; Lukosi, E.

    2016-10-01

    A current need in nuclear security is an economical, yet high energy resolution (near 2%), scintillation detector suitable for gamma-ray spectroscopy. For current scintillators on the market, there is an inverse relationship between scintillator energy resolution and cost of production. A new promising scintillator, KSr2I5:Eu2+, under development at the University of Tennessee, has achieved an energy resolution of 2.4% at 662 keV at room temperature, with potential growth rates exceeding several millimeters per hour. However, the internal background due to the 40K content could present a hurdle for effective source detection/identification in nuclear security applications. As a first step in addressing this question, this paper reports on a computational investigation of the intrinsic differential pulse height spectrum (DPHS) generated by 40K within the KSr2I5:Eu2+ scintillator as a function of crystal geometry. It was found that the DPHS remains relatively equal to a constant multiplicative factor of the negatron emission spectrum with a direct increase of the 1.46 MeV photopeak relative height to the negatron spectrum with volume. Further, peak pileup does not readily manifest itself for practical KSr2I5:Eu2+ volumes.

  6. Statistics in the pharmacy literature.

    PubMed

    Lee, Charlene M; Soin, Herpreet K; Einarson, Thomas R

    2004-09-01

    Research in statistical methods is essential for maintenance of high quality of the published literature. To update previous reports of the types and frequencies of statistical terms and procedures in research studies of selected professional pharmacy journals. We obtained all research articles published in 2001 in 6 journals: American Journal of Health-System Pharmacy, The Annals of Pharmacotherapy, Canadian Journal of Hospital Pharmacy, Formulary, Hospital Pharmacy, and Journal of the American Pharmaceutical Association. Two independent reviewers identified and recorded descriptive and inferential statistical terms/procedures found in the methods, results, and discussion sections of each article. Results were determined by tallying the total number of times, as well as the percentage, that each statistical term or procedure appeared in the articles. One hundred forty-four articles were included. Ninety-eight percent employed descriptive statistics; of these, 28% used only descriptive statistics. The most common descriptive statistical terms were percentage (90%), mean (74%), standard deviation (58%), and range (46%). Sixty-nine percent of the articles used inferential statistics, the most frequent being chi(2) (33%), Student's t-test (26%), Pearson's correlation coefficient r (18%), ANOVA (14%), and logistic regression (11%). Statistical terms and procedures were found in nearly all of the research articles published in pharmacy journals. Thus, pharmacy education should aim to provide current and future pharmacists with an understanding of the common statistical terms and procedures identified to facilitate the appropriate appraisal and consequential utilization of the information available in research articles.

  7. Vehicle-to-infrastructure (V2I) : message lexicon.

    DOT National Transportation Integrated Search

    2016-12-01

    To help with Vehicle-to-Infrastructure (V2I) deployments, a V2I Message Lexicon was developed that explains the relationships and concepts for V2I messages and identifies the ITS standards where they may be found. This lexicon document provides a bri...

  8. Results from phase I of the GERDA experiment

    NASA Astrophysics Data System (ADS)

    Wester, Thomas

    2015-10-01

    The GERmanium Detector Array Gerda at the Laboratori Nazionali del Gran Sasso of the INFN in Italy is an experiment dedicated to the search for the neutrinoless double beta (0νββ) decay in 76Ge. The experiment employs high purity germanium detectors enriched in 76Ge inside a 64 m3 cryostat filled with liquid argon. Gerda was planned in two phases of data taking with the goal to reach a half-life sensitivity in the order of 1026 yr. Phase I of Gerda was running from November 2011 until May 2013. With about 18 kg total detector mass, data with an exposure of 21.6 kg.yr was collected and a background index of 0.01 cts/(keV.kg.yr) was achieved in the region of interest. No signal was found for the 0νββ decay and a new limit of T1/2 > 2.1 . 1025 yr (90% C.L.) was obtained, strongly disfavoring the previous claim of observation. Furthermore, the 2νββ decay half-life of 76Ge was measured with unprecedented precision. Other results include new half-life limits of the order of 1023 yr for Majoron emitting double beta decay modes with spectral indices n = 1, 2, 3, 7 and new limits in the order of 1023 yr for 2νββ decays to the first 3 excited states of 76Se. In Phase II, currently in preparation, the detector mass will be doubled while reducing the background index by a factor of 10.

  9. "Of Course I'm Communicating; I Lecture Every Day": Enhancing Teaching and Learning in Introductory Statistics. Scholarship of Teaching and Learning

    ERIC Educational Resources Information Center

    Wulff, Shaun S.; Wulff, Donald H.

    2004-01-01

    This article focuses on one instructor's evolution from formal lecturing to interactive teaching and learning in a statistics course. Student perception data are used to demonstrate the instructor's use of communication to align the content, students, and instructor throughout the course. Results indicate that the students learned, that…

  10. Localized infusion of IGF-I results in skeletal muscle hypertrophy in rats

    NASA Technical Reports Server (NTRS)

    Adams, G. R.; McCue, S. A.

    1998-01-01

    Insulin-like growth factor I (IGF-I) peptide levels have been shown to increase in overloaded skeletal muscles (G. R. Adams and F. Haddad. J. Appl. Physiol. 81: 2509-2516, 1996). In that study, the increase in IGF-I was found to precede measurable increases in muscle protein and was correlated with an increase in muscle DNA content. The present study was undertaken to test the hypothesis that direct IGF-I infusion would result in an increase in muscle DNA as well as in various measurements of muscle size. Either 0.9% saline or nonsystemic doses of IGF-I were infused directly into a non-weight-bearing muscle of rats, the tibialis anterior (TA), via a fenestrated catheter attached to a subcutaneous miniosmotic pump. Saline infusion had no effect on the mass, protein content, or DNA content of TA muscles. Local IGF-I infusion had no effect on body or heart weight. The absolute weight of the infused TA muscles was approximately 9% greater (P < 0.05) than that of the contralateral TA muscles. IGF-I infusion resulted in significant increases in the total protein and DNA content of TA muscles (P < 0.05). As a result of these coordinated changes, the DNA-to-protein ratio of the hypertrophied TA was similar to that of the contralateral muscles. These results suggest that IGF-I may be acting to directly stimulate processes such as protein synthesis and satellite cell proliferation, which result in skeletal muscle hypertrophy.

  11. New heterogeneous test statistics for the unbalanced fixed-effect nested design.

    PubMed

    Guo, Jiin-Huarng; Billard, L; Luh, Wei-Ming

    2011-05-01

    When the underlying variances are unknown or/and unequal, using the conventional F test is problematic in the two-factor hierarchical data structure. Prompted by the approximate test statistics (Welch and Alexander-Govern methods), the authors develop four new heterogeneous test statistics to test factor A and factor B nested within A for the unbalanced fixed-effect two-stage nested design under variance heterogeneity. The actual significance levels and statistical power of the test statistics were compared in a simulation study. The results show that the proposed procedures maintain better Type I error rate control and have greater statistical power than those obtained by the conventional F test in various conditions. Therefore, the proposed test statistics are recommended in terms of robustness and easy implementation. ©2010 The British Psychological Society.

  12. Evaluation of Association of ADRA2A rs553668 and ACE I/D Gene Polymorphisms with Obesity Traits in the Setapak Population, Malaysia.

    PubMed

    Shunmugam, Vicneswari; Say, Yee-How

    2016-02-01

    α-adrenergic receptor 2A (ADRA2A) and angiotensin-converting enzyme (ACE) genes have been variably associated with obesity and its related phenotypes in different populations worldwide. This cross-sectional study aims to investigate the association of adrenergic receptor α2A (ADRA2A) rs553668 and angiotensin-converting enzyme (ACE) I/D single nucleotide polymorphisms (SNPs) with obesity traits (body mass index-BMI; waist-hip ratio-WHR; total body fat percentage - TBF) in a Malaysian population. Demographic and clinical variables were initially collected from 230 subjects via convenience sampling among residents and workers in Setapak, Malaysia, but in the end only 214 multi-ethnic Malaysians (99 males; 45 Malays, 116 ethnic Chinese, and 53 ethnic Indians) were available for statistical analysis. Genotyping was performed by polymerase chain reaction using DNA extracted from mouthwash samples. The overall minor allele frequencies (MAFs) for ADRA2A rs553668 and ACE I/D were 0.55 and 0.56, respectively. Allele distribution of ACE I/D was significantly associated with ethnicity and WHR class. Logistic regression analysis showed that subjects with the ACE II genotype and I allele were, respectively, 2.15 and 1.55 times more likely to be centrally obese, but when adjusted for age and ethnicity, this association was abolished. Covariate analysis controlling for age, gender, and ethnicity also showed similar results, where subjects carrying the II genotype or I allele did not have significantly higher WHR. Combinatory genotype and allele analysis for ADRA2A rs553668 and ACE I/D showed that subjects with both ADRA2A rs553668 GG and ACE I/D II genotypes had significant lowest WHR compared to other genotype combinations. The ACE II genotype might be a protective factor against central adiposity risk among the Malaysian population when in combination with the ADRA2A rs553668 GG genotype.

  13. Intermittency Statistics in the Expanding Solar Wind

    NASA Astrophysics Data System (ADS)

    Cuesta, M. E.; Parashar, T. N.; Matthaeus, W. H.

    2017-12-01

    The solar wind is observed to be turbulent. One of the open questions in solar wind research is how the turbulence evolves as the solar wind expands to great distances. Some studies have focused on evolution of the outer scale but not much has been done to understand how intermittency evolves in the expanding wind beyond 1 AU (see [1,2]). We use magnetic field data from Voyager I spacecraft from 1 to 10AU to study the evolution of statistics of magnetic discontinuities. We perform various statistical tests on these discontinuities and make connections to the physical processes occurring in the expanding wind.[1] Tsurutani, Bruce T., and Edward J. Smith. "Interplanetary discontinuities: Temporal variations and the radial gradient from 1 to 8.5 AU." Journal of Geophysical Research: Space Physics 84.A6 (1979): 2773-2787.[2] Greco, A., et al. "Evidence for nonlinear development of magnetohydrodynamic scale intermittency in the inner heliosphere." The Astrophysical Journal 749.2 (2012): 105.

  14. Chain Reaction Mechanism for I2 Dissociation in the O2 (1 delta)-I Atom Laser.

    DTIC Science & Technology

    1983-09-20

    The principal injected gases in this study were 12 (+Ar) and H2 0(+.Ar). We continue to use the method of flow replacement, whereby a pure Ar stream...conditions exist for small H20 densities. The identification of intermediate states in a kinetic mechanist b. indirect methods is always unsatisfactory...and e.’Irr-’tr1,ic o ""n"a i : n es a ppl ele-Ic- trcon i cs, senienndoztor cry" stal1 and itevice :1-i. , radi -metci, ima ’In4; -’ i’ -t er- wace

  15. I(2)(PP2A) regulates p53 and Akt correlatively and leads the neurons to abort apoptosis.

    PubMed

    Liu, Gong-Ping; Wei, Wei; Zhou, Xin; Zhang, Yao; Shi, Hai-Hong; Yin, Jun; Yao, Xiu-Qing; Peng, Cai-Xia; Hu, Juan; Wang, Qun; Li, Hong-Lian; Wang, Jian-Zhi

    2012-02-01

    A chronic neuron loss is the cardinal pathology in Alzheimer disease (AD), but it is still not understood why most neurons in AD brain do not accomplish apoptosis even though they are actually exposed to an environment with enriched proapoptotic factors. Protein phosphatase-2A inhibitor-2 (I(2)(PP2A)), an endogenous PP2A inhibitor, is significantly increased in AD brain, but the role of I(2)(PP2A) in AD-like neuron loss is elusive. Here, we show that I(2)(PP2A) regulates p53 and Akt correlatively. The mechanisms involve activated transcription and p38 MAPK activities. More importantly, we demonstrate that the simultaneous activation of Akt induced by I(2)(PP2A) counteracts the hyperactivated p53-induced cell apoptosis. Furthermore, I(2)(PP2A), p53 and Akt are all elevated in the brain of mouse model and AD patients. Our results suggest that the increased I(2)(PP2A) may trigger apoptosis by p53 upregulation, but due to simultaneous activation of Akt, the neurons are aborted from the apoptotic pathway. This finding contributes to the understanding of why most neurons in AD brain do not undergo apoptosis. Copyright © 2010. Published by Elsevier Inc.

  16. Reflection statistics of weakly disordered optical medium when its mean refractive index is different from an outside medium

    NASA Astrophysics Data System (ADS)

    Pradhan, Prabhakar; John Park, Daniel; Capoglu, Ilker; Subramanian, Hariharan; Damania, Dhwanil; Cherkezyan, Lusik; Taflove, Allen; Backman, Vadim

    2017-06-01

    Statistical properties of light waves reflected from a one-dimensional (1D) disordered optical medium [n(x) = n0+ dn(x), =0] have been well studied, however, most of the studies have focused on the situation when the mean refractive index of the optical medium matched with the outside medium, i.e., n0= nout=1. Further, considering dn(x) as a Gaussian color noise refractive index medium with exponential spatial correlation decay length lc and k as the incident wave vector, it has been shown that for smaller correlation length limit, i.e., klc <<1, both the mean reflection coefficient and std of r, σ(r), have same value, and they follow the relation = σ(r) ∝ 2> lc. However, when the refractive index of the sample medium is different from the outside medium, the reflection statistics may have interesting features, which has not been well studied or understood. We studied the reflection statistics of a 1D weakly disordered optical medium with the mean background refractive index n0 being different from the outside medium nout (≠n0), to see the effect of mismatching (i.e., value of n0- nout) on the reflection statistics. In the mismatched case, the results show that the mean reflection coefficient follows a form similar to that of the matched refractive-index case, i.e., 2> lc, with a linear increased shift, which is due to 1D uniform background reflection from a slab. However, σ(r) is shown to be σ(r) ∝ (2>lc)1/2, which is different from the matched case. This change in std of r is attributed to the interference between the mismatched-crerated edge mediated multiple scattering that are coupled with the random scattering. Applications to light scattering from random layered media and biological cells are discussed.

  17. Statistics for Geography Teachers: Topics in Geography, Number 2.

    ERIC Educational Resources Information Center

    National Council for Geographic Education.

    This publication is designed to provide geography teachers with useful statistical information. It presents tables, maps, graphs, diagrams, and explanations of statistical data in 24 areas. The areas in which statistics are given are conversions, measurement, astronomy, time, daylight, twilight, latitude and longitude as distance, the relationship…

  18. Statistics of Smoothed Cosmic Fields in Perturbation Theory. I. Formulation and Useful Formulae in Second-Order Perturbation Theory

    NASA Astrophysics Data System (ADS)

    Matsubara, Takahiko

    2003-02-01

    We formulate a general method for perturbative evaluations of statistics of smoothed cosmic fields and provide useful formulae for application of the perturbation theory to various statistics. This formalism is an extensive generalization of the method used by Matsubara, who derived a weakly nonlinear formula of the genus statistic in a three-dimensional density field. After describing the general method, we apply the formalism to a series of statistics, including genus statistics, level-crossing statistics, Minkowski functionals, and a density extrema statistic, regardless of the dimensions in which each statistic is defined. The relation between the Minkowski functionals and other geometrical statistics is clarified. These statistics can be applied to several cosmic fields, including three-dimensional density field, three-dimensional velocity field, two-dimensional projected density field, and so forth. The results are detailed for second-order theory of the formalism. The effect of the bias is discussed. The statistics of smoothed cosmic fields as functions of rescaled threshold by volume fraction are discussed in the framework of second-order perturbation theory. In CDM-like models, their functional deviations from linear predictions plotted against the rescaled threshold are generally much smaller than that plotted against the direct threshold. There is still a slight meatball shift against rescaled threshold, which is characterized by asymmetry in depths of troughs in the genus curve. A theory-motivated asymmetry factor in the genus curve is proposed.

  19. Complete primary structure of rainbow trout type I collagen consisting of alpha1(I)alpha2(I)alpha3(I) heterotrimers.

    PubMed

    Saito, M; Takenouchi, Y; Kunisaki, N; Kimura, S

    2001-05-01

    The subunit compositions of skin and muscle type I collagens from rainbow trout were found to be alpha1(I)alpha2(I)alpha3(I) and [alpha1(I)](2)alpha2(I), respectively. The occurrence of alpha3(I) has been observed only for bonyfish. The skin collagen exhibited more susceptibility to both heat denaturation and MMP-13 digestion than the muscle counterpart; the former had a lower denaturation temperature by about 0.5 degrees C than the latter. The lower stability of skin collagen, however, is not due to the low levels of imino acids because the contents of Pro and Hyp were almost constant in both collagens. On the other hand, some cDNAs coding for the N-terminal and/or a part of triple-helical domains of proalpha(I) chains were cloned from the cDNA library of rainbow trout fibroblasts. These cDNAs together with the previously cloned collagen cDNAs gave information about the complete primary structure of type I procollagen. The main triple-helical domain of each proalpha(I) chain had 338 uninterrupted Gly-X-Y triplets consisting of 1014 amino acids and was unique in its high content of Gly-Gly doublets. In particular, the bonyfish-specific alpha(I) chain, proalpha3(I) was characterized by the small number of Gly-Pro-Pro triplets, 19, and the large number of Gly-Gly doublets, 38, in the triple-helical domain, compared to 23 and 22, respectively, for proalpha1(I). The small number of Gly-Pro-Pro and the large number of Gly-Gly in proalpha3(I) was assumed to partially loosen the triple-helical structure of skin collagen, leading to the lower stability of skin collagen mentioned above. Finally, phylogenetic analyses revealed that proalpha3(I) had diverged from proalpha1(I). This study is the first report of the complete primary structure of fish type I procollagen.

  20. Statistical methods for astronomical data with upper limits. I - Univariate distributions

    NASA Technical Reports Server (NTRS)

    Feigelson, E. D.; Nelson, P. I.

    1985-01-01

    The statistical treatment of univariate censored data is discussed. A heuristic derivation of the Kaplan-Meier maximum-likelihood estimator from first principles is presented which results in an expression amenable to analytic error analysis. Methods for comparing two or more censored samples are given along with simple computational examples, stressing the fact that most astronomical problems involve upper limits while the standard mathematical methods require lower limits. The application of univariate survival analysis to six data sets in the recent astrophysical literature is described, and various aspects of the use of survival analysis in astronomy, such as the limitations of various two-sample tests and the role of parametric modelling, are discussed.

  1. Combining statistical inference and decisions in ecology

    USGS Publications Warehouse

    Williams, Perry J.; Hooten, Mevin B.

    2016-01-01

    Statistical decision theory (SDT) is a sub-field of decision theory that formally incorporates statistical investigation into a decision-theoretic framework to account for uncertainties in a decision problem. SDT provides a unifying analysis of three types of information: statistical results from a data set, knowledge of the consequences of potential choices (i.e., loss), and prior beliefs about a system. SDT links the theoretical development of a large body of statistical methods including point estimation, hypothesis testing, and confidence interval estimation. The theory and application of SDT have mainly been developed and published in the fields of mathematics, statistics, operations research, and other decision sciences, but have had limited exposure in ecology. Thus, we provide an introduction to SDT for ecologists and describe its utility for linking the conventionally separate tasks of statistical investigation and decision making in a single framework. We describe the basic framework of both Bayesian and frequentist SDT, its traditional use in statistics, and discuss its application to decision problems that occur in ecology. We demonstrate SDT with two types of decisions: Bayesian point estimation, and an applied management problem of selecting a prescribed fire rotation for managing a grassland bird species. Central to SDT, and decision theory in general, are loss functions. Thus, we also provide basic guidance and references for constructing loss functions for an SDT problem.

  2. Sn2+—Stabilization in MASnI3 perovskites by superhalide incorporation

    NASA Astrophysics Data System (ADS)

    Xiang, Junxiang; Wang, Kan; Xiang, Bin; Cui, Xudong

    2018-03-01

    Sn-based hybrid halide perovskites are a potential solution to replace Pb and thereby reduce Pb toxicity in MAPbI3 perovskite-based solar cells. However, the instability of Sn2+ in air atmosphere causes a poor reproducibility of MASnI3, hindering steps towards this goal. In this paper, we propose a new type of organic metal-superhalide perovskite of MASnI2BH4 and MASnI2AlH4. Through first-principles calculations, our results reveal that the incorporation of BH4 and AlH4 superhalides can realize an impressive enhancement of oxidation resistance of Sn2+ in MASnI3 perovskites because of the large electron transfer between Sn2+ and [BH4]-/[AlH4]-. Meanwhile, the high carrier mobility is preserved in these superhalide perovskites and only a slight decrease is observed in the optical absorption strength. Our studies provide a new path to attain highly stable performance and reproducibility of Sn-based perovskite solar cells.

  3. Descriptive Statistical Techniques for Librarians. 2nd Edition.

    ERIC Educational Resources Information Center

    Hafner, Arthur W.

    A thorough understanding of the uses and applications of statistical techniques is integral in gaining support for library funding or new initiatives. This resource is designed to help practitioners develop and manipulate descriptive statistical information in evaluating library services, tracking and controlling limited resources, and analyzing…

  4. Biennial Survey of Education in the United States, 1936-1938. Bulletin, 1940, No. 2. Chapter I: Statistical Summary of Education, 1937-38

    ERIC Educational Resources Information Center

    Foster, Emily M.

    1942-01-01

    The U.S. Office of Education is required by law to collect statistics to show the condition and progress of education. Statistics can be made available, on a national scale, to the extent that school administrators, principals, and college officials cooperate on a voluntary basis with the Office of Education in making the facts available. This…

  5. United States Air Force Statistical Digest, Fiscal Year 1952. Seventh Edition

    DTIC Science & Technology

    1953-01-01

    Off’icers E! Enlisted are reported on a "pay scale basis because ot’ the inclusion ot’ SCARiiAF Source: Personnel Statistical Division, DeS /Comptroller...D:: , ~. ;!~ ,~:! ilj rl"’ < ., 𔃺’ lai’. = 1~~ L,- l{H’=-... - I u, u, , !! ~iLL u, :; ~E! !LL <t ’" Ii Ii t-U I-et en u, a:l I II- 0 .. a...person in a theater of operatlOll8 Who beceaee a casualty as defiDed bererc , as a result of en outside force or agent or the eDelllY, il:J. the race of

  6. Statistical analysis of time transfer data from Timation 2. [US Naval Observatory and Australia

    NASA Technical Reports Server (NTRS)

    Luck, J. M.; Morgan, P.

    1974-01-01

    Between July 1973 and January 1974, three time transfer experiments using the Timation 2 satellite were conducted to measure time differences between the U.S. Naval Observatory and Australia. Statistical tests showed that the results are unaffected by the satellite's position with respect to the sunrise/sunset line or by its closest approach azimuth at the Australian station. Further tests revealed that forward predictions of time scale differences, based on the measurements, can be made with high confidence.

  7. Measuring Classroom Management Expertise (CME) of Teachers: A Video-Based Assessment Approach and Statistical Results

    ERIC Educational Resources Information Center

    König, Johannes

    2015-01-01

    The study aims at developing and exploring a novel video-based assessment that captures classroom management expertise (CME) of teachers and for which statistical results are provided. CME measurement is conceptualized by using four video clips that refer to typical classroom management situations in which teachers are heavily challenged…

  8. Interpretation of the results of statistical measurements. [search for basic probability model

    NASA Technical Reports Server (NTRS)

    Olshevskiy, V. V.

    1973-01-01

    For random processes, the calculated probability characteristic, and the measured statistical estimate are used in a quality functional, which defines the difference between the two functions. Based on the assumption that the statistical measurement procedure is organized so that the parameters for a selected model are optimized, it is shown that the interpretation of experimental research is a search for a basic probability model.

  9. Statistics of Narrowband White Noise Derived from Clipped Broadband White Noise

    DTIC Science & Technology

    1992-02-01

    e -26’lnN (7) A=1 with the inverse transform given by I N C(nAt) X D (lAf)e 2N. (8) The validity of this transform pair can be established by means...of the identity N I e (x"- ’ N = 8n.k+IN. (9) NARROWBAND STATISTICS The discrete Fourier transform and inverse transform can be executed via the fast

  10. Results from phase I of the GERDA experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wester, Thomas

    2015-10-28

    The GERmanium Detector Array Gerda at the Laboratori Nazionali del Gran Sasso of the INFN in Italy is an experiment dedicated to the search for the neutrinoless double beta (0νββ) decay in {sup 76}Ge. The experiment employs high purity germanium detectors enriched in {sup 76}Ge inside a 64 m{sup 3} cryostat filled with liquid argon. Gerda was planned in two phases of data taking with the goal to reach a half-life sensitivity in the order of 10{sup 26} yr. Phase I of Gerda was running from November 2011 until May 2013. With about 18 kg total detector mass, data withmore » an exposure of 21.6 kg·yr was collected and a background index of 0.01 cts/(keV·kg·yr) was achieved in the region of interest. No signal was found for the 0νββ decay and a new limit of T{sub 1/2} > 2.1 · 10{sup 25} yr (90% C.L.) was obtained, strongly disfavoring the previous claim of observation. Furthermore, the 2νββ decay half-life of {sup 76}Ge was measured with unprecedented precision. Other results include new half-life limits of the order of 10{sup 23} yr for Majoron emitting double beta decay modes with spectral indices n = 1, 2, 3, 7 and new limits in the order of 10{sup 23} yr for 2νββ decays to the first 3 excited states of {sup 76}Se. In Phase II, currently in preparation, the detector mass will be doubled while reducing the background index by a factor of 10.« less

  11. Hidden Statistics of Schroedinger Equation

    NASA Technical Reports Server (NTRS)

    Zak, Michail

    2011-01-01

    Work was carried out in determination of the mathematical origin of randomness in quantum mechanics and creating a hidden statistics of Schr dinger equation; i.e., to expose the transitional stochastic process as a "bridge" to the quantum world. The governing equations of hidden statistics would preserve such properties of quantum physics as superposition, entanglement, and direct-product decomposability while allowing one to measure its state variables using classical methods.

  12. Statistics of Low-Mass Companions to Stars: Implications for Their Origin

    NASA Technical Reports Server (NTRS)

    Stepinski, T. F.; Black, D. C.

    2001-01-01

    One of the more significant results from observational astronomy over the past few years has been the detection, primarily via radial velocity studies, of low-mass companions (LMCs) to solar-like stars. The commonly held interpretation of these is that the majority are "extrasolar planets" whereas the rest are brown dwarfs, the distinction made on the basis of apparent discontinuity in the distribution of M sin i for LMCs as revealed by a histogram. We report here results from statistical analysis of M sin i, as well as of the orbital elements data for available LMCs, to rest the assertion that the LMCs population is heterogeneous. The outcome is mixed. Solely on the basis of the distribution of M sin i a heterogeneous model is preferable. Overall, we find that a definitive statement asserting that LMCs population is heterogeneous is, at present, unjustified. In addition we compare statistics of LMCs with a comparable sample of stellar binaries. We find a remarkable statistical similarity between these two populations. This similarity coupled with marked populational dissimilarity between LMCs and acknowledged planets motivates us to suggest a common origin hypothesis for LMCs and stellar binaries as an alternative to the prevailing interpretation. We discuss merits of such a hypothesis and indicate a possible scenario for the formation of LMCs.

  13. Nonadditive entropy Sq and nonextensive statistical mechanics: Applications in geophysics and elsewhere

    NASA Astrophysics Data System (ADS)

    Tsallis, Constantino

    2012-06-01

    The celebrated Boltzmann-Gibbs (BG) entropy, S BG = -kΣi p i ln p i, and associated statistical mechanics are essentially based on hypotheses such as ergodicity, i.e., when ensemble averages coincide with time averages. This dynamical simplification occurs in classical systems (and quantum counterparts) whose microscopic evolution is governed by a positive largest Lyapunov exponent (LLE). Under such circumstances, relevant microscopic variables behave, from the probabilistic viewpoint, as (nearly) independent. Many phenomena exist, however, in natural, artificial and social systems (geophysics, astrophysics, biophysics, economics, and others) that violate ergodicity. To cover a (possibly) wide class of such systems, a generalization (nonextensive statistical mechanics) of the BG theory was proposed in 1988. This theory is based on nonadditive entropies such as S_q = kfrac{{1 - sumnolimits_i {p_i^q } }} {{q - 1}}left( {S_1 = S_{BG} } right). Here we comment some central aspects of this theory, and briefly review typical predictions, verifications and applications in geophysics and elsewhere, as illustrated through theoretical, experimental, observational, and computational results.

  14. Robust inference from multiple test statistics via permutations: a better alternative to the single test statistic approach for randomized trials.

    PubMed

    Ganju, Jitendra; Yu, Xinxin; Ma, Guoguang Julie

    2013-01-01

    Formal inference in randomized clinical trials is based on controlling the type I error rate associated with a single pre-specified statistic. The deficiency of using just one method of analysis is that it depends on assumptions that may not be met. For robust inference, we propose pre-specifying multiple test statistics and relying on the minimum p-value for testing the null hypothesis of no treatment effect. The null hypothesis associated with the various test statistics is that the treatment groups are indistinguishable. The critical value for hypothesis testing comes from permutation distributions. Rejection of the null hypothesis when the smallest p-value is less than the critical value controls the type I error rate at its designated value. Even if one of the candidate test statistics has low power, the adverse effect on the power of the minimum p-value statistic is not much. Its use is illustrated with examples. We conclude that it is better to rely on the minimum p-value rather than a single statistic particularly when that single statistic is the logrank test, because of the cost and complexity of many survival trials. Copyright © 2013 John Wiley & Sons, Ltd.

  15. iTOUGH2: A multiphysics simulation-optimization framework for analyzing subsurface systems

    NASA Astrophysics Data System (ADS)

    Finsterle, S.; Commer, M.; Edmiston, J. K.; Jung, Y.; Kowalsky, M. B.; Pau, G. S. H.; Wainwright, H. M.; Zhang, Y.

    2017-11-01

    iTOUGH2 is a simulation-optimization framework for the TOUGH suite of nonisothermal multiphase flow models and related simulators of geophysical, geochemical, and geomechanical processes. After appropriate parameterization of subsurface structures and their properties, iTOUGH2 runs simulations for multiple parameter sets and analyzes the resulting output for parameter estimation through automatic model calibration, local and global sensitivity analyses, data-worth analyses, and uncertainty propagation analyses. Development of iTOUGH2 is driven by scientific challenges and user needs, with new capabilities continually added to both the forward simulator and the optimization framework. This review article provides a summary description of methods and features implemented in iTOUGH2, and discusses the usefulness and limitations of an integrated simulation-optimization workflow in support of the characterization and analysis of complex multiphysics subsurface systems.

  16. North Carolina Migrant Education Program. 1971 Project Evaluation Reports, Vol. I.

    ERIC Educational Resources Information Center

    North Carolina State Dept. of Public Instruction, Raleigh.

    Evaluation reports for 10 of the 23 1971 Summer Migrant Projects in North Carolina are presented in Volume I of this compilation. Each report contains the following information: (1) descriptive statistics and results of student achievement; (2) description of the project as obtained from site team reports and other available information; and (3)…

  17. CD147 reinforces [Ca2+]i oscillations and promotes oncogenic progression in hepatocellular carcinoma.

    PubMed

    Tang, Juan; Guo, Yun-Shan; Yu, Xiao-Ling; Huang, Wan; Zheng, Ming; Zhou, Ying-Hui; Nan, Gang; Wang, Jian-Chao; Yang, Hai-Jiao; Yu, Jing-Min; Jiang, Jian-Li; Chen, Zhi-Nan

    2015-10-27

    Oscillations in intracellular Ca2+ concentrations ([Ca2+]i) mediate various cellular function. Although it is known that [Ca2+]i oscillations are susceptible to dysregulation in tumors, the tumor-specific regulators of [Ca2+]i oscillations are poorly characterized. We discovered that CD147 promotes hepatocellular carcinoma (HCC) metastasis and proliferation by enhancing the amplitude and frequency of [Ca2+]i oscillations in HCC cells. CD147 activates two distinct signaling pathways to regulate [Ca2+]i oscillations. By activating FAK-Src-IP3R1 signaling pathway, CD147 promotes Ca2+ release from endoplasmic reticulum (ER) and enhances the amplitude of [Ca2+]i oscillations. Furthermore, CD147 accelerates ER Ca2+refilling and enhances the frequency of [Ca2+]i oscillations through activating CaMKP-PAK1-PP2A-PLB-SERCA signaling pathway. Besides, CD147-promoted ER Ca2+ release and refilling are tightly regulated by changing [Ca2+]i. CD147 may activate IP3R1 channel under low [Ca2+]i conditions and CD147 may activate SERCA pump under high [Ca2+]i conditions. CD147 deletion suppresses HCC tumorigenesis and increases the survival rate of liver-specific CD147 knockout mice by regulating [Ca2+]i oscillations in vivo. Together, these results reveal that CD147 functions as a critical regulator of ER-dependent [Ca2+]i oscillations to promote oncogenic progression in HCC.

  18. CD147 reinforces [Ca2+]i oscillations and promotes oncogenic progression in hepatocellular carcinoma

    PubMed Central

    Zheng, Ming; Zhou, Ying-Hui; Nan, Gang; Wang, Jian-Chao; Yang, Hai-Jiao; Yu, Jing-Min; Jiang, Jian-Li; Chen, Zhi-Nan

    2015-01-01

    Oscillations in intracellular Ca2+ concentrations ([Ca2+]i) mediate various cellular function. Although it is known that [Ca2+]i oscillations are susceptible to dysregulation in tumors, the tumor-specific regulators of [Ca2+]i oscillations are poorly characterized. We discovered that CD147 promotes hepatocellular carcinoma (HCC) metastasis and proliferation by enhancing the amplitude and frequency of [Ca2+]i oscillations in HCC cells. CD147 activates two distinct signaling pathways to regulate [Ca2+]i oscillations. By activating FAK-Src-IP3R1 signaling pathway, CD147 promotes Ca2+ release from endoplasmic reticulum (ER) and enhances the amplitude of [Ca2+]i oscillations. Furthermore, CD147 accelerates ER Ca2+ refilling and enhances the frequency of [Ca2+]i oscillations through activating CaMKP-PAK1-PP2A-PLB-SERCA signaling pathway. Besides, CD147-promoted ER Ca2+ release and refilling are tightly regulated by changing [Ca2+]i. CD147 may activate IP3R1 channel under low [Ca2+]i conditions and CD147 may activate SERCA pump under high [Ca2+]i conditions. CD147 deletion suppresses HCC tumorigenesis and increases the survival rate of liver-specific CD147 knockout mice by regulating [Ca2+]i oscillations in vivo. Together, these results reveal that CD147 functions as a critical regulator of ER-dependent [Ca2+]i oscillations to promote oncogenic progression in HCC. PMID:26498680

  19. iRhom2 deficiency relieves TNF-α associated hepatic dyslipidemia in long-term PM2.5-exposed mice.

    PubMed

    Ge, Chen-Xu; Qin, Yu-Ting; Lou, De-Shuai; Li, Qiang; Li, Yuan-Yuan; Wang, Zhong-Ming; Yang, Wei-Wei; Wang, Ming; Liu, Nan; Wang, Zhen; Zhang, Peng-Xing; Tu, Yan-Yang; Tan, Jun; Xu, Min-Xuan

    2017-12-02

    Accumulating researches reported that particulate matter (PM2.5) is a risk factor for developing various diseases, including metabolic syndrome. Recently, inactive rhomboid protein 2 (iRhom2) was considered as a necessary modulator for shedding of tumor necrosis factor-α (TNF-α) in immune cells. TNF-α, a major pro-inflammatory cytokine, was linked to various pathogenesis of diseases, including dyslipidemia. Here, wild type (WT) and iRhom2-knockout (iRhom2 -/- ) mice were used to investigate the effects of iRhom2 on PM2.5-induced hepatic dyslipidemia. The hepatic histology, inflammatory response, glucose tolerance, serum parameters and gene expressions were analyzed. We found that long-term inhalation of PM2.5 resulted in hepatic steatosis. And a significant up-regulation of iRhom2 in liver tissues was observed, accompanied with elevated TNF-α, TNF-α converting enzyme (TACE), TNFα receptor (TNFR)2 and various inflammatory cytokines expressions. Additionally, PM2.5 treatment caused TG and TC accumulation in serum and liver, probably attributed to changes of genes modulating lipid metabolism. Intriguingly, hepatic injury and dyslipidemia were attenuated by iRhom2 -/- in mice with PM2.5 challenge. In vitro, iRhom2-knockdwon reduced TNF-α expressions and its associated inflammatory cytokines in Kupffer cells, implying that liver-resident macrophages played an important role in regulating hepatic inflammation and lipid metabolism in cells treated with PM2.5. The findings indicated that long-term PM2.5 exposure caused hepatic steatosis and dyslipidemia through triggering inflammation, which was, at least partly, dependent on iRhom2/TNF-α pathway in liver-resident macrophages. Copyright © 2017 Elsevier Inc. All rights reserved.

  20. Multi-Reader ROC studies with Split-Plot Designs: A Comparison of Statistical Methods

    PubMed Central

    Obuchowski, Nancy A.; Gallas, Brandon D.; Hillis, Stephen L.

    2012-01-01

    Rationale and Objectives Multi-reader imaging trials often use a factorial design, where study patients undergo testing with all imaging modalities and readers interpret the results of all tests for all patients. A drawback of the design is the large number of interpretations required of each reader. Split-plot designs have been proposed as an alternative, in which one or a subset of readers interprets all images of a sample of patients, while other readers interpret the images of other samples of patients. In this paper we compare three methods of analysis for the split-plot design. Materials and Methods Three statistical methods are presented: Obuchowski-Rockette method modified for the split-plot design, a newly proposed marginal-mean ANOVA approach, and an extension of the three-sample U-statistic method. A simulation study using the Roe-Metz model was performed to compare the type I error rate, power and confidence interval coverage of the three test statistics. Results The type I error rates for all three methods are close to the nominal level but tend to be slightly conservative. The statistical power is nearly identical for the three methods. The coverage of 95% CIs fall close to the nominal coverage for small and large sample sizes. Conclusions The split-plot MRMC study design can be statistically efficient compared with the factorial design, reducing the number of interpretations required per reader. Three methods of analysis, shown to have nominal type I error rate, similar power, and nominal CI coverage, are available for this study design. PMID:23122570

  1. 17 CFR 240.17g-2 - Records to be made and retained by nationally recognized statistical rating organizations.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... retained by nationally recognized statistical rating organizations. 240.17g-2 Section 240.17g-2 Commodity... Recognized Statistical Rating Organizations § 240.17g-2 Records to be made and retained by nationally recognized statistical rating organizations. (a) Records required to be made and retained. A nationally...

  2. Antinociceptive effects of imidazoline I2 receptor agonists in the formalin test in rats

    PubMed Central

    Thorn, David A; Qiu, Yanyan; Zhang, Yanan; Li, Jun-Xu

    2015-01-01

    The imidazoline I2 receptor is an emerging drug target for analgesics. This study extended previous studies by examining the antinociceptive effects of three I2 receptor agonists (2-BFI, BU224 and CR4056) in the formalin test. The receptor mechanisms and anatomical mediation of I2 receptor agonist-induced antinociception were also examined. Formalin-induced flinching responses (2%, 50µl) were quantified after treatment with I2 receptor agonists alone or in combination with the I2 receptor antagonist idazoxan. Anatomical mediation was studied by locally administering 2-BFI into the plantar surface or into the right lateral ventricle via cannulae (i.c.v). The locomotor activity was also examined after central (i.c.v.) administration of 2-BFI. 2-BFI (1–10 mg/kg, i.p.) and BU224 (1–10 mg/kg, i.p.) attenuated the spontaneous flinching response observed during 10 min (phase 1) and 20–60 min (phase 2) following formalin treatment, while CR4056 (1–32 mg/kg, i.p.) only decreased phase 2 flinching response. The I2 receptor antagonist idazoxan attenuated the antinociceptive effects of 2-BFI and BU224 during phase 1, but not phase 2. Peripheral administration of 2-BFI (1–10 mg/kg, i.pl) to the hindpaw of rats had no antinociceptive effects. In contrast, centrally delivered 2-BFI (10–100 µg, i.c.v.) dose-dependently attenuated phase 1 and phase 2 flinching at doses that did not reduce the locomotor activity. Together, these data revealed the differential antinociceptive effects of I2 receptor agonists and the differential antagonism profiles by idazoxan, suggesting the involvement of different I2 receptor subtypes in reducing different phases of formalin-induced pain-like behaviors. In addition, the results also suggest the central mediation of I2 receptor agonist-induced antinociceptive actions. PMID:26599907

  3. Using iPad2 for a Graduate Practicum Course

    ERIC Educational Resources Information Center

    Sachs, Lindsey; Bull, Prince Hycy

    2012-01-01

    iPads and iPhones continue to impact academia, but the iPad2 provides features that could enhance teacher education programs. This paper addresses how eight graduate students and a faculty used iPad2 to support a graduate practicum course. Participants were asked to report how they used their iPad2 each week in the form of a written log and…

  4. Reactions of R(2)P-P(SiMe(3))Li with [(R'(3)P)(2)PtCl(2)]. A general and efficient entry to phosphanylphosphinidene complexes of platinum. Syntheses and structures of [(eta(2)-P=(i)Pr(2))Pt(p-Tol(3)P)(2)], [(eta(2)-P=(t)Bu(2))Pt(p-Tol(3)P)(2)], [{eta(2)-P=(N(i)Pr(2))(2)}Pt(p-Tol(3)P)(2)] and [{(Et(2)PhP)(2)Pt}(2)P(2)].

    PubMed

    Domańska-Babul, Wioleta; Chojnacki, Jaroslaw; Matern, Eberhard; Pikies, Jerzy

    2009-01-07

    The reactions of lithium derivatives of diphosphanes R(2)P-P(SiMe(3))Li (R = (t)Bu, (i)Pr, Et(2)N and (i)Pr(2)N) with [(R'(3)P)(2)PtCl(2)] (R'(3)P = Et(3)P, Et(2)PhP, EtPh(2)P and p-Tol(3)P) proceed in a facile manner to afford side-on bonded phosphanylphosphinidene complexes of platinum [(eta(2)-P=R(2))Pt(PR'(3))(2)]. The related reactions of Ph(2)P-P(SiMe(3))Li with [(R'(3)P)(2)PtCl(2)] did not yield [(eta(2)-P=PPh(2))Pt(PR'(3))(2)] and resulted mainly in the formation of [{(R'(3)P)(2)Pt}(2)P(2)], Ph(2)P-PLi-PPh(2), (Me(3)Si)(2)PLi and (Me(3)Si)(3)P. Crystallographic data are reported for the compounds [(eta(2)-P=R(2))Pt(p-Tol(3)P)(2)] (R = (t)Bu, (i)Pr, ((i)Pr(2)N)(2)P) and for [{(Et(2)PhP)(2)Pt}(2)P(2)].

  5. A statistical study on the F2 layer vertical variation during nighttime medium-scale traveling ionospheric disturbances

    NASA Astrophysics Data System (ADS)

    Ssessanga, Nicholas; Kim, Yong Ha; Jeong, Se-Heon

    2017-03-01

    A statistical study on the relationship between the perturbation component (ΔTEC (total electron content)) and the F2 layer peak height (hmF2) during nighttime medium-scale traveling ionospheric disturbances is presented. The results are obtained by using a time-dependent computerized ionospheric tomography (CIT) technique. This was realized by using slant total electron content observations from a dense Global Positioning System receiver network over Japan (with more than 1000 receivers), together with a multiplicative algebraic reconstruction technique. Reconstructions from CIT were validated by using ionosonde and occultation measurements. A total of 36 different time snapshots of the ionosphere when medium-scale traveling ionospheric disturbances (MSTIDs) were eminent were analyzed. These were obtained from a data set covering years from 2011 to 2014. The reconstructed surface wavefronts of ΔTEC and hmF2 structure were found to be aligned along the northwest-southeast direction. These results confirm that nighttime MSTIDs are driven by electrodynamic forces related to Perkins instability which explains the northwest-southeast wavefront alignment based on the F region electrodynamics. Furthermore, from the statistical analysis hmF2 varied quasiperiodically in altitude with dominant peak-to-peak amplitudes between 10 and 40 km. In addition, ΔTEC and hmF2 were 60% anticorrelated.

  6. Statistical Analysis in Dental Research Papers.

    DTIC Science & Technology

    1983-08-08

    AD A136, 019 STATISTICAL ANALYSS IN DENTAL RESEARCH PAPERS(Ul ARMY I INS OF DENTAL NESEARCH WASHINGTON DC L LORTON 0R AUG983 UNCL ASS FED F/S 6/5 IEE...BEFORE COSTL’,..G FORM 2. GOVT ACCESSION NO 3. RECIPIENTS CATALOG NUbER d Ste S. TYPE OF REPORT A PERIOD COVERED ,cistical Analysis in Dental Research ...Papers Submission of papaer Jan- Aue 1983 X!t AUTHOR(&) ". COTACO.RATN Lewis Lorton 9. PERFORMING ORGANIZATION NAME AND ADDRESS 10. PROGRAM ELEMENT

  7. Health Resources Statistics; Health Manpower and Health Facilities, 1968. Public Health Service Publication No. 1509.

    ERIC Educational Resources Information Center

    National Center for Health Statistics (DHEW/PHS), Hyattsville, MD.

    This report is a part of the program of the National Center for Health Statistics to provide current statistics as baseline data for the evaluation, planning, and administration of health programs. Part I presents data concerning the occupational fields: (1) administration, (2) anthropology and sociology, (3) data processing, (4) basic sciences,…

  8. Statistical aspects of the TNK-S2B trial of tenecteplase versus alteplase in acute ischemic stroke: an efficient, dose-adaptive, seamless phase II/III design.

    PubMed

    Levin, Bruce; Thompson, John L P; Chakraborty, Bibhas; Levy, Gilberto; MacArthur, Robert; Haley, E Clarke

    2011-08-01

    TNK-S2B, an innovative, randomized, seamless phase II/III trial of tenecteplase versus rt-PA for acute ischemic stroke, terminated for slow enrollment before regulatory approval of use of phase II patients in phase III. (1) To review the trial design and comprehensive type I error rate simulations and (2) to discuss issues raised during regulatory review, to facilitate future approval of similar designs. In phase II, an early (24-h) outcome and adaptive sequential procedure selected one of three tenecteplase doses for phase III comparison with rt-PA. Decision rules comparing this dose to rt-PA would cause stopping for futility at phase II end, or continuation to phase III. Phase III incorporated two co-primary hypotheses, allowing for a treatment effect at either end of the trichotomized Rankin scale. Assuming no early termination, four interim analyses and one final analysis of 1908 patients provided an experiment-wise type I error rate of <0.05. Over 1,000 distribution scenarios, each involving 40,000 replications, the maximum type I error in phase III was 0.038. Inflation from the dose selection was more than offset by the one-half continuity correction in the test statistics. Inflation from repeated interim analyses was more than offset by the reduction from the clinical stopping rules for futility at the first interim analysis. Design complexity and evolving regulatory requirements lengthened the review process. (1) The design was innovative and efficient. Per protocol, type I error was well controlled for the co-primary phase III hypothesis tests, and experiment-wise. (2a) Time must be allowed for communications with regulatory reviewers from first design stages. (2b) Adequate type I error control must be demonstrated. (2c) Greater clarity is needed on (i) whether this includes demonstration of type I error control if the protocol is violated and (ii) whether simulations of type I error control are acceptable. (2d) Regulatory agency concerns that protocols

  9. Differences in 5-HT2A and mGlu2 Receptor Expression Levels and Repressive Epigenetic Modifications at the 5-HT2A Promoter Region in the Roman Low- (RLA-I) and High- (RHA-I) Avoidance Rat Strains.

    PubMed

    Fomsgaard, Luna; Moreno, Jose L; de la Fuente Revenga, Mario; Brudek, Tomasz; Adamsen, Dea; Rio-Alamos, Cristobal; Saunders, Justin; Klein, Anders Bue; Oliveras, Ignasi; Cañete, Toni; Blazquez, Gloria; Tobeña, Adolf; Fernandez-Teruel, Albert; Gonzalez-Maeso, Javier; Aznar, Susana

    2018-03-01

    The serotonin 2A (5-HT 2A ) and metabotropic glutamate 2 (mGlu2) receptors regulate each other and are associated with schizophrenia. The Roman high- (RHA-I) and the Roman low- (RLA-I) avoidance rat strains present well-differentiated behavioral profiles, with the RHA-I strain emerging as a putative genetic rat model of schizophrenia-related features. The RHA-I strain shows increased 5-HT 2A and decreased mGlu2 receptor binding levels in prefrontal cortex (PFC). Here, we looked for differences in gene expression and transcriptional regulation of these receptors. The striatum (STR) was included in the analysis. 5-HT 2A , 5-HT 1A , and mGlu2 mRNA and [ 3 H]ketanserin binding levels were measured in brain homogenates. As expected, 5-HT 2A binding was significantly increased in PFC in the RHA-I rats, while no difference in binding was observed in STR. Surprisingly, 5-HT 2A gene expression was unchanged in PFC but significantly decreased in STR. mGlu2 receptor gene expression was significantly decreased in both PFC and STR. No differences were observed for the 5-HT 1A receptor. Chromatin immunoprecipitation assay revealed increased trimethylation of histone 3 at lysine 27 (H3K27me3) at the promoter region of the HTR2A gene in the STR. We further looked at the Akt/GSK3 signaling pathway, a downstream point of convergence of the serotonin and glutamate system, and found increased phosphorylation levels of GSK3β at tyrosine 216 and increased β-catenin levels in the PFC of the RHA-I rats. These results reveal region-specific regulation of the 5-HT 2A receptor in the RHA-I rats probably due to absence of mGlu2 receptor that may result in differential regulation of downstream pathways.

  10. New iGrav superconducting gravimeter: accuracy, drift and first results

    NASA Astrophysics Data System (ADS)

    Le Moigne, N.; Champollion, C.; Warburton, R. J.; Bayer, R.; Deville, S.; Doerflinger, E.; chery, J.; Vernant, P.; Boudin, F.; Collard, P.

    2011-12-01

    A GWR iGrav superconducting gravimeter has been installed in the Larzac karstic area (Southern France near the Mediterranean Sea, elevation 800m, karst thickness 200m). Continuous sub-μGal gravity measurements are needed to study water storage and transfer in the non-saturated zone of the karstic area. The GWR iGrav is a new generation of superconducting gravimeter of reduced size (Dewar 15L) with simplified installation. At first, the specifications of the iGrav site will be presented, then the drift behaviour and the data processing. The drift quickly decreases to less than 0.1 μGal per day and only a few offsets are observed in the data. In order to look at the stability of the iGrav over a wide time period, a FG5 gravimeter is used for bi-monthly absolute gravity measurements and for frequent calibrations. As a result of the installation, the iGrav allows sub-μGal gravity monitoring only a few weeks after the beginning of the installation. After having discussed the instrumental and data processing points of view, preliminary results on the local karstic water storage will be presented and interpreted by combining different geophysical data. Continuous gravity data allow to study processes at different timescale such as summer evapotranspiration or high precipitating event characteristic of the Mediterranean autumn.

  11. Expression of IGF-I, IGF-I receptor and IGF binding proteins-1, -2, -3, -4 and -5 in human atherectomy specimens.

    PubMed

    Grant, M B; Wargovich, T J; Ellis, E A; Tarnuzzer, R; Caballero, S; Estes, K; Rossing, M; Spoerri, P E; Pepine, C

    1996-12-17

    The molecular and cellular processes that induce rapid atherosclerotic plaque progression in patients with unstable angina and initiate restenosis following coronary interventional procedures are uncertain. We examined primary (de novo) and restenotic lesions retrieved at the time of directional coronary atherectomy for expression of insulin-like-growth factor-I (IGF-I). IGF-I receptor, and five IGF binding proteins (IGFBPs), IGFBP-1, IGFBP-2, IGFBP-3, IGFBP-4, and IGFBP-5 in smooth muscle cells (SMCs) using colloidal gold immunocytochemistry. IGF-1, its receptor and binding proteins were not detected in SMCs of normal coronary arteries. IGF-I localized primarily in synthetic smooth muscle cells (sSMCs) in both de novo and restenotic plaques. IGF-I receptor localized on sSMCs and their processes and colocalized with IGF-I. Although morphometric analysis of IGF-I and IGF-I receptor immunoreactivity in sSMCs of de novo and restenotic lesions showed comparable levels of IGF-I (3.2 +/- 1.0 and 2.9 +/- 0.9, respectively). IGF-I receptor was significantly higher in de novo lesions as compared to restenotic lesions (10.7 +/- 2.5 and 4.2 +/- 1.3, P < 0.05, respectively). IGFBP-1, IGFBP-2, IGFBP-3, IGFBP-4 and IGFBP-5 localized in the cytoplasm of sSMCs and in the extracellular matrix. Quantitative reverse transcription polymerase chain reaction (QRT-PCR) performed on de novo atherectomy specimens identified mRNA for IGF-I, IGF-I receptor, IGFBP-1, IGFBP-2, IGFBP-4, IGFBP-5 levels and detected mRNA for IGFBP-3. The expression of IGF-I, IGF-I receptor, and IGFBPs in atherectomy plaques suggests that the development of coronary obstructive lesions may be a result of changes in the IGF system.

  12. TID Test Results for 4th Generation iPad(TradeMark)

    NASA Technical Reports Server (NTRS)

    Guertin, S. M.; Allen, G. R.; McClure, S. S.; LaBel, K. A.

    2013-01-01

    TID testing of 4th generation iPads is reported. Of iPad subsystems, results indicate that the charging circuitry and display drivers fail at lowest TID levels. Details of construction are investigated for additional testing of components.

  13. The longevity of statistical learning: When infant memory decays, isolated words come to the rescue.

    PubMed

    Karaman, Ferhat; Hay, Jessica F

    2018-02-01

    Research over the past 2 decades has demonstrated that infants are equipped with remarkable computational abilities that allow them to find words in continuous speech. Infants can encode information about the transitional probability (TP) between syllables to segment words from artificial and natural languages. As previous research has tested infants immediately after familiarization, infants' ability to retain sequential statistics beyond the immediate familiarization context remains unknown. Here, we examine infants' memory for statistically defined words 10 min after familiarization with an Italian corpus. Eight-month-old English-learning infants were familiarized with Italian sentences that contained 4 embedded target words-2 words had high internal TP (HTP, TP = 1.0) and 2 had low TP (LTP, TP = .33)-and were tested on their ability to discriminate HTP from LTP words using the Headturn Preference Procedure. When tested after a 10-min delay, infants failed to discriminate HTP from LTP words, suggesting that memory for statistical information likely decays over even short delays (Experiment 1). Experiments 2-4 were designed to test whether experience with isolated words selectively reinforces memory for statistically defined (i.e., HTP) words. When 8-month-olds were given additional experience with isolated tokens of both HTP and LTP words immediately after familiarization, they looked significantly longer on HTP than LTP test trials 10 min later. Although initial representations of statistically defined words may be fragile, our results suggest that experience with isolated words may reinforce the output of statistical learning by helping infants create more robust memories for words with strong versus weak co-occurrence statistics. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  14. Simulation studies of the Cl- + CH3I SN2 nucleophilic substitution reaction: Comparison with ion imaging experiments

    NASA Astrophysics Data System (ADS)

    Zhang, Jiaxu; Lourderaj, Upakarasamy; Sun, Rui; Mikosch, Jochen; Wester, Roland; Hase, William L.

    2013-03-01

    In the previous work of Mikosch et al. [Science 319, 183 (2008)], 10.1126/science.1150238, ion imaging experiments were used to study the Cl- + CH3I → ClCH3 + I- reaction at collision energies Erel of 0.39, 0.76, 1.07, and 1.9 eV. For the work reported here MP2(fc)/ECP/d direct dynamics simulations were performed to obtain an atomistic understanding of the experiments. There is good agreement with the experimental product energy and scattering angle distributions for the highest three Erel, and at these energies 80% or more of the reaction is direct, primarily occurring by a rebound mechanism with backward scattering. At 0.76 eV there is a small indirect component, with isotropic scattering, involving formation of the pre- and post-reaction complexes. All of the reaction is direct at 1.07 eV. Increasing Erel to 1.9 eV opens up a new indirect pathway, the roundabout mechanism. The product energy is primarily partitioned into relative translation for the direct reactions, but to CH3Cl internal energy for the indirect reactions. The roundabout mechanism transfers substantial energy to CH3Cl rotation. At Erel = 0.39 eV both the experimental product energy partitioning and scattering are statistical, suggesting the reaction is primarily indirect with formation of the pre- and post-reaction complexes. However, neither MP2 nor BhandH/ECP/d simulations agree with experiment and, instead, give reaction dominated by direct processes as found for the higher collision energies. Decreasing the simulation Erel to 0.20 eV results in product energy partitioning and scattering which agree with the 0.39 eV experiment. The sharp transition from a dominant direct to indirect reaction as Erel is lowered from 0.39 to 0.20 eV is striking. The lack of agreement between the simulations and experiment for Erel = 0.39 eV may result from a distribution of collision energies in the experiment and/or a shortcoming in both the MP2 and BhandH simulations. Increasing the reactant rotational

  15. I = 1 and I = 2 π-π scattering phase shifts from Nf = 2 + 1 lattice QCD

    NASA Astrophysics Data System (ADS)

    Bulava, John; Fahy, Brendan; Hörz, Ben; Juge, Keisuke J.; Morningstar, Colin; Wong, Chik Him

    2016-09-01

    The I = 1 p-wave and I = 2 s-wave elastic π-π scattering amplitudes are calculated from a first-principles lattice QCD simulation using a single ensemble of gauge field configurations with Nf = 2 + 1 dynamical flavors of anisotropic clover-improved Wilson fermions. This ensemble has a large spatial volume V =(3.7 fm)3, pion mass mπ = 230 MeV, and spatial lattice spacing as = 0.11 fm. Calculation of the necessary temporal correlation matrices is efficiently performed using the stochastic LapH method, while the large volume enables an improved energy resolution compared to previous work. For this single ensemble we obtain mρ /mπ = 3.350 (24), gρππ = 5.99 (26), and a clear signal for the I = 2 s-wave. The success of the stochastic LapH method in this proof-of-principle large-volume calculation paves the way for quantitative study of the lattice spacing effects and quark mass dependence of scattering amplitudes using state-of-the-art ensembles.

  16. Interactive Cohort Identification of Sleep Disorder Patients Using Natural Language Processing and i2b2

    PubMed Central

    Chen, W.; Kowatch, R.; Lin, S.; Splaingard, M.

    2015-01-01

    Summary Nationwide Children’s Hospital established an i2b2 (Informatics for Integrating Biology & the Bedside) application for sleep disorder cohort identification. Discrete data were gleaned from semistructured sleep study reports. The system showed to work more efficiently than the traditional manual chart review method, and it also enabled searching capabilities that were previously not possible. Objective We report on the development and implementation of the sleep disorder i2b2 cohort identification system using natural language processing of semi-structured documents. Methods We developed a natural language processing approach to automatically parse concepts and their values from semi-structured sleep study documents. Two parsers were developed: a regular expression parser for extracting numeric concepts and a NLP based tree parser for extracting textual concepts. Concepts were further organized into i2b2 ontologies based on document structures and in-domain knowledge. Results 26,550 concepts were extracted with 99% being textual concepts. 1.01 million facts were extracted from sleep study documents such as demographic information, sleep study lab results, medications, procedures, diagnoses, among others. The average accuracy of terminology parsing was over 83% when comparing against those by experts. The system is capable of capturing both standard and non-standard terminologies. The time for cohort identification has been reduced significantly from a few weeks to a few seconds. Conclusion Natural language processing was shown to be powerful for quickly converting large amount of semi-structured or unstructured clinical data into discrete concepts, which in combination of intuitive domain specific ontologies, allows fast and effective interactive cohort identification through the i2b2 platform for research and clinical use. PMID:26171080

  17. Statistical science: a grammar for research.

    PubMed

    Cox, David R

    2017-06-01

    I greatly appreciate the invitation to give this lecture with its century long history. The title is a warning that the lecture is rather discursive and not highly focused and technical. The theme is simple. That statistical thinking provides a unifying set of general ideas and specific methods relevant whenever appreciable natural variation is present. To be most fruitful these ideas should merge seamlessly with subject-matter considerations. By contrast, there is sometimes a temptation to regard formal statistical analysis as a ritual to be added after the serious work has been done, a ritual to satisfy convention, referees, and regulatory agencies. I want implicitly to refute that idea.

  18. A new statistical methodology predicting chip failure probability considering electromigration

    NASA Astrophysics Data System (ADS)

    Sun, Ted

    In this research thesis, we present a new approach to analyze chip reliability subject to electromigration (EM) whose fundamental causes and EM phenomenon happened in different materials are presented in this thesis. This new approach utilizes the statistical nature of EM failure in order to assess overall EM risk. It includes within-die temperature variations from the chip's temperature map extracted by an Electronic Design Automation (EDA) tool to estimate the failure probability of a design. Both the power estimation and thermal analysis are performed in the EDA flow. We first used the traditional EM approach to analyze the design with a single temperature across the entire chip that involves 6 metal and 5 via layers. Next, we used the same traditional approach but with a realistic temperature map. The traditional EM analysis approach and that coupled with a temperature map and the comparison between the results of considering and not considering temperature map are presented in in this research. A comparison between these two results confirms that using a temperature map yields a less pessimistic estimation of the chip's EM risk. Finally, we employed the statistical methodology we developed considering a temperature map and different use-condition voltages and frequencies to estimate the overall failure probability of the chip. The statistical model established considers the scaling work with the usage of traditional Black equation and four major conditions. The statistical result comparisons are within our expectations. The results of this statistical analysis confirm that the chip level failure probability is higher i) at higher use-condition frequencies for all use-condition voltages, and ii) when a single temperature instead of a temperature map across the chip is considered. In this thesis, I start with an overall review on current design types, common flows, and necessary verifications and reliability checking steps used in this IC design industry

  19. Receptor arrays optimized for natural odor statistics.

    PubMed

    Zwicker, David; Murugan, Arvind; Brenner, Michael P

    2016-05-17

    Natural odors typically consist of many molecules at different concentrations. It is unclear how the numerous odorant molecules and their possible mixtures are discriminated by relatively few olfactory receptors. Using an information theoretic model, we show that a receptor array is optimal for this task if it achieves two possibly conflicting goals: (i) Each receptor should respond to half of all odors and (ii) the response of different receptors should be uncorrelated when averaged over odors presented with natural statistics. We use these design principles to predict statistics of the affinities between receptors and odorant molecules for a broad class of odor statistics. We also show that optimal receptor arrays can be tuned to either resolve concentrations well or distinguish mixtures reliably. Finally, we use our results to predict properties of experimentally measured receptor arrays. Our work can thus be used to better understand natural olfaction, and it also suggests ways to improve artificial sensor arrays.

  20. Hispanics/Latinos & Cardiovascular Disease: Statistical Fact Sheet

    MedlinePlus

    Statistical Fact Sheet 2013 Update Hispanics/Latinos & Cardiovascular Diseases Cardiovascular Disease (CVD) (ICD/10 codes I00-I99, Q20-Q28) (ICD/9 codes 390-459, 745-747)  Among Mexican-American adults age 20 ...

  1. A knowledge-based T2-statistic to perform pathway analysis for quantitative proteomic data

    PubMed Central

    Chen, Yi-Hau

    2017-01-01

    Approaches to identify significant pathways from high-throughput quantitative data have been developed in recent years. Still, the analysis of proteomic data stays difficult because of limited sample size. This limitation also leads to the practice of using a competitive null as common approach; which fundamentally implies genes or proteins as independent units. The independent assumption ignores the associations among biomolecules with similar functions or cellular localization, as well as the interactions among them manifested as changes in expression ratios. Consequently, these methods often underestimate the associations among biomolecules and cause false positives in practice. Some studies incorporate the sample covariance matrix into the calculation to address this issue. However, sample covariance may not be a precise estimation if the sample size is very limited, which is usually the case for the data produced by mass spectrometry. In this study, we introduce a multivariate test under a self-contained null to perform pathway analysis for quantitative proteomic data. The covariance matrix used in the test statistic is constructed by the confidence scores retrieved from the STRING database or the HitPredict database. We also design an integrating procedure to retain pathways of sufficient evidence as a pathway group. The performance of the proposed T2-statistic is demonstrated using five published experimental datasets: the T-cell activation, the cAMP/PKA signaling, the myoblast differentiation, and the effect of dasatinib on the BCR-ABL pathway are proteomic datasets produced by mass spectrometry; and the protective effect of myocilin via the MAPK signaling pathway is a gene expression dataset of limited sample size. Compared with other popular statistics, the proposed T2-statistic yields more accurate descriptions in agreement with the discussion of the original publication. We implemented the T2-statistic into an R package T2GA, which is available at https

  2. A knowledge-based T2-statistic to perform pathway analysis for quantitative proteomic data.

    PubMed

    Lai, En-Yu; Chen, Yi-Hau; Wu, Kun-Pin

    2017-06-01

    Approaches to identify significant pathways from high-throughput quantitative data have been developed in recent years. Still, the analysis of proteomic data stays difficult because of limited sample size. This limitation also leads to the practice of using a competitive null as common approach; which fundamentally implies genes or proteins as independent units. The independent assumption ignores the associations among biomolecules with similar functions or cellular localization, as well as the interactions among them manifested as changes in expression ratios. Consequently, these methods often underestimate the associations among biomolecules and cause false positives in practice. Some studies incorporate the sample covariance matrix into the calculation to address this issue. However, sample covariance may not be a precise estimation if the sample size is very limited, which is usually the case for the data produced by mass spectrometry. In this study, we introduce a multivariate test under a self-contained null to perform pathway analysis for quantitative proteomic data. The covariance matrix used in the test statistic is constructed by the confidence scores retrieved from the STRING database or the HitPredict database. We also design an integrating procedure to retain pathways of sufficient evidence as a pathway group. The performance of the proposed T2-statistic is demonstrated using five published experimental datasets: the T-cell activation, the cAMP/PKA signaling, the myoblast differentiation, and the effect of dasatinib on the BCR-ABL pathway are proteomic datasets produced by mass spectrometry; and the protective effect of myocilin via the MAPK signaling pathway is a gene expression dataset of limited sample size. Compared with other popular statistics, the proposed T2-statistic yields more accurate descriptions in agreement with the discussion of the original publication. We implemented the T2-statistic into an R package T2GA, which is available at https

  3. Statistical Reference Datasets

    National Institute of Standards and Technology Data Gateway

    Statistical Reference Datasets (Web, free access)   The Statistical Reference Datasets is also supported by the Standard Reference Data Program. The purpose of this project is to improve the accuracy of statistical software by providing reference datasets with certified computational results that enable the objective evaluation of statistical software.

  4. Spectroscopic constants and potential energy curve of the iodine weakly bound 1u state correlating with the I(2P1/2) + I(2P1/2) dissociation limit

    NASA Astrophysics Data System (ADS)

    Akopyan, M. E.; Baturo, V. V.; Lukashov, S. S.; Poretsky, S. A.; Pravilov, A. M.

    2015-01-01

    The stepwise three-step three-color laser population of the I2(β1g, νβ, Jβ) rovibronic states via the B0u+, νB, JB rovibronic states and rovibronic levels of the 1u(bb) and 0g+(bb) states mixed by hyperfine interaction is used for determination of rovibronic level energies of the weakly bound I2(1u(bb)) state. Dunham coefficients of the state, Yi0 (i = 0-3), Yi1 (i = 0-2), Y02 and Y12 for the {{v}{{1u}}} = 1-5, 8, 10, 15 and {{J}{{1u}}} ≈ 9-87 ranges, the dissociation energy of the state, De, and equilibrium I-I distance, Re, as well as the potential energy curve are determined. There are aperiodicities in the excitation spectrum corresponding to the β, νβ = 23, Jβ ← 1u(bb), ν1u = 4, 5, J1u progressions in the I2 + Rg = He, Ar mixture, namely, a great number of lines which do not coincide with the R or P line progressions. Their positions conflict with the ΔJ-even selection rule. Furthermore, they do not correspond to the ΔJ-odd progression.

  5. Association of vitamin D receptor BsmI, TaqI, FokI, and ApaI polymorphisms with susceptibility of chronic periodontitis: A systematic review and meta-analysis based on 38 case –control studies

    PubMed Central

    Mashhadiabbas, Fatemeh; Neamatzadeh, Hossein; Nasiri, Rezvan; Foroughi, Elnaz; Farahnak, Soudabeh; Piroozmand, Parisa; Mazaheri, Mahta; Zare-Shehneh, Masoud

    2018-01-01

    Background: There has been increasing interest in the study of the association between Vitamin D receptor (VDR) gene polymorphisms and risk of chronic periodontitis. However, the results remain inconclusive. To better understand the roles of VDR polymorphisms (BsmI, TaqI, FokI, and ApaI) in chronic periodontitis susceptibility, we conducted this systematic review and meta-analysis. Materials and Methods: The PubMed, Google Scholar, and Web of Science database were systemically searched to determine all the eligible studies about VDR polymorphisms and risk of chronic periodontitis up to April 2017. Odds ratio (OR) and 95% confidence interval (CI) were used to evaluate the associations between VDR polymorphisms and chronic periodontitis risk. All the statistical analyses were performed by Comprehensive Meta-Analysis. All P values were two-tailed with a significant level at 0.05. Results: Finally, a total of 38 case–control studies in 19 publications were identified which met our inclusion criteria. There are ten studies with 866 chronic periodontitis cases and 786 controls for BsmI, 16 studies with 1570 chronic periodontitis cases and 1676 controls for TaqI, five studies with 374 chronic periodontitis cases and 382 controls for FokI, and seven studies with 632 chronic periodontitis cases and 604 controls for ApaI. Overall, no significant association was observed between VDR gene BsmI, TaqI, FokI, and ApaI polymorphisms and risk of chronic periodontitis in any genetic model. Subgroup analysis stratified by ethnicity suggested a significant association between BsmI polymorphism and chronic periodontitis risk in the Caucasian subgroup under allele model (A vs. G: OR = 1.747, 95% CI = 1.099–2.778, P = 0.018). Further, no significant associations were observed when stratified by Hardy–Weinberg equilibrium status for BsmI, TaqI, and ApaI. Conclusion: Our results suggest that BsmI, TaqI, FokI, and ApaI polymorphisms in the VDR gene might not be associated with risk of

  6. Combining statistical inference and decisions in ecology.

    PubMed

    Williams, Perry J; Hooten, Mevin B

    2016-09-01

    Statistical decision theory (SDT) is a sub-field of decision theory that formally incorporates statistical investigation into a decision-theoretic framework to account for uncertainties in a decision problem. SDT provides a unifying analysis of three types of information: statistical results from a data set, knowledge of the consequences of potential choices (i.e., loss), and prior beliefs about a system. SDT links the theoretical development of a large body of statistical methods, including point estimation, hypothesis testing, and confidence interval estimation. The theory and application of SDT have mainly been developed and published in the fields of mathematics, statistics, operations research, and other decision sciences, but have had limited exposure in ecology. Thus, we provide an introduction to SDT for ecologists and describe its utility for linking the conventionally separate tasks of statistical investigation and decision making in a single framework. We describe the basic framework of both Bayesian and frequentist SDT, its traditional use in statistics, and discuss its application to decision problems that occur in ecology. We demonstrate SDT with two types of decisions: Bayesian point estimation and an applied management problem of selecting a prescribed fire rotation for managing a grassland bird species. Central to SDT, and decision theory in general, are loss functions. Thus, we also provide basic guidance and references for constructing loss functions for an SDT problem. © 2016 by the Ecological Society of America.

  7. Failure Analysis by Statistical Techniques (FAST). Volume 1. User’s Manual

    DTIC Science & Technology

    1974-10-31

    REPORT NUMBER DNA 3336F-1 2. OOVT ACCESSION NO 4. TITLE Cand Sublllle) • FAILURE ANALYSIS BY STATISTICAL TECHNIQUES (FAST) Volume I, User’s...SS2), and t’ a facility ( SS7 ). The other three diagrams break down the three critical subsystems. T le median probability of survival of the

  8. i2- and Gαi3-Deficient Mice Display Opposite Severity of Myocardial Ischemia Reperfusion Injury

    PubMed Central

    Köhler, David; Devanathan, Vasudharani; Bernardo de Oliveira Franz, Claudia; Eldh, Therese; Novakovic, Ana; Roth, Judith M.; Granja, Tiago; Birnbaumer, Lutz; Rosenberger, Peter; Beer-Hammer, Sandra; Nürnberg, Bernd

    2014-01-01

    G-protein-coupled receptors (GPCRs) are the most abundant receptors in the heart and therefore are common targets for cardiovascular therapeutics. The activated GPCRs transduce their signals via heterotrimeric G-proteins. The four major families of G-proteins identified so far are specified through their α-subunit: Gαi, Gαs, Gαq and G12/13. Gαi-proteins have been reported to protect hearts from ischemia reperfusion injury. However, determining the individual impact of Gαi2 or Gαi3 on myocardial ischemia injury has not been clarified yet. Here, we first investigated expression of Gαi2 and Gαi3 on transcriptional level by quantitative PCR and on protein level by immunoblot analysis as well as by immunofluorescence in cardiac tissues of wild-type, Gαi2-, and Gαi3-deficient mice. Gαi2 was expressed at higher levels than Gαi3 in murine hearts, and irrespective of the isoform being knocked out we observed an up regulation of the remaining Gαi-protein. Myocardial ischemia promptly regulated cardiac mRNA and with a slight delay protein levels of both Gαi2 and Gαi3, indicating important roles for both Gαi isoforms. Furthermore, ischemia reperfusion injury in Gαi2- and Gαi3-deficient mice exhibited opposite outcomes. Whereas the absence of Gαi2 significantly increased the infarct size in the heart, the absence of Gαi3 or the concomitant upregulation of Gαi2 dramatically reduced cardiac infarction. In conclusion, we demonstrate for the first time that the genetic ablation of Gαi proteins has protective or deleterious effects on cardiac ischemia reperfusion injury depending on the isoform being absent. PMID:24858945

  9. AgPO2F2 and Ag9(PO2F2)14: the first Ag(i) and Ag(i)/Ag(ii) difluorophosphates with complex crystal structures.

    PubMed

    Malinowski, Przemysław J; Kurzydłowski, Dominik; Grochala, Wojciech

    2015-12-07

    The reaction of AgF2 with P2O3F4 yields a mixed valence Ag(I)/Ag(II) difluorophosphate salt with AgAg(PO2F2)14 stoichiometry - the first Ag(ii)-PO2F2 system known. This highly moisture sensitive brown solid is thermally stable up to 120 °C, which points at further feasible extension of the chemistry of Ag(ii)-PO2F2 systems. The crystal structure shows a very complex bonding pattern, comprising of polymeric Ag(PO2F2)14(4-) anions and two types of Ag(I) cations. One particular Ag(II) site present in the crystal structure of Ag9(PO2F2)14 is the first known example of square pyramidal penta-coordinated Ag(ii) in an oxo-ligand environment. Ag(i)PO2F2 - the product of the thermal decomposition of Ag9(PO2F2)14 - has also been characterized by thermal analysis, IR spectroscopy and X-ray powder diffraction. It has a complicated crystal structure as well, which consists of infinite 1D [Ag(I)O4/2] chains which are linked to more complex 3D structures via OPO bridges. The PO2F2(-) anions bind to cations in both compounds as bidentate oxo-ligands. The terminal F atoms tend to point inside the van der Waals cavities in the crystal structure of both compounds. All important structural details of both title compounds were corroborated by DFT calculations.

  10. The development of form two mathematics i-Think module (Mi-T2)

    NASA Astrophysics Data System (ADS)

    Yao, Foo Jing; Abdullah, Mohd Faizal Nizam Lee; Tien, Lee Tien

    2017-05-01

    This study aims to develop a training module i-THINK Mathematics Form Two (Mi-T2) to increase the higher-order thinking skills of students. The Mi-T2 training module was built based on the Sidek Module Development Model (2001). Constructivist learning theory, cognitive learning theory, i-THINK map and higher order thinking skills were the building blocks of the module development. In this study, researcher determined the validity and reliability of Mi-T2 module. The design being used in this study was descriptive study. To determine the needs of Mi-T2 module, questionnaires and literature review were used to collect data. When the need of the module was determined, the module was built and a pilot study was conducted to test the reliability of the Mi-T2 module. The pilot study was conducted at a secondary school in North Kinta, Perak. A Form Two class was selected to be the sample study through clustered random sampling. The pilot study was conducted for two months and one topic had been studied. The Mi-T2 module was evaluated by five expert panels to determine the content validity of the module. The instruments being used in the study were questionnaires about the necessity of the Mi-T2 module for guidance, questionnaires about the validity of the module and questionnaires concerning the reliability of the module. Statistical analysis was conducted to determine the validity and reliability coefficients of the Mi-T2 module. The content validity of Mi-T2 module was determined by Cohen's Kappa's (1968) agreement coefficient and the reliability of Mi-T2 module was determined by Cronbach Alpha's value scale. The content validity of Mi-T2 module was 0.89 and the Cronbach Alpha's value of Mi-T2 module was 0.911.

  11. DA-6034 Induces [Ca(2+)]i Increase in Epithelial Cells.

    PubMed

    Yang, Yu-Mi; Park, Soonhong; Ji, Hyewon; Kim, Tae-Im; Kim, Eung Kweon; Kang, Kyung Koo; Shin, Dong Min

    2014-04-01

    DA-6034, a eupatilin derivative of flavonoid, has shown potent effects on the protection of gastric mucosa and induced the increases in fluid and glycoprotein secretion in human and rat corneal and conjunctival cells, suggesting that it might be considered as a drug for the treatment of dry eye. However, whether DA-6034 induces Ca(2+) signaling and its underlying mechanism in epithelial cells are not known. In the present study, we investigated the mechanism for actions of DA-6034 in Ca(2+) signaling pathways of the epithelial cells (conjunctival and corneal cells) from human donor eyes and mouse salivary gland epithelial cells. DA-6034 activated Ca(2+)-activated Cl(-) channels (CaCCs) and increased intracellular calcium concentrations ([Ca(2+)]i) in primary cultured human conjunctival cells. DA-6034 also increased [Ca(2+)]i in mouse salivary gland cells and human corneal epithelial cells. [Ca(2+)]i increase of DA-6034 was dependent on the Ca(2+) entry from extracellular and Ca(2+) release from internal Ca(2+) stores. Interestingly, these effects of DA-6034 were related to ryanodine receptors (RyRs) but not phospholipase C/inositol 1,4,5-triphosphate (IP3) pathway and lysosomal Ca(2+) stores. These results suggest that DA-6034 induces Ca(2+) signaling via extracellular Ca(2+) entry and RyRs-sensitive Ca(2+) release from internal Ca(2+) stores in epithelial cells.

  12. Patients and Medical Statistics

    PubMed Central

    Woloshin, Steven; Schwartz, Lisa M; Welch, H Gilbert

    2005-01-01

    BACKGROUND People are increasingly presented with medical statistics. There are no existing measures to assess their level of interest or confidence in using medical statistics. OBJECTIVE To develop 2 new measures, the STAT-interest and STAT-confidence scales, and assess their reliability and validity. DESIGN Survey with retest after approximately 2 weeks. SUBJECTS Two hundred and twenty-four people were recruited from advertisements in local newspapers, an outpatient clinic waiting area, and a hospital open house. MEASURES We developed and revised 5 items on interest in medical statistics and 3 on confidence understanding statistics. RESULTS Study participants were mostly college graduates (52%); 25% had a high school education or less. The mean age was 53 (range 20 to 84) years. Most paid attention to medical statistics (6% paid no attention). The mean (SD) STAT-interest score was 68 (17) and ranged from 15 to 100. Confidence in using statistics was also high: the mean (SD) STAT-confidence score was 65 (19) and ranged from 11 to 100. STAT-interest and STAT-confidence scores were moderately correlated (r=.36, P<.001). Both scales demonstrated good test–retest repeatability (r=.60, .62, respectively), internal consistency reliability (Cronbach's α=0.70 and 0.78), and usability (individual item nonresponse ranged from 0% to 1.3%). Scale scores correlated only weakly with scores on a medical data interpretation test (r=.15 and .26, respectively). CONCLUSION The STAT-interest and STAT-confidence scales are usable and reliable. Interest and confidence were only weakly related to the ability to actually use data. PMID:16307623

  13. Quality control analysis : part I : asphaltic concrete.

    DOT National Transportation Integrated Search

    1964-11-01

    This report deals with the statistical evaluation of results from several hot mix plants to determine the pattern of variability with respect to bituminous hot mix characteristics. : Individual tests results when subjected to frequency distribution i...

  14. Comparative study of nondoped and Eu-doped SrI2 scintillator

    NASA Astrophysics Data System (ADS)

    Yanagida, Takayuki; Koshimizu, Masanori; Okada, Go; Kojima, Takahiro; Osada, Junya; Kawaguchi, Noriaki

    2016-11-01

    Optical and scintillation properties of nondoped and Eu 3% doped SrI2 crystals grown by the Vertical Bridgman method were investigated. Eu-doped crystal showed an intense single band emission at 430 nm due to the Eu2+ 5d-4f transitions in both photoluminescence and scintillation while the nondoped crystal had a complex spectral shape. The latter emission consists of mainly four bands: 360 nm, 540 nm, 410 nm and 430 nm. The origins of 360 nm and 540 nm were self-trapped exciton and unexpected impurity, respectively. The origins of 410 and 430 nm lines were ascribed to F center in different I sites. Under 137Cs γ-ray irradiations, both crystals showed a clear photoabsorption peak. The scintillation light yields of the nondoped and Eu-doped SrI2 resulted 33,000 ph/MeV and 82,000 ph/MeV, respectively. The energy resolution at 662 keV of Eu-doped was 4% while that of the non-doped SrI2 was 8%.

  15. Construction Norms and Regulations. Part III. Section I. Chapter 1. Part II. Section I. Chapter 2 (Stroitel’nyye Normy i Pravila. Chast’ III. Razdel I. Glava 1. Chast’ II. Razdel I. Glava 2),

    DTIC Science & Technology

    1981-06-26

    of the Ministry of Inland Water Transportation of the RSFSR with participation of the Central Scientific Research Institute and the Construction... Articles . General Indicators" I-V.5.2-62, "Reinforced-Concrete Articles for Structures" and the regulations of this chapter which supplement them. 9.2...Department of Scientific and Technical Literature at SOYUZKNIG. 58 Ct R.C’lTI : NOR’,IS AND RIGULATrONS I\\. I. Pal ’ch ikov, S . P. Antonov , Editors

  16. A comparison of different statistical methods analyzing hypoglycemia data using bootstrap simulations.

    PubMed

    Jiang, Honghua; Ni, Xiao; Huster, William; Heilmann, Cory

    2015-01-01

    Hypoglycemia has long been recognized as a major barrier to achieving normoglycemia with intensive diabetic therapies. It is a common safety concern for the diabetes patients. Therefore, it is important to apply appropriate statistical methods when analyzing hypoglycemia data. Here, we carried out bootstrap simulations to investigate the performance of the four commonly used statistical models (Poisson, negative binomial, analysis of covariance [ANCOVA], and rank ANCOVA) based on the data from a diabetes clinical trial. Zero-inflated Poisson (ZIP) model and zero-inflated negative binomial (ZINB) model were also evaluated. Simulation results showed that Poisson model inflated type I error, while negative binomial model was overly conservative. However, after adjusting for dispersion, both Poisson and negative binomial models yielded slightly inflated type I errors, which were close to the nominal level and reasonable power. Reasonable control of type I error was associated with ANCOVA model. Rank ANCOVA model was associated with the greatest power and with reasonable control of type I error. Inflated type I error was observed with ZIP and ZINB models.

  17. Statistics and Physical Oceanography

    DTIC Science & Technology

    1993-01-01

    1987; Institute of Mathematical Statistics, 1988; NRC, 1990a; see also Goel et al., 1990; Gnanadesikan , 1990; Hoadley and Kettenring, 1990), together...1621. Fukumori, I. J. Benveniste, C. Wunsch, and D. B. HaidvogeL 1993. Assimilation of sea surface 57 Gnanadesikan , R. 1990. Looking ahead: Cross

  18. Loop Braiding Statistics and Interacting Fermionic Symmetry-Protected Topological Phases in Three Dimensions

    NASA Astrophysics Data System (ADS)

    Cheng, Meng; Tantivasadakarn, Nathanan; Wang, Chenjie

    2018-01-01

    We study Abelian braiding statistics of loop excitations in three-dimensional gauge theories with fermionic particles and the closely related problem of classifying 3D fermionic symmetry-protected topological (FSPT) phases with unitary symmetries. It is known that the two problems are related by turning FSPT phases into gauge theories through gauging the global symmetry of the former. We show that there exist certain types of Abelian loop braiding statistics that are allowed only in the presence of fermionic particles, which correspond to 3D "intrinsic" FSPT phases, i.e., those that do not stem from bosonic SPT phases. While such intrinsic FSPT phases are ubiquitous in 2D systems and in 3D systems with antiunitary symmetries, their existence in 3D systems with unitary symmetries was not confirmed previously due to the fact that strong interaction is necessary to realize them. We show that the simplest unitary symmetry to support 3D intrinsic FSPT phases is Z2×Z4. To establish the results, we first derive a complete set of physical constraints on Abelian loop braiding statistics. Solving the constraints, we obtain all possible Abelian loop braiding statistics in 3D gauge theories, including those that correspond to intrinsic FSPT phases. Then, we construct exactly soluble state-sum models to realize the loop braiding statistics. These state-sum models generalize the well-known Crane-Yetter and Dijkgraaf-Witten models.

  19. Vibrational relaxation of I2 in complexing solvents: The role of solvent-solute attractive forces

    NASA Astrophysics Data System (ADS)

    Shiang, Joseph J.; Liu, Hongjun; Sension, Roseanne J.

    1998-12-01

    Femtosecond transient absorption studies of I2-arene complexes, with arene=hexamethylbenzene (HMB), mesitylene (MST), or m-xylene (mX), are used to investigate the effect of solvent-solute attractive forces upon the rate of vibrational relaxation in solution. Comparison of measurements on I2-MST complexes in neat mesitylene and I2-MST complexes diluted in carbontetrachloride demonstrate that binary solvent-solute attractive forces control the rate of vibrational relaxation in this prototypical model of diatomic vibrational relaxation. The data obtained for different arenes demonstrate that the rate of I2 relaxation increases with the magnitude of the I2-arene attractive interaction. I2-HMB relaxes much faster than I2 in MST or mX. The results of these experiments are discussed in terms of both isolated binary collision and instantaneous normal mode models for vibrational relaxation.

  20. Variability-aware compact modeling and statistical circuit validation on SRAM test array

    NASA Astrophysics Data System (ADS)

    Qiao, Ying; Spanos, Costas J.

    2016-03-01

    Variability modeling at the compact transistor model level can enable statistically optimized designs in view of limitations imposed by the fabrication technology. In this work we propose a variability-aware compact model characterization methodology based on stepwise parameter selection. Transistor I-V measurements are obtained from bit transistor accessible SRAM test array fabricated using a collaborating foundry's 28nm FDSOI technology. Our in-house customized Monte Carlo simulation bench can incorporate these statistical compact models; and simulation results on SRAM writability performance are very close to measurements in distribution estimation. Our proposed statistical compact model parameter extraction methodology also has the potential of predicting non-Gaussian behavior in statistical circuit performances through mixtures of Gaussian distributions.

  1. Type-I interband cascade lasers near 3.2 μm

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jiang, Yuchao; Li, Lu; Yang, Rui Q., E-mail: Rui.Q.Yang@ou.edu

    2015-01-26

    Interband cascade (IC) lasers have been demonstrated based on type-I InGaAsSb/AlAsSb quantum well (QW) active regions. These type-I IC lasers are composed of 6-cascade stages and InAs/AlSb superlattice cladding layers. In contrast to the use of quinary AlGaInAsSb barriers for active region in previous type-I QW lasers, the type-I QW active region in each stage is sandwiched by digitally graded multiple InAs/AlSb QW electron injector and GaSb/AlSb QW hole injector. The fabricated type-I IC lasers were able to operate in continuous wave and pulsed modes at temperatures up to 306 and 365 K, respectively. The threshold current densities of broad-area lasersmore » were around 300 A/cm{sup 2} at 300 K with a lasing wavelength near 3.2 μm. The implications and prospects of these initial results are discussed.« less

  2. Application of pedagogy reflective in statistical methods course and practicum statistical methods

    NASA Astrophysics Data System (ADS)

    Julie, Hongki

    2017-08-01

    Subject Elementary Statistics, Statistical Methods and Statistical Methods Practicum aimed to equip students of Mathematics Education about descriptive statistics and inferential statistics. The students' understanding about descriptive and inferential statistics were important for students on Mathematics Education Department, especially for those who took the final task associated with quantitative research. In quantitative research, students were required to be able to present and describe the quantitative data in an appropriate manner, to make conclusions from their quantitative data, and to create relationships between independent and dependent variables were defined in their research. In fact, when students made their final project associated with quantitative research, it was not been rare still met the students making mistakes in the steps of making conclusions and error in choosing the hypothetical testing process. As a result, they got incorrect conclusions. This is a very fatal mistake for those who did the quantitative research. There were some things gained from the implementation of reflective pedagogy on teaching learning process in Statistical Methods and Statistical Methods Practicum courses, namely: 1. Twenty two students passed in this course and and one student did not pass in this course. 2. The value of the most accomplished student was A that was achieved by 18 students. 3. According all students, their critical stance could be developed by them, and they could build a caring for each other through a learning process in this course. 4. All students agreed that through a learning process that they undergo in the course, they can build a caring for each other.

  3. 40 CFR Appendix I to Subpart T - Sample Graphical Summary of NTE Emission Results

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Emission Results I Appendix I to Subpart T Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Appendix I to Subpart T—Sample Graphical Summary of NTE Emission Results The following figure shows an example of a graphical summary of NTE emission results: ER14JN05.002 ...

  4. A statistical mechanical model of economics

    NASA Astrophysics Data System (ADS)

    Lubbers, Nicholas Edward Williams

    Statistical mechanics pursues low-dimensional descriptions of systems with a very large number of degrees of freedom. I explore this theme in two contexts. The main body of this dissertation explores and extends the Yard Sale Model (YSM) of economic transactions using a combination of simulations and theory. The YSM is a simple interacting model for wealth distributions which has the potential to explain the empirical observation of Pareto distributions of wealth. I develop the link between wealth condensation and the breakdown of ergodicity due to nonlinear diffusion effects which are analogous to the geometric random walk. Using this, I develop a deterministic effective theory of wealth transfer in the YSM that is useful for explaining many quantitative results. I introduce various forms of growth to the model, paying attention to the effect of growth on wealth condensation, inequality, and ergodicity. Arithmetic growth is found to partially break condensation, and geometric growth is found to completely break condensation. Further generalizations of geometric growth with growth in- equality show that the system is divided into two phases by a tipping point in the inequality parameter. The tipping point marks the line between systems which are ergodic and systems which exhibit wealth condensation. I explore generalizations of the YSM transaction scheme to arbitrary betting functions to develop notions of universality in YSM-like models. I find that wealth vi condensation is universal to a large class of models which can be divided into two phases. The first exhibits slow, power-law condensation dynamics, and the second exhibits fast, finite-time condensation dynamics. I find that the YSM, which exhibits exponential dynamics, is the critical, self-similar model which marks the dividing line between the two phases. The final chapter develops a low-dimensional approach to materials microstructure quantification. Modern materials design harnesses complex

  5. Cup Implant Planning Based on 2-D/3-D Radiographic Pelvis Reconstruction-First Clinical Results.

    PubMed

    Schumann, Steffen; Sato, Yoshinobu; Nakanishi, Yuki; Yokota, Futoshi; Takao, Masaki; Sugano, Nobuhiko; Zheng, Guoyan

    2015-11-01

    In the following, we will present a newly developed X-ray calibration phantom and its integration for 2-D/3-D pelvis reconstruction and subsequent automatic cup planning. Two different planning strategies were applied and evaluated with clinical data. Two different cup planning methods were investigated: The first planning strategy is based on a combined pelvis and cup statistical atlas. Thereby, the pelvis part of the combined atlas is matched to the reconstructed pelvis model, resulting in an optimized cup planning. The second planning strategy analyzes the morphology of the reconstructed pelvis model to determine the best fitting cup implant. The first planning strategy was compared to 3-D CT-based planning. Digitally reconstructed radiographs of THA patients with differently severe pathologies were used to evaluate the accuracy of predicting the cup size and position. Within a discrepancy of one cup size, the size was correctly identified in 100% of the cases for Crowe type I datasets and in 77.8% of the cases for Crowe type II, III, and IV datasets. The second planning strategy was analyzed with respect to the eventually implanted cup size. In seven patients, the estimated cup diameter was correct within one cup size, while the estimation for the remaining five patients differed by two cup sizes. While both planning strategies showed the same prediction rate with a discrepancy of one cup size (87.5%), the prediction of the exact cup size was increased for the statistical atlas-based strategy (56%) in contrast to the anatomically driven approach (37.5%). The proposed approach demonstrated the clinical validity of using 2-D/3-D reconstruction technique for cup planning.

  6. Lexical statistics of competition in L2 versus L1 listening

    NASA Astrophysics Data System (ADS)

    Cutler, Anne

    2005-09-01

    Spoken-word recognition involves multiple activation of alternative word candidates and competition between these alternatives. Phonemic confusions in L2 listening increase the number of potentially active words, thus slowing word recognition by adding competitors. This study used a 70,000-word English lexicon backed by frequency statistics from a 17,900,000-word corpus to assess the competition increase resulting from two representative phonemic confusions, one vocalic (ae/E) and one consonantal (r/l), in L2 versus L1 listening. The first analysis involved word embedding. Embedded words (cat in cattle, rib in ribbon) cause competition, which phonemic confusion can increase (cat in kettle, rib in liberty). The average increase in number of embedded words was 59.6 and 48.3 temporary ambiguity. Even when no embeddings are present, multiple alternatives are possible: para- can become parrot, paradise, etc., but also pallet, palace given /r/-/l/ confusion. Phoneme confusions (vowel or consonant) in first or second position in the word approximately doubled the number of activated candidates; confusions later in the word increased activation by on average 53 third, 42 confusions significantly increase competition for L2 compared with L1 listeners.

  7. 40 CFR Appendix I to Subpart T - Sample Graphical Summary of NTE Emission Results

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Emission Results I Appendix I to Subpart T Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... (CONTINUED) Manufacturer-Run In-Use Testing Program for Heavy-Duty Diesel Engines Pt. 86, Subpt. T, App. I Appendix I to Subpart T—Sample Graphical Summary of NTE Emission Results The following figure shows an...

  8. 40 CFR Appendix I to Subpart T - Sample Graphical Summary of NTE Emission Results

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Emission Results I Appendix I to Subpart T Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... (CONTINUED) Manufacturer-Run In-Use Testing Program for Heavy-Duty Diesel Engines Pt. 86, Subpt. T, App. I Appendix I to Subpart T—Sample Graphical Summary of NTE Emission Results The following figure shows an...

  9. Statistical Package User’s Guide.

    DTIC Science & Technology

    1980-08-01

    261 C. STACH Nonparametric Descriptive Statistics ... ......... ... 265 D. CHIRA Coefficient of Concordance...135 I.- -a - - W 7- Test Data: This program was tested using data from John Neter and William Wasserman, Applied Linear Statistical Models: Regression...length of data file e. new fileý name (not same as raw data file) 5. Printout as optioned for only. Comments: Ranked data are used for program CHIRA

  10. Active Learning with Statistical Models.

    DTIC Science & Technology

    1995-01-01

    Active Learning with Statistical Models ASC-9217041, NSF CDA-9309300 6. AUTHOR(S) David A. Cohn, Zoubin Ghahramani, and Michael I. Jordan 7. PERFORMING...TERMS 15. NUMBER OF PAGES Al, MIT, Artificial Intelligence, active learning , queries, locally weighted 6 regression, LOESS, mixtures of gaussians...COMPUTATIONAL LEARNING DEPARTMENT OF BRAIN AND COGNITIVE SCIENCES A.I. Memo No. 1522 January 9. 1995 C.B.C.L. Paper No. 110 Active Learning with

  11. Statistics Clinic

    NASA Technical Reports Server (NTRS)

    Feiveson, Alan H.; Foy, Millennia; Ploutz-Snyder, Robert; Fiedler, James

    2014-01-01

    Do you have elevated p-values? Is the data analysis process getting you down? Do you experience anxiety when you need to respond to criticism of statistical methods in your manuscript? You may be suffering from Insufficient Statistical Support Syndrome (ISSS). For symptomatic relief of ISSS, come for a free consultation with JSC biostatisticians at our help desk during the poster sessions at the HRP Investigators Workshop. Get answers to common questions about sample size, missing data, multiple testing, when to trust the results of your analyses and more. Side effects may include sudden loss of statistics anxiety, improved interpretation of your data, and increased confidence in your results.

  12. Statistical microeconomics and commodity prices: theory and empirical results.

    PubMed

    Baaquie, Belal E

    2016-01-13

    A review is made of the statistical generalization of microeconomics by Baaquie (Baaquie 2013 Phys. A 392, 4400-4416. (doi:10.1016/j.physa.2013.05.008)), where the market price of every traded commodity, at each instant of time, is considered to be an independent random variable. The dynamics of commodity market prices is given by the unequal time correlation function and is modelled by the Feynman path integral based on an action functional. The correlation functions of the model are defined using the path integral. The existence of the action functional for commodity prices that was postulated to exist in Baaquie (Baaquie 2013 Phys. A 392, 4400-4416. (doi:10.1016/j.physa.2013.05.008)) has been empirically ascertained in Baaquie et al. (Baaquie et al. 2015 Phys. A 428, 19-37. (doi:10.1016/j.physa.2015.02.030)). The model's action functionals for different commodities has been empirically determined and calibrated using the unequal time correlation functions of the market commodity prices using a perturbation expansion (Baaquie et al. 2015 Phys. A 428, 19-37. (doi:10.1016/j.physa.2015.02.030)). Nine commodities drawn from the energy, metal and grain sectors are empirically studied and their auto-correlation for up to 300 days is described by the model to an accuracy of R(2)>0.90-using only six parameters. © 2015 The Author(s).

  13. Cyclopentadienylniobium and -molybdenum phosphorodithioate complexes. X-ray crystal structures of CpNbCl sub 3 (S sub 2 P(OPr sup i ) sub 2 ), CpNbCl(. mu. -Cl) sub 2 Nb(S sub 2 P(OPr sup i ) sub 2 )Cp, and cis-Cp prime Mo(CO) sub 2 (S sub 2 P(OPr sup i ) sub 2 )

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Woodward, S.; Riaz, U.; Curtis, M.D.

    1990-10-01

    Reaction of CpNbCl{sub 4} (Cp = {eta}-C{sub 5}H{sub 5}) with (Pr{sup i}O){sub 2}P(S)(SH) in the presence of NEt{sub 3} yields CpNbCl{sub 3}(S{sub 2}P(S{sub 2}Pr{sup i}){sub 2}) (1). Reduction of 1 with Na/Hg affords the Nb-Nb-bonded complex CpNbCl({mu}-Cl){sub 2}Nb(S{sub 2}P(OR){sub 2})Cp (2). In refluxing toluene, (Pr{sup i}O){sub 2}P(S)(SH) with (Cp{prime}Mo(CO){sub 3}){sub 2} (Cp{prime} = {eta}-C{sub 5}H{sub 4}Me) gives cis-Cp{prime}Mo(CO){sub 2}(S{sub 2}P(OPr{sup i}){sub 2}) (3). Oxidation of 3 with I{sub 2} affords Cp{prime}MoI{sub 2}(CO)(S{sub 2}P(OPr{sup i}){sub 2}) (4). The crystal structures of 1-3 are compared. For 1, triclinic, P{bar 1}, a = 7.122 (3) {angstrom}, b = 11.365 (4) {angstrom}, c =more » 12.532 (4) {angstrom}, {alpha} = 77.38 (3){degree}, {beta} = 89.08 (3){degree}, {gamma} = 72.87 (3){degree}, V = 944.5 (8) {angstrom}{sup 3}. For 2, triclinic, P{bar 1}, a = 7.251 (3) {angstrom}, b = 12.386 (5) {angstrom}, c = 13.988 (5) {angstrom}, {alpha} = 102.66 (3){degree}, {beta} = 103.56 (3){degree}, {gamma} = 94.66 (3){degree}, V = 1180.0 (8) {angstrom}{sup 3}, Z = 2. For 3, orthorhombic, Pbca, a = 12.703 (3) {angstrom}, b = 16.707 (4) {angstrom}, c = 18.398 (4) {angstrom}, V = 3904.4 (17) {angstrom}{sup 3}, Z = 8.« less

  14. Reasoning about Informal Statistical Inference: One Statistician's View

    ERIC Educational Resources Information Center

    Rossman, Allan J.

    2008-01-01

    This paper identifies key concepts and issues associated with the reasoning of informal statistical inference. I focus on key ideas of inference that I think all students should learn, including at secondary level as well as tertiary. I argue that a fundamental component of inference is to go beyond the data at hand, and I propose that statistical…

  15. EVALUATION OF THE I-STAT PORTABLE CLINICAL ANALYZER FOR MEASUREMENT OF IONIZED CALCIUM AND SELECTED BLOOD CHEMISTRY VALUES IN ASIAN ELEPHANTS (ELEPHAS MAXIMUS).

    PubMed

    Tarbert, Danielle K; Behling-Kelly, Erica; Priest, Heather; Childs-Sanford, Sara

    2017-06-01

    Thei-STAT® portable clinical analyzer (PCA) provides patient-side results for hematologic, biochemical, and blood gas values when immediate results are desired. This analyzer is commonly used in nondomestic animals; however, validation of this method in comparison with traditional benchtop methods should be performed for each species. In this study, the i-STAT PCA was compared with the Radiometer ABL 800 Flex benchtop analyzer using 24 heparinized whole blood samples obtained from healthy E. maximus . In addition, the effect of sample storage was evaluated on the i-STAT PCA. Analytes evaluated were hydrogen ion concentration (pH), glucose, potassium (K + ), sodium (Na + ), bicarbonate (HCO 3 - ), total carbon dioxide (TCO 2 ), partial pressure of carbon dioxide (PCO 2 ), and ionized calcium (iCa 2+ ). Statistical analysis using correlation coefficients, Passing-Bablok regression analysis, and Bland-Altman plots found good agreement between results from samples run immediately after phlebotomy and 4 hr postsampling on the i-STAT PCA with the exception of K + , which is known to change with sample storage. Comparison of the results from the two analyzers at 4 hr postsampling found very strong or strong correlation in all values except K + , with statistically significant bias in all values except glucose and PCO 2 . Despite bias, mean differences assessed via Bland-Altman plots were clinically acceptable for all analytes excluding K + . Within the reference range for iCa 2+ , the iCa 2+ values obtained by the i-STAT PCA and Radiometer ABL 800 Flex were close in value, however in light of the constant and proportionate biases detected, overestimation at higher values and underestimation at lower values of iCa 2+ by the i-STAT PCA would be of potential concern. This study supports the use of the i-STAT PCA for the evaluation of these analytes, with the exception of K + , in the Asian elephant.

  16. Statistical regularities in the rank-citation profile of scientists.

    PubMed

    Petersen, Alexander M; Stanley, H Eugene; Succi, Sauro

    2011-01-01

    Recent science of science research shows that scientific impact measures for journals and individual articles have quantifiable regularities across both time and discipline. However, little is known about the scientific impact distribution at the scale of an individual scientist. We analyze the aggregate production and impact using the rank-citation profile c(i)(r) of 200 distinguished professors and 100 assistant professors. For the entire range of paper rank r, we fit each c(i)(r) to a common distribution function. Since two scientists with equivalent Hirsch h-index can have significantly different c(i)(r) profiles, our results demonstrate the utility of the β(i) scaling parameter in conjunction with h(i) for quantifying individual publication impact. We show that the total number of citations C(i) tallied from a scientist's N(i) papers scales as [Formula: see text]. Such statistical regularities in the input-output patterns of scientists can be used as benchmarks for theoretical models of career progress.

  17. Willingness to Share Research Data Is Related to the Strength of the Evidence and the Quality of Reporting of Statistical Results

    PubMed Central

    Wicherts, Jelte M.; Bakker, Marjan; Molenaar, Dylan

    2011-01-01

    Background The widespread reluctance to share published research data is often hypothesized to be due to the authors' fear that reanalysis may expose errors in their work or may produce conclusions that contradict their own. However, these hypotheses have not previously been studied systematically. Methods and Findings We related the reluctance to share research data for reanalysis to 1148 statistically significant results reported in 49 papers published in two major psychology journals. We found the reluctance to share data to be associated with weaker evidence (against the null hypothesis of no effect) and a higher prevalence of apparent errors in the reporting of statistical results. The unwillingness to share data was particularly clear when reporting errors had a bearing on statistical significance. Conclusions Our findings on the basis of psychological papers suggest that statistical results are particularly hard to verify when reanalysis is more likely to lead to contrasting conclusions. This highlights the importance of establishing mandatory data archiving policies. PMID:22073203

  18. Statistics & Input-Output Measures for School Libraries in Colorado, 2002.

    ERIC Educational Resources Information Center

    Colorado State Library, Denver.

    This document presents statistics and input-output measures for K-12 school libraries in Colorado for 2002. Data are presented by type and size of school, i.e., high schools (six categories ranging from 2,000 and over to under 300), junior high/middle schools (five categories ranging from 1,000-1,999 to under 300), elementary schools (four…

  19. Statistical analysis and interpretation of prenatal diagnostic imaging studies, Part 2: descriptive and inferential statistical methods.

    PubMed

    Tuuli, Methodius G; Odibo, Anthony O

    2011-08-01

    The objective of this article is to discuss the rationale for common statistical tests used for the analysis and interpretation of prenatal diagnostic imaging studies. Examples from the literature are used to illustrate descriptive and inferential statistics. The uses and limitations of linear and logistic regression analyses are discussed in detail.

  20. THE POSTERIOR DISTRIBUTION OF sin(i) VALUES FOR EXOPLANETS WITH M{sub T} sin(i) DETERMINED FROM RADIAL VELOCITY DATA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ho, Shirley; Turner, Edwin L., E-mail: cwho@lbl.gov

    2011-09-20

    Radial velocity (RV) observations of an exoplanet system giving a value of M{sub T} sin(i) condition (i.e., give information about) not only the planet's true mass M{sub T} but also the value of sin(i) for that system (where i is the orbital inclination angle). Thus, the value of sin(i) for a system with any particular observed value of M{sub T} sin(i) cannot be assumed to be drawn randomly from a distribution corresponding to an isotropic i distribution, i.e., the presumptive prior distribution. Rather, the posterior distribution from which it is drawn depends on the intrinsic distribution of M{sub T} formore » the exoplanet population being studied. We give a simple Bayesian derivation of this relationship and apply it to several 'toy models' for the intrinsic distribution of M{sub T} , on which we have significant information from available RV data in some mass ranges but little or none in others. The results show that the effect can be an important one. For example, even for simple power-law distributions of M{sub T} , the median value of sin(i) in an observed RV sample can vary between 0.860 and 0.023 (as compared to the 0.866 value for an isotropic i distribution) for indices of the power law in the range between -2 and +1, respectively. Over the same range of indices, the 95% confidence interval on M{sub T} varies from 1.0001-2.405 ({alpha} = -2) to 1.13-94.34 ({alpha} = +2) times larger than M{sub T} sin(i) due to sin(i) uncertainty alone. More complex, but still simple and plausible, distributions of M{sub T} yield more complicated and somewhat unintuitive posterior sin(i) distributions. In particular, if the M{sub T} distribution contains any characteristic mass scale M{sub c} , the posterior sin(i) distribution will depend on the ratio of M{sub T} sin(i) to M{sub c} , often in a non-trivial way. Our qualitative conclusion is that RV studies of exoplanets, both individual objects and statistical samples, should regard the sin(i) factor as more than a

  1. Statistical description of non-Gaussian samples in the F2 layer of the ionosphere during heliogeophysical disturbances

    NASA Astrophysics Data System (ADS)

    Sergeenko, N. P.

    2017-11-01

    An adequate statistical method should be developed in order to predict probabilistically the range of ionospheric parameters. This problem is solved in this paper. The time series of the critical frequency of the layer F2- foF2( t) were subjected to statistical processing. For the obtained samples {δ foF2}, statistical distributions and invariants up to the fourth order are calculated. The analysis shows that the distributions differ from the Gaussian law during the disturbances. At levels of sufficiently small probability distributions, there are arbitrarily large deviations from the model of the normal process. Therefore, it is attempted to describe statistical samples {δ foF2} based on the Poisson model. For the studied samples, the exponential characteristic function is selected under the assumption that time series are a superposition of some deterministic and random processes. Using the Fourier transform, the characteristic function is transformed into a nonholomorphic excessive-asymmetric probability-density function. The statistical distributions of the samples {δ foF2} calculated for the disturbed periods are compared with the obtained model distribution function. According to the Kolmogorov's criterion, the probabilities of the coincidence of a posteriori distributions with the theoretical ones are P 0.7-0.9. The conducted analysis makes it possible to draw a conclusion about the applicability of a model based on the Poisson random process for the statistical description and probabilistic variation estimates during heliogeophysical disturbances of the variations {δ foF2}.

  2. SRS 2010 Vegetation Inventory GeoStatistical Mapping Results for Custom Reaction Intensity and Total Dead Fuels.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Edwards, Lloyd A.; Paresol, Bernard

    This report of the geostatistical analysis results of the fire fuels response variables, custom reaction intensity and total dead fuels is but a part of an SRS 2010 vegetation inventory project. For detailed description of project, theory and background including sample design, methods, and results please refer to USDA Forest Service Savannah River Site internal report “SRS 2010 Vegetation Inventory GeoStatistical Mapping Report”, (Edwards & Parresol 2013).

  3. Statistical learning and probabilistic prediction in music cognition: mechanisms of stylistic enculturation.

    PubMed

    Pearce, Marcus T

    2018-05-11

    Music perception depends on internal psychological models derived through exposure to a musical culture. It is hypothesized that this musical enculturation depends on two cognitive processes: (1) statistical learning, in which listeners acquire internal cognitive models of statistical regularities present in the music to which they are exposed; and (2) probabilistic prediction based on these learned models that enables listeners to organize and process their mental representations of music. To corroborate these hypotheses, I review research that uses a computational model of probabilistic prediction based on statistical learning (the information dynamics of music (IDyOM) model) to simulate data from empirical studies of human listeners. The results show that a broad range of psychological processes involved in music perception-expectation, emotion, memory, similarity, segmentation, and meter-can be understood in terms of a single, underlying process of probabilistic prediction using learned statistical models. Furthermore, IDyOM simulations of listeners from different musical cultures demonstrate that statistical learning can plausibly predict causal effects of differential cultural exposure to musical styles, providing a quantitative model of cultural distance. Understanding the neural basis of musical enculturation will benefit from close coordination between empirical neuroimaging and computational modeling of underlying mechanisms, as outlined here. © 2018 The Authors. Annals of the New York Academy of Sciences published by Wiley Periodicals, Inc. on behalf of New York Academy of Sciences.

  4. Statistical methods for quantitative mass spectrometry proteomic experiments with labeling.

    PubMed

    Oberg, Ann L; Mahoney, Douglas W

    2012-01-01

    Mass Spectrometry utilizing labeling allows multiple specimens to be subjected to mass spectrometry simultaneously. As a result, between-experiment variability is reduced. Here we describe use of fundamental concepts of statistical experimental design in the labeling framework in order to minimize variability and avoid biases. We demonstrate how to export data in the format that is most efficient for statistical analysis. We demonstrate how to assess the need for normalization, perform normalization, and check whether it worked. We describe how to build a model explaining the observed values and test for differential protein abundance along with descriptive statistics and measures of reliability of the findings. Concepts are illustrated through the use of three case studies utilizing the iTRAQ 4-plex labeling protocol.

  5. VizieR Online Data Catalog: NORAS II. I. First results (Bohringer+, 2017)

    NASA Astrophysics Data System (ADS)

    Bohringer, H.; Chon, G.; Retzlaff, J.; Trumper, J.; Meisenheimer, K.; Schartel, N.

    2017-08-01

    The NOrthern ROSAT All-Sky (NORAS) galaxy cluster survey project is based on the ROSAT All-Sky Survey (RASS; Trumper 1993Sci...260.1769T), which is the only full-sky survey conducted with an imaging X-ray telescope. We have already used RASS for the construction of the cluster catalogs of the NORAS I project. While NORAS I was as a first step focused on the identification of galaxy clusters among the RASS X-ray sources showing a significant extent, the complementary REFLEX I sample in the southern sky was strictly constructed as a flux-limited cluster sample. A major extension of the REFLEX I sample, which roughly doubles the number of clusters, REFLEX II (Bohringer et al. 2013, Cat. J/A+A/555/A30), was recently completed. It is by far the largest high-quality sample of X-ray-selected galaxy clusters. The NORAS II survey now reaches a flux limit of 1.8*10-12erg/s/cm2 in the 0.1-2.4keV band. Redshifts have been obtained for all of the 860 clusters in the NORAS II catalog, except for 25 clusters for which observing campaigns are scheduled. Thus with 3% missing redshifts we can already obtain a very good view of the properties of the NORAS II cluster sample and obtain some first results. The NORAS II survey covers the sky region north of the equator outside the band of the Milky Way (|bII|>=20°). We also excise a region around the nearby Virgo cluster of galaxies that extends over several degrees on the sky, where the detection of background clusters is hampered by bright X-ray emission. This region is bounded in right ascension by R.A.=185°-191.25° and in declination by decl.=6°-15° (an area of ~53deg2). With this excision, the survey area covers 4.18 steradian (13519deg2, a fraction of 32.7% of the sky). NORAS II is based on the RASS product RASS III (Voges et al. 1999, Cat. IX/10), which was also used for REFLEX II. The NORAS II survey was constructed in a way identical to REFLEX II with a nominal flux limit of 1.8*10-12erg/s/cm2. (3 data files).

  6. Bayesian statistics and Monte Carlo methods

    NASA Astrophysics Data System (ADS)

    Koch, K. R.

    2018-03-01

    The Bayesian approach allows an intuitive way to derive the methods of statistics. Probability is defined as a measure of the plausibility of statements or propositions. Three rules are sufficient to obtain the laws of probability. If the statements refer to the numerical values of variables, the so-called random variables, univariate and multivariate distributions follow. They lead to the point estimation by which unknown quantities, i.e. unknown parameters, are computed from measurements. The unknown parameters are random variables, they are fixed quantities in traditional statistics which is not founded on Bayes' theorem. Bayesian statistics therefore recommends itself for Monte Carlo methods, which generate random variates from given distributions. Monte Carlo methods, of course, can also be applied in traditional statistics. The unknown parameters, are introduced as functions of the measurements, and the Monte Carlo methods give the covariance matrix and the expectation of these functions. A confidence region is derived where the unknown parameters are situated with a given probability. Following a method of traditional statistics, hypotheses are tested by determining whether a value for an unknown parameter lies inside or outside the confidence region. The error propagation of a random vector by the Monte Carlo methods is presented as an application. If the random vector results from a nonlinearly transformed vector, its covariance matrix and its expectation follow from the Monte Carlo estimate. This saves a considerable amount of derivatives to be computed, and errors of the linearization are avoided. The Monte Carlo method is therefore efficient. If the functions of the measurements are given by a sum of two or more random vectors with different multivariate distributions, the resulting distribution is generally not known. TheMonte Carlo methods are then needed to obtain the covariance matrix and the expectation of the sum.

  7. A statistical study of EMIC waves observed by Cluster: 2. Associated plasma conditions

    NASA Astrophysics Data System (ADS)

    Allen, R. C.; Zhang, J.-C.; Kistler, L. M.; Spence, H. E.; Lin, R.-L.; Klecker, B.; Dunlop, M. W.; André, M.; Jordanova, V. K.

    2016-07-01

    This is the second in a pair of papers discussing a statistical study of electromagnetic ion cyclotron (EMIC) waves detected during 10 years (2001-2010) of Cluster observations. In the first paper, an analysis of EMIC wave properties (i.e., wave power, polarization, normal angle, and wave propagation angle) is presented in both the magnetic latitude (MLAT)-distance as well as magnetic local time (MLT)-L frames. This paper focuses on the distribution of EMIC wave-associated plasma conditions as well as two EMIC wave generation proxies (the electron plasma frequency to gyrofrequency ratio proxy and the linear theory proxy) in these same frames. Based on the distributions of hot H+ anisotropy, electron and hot H+ density measurements, hot H+ parallel plasma beta, and the calculated wave generation proxies, three source regions of EMIC waves appear to exist: (1) the well-known overlap between cold plasmaspheric or plume populations with hot anisotropic ring current populations in the postnoon to dusk MLT region; (2) regions all along the dayside magnetosphere at high L shells related to dayside magnetospheric compression and drift shell splitting; and (3) off-equator regions possibly associated with the Shabansky orbits in the dayside magnetosphere.

  8. Meta-analysis of gene-level associations for rare variants based on single-variant statistics.

    PubMed

    Hu, Yi-Juan; Berndt, Sonja I; Gustafsson, Stefan; Ganna, Andrea; Hirschhorn, Joel; North, Kari E; Ingelsson, Erik; Lin, Dan-Yu

    2013-08-08

    Meta-analysis of genome-wide association studies (GWASs) has led to the discoveries of many common variants associated with complex human diseases. There is a growing recognition that identifying "causal" rare variants also requires large-scale meta-analysis. The fact that association tests with rare variants are performed at the gene level rather than at the variant level poses unprecedented challenges in the meta-analysis. First, different studies may adopt different gene-level tests, so the results are not compatible. Second, gene-level tests require multivariate statistics (i.e., components of the test statistic and their covariance matrix), which are difficult to obtain. To overcome these challenges, we propose to perform gene-level tests for rare variants by combining the results of single-variant analysis (i.e., p values of association tests and effect estimates) from participating studies. This simple strategy is possible because of an insight that multivariate statistics can be recovered from single-variant statistics, together with the correlation matrix of the single-variant test statistics, which can be estimated from one of the participating studies or from a publicly available database. We show both theoretically and numerically that the proposed meta-analysis approach provides accurate control of the type I error and is as powerful as joint analysis of individual participant data. This approach accommodates any disease phenotype and any study design and produces all commonly used gene-level tests. An application to the GWAS summary results of the Genetic Investigation of ANthropometric Traits (GIANT) consortium reveals rare and low-frequency variants associated with human height. The relevant software is freely available. Copyright © 2013 The American Society of Human Genetics. Published by Elsevier Inc. All rights reserved.

  9. H I-to-H2 Transition Layers in the Star-forming Region W43

    NASA Astrophysics Data System (ADS)

    Bialy, Shmuel; Bihr, Simon; Beuther, Henrik; Henning, Thomas; Sternberg, Amiel

    2017-02-01

    The process of atomic-to-molecular (H I-to-H2) gas conversion is fundamental for molecular-cloud formation and star formation. 21 cm observations of the star-forming region W43 revealed extremely high H I column densities, of 120-180 {M}⊙ {{pc}}-2, a factor of 10-20 larger than predicted by H I-to-H2 transition theories. We analyze the observed H I with a theoretical model of the H I-to-H2 transition, and show that the discrepancy between theory and observation cannot be explained by the intense radiation in W43, nor be explained by variations of the assumed volume density or H2 formation rate coefficient. We show that the large observed H I columns are naturally explained by several (9-22) H I-to-H2 transition layers, superimposed along the sightlines of W43. We discuss other possible interpretations such as a non-steady-state scenario and inefficient dust absorption. The case of W43 suggests that H I thresholds reported in extragalactic observations are probably not associated with a single H I-to-H2 transition, but are rather a result of several transition layers (clouds) along the sightlines, beam-diluted with diffuse intercloud gas.

  10. Accounting for the differences in the structures and relative energies of the highly homoatomic np pi-np pi (n > or = 3)-bonded S2I4 2+, the Se-I pi-bonded Se2I4 2+, and their higher-energy isomers by AIM, MO, NBO, and VB methodologies.

    PubMed

    Brownridge, Scott; Crawford, Margaret-Jane; Du, Hongbin; Harcourt, Richard D; Knapp, Carsten; Laitinen, Risto S; Passmore, Jack; Rautiainen, J Mikko; Suontamo, Reijo J; Valkonen, Jussi

    2007-02-05

    The bonding in the highly homoatomic np pi-np pi (n > or = 3)-bonded S2I42+ (three sigma + two pi bonds), the Se-I pi-bonded Se2I42+ (four sigma + one pi bonds), and their higher-energy isomers have been studied using modern DFT and ab initio calculations and theoretical analysis methods: atoms in molecules (AIM), molecular orbital (MO), natural bond orbital (NBO), and valence bond (VB) analyses, giving their relative energies, theoretical bond orders, and atomic charges. The aim of this work was to seek theory-based answers to four main questions: (1) Are the previously proposed simple pi*-pi* bonding models valid for S2I42+ and Se2I42+? (2) What accounts for the difference in the structures of S2I42+ and Se2I42+? (3) Why are the classically bonded isolobal P2I4 and As2I4 structures not adopted? (4) Is the high experimentally observed S-S bond order supported by theoretical bond orders, and how does it relate to high bond orders between other heavier main group elements? The AIM analysis confirmed the high bond orders and established that the weak bonds observed in S2I42+ and Se2I42+ are real and the bonding in these cations is covalent in nature. The full MO analysis confirmed that S2I42+ contains three sigma and two pi bonds, that the positive charge is essentially equally distributed over all atoms, that the bonding between S2 and two I2+ units in S2I42+ is best described by two mutually perpendicular 4c2e pi*-pi* bonds, and that in Se2I42+, two SeI2+ moieties are joined by a 6c2e pi*-pi* bond, both in agreement with previously suggested models. The VB treatment provided a complementary approach to MO analysis and provided insight how the formation of the weak bonds affects the other bonds. The NBO analysis and the calculated AIM charges showed that the minimization of the electrostatic repulsion between EI2+ units (E = S, Se) and the delocalization of the positive charge are the main factors that explain why the nonclassical structures are favored for S2I42

  11. REANALYSIS OF F-STATISTIC GRAVITATIONAL-WAVE SEARCHES WITH THE HIGHER CRITICISM STATISTIC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bennett, M. F.; Melatos, A.; Delaigle, A.

    2013-04-01

    We propose a new method of gravitational-wave detection using a modified form of higher criticism, a statistical technique introduced by Donoho and Jin. Higher criticism is designed to detect a group of sparse, weak sources, none of which are strong enough to be reliably estimated or detected individually. We apply higher criticism as a second-pass method to synthetic F-statistic and C-statistic data for a monochromatic periodic source in a binary system and quantify the improvement relative to the first-pass methods. We find that higher criticism on C-statistic data is more sensitive by {approx}6% than the C-statistic alone under optimal conditionsmore » (i.e., binary orbit known exactly) and the relative advantage increases as the error in the orbital parameters increases. Higher criticism is robust even when the source is not monochromatic (e.g., phase-wandering in an accreting system). Applying higher criticism to a phase-wandering source over multiple time intervals gives a {approx}> 30% increase in detectability with few assumptions about the frequency evolution. By contrast, in all-sky searches for unknown periodic sources, which are dominated by the brightest source, second-pass higher criticism does not provide any benefits over a first-pass search.« less

  12. Statistical mechanics of few-particle systems: exact results for two useful models

    NASA Astrophysics Data System (ADS)

    Miranda, Enrique N.

    2017-11-01

    The statistical mechanics of small clusters (n ˜ 10-50 elements) of harmonic oscillators and two-level systems is studied exactly, following the microcanonical, canonical and grand canonical formalisms. For clusters with several hundred particles, the results from the three formalisms coincide with those found in the thermodynamic limit. However, for clusters formed by a few tens of elements, the three ensembles yield different results. For a cluster with a few tens of harmonic oscillators, when the heat capacity per oscillator is evaluated within the canonical formalism, it reaches a limit value equal to k B , as in the thermodynamic case, while within the microcanonical formalism the limit value is k B (1-1/n). This difference could be measured experimentally. For a cluster with a few tens of two-level systems, the heat capacity evaluated within the canonical and microcanonical ensembles also presents differences that could be detected experimentally. Both the microcanonical and grand canonical formalism show that the entropy is non-additive for systems this small, while the canonical ensemble reaches the opposite conclusion. These results suggest that the microcanonical ensemble is the most appropriate for dealing with systems with tens of particles.

  13. 124I-HuCC49deltaCH2 for TAG-72 antigen-directed positron emission tomography (PET) imaging of LS174T colon adenocarcinoma tumor implants in xenograft mice: preliminary results

    PubMed Central

    2010-01-01

    Background 18F-fluorodeoxyglucose positron emission tomography (18F-FDG-PET) is widely used in diagnostic cancer imaging. However, the use of 18F-FDG in PET-based imaging is limited by its specificity and sensitivity. In contrast, anti-TAG (tumor associated glycoprotein)-72 monoclonal antibodies are highly specific for binding to a variety of adenocarcinomas, including colorectal cancer. The aim of this preliminary study was to evaluate a complimentary determining region (CDR)-grafted humanized CH2-domain-deleted anti-TAG-72 monoclonal antibody (HuCC49deltaCH2), radiolabeled with iodine-124 (124I), as an antigen-directed and cancer-specific targeting agent for PET-based imaging. Methods HuCC49deltaCH2 was radiolabeled with 124I. Subcutaneous tumor implants of LS174T colon adenocarcinoma cells, which express TAG-72 antigen, were grown on athymic Nu/Nu nude mice as the xenograft model. Intravascular (i.v.) and intraperitoneal (i.p.) administration of 124I-HuCC49deltaCH2 was then evaluated in this xenograft mouse model at various time points from approximately 1 hour to 24 hours after injection using microPET imaging. This was compared to i.v. injection of 18F-FDG in the same xenograft mouse model using microPET imaging at 50 minutes after injection. Results At approximately 1 hour after i.v. injection, 124I-HuCC49deltaCH2 was distributed within the systemic circulation, while at approximately 1 hour after i.p. injection, 124I-HuCC49deltaCH2 was distributed within the peritoneal cavity. At time points from 18 hours to 24 hours after i.v. and i.p. injection, 124I-HuCC49deltaCH2 demonstrated a significantly increased level of specific localization to LS174T tumor implants (p = 0.001) when compared to the 1 hour images. In contrast, approximately 50 minutes after i.v. injection, 18F-FDG failed to demonstrate any increased level of specific localization to a LS174T tumor implant, but showed the propensity toward more nonspecific uptake within the heart, Harderian glands

  14. An Online Course of Business Statistics: The Proportion of Successful Students

    ERIC Educational Resources Information Center

    Pena-Sanchez, Rolando

    2009-01-01

    This article describes the students' academic progress in an online course of business statistics through interactive software assignments and diverse educational homework, which helps these students to build their own e-learning through basic competences; i.e. interpreting results and solving problems. Cross-tables were built for the categorical…

  15. Statistical Prediction of Sea Ice Concentration over Arctic

    NASA Astrophysics Data System (ADS)

    Kim, Jongho; Jeong, Jee-Hoon; Kim, Baek-Min

    2017-04-01

    In this study, a statistical method that predict sea ice concentration (SIC) over the Arctic is developed. We first calculate the Season-reliant Empirical Orthogonal Functions (S-EOFs) of monthly Arctic SIC from Nimbus-7 SMMR and DMSP SSM/I-SSMIS Passive Microwave Data, which contain the seasonal cycles (12 months long) of dominant SIC anomaly patterns. Then, the current SIC state index is determined by projecting observed SIC anomalies for latest 12 months to the S-EOFs. Assuming the current SIC anomalies follow the spatio-temporal evolution in the S-EOFs, we project the future (upto 12 months) SIC anomalies by multiplying the SI and the corresponding S-EOF and then taking summation. The predictive skill is assessed by hindcast experiments initialized at all the months for 1980-2010. When comparing predictive skill of SIC predicted by statistical model and NCEP CFS v2, the statistical model shows a higher skill in predicting sea ice concentration and extent.

  16. Combinatorial interpretation of Haldane-Wu fractional exclusion statistics.

    PubMed

    Aringazin, A K; Mazhitov, M I

    2002-08-01

    Assuming that the maximal allowed number of identical particles in a state is an integer parameter, q, we derive the statistical weight and analyze the associated equation that defines the statistical distribution. The derived distribution covers Fermi-Dirac and Bose-Einstein ones in the particular cases q=1 and q--> infinity (n(i)/q-->1), respectively. We show that the derived statistical weight provides a natural combinatorial interpretation of Haldane-Wu fractional exclusion statistics, and present exact solutions of the distribution equation.

  17. Traffic crash statistics report, 2009

    DOT National Transportation Integrated Search

    2010-06-23

    This report is compiled from long form traffic crash reports submitted by state and local law enforcement agencies. The Department summarizes all the submitted information for this report. in general, the 2009 crash statistics show a positive trend i...

  18. Synthesis, Resistivity, and Thermal Properties of the Cubic Perovskite NH 2CH=NH 2SnI 3and Related Systems

    NASA Astrophysics Data System (ADS)

    Mitzi, D. B.; Liang, K.

    1997-12-01

    Combining concentrated hydriodic acid solutions of tin(II) iodide and formamidine acetate in an inert atmosphere results in the precipitation of a new conducting organic-inorganic compound, NH 2CH=NH 2SnI 3, which at room temperature adopts a cubic perovskite structure. The lattice constant for NH 2CH=NH 2SnI 3is found to be a=6.316(1) Å, which is approximately 1.2% larger than that for the isostructural compound CH 3NH 3SnI 3. The electrical resistivity of a pressed pellet of the new compound exhibits semimetallic temperature dependence from 10 to 300 K, with evidence of a structural transition at approximately 75 K. NH 2CH=NH 2SnI 3begins to slowly decompose in an inert atmosphere at temperatures as low as 200°C, with bulk decomposition/melting occurring above 300°C. The properties of the formamidinium-based perovskite are compared with those of the related cubic (at room temperature) perovskite CH 3NH 3SnI 3and the mixed-cation system (CH 3NH 3) 1- x(NH 2CH=NH 2) xSnI 3.

  19. Effect of NaI/I 2 mediators on properties of PEO/LiAlO 2 based all-solid-state supercapacitors

    NASA Astrophysics Data System (ADS)

    Yin, Yijing; Zhou, Juanjuan; Mansour, Azzam N.; Zhou, Xiangyang

    NaI/I 2 mediators and activated carbon were added into poly(ethylene oxide) (PEO)/lithium aluminate (LiAlO 2) electrolyte to fabricate composite electrodes. All solid-state supercapacitors were fabricated using the as prepared composite electrodes and a Nafion 117 membrane as a separator. Cyclic voltammetry, electrochemical impedance spectroscopy, and galvanostatic charge/discharge measurements were conducted to evaluate the electrochemical properties of the supercapacitors. With the addition of NaI/I 2 mediators, the specific capacitance increased by 27 folds up to 150 F g -1. The specific capacitance increased with increases in the concentration of mediators in the electrodes. The addition of mediators also reduced the electrode resistance and rendered a higher electron transfer rate between mediator and mediator. The stability of the all-solid-state supercapacitor was tested over 2000 charge/discharge cycles.

  20. Should I Pack My Umbrella? Clinical versus Statistical Prediction of Mental Health Decisions

    ERIC Educational Resources Information Center

    Aegisdottir, Stefania; Spengler, Paul M.; White, Michael J.

    2006-01-01

    In this rejoinder, the authors respond to the insightful commentary of Strohmer and Arm, Chwalisz, and Hilton, Harris, and Rice about the meta-analysis on statistical versus clinical prediction techniques for mental health judgments. The authors address issues including the availability of statistical prediction techniques for real-life psychology…

  1. Recombinant insulin-like growth factor (IGF)-I treatment in short children with low IGF-I levels: first-year results from a randomized clinical trial.

    PubMed

    Midyett, L Kurt; Rogol, Alan D; Van Meter, Quentin L; Frane, James; Bright, George M

    2010-02-01

    Short stature in children may be associated with low IGF-I despite normal stimulated GH levels and without other causes. Our objective was to assess the safety and efficacy of recombinant human IGF-I (rhIGF-I) in short children with low IGF-I levels. This was a 1-yr, randomized, open-label trial (MS301). The study was conducted at 30 U.S. pediatric endocrinology clinics. A total of 136 short, prepubertal subjects with low IGF-I (height and IGF-I sd scores <-2, stimulated GH > or =7 ng/ml); 124 completed the study, and six withdrew for adverse events and six for other reasons. rhIGF-I was administered sc, twice daily using weight-based dosing (40, 80, or 120 microg/kg; n = 111) or subjects were observed (n = 25). First-year height velocity (centimeters per year, cm/yr), height sd score, IGF-I, and adverse events were prespecified outcomes. First-year height velocities for subjects completing the trial were increased for the 80- and 120-microg/kg twice-daily vs. the untreated group (7.0 +/- 1.0, 7.9 +/- 1.4, and 5.2 +/- 1.0 cm/yr, respectively; all P < 0.0001) and for the 120- vs. 80-microg/kg group (P = 0.0002) and were inversely related to age. They were not predicted by GH stimulation or IGF-I generation test results and were not correlated with IGF-I antibody status. The most commonly reported adverse events of special interest during treatment were headache (38% of subjects), vomiting (25%), and hypoglycemia (14%). rhIGF-I treatment was associated with age- and dose-dependent increases in first-year height velocity. Adverse events during treatment were less common than in previous studies and were generally transient, easily managed, and without known sequelae.

  2. Stochastic Lanchester-type Combat Models I.

    DTIC Science & Technology

    1979-10-01

    necessarily hold when the attrition rates become non- linear in b and/or r. 13 iL 4. OTHER COMBAT MODELS In this section we briefly describe how other...AD-A092 898 FLORIDA STATE UNIV TALLAHASSEE DEPT OF STATISTICS F/6 12/2 STOCHASTIC LANCHESTER-TYPE COMBAT MODELS I.(U) OCT 79 L BILLARD N62271-79-M...COMBAT MODELS I by L. BILLARD October 1979 Approved for public release; distribution unlimited. Prepared for: Naval Postgraduate School Monterey, CA 93940

  3. Mourning dove hunting regulation strategy based on annual harvest statistics and banding data

    USGS Publications Warehouse

    Otis, D.L.

    2006-01-01

    Although managers should strive to base game bird harvest management strategies on mechanistic population models, monitoring programs required to build and continuously update these models may not be in place. Alternatively, If estimates of total harvest and harvest rates are available, then population estimates derived from these harvest data can serve as the basis for making hunting regulation decisions based on population growth rates derived from these estimates. I present a statistically rigorous approach for regulation decision-making using a hypothesis-testing framework and an assumed framework of 3 hunting regulation alternatives. I illustrate and evaluate the technique with historical data on the mid-continent mallard (Anas platyrhynchos) population. I evaluate the statistical properties of the hypothesis-testing framework using the best available data on mourning doves (Zenaida macroura). I use these results to discuss practical implementation of the technique as an interim harvest strategy for mourning doves until reliable mechanistic population models and associated monitoring programs are developed.

  4. 41 CFR 302-2.2 - May I relocate to my new official duty station before I receive a written travel authorization (TA)?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 41 Public Contracts and Property Management 4 2010-07-01 2010-07-01 false May I relocate to my new official duty station before I receive a written travel authorization (TA)? 302-2.2 Section 302-2.2 Public Contracts and Property Management Federal Travel Regulation System RELOCATION ALLOWANCES INTRODUCTION 2...

  5. Purification of HgI.sub.2 for nuclear detector fabrication

    DOEpatents

    Schieber, Michael M.

    1978-01-01

    A process for purification of mercuric iodide (HgI.sub.2) to be used as a source material for the growth of detector quality crystals. The high purity HgI.sub.2 raw material is produced by a combination of three stages: synthesis of HgI.sub.2 from Hg and I.sub.2, repeated sublimation, and zone refining.

  6. Association of vitamin D receptor BsmI, TaqI, FokI, and ApaI polymorphisms with susceptibility of chronic periodontitis: A systematic review and meta-analysis based on 38 case -control studies.

    PubMed

    Mashhadiabbas, Fatemeh; Neamatzadeh, Hossein; Nasiri, Rezvan; Foroughi, Elnaz; Farahnak, Soudabeh; Piroozmand, Parisa; Mazaheri, Mahta; Zare-Shehneh, Masoud

    2018-01-01

    There has been increasing interest in the study of the association between Vitamin D receptor (VDR) gene polymorphisms and risk of chronic periodontitis. However, the results remain inconclusive. To better understand the roles of VDR polymorphisms (BsmI, TaqI, FokI, and ApaI) in chronic periodontitis susceptibility, we conducted this systematic review and meta-analysis. The PubMed, Google Scholar, and Web of Science database were systemically searched to determine all the eligible studies about VDR polymorphisms and risk of chronic periodontitis up to April 2017. Odds ratio (OR) and 95% confidence interval (CI) were used to evaluate the associations between VDR polymorphisms and chronic periodontitis risk. All the statistical analyses were performed by Comprehensive Meta-Analysis. All P values were two-tailed with a significant level at 0.05. Finally, a total of 38 case-control studies in 19 publications were identified which met our inclusion criteria. There are ten studies with 866 chronic periodontitis cases and 786 controls for BsmI, 16 studies with 1570 chronic periodontitis cases and 1676 controls for TaqI, five studies with 374 chronic periodontitis cases and 382 controls for FokI, and seven studies with 632 chronic periodontitis cases and 604 controls for ApaI. Overall, no significant association was observed between VDR gene BsmI, TaqI, FokI, and ApaI polymorphisms and risk of chronic periodontitis in any genetic model. Subgroup analysis stratified by ethnicity suggested a significant association between BsmI polymorphism and chronic periodontitis risk in the Caucasian subgroup under allele model (A vs. G: OR = 1.747, 95% CI = 1.099-2.778, P = 0.018). Further, no significant associations were observed when stratified by Hardy-Weinberg equilibrium status for BsmI, TaqI, and ApaI. Our results suggest that BsmI, TaqI, FokI, and ApaI polymorphisms in the VDR gene might not be associated with risk of chronic periodontitis in overall population.

  7. A Statistical Analysis Plan to Support the Joint Forward Area Air Defense Test.

    DTIC Science & Technology

    1984-08-02

    hy estahlishing a specific significance level prior to performing the statistical test (traditionally a levels are set at .01 or .05). What is often...undesirable increase in 8. For constant a levels , the power (I - 8) of a statistical test can he increased by Increasing the sample size of the test. fRef...ANOVA Iparison Test on MOP I=--ferences Exist AmongF "Upon MOP "A" Factor I "A" Factor I 1MOP " A " Levels ? I . I I I _ _ ________ IPerform k-Sample Com- I

  8. i3b3: Infobuttons for i2b2 as a Mechanism for Investigating the Information Needs of Clinical Researchers.

    PubMed

    Kennell, Timothy; Dempsey, Donald M; Cimino, James J

    2016-01-01

    The information needs of clinicians, as they interact with the EHR, are well-studied. Clinical researchers also interact with the EHR and, while they might be expected to have some similar needs, the unique needs that arise due to nature of their work remain largely unstudied. For clinicians, infobuttons (context-aware hyperlinks) provide a mechanism of studying these information needs. Here we describe the integration of infobuttons into i2b2, a popular data warehouse commonly used by clinical researchers, using a plugin. A preliminary survey of i2b2 developers suggests a general interest in infobuttons for i2b2 and indicates good likelihood for their deployment, where they may be used as a tool for further studying these needs in greater detail.

  9. ECAS Phase I fuel cell results. [Energy Conservation Alternatives Study

    NASA Technical Reports Server (NTRS)

    Warshay, M.

    1978-01-01

    This paper summarizes and discusses the fuel cell system results of Phase I of the Energy Conversion Alternatives Study (ECAS). Ten advanced electric powerplant systems for central-station baseload generation using coal were studied by NASA in ECAS. Three types of low-temperature fuel cells (solid polymer electrolyte, SPE, aqueous alkaline, and phosphoric acid) and two types of high-temperature fuel cells (molten carbonate, MC, and zirconia solid electrolyte, SE) were studied. The results indicate that (1) overall efficiency increases with fuel cell temperature, and (2) scale-up in powerplant size can produce a significant reduction in cost of electricity (COE) only when it is accompanied by utilization of waste fuel cell heat through a steam bottoming cycle and/or integration with a gasifier. For low-temperature fuel cell systems, the use of hydrogen results in the highest efficiency and lowest COE. In spite of higher efficiencies, because of higher fuel cell replacement costs integrated SE systems have higher projected COEs than do integrated MC systems. Present data indicate that life can be projected to over 30,000 hr for MC fuel cells, but data are not yet sufficient for similarly projecting SE fuel cell life expectancy.

  10. PCGF2 negatively regulates arsenic trioxide-induced PML-RARA protein degradation via UBE2I inhibition in NB4 cells.

    PubMed

    Jo, Sungsin; Lee, Young Lim; Kim, Sojin; Lee, Hongki; Chung, Heekyoung

    2016-07-01

    Arsenic trioxide (ATO) is a therapeutic agent for acute promyelocytic leukemia (APL) which induces PML-RARA protein degradation via enhanced UBE2I-mediated sumoylation. PCGF2, a Polycomb group protein, has been suggested as an anti-SUMO E3 protein by inhibiting the sumoylation of UBE2I substrates, HSF2 and RANGAP1, via direct interaction. Thus, we hypothesized that PCGF2 might play a role in ATO-induced PML-RARA degradation by interacting with UBE2I. PCGF2 protein was down-regulated upon ATO treatment in human APL cell line, NB4. Knockdown of PCGF2 in NB4 cells, in the absence of ATO treatment, was sufficient to induce sumoylation-, ubiquitylation- and PML nuclear body-mediated degradation of PML-RARA protein. Moreover, overexpression of PCGF2 protected ATO-mediated degradation of ectopic and endogenous PML-RARA in 293T and NB4 cells, respectively. In 293T cells, UBE2I-mediated PML-RARA degradation was reduced upon PCGF2 co-expression. In addition, UBE2I-mediated sumoylation of PML-RARA was reduced upon PCGF2 co-expression and PCGF2-UBE2I interaction was confirmed by co-immunoprecipitation. Likewise, endogenous PCGF2-UBE2I interaction was detected by co-immunoprecipitation and immunofluorescence assays in NB4 cells. Intriguingly, upon ATO-treatment, such interaction was disrupted and UBE2I was co-immunoprecipitated or co-localized with its SUMO substrate, PML-RARA. Taken together, our results suggested a novel role of PCGF2 in ATO-mediated degradation of PML-RARA that PCGF2 might act as a negative regulator of UBE2I via direct interaction. Copyright © 2016 Elsevier B.V. All rights reserved.

  11. Parameterizing Phrase Based Statistical Machine Translation Models: An Analytic Study

    ERIC Educational Resources Information Center

    Cer, Daniel

    2011-01-01

    The goal of this dissertation is to determine the best way to train a statistical machine translation system. I first develop a state-of-the-art machine translation system called Phrasal and then use it to examine a wide variety of potential learning algorithms and optimization criteria and arrive at two very surprising results. First, despite the…

  12. Analytic H I-to-H2 Photodissociation Transition Profiles

    NASA Astrophysics Data System (ADS)

    Bialy, Shmuel; Sternberg, Amiel

    2016-05-01

    We present a simple analytic procedure for generating atomic (H I) to molecular ({{{H}}}2) density profiles for optically thick hydrogen gas clouds illuminated by far-ultraviolet radiation fields. Our procedure is based on the analytic theory for the structure of one-dimensional H I/{{{H}}}2 photon-dominated regions, presented by Sternberg et al. Depth-dependent atomic and molecular density fractions may be computed for arbitrary gas density, far-ultraviolet field intensity, and the metallicity-dependent H2 formation rate coefficient, and dust absorption cross section in the Lyman-Werner photodissociation band. We use our procedure to generate a set of {{H}} {{I}}{-}{to}{-}{{{H}}}2 transition profiles for a wide range of conditions, from the weak- to strong-field limits, and from super-solar down to low metallicities. We show that if presented as functions of dust optical depth, the {{H}} {{I}} and {{{H}}}2 density profiles depend primarily on the Sternberg “α G parameter” (dimensionless) that determines the dust optical depth associated with the total photodissociated {{H}} {{I}} column. We derive a universal analytic formula for the {{H}} {{I}}{-}{to}{-}{{{H}}}2 transition points as a function of just α G. Our formula will be useful for interpreting emission-line observations of H I/{{{H}}}2 interfaces, for estimating star formation thresholds, and for sub-grid components in hydrodynamics simulations.

  13. Comparison of reporting phase I trial results in ClinicalTrials.gov and matched publications.

    PubMed

    Shepshelovich, D; Goldvaser, H; Wang, L; Abdul Razak, A R; Bedard, P L

    2017-12-01

    Background Data on completeness of reporting of phase I cancer clinical trials in publications are lacking. Methods The ClinicalTrials.gov database was searched for completed adult phase I cancer trials with reported results. PubMed was searched for matching primary publications published prior to November 1, 2016. Reporting in primary publications was compared with the ClinicalTrials.gov database using a 28-point score (2=complete; 1=partial; 0=no reporting) for 14 items related to study design, outcome measures and safety profile. Inconsistencies between primary publications and ClinicalTrials.gov were recorded. Linear regression was used to identify factors associated with incomplete reporting. Results After a review of 583 trials in ClinicalTrials.gov , 163 matching primary publications were identified. Publications reported outcomes that did not appear in ClinicalTrials.gov in 25% of trials. Outcomes were upgraded, downgraded or omitted in publications in 47% of trials. The overall median reporting score was 23/28 (interquartile range 21-25). Incompletely reported items in >25% publications were: inclusion criteria (29%), primary outcome definition (26%), secondary outcome definitions (53%), adverse events (71%), serious adverse events (80%) and dates of study start and database lock (91%). Higher reporting scores were associated with phase I (vs phase I/II) trials (p<0.001), multicenter trials (p<0.001) and publication in journals with lower impact factor (p=0.004). Conclusions Reported results in primary publications for early phase cancer trials are frequently inconsistent or incomplete compared with ClinicalTrials.gov entries. ClinicalTrials.gov may provide more comprehensive data from new cancer drug trials.

  14. Statistical analysis of effective singular values in matrix rank determination

    NASA Technical Reports Server (NTRS)

    Konstantinides, Konstantinos; Yao, Kung

    1988-01-01

    A major problem in using SVD (singular-value decomposition) as a tool in determining the effective rank of a perturbed matrix is that of distinguishing between significantly small and significantly large singular values to the end, conference regions are derived for the perturbed singular values of matrices with noisy observation data. The analysis is based on the theories of perturbations of singular values and statistical significance test. Threshold bounds for perturbation due to finite-precision and i.i.d. random models are evaluated. In random models, the threshold bounds depend on the dimension of the matrix, the noisy variance, and predefined statistical level of significance. Results applied to the problem of determining the effective order of a linear autoregressive system from the approximate rank of a sample autocorrelation matrix are considered. Various numerical examples illustrating the usefulness of these bounds and comparisons to other previously known approaches are given.

  15. Dissociative photoionization mechanism of methanol isotopologues (CH3OH, CD3OH, CH3OD and CD3OD) by iPEPICO: energetics, statistical and non-statistical kinetics and isotope effects.

    PubMed

    Borkar, Sampada; Sztáray, Bálint; Bodi, Andras

    2011-07-28

    The dissociative photoionization of energy selected methanol isotopologue (CH(3)OH, CD(3)OH, CH(3)OD and CD(3)OD) cations was investigated using imaging Photoelectron Photoion Coincidence (iPEPICO) spectroscopy. The first dissociation is an H/D-atom loss from the carbon, also confirmed by partial deuteration. Somewhat above 12 eV, a parallel H(2)-loss channel weakly asserts itself. At photon energies above 15 eV, in a consecutive hydrogen molecule loss to the first H-atom loss, the formation of CHO(+)/CDO(+) dominates as opposed to COH(+)/COD(+) formation. We see little evidence for H-atom scrambling in these processes. In the photon energy range corresponding to the B[combining tilde] and C[combining tilde] ion states, a hydroxyl radical loss appears yielding CH(3)(+)/CD(3)(+). Based on the branching ratios, statistical considerations and ab initio calculations, this process is confirmed to take place on the first electronically excited Ã(2)A' ion state. Uncharacteristically, internal conversion is outcompeted by unimolecular dissociation due to the apparently weak Renner-Teller-like coupling between the X[combining tilde] and the à ion states. The experimental 0 K appearance energies of the ions CH(2)OH(+), CD(2)OH(+), CH(2)OD(+) and CD(2)OD(+) are measured to be 11.646 ± 0.003 eV, 11.739 ± 0.003 eV, 11.642 ± 0.003 eV and 11.737 ± 0.003 eV, respectively. The E(0)(CH(2)OH(+)) = 11.6454 ± 0.0017 eV was obtained based on the independently measured isotopologue results and calculated zero point effects. The 0 K heat of formation of CH(2)OH(+), protonated formaldehyde, was determined to be 717.7 ± 0.7 kJ mol(-1). This yields a 0 K heat of formation of CH(2)OH of -11.1 ± 0.9 kJ mol(-1) and an experimental 298 K proton affinity of formaldehyde of 711.6 ± 0.8 kJ mol(-1). The reverse barrier to homonuclear H(2)-loss from CH(3)OH(+) is determined to be 36 kJ mol(-1), whereas for heteronuclear H(2)-loss from CH(2)OH(+) it is found to be 210 kJ mol(-1). This

  16. 40 CFR Appendix I to Subpart T of... - Sample Graphical Summary of NTE Emission Results

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Emission Results I Appendix I to Subpart T of Part 86 Protection of Environment ENVIRONMENTAL PROTECTION..., App. I Appendix I to Subpart T of Part 86—Sample Graphical Summary of NTE Emission Results The following figure shows an example of a graphical summary of NTE emission results: ER14JN05.002 ...

  17. H I Structure and Topology of the Galaxy Revealed by the I-GALFA H I 21-cm Line Survey

    NASA Astrophysics Data System (ADS)

    Koo, Bon-Chul; Park, G.; Cho, W.; Gibson, S. J.; Kang, J.; Douglas, K. A.; Peek, J. E. G.; Korpela, E. J.; Heiles, C. E.

    2011-05-01

    The I-GALFA survey mapping all the H I in the inner Galactic disk visible to the Arecibo 305m telescope within 10 degrees of the Galactic plane (longitudes of 32 to 77 degrees at b = 0) completed observations in 2009 September and will soon be made publicly available. The high (3.4 arcmin) resolution and tremendous sensitivity of the survey offer a great opportunity to observe the fine details of H I both in the inner and in the far outer Galaxy. The reduced HI column density maps show that the HI structure is highly filamentary and clumpy, pervaded by shell-like structures, vertical filaments, and small clumps. By inspecting individual maps, we have found 36 shell candidates of angular sizes ranging from 0.4 to 12 degrees, half of which appear to be expanding. In order to characterize the filamentary/clumpy morphology of the HI structure, we have carried out statistical analyses of selected areas representing the spiral arms in the inner and outer Galaxy. Genus statistics that can distinguish the ``meatball'' and ``swiss-cheese'' topologies show that the HI topology is clump-like in most regions. The two-dimensional Fourier analysis further shows the HI structures are filamentary and mainly parallel to the plane in the outer Galaxy. We also examine the level-crossing statistics, the results of which are described in detail in an accompanying poster by Park et al.

  18. Assessing risk factors for dental caries: a statistical modeling approach.

    PubMed

    Trottini, Mario; Bossù, Maurizio; Corridore, Denise; Ierardo, Gaetano; Luzzi, Valeria; Saccucci, Matteo; Polimeni, Antonella

    2015-01-01

    The problem of identifying potential determinants and predictors of dental caries is of key importance in caries research and it has received considerable attention in the scientific literature. From the methodological side, a broad range of statistical models is currently available to analyze dental caries indices (DMFT, dmfs, etc.). These models have been applied in several studies to investigate the impact of different risk factors on the cumulative severity of dental caries experience. However, in most of the cases (i) these studies focus on a very specific subset of risk factors; and (ii) in the statistical modeling only few candidate models are considered and model selection is at best only marginally addressed. As a result, our understanding of the robustness of the statistical inferences with respect to the choice of the model is very limited; the richness of the set of statistical models available for analysis in only marginally exploited; and inferences could be biased due the omission of potentially important confounding variables in the model's specification. In this paper we argue that these limitations can be overcome considering a general class of candidate models and carefully exploring the model space using standard model selection criteria and measures of global fit and predictive performance of the candidate models. Strengths and limitations of the proposed approach are illustrated with a real data set. In our illustration the model space contains more than 2.6 million models, which require inferences to be adjusted for 'optimism'.

  19. Line identification studies using traditional techniques and wavelength coincidence statistics

    NASA Technical Reports Server (NTRS)

    Cowley, Charles R.; Adelman, Saul J.

    1990-01-01

    Traditional line identification techniques result in the assignment of individual lines to an atomic or ionic species. These methods may be supplemented by wavelength coincidence statistics (WCS). The strength and weakness of these methods are discussed using spectra of a number of normal and peculiar B and A stars that have been studied independently by both methods. The present results support the overall findings of some earlier studies. WCS would be most useful in a first survey, before traditional methods have been applied. WCS can quickly make a global search for all species and in this way may enable identifications of an unexpected spectrum that could easily be omitted entirely from a traditional study. This is illustrated by O I. WCS is a subject to well known weakness of any statistical technique, for example, a predictable number of spurious results are to be expected. The danger of small number statistics are illustrated. WCS is at its best relative to traditional methods in finding a line-rich atomic species that is only weakly present in a complicated stellar spectrum.

  20. New results on Class I methanol masers in nearby low-mass star formation regions

    NASA Astrophysics Data System (ADS)

    Kalenskii, S. V.; Kurtz, S.; Hofner, P.

    We present the review of the properties of Class I methanol masers detected in low-mass star forming regions (LMSFRs). These masers, called further LMMIs, are associated with postshock gas in the lobes of chemically active outflows in LMSFRs NGC1333,NGC2023, HH25, and L1157. Flux densities of these masers are no higher than 18 Jy at 44 GHz and are lower in the other Class I lines, being much lower than those of strong Class I masers in the regions of high-mass star formation. However, LMMI luminosities at 44 GHz match the relation "maser luminosity-protostar luminosity" established for high- and intermediate-mass protostars. No variability of LMMIs has been found in 2004-2011. Radial velocities of most LMMIs are close to the systemic velocities of associated regions. The only known exception is the maser detected at 36 GHz toward the blue lobe of the extra-high-velocity outflow in NGC2023, whose radial velocity is 3.5 km s-1 lower than the systemic velocity. Four masers, NGC1333I2A, NGC1333I4A, HH25MMS, and L1157 have been observed at 44 GHz with the VLA in the D configuration with an angular resolution of about 1:500. All of them except NGC1333I2A have been later reobserved with the EVLA in the B configuration, which provides an angular resolution of about 0:200 at this frequency. The maser images consist of compact spots, unresolved or barely resolved even with the B configuration. The brightness temperatures of the strongest spots are hundreds of thousands Kelvins. Many spots consist of two spatial components and demonstrate double spectral lines. An interesting result is the detection of unresolved spots demonstrating broad(˜3-5 km s-1) spectral lines. Their fluxes correspond to brightness temperatures ˜1000 K. Thus, in spite of large linewidths, these objects could be weak masers. Probably, the broad lines, detected in some sources at 44 GHz and in other Class I lines as a result of single-dish observations, are also masers. We believe that turbulence plays

  1. 3D flow effects on measuring turbulence statistics using 2D PIV

    NASA Astrophysics Data System (ADS)

    Lee, Hoonsang; Hwang, Wontae

    2017-11-01

    Homogeneous & isotropic turbulence (HIT) with no mean flow is the simplest type of turbulent flow which can be used to study various phenomena. Although HIT is inherently three dimensional in nature, various turbulence statistics can be measured with 2D PIV utilizing various assumptions. In this study, the loss of tracer particle pairs due to out-of-plane motion, and the effect it has on statistics such as turbulence kinetic energy, dissipation rate, and velocity correlations is investigated. Synthetic PIV images created from HIT direct numerical simulation (DNS) data are utilized to quantify this effect. We estimate the out-of-plane error by adjusting parameters such as PIV time interval, interrogation window size, and particle size. This information can be utilized to optimize experimental parameters when examining 3D turbulence via 2D PIV. This research was supported by Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Science, ICT & Future Planning (No. 2017R1A2B4007372), and also by SNU new faculty Research Resettlement Fund.

  2. New concepts for HgI2 scintillator gamma ray spectroscopy

    NASA Technical Reports Server (NTRS)

    Iwanczyk, Jan S.

    1994-01-01

    The primary goals of this project are development of the technology for HgI2 photodetectors (PD's), development of a HgI2/scintillator gamma detector, development of electronics, and development of a prototype gamma spectrometer. Work on the HgI2 PD's involved HgI2 purification and crystal growth, detector surface and electrical contact studies, PD structure optimization, encapsulation and packaging, and testing. Work on the HgI2/scintillator gamma detector involved a study of the optical - mechanical coupling for the optimization of CsI(Tl)/HgI2 gamma ray detectors and determination of the relationship between resolution versus scintillator type and size. The development of the electronics focused on low noise amplification circuits using different preamp input FET's and the use of a coincidence technique to maximize the signal, minimize the noise contribution in the gamma spectra, and improve the overall system resolution.

  3. The Statistical Drake Equation

    NASA Astrophysics Data System (ADS)

    Maccone, Claudio

    2010-12-01

    We provide the statistical generalization of the Drake equation. From a simple product of seven positive numbers, the Drake equation is now turned into the product of seven positive random variables. We call this "the Statistical Drake Equation". The mathematical consequences of this transformation are then derived. The proof of our results is based on the Central Limit Theorem (CLT) of Statistics. In loose terms, the CLT states that the sum of any number of independent random variables, each of which may be ARBITRARILY distributed, approaches a Gaussian (i.e. normal) random variable. This is called the Lyapunov Form of the CLT, or the Lindeberg Form of the CLT, depending on the mathematical constraints assumed on the third moments of the various probability distributions. In conclusion, we show that: The new random variable N, yielding the number of communicating civilizations in the Galaxy, follows the LOGNORMAL distribution. Then, as a consequence, the mean value of this lognormal distribution is the ordinary N in the Drake equation. The standard deviation, mode, and all the moments of this lognormal N are also found. The seven factors in the ordinary Drake equation now become seven positive random variables. The probability distribution of each random variable may be ARBITRARY. The CLT in the so-called Lyapunov or Lindeberg forms (that both do not assume the factors to be identically distributed) allows for that. In other words, the CLT "translates" into our statistical Drake equation by allowing an arbitrary probability distribution for each factor. This is both physically realistic and practically very useful, of course. An application of our statistical Drake equation then follows. The (average) DISTANCE between any two neighboring and communicating civilizations in the Galaxy may be shown to be inversely proportional to the cubic root of N. Then, in our approach, this distance becomes a new random variable. We derive the relevant probability density

  4. Addressing the statistical mechanics of planet orbits in the solar system

    NASA Astrophysics Data System (ADS)

    Mogavero, Federico

    2017-10-01

    The chaotic nature of planet dynamics in the solar system suggests the relevance of a statistical approach to planetary orbits. In such a statistical description, the time-dependent position and velocity of the planets are replaced by the probability density function (PDF) of their orbital elements. It is natural to set up this kind of approach in the framework of statistical mechanics. In the present paper, I focus on the collisionless excitation of eccentricities and inclinations via gravitational interactions in a planetary system. The future planet trajectories in the solar system constitute the prototype of this kind of dynamics. I thus address the statistical mechanics of the solar system planet orbits and try to reproduce the PDFs numerically constructed by Laskar (2008, Icarus, 196, 1). I show that the microcanonical ensemble of the Laplace-Lagrange theory accurately reproduces the statistics of the giant planet orbits. To model the inner planets I then investigate the ansatz of equiprobability in the phase space constrained by the secular integrals of motion. The eccentricity and inclination PDFs of Earth and Venus are reproduced with no free parameters. Within the limitations of a stationary model, the predictions also show a reasonable agreement with Mars PDFs and that of Mercury inclination. The eccentricity of Mercury demands in contrast a deeper analysis. I finally revisit the random walk approach of Laskar to the time dependence of the inner planet PDFs. Such a statistical theory could be combined with direct numerical simulations of planet trajectories in the context of planet formation, which is likely to be a chaotic process.

  5. Emerging understanding of the ΔI=1/2 rule from lattice QCD.

    PubMed

    Boyle, P A; Christ, N H; Garron, N; Goode, E J; Janowski, T; Lehner, C; Liu, Q; Lytle, A T; Sachrajda, C T; Soni, A; Zhang, D

    2013-04-12

    There has been much speculation as to the origin of the ΔI=1/2 rule (ReA0/ReA2≃22.5). We find that the two dominant contributions to the ΔI=3/2, K→ππ correlation functions have opposite signs, leading to a significant cancelation. This partial cancelation occurs in our computation of ReA2 with physical quark masses and kinematics (where we reproduce the experimental value of A2) and also for heavier pions at threshold. For ReA0, although we do not have results at physical kinematics, we do have results for pions at zero momentum with mπ≃420  MeV [ReA0/ReA2=9.1(2.1)] and mπ≃330  MeV [ReA0/ReA2=12.0(1.7)]. The contributions which partially cancel in ReA2 are also the largest ones in ReA0, but now they have the same sign and so enhance this amplitude. The emerging explanation of the ΔI=1/2 rule is a combination of the perturbative running to scales of O(2  GeV), a relative suppression of ReA2 through the cancelation of the two dominant contributions, and the corresponding enhancement of ReA0. QCD and electroweak penguin operators make only very small contributions at such scales.

  6. Lithography process for patterning HgI2 photonic devices

    DOEpatents

    Mescher, Mark J.; James, Ralph B.; Hermon, Haim

    2004-11-23

    A photolithographic process forms patterns on HgI.sub.2 surfaces and defines metal sublimation masks and electrodes to substantially improve device performance by increasing the realizable design space. Techniques for smoothing HgI.sub.2 surfaces and for producing trenches in HgI.sub.2 are provided. A sublimation process is described which produces etched-trench devices with enhanced electron-transport-only behavior.

  7. United States Air Force Statistical Digest, Fiscal Year 1951. Sixth Edition

    DTIC Science & Technology

    1952-11-18

    2 2 2 Mili tary Air Transport Service Squadron - Total 2 2 g 2 g g 2 s 1 ! 1 ! , curce a Qrganhat10n geccr- de (AFASC-6F)j oenerea orden from Major...STATISTICAL SERVICES DeS COMPTROLLER", USAF WASHINGTON, DC .J DEPARTMENT OF THE AIR FORCE WASHINGTON, 20 SEPTEMBER 1948 Am FORCE REGULATION) NO. 5-24...AND LUBES. 223 PART V III STOCKPILING •. 235 PART IX INDUSTRIAL RESERVE’ 241 PART X TRANSPORTATION ..• Z8S PART X I RESEARCH AND DEVELOPMENT 2𔄁 t

  8. Vitamin D receptor gene Alw I, Fok I, Apa I, and Taq I polymorphisms in patients with urinary stone.

    PubMed

    Seo, Ill Young; Kang, In-Hong; Chae, Soo-Cheon; Park, Seung Chol; Lee, Young-Jin; Yang, Yun Sik; Ryu, Soo Bang; Rim, Joung Sik

    2010-04-01

    To evaluate vitamin D receptor (VDR) gene polymorphisms in Korean patients so as to identify the candidate genes associated with urinary stones. Urinary stones are a multifactorial disease that includes various genetic factors. A normal control group of 535 healthy subjects and 278 patients with urinary stones was evaluated. Of 125 patients who presented stone samples, 102 had calcium stones on chemical analysis. The VDR gene Alw I, Fok I, Apa I, and Taq I polymorphisms were evaluated using the polymerase chain reaction-restriction fragment length polymorphism analysis. Allelic and genotypic frequencies were calculated to identify associations in both groups. The haplotype frequencies of the VDR gene polymorphisms for multiple loci were also determined. For the VDR gene Alw I, Fok I, Apa I, and Taq I polymorphisms, there was no statistically significant difference between the patients with urinary stones and the healthy controls. There was also no statistically significant difference between the patients with calcium stones and the healthy controls. A novel haplotype (Ht 4; CTTT) was identified in 13.5% of the patients with urinary stones and in 8.3% of the controls (P = .001). The haplotype frequencies were significantly different between the patients with calcium stones and the controls (P = .004). The VDR gene Alw I, Fok I, Apa I, and Taq I polymorphisms does not seem to be candidate genetic markers for urinary stones in Korean patients. However, 1 novel haplotype of the VDR gene polymorphisms for multiple loci might be a candidate genetic marker. Copyright 2010 Elsevier Inc. All rights reserved.

  9. CYP2E1 Rsa I/Pst I polymorphism contributes to oral cancer susceptibility: a meta-analysis.

    PubMed

    Niu, Yuming; Hu, Yuanyuan; Wu, Mingyue; Jiang, Fei; Shen, Ming; Tang, Chunbo; Chen, Ning

    2012-01-01

    Previous data on association between CYP2E1 Rsa I/Pst I polymorphism and oral cancer risk were controversial. To investigate the association between CYP2E1 Rsa I/Pst I polymorphism and oral cancer risk. We performed a meta-analysis to assess the relationship between oral cancer and genotype with English language until June 2010. Twelve published case-control studies of 1259 patients with oral cancer and 2262 controls were acquired. Odds ratios (ORs) with 95% confidence intervals (CIs) were used to assess the strength of the association in codominant and dominant models. Overall, the pooled ORs indicated a significant association between CYP2E1 Rsa I/Pst I polymorphism and oral cancer risk (for c1/c2 vs. c1/c1: OR=1.30, 95% CI=1.04-1.62, Pheterogeneity=0.57; for (c1/c2+c2/c2) vs. c1/c1: OR=1.32, 95% CI=1.07-1.64, Pheterogeneity=0.57, respectively). In subgroup analysis by race, the same significant risks were found among Asian (for c1/c2 vs. c1/c1: OR=1.41, 95% CI=1.05-1.91, Pheterogeneity=0.92; for (c1/c2+c2/c2) vs. c1/c1: OR=1.43, 95% CI=1.08-1.88, Pheterogeneity=0.97, respectively). In conclusion, this meta-analysis demonstrates that CYP2E1 Rsa I/Pst I c2 allele may be a biomarker for oral cancer, especially among Asian populations.

  10. Processes and subdivisions in diogenites, a multivariate statistical analysis

    NASA Technical Reports Server (NTRS)

    Harriott, T. A.; Hewins, R. H.

    1984-01-01

    Multivariate statistical techniques used on diogenite orthopyroxene analyses show the relationships that occur within diogenites and the two orthopyroxenite components (class I and II) in the polymict diogenite Garland. Cluster analysis shows that only Peckelsheim is similar to Garland class I (Fe-rich) and the other diogenites resemble Garland class II. The unique diogenite Y 75032 may be related to type I by fractionation. Factor analysis confirms the subdivision and shows that Fe does not correlate with the weakly incompatible elements across the entire pyroxene composition range, indicating that igneous fractionation is not the process controlling total diogenite composition variation. The occurrence of two groups of diogenites is interpreted as the result of sampling or mixing of two main sequences of orthopyroxene cumulates with slightly different compositions.

  11. The Thurgood Marshall School of Law Empirical Findings: A Report of the Statistical Analysis of the February 2010 TMSL Texas Bar Results

    ERIC Educational Resources Information Center

    Kadhi, T.; Holley, D.; Rudley, D.; Garrison, P.; Green, T.

    2010-01-01

    The following report gives the statistical findings of the 2010 Thurgood Marshall School of Law (TMSL) Texas Bar results. This data was pre-existing and was given to the Evaluator by email from the Dean. Then, in-depth statistical analyses were run using the SPSS 17 to address the following questions: 1. What are the statistical descriptors of the…

  12. The Thurgood Marshall School of Law Empirical Findings: A Report of the Statistical Analysis of the July 2010 TMSL Texas Bar Results

    ERIC Educational Resources Information Center

    Kadhi, Tau; Holley, D.

    2010-01-01

    The following report gives the statistical findings of the July 2010 TMSL Bar results. Procedures: Data is pre-existing and was given to the Evaluator by email from the Registrar and Dean. Statistical analyses were run using SPSS 17 to address the following research questions: 1. What are the statistical descriptors of the July 2010 overall TMSL…

  13. Linear and Order Statistics Combiners for Pattern Classification

    NASA Technical Reports Server (NTRS)

    Tumer, Kagan; Ghosh, Joydeep; Lau, Sonie (Technical Monitor)

    2001-01-01

    Several researchers have experimentally shown that substantial improvements can be obtained in difficult pattern recognition problems by combining or integrating the outputs of multiple classifiers. This chapter provides an analytical framework to quantify the improvements in classification results due to combining. The results apply to both linear combiners and order statistics combiners. We first show that to a first order approximation, the error rate obtained over and above the Bayes error rate, is directly proportional to the variance of the actual decision boundaries around the Bayes optimum boundary. Combining classifiers in output space reduces this variance, and hence reduces the 'added' error. If N unbiased classifiers are combined by simple averaging. the added error rate can be reduced by a factor of N if the individual errors in approximating the decision boundaries are uncorrelated. Expressions are then derived for linear combiners which are biased or correlated, and the effect of output correlations on ensemble performance is quantified. For order statistics based non-linear combiners, we derive expressions that indicate how much the median, the maximum and in general the i-th order statistic can improve classifier performance. The analysis presented here facilitates the understanding of the relationships among error rates, classifier boundary distributions, and combining in output space. Experimental results on several public domain data sets are provided to illustrate the benefits of combining and to support the analytical results.

  14. Statistical deprojection of galaxy pairs

    NASA Astrophysics Data System (ADS)

    Nottale, Laurent; Chamaraux, Pierre

    2018-06-01

    Aims: The purpose of the present paper is to provide methods of statistical analysis of the physical properties of galaxy pairs. We perform this study to apply it later to catalogs of isolated pairs of galaxies, especially two new catalogs we recently constructed that contain ≈1000 and ≈13 000 pairs, respectively. We are particularly interested by the dynamics of those pairs, including the determination of their masses. Methods: We could not compute the dynamical parameters directly since the necessary data are incomplete. Indeed, we only have at our disposal one component of the intervelocity between the members, namely along the line of sight, and two components of their interdistance, i.e., the projection on the sky-plane. Moreover, we know only one point of each galaxy orbit. Hence we need statistical methods to find the probability distribution of 3D interdistances and 3D intervelocities from their projections; we designed those methods under the term deprojection. Results: We proceed in two steps to determine and use the deprojection methods. First we derive the probability distributions expected for the various relevant projected quantities, namely intervelocity vz, interdistance rp, their ratio, and the product rp v_z^2, which is involved in mass determination. In a second step, we propose various methods of deprojection of those parameters based on the previous analysis. We start from a histogram of the projected data and we apply inversion formulae to obtain the deprojected distributions; lastly, we test the methods by numerical simulations, which also allow us to determine the uncertainties involved.

  15. Use of statistical study methods for the analysis of the results of the imitation modeling of radiation transfer

    NASA Astrophysics Data System (ADS)

    Alekseenko, M. A.; Gendrina, I. Yu.

    2017-11-01

    Recently, due to the abundance of various types of observational data in the systems of vision through the atmosphere and the need for their processing, the use of various methods of statistical research in the study of such systems as correlation-regression analysis, dynamic series, variance analysis, etc. is actual. We have attempted to apply elements of correlation-regression analysis for the study and subsequent prediction of the patterns of radiation transfer in these systems same as in the construction of radiation models of the atmosphere. In this paper, we present some results of statistical processing of the results of numerical simulation of the characteristics of vision systems through the atmosphere obtained with the help of a special software package.1

  16. A statistical test to show negligible trend

    Treesearch

    Philip M. Dixon; Joseph H.K. Pechmann

    2005-01-01

    The usual statistical tests of trend are inappropriate for demonstrating the absence of trend. This is because failure to reject the null hypothesis of no trend does not prove that null hypothesis. The appropriate statistical method is based on an equivalence test. The null hypothesis is that the trend is not zero, i.e., outside an a priori specified equivalence region...

  17. Statistics and Discoveries at the LHC (2/4)

    ScienceCinema

    Cowan, Glen

    2018-04-26

    The lectures will give an introduction to statistics as applied in particle physics and will provide all the necessary basics for data analysis at the LHC. Special emphasis will be placed on the the problems and questions that arise when searching for new phenomena, including p-values, discovery significance, limit setting procedures, treatment of small signals in the presence of large backgrounds. Specific issues that will be addressed include the advantages and drawbacks of different statistical test procedures (cut-based, likelihood-ratio, etc.), the look-elsewhere effect and treatment of systematic uncertainties.

  18. Descriptive statistics.

    PubMed

    Nick, Todd G

    2007-01-01

    Statistics is defined by the Medical Subject Headings (MeSH) thesaurus as the science and art of collecting, summarizing, and analyzing data that are subject to random variation. The two broad categories of summarizing and analyzing data are referred to as descriptive and inferential statistics. This chapter considers the science and art of summarizing data where descriptive statistics and graphics are used to display data. In this chapter, we discuss the fundamentals of descriptive statistics, including describing qualitative and quantitative variables. For describing quantitative variables, measures of location and spread, for example the standard deviation, are presented along with graphical presentations. We also discuss distributions of statistics, for example the variance, as well as the use of transformations. The concepts in this chapter are useful for uncovering patterns within the data and for effectively presenting the results of a project.

  19. [Analysis on 2011 quality control results on aerobic plate count of microbiology laboratories in China].

    PubMed

    Han, Haihong; Li, Ning; Li, Yepeng; Fu, Ping; Yu, Dongmin; Li Zhigang; Du, Chunming; Guo, Yunchang

    2015-01-01

    To test the aerobic plate count examining capability of microbiology laboratories, to ensure the accuracy and comparability of quantitative bacteria examination results, and to improve the quality of monitoring. The 4 different concentration aerobic plate count piece samples were prepared and noted as I, II, III and IV. After homogeneity and stability tests, the samples were delivered to monitoring institutions. The results of I, II, III samples were logarithmic transformed, and evaluated with Z-score method using the robust average and standard deviation. The results of IV samples were evaluated as "satisfactory" when reported as < 10 CFU/piece or as "not satisfactory" otherwise. Pearson χ2 test was used to analyze the ratio results. 309 monitoring institutions, which was 99.04% of the total number, reported their results. 271 institutions reported a satisfactory result, and the satisfactory rate was 87.70%. There was no statistical difference in satisfactory rates of I, II and III samples which were 81.52%, 88.30% and 91.40% respectively. The satisfactory rate of IV samples was 93.33%. There was no statistical difference in satisfactory rates between provincial and municipal CDC. The quality control program has provided scientific data that the aerobic plate count capability of the laboratories meets the requirements of monitoring tasks.

  20. Apical P2XR contribute to [Ca2+]i signaling and Isc in mouse renal MCD.

    PubMed

    Li, Liuzhe; Lynch, I Jeanette; Zheng, Wencui; Cash, Melanie N; Teng, Xueling; Wingo, Charles S; Verlander, Jill W; Xia, Shen-Ling

    2007-08-03

    We examined P2X receptor expression and distribution in the mouse collecting duct (CD) and their functional role in Ca(2+) signaling. Both P2X(1) and P2X(4) were detected by RT-PCR and Western blot. Immunohistochemistry demonstrated apical P2X(1) and P2X(4) immunoreactivity in principal cells in the outer medullary CD (OMCD) and inner medullary CD (IMCD). Luminal ATP induced an increase in Ca(2+) signaling in native medullary CD (MCD) as measured by fluorescence imaging. ATP also induced an increase in Ca(2+) signaling in MCD cells grown in primary culture but not in the presence of P2XR antagonist PPNDS. Short circuit current (I(sc)) measurement with mouse IMCD cells showed that P2XR agonist BzATP induced a larger I(sc) than did P2YR agonist UTP in the apical membrane. Our data reveal for the first time that P2X(1) and P2X(4) are cell-specific with prominent immunoreactivity in the apical area of MCD cells. The finding that P2XR blockade inhibits ATP-induced Ca(2+) signaling suggests that activation of P2XR is a key step in Ca(2+)-dependent purinergic signaling. The result that activation of P2XR produces large I(sc) indicates the necessity of P2XR in renal CD ion transport.

  1. iMatTOUGH: An open-source Matlab-based graphical user interface for pre- and post-processing of TOUGH2 and iTOUGH2 models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tran, Anh Phuong; Dafflon, Baptiste; Hubbard, Susan

    TOUGH2 and iTOUGH2 are powerful models that simulate the heat and fluid flows in porous and fracture media, and perform parameter estimation, sensitivity analysis and uncertainty propagation analysis. However, setting up the input files is not only tedious, but error prone, and processing output files is time consuming. Here, we present an open source Matlab-based tool (iMatTOUGH) that supports the generation of all necessary inputs for both TOUGH2 and iTOUGH2 and visualize their outputs. The tool links the inputs of TOUGH2 and iTOUGH2, making sure the two input files are consistent. It supports the generation of rectangular computational mesh, i.e.,more » it automatically generates the elements and connections as well as their properties as required by TOUGH2. The tool also allows the specification of initial and time-dependent boundary conditions for better subsurface heat and water flow simulations. The effectiveness of the tool is illustrated by an example that uses TOUGH2 and iTOUGH2 to estimate soil hydrological and thermal properties from soil temperature data and simulate the heat and water flows at the Rifle site in Colorado.« less

  2. iMatTOUGH: An open-source Matlab-based graphical user interface for pre- and post-processing of TOUGH2 and iTOUGH2 models

    DOE PAGES

    Tran, Anh Phuong; Dafflon, Baptiste; Hubbard, Susan

    2016-04-01

    TOUGH2 and iTOUGH2 are powerful models that simulate the heat and fluid flows in porous and fracture media, and perform parameter estimation, sensitivity analysis and uncertainty propagation analysis. However, setting up the input files is not only tedious, but error prone, and processing output files is time consuming. Here, we present an open source Matlab-based tool (iMatTOUGH) that supports the generation of all necessary inputs for both TOUGH2 and iTOUGH2 and visualize their outputs. The tool links the inputs of TOUGH2 and iTOUGH2, making sure the two input files are consistent. It supports the generation of rectangular computational mesh, i.e.,more » it automatically generates the elements and connections as well as their properties as required by TOUGH2. The tool also allows the specification of initial and time-dependent boundary conditions for better subsurface heat and water flow simulations. The effectiveness of the tool is illustrated by an example that uses TOUGH2 and iTOUGH2 to estimate soil hydrological and thermal properties from soil temperature data and simulate the heat and water flows at the Rifle site in Colorado.« less

  3. The non-equilibrium statistical mechanics of a simple geophysical fluid dynamics model

    NASA Astrophysics Data System (ADS)

    Verkley, Wim; Severijns, Camiel

    2014-05-01

    Lorenz [1] has devised a dynamical system that has proved to be very useful as a benchmark system in geophysical fluid dynamics. The system in its simplest form consists of a periodic array of variables that can be associated with an atmospheric field on a latitude circle. The system is driven by a constant forcing, is damped by linear friction and has a simple advection term that causes the model to behave chaotically if the forcing is large enough. Our aim is to predict the statistics of Lorenz' model on the basis of a given average value of its total energy - obtained from a numerical integration - and the assumption of statistical stationarity. Our method is the principle of maximum entropy [2] which in this case reads: the information entropy of the system's probability density function shall be maximal under the constraints of normalization, a given value of the average total energy and statistical stationarity. Statistical stationarity is incorporated approximately by using `stationarity constraints', i.e., by requiring that the average first and possibly higher-order time-derivatives of the energy are zero in the maximization of entropy. The analysis [3] reveals that, if the first stationarity constraint is used, the resulting probability density function rather accurately reproduces the statistics of the individual variables. If the second stationarity constraint is used as well, the correlations between the variables are also reproduced quite adequately. The method can be generalized straightforwardly and holds the promise of a viable non-equilibrium statistical mechanics of the forced-dissipative systems of geophysical fluid dynamics. [1] E.N. Lorenz, 1996: Predictability - A problem partly solved, in Proc. Seminar on Predictability (ECMWF, Reading, Berkshire, UK), Vol. 1, pp. 1-18. [2] E.T. Jaynes, 2003: Probability Theory - The Logic of Science (Cambridge University Press, Cambridge). [3] W.T.M. Verkley and C.A. Severijns, 2014: The maximum entropy

  4. I + (H2O)2 → HI + (H2O)OH Forward and Reverse Reactions. CCSD(T) Studies Including Spin-Orbit Coupling.

    PubMed

    Wang, Hui; Li, Guoliang; Li, Qian-Shu; Xie, Yaoming; Schaefer, Henry F

    2016-03-03

    The potential energy profile for the atomic iodine plus water dimer reaction I + (H2O)2 → HI + (H2O)OH has been explored using the "Gold Standard" CCSD(T) method with quadruple-ζ correlation-consistent basis sets. The corresponding information for the reverse reaction HI + (H2O)OH → I + (H2O)2 is also derived. Both zero-point vibrational energies (ZPVEs) and spin-orbit (SO) coupling are considered, and these notably alter the classical energetics. On the basis of the CCSD(T)/cc-pVQZ-PP results, including ZPVE and SO coupling, the forward reaction is found to be endothermic by 47.4 kcal/mol, implying a significant exothermicity for the reverse reaction. The entrance complex I···(H2O)2 is bound by 1.8 kcal/mol, and this dissociation energy is significantly affected by SO coupling. The reaction barrier lies 45.1 kcal/mol higher than the reactants. The exit complex HI···(H2O)OH is bound by 3.0 kcal/mol relative to the asymptotic limit. At every level of theory, the reverse reaction HI + (H2O)OH → I + (H2O)2 proceeds without a barrier. Compared with the analogous water monomer reaction I + H2O → HI + OH, the additional water molecule reduces the relative energies of the entrance stationary point, transition state, and exit complex by 3-5 kcal/mol. The I + (H2O)2 reaction is related to the valence isoelectronic bromine and chlorine reactions but is distinctly different from the F + (H2O)2 system.

  5. Concept Maps in Introductory Statistics

    ERIC Educational Resources Information Center

    Witmer, Jeffrey A.

    2016-01-01

    Concept maps are tools for organizing thoughts on the main ideas in a course. I present an example of a concept map that was created through the work of students in an introductory class and discuss major topics in statistics and relationships among them.

  6. Macrophages From Irradiated Tumors Express Higher Levels of iNOS, Arginase-I and COX-2, and Promote Tumor Growth

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tsai, C.-S.; Graduate Institute of Clinical Medical Sciences, Chang Gung University, Taiwan; Chen, F.-H.

    2007-06-01

    Purpose: To investigate the effects of single and fractionated doses of radiation on tumors and tumor-associated macrophages (TAMs), and to elucidate the potential of TAMs to influence tumor growth. Methods and Materials: A murine prostate cell line, TRAMP-C1, was grown in C57Bl/6J mice to 4-mm tumor diameter and irradiated with either 25 Gy in a single dose, or 60 Gy in 15 fractions. The tumors were removed at the indicated times and assessed for a variety of markers related to TAM content, activation status, and function. Results: In tumors receiving a single radiation dose, arginase (Arg-I), and cycloxygenase-2 (COX-2) mRNAmore » expression increased as a small transient wave within 24 h and a larger persistent wave starting after 3 days. Inducible nitric oxide synthase (iNOS) mRNA was elevated only after 3 days and continued to increase up to 3 weeks. After fractionated irradiation, Arg-1 and COX-2 mRNA levels increased within 5 days, whereas iNOS was increased only after 10 fractions of irradiation had been given. Increased levels of Arg-I, COX-2, and, to a lesser extent, iNOS protein were found to associate with TAMs 1-2 weeks after tumor irradiation. Function of TAMs were compared by mixing them with TRAMP-C1 cells and injecting them into mice; TRAMP-C1 cells mixed with TAMs from irradiated tumors appeared earlier and grew significantly faster than those mixed with TAMs from unirradiated tumors or TRAMP-C1 alone. Conclusions: Tumor-associated macrophages in the postirradiated tumor microenvironment express higher levels of Arg-1, COX-2, and iNOS, and promote early tumor growth in vivo.« less

  7. Statistical characteristics of trajectories of diamagnetic unicellular organisms in a magnetic field.

    PubMed

    Gorobets, Yu I; Gorobets, O Yu

    2015-01-01

    The statistical model is proposed in this paper for description of orientation of trajectories of unicellular diamagnetic organisms in a magnetic field. The statistical parameter such as the effective energy is calculated on basis of this model. The resulting effective energy is the statistical characteristics of trajectories of diamagnetic microorganisms in a magnetic field connected with their metabolism. The statistical model is applicable for the case when the energy of the thermal motion of bacteria is negligible in comparison with their energy in a magnetic field and the bacteria manifest the significant "active random movement", i.e. there is the randomizing motion of the bacteria of non thermal nature, for example, movement of bacteria by means of flagellum. The energy of the randomizing active self-motion of bacteria is characterized by the new statistical parameter for biological objects. The parameter replaces the energy of the randomizing thermal motion in calculation of the statistical distribution. Copyright © 2014 Elsevier Ltd. All rights reserved.

  8. Do infants retain the statistics of a statistical learning experience? Insights from a developmental cognitive neuroscience perspective

    PubMed Central

    2017-01-01

    Statistical structure abounds in language. Human infants show a striking capacity for using statistical learning (SL) to extract regularities in their linguistic environments, a process thought to bootstrap their knowledge of language. Critically, studies of SL test infants in the minutes immediately following familiarization, but long-term retention unfolds over hours and days, with almost no work investigating retention of SL. This creates a critical gap in the literature given that we know little about how single or multiple SL experiences translate into permanent knowledge. Furthermore, different memory systems with vastly different encoding and retention profiles emerge at different points in development, with the underlying memory system dictating the fidelity of the memory trace hours later. I describe the scant literature on retention of SL, the learning and retention properties of memory systems as they apply to SL, and the development of these memory systems. I propose that different memory systems support retention of SL in infant and adult learners, suggesting an explanation for the slow pace of natural language acquisition in infancy. I discuss the implications of developing memory systems for SL and suggest that we exercise caution in extrapolating from adult to infant properties of SL. This article is part of the themed issue ‘New frontiers for statistical learning in the cognitive sciences’. PMID:27872372

  9. Do infants retain the statistics of a statistical learning experience? Insights from a developmental cognitive neuroscience perspective.

    PubMed

    Gómez, Rebecca L

    2017-01-05

    Statistical structure abounds in language. Human infants show a striking capacity for using statistical learning (SL) to extract regularities in their linguistic environments, a process thought to bootstrap their knowledge of language. Critically, studies of SL test infants in the minutes immediately following familiarization, but long-term retention unfolds over hours and days, with almost no work investigating retention of SL. This creates a critical gap in the literature given that we know little about how single or multiple SL experiences translate into permanent knowledge. Furthermore, different memory systems with vastly different encoding and retention profiles emerge at different points in development, with the underlying memory system dictating the fidelity of the memory trace hours later. I describe the scant literature on retention of SL, the learning and retention properties of memory systems as they apply to SL, and the development of these memory systems. I propose that different memory systems support retention of SL in infant and adult learners, suggesting an explanation for the slow pace of natural language acquisition in infancy. I discuss the implications of developing memory systems for SL and suggest that we exercise caution in extrapolating from adult to infant properties of SL.This article is part of the themed issue 'New frontiers for statistical learning in the cognitive sciences'. © 2016 The Author(s).

  10. Community annotation experiment for ground truth generation for the i2b2 medication challenge

    PubMed Central

    Solti, Imre; Xia, Fei; Cadag, Eithon

    2010-01-01

    Objective Within the context of the Third i2b2 Workshop on Natural Language Processing Challenges for Clinical Records, the authors (also referred to as ‘the i2b2 medication challenge team’ or ‘the i2b2 team’ for short) organized a community annotation experiment. Design For this experiment, the authors released annotation guidelines and a small set of annotated discharge summaries. They asked the participants of the Third i2b2 Workshop to annotate 10 discharge summaries per person; each discharge summary was annotated by two annotators from two different teams, and a third annotator from a third team resolved disagreements. Measurements In order to evaluate the reliability of the annotations thus produced, the authors measured community inter-annotator agreement and compared it with the inter-annotator agreement of expert annotators when both the community and the expert annotators generated ground truth based on pooled system outputs. For this purpose, the pool consisted of the three most densely populated automatic annotations of each record. The authors also compared the community inter-annotator agreement with expert inter-annotator agreement when the experts annotated raw records without using the pool. Finally, they measured the quality of the community ground truth by comparing it with the expert ground truth. Results and conclusions The authors found that the community annotators achieved comparable inter-annotator agreement to expert annotators, regardless of whether the experts annotated from the pool. Furthermore, the ground truth generated by the community obtained F-measures above 0.90 against the ground truth of the experts, indicating the value of the community as a source of high-quality ground truth even on intricate and domain-specific annotation tasks. PMID:20819855

  11. Computation of statistical secondary structure of nucleic acids.

    PubMed Central

    Yamamoto, K; Kitamura, Y; Yoshikura, H

    1984-01-01

    This paper presents a computer analysis of statistical secondary structure of nucleic acids. For a given single stranded nucleic acid, we generated "structure map" which included all the annealing structures in the sequence. The map was transformed into "energy map" by rough approximation; here, the energy level of every pairing structure consisting of more than 2 successive nucleic acid pairs was calculated. By using the "energy map", the probability of occurrence of each annealed structure was computed, i.e., the structure was computed statistically. The basis of computation was the 8-queen problem in the chess game. The validity of our computer programme was checked by computing tRNA structure which has been well established. Successful application of this programme to small nuclear RNAs of various origins is demonstrated. PMID:6198622

  12. Significant genotype difference in the CYP2E1 PstI polymorphism of indigenous groups in Sabah, Malaysia with Asian and non-Asian populations.

    PubMed

    Goh, Lucky Poh Wah; Chong, Eric Tzyy Jiann; Chua, Kek Heng; Chuah, Jitt Aun; Lee, Ping-Chin

    2014-01-01

    CYP2E1 PstI polymorphism G-1259C (rs3813867) genotype distributions vary significantly among different populations and are associated with both diseases, like cancer, and adverse drug effects. To date, there have been limited genotype distributions and allele frequencies of this polymorphism reported in the three major indigenous ethnic groups (KadazanDusun, Bajau, and Rungus) in Sabah, also known as North Borneo. The aim of this study was to investigate the genotype distributions and allele frequencies of the CYP2E1 PstI polymorphism G-1259C in these three major indigenous peoples in Sabah. A total of 640 healthy individuals from the three dominant indigenous groups were recruited for this study. Polymerase chain reaction-restriction fragment length polymorphism (PCR-RFLP) at G-1259C polymorphic site of CYP2E1 gene was performed using the Pst I restriction enzyme. Fragments were analyzed using agarose gel electrophoresis and confirmed by direct sequencing. Overall, the allele frequencies were 90.3% for c1 allele and 9.7% for c2 allele. The genotype frequencies for c1/c1, c1/c2 and c2/c2 were observed as 80.9%, 18.8%, and 0.3%, respectively. A highly statistical significant difference (p<0.001) was observed in the genotype distributions between indigenous groups in Sabah with all Asian and non-Asian populations. However, among these three indigenous groups, there was no statistical significant difference (p>0.001) in their genotype distributions. The three major indigenous ethnic groups in Sabah show unique genotype distributions when compared with other populations. This finding indicates the importance of establishing the genotype distributions of CYP2E1 PstI polymorphism in the indigenous populations.

  13. Dealing with the Conflicting Results of Psycholinguistic Experiments: How to Resolve Them with the Help of Statistical Meta-analysis.

    PubMed

    Rákosi, Csilla

    2018-01-22

    This paper proposes the use of the tools of statistical meta-analysis as a method of conflict resolution with respect to experiments in cognitive linguistics. With the help of statistical meta-analysis, the effect size of similar experiments can be compared, a well-founded and robust synthesis of the experimental data can be achieved, and possible causes of any divergence(s) in the outcomes can be revealed. This application of statistical meta-analysis offers a novel method of how diverging evidence can be dealt with. The workability of this idea is exemplified by a case study dealing with a series of experiments conducted as non-exact replications of Thibodeau and Boroditsky (PLoS ONE 6(2):e16782, 2011. https://doi.org/10.1371/journal.pone.0016782 ).

  14. An i2b2-based, generalizable, open source, self-scaling chronic disease registry.

    PubMed

    Natter, Marc D; Quan, Justin; Ortiz, David M; Bousvaros, Athos; Ilowite, Norman T; Inman, Christi J; Marsolo, Keith; McMurry, Andrew J; Sandborg, Christy I; Schanberg, Laura E; Wallace, Carol A; Warren, Robert W; Weber, Griffin M; Mandl, Kenneth D

    2013-01-01

    Registries are a well-established mechanism for obtaining high quality, disease-specific data, but are often highly project-specific in their design, implementation, and policies for data use. In contrast to the conventional model of centralized data contribution, warehousing, and control, we design a self-scaling registry technology for collaborative data sharing, based upon the widely adopted Integrating Biology & the Bedside (i2b2) data warehousing framework and the Shared Health Research Information Network (SHRINE) peer-to-peer networking software. Focusing our design around creation of a scalable solution for collaboration within multi-site disease registries, we leverage the i2b2 and SHRINE open source software to create a modular, ontology-based, federated infrastructure that provides research investigators full ownership and access to their contributed data while supporting permissioned yet robust data sharing. We accomplish these objectives via web services supporting peer-group overlays, group-aware data aggregation, and administrative functions. The 56-site Childhood Arthritis & Rheumatology Research Alliance (CARRA) Registry and 3-site Harvard Inflammatory Bowel Diseases Longitudinal Data Repository now utilize i2b2 self-scaling registry technology (i2b2-SSR). This platform, extensible to federation of multiple projects within and between research networks, encompasses >6000 subjects at sites throughout the USA. We utilize the i2b2-SSR platform to minimize technical barriers to collaboration while enabling fine-grained control over data sharing. The implementation of i2b2-SSR for the multi-site, multi-stakeholder CARRA Registry has established a digital infrastructure for community-driven research data sharing in pediatric rheumatology in the USA. We envision i2b2-SSR as a scalable, reusable solution facilitating interdisciplinary research across diseases.

  15. Proapoptotic signaling induced by RIG-I and MDA-5 results in type I interferon–independent apoptosis in human melanoma cells

    PubMed Central

    Besch, Robert; Poeck, Hendrik; Hohenauer, Tobias; Senft, Daniela; Häcker, Georg; Berking, Carola; Hornung, Veit; Endres, Stefan; Ruzicka, Thomas; Rothenfusser, Simon; Hartmann, Gunther

    2009-01-01

    The retinoic acid–inducible gene I (RIG-I) and melanoma differentiation–associated antigen 5 (MDA-5) helicases sense viral RNA in infected cells and initiate antiviral responses such as the production of type I IFNs. Here we have shown that RIG-I and MDA-5 also initiate a proapoptotic signaling pathway that is independent of type I IFNs. In human melanoma cells, this signaling pathway required the mitochondrial adapter Cardif (also known as IPS-1) and induced the proapoptotic BH3-only proteins Puma and Noxa. RIG-I– and MDA-5–initiated apoptosis required Noxa but was independent of the tumor suppressor p53. Triggering this pathway led to efficient activation of mitochondrial apoptosis, requiring caspase-9 and Apaf-1. Surprisingly, this proapoptotic signaling pathway was also active in nonmalignant cells, but these cells were much less sensitive to apoptosis than melanoma cells. Endogenous Bcl-xL rescued nonmalignant, but not melanoma, cells from RIG-I– and MDA-5–mediated apoptosis. In addition, we confirmed the results of the in vitro studies, demonstrating that RIG-I and MDA-5 ligands both reduced human tumor lung metastasis in immunodeficient NOD/SCID mice. These results identify an IFN-independent antiviral signaling pathway initiated by RIG-I and MDA-5 that activates proapoptotic signaling and, unless blocked by Bcl-xL, results in apoptosis. Due to their immunostimulatory and proapoptotic activity, RIG-I and MDA-5 ligands have therapeutic potential due to their ability to overcome the characteristic resistance of melanoma cells to apoptosis. PMID:19620789

  16. Aftershock Energy Distribution by Statistical Mechanics Approach

    NASA Astrophysics Data System (ADS)

    Daminelli, R.; Marcellini, A.

    2015-12-01

    The aim of our work is to research the most probable distribution of the energy of aftershocks. We started by applying one of the fundamental principles of statistical mechanics that, in case of aftershock sequences, it could be expressed as: the greater the number of different ways in which the energy of aftershocks can be arranged among the energy cells in phase space the more probable the distribution. We assume that each cell in phase space has the same possibility to be occupied, and that more than one cell in the phase space can have the same energy. Seeing that seismic energy is proportional to products of different parameters, a number of different combinations of parameters can produce different energies (e.g., different combination of stress drop and fault area can release the same seismic energy). Let us assume that there are gi cells in the aftershock phase space characterised by the same energy released ɛi. Therefore we can assume that the Maxwell-Boltzmann statistics can be applied to aftershock sequences with the proviso that the judgment on the validity of this hypothesis is the agreement with the data. The aftershock energy distribution can therefore be written as follow: n(ɛ)=Ag(ɛ)exp(-βɛ)where n(ɛ) is the number of aftershocks with energy, ɛ, A and β are constants. Considering the above hypothesis, we can assume g(ɛ) is proportional to ɛ. We selected and analysed different aftershock sequences (data extracted from Earthquake Catalogs of SCEC, of INGV-CNT and other institutions) with a minimum magnitude retained ML=2 (in some cases ML=2.6) and a time window of 35 days. The results of our model are in agreement with the data, except in the very low energy band, where our model resulted in a moderate overestimation.

  17. The Extended Northern ROSAT Galaxy Cluster Survey (NORAS II). I. Survey Construction and First Results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Böhringer, Hans; Chon, Gayoung; Trümper, Joachim

    As the largest, clearly defined building blocks of our universe, galaxy clusters are interesting astrophysical laboratories and important probes for cosmology. X-ray surveys for galaxy clusters provide one of the best ways to characterize the population of galaxy clusters. We provide a description of the construction of the NORAS II galaxy cluster survey based on X-ray data from the northern part of the ROSAT All-Sky Survey. NORAS II extends the NORAS survey down to a flux limit of 1.8 × 10{sup −12} erg s{sup −1} cm{sup −2} (0.1–2.4 keV), increasing the sample size by about a factor of two. The NORAS IImore » cluster survey now reaches the same quality and depth as its counterpart, the southern REFLEX II survey, allowing us to combine the two complementary surveys. The paper provides information on the determination of the cluster X-ray parameters, the identification process of the X-ray sources, the statistics of the survey, and the construction of the survey selection function, which we provide in numerical format. Currently NORAS II contains 860 clusters with a median redshift of z  = 0.102. We provide a number of statistical functions, including the log N –log S and the X-ray luminosity function and compare these to the results from the complementary REFLEX II survey. Using the NORAS II sample to constrain the cosmological parameters, σ {sub 8} and Ω{sub m}, yields results perfectly consistent with those of REFLEX II. Overall, the results show that the two hemisphere samples, NORAS II and REFLEX II, can be combined without problems into an all-sky sample, just excluding the zone of avoidance.« less

  18. Photovoltaic Properties of Two-Dimensional (CH3NH3)2Pb(SCN)2I2 Perovskite: A Combined Experimental and Density Functional Theory Study.

    PubMed

    Xiao, Zewen; Meng, Weiwei; Saparov, Bayrammurad; Duan, Hsin-Sheng; Wang, Changlei; Feng, Chunbao; Liao, Weiqiang; Ke, Weijun; Zhao, Dewei; Wang, Jianbo; Mitzi, David B; Yan, Yanfa

    2016-04-07

    We explore the photovoltaic-relevant properties of the 2D MA2Pb(SCN)2I2 (where MA = CH3NH3(+)) perovskite using a combination of materials synthesis, characterization and density functional theory calculation, and determine electronic properties of MA2Pb(SCN)2I2 that are significantly different from those previously reported in literature. The layered perovskite with mixed-anions exhibits an indirect bandgap of ∼2.04 eV, with a slightly larger direct bandgap of ∼2.11 eV. The carriers (both electrons and holes) are also found to be confined within the 2D layers. Our results suggest that the 2D MA2Pb(SCN)2I2 perovskite may not be among the most promising absorbers for efficient single-junction solar cell applications; however, use as an absorber for the top cell of a tandem solar cell may still be a possibility if films are grown with the 2D layers aligned perpendicular to the substrates.

  19. Structural and superionic properties of Ag+-rich ternary phases within the AgI-MI2 systems

    NASA Astrophysics Data System (ADS)

    Hull, S.; Keen, D. A.; Berastegui, P.

    2002-12-01

    The effects of temperature on the crystal structure and ionic conductivity of the compounds Ag2CdI4, Ag2ZnI4 and Ag3SnI5 have been investigated by powder diffraction and impedance spectroscopy techniques. varepsilon-Ag2CdI4 adopts a tetragonal crystal structure under ambient conditions and abrupt increases in the ionic conductivity are observed at 407(2), 447(3) and 532(4) K, consistent with the sequence of transitions varepsilon-Ag2CdI 4 rightarrow beta-Ag2CdI 4 + beta-AgI + CdI2 rightarrow alpha-AgI + CdI2 rightarrow alpha-Ag2CdI4. Hexagonal beta-Ag2CdI4 is metastable at ambient temperature. The ambient-temperature beta phase of Ag2ZnI4 is orthorhombic and the structures of beta-Ag2CdI4 and beta-Ag2ZnI4 can, respectively, be considered as ordered derivatives of the wurtzite (beta) and zincblende (gamma) phases of AgI. On heating Ag2ZnI4, there is a 12-fold increase in ionic conductivity at 481(1) K and a further eightfold increase at 542(3) K. These changes result from decomposition of beta-Ag2ZnI4 into alpha-AgI + ZnI2, followed by the appearance of superionic alpha-Ag2ZnI4 at the higher temperature. The hexagonal crystal structure of alpha-Ag2ZnI4 is a dynamically disordered counterpart to the beta modification. Ag3SnI5 is only stable at temperatures in excess of 370(3) K and possesses a relatively high ionic conductivity (sigma approx 0.19Omega-1 cm-1 at 420 K) due to dynamic disorder of the Ag+ and Sn2+ within a cubic close packed I- sublattice. The implications of these findings for the wider issue of high ionic conductivity in AgI-MI2 compounds is discussed, with reference to recently published studies of Ag4PbI6 and Ag2HgI4 and new data for the temperature dependence of the ionic conductivity of the latter compound.

  20. [Evaluation of the results of clinical trials using a new non-statistical method].

    PubMed

    Zofková, I

    1994-04-04

    The author presents information on the possibilities and some advantages associated with the application of a new nonstatistical (gnostic) method for evaluation of results in clinical trials. The mentioned method is among other properties very robust, i.e. suited for evaluation of small groups of highly scattered data, a situation very frequently encountered in clinical research.

  1. Societal Statistics by virtue of the Statistical Drake Equation

    NASA Astrophysics Data System (ADS)

    Maccone, Claudio

    2012-09-01

    number of Habitable Planets follows the lognormal distribution as well. But the Dole equation is described by the first FOUR factors of the Drake equation. Thus, we may "divide" the 7-factor Drake equation by the 4-factor Dole equation getting the probability distribution of the last-3-factor Drake equation, i.e. the probability distribution of the SOCIETAL TERMS ONLY. These we study in detail in this paper, achieving new statistical results about the SOCIETAL ASPECTS OF SETI.

  2. GASP cloud- and particle-encounter statistics and their application to LFC aircraft studies. Volume 2: Appendixes

    NASA Technical Reports Server (NTRS)

    Jasperson, W. H.; Nastron, G. D.; Davis, R. E.; Holdeman, J. D.

    1984-01-01

    Summary studies are presented for the entire cloud observation archive from the NASA Global Atmospheric Sampling Program (GASP). Studies are also presented for GASP particle-concentration data gathered concurrently with the cloud observations. Cloud encounters are shown on about 15 percent of the data samples overall, but the probability of cloud encounter is shown to vary significantly with altitude, latitude, and distance from the tropopause. Several meteorological circulation features are apparent in the latitudinal distribution of cloud cover, and the cloud-encounter statistics are shown to be consistent with the classical mid-latitude cyclone model. Observations of clouds spaced more closely than 90 minutes are shown to be statistically dependent. The statistics for cloud and particle encounter are utilized to estimate the frequency of cloud encounter on long-range airline routes, and to assess the probability and extent of laminaar flow loss due to cloud or particle encounter by aircraft utilizing laminar flow control (LFC). It is shown that the probability of extended cloud encounter is too low, of itself, to make LFC impractical. This report is presented in two volumes. Volume I contains the narrative, analysis, and conclusions. Volume II contains five supporting appendixes.

  3. Nature of Driving Force for Protein Folding: A Result From Analyzing the Statistical Potential

    NASA Astrophysics Data System (ADS)

    Li, Hao; Tang, Chao; Wingreen, Ned S.

    1997-07-01

    In a statistical approach to protein structure analysis, Miyazawa and Jernigan derived a 20×20 matrix of inter-residue contact energies between different types of amino acids. Using the method of eigenvalue decomposition, we find that the Miyazawa-Jernigan matrix can be accurately reconstructed from its first two principal component vectors as Mij = C0+C1\\(qi+qj\\)+C2qiqj, with constant C's, and 20 q values associated with the 20 amino acids. This regularity is due to hydrophobic interactions and a force of demixing, the latter obeying Hildebrand's solubility theory of simple liquids.

  4. Student Achievement in Undergraduate Statistics: The Potential Value of Allowing Failure

    ERIC Educational Resources Information Center

    Ferrandino, Joseph A.

    2016-01-01

    This article details what resulted when I re-designed my undergraduate statistics course to allow failure as a learning strategy and focused on achievement rather than performance. A variety of within and between sample t-tests are utilized to determine the impact of unlimited test and quiz opportunities on student learning on both quizzes and…

  5. Clotrimazole and efaroxan inhibit red cell Gardos channel independently of imidazoline I1 and I2 binding sites.

    PubMed

    Coupry, I; Armsby, C C; Alper, S L; Brugnara, C; Parini, A

    1996-01-04

    In the present report, we investigated the potential involvement of imidazoline I1 and I2 binding sites in the inhibition of the Ca(2+)-activated K+ channel (Gardos channel) by clotrimazole in human red cells. Ca(2+)-activated 86Rb influx was inhibited by clotrimazole and efaroxan but not by the imidazoline binding site ligands clonidine, moxonidine, cirazoline and idazoxan (100 microM). Binding studies with [3H]idazoxan and [3H]p-aminoclonidine did not reveal the expression of I1 and I2 binding sites in erythrocytes. These data indicate that the effects of clotrimazole and efaroxan on the erythrocyte Ca(2+)-activated K+ channel may be mediated by a 'non-I1/non-I2' binding site.

  6. Use of iPhone technology in improving acetabular component position in total hip arthroplasty.

    PubMed

    Tay, Xiau Wei; Zhang, Benny Xu; Gayagay, George

    2017-09-01

    Improper acetabular cup positioning is associated with high risk of complications after total hip arthroplasty. The aim of our study is to objectively compare 3 methods, namely (1) free hand, (2) alignment jig (Sputnik), and (3) iPhone application to identify an easy, reproducible, and accurate method in improving acetabular cup placement. We designed a simple setup and carried out a simple experiment (see Method section). Using statistical analysis, the difference in inclination angles using iPhone application compared with the freehand method was found to be statistically significant ( F [2,51] = 4.17, P = .02) in the "untrained group". There is no statistical significance detected for the other groups. This suggests a potential role for iPhone applications in junior surgeons in overcoming the steep learning curve.

  7. [131I therapy in hyperthyroidism. Results of treatment from 1960-1974].

    PubMed

    Heinze, H G; Schenk, F

    1977-02-01

    488 PATIENTS WITH Graves' disease were treated by 131Iodine between 1960 and 1974. 427 (87,5%) of these patients were reexamined several times (clinically, 131I-uptake, PB127I, T4 (CPB-A), T3-uptake, and since 1973 TRH-test). The 131I was given as an individually calculated single dose treatment, using 7 000 -- 10 000 rd before 1965 and 6 000 rd thereafter. Two thirds of the patients became euthyroid after a single 131I-dose. In 20% the treatment had to be repeated. These patients show evidently a different biological behaviour of their disease, since multiple treatments revealed a higher rate of failure (33--35%). There is no principal difference between the out-come after 131I-therapy and surgery concerning the rate of failure, respectively relapse (3--4%) and hypothyroidism. Early incidence of hypothyrodism is dose--dependent, as could be shown in patients treated with higher doses before 1965. The reduction of the irradiation dose to 6 000 rd was followed by a drop of hypothyroidism from 18% to 7%. The reasons of late incidence of hypothyroidism are discussed. The incidence of hypothroidism was calculated by three different methods (over-all incidence, incidence within the observed interval after therapy, life-table method). All three methods revealed different results. This has to be taken into account comparing results after radioiodine as well as after surgery. Radioiodine therapy for hyperthyroidism offers a true alternative to surgery.

  8. Multi-reader ROC studies with split-plot designs: a comparison of statistical methods.

    PubMed

    Obuchowski, Nancy A; Gallas, Brandon D; Hillis, Stephen L

    2012-12-01

    Multireader imaging trials often use a factorial design, in which study patients undergo testing with all imaging modalities and readers interpret the results of all tests for all patients. A drawback of this design is the large number of interpretations required of each reader. Split-plot designs have been proposed as an alternative, in which one or a subset of readers interprets all images of a sample of patients, while other readers interpret the images of other samples of patients. In this paper, the authors compare three methods of analysis for the split-plot design. Three statistical methods are presented: the Obuchowski-Rockette method modified for the split-plot design, a newly proposed marginal-mean analysis-of-variance approach, and an extension of the three-sample U-statistic method. A simulation study using the Roe-Metz model was performed to compare the type I error rate, power, and confidence interval coverage of the three test statistics. The type I error rates for all three methods are close to the nominal level but tend to be slightly conservative. The statistical power is nearly identical for the three methods. The coverage of 95% confidence intervals falls close to the nominal coverage for small and large sample sizes. The split-plot multireader, multicase study design can be statistically efficient compared to the factorial design, reducing the number of interpretations required per reader. Three methods of analysis, shown to have nominal type I error rates, similar power, and nominal confidence interval coverage, are available for this study design. Copyright © 2012 AUR. All rights reserved.

  9. Statistics

    Cancer.gov

    Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.

  10. Genomic similarity and kernel methods I: advancements by building on mathematical and statistical foundations.

    PubMed

    Schaid, Daniel J

    2010-01-01

    Measures of genomic similarity are the basis of many statistical analytic methods. We review the mathematical and statistical basis of similarity methods, particularly based on kernel methods. A kernel function converts information for a pair of subjects to a quantitative value representing either similarity (larger values meaning more similar) or distance (smaller values meaning more similar), with the requirement that it must create a positive semidefinite matrix when applied to all pairs of subjects. This review emphasizes the wide range of statistical methods and software that can be used when similarity is based on kernel methods, such as nonparametric regression, linear mixed models and generalized linear mixed models, hierarchical models, score statistics, and support vector machines. The mathematical rigor for these methods is summarized, as is the mathematical framework for making kernels. This review provides a framework to move from intuitive and heuristic approaches to define genomic similarities to more rigorous methods that can take advantage of powerful statistical modeling and existing software. A companion paper reviews novel approaches to creating kernels that might be useful for genomic analyses, providing insights with examples [1]. Copyright © 2010 S. Karger AG, Basel.

  11. A statistical study of EMIC waves observed by Cluster: 2. Associated plasma conditions

    DOE PAGES

    Allen, R. C.; Zhang, J. -C.; Kistler, L. M.; ...

    2016-07-01

    This is the second in a pair of papers discussing a statistical study of electromagnetic ion cyclotron (EMIC) waves detected during 10 years (2001–2010) of Cluster observations. In the first paper, an analysis of EMIC wave properties (i.e., wave power, polarization, normal angle, and wave propagation angle) is presented in both the magnetic latitude (MLAT)-distance as well as magnetic local time (MLT)-L frames. In addition, this paper focuses on the distribution of EMIC wave-associated plasma conditions as well as two EMIC wave generation proxies (the electron plasma frequency to gyrofrequency ratio proxy and the linear theory proxy) in these samemore » frames. Based on the distributions of hot H + anisotropy, electron and hot H+ density measurements, hot H + parallel plasma beta, and the calculated wave generation proxies, three source regions of EMIC waves appear to exist: (1) the well-known overlap between cold plasmaspheric or plume populations with hot anisotropic ring current populations in the postnoon to dusk MLT region; (2) regions all along the dayside magnetosphere at high L shells related to dayside magnetospheric compression and drift shell splitting; and (3) off-equator regions possibly associated with the Shabansky orbits in the dayside magnetosphere.« less

  12. A statistical study of EMIC waves observed by Cluster: 2. Associated plasma conditions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Allen, R. C.; Zhang, J. -C.; Kistler, L. M.

    This is the second in a pair of papers discussing a statistical study of electromagnetic ion cyclotron (EMIC) waves detected during 10 years (2001–2010) of Cluster observations. In the first paper, an analysis of EMIC wave properties (i.e., wave power, polarization, normal angle, and wave propagation angle) is presented in both the magnetic latitude (MLAT)-distance as well as magnetic local time (MLT)-L frames. In addition, this paper focuses on the distribution of EMIC wave-associated plasma conditions as well as two EMIC wave generation proxies (the electron plasma frequency to gyrofrequency ratio proxy and the linear theory proxy) in these samemore » frames. Based on the distributions of hot H + anisotropy, electron and hot H+ density measurements, hot H + parallel plasma beta, and the calculated wave generation proxies, three source regions of EMIC waves appear to exist: (1) the well-known overlap between cold plasmaspheric or plume populations with hot anisotropic ring current populations in the postnoon to dusk MLT region; (2) regions all along the dayside magnetosphere at high L shells related to dayside magnetospheric compression and drift shell splitting; and (3) off-equator regions possibly associated with the Shabansky orbits in the dayside magnetosphere.« less

  13. Statistical analysis of arthroplasty data

    PubMed Central

    2011-01-01

    It is envisaged that guidelines for statistical analysis and presentation of results will improve the quality and value of research. The Nordic Arthroplasty Register Association (NARA) has therefore developed guidelines for the statistical analysis of arthroplasty register data. The guidelines are divided into two parts, one with an introduction and a discussion of the background to the guidelines (Ranstam et al. 2011a, see pages x-y in this issue), and this one with a more technical statistical discussion on how specific problems can be handled. This second part contains (1) recommendations for the interpretation of methods used to calculate survival, (2) recommendations on howto deal with bilateral observations, and (3) a discussion of problems and pitfalls associated with analysis of factors that influence survival or comparisons between outcomes extracted from different hospitals. PMID:21619500

  14. Statistical Characterization of the Chandra Source Catalog

    NASA Astrophysics Data System (ADS)

    Primini, Francis A.; Houck, John C.; Davis, John E.; Nowak, Michael A.; Evans, Ian N.; Glotfelty, Kenny J.; Anderson, Craig S.; Bonaventura, Nina R.; Chen, Judy C.; Doe, Stephen M.; Evans, Janet D.; Fabbiano, Giuseppina; Galle, Elizabeth C.; Gibbs, Danny G.; Grier, John D.; Hain, Roger M.; Hall, Diane M.; Harbo, Peter N.; He, Xiangqun Helen; Karovska, Margarita; Kashyap, Vinay L.; Lauer, Jennifer; McCollough, Michael L.; McDowell, Jonathan C.; Miller, Joseph B.; Mitschang, Arik W.; Morgan, Douglas L.; Mossman, Amy E.; Nichols, Joy S.; Plummer, David A.; Refsdal, Brian L.; Rots, Arnold H.; Siemiginowska, Aneta; Sundheim, Beth A.; Tibbetts, Michael S.; Van Stone, David W.; Winkelman, Sherry L.; Zografou, Panagoula

    2011-06-01

    The first release of the Chandra Source Catalog (CSC) contains ~95,000 X-ray sources in a total area of 0.75% of the entire sky, using data from ~3900 separate ACIS observations of a multitude of different types of X-ray sources. In order to maximize the scientific benefit of such a large, heterogeneous data set, careful characterization of the statistical properties of the catalog, i.e., completeness, sensitivity, false source rate, and accuracy of source properties, is required. Characterization efforts of other large Chandra catalogs, such as the ChaMP Point Source Catalog or the 2 Mega-second Deep Field Surveys, while informative, cannot serve this purpose, since the CSC analysis procedures are significantly different and the range of allowable data is much less restrictive. We describe here the characterization process for the CSC. This process includes both a comparison of real CSC results with those of other, deeper Chandra catalogs of the same targets and extensive simulations of blank-sky and point-source populations.

  15. 123/125I-labelled 2-iodo-L: -phenylalanine and 2-iodo-D: -phenylalanine: comparative uptake in various tumour types and biodistribution in mice.

    PubMed

    Kersemans, Veerle; Cornelissen, Bart; Kersemans, Ken; Bauwens, Matthias; Dierckx, Rudi A; De Spiegeleer, Bart; Mertens, John; Slegers, Guido

    2006-08-01

    In vitro in the R1M cell model and in vivo in the R1M tumour-bearing athymic model, both [(123)I]-2-iodo-L: -phenylalanine and [(123)I]-2-iodo-D: -phenylalanine have shown promising results as tumour diagnostic agents for SPECT. In order to compare these two amino acid analogues and to examine whether the observed characteristics could be generalised, both isomers were evaluated in various tumour models. Transport type characterisation in vitro in A549, A2058, C6, C32, Capan2, EF43fgf4, HT29 and R1M cells with [(123)I]-2-iodo-L: -phenylalanine was performed using the method described by Shotwell et al. Subsequently, [(123)I]-2-iodo-L: -phenylalanine and [(123)I]-2-iodo-D: -phenylalanine tumour uptake and biodistribution were evaluated using dynamic planar imaging and/or dissection in A549, A2058, C6, C32, Capan2, EF43fgf4, HT29 and R1M inoculated athymic mice. Two-compartment blood modelling of the imaging results was performed. In vitro testing demonstrated that [(123)I]-2-iodo-L: -phenylalanine was transported in all tumour cell lines by LAT1. In all tumour models, the two amino acid analogues showed the same general biodistribution characteristics: high and specific tumour uptake and renal tracer clearance. Two-compartment modelling revealed that the D: -isomer showed a faster blood clearance together with a faster distribution to the peripheral compartment in comparison with [(123)I]-2-iodo-L: -phenylalanine. [(123)I]-2-iodo-L: -phenylalanine and its D: -isomer are promising tumour diagnostic agents for dynamic planar imaging. They showed a high and similar uptake in all tested tumours. [(123)I]-2-iodo-D: -phenylalanine showed better tracer characteristics concerning radiation dose to other organs.

  16. An evaluation of the quality of statistical design and analysis of published medical research: results from a systematic survey of general orthopaedic journals.

    PubMed

    Parsons, Nick R; Price, Charlotte L; Hiskens, Richard; Achten, Juul; Costa, Matthew L

    2012-04-25

    The application of statistics in reported research in trauma and orthopaedic surgery has become ever more important and complex. Despite the extensive use of statistical analysis, it is still a subject which is often not conceptually well understood, resulting in clear methodological flaws and inadequate reporting in many papers. A detailed statistical survey sampled 100 representative orthopaedic papers using a validated questionnaire that assessed the quality of the trial design and statistical analysis methods. The survey found evidence of failings in study design, statistical methodology and presentation of the results. Overall, in 17% (95% confidence interval; 10-26%) of the studies investigated the conclusions were not clearly justified by the results, in 39% (30-49%) of studies a different analysis should have been undertaken and in 17% (10-26%) a different analysis could have made a difference to the overall conclusions. It is only by an improved dialogue between statistician, clinician, reviewer and journal editor that the failings in design methodology and analysis highlighted by this survey can be addressed.

  17. 40 CFR Appendix I to Subpart T of... - Sample Graphical Summary of NTE Emission Results

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Emission Results I Appendix I to Subpart T of Part 86 Protection of Environment ENVIRONMENTAL PROTECTION... Appendix I to Subpart T of Part 86—Sample Graphical Summary of NTE Emission Results The following figure shows an example of a graphical summary of NTE emission results: ER14JN05.002 ...

  18. Diverse Reactions of Thiophenes, Selenophenes, and Tellurophenes with Strongly Oxidizing I(III) PhI(L)2 Reagents.

    PubMed

    Egalahewa, Sathsara; Albayer, Mohammad; Aprile, Antonino; Dutton, Jason L

    2017-02-06

    We report the outcomes of the reactions of aromatic group 16 thiophene, selenophene, and tellurophene rings with the I(III) oxidants PhI(OAc)(OTf) and [PhI(Pyr) 2 ][OTf] 2 (Pyr = pyridine). In all reactions, oxidative processes take place, with generation of PhI as the reduction product. However, with the exception of tellurophene with PhI(OAc)(OTf), +4 oxidation state complexes are not observed, but rather a variety of other processes occur. In general, where a C-H unit is available on the 5-membered ring, an electrophilic aromatic substitution reaction of either -IPh or pyridine onto the ring occurs. When all positions are blocked, reactions with PhI(OAc)(OTf) give acetic and triflic anhydride as the identifiable oxidative byproducts, while [PhI(Pyr) 2 ][OTf] 2 gives pyridine electrophilic aromatic substitution onto the peripheral rings. Qualitative mechanistic studies indicate that the presence of the oxidizable heteroatom is required for pyridine to act as an electrophile in a substantial manner.

  19. Paleontology and Darwin's Theory of Evolution: The Subversive Role of Statistics at the End of the 19th Century.

    PubMed

    Tamborini, Marco

    2015-11-01

    This paper examines the subversive role of statistics paleontology at the end of the 19th and the beginning of the 20th centuries. In particular, I will focus on German paleontology and its relationship with statistics. I argue that in paleontology, the quantitative method was questioned and strongly limited by the first decade of the 20th century because, as its opponents noted, when the fossil record is treated statistically, it was found to generate results openly in conflict with the Darwinian theory of evolution. Essentially, statistics questions the gradual mode of evolution and the role of natural selection. The main objections to statistics were addressed during the meetings at the Kaiserlich-Königliche Geologische Reichsanstalt in Vienna in the 1880s. After having introduced the statistical treatment of the fossil record, I will use the works of Charles Léo Lesquereux (1806-1889), Joachim Barrande (1799-1833), and Henry Shaler Williams (1847-1918) to compare the objections raised in Vienna with how the statistical treatment of the data worked in practice. Furthermore, I will discuss the criticisms of Melchior Neumayr (1845-1890), one of the leading German opponents of statistical paleontology, to show why, and to what extent, statistics were questioned in Vienna. The final part of this paper considers what paleontologists can derive from a statistical notion of data: the necessity of opening a discussion about the completeness and nature of the paleontological data. The Vienna discussion about which method paleontologists should follow offers an interesting case study in order to understand the epistemic tensions within paleontology surrounding Darwin's theory as well as the variety of non-Darwinian alternatives that emerged from the statistical treatment of the fossil record at the end of the 19th century.

  20. Envelope statistics of self-motion signals experienced by human subjects during everyday activities: Implications for vestibular processing.

    PubMed

    Carriot, Jérome; Jamali, Mohsen; Cullen, Kathleen E; Chacron, Maurice J

    2017-01-01

    There is accumulating evidence that the brain's neural coding strategies are constrained by natural stimulus statistics. Here we investigated the statistics of the time varying envelope (i.e. a second-order stimulus attribute that is related to variance) of rotational and translational self-motion signals experienced by human subjects during everyday activities. We found that envelopes can reach large values across all six motion dimensions (~450 deg/s for rotations and ~4 G for translations). Unlike results obtained in other sensory modalities, the spectral power of envelope signals decreased slowly for low (< 2 Hz) and more sharply for high (>2 Hz) temporal frequencies and thus was not well-fit by a power law. We next compared the spectral properties of envelope signals resulting from active and passive self-motion, as well as those resulting from signals obtained when the subject is absent (i.e. external stimuli). Our data suggest that different mechanisms underlie deviation from scale invariance in rotational and translational self-motion envelopes. Specifically, active self-motion and filtering by the human body cause deviation from scale invariance primarily for translational and rotational envelope signals, respectively. Finally, we used well-established models in order to predict the responses of peripheral vestibular afferents to natural envelope stimuli. We found that irregular afferents responded more strongly to envelopes than their regular counterparts. Our findings have important consequences for understanding the coding strategies used by the vestibular system to process natural second-order self-motion signals.