Sample records for statistical analysis proves

  1. The Shock and Vibration Digest, Volume 14, Number 2, February 1982

    DTIC Science & Technology

    1982-02-01

    figurations. 75 4J DUCTS 82-424 (Also see No. 346) Coupling Lou Factors for Statistical Energy Analysis of Sound Transnission at Rectangular...waves, Sound waves, Wave props- tures by means of statistical energy analysis (SEA) coupling gation loss factors for the structure-borne sound...multilayered panels are discussed. Statistical energy analysis (SEA) has proved to be a promising Experimental results of stiffened panels, damping tape

  2. Testing of Hypothesis in Equivalence and Non Inferiority Trials-A Concept.

    PubMed

    Juneja, Atul; Aggarwal, Abha R; Adhikari, Tulsi; Pandey, Arvind

    2016-04-01

    Establishing the appropriate hypothesis is one of the important steps for carrying out the statistical tests/analysis. Its understanding is important for interpreting the results of statistical analysis. The current communication attempts to provide the concept of testing of hypothesis in non inferiority and equivalence trials, where the null hypothesis is just reverse of what is set up for conventional superiority trials. It is similarly looked for rejection for establishing the fact the researcher is intending to prove. It is important to mention that equivalence or non inferiority cannot be proved by accepting the null hypothesis of no difference. Hence, establishing the appropriate statistical hypothesis is extremely important to arrive at meaningful conclusion for the set objectives in research.

  3. On the Utility of Content Analysis in Author Attribution: "The Federalist."

    ERIC Educational Resources Information Center

    Martindale, Colin; McKenzie, Dean

    1995-01-01

    Compares the success of lexical statistics, content analysis, and function words in determining the true author of "The Federalist." The function word approach proved most successful in attributing the papers to James Madison. Lexical statistics contributed nothing, while content analytic measures resulted in some success. (MJP)

  4. The Effects of Measurement Error on Statistical Models for Analyzing Change. Final Report.

    ERIC Educational Resources Information Center

    Dunivant, Noel

    The results of six major projects are discussed including a comprehensive mathematical and statistical analysis of the problems caused by errors of measurement in linear models for assessing change. In a general matrix representation of the problem, several new analytic results are proved concerning the parameters which affect bias in…

  5. The Importance of Proving the Null

    ERIC Educational Resources Information Center

    Gallistel, C. R.

    2009-01-01

    Null hypotheses are simple, precise, and theoretically important. Conventional statistical analysis cannot support them; Bayesian analysis can. The challenge in a Bayesian analysis is to formulate a suitably vague alternative, because the vaguer the alternative is (the more it spreads out the unit mass of prior probability), the more the null is…

  6. Evaluation of adding item-response theory analysis for evaluation of the European Board of Ophthalmology Diploma examination.

    PubMed

    Mathysen, Danny G P; Aclimandos, Wagih; Roelant, Ella; Wouters, Kristien; Creuzot-Garcher, Catherine; Ringens, Peter J; Hawlina, Marko; Tassignon, Marie-José

    2013-11-01

    To investigate whether introduction of item-response theory (IRT) analysis, in parallel to the 'traditional' statistical analysis methods available for performance evaluation of multiple T/F items as used in the European Board of Ophthalmology Diploma (EBOD) examination, has proved beneficial, and secondly, to study whether the overall assessment performance of the current written part of EBOD is sufficiently high (KR-20≥ 0.90) to be kept as examination format in future EBOD editions. 'Traditional' analysis methods for individual MCQ item performance comprise P-statistics, Rit-statistics and item discrimination, while overall reliability is evaluated through KR-20 for multiple T/F items. The additional set of statistical analysis methods for the evaluation of EBOD comprises mainly IRT analysis. These analysis techniques are used to monitor whether the introduction of negative marking for incorrect answers (since EBOD 2010) has a positive influence on the statistical performance of EBOD as a whole and its individual test items in particular. Item-response theory analysis demonstrated that item performance parameters should not be evaluated individually, but should be related to one another. Before the introduction of negative marking, the overall EBOD reliability (KR-20) was good though with room for improvement (EBOD 2008: 0.81; EBOD 2009: 0.78). After the introduction of negative marking, the overall reliability of EBOD improved significantly (EBOD 2010: 0.92; EBOD 2011:0.91; EBOD 2012: 0.91). Although many statistical performance parameters are available to evaluate individual items, our study demonstrates that the overall reliability assessment remains the only crucial parameter to be evaluated allowing comparison. While individual item performance analysis is worthwhile to undertake as secondary analysis, drawing final conclusions seems to be more difficult. Performance parameters need to be related, as shown by IRT analysis. Therefore, IRT analysis has proved beneficial for the statistical analysis of EBOD. Introduction of negative marking has led to a significant increase in the reliability (KR-20 > 0.90), indicating that the current examination format can be kept for future EBOD examinations. © 2013 Acta Ophthalmologica Scandinavica Foundation. Published by John Wiley & Sons Ltd.

  7. A Survey of Logic Formalisms to Support Mishap Analysis

    NASA Technical Reports Server (NTRS)

    Johnson, Chris; Holloway, C. M.

    2003-01-01

    Mishap investigations provide important information about adverse events and near miss incidents. They are intended to help avoid any recurrence of previous failures. Over time, they can also yield statistical information about incident frequencies that helps to detect patterns of failure and can validate risk assessments. However, the increasing complexity of many safety critical systems is posing new challenges for mishap analysis. Similarly, the recognition that many failures have complex, systemic causes has helped to widen the scope of many mishap investigations. These two factors have combined to pose new challenges for the analysis of adverse events. A new generation of formal and semi-formal techniques have been proposed to help investigators address these problems. We introduce the term mishap logics to collectively describe these notations that might be applied to support the analysis of mishaps. The proponents of these notations have argued that they can be used to formally prove that certain events created the necessary and sufficient causes for a mishap to occur. These proofs can be used to reduce the bias that is often perceived to effect the interpretation of adverse events. Others have argued that one cannot use logic formalisms to prove causes in the same way that one might prove propositions or theorems. Such mechanisms cannot accurately capture the wealth of inductive, deductive and statistical forms of inference that investigators must use in their analysis of adverse events. This paper provides an overview of these mishap logics. It also identifies several additional classes of logic that might also be used to support mishap analysis.

  8. A new feedback image encryption scheme based on perturbation with dynamical compound chaotic sequence cipher generator

    NASA Astrophysics Data System (ADS)

    Tong, Xiaojun; Cui, Minggen; Wang, Zhu

    2009-07-01

    The design of the new compound two-dimensional chaotic function is presented by exploiting two one-dimensional chaotic functions which switch randomly, and the design is used as a chaotic sequence generator which is proved by Devaney's definition proof of chaos. The properties of compound chaotic functions are also proved rigorously. In order to improve the robustness against difference cryptanalysis and produce avalanche effect, a new feedback image encryption scheme is proposed using the new compound chaos by selecting one of the two one-dimensional chaotic functions randomly and a new image pixels method of permutation and substitution is designed in detail by array row and column random controlling based on the compound chaos. The results from entropy analysis, difference analysis, statistical analysis, sequence randomness analysis, cipher sensitivity analysis depending on key and plaintext have proven that the compound chaotic sequence cipher can resist cryptanalytic, statistical and brute-force attacks, and especially it accelerates encryption speed, and achieves higher level of security. By the dynamical compound chaos and perturbation technology, the paper solves the problem of computer low precision of one-dimensional chaotic function.

  9. Random-Effects Meta-Analysis of Time-to-Event Data Using the Expectation-Maximisation Algorithm and Shrinkage Estimators

    ERIC Educational Resources Information Center

    Simmonds, Mark C.; Higgins, Julian P. T.; Stewart, Lesley A.

    2013-01-01

    Meta-analysis of time-to-event data has proved difficult in the past because consistent summary statistics often cannot be extracted from published results. The use of individual patient data allows for the re-analysis of each study in a consistent fashion and thus makes meta-analysis of time-to-event data feasible. Time-to-event data can be…

  10. [Statistical analysis of body and lung mass of animals subjected to a single experimental insufflation of soil dust and electro-energetic ashes].

    PubMed

    Matysiak, W; Królikowska-Prasał, I; Staszyc, J; Kifer, E; Romanowska-Sarlej, J

    1989-01-01

    The studies were performed on 44 white female Wistar rats which were intratracheally administered the suspension of the soil dust and the electro-energetic ashes. The electro-energetic ashes were collected from 6 different local heat and power generating plants while the soil dust from several random places of our country. The statistical analysis of the body and the lung mass of the animals subjected to the single dust and ash insufflation was performed. The applied variants proved the statistically significant differences between the body and the lung mass. The observed differences are connected with the kinds of dust and ash used in the experiment.

  11. Network Motif Basis of Threshold Responses

    EPA Science Inventory

    There has been a long-running debate over the existence of thresholds for adverse effects. The difficulty stems from two fundamental challenges: (i) statistical analysis by itself cannot prove the existence of a threshold, i.e., a dose below which there is no effect; and (ii) the...

  12. General solution of the chemical master equation and modality of marginal distributions for hierarchic first-order reaction networks.

    PubMed

    Reis, Matthias; Kromer, Justus A; Klipp, Edda

    2018-01-20

    Multimodality is a phenomenon which complicates the analysis of statistical data based exclusively on mean and variance. Here, we present criteria for multimodality in hierarchic first-order reaction networks, consisting of catalytic and splitting reactions. Those networks are characterized by independent and dependent subnetworks. First, we prove the general solvability of the Chemical Master Equation (CME) for this type of reaction network and thereby extend the class of solvable CME's. Our general solution is analytical in the sense that it allows for a detailed analysis of its statistical properties. Given Poisson/deterministic initial conditions, we then prove the independent species to be Poisson/binomially distributed, while the dependent species exhibit generalized Poisson/Khatri Type B distributions. Generalized Poisson/Khatri Type B distributions are multimodal for an appropriate choice of parameters. We illustrate our criteria for multimodality by several basic models, as well as the well-known two-stage transcription-translation network and Bateman's model from nuclear physics. For both examples, multimodality was previously not reported.

  13. Research on the raw data processing method of the hydropower construction project

    NASA Astrophysics Data System (ADS)

    Tian, Zhichao

    2018-01-01

    In this paper, based on the characteristics of the fixed data, this paper compares the various mathematical statistics analysis methods and chooses the improved Grabs criterion to analyze the data, and through the analysis of the data processing, the data processing method is not suitable. It is proved that this method can be applied to the processing of fixed raw data. This paper provides a reference for reasonably determining the effective quota analysis data.

  14. ASCS online fault detection and isolation based on an improved MPCA

    NASA Astrophysics Data System (ADS)

    Peng, Jianxin; Liu, Haiou; Hu, Yuhui; Xi, Junqiang; Chen, Huiyan

    2014-09-01

    Multi-way principal component analysis (MPCA) has received considerable attention and been widely used in process monitoring. A traditional MPCA algorithm unfolds multiple batches of historical data into a two-dimensional matrix and cut the matrix along the time axis to form subspaces. However, low efficiency of subspaces and difficult fault isolation are the common disadvantages for the principal component model. This paper presents a new subspace construction method based on kernel density estimation function that can effectively reduce the storage amount of the subspace information. The MPCA model and the knowledge base are built based on the new subspace. Then, fault detection and isolation with the squared prediction error (SPE) statistic and the Hotelling ( T 2) statistic are also realized in process monitoring. When a fault occurs, fault isolation based on the SPE statistic is achieved by residual contribution analysis of different variables. For fault isolation of subspace based on the T 2 statistic, the relationship between the statistic indicator and state variables is constructed, and the constraint conditions are presented to check the validity of fault isolation. Then, to improve the robustness of fault isolation to unexpected disturbances, the statistic method is adopted to set the relation between single subspace and multiple subspaces to increase the corrective rate of fault isolation. Finally fault detection and isolation based on the improved MPCA is used to monitor the automatic shift control system (ASCS) to prove the correctness and effectiveness of the algorithm. The research proposes a new subspace construction method to reduce the required storage capacity and to prove the robustness of the principal component model, and sets the relationship between the state variables and fault detection indicators for fault isolation.

  15. UNITY: Confronting Supernova Cosmology's Statistical and Systematic Uncertainties in a Unified Bayesian Framework

    NASA Astrophysics Data System (ADS)

    Rubin, D.; Aldering, G.; Barbary, K.; Boone, K.; Chappell, G.; Currie, M.; Deustua, S.; Fagrelius, P.; Fruchter, A.; Hayden, B.; Lidman, C.; Nordin, J.; Perlmutter, S.; Saunders, C.; Sofiatti, C.; Supernova Cosmology Project, The

    2015-11-01

    While recent supernova (SN) cosmology research has benefited from improved measurements, current analysis approaches are not statistically optimal and will prove insufficient for future surveys. This paper discusses the limitations of current SN cosmological analyses in treating outliers, selection effects, shape- and color-standardization relations, unexplained dispersion, and heterogeneous observations. We present a new Bayesian framework, called UNITY (Unified Nonlinear Inference for Type-Ia cosmologY), that incorporates significant improvements in our ability to confront these effects. We apply the framework to real SN observations and demonstrate smaller statistical and systematic uncertainties. We verify earlier results that SNe Ia require nonlinear shape and color standardizations, but we now include these nonlinear relations in a statistically well-justified way. This analysis was primarily performed blinded, in that the basic framework was first validated on simulated data before transitioning to real data. We also discuss possible extensions of the method.

  16. Autocorrelation and cross-correlation in time series of homicide and attempted homicide

    NASA Astrophysics Data System (ADS)

    Machado Filho, A.; da Silva, M. F.; Zebende, G. F.

    2014-04-01

    We propose in this paper to establish the relationship between homicides and attempted homicides by a non-stationary time-series analysis. This analysis will be carried out by Detrended Fluctuation Analysis (DFA), Detrended Cross-Correlation Analysis (DCCA), and DCCA cross-correlation coefficient, ρ(n). Through this analysis we can identify a positive cross-correlation between homicides and attempted homicides. At the same time, looked at from the point of view of autocorrelation (DFA), this analysis can be more informative depending on time scale. For short scale (days), we cannot identify auto-correlations, on the scale of weeks DFA presents anti-persistent behavior, and for long time scales (n>90 days) DFA presents a persistent behavior. Finally, the application of this new type of statistical analysis proved to be efficient and, in this sense, this paper can contribute to a more accurate descriptive statistics of crime.

  17. Analysis of cost regression and post-accident absence

    NASA Astrophysics Data System (ADS)

    Wojciech, Drozd

    2017-07-01

    The article presents issues related with costs of work safety. It proves the thesis that economic aspects cannot be overlooked in effective management of occupational health and safety and that adequate expenditures on safety can bring tangible benefits to the company. Reliable analysis of this problem is essential for the description the problem of safety the work. In the article attempts to carry it out using the procedures of mathematical statistics [1, 2, 3].

  18. Controlling the joint local false discovery rate is more powerful than meta-analysis methods in joint analysis of summary statistics from multiple genome-wide association studies.

    PubMed

    Jiang, Wei; Yu, Weichuan

    2017-02-15

    In genome-wide association studies (GWASs) of common diseases/traits, we often analyze multiple GWASs with the same phenotype together to discover associated genetic variants with higher power. Since it is difficult to access data with detailed individual measurements, summary-statistics-based meta-analysis methods have become popular to jointly analyze datasets from multiple GWASs. In this paper, we propose a novel summary-statistics-based joint analysis method based on controlling the joint local false discovery rate (Jlfdr). We prove that our method is the most powerful summary-statistics-based joint analysis method when controlling the false discovery rate at a certain level. In particular, the Jlfdr-based method achieves higher power than commonly used meta-analysis methods when analyzing heterogeneous datasets from multiple GWASs. Simulation experiments demonstrate the superior power of our method over meta-analysis methods. Also, our method discovers more associations than meta-analysis methods from empirical datasets of four phenotypes. The R-package is available at: http://bioinformatics.ust.hk/Jlfdr.html . eeyu@ust.hk. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  19. Statistical issues in quality control of proteomic analyses: good experimental design and planning.

    PubMed

    Cairns, David A

    2011-03-01

    Quality control is becoming increasingly important in proteomic investigations as experiments become more multivariate and quantitative. Quality control applies to all stages of an investigation and statistics can play a key role. In this review, the role of statistical ideas in the design and planning of an investigation is described. This involves the design of unbiased experiments using key concepts from statistical experimental design, the understanding of the biological and analytical variation in a system using variance components analysis and the determination of a required sample size to perform a statistically powerful investigation. These concepts are described through simple examples and an example data set from a 2-D DIGE pilot experiment. Each of these concepts can prove useful in producing better and more reproducible data. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. Exploratory Visual Analysis of Statistical Results from Microarray Experiments Comparing High and Low Grade Glioma

    PubMed Central

    Reif, David M.; Israel, Mark A.; Moore, Jason H.

    2007-01-01

    The biological interpretation of gene expression microarray results is a daunting challenge. For complex diseases such as cancer, wherein the body of published research is extensive, the incorporation of expert knowledge provides a useful analytical framework. We have previously developed the Exploratory Visual Analysis (EVA) software for exploring data analysis results in the context of annotation information about each gene, as well as biologically relevant groups of genes. We present EVA as a flexible combination of statistics and biological annotation that provides a straightforward visual interface for the interpretation of microarray analyses of gene expression in the most commonly occuring class of brain tumors, glioma. We demonstrate the utility of EVA for the biological interpretation of statistical results by analyzing publicly available gene expression profiles of two important glial tumors. The results of a statistical comparison between 21 malignant, high-grade glioblastoma multiforme (GBM) tumors and 19 indolent, low-grade pilocytic astrocytomas were analyzed using EVA. By using EVA to examine the results of a relatively simple statistical analysis, we were able to identify tumor class-specific gene expression patterns having both statistical and biological significance. Our interactive analysis highlighted the potential importance of genes involved in cell cycle progression, proliferation, signaling, adhesion, migration, motility, and structure, as well as candidate gene loci on a region of Chromosome 7 that has been implicated in glioma. Because EVA does not require statistical or computational expertise and has the flexibility to accommodate any type of statistical analysis, we anticipate EVA will prove a useful addition to the repertoire of computational methods used for microarray data analysis. EVA is available at no charge to academic users and can be found at http://www.epistasis.org. PMID:19390666

  1. Vapor Pressure Data and Analysis for Selected Organophosphorus Compounds: DIBMP, DCMP, IMMP, IMPA, EMPA, and MPFA

    DTIC Science & Technology

    2017-04-01

    Methodology, Statistics, and Applications; CRDEC-TR-386; U.S. Army Chemical Research, Development and Engineering Center: Aberdeen Proving Ground...Approved for public release; distribution unlimited. 13. SUPPLEMENTARY NOTES 14. ABSTRACT: Recent work from our laboratory has focused on chemical ...vaporization Volatility Differential scanning calorimetry (DSC) Vapor saturation Boiling point Diisobutyl methylphosphonate (DIBMP), Chemical Abstracts

  2. Minorities & Women in the Health Fields: Applicants, Students, and Workers. Health Manpower References.

    ERIC Educational Resources Information Center

    Philpot, Wilbertine P.; Bernstein, Stuart

    A comprehensive look at the current and future supply of women and minorities in the health professions and in health professions schools is provided in this statistical report. Its data are more extensive than those presented in either of two earlier reports, hence, it can prove useful in assisting analysis of the composition of the nation's…

  3. Statistical analysis of NaOH pretreatment effects on sweet sorghum bagasse characteristics

    NASA Astrophysics Data System (ADS)

    Putri, Ary Mauliva Hada; Wahyuni, Eka Tri; Sudiyani, Yanni

    2017-01-01

    We analyze the behavior of sweet sorghum bagasse characteristics before and after NaOH pretreatments by statistical analysis. These characteristics include the percentages of lignocellulosic materials and the degree of crystallinity. We use the chi-square method to get the values of fitted parameters, and then deploy student's t-test to check whether they are significantly different from zero at 99.73% confidence level (C.L.). We obtain, in the cases of hemicellulose and lignin, that their percentages after pretreatment decrease statistically. On the other hand, crystallinity does not possess similar behavior as the data proves that all fitted parameters in this case might be consistent with zero. Our statistical result is then cross examined with the observations from X-ray diffraction (XRD) and Fourier Transform Infrared (FTIR) Spectroscopy, showing pretty good agreement. This result may indicate that the 10% NaOH pretreatment might not be sufficient in changing the crystallinity index of the sweet sorghum bagasse.

  4. The mediating effect of calling on the relationship between medical school students' academic burnout and empathy.

    PubMed

    Chae, Su Jin; Jeong, So Mi; Chung, Yoon-Sok

    2017-09-01

    This study is aimed at identifying the relationships between medical school students' academic burnout, empathy, and calling, and determining whether their calling has a mediating effect on the relationship between academic burnout and empathy. A mixed method study was conducted. One hundred twenty-seven medical students completed a survey. Scales measuring academic burnout, medical students' empathy, and calling were utilized. For statistical analysis, correlation analysis, descriptive statistics analysis, and hierarchical multiple regression analyses were conducted. For qualitative approach, eight medical students participated in a focus group interview. The study found that empathy has a statistically significant, negative correlation with academic burnout, while having a significant, positive correlation with calling. Sense of calling proved to be an effective mediator of the relationship between academic burnout and empathy. This result demonstrates that calling is a key variable that mediates the relationship between medical students' academic burnout and empathy. As such, this study provides baseline data for an education that could improve medical students' empathy skills.

  5. Analysis and meta-analysis of single-case designs with a standardized mean difference statistic: a primer and applications.

    PubMed

    Shadish, William R; Hedges, Larry V; Pustejovsky, James E

    2014-04-01

    This article presents a d-statistic for single-case designs that is in the same metric as the d-statistic used in between-subjects designs such as randomized experiments and offers some reasons why such a statistic would be useful in SCD research. The d has a formal statistical development, is accompanied by appropriate power analyses, and can be estimated using user-friendly SPSS macros. We discuss both advantages and disadvantages of d compared to other approaches such as previous d-statistics, overlap statistics, and multilevel modeling. It requires at least three cases for computation and assumes normally distributed outcomes and stationarity, assumptions that are discussed in some detail. We also show how to test these assumptions. The core of the article then demonstrates in depth how to compute d for one study, including estimation of the autocorrelation and the ratio of between case variance to total variance (between case plus within case variance), how to compute power using a macro, and how to use the d to conduct a meta-analysis of studies using single-case designs in the free program R, including syntax in an appendix. This syntax includes how to read data, compute fixed and random effect average effect sizes, prepare a forest plot and a cumulative meta-analysis, estimate various influence statistics to identify studies contributing to heterogeneity and effect size, and do various kinds of publication bias analyses. This d may prove useful for both the analysis and meta-analysis of data from SCDs. Copyright © 2013 Society for the Study of School Psychology. Published by Elsevier Ltd. All rights reserved.

  6. Monitoring of bread cooling by statistical analysis of laser speckle patterns

    NASA Astrophysics Data System (ADS)

    Lyubenova, Tanya; Stoykova, Elena; Nacheva, Elena; Ivanov, Branimir; Panchev, Ivan; Sainov, Ventseslav

    2013-03-01

    The phenomenon of laser speckle can be used for detection and visualization of physical or biological activity in various objects (e.g. fruits, seeds, coatings) through statistical description of speckle dynamics. The paper presents the results of non-destructive monitoring of bread cooling by co-occurrence matrix and temporal structure function analysis of speckle patterns which have been recorded continuously within a few days. In total, 72960 and 39680 images were recorded and processed for two similar bread samples respectively. The experiments proved the expected steep decrease of activity related to the processes in the bread samples during the first several hours and revealed its oscillating character within the next few days. Characterization of activity over the bread sample surface was also obtained.

  7. An evaluation of intraoperative and postoperative outcomes of torsional mode versus longitudinal ultrasound mode phacoemulsification: a Meta-analysis.

    PubMed

    Leon, Pia; Umari, Ingrid; Mangogna, Alessandro; Zanei, Andrea; Tognetto, Daniele

    2016-01-01

    To evaluate and compare the intraoperative parameters and postoperative outcomes of torsional mode and longitudinal mode of phacoemulsification. Pertinent studies were identified by a computerized MEDLINE search from January 2002 to September 2013. The Meta-analysis is composed of two parts. In the first part the intraoperative parameters were considered: ultrasound time (UST) and cumulative dissipated energy (CDE). The intraoperative values were also distinctly considered for two categories (moderate and hard cataract group) depending on the nuclear opacity grade. In the second part of the study the postoperative outcomes as the best corrected visual acuity (BCVA) and the endothelial cell loss (ECL) were taken in consideration. The UST and CDE values proved statistically significant in support of torsional mode for both moderate and hard cataract group. The analysis of BCVA did not present statistically significant difference between the two surgical modalities. The ECL count was statistically significant in support of torsional mode (P<0.001). The Meta-analysis shows the superiority of the torsional mode for intraoperative parameters (UST, CDE) and postoperative ECL outcomes.

  8. An evaluation of intraoperative and postoperative outcomes of torsional mode versus longitudinal ultrasound mode phacoemulsification: a Meta-analysis

    PubMed Central

    Leon, Pia; Umari, Ingrid; Mangogna, Alessandro; Zanei, Andrea; Tognetto, Daniele

    2016-01-01

    AIM To evaluate and compare the intraoperative parameters and postoperative outcomes of torsional mode and longitudinal mode of phacoemulsification. METHODS Pertinent studies were identified by a computerized MEDLINE search from January 2002 to September 2013. The Meta-analysis is composed of two parts. In the first part the intraoperative parameters were considered: ultrasound time (UST) and cumulative dissipated energy (CDE). The intraoperative values were also distinctly considered for two categories (moderate and hard cataract group) depending on the nuclear opacity grade. In the second part of the study the postoperative outcomes as the best corrected visual acuity (BCVA) and the endothelial cell loss (ECL) were taken in consideration. RESULTS The UST and CDE values proved statistically significant in support of torsional mode for both moderate and hard cataract group. The analysis of BCVA did not present statistically significant difference between the two surgical modalities. The ECL count was statistically significant in support of torsional mode (P<0.001). CONCLUSION The Meta-analysis shows the superiority of the torsional mode for intraoperative parameters (UST, CDE) and postoperative ECL outcomes. PMID:27366694

  9. Evaluation of direct and indirect ethanol biomarkers using a likelihood ratio approach to identify chronic alcohol abusers for forensic purposes.

    PubMed

    Alladio, Eugenio; Martyna, Agnieszka; Salomone, Alberto; Pirro, Valentina; Vincenti, Marco; Zadora, Grzegorz

    2017-02-01

    The detection of direct ethanol metabolites, such as ethyl glucuronide (EtG) and fatty acid ethyl esters (FAEEs), in scalp hair is considered the optimal strategy to effectively recognize chronic alcohol misuses by means of specific cut-offs suggested by the Society of Hair Testing. However, several factors (e.g. hair treatments) may alter the correlation between alcohol intake and biomarkers concentrations, possibly introducing bias in the interpretative process and conclusions. 125 subjects with various drinking habits were subjected to blood and hair sampling to determine indirect (e.g. CDT) and direct alcohol biomarkers. The overall data were investigated using several multivariate statistical methods. A likelihood ratio (LR) approach was used for the first time to provide predictive models for the diagnosis of alcohol abuse, based on different combinations of direct and indirect alcohol biomarkers. LR strategies provide a more robust outcome than the plain comparison with cut-off values, where tiny changes in the analytical results can lead to dramatic divergence in the way they are interpreted. An LR model combining EtG and FAEEs hair concentrations proved to discriminate non-chronic from chronic consumers with ideal correct classification rates, whereas the contribution of indirect biomarkers proved to be negligible. Optimal results were observed using a novel approach that associates LR methods with multivariate statistics. In particular, the combination of LR approach with either Principal Component Analysis (PCA) or Linear Discriminant Analysis (LDA) proved successful in discriminating chronic from non-chronic alcohol drinkers. These LR models were subsequently tested on an independent dataset of 43 individuals, which confirmed their high efficiency. These models proved to be less prone to bias than EtG and FAEEs independently considered. In conclusion, LR models may represent an efficient strategy to sustain the diagnosis of chronic alcohol consumption and provide a suitable gradation to support the judgment. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  10. Introduction of High Throughput Magnetic Resonance T2-Weighted Image Texture Analysis for WHO Grade 2 and 3 Gliomas.

    PubMed

    Kinoshita, Manabu; Sakai, Mio; Arita, Hideyuki; Shofuda, Tomoko; Chiba, Yasuyoshi; Kagawa, Naoki; Watanabe, Yoshiyuki; Hashimoto, Naoya; Fujimoto, Yasunori; Yoshimine, Toshiki; Nakanishi, Katsuyuki; Kanemura, Yonehiro

    2016-01-01

    Reports have suggested that tumor textures presented on T2-weighted images correlate with the genetic status of glioma. Therefore, development of an image analyzing framework that is capable of objective and high throughput image texture analysis for large scale image data collection is needed. The current study aimed to address the development of such a framework by introducing two novel parameters for image textures on T2-weighted images, i.e., Shannon entropy and Prewitt filtering. Twenty-two WHO grade 2 and 28 grade 3 glioma patients were collected whose pre-surgical MRI and IDH1 mutation status were available. Heterogeneous lesions showed statistically higher Shannon entropy than homogenous lesions (p = 0.006) and ROC curve analysis proved that Shannon entropy on T2WI was a reliable indicator for discrimination of homogenous and heterogeneous lesions (p = 0.015, AUC = 0.73). Lesions with well-defined borders exhibited statistically higher Edge mean and Edge median values using Prewitt filtering than those with vague lesion borders (p = 0.0003 and p = 0.0005 respectively). ROC curve analysis also proved that both Edge mean and median values were promising indicators for discrimination of lesions with vague and well defined borders and both Edge mean and median values performed in a comparable manner (p = 0.0002, AUC = 0.81 and p < 0.0001, AUC = 0.83, respectively). Finally, IDH1 wild type gliomas showed statistically lower Shannon entropy on T2WI than IDH1 mutated gliomas (p = 0.007) but no difference was observed between IDH1 wild type and mutated gliomas in Edge median values using Prewitt filtering. The current study introduced two image metrics that reflect lesion texture described on T2WI. These two metrics were validated by readings of a neuro-radiologist who was blinded to the results. This observation will facilitate further use of this technique in future large scale image analysis of glioma.

  11. Time lapse microscopy observation of cellular structural changes and image analysis of drug treated cancer cells to characterize the cellular heterogeneity.

    PubMed

    Vaiyapuri, Periasamy S; Ali, Alshatwi A; Mohammad, Akbarsha A; Kandhavelu, Jeyalakshmi; Kandhavelu, Meenakshisundaram

    2015-01-01

    The effect of Calotropis gigantea latex (CGLX) on human mammary carcinoma cells is not well established. We present the results of this drug activity at total population and single cell level. CGLX inhibited the growth of MCF7 cancer cells at lower IC50 concentration (17 µL/mL). Microscopy of IC50 drug treated cells at 24 hr confirming the appearance of morphological characteristics of apoptotic and necrotic cells, associated with 70% of DNA damage. FACS analysis confirmed that, 10 and 20% of the disruption of cellular mitochondrial nature by at 24 and 48 h, respectively. Microscopic image analysis of total population level proved that MMP changes were statistically significant with P values. The cell to cell variation was confirmed by functional heterogeneity analysis which proves that CGLX was able to induce the apoptosis without the contribution of mitochondria. We conclude that CGLX inhibits cell proliferation, survival, and heterogeneity of pathways in human mammary carcinoma cells. © 2014 Wiley Periodicals, Inc.

  12. The application of satellite data in monitoring strip mines

    NASA Technical Reports Server (NTRS)

    Sharber, L. A.; Shahrokhi, F.

    1977-01-01

    Strip mines in the New River Drainage Basin of Tennessee were studied through use of Landsat-1 imagery and aircraft photography. A multilevel analysis, involving conventional photo interpretation techniques, densitometric methods, multispectral analysis and statistical testing was applied to the data. The Landsat imagery proved adequate for monitoring large-scale change resulting from active mining and land-reclamation projects. However, the spatial resolution of the satellite imagery rendered it inadequate for assessment of many smaller strip mines, in the region which may be as small as a few hectares.

  13. [Prosthodontic research design from the standpoint of statistical analysis: learning and knowing the research design].

    PubMed

    Tanoue, Naomi

    2007-10-01

    For any kind of research, "Research Design" is the most important. The design is used to structure the research, to show how all of the major parts of the research project. It is necessary for all the researchers to begin the research after planning research design for what is the main theme, what is the background and reference, what kind of data is needed, and what kind of analysis is needed. It seems to be a roundabout route, but, in fact, it will be a shortcut. The research methods must be appropriate to the objectives of the study. Regarding the hypothesis-testing research that is the traditional style of the research, the research design based on statistics is undoubtedly necessary considering that the research basically proves "a hypothesis" with data and statistics theory. On the subject of the clinical trial, which is the clinical version of the hypothesis-testing research, the statistical method must be mentioned in a clinical trial planning. This report describes the basis of the research design for a prosthodontics study.

  14. The mediating effect of calling on the relationship between medical school students’ academic burnout and empathy

    PubMed Central

    2017-01-01

    Purpose This study is aimed at identifying the relationships between medical school students’ academic burnout, empathy, and calling, and determining whether their calling has a mediating effect on the relationship between academic burnout and empathy. Methods A mixed method study was conducted. One hundred twenty-seven medical students completed a survey. Scales measuring academic burnout, medical students’ empathy, and calling were utilized. For statistical analysis, correlation analysis, descriptive statistics analysis, and hierarchical multiple regression analyses were conducted. For qualitative approach, eight medical students participated in a focus group interview. Results The study found that empathy has a statistically significant, negative correlation with academic burnout, while having a significant, positive correlation with calling. Sense of calling proved to be an effective mediator of the relationship between academic burnout and empathy. Conclusion This result demonstrates that calling is a key variable that mediates the relationship between medical students’ academic burnout and empathy. As such, this study provides baseline data for an education that could improve medical students’ empathy skills. PMID:28870019

  15. Management of constipation in palliative care patients undergoing opioid therapy: is polyethylene glycol an option?

    PubMed

    Wirz, Stefan; Klaschik, Eberhard

    2005-01-01

    This study assessed the efficacy of laxative use for treatment of constipation in patients receiving opioid therapy, with special attention to polyethylene glycol 3350/electrolyte solution (PEG-ES). Computerized data from 206 patients were analyzed using descriptive statistics. Subgroups were analyzed using confirmatory statistics. Constipation occurred in 42.7 percent of patients. Laxatives were administered to 74.3 percent of these patients using a standardized step scheme, with good results in 78.4 percent. As a therapy for constipation, the combined administration of PEG-ES, sodium picosulphate, and liquid paraffin proved most effective, although statistical analysis yielded no significance. Early use of PEG-ES using a step scheme holds promise for treatment of opioid-related constipation in palliative care patients, although further investigation is warranted.

  16. The battle against violence in U.S. hospitals: an analysis of the recent I IAHSS Foundation's healthcare crime surveys.

    PubMed

    Vellani, Karim H

    2016-10-01

    In this article, the author analyzes the possible reasons for the reported drop in hospital violence in the 2016IAHSS Crime Survey compared to previous surveys. He also reviews the one statistic that has remained constant in all the recent crime surveys and recommends an approach in violence prevention programs that may prove successful in reducing workplace violence and staff injuries.

  17. The effect of telehealth systems and satisfaction with health expenditure among patients with metabolic syndrome.

    PubMed

    Uei, Shu-Lin; Tsai, Chung-Hung; Kuo, Yu-Ming

    2016-04-29

    Telehealth cost analysis has become a crucial issue for governments in recent years. In this study, we examined cases of metabolic syndrome in Hualien County, Taiwan. This research adopted the framework proposed by Marchand to establish a study process. In addition, descriptive statistics, a t test, analysis of variance, and regression analysis were employed to analyze 100 questionnaires. The results of the t$ test revealed significant differences in medical health expenditure, number of clinical visits for medical treatment, average amount of time spent commuting to clinics, amount of time spent undergoing medical treatment, and average number of people accompanying patients to medical care facilities or assisting with other tasks in the past one month, indicating that offering telehealth care services can reduce health expenditure. The statistical analysis results revealed that customer satisfaction has a positive effect on reducing health expenditure. Therefore, this study proves that telehealth care systems can effectively reduce health expenditure and directly improve customer satisfaction with medical treatment.

  18. Statistical framework for detection of genetically modified organisms based on Next Generation Sequencing.

    PubMed

    Willems, Sander; Fraiture, Marie-Alice; Deforce, Dieter; De Keersmaecker, Sigrid C J; De Loose, Marc; Ruttink, Tom; Herman, Philippe; Van Nieuwerburgh, Filip; Roosens, Nancy

    2016-02-01

    Because the number and diversity of genetically modified (GM) crops has significantly increased, their analysis based on real-time PCR (qPCR) methods is becoming increasingly complex and laborious. While several pioneers already investigated Next Generation Sequencing (NGS) as an alternative to qPCR, its practical use has not been assessed for routine analysis. In this study a statistical framework was developed to predict the number of NGS reads needed to detect transgene sequences, to prove their integration into the host genome and to identify the specific transgene event in a sample with known composition. This framework was validated by applying it to experimental data from food matrices composed of pure GM rice, processed GM rice (noodles) or a 10% GM/non-GM rice mixture, revealing some influential factors. Finally, feasibility of NGS for routine analysis of GM crops was investigated by applying the framework to samples commonly encountered in routine analysis of GM crops. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  19. Sparse approximation of currents for statistics on curves and surfaces.

    PubMed

    Durrleman, Stanley; Pennec, Xavier; Trouvé, Alain; Ayache, Nicholas

    2008-01-01

    Computing, processing, visualizing statistics on shapes like curves or surfaces is a real challenge with many applications ranging from medical image analysis to computational geometry. Modelling such geometrical primitives with currents avoids feature-based approach as well as point-correspondence method. This framework has been proved to be powerful to register brain surfaces or to measure geometrical invariants. However, if the state-of-the-art methods perform efficiently pairwise registrations, new numerical schemes are required to process groupwise statistics due to an increasing complexity when the size of the database is growing. Statistics such as mean and principal modes of a set of shapes often have a heavy and highly redundant representation. We propose therefore to find an adapted basis on which mean and principal modes have a sparse decomposition. Besides the computational improvement, this sparse representation offers a way to visualize and interpret statistics on currents. Experiments show the relevance of the approach on 34 sets of 70 sulcal lines and on 50 sets of 10 meshes of deep brain structures.

  20. Texture analysis of pulmonary parenchyma in normal and emphysematous lung

    NASA Astrophysics Data System (ADS)

    Uppaluri, Renuka; Mitsa, Theophano; Hoffman, Eric A.; McLennan, Geoffrey; Sonka, Milan

    1996-04-01

    Tissue characterization using texture analysis is gaining increasing importance in medical imaging. We present a completely automated method for discriminating between normal and emphysematous regions from CT images. This method involves extracting seventeen features which are based on statistical, hybrid and fractal texture models. The best subset of features is derived from the training set using the divergence technique. A minimum distance classifier is used to classify the samples into one of the two classes--normal and emphysema. Sensitivity and specificity and accuracy values achieved were 80% or greater in most cases proving that texture analysis holds great promise in identifying emphysema.

  1. Automated spectral and timing analysis of AGNs

    NASA Astrophysics Data System (ADS)

    Munz, F.; Karas, V.; Guainazzi, M.

    2006-12-01

    % We have developed an autonomous script that helps the user to automate the XMM-Newton data analysis for the purposes of extensive statistical investigations. We test this approach by examining X-ray spectra of bright AGNs pre-selected from the public database. The event lists extracted in this process were studied further by constructing their energy-resolved Fourier power-spectrum density. This analysis combines energy distributions, light-curves, and their power-spectra and it proves useful to assess the variability patterns present is the data. As another example, an automated search was based on the XSPEC package to reveal the emission features in 2-8 keV range.

  2. Statistical analysis for improving data precision in the SPME GC-MS analysis of blackberry (Rubus ulmifolius Schott) volatiles.

    PubMed

    D'Agostino, M F; Sanz, J; Martínez-Castro, I; Giuffrè, A M; Sicari, V; Soria, A C

    2014-07-01

    Statistical analysis has been used for the first time to evaluate the dispersion of quantitative data in the solid-phase microextraction (SPME) followed by gas chromatography-mass spectrometry (GC-MS) analysis of blackberry (Rubus ulmifolius Schott) volatiles with the aim of improving their precision. Experimental and randomly simulated data were compared using different statistical parameters (correlation coefficients, Principal Component Analysis loadings and eigenvalues). Non-random factors were shown to significantly contribute to total dispersion; groups of volatile compounds could be associated with these factors. A significant improvement of precision was achieved when considering percent concentration ratios, rather than percent values, among those blackberry volatiles with a similar dispersion behavior. As novelty over previous references, and to complement this main objective, the presence of non-random dispersion trends in data from simple blackberry model systems was evidenced. Although the influence of the type of matrix on data precision was proved, the possibility of a better understanding of the dispersion patterns in real samples was not possible from model systems. The approach here used was validated for the first time through the multicomponent characterization of Italian blackberries from different harvest years. Copyright © 2014 Elsevier B.V. All rights reserved.

  3. Statistical trend analysis and extreme distribution of significant wave height from 1958 to 1999 - an application to the Italian Seas

    NASA Astrophysics Data System (ADS)

    Martucci, G.; Carniel, S.; Chiggiato, J.; Sclavo, M.; Lionello, P.; Galati, M. B.

    2010-06-01

    The study is a statistical analysis of sea states timeseries derived using the wave model WAM forced by the ERA-40 dataset in selected areas near the Italian coasts. For the period 1 January 1958 to 31 December 1999 the analysis yields: (i) the existence of a negative trend in the annual- and winter-averaged sea state heights; (ii) the existence of a turning-point in late 80's in the annual-averaged trend of sea state heights at a site in the Northern Adriatic Sea; (iii) the overall absence of a significant trend in the annual-averaged mean durations of sea states over thresholds; (iv) the assessment of the extreme values on a time-scale of thousand years. The analysis uses two methods to obtain samples of extremes from the independent sea states: the r-largest annual maxima and the peak-over-threshold. The two methods show statistical differences in retrieving the return values and more generally in describing the significant wave field. The r-largest annual maxima method provides more reliable predictions of the extreme values especially for small return periods (<100 years). Finally, the study statistically proves the existence of decadal negative trends in the significant wave heights and by this it conveys useful information on the wave climatology of the Italian seas during the second half of the 20th century.

  4. Cancer diagnosis by infrared spectroscopy: methodological aspects

    NASA Astrophysics Data System (ADS)

    Jackson, Michael; Kim, Keith; Tetteh, John; Mansfield, James R.; Dolenko, Brion; Somorjai, Raymond L.; Orr, F. W.; Watson, Peter H.; Mantsch, Henry H.

    1998-04-01

    IR spectroscopy is proving to be a powerful tool for the study and diagnosis of cancer. The application of IR spectroscopy to the analysis of cultured tumor cells and grading of breast cancer sections is outlined. Potential sources of error in spectral interpretation due to variations in sample histology and artifacts associated with sample storage and preparation are discussed. The application of statistical techniques to assess differences between spectra and to non-subjectively classify spectra is demonstrated.

  5. On Nonlinear Functionals of Random Spherical Eigenfunctions

    NASA Astrophysics Data System (ADS)

    Marinucci, Domenico; Wigman, Igor

    2014-05-01

    We prove central limit theorems and Stein-like bounds for the asymptotic behaviour of nonlinear functionals of spherical Gaussian eigenfunctions. Our investigation combines asymptotic analysis of higher order moments for Legendre polynomials and, in addition, recent results on Malliavin calculus and total variation bounds for Gaussian subordinated fields. We discuss applications to geometric functionals like the defect and invariant statistics, e.g., polyspectra of isotropic spherical random fields. Both of these have relevance for applications, especially in an astrophysical environment.

  6. Neural network representation and learning of mappings and their derivatives

    NASA Technical Reports Server (NTRS)

    White, Halbert; Hornik, Kurt; Stinchcombe, Maxwell; Gallant, A. Ronald

    1991-01-01

    Discussed here are recent theorems proving that artificial neural networks are capable of approximating an arbitrary mapping and its derivatives as accurately as desired. This fact forms the basis for further results establishing the learnability of the desired approximations, using results from non-parametric statistics. These results have potential applications in robotics, chaotic dynamics, control, and sensitivity analysis. An example involving learning the transfer function and its derivatives for a chaotic map is discussed.

  7. Bayesian selection of Markov models for symbol sequences: application to microsaccadic eye movements.

    PubMed

    Bettenbühl, Mario; Rusconi, Marco; Engbert, Ralf; Holschneider, Matthias

    2012-01-01

    Complex biological dynamics often generate sequences of discrete events which can be described as a Markov process. The order of the underlying Markovian stochastic process is fundamental for characterizing statistical dependencies within sequences. As an example for this class of biological systems, we investigate the Markov order of sequences of microsaccadic eye movements from human observers. We calculate the integrated likelihood of a given sequence for various orders of the Markov process and use this in a Bayesian framework for statistical inference on the Markov order. Our analysis shows that data from most participants are best explained by a first-order Markov process. This is compatible with recent findings of a statistical coupling of subsequent microsaccade orientations. Our method might prove to be useful for a broad class of biological systems.

  8. An effect size filter improves the reproducibility in spectral counting-based comparative proteomics.

    PubMed

    Gregori, Josep; Villarreal, Laura; Sánchez, Alex; Baselga, José; Villanueva, Josep

    2013-12-16

    The microarray community has shown that the low reproducibility observed in gene expression-based biomarker discovery studies is partially due to relying solely on p-values to get the lists of differentially expressed genes. Their conclusions recommended complementing the p-value cutoff with the use of effect-size criteria. The aim of this work was to evaluate the influence of such an effect-size filter on spectral counting-based comparative proteomic analysis. The results proved that the filter increased the number of true positives and decreased the number of false positives and the false discovery rate of the dataset. These results were confirmed by simulation experiments where the effect size filter was used to evaluate systematically variable fractions of differentially expressed proteins. Our results suggest that relaxing the p-value cut-off followed by a post-test filter based on effect size and signal level thresholds can increase the reproducibility of statistical results obtained in comparative proteomic analysis. Based on our work, we recommend using a filter consisting of a minimum absolute log2 fold change of 0.8 and a minimum signal of 2-4 SpC on the most abundant condition for the general practice of comparative proteomics. The implementation of feature filtering approaches could improve proteomic biomarker discovery initiatives by increasing the reproducibility of the results obtained among independent laboratories and MS platforms. Quality control analysis of microarray-based gene expression studies pointed out that the low reproducibility observed in the lists of differentially expressed genes could be partially attributed to the fact that these lists are generated relying solely on p-values. Our study has established that the implementation of an effect size post-test filter improves the statistical results of spectral count-based quantitative proteomics. The results proved that the filter increased the number of true positives whereas decreased the false positives and the false discovery rate of the datasets. The results presented here prove that a post-test filter applying a reasonable effect size and signal level thresholds helps to increase the reproducibility of statistical results in comparative proteomic analysis. Furthermore, the implementation of feature filtering approaches could improve proteomic biomarker discovery initiatives by increasing the reproducibility of results obtained among independent laboratories and MS platforms. This article is part of a Special Issue entitled: Standardization and Quality Control in Proteomics. Copyright © 2013 Elsevier B.V. All rights reserved.

  9. Kolmogorov-Smirnov statistical test for analysis of ZAP-70 expression in B-CLL, compared with quantitative PCR and IgV(H) mutation status.

    PubMed

    Van Bockstaele, Femke; Janssens, Ann; Piette, Anne; Callewaert, Filip; Pede, Valerie; Offner, Fritz; Verhasselt, Bruno; Philippé, Jan

    2006-07-15

    ZAP-70 has been proposed as a surrogate marker for immunoglobulin heavy-chain variable region (IgV(H)) mutation status, which is known as a prognostic marker in B-cell chronic lymphocytic leukemia (CLL). The flow cytometric analysis of ZAP-70 suffers from difficulties in standardization and interpretation. We applied the Kolmogorov-Smirnov (KS) statistical test to make analysis more straightforward. We examined ZAP-70 expression by flow cytometry in 53 patients with CLL. Analysis was performed as initially described by Crespo et al. (New England J Med 2003; 348:1764-1775) and alternatively by application of the KS statistical test comparing T cells with B cells. Receiver-operating-characteristics (ROC)-curve analyses were performed to determine the optimal cut-off values for ZAP-70 measured by the two approaches. ZAP-70 protein expression was compared with ZAP-70 mRNA expression measured by a quantitative PCR (qPCR) and with the IgV(H) mutation status. Both flow cytometric analyses correlated well with the molecular technique and proved to be of equal value in predicting the IgV(H) mutation status. Applying the KS test is reproducible, simple, straightforward, and overcomes a number of difficulties encountered in the Crespo-method. The KS statistical test is an essential part of the software delivered with modern routine analytical flow cytometers and is well suited for analysis of ZAP-70 expression in CLL. (c) 2006 International Society for Analytical Cytology.

  10. Mesophilic batch anaerobic co-digestion of fruit-juice industrial waste and municipal waste sludge: process and cost-benefit analysis.

    PubMed

    Hosseini Koupaie, E; Barrantes Leiva, M; Eskicioglu, C; Dutil, C

    2014-01-01

    The feasibility of anaerobic co-digestion of two juice-based beverage industrial wastes, screen cake (SC) and thickened waste activated sludge (TWAS), along with municipal sludge cake (MC) was investigated. Experiments were conducted in twenty mesophilic batch 160 ml serum bottles with no inhibition occurred. The statistical analysis proved that the substrate type had statistically significant effect on both ultimate biogas and methane yields (P=0.0003<0.05). The maximum and minimum ultimate cumulative methane yields were 890.90 and 308.34 mL/g-VSremoved from the digesters containing only TWAS and SC as substrate. First-order reaction model well described VS utilization in all digesters. The first 2-day and 10-day specific biodegradation rate constants were statistically higher in the digesters containing SC (P=0.004<0.05) and MC (P=0.0005<0.05), respectively. The cost-benefit analysis showed that the capital, operating and total costs can be decreased by 21.5%, 29.8% and 27.6%, respectively using a co-digester rather than two separate digesters. Copyright © 2013 Elsevier Ltd. All rights reserved.

  11. Illustrating Sampling Distribution of a Statistic: Minitab Revisited

    ERIC Educational Resources Information Center

    Johnson, H. Dean; Evans, Marc A.

    2008-01-01

    Understanding the concept of the sampling distribution of a statistic is essential for the understanding of inferential procedures. Unfortunately, this topic proves to be a stumbling block for students in introductory statistics classes. In efforts to aid students in their understanding of this concept, alternatives to a lecture-based mode of…

  12. Topical tranexamic acid in total knee replacement: a systematic review and meta-analysis.

    PubMed

    Panteli, Michalis; Papakostidis, Costas; Dahabreh, Ziad; Giannoudis, Peter V

    2013-10-01

    To examine the safety and efficacy of topical use of tranexamic acid (TA) in total knee arthroplasty (TKA). An electronic literature search of PubMed Medline; Ovid Medline; Embase; and the Cochrane Library was performed, identifying studies published in any language from 1966 to February 2013. The studies enrolled adults undergoing a primary TKA, where topical TA was used. Inverse variance statistical method and either a fixed or random effect model, depending on the absence or presence of statistical heterogeneity were used; subgroup analysis was performed when possible. We identified a total of seven eligible reports for analysis. Our meta-analysis indicated that when compared with the control group, topical application of TA limited significantly postoperative drain output (mean difference: -268.36ml), total blood loss (mean difference=-220.08ml), Hb drop (mean difference=-0.94g/dL) and lowered the risk of transfusion requirements (risk ratio=0.47, 95CI=0.26-0.84), without increased risk of thromboembolic events. Sub-group analysis indicated that a higher dose of topical TA (>2g) significantly reduced transfusion requirements. Although the present meta-analysis proved a statistically significant reduction of postoperative blood loss and transfusion requirements with topical use of TA in TKA, the clinical importance of the respective estimates of effect size should be interpreted with caution. I, II. Copyright © 2013 Elsevier B.V. All rights reserved.

  13. A graph theory approach to identify resonant and non-resonant transmission paths in statistical modal energy distribution analysis

    NASA Astrophysics Data System (ADS)

    Aragonès, Àngels; Maxit, Laurent; Guasch, Oriol

    2015-08-01

    Statistical modal energy distribution analysis (SmEdA) extends classical statistical energy analysis (SEA) to the mid frequency range by establishing power balance equations between modes in different subsystems. This circumvents the SEA requirement of modal energy equipartition and enables applying SmEdA to the cases of low modal overlap, locally excited subsystems and to deal with complex heterogeneous subsystems as well. Yet, widening the range of application of SEA is done at a price with large models because the number of modes per subsystem can become considerable when the frequency increases. Therefore, it would be worthwhile to have at one's disposal tools for a quick identification and ranking of the resonant and non-resonant paths involved in modal energy transmission between subsystems. It will be shown that previously developed graph theory algorithms for transmission path analysis (TPA) in SEA can be adapted to SmEdA and prove useful for that purpose. The case of airborne transmission between two cavities separated apart by homogeneous and ribbed plates will be first addressed to illustrate the potential of the graph approach. A more complex case representing transmission between non-contiguous cavities in a shipbuilding structure will be also presented.

  14. Estimating size and scope economies in the Portuguese water sector using the Bayesian stochastic frontier analysis.

    PubMed

    Carvalho, Pedro; Marques, Rui Cunha

    2016-02-15

    This study aims to search for economies of size and scope in the Portuguese water sector applying Bayesian and classical statistics to make inference in stochastic frontier analysis (SFA). This study proves the usefulness and advantages of the application of Bayesian statistics for making inference in SFA over traditional SFA which just uses classical statistics. The resulting Bayesian methods allow overcoming some problems that arise in the application of the traditional SFA, such as the bias in small samples and skewness of residuals. In the present case study of the water sector in Portugal, these Bayesian methods provide more plausible and acceptable results. Based on the results obtained we found that there are important economies of output density, economies of size, economies of vertical integration and economies of scope in the Portuguese water sector, pointing out to the huge advantages in undertaking mergers by joining the retail and wholesale components and by joining the drinking water and wastewater services. Copyright © 2015 Elsevier B.V. All rights reserved.

  15. Differential gene expression detection and sample classification using penalized linear regression models.

    PubMed

    Wu, Baolin

    2006-02-15

    Differential gene expression detection and sample classification using microarray data have received much research interest recently. Owing to the large number of genes p and small number of samples n (p > n), microarray data analysis poses big challenges for statistical analysis. An obvious problem owing to the 'large p small n' is over-fitting. Just by chance, we are likely to find some non-differentially expressed genes that can classify the samples very well. The idea of shrinkage is to regularize the model parameters to reduce the effects of noise and produce reliable inferences. Shrinkage has been successfully applied in the microarray data analysis. The SAM statistics proposed by Tusher et al. and the 'nearest shrunken centroid' proposed by Tibshirani et al. are ad hoc shrinkage methods. Both methods are simple, intuitive and prove to be useful in empirical studies. Recently Wu proposed the penalized t/F-statistics with shrinkage by formally using the (1) penalized linear regression models for two-class microarray data, showing good performance. In this paper we systematically discussed the use of penalized regression models for analyzing microarray data. We generalize the two-class penalized t/F-statistics proposed by Wu to multi-class microarray data. We formally derive the ad hoc shrunken centroid used by Tibshirani et al. using the (1) penalized regression models. And we show that the penalized linear regression models provide a rigorous and unified statistical framework for sample classification and differential gene expression detection.

  16. [Pathogenetic therapy of mastopathies in the prevention of breast cancer].

    PubMed

    Iaritsyn, S S; Sidorenko, L N

    1979-01-01

    The breast cancer morbidity among the population of the city of Leningrad has been analysed. It was shown that there is a tendency to the increased number of breast cancer patients. In this respect attention is given to the prophylactic measures, accomplished in Leningrad City oncological dyspensary. As proved statistically, the pathogenetic therapy of mastopathy is a factor contributing to less risk of malignant transformation. For the statistical analysis the authors used the data of 132 breast cancer patients; previously operated upon for local fibroadenomatosis, and the data of 259 control patients. It was found that among the patients with fibroadenomatosis who subsequently developed cancer of the mammary gland, the proportion of untreated patients was 2.8 times as much as in the control group.

  17. Significant Association of Urinary Toxic Metals and Autism-Related Symptoms—A Nonlinear Statistical Analysis with Cross Validation

    PubMed Central

    Adams, James; Kruger, Uwe; Geis, Elizabeth; Gehn, Eva; Fimbres, Valeria; Pollard, Elena; Mitchell, Jessica; Ingram, Julie; Hellmers, Robert; Quig, David; Hahn, Juergen

    2017-01-01

    Introduction A number of previous studies examined a possible association of toxic metals and autism, and over half of those studies suggest that toxic metal levels are different in individuals with Autism Spectrum Disorders (ASD). Additionally, several studies found that those levels correlate with the severity of ASD. Methods In order to further investigate these points, this paper performs the most detailed statistical analysis to date of a data set in this field. First morning urine samples were collected from 67 children and adults with ASD and 50 neurotypical controls of similar age and gender. The samples were analyzed to determine the levels of 10 urinary toxic metals (UTM). Autism-related symptoms were assessed with eleven behavioral measures. Statistical analysis was used to distinguish participants on the ASD spectrum and neurotypical participants based upon the UTM data alone. The analysis also included examining the association of autism severity with toxic metal excretion data using linear and nonlinear analysis. “Leave-one-out” cross-validation was used to ensure statistical independence of results. Results and Discussion Average excretion levels of several toxic metals (lead, tin, thallium, antimony) were significantly higher in the ASD group. However, ASD classification using univariate statistics proved difficult due to large variability, but nonlinear multivariate statistical analysis significantly improved ASD classification with Type I/II errors of 15% and 18%, respectively. These results clearly indicate that the urinary toxic metal excretion profiles of participants in the ASD group were significantly different from those of the neurotypical participants. Similarly, nonlinear methods determined a significantly stronger association between the behavioral measures and toxic metal excretion. The association was strongest for the Aberrant Behavior Checklist (including subscales on Irritability, Stereotypy, Hyperactivity, and Inappropriate Speech), but significant associations were found for UTM with all eleven autism-related assessments with cross-validation R2 values ranging from 0.12–0.48. PMID:28068407

  18. Multivariate analysis for stormwater quality characteristics identification from different urban surface types in macau.

    PubMed

    Huang, J; Du, P; Ao, C; Ho, M; Lei, M; Zhao, D; Wang, Z

    2007-12-01

    Statistical analysis of stormwater runoff data enables general identification of runoff characteristics. Six catchments with different urban surface type including roofs, roadway, park, and residential/commercial in Macau were selected for sampling and study during the period from June 2005 to September 2006. Based on univariate statistical analysis of data sampled, major pollutants discharged from different urban surface type were identified. As for iron roof runoff, Zn is the most significant pollutant. The major pollutants from urban roadway runoff are TSS and COD. Stormwater runoff from commercial/residential and Park catchments show high level of COD, TN, and TP concentration. Principal component analysis was further done for identification of linkages between stormwater quality and urban surface types. Two potential pollution sources were identified for study catchments with different urban surface types. The first one is referred as nutrients losses, soil losses and organic pollutants discharges, the second is related to heavy metals losses. PCA was proved to be a viable tool to explain the type of pollution sources and its mechanism for different urban surface type catchments.

  19. Initial Adsorption of Fe on an Ethanol-Saturated Si(111)7 × 7 Surface: Statistical Analysis in Scanning Tunneling Microscopy

    NASA Astrophysics Data System (ADS)

    Yang, Haoyu; Hattori, Ken

    2018-03-01

    We studied the initial stage of iron deposition on an ethanol-saturated Si(111)7 × 7 surface at room temperature using scanning tunneling microscopy (STM). The statistical analysis of the Si adatom height at empty states for Si(111)-C2H5OH before and after the Fe deposition showed different types of adatoms: type B (before the deposition) and type B' (after the deposition) assigned to bare adatoms, type D and type D' to C2H5O-terminated adatoms, and type E' to adatoms with Fe. The analysis of the height distribution revealed the protection of the molecule termination for the Fe capture at the initial stage. The analysis also indicated the preferential capture of a single Fe atom to a bare center-adatom rather than a bare corner-adatom which remain after the C2H5OH saturation, but no selectivity was observed in faulted and unfaulted half unit-cells. This is the first STM-based report proving that a remaining bare adatom, but not a molecule-terminated adatom, captures a metal.

  20. Factor analysis in optimization of formulation of high content uniformity tablets containing low dose active substance.

    PubMed

    Lukášová, Ivana; Muselík, Jan; Franc, Aleš; Goněc, Roman; Mika, Filip; Vetchý, David

    2017-11-15

    Warfarin is intensively discussed drug with narrow therapeutic range. There have been cases of bleeding attributed to varying content or altered quality of the active substance. Factor analysis is useful for finding suitable technological parameters leading to high content uniformity of tablets containing low amount of active substance. The composition of tabletting blend and technological procedure were set with respect to factor analysis of previously published results. The correctness of set parameters was checked by manufacturing and evaluation of tablets containing 1-10mg of warfarin sodium. The robustness of suggested technology was checked by using "worst case scenario" and statistical evaluation of European Pharmacopoeia (EP) content uniformity limits with respect to Bergum division and process capability index (Cpk). To evaluate the quality of active substance and tablets, dissolution method was developed (water; EP apparatus II; 25rpm), allowing for statistical comparison of dissolution profiles. Obtained results prove the suitability of factor analysis to optimize the composition with respect to batches manufactured previously and thus the use of metaanalysis under industrial conditions is feasible. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. Does daily nurse staffing match ward workload variability? Three hospitals' experiences.

    PubMed

    Gabbay, Uri; Bukchin, Michael

    2009-01-01

    Nurse shortage and rising healthcare resource burdens mean that appropriate workforce use is imperative. This paper aims to evaluate whether daily nursing staffing meets ward workload needs. Nurse attendance and daily nurses' workload capacity in three hospitals were evaluated. Statistical process control was used to evaluate intra-ward nurse workload capacity and day-to-day variations. Statistical process control is a statistics-based method for process monitoring that uses charts with predefined target measure and control limits. Standardization was performed for inter-ward analysis by converting ward-specific crude measures to ward-specific relative measures by dividing observed/expected. Two charts: acceptable and tolerable daily nurse workload intensity, were defined. Appropriate staffing indicators were defined as those exceeding predefined rates within acceptable and tolerable limits (50 percent and 80 percent respectively). A total of 42 percent of the overall days fell within acceptable control limits and 71 percent within tolerable control limits. Appropriate staffing indicators were met in only 33 percent of wards regarding acceptable nurse workload intensity and in only 45 percent of wards regarding tolerable workloads. The study work did not differentiate crude nurse attendance and it did not take into account patient severity since crude bed occupancy was used. Double statistical process control charts and certain staffing indicators were used, which is open to debate. Wards that met appropriate staffing indicators prove the method's feasibility. Wards that did not meet appropriate staffing indicators prove the importance and the need for process evaluations and monitoring. Methods presented for monitoring daily staffing appropriateness are simple to implement either for intra-ward day-to-day variation by using nurse workload capacity statistical process control charts or for inter-ward evaluation using standardized measure of nurse workload intensity. The real challenge will be to develop planning systems and implement corrective interventions such as dynamic and flexible daily staffing, which will face difficulties and barriers. The paper fulfils the need for workforce utilization evaluation. A simple method using available data for daily staffing appropriateness evaluation, which is easy to implement and operate, is presented. The statistical process control method enables intra-ward evaluation, while standardization by converting crude into relative measures enables inter-ward analysis. The staffing indicator definitions enable performance evaluation. This original study uses statistical process control to develop simple standardization methods and applies straightforward statistical tools. This method is not limited to crude measures, rather it uses weighted workload measures such as nursing acuity or weighted nurse level (i.e. grade/band).

  2. Alignment-free genetic sequence comparisons: a review of recent approaches by word analysis

    PubMed Central

    Steele, Joe; Bastola, Dhundy

    2014-01-01

    Modern sequencing and genome assembly technologies have provided a wealth of data, which will soon require an analysis by comparison for discovery. Sequence alignment, a fundamental task in bioinformatics research, may be used but with some caveats. Seminal techniques and methods from dynamic programming are proving ineffective for this work owing to their inherent computational expense when processing large amounts of sequence data. These methods are prone to giving misleading information because of genetic recombination, genetic shuffling and other inherent biological events. New approaches from information theory, frequency analysis and data compression are available and provide powerful alternatives to dynamic programming. These new methods are often preferred, as their algorithms are simpler and are not affected by synteny-related problems. In this review, we provide a detailed discussion of computational tools, which stem from alignment-free methods based on statistical analysis from word frequencies. We provide several clear examples to demonstrate applications and the interpretations over several different areas of alignment-free analysis such as base–base correlations, feature frequency profiles, compositional vectors, an improved string composition and the D2 statistic metric. Additionally, we provide detailed discussion and an example of analysis by Lempel–Ziv techniques from data compression. PMID:23904502

  3. Statistical analysis of 4 types of neck whiplash injuries based on classical meridian theory.

    PubMed

    Chen, Yemeng; Zhao, Yan; Xue, Xiaolin; Li, Hui; Wu, Xiuyan; Zhang, Qunce; Zheng, Xin; Wang, Tianfang

    2015-01-01

    As one component of the Chinese medicine meridian system, the meridian sinew (Jingjin, (see text), tendino-musculo) is specially described as being for acupuncture treatment of the musculoskeletal system because of its dynamic attributes and tender point correlations. In recent decades, the therapeutic importance of the sinew meridian has become revalued in clinical application. Based on this theory, the authors have established therapeutic strategies of acupuncture treatment in Whiplash-Associated Disorders (WAD) by categorizing four types of neck symptom presentations. The advantage of this new system is to make it much easier for the clinician to find effective acupuncture points. This study attempts to prove the significance of the proposed therapeutic strategies by analyzing data collected from a clinical survey of various WAD using non-supervised statistical methods, such as correlation analysis, factor analysis, and cluster analysis. The clinical survey data have successfully verified discrete characteristics of four neck syndromes, based upon the range of motion (ROM) and tender point location findings. A summary of the relationships among the symptoms of the four neck syndromes has shown the correlation coefficient as having a statistical significance (P < 0.01 or P < 0.05), especially with regard to ROM. Furthermore, factor and cluster analyses resulted in a total of 11 categories of general symptoms, which implies syndrome factors are more related to the Liver, as originally described in classical theory. The hypothesis of meridian sinew syndromes in WAD is clearly supported by the statistical analysis of the clinical trials. This new discovery should be beneficial in improving therapeutic outcomes.

  4. Statistical and Graphical Assessment of Circumferential and Radial Hardness Variation of AISI 4140, AISI 1020 and AA 6082 Aluminum Alloy

    PubMed Central

    Al-Khalid, Hamad; Alaskari, Ayman; Oraby, Samy

    2011-01-01

    Hardness homogeneity of the commonly used structural ferrous and nonferrous engineering materials is of vital importance in the design stage, therefore, reliable information regarding material properties homogeneity should be validated and any deviation should be addressed. In the current study the hardness variation, over wide spectrum radial locations of some ferrous and nonferrous structural engineering materials, was investigated. Measurements were performed over both faces (cross-section) of each stock bar according to a pre-specified stratified design, ensuring the coverage of the entire area both in radial and circumferential directions. Additionally the credibility of the apparatus and measuring procedures were examined through a statistically based calibration process of the hardness reference block. Statistical and response surface graphical analysis are used to examine the nature, adequacy and significance of the measured hardness values. Calibration of the apparatus reference block proved the reliability of the measuring system, where no strong evidence was found against the stochastic nature of hardness measures over the various stratified locations. Also, outlier elimination procedures were proved to be beneficial only at fewer measured points. Hardness measurements showed a dispersion domain that is within the acceptable confidence interval. For AISI 4140 and AISI 1020 steels, hardness is found to have a slight decrease trend as the diameter is reduced, while an opposite behavior is observed for AA 6082 aluminum alloy. However, no definite significant behavior was noticed regarding the effect of the sector sequence (circumferential direction). PMID:28817030

  5. Statistical and Graphical Assessment of Circumferential and Radial Hardness Variation of AISI 4140, AISI 1020 and AA 6082 Aluminum Alloy.

    PubMed

    Al-Khalid, Hamad; Alaskari, Ayman; Oraby, Samy

    2011-12-23

    Hardness homogeneity of the commonly used structural ferrous and nonferrous engineering materials is of vital importance in the design stage, therefore, reliable information regarding material properties homogeneity should be validated and any deviation should be addressed. In the current study the hardness variation, over wide spectrum radial locations of some ferrous and nonferrous structural engineering materials, was investigated. Measurements were performed over both faces (cross-section) of each stock bar according to a pre-specified stratified design, ensuring the coverage of the entire area both in radial and circumferential directions. Additionally the credibility of the apparatus and measuring procedures were examined through a statistically based calibration process of the hardness reference block. Statistical and response surface graphical analysis are used to examine the nature, adequacy and significance of the measured hardness values. Calibration of the apparatus reference block proved the reliability of the measuring system, where no strong evidence was found against the stochastic nature of hardness measures over the various stratified locations. Also, outlier elimination procedures were proved to be beneficial only at fewer measured points. Hardness measurements showed a dispersion domain that is within the acceptable confidence interval. For AISI 4140 and AISI 1020 steels, hardness is found to have a slight decrease trend as the diameter is reduced, while an opposite behavior is observed for AA 6082 aluminum alloy. However, no definite significant behavior was noticed regarding the effect of the sector sequence (circumferential direction).

  6. Using Pictures to Enhance Students' Understanding of Bayes' Theorem

    ERIC Educational Resources Information Center

    Trafimow, David

    2011-01-01

    Students often have difficulty understanding algebraic proofs of statistics theorems. However, it sometimes is possible to prove statistical theorems with pictures in which case students can gain understanding more easily. I provide examples for two versions of Bayes' theorem.

  7. Factors influencing job satisfaction in post-transition economies: the case of the Czech Republic.

    PubMed

    Čábelková, Inna; Abrhám, Josef; Strielkowski, Wadim

    2015-01-01

    This paper presents an analysis of factors influencing job satisfaction in post-transition economies on the example of the Czech Republic. Our research shows that women reported higher levels of job satisfaction compared to men. Education proved to be statistically significant in one of three indicators of job satisfaction. Personal income and workplace relationships proved to be positively and significantly related to all the three indicators of job satisfaction. Most of the occupational dummies were significantly related to two out of three indicators of job satisfaction. In addition, we found that Czech entrepreneurs enjoy and value their job, which indicates strong self-selection for doing business in post-transition economies. However, human capital expressed by the level of education was significant factor for job satisfaction, meaning that well-educated people might not be satisfied with their jobs or feel that their education and experience are wasted in the market economy.

  8. From Constraints to Resolution Rules Part II : chains, braids, confluence and T&E

    NASA Astrophysics Data System (ADS)

    Berthier, Denis

    In this Part II, we apply the general theory developed in Part I to a detailed analysis of the Constraint Satisfaction Problem (CSP). We show how specific types of resolution rules can be defined. In particular, we introduce the general notions of a chain and a braid. As in Part I, these notions are illustrated in detail with the Sudoku example - a problem known to be NP-complete and which is therefore typical of a broad class of hard problems. For Sudoku, we also show how far one can go in "approximating" a CSP with a resolution theory and we give an empirical statistical analysis of how the various puzzles, corresponding to different sets of entries, can be classified along a natural scale of complexity. For any CSP, we also prove the confluence property of some Resolution Theories based on braids and we show how it can be used to define different resolution strategies. Finally, we prove that, in any CSP, braids have the same solving capacity as Trial-and-Error (T&E) with no guessing and we comment this result in the Sudoku case.

  9. Functional data analysis on ground reaction force of military load carriage increment

    NASA Astrophysics Data System (ADS)

    Din, Wan Rozita Wan; Rambely, Azmin Sham

    2014-06-01

    Analysis of ground reaction force on military load carriage is done through functional data analysis (FDA) statistical technique. The main objective of the research is to investigate the effect of 10% load increment and to find the maximum suitable load for the Malaysian military. Ten military soldiers age 31 ± 6.2 years, weigh 71.6 ± 10.4 kg and height of 166.3 ± 5.9 cm carrying different military load range from 0% body weight (BW) up to 40% BW participated in an experiment to gather the GRF and kinematic data using Vicon Motion Analysis System, Kirstler force plates and thirty nine body markers. The analysis is conducted in sagittal, medial lateral and anterior posterior planes. The results show that 10% BW load increment has an effect when heel strike and toe-off for all the three planes analyzed with P-value less than 0.001 at 0.05 significant levels. FDA proves to be one of the best statistical techniques in analyzing the functional data. It has the ability to handle filtering, smoothing and curve aligning according to curve features and points of interest.

  10. Research on Visual Analysis Methods of Terrorism Events

    NASA Astrophysics Data System (ADS)

    Guo, Wenyue; Liu, Haiyan; Yu, Anzhu; Li, Jing

    2016-06-01

    Under the situation that terrorism events occur more and more frequency throughout the world, improving the response capability of social security incidents has become an important aspect to test governments govern ability. Visual analysis has become an important method of event analysing for its advantage of intuitive and effective. To analyse events' spatio-temporal distribution characteristics, correlations among event items and the development trend, terrorism event's spatio-temporal characteristics are discussed. Suitable event data table structure based on "5W" theory is designed. Then, six types of visual analysis are purposed, and how to use thematic map and statistical charts to realize visual analysis on terrorism events is studied. Finally, experiments have been carried out by using the data provided by Global Terrorism Database, and the results of experiments proves the availability of the methods.

  11. Detailed Analysis of the Interoccurrence Time Statistics in Seismic Activity

    NASA Astrophysics Data System (ADS)

    Tanaka, Hiroki; Aizawa, Yoji

    2017-02-01

    The interoccurrence time statistics of seismiciry is studied theoretically as well as numerically by taking into account the conditional probability and the correlations among many earthquakes in different magnitude levels. It is known so far that the interoccurrence time statistics is well approximated by the Weibull distribution, but the more detailed information about the interoccurrence times can be obtained from the analysis of the conditional probability. Firstly, we propose the Embedding Equation Theory (EET), where the conditional probability is described by two kinds of correlation coefficients; one is the magnitude correlation and the other is the inter-event time correlation. Furthermore, the scaling law of each correlation coefficient is clearly determined from the numerical data-analysis carrying out with the Preliminary Determination of Epicenter (PDE) Catalog and the Japan Meteorological Agency (JMA) Catalog. Secondly, the EET is examined to derive the magnitude dependence of the interoccurrence time statistics and the multi-fractal relation is successfully formulated. Theoretically we cannot prove the universality of the multi-fractal relation in seismic activity; nevertheless, the theoretical results well reproduce all numerical data in our analysis, where several common features or the invariant aspects are clearly observed. Especially in the case of stationary ensembles the multi-fractal relation seems to obey an invariant curve, furthermore in the case of non-stationary (moving time) ensembles for the aftershock regime the multi-fractal relation seems to satisfy a certain invariant curve at any moving times. It is emphasized that the multi-fractal relation plays an important role to unify the statistical laws of seismicity: actually the Gutenberg-Richter law and the Weibull distribution are unified in the multi-fractal relation, and some universality conjectures regarding the seismicity are briefly discussed.

  12. Identification of key micro-organisms involved in Douchi fermentation by statistical analysis and their use in an experimental fermentation.

    PubMed

    Chen, C; Xiang, J Y; Hu, W; Xie, Y B; Wang, T J; Cui, J W; Xu, Y; Liu, Z; Xiang, H; Xie, Q

    2015-11-01

    To screen and identify safe micro-organisms used during Douchi fermentation, and verify the feasibility of producing high-quality Douchi using these identified micro-organisms. PCR-denaturing gradient gel electrophoresis (DGGE) and automatic amino-acid analyser were used to investigate the microbial diversity and free amino acids (FAAs) content of 10 commercial Douchi samples. The correlations between microbial communities and FAAs were analysed by statistical analysis. Ten strains with significant positive correlation were identified. Then an experiment on Douchi fermentation by identified strains was carried out, and the nutritional composition in Douchi was analysed. Results showed that FAAs and relative content of isoflavone aglycones in verification Douchi samples were generally higher than those in commercial Douchi samples. Our study indicated that fungi, yeasts, Bacillus and lactic acid bacteria were the key players in Douchi fermentation, and with identified probiotic micro-organisms participating in fermentation, a higher quality Douchi product was produced. This is the first report to analyse and confirm the key micro-organisms during Douchi fermentation by statistical analysis. This work proves fermentation micro-organisms to be the key influencing factor of Douchi quality, and demonstrates the feasibility of fermenting Douchi using identified starter micro-organisms. © 2015 The Society for Applied Microbiology.

  13. Prove It! Putting Together the Evidence-Based Practice Puzzle

    ERIC Educational Resources Information Center

    Little, Hannah Byrd

    2015-01-01

    Why is it important to prove that school libraries add value to the school program? The National Center for Education Statistics reports that 20 percent of U.S. public schools lack a full or part-time certified librarian (NCES 2013). In California the ratio of certified school librarians to students is 1:7,374 (California Department of Education…

  14. Estimating the effects of harmonic voltage fluctuations on the temperature rise of squirrel-cage motors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Emanuel, A.E.

    1991-03-01

    This article presents a preliminary analysis of the effect of randomly varying harmonic voltages on the temperature rise of squirrel-cage motors. The stochastic process of random variations of harmonic voltages is defined by means of simple statistics (mean, standard deviation, type of distribution). Computational models based on a first-order approximation of the motor losses and on the Monte Carlo method yield results which prove that equipment with large thermal time-constant is capable of withstanding for a short period of time larger distortions than THD = 5%.

  15. Impact of malicious servers over trust and reputation models in wireless sensor networks

    NASA Astrophysics Data System (ADS)

    Verma, Vinod Kumar; Singh, Surinder; Pathak, N. P.

    2016-03-01

    This article deals with the impact of malicious servers over different trust and reputation models in wireless sensor networks. First, we analysed the five trust and reputation models, namely BTRM-WSN, Eigen trust, peer trust, power trust, linguistic fuzzy trust model. Further, we proposed wireless sensor network design for optimisation of these models. Finally, influence of malicious servers on the behaviour of above mentioned trust and reputation models is discussed. Statistical analysis has been carried out to prove the validity of our proposal.

  16. The effect of funding policy on day of week admissions and discharges in hospitals: the cases of Austria and Canada.

    PubMed

    Leonard, Kevin J; Rauner, Marion S; Schaffhauser-Linzatti, Michaela Maria; Yap, Richard

    2003-03-01

    This paper compares two different funding policies for inpatients, the case-based approach in Austria versus the global budgeting approach in Canada. It examines the impact of these funding policies on length of stay of inpatients as one key measure of health outcome. In our study, six major clinical categories for inpatients are selected in which the day of the week for admission is matched to the particular day of the week of discharge for each individual case. The strategic statistical analysis proves that funding policies have a significant impact on the expected length of stay of inpatients. For all six clinical categories, Austrian inpatients stayed longer in hospitals compared to Canadian inpatients. Moreover, inpatients were not admitted and discharged equally throughout the week. We also statistically prove for certain clinical categories that more inpatients are discharged on certain days such as Mondays or Fridays depending on the funding policy. Our study is unique in the literature and our conclusions indicate that, with the right incentives in place, the length of stay can be decreased and discharge anomalies can be eliminated, which ultimately leads to a decrease in healthcare expenditures and an increase in healthcare effectiveness.

  17. Statistical Significance for Hierarchical Clustering

    PubMed Central

    Kimes, Patrick K.; Liu, Yufeng; Hayes, D. Neil; Marron, J. S.

    2017-01-01

    Summary Cluster analysis has proved to be an invaluable tool for the exploratory and unsupervised analysis of high dimensional datasets. Among methods for clustering, hierarchical approaches have enjoyed substantial popularity in genomics and other fields for their ability to simultaneously uncover multiple layers of clustering structure. A critical and challenging question in cluster analysis is whether the identified clusters represent important underlying structure or are artifacts of natural sampling variation. Few approaches have been proposed for addressing this problem in the context of hierarchical clustering, for which the problem is further complicated by the natural tree structure of the partition, and the multiplicity of tests required to parse the layers of nested clusters. In this paper, we propose a Monte Carlo based approach for testing statistical significance in hierarchical clustering which addresses these issues. The approach is implemented as a sequential testing procedure guaranteeing control of the family-wise error rate. Theoretical justification is provided for our approach, and its power to detect true clustering structure is illustrated through several simulation studies and applications to two cancer gene expression datasets. PMID:28099990

  18. Forecasting the discomfort levels within the greater Athens area, Greece using artificial neural networks and multiple criteria analysis

    NASA Astrophysics Data System (ADS)

    Vouterakos, P. A.; Moustris, K. P.; Bartzokas, A.; Ziomas, I. C.; Nastos, P. T.; Paliatsos, A. G.

    2012-12-01

    In this work, artificial neural networks (ANNs) were developed and applied in order to forecast the discomfort levels due to the combination of high temperature and air humidity, during the hot season of the year, in eight different regions within the Greater Athens area (GAA), Greece. For the selection of the best type and architecture of ANNs-forecasting models, the multiple criteria analysis (MCA) technique was applied. Three different types of ANNs were developed and tested with the MCA method. Concretely, the multilayer perceptron, the generalized feed forward networks (GFFN), and the time-lag recurrent networks were developed and tested. Results showed that the best ANNs type performance was achieved by using the GFFN model for the prediction of discomfort levels due to high temperature and air humidity within GAA. For the evaluation of the constructed ANNs, appropriate statistical indices were used. The analysis proved that the forecasting ability of the developed ANNs models is very satisfactory at a significant statistical level of p < 0.01.

  19. Introduction, comparison, and validation of Meta‐Essentials: A free and simple tool for meta‐analysis

    PubMed Central

    van Rhee, Henk; Hak, Tony

    2017-01-01

    We present a new tool for meta‐analysis, Meta‐Essentials, which is free of charge and easy to use. In this paper, we introduce the tool and compare its features to other tools for meta‐analysis. We also provide detailed information on the validation of the tool. Although free of charge and simple, Meta‐Essentials automatically calculates effect sizes from a wide range of statistics and can be used for a wide range of meta‐analysis applications, including subgroup analysis, moderator analysis, and publication bias analyses. The confidence interval of the overall effect is automatically based on the Knapp‐Hartung adjustment of the DerSimonian‐Laird estimator. However, more advanced meta‐analysis methods such as meta‐analytical structural equation modelling and meta‐regression with multiple covariates are not available. In summary, Meta‐Essentials may prove a valuable resource for meta‐analysts, including researchers, teachers, and students. PMID:28801932

  20. A statistical test to show negligible trend

    Treesearch

    Philip M. Dixon; Joseph H.K. Pechmann

    2005-01-01

    The usual statistical tests of trend are inappropriate for demonstrating the absence of trend. This is because failure to reject the null hypothesis of no trend does not prove that null hypothesis. The appropriate statistical method is based on an equivalence test. The null hypothesis is that the trend is not zero, i.e., outside an a priori specified equivalence region...

  1. EMMPRIN Is an Independent Negative Prognostic Factor for Patients with Astrocytic Glioma

    PubMed Central

    Chen, Yu; Cai, Min; Dong, Hailong; Xiong, Lize

    2013-01-01

    Extracellular matrix metalloproteinase inducer (EMMPRIN), also known as CD147, is a member of the immunoglobulin superfamily that is present on the surface of tumor cells and stimulates adjacent fibroblasts to produce matrix metalloproteinases (MMPs). It has been proved to be associated with tumor invasion and metastasis in various human malignancies. In our study, the protein expression level of EMMPRIN in 306 cases of astrocytic glioma is investigated by immunohistochemistry assay. Statistical analysis was utilized to evaluate the association of EMMPRIN with clinicopathological characteristics and prognosis of patients. It was proved that EMMPRIN protein expression was increased in glioma compared with that in normal brain tissue. Moreover, EMMPRIN immunohistochemical staining was correlated with WHO grade and Karnofsky performance score for strong positive EMMPRIN staining is more frequently detected in glioma of advanced grade or low KPS score. It is also demonstrated that EMMPRIN could be an independent negative prognostic factor in glioma for patients with glioma of strong EMMPRIN staining tend to have high risk of death. These results proved that EMMPRIN is associated with prognosis of glioma, which may also suggest the potential role of EMMPRIN in glioma management. PMID:23516431

  2. EMMPRIN is an independent negative prognostic factor for patients with astrocytic glioma.

    PubMed

    Tian, Li; Zhang, Yang; Chen, Yu; Cai, Min; Dong, Hailong; Xiong, Lize

    2013-01-01

    Extracellular matrix metalloproteinase inducer (EMMPRIN), also known as CD147, is a member of the immunoglobulin superfamily that is present on the surface of tumor cells and stimulates adjacent fibroblasts to produce matrix metalloproteinases (MMPs). It has been proved to be associated with tumor invasion and metastasis in various human malignancies. In our study, the protein expression level of EMMPRIN in 306 cases of astrocytic glioma is investigated by immunohistochemistry assay. Statistical analysis was utilized to evaluate the association of EMMPRIN with clinicopathological characteristics and prognosis of patients. It was proved that EMMPRIN protein expression was increased in glioma compared with that in normal brain tissue. Moreover, EMMPRIN immunohistochemical staining was correlated with WHO grade and Karnofsky performance score for strong positive EMMPRIN staining is more frequently detected in glioma of advanced grade or low KPS score. It is also demonstrated that EMMPRIN could be an independent negative prognostic factor in glioma for patients with glioma of strong EMMPRIN staining tend to have high risk of death. These results proved that EMMPRIN is associated with prognosis of glioma, which may also suggest the potential role of EMMPRIN in glioma management.

  3. Central Limit Theorem for Exponentially Quasi-local Statistics of Spin Models on Cayley Graphs

    NASA Astrophysics Data System (ADS)

    Reddy, Tulasi Ram; Vadlamani, Sreekar; Yogeshwaran, D.

    2018-04-01

    Central limit theorems for linear statistics of lattice random fields (including spin models) are usually proven under suitable mixing conditions or quasi-associativity. Many interesting examples of spin models do not satisfy mixing conditions, and on the other hand, it does not seem easy to show central limit theorem for local statistics via quasi-associativity. In this work, we prove general central limit theorems for local statistics and exponentially quasi-local statistics of spin models on discrete Cayley graphs with polynomial growth. Further, we supplement these results by proving similar central limit theorems for random fields on discrete Cayley graphs taking values in a countable space, but under the stronger assumptions of α -mixing (for local statistics) and exponential α -mixing (for exponentially quasi-local statistics). All our central limit theorems assume a suitable variance lower bound like many others in the literature. We illustrate our general central limit theorem with specific examples of lattice spin models and statistics arising in computational topology, statistical physics and random networks. Examples of clustering spin models include quasi-associated spin models with fast decaying covariances like the off-critical Ising model, level sets of Gaussian random fields with fast decaying covariances like the massive Gaussian free field and determinantal point processes with fast decaying kernels. Examples of local statistics include intrinsic volumes, face counts, component counts of random cubical complexes while exponentially quasi-local statistics include nearest neighbour distances in spin models and Betti numbers of sub-critical random cubical complexes.

  4. Nonclassical point of view of the Brownian motion generation via fractional deterministic model

    NASA Astrophysics Data System (ADS)

    Gilardi-Velázquez, H. E.; Campos-Cantón, E.

    In this paper, we present a dynamical system based on the Langevin equation without stochastic term and using fractional derivatives that exhibit properties of Brownian motion, i.e. a deterministic model to generate Brownian motion is proposed. The stochastic process is replaced by considering an additional degree of freedom in the second-order Langevin equation. Thus, it is transformed into a system of three first-order linear differential equations, additionally α-fractional derivative are considered which allow us to obtain better statistical properties. Switching surfaces are established as a part of fluctuating acceleration. The final system of three α-order linear differential equations does not contain a stochastic term, so the system generates motion in a deterministic way. Nevertheless, from the time series analysis, we found that the behavior of the system exhibits statistics properties of Brownian motion, such as, a linear growth in time of mean square displacement, a Gaussian distribution. Furthermore, we use the detrended fluctuation analysis to prove the Brownian character of this motion.

  5. Statistical analysis of the pulse-coupled synchronization strategy for wireless sensor networks

    PubMed Central

    Wang, Yongqiang; Núñez, Felipe; Doyle, Francis J.

    2013-01-01

    Pulse-coupled synchronization is attracting increased attention in the sensor network community. Yet its properties have not been fully investigated. Using statistical analysis, we prove analytically that by controlling the number of connections at each node, synchronization can be guaranteed for generally pulse-coupled oscillators even in the presence of a refractory period. The approach does not require the initial phases to reside in half an oscillation cycle, which improves existing results. We also find that a refractory period can be strategically included to reduce idle listening at nearly no sacrifice to the synchronization probability. Given that reduced idle listening leads to higher energy efficiency in the synchronization process, the strategically added refractory period makes the synchronization scheme appealing to cheap sensor nodes, where energy is a precious system resource. We also analyzed the pulse-coupled synchronization in the presence of unreliable communication links and obtained similar results. QualNet experimental results are given to confirm the effectiveness of the theoretical predictions. PMID:24324322

  6. Statistical approach to partial equilibrium analysis

    NASA Astrophysics Data System (ADS)

    Wang, Yougui; Stanley, H. E.

    2009-04-01

    A statistical approach to market equilibrium and efficiency analysis is proposed in this paper. One factor that governs the exchange decisions of traders in a market, named willingness price, is highlighted and constitutes the whole theory. The supply and demand functions are formulated as the distributions of corresponding willing exchange over the willingness price. The laws of supply and demand can be derived directly from these distributions. The characteristics of excess demand function are analyzed and the necessary conditions for the existence and uniqueness of equilibrium point of the market are specified. The rationing rates of buyers and sellers are introduced to describe the ratio of realized exchange to willing exchange, and their dependence on the market price is studied in the cases of shortage and surplus. The realized market surplus, which is the criterion of market efficiency, can be written as a function of the distributions of willing exchange and the rationing rates. With this approach we can strictly prove that a market is efficient in the state of equilibrium.

  7. Systematic analysis of coding and noncoding DNA sequences using methods of statistical linguistics

    NASA Technical Reports Server (NTRS)

    Mantegna, R. N.; Buldyrev, S. V.; Goldberger, A. L.; Havlin, S.; Peng, C. K.; Simons, M.; Stanley, H. E.

    1995-01-01

    We compare the statistical properties of coding and noncoding regions in eukaryotic and viral DNA sequences by adapting two tests developed for the analysis of natural languages and symbolic sequences. The data set comprises all 30 sequences of length above 50 000 base pairs in GenBank Release No. 81.0, as well as the recently published sequences of C. elegans chromosome III (2.2 Mbp) and yeast chromosome XI (661 Kbp). We find that for the three chromosomes we studied the statistical properties of noncoding regions appear to be closer to those observed in natural languages than those of coding regions. In particular, (i) a n-tuple Zipf analysis of noncoding regions reveals a regime close to power-law behavior while the coding regions show logarithmic behavior over a wide interval, while (ii) an n-gram entropy measurement shows that the noncoding regions have a lower n-gram entropy (and hence a larger "n-gram redundancy") than the coding regions. In contrast to the three chromosomes, we find that for vertebrates such as primates and rodents and for viral DNA, the difference between the statistical properties of coding and noncoding regions is not pronounced and therefore the results of the analyses of the investigated sequences are less conclusive. After noting the intrinsic limitations of the n-gram redundancy analysis, we also briefly discuss the failure of the zeroth- and first-order Markovian models or simple nucleotide repeats to account fully for these "linguistic" features of DNA. Finally, we emphasize that our results by no means prove the existence of a "language" in noncoding DNA.

  8. Conceptual and statistical problems associated with the use of diversity indices in ecology.

    PubMed

    Barrantes, Gilbert; Sandoval, Luis

    2009-09-01

    Diversity indices, particularly the Shannon-Wiener index, have extensively been used in analyzing patterns of diversity at different geographic and ecological scales. These indices have serious conceptual and statistical problems which make comparisons of species richness or species abundances across communities nearly impossible. There is often no a single statistical method that retains all information needed to answer even a simple question. However, multivariate analyses could be used instead of diversity indices, such as cluster analyses or multiple regressions. More complex multivariate analyses, such as Canonical Correspondence Analysis, provide very valuable information on environmental variables associated to the presence and abundance of the species in a community. In addition, particular hypotheses associated to changes in species richness across localities, or change in abundance of one, or a group of species can be tested using univariate, bivariate, and/or rarefaction statistical tests. The rarefaction method has proved to be robust to standardize all samples to a common size. Even the simplest method as reporting the number of species per taxonomic category possibly provides more information than a diversity index value.

  9. Automatic Classification of Medical Text: The Influence of Publication Form1

    PubMed Central

    Cole, William G.; Michael, Patricia A.; Stewart, James G.; Blois, Marsden S.

    1988-01-01

    Previous research has shown that within the domain of medical journal abstracts the statistical distribution of words is neither random nor uniform, but is highly characteristic. Many words are used mainly or solely by one medical specialty or when writing about one particular level of description. Due to this regularity of usage, automatic classification within journal abstracts has proved quite successful. The present research asks two further questions. It investigates whether this statistical regularity and automatic classification success can also be achieved in medical textbook chapters. It then goes on to see whether the statistical distribution found in textbooks is sufficiently similar to that found in abstracts to permit accurate classification of abstracts based solely on previous knowledge of textbooks. 14 textbook chapters and 45 MEDLINE abstracts were submitted to an automatic classification program that had been trained only on chapters drawn from a standard textbook series. Statistical analysis of the properties of abstracts vs. chapters revealed important differences in word use. Automatic classification performance was good for chapters, but poor for abstracts.

  10. Satyendranath Bose: Co-Founder of Quantum Statistics

    ERIC Educational Resources Information Center

    Blanpied, William A.

    1972-01-01

    Satyendranath Bose was first to prove Planck's Law by using ideal quantum gas. Einstein credited Bose for this first step in the development of quantum statistical mechanics. Bose did not realize the importance of his work, perhaps because of peculiar academic settings in India under British rule. (PS)

  11. The Importance of Proving the Null

    PubMed Central

    Gallistel, C. R.

    2010-01-01

    Null hypotheses are simple, precise, and theoretically important. Conventional statistical analysis cannot support them; Bayesian analysis can. The challenge in a Bayesian analysis is to formulate a suitably vague alternative, because the vaguer the alternative is (the more it spreads out the unit mass of prior probability), the more the null is favored. A general solution is a sensitivity analysis: Compute the odds for or against the null as a function of the limit(s) on the vagueness of the alternative. If the odds on the null approach 1 from above as the hypothesized maximum size of the possible effect approaches 0, then the data favor the null over any vaguer alternative to it. The simple computations and the intuitive graphic representation of the analysis are illustrated by the analysis of diverse examples from the current literature. They pose 3 common experimental questions: (a) Are 2 means the same? (b) Is performance at chance? (c) Are factors additive? PMID:19348549

  12. Alignment-free genetic sequence comparisons: a review of recent approaches by word analysis.

    PubMed

    Bonham-Carter, Oliver; Steele, Joe; Bastola, Dhundy

    2014-11-01

    Modern sequencing and genome assembly technologies have provided a wealth of data, which will soon require an analysis by comparison for discovery. Sequence alignment, a fundamental task in bioinformatics research, may be used but with some caveats. Seminal techniques and methods from dynamic programming are proving ineffective for this work owing to their inherent computational expense when processing large amounts of sequence data. These methods are prone to giving misleading information because of genetic recombination, genetic shuffling and other inherent biological events. New approaches from information theory, frequency analysis and data compression are available and provide powerful alternatives to dynamic programming. These new methods are often preferred, as their algorithms are simpler and are not affected by synteny-related problems. In this review, we provide a detailed discussion of computational tools, which stem from alignment-free methods based on statistical analysis from word frequencies. We provide several clear examples to demonstrate applications and the interpretations over several different areas of alignment-free analysis such as base-base correlations, feature frequency profiles, compositional vectors, an improved string composition and the D2 statistic metric. Additionally, we provide detailed discussion and an example of analysis by Lempel-Ziv techniques from data compression. © The Author 2013. Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  13. Mathematical Analysis of Vehicle Delivery Scale of Bike-Sharing Rental Nodes

    NASA Astrophysics Data System (ADS)

    Zhai, Y.; Liu, J.; Liu, L.

    2018-04-01

    Aiming at the lack of scientific and reasonable judgment of vehicles delivery scale and insufficient optimization of scheduling decision, based on features of the bike-sharing usage, this paper analyses the applicability of the discrete time and state of the Markov chain, and proves its properties to be irreducible, aperiodic and positive recurrent. Based on above analysis, the paper has reached to the conclusion that limit state (steady state) probability of the bike-sharing Markov chain only exists and is independent of the initial probability distribution. Then this paper analyses the difficulty of the transition probability matrix parameter statistics and the linear equations group solution in the traditional solving algorithm of the bike-sharing Markov chain. In order to improve the feasibility, this paper proposes a "virtual two-node vehicle scale solution" algorithm which considered the all the nodes beside the node to be solved as a virtual node, offered the transition probability matrix, steady state linear equations group and the computational methods related to the steady state scale, steady state arrival time and scheduling decision of the node to be solved. Finally, the paper evaluates the rationality and accuracy of the steady state probability of the proposed algorithm by comparing with the traditional algorithm. By solving the steady state scale of the nodes one by one, the proposed algorithm is proved to have strong feasibility because it lowers the level of computational difficulty and reduces the number of statistic, which will help the bike-sharing companies to optimize the scale and scheduling of nodes.

  14. Fragment size distribution statistics in dynamic fragmentation of laser shock-loaded tin

    NASA Astrophysics Data System (ADS)

    He, Weihua; Xin, Jianting; Zhao, Yongqiang; Chu, Genbai; Xi, Tao; Shui, Min; Lu, Feng; Gu, Yuqiu

    2017-06-01

    This work investigates the geometric statistics method to characterize the size distribution of tin fragments produced in the laser shock-loaded dynamic fragmentation process. In the shock experiments, the ejection of the tin sample with etched V-shape groove in the free surface are collected by the soft recovery technique. Subsequently, the produced fragments are automatically detected with the fine post-shot analysis techniques including the X-ray micro-tomography and the improved watershed method. To characterize the size distributions of the fragments, a theoretical random geometric statistics model based on Poisson mixtures is derived for dynamic heterogeneous fragmentation problem, which reveals linear combinational exponential distribution. The experimental data related to fragment size distributions of the laser shock-loaded tin sample are examined with the proposed theoretical model, and its fitting performance is compared with that of other state-of-the-art fragment size distribution models. The comparison results prove that our proposed model can provide far more reasonable fitting result for the laser shock-loaded tin.

  15. Breast cancer statistics and prediction methodology: a systematic review and analysis.

    PubMed

    Dubey, Ashutosh Kumar; Gupta, Umesh; Jain, Sonal

    2015-01-01

    Breast cancer is a menacing cancer, primarily affecting women. Continuous research is going on for detecting breast cancer in the early stage as the possibility of cure in early stages is bright. There are two main objectives of this current study, first establish statistics for breast cancer and second to find methodologies which can be helpful in the early stage detection of the breast cancer based on previous studies. The breast cancer statistics for incidence and mortality of the UK, US, India and Egypt were considered for this study. The finding of this study proved that the overall mortality rates of the UK and US have been improved because of awareness, improved medical technology and screening, but in case of India and Egypt the condition is less positive because of lack of awareness. The methodological findings of this study suggest a combined framework based on data mining and evolutionary algorithms. It provides a strong bridge in improving the classification and detection accuracy of breast cancer data.

  16. Statistical Process Control in the Practice of Program Evaluation.

    ERIC Educational Resources Information Center

    Posavac, Emil J.

    1995-01-01

    A technique developed to monitor the quality of manufactured products, statistical process control (SPC), incorporates several features that may prove attractive to evaluators. This paper reviews the history of SPC, suggests how the approach can enrich program evaluation, and illustrates its use in a hospital-based example. (SLD)

  17. Finding the Root Causes of Statistical Inconsistency in Community Earth System Model Output

    NASA Astrophysics Data System (ADS)

    Milroy, D.; Hammerling, D.; Baker, A. H.

    2017-12-01

    Baker et al (2015) developed the Community Earth System Model Ensemble Consistency Test (CESM-ECT) to provide a metric for software quality assurance by determining statistical consistency between an ensemble of CESM outputs and new test runs. The test has proved useful for detecting statistical difference caused by compiler bugs and errors in physical modules. However, detection is only the necessary first step in finding the causes of statistical difference. The CESM is a vastly complex model comprised of millions of lines of code which is developed and maintained by a large community of software engineers and scientists. Any root cause analysis is correspondingly challenging. We propose a new capability for CESM-ECT: identifying the sections of code that cause statistical distinguishability. The first step is to discover CESM variables that cause CESM-ECT to classify new runs as statistically distinct, which we achieve via Randomized Logistic Regression. Next we use a tool developed to identify CESM components that define or compute the variables found in the first step. Finally, we employ the application Kernel GENerator (KGEN) created in Kim et al (2016) to detect fine-grained floating point differences. We demonstrate an example of the procedure and advance a plan to automate this process in our future work.

  18. QSAR and 3D QSAR of inhibitors of the epidermal growth factor receptor

    NASA Astrophysics Data System (ADS)

    Pinto-Bazurco, Mariano; Tsakovska, Ivanka; Pajeva, Ilza

    This article reports quantitative structure-activity relationships (QSAR) and 3D QSAR models of 134 structurally diverse inhibitors of the epidermal growth factor receptor (EGFR) tyrosine kinase. Free-Wilson analysis was used to derive the QSAR model. It identified the substituents in aniline, the polycyclic system, and the substituents at the 6- and 7-positions of the polycyclic system as the most important structural features. Comparative molecular field analysis (CoMFA) and comparative molecular similarity indices analysis (CoMSIA) were used in the 3D QSAR modeling. The steric and electrostatic interactions proved the most important for the inhibitory effect. Both QSAR and 3D QSAR models led to consistent results. On the basis of the statistically significant models, new structures were proposed and their inhibitory activities were predicted.

  19. Comparison of algebraic and analytical approaches to the formulation of the statistical model-based reconstruction problem for X-ray computed tomography.

    PubMed

    Cierniak, Robert; Lorent, Anna

    2016-09-01

    The main aim of this paper is to investigate properties of our originally formulated statistical model-based iterative approach applied to the image reconstruction from projections problem which are related to its conditioning, and, in this manner, to prove a superiority of this approach over ones recently used by other authors. The reconstruction algorithm based on this conception uses a maximum likelihood estimation with an objective adjusted to the probability distribution of measured signals obtained from an X-ray computed tomography system with parallel beam geometry. The analysis and experimental results presented here show that our analytical approach outperforms the referential algebraic methodology which is explored widely in the literature and exploited in various commercial implementations. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. Energy transfer mechanism and probability analysis of submarine pipe laterally impacted by dropped objects

    NASA Astrophysics Data System (ADS)

    Liang, Jing; Yu, Jian-xing; Yu, Yang; Lam, W.; Zhao, Yi-yu; Duan, Jing-hui

    2016-06-01

    Energy transfer ratio is the basic-factor affecting the level of pipe damage during the impact between dropped object and submarine pipe. For the purpose of studying energy transfer and damage mechanism of submarine pipe impacted by dropped objects, series of experiments are designed and carried out. The effective yield strength is deduced to make the quasi-static analysis more reliable, and the normal distribution of energy transfer ratio caused by lateral impact on pipes is presented by statistic analysis of experimental results based on the effective yield strength, which provides experimental and theoretical basis for the risk analysis of submarine pipe system impacted by dropped objects. Failure strains of pipe material are confirmed by comparing experimental results with finite element simulation. In addition, impact contact area and impact time are proved to be the major influence factors of energy transfer by sensitivity analysis of the finite element simulation.

  1. Evaluation and statistical judgement of neural responses to sinusoidal stimulation in cases with superimposed drift and noise.

    PubMed

    Jastreboff, P W

    1979-06-01

    Time histograms of neural responses evoked by sinuosidal stimulation often contain a slow drifting and an irregular noise which disturb Fourier analysis of these responses. Section 2 of this paper evaluates the extent to which a linear drift influences the Fourier analysis, and develops a combined Fourier and linear regression analysis for detecting and correcting for such a linear drift. Usefulness of this correcting method is demonstrated for the time histograms of actual eye movements and Purkinje cell discharges evoked by sinusoidal rotation of rabbits in the horizontal plane. In Sect. 3, the analysis of variance is adopted for estimating the probability of the random occurrence of the response curve extracted by Fourier analysis from noise. This method proved to be useful for avoiding false judgements as to whether the response curve was meaningful, particularly when the response was small relative to the contaminating noise.

  2. From the necessary to the possible: the genesis of the spin-statistics theorem

    NASA Astrophysics Data System (ADS)

    Blum, Alexander

    2014-12-01

    The spin-statistics theorem, which relates the intrinsic angular momentum of a single particle to the type of quantum statistics obeyed by a system of many such particles, is one of the central theorems in quantum field theory and the physics of elementary particles. It was first formulated in 1939/40 by Wolfgang Pauli and his assistant Markus Fierz. This paper discusses the developments that led up to this first formulation, starting from early attempts in the late 1920s to explain why charged matter particles obey Fermi-Dirac statistics, while photons obey Bose-Einstein statistics. It is demonstrated how several important developments paved the way from such general philosophical musings to a general (and provable) theorem, most notably the use of quantum field theory, the discovery of new elementary particles, and the generalization of the notion of spin. It is also discussed how the attempts to prove a spin-statistics connection were driven by Pauli from formal to more physical arguments, culminating in Pauli's 1940 proof. This proof was a major success for the beleaguered theory of quantum field theory and the methods Pauli employed proved essential for the renaissance of quantum field theory and the development of renormalization techniques in the late 1940s.

  3. Forgery at the Snite Museum of Art? Improving AMS Radiocarbon Dating at the University of Notre Dame

    NASA Astrophysics Data System (ADS)

    Troyer, Laura; Bagwell, Connor; Anderson, Tyler; Clark, Adam; Nelson, Austin; Skulski, Michael; Collon, Philippe

    2017-09-01

    The Snite Museum of Art recently obtained several donations of artifacts. Five of the pieces lack sufficient background information to prove authenticity and require further analysis to positively determine the artwork's age. One method to determine the artwork's age is radiocarbon dating via Accelerator Mass Spectrometry (AMS) performed at the University of Notre Dame's Nuclear Science Laboratory. Samples are prepared by combustion of a small amount of material and subsequent reduction to carbon into an iron powder matrix (graphitization). The graphitization procedure affects the maximum measurement rate, and a poor graphitization can be detrimental to the AMS measurement of the sample. Previous graphitization procedures resulted in a particle current too low or inconsistent to optimize AMS measurements. Thus, there was a desire to design and refine the graphitization system. The finalized process yielded physically darker samples and increased sample currents by two orders of magnitude. Additionally, the first testing of the samples was successful, yet analysis of the dates proved inconclusive. AMS measurements will be performed again to obtain better sampling statistics in the hopes of narrowing the reported date ranges. NSF and JINA-CEE.

  4. On the occurrence of hypothyroidism after radioiodine resection (in German)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Farschidpur, D.; Meiisel, P.

    1963-02-01

    An attempt was made to determine the dose of radioiodine in the treatment of hyperthyroidism with reference to the number of Cs and rep values. It was proved desirable to administer no more than 8 mC and 30,000 rep. An analysis of patients treated between 1956 and 1961 revealed six permanent cases of hypothyroidism; each had been treated with a greater dose than the above. The occurrence of hynofunction in cases treated with a dose greater than the above is compared statistically with cases treated with smaller doses. A number of examples are discussed.

  5. Biomedical engineering tasks. [electrode development for electrocardiography and electroencephalography

    NASA Technical Reports Server (NTRS)

    1972-01-01

    Electrocardiographic and vectorcardiographic bioinstrumentation work centered on the development of a new electrode system harness for Project Skylab. Evaluation of several silver electrode configurations proved superior impedance voltage performance for silver/silver chloride electrodes mounted flush by using a paste adhesive. A portable ECG processor has been designed and a breadboard unit has been built to sample ECG input data at a rate of 500 samples per second for arrhythmia detection. A small real time display driver program has been developed for statistical analysis on selected QPS features. Engineering work on a sleep monitoring cap assembly continued.

  6. Mathematical Analysis of a Coarsening Model with Local Interactions

    NASA Astrophysics Data System (ADS)

    Helmers, Michael; Niethammer, Barbara; Velázquez, Juan J. L.

    2016-10-01

    We consider particles on a one-dimensional lattice whose evolution is governed by nearest-neighbor interactions where particles that have reached size zero are removed from the system. Concentrating on configurations with infinitely many particles, we prove existence of solutions under a reasonable density assumption on the initial data and show that the vanishing of particles and the localized interactions can lead to non-uniqueness. Moreover, we provide a rigorous upper coarsening estimate and discuss generic statistical properties as well as some non-generic behavior of the evolution by means of heuristic arguments and numerical observations.

  7. PROM and Labour Effects on Urinary Metabolome: A Pilot Study

    PubMed Central

    Meloni, Alessandra; Palmas, Francesco; Mereu, Rossella; Deiana, Sara Francesca; Fais, Maria Francesca; Mussap, Michele; Ragusa, Antonio; Pintus, Roberta; Fanos, Vassilios; Melis, Gian Benedetto

    2018-01-01

    Since pathologies and complications occurring during pregnancy and/or during labour may cause adverse outcomes for both newborns and mothers, there is a growing interest in metabolomic applications on pregnancy investigation. In fact, metabolomics has proved to be an efficient strategy for the description of several perinatal conditions. In particular, this study focuses on premature rupture of membranes (PROM) in pregnancy at term. For this project, urine samples were collected at three different clinical conditions: out of labour before PROM occurrence (Ph1), out of labour with PROM (Ph2), and during labour with PROM (Ph3). GC-MS analysis, followed by univariate and multivariate statistical analysis, was able to discriminate among the different classes, highlighting the metabolites most involved in the discrimination. PMID:29511388

  8. Proving Causation in Toxic Tort Claims: Will the Judiciary Bend?

    DTIC Science & Technology

    1991-04-01

    birth control drug Bendectin , the court had determined that statistically significant epidemiological proof that the drug was a teratogen3 3 was...during pregnancy, Bendectin , caused various deformities to their child’s hands and feet. The plaintiffs attempted to prove teratogenicity (that the drug...other evidence besides that of epidemiological studies and were prepared to testify to a reasonable 67 degree of medical certainty that Bendectin is

  9. Outbreak of resistant Acinetobacter baumannii- measures and proposal for prevention and control.

    PubMed

    Romanelli, Roberta Maia de Castro; Jesus, Lenize Adriana de; Clemente, Wanessa Trindade; Lima, Stella Sala Soares; Rezende, Edna Maria; Coutinho, Rosane Luiza; Moreira, Ricardo Luiz Fontes; Neves, Francelli Aparecida Cordeiro; Brás, Nelma de Jesus

    2009-10-01

    Acinetobacter baumannii colonization and infection, frequent in Intensive Care Unit (ICU) patients, is commonly associated with high morbimortality. Several outbreaks due to multidrug-resistant (MDR) A. baumanii have been reported but few of them in Brazil. This study aimed to identify risk factors associated with colonization and infection by MDR and carbapenem-resistant A. baumannii strains isolated from patients admitted to the adult ICU at HC/UFMG. A case-control study was performed from January 2007 to June 2008. Cases were defined as patients colonized or infected by MDR/carbapenem-resistant A. baumannii, and controls were patients without MDR/carbapenem-resistant A. baumannii isolation, in a 1:2 proportion. For statistical analysis, due to changes in infection control guidelines, infection criteria and the notification process, this study was divided into two periods. During the first period analyzed, from January to December 2007, colonization or infection by MDR/carbapenem-resistant A. baumannii was associated with prior infection, invasive device utilization, prior carbapenem use and clinical severity. In the multivariate analysis, prior infection and mechanical ventilation proved to be statistically significant risk factors. Carbapenem use showed a tendency towards a statistical association. During the second study period, from January to June 2008, variables with a significant association with MDR/carbapenem-resistant A. baumannii colonization/infection were catheter utilization, carbapenem and third-generation cephalosporin use, hepatic transplantation, and clinical severity. In the multivariate analysis, only CVC use showed a statistical difference. Carbapenem and third-generation cephalosporin use displayed a tendency to be risk factors. Risk factors must be focused on infection control and prevention measures considering A. baumanni dissemination.

  10. Statistical Inference on Memory Structure of Processes and Its Applications to Information Theory

    DTIC Science & Technology

    2016-05-12

    valued times series from a sample. (A practical algorithm to compute the estimator is a work in progress.) Third, finitely-valued spatial processes...ES) U.S. Army Research Office P.O. Box 12211 Research Triangle Park, NC 27709-2211 mathematical statistics; time series ; Markov chains; random...proved. Second, a statistical method is developed to estimate the memory depth of discrete- time and continuously-valued times series from a sample. (A

  11. Neandertal admixture in Eurasia confirmed by maximum-likelihood analysis of three genomes.

    PubMed

    Lohse, Konrad; Frantz, Laurent A F

    2014-04-01

    Although there has been much interest in estimating histories of divergence and admixture from genomic data, it has proved difficult to distinguish recent admixture from long-term structure in the ancestral population. Thus, recent genome-wide analyses based on summary statistics have sparked controversy about the possibility of interbreeding between Neandertals and modern humans in Eurasia. Here we derive the probability of full mutational configurations in nonrecombining sequence blocks under both admixture and ancestral structure scenarios. Dividing the genome into short blocks gives an efficient way to compute maximum-likelihood estimates of parameters. We apply this likelihood scheme to triplets of human and Neandertal genomes and compare the relative support for a model of admixture from Neandertals into Eurasian populations after their expansion out of Africa against a history of persistent structure in their common ancestral population in Africa. Our analysis allows us to conclusively reject a model of ancestral structure in Africa and instead reveals strong support for Neandertal admixture in Eurasia at a higher rate (3.4-7.3%) than suggested previously. Using analysis and simulations we show that our inference is more powerful than previous summary statistics and robust to realistic levels of recombination.

  12. Neandertal Admixture in Eurasia Confirmed by Maximum-Likelihood Analysis of Three Genomes

    PubMed Central

    Lohse, Konrad; Frantz, Laurent A. F.

    2014-01-01

    Although there has been much interest in estimating histories of divergence and admixture from genomic data, it has proved difficult to distinguish recent admixture from long-term structure in the ancestral population. Thus, recent genome-wide analyses based on summary statistics have sparked controversy about the possibility of interbreeding between Neandertals and modern humans in Eurasia. Here we derive the probability of full mutational configurations in nonrecombining sequence blocks under both admixture and ancestral structure scenarios. Dividing the genome into short blocks gives an efficient way to compute maximum-likelihood estimates of parameters. We apply this likelihood scheme to triplets of human and Neandertal genomes and compare the relative support for a model of admixture from Neandertals into Eurasian populations after their expansion out of Africa against a history of persistent structure in their common ancestral population in Africa. Our analysis allows us to conclusively reject a model of ancestral structure in Africa and instead reveals strong support for Neandertal admixture in Eurasia at a higher rate (3.4−7.3%) than suggested previously. Using analysis and simulations we show that our inference is more powerful than previous summary statistics and robust to realistic levels of recombination. PMID:24532731

  13. Identification of Intensity Ratio Break Points from Photon Arrival Trajectories in Ratiometric Single Molecule Spectroscopy

    PubMed Central

    Bingemann, Dieter; Allen, Rachel M.

    2012-01-01

    We describe a statistical method to analyze dual-channel photon arrival trajectories from single molecule spectroscopy model-free to identify break points in the intensity ratio. Photons are binned with a short bin size to calculate the logarithm of the intensity ratio for each bin. Stochastic photon counting noise leads to a near-normal distribution of this logarithm and the standard student t-test is used to find statistically significant changes in this quantity. In stochastic simulations we determine the significance threshold for the t-test’s p-value at a given level of confidence. We test the method’s sensitivity and accuracy indicating that the analysis reliably locates break points with significant changes in the intensity ratio with little or no error in realistic trajectories with large numbers of small change points, while still identifying a large fraction of the frequent break points with small intensity changes. Based on these results we present an approach to estimate confidence intervals for the identified break point locations and recommend a bin size to choose for the analysis. The method proves powerful and reliable in the analysis of simulated and actual data of single molecule reorientation in a glassy matrix. PMID:22837704

  14. Some insight on censored cost estimators.

    PubMed

    Zhao, H; Cheng, Y; Bang, H

    2011-08-30

    Censored survival data analysis has been studied for many years. Yet, the analysis of censored mark variables, such as medical cost, quality-adjusted lifetime, and repeated events, faces a unique challenge that makes standard survival analysis techniques invalid. Because of the 'informative' censorship imbedded in censored mark variables, the use of the Kaplan-Meier (Journal of the American Statistical Association 1958; 53:457-481) estimator, as an example, will produce biased estimates. Innovative estimators have been developed in the past decade in order to handle this issue. Even though consistent estimators have been proposed, the formulations and interpretations of some estimators are less intuitive to practitioners. On the other hand, more intuitive estimators have been proposed, but their mathematical properties have not been established. In this paper, we prove the analytic identity between some estimators (a statistically motivated estimator and an intuitive estimator) for censored cost data. Efron (1967) made similar investigation for censored survival data (between the Kaplan-Meier estimator and the redistribute-to-the-right algorithm). Therefore, we view our study as an extension of Efron's work to informatively censored data so that our findings could be applied to other marked variables. Copyright © 2011 John Wiley & Sons, Ltd.

  15. HPTLC Determination of Artemisinin and Its Derivatives in Bulk and Pharmaceutical Dosage

    NASA Astrophysics Data System (ADS)

    Agarwal, Suraj P.; Ahuja, Shipra

    A simple, selective, accurate, and precise high-performance thin-layer chromatographic (HPTLC) method has been established and validated for the analysis of artemisinin and its derivatives (artesunate, artemether, and arteether) in the bulk drugs and formulations. The artemisinin, artesunate, artemether, and arteether were separated on aluminum-backed silica gel 60 F254 plates with toluene:ethyl acetate (10:1), toluene: ethyl acetate: acetic acid (2:8:0.2), toluene:butanol (10:1), and toluene:dichloro methane (0.5:10) mobile phase, respectively. The linear detector response for concentrations between 100 and 600 ng/spot showed good linear relationship with r value 0.9967, 0.9989, 0.9981 and 0.9989 for artemisinin, artesunate, artemether, and arteether, respectively. Statistical analysis proves that the method is precise, accurate, and reproducible and hence can be employed for the routine analysis.

  16. Introduction, comparison, and validation of Meta-Essentials: A free and simple tool for meta-analysis.

    PubMed

    Suurmond, Robert; van Rhee, Henk; Hak, Tony

    2017-12-01

    We present a new tool for meta-analysis, Meta-Essentials, which is free of charge and easy to use. In this paper, we introduce the tool and compare its features to other tools for meta-analysis. We also provide detailed information on the validation of the tool. Although free of charge and simple, Meta-Essentials automatically calculates effect sizes from a wide range of statistics and can be used for a wide range of meta-analysis applications, including subgroup analysis, moderator analysis, and publication bias analyses. The confidence interval of the overall effect is automatically based on the Knapp-Hartung adjustment of the DerSimonian-Laird estimator. However, more advanced meta-analysis methods such as meta-analytical structural equation modelling and meta-regression with multiple covariates are not available. In summary, Meta-Essentials may prove a valuable resource for meta-analysts, including researchers, teachers, and students. © 2017 The Authors. Research Synthesis Methods published by John Wiley & Sons Ltd.

  17. If You Build (and Moderate) It, They Will Come: The Smokefree Women Facebook Page

    PubMed Central

    2013-01-01

    This analysis explores the impact of modifying the Smokefree Women Facebook social media strategy, from primarily promoting resources to encouraging participation in communications about smoking cessation by posting user-generated content. Analyses were performed using data from the Smokefree Women Facebook page to assess the impact of the revised strategy on reach and engagement. Fan engagement increased 430%, and a strong and statistically significant correlation (P < .05) between the frequency of moderator posts and community engagement was observed. The reach of the page also increased by 420%. Our findings indicate that the strategy shift had a statistically significant and positive effect on the frequency of interactions on the Facebook page, providing an example of an approach that may prove useful for reaching and engaging users in online communities. Additional research is needed to assess the association between engagement in virtual communities and health behavior outcomes. PMID:24395993

  18. [The change in lipoid spectrum in blood serum in girls of different somatotypes after meals].

    PubMed

    Fefelova, Iu A

    2010-01-01

    State Educational Institution for Professional Education - Prof. Voyno-Yasenetzkiy's High School of Krasnoyarsk State Medical Academy of Russian Public Health Ministry. We carried out the analysis of the changes in the spectrum of neutral lipoids and phospholipoids in blood serum as a response to meals in girls of different somatotypes. We revealed statistically true lowering of lipid acids content in representatives of all examined somatotypes after meals. Statistically true increase of simply oxidized fractions of phospholipoids in girls of sub-athletic and athletic somatotypes testifies on the change in the ratio of dynamics components of lipoid spectrum of lipoproteids. Balanced fractions of phospholipoids as well as free cholesterol are the main structural components in lipoproteid membranes and they didn't change in any of the studied somatotypes as a response to meals. This proves the stability of membrane structure of lipoproteid complexes as a response to the given physiological stimulus.

  19. Development of chemistry attitudes and experiences questionnaire (CAEQ)

    NASA Astrophysics Data System (ADS)

    Dalgety, Jacinta; Coll, Richard K.; Jones, Alister

    2003-09-01

    In this article we describe the development of the Chemistry Attitudes and Experiences Questionnaire (CAEQ) that measures first-year university chemistry students' attitude toward chemistry, chemistry self-efficacy, and learning experiences. The instrument was developed as part of a larger study and sought to fulfill a need for an instrument to investigate factors that influence student enrollment choice. We set out to design the instrument in a manner that would maximize construct validity. The CAEQ was piloted with a cohort of science and technology students (n = 129) at the end of their first year. Based on statistical analysis the instrument was modified and subsequently administered on two occasions at two tertiary institutions (n = 669). Statistical data along with additional data gathered from interviews suggest that the CAEQ possesses good construct validity and will prove a useful tool for tertiary level educators who wish to gain an understanding of factors that influence student choice of chemistry enrolment.

  20. Analyzing Carbohydrate-Protein Interaction Based on Single Plasmonic Nanoparticle by Conventional Dark Field Microscopy.

    PubMed

    Jin, Hong-Ying; Li, Da-Wei; Zhang, Na; Gu, Zhen; Long, Yi-Tao

    2015-06-10

    We demonstrated a practical method to analyze carbohydrate-protein interaction based on single plasmonic nanoparticles by conventional dark field microscopy (DFM). Protein concanavalin A (ConA) was modified on large sized gold nanoparticles (AuNPs), and dextran was conjugated on small sized AuNPs. As the interaction between ConA and dextran resulted in two kinds of gold nanoparticles coupled together, which caused coupling of plasmonic oscillations, apparent color changes (from green to yellow) of the single AuNPs were observed through DFM. Then, the color information was instantly transformed into a statistic peak wavelength distribution in less than 1 min by a self-developed statistical program (nanoparticleAnalysis). In addition, the interaction between ConA and dextran was proved with biospecific recognition. This approach is high-throughput and real-time, and is a convenient method to analyze carbohydrate-protein interaction at the single nanoparticle level efficiently.

  1. Progress toward openness, transparency, and reproducibility in cognitive neuroscience.

    PubMed

    Gilmore, Rick O; Diaz, Michele T; Wyble, Brad A; Yarkoni, Tal

    2017-05-01

    Accumulating evidence suggests that many findings in psychological science and cognitive neuroscience may prove difficult to reproduce; statistical power in brain imaging studies is low and has not improved recently; software errors in analysis tools are common and can go undetected for many years; and, a few large-scale studies notwithstanding, open sharing of data, code, and materials remain the rare exception. At the same time, there is a renewed focus on reproducibility, transparency, and openness as essential core values in cognitive neuroscience. The emergence and rapid growth of data archives, meta-analytic tools, software pipelines, and research groups devoted to improved methodology reflect this new sensibility. We review evidence that the field has begun to embrace new open research practices and illustrate how these can begin to address problems of reproducibility, statistical power, and transparency in ways that will ultimately accelerate discovery. © 2017 New York Academy of Sciences.

  2. If you build (and moderate) it, they will come: the Smokefree Women Facebook page.

    PubMed

    Post, Samantha D; Taylor, Shani C; Sanders, Amy E; Goldfarb, Jeffrey M; Hunt, Yvonne M; Augustson, Erik M

    2013-12-01

    This analysis explores the impact of modifying the Smokefree Women Facebook social media strategy, from primarily promoting resources to encouraging participation in communications about smoking cessation by posting user-generated content. Analyses were performed using data from the Smokefree Women Facebook page to assess the impact of the revised strategy on reach and engagement. Fan engagement increased 430%, and a strong and statistically significant correlation (P < .05) between the frequency of moderator posts and community engagement was observed. The reach of the page also increased by 420%. Our findings indicate that the strategy shift had a statistically significant and positive effect on the frequency of interactions on the Facebook page, providing an example of an approach that may prove useful for reaching and engaging users in online communities. Additional research is needed to assess the association between engagement in virtual communities and health behavior outcomes.

  3. Enabling High-Energy, High-Voltage Lithium-Ion Cells: Standardization of Coin-Cell Assembly, Electrochemical Testing, and Evaluation of Full Cells

    DOE PAGES

    Long, Brandon R.; Rinaldo, Steven G.; Gallagher, Kevin G.; ...

    2016-11-09

    Coin-cells are often the test format of choice for laboratories engaged in battery research and development as they provide a convenient platform for rapid testing of new materials on a small scale. However, reliable, reproducible data via the coin-cell format is inherently difficult, particularly in the full-cell configuration. In addition, statistical evaluation to prove the consistency and reliability of such data is often neglected. Herein we report on several studies aimed at formalizing physical process parameters and coin-cell construction related to full cells. Statistical analysis and performance benchmarking approaches are advocated as a means to more confidently track changes inmore » cell performance. Finally, we show that trends in the electrochemical data obtained from coin-cells can be reliable and informative when standardized approaches are implemented in a consistent manner.« less

  4. Histological and profilometric evaluation of the root surface after instrumentation with a new piezoelectric device - ex vivo study.

    PubMed

    Silva, D; Martins, O; Matos, S; Lopes, P; Rolo, T; Baptista, I

    2015-05-01

    An ex vivo model was designed to profilometrically and histologically assess root changes resulting from scaling with a new ultrasonic device, designed for bone piezoelectric surgery, in comparison with curettes. Three groups of 10 periodontal hopeless teeth were each subjected to different root instrumentation: Gracey curettes (CUR); ultrasonic piezoelectric device, Perio 100% setting, level 8 (P100); and ultrasonic piezoelectric device Surg 50% setting, level 1 (S50). After extraction, all teeth were photographed to visually assess the presence of dental calculus. The treated root surfaces were profilometrically evaluated (Ra, Rz, Rmax). Undecalcified histological sections were prepared to assess qualitative changes in cementum thickness. Statistical analysis was carried out using one-way anova test with a significance level of 95%. Both instruments proved to be effective in the complete removal of calculus. The CUR group presented the lowest Ra [2.28 μm (±0.58)] and S50 the highest [3.01 μm (±0.61)]. No statistically significant differences were detected among the three groups, for Ra, Rz and Rmax. Histologically, there was a cementum thickness reduction in all groups, being higher and more irregular in S50 group. Within the limits of this study, there were no statistically significant differences in roughness parameters analyzed between curettes and the ultrasonic piezoelectric unit. This new instrument removes a smaller amount of cementum, mainly at the Perio 100% power setting, which appears to be the least damaging. The ultrasonic device is effective in calculus removal, proving to be as effective as curettes. © 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  5. Length and Rate of Individual Participation in Various Activities on Recreation Sites and Areas

    Treesearch

    Gary L. Tyre; George A. James

    1971-01-01

    While statistically reliable methods exist for estimating recreation use on large areas, they often prove prohibitively expensive. Inexpensive alternatives involving the length and rate of individual participation in specific activites are presented, together with data and statistics on the recreational use of three large areas on the National Forests. This...

  6. Nuclear magnetic resonance (NMR)-based metabolomics for cancer research.

    PubMed

    Ranjan, Renuka; Sinha, Neeraj

    2018-05-07

    Nuclear magnetic resonance (NMR) has emerged as an effective tool in various spheres of biomedical research, amongst which metabolomics is an important method for the study of various types of disease. Metabolomics has proved its stronghold in cancer research by the development of different NMR methods over time for the study of metabolites, thus identifying key players in the aetiology of cancer. A plethora of one-dimensional and two-dimensional NMR experiments (in solids, semi-solids and solution phases) are utilized to obtain metabolic profiles of biofluids, cell extracts and tissue biopsy samples, which can further be subjected to statistical analysis. Any alteration in the assigned metabolite peaks gives an indication of changes in metabolic pathways. These defined changes demonstrate the utility of NMR in the early diagnosis of cancer and provide further measures to combat malignancy and its progression. This review provides a snapshot of the trending NMR techniques and the statistical analysis involved in the metabolomics of diseases, with emphasis on advances in NMR methodology developed for cancer research. Copyright © 2018 John Wiley & Sons, Ltd.

  7. Oil, gas field growth projections: Wishful thinking or reality?

    USGS Publications Warehouse

    Attanasi, E.D.; Mast, R.F.; Root, D.H.

    1999-01-01

    The observed `field growth' for the period from 1992 through 1996 with the US Geological Survey's (USGS) predicted field growth for the same period are compared. Known field recovery of field size is defined as the sum of past cumulative field production and the field's proved reserves. Proved reserves are estimated quantities of hydrocarbons which geologic and engineering data demonstrate with reasonable certainty to recoverable from known fields under existing economic and operating conditions. Proved reserve estimates calculated with this definition are typically conservative. The modeling approach used by the USGS to characterize `field growth phenomena' is statistical rather that geologic in nature.

  8. Fractal geometry as a new approach for proving nanosimilarity: a reflection note.

    PubMed

    Demetzos, Costas; Pippa, Natassa

    2015-04-10

    Nanosimilars are considered as new medicinal outcomes combining the generic drugs and the nanocarrier as an innovative excipient, in order to evaluate them as final products. They belong to the grey area - concerning the evaluation process - between generic drugs and biosimilar medicinal products. Generic drugs are well documented and a huge number of them are in market, replacing effectively the off-patent drugs. The scientific approach for releasing them to the market is based on bioequivalence studies, which are well documented and accepted by the regulatory agencies. On the other hand, the structural complexity of biological/biotechnology-derived products demands a new approach for the approval process taking into consideration that bioequivalence studies are not considered as sufficient as in generic drugs, and new clinical trials are needed to support their approval process of the product to the market. In proportion, due to technological complexity of nanomedicines, the approaches for proving the statistical identity or the similarity for generic and biosimilar products, respectively, with those of prototypes, are not considered as effective for nanosimilar products. The aim of this note is to propose a complementary approach which can provide realistic evidences concerning the nanosimilarity, based on fractal analysis. This approach is well fit with the structural complexity of nanomedicines and smooths the difficulties for proving the similarity between off-patent and nanosimilar products. Fractal analysis could be considered as the approach that completely characterizes the physicochemical/morphological characteristics of nanosimilar products and could be proposed as a start point for a deep discussion on nanosimilarity. Copyright © 2015 Elsevier B.V. All rights reserved.

  9. On Fluctuations of Eigenvalues of Random Band Matrices

    NASA Astrophysics Data System (ADS)

    Shcherbina, M.

    2015-10-01

    We consider the fluctuations of linear eigenvalue statistics of random band matrices whose entries have the form with i.i.d. possessing the th moment, where the function u has a finite support , so that M has only nonzero diagonals. The parameter b (called the bandwidth) is assumed to grow with n in a way such that . Without any additional assumptions on the growth of b we prove CLT for linear eigenvalue statistics for a rather wide class of test functions. Thus we improve and generalize the results of the previous papers (Jana et al., arXiv:1412.2445; Li et al. Random Matrices 2:04, 2013), where CLT was proven under the assumption . Moreover, we develop a method which allows to prove automatically the CLT for linear eigenvalue statistics of the smooth test functions for almost all classical models of random matrix theory: deformed Wigner and sample covariance matrices, sparse matrices, diluted random matrices, matrices with heavy tales etc.

  10. Descriptive and inferential statistical methods used in burns research.

    PubMed

    Al-Benna, Sammy; Al-Ajam, Yazan; Way, Benjamin; Steinstraesser, Lars

    2010-05-01

    Burns research articles utilise a variety of descriptive and inferential methods to present and analyse data. The aim of this study was to determine the descriptive methods (e.g. mean, median, SD, range, etc.) and survey the use of inferential methods (statistical tests) used in articles in the journal Burns. This study defined its population as all original articles published in the journal Burns in 2007. Letters to the editor, brief reports, reviews, and case reports were excluded. Study characteristics, use of descriptive statistics and the number and types of statistical methods employed were evaluated. Of the 51 articles analysed, 11(22%) were randomised controlled trials, 18(35%) were cohort studies, 11(22%) were case control studies and 11(22%) were case series. The study design and objectives were defined in all articles. All articles made use of continuous and descriptive data. Inferential statistics were used in 49(96%) articles. Data dispersion was calculated by standard deviation in 30(59%). Standard error of the mean was quoted in 19(37%). The statistical software product was named in 33(65%). Of the 49 articles that used inferential statistics, the tests were named in 47(96%). The 6 most common tests used (Student's t-test (53%), analysis of variance/co-variance (33%), chi(2) test (27%), Wilcoxon & Mann-Whitney tests (22%), Fisher's exact test (12%)) accounted for the majority (72%) of statistical methods employed. A specified significance level was named in 43(88%) and the exact significance levels were reported in 28(57%). Descriptive analysis and basic statistical techniques account for most of the statistical tests reported. This information should prove useful in deciding which tests should be emphasised in educating burn care professionals. These results highlight the need for burn care professionals to have a sound understanding of basic statistics, which is crucial in interpreting and reporting data. Advice should be sought from professionals in the fields of biostatistics and epidemiology when using more advanced statistical techniques. Copyright 2009 Elsevier Ltd and ISBI. All rights reserved.

  11. The Strengths and Weaknesses of Logic Formalisms to Support Mishap Analysis

    NASA Technical Reports Server (NTRS)

    Johnson, C. W.; Holloway, C. M.

    2002-01-01

    The increasing complexity of many safety critical systems poses new problems for mishap analysis. Techniques developed in the sixties and seventies cannot easily scale-up to analyze incidents involving tightly integrated software and hardware components. Similarly, the realization that many failures have systemic causes has widened the scope of many mishap investigations. Organizations, including NASA and the NTSB, have responded by starting research and training initiatives to ensure that their personnel are well equipped to meet these challenges. One strand of research has identified a range of mathematically based techniques that can be used to reason about the causes of complex, adverse events. The proponents of these techniques have argued that they can be used to formally prove that certain events created the necessary and sufficient causes for a mishap to occur. Mathematical proofs can reduce the bias that is often perceived to effect the interpretation of adverse events. Others have opposed the introduction of these techniques by identifying social and political aspects to incident investigation that cannot easily be reconciled with a logic-based approach. Traditional theorem proving mechanisms cannot accurately capture the wealth of inductive, deductive and statistical forms of inference that investigators routinely use in their analysis of adverse events. This paper summarizes some of the benefits that logics provide, describes their weaknesses, and proposes a number of directions for future research.

  12. Design of an image encryption scheme based on a multiple chaotic map

    NASA Astrophysics Data System (ADS)

    Tong, Xiao-Jun

    2013-07-01

    In order to solve the problem that chaos is degenerated in limited computer precision and Cat map is the small key space, this paper presents a chaotic map based on topological conjugacy and the chaotic characteristics are proved by Devaney definition. In order to produce a large key space, a Cat map named block Cat map is also designed for permutation process based on multiple-dimensional chaotic maps. The image encryption algorithm is based on permutation-substitution, and each key is controlled by different chaotic maps. The entropy analysis, differential analysis, weak-keys analysis, statistical analysis, cipher random analysis, and cipher sensibility analysis depending on key and plaintext are introduced to test the security of the new image encryption scheme. Through the comparison to the proposed scheme with AES, DES and Logistic encryption methods, we come to the conclusion that the image encryption method solves the problem of low precision of one dimensional chaotic function and has higher speed and higher security.

  13. The importance of proving the null.

    PubMed

    Gallistel, C R

    2009-04-01

    Null hypotheses are simple, precise, and theoretically important. Conventional statistical analysis cannot support them; Bayesian analysis can. The challenge in a Bayesian analysis is to formulate a suitably vague alternative, because the vaguer the alternative is (the more it spreads out the unit mass of prior probability), the more the null is favored. A general solution is a sensitivity analysis: Compute the odds for or against the null as a function of the limit(s) on the vagueness of the alternative. If the odds on the null approach 1 from above as the hypothesized maximum size of the possible effect approaches 0, then the data favor the null over any vaguer alternative to it. The simple computations and the intuitive graphic representation of the analysis are illustrated by the analysis of diverse examples from the current literature. They pose 3 common experimental questions: (a) Are 2 means the same? (b) Is performance at chance? (c) Are factors additive? (c) 2009 APA, all rights reserved

  14. Brownian motion or Lévy walk? Stepping towards an extended statistical mechanics for animal locomotion.

    PubMed

    Gautestad, Arild O

    2012-09-07

    Animals moving under the influence of spatio-temporal scaling and long-term memory generate a kind of space-use pattern that has proved difficult to model within a coherent theoretical framework. An extended kind of statistical mechanics is needed, accounting for both the effects of spatial memory and scale-free space use, and put into a context of ecological conditions. Simulations illustrating the distinction between scale-specific and scale-free locomotion are presented. The results show how observational scale (time lag between relocations of an individual) may critically influence the interpretation of the underlying process. In this respect, a novel protocol is proposed as a method to distinguish between some main movement classes. For example, the 'power law in disguise' paradox-from a composite Brownian motion consisting of a superposition of independent movement processes at different scales-may be resolved by shifting the focus from pattern analysis at one particular temporal resolution towards a more process-oriented approach involving several scales of observation. A more explicit consideration of system complexity within a statistical mechanical framework, supplementing the more traditional mechanistic modelling approach, is advocated.

  15. Development of Software for Automatic Analysis of Intervention in the Field of Homeopathy.

    PubMed

    Jain, Rajesh Kumar; Goyal, Shagun; Bhat, Sushma N; Rao, Srinath; Sakthidharan, Vivek; Kumar, Prasanna; Sajan, Kannanaikal Rappayi; Jindal, Sameer Kumar; Jindal, Ghanshyam D

    2018-05-01

    To study the effect of homeopathic medicines (in higher potencies) in normal subjects, Peripheral Pulse Analyzer (PPA) has been used to record physiologic variability parameters before and after administration of the medicine/placebo in 210 normal subjects. Data have been acquired in seven rounds; placebo was administered in rounds 1 and 2 and medicine in potencies 6, 30, 200, 1 M, and 10 M was administered in rounds 3 to 7, respectively. Five different medicines in the said potencies were given to a group of around 40 subjects each. Although processing of data required human intervention, a software application has been developed to analyze the processed data and detect the response to eliminate the undue delay as well as human bias in subjective analysis. This utility named Automatic Analysis of Intervention in the Field of Homeopathy is run on the processed PPA data and the outcome has been compared with the manual analysis. The application software uses adaptive threshold based on statistics for detecting responses in contrast to fixed threshold used in manual analysis. The automatic analysis has detected 12.96% higher responses than subjective analysis. Higher response rates have been manually verified to be true positive. This indicates robustness of the application software. The automatic analysis software was run on another set of pulse harmonic parameters derived from the same data set to study cardiovascular susceptibility and 385 responses were detected in contrast to 272 of variability parameters. It was observed that 65% of the subjects, eliciting response, were common. This not only validates the software utility for giving consistent yield but also reveals the certainty of the response. This development may lead to electronic proving of homeopathic medicines (e-proving).

  16. Expected p-values in light of an ROC curve analysis applied to optimal multiple testing procedures.

    PubMed

    Vexler, Albert; Yu, Jihnhee; Zhao, Yang; Hutson, Alan D; Gurevich, Gregory

    2017-01-01

    Many statistical studies report p-values for inferential purposes. In several scenarios, the stochastic aspect of p-values is neglected, which may contribute to drawing wrong conclusions in real data experiments. The stochastic nature of p-values makes their use to examine the performance of given testing procedures or associations between investigated factors to be difficult. We turn our focus on the modern statistical literature to address the expected p-value (EPV) as a measure of the performance of decision-making rules. During the course of our study, we prove that the EPV can be considered in the context of receiver operating characteristic (ROC) curve analysis, a well-established biostatistical methodology. The ROC-based framework provides a new and efficient methodology for investigating and constructing statistical decision-making procedures, including: (1) evaluation and visualization of properties of the testing mechanisms, considering, e.g. partial EPVs; (2) developing optimal tests via the minimization of EPVs; (3) creation of novel methods for optimally combining multiple test statistics. We demonstrate that the proposed EPV-based approach allows us to maximize the integrated power of testing algorithms with respect to various significance levels. In an application, we use the proposed method to construct the optimal test and analyze a myocardial infarction disease dataset. We outline the usefulness of the "EPV/ROC" technique for evaluating different decision-making procedures, their constructions and properties with an eye towards practical applications.

  17. A comparative evaluation of efficacy of protaper universal rotary retreatment system for gutta-percha removal with or without a solvent.

    PubMed

    Kumar, M Sita Ram; Sajjan, Girija S; Satish, Kalyan; Varma, K Madhu

    2012-09-01

    The aim was to evaluate and compare the efficacy of ProTaper Universal rotary retreatment system with or without solvent and stainless steel hand files for endodontic filling removal from root canals and also to compare retreatment time for each system. Thirty extracted mandibular premolars with single straight canals were endodontically treated. Teeth were divided into three major groups, having 10 specimens each. Removal of obturating material in group 1 by stainless steel hand files with RC Solve, group 2 by ProTaper Universal retreatment instruments and group 3 by ProTaper Universal retreatment instruments along with RC solve was done. Retreatment was considered complete for all groups when no filling material was observed on the instruments. The retreatment time was recorded for each tooth. All specimens were grooved longitudinally in a buccolingual direction. The split halves were examined under a stereomicroscope and images were captured and analyzed. The remaining filling debris area ratios were considered for statistical analysis. With ANOVA test, statistical analysis showed that there was statistically no significant difference regarding the amount of filling remnants between the groups (P < 0.05). Differences between the means of groups are statistically significant regarding the retreatment time. Irrespective of the technique used, all the specimens had some remnants on the root canal wall. ProTaper Universal retreatment system files alone proved to be faster than the other experimental groups.

  18. Extreme Statistics of Storm Surges in the Baltic Sea

    NASA Astrophysics Data System (ADS)

    Kulikov, E. A.; Medvedev, I. P.

    2017-11-01

    Statistical analysis of the extreme values of the Baltic Sea level has been performed for a series of observations for 15-125 years at 13 tide gauge stations. It is shown that the empirical relation between value of extreme sea level rises or ebbs (caused by storm events) and its return period in the Baltic Sea can be well approximated by the Gumbel probability distribution. The maximum values of extreme floods/ebbs of the 100-year recurrence were observed in the Gulf of Finland and the Gulf of Riga. The two longest data series, observed in Stockholm and Vyborg over 125 years, have shown a significant deviation from the Gumbel distribution for the rarest events. Statistical analysis of the hourly sea level data series reveals some asymmetry in the variability of the Baltic Sea level. The probability of rises proved higher than that of ebbs. As for the magnitude of the 100-year recurrence surge, it considerably exceeded the magnitude of ebbs almost everywhere. This asymmetry effect can be attributed to the influence of low atmospheric pressure during storms. A statistical study of extreme values has also been applied to sea level series for Narva over the period of 1994-2000, which were simulated by the ROMS numerical model. Comparisons of the "simulated" and "observed" extreme sea level distributions show that the model reproduces quite satisfactorily extreme floods of "moderate" magnitude; however, it underestimates sea level changes for the most powerful storm surges.

  19. A novel bi-level meta-analysis approach: applied to biological pathway analysis.

    PubMed

    Nguyen, Tin; Tagett, Rebecca; Donato, Michele; Mitrea, Cristina; Draghici, Sorin

    2016-02-01

    The accumulation of high-throughput data in public repositories creates a pressing need for integrative analysis of multiple datasets from independent experiments. However, study heterogeneity, study bias, outliers and the lack of power of available methods present real challenge in integrating genomic data. One practical drawback of many P-value-based meta-analysis methods, including Fisher's, Stouffer's, minP and maxP, is that they are sensitive to outliers. Another drawback is that, because they perform just one statistical test for each individual experiment, they may not fully exploit the potentially large number of samples within each study. We propose a novel bi-level meta-analysis approach that employs the additive method and the Central Limit Theorem within each individual experiment and also across multiple experiments. We prove that the bi-level framework is robust against bias, less sensitive to outliers than other methods, and more sensitive to small changes in signal. For comparative analysis, we demonstrate that the intra-experiment analysis has more power than the equivalent statistical test performed on a single large experiment. For pathway analysis, we compare the proposed framework versus classical meta-analysis approaches (Fisher's, Stouffer's and the additive method) as well as against a dedicated pathway meta-analysis package (MetaPath), using 1252 samples from 21 datasets related to three human diseases, acute myeloid leukemia (9 datasets), type II diabetes (5 datasets) and Alzheimer's disease (7 datasets). Our framework outperforms its competitors to correctly identify pathways relevant to the phenotypes. The framework is sufficiently general to be applied to any type of statistical meta-analysis. The R scripts are available on demand from the authors. sorin@wayne.edu Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  20. Molecular Analysis of Date Palm Genetic Diversity Using Random Amplified Polymorphic DNA (RAPD) and Inter-Simple Sequence Repeats (ISSRs).

    PubMed

    El Sharabasy, Sherif F; Soliman, Khaled A

    2017-01-01

    The date palm is an ancient domesticated plant with great diversity and has been cultivated in the Middle East and North Africa for at last 5000 years. Date palm cultivars are classified based on the fruit moisture content, as dry, semidry, and soft dates. There are a number of biochemical and molecular techniques available for characterization of the date palm variation. This chapter focuses on the DNA-based markers random amplified polymorphic DNA (RAPD) and inter-simple sequence repeats (ISSR) techniques, in addition to biochemical markers based on isozyme analysis. These techniques coupled with appropriate statistical tools proved useful for determining phylogenetic relationships among date palm cultivars and provide information resources for date palm gene banks.

  1. Feynman graphs and the large dimensional limit of multipartite entanglement

    NASA Astrophysics Data System (ADS)

    Di Martino, Sara; Facchi, Paolo; Florio, Giuseppe

    2018-01-01

    In this paper, we extend the analysis of multipartite entanglement, based on techniques from classical statistical mechanics, to a system composed of n d-level parties (qudits). We introduce a suitable partition function at a fictitious temperature with the average local purity of the system as Hamiltonian. In particular, we analyze the high-temperature expansion of this partition function, prove the convergence of the series, and study its asymptotic behavior as d → ∞. We make use of a diagrammatic technique, classify the graphs, and study their degeneracy. We are thus able to evaluate their contributions and estimate the moments of the distribution of the local purity.

  2. Localized rosuvastatin via implantable bioerodible sponge and its potential role in augmenting bone healing and regeneration.

    PubMed

    Ibrahim, Howida Kamal; Fahmy, Rania Hassan

    2016-11-01

    Statins proved potential bone healing properties. Rosuvastatin is a synthetic, hydrophilic, potent and highly efficacious statin. In the current work, an attempt was investigated to develop, evaluate various bioerodible composite sponges enclosing rosuvastatin and explore their potential in augmenting bone healing and regeneration. Twelve lyophilized sponge formulae were prepared adapting a 4 1 .3 1 full factorial design. Xanthan gum, polycarbophil, Carbopol® and sodium alginate were investigated as anionic polymers, each at three chitosan:anionic polymer ratios (1:3, 1:1, 3:1). The formula of choice was implanted in fractured rat femora. Visual and microscopic examination showed flexible homogenous porous structures with considerable bending ability. Polyelectrolyte complex formation was proved by DSC and FT-IR for all chitosan/anionic combinations except with xanthan gum where chitosan probably bound to the drug rather than xanthan gum. Statistical analysis proved that anionic polymer type and chitosan: polymer ratio, as well as, their interactions, exhibited significant effects on the release parameters at p ≤ 0.05. The optimum chitosan/anionic polymer complexation ratios were 3:1 for polycarbophil and 1:1 for Carbopol and alginate. The release at these ratios followed Fiction diffusion while other ratios had anomalous diffusion. Imwitor® 900K and HPMC K100M were added as release retarardants for further release optimization. The formula of choice was implanted in fractured rat femora. Histopathological examination revealed advanced stages of healing in treated femora compared to control ones. Biodegradable sponges for local rosuvastatin delivery proved significantly enhanced wound healing and regeneration properties to fractured bones.

  3. Simulation on a car interior aerodynamic noise control based on statistical energy analysis

    NASA Astrophysics Data System (ADS)

    Chen, Xin; Wang, Dengfeng; Ma, Zhengdong

    2012-09-01

    How to simulate interior aerodynamic noise accurately is an important question of a car interior noise reduction. The unsteady aerodynamic pressure on body surfaces is proved to be the key effect factor of car interior aerodynamic noise control in high frequency on high speed. In this paper, a detail statistical energy analysis (SEA) model is built. And the vibra-acoustic power inputs are loaded on the model for the valid result of car interior noise analysis. The model is the solid foundation for further optimization on car interior noise control. After the most sensitive subsystems for the power contribution to car interior noise are pointed by SEA comprehensive analysis, the sound pressure level of car interior aerodynamic noise can be reduced by improving their sound and damping characteristics. The further vehicle testing results show that it is available to improve the interior acoustic performance by using detailed SEA model, which comprised by more than 80 subsystems, with the unsteady aerodynamic pressure calculation on body surfaces and the materials improvement of sound/damping properties. It is able to acquire more than 2 dB reduction on the central frequency in the spectrum over 800 Hz. The proposed optimization method can be looked as a reference of car interior aerodynamic noise control by the detail SEA model integrated unsteady computational fluid dynamics (CFD) and sensitivity analysis of acoustic contribution.

  4. Condensate statistics in interacting and ideal dilute bose gases

    PubMed

    Kocharovsky; Kocharovsky; Scully

    2000-03-13

    We obtain analytical formulas for the statistics, in particular, for the characteristic function and all cumulants, of the Bose-Einstein condensate in dilute weakly interacting and ideal equilibrium gases in the canonical ensemble via the particle-number-conserving operator formalism of Girardeau and Arnowitt. We prove that the ground-state occupation statistics is not Gaussian even in the thermodynamic limit. We calculate the effect of Bogoliubov coupling on suppression of ground-state occupation fluctuations and show that they are governed by a pair-correlation, squeezing mechanism.

  5. Detecting most influencing courses on students grades using block PCA

    NASA Astrophysics Data System (ADS)

    Othman, Osama H.; Gebril, Rami Salah

    2014-12-01

    One of the modern solutions adopted in dealing with the problem of large number of variables in statistical analyses is the Block Principal Component Analysis (Block PCA). This modified technique can be used to reduce the vertical dimension (variables) of the data matrix Xn×p by selecting a smaller number of variables, (say m) containing most of the statistical information. These selected variables can then be employed in further investigations and analyses. Block PCA is an adapted multistage technique of the original PCA. It involves the application of Cluster Analysis (CA) and variable selection throughout sub principal components scores (PC's). The application of Block PCA in this paper is a modified version of the original work of Liu et al (2002). The main objective was to apply PCA on each group of variables, (established using cluster analysis), instead of involving the whole large pack of variables which was proved to be unreliable. In this work, the Block PCA is used to reduce the size of a huge data matrix ((n = 41) × (p = 251)) consisting of Grade Point Average (GPA) of the students in 251 courses (variables) in the faculty of science in Benghazi University. In other words, we are constructing a smaller analytical data matrix of the GPA's of the students with less variables containing most variation (statistical information) in the original database. By applying the Block PCA, (12) courses were found to `absorb' most of the variation or influence from the original data matrix, and hence worth to be keep for future statistical exploring and analytical studies. In addition, the course Independent Study (Math.) was found to be the most influencing course on students GPA among the 12 selected courses.

  6. Biomonitoring of pollen grains of a river bank suburban city, Konnagar, Calcutta, India, and its link and impact on local people.

    PubMed

    Ghosal, Kavita; Pandey, Naren; Bhattacharya, Swati Gupta

    2015-01-01

    Pollen grains released by plants are dispersed into the air and can become trapped in human nasal mucosa, causing immediate release of allergens triggering severe Type 1 hypersensitivity reactions in susceptible allergic patients. Recent epidemiologic data show that 11-12% of people suffer from this type of disorders in India. Hence, it is important to examine whether pollen grains have a role in dissipating respiratory problems, including allergy and astma, in a subtropical suburban city. Meteorological data were collected for a period of two years, together with aerobiological sampling with a Burkard sampler. A pollen calendar was prepared for the city. A health survey and the hospitalization rate of local people for the above problems were documented following statistical analysis between pollen counts and the data from the two above-mentioned sources. Skin Prick Test and Indirect ELISA were performer for the identification of allergenic pollen grains. Bio-monitoring results showed that a total of 36 species of pollen grains were located in the air of the study area, where their presence is controlled by many important meteorological parameters proved from SPSS statistical analysis and by their blooming periods. Statistical analysis showed that there is a high positive correlation of monthly pollen counts with the data from the survey and hospital. Biochemical tests revealed the allergic nature of pollen grains of many local species found in the sampler. Bio-monitoring, together with statistical and biochemical results, leave no doubt about the role of pollen as a bio-pollutant. General knowledge about pollen allergy and specific allergenic pollen grains of a particular locality could be a good step towards better health for the cosmopolitan suburban city.

  7. A Non-Destructive Method for Distinguishing Reindeer Antler (Rangifer tarandus) from Red Deer Antler (Cervus elaphus) Using X-Ray Micro-Tomography Coupled with SVM Classifiers

    PubMed Central

    Lefebvre, Alexandre; Rochefort, Gael Y.; Santos, Frédéric; Le Denmat, Dominique; Salmon, Benjamin; Pétillon, Jean-Marc

    2016-01-01

    Over the last decade, biomedical 3D-imaging tools have gained widespread use in the analysis of prehistoric bone artefacts. While initial attempts to characterise the major categories used in osseous industry (i.e. bone, antler, and dentine/ivory) have been successful, the taxonomic determination of prehistoric artefacts remains to be investigated. The distinction between reindeer and red deer antler can be challenging, particularly in cases of anthropic and/or taphonomic modifications. In addition to the range of destructive physicochemical identification methods available (mass spectrometry, isotopic ratio, and DNA analysis), X-ray micro-tomography (micro-CT) provides convincing non-destructive 3D images and analyses. This paper presents the experimental protocol (sample scans, image processing, and statistical analysis) we have developed in order to identify modern and archaeological antler collections (from Isturitz, France). This original method is based on bone microstructure analysis combined with advanced statistical support vector machine (SVM) classifiers. A combination of six microarchitecture biomarkers (bone volume fraction, trabecular number, trabecular separation, trabecular thickness, trabecular bone pattern factor, and structure model index) were screened using micro-CT in order to characterise internal alveolar structure. Overall, reindeer alveoli presented a tighter mesh than red deer alveoli, and statistical analysis allowed us to distinguish archaeological antler by species with an accuracy of 96%, regardless of anatomical location on the antler. In conclusion, micro-CT combined with SVM classifiers proves to be a promising additional non-destructive method for antler identification, suitable for archaeological artefacts whose degree of human modification and cultural heritage or scientific value has previously made it impossible (tools, ornaments, etc.). PMID:26901355

  8. Regression: The Apple Does Not Fall Far From the Tree.

    PubMed

    Vetter, Thomas R; Schober, Patrick

    2018-05-15

    Researchers and clinicians are frequently interested in either: (1) assessing whether there is a relationship or association between 2 or more variables and quantifying this association; or (2) determining whether 1 or more variables can predict another variable. The strength of such an association is mainly described by the correlation. However, regression analysis and regression models can be used not only to identify whether there is a significant relationship or association between variables but also to generate estimations of such a predictive relationship between variables. This basic statistical tutorial discusses the fundamental concepts and techniques related to the most common types of regression analysis and modeling, including simple linear regression, multiple regression, logistic regression, ordinal regression, and Poisson regression, as well as the common yet often underrecognized phenomenon of regression toward the mean. The various types of regression analysis are powerful statistical techniques, which when appropriately applied, can allow for the valid interpretation of complex, multifactorial data. Regression analysis and models can assess whether there is a relationship or association between 2 or more observed variables and estimate the strength of this association, as well as determine whether 1 or more variables can predict another variable. Regression is thus being applied more commonly in anesthesia, perioperative, critical care, and pain research. However, it is crucial to note that regression can identify plausible risk factors; it does not prove causation (a definitive cause and effect relationship). The results of a regression analysis instead identify independent (predictor) variable(s) associated with the dependent (outcome) variable. As with other statistical methods, applying regression requires that certain assumptions be met, which can be tested with specific diagnostics.

  9. Isolating the anthropogenic component of Arctic warming

    DOE PAGES

    Chylek, Petr; Hengartner, Nicholas; Lesins, Glen; ...

    2014-05-28

    Structural equation modeling is used in statistical applications as both confirmatory and exploratory modeling to test models and to suggest the most plausible explanation for a relationship between the independent and the dependent variables. Although structural analysis cannot prove causation, it can suggest the most plausible set of factors that influence the observed variable. Here, we apply structural model analysis to the annual mean Arctic surface air temperature from 1900 to 2012 to find the most effective set of predictors and to isolate the anthropogenic component of the recent Arctic warming by subtracting the effects of natural forcing and variabilitymore » from the observed temperature. We also find that anthropogenic greenhouse gases and aerosols radiative forcing and the Atlantic Multidecadal Oscillation internal mode dominate Arctic temperature variability. Finally, our structural model analysis of observational data suggests that about half of the recent Arctic warming of 0.64 K/decade may have anthropogenic causes.« less

  10. Software phantom with realistic speckle modeling for validation of image analysis methods in echocardiography

    NASA Astrophysics Data System (ADS)

    Law, Yuen C.; Tenbrinck, Daniel; Jiang, Xiaoyi; Kuhlen, Torsten

    2014-03-01

    Computer-assisted processing and interpretation of medical ultrasound images is one of the most challenging tasks within image analysis. Physical phenomena in ultrasonographic images, e.g., the characteristic speckle noise and shadowing effects, make the majority of standard methods from image analysis non optimal. Furthermore, validation of adapted computer vision methods proves to be difficult due to missing ground truth information. There is no widely accepted software phantom in the community and existing software phantoms are not exible enough to support the use of specific speckle models for different tissue types, e.g., muscle and fat tissue. In this work we propose an anatomical software phantom with a realistic speckle pattern simulation to _ll this gap and provide a exible tool for validation purposes in medical ultrasound image analysis. We discuss the generation of speckle patterns and perform statistical analysis of the simulated textures to obtain quantitative measures of the realism and accuracy regarding the resulting textures.

  11. Differentiation of wines according to grape variety and geographical origin based on volatiles profiling using SPME-MS and SPME-GC/MS methods.

    PubMed

    Ziółkowska, Angelika; Wąsowicz, Erwin; Jeleń, Henryk H

    2016-12-15

    Among methods to detect wine adulteration, profiling volatiles is one with a great potential regarding robustness, analysis time and abundance of information for subsequent data treatment. Volatile fraction fingerprinting by solid-phase microextraction with direct analysis by mass spectrometry without compounds separation (SPME-MS) was used for differentiation of white as well as red wines. The aim was to differentiate between varieties used for wine production and to also differentiate wines by country of origin. The results obtained were compared to SPME-GC/MS analysis in which compounds were resolved by gas chromatography. For both approaches the same type of statistical procedure was used to compare samples: principal component analysis (PCA) followed by linear discriminant analysis (LDA). White wines (38) and red wines (41) representing different grape varieties and various regions of origin were analysed. SPME-MS proved to be advantageous in use due to better discrimination and higher sample throughput. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. A Localized Ensemble Kalman Smoother

    NASA Technical Reports Server (NTRS)

    Butala, Mark D.

    2012-01-01

    Numerous geophysical inverse problems prove difficult because the available measurements are indirectly related to the underlying unknown dynamic state and the physics governing the system may involve imperfect models or unobserved parameters. Data assimilation addresses these difficulties by combining the measurements and physical knowledge. The main challenge in such problems usually involves their high dimensionality and the standard statistical methods prove computationally intractable. This paper develops and addresses the theoretical convergence of a new high-dimensional Monte-Carlo approach called the localized ensemble Kalman smoother.

  13. The maximum entropy production principle: two basic questions.

    PubMed

    Martyushev, Leonid M

    2010-05-12

    The overwhelming majority of maximum entropy production applications to ecological and environmental systems are based on thermodynamics and statistical physics. Here, we discuss briefly maximum entropy production principle and raises two questions: (i) can this principle be used as the basis for non-equilibrium thermodynamics and statistical mechanics and (ii) is it possible to 'prove' the principle? We adduce one more proof which is most concise today.

  14. Do physicians understand cancer screening statistics? A national survey of primary care physicians in the United States.

    PubMed

    Wegwarth, Odette; Schwartz, Lisa M; Woloshin, Steven; Gaissmaier, Wolfgang; Gigerenzer, Gerd

    2012-03-06

    Unlike reduced mortality rates, improved survival rates and increased early detection do not prove that cancer screening tests save lives. Nevertheless, these 2 statistics are often used to promote screening. To learn whether primary care physicians understand which statistics provide evidence about whether screening saves lives. Parallel-group, randomized trial (randomization controlled for order effect only), conducted by Internet survey. (ClinicalTrials.gov registration number: NCT00981019) National sample of U.S. primary care physicians from a research panel maintained by Harris Interactive (79% cooperation rate). 297 physicians who practiced both inpatient and outpatient medicine were surveyed in 2010, and 115 physicians who practiced exclusively outpatient medicine were surveyed in 2011. Physicians received scenarios about the effect of 2 hypothetical screening tests: The effect was described as improved 5-year survival and increased early detection in one scenario and as decreased cancer mortality and increased incidence in the other. Physicians' recommendation of screening and perception of its benefit in the scenarios and general knowledge of screening statistics. Primary care physicians were more enthusiastic about the screening test supported by irrelevant evidence (5-year survival increased from 68% to 99%) than about the test supported by relevant evidence (cancer mortality reduced from 2 to 1.6 in 1000 persons). When presented with irrelevant evidence, 69% of physicians recommended the test, compared with 23% when presented with relevant evidence (P < 0.001). When asked general knowledge questions about screening statistics, many physicians did not distinguish between irrelevant and relevant screening evidence; 76% versus 81%, respectively, stated that each of these statistics proves that screening saves lives (P = 0.39). About one half (47%) of the physicians incorrectly said that finding more cases of cancer in screened as opposed to unscreened populations "proves that screening saves lives." Physicians' recommendations for screening were based on hypothetical scenarios, not actual practice. Most primary care physicians mistakenly interpreted improved survival and increased detection with screening as evidence that screening saves lives. Few correctly recognized that only reduced mortality in a randomized trial constitutes evidence of the benefit of screening. Harding Center for Risk Literacy, Max Planck Institute for Human Development.

  15. Electrocoagulation with polarity switch for fast oil removal from oil in water emulsions.

    PubMed

    Gobbi, Lorena C A; Nascimento, Izabela L; Muniz, Eduardo P; Rocha, Sandra M S; Porto, Paulo S S

    2018-05-01

    An electrocoagulation technique using a 3.5 L reactor, with aluminum electrodes in a monopolar arrangement with polarity switch at each 10 s was used to separate oil from synthetic oily water similar in oil concentration to produced water from offshore platforms. Up to 98% of oil removal was achieved after 20 min of processing. Processing time dependence of the oil removal and pH was measured and successfully adjusted to exponential models, indicating a pseudo first order behavior. Statistical analysis was used to prove that electrical conductivity and total solids depend significantly on the concentration of electrolyte (NaCl) in the medium. Oil removal depends mostly on the distance between the electrodes but is proportional to electrolyte concentration when initial pH is 8. Electrocoagulation with polarity switch maximizes the lifetime of the electrodes. The process reduced oil concentration to a value below that stipulated by law, proving it can be an efficient technology to minimize the offshore drilling impact in the environment. Copyright © 2018 Elsevier Ltd. All rights reserved.

  16. Cupriavidus metallidurans biomineralization ability and its application as a bioconsolidation enhancer for ornamental marble stone.

    PubMed

    Daskalakis, Markos I; Magoulas, Antonis; Kotoulas, Georgios; Katsikis, Ioannis; Bakolas, Asterios; Karageorgis, Aristomenis P; Mavridou, Athena; Doulia, Danae; Rigas, Fotis

    2014-08-01

    Bacterially induced calcium carbonate precipitation of a Cupriavidus metallidurans isolate was investigated to develop an environmentally friendly method for restoration and preservation of ornamental stones. Biomineralization performance was carried out in a growth medium via a Design of Experiments (DoE) approach using, as design factors, the temperature, growth medium concentration, and inoculum concentration. The optimum conditions were determined with the aid of consecutive experiments based on response surface methodology (RSM) and were successfully validated thereafter. Statistical analysis can be utilized as a tool for screening bacterial bioprecipitation as it considerably reduced the experimental time and effort needed for bacterial evaluation. Analytical methods provided an insight to the biomineral characteristics, and sonication tests proved that our isolate could create a solid new layer of vaterite on marble substrate withstanding sonication forces. C. metallidurans ACA-DC 4073 provided a compact vaterite layer on the marble substrate with morphological characteristics that assisted in its differentiation. The latter proved valuable during spraying minimum amount of inoculated media on marble substrate under conditions close to an in situ application. A sufficient and clearly distinguishable layer was identified.

  17. Low level laser therapy in healing tendon

    NASA Astrophysics Data System (ADS)

    Carvalho, P. T. C.; Batista, Cheila O. C.; Fabíola, C.

    2005-11-01

    This study aims to verify the effects of AsGa Laser in the scarring of tendon lesion in rats with low nourishment condition and to analyze the ideal light density by means of histopathologic findings highlighted by light microscopy. After the proposed nutritional condition was verified the animals were divided into 3 groups denominated as follows: GI control group, GII laser 1 J/sq.cm. and GIII laser 4 J/sq.cm. The lesions were induced by means of routine surgical process for tendon exposure: There was a crushing process with Allis pincers followed by saturated incision. The data obtained in relation to the amount of macrophage, leukocyte, fibroblast, vessel neoformation, fibrosis and collagen were submitted to parametric statistic procedures of variance analysis and "Tukey" Test and the result obtained was p < 0,05. According to the obtained results it can be concluded that low power laser therapy proved to be efficient in tendon repairing even though the animals suffered from malnutrition as well as the 1 J energy density proved to be more efficient in this case.

  18. Real-time monitoring of a coffee roasting process with near infrared spectroscopy using multivariate statistical analysis: A feasibility study.

    PubMed

    Catelani, Tiago A; Santos, João Rodrigo; Páscoa, Ricardo N M J; Pezza, Leonardo; Pezza, Helena R; Lopes, João A

    2018-03-01

    This work proposes the use of near infrared (NIR) spectroscopy in diffuse reflectance mode and multivariate statistical process control (MSPC) based on principal component analysis (PCA) for real-time monitoring of the coffee roasting process. The main objective was the development of a MSPC methodology able to early detect disturbances to the roasting process resourcing to real-time acquisition of NIR spectra. A total of fifteen roasting batches were defined according to an experimental design to develop the MSPC models. This methodology was tested on a set of five batches where disturbances of different nature were imposed to simulate real faulty situations. Some of these batches were used to optimize the model while the remaining was used to test the methodology. A modelling strategy based on a time sliding window provided the best results in terms of distinguishing batches with and without disturbances, resourcing to typical MSPC charts: Hotelling's T 2 and squared predicted error statistics. A PCA model encompassing a time window of four minutes with three principal components was able to efficiently detect all disturbances assayed. NIR spectroscopy combined with the MSPC approach proved to be an adequate auxiliary tool for coffee roasters to detect faults in a conventional roasting process in real-time. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Fresh Biomass Estimation in Heterogeneous Grassland Using Hyperspectral Measurements and Multivariate Statistical Analysis

    NASA Astrophysics Data System (ADS)

    Darvishzadeh, R.; Skidmore, A. K.; Mirzaie, M.; Atzberger, C.; Schlerf, M.

    2014-12-01

    Accurate estimation of grassland biomass at their peak productivity can provide crucial information regarding the functioning and productivity of the rangelands. Hyperspectral remote sensing has proved to be valuable for estimation of vegetation biophysical parameters such as biomass using different statistical techniques. However, in statistical analysis of hyperspectral data, multicollinearity is a common problem due to large amount of correlated hyper-spectral reflectance measurements. The aim of this study was to examine the prospect of above ground biomass estimation in a heterogeneous Mediterranean rangeland employing multivariate calibration methods. Canopy spectral measurements were made in the field using a GER 3700 spectroradiometer, along with concomitant in situ measurements of above ground biomass for 170 sample plots. Multivariate calibrations including partial least squares regression (PLSR), principal component regression (PCR), and Least-Squared Support Vector Machine (LS-SVM) were used to estimate the above ground biomass. The prediction accuracy of the multivariate calibration methods were assessed using cross validated R2 and RMSE. The best model performance was obtained using LS_SVM and then PLSR both calibrated with first derivative reflectance dataset with R2cv = 0.88 & 0.86 and RMSEcv= 1.15 & 1.07 respectively. The weakest prediction accuracy was appeared when PCR were used (R2cv = 0.31 and RMSEcv= 2.48). The obtained results highlight the importance of multivariate calibration methods for biomass estimation when hyperspectral data are used.

  20. Transanal hemorrhoidal dearterialization with mucopexy versus open hemorrhoidectomy in the treatment of hemorrhoids: a meta-analysis of randomized control trials.

    PubMed

    Xu, L; Chen, H; Lin, G; Ge, Q; Qi, H; He, X

    2016-12-01

    The aim of this study was to analyse the outcomes of transanal hemorrhoidal dearterialization with mucopexy (THDm) versus open hemorrhoidectomy (OH) in the management of hemorrhoids. Randomized controlled trials in English were found by searching PubMed, Web of science, EMBASE, and the Cochrane Library database. Trials that compared THDm with OH were identified. Data were extracted independently for each study, and a meta-analysis was performed using fixed and random effects models. Four trials, including 316 patients, met the inclusion criteria. No statistically significant differences were noted in either total complications or postoperative bleeding, incontinence, recurrent prolapse, and urinary retention rate. Operative time was significantly longer for THDm with Doppler guidance than for THDm without Doppler guidance. Patients returned to normal activities faster after THDm than after OH. No statistically significant differences between THDm and OH were noted with regard to recurrence and reoperation rates. Our meta-analysis shows that THDm and OH are equally effective and can be attempted for the management of hemorrhoids. However, for THDm with Doppler guidance, more instruments and a longer operative time are required. Future large-scale, high-quality, multicenter trials with long-term outcomes are needed to prove these results and determine whether Doppler guidance in THD is truly necessary or not.

  1. More Results from the Opera Experiment at the Gran Sasso Underground Lab

    NASA Astrophysics Data System (ADS)

    Kamiscioglu, Mustafa

    The OPERA experiment reached its main goal by proving the appearance of ντ in the CNGS νμ beam. Five ντ candidates fulfilling the analysis defined in the proposal were detected with a S/B ratio of about ten allowing to reject the null hypothesis at 5.1σ. The search has been extended by loosening the selection criteria in order to obtain a statistically enhanced, lower purity, signal sample. One such interesting neutrino interaction with a double vertex topology having a high probability of being a ντ interaction with charm production is reported. Based on the enlarged data sample the estimation of Δm232 in appearance mode is presented. The search for νe interactions has been extended over the full data set with a more than twofold increase in statistics with respect to published data. The analysis of the νμ → νe channel is updated and the implications of the electron neutrino sample in the framework of the 3+1 neutrino model is discussed. An analysis of νμ → ντ interactions in the framework of the sterile neutrino model has also been performed. Finally, the results of the study of charged hadron multiplicity distributions is presented.

  2. Independent Component Analysis-motivated Approach to Classificatory Decomposition of Cortical Evoked Potentials

    PubMed Central

    Smolinski, Tomasz G; Buchanan, Roger; Boratyn, Grzegorz M; Milanova, Mariofanna; Prinz, Astrid A

    2006-01-01

    Background Independent Component Analysis (ICA) proves to be useful in the analysis of neural activity, as it allows for identification of distinct sources of activity. Applied to measurements registered in a controlled setting and under exposure to an external stimulus, it can facilitate analysis of the impact of the stimulus on those sources. The link between the stimulus and a given source can be verified by a classifier that is able to "predict" the condition a given signal was registered under, solely based on the components. However, the ICA's assumption about statistical independence of sources is often unrealistic and turns out to be insufficient to build an accurate classifier. Therefore, we propose to utilize a novel method, based on hybridization of ICA, multi-objective evolutionary algorithms (MOEA), and rough sets (RS), that attempts to improve the effectiveness of signal decomposition techniques by providing them with "classification-awareness." Results The preliminary results described here are very promising and further investigation of other MOEAs and/or RS-based classification accuracy measures should be pursued. Even a quick visual analysis of those results can provide an interesting insight into the problem of neural activity analysis. Conclusion We present a methodology of classificatory decomposition of signals. One of the main advantages of our approach is the fact that rather than solely relying on often unrealistic assumptions about statistical independence of sources, components are generated in the light of a underlying classification problem itself. PMID:17118151

  3. Black sea surface temperature anomaly on 5th August 1998 and the ozone layer thickness

    NASA Astrophysics Data System (ADS)

    Manev, A.; Palazov, K.; Raykov, St.; Ivanov, V.

    2003-04-01

    BLACK SEA SURFACE TEMPERATURE ANOMALY ON 5th AUGUST 1998 AND THE OZONE LAYER THICKNESS A. Manev , K. Palazov , St. Raykov, V. Ivanov Solar Terrestrial Influences Laboratory, Bulgarian Academy of Sciences amanev@abv.bg This paper focuses on the peculiarities of the Black Sea surface temperature anomaly on 05.08.1998. Researching the daily temperature changes in a number of control fields in the course of 8-10 years, we have found hidden correlations and anomalous deviations in the sea surface temperatures on a global scale. Research proves the statistical reliability of the temperature anomaly on the entire Black Sea surface registered on 04.-05.08.1998. In the course of six days around these dates the temperatures are up to 2°C higher than the maximum temperatures in this period in the other seven years. A more detailed analysis of the dynamics of the anomaly required the investigation of five Black Sea surface characteristic zones of 75x75 km. The analysis covers the period 20 days - 10 days before and 10 days after the anomaly. Investigations aimed at interpreting the reasons for the anomalous heating of the surface waters. We have tried to analyze the correlation between sea surface temperature and the global ozone above the Black Sea by using simultaneously data from the two satellite systems NOAA and TOMS. Methods of processing and comparing the data from the two satellite systems are described. The correlation coefficients values for the five characteristic zones are very high and close, which proves that the character of the correlation ozone - sea surface temperature is the same for the entire Black Sea surface. Despite the high correlation coefficient, we have proved that causality between the two phenomena at the time of the anomaly does not exit.

  4. A twelve-year profile of students' SAT scores, GPAs, and MCAT scores from a small university's premedical program.

    PubMed

    Montague, J R; Frei, J K

    1993-04-01

    To determine whether significant correlations existed among quantitative and qualitative predictors of students' academic success and quantitative outcomes of such success over a 12-year period in a small university's premedical program. A database was assembled from information on the 199 graduates who earned BS degrees in biology from Barry University's School of Natural and Health Sciences from 1980 through 1991. The quantitative variables were year of BS degree, total score on the Scholastic Aptitude Test (SAT), various measures of undergraduate grade-point averages (GPAs), and total score on the Medical College Admission Test (MCAT); and the qualitative variables were minority (54% of the students) or majority status and transfer (about one-third of the students) or nontransfer status. The statistical methods were multiple analysis of variance and stepwise multiple regression. Statistically significant positive correlations were found among SAT total scores, final GPAs, biology GPAs versus nonbiology GPAs, and MCAT total scores. These correlations held for transfer versus nontransfer students and for minority versus majority students. Over the 12-year period there were significant fluctuations in mean MCAT scores. The students' SAT scores and GPAs proved to be statistically reliable predictors of MCAT scores, but the minority or majority status and the transfer or nontransfer status of the students were statistically insignificant.

  5. Phenolic Analysis and Theoretic Design for Chinese Commercial Wines' Authentication.

    PubMed

    Li, Si-Yu; Zhu, Bao-Qing; Reeves, Malcolm J; Duan, Chang-Qing

    2018-01-01

    To develop a robust tool for Chinese commercial wines' varietal, regional, and vintage authentication, phenolic compounds in 121 Chinese commercial dry red wines were detected and quantified by using high-performance liquid chromatography triple-quadrupole mass spectrometry (HPLC-QqQ-MS/MS), and differentiation abilities of principal component analysis (PCA), partial least squares discriminant analysis (PLS-DA), and orthogonal partial least squares discriminant analysis (OPLS-DA) were compared. Better than PCA and PLS-DA, OPLS-DA models used to differentiate wines according to their varieties (Cabernet Sauvignon or other varieties), regions (east or west Cabernet Sauvignon wines), and vintages (young or old Cabernet Sauvignon wines) were ideally established. The S-plot provided in OPLS-DA models showed the key phenolic compounds which were both statistically and biochemically significant in sample differentiation. Besides, the potential of the OPLS-DA models in deeper sample differentiating of more detailed regional and vintage information of wines was proved optimistic. On the basis of our results, a promising theoretic design for wine authentication was further proposed for the first time, which might be helpful in practical authentication of more commercial wines. The phenolic data of 121 Chinese commercial dry red wines was processed with different statistical tools for varietal, regional, and vintage differentiation. A promising theoretical design was summarized, which might be helpful for wine authentication in practical situation. © 2017 Institute of Food Technologists®.

  6. An investigation on thermal patterns in Iran based on spatial autocorrelation

    NASA Astrophysics Data System (ADS)

    Fallah Ghalhari, Gholamabbas; Dadashi Roudbari, Abbasali

    2018-02-01

    The present study aimed at investigating temporal-spatial patterns and monthly patterns of temperature in Iran using new spatial statistical methods such as cluster and outlier analysis, and hotspot analysis. To do so, climatic parameters, monthly average temperature of 122 synoptic stations, were assessed. Statistical analysis showed that January with 120.75% had the most fluctuation among the studied months. Global Moran's Index revealed that yearly changes of temperature in Iran followed a strong spatially clustered pattern. Findings showed that the biggest thermal cluster pattern in Iran, 0.975388, occurred in May. Cluster and outlier analyses showed that thermal homogeneity in Iran decreases in cold months, while it increases in warm months. This is due to the radiation angle and synoptic systems which strongly influence thermal order in Iran. The elevations, however, have the most notable part proved by Geographically weighted regression model. Iran's thermal analysis through hotspot showed that hot thermal patterns (very hot, hot, and semi-hot) were dominant in the South, covering an area of 33.5% (about 552,145.3 km2). Regions such as mountain foot and low lands lack any significant spatial autocorrelation, 25.2% covering about 415,345.1 km2. The last is the cold thermal area (very cold, cold, and semi-cold) with about 25.2% covering about 552,145.3 km2 of the whole area of Iran.

  7. Supervised chemical pattern recognition in almond ( Prunus dulcis ) Portuguese PDO cultivars: PCA- and LDA-based triennial study.

    PubMed

    Barreira, João C M; Casal, Susana; Ferreira, Isabel C F R; Peres, António M; Pereira, José Alberto; Oliveira, M Beatriz P P

    2012-09-26

    Almonds harvested in three years in Trás-os-Montes (Portugal) were characterized to find differences among Protected Designation of Origin (PDO) Amêndoa Douro and commercial non-PDO cultivars. Nutritional parameters, fiber (neutral and acid detergent fibers, acid detergent lignin, and cellulose), fatty acids, triacylglycerols (TAG), and tocopherols were evaluated. Fat was the major component, followed by carbohydrates, protein, and moisture. Fatty acids were mostly detected as monounsaturated and polyunsaturated forms, with relevance of oleic and linoleic acids. Accordingly, 1,2,3-trioleoylglycerol and 1,2-dioleoyl-3-linoleoylglycerol were the major TAG. α-Tocopherol was the leading tocopherol. To verify statistical differences among PDO and non-PDO cultivars independent of the harvest year, data were analyzed through an analysis of variance, a principal component analysis, and a linear discriminant analysis (LDA). These differences identified classification parameters, providing an important tool for authenticity purposes. The best results were achieved with TAG analysis coupled with LDA, which proved its effectiveness to discriminate almond cultivars.

  8. Compressive strength of human openwedges: a selection method

    NASA Astrophysics Data System (ADS)

    Follet, H.; Gotteland, M.; Bardonnet, R.; Sfarghiu, A. M.; Peyrot, J.; Rumelhart, C.

    2004-02-01

    A series of 44 samples of bone wedges of human origin, intended for allograft openwedge osteotomy and obtained without particular precautions during hip arthroplasty were re-examined. After viral inactivity chemical treatment, lyophilisation and radio-sterilisation (intended to produce optimal health safety), the compressive strength, independent of age, sex and the height of the sample (or angle of cut), proved to be too widely dispersed [ 10{-}158 MPa] in the first study. We propose a method for selecting samples which takes into account their geometry (width, length, thicknesses, cortical surface area). Statistical methods (Principal Components Analysis PCA, Hierarchical Cluster Analysis, Multilinear regression) allowed final selection of 29 samples having a mean compressive strength σ_{max} =103 MPa ± 26 and with variation [ 61{-}158 MPa] . These results are equivalent or greater than average materials currently used in openwedge osteotomy.

  9. Neural networks using broadband spectral discriminators reduces illumination required for broccoli identification in weedy fields

    NASA Astrophysics Data System (ADS)

    Hahn, Federico

    1996-03-01

    Statistical discriminative analysis and neural networks were used to prove that crop/weed/soil discrimination by optical reflectance was feasible. The wavelengths selected as inputs on those neural networks were ten nanometers width, reducing the total collected radiation for the sensor. Spectral data collected from several farms having different weed populations were introduced to discriminant analysis. The best discriminant wavelengths were used to build a wavelength histogram which selected the three best spectral broadbands for broccoli/weed/soil discrimination. The broadbands were analyzed using a new single broadband discriminator index named the discriminative integration index, DII, and the DII values obtained were used to train a neural network. This paper introduces the index concept, its results and its use for minimizing artificial lightning requirements with broadband spectral measurements for broccoli/weed/soil discrimination.

  10. Evaluation of orthognathic surgery on articular disc position and temporomandibular joint symptoms in skeletal class II patients: A Magnetic Resonance Imaging study.

    PubMed

    Firoozei, Gholamreza; Shahnaseri, Shirin; Momeni, Hasan; Soltani, Parisa

    2017-08-01

    The purpose of orthognathic surgery is to correct facial deformity and dental malocclusion and to obtain normal orofacial function. However, there are controversies of whether orthognathic surgery might have any negative influence on temporomandibular (TM) joint. The purpose of this study was to evaluate the influence of orthognathic surgery on articular disc position and temporomandibular joint symptoms of skeletal CI II patients by means of magnetic resonance imaging. For this purpose, fifteen patients with skeletal CI II malocclusion, aged 19-32 years (mean 23 years), 10 women and 5 men, from the Isfahan Department of Oral and Maxillofacial Surgery were studied. All received LeFort I and bilateral sagittal split osteotomy (BSSO) osteotomies and all patients received pre- and post-surgical orthodontic treatment. Magnetic resonance imaging was performed 1 day preoperatively and 3 month postoperatively. Descriptive statistics and Wilcoxon and Mc-Nemar tests were used for statistical analysis. P <0.05 was considered significant. Disc position ranged between 4.25 and 8.09 prior to surgery (mean=5.74±1.21). After surgery disc position range was 4.36 to 7.40 (mean=5.65±1.06). Statistical analysis proved that although TM disc tended to move anteriorly after BSSO surgery, this difference was not statistically significant ( p value<0.05). The findings of the present study revealed that orthognathic surgery does not alter the disc and condyle relationship. Therefore, it has minimal effects on intact and functional TM joint. Key words: Orthognathic surgery, skeletal class 2, magnetic resonance imaging, temporomandibular disc.

  11. Short-term monitoring of benzene air concentration in an urban area: a preliminary study of application of Kruskal-Wallis non-parametric test to assess pollutant impact on global environment and indoor.

    PubMed

    Mura, Maria Chiara; De Felice, Marco; Morlino, Roberta; Fuselli, Sergio

    2010-01-01

    In step with the need to develop statistical procedures to manage small-size environmental samples, in this work we have used concentration values of benzene (C6H6), concurrently detected by seven outdoor and indoor monitoring stations over 12 000 minutes, in order to assess the representativeness of collected data and the impact of the pollutant on indoor environment. Clearly, the former issue is strictly connected to sampling-site geometry, which proves critical to correctly retrieving information from analysis of pollutants of sanitary interest. Therefore, according to current criteria for network-planning, single stations have been interpreted as nodes of a set of adjoining triangles; then, a) node pairs have been taken into account in order to estimate pollutant stationarity on triangle sides, as well as b) node triplets, to statistically associate data from air-monitoring with the corresponding territory area, and c) node sextuplets, to assess the impact probability of the outdoor pollutant on indoor environment for each area. Distributions from the various node combinations are all non-Gaussian, in the consequently, Kruskal-Wallis (KW) non-parametric statistics has been exploited to test variability on continuous density function from each pair, triplet and sextuplet. Results from the above-mentioned statistical analysis have shown randomness of site selection, which has not allowed a reliable generalization of monitoring data to the entire selected territory, except for a single "forced" case (70%); most important, they suggest a possible procedure to optimize network design.

  12. On the use of administrative databases to support planning activities: the case of the evaluation of neonatal case-mix in the Emilia-Romagna region using DRG and APR-DRG classification systems.

    PubMed

    Fantini, M P; Cisbani, L; Manzoli, L; Vertrees, J; Lorenzoni, L

    2003-06-01

    There are several versions of the Diagnosis Related Group (DRG) classification systems that are used for case-mix analysis, utilization review, prospective payment, and planning applications. The objective of this study was to assess the adequacy of two of these DRG systems--Medicare DRG and All Patient Refined DRG--to classify neonatal patients. The first part of the paper contains a descriptive analysis that outlines the major differences between the two systems in terms of classification logic and variables used in the assignment process. The second part examines the statistical performance of each system on the basis of the administrative data collected in all public hospitals of the Emilia-Romagna region relating to neonates discharged in 1997 and 1998. The Medicare DRG are less developed in terms of classification structure and yield a poorer statistical performance in terms of reduction in variance for length of stay. This is important because, for specific areas, a more refined system can prove useful at regional level to remove systematic biases in the measurement of case-mix due to the structural characteristics of the Medicare DRGs classification system.

  13. Automatic recognition of surface landmarks of anatomical structures of back and posture

    NASA Astrophysics Data System (ADS)

    Michoński, Jakub; Glinkowski, Wojciech; Witkowski, Marcin; Sitnik, Robert

    2012-05-01

    Faulty postures, scoliosis and sagittal plane deformities should be detected as early as possible to apply preventive and treatment measures against major clinical consequences. To support documentation of the severity of deformity and diminish x-ray exposures, several solutions utilizing analysis of back surface topography data were introduced. A novel approach to automatic recognition and localization of anatomical landmarks of the human back is presented that may provide more repeatable results and speed up the whole procedure. The algorithm was designed as a two-step process involving a statistical model built upon expert knowledge and analysis of three-dimensional back surface shape data. Voronoi diagram is used to connect mean geometric relations, which provide a first approximation of the positions, with surface curvature distribution, which further guides the recognition process and gives final locations of landmarks. Positions obtained using the developed algorithms are validated with respect to accuracy of manual landmark indication by experts. Preliminary validation proved that the landmarks were localized correctly, with accuracy depending mostly on the characteristics of a given structure. It was concluded that recognition should mainly take into account the shape of the back surface, putting as little emphasis on the statistical approximation as possible.

  14. Generalized statistical complexity measures: Geometrical and analytical properties

    NASA Astrophysics Data System (ADS)

    Martin, M. T.; Plastino, A.; Rosso, O. A.

    2006-09-01

    We discuss bounds on the values adopted by the generalized statistical complexity measures [M.T. Martin et al., Phys. Lett. A 311 (2003) 126; P.W. Lamberti et al., Physica A 334 (2004) 119] introduced by López Ruiz et al. [Phys. Lett. A 209 (1995) 321] and Shiner et al. [Phys. Rev. E 59 (1999) 1459]. Several new theorems are proved and illustrated with reference to the celebrated logistic map.

  15. [Intranarcotic infusion therapy -- a computer interpretation using the program package SPSS (Statistical Package for the Social Sciences)].

    PubMed

    Link, J; Pachaly, J

    1975-08-01

    In a retrospective 18-month study the infusion therapy applied in a great anesthesia institute is examined. The data of the course of anesthesia recorded on magnetic tape by routine are analysed for this purpose bya computer with the statistical program SPSS. It could be proved that the behaviour of the several anesthetists is very different. Various correlations are discussed.

  16. Effect of cotton bollworm (Helicoverpa armigera Hübner) caused injury on maize grain content, especially regarding to the protein alteration.

    PubMed

    Keszthelyi, S; Pál-Fám, F; Kerepesi, I

    2011-03-01

    The cotton bollworm (Helicoverpa armigera Hübner), which migrated in the Carpathian-basin from Mediterraneum in the last decades, is becoming an increasingly serious problem for maize producers in Hungary. In several regions the damage it causes has reached the threshold of economic loss, especially in the case of the sweet maize cultivation. The aim of the research was to determine the changing of ears weights and in-kernel accumulation and alteration in grain as a function of cotton bollworm mastication.Our investigation confirmed that there is an in-kernel and protein pattern change of maize grain by cotton bollworm. Our results proved the significant damaging of each part of ears by cotton bollworm masticating (the average weight loss of ears: 13.99%; the average weight loss of grains: 14.03%; the average weight loss of cobs: 13.74%), with the exception of the increasing of the grain-cob ratio. Our examinations did not prove the water loss - that is the "forced maturing" - caused by the damage. Decreasing of raw fat (control: 2.8%; part-damaged: 2.6%; damaged: 2.4%) and starch content (control: 53.1%; part-damaged: 46.6%; damaged: 44.7%) were registered as a function of injury. In contrast, the raw protein content was increased (control: 4.7%; part-damaged: 5.3%; damaged: 7.4%) by maize ear masticating. The most conspicuous effect on protein composition changing was proved by comparison of damaged grain samples by SDS PAGE. Increased amounts of 114, 50, 46 and 35 kDa molecular mass proteins were detected which explained the more than 50% elevation of raw protein content. The statistical analysis of molecular weights proved the protein realignment as a function of the pest injuries, too.

  17. Health inequalities among rural and urban population of Eastern Poland in the context of sustainable development.

    PubMed

    Pantyley, Viktoriya

    2017-09-21

    The primary goals of the study were a critical analysis of the concepts associated with health from the perspective of sustainable development, and empirical analysis of health and health- related issues among the rural and urban residents of Eastern Poland in the context of the sustainable development of the region. The study was based on the following research methods: a systemic approach, selection and analysis of the literature and statistical data, developing a special questionnaire concerning socio-economic and health inequalities among the population in the studied area, field research with an interview questionnaire conducted on randomly-selected respondents (N=1,103) in randomly selected areas of the Lubelskie, Podkarpackie, Podlaskie and eastern part of Mazowieckie Provinces (with the division between provincial capital cities - county capital cities - other cities - rural areas). The results of statistical surveys in the studied area with the use of chi-square test and contingence quotients indicated a correlation between the state of health and the following independent variables: age, life quality, social position and financial situation (C-Pearson's coefficient over 0,300); a statistically significant yet weak correlation was recorded for gender, household size, place of residence and amount of free time. The conducted analysis proved the existence of a huge gap between state of health of the population in urban and rural areas. In order to eliminate unfavourable differences in the state iof health among the residents of Eastern Poland, and provide equal sustainable development in urban and rural areas of the examined areas, special preventive programmes aimed at the residents of peripheral, marginalized rural areas should be implemented. In these programmes, attention should be paid to preventive measures, early diagnosis of basic civilization and social diseases, and better accessibility to medical services for the residents.

  18. Contrast-Enhanced Ultrasonography in Differential Diagnosis of Benign and Malignant Ovarian Tumors

    PubMed Central

    Qiao, Jing-Jing; Yu, Jing; Yu, Zhe; Li, Na; Song, Chen; Li, Man

    2015-01-01

    Objective To evaluate the accuracy of contrast-enhanced ultrasonography (CEUS) in differential diagnosis of benign and malignant ovarian tumors. Methods The scientific literature databases PubMed, Cochrane Library and CNKI were comprehensively searched for studies relevant to the use of CEUS technique for differential diagnosis of benign and malignant ovarian cancer. Pooled summary statistics for specificity (Spe), sensitivity (Sen), positive and negative likelihood ratios (LR+/LR−), and diagnostic odds ratio (DOR) and their 95%CIs were calculated. Software for statistical analysis included STATA version 12.0 (Stata Corp, College Station, TX, USA) and Meta-Disc version 1.4 (Universidad Complutense, Madrid, Spain). Results Following a stringent selection process, seven high quality clinical trials were found suitable for inclusion in the present meta-analysis. The 7 studies contained a combined total of 375 ovarian cancer patients (198 malignant and 177 benign). Statistical analysis revealed that CEUS was associated with the following performance measures in differential diagnosis of ovarian tumors: pooled Sen was 0.96 (95%CI = 0.92∼0.98); the summary Spe was 0.91 (95%CI = 0.86∼0.94); the pooled LR+ was 10.63 (95%CI = 6.59∼17.17); the pooled LR− was 0.04 (95%CI = 0.02∼0.09); and the pooled DOR was 241.04 (95% CI = 92.61∼627.37). The area under the SROC curve was 0.98 (95% CI = 0.20∼1.00). Lastly, publication bias was not detected (t = −0.52, P = 0.626) in the meta-analysis. Conclusions Our results revealed the high clinical value of CEUS in differential diagnosis of benign and malignant ovarian tumors. Further, CEUS may also prove to be useful in differential diagnosis at early stages of this disease. PMID:25764442

  19. [Occupational hearing loss--problem of health and safety].

    PubMed

    Denisov, É I; Adeninskaia, E E; Eremin, A L; Kur'erov, N N

    2014-01-01

    On the basis of the literature review the critical analysis of the recommendations (the letter of Ministry of Health of Russia from 6/11/2012 N 14-1/10/2-3508) on occupation noise-induced hearing loss (HL) assessment is presented. Need of more strict criteria of HL assessment for workers, than for the general population according to ICF (WHO, 2001), in order to avoid growth of accidents and injury rate is proved. The illegitimacy of a deduction of statistical presbiacusia values from individual audiograms as human rights violation is stressed. Some terminological defects are noted. It is necessary to cancel recommendations and to develop the sanitary norms or state standard with the program of hearing conservation at work.

  20. A new approach for biological online testing of stack gas condensate from municipal waste incinerators.

    PubMed

    Elsner, Dorothea; Fomin, Anette

    2002-01-01

    A biological testing system for the monitoring of stack gas condensates of municipal waste incinerators has been developed using Euglena gracilis as a test organism. The motility, velocity and cellular form of the organisms were the endpoints, calculated by an image analysis system. All endpoints showed statistically significant changes in a short time when organisms were exposed to samples collected during combustion situations with increased pollutant concentrations. The velocity of the organisms proved to be the most appropriate endpoint. A semi-continuous system with E. gracilis for monitoring stack gas condensate is proposed, which could result in an online system for testing stack gas condensates in the future.

  1. Using Self-Reflection To Increase Science Process Skills in the General Chemistry Laboratory

    NASA Astrophysics Data System (ADS)

    Veal, William R.; Taylor, Dawne; Rogers, Amy L.

    2009-03-01

    Self-reflection is a tool of instruction that has been used in the science classroom. Research has shown great promise in using video as a learning tool in the classroom. However, the integration of self-reflective practice using video in the general chemistry laboratory to help students develop process skills has not been done. Immediate video feedback and direct instruction were employed in a general chemistry laboratory course to improve students' mastery and understanding of basic and advanced process skills. Qualitative results and statistical analysis of quantitative data proved that self-reflection significantly helped students develop basic and advanced process skills, yet did not seem to influence the general understanding of the science content.

  2. Statistical distribution of amino acid sequences: a proof of Darwinian evolution.

    PubMed

    Eitner, Krystian; Koch, Uwe; Gaweda, Tomasz; Marciniak, Jedrzej

    2010-12-01

    The article presents results of the listing of the quantity of amino acids, dipeptides and tripeptides for all proteins available in the UNIPROT-TREMBL database and the listing for selected species and enzymes. UNIPROT-TREMBL contains protein sequences associated with computationally generated annotations and large-scale functional characterization. Due to the distinct metabolic pathways of amino acid syntheses and their physicochemical properties, the quantities of subpeptides in proteins vary. We have proved that the distribution of amino acids, dipeptides and tripeptides is statistical which confirms that the evolutionary biodiversity development model is subject to the theory of independent events. It seems interesting that certain short peptide combinations occur relatively rarely or even not at all. First, it confirms the Darwinian theory of evolution and second, it opens up opportunities for designing pharmaceuticals among rarely represented short peptide combinations. Furthermore, an innovative approach to the mass analysis of bioinformatic data is presented. eitner@amu.edu.pl Supplementary data are available at Bioinformatics online.

  3. Impact of the buildings areas on the fire incidence.

    PubMed

    Srekl, Jože; Golob, Janvit

    2010-03-01

    A survey of statistical studies shows that probability of fires is expressed by the equation P(A) = KAα, where A = total floor area of the building and K and  are constants for an individual group, or risk category. This equation, which is based on the statistical data on fires in Great Britain, does not include the impact factors such as the number of employees and the activities carried out in these buildings. In order to find out possible correlations between the activities carried out in buildings, the characteristics of buildings and number of fires, we used a random sample which included 134 buildings as industrial objects, hotels, restaurants, warehouses and shopping malls. Our study shows that the floor area of buildings has low impact on the incidence of fires. After analysing the sample of buildings by using multivariate analysis we proved a correlation between the number of fires, floor area of objects, work operation period (per day) and the number of employees in objects.

  4. Efficacy of cevimeline vs. pilocarpine in the secretion of saliva: a pilot study.

    PubMed

    Brimhall, Jae; Jhaveri, Malhar A; Yepes, Juan F

    2013-01-01

    To determine the efficacy and compare the side-effects of cevimeline and pilocarpine in the secretion of saliva in patients with xerostomia. A randomized, cross-over, double blind study was designed. Fifteen patients with diagnosis of xerostomia were assigned to take either 5 mg of pilocarpine or 30 mg of cevimeline three times a day for four weeks. Salivary flow rates were measured during the initial baseline, first and second month appointments. Statistical analysis was carried out with ANOVA and post hoc t-tests. Twelve patients completed both medication treatments. Although both medications proved to increase salivary secretion, there was no significant difference between pilocarpine and cevimeline. Also, the perceived side-effects between the two medications were similar. Both medications increased the secretion of saliva at the end of four weeks. However, there was a slightly higher increment in saliva with pilocarpine. However, the difference was not statistically significant. ©2013 Special Care Dentistry Association and Wiley Periodicals, Inc.

  5. Comparative Study on the Efficacy of Gingival Retraction using Polyvinyl Acetate Strips and Conventional Retraction Cord - An in Vivo Study.

    PubMed

    Shivasakthy, M; Asharaf Ali, Syed

    2013-10-01

    A new material is proposed in dentistry in the form of strips for producing gingival retraction. The clinical efficacy of the material remains untested. This study aimed to determine whether the polyvinyl acetate strips are able to effectively displace the gingival tissues in comparison with the conventional retraction cord. Complete metal ceramic preparation with supra-gingival margin was performed in fourteen maxillary incisors and gingival retraction was done using Merocel strips and conventional retraction cords alternatively in 2 weeks time interval. The amount of displacement was compared using a digital vernier caliper of 0.01mm accuracy. RESULTS were analyzed statistically using Paired students t-test. The statistical analysis of the data revealed that both the conventional retraction cord and the Merocel strip produce significant retraction. Among both the materials, Merocel proved to be significantly more effective. Merocel strip produces more gingival displacement than the conventional retraction cord.

  6. Comparative Study on the Efficacy of Gingival Retraction using Polyvinyl Acetate Strips and Conventional Retraction Cord – An in Vivo Study

    PubMed Central

    Shivasakthy, M.; Asharaf Ali, Syed

    2013-01-01

    Statement of Problem: A new material is proposed in dentistry in the form of strips for producing gingival retraction. The clinical efficacy of the material remains untested. Purpose of the Study: This study aimed to determine whether the polyvinyl acetate strips are able to effectively displace the gingival tissues in comparison with the conventional retraction cord. Material and Methods: Complete metal ceramic preparation with supra-gingival margin was performed in fourteen maxillary incisors and gingival retraction was done using Merocel strips and conventional retraction cords alternatively in 2 weeks time interval. The amount of displacement was compared using a digital vernier caliper of 0.01mm accuracy. Results were analyzed statistically using Paired students t-test. Results: The statistical analysis of the data revealed that both the conventional retraction cord and the Merocel strip produce significant retraction. Among both the materials, Merocel proved to be significantly more effective. Conclusion: Merocel strip produces more gingival displacement than the conventional retraction cord. PMID:24298531

  7. Impact analysis of two kinds of failure strategies in Beijing road transportation network

    NASA Astrophysics Data System (ADS)

    Zhang, Zundong; Xu, Xiaoyang; Zhang, Zhaoran; Zhou, Huijuan

    The Beijing road transportation network (BRTN), as a large-scale technological network, exhibits very complex and complicate features during daily periods. And it has been widely highlighted that how statistical characteristics (i.e. average path length and global network efficiency) change while the network evolves. In this paper, by using different modeling concepts, three kinds of network models of BRTN namely the abstract network model, the static network model with road mileage as weights and the dynamic network model with travel time as weights — are constructed, respectively, according to the topological data and the real detected flow data. The degree distribution of the three kinds of network models are analyzed, which proves that the urban road infrastructure network and the dynamic network behavior like scale-free networks. By analyzing and comparing the important statistical characteristics of three models under random attacks and intentional attacks, it shows that the urban road infrastructure network and the dynamic network of BRTN are both robust and vulnerable.

  8. Does objective cluster analysis serve as a useful precursor to seasonal precipitation prediction at local scale? Application to western Ethiopia

    NASA Astrophysics Data System (ADS)

    Zhang, Ying; Moges, Semu; Block, Paul

    2018-01-01

    Prediction of seasonal precipitation can provide actionable information to guide management of various sectoral activities. For instance, it is often translated into hydrological forecasts for better water resources management. However, many studies assume homogeneity in precipitation across an entire study region, which may prove ineffective for operational and local-level decisions, particularly for locations with high spatial variability. This study proposes advancing local-level seasonal precipitation predictions by first conditioning on regional-level predictions, as defined through objective cluster analysis, for western Ethiopia. To our knowledge, this is the first study predicting seasonal precipitation at high resolution in this region, where lives and livelihoods are vulnerable to precipitation variability given the high reliance on rain-fed agriculture and limited water resources infrastructure. The combination of objective cluster analysis, spatially high-resolution prediction of seasonal precipitation, and a modeling structure spanning statistical and dynamical approaches makes clear advances in prediction skill and resolution, as compared with previous studies. The statistical model improves versus the non-clustered case or dynamical models for a number of specific clusters in northwestern Ethiopia, with clusters having regional average correlation and ranked probability skill score (RPSS) values of up to 0.5 and 33 %, respectively. The general skill (after bias correction) of the two best-performing dynamical models over the entire study region is superior to that of the statistical models, although the dynamical models issue predictions at a lower resolution and the raw predictions require bias correction to guarantee comparable skills.

  9. Brownian motion or Lévy walk? Stepping towards an extended statistical mechanics for animal locomotion

    PubMed Central

    Gautestad, Arild O.

    2012-01-01

    Animals moving under the influence of spatio-temporal scaling and long-term memory generate a kind of space-use pattern that has proved difficult to model within a coherent theoretical framework. An extended kind of statistical mechanics is needed, accounting for both the effects of spatial memory and scale-free space use, and put into a context of ecological conditions. Simulations illustrating the distinction between scale-specific and scale-free locomotion are presented. The results show how observational scale (time lag between relocations of an individual) may critically influence the interpretation of the underlying process. In this respect, a novel protocol is proposed as a method to distinguish between some main movement classes. For example, the ‘power law in disguise’ paradox—from a composite Brownian motion consisting of a superposition of independent movement processes at different scales—may be resolved by shifting the focus from pattern analysis at one particular temporal resolution towards a more process-oriented approach involving several scales of observation. A more explicit consideration of system complexity within a statistical mechanical framework, supplementing the more traditional mechanistic modelling approach, is advocated. PMID:22456456

  10. Fractal planetary rings: Energy inequalities and random field model

    NASA Astrophysics Data System (ADS)

    Malyarenko, Anatoliy; Ostoja-Starzewski, Martin

    2017-12-01

    This study is motivated by a recent observation, based on photographs from the Cassini mission, that Saturn’s rings have a fractal structure in radial direction. Accordingly, two questions are considered: (1) What Newtonian mechanics argument in support of such a fractal structure of planetary rings is possible? (2) What kinematics model of such fractal rings can be formulated? Both challenges are based on taking planetary rings’ spatial structure as being statistically stationary in time and statistically isotropic in space, but statistically nonstationary in space. An answer to the first challenge is given through an energy analysis of circular rings having a self-generated, noninteger-dimensional mass distribution [V. E. Tarasov, Int. J. Mod Phys. B 19, 4103 (2005)]. The second issue is approached by taking the random field of angular velocity vector of a rotating particle of the ring as a random section of a special vector bundle. Using the theory of group representations, we prove that such a field is completely determined by a sequence of continuous positive-definite matrix-valued functions defined on the Cartesian square F2 of the radial cross-section F of the rings, where F is a fat fractal.

  11. K-nearest neighbors based methods for identification of different gear crack levels under different motor speeds and loads: Revisited

    NASA Astrophysics Data System (ADS)

    Wang, Dong

    2016-03-01

    Gears are the most commonly used components in mechanical transmission systems. Their failures may cause transmission system breakdown and result in economic loss. Identification of different gear crack levels is important to prevent any unexpected gear failure because gear cracks lead to gear tooth breakage. Signal processing based methods mainly require expertize to explain gear fault signatures which is usually not easy to be achieved by ordinary users. In order to automatically identify different gear crack levels, intelligent gear crack identification methods should be developed. The previous case studies experimentally proved that K-nearest neighbors based methods exhibit high prediction accuracies for identification of 3 different gear crack levels under different motor speeds and loads. In this short communication, to further enhance prediction accuracies of existing K-nearest neighbors based methods and extend identification of 3 different gear crack levels to identification of 5 different gear crack levels, redundant statistical features are constructed by using Daubechies 44 (db44) binary wavelet packet transform at different wavelet decomposition levels, prior to the use of a K-nearest neighbors method. The dimensionality of redundant statistical features is 620, which provides richer gear fault signatures. Since many of these statistical features are redundant and highly correlated with each other, dimensionality reduction of redundant statistical features is conducted to obtain new significant statistical features. At last, the K-nearest neighbors method is used to identify 5 different gear crack levels under different motor speeds and loads. A case study including 3 experiments is investigated to demonstrate that the developed method provides higher prediction accuracies than the existing K-nearest neighbors based methods for recognizing different gear crack levels under different motor speeds and loads. Based on the new significant statistical features, some other popular statistical models including linear discriminant analysis, quadratic discriminant analysis, classification and regression tree and naive Bayes classifier, are compared with the developed method. The results show that the developed method has the highest prediction accuracies among these statistical models. Additionally, selection of the number of new significant features and parameter selection of K-nearest neighbors are thoroughly investigated.

  12. Different mathematical processing of absorption, ratio and derivative spectra for quantification of mixtures containing minor component: An application to the analysis of the recently co-formulated antidiabetic drugs; canagliflozin and metformin

    NASA Astrophysics Data System (ADS)

    Lotfy, Hayam M.; Mohamed, Dalia; Elshahed, Mona S.

    2018-01-01

    In the presented work several spectrophotometric methods were performed for the quantification of canagliflozin (CGZ) and metformin hydrochloride (MTF) simultaneously in their binary mixture. Two of these methods; response correlation (RC) and advanced balance point-spectrum subtraction (ABP-SS) were developed and introduced for the first time in this work, where the latter method (ABP-SS) was performed on both the zero order and the first derivative spectra of the drugs. Besides, two recently established methods; advanced amplitude modulation (AAM) and advanced absorbance subtraction (AAS) were also accomplished. All the proposed methods were validated in accordance to the ICH guidelines, where all methods were proved to be accurate and precise. Additionally, the linearity range, limit of detection and limit of quantification were determined and the selectivity was examined through the analysis of laboratory prepared mixtures and the combined dosage form of the drugs. The proposed methods were capable of determining the two drugs in the ratio present in the pharmaceutical formulation CGZ:MTF (1:17) without the requirement of any preliminary separation, further dilution or standard spiking. The results obtained by the proposed methods were in compliance with the reported chromatographic method when compared statistically, proving the absence of any significant difference in accuracy and precision between the proposed and reported methods.

  13. Testing the interaction between analytical modules: an example with Roundup Ready® soybean line GTS 40-3-2

    PubMed Central

    2010-01-01

    Background The modular approach to analysis of genetically modified organisms (GMOs) relies on the independence of the modules combined (i.e. DNA extraction and GM quantification). The validity of this assumption has to be proved on the basis of specific performance criteria. Results An experiment was conducted using, as a reference, the validated quantitative real-time polymerase chain reaction (PCR) module for detection of glyphosate-tolerant Roundup Ready® GM soybean (RRS). Different DNA extraction modules (CTAB, Wizard and Dellaporta), were used to extract DNA from different food/feed matrices (feed, biscuit and certified reference material [CRM 1%]) containing the target of the real-time PCR module used for validation. Purity and structural integrity (absence of inhibition) were used as basic criteria that a DNA extraction module must satisfy in order to provide suitable template DNA for quantitative real-time (RT) PCR-based GMO analysis. When performance criteria were applied (removal of non-compliant DNA extracts), the independence of GMO quantification from the extraction method and matrix was statistically proved, except in the case of Wizard applied to biscuit. A fuzzy logic-based procedure also confirmed the relatively poor performance of the Wizard/biscuit combination. Conclusions For RRS, this study recognises that modularity can be generally accepted, with the limitation of avoiding combining highly processed material (i.e. biscuit) with a magnetic-beads system (i.e. Wizard). PMID:20687918

  14. Modalities, Relations, and Learning

    NASA Astrophysics Data System (ADS)

    Müller, Martin Eric

    While the popularity of statistical, probabilistic and exhaustive machine learning techniques still increases, relational and logic approaches are still a niche market in research. While the former approaches focus on predictive accuracy, the latter ones prove to be indispensable in knowledge discovery.

  15. Automatic brain tumor detection in MRI: methodology and statistical validation

    NASA Astrophysics Data System (ADS)

    Iftekharuddin, Khan M.; Islam, Mohammad A.; Shaik, Jahangheer; Parra, Carlos; Ogg, Robert

    2005-04-01

    Automated brain tumor segmentation and detection are immensely important in medical diagnostics because it provides information associated to anatomical structures as well as potential abnormal tissue necessary to delineate appropriate surgical planning. In this work, we propose a novel automated brain tumor segmentation technique based on multiresolution texture information that combines fractal Brownian motion (fBm) and wavelet multiresolution analysis. Our wavelet-fractal technique combines the excellent multiresolution localization property of wavelets to texture extraction of fractal. We prove the efficacy of our technique by successfully segmenting pediatric brain MR images (MRIs) from St. Jude Children"s Research Hospital. We use self-organizing map (SOM) as our clustering tool wherein we exploit both pixel intensity and multiresolution texture features to obtain segmented tumor. Our test results show that our technique successfully segments abnormal brain tissues in a set of T1 images. In the next step, we design a classifier using Feed-Forward (FF) neural network to statistically validate the presence of tumor in MRI using both the multiresolution texture and the pixel intensity features. We estimate the corresponding receiver operating curve (ROC) based on the findings of true positive fractions and false positive fractions estimated from our classifier at different threshold values. An ROC, which can be considered as a gold standard to prove the competence of a classifier, is obtained to ascertain the sensitivity and specificity of our classifier. We observe that at threshold 0.4 we achieve true positive value of 1.0 (100%) sacrificing only 0.16 (16%) false positive value for the set of 50 T1 MRI analyzed in this experiment.

  16. Validation of the Hospital Ethical Climate Survey for older people care.

    PubMed

    Suhonen, Riitta; Stolt, Minna; Katajisto, Jouko; Charalambous, Andreas; Olson, Linda L

    2015-08-01

    The exploration of the ethical climate in the care settings for older people is highlighted in the literature, and it has been associated with various aspects of clinical practice and nurses' jobs. However, ethical climate is seldom studied in the older people care context. Valid, reliable, feasible measures are needed for the measurement of ethical climate. This study aimed to test the reliability, validity, and sensitivity of the Hospital Ethical Climate Survey in healthcare settings for older people. A non-experimental cross-sectional study design was employed, and a survey using questionnaires, including the Hospital Ethical Climate Survey was used for data collection. Data were analyzed using descriptive statistics, inferential statistics, and multivariable methods. Survey data were collected from a sample of nurses working in the care settings for older people in Finland (N = 1513, n = 874, response rate = 58%) in 2011. This study was conducted according to good scientific inquiry guidelines, and ethical approval was obtained from the university ethics committee. The mean score for the Hospital Ethical Climate Survey total was 3.85 (standard deviation = 0.56). Cronbach's alpha was 0.92. Principal component analysis provided evidence for factorial validity. LISREL provided evidence for construct validity based on goodness-of-fit statistics. Pearson's correlations of 0.68-0.90 were found between the sub-scales and the Hospital Ethical Climate Survey. The Hospital Ethical Climate Survey was found able to reveal discrimination across care settings and proved to be a valid and reliable tool for measuring ethical climate in care settings for older people and sensitive enough to reveal variations across various clinical settings. The Finnish version of the Hospital Ethical Climate Survey, used mainly in the hospital settings previously, proved to be a valid instrument to be used in the care settings for older people. Further studies are due to analyze the factor structure and some items of the Hospital Ethical Climate Survey. © The Author(s) 2014.

  17. TRANSIT TIMING OBSERVATIONS FROM KEPLER. VI. POTENTIALLY INTERESTING CANDIDATE SYSTEMS FROM FOURIER-BASED STATISTICAL TESTS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steffen, Jason H.; Ford, Eric B.; Rowe, Jason F.

    2012-09-10

    We analyze the deviations of transit times from a linear ephemeris for the Kepler Objects of Interest (KOI) through quarter six of science data. We conduct two statistical tests for all KOIs and a related statistical test for all pairs of KOIs in multi-transiting systems. These tests identify several systems which show potentially interesting transit timing variations (TTVs). Strong TTV systems have been valuable for the confirmation of planets and their mass measurements. Many of the systems identified in this study should prove fruitful for detailed TTV studies.

  18. Transit Timing Observations from Kepler: VII. Potentially interesting candidate systems from Fourier-based statistical tests

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steffen, Jason H.; /Fermilab; Ford, Eric B.

    2012-01-01

    We analyze the deviations of transit times from a linear ephemeris for the Kepler Objects of Interest (KOI) through Quarter six (Q6) of science data. We conduct two statistical tests for all KOIs and a related statistical test for all pairs of KOIs in multi-transiting systems. These tests identify several systems which show potentially interesting transit timing variations (TTVs). Strong TTV systems have been valuable for the confirmation of planets and their mass measurements. Many of the systems identified in this study should prove fruitful for detailed TTV studies.

  19. Attitudes of Students of Medicine, University of Mostar According to Induced Abortion.

    PubMed

    Trninić, Zoran; Bender, Marija; Šutalo, Nikica; Kozomara, Davorin; Lasić, Valentina; Bevanda, Danijel; Galić, Gordan

    2017-12-01

    Aim of this study was to establish attitudes of medical students on induced abortion and connection of those attitudes with religiousness, length of their studies, sex and various circumstances of pregnancy. In total, 148 students of the first, second, fifth and sixth year of medical faculty participated in the research. The study was conducted at the Medical Faculty of the University in Mostar. While collecting the data, we used a survey taken over from literature. The data were tested with adequate statistical methods afterwards. 81.1% of students would perform an abortion under certain circumstances (χ 2 =57.189; P<0.001). Most students answered that they would perform an abortion in case that a fetus had malformations (χ 2 =3.892; P=0.49) or if the mother's life were endangered (χ 2 =47.676; P<0.001). By comparison of students' readiness to perform an abortion under various circumstances of pregnancy depending on length of medical education, statistically significant difference was proved in the following circumstances: rape (χ 2 =6.097; P=0.014) and if the pregnancy would endanger mother's mental health (χ 2 =4.488; P=0.034). Students with shorter medical education expressed more liberal attitudes in the above stated circumstances. By comparison of students' readiness to perform an abortion under various circumstances of pregnancy depending on religiousness statistically significant difference was proved in the following circumstances: in case of 'abortion on demand', no matter the reason (χ 2 =11.908; P=0.012), teenage pregnancy (χ 2 =33.308; P<0.001) and if the pregnancy would interfere with mother's career χ 2 =35.897; P<0.001). Unreligious students expressed more liberal attitudes. Influence of length of medical education and sex on attitudes on abortion was not proved statistically. Impact of religiousness on that attitude cannot be commented due to very small share of unreligious students in the sample.

  20. Global hospital bed utilization crisis. A different approach.

    PubMed

    Waness, Abdelkarim; Akbar, Jalal U; Kharal, Mubashar; BinSalih, Salih; Harakati, Mohammed

    2010-04-01

    To test the effect of improved physician availability on hospital bed utilization. A prospective cohort study was conducted from 1st January 2009 to 31st March 2009 in the Division of Internal Medicine (DIM), King Abdul-Aziz Medical City (KAMC), Riyadh, Kingdom of Saudi Arabia. Two clinical teaching units (CTU) were compared head-to-head. Each CTU has 3 consultants. The CTU-control provides standard care, while the CTU-intervention was designed to provide better physician-consultant availability. Three outcomes were evaluated: patient outsourcing to another hospital, patient discharge during weekends, and overall admissions. Statistical analysis was carried out by electronic statistics calculator from the Center for Evidence-Based Medicine. Three hundred and thirty-four patients were evaluated for admission at the Emergency Room by both CTU's. One hundred and eighty-three patients were seen by the CTU-control, 6 patients were outsourced, and 177 were admitted. One hundred fifty-one patients were seen by the CTU-intervention: 39 of them were outsourced, and 112 were admitted. Forty-eight weekend patient discharges occurred during this period of time: 21 by CTU-control, and 27 by CTU-intervention. Analysis for odds ratio in both the rate of outsourcing, and weekend discharges, showed statistical significance in favor of the intervention group. The continuous availability of a physician-consultant for patient admission evaluation, outsourcing, or discharge during regular weekdays and weekends at DIM, KAMC proved to have a positive impact on bed utilization.

  1. Universal Algorithm for Identification of Fractional Brownian Motion. A Case of Telomere Subdiffusion

    PubMed Central

    Burnecki, Krzysztof; Kepten, Eldad; Janczura, Joanna; Bronshtein, Irena; Garini, Yuval; Weron, Aleksander

    2012-01-01

    We present a systematic statistical analysis of the recently measured individual trajectories of fluorescently labeled telomeres in the nucleus of living human cells. The experiments were performed in the U2OS cancer cell line. We propose an algorithm for identification of the telomere motion. By expanding the previously published data set, we are able to explore the dynamics in six time orders, a task not possible earlier. As a result, we establish a rigorous mathematical characterization of the stochastic process and identify the basic mathematical mechanisms behind the telomere motion. We find that the increments of the motion are stationary, Gaussian, ergodic, and even more chaotic—mixing. Moreover, the obtained memory parameter estimates, as well as the ensemble average mean square displacement reveal subdiffusive behavior at all time spans. All these findings statistically prove a fractional Brownian motion for the telomere trajectories, which is confirmed by a generalized p-variation test. Taking into account the biophysical nature of telomeres as monomers in the chromatin chain, we suggest polymer dynamics as a sufficient framework for their motion with no influence of other models. In addition, these results shed light on other studies of telomere motion and the alternative telomere lengthening mechanism. We hope that identification of these mechanisms will allow the development of a proper physical and biological model for telomere subdynamics. This array of tests can be easily implemented to other data sets to enable quick and accurate analysis of their statistical characteristics. PMID:23199912

  2. Efficacy and Physicochemical Evaluation of an Optimized Semisolid Formulation of Povidone Iodine Proposed by Extreme Vertices Statistical Design; a Practical Approach

    PubMed Central

    Lotfipour, Farzaneh; Valizadeh, Hadi; Shademan, Shahin; Monajjemzadeh, Farnaz

    2015-01-01

    One of the most significant issues in pharmaceutical industries, prior to commercialization of a pharmaceutical preparation is the "preformulation" stage. However, far too attention has been paid to verification of the software assisted statistical designs in preformulation studies. The main aim of this study was to report a step by step preformulation approach for a semisolid preparation based on a statistical mixture design and to verify the predictions made by the software with an in-vitro efficacy bioassay test. Extreme vertices mixture design (4 factors, 4 levels) was applied for preformulation of a semisolid Povidone Iodine preparation as Water removable ointment using different PolyEthylenGlycoles. Software Assisted (Minitab) analysis was then performed using four practically assessed response values including; Available iodine, viscosity (N index and yield value) and water absorption capacity. Subsequently mixture analysis was performed and finally, an optimized formulation was proposed. The efficacy of this formulation was bio-assayed using microbial tests in-vitro and MIC values were calculated for Escherichia coli, pseudomonaaeruginosa, staphylococcus aureus and Candida albicans. Results indicated the acceptable conformity of the measured responses. Thus, it can be concluded that the proposed design had an adequate power to predict the responses in practice. Stability studies, proved no significant change during the one year study for the optimized formulation. Efficacy was eligible on all tested species and in the case of staphylococcus aureus; the prepared semisolid formulation was even more effective. PMID:26664368

  3. Radiosonde and satellite observations of topographic flow off the Norwegian coast

    NASA Astrophysics Data System (ADS)

    Rugaard Furevik, Birgitte; Dagestad, Knut-Frode; Olafsson, Haraldur

    2015-04-01

    Winds in Norway are strongly affected by the complex topography and in some areas the average wind speed in the fjords may exceed those on the coast. Such effects are revealed through a statistical analysis derived wind speed from ~8500 Synthetic Aperture Radar (SAR) scenes covering the Norwegian coast. We have compared the results with modelled winds from the operational atmosphere model at MET (horizontal grid spacing of 2.5km) and 3 years of measurements from "M/S Trollfjord", a ferry traversing a 2400km coastal route between the cities Bergen and Kirkenes. The analysis reveals many coastal details of the wind field not observed from the meteorological station network of Norway. The data set proves useful for verification of offshore winds in the model. High temporal resolution radiosonde winds from two locations are used to analyse the topographic effects.

  4. Cardiac data mining (CDM); organization and predictive analytics on biomedical (cardiac) data

    NASA Astrophysics Data System (ADS)

    Bilal, M. Musa; Hussain, Masood; Basharat, Iqra; Fatima, Mamuna

    2013-10-01

    Data mining and data analytics has been of immense importance to many different fields as we witness the evolution of data sciences over recent years. Biostatistics and Medical Informatics has proved to be the foundation of many modern biological theories and analysis techniques. These are the fields which applies data mining practices along with statistical models to discover hidden trends from data that comprises of biological experiments or procedures on different entities. The objective of this research study is to develop a system for the efficient extraction, transformation and loading of such data from cardiologic procedure reports given by Armed Forces Institute of Cardiology. It also aims to devise a model for the predictive analysis and classification of this data to some important classes as required by cardiologists all around the world. This includes predicting patient impressions and other important features.

  5. Comparative study on the selectivity of various spectrophotometric techniques for the determination of binary mixture of fenbendazole and rafoxanide.

    PubMed

    Saad, Ahmed S; Attia, Ali K; Alaraki, Manal S; Elzanfaly, Eman S

    2015-11-05

    Five different spectrophotometric methods were applied for simultaneous determination of fenbendazole and rafoxanide in their binary mixture; namely first derivative, derivative ratio, ratio difference, dual wavelength and H-point standard addition spectrophotometric methods. Different factors affecting each of the applied spectrophotometric methods were studied and the selectivity of the applied methods was compared. The applied methods were validated as per the ICH guidelines and good accuracy; specificity and precision were proven within the concentration range of 5-50 μg/mL for both drugs. Statistical analysis using one-way ANOVA proved no significant differences among the proposed methods for the determination of the two drugs. The proposed methods successfully determined both drugs in laboratory prepared and commercially available binary mixtures, and were found applicable for the routine analysis in quality control laboratories. Copyright © 2015 Elsevier B.V. All rights reserved.

  6. Multi-tissue partial volume quantification in multi-contrast MRI using an optimised spectral unmixing approach.

    PubMed

    Collewet, Guylaine; Moussaoui, Saïd; Deligny, Cécile; Lucas, Tiphaine; Idier, Jérôme

    2018-06-01

    Multi-tissue partial volume estimation in MRI images is investigated with a viewpoint related to spectral unmixing as used in hyperspectral imaging. The main contribution of this paper is twofold. It firstly proposes a theoretical analysis of the statistical optimality conditions of the proportion estimation problem, which in the context of multi-contrast MRI data acquisition allows to appropriately set the imaging sequence parameters. Secondly, an efficient proportion quantification algorithm based on the minimisation of a penalised least-square criterion incorporating a regularity constraint on the spatial distribution of the proportions is proposed. Furthermore, the resulting developments are discussed using empirical simulations. The practical usefulness of the spectral unmixing approach for partial volume quantification in MRI is illustrated through an application to food analysis on the proving of a Danish pastry. Copyright © 2018 Elsevier Inc. All rights reserved.

  7. The Freight Transportation Services Index as a leading economic indicator

    DOT National Transportation Integrated Search

    2009-09-01

    The Bureau of Transportation Statistics (BTS) freight Transportation : Services Index (TSI) showed a decline a full year : and a half prior to the start of the current recession. This : downturn suggests the TSI may prove particularly useful : as ...

  8. STUDY OF TURBULENT ENERGY OVER COMPLEX TERRAIN: STATE, 1978

    EPA Science Inventory

    The complex structure of the earth's surface influenced atmospheric parameters pertinent to modeling the diffusion process during the 1978 'STATE' field study. The Information Theory approach of statistics proved useful for analyzing the complex structures observed in the radiome...

  9. Analysis of polymeric phenolics in red wines using different techniques combined with gel permeation chromatography fractionation.

    PubMed

    Guadalupe, Zenaida; Soldevilla, Alberto; Sáenz-Navajas, María-Pilar; Ayestarán, Belén

    2006-04-21

    A multiple-step analytical method was developed to improve the analysis of polymeric phenolics in red wines. With a common initial step based on the fractionation of wine phenolics by gel permeation chromatography (GPC), different analytical techniques were used: high-performance liquid chromatography-diode array detection (HPLC-DAD), HPLC-mass spectrometry (MS), capillary zone electrophoresis (CZE) and spectrophotometry. This method proved to be valid for analyzing different families of phenolic compounds, such as monomeric phenolics and their derivatives, polymeric pigments and proanthocyanidins. The analytical characteristics of fractionation by GPC were studied and the method was fully validated, yielding satisfactory statistical results. GPC fractionation substantially improved the analysis of polymeric pigments by CZE, in terms of response, repeatability and reproducibility. It also represented an improvement in the traditional vanillin assay used for proanthocyanidin (PA) quantification. Astringent proanthocyanidins were also analyzed using a simple combined method that allowed these compounds, for which only general indexes were available, to be quantified.

  10. Evidence of codon usage in the nearest neighbor spacing distribution of bases in bacterial genomes

    NASA Astrophysics Data System (ADS)

    Higareda, M. F.; Geiger, O.; Mendoza, L.; Méndez-Sánchez, R. A.

    2012-02-01

    Statistical analysis of whole genomic sequences usually assumes a homogeneous nucleotide density throughout the genome, an assumption that has been proved incorrect for several organisms since the nucleotide density is only locally homogeneous. To avoid giving a single numerical value to this variable property, we propose the use of spectral statistics, which characterizes the density of nucleotides as a function of its position in the genome. We show that the cumulative density of bases in bacterial genomes can be separated into an average (or secular) plus a fluctuating part. Bacterial genomes can be divided into two groups according to the qualitative description of their secular part: linear and piecewise linear. These two groups of genomes show different properties when their nucleotide spacing distribution is studied. In order to analyze genomes having a variable nucleotide density, statistically, the use of unfolding is necessary, i.e., to get a separation between the secular part and the fluctuations. The unfolding allows an adequate comparison with the statistical properties of other genomes. With this methodology, four genomes were analyzed Burkholderia, Bacillus, Clostridium and Corynebacterium. Interestingly, the nearest neighbor spacing distributions or detrended distance distributions are very similar for species within the same genus but they are very different for species from different genera. This difference can be attributed to the difference in the codon usage.

  11. The effects of GeoGebra software on pre-service mathematics teachers' attitudes and views toward proof and proving

    NASA Astrophysics Data System (ADS)

    Zengin, Yılmaz

    2017-11-01

    The purpose of this study is to determine the effect of GeoGebra software on pre-service mathematics teachers' attitudes towards proof and proving and to determine pre-service teachers' pre- and post-views regarding proof. The study lasted nine weeks and the participants of the study consisted of 24 pre-service mathematics teachers. The study used the 'Attitude Scale Towards Proof and Proving' and an open-ended questionnaire that were administered before and after the intervention as data collection tools. Paired samples t-test analysis was used for the analysis of quantitative data and content and descriptive analyses were utilized for the analysis of qualitative data. As a result of the data analysis, it was determined that GeoGebra software was an effective tool in increasing pre-service teachers' attitudes towards proof and proving.

  12. Comment on the asymptotics of a distribution-free goodness of fit test statistic.

    PubMed

    Browne, Michael W; Shapiro, Alexander

    2015-03-01

    In a recent article Jennrich and Satorra (Psychometrika 78: 545-552, 2013) showed that a proof by Browne (British Journal of Mathematical and Statistical Psychology 37: 62-83, 1984) of the asymptotic distribution of a goodness of fit test statistic is incomplete because it fails to prove that the orthogonal component function employed is continuous. Jennrich and Satorra (Psychometrika 78: 545-552, 2013) showed how Browne's proof can be completed satisfactorily but this required the development of an extensive and mathematically sophisticated framework for continuous orthogonal component functions. This short note provides a simple proof of the asymptotic distribution of Browne's (British Journal of Mathematical and Statistical Psychology 37: 62-83, 1984) test statistic by using an equivalent form of the statistic that does not involve orthogonal component functions and consequently avoids all complicating issues associated with them.

  13. The effect of intravesical instillations with Hyaluronic Acid on sexual dysfunction in women with recurrent urinary tract infections (RUTI).

    PubMed

    Nightingale, Gemma; Shehab, Qasem; Kandiah, Chandrakumaran; Rush, Lorraine; Rowe-Jones, Clare; Phillips, Christian H

    2018-02-01

    To determine whether sexual dysfunction in women with recurrent urinary tract infections (RUTI) improved following treatment with intravesical Hyaluronic Acid (HA) instillations. Ethical approval was obtained for a prospective study to be performed. Patients referred for bladder instillations to treat RUTI, and who were sexually active, were recruited to the study. A selection of validated questionnaires (ICIQ-UI, ICIQ-VS, FSDS-R, ICIQ-FLUTS, O'Leary/Sant and PGI-I) were completed at baseline, three, six and 12 months after initiation of treatment with bladder instillations. Treatment consisted of weekly bladder instillations with a preparation containing HA for four weeks then monthly for two further treatments. Results were populated in SPSS for statistical analysis and statistical significance was powered for 22 patients. Thirty women were included in the study. FSDS-R was used to determine sexual dysfunction and showed that 57% patients with RUTI had significant sexual distress. There was a significant improvement in FSDS-R at three, six and 12 months when compared to baseline (Friedman two-way analysis p < 0.001). ICIQ FLUTS F and I scores, O'Leary/Sant, ICIQ VS and PGI-I also showed a statistically significant improvement throughout the period of follow up. A statistically significant, negative correlation was found between FSDS-R and PGI-I at 12 months (r = -0.468, p = 0.009). We have reinforced previous work showing the association between RUTI and sexual dysfunction, and an improvement in bladder symptoms following treatment with HA. To our knowledge, this is the first study to prove an improvement in sexual dysfunction following intravesical treatment with HA which is sustained for up to 12 months. Crown Copyright © 2017. Published by Elsevier B.V. All rights reserved.

  14. Depressive symptom patterns and their consequences for diagnosis of affective disorders in cancer patients.

    PubMed

    Reuter, Katrin; Raugust, Simone; Bengel, Jürgen; Härter, Martin

    2004-12-01

    In order to obtain references for adequate diagnostic procedures of depressive syndromes in cancer patients, the present study analyzes first the prevalence of somatic, emotional, and cognitive symptoms of depression. In a second part, the ability of diagnostic procedures to discriminate between patients with and without comorbid affective disorder is investigated. From a cross-sectional survey investigating comorbid mental disorders in cancer patients with standardized clinical assessment, a subsample of 71 patients with current affective disorders and depressive symptoms according to the Diagnostic and Statistic Manual of Mental Disorders, 4th edition (DSM-IV) were analyzed. In addition to patients' symptom patterns, a discriminant analysis including all depressive symptoms was conducted. Cognitive symptoms are less prevalent in cancer patients than somatic and emotional symptoms. Loss of interest discriminated best between patients with and without diagnosis of comorbid affective disorder. Additionally, decreased energy and fatigue proved to have discriminatory value. Cognitive symptoms should receive special attention in diagnostic procedures for affective disorders in cancer patients. In spite of possible symptom overlap with the cancer disease and its treatment, fatigue proves to be a useful criteria for diagnosis of depression.

  15. A novel sample preparation method to avoid influence of embedding medium during nano-indentation

    NASA Astrophysics Data System (ADS)

    Meng, Yujie; Wang, Siqun; Cai, Zhiyong; Young, Timothy M.; Du, Guanben; Li, Yanjun

    2013-02-01

    The effect of the embedding medium on the nano-indentation measurements of lignocellulosic materials was investigated experimentally using nano-indentation. Both the reduced elastic modulus and the hardness of non-embedded cell walls were found to be lower than those of the embedded samples, proving that the embedding medium used for specimen preparation on cellulosic material during nano-indentation can modify cell-wall properties. This leads to structural and chemical changes in the cell-wall constituents, changes that may significantly alter the material properties. Further investigation was carried out to detect the influence of different vacuum times on the cell-wall mechanical properties during the embedding procedure. Interpretation of the statistical analysis revealed no linear relationships between vacuum time and the mechanical properties of cell walls. The quantitative measurements confirm that low-viscosity resin has a rapid penetration rate early in the curing process. Finally, a novel sample preparation method aimed at preventing resin diffusion into lignocellulosic cell walls was developed using a plastic film to wrap the sample before embedding. This method proved to be accessible and straightforward for many kinds of lignocellulosic material, but is especially suitable for small, soft samples.

  16. Identification of factors affecting birth rate in Czech Republic

    NASA Astrophysics Data System (ADS)

    Zámková, Martina; Blašková, Veronika

    2013-10-01

    This article is concerned with identifying economic factors primarily that affect birth rates in Czech Republic. To find the relationship between the magnitudes, we used the multivariate regression analysis and for modeling, we used a time series of annual values (1994-2011) both economic indicators and indicators related to demographics. Due to potential problems with apparent dependence we first cleansed all series obtained from the Czech Statistical Office using first differences. It is clear from the final model that meets all assumptions that there is a positive correlation between birth rates and the financial situation of households. We described the financial situation of households by GDP per capita, gross wages and consumer price index. As expected a positive correlation was proved for GDP per capita and gross wages and negative dependence was proved for the consumer price index. In addition to these economic variables in the model there were used also demographic characteristics of the workforce and the number of employed people. It can be stated that if the Czech Republic wants to support an increase in the birth rate, it is necessary to consider the financial support for households with small children.

  17. The Mutation Breeding and Mutagenic Effect of Air Plasma on Penicillium Chrysogenum

    NASA Astrophysics Data System (ADS)

    Gui, Fang; Wang, Hui; Wang, Peng; Liu, Hui; Cai, Xiaochun; Hu, Yihua; Yuan, Chengling; Zheng, Zhiming

    2012-04-01

    Low temperature air plasma was used as the mutation tool for penicillin-producing strain Penicillium chrysogenum. The discharge conditions were RF power of 360 W, temperature of 40°C in a sealed chamber, and pressure of 10 Pa to 30 Pa. The result showed that the kinetics of the survival rate followed a typical saddle-shaped curve. Based on a statistic analysis, at the treating duration of 10 min, the positive mutation rate was as high as 37.5% while the negative mutation rate was low. The colonial morphology changed obviously when the plasma treating duration reached or exceeded 45 min. After both primary and secondary screening, a mutant designated as aPc051310 with high productivity of penicillin was obtained, and a strong mutagenic effect on P. chrysogenum was observed in the process. It was proved that after five generations, the mutant aPc051310 still exhibits a high productivity. All the results prove that the plasma mutation method could be developed as a convenient and effective tool to breed high-yield strains in the fermentation industry, while expanding the plasm application at the same time.

  18. Paclitaxel-Coated Balloons for the Treatment of Dysfunctional Dialysis Access. Results from a Single-Center, Retrospective Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kitrou, Panagiotis M., E-mail: panoskitrou@gmail.com; Spiliopoulos, Stavros; Papadimatos, Panagiotis

    PurposeTo investigate the safety and effectiveness of lutonix paclitaxel-coated balloon (PCB) for the treatment of dysfunctional dialysis access.Materials and MethodsThis was a single-center, single-arm, retrospective analysis of 39 patients (23 male, 59 %) undergoing 61 interventions using 69 PCBs in a 20-month period. There was a balance between arteriovenous fistulae (AVF) and grafts (AVG) (20 AVFs, 19AVGs), and the majority of lesions were restenotic (25/39, 64.1 %). Mean balloon diameter used was 6.6 mm and length 73.4 mm. Primary outcome measure was target lesion primary patency (TLPP) at 6 months, while secondary outcome measures included factors affecting TLPP and major complications. As there were lesionsmore » treated more than once with PCB, authors also compared patency results after first and second PCB angioplasty.ResultsTLPP was 72.2 % at 6 months with a median patency of 260 days according to the Kaplan–Meier survival analysis. No major complications occurred. TLPP between AVFs and AVGs (311 vs. 237 days, respectively; p = 0.29) and de novo and restenotic lesions was similar (270.5 vs. 267.5 days, respectively; p = 0.50). In 14 cases, in which lesions were treated with two PCB angioplasties, a statistically significant difference in TLPP after the second treatment was noted (first intervention 179.5 days vs. second intervention 273.5 days; p = 0.032).ConclusionIn this retrospective analysis, Lutonix PCB proved to be safe and effective in treating restenosis in dysfunctional dialysis access with results comparable to the literature available. Larger studies are needed to prove abovementioned results.« less

  19. Comparison and statistical analysis of four write stability metrics in bulk CMOS static random access memory cells

    NASA Astrophysics Data System (ADS)

    Qiu, Hao; Mizutani, Tomoko; Saraya, Takuya; Hiramoto, Toshiro

    2015-04-01

    The commonly used four metrics for write stability were measured and compared based on the same set of 2048 (2k) six-transistor (6T) static random access memory (SRAM) cells by the 65 nm bulk technology. The preferred one should be effective for yield estimation and help predict edge of stability. Results have demonstrated that all metrics share the same worst SRAM cell. On the other hand, compared to butterfly curve with non-normality and write N-curve where no cell state flip happens, bit-line and word-line margins have good normality as well as almost perfect correlation. As a result, both bit line method and word line method prove themselves preferred write stability metrics.

  20. Design and evaluation of an oral multiparticulate system for dual delivery of amoxicillin and Lactobacillus acidophilus.

    PubMed

    Govender, Mershen; Choonara, Yahya E; van Vuuren, Sandy; Kumar, Pradeep; du Toit, Lisa C; Pillay, Viness

    2016-09-01

    A delayed-release dual delivery system for amoxicillin and the probiotic Lactobacillus acidophilus was developed and evaluated. Statistical optimization of a cross-linked denatured ovalbumin protective matrix was first synthesized using a Box-Behnken experimental design prior to encapsulation with glyceryl monostereate. The encapsulated ovalbumin matrix was thereafter incorporated with amoxicillin in a gastro-resistant capsule. In vitro characterization and stability analysis of the ovalbumin and encapsulated components were also performed Results: Protection of L. acidophilus probiotic against the bactericidal effects of amoxicillin within the dual formulation was determined. The dual formulation in this study proved effective and provides insight into current microbiome research to identify, classify and use functional healthy bacteria to develop novel probiotic delivery technologies.

  1. A Theorem on the Rank of a Product of Matrices with Illustration of Its Use in Goodness of Fit Testing.

    PubMed

    Satorra, Albert; Neudecker, Heinz

    2015-12-01

    This paper develops a theorem that facilitates computing the degrees of freedom of Wald-type chi-square tests for moment restrictions when there is rank deficiency of key matrices involved in the definition of the test. An if and only if (iff) condition is developed for a simple rule of difference of ranks to be used when computing the desired degrees of freedom of the test. The theorem is developed exploiting basics tools of matrix algebra. The theorem is shown to play a key role in proving the asymptotic chi-squaredness of a goodness of fit test in moment structure analysis, and in finding the degrees of freedom of this chi-square statistic.

  2. Somatic mutation load of estrogen receptor-positive breast tumors predicts overall survival: an analysis of genome sequence data.

    PubMed

    Haricharan, Svasti; Bainbridge, Matthew N; Scheet, Paul; Brown, Powel H

    2014-07-01

    Breast cancer is one of the most commonly diagnosed cancers in women. While there are several effective therapies for breast cancer and important single gene prognostic/predictive markers, more than 40,000 women die from this disease every year. The increasing availability of large-scale genomic datasets provides opportunities for identifying factors that influence breast cancer survival in smaller, well-defined subsets. The purpose of this study was to investigate the genomic landscape of various breast cancer subtypes and its potential associations with clinical outcomes. We used statistical analysis of sequence data generated by the Cancer Genome Atlas initiative including somatic mutation load (SML) analysis, Kaplan-Meier survival curves, gene mutational frequency, and mutational enrichment evaluation to study the genomic landscape of breast cancer. We show that ER(+), but not ER(-), tumors with high SML associate with poor overall survival (HR = 2.02). Further, these high mutation load tumors are enriched for coincident mutations in both DNA damage repair and ER signature genes. While it is known that somatic mutations in specific genes affect breast cancer survival, this study is the first to identify that SML may constitute an important global signature for a subset of ER(+) tumors prone to high mortality. Moreover, although somatic mutations in individual DNA damage genes affect clinical outcome, our results indicate that coincident mutations in DNA damage response and signature ER genes may prove more informative for ER(+) breast cancer survival. Next generation sequencing may prove an essential tool for identifying pathways underlying poor outcomes and for tailoring therapeutic strategies.

  3. Setting Environmental Standards

    ERIC Educational Resources Information Center

    Fishbein, Gershon

    1975-01-01

    Recent court decisions have pointed out the complexities involved in setting environmental standards. Environmental health is composed of multiple causative agents, most of which work over long periods of time. This makes the cause-and-effect relationship between health statistics and environmental contaminant exposures difficult to prove in…

  4. A Solution Space for a System of Null-State Partial Differential Equations: Part 1

    NASA Astrophysics Data System (ADS)

    Flores, Steven M.; Kleban, Peter

    2015-01-01

    This article is the first of four that completely and rigorously characterize a solution space for a homogeneous system of 2 N + 3 linear partial differential equations (PDEs) in 2 N variables that arises in conformal field theory (CFT) and multiple Schramm-Löwner evolution (SLE). In CFT, these are null-state equations and conformal Ward identities. They govern partition functions for the continuum limit of a statistical cluster or loop-gas model, such as percolation, or more generally the Potts models and O( n) models, at the statistical mechanical critical point. (SLE partition functions also satisfy these equations.) For such a lattice model in a polygon with its 2 N sides exhibiting a free/fixed side-alternating boundary condition , this partition function is proportional to the CFT correlation function where the w i are the vertices of and where is a one-leg corner operator. (Partition functions for "crossing events" in which clusters join the fixed sides of in some specified connectivity are linear combinations of such correlation functions.) When conformally mapped onto the upper half-plane, methods of CFT show that this correlation function satisfies the system of PDEs that we consider. In this first article, we use methods of analysis to prove that the dimension of this solution space is no more than C N , the Nth Catalan number. While our motivations are based in CFT, our proofs are completely rigorous. This proof is contained entirely within this article, except for the proof of Lemma 14, which constitutes the second article (Flores and Kleban, in Commun Math Phys, arXiv:1404.0035, 2014). In the third article (Flores and Kleban, in Commun Math Phys, arXiv:1303.7182, 2013), we use the results of this article to prove that the solution space of this system of PDEs has dimension C N and is spanned by solutions constructed with the CFT Coulomb gas (contour integral) formalism. In the fourth article (Flores and Kleban, in Commun Math Phys, arXiv:1405.2747, 2014), we prove further CFT-related properties about these solutions, some useful for calculating cluster-crossing probabilities of critical lattice models in polygons.

  5. Applications of "Integrated Data Viewer'' (IDV) in the classroom

    NASA Astrophysics Data System (ADS)

    Nogueira, R.; Cutrim, E. M.

    2006-06-01

    Conventionally, weather products utilized in synoptic meteorology reduce phenomena occurring in four dimensions to a 2-dimensional form. This constitutes a road-block for non-atmospheric-science majors who need to take meteorology as a non-mathematical and complementary course to their major programs. This research examines the use of Integrated Data Viewer-IDV as a teaching tool, as it allows a 4-dimensional representation of weather products. IDV was tested in the teaching of synoptic meteorology, weather analysis, and weather map interpretation to non-science students in the laboratory sessions of an introductory meteorology class at Western Michigan University. Comparison of student exam scores according to the laboratory teaching techniques, i.e., traditional lab manual and IDV was performed for short- and long-term learning. Results of the statistical analysis show that the Fall 2004 students in the IDV-based lab session retained learning. However, in the Spring 2005 the exam scores did not reflect retention in learning when compared with IDV-based and MANUAL-based lab scores (short term learning, i.e., exam taken one week after the lab exercise). Testing the long-term learning, seven weeks between the two exams in the Spring 2005, show no statistically significant difference between IDV-based group scores and MANUAL-based group scores. However, the IDV group obtained exam score average slightly higher than the MANUAL group. Statistical testing of the principal hypothesis in this study, leads to the conclusion that the IDV-based method did not prove to be a better teaching tool than the traditional paper-based method. Future studies could potentially find significant differences in the effectiveness of both manual and IDV methods if the conditions had been more controlled. That is, students in the control group should not be exposed to the weather analysis using IDV during lecture.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Solc, J.

    The reclamation effort typically deals with consequences of mining activity instead of being planned well before the mining. Detailed assessment of principal hydro- and geochemical processes participating in pore and groundwater chemistry evolution was carried out at three surface mine localities in North Dakota-the Fritz mine, the Indian Head mine, and the Velva mine. The geochemical model MINTEQUA2 and advanced statistical analysis coupled with traditional interpretive techniques were used to determine site-specific environmental characteristics and to compare the differences between study sites. Multivariate statistical analysis indicates that sulfate, magnesium, calcium, the gypsum saturation index, and sodium contribute the most tomore » overall differences in groundwater chemistry between study sites. Soil paste extract pH and EC measurements performed on over 3700 samples document extremely acidic soils at the Fritz mine. The number of samples with pH <5.5 reaches 80%-90% of total samples from discrete depth near the top of the soil profile at the Fritz mine. Soil samples from Indian Head and Velva do not indicate the acidity below the pH of 5.5 limit. The percentage of samples with EC > 3 mS cm{sup -1} is between 20% and 40% at the Fritz mine and below 20% for samples from Indian Head and Velva. The results of geochemical modeling indicate an increased tendency for gypsum saturation within the vadose zone, particularly within the lands disturbed by mining activity. This trend is directly associated with increased concentrations of sulfate anions as a result of mineral oxidation. Geochemical modeling, statistical analysis, and soil extract pH and EC measurements proved to be reliable, fast, and relatively cost-effective tools for the assessment of soil acidity, the extent of the oxidation zone, and the potential for negative impact on pore and groundwater chemistry.« less

  7. Unicorns do exist: a tutorial on "proving" the null hypothesis.

    PubMed

    Streiner, David L

    2003-12-01

    Introductory statistics classes teach us that we can never prove the null hypothesis; all we can do is reject or fail to reject it. However, there are times when it is necessary to try to prove the nonexistence of a difference between groups. This most often happens within the context of comparing a new treatment against an established one and showing that the new intervention is not inferior to the standard. This article first outlines the logic of "noninferiority" testing by differentiating between the null hypothesis (that which we are trying to nullify) and the "nill" hypothesis (there is no difference), reversing the role of the null and alternate hypotheses, and defining an interval within which groups are said to be equivalent. We then work through an example and show how to calculate sample sizes for noninferiority studies.

  8. Estimation of salivary flow rate, pH, buffer capacity, calcium, total protein content and total antioxidant capacity in relation to dental caries severity, age and gender.

    PubMed

    Pandey, Pallavi; Reddy, N Venugopal; Rao, V Arun Prasad; Saxena, Aditya; Chaudhary, C P

    2015-03-01

    The aim of the study was to evaluate salivary flow rate, pH, buffering capacity, calcium, total protein content and total antioxidant capacity in relation to dental caries, age and gender. The study population consisted of 120 healthy children aged 7-15 years that was further divided into two groups: 7-10 years and 11-15 years. In this 60 children with DMFS/dfs = 0 and 60 children with DMFS/dfs ≥5 were included. The subjects were divided into two groups; Group A: Children with DMFS/dfs = 0 (caries-free) Group B: Children with DMFS/dfs ≥5 (caries active). Unstimulated saliva samples were collected from all groups. Flow rates were determined, and samples analyzed for pH, buffer capacity, calcium, total protein and total antioxidant status. Salivary antioxidant activity is measured with spectrophotometer by an adaptation of 2,2'-azino-di-(3-ethylbenzthiazoline-6-sulphonate) assays. The mean difference of the two groups; caries-free and caries active were proved to be statistically significant (P < 0.05) for salivary calcium, total protein and total antioxidant level for both the sexes in the age group 7-10 years and for the age 11-15 years the mean difference of the two groups were proved to be statistically significant (P < 0.05) for salivary calcium level for both the sexes. Salivary total protein and total antioxidant level were proved to be statistically significant for male children only. In general, total protein and total antioxidants in saliva were increased with caries activity. Calcium content of saliva was found to be more in caries-free group and increased with age.

  9. Test Planning, Collection, and Analysis of Pressure Data Resulting from Army Weapon Systems. Volume IV. Data Analysis of the M198 and M109 May 1979 Firings.

    DTIC Science & Technology

    1980-05-01

    the M203 charge during May 1979 at Aberdeen Proving Ground . The data collection and analysis effort is part of a continuing program undertaken by...May to 18 May 1979 the M198 towed howitzer and the M109 self- propelled howitzer were fired with the 14203 charge at the Aberdeen Proving Grounds ...howitzer and the M109 self- propeiled howitzer were fired with the M203 charge at the Aberdeen Proving Grounds . This section of the report gives the

  10. M1A2 Adjunct Analysis (POSNOV Volume)

    DTIC Science & Technology

    1989-12-01

    MD 20814-2797 Director 2 U.S. Army Materiel Systems Analysis Activity ATTN: AMXSY-CS, AMXSY-GA Aberden Proving Grounds , MD 21005-5071 U.S. Army...Leonard Wood, MO Commander U.S. Army Ordnance Center & School ATTN: ATSL-CD-CS Aberdeen Proving Ground , MD 21005 Commander 2 U.S. Army Soldier Support...NJ Commander U.S. Army Test and Evaluation Command ATrN: AMSTE-CM-R Aberdeen Proving Ground , MD 21005 Commander U.S. Army Tank Automotive Command

  11. Performance of Blind Source Separation Algorithms for FMRI Analysis using a Group ICA Method

    PubMed Central

    Correa, Nicolle; Adali, Tülay; Calhoun, Vince D.

    2007-01-01

    Independent component analysis (ICA) is a popular blind source separation (BSS) technique that has proven to be promising for the analysis of functional magnetic resonance imaging (fMRI) data. A number of ICA approaches have been used for fMRI data analysis, and even more ICA algorithms exist, however the impact of using different algorithms on the results is largely unexplored. In this paper, we study the performance of four major classes of algorithms for spatial ICA, namely information maximization, maximization of non-gaussianity, joint diagonalization of cross-cumulant matrices, and second-order correlation based methods when they are applied to fMRI data from subjects performing a visuo-motor task. We use a group ICA method to study the variability among different ICA algorithms and propose several analysis techniques to evaluate their performance. We compare how different ICA algorithms estimate activations in expected neuronal areas. The results demonstrate that the ICA algorithms using higher-order statistical information prove to be quite consistent for fMRI data analysis. Infomax, FastICA, and JADE all yield reliable results; each having their strengths in specific areas. EVD, an algorithm using second-order statistics, does not perform reliably for fMRI data. Additionally, for the iterative ICA algorithms, it is important to investigate the variability of the estimates from different runs. We test the consistency of the iterative algorithms, Infomax and FastICA, by running the algorithm a number of times with different initializations and note that they yield consistent results over these multiple runs. Our results greatly improve our confidence in the consistency of ICA for fMRI data analysis. PMID:17540281

  12. Performance of blind source separation algorithms for fMRI analysis using a group ICA method.

    PubMed

    Correa, Nicolle; Adali, Tülay; Calhoun, Vince D

    2007-06-01

    Independent component analysis (ICA) is a popular blind source separation technique that has proven to be promising for the analysis of functional magnetic resonance imaging (fMRI) data. A number of ICA approaches have been used for fMRI data analysis, and even more ICA algorithms exist; however, the impact of using different algorithms on the results is largely unexplored. In this paper, we study the performance of four major classes of algorithms for spatial ICA, namely, information maximization, maximization of non-Gaussianity, joint diagonalization of cross-cumulant matrices and second-order correlation-based methods, when they are applied to fMRI data from subjects performing a visuo-motor task. We use a group ICA method to study variability among different ICA algorithms, and we propose several analysis techniques to evaluate their performance. We compare how different ICA algorithms estimate activations in expected neuronal areas. The results demonstrate that the ICA algorithms using higher-order statistical information prove to be quite consistent for fMRI data analysis. Infomax, FastICA and joint approximate diagonalization of eigenmatrices (JADE) all yield reliable results, with each having its strengths in specific areas. Eigenvalue decomposition (EVD), an algorithm using second-order statistics, does not perform reliably for fMRI data. Additionally, for iterative ICA algorithms, it is important to investigate the variability of estimates from different runs. We test the consistency of the iterative algorithms Infomax and FastICA by running the algorithm a number of times with different initializations, and we note that they yield consistent results over these multiple runs. Our results greatly improve our confidence in the consistency of ICA for fMRI data analysis.

  13. Electronic medical record system at an opioid agonist treatment programme: study design, pre-implementation results and post-implementation trends.

    PubMed

    Kritz, Steven; Brown, Lawrence S; Chu, Melissa; John-Hull, Carlota; Madray, Charles; Zavala, Roberto; Louie, Ben

    2012-08-01

    Electronic medical record (EMR) systems are commonly included in health care reform discussions. However, their embrace by the health care community has been slow. At Addiction Research and Treatment Corporation, an outpatient opioid agonist treatment programme that also provides primary medical care, HIV medical care and case management, substance abuse counselling and vocational services, we studied the implementation of an EMR in the domains of quality, productivity, satisfaction, risk management and financial performance utilizing a prospective pre- and post-implementation study design. This report details the research approach, pre-implementation findings for all five domains, analysis of the pre-implementation findings and some preliminary post-implementation results in the domains of quality and risk management. For quality, there was a highly statistically significant improvement in timely performance of annual medical assessments (P < 0.001) and annual multidiscipline assessments (P < 0.0001). For risk management, the number of events was not sufficient to perform valid statistical analysis. The preliminary findings in the domain of quality are very promising. Should the findings in the other domains prove to be positive, then the impetus to implement EMR in similar health care facilities will be advanced. © 2011 Blackwell Publishing Ltd.

  14. Energy Cascade Analysis: from Subscale Eddies to Mean Flow

    NASA Astrophysics Data System (ADS)

    Cheikh, Mohamad Ibrahim; Wonnell, Louis; Chen, James

    2017-11-01

    Understanding the energy transfer between eddies and mean flow can provide insights into the energy cascade process. Much work has been done to investigate the energy cascade at the level of the smallest eddies using different numerical techniques derived from the Navier-Stokes equations. These methodologies, however, prove to be computationally inefficient when producing energy spectra for a wide range of length scales. In this regard, Morphing Continuum Theory (MCT) resolves the length-scales issues by assuming the fluid continuum to be composed of inner structures that play the role of subscale eddies. The current study show- cases the capabilities of MCT in capturing the dynamics of energy cascade at the level of subscale eddies, through a supersonic turbulent flow of Mach 2.93 over an 8× compression ramp. Analysis of the results using statistical averaging procedure shows the existence of a statistical coupling of the internal and translational kinetic energy fluctuations with the corresponding rotational kinetic energy of the subscale eddies, indicating a multiscale transfer of energy. The results show that MCT gives a new characterization of the energy cascade within compressible turbulence without the use of excessive computational resources. This material is based upon work supported by the Air Force Office of Scientific Research under Award Number FA9550-17-1-0154.

  15. Ultrastructural analysis of oral exfoliated epithelial cells of tobacco smokers and betel nut chewers: A scanning electron microscopy study.

    PubMed

    Khan, Sameera Shamim; Shreedhar, Balasundari; Kamboj, Mala

    2016-01-01

    The study was undertaken to correlate epithelial surface pattern changes of oral exfoliated cells of tobacco smokers and betel nut chewers and also to compare them with patients of oral squamous cell carcinoma (OSCC) and healthy individuals. In this cross-sectional study, a total of fifty persons were included in the study, out of which thirty formed the study group (15 each tobacco smokers and betel nut chewers) and twenty formed the control group (ten each of OSCC patients - positive control and ten normal buccal mucosa - negative control). Their oral exfoliated cells were scraped, fixed, and studied under scanning electron microscope (SEM). The statistical analysis was determined using ANOVA, Tukey honestly significant difference, Chi-square test, and statistical SPASS software, P < 0.05. OSCC, Individual cell modifications, intercellular relationships and surface characteristics observed by scanning electron microscopy between OSCC, tobacco smokers, betel nut chewers compared to normal oral mucosa have been tabulated. In normal oral mucosa, cell surface morphology depends on the state of keratinization of the tissue. Thus, it could prove helpful in detecting any carcinomatous change at its incipient stage and also give an insight into the ultra-structural details of cellular differentiations in epithelial tissues.

  16. Assessment of the antidandruff activity of a new shampoo: a randomized, double-blind, controlled study by clinical and instrumental evaluations.

    PubMed

    Sparavigna, Adele; Setaro, Michele; Caserini, Maurizio; Bulgheroni, Anna

    2013-01-01

    The aim of this randomized, double-blind, controlled study was to evaluate the antidandruff activity exerted by a new shampoo on patients affected by dandruff and/or mild seborrheic dermatitis by means of both D-squame technique coupled with image analysis and clinical assessments. Thirty-four patients were enrolled and 1:1 randomly assigned to either a test shampoo or a comparative shampoo group. Treatment schedule was twice a week for 4 weeks. The D-squame technique was shown to be able to objectively record variations in scalp desquamation both between test and comparative groups and within the same group over time. The results obtained with this instrumental approach showed a statistically significant reduction by 52% vs baseline after 2 weeks of treatment. There was an even greater reduction after 4 weeks (-66%). This reduction was statistically significant compared with the comparative group at the same time points. The analysis of all the other parameters (except Wood's lamp) confirmed the superiority of the test vs the comparative shampoo. The test shampoo proved to be safe, well tolerated, and accepted by the patients for cosmetic acceptability and efficacy. The study confirmed the antidandruff efficacy of the test shampoo and its superiority vs the comparative shampoo.

  17. Food security and the nutritional status of children in foster care: new horizons in the protection of a fragile population.

    PubMed

    Ferrara, Pietro; Scancarello, Marta; Khazrai, Yeganeh M; Romani, Lorenza; Cutrona, Costanza; DE Gara, Laura; Bona, Gianni

    2016-10-12

    The nutritional status of foster children, the quality of daily menus in group homes and the Food Security inside these organizations have been poorly studied and this study means to investigate them. A sample of 125 children, ranging in age from 0-17 years, among seven group homes (group A) was compared with 121 children of the general population we (group B). To evaluate nutritional status, BMI percentiles were used. Mean percentiles of both groups were compared through statistical analysis. Both nutritional and caloric daily distributions in each organization were obtained using the 24-hour recall method. A specific questionnaire was administered to evaluate Food Security. From the analysis of mean BMI-for-age (or height-for-length) percentiles, did not observe statistically significant differences between group A and group B. The average daily nutrient and calorie distribution in group homes proves to be nearly optimal with the exception of a slight excess in proteins and a slight deficiency in PUFAs. Moreover, a low intake of iron and calcium was revealed. All organizations obtained a "High Food Security" profile. Nutritional conditions of foster children are no worse than that of children of the general population. Foster care provides the necessary conditions to support their growth.

  18. Surface free energy analysis of oil palm empty fruit bunches fiber reinforced biocomposites

    NASA Astrophysics Data System (ADS)

    Suryadi, G. S.; Nikmatin, S.; Sudaryanto; Irmansyah; Sukaryo, S. G.

    2017-05-01

    Study of the size effect of natural fiber from oil palm empty fruit bunches (OPEFB) as filler, onto the contact angle and surface free energy of fiber reinforced biocomposites has been done. The OPEFB fibers were prepared by mechanical milling and sieving to obtain various sizes of fiber (long-fiber, medium-fiber, short-fiber, and microparticle). The biocomposites has been produced by extrusion using single-screw extruder with EFB fiber as filler, recycled Acrylonitrile Butadiene Styrene (ABS) polymer as matrix, and primary antioxidant, acid scavanger, and coupling agent as additives. The obtained biocomposites in form of granular, were made into test piece by injection molding method. Contact angles of water, methanol, and hexane on the surface of biocomposites at room temperature were measured using Phoenix 300 Contact Angle Analyzer. The surface free energy (SFE) and their components were calculated using three previous known methods (Girifalco-Good-Fowkes-Young (GGFY), Owens-Wendt, and van Oss-Chaudhury-Good (vOCG)). The results showed that total SFE of Recycled ABS as control was about 24.38 mJ/m2, and SFE of biocomposites was lower than control, decreased with decreasing of EFB fiber size as biocomposites filler. The statistical analysis proved that there are no statistically significant differences in the value of the SFE calculated with the three different methods.

  19. Histomorphometric, fractal and lacunarity comparative analysis of sheep (Ovis aries), goat (Capra hircus) and roe deer (Capreolus capreolus) compact bone samples.

    PubMed

    Gudea, A I; Stefan, A C

    2013-08-01

    Quantitative and qualitative studies dealing with histomorphometry of the bone tissue play a new role in modern legal medicine/forensic medicine and archaeozoology nowadays. This study deals with the differences found in case of humerus and metapodial bones of recent sheep (Ovis aries), goat (Capra hircus) and roedeer (Capreolus capreolus) specimens, both from a qualitative point of view, but mainly from a quantitative perspective. A novel perspective given by the fractal analysis performed on the digital histological images is approached. This study shows that the qualitative assessment may not be a reliable one due to the close resemblance of the structures. From the quantitative perspective (several measurements performed on osteonal units and statistical processing of data),some of the elements measured show significant differences among 3 species(the primary osteonal diameter, etc.). The fractal analysis and the lacunarity of the images show a great deal of potential, proving that this type of analysis can be of great help in the separation of the material from this perspective.

  20. An 'electronic' extramural course in epidemiology and medical statistics.

    PubMed

    Ostbye, T

    1989-03-01

    This article describes an extramural university course in epidemiology and medical statistics taught using a computer conferencing system, microcomputers and data communications. Computer conferencing was shown to be a powerful, yet quite easily mastered, vehicle for distance education. It allows health personnel unable to attend regular classes due to geographical or time constraints, to take part in an interactive learning environment at low cost. This overcomes part of the intellectual and social isolation associated with traditional correspondence courses. Teaching of epidemiology and medical statistics is well suited to computer conferencing, even if the asynchronicity of the medium makes discussion of the most complex statistical concepts a little cumbersome. Computer conferencing may also prove to be a useful tool for teaching other medical and health related subjects.

  1. Statistical transmutation in doped quantum dimer models.

    PubMed

    Lamas, C A; Ralko, A; Cabra, D C; Poilblanc, D; Pujol, P

    2012-07-06

    We prove a "statistical transmutation" symmetry of doped quantum dimer models on the square, triangular, and kagome lattices: the energy spectrum is invariant under a simultaneous change of statistics (i.e., bosonic into fermionic or vice versa) of the holes and of the signs of all the dimer resonance loops. This exact transformation enables us to define the duality equivalence between doped quantum dimer Hamiltonians and provides the analytic framework to analyze dynamical statistical transmutations. We investigate numerically the doping of the triangular quantum dimer model with special focus on the topological Z(2) dimer liquid. Doping leads to four (instead of two for the square lattice) inequivalent families of Hamiltonians. Competition between phase separation, superfluidity, supersolidity, and fermionic phases is investigated in the four families.

  2. [FROM STATISTICAL ASSOCIATIONS TO SCIENTIFIC CAUSALITY].

    PubMed

    Golan, Daniel; Linn, Shay

    2015-06-01

    The pathogenesis of most chronic diseases is complex and probably involves the interaction of multiple genetic and environmental risk factors. One way to learn about disease triggers is from statistically significant associations in epidemiological studies. However, associations do not necessarily prove causation. Associations can commonly result from bias, confounding and reverse causation. Several paradigms for causality inference have been developed. Henle-Koch postulates are mainly applied for infectious diseases. Austin Bradford Hill's criteria may serve as a practical tool to weigh the evidence regarding the probability that a single new risk factor for a given disease is indeed causal. These criteria are irrelevant for estimating the causal relationship between exposure to a risk factor and disease whenever biological causality has been previously established. Thus, it is highly probable that past exposure of an individual to definite carcinogens is related to his cancer, even without proving an association between this exposure and cancer in his group. For multifactorial diseases, Rothman's model of interacting sets of component causes can be applied.

  3. On the Equivalence of a Likelihood Ratio of Drasgow, Levine, and Zickar (1996) and the Statistic Based on the Neyman-Pearson Lemma of Belov (2016).

    PubMed

    Sinharay, Sandip

    2017-03-01

    Levine and Drasgow (1988) suggested an approach based on the Neyman-Pearson lemma to detect examinees whose response patterns are "aberrant" due to cheating, language issues, and so on. Belov (2016) used the approach of Levine and Drasgow (1988) to suggest a statistic based on the Neyman-Pearson Lemma (SBNPL) to detect item preknowledge when the investigator knows which items are compromised. This brief report proves that the SBNPL of Belov (2016) is equivalent to a statistic suggested for the same purpose by Drasgow, Levine, and Zickar 20 years ago.

  4. Measuring forest landscape patterns in the Cascade Range of Oregon, USA

    NASA Technical Reports Server (NTRS)

    Ripple, William J.; Bradshaw, G. A.; Spies, Thomas A.

    1995-01-01

    This paper describes the use of a set of spatial statistics to quantify the landscape pattern caused by the patchwork of clearcuts made over a 15-year period in the western Cascades of Oregon. Fifteen areas were selected at random to represent a diversity of landscape fragmentation patterns. Managed forest stands (patches) were digitized and analyzed to produce both tabular and mapped information describing patch size, shape, abundance and spacing, and matrix characteristics of a given area. In addition, a GIS fragmentation index was developed which was found to be sensitive to patch abundance and to the spatial distribution of patches. Use of the GIS-derived index provides an automated method of determining the level of forest fragmentation and can be used to facilitate spatial analysis of the landscape for later coordination with field and remotely sensed data. A comparison of the spatial statistics calculated for the two years indicates an increase in forest fragmentation as characterized by an increase in mean patch abundance and a decrease in interpatch distance, amount of interior natural forest habitat, and the GIS fragmentation index. Such statistics capable of quantifying patch shape and spatial distribution may prove important in the evaluation of the changing character of interior and edge habitats for wildlife.

  5. Ballistic Analysis of Firing Table Data for 155MM, M825 Smoke Projectile

    DTIC Science & Technology

    1990-09-01

    PROVING GROUND , MARYLAND I I 4 .i. NOTICES Destroy this report when it is no longer needed. DO NOT return it to the originator. Additional copies of this...ADDRESS(ES) 10. SPONSORING MONITORING U.S. Army Ballistic Research Laboratory AGENCY REPORT NUMBER ATTN: SLCBR-DD-T BRL-R-3865 Aberdeen Proving Ground ...thru September 1988 at Dugway Proving Ground . Such an analysis will consider whether the M825 MOD PIP Base projectile is ballistically matched or

  6. Midline corpus callosum is a neuroanatomical focus of fetal alcohol damage.

    PubMed

    Bookstein, Fred L; Sampson, Paul D; Connor, Paul D; Streissguth, Ann P

    2002-06-15

    Prenatal exposure to high levels of alcohol often induces birth defects that combine morphological stigmata with neurological or neuropsychological deficits. But it has proved problematic to diagnose these syndromes in adolescents and adults, in whom the morphological signs are absent or attenuated, the behavioral deficits nonspecific, and the exposure history often difficult to reconstruct. Localizing the associated brain abnormalities might circumvent most of these difficulties. To this end, three-dimensional (3D) locations were recorded for 67 homologous points on or near the corpus callosum in magnetic resonance (MR) brain images from 60 adolescents and adults who were normal, 60 diagnosed with fetal alcohol syndrome, and 60 diagnosed with fetal alcohol effects. We combined the standard statistical approach to this type of geometric data, Procrustes analysis, with a multivariate strategy focusing on differences in variability. In this data set, the shape of the corpus callosum and its vicinity proves systematically much more variable in the alcohol-affected brains than in those of the normal subjects. From this excess variability follows a promising classification rule, having both high sensitivity (100 out of 117) and high specificity (49 out of 60) in this sample. The discrimination uses four landmark points and two summary scores of callosal outline shape. The information from the corpus callosum and vicinity, as viewed in MR brain images of full-grown subjects, may serve as a permanent record of the prenatal effects of alcohol, even in patients who are first suspected of these syndromes relatively late in life or who lack the facial signs of prenatal alcohol damage. The statistical pattern underlying the callosal diagnosis also leads to speculations on mechanisms of the prenatal damage. Copyright 2002 Wiley-Liss, Inc.

  7. A theoretical case study of type I and type II beta-turns.

    PubMed

    Czinki, Eszter; Császár, Attila G; Perczel, András

    2003-03-03

    NMR chemical shielding anisotropy tensors have been computed by employing a medium size basis set and the GIAO-DFT(B3LYP) formalism of electronic structure theory for all of the atoms of type I and type II beta-turn models. The models contain all possible combinations of the amino acid residues Gly, Ala, Val, and Ser, with all possible side-chain orientations where applicable in a dipeptide. The several hundred structures investigated contain either constrained or optimized phi, psi, and chi dihedral angles. A statistical analysis of the resulting large database was performed and multidimensional (2D and 3D) chemical-shift/chemical-shift plots were generated. The (1)H(alpha-13)C(alpha), (13)C(alpha-1)H(alpha-13)C(beta), and (13)C(alpha-1)H(alpha-13)C' 2D and 3D plots have the notable feature that the conformers clearly cluster in distinct regions. This allows straightforward identification of the backbone and side-chain conformations of the residues forming beta-turns. Chemical shift calculations on larger For-(L-Ala)(n)-NH(2) (n=4, 6, 8) models, containing a single type I or type II beta-turn, prove that the simple models employed are adequate. A limited number of chemical shift calculations performed at the highly correlated CCSD(T) level prove the adequacy of the computational method chosen. For all nuclei, statistically averaged theoretical and experimental shifts taken from the BioMagnetic Resonance Bank (BMRB) exhibit good correlation. These results confirm and extend our previous findings that chemical shift information from selected multiple-pulse NMR experiments could be employed directly to extract folding information for polypeptides and proteins.

  8. Land cover mapping with emphasis to burnt area delineation using co-orbital ALI and Landsat TM imagery

    NASA Astrophysics Data System (ADS)

    Petropoulos, George P.; Kontoes, Charalambos C.; Keramitsoglou, Iphigenia

    2012-08-01

    In this study, the potential of EO-1 Advanced Land Imager (ALI) radiometer for land cover and especially burnt area mapping from a single image analysis is investigated. Co-orbital imagery from the Landsat Thematic Mapper (TM) was also utilised for comparison purposes. Both images were acquired shortly after the suppression of a fire occurred during the summer of 2009 North-East of Athens, the capital of Greece. The Maximum Likelihood (ML), Artificial Neural Networks (ANNs) and Support Vector Machines (SVMs) classifiers were parameterised and subsequently applied to the acquired satellite datasets. Evaluation of the land use/cover mapping accuracy was based on the error matrix statistics. Also, the McNemar test was used to evaluate the statistical significance of the differences between the approaches tested. Derived burnt area estimates were validated against the operationally deployed Services and Applications For Emergency Response (SAFER) Burnt Scar Mapping service. All classifiers applied to either ALI or TM imagery proved flexible enough to map land cover and also to extract the burnt area from other land surface types. The highest total classification accuracy and burnt area detection capability was returned from the application of SVMs to ALI data. This was due to the SVMs ability to identify an optimal separating hyperplane for best classes' separation that was able to better utilise ALI's advanced technological characteristics in comparison to those of TM sensor. This study is to our knowledge the first of its kind, effectively demonstrating the benefits of the combined application of SVMs to ALI data further implying that ALI technology may prove highly valuable in mapping burnt areas and land use/cover if it is incorporated into the development of Landsat 8 mission, planned to be launched in the coming years.

  9. Evaluation of apoptosis indexes in currently used oral alpha-blockers in prostate: a pilot study.

    PubMed

    Demir, Mehmet; Akin, Yigit; Terim, Kubra Asena Kapakin; Gulum, Mehmet; Buyukfirat, Evren; Ciftci, Halil; Yeni, Ercan

    2018-01-01

    Apoptosis effect of oral alpha-blockers is known in the prostate. Apoptosis index of silodosin has not been proved, yet. Aims are to present apoptosis index of silodosin in prostate and to compare this with other currently used alpha-blocker's apoptosis indexes together with their clinical effects. Benign prostatic hyperplasia (BPH) patients were enrolled among those admitted to urology outpatient clinic between June 2014 and June 2015. Study groups were created according to randomly prescribed oral alpha-blocker drugs as silodosin 8mg (Group 1; n=24), tamsulosin 0.4mg (Group 2; n=30), alfuzosin 10mg (Group 3; n=25), doxazosin 8mg (Group 4; n=22), terazosin 5mg (Group 5; n=15). Patients who refused to use any alpha-blocker drug were included into Group 6 as control group (n=16). We investigated apoptosis indexes of the drugs in prostatic tissues that were taken from patient's surgery (transurethral resection of prostate) and/or prostate biopsies. Immunochemical dyeing, light microscope, and Image Processing and Analysis in Java were used for evaluations. Statistical significant p was p<0.05. There were 132 patients with mean follow-up of 4.2±2.1 months. Pathologist researched randomly selected 10 areas in each microscope set. Group 1 showed statistical significant difference apoptosis index in immunochemical TUNEL dyeing and image software (p<0.001). Moreover, we determined superior significant development in parameters as uroflowmetry, quality of life scores, and international prostate symptom score in Group 1. Silodosin has higher apoptosis effect than other alpha-blockers in prostate. Thus, clinic improvement with silodosin was proved by histologic studies. Besides, static factor of BPH may be overcome with creating apoptosis. Copyright® by the International Brazilian Journal of Urology.

  10. Outcomes of Mini vs Roux-en-Y gastric bypass: A meta-analysis and systematic review.

    PubMed

    Wang, Fu-Gang; Yan, Wen-Mao; Yan, Ming; Song, Mao-Min

    2018-05-10

    Mini gastric bypass has been proved to be capable of achieving excellent metabolic results by numerous published studies. Compared to Roux-en-Y gastric bypass, mini gastric bypass is a technically simpler and reversible procedure. However, comparative outcomes of the effectiveness between Mini gastric bypass and Roux-en-Y gastric bypass remain unclear. A systematic literature search was performed in Pubmed, Embase, Cochrane library from inception to February 9, 2018. For assessment of method quality, NOS (Newcastle-Ottawa Scale) and Cochrane Collaboration's tool for assessing risk of bias were used for cohort study and randomized controlled trials, respectively. The meta-analysis was performed by RevMan 5.3 software. 10 cohort studies and 1 randomized controlled trial was included in our meta-analysis. The method quality of the 10 cohort studies was proved as high quality according to the Newcastle-Ottawa Scale. The randomized controlled trial was proved to have a low risk of bias according to Cochrane Collaboration's assessment. Patients receiving mini-gastric bypass had multiple advantageous indexes as compared with patients receiving Roux-en-Y gastric bypass. Examples include: a higher 1-year EWL% (P < 0.05), higher 2-year EWL% (P < 0.05), higher type 2 diabetes mellitus remission rate, as well as a shorter operation time (P < 0.05). No significant statistical difference was observed in hypertension remission rate, mortality, leakage rate, GERD rate, or hospital stay between mini gastric bypass and Roux-en-Y gastric bypass. Mini gastric bypass seems to be a simpler procedure with a better weight reduction effect. This seems to also be the case regarding remission rates of type 2 diabetes mellitus when using Mini gastric bypass in comparison to Roux-en-Y gastric bypass. A small sample size and biased data may have influenced the stability of our results. In light of this, surgeons should treat our results in a conservative way. Larger sample size and multi-center randomized control trials are needed to compare the effectiveness and safety between mini-gastric bypass and Roux-en-Y gastric bypass. Copyright © 2018. Published by Elsevier Ltd.

  11. Magnetorotational dynamo chimeras. The missing link to turbulent accretion disk dynamo models?

    NASA Astrophysics Data System (ADS)

    Riols, A.; Rincon, F.; Cossu, C.; Lesur, G.; Ogilvie, G. I.; Longaretti, P.-Y.

    2017-02-01

    In Keplerian accretion disks, turbulence and magnetic fields may be jointly excited through a subcritical dynamo mechanisminvolving magnetorotational instability (MRI). This dynamo may notably contribute to explaining the time-variability of various accreting systems, as high-resolution simulations of MRI dynamo turbulence exhibit statistical self-organization into large-scale cyclic dynamics. However, understanding the physics underlying these statistical states and assessing their exact astrophysical relevance is theoretically challenging. The study of simple periodic nonlinear MRI dynamo solutions has recently proven useful in this respect, and has highlighted the role of turbulent magnetic diffusion in the seeming impossibility of a dynamo at low magnetic Prandtl number (Pm), a common regime in disks. Arguably though, these simple laminar structures may not be fully representative of the complex, statistically self-organized states expected in astrophysical regimes. Here, we aim at closing this seeming discrepancy by reporting the numerical discovery of exactly periodic, yet semi-statistical "chimeral MRI dynamo states" which are the organized outcome of a succession of MRI-unstable, non-axisymmetric dynamical stages of different forms and amplitudes. Interestingly, these states, while reminiscent of the statistical complexity of turbulent simulations, involve the same physical principles as simpler laminar cycles, and their analysis further confirms the theory that subcritical turbulent magnetic diffusion impedes the sustainment of an MRI dynamo at low Pm. Overall, chimera dynamo cycles therefore offer an unprecedented dual physical and statistical perspective on dynamos in rotating shear flows, which may prove useful in devising more accurate, yet intuitive mean-field models of time-dependent turbulent disk dynamos. Movies associated to Fig. 1 are available at http://www.aanda.org

  12. Meso-scale turbulence in living fluids

    NASA Astrophysics Data System (ADS)

    Dunkel, Jorn; Wensink, Rik; Heidenreich, Sebastian; Drescher, Knut; Goldstein, Ray; Loewen, Hartmut; Yeomans, Julia

    2012-11-01

    The mathematical characterization of turbulence phenomena in active non-equilibrium fluids proves even more difficult than for conventional liquids or gases. It is not known which features of turbulent phases in living matter are universal or system-specific, or which generalizations of the Navier-Stokes equations are able to describe them adequately. We combine experiments, particle simulations, and continuum theory to identify the statistical properties of self-sustained meso-scale turbulence in active systems. To study how dimensionality and boundary conditions affect collective bacterial dynamics, we measured energy spectra and structure functions in dense Bacillus subtilis suspensions in quasi-2D and 3D geometries. Our experimental results for the bacterial flow statistics agree well with predictions from a minimal model for self-propelled rods, suggesting that at high concentrations the collective motion of the bacteria is dominated by short-range interactions. To provide a basis for future theoretical studies, we propose a minimal continuum model for incompressible bacterial flow. A detailed numerical analysis of the 2D case shows that this theory can reproduce many of the experimentally observed features of self-sustained active turbulence. Supported by the ERC, EPSRC and DFG.

  13. Development Of Educational Programs In Renewable And Alternative Energy Processing: The Case Of Russia

    NASA Astrophysics Data System (ADS)

    Svirina, Anna; Shindor, Olga; Tatmyshevsky, Konstantin

    2014-12-01

    The paper deals with the main problems of Russian energy system development that proves necessary to provide educational programs in the field of renewable and alternative energy. In the paper the process of curricula development and defining teaching techniques on the basis of expert opinion evaluation is defined, and the competence model for renewable and alternative energy processing master students is suggested. On the basis of a distributed questionnaire and in-depth interviews, the data for statistical analysis was obtained. On the basis of this data, an optimization of curricula structure was performed, and three models of a structure for optimizing teaching techniques were developed. The suggested educational program structure which was adopted by employers is presented in the paper. The findings include quantitatively estimated importance of systemic thinking and professional skills and knowledge as basic competences of a masters' program graduate; statistically estimated necessity of practice-based learning approach; and optimization models for structuring curricula in renewable and alternative energy processing. These findings allow the establishment of a platform for the development of educational programs.

  14. Clinical and radiographic assessment of various predictors for healing outcome 1 year after periapical surgery.

    PubMed

    von Arx, Thomas; Jensen, Simon Storgård; Hänni, Stefan

    2007-02-01

    This clinical study prospectively evaluated the influence of various predictors on healing outcome 1 year after periapical surgery. The study cohort included 194 teeth in an equal number of patients. Three teeth were lost for the follow-up (1.5% drop-out rate). Clinical and radiographic measures were used to determine the healing outcome. For statistical analysis, results were dichotomized (healed versus nonhealed). The overall success rate was 83.8% (healed cases). The only individual predictors to prove significant for the outcome were pain at initial examination (p=0.030) and other clinical signs or symptoms at initial examination (p=0.042), meaning that such teeth had lower healing rates 1 year after periapical surgery compared with teeth without such signs or symptoms. Logistic regression revealed that pain at initial examination (odds ratio=2.59, confidence interval=1.2-5.6, p=0.04) was the only predictor reaching significance. Several predictors almost reached statistical significance: lesion size (p=0.06), retrofilling material (p=0.06), and postoperative healing course (p=0.06).

  15. ["The most ill go into psychoanalytic treatment"? Critical comments on an article in Report Psychologie].

    PubMed

    Richter, R; Hartmann, A; Meyer, A E; Rüger, U

    1994-01-01

    Thomas and Schmitz claim that they "deliver a proof for the effectiveness of humanistic methods" (p. 25) with their study. However, they did not or were not able to verify their claim due to several reasons: The authors did not say if and if so to what extent the treatments carried out within the framework of the TK-regulation were treatments using humanistic methods. The validity of the only criterium used by the authors, the average duration of the inability to work, must be questioned. The inferential statistical treatment of the data is insufficient; a non-parametrical evaluation is necessary. Especially missing are personal details concerning the treatment groups (age, sex, occupation, method, duration and frequency of therapy), which are indispensable for a differentiated interpretation. In addition there are numerous formal faults (wrong quotations, mistakes in tables, unclear terms etc.). In view of this criticism we come to the conclusion that the results are to a large degree worthless, at least until several of our objections have been refuted by further information and adequate inferential statistical methods. This study is especially unsuitable to prove a however defined "effectiveness of out-patient psychotherapies", therefore also not suitable to prove the effectiveness of those treatments conducted within the framework of the TK-regulation and especially not suitable to prove the superiority of humanistic methods in comparison with psychoanalytic methods and behavioural therapy.

  16. Antimicrobial properties of Cocos nucifera (coconut) husk: An extrapolation to oral health.

    PubMed

    Jose, Maji; Cyriac, Maria B; Pai, Vidya; Varghese, Ipe; Shantaram, Manjula

    2014-07-01

    Brushing the teeth with fibrous husk of Cocos nucifera (coconut) is a common oral hygiene practice among people of rural areas of South India. However, the probable antimicrobial properties of this plant material against common oral pathogens have not been proved scientifically. Therefore, the present study was designed. Alcoholic extract of the husk of Cocos nucifera was prepared and the antimicrobial properties against common oral pathogens like cariogenic bacteria, periodontal pathogens, and candidal organisms were performed by the Agar Well Diffusion Method. The results obtained were then subjected to statistical analysis using One-Way Analysis of Variance (ANOVA) and the Tukey's Honestly Significant Difference (HSD). The alcoholic extract of Cocos nucifera showed a significant concentration-dependent antimicrobial activity, expressed as a zone of inhibition with respect to all tested organisms except Actinomyces species. The inhibitory effect was more significant, with a majority of cariogenic organisms and Candida, with a zone of inhibition ranging from 4.6 mm to 16.3 mm. However, the effect was lesser with Cocos nucifera compared to chlorhexidine. Minimum inhibitory concentration (MIC) ranged from 50 mg/ml to 75 mg/ml. Cocos nucifera has a significant inhibitory action against common oral pathogens, indicating the presence of highly effective antimicrobial compounds. Therefore, it is proved that its use can contribute to oral health to a great extent. Identification of these active compounds provides the scope for incorporating it into a modern oral care system, so as to control oral diseases.

  17. Konnen Computer das Sprachproblem losen (Can Computers Solve the Language Problem)?

    ERIC Educational Resources Information Center

    Zeilinger, Michael

    1972-01-01

    Various computer applications in linguistics, primarily speech synthesis and machine translation, are reviewed. Although the computer proves useful for statistics, dictionary building and programmed instruction, the promulgation of a world auxiliary language is considered a more human and practical solution to the international communication…

  18. Intuition in Business: Empirical Base

    ERIC Educational Resources Information Center

    Fomin, Eugeniy P.; Alekseev, Andrey A.; Fomina, Natalia E.; Rensh, Marina A.; Zaitseva, Ekaterina V.

    2016-01-01

    In this article, the authors propose economic projection of the views of Daniel Kahneman on intuition. The authors believe intuition to act as an operative category in entrepreneurship. The results of given statistical experiment prove viability of the phenomenon of intuition when making investment decisions. Two independent mechanisms for…

  19. In-Flight Performance Evaluation of Experimental Information Displays

    DTIC Science & Technology

    1979-05-01

    Chemical Systems Laboratory Experimentation Command Aberden Proving Ground ,MD Technical Library 21010 (1) Box 22 Fort Ord, CA 93941 (1) 21 US Amy Materiel...US Army Missile R&D Command Library, Bldg 3071 Redstone Arsenal, AL 35809 (1) ATTN: ATSL-DOSL Aberdeen Proving Ground , MD US Army Yuma Proving Ground ...Systems Chief Analysis Agency Benet Weapons Laboratory ATTN: Reports Distribution LCWSL, USA ARRADCOH Aberdeen Proving Ground , MD ATTN: DRDAR-LCB-TL

  20. Classification of Malaysia aromatic rice using multivariate statistical analysis

    NASA Astrophysics Data System (ADS)

    Abdullah, A. H.; Adom, A. H.; Shakaff, A. Y. Md; Masnan, M. J.; Zakaria, A.; Rahim, N. A.; Omar, O.

    2015-05-01

    Aromatic rice (Oryza sativa L.) is considered as the best quality premium rice. The varieties are preferred by consumers because of its preference criteria such as shape, colour, distinctive aroma and flavour. The price of aromatic rice is higher than ordinary rice due to its special needed growth condition for instance specific climate and soil. Presently, the aromatic rice quality is identified by using its key elements and isotopic variables. The rice can also be classified via Gas Chromatography Mass Spectrometry (GC-MS) or human sensory panels. However, the uses of human sensory panels have significant drawbacks such as lengthy training time, and prone to fatigue as the number of sample increased and inconsistent. The GC-MS analysis techniques on the other hand, require detailed procedures, lengthy analysis and quite costly. This paper presents the application of in-house developed Electronic Nose (e-nose) to classify new aromatic rice varieties. The e-nose is used to classify the variety of aromatic rice based on the samples odour. The samples were taken from the variety of rice. The instrument utilizes multivariate statistical data analysis, including Principal Component Analysis (PCA), Linear Discriminant Analysis (LDA) and K-Nearest Neighbours (KNN) to classify the unknown rice samples. The Leave-One-Out (LOO) validation approach is applied to evaluate the ability of KNN to perform recognition and classification of the unspecified samples. The visual observation of the PCA and LDA plots of the rice proves that the instrument was able to separate the samples into different clusters accordingly. The results of LDA and KNN with low misclassification error support the above findings and we may conclude that the e-nose is successfully applied to the classification of the aromatic rice varieties.

  1. Reasoning and Proving Opportunities in Textbooks: A Comparative Analysis

    ERIC Educational Resources Information Center

    Hong, Dae S.; Choi, Kyong Mi

    2018-01-01

    In this study, we analyzed and compared reasoning and proving opportunities in geometry lessons from American standard-based textbooks and Korean textbooks to understand how these textbooks provide student opportunities to engage in reasoning and proving activities. Overall, around 40% of exercise problems in Core Plus Mathematics Project (CPMP)…

  2. A statistical approach to the life cycle analysis of cumulus clouds selected in a virtual reality environment

    NASA Astrophysics Data System (ADS)

    Heus, Thijs; Jonker, Harm J. J.; van den Akker, Harry E. A.; Griffith, Eric J.; Koutek, Michal; Post, Frits H.

    2009-03-01

    In this study, a new method is developed to investigate the entire life cycle of shallow cumuli in large eddy simulations. Although trained observers have no problem in distinguishing the different life stages of a cloud, this process proves difficult to automate, because cloud-splitting and cloud-merging events complicate the distinction between a single system divided in several cloudy parts and two independent systems that collided. Because the human perception is well equipped to capture and to make sense of these time-dependent three-dimensional features, a combination of automated constraints and human inspection in a three-dimensional virtual reality environment is used to select clouds that are exemplary in their behavior throughout their entire life span. Three specific cases (ARM, BOMEX, and BOMEX without large-scale forcings) are analyzed in this way, and the considerable number of selected clouds warrants reliable statistics of cloud properties conditioned on the phase in their life cycle. The most dominant feature in this statistical life cycle analysis is the pulsating growth that is present throughout the entire lifetime of the cloud, independent of the case and of the large-scale forcings. The pulses are a self-sustained phenomenon, driven by a balance between buoyancy and horizontal convergence of dry air. The convective inhibition just above the cloud base plays a crucial role as a barrier for the cloud to overcome in its infancy stage, and as a buffer region later on, ensuring a steady supply of buoyancy into the cloud.

  3. Universal algorithm for identification of fractional Brownian motion. A case of telomere subdiffusion.

    PubMed

    Burnecki, Krzysztof; Kepten, Eldad; Janczura, Joanna; Bronshtein, Irena; Garini, Yuval; Weron, Aleksander

    2012-11-07

    We present a systematic statistical analysis of the recently measured individual trajectories of fluorescently labeled telomeres in the nucleus of living human cells. The experiments were performed in the U2OS cancer cell line. We propose an algorithm for identification of the telomere motion. By expanding the previously published data set, we are able to explore the dynamics in six time orders, a task not possible earlier. As a result, we establish a rigorous mathematical characterization of the stochastic process and identify the basic mathematical mechanisms behind the telomere motion. We find that the increments of the motion are stationary, Gaussian, ergodic, and even more chaotic--mixing. Moreover, the obtained memory parameter estimates, as well as the ensemble average mean square displacement reveal subdiffusive behavior at all time spans. All these findings statistically prove a fractional Brownian motion for the telomere trajectories, which is confirmed by a generalized p-variation test. Taking into account the biophysical nature of telomeres as monomers in the chromatin chain, we suggest polymer dynamics as a sufficient framework for their motion with no influence of other models. In addition, these results shed light on other studies of telomere motion and the alternative telomere lengthening mechanism. We hope that identification of these mechanisms will allow the development of a proper physical and biological model for telomere subdynamics. This array of tests can be easily implemented to other data sets to enable quick and accurate analysis of their statistical characteristics. Copyright © 2012 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  4. Standardized UXO Technology Demonstration Site Blind Grid Record No. 904 (Sky Research, Inc.). EM61 MKII/Towed Array

    DTIC Science & Technology

    2008-09-01

    DEMONSTRATOR’S FIELD PERSONNEL Geophysicist: Craig Hyslop Geophysicist: John Jacobsen Geophysicist: Rob Mehl 3.7 DEMONSTRATOR’S FIELD SURVEYING...Yuma Proving Ground Soil Survey Report, May 2003. 5. Practical Nonparametric Statistics, W.J. Conover, John Wiley & Sons, 1980 , pages 144 through

  5. Blowing Away Bennett's Blob.

    ERIC Educational Resources Information Center

    Bridgman, Anne

    1987-01-01

    Bureau of Labor statistics prove that schools are not top-heavy with administrators, contrary to the myth and Secretary William Bennett's assertion. Administrators comprise 6.6 percent of school employees and public education ranks 28 out of 35 occupations in terms of the percentage of administrative personnel. Accounting and bookkeeping lead with…

  6. Linkage analysis of chromosome 22q12-13 in a United Kingdom/Icelandic sample of 23 multiplex schizophrenia families

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kalsi, G.; Read, T.; Butler, R.

    A possible linkage to a genetic subtype of schizophrenia and related disorders has been reported on the long arm of chromosome 22 at q12-13. However formal statistical tests in a combined sample could not reject homogeneity and prove that there was linked subgroup of families. We have studied 23 schizophrenia pedigrees to test whether some multiplex schizophrenia families may be linked to the microsatellite markers D22S274 and D22S283 which span the 22q12-13 region. Two point followed by multipoint lod and non-parametric linkage analyses under the assumption of heterogeneity provided no evidence for linkage over the relevant region. 16 refs., 4more » tabs.« less

  7. Developing Matlab scripts for image analysis and quality assessment

    NASA Astrophysics Data System (ADS)

    Vaiopoulos, A. D.

    2011-11-01

    Image processing is a very helpful tool in many fields of modern sciences that involve digital imaging examination and interpretation. Processed images however, often need to be correlated with the original image, in order to ensure that the resulting image fulfills its purpose. Aside from the visual examination, which is mandatory, image quality indices (such as correlation coefficient, entropy and others) are very useful, when deciding which processed image is the most satisfactory. For this reason, a single program (script) was written in Matlab language, which automatically calculates eight indices by utilizing eight respective functions (independent function scripts). The program was tested in both fused hyperspectral (Hyperion-ALI) and multispectral (ALI, Landsat) imagery and proved to be efficient. Indices were found to be in agreement with visual examination and statistical observations.

  8. Phenomenological approach to mechanical damage growth analysis.

    PubMed

    Pugno, Nicola; Bosia, Federico; Gliozzi, Antonio S; Delsanto, Pier Paolo; Carpinteri, Alberto

    2008-10-01

    The problem of characterizing damage evolution in a generic material is addressed with the aim of tracing it back to existing growth models in other fields of research. Based on energetic considerations, a system evolution equation is derived for a generic damage indicator describing a material system subjected to an increasing external stress. The latter is found to fit into the framework of a recently developed phenomenological universality (PUN) approach and, more specifically, the so-called U2 class. Analytical results are confirmed by numerical simulations based on a fiber-bundle model and statistically assigned local strengths at the microscale. The fits with numerical data prove, with an excellent degree of reliability, that the typical evolution of the damage indicator belongs to the aforementioned PUN class. Applications of this result are briefly discussed and suggested.

  9. A meta-analysis of vitrectomy with or without internal limiting membrane peeling for macular hole retinal detachment in the highly myopic eyes.

    PubMed

    Gao, Xinxiao; Guo, Jia; Meng, Xin; Wang, Jun; Peng, Xiaoyan; Ikuno, Yasushi

    2016-06-13

    To evaluate the anatomical and visual outcomes by par plana vitrectomy with or without internal limiting membrane (ILM) peeling in highly myopic eyes with macular hole retinal detachment (MHRD). MEDLINE (Ovid, PubMed) and EMBASE were used for data collection up to September 30, 2015. The parameters of anatomical success, macular hole closure and improved best corrected visual acuity (BCVA) at or beyond 6 months after operation were assessed as the primary outcome measurement. The meta-analysis was performed with the fixed-effects model. Seven comparative analyses involving a total of 373 patients were included in the present meta-analysis. Statistically the pooled data showed significant relative risk (RR) in terms of primary reattachment between ILM peeling and non-peeling groups (RR, 1.19; 95 % CI, 1.04 to 1.36; P = 0.012). An effect favoring ILM peeling with regard to macular hole closure was also detected (RR, 1.71; 95 % CI, 1.20 to 2.43; P = 0.003). However, no statistically significant difference was found in the improved BCVA (logarithm of the minimum angle of resolution) at 6 months or more (95 % CI, -0.31 to 0.44; P = 0.738). There is no proved benefit of postoperative visual improvement. However, the available evidences from this study suggested a superiority of ILM peeling over no peeling for myopic patients with MHRD.

  10. A Computer Program to Implement the Chen Method of Dimensional Analysis

    DTIC Science & Technology

    1990-01-01

    Director: AXHE-S (m. B Corna)U.S. Army TRADOX Systems Analysis Activity ATTdN: AXrE-IS (Mr. B. Corona) ATM: ATOR-TSL Aberden Proving Ground , MD 21005-5001...Laboratory I Aberdeen Proving Ground , MD 21005-5066 ATTN: AMSMI-ROC Redstone Arsenal, AL 35898-5242 Direct or D U.S. Army Human Engineering Laboratory 1...Kokinakis) U.S. Army Missile Laboratory Aberdeen Proving Ground , MD 21005-5066 ReTN AMSMI-R C1edstone Arsenal, AL 35898-5242 Director Director 1 U.S. Army

  11. What are the most fire-dangerous atmospheric circulations in the Eastern-Mediterranean? Analysis of the synoptic wildfire climatology.

    PubMed

    Paschalidou, A K; Kassomenos, P A

    2016-01-01

    Wildfire management is closely linked to robust forecasts of changes in wildfire risk related to meteorological conditions. This link can be bridged either through fire weather indices or through statistical techniques that directly relate atmospheric patterns to wildfire activity. In the present work the COST-733 classification schemes are applied in order to link wildfires in Greece with synoptic circulation patterns. The analysis reveals that the majority of wildfire events can be explained by a small number of specific synoptic circulations, hence reflecting the synoptic climatology of wildfires. All 8 classification schemes used, prove that the most fire-dangerous conditions in Greece are characterized by a combination of high atmospheric pressure systems located N to NW of Greece, coupled with lower pressures located over the very Eastern part of the Mediterranean, an atmospheric pressure pattern closely linked to the local Etesian winds over the Aegean Sea. During these events, the atmospheric pressure has been reported to be anomalously high, while anomalously low 500hPa geopotential heights and negative total water column anomalies were also observed. Among the various classification schemes used, the 2 Principal Component Analysis-based classifications, namely the PCT and the PXE, as well as the Leader Algorithm classification LND proved to be the best options, in terms of being capable to isolate the vast amount of fire events in a small number of classes with increased frequency of occurrence. It is estimated that these 3 schemes, in combination with medium-range to seasonal climate forecasts, could be used by wildfire risk managers to provide increased wildfire prediction accuracy. Copyright © 2015 Elsevier B.V. All rights reserved.

  12. Effects of complex interventions in ‘skin cancer prevention and treatment’: protocol for a mixed-method systematic review with qualitative comparative analysis

    PubMed Central

    Breitbart, Eckhard; Köberlein-Neu, Juliane

    2017-01-01

    Introduction Occurring from ultraviolet radiation combined with impairing ozone levels, uncritical sun exposure and use of tanning beds an increasing number of people are affected by different types of skin cancer. But preventive interventions like skin cancer screening are still missing the evidence for effectiveness and therefore are criticised. Fundamental for an appropriate course of action is to approach the defined parameters as measures for effectiveness critically. A prerequisite should be the critical application of used parameter that are defined as measures for effectiveness. This research seeks to establish, through the available literature, the effects and conditions that prove the effectiveness of prevention strategies in skin cancer. Method and analysis A mixed-method approach is employed to combine quantitative to qualitative methods and answer what effects can display effectiveness considering time horizon, perspective and organisational level and what are essential and sufficient conditions to prove effectiveness and cost-effectiveness in skin cancer prevention strategies. A systematic review will be performed to spot studies from any design and assess the data quantitatively and qualitatively. Included studies from each key question will be summarised by characteristics like population, intervention, comparison, outcomes, study design, endpoints, effect estimator and so on. Beside statistical relevancies for a systematic review the qualitative method of qualitative comparative analysis (QCA) will be performed. The estimated outcomes from this review and QCA are the accomplishment and absence of effects that are appropriate for application in effectiveness assessments and further cost-effectiveness assessment. Ethics and dissemination Formal ethical approval is not required as primary data will not be collected. Trial registration number International Prospective Register for Systematic Reviews number CRD42017053859. PMID:28877950

  13. Development of one novel multiple-target plasmid for duplex quantitative PCR analysis of roundup ready soybean.

    PubMed

    Zhang, Haibo; Yang, Litao; Guo, Jinchao; Li, Xiang; Jiang, Lingxi; Zhang, Dabing

    2008-07-23

    To enforce the labeling regulations of genetically modified organisms (GMOs), the application of reference molecules as calibrators is becoming essential for practical quantification of GMOs. However, the reported reference molecules with tandem marker multiple targets have been proved not suitable for duplex PCR analysis. In this study, we developed one unique plasmid molecule based on one pMD-18T vector with three exogenous target DNA fragments of Roundup Ready soybean GTS 40-3-2 (RRS), that is, CaMV35S, NOS, and RRS event fragments, plus one fragment of soybean endogenous Lectin gene. This Lectin gene fragment was separated from the three exogenous target DNA fragments of RRS by inserting one 2.6 kb DNA fragment with no relatedness to RRS detection targets in this resultant plasmid. Then, we proved that this design allows the quantification of RRS using the three duplex real-time PCR assays targeting CaMV35S, NOS, and RRS events employing this reference molecule as the calibrator. In these duplex PCR assays, the limits of detection (LOD) and quantification (LOQ) were 10 and 50 copies, respectively. For the quantitative analysis of practical RRS samples, the results of accuracy and precision were similar to those of simplex PCR assays, for instance, the quantitative results were at the 1% level, the mean bias of the simplex and duplex PCR were 4.0% and 4.6%, respectively, and the statistic analysis ( t-test) showed that the quantitative data from duplex and simplex PCR had no significant discrepancy for each soybean sample. Obviously, duplex PCR analysis has the advantages of saving the costs of PCR reaction and reducing the experimental errors in simplex PCR testing. The strategy reported in the present study will be helpful for the development of new reference molecules suitable for duplex PCR quantitative assays of GMOs.

  14. Quantifying networks complexity from information geometry viewpoint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Felice, Domenico, E-mail: domenico.felice@unicam.it; Mancini, Stefano; INFN-Sezione di Perugia, Via A. Pascoli, I-06123 Perugia

    We consider a Gaussian statistical model whose parameter space is given by the variances of random variables. Underlying this model we identify networks by interpreting random variables as sitting on vertices and their correlations as weighted edges among vertices. We then associate to the parameter space a statistical manifold endowed with a Riemannian metric structure (that of Fisher-Rao). Going on, in analogy with the microcanonical definition of entropy in Statistical Mechanics, we introduce an entropic measure of networks complexity. We prove that it is invariant under networks isomorphism. Above all, considering networks as simplicial complexes, we evaluate this entropy onmore » simplexes and find that it monotonically increases with their dimension.« less

  15. Adaptive interference cancel filter for evoked potential using high-order cumulants.

    PubMed

    Lin, Bor-Shyh; Lin, Bor-Shing; Chong, Fok-Ching; Lai, Feipei

    2004-01-01

    This paper is to present evoked potential (EP) processing using adaptive interference cancel (AIC) filter with second and high order cumulants. In conventional ensemble averaging method, people have to conduct repetitively experiments to record the required data. Recently, the use of AIC structure with second statistics in processing EP has proved more efficiency than traditional averaging method, but it is sensitive to both of the reference signal statistics and the choice of step size. Thus, we proposed higher order statistics-based AIC method to improve these disadvantages. This study was experimented in somatosensory EP corrupted with EEG. Gradient type algorithm is used in AIC method. Comparisons with AIC filter on second, third, fourth order statistics are also presented in this paper. We observed that AIC filter with third order statistics has better convergent performance for EP processing and is not sensitive to the selection of step size and reference input.

  16. JPSS Preparations at the Satellite Proving Ground for Marine, Precipitation, and Satellite Analysis

    NASA Technical Reports Server (NTRS)

    Folmer, Michael J.; Berndt, E.; Clark, J.; Orrison, A.; Kibler, J.; Sienkiewicz, J.; Nelson, J.; Goldberg, M.; Sjoberg, W.

    2016-01-01

    The ocean prediction center at the national hurricane center's tropical analysis and forecast Branch, the Weather Prediction center and the Satellite analysis branch of NESDIS make up the Satellite Proving Ground for Marine, Precipitation and Satellite Analysis. These centers had early exposure to JPSS products using the S-NPP Satellite that was launched in 2011. Forecasters continue to evaluate new products in anticipation for the launch of JPSS-1 sometime in 2017.

  17. Nanoengineered capsules for selective SERS analysis of biological samples

    NASA Astrophysics Data System (ADS)

    You, Yil-Hwan; Schechinger, Monika; Locke, Andrea; Coté, Gerard; McShane, Mike

    2018-02-01

    Metal nanoparticles conjugated with DNA oligomers have been intensively studied for a variety of applications, including optical diagnostics. Assays based on aggregation of DNA-coated particles in proportion to the concentration of target analyte have not been widely adopted for clinical analysis, however, largely due to the nonspecific responses observed in complex biofluids. While sample pre-preparation such as dialysis is helpful to enable selective sensing, here we sought to prove that assay encapsulation in hollow microcapsules could remove this requirement and thereby facilitate more rapid analysis on complex samples. Gold nanoparticle-based assays were incorporated into capsules comprising polyelectrolyte multilayer (PEMs), and the response to small molecule targets and larger proteins were compared. Gold nanoparticles were able to selectively sense small Raman dyes (Rhodamine 6G) in the presence of large protein molecules (BSA) when encapsulated. A ratiometric based microRNA-17 sensing assay exhibited drastic reduction in response after encapsulation, with statistically-significant relative Raman intensity changes only at a microRNA-17 concentration of 10 nM compared to a range of 0-500 nM for the corresponding solution-phase response.

  18. Ergonomical valorization of working spaces in multipurpose ships.

    PubMed

    Seif, Mehdi; Degiuli, Nastija; Muftić, Osman

    2003-06-01

    In this work it is shown how anthropological data are among the most needed factors in ergonomical valorization of crew working spaces. Ship's working or living environment involves many unique human factors, which should be specially considered in our case as limitation of crew space. In this work we have chosen ships of different years of construction to prove this tendency. As a micro study, the work posture analysis using the pulling force experiment is performed in order to determine lumbar moment, intra-abdominal pressure as a measure of evaluating and comparing different crew work positions. As a macro-study, the "crew work posture analysis" was carried out by the use of the data collected from real cases. The most probable work postures in different spaces of a ship are classified and after some corrections of the work place the profile and its grade were determined. The "statistical analysis for real ship's spaces" is also performed, as well as another macro study, in order to show some real designed ship spaces from the point of view of the allocated volume.

  19. Detection and discrimination of microorganisms on various substrates with quantum cascade laser spectroscopy

    NASA Astrophysics Data System (ADS)

    Padilla-Jiménez, Amira C.; Ortiz-Rivera, William; Rios-Velazquez, Carlos; Vazquez-Ayala, Iris; Hernández-Rivera, Samuel P.

    2014-06-01

    Investigations focusing on devising rapid and accurate methods for developing signatures for microorganisms that could be used as biological warfare agents' detection, identification, and discrimination have recently increased significantly. Quantum cascade laser (QCL)-based spectroscopic systems have revolutionized many areas of defense and security including this area of research. In this contribution, infrared spectroscopy detection based on QCL was used to obtain the mid-infrared (MIR) spectral signatures of Bacillus thuringiensis, Escherichia coli, and Staphylococcus epidermidis. These bacteria were used as microorganisms that simulate biothreats (biosimulants) very truthfully. The experiments were conducted in reflection mode with biosimulants deposited on various substrates including cardboard, glass, travel bags, wood, and stainless steel. Chemometrics multivariate statistical routines, such as principal component analysis regression and partial least squares coupled to discriminant analysis, were used to analyze the MIR spectra. Overall, the investigated infrared vibrational techniques were useful for detecting target microorganisms on the studied substrates, and the multivariate data analysis techniques proved to be very efficient for classifying the bacteria and discriminating them in the presence of highly IR-interfering media.

  20. RNA-Seq Mouse Brain Regions Expression Data Analysis: Focus on ApoE Functional Network

    PubMed

    Babenko, Vladimir N; Smagin, Dmitry A; Kudryavtseva, Natalia N

    2017-09-13

    ApoE expression status was proved to be a highly specific marker of energy metabolism rate in the brain. Along with its neighbor, Translocase of Outer Mitochondrial Membrane 40 kDa (TOMM40) which is involved in mitochondrial metabolism, the corresponding genomic region constitutes the neuroenergetic hotspot. Using RNA-Seq data from a murine model of chronic stress a significant positive expression coordination of seven neighboring genes in ApoE locus in five brain regions was observed. ApoE maintains one of the highest absolute expression values genome-wide, implying that ApoE can be the driver of the neighboring gene expression alteration observed under stressful loads. Notably, we revealed the highly statistically significant increase of ApoE expression in the hypothalamus of chronically aggressive (FDR < 0.007) and defeated (FDR < 0.001) mice compared to the control. Correlation analysis revealed a close association of ApoE and proopiomelanocortin (Pomc) gene expression profiles implying the putative neuroendocrine stress response background of ApoE expression elevation therein.

  1. Multiresolution analysis (discrete wavelet transform) through Daubechies family for emotion recognition in speech.

    NASA Astrophysics Data System (ADS)

    Campo, D.; Quintero, O. L.; Bastidas, M.

    2016-04-01

    We propose a study of the mathematical properties of voice as an audio signal. This work includes signals in which the channel conditions are not ideal for emotion recognition. Multiresolution analysis- discrete wavelet transform - was performed through the use of Daubechies Wavelet Family (Db1-Haar, Db6, Db8, Db10) allowing the decomposition of the initial audio signal into sets of coefficients on which a set of features was extracted and analyzed statistically in order to differentiate emotional states. ANNs proved to be a system that allows an appropriate classification of such states. This study shows that the extracted features using wavelet decomposition are enough to analyze and extract emotional content in audio signals presenting a high accuracy rate in classification of emotional states without the need to use other kinds of classical frequency-time features. Accordingly, this paper seeks to characterize mathematically the six basic emotions in humans: boredom, disgust, happiness, anxiety, anger and sadness, also included the neutrality, for a total of seven states to identify.

  2. Generating a Multiphase Equation of State with Swarm Intelligence

    NASA Astrophysics Data System (ADS)

    Cox, Geoffrey

    2017-06-01

    Hydrocode calculations require knowledge of the variation of pressure of a material with density and temperature, which is given by the equation of state. An accurate model needs to account for discontinuities in energy, density and properties of a material across a phase boundary. When generating a multiphase equation of state the modeller attempts to balance the agreement between the available data for compression, expansion and phase boundary location. However, this can prove difficult because minor adjustments in the equation of state for a single phase can have a large impact on the overall phase diagram. Recently, Cox and Christie described a method for combining statistical-mechanics-based condensed matter physics models with a stochastic analysis technique called particle swarm optimisation. The models produced show good agreement with experiment over a wide range of pressure-temperature space. This talk details the general implementation of this technique, shows example results, and describes the types of analysis that can be performed with this method.

  3. Windshield splatter analysis with the Galaxy metagenomic pipeline

    PubMed Central

    Kosakovsky Pond, Sergei; Wadhawan, Samir; Chiaromonte, Francesca; Ananda, Guruprasad; Chung, Wen-Yu; Taylor, James; Nekrutenko, Anton

    2009-01-01

    How many species inhabit our immediate surroundings? A straightforward collection technique suitable for answering this question is known to anyone who has ever driven a car at highway speeds. The windshield of a moving vehicle is subjected to numerous insect strikes and can be used as a collection device for representative sampling. Unfortunately the analysis of biological material collected in that manner, as with most metagenomic studies, proves to be rather demanding due to the large number of required tools and considerable computational infrastructure. In this study, we use organic matter collected by a moving vehicle to design and test a comprehensive pipeline for phylogenetic profiling of metagenomic samples that includes all steps from processing and quality control of data generated by next-generation sequencing technologies to statistical analyses and data visualization. To the best of our knowledge, this is also the first publication that features a live online supplement providing access to exact analyses and workflows used in the article. PMID:19819906

  4. “At 150 kg, you can't run” men's weight loss stories in a popular health magazine provide appropriate examples of good health practice

    PubMed Central

    Couch, Danielle; Han, Gil-Soo; Robinson, Priscilla; Komesaroff, Paul

    2014-01-01

    We explore weight loss stories from 47 men collected from the Australian edition of Men's Health magazine between January 2009 and December 2012. Our analysis uses a mixed methods approach that combines thematic analysis and descriptive statistics to examine weight loss strategies against clinical practice guidelines for the management of overweight and obesity. All the stories reported the use of physical activity for weight loss and most stories detailed dietary changes for weight loss. Our findings indicate that most of the men reportedly used some form of behavioural strategies to assist them in their behaviour change efforts. The weight loss methods used were consistent with clinical practice guidelines, with the exception of some dietary practices. As narratives may assist with behaviour change, stories like those examined in this study could prove to be very useful in promoting weight loss to men. PMID:25750780

  5. Radiological and histopathological evaluation of experimentally-induced periapical lesion in rats

    PubMed Central

    TEIXEIRA, Renata Cordeiro; RUBIRA, Cassia Maria Fischer; ASSIS, Gerson Francisco; LAURIS, José Roberto Pereira; CESTARI, Tania Mary; RUBIRA-BULLEN, Izabel Regina Fischer

    2011-01-01

    Objective This study evaluated experimentally-induced periapical bone loss sites using digital radiographic and histopathologic parameters. Material and Methods Twenty-seven Wistar rats were submitted to coronal opening of their mandibular right first molars. They were radiographed at 2, 15 and 30 days after the operative procedure by two digital radiographic storage phosphor plates (Digora®). The images were analyzed by creating a region of interest at the periapical region of each tooth (ImageJ) and registering the corresponding pixel values. After the sacrifice, the specimens were submitted to microscopic analysis in order to confirm the pulpal and periapical status of the tooth. Results There was significant statistically difference between the control and test sides in all the experimental periods regarding the pixel values (two-way ANOVA; p<0.05). Conclusions The microscopic analysis proved that a periapical disease development occurred during the experimental periods with an evolution from pulpal necrosis to periapical bone resorption. PMID:21922123

  6. Generating and Using Examples in the Proving Process

    ERIC Educational Resources Information Center

    Sandefur, J.; Mason, J.; Stylianides, G. J.; Watson, A.

    2013-01-01

    We report on our analysis of data from a dataset of 26 videotapes of university students working in groups of 2 and 3 on different proving problems. Our aim is to understand the role of example generation in the proving process, focusing on deliberate changes in representation and symbol manipulation. We suggest and illustrate four aspects of…

  7. Commognitive Analysis of Undergraduate Mathematics Students' Responses in Proving Subgroup's Non-Emptiness

    ERIC Educational Resources Information Center

    Ioannou, Marios

    2016-01-01

    Proving that a given set is indeed a subgroup, one needs to show that it is non-empty, and closed under operation and inverses. This study focuses on the first condition, analysing students' responses to this task. Results suggest that there are three distinct problematic responses: the total absence of proving this condition, the problematic…

  8. Using geovisual analytics in Google Earth to understand disease distribution: a case study of campylobacteriosis in the Czech Republic (2008-2012).

    PubMed

    Marek, Lukáš; Tuček, Pavel; Pászto, Vít

    2015-01-28

    Visual analytics aims to connect the processing power of information technologies and the user's ability of logical thinking and reasoning through the complex visual interaction. Moreover, the most of the data contain the spatial component. Therefore, the need for geovisual tools and methods arises. Either one can develop own system but the dissemination of findings and its usability might be problematic or the widespread and well-known platform can be utilized. The aim of this paper is to prove the applicability of Google Earth™ software as a tool for geovisual analytics that helps to understand the spatio-temporal patterns of the disease distribution. We combined the complex joint spatio-temporal analysis with comprehensive visualisation. We analysed the spatio-temporal distribution of the campylobacteriosis in the Czech Republic between 2008 and 2012. We applied three main approaches in the study: (1) the geovisual analytics of the surveillance data that were visualised in the form of bubble chart; (2) the geovisual analytics of the disease's weekly incidence surfaces computed by spatio-temporal kriging and (3) the spatio-temporal scan statistics that was employed in order to identify high or low rates clusters of affected municipalities. The final data are stored in Keyhole Markup Language files and visualised in Google Earth™ in order to apply geovisual analytics. Using geovisual analytics we were able to display and retrieve information from complex dataset efficiently. Instead of searching for patterns in a series of static maps or using numerical statistics, we created the set of interactive visualisations in order to explore and communicate results of analyses to the wider audience. The results of the geovisual analytics identified periodical patterns in the behaviour of the disease as well as fourteen spatio-temporal clusters of increased relative risk. We prove that Google Earth™ software is a usable tool for the geovisual analysis of the disease distribution. Google Earth™ has many indisputable advantages (widespread, freely available, intuitive interface, space-time visualisation capabilities and animations, communication of results), nevertheless it is still needed to combine it with pre-processing tools that prepare the data into a form suitable for the geovisual analytics itself.

  9. Development of an instrument to analyze organizational characteristics in multidisciplinary care pathways; the case of colorectal cancer.

    PubMed

    Pluimers, Dorine J; van Vliet, Ellen J; Niezink, Anne Gh; van Mourik, Martijn S; Eddes, Eric H; Wouters, Michel W; Tollenaar, Rob A E M; van Harten, Wim H

    2015-04-09

    To analyze the organization of multidisciplinary care pathways such as colorectal cancer care, an instrument was developed based on a recently published framework that was earlier used in analyzing (monodisciplinary) specialist cataract care from a lean perspective. The instrument was constructed using semi-structured interviews and direct observation of the colorectal care process based on a Rapid Plant Assessment. Six lean aspects that were earlier established that highly impact process design, were investigated: operational focus, autonomous work cell, physical lay-out of resources, multi-skilled team, pull planning and non-value adding activities. To test reliability, clarity and face validity of the instrument, a pilot study was performed in eight Dutch hospitals. In the pilot it proved feasible to apply the instrument and generate the intended information. The instrument consisted of 83 quantitative and 24 qualitative items. Examples of results show differences in operational focus, number of patient visits needed for diagnosis, numbers of staff involved with treatment, the implementation of protocols and utilization of one-stop-shops. Identification of waste and non-value adding activities may need further attention. Based on feedback from involved clinicians the face validity was acceptable and the results provided useful feedback- and benchmark data. The instrument proved to be reliable and valid for broader implementation in Dutch health care. The limited number of cases made statistical analysis not possible and further validation studies may shed better light on variation. This paper demonstrates the use of an instrument to analyze organizational characteristics in colorectal cancer care from a lean perspective. Wider use might help to identify best organizational practices for colorectal surgery. In larger series the instrument might be used for in-depth research into the relation between organization and patient outcomes. Although we found no reason to adapt the underlying framework, recommendations were made for further development to enable use in different tumor- and treatment modalities and in larger (international) samples that allow for more advanced statistical analysis. Waste from defective care or from wasted human potential will need further elaboration of the instrument.

  10. Role of sufficient statistics in stochastic thermodynamics and its implication to sensory adaptation

    NASA Astrophysics Data System (ADS)

    Matsumoto, Takumi; Sagawa, Takahiro

    2018-04-01

    A sufficient statistic is a significant concept in statistics, which means a probability variable that has sufficient information required for an inference task. We investigate the roles of sufficient statistics and related quantities in stochastic thermodynamics. Specifically, we prove that for general continuous-time bipartite networks, the existence of a sufficient statistic implies that an informational quantity called the sensory capacity takes the maximum. Since the maximal sensory capacity imposes a constraint that the energetic efficiency cannot exceed one-half, our result implies that the existence of a sufficient statistic is inevitably accompanied by energetic dissipation. We also show that, in a particular parameter region of linear Langevin systems there exists the optimal noise intensity at which the sensory capacity, the information-thermodynamic efficiency, and the total entropy production are optimized at the same time. We apply our general result to a model of sensory adaptation of E. coli and find that the sensory capacity is nearly maximal with experimentally realistic parameters.

  11. A Diagrammatic Exposition of Regression and Instrumental Variables for the Beginning Student

    ERIC Educational Resources Information Center

    Foster, Gigi

    2009-01-01

    Some beginning students of statistics and econometrics have difficulty with traditional algebraic approaches to explaining regression and related techniques. For these students, a simple and intuitive diagrammatic introduction as advocated by Kennedy (2008) may prove a useful framework to support further study. The author presents a series of…

  12. Amino acid pair- and triplet-wise groupings in the interior of α-helical segments in proteins.

    PubMed

    de Sousa, Miguel M; Munteanu, Cristian R; Pazos, Alejandro; Fonseca, Nuno A; Camacho, Rui; Magalhães, A L

    2011-02-21

    A statistical approach has been applied to analyse primary structure patterns at inner positions of α-helices in proteins. A systematic survey was carried out in a recent sample of non-redundant proteins selected from the Protein Data Bank, which were used to analyse α-helix structures for amino acid pairing patterns. Only residues more than three positions apart from both termini of the α-helix were considered as inner. Amino acid pairings i, i+k (k=1, 2, 3, 4, 5), were analysed and the corresponding 20×20 matrices of relative global propensities were constructed. An analysis of (i, i+4, i+8) and (i, i+3, i+4) triplet patterns was also performed. These analysis yielded information on a series of amino acid patterns (pairings and triplets) showing either high or low preference for α-helical motifs and suggested a novel approach to protein alphabet reduction. In addition, it has been shown that the individual amino acid propensities are not enough to define the statistical distribution of these patterns. Global pair propensities also depend on the type of pattern, its composition and orientation in the protein sequence. The data presented should prove useful to obtain and refine useful predictive rules which can further the development and fine-tuning of protein structure prediction algorithms and tools. Copyright © 2010 Elsevier Ltd. All rights reserved.

  13. A novel lenticular arena to quantify locomotor competence in walking fruit flies.

    PubMed

    Tom Mekdara, Nalong; Goto, Joy June; Choudhury, Songita; Jerry Mekdara, Prasong; Yingst, Nicholas; Cao, Yu; Berg, Otto; Katharina Müller, Ulrike

    2012-07-01

    Drosophila melanogaster has become an important invertebrate model organism in biological and medical research, for mutational and genetic analysis, and in toxicological screening. Many screening assays have been developed that assess the flies' mortality, reproduction, development, morphology, or behavioral competence. In this study, we describe a new assay for locomotor competence. It comprises a circular walking arena with a lenticular floor and a flat cover (the slope of the floor increases gradually from the center to the edge of the arena) plus automated fly tracking and statistical analysis. This simple modification of a flat arena presents a graduated physical challenge, with which we can assess fine gradations of motor ability, since a fly's time average radial distance from the arena center is a direct indicator of its climbing ability. The time averaged distribution of flies as a function of slope, activity levels, and walking speed, yields a fine grained picture of locomotory ability and motivation levels. We demonstrate the strengths and weaknesses of this assay (compared with a conventional tap-down test) by observing flies treated with a neurotoxin (BMAA) that acts as a glutamate agonist. The assay proves well suited to detect dose effects and progression effects with higher statistical power than the traditional tap-down, but it has a higher detection limit, making it less sensitive to treatment effects. © 2012 WILEY PERIODICALS, INC.

  14. Vision-based gait impairment analysis for aided diagnosis.

    PubMed

    Ortells, Javier; Herrero-Ezquerro, María Trinidad; Mollineda, Ramón A

    2018-02-12

    Gait is a firsthand reflection of health condition. This belief has inspired recent research efforts to automate the analysis of pathological gait, in order to assist physicians in decision-making. However, most of these efforts rely on gait descriptions which are difficult to understand by humans, or on sensing technologies hardly available in ambulatory services. This paper proposes a number of semantic and normalized gait features computed from a single video acquired by a low-cost sensor. Far from being conventional spatio-temporal descriptors, features are aimed at quantifying gait impairment, such as gait asymmetry from several perspectives or falling risk. They were designed to be invariant to frame rate and image size, allowing cross-platform comparisons. Experiments were formulated in terms of two databases. A well-known general-purpose gait dataset is used to establish normal references for features, while a new database, introduced in this work, provides samples under eight different walking styles: one normal and seven impaired patterns. A number of statistical studies were carried out to prove the sensitivity of features at measuring the expected pathologies, providing enough evidence about their accuracy. Graphical Abstract Graphical abstract reflecting main contributions of the manuscript: at the top, a robust, semantic and easy-to-interpret feature set to describe impaired gait patterns; at the bottom, a new dataset consisting of video-recordings of a number of volunteers simulating different patterns of pathological gait, where features were statistically assessed.

  15. Bayesian Reconstruction of Disease Outbreaks by Combining Epidemiologic and Genomic Data

    PubMed Central

    Jombart, Thibaut; Cori, Anne; Didelot, Xavier; Cauchemez, Simon; Fraser, Christophe; Ferguson, Neil

    2014-01-01

    Recent years have seen progress in the development of statistically rigorous frameworks to infer outbreak transmission trees (“who infected whom”) from epidemiological and genetic data. Making use of pathogen genome sequences in such analyses remains a challenge, however, with a variety of heuristic approaches having been explored to date. We introduce a statistical method exploiting both pathogen sequences and collection dates to unravel the dynamics of densely sampled outbreaks. Our approach identifies likely transmission events and infers dates of infections, unobserved cases and separate introductions of the disease. It also proves useful for inferring numbers of secondary infections and identifying heterogeneous infectivity and super-spreaders. After testing our approach using simulations, we illustrate the method with the analysis of the beginning of the 2003 Singaporean outbreak of Severe Acute Respiratory Syndrome (SARS), providing new insights into the early stage of this epidemic. Our approach is the first tool for disease outbreak reconstruction from genetic data widely available as free software, the R package outbreaker. It is applicable to various densely sampled epidemics, and improves previous approaches by detecting unobserved and imported cases, as well as allowing multiple introductions of the pathogen. Because of its generality, we believe this method will become a tool of choice for the analysis of densely sampled disease outbreaks, and will form a rigorous framework for subsequent methodological developments. PMID:24465202

  16. Performance evaluation of dispersion parameterization schemes in the plume simulation of FFT-07 diffusion experiment

    NASA Astrophysics Data System (ADS)

    Pandey, Gavendra; Sharan, Maithili

    2018-01-01

    Application of atmospheric dispersion models in air quality analysis requires a proper representation of the vertical and horizontal growth of the plume. For this purpose, various schemes for the parameterization of dispersion parameters σ‧s are described in both stable and unstable conditions. These schemes differ on the use of (i) extent of availability of on-site measurements (ii) formulations developed for other sites and (iii) empirical relations. The performance of these schemes is evaluated in an earlier developed IIT (Indian Institute of Technology) dispersion model with the data set in single and multiple releases conducted at Fusion Field Trials, Dugway Proving Ground, Utah 2007. Qualitative and quantitative evaluation of the relative performance of all the schemes is carried out in both stable and unstable conditions in the light of (i) peak/maximum concentrations, and (ii) overall concentration distribution. The blocked bootstrap resampling technique is adopted to investigate the statistical significance of the differences in performances of each of the schemes by computing 95% confidence limits on the parameters FB and NMSE. The various analysis based on some selected statistical measures indicated consistency in the qualitative and quantitative performances of σ schemes. The scheme which is based on standard deviation of wind velocity fluctuations and Lagrangian time scales exhibits a relatively better performance in predicting the peak as well as the lateral spread.

  17. A Dilemma That Underlies an Existence Proof in Geometry

    ERIC Educational Resources Information Center

    Samper, Carmen; Perry, Patricia; Camargo, Leonor; Sáenz-Ludlow, Adalira; Molina, Óscar

    2016-01-01

    Proving an existence theorem is less intuitive than proving other theorems. This article presents a semiotic analysis of significant fragments of classroom meaning-making which took place during the class-session in which the existence of the midpoint of a line-segment was proven. The purpose of the analysis is twofold. First follow the evolution…

  18. Establishing the credibility of archaeoastronomical sites

    NASA Astrophysics Data System (ADS)

    Ruggles, Clive

    2016-10-01

    In 2011, an attempt to nominate a prehistoric ``observatory'' site onto the World Heritage List proved unsuccessful because UNESCO rejected the interpretation as statistically and archaeologically unproven. The case highlights an issue at the heart of archaeoastronomical methodology and interpretation: the mere existence of astronomical alignments in ancient sites does not prove that they were important to those who constructed and used the sites, let alone giving us insights into their likely significance and meaning. The fact that more archaeoastronomical sites are now appearing on national tentative lists prior to their WHL nomination means that this is no longer just an academic issue; establishing the credibility of the archaeoastronomical interpretations is crucial to any assessment of their value in heritage terms.

  19. Efficacy of local drug delivery of Achyranthes aspera gel in the management of chronic periodontitis: A clinical study

    PubMed Central

    Boyapati, Ramanarayana; Gojja, Prathibha; Chintalapani, Srikanth; Nagubandi, Kirankumar; Ramisetti, Arpita; Salavadhi, Shyam Sunder

    2017-01-01

    Context: Periodontitis is an inflammatory disease of microbial origin. Locally delivered antimicrobials reduce subgingival flora. Achyranthes aspera gel has antimicrobial, antioxidant, anti-inflammatory, and immunostimulant effects. Aims: To evaluate the efficacy of local drug delivery of A. aspera gel in the management of chronic periodontitis. Materials and Methods: Thirty patients with chronic periodontitis were considered in the study and categorized into two equal groups (Group A: scaling and root planing (SRP) with A. aspera gel, Group B: SRP with placebo gel). Patients were enlisted from the Department of Periodontics, Mamata Dental College and Hospital. The clinical parameters (gingival index, bleeding on probing, probing pocket depth, and clinical attachment level) were recorded at baseline and 3 months. Statistical Analysis Used: All the obtained data were sent for statistical analyses using SPSS version 18. Results: The periodontitis and the Achyranthes were statistically analyzed. A comparison of clinical parameters for test group and control group from baseline to 3 months was done using paired t-test. Intergroup comparison for both the groups was done using independent sample t-test. Conclusions: A. aspera gel when delivered locally along with SRP showed a beneficial effect. A. aspera gel as a non-surgical local drug delivery system proved to be without any side effects in the management of periodontitis. A. aspera gel has strong anti-inflammatory effects in addition to its antioxidant activity. PMID:29386800

  20. Regional atrophy of the basal ganglia and thalamus in idiopathic generalized epilepsy.

    PubMed

    Du, Hanjian; Zhang, Yuanchao; Xie, Bing; Wu, Nan; Wu, Guocai; Wang, Jian; Jiang, Tianzi; Feng, Hua

    2011-04-01

    To determine the regional changes in the shapes of subcortical structures in idiopathic generalized epilepsy using a vertex-based analysis method. Earlier studies found that gray matter volume in the frontal, parietal, and temporal lobes is significantly altered in idiopathic generalized epilepsy (IGE). Research has indicated that a relationship exists between the brain's subcortical structures and epilepsy. However, little is known about possible changes in the subcortical structures in IGE. This study aims to determine the changes in the shape of subcortical structures in IGE using vertex analysis. Fourteen male patients with IGE and 28 age- and sex-matched healthy controls were included in this study, which used high-resolution magnetic resonance imaging. We performed a vertex-based shape analysis, in which we compared patients with IGE with the controls, on the subcortical structures that we had obtained from the MRI data. Statistical analysis showed significant regional atrophy in the left thalamus, left putamen and bilateral globus pallidus in patients with IGE. These results indicate that regional atrophy of the basal ganglia and the thalamus may be related to seizure disorder. In the future, these findings may prove useful for choosing new therapeutic regimens. Copyright © 2011 Wiley-Liss, Inc.

  1. Phytochemical diversity of cranberry (Vaccinium macrocarpon Aiton) cultivars by anthocyanin determination and metabolomic profiling with chemometric analysis.

    PubMed

    Brown, Paula N; Murch, Susan J; Shipley, Paul

    2012-01-11

    Originally native to the eastern United States, American cranberry ( Vaccinium macrocarpon Aiton, family Ericaceae) cultivation of native and hybrid varieties has spread across North America. Herein is reported the phytochemical diversity of five cranberry cultivars (Stevens, Ben Lear, Bergman, Pilgrim, and GH1) collected in the Greater Vancouver Regional District, by anthocyanin content and UPLC-TOF-MS metabolomic profiling. The anthocyanin content for biological replicates (n = 5) was determined as 7.98 ± 5.83, Ben Lear; 7.02 ± 1.75, Bergman; 6.05 ± 2.51, GH1; 3.28 ± 1.88, Pilgrim; and 2.81 ± 0.81, Stevens. Using subtractive metabonomic algorithms 6481 compounds were found conserved across all varietals, with 136 (Ben Lear), 84 (Bergman), 91 (GH1), 128 (Pilgrim), and 165 (Stevens) unique compounds observed. Principal component analysis (PCA) did not differentiate varieties, whereas partial least-squares discriminate analysis (PLS-DA) exhibited clustering patterns. Univariate statistical approaches were applied to the data set, establishing significance of values and assessing quality of the models. Metabolomic profiling with chemometric analysis proved to be useful for characterizing metabonomic changes across cranberry varieties.

  2. [The glaucoma pharmacological treatment and biomechanical properties of the cornea].

    PubMed

    Liehneová, I; Karlovská, S

    2014-10-01

    To evaluate and compare the impact of long-term use of intraocular pressure lowering medication on the biomechanical properties of the cornea. Group of 305 eyes of 154 patients newly diagnosed with primary open angle glaucoma (POAG, n = 68) or ocular hypertension (OH, n = 6) was enrolled in prospective cohort study. The control group was established of 80 untreated eyes of 40 patients with ocular hypertension and 80 eyes of 40 patients with no ocular pathology. Following parameters were evaluated: intraocular pressure (IOPg,IOPcc), hysteresis (CH), corneal resistance factor (CRF) and central corneal thickness (CCT). The parameters were evaluated at baseline (untreated) and in follow up periods of 3, 6, 9 and 12 months. The same schedule was used for eyes in the control group. Eyes with POAG or OH were sorted into two groups depending on the type of applied medication: prostaglandin analogues, carboanhydrase inhibitors alone or combined with betablockers. We did not prove any statistically significant difference in hysteresis in patients with newly diagnosed POAG (yet untreated) in comparison with normal eyes in control group (p = 0.238). We proved significantly higher values of CRF (p = 0.032) and CCT (p = 0.013) in the control group of untreated patients with ocular hypertension. This result confirms higher number of patients with stiffer and thicker corneas. Statistically significant difference of CH and CRF was proved (p < 0.0001) in eyes treated by prostaglandin analogues during follow up period. In these eyes we also demonstrated reduction of CCT (p < 0.001). We did not record any other statistically significant change in remaining followed parameters. Increase of CH and CRF can show change of biomechanical properties of the cornea after long-term use of prostaglandin analogues. The biomechanical properties of the cornea were not impacted by carboanhydrase inhibitors. Further studies are required to establish the effect of long-term use prostaglandin analogues on accuracy of IOP measurements.

  3. On the application of the Principal Component Analysis for an efficient climate downscaling of surface wind fields

    NASA Astrophysics Data System (ADS)

    Chavez, Roberto; Lozano, Sergio; Correia, Pedro; Sanz-Rodrigo, Javier; Probst, Oliver

    2013-04-01

    With the purpose of efficiently and reliably generating long-term wind resource maps for the wind energy industry, the application and verification of a statistical methodology for the climate downscaling of wind fields at surface level is presented in this work. This procedure is based on the combination of the Monte Carlo and the Principal Component Analysis (PCA) statistical methods. Firstly the Monte Carlo method is used to create a huge number of daily-based annual time series, so called climate representative years, by the stratified sampling of a 33-year-long time series corresponding to the available period of the NCAR/NCEP global reanalysis data set (R-2). Secondly the representative years are evaluated such that the best set is chosen according to its capability to recreate the Sea Level Pressure (SLP) temporal and spatial fields from the R-2 data set. The measure of this correspondence is based on the Euclidean distance between the Empirical Orthogonal Functions (EOF) spaces generated by the PCA (Principal Component Analysis) decomposition of the SLP fields from both the long-term and the representative year data sets. The methodology was verified by comparing the selected 365-days period against a 9-year period of wind fields generated by dynamical downscaling the Global Forecast System data with the mesoscale model SKIRON for the Iberian Peninsula. These results showed that, compared to the traditional method of dynamical downscaling any random 365-days period, the error in the average wind velocity by the PCA's representative year was reduced by almost 30%. Moreover the Mean Absolute Errors (MAE) in the monthly and daily wind profiles were also reduced by almost 25% along all SKIRON grid points. These results showed also that the methodology presented maximum error values in the wind speed mean of 0.8 m/s and maximum MAE in the monthly curves of 0.7 m/s. Besides the bulk numbers, this work shows the spatial distribution of the errors across the Iberian domain and additional wind statistics such as the velocity and directional frequency. Additional repetitions were performed to prove the reliability and robustness of this kind-of statistical-dynamical downscaling method.

  4. Measuring trust in nurses - Psychometric properties of the Trust in Nurses Scale in four countries.

    PubMed

    Stolt, Minna; Charalambous, Andreas; Radwin, Laurel; Adam, Christina; Katajisto, Jouko; Lemonidou, Chryssoula; Patiraki, Elisabeth; Sjövall, Katarina; Suhonen, Riitta

    2016-12-01

    The purpose of this study was to examine psychometric properties of three translated versions of the Trust in Nurses Scale (TNS) and cancer patients' perceptions of trust in nurses in a sample of cancer patients from four European countries. A cross-sectional, cross-cultural, multi-site survey design was used. The data were collected with the Trust in Nurses Scale from patients with different types of malignancies in 17 units within five clinical sites (n = 599) between 09/2012 and 06/2014. Data were analyzed using descriptive and inferential statistics, multivariate methods and psychometrics using exploratory factor analysis, Cronbach's alpha coefficients, item analysis and Rasch analysis. The psychometric properties of the data were consistent in all countries. Within the exploratory factor analysis the principal component analysis supported the one component structure (unidimensionality) of the TNS. The internal consistency reliability was acceptable. The Rasch analysis supported the unidimensionality of the TNS cross-culturally. All items of the TNS demonstrated acceptable goodness-of-fit to the Rasch model. Cancer patients trusted nurses to a great extent although between-country differences were found. The Trust in Nurses Scale proved to be a valid and reliable tool for measuring patients' trust in nurses in oncological settings in international contexts. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. Antimicrobial properties of Cocos nucifera (coconut) husk: An extrapolation to oral health

    PubMed Central

    Jose, Maji; Cyriac, Maria B; Pai, Vidya; Varghese, Ipe; Shantaram, Manjula

    2014-01-01

    Background and Objectives: Brushing the teeth with fibrous husk of Cocos nucifera (coconut) is a common oral hygiene practice among people of rural areas of South India. However, the probable antimicrobial properties of this plant material against common oral pathogens have not been proved scientifically. Therefore, the present study was designed. Materials and Methods: Alcoholic extract of the husk of Cocos nucifera was prepared and the antimicrobial properties against common oral pathogens like cariogenic bacteria, periodontal pathogens, and candidal organisms were performed by the Agar Well Diffusion Method. The results obtained were then subjected to statistical analysis using One-Way Analysis of Variance (ANOVA) and the Tukey's Honestly Significant Difference (HSD). Results: The alcoholic extract of Cocos nucifera showed a significant concentration-dependent antimicrobial activity, expressed as a zone of inhibition with respect to all tested organisms except Actinomyces species. The inhibitory effect was more significant, with a majority of cariogenic organisms and Candida, with a zone of inhibition ranging from 4.6 mm to 16.3 mm. However, the effect was lesser with Cocos nucifera compared to chlorhexidine. Minimum inhibitory concentration (MIC) ranged from 50 mg/ml to 75 mg/ml. Conclusion: Cocos nucifera has a significant inhibitory action against common oral pathogens, indicating the presence of highly effective antimicrobial compounds. Therefore, it is proved that its use can contribute to oral health to a great extent. Identification of these active compounds provides the scope for incorporating it into a modern oral care system, so as to control oral diseases. PMID:25097415

  6. Negative pressure wound therapy with instillation, a cost-effective treatment for abdominal mesh exposure.

    PubMed

    Deleyto, E; García-Ruano, A; González-López, J R

    2018-04-01

    Negative pressure wound therapy with instillation (NPWTi) has been proved to be a safe and effective treatment option for abdominal wall wound dehiscence with mesh exposure. Our aim in this study is to examine whether it is also cost-effective. We performed a retrospective cohort study with 45 patients treated for postoperative abdominal wall wound dehiscence and exposed mesh: 34 were treated with conventional wound therapy (CWT) and 11 with NPWTi. We carried out a cost analysis for each treatment group using the Diagnosis-related group (DRG) system and a second evaluation using the calculated costs "per hospital stay". The differences between NPWTi and CWT were calculated with both evaluation systems. Comparative analysis was performed using the Mann-Whitney U test. Mean costs using the DRG estimation were 29,613.71€ for the CWT group and 15,093.37€ for the NPWTi group, and according to the calculated expenses "per hospital stay", 17,322.88€ for the CWT group and 15,284.22€ for the NPWTi group. NPWTi showed a reduction in the total expense of treatment, related to a reduction in episodes of hospitalization and number of surgeries required to achieve wound closure. However, differences were not statistically significant in our sample. NPWTi proves to be an efficient treatment option for abdominal wall wound dehiscence with mesh exposure, compared to CWT. More trials aimed to optimize treatment protocols will lead to an additional increase in NPWTi efficiency. In addition, to generalize our results, further studies with larger samples would be necessary.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tandon, Lav; Colletti, Lisa M.; Drake, Lawrence R.

    This report discusses the process used to prove in the SRNL-Rev.2 coulometer for isotopic data analysis used in the special plutonium material project. In May of 2012, the PAR 173 coulometer system that had been the workhorse of the Plutonium Assay team since the early 1970s became inoperable. A new coulometer system had been purchased from Savannah River National Laboratory (SRNL) and installed in August of 2011. Due to funding issues the new system was not qualified at that time. Following the failure of the PAR 173, it became necessary to qualify the new system for use in Process 3401a,more » Plutonium Assay by Controlled Coulometry. A qualification plan similar to what is described in PQR -141a was followed. Experiments were performed to establish a statistical summary of the performance of the new system by monitoring the repetitive analysis of quality control sample, PEOL, and the assay of plutonium metals obtained from the Plutonium Exchange Program. The data for the experiments was acquired using work instructions ANC125 and ANC195. Figure 1 shows approximately 2 years of data for the PEOL material obtained using the PAR 173. The required acceptance criteria for the sample are that it returns the correct value for the quality control material of 88.00% within 2 sigma (95% Confidence Interval). It also must meet daily precision standards that are set from the historical data analysis of decades of data. The 2 sigma value that is currently used is 0.146 % as evaluated by the Statistical Science Group, CCS-6. The average value of the PEOL quality control material run in 10 separate days on the SRNL-03 coulometer is 87.98% with a relative standard deviation of 0.04 at the 95% Confidence interval. The date of data acquisition is between 5/23/2012 to 8/1/2012. The control samples are run every day experiments using the coulometer are carried out. It is also used to prove an instrument is in statistical control before any experiments are undertaken. The total number of replicate controls run with the new coulometer to date, is n=18. This value is identical to that calculated by the LANL statistical group for this material from data produced by the PAR 173 system over the period of October 2007 to May 2011. The final validation/verification test was to run a blind sample over multiple days. AAC participates in a plutonium exchange program which supplies blind Pu metal samples to the group on a regular basis. The Pu material supplied for this study was ran using the PAR 173 in the past and more recently with the new system. Table 1a contains the values determined through the use of the PAR 173 and Table 1b contains the values obtained with the new system. The Pu assay value obtained on the SRNL system is for paired analysis and had a value of 98.88+/-0.07% RSD at 95% CI. The Pu assay value (decay corrected to July 2012) of the material determined in prior measurements using the PAR173 is 99.05 +/- 0.06 % RSD at 95% CI. We believe that the instrument is adequate to meet the needs of the program.« less

  8. Regression Rates Following the Treatment of Aggressive Posterior Retinopathy of Prematurity with Bevacizumab Versus Laser: 8-Year Retrospective Analysis

    PubMed Central

    Nicoară, Simona D.; Ştefănuţ, Anne C.; Nascutzy, Constanta; Zaharie, Gabriela C.; Toader, Laura E.; Drugan, Tudor C.

    2016-01-01

    Background Retinopathy is a serious complication related to prematurity and a leading cause of childhood blindness. The aggressive posterior form of retinopathy of prematurity (APROP) has a worse anatomical and functional outcome following laser therapy, as compared with the classic form of the disease. The main outcome measures are the APROP regression rate, structural outcomes, and complications associated with intravitreal bevacizumab (IVB) versus laser photocoagulation in APROP. Material/Methods This is a retrospective case series that includes infants with APROP who received either IVB or laser photocoagulation and had a follow-up of at least 60 weeks (for the laser photocoagulation group) and 80 weeks (for the IVB group). In the first group, laser photocoagulation of the retina was carried out and in the second group, 1 bevacizumab injection was administered intravitreally. The following parameters were analyzed in each group: sex, gestational age, birth weight, postnatal age and postmenstrual age at treatment, APROP regression, sequelae, and complications. Statistical analysis was performed using Microsoft Excel and IBM SPSS (version 23.0). Results The laser photocoagulation group consisted of 6 premature infants (12 eyes) and the IVB group consisted of 17 premature infants (34 eyes). Within the laser photocoagulation group, the evolution was favorable in 9 eyes (75%) and unfavorable in 3 eyes (25%). Within the IVB group, APROP regressed in 29 eyes (85.29%) and failed to regress in 5 eyes (14.71%). These differences are statistically significant, as proved by the McNemar test (P<0.001). Conclusions The IVB group had a statistically significant better outcome compared with the laser photocoagulation group, in APROP in our series. PMID:27062023

  9. Nutritional status and CD4 cell counts in patients with HIV/AIDS receiving antiretroviral therapy.

    PubMed

    Santos, Ana Célia Oliveira dos; Almeida, Ana Maria Rampeloti

    2013-01-01

    Even with current highly active antiretroviral therapy, individuals with AIDS continue to exhibit important nutritional deficits and reduced levels of albumin and hemoglobin, which may be directly related to their cluster of differentiation 4 (CD4) cell counts. The aim of this study was to characterize the nutritional status of individuals with human immunodeficiency virus/acquired immunodeficiency syndrome (HIV/AIDS) and relate the findings to the albumin level, hemoglobin level and CD4 cell count. Patients over 20 years of age with AIDS who were hospitalized in a university hospital and were receiving antiretroviral therapy were studied with regard to clinical, anthropometric, biochemical and sociodemographic characteristics. Body mass index, percentage of weight loss, arm circumference, triceps skinfold and arm muscle circumference were analyzed. Data on albumin, hemoglobin, hematocrit and CD4 cell count were obtained from patient charts. Statistical analysis was performed using Fisher's exact test, Student's t-test for independent variables and the Mann-Whitney U-test. The level of significance was set to 0.05 (α = 5%). Statistical analysis was performed using Statistical Package for the Social Sciences (SPSS) 17.0 software for Windows. Of the 50 patients evaluated, 70% were male. The prevalence of malnutrition was higher when the definition was based on arm circumference and triceps skinfold measurement. The concentrations of all biochemical variables were significantly lower among patients with a body mass index of less than 18.5kg/m2. The CD4 cell count, albumin, hemoglobin and hematocrit anthropometric measures were directly related to each other. These findings underscore the importance of nutritional follow-up for underweight patients with AIDS, as nutritional status proved to be related to important biochemical alterations.

  10. Orthotopic bladder substitution in men revisited: identification of continence predictors.

    PubMed

    Koraitim, M M; Atta, M A; Foda, M K

    2006-11-01

    We determined the impact of the functional characteristics of the neobladder and urethral sphincter on continence results, and determined the most significant predictors of continence. A total of 88 male patients 29 to 70 years old underwent orthotopic bladder substitution with tubularized ileocecal segment (40) and detubularized sigmoid (25) or ileum (23). Uroflowmetry, cystometry and urethral pressure profilometry were performed at 13 to 36 months (mean 19) postoperatively. The correlation between urinary continence and 28 urodynamic variables was assessed. Parameters that correlated significantly with continence were entered into a multivariate analysis using a logistic regression model to determine the most significant predictors of continence. Maximum urethral closure pressure was the only parameter that showed a statistically significant correlation with diurnal continence. Nocturnal continence had not only a statistically significant positive correlation with maximum urethral closure pressure, but also statistically significant negative correlations with maximum contraction amplitude, and baseline pressure at mid and maximum capacity. Three of these 4 parameters, including maximum urethral closure pressure, maximum contraction amplitude and baseline pressure at mid capacity, proved to be significant predictors of continence on multivariate analysis. While daytime continence is determined by maximum urethral closure pressure, during the night it is the net result of 2 forces that have about equal influence but in opposite directions, that is maximum urethral closure pressure vs maximum contraction amplitude plus baseline pressure at mid capacity. Two equations were derived from the logistic regression model to predict the probability of continence after orthotopic bladder substitution, including Z1 (diurnal) = 0.605 + 0.0085 maximum urethral closure pressure and Z2 (nocturnal) = 0.841 + 0.01 [maximum urethral closure pressure - (maximum contraction amplitude + baseline pressure at mid capacity)].

  11. Physics of negative absolute temperatures.

    PubMed

    Abraham, Eitan; Penrose, Oliver

    2017-01-01

    Negative absolute temperatures were introduced into experimental physics by Purcell and Pound, who successfully applied this concept to nuclear spins; nevertheless, the concept has proved controversial: a recent article aroused considerable interest by its claim, based on a classical entropy formula (the "volume entropy") due to Gibbs, that negative temperatures violated basic principles of statistical thermodynamics. Here we give a thermodynamic analysis that confirms the negative-temperature interpretation of the Purcell-Pound experiments. We also examine the principal arguments that have been advanced against the negative temperature concept; we find that these arguments are not logically compelling, and moreover that the underlying "volume" entropy formula leads to predictions inconsistent with existing experimental results on nuclear spins. We conclude that, despite the counterarguments, negative absolute temperatures make good theoretical sense and did occur in the experiments designed to produce them.

  12. Two models of the sound-signal frequency dependence on the animal body size as exemplified by the ground squirrels of Eurasia (mammalia, rodentia).

    PubMed

    Nikol'skii, A A

    2017-11-01

    Dependence of the sound-signal frequency on the animal body length was studied in 14 ground squirrel species (genus Spermophilus) of Eurasia. Regression analysis of the total sample yielded a low determination coefficient (R 2 = 26%), because the total sample proved to be heterogeneous in terms of signal frequency within the dimension classes of animals. When the total sample was divided into two groups according to signal frequency, two statistically significant models (regression equations) were obtained in which signal frequency depended on the body size at high determination coefficients (R 2 = 73 and 94% versus 26% for the total sample). Thus, the problem of correlation between animal body size and the frequency of their vocal signals does not have a unique solution.

  13. A contribution to the calculation of measurement uncertainty and optimization of measuring strategies in coordinate measurement

    NASA Astrophysics Data System (ADS)

    Waeldele, F.

    1983-01-01

    The influence of sample shape deviations on the measurement uncertainties and the optimization of computer aided coordinate measurement were investigated for a circle and a cylinder. Using the complete error propagation law in matrix form the parameter uncertainties are calculated, taking the correlation between the measurement points into account. Theoretical investigations show that the measuring points have to be equidistantly distributed and that for a cylindrical body a measuring point distribution along a cross section is better than along a helical line. The theoretically obtained expressions to calculate the uncertainties prove to be a good estimation basis. The simple error theory is not satisfactory for estimation. The complete statistical data analysis theory helps to avoid aggravating measurement errors and to adjust the number of measuring points to the required measuring uncertainty.

  14. Has neutrinoless double /β decay of 76Ge been really observed?

    NASA Astrophysics Data System (ADS)

    Zdesenko, Yu. G.; Danevich, F. A.; Tretyak, V. I.

    2002-10-01

    The claim of discovery of the neutrinoless double beta (0ν2β) decay of 76Ge [Mod. Phys. Lett. A 16 (2001) 2409] is considered critically and firm conclusion about, at least, prematurity of such a claim is derived on the basis of a simple statistical analysis of the measured spectra. This result is also proved by analyzing the cumulative data sets of the Heidelberg-Moscow and IGEX experiments. Besides, it allows us to establish the highest worldwide half-life limit on the 0ν2β decay of 76Ge: T1/20ν⩾2.5 (4.2)×1025 yr at 90% (68%) C.L. This bound corresponds to the most stringent constraint on the Majorana neutrino mass: mν⩽0.3 (0.2) eV at 90% (68%) C.L.

  15. Analyzing thematic maps and mapping for accuracy

    USGS Publications Warehouse

    Rosenfield, G.H.

    1982-01-01

    Two problems which exist while attempting to test the accuracy of thematic maps and mapping are: (1) evaluating the accuracy of thematic content, and (2) evaluating the effects of the variables on thematic mapping. Statistical analysis techniques are applicable to both these problems and include techniques for sampling the data and determining their accuracy. In addition, techniques for hypothesis testing, or inferential statistics, are used when comparing the effects of variables. A comprehensive and valid accuracy test of a classification project, such as thematic mapping from remotely sensed data, includes the following components of statistical analysis: (1) sample design, including the sample distribution, sample size, size of the sample unit, and sampling procedure; and (2) accuracy estimation, including estimation of the variance and confidence limits. Careful consideration must be given to the minimum sample size necessary to validate the accuracy of a given. classification category. The results of an accuracy test are presented in a contingency table sometimes called a classification error matrix. Usually the rows represent the interpretation, and the columns represent the verification. The diagonal elements represent the correct classifications. The remaining elements of the rows represent errors by commission, and the remaining elements of the columns represent the errors of omission. For tests of hypothesis that compare variables, the general practice has been to use only the diagonal elements from several related classification error matrices. These data are arranged in the form of another contingency table. The columns of the table represent the different variables being compared, such as different scales of mapping. The rows represent the blocking characteristics, such as the various categories of classification. The values in the cells of the tables might be the counts of correct classification or the binomial proportions of these counts divided by either the row totals or the column totals from the original classification error matrices. In hypothesis testing, when the results of tests of multiple sample cases prove to be significant, some form of statistical test must be used to separate any results that differ significantly from the others. In the past, many analyses of the data in this error matrix were made by comparing the relative magnitudes of the percentage of correct classifications, for either individual categories, the entire map or both. More rigorous analyses have used data transformations and (or) two-way classification analysis of variance. A more sophisticated step of data analysis techniques would be to use the entire classification error matrices using the methods of discrete multivariate analysis or of multiviariate analysis of variance.

  16. On the dispute between Boltzmann and Gibbs entropy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Buonsante, Pierfrancesco; Franzosi, Roberto, E-mail: roberto.franzosi@ino.it; Smerzi, Augusto

    2016-12-15

    The validity of the concept of negative temperature has been recently challenged by arguing that the Boltzmann entropy (that allows negative temperatures) is inconsistent from a mathematical and statistical point of view, whereas the Gibbs entropy (that does not admit negative temperatures) provides the correct definition for the microcanonical entropy. Here we prove that the Boltzmann entropy is thermodynamically and mathematically consistent. Analytical results on two systems supporting negative temperatures illustrate the scenario we propose. In addition we numerically study a lattice system to show that negative temperature equilibrium states are accessible and obey standard statistical mechanics prediction.

  17. Central Limit Theorems for Linear Statistics of Heavy Tailed Random Matrices

    NASA Astrophysics Data System (ADS)

    Benaych-Georges, Florent; Guionnet, Alice; Male, Camille

    2014-07-01

    We show central limit theorems (CLT) for the linear statistics of symmetric matrices with independent heavy tailed entries, including entries in the domain of attraction of α-stable laws and entries with moments exploding with the dimension, as in the adjacency matrices of Erdös-Rényi graphs. For the second model, we also prove a central limit theorem of the moments of its empirical eigenvalues distribution. The limit laws are Gaussian, but unlike the case of standard Wigner matrices, the normalization is the one of the classical CLT for independent random variables.

  18. ASSESSMENT OF GOOD PRACTICES IN HOSPITAL FOOD SERVICE BY COMPARING EVALUATION TOOLS.

    PubMed

    Macedo Gonçalves, Juliana; Lameiro Rodrigues, Kelly; Santiago Almeida, Ângela Teresinha; Pereira, Giselda Maria; Duarte Buchweitz, Márcia Rúbia

    2015-10-01

    since food service in hospitals complements medical treatment, it should be produced in proper hygienic and sanitary conditions. It is a well-known fact that food-transmitted illnesses affect with greater severity hospitalized and immunosuppressed patients. good practices in hospital food service are evaluated by comparing assessment instruments. good practices were evaluated by a verification list following Resolution of Collegiate Directory n. 216 of the Brazilian Agency for Sanitary Vigilance. Interpretation of listed items followed parameters of RCD 216 and the Brazilian Association of Collective Meals Enterprises (BACME). Fisher's exact test was applied to detect whether there were statistically significant differences. Analysis of data grouping was undertaken with Unweighted Pair-group using Arithmetic Averages, coupled to a correlation study between dissimilarity matrixes to verify disagreement between the two methods. Good Practice was classified with mean total rates above 75% by the two methods. There were statistically significant differences between services and food evaluated by BACME instrument. Hospital Food Services have proved to show conditions of acceptable good practices. the comparison of interpretation tools based on RCD n. 216 and BACME provided similar results for the two classifications. Copyright AULA MEDICA EDICIONES 2014. Published by AULA MEDICA. All rights reserved.

  19. Experimental measurement of flexion-extension movement in normal and corpse prosthetic elbow joint.

    PubMed

    TarniŢă, Daniela; TarniŢă, DănuŢ Nicolae

    2016-01-01

    This paper presents a comparative experimental study of flexion-extension movement in healthy elbow and in the prosthetic elbow joint fixed on an original experimental bench. Measurements were carried out in order to validate the functional morphology and a new elbow prosthesis type ball head. The three-dimensional (3D) model and the physical prototype of our experimental bench used to test elbow endoprosthesis at flexion-extension and pronation-supination movements is presented. The measurements were carried out on a group of nine healthy subjects and on the prosthetic corpse elbow, the experimental data being obtained for flexion-extension movement cycles. Experimental data for the two different flexion-extension tests for the nine subjects and for the corpse prosthetic elbow were acquired using SimiMotion video system. Experimental data were processed statistically. The corresponding graphs were obtained for all subjects in the experimental group, and for corpse prosthetic elbow for both flexion-extension tests. The statistical analysis has proved that the flexion angles of healthy elbows were significantly close to the values measured at the prosthetic elbow fixed on the experimental bench. The studied elbow prosthesis manages to re-establish the mobility for the elbow joint as close to the normal one.

  20. Web-based data collection: detailed methods of a questionnaire and data gathering tool

    PubMed Central

    Cooper, Charles J; Cooper, Sharon P; del Junco, Deborah J; Shipp, Eva M; Whitworth, Ryan; Cooper, Sara R

    2006-01-01

    There have been dramatic advances in the development of web-based data collection instruments. This paper outlines a systematic web-based approach to facilitate this process through locally developed code and to describe the results of using this process after two years of data collection. We provide a detailed example of a web-based method that we developed for a study in Starr County, Texas, assessing high school students' work and health status. This web-based application includes data instrument design, data entry and management, and data tables needed to store the results that attempt to maximize the advantages of this data collection method. The software also efficiently produces a coding manual, web-based statistical summary and crosstab reports, as well as input templates for use by statistical packages. Overall, web-based data entry using a dynamic approach proved to be a very efficient and effective data collection system. This data collection method expedited data processing and analysis and eliminated the need for cumbersome and expensive transfer and tracking of forms, data entry, and verification. The code has been made available for non-profit use only to the public health research community as a free download [1]. PMID:16390556

  1. Gaining Insights on Nasopharyngeal Carcinoma Treatment Outcome Using Clinical Data Mining Techniques.

    PubMed

    Ghaibeh, A Ammar; Kasem, Asem; Ng, Xun Jin; Nair, Hema Latha Krishna; Hirose, Jun; Thiruchelvam, Vinesh

    2018-01-01

    The analysis of Electronic Health Records (EHRs) is attracting a lot of research attention in the medical informatics domain. Hospitals and medical institutes started to use data mining techniques to gain new insights from the massive amounts of data that can be made available through EHRs. Researchers in the medical field have often used descriptive statistics and classical statistical methods to prove assumed medical hypotheses. However, discovering new insights from large amounts of data solely based on experts' observations is difficult. Using data mining techniques and visualizations, practitioners can find hidden knowledge, identify interesting patterns, or formulate new hypotheses to be further investigated. This paper describes a work in progress on using data mining methods to analyze clinical data of Nasopharyngeal Carcinoma (NPC) cancer patients. NPC is the fifth most common cancer among Malaysians, and the data analyzed in this study was collected from three states in Malaysia (Kuala Lumpur, Sabah and Sarawak), and is considered to be the largest up-to-date dataset of its kind. This research is addressing the issue of cancer recurrence after the completion of radiotherapy and chemotherapy treatment. We describe the procedure, problems, and insights gained during the process.

  2. Retrieval Capabilities of Hierarchical Networks: From Dyson to Hopfield

    NASA Astrophysics Data System (ADS)

    Agliari, Elena; Barra, Adriano; Galluzzi, Andrea; Guerra, Francesco; Tantari, Daniele; Tavani, Flavia

    2015-01-01

    We consider statistical-mechanics models for spin systems built on hierarchical structures, which provide a simple example of non-mean-field framework. We show that the coupling decay with spin distance can give rise to peculiar features and phase diagrams much richer than their mean-field counterpart. In particular, we consider the Dyson model, mimicking ferromagnetism in lattices, and we prove the existence of a number of metastabilities, beyond the ordered state, which become stable in the thermodynamic limit. Such a feature is retained when the hierarchical structure is coupled with the Hebb rule for learning, hence mimicking the modular architecture of neurons, and gives rise to an associative network able to perform single pattern retrieval as well as multiple-pattern retrieval, depending crucially on the external stimuli and on the rate of interaction decay with distance; however, those emergent multitasking features reduce the network capacity with respect to the mean-field counterpart. The analysis is accomplished through statistical mechanics, Markov chain theory, signal-to-noise ratio technique, and numerical simulations in full consistency. Our results shed light on the biological complexity shown by real networks, and suggest future directions for understanding more realistic models.

  3. Study of Montmorillonite Clay for the Removal of Copper (II) by Adsorption: Full Factorial Design Approach and Cascade Forward Neural Network

    PubMed Central

    Turan, Nurdan Gamze; Ozgonenel, Okan

    2013-01-01

    An intensive study has been made of the removal efficiency of Cu(II) from industrial leachate by biosorption of montmorillonite. A 24 factorial design and cascade forward neural network (CFNN) were used to display the significant levels of the analyzed factors on the removal efficiency. The obtained model based on 24 factorial design was statistically tested using the well-known methods. The statistical analysis proves that the main effects of analyzed parameters were significant by an obtained linear model within a 95% confidence interval. The proposed CFNN model requires less experimental data and minimum calculations. Moreover, it is found to be cost-effective due to inherent advantages of its network structure. Optimization of the levels of the analyzed factors was achieved by minimizing adsorbent dosage and contact time, which were costly, and maximizing Cu(II) removal efficiency. The suggested optimum conditions are initial pH at 6, adsorbent dosage at 10 mg/L, and contact time at 10 min using raw montmorillonite with the Cu(II) removal of 80.7%. At the optimum values, removal efficiency was increased to 88.91% if the modified montmorillonite was used. PMID:24453833

  4. Association Between Plantar Fasciitis and Isolated Gastrocnemius Tightness.

    PubMed

    Nakale, Ngenomeulu T; Strydom, Andrew; Saragas, Nick P; Ferrao, Paulo N F

    2018-03-01

    An association between plantar fasciitis and isolated gastrocnemius tightness (IGT) has been postulated in the literature; however, there have been few studies to prove this relationship. This prospective cross-sectional cohort study was aimed at determining the association between plantar fasciitis and IGT. Three groups comprising 45 patients with plantar fasciitis (group 1), 117 patients with foot and ankle pathology other than plantar fasciitis (group 2), and 61 patients without foot and ankle pathology (group 3) were examined for the presence of IGT using the Silfverskiöld test. Statistical tests included chi-square test, Student t test, and analysis of variance. Of the patients, 101 (45.3%) had IGT: 36 (80%) in group 1, 53 (45.3%) in group 2, and 12 (19.7%) in group 3. The difference in IGT prevalence between the groups was statistically significant at P < .001. The prevalence of IGT was similar between acute and chronic plantar fasciitis at 78.9% and 80.6%, respectively. There was a very strong association between plantar fasciitis and IGT using group 3 as a reference. This study suggests that IGT should be actively sought out and managed in patients with plantar fasciitis. Level II, cross-sectional cohort prospective study.

  5. Approach toward enhancement of halophilic protease production by Halobacterium sp. strain LBU50301 using statistical design response surface methodology.

    PubMed

    Chuprom, Julalak; Bovornreungroj, Preeyanuch; Ahmad, Mehraj; Kantachote, Duangporn; Dueramae, Sawitree

    2016-06-01

    A new potent halophilic protease producer, Halobacterium sp. strain LBU50301 was isolated from salt-fermented fish samples ( budu ) and identified by phenotypic analysis, and 16S rDNA gene sequencing. Thereafter, sequential statistical strategy was used to optimize halophilic protease production from Halobacterium sp. strain LBU50301 by shake-flask fermentation. The classical one-factor-at-a-time (OFAT) approach determined gelatin was the best nitrogen source. Based on Plackett - Burman (PB) experimental design; gelatin, MgSO 4 ·7H 2 O, NaCl and pH significantly influenced the halophilic protease production. Central composite design (CCD) determined the optimum level of medium components. Subsequently, an 8.78-fold increase in corresponding halophilic protease yield (156.22 U/mL) was obtained, compared with that produced in the original medium (17.80 U/mL). Validation experiments proved the adequacy and accuracy of model, and the results showed the predicted value agreed well with the experimental values. An overall 13-fold increase in halophilic protease yield was achieved using a 3 L laboratory fermenter and optimized medium (231.33 U/mL).

  6. [Respondent-Driven Sampling: a new sampling method to study visible and hidden populations].

    PubMed

    Mantecón, Alejandro; Juan, Montse; Calafat, Amador; Becoña, Elisardo; Román, Encarna

    2008-01-01

    The paper introduces a variant of chain-referral sampling: respondent-driven sampling (RDS). This sampling method shows that methods based on network analysis can be combined with the statistical validity of standard probability sampling methods. In this sense, RDS appears to be a mathematical improvement of snowball sampling oriented to the study of hidden populations. However, we try to prove its validity with populations that are not within a sampling frame but can nonetheless be contacted without difficulty. The basics of RDS are explained through our research on young people (aged 14 to 25) who go clubbing, consume alcohol and other drugs, and have sex. Fieldwork was carried out between May and July 2007 in three Spanish regions: Baleares, Galicia and Comunidad Valenciana. The presentation of the study shows the utility of this type of sampling when the population is accessible but there is a difficulty deriving from the lack of a sampling frame. However, the sample obtained is not a random representative one in statistical terms of the target population. It must be acknowledged that the final sample is representative of a 'pseudo-population' that approximates to the target population but is not identical to it.

  7. Implantable cardioverter defibrillators for primary prevention in patients with nonischemic cardiomyopathy: A systematic review and meta-analysis.

    PubMed

    Akel, Tamer; Lafferty, James

    2017-06-01

    Implantable cardioverter defibrillators (ICDs) have proved their favorable outcomes on survival in selected patients with cardiomyopathy. Although previous meta-analyses have shown benefit for their use in primary prevention, the evidence remains less robust for patients with nonischemic cardiomyopathy (NICM) in comparison to patients with coronary artery disease (CAD). To evaluate the effect of ICD therapy on reducing all-cause mortality and sudden cardiac death (SCD) in patients with NICM. PubMed (1993-2016), the Cochrane Central Register of Controlled Trials (2000-2016), reference lists of relevant articles, and previous meta-analyses. Search terms included defibrillator, heart failure, cardiomyopathy, randomized controlled trials, and clinical trials. Eligible trials were randomized controlled trials with at least an arm of ICD, an arm of medical therapy and enrolled some patients with NICM. The primary endpoint in the trials should include all-cause mortality or mortality from SCD. Hazard ratios (HRs) for all-cause mortality and mortality from SCD were either extracted or calculated along with their standard errors. Of the 1047 abstracts retained by the initial screen, eight randomized controlled trials were identified. Five of these trials reported relevant data regarding patients with NICM and were subsequently included in this meta-analysis. Pooled analysis of HRs suggested a statistically significant reduction in all-cause mortality among a total of 2573 patients randomized to ICD vs medical therapy (HR 0.80; 95% CI, 0.67-0.96; P=.02). Pooled analysis of HRs for mortality from SCD was also statistically significant (n=1677) (HR 0.51; 95% CI, 0.34-0.76; P=.001). ICD implantation is beneficial in terms of all-cause mortality and mortality from SCD in certain subgroups of patients with NICM. © 2017 John Wiley & Sons Ltd.

  8. Radiation shielding quality assurance

    NASA Astrophysics Data System (ADS)

    Um, Dallsun

    For the radiation shielding quality assurance, the validity and reliability of the neutron transport code MCNP, which is now one of the most widely used radiation shielding analysis codes, were checked with lot of benchmark experiments. And also as a practical example, follows were performed in this thesis. One integral neutron transport experiment to measure the effect of neutron streaming in iron and void was performed with Dog-Legged Void Assembly in Knolls Atomic Power Laboratory in 1991. Neutron flux was measured six different places with the methane detectors and a BF-3 detector. The main purpose of the measurements was to provide benchmark against which various neutron transport calculation tools could be compared. Those data were used in verification of Monte Carlo Neutron & Photon Transport Code, MCNP, with the modeling for that. Experimental results and calculation results were compared in both ways, as the total integrated value of neutron fluxes along neutron energy range from 10 KeV to 2 MeV and as the neutron spectrum along with neutron energy range. Both results are well matched with the statistical error +/-20%. MCNP results were also compared with those of TORT, a three dimensional discrete ordinates code which was developed by Oak Ridge National Laboratory. MCNP results are superior to the TORT results at all detector places except one. This means that MCNP is proved as a very powerful tool for the analysis of neutron transport through iron & air and further it could be used as a powerful tool for the radiation shielding analysis. For one application of the analysis of variance (ANOVA) to neutron and gamma transport problems, uncertainties for the calculated values of critical K were evaluated as in the ANOVA on statistical data.

  9. Separate modal analysis for tumor detection with a digital image elasto tomography (DIET) breast cancer screening system.

    PubMed

    Kashif, Amer S; Lotz, Thomas F; Heeren, Adrianus M W; Chase, James G

    2013-11-01

    It is estimated that every year, 1 × 10(6) women are diagnosed with breast cancer, and more than 410,000 die annually worldwide. Digital Image Elasto Tomography (DIET) is a new noninvasive breast cancer screening modality that induces mechanical vibrations in the breast and images its surface motion with digital cameras to detect changes in stiffness. This research develops a new automated approach for diagnosing breast cancer using DIET based on a modal analysis model. The first and second natural frequency of silicone phantom breasts is analyzed. Separate modal analysis is performed for each region of the phantom to estimate the modal parameters using imaged motion data over several input frequencies. Statistical methods are used to assess the likelihood of a frequency shift, which can indicate tumor location. Phantoms with 5, 10, and 20 mm stiff inclusions are tested, as well as a homogeneous (healthy) phantom. Inclusions are located at four locations with different depth. The second natural frequency proves to be a reliable metric with the potential to clearly distinguish lesion like inclusions of different stiffness, as well as providing an approximate location for the tumor like inclusions. The 10 and 20 mm inclusions are always detected regardless of depth. The 5 mm inclusions are only detected near the surface. The homogeneous phantom always yields a negative result, as expected. Detection is based on a statistical likelihood analysis to determine the presence of significantly different frequency response over the phantom, which is a novel approach to this problem. The overall results show promise and justify proof of concept trials with human subjects.

  10. Task-Related Edge Density (TED)—A New Method for Revealing Dynamic Network Formation in fMRI Data of the Human Brain

    PubMed Central

    Lohmann, Gabriele; Stelzer, Johannes; Zuber, Verena; Buschmann, Tilo; Margulies, Daniel; Bartels, Andreas; Scheffler, Klaus

    2016-01-01

    The formation of transient networks in response to external stimuli or as a reflection of internal cognitive processes is a hallmark of human brain function. However, its identification in fMRI data of the human brain is notoriously difficult. Here we propose a new method of fMRI data analysis that tackles this problem by considering large-scale, task-related synchronisation networks. Networks consist of nodes and edges connecting them, where nodes correspond to voxels in fMRI data, and the weight of an edge is determined via task-related changes in dynamic synchronisation between their respective times series. Based on these definitions, we developed a new data analysis algorithm that identifies edges that show differing levels of synchrony between two distinct task conditions and that occur in dense packs with similar characteristics. Hence, we call this approach “Task-related Edge Density” (TED). TED proved to be a very strong marker for dynamic network formation that easily lends itself to statistical analysis using large scale statistical inference. A major advantage of TED compared to other methods is that it does not depend on any specific hemodynamic response model, and it also does not require a presegmentation of the data for dimensionality reduction as it can handle large networks consisting of tens of thousands of voxels. We applied TED to fMRI data of a fingertapping and an emotion processing task provided by the Human Connectome Project. TED revealed network-based involvement of a large number of brain areas that evaded detection using traditional GLM-based analysis. We show that our proposed method provides an entirely new window into the immense complexity of human brain function. PMID:27341204

  11. Task-Related Edge Density (TED)-A New Method for Revealing Dynamic Network Formation in fMRI Data of the Human Brain.

    PubMed

    Lohmann, Gabriele; Stelzer, Johannes; Zuber, Verena; Buschmann, Tilo; Margulies, Daniel; Bartels, Andreas; Scheffler, Klaus

    2016-01-01

    The formation of transient networks in response to external stimuli or as a reflection of internal cognitive processes is a hallmark of human brain function. However, its identification in fMRI data of the human brain is notoriously difficult. Here we propose a new method of fMRI data analysis that tackles this problem by considering large-scale, task-related synchronisation networks. Networks consist of nodes and edges connecting them, where nodes correspond to voxels in fMRI data, and the weight of an edge is determined via task-related changes in dynamic synchronisation between their respective times series. Based on these definitions, we developed a new data analysis algorithm that identifies edges that show differing levels of synchrony between two distinct task conditions and that occur in dense packs with similar characteristics. Hence, we call this approach "Task-related Edge Density" (TED). TED proved to be a very strong marker for dynamic network formation that easily lends itself to statistical analysis using large scale statistical inference. A major advantage of TED compared to other methods is that it does not depend on any specific hemodynamic response model, and it also does not require a presegmentation of the data for dimensionality reduction as it can handle large networks consisting of tens of thousands of voxels. We applied TED to fMRI data of a fingertapping and an emotion processing task provided by the Human Connectome Project. TED revealed network-based involvement of a large number of brain areas that evaded detection using traditional GLM-based analysis. We show that our proposed method provides an entirely new window into the immense complexity of human brain function.

  12. Applying the LANL Statistical Pattern Recognition Paradigm for Structural Health Monitoring to Data from a Surface-Effect Fast Patrol Boat

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoon Sohn; Charles Farrar; Norman Hunter

    2001-01-01

    This report summarizes the analysis of fiber-optic strain gauge data obtained from a surface-effect fast patrol boat being studied by the staff at the Norwegian Defense Research Establishment (NDRE) in Norway and the Naval Research Laboratory (NRL) in Washington D.C. Data from two different structural conditions were provided to the staff at Los Alamos National Laboratory. The problem was then approached from a statistical pattern recognition paradigm. This paradigm can be described as a four-part process: (1) operational evaluation, (2) data acquisition & cleansing, (3) feature extraction and data reduction, and (4) statistical model development for feature discrimination. Given thatmore » the first two portions of this paradigm were mostly completed by the NDRE and NRL staff, this study focused on data normalization, feature extraction, and statistical modeling for feature discrimination. The feature extraction process began by looking at relatively simple statistics of the signals and progressed to using the residual errors from auto-regressive (AR) models fit to the measured data as the damage-sensitive features. Data normalization proved to be the most challenging portion of this investigation. A novel approach to data normalization, where the residual errors in the AR model are considered to be an unmeasured input and an auto-regressive model with exogenous inputs (ARX) is then fit to portions of the data exhibiting similar waveforms, was successfully applied to this problem. With this normalization procedure, a clear distinction between the two different structural conditions was obtained. A false-positive study was also run, and the procedure developed herein did not yield any false-positive indications of damage. Finally, the results must be qualified by the fact that this procedure has only been applied to very limited data samples. A more complete analysis of additional data taken under various operational and environmental conditions as well as other structural conditions is necessary before one can definitively state that the procedure is robust enough to be used in practice.« less

  13. Gender Differences in Research Patterns among PhD Economists

    ERIC Educational Resources Information Center

    Barbezat, Debra A.

    2006-01-01

    This study is based on a 1996 survey of PhD economists working in the academic and nonacademic sectors since 1989. Despite a raw gender difference in all types of research output, the male dummy variable proves statistically significant in predicting only one publication measure. In a full sample and faculty subsample, number of years since…

  14. Fusion And Inference From Multiple And Massive Disparate Distributed Dynamic Data Sets

    DTIC Science & Technology

    2017-07-01

    principled methodology for two-sample graph testing; designed a provably almost-surely perfect vertex clustering algorithm for block model graphs; proved...3.7 Semi-Supervised Clustering Methodology ...................................................................... 9 3.8 Robust Hypothesis Testing...dimensional Euclidean space – allows the full arsenal of statistical and machine learning methodology for multivariate Euclidean data to be deployed for

  15. Stress in Professionals and Non-Professionals, Men and Women.

    ERIC Educational Resources Information Center

    Barko, Naomi

    1983-01-01

    Researchers are finding that high demands plus low control over how the job is done make a job stressful. According to Professor Robert Karasek of Columbia University, the statistics on heart disease and high blood pressure prove that nonprofessional workers such as typists are under more stress than professional workers such as teachers. Dr.…

  16. Does Affirmative Action Really Hurt Blacks and Latinos in U.S. Law Schools? TRPI Policy Brief

    ERIC Educational Resources Information Center

    Kidder, William C.

    2005-01-01

    In a "Stanford Law Review" article, University of California, Los Angeles (UCLA) law professor Richard Sander claimed to statistically prove that affirmative action at American law schools actually depressed the number of African Americans who become lawyers by "mismatching" them at schools where they were in over their heads academically. This…

  17. Principal component analysis of normalized full spectrum mass spectrometry data in multiMS-toolbox: An effective tool to identify important factors for classification of different metabolic patterns and bacterial strains.

    PubMed

    Cejnar, Pavel; Kuckova, Stepanka; Prochazka, Ales; Karamonova, Ludmila; Svobodova, Barbora

    2018-06-15

    Explorative statistical analysis of mass spectrometry data is still a time-consuming step. We analyzed critical factors for application of principal component analysis (PCA) in mass spectrometry and focused on two whole spectrum based normalization techniques and their application in the analysis of registered peak data and, in comparison, in full spectrum data analysis. We used this technique to identify different metabolic patterns in the bacterial culture of Cronobacter sakazakii, an important foodborne pathogen. Two software utilities, the ms-alone, a python-based utility for mass spectrometry data preprocessing and peak extraction, and the multiMS-toolbox, an R software tool for advanced peak registration and detailed explorative statistical analysis, were implemented. The bacterial culture of Cronobacter sakazakii was cultivated on Enterobacter sakazakii Isolation Agar, Blood Agar Base and Tryptone Soya Agar for 24 h and 48 h and applied by the smear method on an Autoflex speed MALDI-TOF mass spectrometer. For three tested cultivation media only two different metabolic patterns of Cronobacter sakazakii were identified using PCA applied on data normalized by two different normalization techniques. Results from matched peak data and subsequent detailed full spectrum analysis identified only two different metabolic patterns - a cultivation on Enterobacter sakazakii Isolation Agar showed significant differences to the cultivation on the other two tested media. The metabolic patterns for all tested cultivation media also proved the dependence on cultivation time. Both whole spectrum based normalization techniques together with the full spectrum PCA allow identification of important discriminative factors in experiments with several variable condition factors avoiding any problems with improper identification of peaks or emphasis on bellow threshold peak data. The amounts of processed data remain still manageable. Both implemented software utilities are available free of charge from http://uprt.vscht.cz/ms. Copyright © 2018 John Wiley & Sons, Ltd.

  18. Analysis of Rare, Exonic Variation amongst Subjects with Autism Spectrum Disorders and Population Controls

    PubMed Central

    Liu, Li; Sabo, Aniko; Neale, Benjamin M.; Nagaswamy, Uma; Stevens, Christine; Lim, Elaine; Bodea, Corneliu A.; Muzny, Donna; Reid, Jeffrey G.; Banks, Eric; Coon, Hillary; DePristo, Mark; Dinh, Huyen; Fennel, Tim; Flannick, Jason; Gabriel, Stacey; Garimella, Kiran; Gross, Shannon; Hawes, Alicia; Lewis, Lora; Makarov, Vladimir; Maguire, Jared; Newsham, Irene; Poplin, Ryan; Ripke, Stephan; Shakir, Khalid; Samocha, Kaitlin E.; Wu, Yuanqing; Boerwinkle, Eric; Buxbaum, Joseph D.; Cook, Edwin H.; Devlin, Bernie; Schellenberg, Gerard D.; Sutcliffe, James S.; Daly, Mark J.; Gibbs, Richard A.; Roeder, Kathryn

    2013-01-01

    We report on results from whole-exome sequencing (WES) of 1,039 subjects diagnosed with autism spectrum disorders (ASD) and 870 controls selected from the NIMH repository to be of similar ancestry to cases. The WES data came from two centers using different methods to produce sequence and to call variants from it. Therefore, an initial goal was to ensure the distribution of rare variation was similar for data from different centers. This proved straightforward by filtering called variants by fraction of missing data, read depth, and balance of alternative to reference reads. Results were evaluated using seven samples sequenced at both centers and by results from the association study. Next we addressed how the data and/or results from the centers should be combined. Gene-based analyses of association was an obvious choice, but should statistics for association be combined across centers (meta-analysis) or should data be combined and then analyzed (mega-analysis)? Because of the nature of many gene-based tests, we showed by theory and simulations that mega-analysis has better power than meta-analysis. Finally, before analyzing the data for association, we explored the impact of population structure on rare variant analysis in these data. Like other recent studies, we found evidence that population structure can confound case-control studies by the clustering of rare variants in ancestry space; yet, unlike some recent studies, for these data we found that principal component-based analyses were sufficient to control for ancestry and produce test statistics with appropriate distributions. After using a variety of gene-based tests and both meta- and mega-analysis, we found no new risk genes for ASD in this sample. Our results suggest that standard gene-based tests will require much larger samples of cases and controls before being effective for gene discovery, even for a disorder like ASD. PMID:23593035

  19. Performance assessment of methods for estimation of fractal dimension from scanning electron microscope images.

    PubMed

    Risović, Dubravko; Pavlović, Zivko

    2013-01-01

    Processing of gray scale images in order to determine the corresponding fractal dimension is very important due to widespread use of imaging technologies and application of fractal analysis in many areas of science, technology, and medicine. To this end, many methods for estimation of fractal dimension from gray scale images have been developed and routinely used. Unfortunately different methods (dimension estimators) often yield significantly different results in a manner that makes interpretation difficult. Here, we report results of comparative assessment of performance of several most frequently used algorithms/methods for estimation of fractal dimension. To that purpose, we have used scanning electron microscope images of aluminum oxide surfaces with different fractal dimensions. The performance of algorithms/methods was evaluated using the statistical Z-score approach. The differences between performances of six various methods are discussed and further compared with results obtained by electrochemical impedance spectroscopy on the same samples. The analysis of results shows that the performance of investigated algorithms varies considerably and that systematically erroneous fractal dimensions could be estimated using certain methods. The differential cube counting, triangulation, and box counting algorithms showed satisfactory performance in the whole investigated range of fractal dimensions. Difference statistic is proved to be less reliable generating 4% of unsatisfactory results. The performances of the Power spectrum, Partitioning and EIS were unsatisfactory in 29%, 38%, and 75% of estimations, respectively. The results of this study should be useful and provide guidelines to researchers using/attempting fractal analysis of images obtained by scanning microscopy or atomic force microscopy. © Wiley Periodicals, Inc.

  20. Analysis of biochemical genetic data on Jewish populations: II. Results and interpretations of heterogeneity indices and distance measures with respect to standards.

    PubMed

    Karlin, S; Kenett, R; Bonné-Tamir, B

    1979-05-01

    A nonparametric statistical methodology is used for the analysis of biochemical frequency data observed on a series of nine Jewish and six non-Jewish populations. Two categories of statistics are used: heterogeneity indices and various distance measures with respect to a standard. The latter are more discriminating in exploiting historical, geographical and culturally relevant information. A number of partial orderings and distance relationships among the populations are determined. Our concern in this study is to analyze similarities and differences among the Jewish populations, in terms of the gene frequency distributions for a number of genetic markers. Typical questions discussed are as follows: These Jewish populations differ in certain morphological and anthropometric traits. Are there corresponding differences in biochemical genetic constitution? How can we assess the extent of heterogeneity between and within groupings? Which class of markers (blood typings or protein loci) discriminates better among the separate populations? The results are quite surprising. For example, we found the Ashkenazi, Sephardi and Iraqi Jewish populations to be consistently close in genetic constitution and distant from all the other populations, namely the Yemenite and Cochin Jews, the Arabs, and the non-Jewish German and Russian populations. We found the Polish Jewish community the most heterogeneous among all Jewish populations. The blood loci discriminate better than the protein loci. A number of possible interpretations and hypotheses for these and other results are offered. The method devised for this analysis should prove useful in studying similarities and differences for other groups of populations for which substantial biochemical polymorphic data are available.

  1. Prevalence of oropharyngeal dysphagia in Parkinson's disease: a meta-analysis.

    PubMed

    Kalf, J G; de Swart, B J M; Bloem, B R; Munneke, M

    2012-05-01

    Dysphagia is a potentially harmful feature, also in Parkinson's disease (PD). As published prevalence rates vary widely, we aimed to estimate the prevalence of oropharyngeal dysphagia in PD in a meta-analysis. We conducted a systematic literature search in February 2011 and two independent reviewers selected the papers. We computed the estimates of the pooled prevalence weighted by sample size. Twelve studies were suitable for calculating prevalence rates. Ten studies provided an estimate based on subjective outcomes, which proved statistically heterogeneous (p < 0.001), with a pooled prevalence estimate with random effect analysis of 35% (95% CI 28-41). Four studies provided an estimate based on objective measurements, which were statistically homogeneous (p = 0.23), with a pooled prevalence estimate of 82% (95% CI 77-87). In controls the pooled subjective prevalence was 9% (95% CI 2-17), while the pooled objective prevalence was 23% (95% CI 13-32). The pooled relative risk was 3.2 for both subjective outcomes (95% CI 2.32-4.41) and objective outcomes (95% CI 2.08-4.98). Clinical heterogeneity between studies was chiefly explained by differences in disease severity. Subjective dysphagia occurs in one third of community-dwelling PD patients. Objectively measured dysphagia rates were much higher, with 4 out of 5 patients being affected. This suggests that dysphagia is common in PD, but patients do not always report swallowing difficulties unless asked. This underreporting calls for a proactive clinical approach to dysphagia, particularly in light of the serious clinical consequences. Copyright © 2011 Elsevier Ltd. All rights reserved.

  2. Analysis of prognostic factors after resection of solitary liver metastasis in colorectal cancer: a 22-year bicentre study.

    PubMed

    Acciuffi, Sara; Meyer, Frank; Bauschke, Astrid; Settmacher, Utz; Lippert, Hans; Croner, Roland; Altendorf-Hofmann, Annelore

    2018-03-01

    The investigation of the predictors of outcome after hepatic resection for solitary colorectal liver metastasis. We recruited 350 patients with solitary colorectal liver metastasis at the University Hospitals of Jena and Magdeburg, who underwent curative liver resection between 1993 and 2014. All patients had follow-up until death or till summer 2016. The follow-up data concern 96.6% of observed patients. The 5- and 10-year overall survival rates were 47 and 28%, respectively. The 5- and 10-year disease-free survival rates were 30 and 20%, respectively. The analysis of the prognostic factors revealed that the pT category of primary tumour, size and grade of the metastasis and extension of the liver resection had no statistically significant impact on survival and recurrence rates. In multivariate analysis, age, status of lymph node metastasis at the primary tumour, location of primary tumour, time of appearance of the metastasis, the use of preoperative chemotherapy and the presence of extrahepatic tumour proved to be independent statistically significant predictors for the prognosis. Moreover, patients with rectal cancer had a lower intrahepatic recurrence rate, but a higher extrahepatic recurrence rate. The long-term follow-up of patients with R0-resected liver metastasis is multifactorially influenced. Age and comorbidity have a role only in the overall survival. More than three lymph node metastasis reduced both the overall and disease-free survival. Extrahepatic tumour had a negative influence on the extrahepatic recurrence and on the overall survival. Neither overall survival nor recurrence rates was improved using neoadjuvant chemotherapy.

  3. The novel application of Benford's second order analysis for monitoring radiation output in interventional radiology.

    PubMed

    Cournane, S; Sheehy, N; Cooke, J

    2014-06-01

    Benford's law is an empirical observation which predicts the expected frequency of digits in naturally occurring datasets spanning multiple orders of magnitude, with the law having been most successfully applied as an audit tool in accountancy. This study investigated the sensitivity of the technique in identifying system output changes using simulated changes in interventional radiology Dose-Area-Product (DAP) data, with any deviations from Benford's distribution identified using z-statistics. The radiation output for interventional radiology X-ray equipment is monitored annually during quality control testing; however, for a considerable portion of the year an increased output of the system, potentially caused by engineering adjustments or spontaneous system faults may go unnoticed, leading to a potential increase in the radiation dose to patients. In normal operation recorded examination radiation outputs vary over multiple orders of magnitude rendering the application of normal statistics ineffective for detecting systematic changes in the output. In this work, the annual DAP datasets complied with Benford's first order law for first, second and combinations of the first and second digits. Further, a continuous 'rolling' second order technique was devised for trending simulated changes over shorter timescales. This distribution analysis, the first employment of the method for radiation output trending, detected significant changes simulated on the original data, proving the technique useful in this case. The potential is demonstrated for implementation of this novel analysis for monitoring and identifying change in suitable datasets for the purpose of system process control. Copyright © 2013 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  4. Effects of pH, lactate, hematocrit and potassium level on the accuracy of continuous glucose monitoring (CGM) in pediatric intensive care unit.

    PubMed

    Marics, Gábor; Koncz, Levente; Eitler, Katalin; Vatai, Barbara; Szénási, Boglárka; Zakariás, David; Mikos, Borbála; Körner, Anna; Tóth-Heyn, Péter

    2015-03-19

    Continuous glucose monitoring (CGM) originally was developed for diabetic patients and it may be a useful tool for monitoring glucose changes in pediatric intensive care unit (PICU). Its use is, however, limited by the lack of sufficient data on its reliability at insufficient peripheral perfusion. We aimed to correlate the accuracy of CGM with laboratory markers relevant to disturbed tissue perfusion. In 38 pediatric patients (age range, 0-18 years) requiring intensive care we tested the effect of pH, lactate, hematocrit and serum potassium on the difference between CGM and meter glucose measurements. Guardian® (Medtronic®) CGM results were compared to GEM 3000 (Instrumentation laboratory®) and point-of-care measurements. The clinical accuracy of CGM was evaluated by Clarke Error Grid -, Bland-Altman analysis and Pearson's correlation. We used Friedman test for statistical analysis (statistical significance was established as a p < 0.05). CGM values exhibited a considerable variability without any correlation with the examined laboratory parameters. Clarke, Bland-Altman analysis and Pearson's correlation coefficient demonstrated a good clinical accuracy of CGM (zone A and B = 96%; the mean difference between reference and CGM glucose was 1,3 mg/dL, 48 from the 780 calibration pairs overrunning the 2 standard deviation; Pearson's correlation coefficient: 0.83). The accuracy of CGM measurements is independent of laboratory parameters relevant to tissue hypoperfusion. CGM may prove a reliable tool for continuous monitoring of glucose changes in PICUs, not much influenced by tissue perfusion, but still not appropriate for being the base for clinical decisions.

  5. The (mis)reporting of statistical results in psychology journals.

    PubMed

    Bakker, Marjan; Wicherts, Jelte M

    2011-09-01

    In order to study the prevalence, nature (direction), and causes of reporting errors in psychology, we checked the consistency of reported test statistics, degrees of freedom, and p values in a random sample of high- and low-impact psychology journals. In a second study, we established the generality of reporting errors in a random sample of recent psychological articles. Our results, on the basis of 281 articles, indicate that around 18% of statistical results in the psychological literature are incorrectly reported. Inconsistencies were more common in low-impact journals than in high-impact journals. Moreover, around 15% of the articles contained at least one statistical conclusion that proved, upon recalculation, to be incorrect; that is, recalculation rendered the previously significant result insignificant, or vice versa. These errors were often in line with researchers' expectations. We classified the most common errors and contacted authors to shed light on the origins of the errors.

  6. Comparison of mid-infrared transmission spectroscopy with biochemical methods for the determination of macronutrients in human milk.

    PubMed

    Silvestre, Dolores; Fraga, Miriam; Gormaz, María; Torres, Ester; Vento, Máximo

    2014-07-01

    The variability of human milk (HM) composition renders analysis of its components essential for optimal nutrition of preterm fed either with donor's or own mother's milk. To fulfil this requirement, various analytical instruments have been subjected to scientific and clinical evaluation. The objective of this study was to evaluate the suitability of a rapid method for the analysis of macronutrients in HM as compared with the analytical methods applied by cow's milk industry. Mature milk from 39 donors was analysed using an infrared human milk analyser (HMA) and compared with biochemical reference laboratory methods. The statistical analysis was based on the use of paired data tests. The use of an infrared HMA for the analysis of lipids, proteins and lactose in HM proved satisfactory as regards the rapidity, simplicity and the required sample volume. The instrument afforded good linearity and precision in application to all three nutrients. However, accuracy was not acceptable when compared with the reference methods, with overestimation of the lipid content and underestimation of the amount of proteins and lactose contents. The use of mid-infrared HMA might become the standard for rapid analysis of HM once standardisation and rigorous and systematic calibration is provided. © 2012 John Wiley & Sons Ltd.

  7. Individual and population pharmacokinetic compartment analysis: a graphic procedure for quantification of predictive performance.

    PubMed

    Eksborg, Staffan

    2013-01-01

    Pharmacokinetic studies are important for optimizing of drug dosing, but requires proper validation of the used pharmacokinetic procedures. However, simple and reliable statistical methods suitable for evaluation of the predictive performance of pharmacokinetic analysis are essentially lacking. The aim of the present study was to construct and evaluate a graphic procedure for quantification of predictive performance of individual and population pharmacokinetic compartment analysis. Original data from previously published pharmacokinetic compartment analyses after intravenous, oral, and epidural administration, and digitized data, obtained from published scatter plots of observed vs predicted drug concentrations from population pharmacokinetic studies using the NPEM algorithm and NONMEM computer program and Bayesian forecasting procedures, were used for estimating the predictive performance according to the proposed graphical method and by the method of Sheiner and Beal. The graphical plot proposed in the present paper proved to be a useful tool for evaluation of predictive performance of both individual and population compartment pharmacokinetic analysis. The proposed method is simple to use and gives valuable information concerning time- and concentration-dependent inaccuracies that might occur in individual and population pharmacokinetic compartment analysis. Predictive performance can be quantified by the fraction of concentration ratios within arbitrarily specified ranges, e.g. within the range 0.8-1.2.

  8. Towards an automatic wind speed and direction profiler for Wide Field adaptive optics systems

    NASA Astrophysics Data System (ADS)

    Sivo, G.; Turchi, A.; Masciadri, E.; Guesalaga, A.; Neichel, B.

    2018-05-01

    Wide Field Adaptive Optics (WFAO) systems are among the most sophisticated adaptive optics (AO) systems available today on large telescopes. Knowledge of the vertical spatio-temporal distribution of wind speed (WS) and direction (WD) is fundamental to optimize the performance of such systems. Previous studies already proved that the Gemini Multi-Conjugated AO system (GeMS) is able to retrieve measurements of the WS and WD stratification using the SLOpe Detection And Ranging (SLODAR) technique and to store measurements in the telemetry data. In order to assess the reliability of these estimates and of the SLODAR technique applied to such complex AO systems, in this study we compared WS and WD values retrieved from GeMS with those obtained with the atmospheric model Meso-NH on a rich statistical sample of nights. It has previously been proved that the latter technique provided excellent agreement with a large sample of radiosoundings, both in statistical terms and on individual flights. It can be considered, therefore, as an independent reference. The excellent agreement between GeMS measurements and the model that we find in this study proves the robustness of the SLODAR approach. To bypass the complex procedures necessary to achieve automatic measurements of the wind with GeMS, we propose a simple automatic method to monitor nightly WS and WD using Meso-NH model estimates. Such a method can be applied to whatever present or new-generation facilities are supported by WFAO systems. The interest of this study is, therefore, well beyond the optimization of GeMS performance.

  9. Safety and Efficacy of Cortisol Phosphate in Hyaluronic Acid Vehicle in the Treatment of Dry Eye in Sjogren Syndrome.

    PubMed

    Rolando, Maurizio; Vagge, Aldo

    2017-06-01

    Evaluation of 0.3% cortisol phosphate eye drops in hyaluronic acid vehicle in the treatment of dry eye in Sjogren Syndrome. This prospective, single-center, masked (single blind), randomized controlled study included 40 female patients divided into 2 groups, group 1 treated with Idracemi, 0.3% cortisol phosphate eye drops twice a day, and group 2 treated with Cortivis, 0.3% cortisol phosphate in hyaluronic acid vehicle, with the same posology. Screening (day -7), randomization (day 0), follow-up (day 7), and termination (day 28) visits were conducted. Symptoms (VAS) questionnaire, tear film breakup time, corneo-conjunctival stain, intraocular pressure (IOP) measurement, and fundus examination were performed at each visit. Conjunctival impression cytology for human leukocyte antigen-DR (HLA-DR) expression at visit 1 and 4 was also performed. No changes in IOP or fundus examination were observed in either group at each time point. Group 1 showed at day 28 a statistically significant amelioration of symptoms and reduction of HLA-DR expression. Group 2 showed at day 7 statistically significant improvement of corneal and conjunctival stain versus baseline and versus group 1; the symptom score was statistically significantly better than baseline and versus group 1 after 28 days too. The HLA-DR expression and the epithelial cell area were statistically significantly reduced versus baseline and versus group 1 at the same time. Cortisol phosphate proved to be safe and effective in treating dry eye in Sjogren Syndrome patients in both formulations. However, the formula with hyaluronic acid vehicle proved to be more effective. Both formulations were very well tolerated.

  10. A case study: application of statistical process control tool for determining process capability and sigma level.

    PubMed

    Chopra, Vikram; Bairagi, Mukesh; Trivedi, P; Nagar, Mona

    2012-01-01

    Statistical process control is the application of statistical methods to the measurement and analysis of variation process. Various regulatory authorities such as Validation Guidance for Industry (2011), International Conference on Harmonisation ICH Q10 (2009), the Health Canada guidelines (2009), Health Science Authority, Singapore: Guidance for Product Quality Review (2008), and International Organization for Standardization ISO-9000:2005 provide regulatory support for the application of statistical process control for better process control and understanding. In this study risk assessments, normal probability distributions, control charts, and capability charts are employed for selection of critical quality attributes, determination of normal probability distribution, statistical stability, and capability of production processes, respectively. The objective of this study is to determine tablet production process quality in the form of sigma process capability. By interpreting data and graph trends, forecasting of critical quality attributes, sigma process capability, and stability of process were studied. The overall study contributes to an assessment of process at the sigma level with respect to out-of-specification attributes produced. Finally, the study will point to an area where the application of quality improvement and quality risk assessment principles for achievement of six sigma-capable processes is possible. Statistical process control is the most advantageous tool for determination of the quality of any production process. This tool is new for the pharmaceutical tablet production process. In the case of pharmaceutical tablet production processes, the quality control parameters act as quality assessment parameters. Application of risk assessment provides selection of critical quality attributes among quality control parameters. Sequential application of normality distributions, control charts, and capability analyses provides a valid statistical process control study on process. Interpretation of such a study provides information about stability, process variability, changing of trends, and quantification of process ability against defective production. Comparative evaluation of critical quality attributes by Pareto charts provides the least capable and most variable process that is liable for improvement. Statistical process control thus proves to be an important tool for six sigma-capable process development and continuous quality improvement.

  11. Alteration of functional state of peripheral blood erythrocytes in women of different age groups at dislipidemia conditions.

    PubMed

    Ratiani, L; Intskirveli, N; Ormotsadze, G; Sanikidze, T

    2011-12-01

    The aim of the study was identification of statistically reliable correlations and the cause-effect relationships between viability of red blood cells and dislipidema parametres and/or metabolic disorders, induced by age related alterations of estrogen content, in women of different ages (reproductive, menopausal) On the basis of the analysis of research results we can conclude that in the different age groups of women with atherosclerosis-induced cardiovascular diseases revealed estrogen-related dependence between Tg-s and HDL content, functional status of phereperial blood erytrotcites and severity of dislipidemia. The aterogenic index Tg/HD proved to be sensitive marker of dislipidemia in reproductive aging women, but does't reflect disorders of lipid metabolism in postmenosal women. It was proved the existence of reliable corelation between red blood cells dysfunction indicator, spherulation quality, and atherogenic index Tg/HDL highlights; however, the correlation coefficient is 2 times higher in the reproductive age as in menopause. Spherulation quality of red blood cells at low HDL content showd fast growth rate in reproductive-aged women, and was unsensetive to HDL content in postmenopasal women. It was concluded that age-related lack of estrogens in postmenopausal women indirectly contributes to decrease protection of red blood cells against oxidative damage, reduces their deformabelity and disturbances the rheological properties. So, Spherulation quality of red blood cells may be used as a diagnostic marker of severity of atherosclerosis.

  12. Diffusion-weighted magnetic resonance imaging in the characterization of testicular germ cell neoplasms: Effect of ROI methods on apparent diffusion coefficient values and interobserver variability.

    PubMed

    Tsili, Athina C; Ntorkou, Alexandra; Astrakas, Loukas; Xydis, Vasilis; Tsampalas, Stavros; Sofikitis, Nikolaos; Argyropoulou, Maria I

    2017-04-01

    To evaluate the difference in apparent diffusion coefficient (ADC) measurements at diffusion-weighted (DW) magnetic resonance imaging of differently shaped regions-of-interest (ROIs) in testicular germ cell neoplasms (TGCNS), the diagnostic ability of differently shaped ROIs in differentiating seminomas from nonseminomatous germ cell neoplasms (NSGCNs) and the interobserver variability. Thirty-three TGCNs were retrospectively evaluated. Patients underwent MR examinations, including DWI on a 1.5-T MR system. Two observers measured mean tumor ADCs using four distinct ROI methods: round, square, freehand and multiple small, round ROIs. The interclass correlation coefficient was analyzed to assess interobserver variability. Statistical analysis was used to compare mean ADC measurements among observers, methods and histologic types. All ROI methods showed excellent interobserver agreement, with excellent correlation (P<0.001). Multiple, small ROIs provided the lower mean ADC in TGCNs. Seminomas had lower mean ADC compared to NSGCNs for each ROI method (P<0.001). Round ROI proved the most accurate method in characterizing TGCNS. Interobserver variability in ADC measurement is excellent, irrespective of the ROI shape. Multiple, small round ROIs and round ROI proved the more accurate methods for ADC measurement in the characterization of TGCNs and in the differentiation between seminomas and NSGCNs, respectively. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. The Influence of Role-Players on the Character-Development and Character-Building of South African College Students

    ERIC Educational Resources Information Center

    Freeks, Fazel Ebrihiam

    2015-01-01

    The present world is in a moral crisis and it seems as though educational institutions experience both challenges and enormous behavioural problems. Statistics prove that there is a drastic decline in morals, values, standards, ethics, character and behaviour and schools, where colleges and even universities seem to indulge in crisis after crisis.…

  14. Predictive Validity of a Continuous Alternative to Nominal Subtypes of Attention-Deficit/Hyperactivity Disorder for "DSM-V"

    ERIC Educational Resources Information Center

    Lahey, Benjamin B.; Willcutt, Erik G.

    2010-01-01

    Three subtypes of attention-deficit/hyperactivity disorder (ADHD) based on numbers of symptoms of inattention (I) and hyperactivity-impulsivity (HI) were defined in the "Diagnostic and Statistical Manual of Mental Disorders" (4th ed.) to reduce heterogeneity of the disorder, but the subtypes proved to be highly unstable over time. A continuous…

  15. Validity and reliability of a method for assessment of cervical vertebral maturation.

    PubMed

    Zhao, Xiao-Guang; Lin, Jiuxiang; Jiang, Jiu-Hui; Wang, Qingzhu; Ng, Sut Hong

    2012-03-01

    To evaluate the validity and reliability of the cervical vertebral maturation (CVM) method with a longitudinal sample. Eighty-six cephalograms from 18 subjects (5 males and 13 females) were selected from the longitudinal database. Total mandibular length was measured on each film; an increased rate served as the gold standard in examination of the validity of the CVM method. Eleven orthodontists, after receiving intensive training in the CVM method, evaluated all films twice. Kendall's W and the weighted kappa statistic were employed. Kendall's W values were higher than 0.8 at both times, indicating strong interobserver reproducibility, but interobserver agreement was documented twice at less than 50%. A wide range of intraobserver agreement was noted (40.7%-79.1%), and substantial intraobserver reproducibility was proved by kappa values (0.53-0.86). With regard to validity, moderate agreement was reported between the gold standard and observer staging at the initial time (kappa values 0.44-0.61). However, agreement seemed to be unacceptable for clinical use, especially in cervical stage 3 (26.8%). Even though the validity and reliability of the CVM method proved statistically acceptable, we suggest that many other growth indicators should be taken into consideration in evaluating adolescent skeletal maturation.

  16. Educational Gaming for Pharmacy Students - Design and Evaluation of a Diabetes-themed Escape Room.

    PubMed

    Eukel, Heidi N; Frenzel, Jeanne E; Cernusca, Dan

    2017-09-01

    Objective. To design an educational game that will increase third-year professional pharmacy students' knowledge of diabetes mellitus disease management and to evaluate their perceived value of the game. Methods. Faculty members created an innovative educational game, the diabetes escape room. An authentic escape room gaming environment was established through the use of a locked room, an escape time limit, and game rules within which student teams completed complex puzzles focused on diabetes disease management. To evaluate the impact, students completed a pre-test and post-test to measure the knowledge they've gained and a perception survey to identify moderating factors that could help instructors improve the game's effectiveness and utility. Results. Students showed statistically significant increases in knowledge after completion of the game. A one-sample t -test indicated that students' mean perception was statistically significantly higher than the mean value of the evaluation scale. This statically significant result proved that this gaming act offers a potential instructional benefit beyond its novelty. Conclusion. The diabetes escape room proved to be a valuable educational game that increased students' knowledge of diabetes mellitus disease management and showed a positive perceived overall value by student participants.

  17. [Characteristic features of urinary tract infection in malnourished children].

    PubMed

    Stârcea, Magdalena; Munteanu, Mihaela; Brumariu, O

    2010-01-01

    The aim of this study is to prove a relationship between urinary tract infection and malnutrition in children 0-3 years old, hospitalized in the IVI Pediatric Clinic, Hospital St. Mary Iaşi, between January 2000 and December 2004. We have made a retrospective study for 298 infant and young children with urinary tract infection, 237 eutrophic and 61 malnourished. We studied comparatively the both groups with urinary tract infection (UTI), and we applied statistic methods for results. The statistic methods prove that relative risk for UTI increases in malnutrition, predictive positive value is 72.5%. The clinical manifestation is similar in malnourished and eutrophic, but many co morbidities were associated with dystrophic status. Malformation of urinary tract was associate two times more in malnourished child. The etiology of infection was dominated by Escherichia coli, Proteus and Klebsiella pneumoniae. In malnourished children 5% of UTI was determinate by opportunist etiological agents like: Enterobacter, Enterococcus, Acinetobacter. More frequently, bacterium develops resistance of antibiotics like amino-penicilina, Trimethoprim and Cephalosporin. Accurate and fast diagnosis and treatment of UTI in infant and child with malnutrition is the best way for nutritional rehabilitation and prevention of serious consequence.

  18. On the probability density function and characteristic function moments of image steganalysis in the log prediction error wavelet subband

    NASA Astrophysics Data System (ADS)

    Bao, Zhenkun; Li, Xiaolong; Luo, Xiangyang

    2017-01-01

    Extracting informative statistic features is the most essential technical issue of steganalysis. Among various steganalysis methods, probability density function (PDF) and characteristic function (CF) moments are two important types of features due to the excellent ability for distinguishing the cover images from the stego ones. The two types of features are quite similar in definition. The only difference is that the PDF moments are computed in the spatial domain, while the CF moments are computed in the Fourier-transformed domain. Then, the comparison between PDF and CF moments is an interesting question of steganalysis. Several theoretical results have been derived, and CF moments are proved better than PDF moments in some cases. However, in the log prediction error wavelet subband of wavelet decomposition, some experiments show that the result is opposite and lacks a rigorous explanation. To solve this problem, a comparison result based on the rigorous proof is presented: the first-order PDF moment is proved better than the CF moment, while the second-order CF moment is better than the PDF moment. It tries to open the theoretical discussion on steganalysis and the question of finding suitable statistical features.

  19. Extending GIS Technology to Study Karst Features of Southeastern Minnesota

    NASA Astrophysics Data System (ADS)

    Gao, Y.; Tipping, R. G.; Alexander, E. C.; Alexander, S. C.

    2001-12-01

    This paper summarizes ongoing research on karst feature distribution of southeastern Minnesota. The main goals of this interdisciplinary research are: 1) to look for large-scale patterns in the rate and distribution of sinkhole development; 2) to conduct statistical tests of hypotheses about the formation of sinkholes; 3) to create management tools for land-use managers and planners; and 4) to deliver geomorphic and hydrogeologic criteria for making scientifically valid land-use policies and ethical decisions in karst areas of southeastern Minnesota. Existing county and sub-county karst feature datasets of southeastern Minnesota have been assembled into a large GIS-based database capable of analyzing the entire data set. The central database management system (DBMS) is a relational GIS-based system interacting with three modules: GIS, statistical and hydrogeologic modules. ArcInfo and ArcView were used to generate a series of 2D and 3D maps depicting karst feature distributions in southeastern Minnesota. IRIS ExplorerTM was used to produce satisfying 3D maps and animations using data exported from GIS-based database. Nearest-neighbor analysis has been used to test sinkhole distributions in different topographic and geologic settings. All current nearest-neighbor analyses testify that sinkholes in southeastern Minnesota are not evenly distributed in this area (i.e., they tend to be clustered). More detailed statistical methods such as cluster analysis, histograms, probability estimation, correlation and regression have been used to study the spatial distributions of some mapped karst features of southeastern Minnesota. A sinkhole probability map for Goodhue County has been constructed based on sinkhole distribution, bedrock geology, depth to bedrock, GIS buffer analysis and nearest-neighbor analysis. A series of karst features for Winona County including sinkholes, springs, seeps, stream sinks and outcrop has been mapped and entered into the Karst Feature Database of Southeastern Minnesota. The Karst Feature Database of Winona County is being expanded to include all the mapped karst features of southeastern Minnesota. Air photos from 1930s to 1990s of Spring Valley Cavern Area in Fillmore County were scanned and geo-referenced into our GIS system. This technology has been proved to be very useful to identify sinkholes and study the rate of sinkhole development.

  20. Seroprevalence and risk factors of Neospora spp. and Toxoplasma gondii infections among horses and donkeys in Nigeria, West Africa.

    PubMed

    Bártová, Eva; Sedlák, Kamil; Kobédová, Kateřina; Budíková, Marie; Joel Atuman, Yakubu; Kamani, Joshua

    2017-09-26

    Neospora spp. and Toxoplasma gondii are considered to be a globally distributed parasites affecting wide range of warm-blooded animals. Neosporosis has caused clinical illness in horses and consumption of horse meat has been epidemiologically linked to clinical toxoplasmosis in humans. This study was conducted to determine Neospora spp. and T. gondii antibodies and risk factors of infection in horses and donkeys from three states of Nigeria. A total of 144 samples were collected from clinically healthy animals (120 horses and 24 donkeys). The sera were tested for antibodies to Neospora spp. and T. gondii by indirect fluorescence antibody test, a titer ≥ 50 was considered positive. Seroprevalence data were statistically analyzed, considering the variables of gender, age, use, state, origin of breed and type of management. Antibodies to Neospora spp. and T. gondii were detected in 8% horses with titers 50 and in 24% horses with titers 50-800, respectively. Co-infection of both parasites was proved in three horses (3%). Statistical differences were found only for T. gondii seroprevalence in horses with different use, locality, origin and management (p-value ≤ 0.05). Antibodies to T. gondii were detected in four (17%) of 24 donkeys with statistical difference (p-value ≤ 0.05) in animals of different use; antibodies to Neospora spp. were not proved in any of the donkeys. This is the first seroprevalence study of Neospora spp. and T. gondii in equids from Nigeria.

  1. DMT-TAFM: a data mining tool for technical analysis of futures market

    NASA Astrophysics Data System (ADS)

    Stepanov, Vladimir; Sathaye, Archana

    2002-03-01

    Technical analysis of financial markets describes many patterns of market behavior. For practical use, all these descriptions need to be adjusted for each particular trading session. In this paper, we develop a data mining tool for technical analysis of the futures markets (DMT-TAFM), which dynamically generates rules based on the notion of the price pattern similarity. The tool consists of three main components. The first component provides visualization of data series on a chart with different ranges, scales, and chart sizes and types. The second component constructs pattern descriptions using sets of polynomials. The third component specifies the training set for mining, defines the similarity notion, and searches for a set of similar patterns. DMT-TAFM is useful to prepare the data, and then reveal and systemize statistical information about similar patterns found in any type of historical price series. We performed experiments with our tool on three decades of trading data fro hundred types of futures. Our results for this data set shows that, we can prove or disprove many well-known patterns based on real data, as well as reveal new ones, and use the set of relatively consistent patterns found during data mining for developing better futures trading strategies.

  2. Genome-wide QTL analysis for anxiety trait in bipolar disorder type I.

    PubMed

    Contreras, J; Hare, E; Chavarría-Soley, G; Raventós, H

    2018-07-01

    Genetic studies have been consistent that bipolar disorder type I (BPI) runs in families and that this familial aggregation is strongly influenced by genes. In a preliminary study, we proved that anxiety trait meets endophenotype criteria for BPI. We assessed 619 individuals from the Central Valley of Costa Rica (CVCR) who have received evaluation for anxiety following the same methodological procedure used for the initial pilot study. Our goal was to conduct a multipoint quantitative trait linkage analysis to identify quantitative trait loci (QTLs) related to anxiety trait in subjects with BPI. We conducted the statistical analyses using Quantitative Trait Loci method (Variance-components models), implemented in Sequential Oligogenic Linkage Analysis Routines (SOLAR), using 5606 single nucleotide polymorphism (SNPs). We identified a suggestive linkage signal with a LOD score of 2.01 at chromosome 2 (2q13-q14). Since confounding factors such as substance abuse, medical illness and medication history were not assessed in our study, these conclusions should be taken as preliminary. We conclude that region 2q13-q14 may harbor a candidate gene(s) with an important role in the pathophysiology of BPI and anxiety. Published by Elsevier B.V.

  3. Wavelet analysis of frequency chaos game signal: a time-frequency signature of the C. elegans DNA.

    PubMed

    Messaoudi, Imen; Oueslati, Afef Elloumi; Lachiri, Zied

    2014-12-01

    Challenging tasks are encountered in the field of bioinformatics. The choice of the genomic sequence's mapping technique is one the most fastidious tasks. It shows that a judicious choice would serve in examining periodic patterns distribution that concord with the underlying structure of genomes. Despite that, searching for a coding technique that can highlight all the information contained in the DNA has not yet attracted the attention it deserves. In this paper, we propose a new mapping technique based on the chaos game theory that we call the frequency chaos game signal (FCGS). The particularity of the FCGS coding resides in exploiting the statistical properties of the genomic sequence itself. This may reflect important structural and organizational features of DNA. To prove the usefulness of the FCGS approach in the detection of different local periodic patterns, we use the wavelet analysis because it provides access to information that can be obscured by other time-frequency methods such as the Fourier analysis. Thus, we apply the continuous wavelet transform (CWT) with the complex Morlet wavelet as a mother wavelet function. Scalograms that relate to the organism Caenorhabditis elegans (C. elegans) exhibit a multitude of periodic organization of specific DNA sequences.

  4. Predictive factors of clinical response in steroid-refractory ulcerative colitis treated with granulocyte-monocyte apheresis

    PubMed Central

    D'Ovidio, Valeria; Meo, Donatella; Viscido, Angelo; Bresci, Giampaolo; Vernia, Piero; Caprilli, Renzo

    2011-01-01

    AIM: To identify factors predicting the clinical response of ulcerative colitis patients to granulocyte-monocyte apheresis (GMA). METHODS: Sixty-nine ulcerative colitis patients (39 F, 30 M) dependent upon/refractory to steroids were treated with GMA. Steroid dependency, clinical activity index (CAI), C reactive protein (CRP) level, erythrocyte sedimentation rate (ESR), values at baseline, use of immunosuppressant, duration of disease, and age and extent of disease were considered for statistical analysis as predictive factors of clinical response. Univariate and multivariate logistic regression models were used. RESULTS: In the univariate analysis, CAI (P = 0.039) and ESR (P = 0.017) levels at baseline were singled out as predictive of clinical remission. In the multivariate analysis steroid dependency [Odds ratio (OR) = 0.390, 95% Confidence interval (CI): 0.176-0.865, Wald 5.361, P = 0.0160] and low CAI levels at baseline (4 < CAI < 7) (OR = 0.770, 95% CI: 0.425-1.394, Wald 3.747, P = 0.028) proved to be effective as factors predicting clinical response. CONCLUSION: GMA may be a valid therapeutic option for steroid-dependent ulcerative colitis patients with mild-moderate disease and its clinical efficacy seems to persist for 12 mo. PMID:21528055

  5. Spectral Properties and Dynamics of Gold Nanorods Revealed by EMCCD Based Spectral-Phasor Method

    PubMed Central

    Chen, Hongtao; Digman, Michelle A.

    2015-01-01

    Gold nanorods (NRs) with tunable plasmon-resonant absorption in the near-infrared region have considerable advantages over organic fluorophores as imaging agents. However, the luminescence spectral properties of NRs have not been fully explored at the single particle level in bulk due to lack of proper analytic tools. Here we present a global spectral phasor analysis method which allows investigations of NRs' spectra at single particle level with their statistic behavior and spatial information during imaging. The wide phasor distribution obtained by the spectral phasor analysis indicates spectra of NRs are different from particle to particle. NRs with different spectra can be identified graphically in corresponding spatial images with high spectral resolution. Furthermore, spectral behaviors of NRs under different imaging conditions, e.g. different excitation powers and wavelengths, were carefully examined by our laser-scanning multiphoton microscope with spectral imaging capability. Our results prove that the spectral phasor method is an easy and efficient tool in hyper-spectral imaging analysis to unravel subtle changes of the emission spectrum. Moreover, we applied this method to study the spectral dynamics of NRs during direct optical trapping and by optothermal trapping. Interestingly, spectral shifts were observed in both trapping phenomena. PMID:25684346

  6. Efficiency of timing delays and electrode positions in optimization of biventricular pacing: a simulation study.

    PubMed

    Miri, Raz; Graf, Iulia M; Dössel, Olaf

    2009-11-01

    Electrode positions and timing delays influence the efficacy of biventricular pacing (BVP). Accordingly, this study focuses on BVP optimization, using a detailed 3-D electrophysiological model of the human heart, which is adapted to patient-specific anatomy and pathophysiology. The research is effectuated on ten heart models with left bundle branch block and myocardial infarction derived from magnetic resonance and computed tomography data. Cardiac electrical activity is simulated with the ten Tusscher cell model and adaptive cellular automaton at physiological and pathological conduction levels. The optimization methods are based on a comparison between the electrical response of the healthy and diseased heart models, measured in terms of root mean square error (E(RMS)) of the excitation front and the QRS duration error (E(QRS)). Intra- and intermethod associations of the pacing electrodes and timing delays variables were analyzed with statistical methods, i.e., t -test for dependent data, one-way analysis of variance for electrode pairs, and Pearson model for equivalent parameters from the two optimization methods. The results indicate that lateral the left ventricle and the upper or middle septal area are frequently (60% of cases) the optimal positions of the left and right electrodes, respectively. Statistical analysis proves that the two optimization methods are in good agreement. In conclusion, a noninvasive preoperative BVP optimization strategy based on computer simulations can be used to identify the most beneficial patient-specific electrode configuration and timing delays.

  7. Optimizing fixed observational assets in a coastal observatory

    NASA Astrophysics Data System (ADS)

    Frolov, Sergey; Baptista, António; Wilkin, Michael

    2008-11-01

    Proliferation of coastal observatories necessitates an objective approach to managing of observational assets. In this article, we used our experience in the coastal observatory for the Columbia River estuary and plume to identify and address common problems in managing of fixed observational assets, such as salinity, temperature, and water level sensors attached to pilings and moorings. Specifically, we addressed the following problems: assessing the quality of an existing array, adding stations to an existing array, removing stations from an existing array, validating an array design, and targeting of an array toward data assimilation or monitoring. Our analysis was based on a combination of methods from oceanographic and statistical literature, mainly on the statistical machinery of the best linear unbiased estimator. The key information required for our analysis was the covariance structure for a field of interest, which was computed from the output of assimilated and non-assimilated models of the Columbia River estuary and plume. The network optimization experiments in the Columbia River estuary and plume proved to be successful, largely withstanding the scrutiny of sensitivity and validation studies, and hence providing valuable insight into optimization and operation of the existing observational network. Our success in the Columbia River estuary and plume suggest that algorithms for optimal placement of sensors are reaching maturity and are likely to play a significant role in the design of emerging ocean observatories, such as the United State's ocean observation initiative (OOI) and integrated ocean observing system (IOOS) observatories, and smaller regional observatories.

  8. [An analysis of residents' self-evaluation and faculty-evaluation in internal medicine standardized residency training program using Milestones evaluation system].

    PubMed

    Zhang, Y; Chu, X T; Zeng, X J; Li, H; Zhang, F C; Zhang, S Y; Shen, T

    2018-06-01

    Objective: To assess the value of internal medicine residency training program at Peking Union Medical College Hospital (PUMCH), and the feasibility of applying revised Milestones evaluation system. Methods: Postgraduate-year-one to four (PGY-1 to PGY-4) residents in PUMCH finished the revised Milestones evaluation scales in September 2017. Residents' self-evaluation and faculty-evaluation scores were calculated. Statistical analysis was conducted on the data. Results: A total of 207 residents were enrolled in this cross-sectional study. Both self and faculty scores showed an increasing trend in senior residents. PGY-1 residents were assessed during their first month of residency with scores of 4 points or higher, suggesting that residents have a high starting level. More strikingly, the mean score in PGY-4 was 7 points or higher, proving the career development of residency training program. There was no statistically significant difference between total self- and faculty-evaluation scores. Evaluation scores of learning ability and communication ability were lower in faculty group ( t =-2.627, -4.279, all P <0.05). The scores in graduate students were lower than those in standardized training residents. Conclusions: The goal of national standardized residency training is to improve the quality of healthcare and residents' career development. The evaluation results would guide curriculum design and emphasize the importance and necessity of multi-level teaching. Self-evaluation contributes to the understanding of training objectives and personal cognition.

  9. Bayesian networks for evaluation of evidence from forensic entomology.

    PubMed

    Andersson, M Gunnar; Sundström, Anders; Lindström, Anders

    2013-09-01

    In the aftermath of a CBRN incident, there is an urgent need to reconstruct events in order to bring the perpetrators to court and to take preventive actions for the future. The challenge is to discriminate, based on available information, between alternative scenarios. Forensic interpretation is used to evaluate to what extent results from the forensic investigation favor the prosecutors' or the defendants' arguments, using the framework of Bayesian hypothesis testing. Recently, several new scientific disciplines have been used in a forensic context. In the AniBioThreat project, the framework was applied to veterinary forensic pathology, tracing of pathogenic microorganisms, and forensic entomology. Forensic entomology is an important tool for estimating the postmortem interval in, for example, homicide investigations as a complement to more traditional methods. In this article we demonstrate the applicability of the Bayesian framework for evaluating entomological evidence in a forensic investigation through the analysis of a hypothetical scenario involving suspect movement of carcasses from a clandestine laboratory. Probabilities of different findings under the alternative hypotheses were estimated using a combination of statistical analysis of data, expert knowledge, and simulation, and entomological findings are used to update the beliefs about the prosecutors' and defendants' hypotheses and to calculate the value of evidence. The Bayesian framework proved useful for evaluating complex hypotheses using findings from several insect species, accounting for uncertainty about development rate, temperature, and precolonization. The applicability of the forensic statistic approach to evaluating forensic results from a CBRN incident is discussed.

  10. Analysis of temporal gene expression profiles: clustering by simulated annealing and determining the optimal number of clusters.

    PubMed

    Lukashin, A V; Fuchs, R

    2001-05-01

    Cluster analysis of genome-wide expression data from DNA microarray hybridization studies has proved to be a useful tool for identifying biologically relevant groupings of genes and samples. In the present paper, we focus on several important issues related to clustering algorithms that have not yet been fully studied. We describe a simple and robust algorithm for the clustering of temporal gene expression profiles that is based on the simulated annealing procedure. In general, this algorithm guarantees to eventually find the globally optimal distribution of genes over clusters. We introduce an iterative scheme that serves to evaluate quantitatively the optimal number of clusters for each specific data set. The scheme is based on standard approaches used in regular statistical tests. The basic idea is to organize the search of the optimal number of clusters simultaneously with the optimization of the distribution of genes over clusters. The efficiency of the proposed algorithm has been evaluated by means of a reverse engineering experiment, that is, a situation in which the correct distribution of genes over clusters is known a priori. The employment of this statistically rigorous test has shown that our algorithm places greater than 90% genes into correct clusters. Finally, the algorithm has been tested on real gene expression data (expression changes during yeast cell cycle) for which the fundamental patterns of gene expression and the assignment of genes to clusters are well understood from numerous previous studies.

  11. Multi-decadal evolution characteristics of global surface temperature anomaly data shown by observation and CMIP5 models

    NASA Astrophysics Data System (ADS)

    Zhu, X.

    2017-12-01

    Based on methods of statistical analysis, the time series of global surface air temperature(SAT) anomalies from 1860-2014 has been defined by three types of phase changes that occur through the division of temperature changes into different stages. The characteristics of the three types of phase changes simulated by CMIP5 models were evaluated. The conclusion is as follows: the SAT from 1860-2014 can be divided into six stages according to trend differences, and this subdivision is proved to be statistically significant. Based on trend analysis and the distribution of slopes between any two points (two points' slope) in every stage, the six stages can be summarized as three phase changes of warming, cooling, and hiatus. Between 1860 and 2014, the world experienced three heating phases (1860-1878, 1909-1942,1975-2004), one cooling phase (1878-1909), and two hiatus phases (1942-1975, 2004-2014).Using the definition method, whether the next year belongs to the previous phase can be estimated. Furthermore, the temperature in 2015 was used as an example to validate the feasibility of this method. The simulations of the heating period by CMIP5 models are well; however the characteristics shown by SAT during the cooling and hiatus period cannot be represented by CMIP5 models. As such, the projections of future heating phases using the CMIP5 models are credible, but for cooling and hiatus events they are unreliable.

  12. Effect of two complex training protocols of back squats in blood indicators of muscular damage in military athletes

    PubMed Central

    Ojeda, Álvaro Huerta; Ríos, Luis Chirosa; Barrilao, Rafael Guisado; Ríos, Ignacio Chirosa; Serrano, Pablo Cáceres

    2016-01-01

    [Purpose] The aim of this study was to determine the variations in the blood muscular damage indicators post application of two complex training programs for back squats. [Subjects and Methods] Seven military athletes were the subjects of this study. The study had a quasi-experimental cross-over intra-subject design. Two complex training protocols were applied, and the variables to be measured were cortisol, metabolic creatine kinase, and total creatine kinase. For the statistical analysis, Student’s t-test was used. [Results] Twenty-four hours post effort, a significant decrease in cortisol level was shown for both protocols; however, the metabolic creatine kinase and total creatine kinase levels showed a significant increase. [Conclusion] Both protocols lowered the indicator of main muscular damage in the blood supply (cortisol). This proved that the work weight did not generate significant muscular damage in the 24-hour post-exercise period. PMID:27313356

  13. Measures of dependence for multivariate Lévy distributions

    NASA Astrophysics Data System (ADS)

    Boland, J.; Hurd, T. R.; Pivato, M.; Seco, L.

    2001-02-01

    Recent statistical analysis of a number of financial databases is summarized. Increasing agreement is found that logarithmic equity returns show a certain type of asymptotic behavior of the largest events, namely that the probability density functions have power law tails with an exponent α≈3.0. This behavior does not vary much over different stock exchanges or over time, despite large variations in trading environments. The present paper proposes a class of multivariate distributions which generalizes the observed qualities of univariate time series. A new consequence of the proposed class is the "spectral measure" which completely characterizes the multivariate dependences of the extreme tails of the distribution. This measure on the unit sphere in M-dimensions, in principle completely general, can be determined empirically by looking at extreme events. If it can be observed and determined, it will prove to be of importance for scenario generation in portfolio risk management.

  14. Single-particle detection of products from atomic and molecular reactions in a cryogenic ion storage ring

    NASA Astrophysics Data System (ADS)

    Krantz, C.; Novotný, O.; Becker, A.; George, S.; Grieser, M.; Hahn, R. von; Meyer, C.; Schippers, S.; Spruck, K.; Vogel, S.; Wolf, A.

    2017-04-01

    We have used a single-particle detector system, based on secondary electron emission, for counting low-energetic (∼keV/u) massive products originating from atomic and molecular ion reactions in the electrostatic Cryogenic Storage Ring (CSR). The detector is movable within the cryogenic vacuum chamber of CSR, and was used to measure production rates of a variety of charged and neutral daughter particles. In operation at a temperature of ∼ 6 K , the detector is characterised by a high dynamic range, combining a low dark event rate with good high-rate particle counting capability. On-line measurement of the pulse height distributions proved to be an important monitor of the detector response at low temperature. Statistical pulse-height analysis allows to infer the particle detection efficiency of the detector, which has been found to be close to unity also in cryogenic operation at 6 K.

  15. Learnability and generalisation of Arabic broken plural nouns

    PubMed Central

    Dawdy-Hesterberg, Lisa Garnand; Pierrehumbert, Janet Breckenridge

    2014-01-01

    The noun plural system in Modern Standard Arabic lies at a nexus of critical issues in morphological learnability. The suffixing “sound” plural competes with as many as 31 non-concatenative “broken” plural patterns. Our computational analysis of singular–plural pairs in the Corpus of Contemporary Arabic explores what types of linguistic information are statistically relevant to morphological generalisation for this highly complex system. We show that an analogical approach with the generalised context model is highly successful in predicting the plural form for any given singular form. This model proves to be robust to variation, as evidenced by its stability across 10 rounds of cross-validation. The predictive power is carried almost entirely by the CV template, a representation which specifies a segment's status as a consonant or vowel only, providing further support for the abstraction of prosodic templates in the Arabic morphological system as proposed by McCarthy and Prince. PMID:25346932

  16. Effect of two complex training protocols of back squats in blood indicators of muscular damage in military athletes.

    PubMed

    Ojeda, Álvaro Huerta; Ríos, Luis Chirosa; Barrilao, Rafael Guisado; Ríos, Ignacio Chirosa; Serrano, Pablo Cáceres

    2016-05-01

    [Purpose] The aim of this study was to determine the variations in the blood muscular damage indicators post application of two complex training programs for back squats. [Subjects and Methods] Seven military athletes were the subjects of this study. The study had a quasi-experimental cross-over intra-subject design. Two complex training protocols were applied, and the variables to be measured were cortisol, metabolic creatine kinase, and total creatine kinase. For the statistical analysis, Student's t-test was used. [Results] Twenty-four hours post effort, a significant decrease in cortisol level was shown for both protocols; however, the metabolic creatine kinase and total creatine kinase levels showed a significant increase. [Conclusion] Both protocols lowered the indicator of main muscular damage in the blood supply (cortisol). This proved that the work weight did not generate significant muscular damage in the 24-hour post-exercise period.

  17. Conceptual clusters in figurative language production.

    PubMed

    Corts, Daniel P; Meyers, Kristina

    2002-07-01

    Although most prior research on figurative language examines comprehension, several recent studies on the production of such language have proved to be informative. One of the most noticeable traits of figurative language production is that it is produced at a somewhat random rate with occasional bursts of highly figurative speech (e.g., Corts & Pollio, 1999). The present article seeks to extend these findings by observing production during speech that involves a very high base rate of figurative language, making statistically defined bursts difficult to detect. In an analysis of three Baptist sermons, burst-like clusters of figurative language were identified. Further study indicated that these clusters largely involve a central root metaphor that represents the topic under consideration. An interaction of the coherence, along with a conceptual understanding of a topic and the relative importance of the topic to the purpose of the speech, is offered as the most likely explanation for the clustering of figurative language in natural speech.

  18. Saudi Arabia Country Analysis Brief

    EIA Publications

    2014-01-01

    Saudi Arabia is the world's largest holder of crude oil proved reserves and was the largest exporter of total petroleum liquids in 2013. In 2013, Saudi Arabia was the world's second-largest petroleum liquids producer behind the United States and was the world's second-largest crude oil producer behind Russia. Saudi Arabia's economy remains heavily dependent on petroleum. Petroleum exports accounted for 85% of total Saudi export revenues in 2013, according to the Organization of the Petroleum Exporting Countries (OPEC)'s Annual Statistical Bulletin 2014. With the largest oil projects nearing completion, Saudi Arabia is expanding its natural gas, refining, petrochemicals, and electric power industries. Saudi Arabia's oil and natural gas operations are dominated by Saudi Aramco, the national oil and gas company and the world's largest oil company in terms of production. Saudi Arabia's Ministry of Petroleum and Mineral Resources and the Supreme Council for Petroleum and Minerals have oversight of the oil and natural gas sector and Saudi Aramco.

  19. A comparative empirical study on mobile ICT services, social responsibility and the protection of children.

    PubMed

    De-Miguel-Molina, María; Martínez-Gómez, Mónica

    2011-06-01

    The purpose of this paper is to analyse the Spanish mobile phone industry to determine how mobile phone companies and certain institutions can improve protection for children who use mobile phones. We carried out a multivariate statistical analysis using anonymous primary data from mobile phone companies, and institutions and associations that protect children, to compare these stakeholders' opinions and to put forward solutions. We proved that, even though some European countries have made an effort to provide safer ICT services, all stakeholders still need to cooperate and agree on solutions to the commercial problems associated with children using mobile phones. This can be done by signing codes of conduct. We found that even though some companies implement measures to protect children from accessing harmful content via their mobile phones, they do so for reasons of legal and not social responsibility.

  20. Statistical models and time series forecasting of sulfur dioxide: a case study Tehran.

    PubMed

    Hassanzadeh, S; Hosseinibalam, F; Alizadeh, R

    2009-08-01

    This study performed a time-series analysis, frequency distribution and prediction of SO(2) levels for five stations (Pardisan, Vila, Azadi, Gholhak and Bahman) in Tehran for the period of 2000-2005. Most sites show a quite similar characteristic with highest pollution in autumn-winter time and least pollution in spring-summer. The frequency distributions show higher peaks at two residential sites. The potential for SO(2) problems is high because of high emissions and the close geographical proximity of the major industrial and urban centers. The ACF and PACF are nonzero for several lags, indicating a mixed (ARMA) model, then at Bahman station an ARMA model was used for forecasting SO(2). The partial autocorrelations become close to 0 after about 5 lags while the autocorrelations remain strong through all the lags shown. The results proved that ARMA (2,2) model can provides reliable, satisfactory predictions for time series.

  1. The clinical evaluation of platelet-rich plasma on free gingival graft's donor site wound healing.

    PubMed

    Samani, Mahmoud Khosravi; Saberi, Bardia Vadiati; Ali Tabatabaei, S M; Moghadam, Mahdjoube Goldani

    2017-01-01

    It has been proved that platelet-rich plasma (PRP) can promote wound healing. In this way, PRP can be advantageous in periodontal plastic surgeries, free gingival graft (FGG) being one such surgery. In this randomized split-mouth controlled trial, 10 patients who needed bilateral FGG were selected, and two donor sites were randomly assigned to experience either natural healing or healing-assisted with PRP. The outcome was assessed based on the comparison of the extent of wound closure, Manchester scale, Landry healing scale, visual analog scale, and tissue thickness between the study groups at different time intervals. Repeated measurements of analysis of variance and paired t -test were used. Statistical significance was P ≤ 0.05. Significant differences between the study groups and also across different time intervals were seen in all parameters except for the changes in tissue thickness. PRP accelerates the healing process of wounds and reduces the healing time.

  2. The Evolution of Your Success Lies at the Centre of Your Co-Authorship Network

    PubMed Central

    Servia-Rodríguez, Sandra; Noulas, Anastasios; Mascolo, Cecilia; Fernández-Vilas, Ana; Díaz-Redondo, Rebeca P.

    2015-01-01

    Collaboration among scholars and institutions is progressively becoming essential to the success of research grant procurement and to allow the emergence and evolution of scientific disciplines. Our work focuses on analysing if the volume of collaborations of one author together with the relevance of his collaborators is somewhat related to his research performance over time. In order to prove this relation we collected the temporal distributions of scholars’ publications and citations from the Google Scholar platform and the co-authorship network (of Computer Scientists) underlying the well-known DBLP bibliographic database. By the application of time series clustering, social network analysis and non-parametric statistics, we observe that scholars with similar publications (citations) patterns also tend to have a similar centrality in the co-authorship network. To our knowledge, this is the first work that considers success evolution with respect to co-authorship. PMID:25760732

  3. Information content of the space-frequency filtering of blood plasma layers laser images in the diagnosis of pathological changes

    NASA Astrophysics Data System (ADS)

    Ushenko, A. G.; Boychuk, T. M.; Mincer, O. P.; Bodnar, G. B.; Kushnerick, L. Ya.; Savich, V. O.

    2013-12-01

    The bases of method of the space-frequency of the filtering phase allocation of blood plasma pellicle are given here. The model of the optical-anisotropic properties of the albumen chain of blood plasma pellicle with regard to linear and circular double refraction of albumen and globulin crystals is proposed. Comparative researches of the effectiveness of methods of the direct polarized mapping of the azimuth images of blood plasma pcllicle layers and space-frequency polarimetry of the laser radiation transformed by divaricate and holelikc optical-anisotropic chains of blood plasma pellicles were held. On the basis of the complex statistic, correlative and fracta.1 analysis of the filtered frcquencydimensional polarizing azimuth maps of the blood plasma pellicles structure a set of criteria of the change of the double refraction of the albumen chains caused by the prostate cancer was traced and proved.

  4. Using decision trees to understand structure in missing data

    PubMed Central

    Tierney, Nicholas J; Harden, Fiona A; Harden, Maurice J; Mengersen, Kerrie L

    2015-01-01

    Objectives Demonstrate the application of decision trees—classification and regression trees (CARTs), and their cousins, boosted regression trees (BRTs)—to understand structure in missing data. Setting Data taken from employees at 3 different industrial sites in Australia. Participants 7915 observations were included. Materials and methods The approach was evaluated using an occupational health data set comprising results of questionnaires, medical tests and environmental monitoring. Statistical methods included standard statistical tests and the ‘rpart’ and ‘gbm’ packages for CART and BRT analyses, respectively, from the statistical software ‘R’. A simulation study was conducted to explore the capability of decision tree models in describing data with missingness artificially introduced. Results CART and BRT models were effective in highlighting a missingness structure in the data, related to the type of data (medical or environmental), the site in which it was collected, the number of visits, and the presence of extreme values. The simulation study revealed that CART models were able to identify variables and values responsible for inducing missingness. There was greater variation in variable importance for unstructured as compared to structured missingness. Discussion Both CART and BRT models were effective in describing structural missingness in data. CART models may be preferred over BRT models for exploratory analysis of missing data, and selecting variables important for predicting missingness. BRT models can show how values of other variables influence missingness, which may prove useful for researchers. Conclusions Researchers are encouraged to use CART and BRT models to explore and understand missing data. PMID:26124509

  5. Sensitivity assessment of freshwater macroinvertebrates to pesticides using biological traits.

    PubMed

    Ippolito, A; Todeschini, R; Vighi, M

    2012-03-01

    Assessing the sensitivity of different species to chemicals is one of the key points in predicting the effects of toxic compounds in the environment. Trait-based predicting methods have proved to be extremely efficient for assessing the sensitivity of macroinvertebrates toward compounds with non specific toxicity (narcotics). Nevertheless, predicting the sensitivity of organisms toward compounds with specific toxicity is much more complex, since it depends on the mode of action of the chemical. The aim of this work was to predict the sensitivity of several freshwater macroinvertebrates toward three classes of plant protection products: organophosphates, carbamates and pyrethroids. Two databases were built: one with sensitivity data (retrieved, evaluated and selected from the U.S. Environmental Protection Agency ECOTOX database) and the other with biological traits. Aside from the "traditional" traits usually considered in ecological analysis (i.e. body size, respiration technique, feeding habits, etc.), multivariate analysis was used to relate the sensitivity of organisms to some other characteristics which may be involved in the process of intoxication. Results confirmed that, besides traditional biological traits, related to uptake capability (e.g. body size and body shape) some traits more related to particular metabolic characteristics or patterns have a good predictive capacity on the sensitivity to these kinds of toxic substances. For example, behavioral complexity, assumed as an indicator of nervous system complexity, proved to be an important predictor of sensitivity towards these compounds. These results confirm the need for more complex traits to predict effects of highly specific substances. One key point for achieving a complete mechanistic understanding of the process is the choice of traits, whose role in the discrimination of sensitivity should be clearly interpretable, and not only statistically significant.

  6. The problem of 2,4,6-trichloroanisole in cork planks studied by attenuated total reflection infrared spectroscopy: proof of concept.

    PubMed

    Garcia, Ana R; Lopes, Luís F; Brito de Barros, Ricardo; Ilharco, Laura M

    2015-01-14

    Attenuated total reflection infrared spectroscopy (ATR-IR) proved to be a promising detection technique for 2,4,6-trichloroanisole (TCA), which confers organoleptic defects to bottled alcoholic beverages, allowing the proposal of a criterion for cork plank acceptance when meant for stopper production. By analysis of a significant number of samples, it was proved that the presence of TCA, even in very low concentrations, imparts subtle changes to the cork spectra, namely, the growth of two new bands at ∼1417 (νC═C of TCA ring) and 1314 cm–1 (a shifted νCC of TCA) and an increase in the relative intensities of the bands at ∼1039 cm–1 (δCO of polysaccharides) and ∼813 cm–1 (τCH of suberin), the latter by overlapping with intense bands of TCA. These relative intensities were evaluated in comparison to a fingerprint of suberin (νasC–O–C), at 1161 cm–1. On the basis of those spectral variables, a multivariate statistics linear analysis (LDA) was performed to obtain a discriminant function that allows classifying the samples according to whether they contain or not TCA. The methodology proposed consists of a demanding acceptance criterion for cork planks destined for stopper production (with the guarantee of nonexistence of TCA) that results from combining the quantitative results with the absence of the two TCA correlated bands. ATR infrared spectroscopy is a nondestructive and easy to apply technique, both on cork planks and on stoppers, and has proven more restrictive than other techniques used in the cork industry that analyze the cleaning solutions. At the level of proof of concept, the method here proposed is appealing for high-value stopper applications.

  7. Anterolateral minithoracotomy versus median sternotomy for the treatment of congenital heart defects: a meta-analysis and systematic review.

    PubMed

    Ding, Chao; Wang, Chunmao; Dong, Aiqiang; Kong, Minjian; Jiang, Daming; Tao, Kaiyu; Shen, Zhonghua

    2012-05-04

    Anterolateral Minithoracotomy (ALMT) for the radical correction of Congenital Heart Defects is an alternative to Median Sternotomy (MS) due to reduce operative trauma accelerating recovery and yield a better cosmetic outcome after surgery. Our purpose is to conduct whether ALMT would bring more short-term benefits to patients than conventional Median Sternotomy by using a meta-analysis of case-control study in the published English Journal. 6 case control studies published in English from 1997 to 2011 were identified and synthesized to compare the short-term postoperative outcomes between ALMT and MS. These outcomes were cardiopulmonary bypass time, aortic cross-clamp time, intubation time, intensive care unit stay time, and postoperative hospital stay time. ALMT had significantly longer cardiopulmonary bypass times (8.00 min more, 95% CI 0.36 to 15.64 min, p = 0.04). Some evidence proved that aortic cross-clamp time of ALMT was longer, yet not significantly (2.38 min more, 95% CI -0.15 to 4.91 min, p = 0.06). In addition, ALMT had significantly shorter intubation time (1.66 hrs less, 95% CI -3.05 to -0.27 hrs, p = 0.02). Postoperative hospital stay time was significantly shorter with ALMT (1.52 days less, 95% CI -2.71 to -0.33 days, p = 0.01). Some evidence suggested a reduction in ICU stay time in the ALMT group. However, this did not prove to be statistically significant (0.88 days less, 95% CI -0.81 to 0.04 days, p = 0.08). ALMT can bring more benefits to patients with Congenital Heart Defects by reducing intubation time and postoperative hospital stay time, though ALMT has longer CPB time and aortic cross-clamp time.

  8. Comparing the Efficiency of Two Different Extraction Techniques in Removal of Maxillary Third Molars: A Randomized Controlled Trial.

    PubMed

    Edward, Joseph; Aziz, Mubarak A; Madhu Usha, Arjun; Narayanan, Jyothi K

    2017-12-01

    Extractions are routine procedures in dental surgery. Traditional extraction techniques use a combination of severing the periodontal attachment, luxation with an elevator, and removal with forceps. A new technique of extraction of maxillary third molar is introduced in this study-Joedds technique, which is compared with the conventional technique. One hundred people were included in the study, the people were divided into two groups by means of simple random sampling. In one group conventional technique of maxillary third molar extraction was used and on second Joedds technique was used. Statistical analysis was carried out with student's t test. Analysis of 100 patients based on parameters showed that the novel joedds technique had minimal trauma to surrounding tissues, less tuberosity and root fractures and the time taken for extraction was <2 min while compared to other group of patients. This novel technique has proved to be better than conventional third molar extraction technique, with minimal complications. If Proper selection of cases and right technique are used.

  9. A Comparative Study to Evaluate the Effect of Crook Lying Position versus Sitting Position on Forced Vital Capacity (FVC) in Healthy Individuals.

    PubMed

    Sudan, Dharampal Singh; Singh, Harvinder

    2014-02-01

    To prove the effect of body position on the Forced Vital Capacity (FVC) and to find out the better position amongst the sitting and crook lying position as both are considered to be the best respiratory positions as far as FVC is concerned, but no research work is done to find out the better amongst the two. We analyzed the FVC of the randomly selected 100 subjects (both males and females) in sitting and crook lying position respectively. Computerized Pulmonary Function Testing (PFT) apparatus was used for analysis where three readings of FVC were taken in each position from which best ones were taken for analysis. Mean FVC in crook lying position was found out to be 88.83% as compared to 99.07% in sitting position showing a difference of 10.24 %. The values were analyzed by using standard t-test which gave t-value 18.4316 and p-value 0.0001 which is statistically significant. The results show that FVC was more in sitting position as compared to crook lying position.

  10. Dynamics and spatio-temporal variability of environmental factors in Eastern Australia using functional principal component analysis

    USGS Publications Warehouse

    Szabo, J.K.; Fedriani, E.M.; Segovia-Gonzalez, M. M.; Astheimer, L.B.; Hooper, M.J.

    2010-01-01

    This paper introduces a new technique in ecology to analyze spatial and temporal variability in environmental variables. By using simple statistics, we explore the relations between abiotic and biotic variables that influence animal distributions. However, spatial and temporal variability in rainfall, a key variable in ecological studies, can cause difficulties to any basic model including time evolution. The study was of a landscape scale (three million square kilometers in eastern Australia), mainly over the period of 19982004. We simultaneously considered qualitative spatial (soil and habitat types) and quantitative temporal (rainfall) variables in a Geographical Information System environment. In addition to some techniques commonly used in ecology, we applied a new method, Functional Principal Component Analysis, which proved to be very suitable for this case, as it explained more than 97% of the total variance of the rainfall data, providing us with substitute variables that are easier to manage and are even able to explain rainfall patterns. The main variable came from a habitat classification that showed strong correlations with rainfall values and soil types. ?? 2010 World Scientific Publishing Company.

  11. Photoacoustic Analysis of the Penetration Kinetics of Cordia verbenacea DC in Human Skin

    NASA Astrophysics Data System (ADS)

    Carvalho, S. S.; Barja, P. R.

    2012-11-01

    Phonophoresis consists of the utilization of ultrasound radiation associated to pharmacological agents in order to enhance transdermal penetration of applied drugs. It is a widely employed resource in physiotherapy practice, normally associated with anti-inflammatory drugs, such as Acheflan. This drug was developed in Brazil from the essential oil of Cordia verbenacea DC, a native plant of the Brazilian southern coast. In previous studies, the photoacoustic (PA) technique proved effective in the study of the penetration kinetics of topically applied products and in the evaluation of drug delivery after phonophoresis application. The present work aimed to evaluate the penetration kinetics of Acheflan in human skin, employing in vivo PA measurements after massage application or phonophoresis application. Ten volunteers (aged between 18 and 30 years) took part in the study. Time evolution of the PA signal was fitted to a Boltzmann curve, S-shaped. After statistical analysis, PA measurements have shown drug penetration for both application forms, but drug delivery was more evident after phonophoresis application, with a characteristic penetration time of less than 15 min for the stratum corneum.

  12. An efficient and scalable analysis framework for variant extraction and refinement from population-scale DNA sequence data.

    PubMed

    Jun, Goo; Wing, Mary Kate; Abecasis, Gonçalo R; Kang, Hyun Min

    2015-06-01

    The analysis of next-generation sequencing data is computationally and statistically challenging because of the massive volume of data and imperfect data quality. We present GotCloud, a pipeline for efficiently detecting and genotyping high-quality variants from large-scale sequencing data. GotCloud automates sequence alignment, sample-level quality control, variant calling, filtering of likely artifacts using machine-learning techniques, and genotype refinement using haplotype information. The pipeline can process thousands of samples in parallel and requires less computational resources than current alternatives. Experiments with whole-genome and exome-targeted sequence data generated by the 1000 Genomes Project show that the pipeline provides effective filtering against false positive variants and high power to detect true variants. Our pipeline has already contributed to variant detection and genotyping in several large-scale sequencing projects, including the 1000 Genomes Project and the NHLBI Exome Sequencing Project. We hope it will now prove useful to many medical sequencing studies. © 2015 Jun et al.; Published by Cold Spring Harbor Laboratory Press.

  13. Parametric Model Based On Imputations Techniques for Partly Interval Censored Data

    NASA Astrophysics Data System (ADS)

    Zyoud, Abdallah; Elfaki, F. A. M.; Hrairi, Meftah

    2017-12-01

    The term ‘survival analysis’ has been used in a broad sense to describe collection of statistical procedures for data analysis. In this case, outcome variable of interest is time until an event occurs where the time to failure of a specific experimental unit might be censored which can be right, left, interval, and Partly Interval Censored data (PIC). In this paper, analysis of this model was conducted based on parametric Cox model via PIC data. Moreover, several imputation techniques were used, which are: midpoint, left & right point, random, mean, and median. Maximum likelihood estimate was considered to obtain the estimated survival function. These estimations were then compared with the existing model, such as: Turnbull and Cox model based on clinical trial data (breast cancer data), for which it showed the validity of the proposed model. Result of data set indicated that the parametric of Cox model proved to be more superior in terms of estimation of survival functions, likelihood ratio tests, and their P-values. Moreover, based on imputation techniques; the midpoint, random, mean, and median showed better results with respect to the estimation of survival function.

  14. Propagation of Visible and Infrared Radiation in Fog, Rain, and Snow

    DTIC Science & Technology

    1982-07-01

    Force Base Washington, D.C. 20332 Project manager Smoke/Obscurants 3 Aberden Proving Ground , MD 21005 ATTN: DRCPM-SMK Air Force GL 1 Hanscom AFB...Research Laboratory Technical Reports Boulder, CO 80303 ATTN: Library, R-51 Director U.S. Army Materiel Systems Analysis Agency Aberdeen Proving Ground ...DRSMI-RRO 1 DRSMI-RHC 1 Commander 1 U.S. Army Electronic Proving Grounds Fort Huachuca, AZ 85613 ATTN: STEEP-MT-ST Director 1 U.S. Army Ballistic

  15. Classification of type 2 diabetes rats based on urine amino acids metabolic profiling by liquid chromatography coupled with tandem mass spectrometry.

    PubMed

    Wang, Chunyan; Zhu, Hongbin; Pi, Zifeng; Song, Fengrui; Liu, Zhiqiang; Liu, Shuying

    2013-09-15

    An analytical method for quantifying underivatized amino acids (AAs) in urine samples of rats was developed by using liquid chromatography coupled with tandem mass spectrometry (LC-MS/MS). Classification of type 2 diabetes rats was based on urine amino acids metabolic profiling. LC-MS/MS analysis was applied through chromatographic separation and multiple reactions monitoring (MRM) transitions of MS/MS. Multivariate profile-wide predictive models were constructed using partial least squares discriminant analysis (PLS-DA) by SIMAC-P 11.5 version software package and hierarchical cluster analysis (HCA) by SPSS 18.0 version software. Some amino acids in urine of rats have significant change. The results of the present study prove that this method could perform the quantification of free AAs in urine of rats by using LC-MS/MS. In summary, the PLS-DA and HCA statistical analysis in our research were preferable to differentiate healthy rats and type 2 diabetes rats by the quantification of AAs in their urine samples. In addition, comparing with health group the seven increased amino acids in urine of type 2 rats were returned to normal under the treatment of acarbose. Copyright © 2013 Elsevier B.V. All rights reserved.

  16. A metabolomics-based method for studying the effect of yfcC gene in Escherichia coli on metabolism.

    PubMed

    Wang, Xiyue; Xie, Yuping; Gao, Peng; Zhang, Sufang; Tan, Haidong; Yang, Fengxu; Lian, Rongwei; Tian, Jing; Xu, Guowang

    2014-04-15

    Metabolomics is a potent tool to assist in identifying the function of unknown genes through analysis of metabolite changes in the context of varied genetic backgrounds. However, the availability of a universal unbiased profiling analysis is still a big challenge. In this study, we report an optimized metabolic profiling method based on gas chromatography-mass spectrometry for Escherichia coli. It was found that physiological saline at -80°C could ensure satisfied metabolic quenching with less metabolite leakage. A solution of methanol/water (21:79, v/v) was proved to be efficient for intracellular metabolite extraction. This method was applied to investigate the metabolome difference among wild-type E. coli, its yfcC deletion, and overexpression mutants. Statistical and bioinformatic analysis of the metabolic profiling data indicated that the expression of yfcC potentially affected the metabolism of glyoxylate shunt. This finding was further validated by real-time quantitative polymerase chain reactions showing that expression of aceA and aceB, the key genes in glyoxylate shunt, was upregulated by yfcC. This study exemplifies the robustness of the proposed metabolic profiling analysis strategy and its potential roles in investigating unknown gene functions in view of metabolome difference. Copyright © 2014 Elsevier Inc. All rights reserved.

  17. NSK reciprocating handpiece: in vitro comparative analysis of dentinal removal during root canal preparation by different operators.

    PubMed

    Wagner, Márcia Helena; Barletta, Fernando Branco; Reis, Magda de Souza; Mello, Luciano Loureiro; Ferreira, Ronise; Fernandes, Antônio Luiz Rocha

    2006-01-01

    The purpose of this study was to assess dentin removal during root canal preparation by different operators using a NSK reciprocating handpiece. Eighty-four human single-rooted mandibular premolars were hand instrumented using Triple-Flex stainless-steel files (Kerr) up to #30, weighed in analytical balance and randomly assigned to 4 groups (n=21). All specimens were mechanically prepared at the working length with #35 to #45 Triple-Flex files (Kerr) coupled to a NSK (TEP-E10R, Nakanishi Inc.) reciprocating handpiece powered by an electric motor (Endo Plus; VK Driller). Groups 1 to 4 were prepared by a professor of Endodontics, an endodontist, a third-year dental student and a general dentist, respectively. Teeth were reweighed after root canal preparation. The difference between weights was calculated and the means of dentin removal in each group were analyzed statistically by ANOVA and Tukey's test at 5 % significance level. The greatest amount of dentin removal was found in group 4, followed by groups 2, 3 and 1. Group 4 differed statistically from the other groups regarding dentin removal means [p<0.001 (group 1); p=0.005 (group 2); and p=0.001 (group 3)]. No statistically significant difference was found between groups 1 and 2 (p=0.608), 1 and 3 (p=0.914) and 2 and 3 (p=0.938). In conclusion, although the group prepared by a general dentist differed statistically from the other groups in terms of amount of dentin removal, this difference was clinically irrelevant. The NSK reciprocating handpiece powered by an electric engine was proved an effective auxiliary tool in root canal preparation, regardless of the operator's skills.

  18. The non-equilibrium statistical mechanics of a simple geophysical fluid dynamics model

    NASA Astrophysics Data System (ADS)

    Verkley, Wim; Severijns, Camiel

    2014-05-01

    Lorenz [1] has devised a dynamical system that has proved to be very useful as a benchmark system in geophysical fluid dynamics. The system in its simplest form consists of a periodic array of variables that can be associated with an atmospheric field on a latitude circle. The system is driven by a constant forcing, is damped by linear friction and has a simple advection term that causes the model to behave chaotically if the forcing is large enough. Our aim is to predict the statistics of Lorenz' model on the basis of a given average value of its total energy - obtained from a numerical integration - and the assumption of statistical stationarity. Our method is the principle of maximum entropy [2] which in this case reads: the information entropy of the system's probability density function shall be maximal under the constraints of normalization, a given value of the average total energy and statistical stationarity. Statistical stationarity is incorporated approximately by using `stationarity constraints', i.e., by requiring that the average first and possibly higher-order time-derivatives of the energy are zero in the maximization of entropy. The analysis [3] reveals that, if the first stationarity constraint is used, the resulting probability density function rather accurately reproduces the statistics of the individual variables. If the second stationarity constraint is used as well, the correlations between the variables are also reproduced quite adequately. The method can be generalized straightforwardly and holds the promise of a viable non-equilibrium statistical mechanics of the forced-dissipative systems of geophysical fluid dynamics. [1] E.N. Lorenz, 1996: Predictability - A problem partly solved, in Proc. Seminar on Predictability (ECMWF, Reading, Berkshire, UK), Vol. 1, pp. 1-18. [2] E.T. Jaynes, 2003: Probability Theory - The Logic of Science (Cambridge University Press, Cambridge). [3] W.T.M. Verkley and C.A. Severijns, 2014: The maximum entropy principle applied to a dynamical system proposed by Lorenz, Eur. Phys. J. B, 87:7, http://dx.doi.org/10.1140/epjb/e2013-40681-2 (open access).

  19. Convergent Validity of the Autism Spectrum Disorder-Diagnostic for Children (ASD-DC) and Autism Diagnostic Interview-Revised (ADI-R)

    ERIC Educational Resources Information Center

    Matson, Johnny L.; Hess, Julie A.; Mahan, Sara; Fodstad, Jill C.

    2010-01-01

    The purpose of this paper was to further establish the validity of the Autism Spectrum Disorder-Diagnostic for Children (ASD-DC). The methodology consisted of testing the similarity of findings between the ASD-DC and the Autism Diagnostic Interview-Revised (ADI-R), which proved to be statistically significant for subscale content scores on social,…

  20. Territorial Developments Based on Graffiti: a Statistical Mechanics Approach

    DTIC Science & Technology

    2011-10-28

    defined on a lattice . We introduce a two-gang Hamiltonian model where agents have red or blue affiliation but are otherwise indistinguishable. In this...ramifications of our results. Keywords: Territorial Formation, Spin Systems, Phase Transitions 1. Introduction Lattice models have been extensively used in...inconsequential. In short, lattice models have proved extremely useful in the context of the physical, biological and even chemical sciences. In more

  1. Recurrence theorems: A unified account

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wallace, David, E-mail: david.wallace@balliol.ox.ac.uk

    I discuss classical and quantum recurrence theorems in a unified manner, treating both as generalisations of the fact that a system with a finite state space only has so many places to go. Along the way, I prove versions of the recurrence theorem applicable to dynamics on linear and metric spaces and make some comments about applications of the classical recurrence theorem in the foundations of statistical mechanics.

  2. High quality GaAs single photon emitters on Si substrate

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bietti, S.; Sanguinetti, S.; Cavigli, L.

    2013-12-04

    We describe a method for the direct epitaxial growth of a single photon emitter, based on GaAs quantum dots fabricated by droplet epitaxy, working at liquid nitrogen temperatures on Si substrates. The achievement of quantum photon statistics up to T=80 K is directly proved by antibunching in the second order correlation function as measured with a H anbury Brown and Twiss interferometer.

  3. Analysis of biochemical genetic data on Jewish populations: II. Results and interpretations of heterogeneity indices and distance measures with respect to standards.

    PubMed Central

    Karlin, S; Kenett, R; Bonné-Tamir, B

    1979-01-01

    A nonparametric statistical methodology is used for the analysis of biochemical frequency data observed on a series of nine Jewish and six non-Jewish populations. Two categories of statistics are used: heterogeneity indices and various distance measures with respect to a standard. The latter are more discriminating in exploiting historical, geographical and culturally relevant information. A number of partial orderings and distance relationships among the populations are determined. Our concern in this study is to analyze similarities and differences among the Jewish populations, in terms of the gene frequency distributions for a number of genetic markers. Typical questions discussed are as follows: These Jewish populations differ in certain morphological and anthropometric traits. Are there corresponding differences in biochemical genetic constitution? How can we assess the extent of heterogeneity between and within groupings? Which class of markers (blood typings or protein loci) discriminates better among the separate populations? The results are quite surprising. For example, we found the Ashkenazi, Sephardi and Iraqi Jewish populations to be consistently close in genetic constitution and distant from all the other populations, namely the Yemenite and Cochin Jews, the Arabs, and the non-Jewish German and Russian populations. We found the Polish Jewish community the most heterogeneous among all Jewish populations. The blood loci discriminate better than the protein loci. A number of possible interpretations and hypotheses for these and other results are offered. The method devised for this analysis should prove useful in studying similarities and differences for other groups of populations for which substantial biochemical polymorphic data are available. PMID:380330

  4. Combining empirical approaches and error modelling to enhance predictive uncertainty estimation in extrapolation for operational flood forecasting. Tests on flood events on the Loire basin, France.

    NASA Astrophysics Data System (ADS)

    Berthet, Lionel; Marty, Renaud; Bourgin, François; Viatgé, Julie; Piotte, Olivier; Perrin, Charles

    2017-04-01

    An increasing number of operational flood forecasting centres assess the predictive uncertainty associated with their forecasts and communicate it to the end users. This information can match the end-users needs (i.e. prove to be useful for an efficient crisis management) only if it is reliable: reliability is therefore a key quality for operational flood forecasts. In 2015, the French flood forecasting national and regional services (Vigicrues network; www.vigicrues.gouv.fr) implemented a framework to compute quantitative discharge and water level forecasts and to assess the predictive uncertainty. Among the possible technical options to achieve this goal, a statistical analysis of past forecasting errors of deterministic models has been selected (QUOIQUE method, Bourgin, 2014). It is a data-based and non-parametric approach based on as few assumptions as possible about the forecasting error mathematical structure. In particular, a very simple assumption is made regarding the predictive uncertainty distributions for large events outside the range of the calibration data: the multiplicative error distribution is assumed to be constant, whatever the magnitude of the flood. Indeed, the predictive distributions may not be reliable in extrapolation. However, estimating the predictive uncertainty for these rare events is crucial when major floods are of concern. In order to improve the forecasts reliability for major floods, an attempt at combining the operational strength of the empirical statistical analysis and a simple error modelling is done. Since the heteroscedasticity of forecast errors can considerably weaken the predictive reliability for large floods, this error modelling is based on the log-sinh transformation which proved to reduce significantly the heteroscedasticity of the transformed error in a simulation context, even for flood peaks (Wang et al., 2012). Exploratory tests on some operational forecasts issued during the recent floods experienced in France (major spring floods in June 2016 on the Loire river tributaries and flash floods in fall 2016) will be shown and discussed. References Bourgin, F. (2014). How to assess the predictive uncertainty in hydrological modelling? An exploratory work on a large sample of watersheds, AgroParisTech Wang, Q. J., Shrestha, D. L., Robertson, D. E. and Pokhrel, P (2012). A log-sinh transformation for data normalization and variance stabilization. Water Resources Research, , W05514, doi:10.1029/2011WR010973

  5. Wire Roughness Assessment of 0.016'' × 0.022'' the Technique Lingual Orthodontics.

    PubMed

    Facchini, Fátima Mm; Filho, Mario Vedovello; Vedovello, Silvia As; Cotrim, Flávio A; Cotrim-Ferreira, Andrຟa; Tubel, Carlos Am

    2017-04-01

    To evaluate the difference in surface roughness of stainless steel archwires of different commercial brands used in lingual orthodontics. Precontoured arches measuring 0.016'' × 0.022'' were selected of the following brands: Tecnident, Adenta, G&H, Highland Metals Inc., Ormco, Incognito, and Ebraces. Quantitative evaluation of the surface roughness of archwires was performed by means of an atomic force microscope in contact mode. Three surface readouts were taken of each sample, analyzing areas of 20 × 20 μm. Each scan of the samples produced a readout of 512 lines, generating three-dimensional images of the wires. The analysis of variance statistical test was applied to prove significant variables (p > 0.05), with H 0 being rejected and H 1 accepted. The Incognito brand showed the lowest surface roughness. The archwires of brands Adenta, Tecnident, Highland, and Ormco showed similar values among them, and all close to these obtained by the Incognito brand. The archwires of the Ebraces brand showed the highest surface roughness, with values being close to those of the G&H Brand. There was a statistical difference in surface roughness of orthodontic archwires among the brands studied. Companies should pay attention to the quality control of their materials, as these may directly affect the quality of orthodontic treatment.

  6. Comparative evaluation of terminalia chebula extract mouthwash and chlorhexidine mouthwash on plaque and gingival inflammation - 4-week randomised control trial.

    PubMed

    Gupta, Devanand; Gupta, Rajendra Kumar; Bhaskar, Dara John; Gupta, Vipul

    2015-01-01

    The present study was conducted to assess the effectiveness of Terminalia chebula on plaque and gingival inflammation and compare it with the gold standard chlorhexidine (CHX 0.2%) and distilled water as control (placebo). A double-blind randomised control trial was conducted among undergraduate students who volunteered. They were randomly allocated into three study groups: 1) Terminalia chebula mouthwash (n = 30); 2) chlorhexidine (active control) (n = 30); 3) distilled water (placebo) (n = 30). Assessment was carried out according to plaque score and gingival score. Statistical analysis was carried out to compare the effect of both mouthwashes. ANOVA and post-hoc LSD tests were performed using SPSS version 17 with p ≤ 0.05 considered statistically significant. Our result showed that Terminalia chebula mouthrinse is as effective as chlorhexidine in reducing dental plaque and gingival inflammation. The results demonstrated a significant reduction of gingival bleeding and plaque indices in both groups over a period of 15 and 30 days as compared to the placebo. The results of the present study indicate that Terminalia chebula may prove to be an effective mouthwash. Terminalia chebula extract mouthrinse can be used as an alternative to chlorhexidine mouthrinse as it has similar properties without the side-effects of the latter.

  7. [Sanitation and racial inequality conditions in urban Brazil: an analysis focused on the indigenous population based on the 2010 Population Census].

    PubMed

    Raupp, Ludimila; Fávaro, Thatiana Regina; Cunha, Geraldo Marcelo; Santos, Ricardo Ventura

    2017-01-01

    The aims of this study were to analyze and describe the presence and infrastructure of basic sanitation in the urban areas of Brazil, contrasting indigenous with non-indigenous households. Methods: A cross-sectional study based on microdata from the 2010 Census was conducted. The analyses were based on descriptive statistics (prevalence) and the construction of multiple logistic regression models (adjusted by socioeconomic and demographic covariates). The odds ratios were estimated for the association between the explanatory variables (covariates) and the outcome variables (water supply, sewage, garbage collection, and adequate sanitation). The statistical significance level established was 5%. Among the analyzed services, sewage proved to be the most precarious. Regarding race or color, indigenous households presented the lowest rate of sanitary infrastructure in Urban Brazil. The adjusted regression showed that, in general, indigenous households were at a disadvantage when compared to other categories of race or color, especially in terms of the presence of garbage collection services. These inequalities were much more pronounced in the South and Southeastern regions. The analyses of this study not only confirm the profile of poor conditions and infrastructure of the basic sanitation of indigenous households in urban areas, but also demonstrate the persistence of inequalities associated with race or color in the country.

  8. A 12-Day Course of Imiquimod 5% for the Treatment of Actinic Keratosis: Effectiveness and Local Reactions.

    PubMed

    Serra-Guillén, C; Nagore, E; Llombart, B; Sanmartín, O; Requena, C; Calomarde, L; Guillén, C

    2018-04-01

    Imiquimod is an excellent option for patients with actinic keratosis, although its use may be limited by the long course of treatment required (4 weeks) and the likelihood of local skin reactions. The objectives of the present study were to demonstrate the effectiveness of a 12-day course of imiquimod 5% for the treatment of actinic keratosis and to examine the association between treatment effectiveness and severity of local reactions. We included patients with at least 8 actinic keratoses treated with imiquimod 5% cream for 12 consecutive days. Local reactions were classified as mild, moderate, or severe. The statistical analysis of the association between local reactions and clinical response was based on the Pearson χ 2 test and the Spearman rank correlation test. Sixty-five patients completed the study. Complete response was recorded in 52.3% and partial response in 75.4%. We found a statistically significant association between severity of the local reaction and response to treatment in both the Pearson χ 2 test and the Spearman rank correlation test. A 12-day course of imiquimod 5% proved effective for the treatment of actinic keratosis. Severity of local reactions during treatment was correlated with clinical response. Copyright © 2017 AEDV. Publicado por Elsevier España, S.L.U. All rights reserved.

  9. A study of two unsupervised data driven statistical methodologies for detecting and classifying damages in structural health monitoring

    NASA Astrophysics Data System (ADS)

    Tibaduiza, D.-A.; Torres-Arredondo, M.-A.; Mujica, L. E.; Rodellar, J.; Fritzen, C.-P.

    2013-12-01

    This article is concerned with the practical use of Multiway Principal Component Analysis (MPCA), Discrete Wavelet Transform (DWT), Squared Prediction Error (SPE) measures and Self-Organizing Maps (SOM) to detect and classify damages in mechanical structures. The formalism is based on a distributed piezoelectric active sensor network for the excitation and detection of structural dynamic responses. Statistical models are built using PCA when the structure is known to be healthy either directly from the dynamic responses or from wavelet coefficients at different scales representing Time-frequency information. Different damages on the tested structures are simulated by adding masses at different positions. The data from the structure in different states (damaged or not) are then projected into the different principal component models by each actuator in order to obtain the input feature vectors for a SOM from the scores and the SPE measures. An aircraft fuselage from an Airbus A320 and a multi-layered carbon fiber reinforced plastic (CFRP) plate are used as examples to test the approaches. Results are presented, compared and discussed in order to determine their potential in structural health monitoring. These results showed that all the simulated damages were detectable and the selected features proved capable of separating all damage conditions from the undamaged state for both approaches.

  10. Probabilistic seasonal Forecasts to deterministic Farm Leve Decisions: Innovative Approach

    NASA Astrophysics Data System (ADS)

    Mwangi, M. W.

    2015-12-01

    Climate change and vulnerability are major challenges in ensuring household food security. Climate information services have the potential to cushion rural households from extreme climate risks. However, most the probabilistic nature of climate information products is not easily understood by majority of smallholder farmers. Despite the probabilistic nature, climate information have proved to be a valuable climate risk adaptation strategy at the farm level. This calls for innovative ways to help farmers understand and apply climate information services to inform their farm level decisions. The study endeavored to co-design and test appropriate innovation systems for climate information services uptake and scale up necessary for achieving climate risk development. In addition it also determined the conditions necessary to support the effective performance of the proposed innovation system. Data and information sources included systematic literature review, secondary sources, government statistics, focused group discussions, household surveys and semi-structured interviews. Data wasanalyzed using both quantitative and qualitative data analysis techniques. Quantitative data was analyzed using the Statistical Package for Social Sciences (SPSS) software. Qualitative data was analyzed using qualitative techniques, which involved establishing the categories and themes, relationships/patterns and conclusions in line with the study objectives. Sustainable livelihood, reduced household poverty and climate change resilience were the impact that resulted from the study.

  11. Comparative statistical component analysis of transgenic, cyanophycin-producing potatoes in greenhouse and field trials.

    PubMed

    Schmidt, Kerstin; Schmidtke, Jörg; Mast, Yvonne; Waldvogel, Eva; Wohlleben, Wolfgang; Klemke, Friederike; Lockau, Wolfgang; Hausmann, Tina; Hühns, Maja; Broer, Inge

    2017-08-01

    Potatoes are a promising system for industrial production of the biopolymer cyanophycin as a second compound in addition to starch. To assess the efficiency in the field, we analysed the stability of the system, specifically its sensitivity to environmental factors. Field and greenhouse trials with transgenic potatoes (two independent events) were carried out for three years. The influence of environmental factors was measured and target compounds in the transgenic plants (cyanophycin, amino acids) were analysed for differences to control plants. Furthermore, non-target parameters (starch content, number, weight and size of tubers) were analysed for equivalence with control plants. The huge amount of data received was handled using modern statistical approaches to model the correlation between influencing environmental factors (year of cultivation, nitrogen fertilization, origin of plants, greenhouse or field cultivation) and key components (starch, amino acids, cyanophycin) and agronomic characteristics. General linear models were used for modelling, and standard effect sizes were applied to compare conventional and genetically modified plants. Altogether, the field trials prove that significant cyanophycin production is possible without reduction of starch content. Non-target compound composition seems to be equivalent under varying environmental conditions. Additionally, a quick test to measure cyanophycin content gives similar results compared to the extensive enzymatic test. This work facilitates the commercial cultivation of cyanophycin potatoes.

  12. Soil genotoxicity assessment: a new stategy based on biomolecular tools and plant bioindicators.

    PubMed

    Citterio, Sandra; Aina, Roberta; Labra, Massimo; Ghiani, Alessandra; Fumagalli, Pietro; Sgorbati, Sergio; Santagostino, Angela

    2002-06-15

    The setting up of efficient early warning systems is a challenge to research for preventing environmental alteration and human disease. In this paper, we report the development and the field application of a new biomonitoring methodology for assessing soil genotoxicity. In the first part, the use of amplified fragment length polymorphism and flow cytometry techniques to detect DNA damage induced by soils artificially contaminated with heavy metals as potentially genotoxic compounds is explained. Results show that the combination of the two techniques leads to efficient detection of the sublethal genotoxic effect induced in the plant bioindicator by contaminated soil. By contrast, the classic mortality, root, and shoot growth vegetative endpoints prove inappropriate for assessing soil genotoxicity because, although they cause genotoxic damage, some heavy metals do not affect sentinel plant development negatively. The statistical elaboration of the data obtained led to the development of a statistical predictive model which differentiates four different levels of soil genotoxic pollution and can be used everywhere. The second part deals with the application of the biomonitoring protocol in the genotoxic assessment of two areas surrounding a steelworks in northern Italy and the effectiveness of this methodology. In this particular case, in these areas, the predictive model reveals a pollution level strictly correlated to the heavy metal concentrations revealed by traditional chemical analysis.

  13. A la Recherche du Temps Perdu: extracting temporal relations from medical text in the 2012 i2b2 NLP challenge.

    PubMed

    Cherry, Colin; Zhu, Xiaodan; Martin, Joel; de Bruijn, Berry

    2013-01-01

    An analysis of the timing of events is critical for a deeper understanding of the course of events within a patient record. The 2012 i2b2 NLP challenge focused on the extraction of temporal relationships between concepts within textual hospital discharge summaries. The team from the National Research Council Canada (NRC) submitted three system runs to the second track of the challenge: typifying the time-relationship between pre-annotated entities. The NRC system was designed around four specialist modules containing statistical machine learning classifiers. Each specialist targeted distinct sets of relationships: local relationships, 'sectime'-type relationships, non-local overlap-type relationships, and non-local causal relationships. The best NRC submission achieved a precision of 0.7499, a recall of 0.6431, and an F1 score of 0.6924, resulting in a statistical tie for first place. Post hoc improvements led to a precision of 0.7537, a recall of 0.6455, and an F1 score of 0.6954, giving the highest scores reported on this task to date. Methods for general relation extraction extended well to temporal relations, and gave top-ranked state-of-the-art results. Careful ordering of predictions within result sets proved critical to this success.

  14. Statistical properties of the anomalous scaling exponent estimator based on time-averaged mean-square displacement

    NASA Astrophysics Data System (ADS)

    Sikora, Grzegorz; Teuerle, Marek; Wyłomańska, Agnieszka; Grebenkov, Denis

    2017-08-01

    The most common way of estimating the anomalous scaling exponent from single-particle trajectories consists of a linear fit of the dependence of the time-averaged mean-square displacement on the lag time at the log-log scale. We investigate the statistical properties of this estimator in the case of fractional Brownian motion (FBM). We determine the mean value, the variance, and the distribution of the estimator. Our theoretical results are confirmed by Monte Carlo simulations. In the limit of long trajectories, the estimator is shown to be asymptotically unbiased, consistent, and with vanishing variance. These properties ensure an accurate estimation of the scaling exponent even from a single (long enough) trajectory. As a consequence, we prove that the usual way to estimate the diffusion exponent of FBM is correct from the statistical point of view. Moreover, the knowledge of the estimator distribution is the first step toward new statistical tests of FBM and toward a more reliable interpretation of the experimental histograms of scaling exponents in microbiology.

  15. Evaluation of depleted uranium in the environment at Aberdeen Proving Grounds, Maryland and Yuma Proving Grounds, Arizona. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kennedy, P.L.; Clements, W.H.; Myers, O.B.

    1995-01-01

    This report represents an evaluation of depleted uranium (DU) introduced into the environment at the Aberdeen Proving Grounds (APG), Maryland and Yuma Proving Grounds (YPG) Arizona. This was a cooperative project between the Environmental Sciences and Statistical Analyses Groups at LANL and with the Department of Fishery and Wildlife Biology at Colorado State University. The project represents a unique approach to assessing the environmental impact of DU in two dissimilar ecosystems. Ecological exposure models were created for each ecosystem and sensitivity/uncertainty analyses were conducted to identify exposure pathways which were most influential in the fate and transport of DU inmore » the environment. Research included field sampling, field exposure experiment, and laboratory experiments. The first section addresses DU at the APG site. Chapter topics include bioenergetics-based food web model; field exposure experiments; bioconcentration by phytoplankton and the toxicity of U to zooplankton; physical processes governing the desorption of uranium from sediment to water; transfer of uranium from sediment to benthic invertebrates; spead of adsorpion by benthic invertebrates; uptake of uranium by fish. The final section of the report addresses DU at the YPG site. Chapters include the following information: Du transport processes and pathway model; field studies of performance of exposure model; uptake and elimination rates for kangaroo rates; chemical toxicity in kangaroo rat kidneys.« less

  16. Evaluation of Flow-Injection Tandem Mass Spectrometry for Rapid and High-Throughput Quantitative Determination of B-Vitamins in Nutritional Supplements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bhandari, Deepak; Van Berkel, Gary J

    2012-01-01

    The use of flow-injection electrospray ionization tandem mass spectrometry for rapid and high-throughput mass spectral analysis of selected B-vitamins, viz. B1, B2, B3, B5, and B6, in nutritional formulations was demonstrated. A simple and rapid (~5 min) in-tube sample preparation was performed by adding extraction solvent to a powdered sample aliquot followed by agitation, centrifugation, and filtration to recover an extract for analysis. Automated flow injection introduced 1 L of the extracts directly into the mass spectrometer ion source without chromatographic separation. Sample-to-sample analysis time was 60 s representing significant improvement over conventional liquid chromatography approaches which typically require 25-45more » min, and often require more significant sample preparation procedures. Quantitative capabilities of the flow-injection analysis were tested using the method of standard additions and NIST standard reference material (SRM 3280) multivitamin/multielement tablets. The quantity determined for each B-vitamin in SRM 3280 was within the statistical range provided for the respective certified values. The same sample preparation and analysis approach was also applied to two different commercial vitamin supplement tablets and proved to be successful in the quantification of the selected B-vitamins as evidenced by an agreement with the labels values and the results obtained using isotope dilution liquid chromatography/mass spectrometry.« less

  17. Fast analysis of wood preservers using laser induced breakdown spectroscopy

    NASA Astrophysics Data System (ADS)

    Uhl, A.; Loebe, K.; Kreuchwig, L.

    2001-06-01

    Laser-induced breakdown spectroscopy (LIBS) is used for the investigation of wood preservers in timber and in furniture. Both experiments in laboratory and practical applications in recycling facilities and on a building site prove the new possibilities for the fast detection of harmful agents in wood. A commercial system was developed for mobile laser-plasma-analysis as well as for industrial use in sorting plants. The universal measuring principle in combination with an Echelle optics permits real simultaneous multi-element-analysis in the range of 200-780 nm with a resolution of a few picometers. It enables the user to detect main and trace elements in wood within a few seconds, nearly independent of the matrix, knowing that different kinds of wood show an equal elemental composition. Sample preparation is not required. The quantitative analysis of inorganic wood preservers (containing, e.g. Cu, Cr, B, As, Pb, Hg) has been performed exactly using carbon as reference element. It can be shown that the detection limits for heavy metals in wood are in the ppm-range. Additional information is given concerning the quantitative analysis. Statistical data, e.g. the standard deviation (S.D.), were determined and calibration curves were used for each particular element. A comparison between ICP-AES and LIBS is given using depth profile correction factors regarding the different penetration depths with respect to the different volumes in wood analyzed by both analytical methods.

  18. Outperforming whom? A multilevel study of performance-prove goal orientation, performance, and the moderating role of shared team identification.

    PubMed

    Dietz, Bart; van Knippenberg, Daan; Hirst, Giles; Restubog, Simon Lloyd D

    2015-11-01

    Performance-prove goal orientation affects performance because it drives people to try to outperform others. A proper understanding of the performance-motivating potential of performance-prove goal orientation requires, however, that we consider the question of whom people desire to outperform. In a multilevel analysis of this issue, we propose that the shared team identification of a team plays an important moderating role here, directing the performance-motivating influence of performance-prove goal orientation to either the team level or the individual level of performance. A multilevel study of salespeople nested in teams supports this proposition, showing that performance-prove goal orientation motivates team performance more with higher shared team identification, whereas performance-prove goal orientation motivates individual performance more with lower shared team identification. Establishing the robustness of these findings, a second study replicates them with individual and team performance in an educational context. (c) 2015 APA, all rights reserved).

  19. Logical errors on proving theorem

    NASA Astrophysics Data System (ADS)

    Sari, C. K.; Waluyo, M.; Ainur, C. M.; Darmaningsih, E. N.

    2018-01-01

    In tertiary level, students of mathematics education department attend some abstract courses, such as Introduction to Real Analysis which needs an ability to prove mathematical statements almost all the time. In fact, many students have not mastered this ability appropriately. In their Introduction to Real Analysis tests, even though they completed their proof of theorems, they achieved an unsatisfactory score. They thought that they succeeded, but their proof was not valid. In this study, a qualitative research was conducted to describe logical errors that students made in proving the theorem of cluster point. The theorem was given to 54 students. Misconceptions on understanding the definitions seem to occur within cluster point, limit of function, and limit of sequences. The habit of using routine symbol might cause these misconceptions. Suggestions to deal with this condition are described as well.

  20. Parameterisation of non-homogeneities in buried object detection by means of thermography

    NASA Astrophysics Data System (ADS)

    Stepanić, Josip; Malinovec, Marina; Švaić, Srećko; Krstelj, Vjera

    2004-05-01

    Landmines and their natural environment form a system of complex dynamics with variable characteristics. A manifestation of that complexity within the context of thermography-based landmines detection is excessive noise in thermograms. That has severely suppressed application of thermography in landmines detection for the purposes of humanitarian demining. (To be differentiated from military demining and demining for military operations other than war [Land Mine Detection DOD's Research Program Needs a Comprehensive Evaluation Strategy, US GAO Report, GAO-01 239, 2001; International Mine Action Standards, Chapter 4.--Glossary. Available at: < http://www.mineactionstandards.org/IMAS_archive/Final/04.10.pdf>].) The discrepancy between the existing role and the actual potential of thermography in humanitarian demining motivated systematic approach to sources of noise in thermograms of buried objects. These sources are variations in mine orientation relative to soil normal, which modify the shape of mine signature on thermograms, as well as non-homogeneities in soil and vegetation layer above the mine, which modify the overall quality of thermograms. This paper analyses the influence of variable mines, and more generally the influence of axially symmetric buried object orientation on the quality of its signature on thermograms. The following two angles have been extracted to serve as parameters describing variation in orientation: (i) θ--angle between the local vertical axis and mine symmetry axis and (ii) ψ--angle between local vertical axis and soil surface normal. Their influence is compared to the influence of (iii) d--the object depth change, which serves as control parameter. The influences are quantified and ranked within a statistically planned experiment. The analysis has proved that among the parameters listed, the most influential one is statistical interaction dψ, followed with the statistical interaction dθ. According to statistical tests, these two combinations are considered the most significant influences. The results show that the currently applied analysis of thermography in humanitarian demining must be broadened by the inclusion of the variations in mine orientation, otherwise a decrease in the probability of mine detection, due to the presence of a systematic error, occurs.

  1. Out-of-time-order fluctuation-dissipation theorem

    NASA Astrophysics Data System (ADS)

    Tsuji, Naoto; Shitara, Tomohiro; Ueda, Masahito

    2018-01-01

    We prove a generalized fluctuation-dissipation theorem for a certain class of out-of-time-ordered correlators (OTOCs) with a modified statistical average, which we call bipartite OTOCs, for general quantum systems in thermal equilibrium. The difference between the bipartite and physical OTOCs defined by the usual statistical average is quantified by a measure of quantum fluctuations known as the Wigner-Yanase skew information. Within this difference, the theorem describes a universal relation between chaotic behavior in quantum systems and a nonlinear-response function that involves a time-reversed process. We show that the theorem can be generalized to higher-order n -partite OTOCs as well as in the form of generalized covariance.

  2. Genome-Wide Identification and Evaluation of Reference Genes for Quantitative RT-PCR Analysis during Tomato Fruit Development.

    PubMed

    Cheng, Yuan; Bian, Wuying; Pang, Xin; Yu, Jiahong; Ahammed, Golam J; Zhou, Guozhi; Wang, Rongqing; Ruan, Meiying; Li, Zhimiao; Ye, Qingjing; Yao, Zhuping; Yang, Yuejian; Wan, Hongjian

    2017-01-01

    Gene expression analysis in tomato fruit has drawn increasing attention nowadays. Quantitative real-time PCR (qPCR) is a routine technique for gene expression analysis. In qPCR operation, reliability of results largely depends on the choice of appropriate reference genes (RGs). Although tomato is a model for fruit biology study, few RGs for qPCR analysis in tomato fruit had yet been developed. In this study, we initially identified 38 most stably expressed genes based on tomato transcriptome data set, and their expression stabilities were further determined in a set of tomato fruit samples of four different fruit developmental stages (Immature, mature green, breaker, mature red) using qPCR analysis. Two statistical algorithms, geNorm and Normfinder, concordantly determined the superiority of these identified putative RGs. Notably, SlFRG05 (Solyc01g104170), SlFRG12 (Solyc04g009770), SlFRG16 (Solyc10g081190), SlFRG27 (Solyc06g007510), and SlFRG37 (Solyc11g005330) were proved to be suitable RGs for tomato fruit development study. Further analysis using geNorm indicate that the combined use of SlFRG03 (Solyc02g063070) and SlFRG27 would provide more reliable normalization results in qPCR experiments. The identified RGs in this study will be beneficial for future qPCR analysis of tomato fruit developmental study, as well as for the potential identification of optimal normalization controls in other plant species.

  3. Genome-Wide Identification and Evaluation of Reference Genes for Quantitative RT-PCR Analysis during Tomato Fruit Development

    PubMed Central

    Cheng, Yuan; Bian, Wuying; Pang, Xin; Yu, Jiahong; Ahammed, Golam J.; Zhou, Guozhi; Wang, Rongqing; Ruan, Meiying; Li, Zhimiao; Ye, Qingjing; Yao, Zhuping; Yang, Yuejian; Wan, Hongjian

    2017-01-01

    Gene expression analysis in tomato fruit has drawn increasing attention nowadays. Quantitative real-time PCR (qPCR) is a routine technique for gene expression analysis. In qPCR operation, reliability of results largely depends on the choice of appropriate reference genes (RGs). Although tomato is a model for fruit biology study, few RGs for qPCR analysis in tomato fruit had yet been developed. In this study, we initially identified 38 most stably expressed genes based on tomato transcriptome data set, and their expression stabilities were further determined in a set of tomato fruit samples of four different fruit developmental stages (Immature, mature green, breaker, mature red) using qPCR analysis. Two statistical algorithms, geNorm and Normfinder, concordantly determined the superiority of these identified putative RGs. Notably, SlFRG05 (Solyc01g104170), SlFRG12 (Solyc04g009770), SlFRG16 (Solyc10g081190), SlFRG27 (Solyc06g007510), and SlFRG37 (Solyc11g005330) were proved to be suitable RGs for tomato fruit development study. Further analysis using geNorm indicate that the combined use of SlFRG03 (Solyc02g063070) and SlFRG27 would provide more reliable normalization results in qPCR experiments. The identified RGs in this study will be beneficial for future qPCR analysis of tomato fruit developmental study, as well as for the potential identification of optimal normalization controls in other plant species. PMID:28900431

  4. GECKO: a complete large-scale gene expression analysis platform.

    PubMed

    Theilhaber, Joachim; Ulyanov, Anatoly; Malanthara, Anish; Cole, Jack; Xu, Dapeng; Nahf, Robert; Heuer, Michael; Brockel, Christoph; Bushnell, Steven

    2004-12-10

    Gecko (Gene Expression: Computation and Knowledge Organization) is a complete, high-capacity centralized gene expression analysis system, developed in response to the needs of a distributed user community. Based on a client-server architecture, with a centralized repository of typically many tens of thousands of Affymetrix scans, Gecko includes automatic processing pipelines for uploading data from remote sites, a data base, a computational engine implementing approximately 50 different analysis tools, and a client application. Among available analysis tools are clustering methods, principal component analysis, supervised classification including feature selection and cross-validation, multi-factorial ANOVA, statistical contrast calculations, and various post-processing tools for extracting data at given error rates or significance levels. On account of its open architecture, Gecko also allows for the integration of new algorithms. The Gecko framework is very general: non-Affymetrix and non-gene expression data can be analyzed as well. A unique feature of the Gecko architecture is the concept of the Analysis Tree (actually, a directed acyclic graph), in which all successive results in ongoing analyses are saved. This approach has proven invaluable in allowing a large (approximately 100 users) and distributed community to share results, and to repeatedly return over a span of years to older and potentially very complex analyses of gene expression data. The Gecko system is being made publicly available as free software http://sourceforge.net/projects/geckoe. In totality or in parts, the Gecko framework should prove useful to users and system developers with a broad range of analysis needs.

  5. Identifying when tagged fishes have been consumed by piscivorous predators: application of multivariate mixture models to movement parameters of telemetered fishes

    USGS Publications Warehouse

    Romine, Jason G.; Perry, Russell W.; Johnston, Samuel V.; Fitzer, Christopher W.; Pagliughi, Stephen W.; Blake, Aaron R.

    2013-01-01

    Mixture models proved valuable as a means to differentiate between salmonid smolts and predators that consumed salmonid smolts. However, successful application of this method requires that telemetered fishes and their predators exhibit measurable differences in movement behavior. Our approach is flexible, allows inclusion of multiple track statistics and improves upon rule-based manual classification methods.

  6. Comparison of traditional and sensor-based electronic stethoscopes in beagle dogs.

    PubMed

    Szilvási, Viktória; Vörös, Károly; Manczur, Ferenc; Reiczigel, Jenő; Novák, István; Máthé, Akos; Fekete, Dániel

    2013-03-01

    The objective of this study was to compare the auscultatory findings using traditional and electronic sensor-based stethoscopes. Thirty-three adult healthy Beagles (20 females, 13 males, mean age: 4.8 years, range 1.4-8 years) were auscultated by four investigators with different experiences (INVEST-1, -2, -3 and -4) independently with both stethoscopes. Final cardiological diagnoses were established by echocardiography. Mitral murmurs were heard with both stethoscopes by all investigators and echocardiography revealed mild mitral valve insufficiency in 7 dogs (21%, 4 females, 3 males). The statistical sensitivity (Se) in recognising cardiac murmurs proved to be 82% using the traditional stethoscope and 75% using the electronic one in the mean of the four examiners, whilst statistical specificity (Sp) was 99% by the traditional and 100% by the electronic stethoscope. The means of the auscultatory sensitivity differences between the two stethoscopes were 0.36 on the left and 0.59 on the right hemithorax, demonstrating an advantage for the electronic stethoscope being more obvious above the right hemithorax (P = 0.0340). The electronic stethoscope proved to be superior to the traditional one in excluding cardiac murmurs and especially in auscultation over the right hemithorax. Mitral valve disease was relatively common in this clinically healthy research Beagle population.

  7. Context-invariant quasi hidden variable (qHV) modelling of all joint von Neumann measurements for an arbitrary Hilbert space

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Loubenets, Elena R.

    We prove the existence for each Hilbert space of the two new quasi hidden variable (qHV) models, statistically noncontextual and context-invariant, reproducing all the von Neumann joint probabilities via non-negative values of real-valued measures and all the quantum product expectations—via the qHV (classical-like) average of the product of the corresponding random variables. In a context-invariant model, a quantum observable X can be represented by a variety of random variables satisfying the functional condition required in quantum foundations but each of these random variables equivalently models X under all joint von Neumann measurements, regardless of their contexts. The proved existence ofmore » this model negates the general opinion that, in terms of random variables, the Hilbert space description of all the joint von Neumann measurements for dimH≥3 can be reproduced only contextually. The existence of a statistically noncontextual qHV model, in particular, implies that every N-partite quantum state admits a local quasi hidden variable model introduced in Loubenets [J. Math. Phys. 53, 022201 (2012)]. The new results of the present paper point also to the generality of the quasi-classical probability model proposed in Loubenets [J. Phys. A: Math. Theor. 45, 185306 (2012)].« less

  8. Reduction of interferences in graphite furnace atomic absorption spectrometry by multiple linear regression modelling

    NASA Astrophysics Data System (ADS)

    Grotti, Marco; Abelmoschi, Maria Luisa; Soggia, Francesco; Tiberiade, Christian; Frache, Roberto

    2000-12-01

    The multivariate effects of Na, K, Mg and Ca as nitrates on the electrothermal atomisation of manganese, cadmium and iron were studied by multiple linear regression modelling. Since the models proved to efficiently predict the effects of the considered matrix elements in a wide range of concentrations, they were applied to correct the interferences occurring in the determination of trace elements in seawater after pre-concentration of the analytes. In order to obtain a statistically significant number of samples, a large volume of the certified seawater reference materials CASS-3 and NASS-3 was treated with Chelex-100 resin; then, the chelating resin was separated from the solution, divided into several sub-samples, each of them was eluted with nitric acid and analysed by electrothermal atomic absorption spectrometry (for trace element determinations) and inductively coupled plasma optical emission spectrometry (for matrix element determinations). To minimise any other systematic error besides that due to matrix effects, accuracy of the pre-concentration step and contamination levels of the procedure were checked by inductively coupled plasma mass spectrometric measurements. Analytical results obtained by applying the multiple linear regression models were compared with those obtained with other calibration methods, such as external calibration using acid-based standards, external calibration using matrix-matched standards and the analyte addition technique. Empirical models proved to efficiently reduce interferences occurring in the analysis of real samples, allowing an improvement of accuracy better than for other calibration methods.

  9. Multiplicative Multitask Feature Learning

    PubMed Central

    Wang, Xin; Bi, Jinbo; Yu, Shipeng; Sun, Jiangwen; Song, Minghu

    2016-01-01

    We investigate a general framework of multiplicative multitask feature learning which decomposes individual task’s model parameters into a multiplication of two components. One of the components is used across all tasks and the other component is task-specific. Several previous methods can be proved to be special cases of our framework. We study the theoretical properties of this framework when different regularization conditions are applied to the two decomposed components. We prove that this framework is mathematically equivalent to the widely used multitask feature learning methods that are based on a joint regularization of all model parameters, but with a more general form of regularizers. Further, an analytical formula is derived for the across-task component as related to the task-specific component for all these regularizers, leading to a better understanding of the shrinkage effects of different regularizers. Study of this framework motivates new multitask learning algorithms. We propose two new learning formulations by varying the parameters in the proposed framework. An efficient blockwise coordinate descent algorithm is developed suitable for solving the entire family of formulations with rigorous convergence analysis. Simulation studies have identified the statistical properties of data that would be in favor of the new formulations. Extensive empirical studies on various classification and regression benchmark data sets have revealed the relative advantages of the two new formulations by comparing with the state of the art, which provides instructive insights into the feature learning problem with multiple tasks. PMID:28428735

  10. Microplate technique for determining accumulation of metals by algae

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hassett, J.M.; Jennett, J.C.; Smith, J.E.

    1981-05-01

    A microplate technique was developed to determine the conditions under which pure cultures of algae removed heavy metals from aqueous solutions. Variables investigated included algal species and strain, culture age (11 and 44 days), metal (mercury, lead, cadmium, and zinc), pH, effects of different buffer solutions, and time of exposure. Plastic, U-bottomed microtiter plates were used in conjunction with heavy metal radionuclides to determine concentration factors for metal-alga combinations. The technique developed was rapid, statistically reliable, and economical of materials and cells. All species of algae studied removed mercury from solution. Green algae proved better at accumulating cadmium than didmore » blue-green algae. No alga studied removed zinc, perhaps because cells were maintained in the dark during the labeling period. Chlamydomonas sp. proved superior in ability to remove lead from solution.« less

  11. Scan Order in Gibbs Sampling: Models in Which it Matters and Bounds on How Much.

    PubMed

    He, Bryan; De Sa, Christopher; Mitliagkas, Ioannis; Ré, Christopher

    2016-01-01

    Gibbs sampling is a Markov Chain Monte Carlo sampling technique that iteratively samples variables from their conditional distributions. There are two common scan orders for the variables: random scan and systematic scan. Due to the benefits of locality in hardware, systematic scan is commonly used, even though most statistical guarantees are only for random scan. While it has been conjectured that the mixing times of random scan and systematic scan do not differ by more than a logarithmic factor, we show by counterexample that this is not the case, and we prove that that the mixing times do not differ by more than a polynomial factor under mild conditions. To prove these relative bounds, we introduce a method of augmenting the state space to study systematic scan using conductance.

  12. Scan Order in Gibbs Sampling: Models in Which it Matters and Bounds on How Much

    PubMed Central

    He, Bryan; De Sa, Christopher; Mitliagkas, Ioannis; Ré, Christopher

    2016-01-01

    Gibbs sampling is a Markov Chain Monte Carlo sampling technique that iteratively samples variables from their conditional distributions. There are two common scan orders for the variables: random scan and systematic scan. Due to the benefits of locality in hardware, systematic scan is commonly used, even though most statistical guarantees are only for random scan. While it has been conjectured that the mixing times of random scan and systematic scan do not differ by more than a logarithmic factor, we show by counterexample that this is not the case, and we prove that that the mixing times do not differ by more than a polynomial factor under mild conditions. To prove these relative bounds, we introduce a method of augmenting the state space to study systematic scan using conductance. PMID:28344429

  13. Binding Site and Potency Prediction of Teixobactin and other Lipid II Ligands by Statistical Base Scoring of Conformational Space Maps.

    PubMed

    Lungu, Claudiu N; Diudea, Mircea V

    2018-01-01

    Lipid II, a peptidoglycan, is a precursor in bacterial cell synthesis. It has both hydrophilic and lipophilic properties. The molecule translocates a bacterial membrane to deliver and incorporate "building blocks" from disaccharide-pentapeptide into the peptidoglican wall. Lipid II is a valid antibiotic target. A receptor binding pocket may be occupied by a ligand in various plausible conformations, among which only few ones are energetically related to a biological activity in the physiological efficiency domain. This paper reports the mapping of the conformational space of Lipid II in its interaction with Teixobactin and other Lipid II ligands. In order to study computationally the complex between Lipid II and ligands, a docking study was first carried on. Docking site was retrieved form literature. After docking, 5 ligand conformations and further 5 complexes (denoted 00 to 04) for each molecule were taken into account. For each structure, conformational studies were performed. Statistical analysis, conformational analysis and molecular dynamics based clustering were used to predict the potency of these compounds. A score for potency prediction was developed. Appling lipid II classification according to Lipid II conformational energy, a conformation of Teixobactin proved to be energetically favorable, followed by Oritravicin, Dalbavycin, Telvanicin, Teicoplamin and Vancomycin, respectively. Scoring of molecules according to cluster band and PCA produced the same result. Molecules classified according to standard deviations showed Dalbavycin as the most favorable conformation, followed by Teicoplamin, Telvanicin, Teixobactin, Oritravicin and Vancomycin, respectively. Total score showing best energetic efficiency of complex formation shows Teixobactin to have the best conformation (a score of 15 points) followed by Dalbavycin (14 points), Oritravicin (12v points), Telvanicin (10 points), Teicoplamin (9 points), Vancomycin (3 points). Statistical analysis of conformations can be used to predict the efficiency of ligand - target interaction and consecutively to find insight regarding ligand potency and postulate about favorable conformation of ligand and binding site. In this study it was shown that Teixobactin is more efficient in binding with Lipid II compared to Vancomycin, results confirmed by experimental data reported in literature. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  14. A comparative evaluation of Oratest with the microbiological method of assessing caries activity in children

    PubMed Central

    Sundaram, Meenakshi; Nayak, Ullal Anand; Ramalingam, Krishnakumar; Reddy, Venugopal; Rao, Arun Prasad; Mathian, Mahesh

    2013-01-01

    Aims: The aim of this study is to find out whether Oratest can be used as a diagnostic tool in assessing the caries activity by evaluating its relationship to the existing caries status and the salivary streptococcus mutans level. Materials and Methods: The study sample consists of 90 students divided into two groups. Group I (test group) and Group II (control group) consisting of 30 children for control group and 60 children for test group. The sampling of unstimulated saliva for the estimation of streptococcus mutans was done as per the method suggested by Kohler and Bratthall. The plates were then incubated. Rough surface colonies were identified as streptococcus mutans on a pre-determined area of the tip (approximately 1.5 cm2) were counted for each side of spatula pressed against mitis salivarius bacitracin agar using digital colony counter. The results were expressed in colony forming units (CFU). Oratest was carried out in the same patients after the collection of salivary sample for the microbiological method to evaluate the relationship between the two tests. Statistical Analysis Used: The tests used were ANOVA, Pearson Chi-square test, Pearson′s correlation analysis, Mann-Whitney U test and Student′s independent t-test. Results: In the control group and test group, when the streptococcus mutans count (CFU) and Oratest time (minutes) were correlated using Pearson′s correlation analysis, the streptococcus mutans counts was found to be in a statistically significant negative linear relationship with the Oratest time. When the caries status of the children, participated in the test group were correlated with mutans count (CFU) and Oratest time, caries status were found to be in a statistically significant positive linear relationship with streptococcus mutans count and in a significant negative linear relationship with Oratest time. Conclusions: The test proved to be a simple, inexpensive and rapid technique for assessing caries activity since a significant relationship exists clinically with caries status and microbiologically with the streptococcus mutans count of the individual. PMID:23946577

  15. On Allometry Relations

    NASA Astrophysics Data System (ADS)

    West, Damien; West, Bruce J.

    2012-07-01

    There are a substantial number of empirical relations that began with the identification of a pattern in data; were shown to have a terse power-law description; were interpreted using existing theory; reached the level of "law" and given a name; only to be subsequently fade away when it proved impossible to connect the "law" with a larger body of theory and/or data. Various forms of allometry relations (ARs) have followed this path. The ARs in biology are nearly two hundred years old and those in ecology, geophysics, physiology and other areas of investigation are not that much younger. In general if X is a measure of the size of a complex host network and Y is a property of a complex subnetwork embedded within the host network a theoretical AR exists between the two when Y = aXb. We emphasize that the reductionistic models of AR interpret X and Y as dynamic variables, albeit the ARs themselves are explicitly time independent even though in some cases the parameter values change over time. On the other hand, the phenomenological models of AR are based on the statistical analysis of data and interpret X and Y as averages to yield the empirical AR: = ab. Modern explanations of AR begin with the application of fractal geometry and fractal statistics to scaling phenomena. The detailed application of fractal geometry to the explanation of theoretical ARs in living networks is slightly more than a decade old and although well received it has not been universally accepted. An alternate perspective is given by the empirical AR that is derived using linear regression analysis of fluctuating data sets. We emphasize that the theoretical and empirical ARs are not the same and review theories "explaining" AR from both the reductionist and statistical fractal perspectives. The probability calculus is used to systematically incorporate both views into a single modeling strategy. We conclude that the empirical AR is entailed by the scaling behavior of the probability density, which is derived using the probability calculus.

  16. Compared efficacy of preservation solutions on the outcome of liver transplantation: Meta-analysis.

    PubMed

    Szilágyi, Ágnes Lilla; Mátrai, Péter; Hegyi, Péter; Tuboly, Eszter; Pécz, Daniella; Garami, András; Solymár, Margit; Pétervári, Erika; Balaskó, Márta; Veres, Gábor; Czopf, László; Wobbe, Bastian; Szabó, Dorottya; Wagner, Juliane; Hartmann, Petra

    2018-04-28

    To compare the effects of the four most commonly used preservation solutions on the outcome of liver transplantations. A systematic literature search was performed using MEDLINE, Scopus, EMBASE and the Cochrane Library databases up to January 31 st , 2017. The inclusion criteria were comparative, randomized controlled trials (RCTs) for deceased donor liver (DDL) allografts with adult and pediatric donors using the gold standard University of Wisconsin (UW) solution or histidine-tryptophan-ketoglutarate (HTK), Celsior (CS) and Institut Georges Lopez (IGL-1) solutions. Fifteen RCTs (1830 livers) were included; the primary outcomes were primary non-function (PNF) and one-year post-transplant graft survival (OGS-1). All trials were homogenous with respect to donor and recipient characteristics. There was no statistical difference in the incidence of PNF with the use of UW, HTK, CS and IGL-1 (RR = 0.02, 95%CI: 0.01-0.03, P = 0.356). Comparing OGS-1 also failed to reveal any difference between UW, HTK, CS and IGL-1 (RR = 0.80, 95%CI: 0.80-0.80, P = 0.369). Two trials demonstrated higher PNF levels for UW in comparison with the HTK group, and individual studies described higher rates of biliary complications where HTK and CS were used compared to the UW and IGL-1 solutions. However, the meta-analysis of the data did not prove a statistically significant difference: the UW, CS, HTK and IGL-1 solutions were associated with nearly equivalent outcomes. Alternative solutions for UW yield the same degree of safety and effectiveness for the preservation of DDLs, but further well-designed clinical trials are warranted.

  17. Efficacy of monopolar radiofrequency on skin collagen remodeling: a veterinary study.

    PubMed

    Fritz, Klaus; Bernardy, Jan; Tiplica, George Sonn; Machovcova, Alena

    2015-01-01

    The aesthetic market offers various radiofrequency treatments for the reduction of wrinkles and rhytids. Even though this not an uncommon aesthetic therapy, there is considerable lack of clinical evidence on the various energy delivery systems available (unipolar, bipolar, tripolar, multipolar, etc.). The purpose of this study was to demonstrate the efficacy of a monopolar radiofrequency device (Exilis Elite, BTL Industries Inc., Boston, MA, USA) on the skin collagen in an animal model. The study treatment was done on the abdominal area of the potbellied Vietnamese mini pigs in the Veterinary Research Institute facility. All pigs were treated once per week for 4 weeks. The treatment area was sized 20 × 10 cm. The surface temperature was kept in the therapeutic interval from 39 °C to 43 °C and the therapy lasted for 10 minutes after reaching the therapeutic temperature. Biopsy samples were taken before the therapy and at the 3-month follow-up. The histology samples were stained and magnified (×400) before computer processing. The collagen volume was calculated using the stereological analysis and the data were statistically processed (using the nonparametric two-sample t-test). The collagen content tissue increased from average of 9.0% before the therapy up to 25.9% after the 3-month follow-up period. The statistical comparison of 54 samples taken before and after the treatment acknowledged the significant difference (p = 0.018). The stereological analysis proved large-scale improvement of collagen in the treated area. We have observed that the monopolar radiofrequency therapy significantly increases collagen remodeling. © 2015 Wiley Periodicals, Inc.

  18. A theoretical study in extracting the essential features and dynamics of molecular motions: Intrinsic geometry methods for PF(5) pseudorotations and statistical methods for argon clusters

    NASA Astrophysics Data System (ADS)

    Panahi, Nima S.

    We studied the problem of understanding and computing the essential features and dynamics of molecular motions through the development of two theories for two different systems. First, we studied the process of the Berry Pseudorotation of PF5 and the rotations it induces in the molecule through its natural and intrinsic geometric nature by setting it in the language of fiber bundles and graph theory. With these tools, we successfully extracted the essentials of the process' loops and induced rotations. The infinite number of pseudorotation loops were broken down into a small set of essential loops called "super loops", with their intrinsic properties and link to the physical movements of the molecule extensively studied. In addition, only the three "self-edge loops" generated any induced rotations, and then only a finite number of classes of them. Second, we studied applying the statistical methods of Principal Components Analysis (PCA) and Principal Coordinate Analysis (PCO) to capture only the most important changes in Argon clusters so as to reduce computational costs and graph the potential energy surface (PES) in three dimensions respectively. Both methods proved successful, but PCA was only partially successful since one will only see advantages for PES database systems much larger than those both currently being studied and those that can be computationally studied in the next few decades to come. In addition, PCA is only needed for the very rare case of a PES database that does not already include Hessian eigenvalues.

  19. Chip-LC-MS for label-free profiling of human serum.

    PubMed

    Horvatovich, Peter; Govorukhina, Natalia I; Reijmers, Theo H; van der Zee, Ate G J; Suits, Frank; Bischoff, Rainer

    2007-12-01

    The discovery of biomarkers in easily accessible body fluids such as serum is one of the most challenging topics in proteomics requiring highly efficient separation and detection methodologies. Here, we present the application of a microfluidics-based LC-MS system (chip-LC-MS) to the label-free profiling of immunodepleted, trypsin-digested serum in comparison to conventional capillary LC-MS (cap-LC-MS). Both systems proved to have a repeatability of approximately 20% RSD for peak area, all sample preparation steps included, while repeatability of the LC-MS part by itself was less than 10% RSD for the chip-LC-MS system. Importantly, the chip-LC-MS system had a two times higher resolution in the LC dimension and resulted in a lower average charge state of the tryptic peptide ions generated in the ESI interface when compared to cap-LC-MS while requiring approximately 30 times less (~5 pmol) sample. In order to characterize both systems for their capability to find discriminating peptides in trypsin-digested serum samples, five out of ten individually prepared, identical sera were spiked with horse heart cytochrome c. A comprehensive data processing methodology was applied including 2-D smoothing, resolution reduction, peak picking, time alignment, and matching of the individual peak lists to create an aligned peak matrix amenable for statistical analysis. Statistical analysis by supervised classification and variable selection showed that both LC-MS systems could discriminate the two sample groups. However, the chip-LC-MS system allowed to assign 55% of the overall signal to selected peaks against 32% for the cap-LC-MS system.

  20. Insect Venom Immunotherapy: Analysis of the Safety and Tolerance of 3 Buildup Protocols Frequently Used in Spain.

    PubMed

    Gutiérrez Fernández, D; Moreno-Ancillo, A; Fernández Meléndez, S; Domínguez-Noche, C; Gálvez Ruiz, P; Alfaya Arias, T; Carballada González, F; Alonso Llamazares, A; Marques Amat, L; Vega Castro, A; Antolín Amérigo, D; Cruz Granados, S; Ruiz León, B; Sánchez Morillas, L; Fernández Sánchez, J; Soriano Gomis, V; Borja Segade, J; Dalmau Duch, G; Guspi Bori, R; Miranda Páez, A

    2016-01-01

    Hymenoptera venom immunotherapy (VIT) is an effective treatment but not one devoid of risk, as both local and systemic adverse reactions may occur, especially in the initial phases. We compared the tolerance to 3 VIT buildup protocols and analyzed risk factors associated with adverse reactions during this phase. We enrolled 165 patients divided into 3 groups based on the buildup protocol used (3, 4, and 9 weeks). The severity of systemic reactions was evaluated according to the World Allergy Organization model. Results were analyzed using exploratory descriptive statistics, and variables were compared using analysis of variance. Adverse reactions were recorded in 53 patients (32%) (43 local and 10 systemic). Local reactions were immediate in 27 patients (63%) and delayed in 16 (37%). The severity of the local reaction was slight/moderate in 15 patients and severe in 13. Systemic reactions were grade 1-2. No significant association was found between the treatment modality and the onset of local or systemic adverse reactions or the type of local reaction. We only found a statistically significant association between severity of the local reaction and female gender. As for the risk factors associated with systemic reactions during the buildup phase, we found no significant differences in values depending on the protocol used or the insect responsible. The buildup protocols compared proved to be safe and did not differ significantly from one another. In the population studied, patients undergoing the 9-week schedule presented no systemic reactions. Therefore, this protocol can be considered the safest approach.

  1. Flexible Adaptive Paradigms for fMRI Using a Novel Software Package ‘Brain Analysis in Real-Time’ (BART)

    PubMed Central

    Hellrung, Lydia; Hollmann, Maurice; Zscheyge, Oliver; Schlumm, Torsten; Kalberlah, Christian; Roggenhofer, Elisabeth; Okon-Singer, Hadas; Villringer, Arno; Horstmann, Annette

    2015-01-01

    In this work we present a new open source software package offering a unified framework for the real-time adaptation of fMRI stimulation procedures. The software provides a straightforward setup and highly flexible approach to adapt fMRI paradigms while the experiment is running. The general framework comprises the inclusion of parameters from subject’s compliance, such as directing gaze to visually presented stimuli and physiological fluctuations, like blood pressure or pulse. Additionally, this approach yields possibilities to investigate complex scientific questions, for example the influence of EEG rhythms or fMRI signals results themselves. To prove the concept of this approach, we used our software in a usability example for an fMRI experiment where the presentation of emotional pictures was dependent on the subject’s gaze position. This can have a significant impact on the results. So far, if this is taken into account during fMRI data analysis, it is commonly done by the post-hoc removal of erroneous trials. Here, we propose an a priori adaptation of the paradigm during the experiment’s runtime. Our fMRI findings clearly show the benefits of an adapted paradigm in terms of statistical power and higher effect sizes in emotion-related brain regions. This can be of special interest for all experiments with low statistical power due to a limited number of subjects, a limited amount of time, costs or available data to analyze, as is the case with real-time fMRI. PMID:25837719

  2. EEG analysis of the brain activity during the observation of commercial, political, or public service announcements.

    PubMed

    Vecchiato, Giovanni; Astolfi, Laura; Tabarrini, Alessandro; Salinari, Serenella; Mattia, Donatella; Cincotti, Febo; Bianchi, Luigi; Sorrentino, Domenica; Aloise, Fabio; Soranzo, Ramon; Babiloni, Fabio

    2010-01-01

    The use of modern brain imaging techniques could be useful to understand what brain areas are involved in the observation of video clips related to commercial advertising, as well as for the support of political campaigns, and also the areas of Public Service Announcements (PSAs). In this paper we describe the capability of tracking brain activity during the observation of commercials, political spots, and PSAs with advanced high-resolution EEG statistical techniques in time and frequency domains in a group of normal subjects. We analyzed the statistically significant cortical spectral power activity in different frequency bands during the observation of a commercial video clip related to the use of a beer in a group of 13 normal subjects. In addition, a TV speech of the Prime Minister of Italy was analyzed in two groups of swing and "supporter" voters. Results suggested that the cortical activity during the observation of commercial spots could vary consistently across the spot. This fact suggest the possibility to remove the parts of the spot that are not particularly attractive by using those cerebral indexes. The cortical activity during the observation of the political speech indicated a major cortical activity in the supporters group when compared to the swing voters. In this case, it is possible to conclude that the communication proposed has failed to raise attention or interest on swing voters. In conclusions, high-resolution EEG statistical techniques have been proved to able to generate useful insights about the particular fruition of TV messages, related to both commercial as well as political fields.

  3. EEG Analysis of the Brain Activity during the Observation of Commercial, Political, or Public Service Announcements

    PubMed Central

    Vecchiato, Giovanni; Astolfi, Laura; Tabarrini, Alessandro; Salinari, Serenella; Mattia, Donatella; Cincotti, Febo; Bianchi, Luigi; Sorrentino, Domenica; Aloise, Fabio; Soranzo, Ramon; Babiloni, Fabio

    2010-01-01

    The use of modern brain imaging techniques could be useful to understand what brain areas are involved in the observation of video clips related to commercial advertising, as well as for the support of political campaigns, and also the areas of Public Service Announcements (PSAs). In this paper we describe the capability of tracking brain activity during the observation of commercials, political spots, and PSAs with advanced high-resolution EEG statistical techniques in time and frequency domains in a group of normal subjects. We analyzed the statistically significant cortical spectral power activity in different frequency bands during the observation of a commercial video clip related to the use of a beer in a group of 13 normal subjects. In addition, a TV speech of the Prime Minister of Italy was analyzed in two groups of swing and “supporter” voters. Results suggested that the cortical activity during the observation of commercial spots could vary consistently across the spot. This fact suggest the possibility to remove the parts of the spot that are not particularly attractive by using those cerebral indexes. The cortical activity during the observation of the political speech indicated a major cortical activity in the supporters group when compared to the swing voters. In this case, it is possible to conclude that the communication proposed has failed to raise attention or interest on swing voters. In conclusions, high-resolution EEG statistical techniques have been proved to able to generate useful insights about the particular fruition of TV messages, related to both commercial as well as political fields. PMID:20069055

  4. An innovative way to highlight the power of each polymorphism on elite athletes phenotype expression.

    PubMed

    Contrò, Valentina; Schiera, Gabriella; Abbruzzo, Antonino; Bianco, Antonino; Amato, Alessandra; Sacco, Alessia; Macchiarella, Alessandra; Palma, Antonio; Proia, Patrizia

    2018-01-12

    The purpose of this study was to determine the probability of soccer players having the best genetic background that could increase performance, evaluating the polymorphism that are considered Performance Enhancing Polymorphism (PEPs) distributed on five genes: PPARα, PPARGC1A, NRF2, ACE e CKMM. Particularly, we investigated how each polymorphism works directly or through another polymorphism to distinguish elite athletes from non-athletic population. Sixty professional soccer players (age 22.5 ± 2.2) and sixty healthy volunteers (age 21.2± 2.3) were enrolled. Samples of venous blood was used to prepare genomic DNA. The polymorphic sites were scanned using PCR-RFLP protocols with different enzyme. We used a multivariate logistic regression analysis to demonstrate an association between the five PEPs and elite phenotype. We found statistical significance in NRF2 (AG/GG genotype) polymorphism/soccer players association (p < 0.05) as well as a stronger association in ACE polymorphism (p =0.02). Particularly, we noticed that the ACE ID genotype and even more the II genotype are associated with soccer player phenotype. Although the other PEPs had no statistical significance, we proved that some of these may work indirectly, amplifying the effect of another polymorphism; for example, seems that PPARα could acts on NRF2 (GG) enhancing the effect of the latter, notwithstanding it had not shown a statistical significance. In conclusion, to establish if a polymorphism can influence the performance, it is necessary to understand how they act and interact, directly and indirectly, on each other.

  5. Intraday Seasonalities and Nonstationarity of Trading Volume in Financial Markets: Individual and Cross-Sectional Features.

    PubMed

    Graczyk, Michelle B; Duarte Queirós, Sílvio M

    2016-01-01

    We study the intraday behaviour of the statistical moments of the trading volume of the blue chip equities that composed the Dow Jones Industrial Average index between 2003 and 2014. By splitting that time interval into semesters, we provide a quantitative account of the nonstationary nature of the intraday statistical properties as well. Explicitly, we prove the well-known ∪-shape exhibited by the average trading volume-as well as the volatility of the price fluctuations-experienced a significant change from 2008 (the year of the "subprime" financial crisis) onwards. That has resulted in a faster relaxation after the market opening and relates to a consistent decrease in the convexity of the average trading volume intraday profile. Simultaneously, the last part of the session has become steeper as well, a modification that is likely to have been triggered by the new short-selling rules that were introduced in 2007 by the Securities and Exchange Commission. The combination of both results reveals that the ∪ has been turning into a ⊔. Additionally, the analysis of higher-order cumulants-namely the skewness and the kurtosis-shows that the morning and the afternoon parts of the trading session are each clearly associated with different statistical features and hence dynamical rules. Concretely, we claim that the large initial trading volume is due to wayward stocks whereas the large volume during the last part of the session hinges on a cohesive increase of the trading volume. That dissimilarity between the two parts of the trading session is stressed in periods of higher uproar in the market.

  6. Multi-criteria decision analysis and spatial statistic: an approach to determining human vulnerability to vector transmission of Trypanosoma cruzi.

    PubMed

    Montenegro, Diego; Cunha, Ana Paula da; Ladeia-Andrade, Simone; Vera, Mauricio; Pedroso, Marcel; Junqueira, Angela

    2017-10-01

    Chagas disease (CD), caused by the protozoan Trypanosoma cruzi, is a neglected human disease. It is endemic to the Americas and is estimated to have an economic impact, including lost productivity and disability, of 7 billion dollars per year on average. To assess vulnerability to vector-borne transmission of T. cruzi in domiciliary environments within an area undergoing domiciliary vector interruption of T. cruzi in Colombia. Multi-criteria decision analysis [preference ranking method for enrichment evaluation (PROMETHEE) and geometrical analysis for interactive assistance (GAIA) methods] and spatial statistics were performed on data from a socio-environmental questionnaire and an entomological survey. In the construction of multi-criteria descriptors, decision-making processes and indicators of five determinants of the CD vector pathway were summarily defined, including: (1) house indicator (HI); (2) triatominae indicator (TI); (3) host/reservoir indicator (Ho/RoI); (4) ecotope indicator (EI); and (5) socio-cultural indicator (S-CI). Determination of vulnerability to CD is mostly influenced by TI, with 44.96% of the total weight in the model, while the lowest contribution was from S-CI, with 7.15%. The five indicators comprise 17 indices, and include 78 of the original 104 priority criteria and variables. The PROMETHEE and GAIA methods proved very efficient for prioritisation and quantitative categorisation of socio-environmental determinants and for better determining which criteria should be considered for interrupting the man-T. cruzi-vector relationship in endemic areas of the Americas. Through the analysis of spatial autocorrelation it is clear that there is a spatial dependence in establishing categories of vulnerability, therefore, the effect of neighbors' setting (border areas) on local values should be incorporated into disease management for establishing programs of surveillance and control of CD via vector. The study model proposed here is flexible and can be adapted to various eco-epidemiological profiles and is suitable for focusing anti-T. cruzi serological surveillance programs in vulnerable human populations.

  7. Analysis by the Residual Method for Estimate Market Value of Land on the Areas with Mining Exploitation in Subsoil under Future New Building

    NASA Astrophysics Data System (ADS)

    Gwozdz-Lason, Monika

    2017-12-01

    This paper attempts to answer some of the following questions: what is the main selling advantage of a plot of land on the areas with mining exploitation? which attributes influence on market value the most? and how calculate the mining influence in subsoil under future new building as market value of plot with commercial use? This focus is not accidental, as the paper sets out to prove that the subsoil load bearing capacity, as directly inferred from the local geotechnical properties with mining exploitation, considerably influences the market value of this type of real estate. Presented in this elaborate analysis and calculations, are part of the ongoing development works which aimed at suggesting a new technology and procedures for estimating the value of the land belonging to the third category geotechnical. Analysed the question was examined both in terms of the theoretical and empirical. On the basis of the analysed code calculations in residual method, numerical, statistical and econometric defined results and final conclusions. A market analysis yielded a group of subsoil stabilization costs which depend on the mining operations interaction, subsoil parameters, type of the contemplated structure, its foundations, selected stabilization method, its overall area and shape.

  8. Cadmium, lead, and mercury levels in feathers of small passerine birds: noninvasive sampling strategy.

    PubMed

    Bianchi, Nicola; Ancora, Stefania; di Fazio, Noemi; Leonzio, Claudio

    2008-10-01

    Bird feathers have been widely used as a nondestructive biological material for monitoring heavy metals. Sources of metals taken up by feathers include diet (metals are incorporated during feather formation), preening, and direct contact with metals in water, air, dust, and plants. In the literature, data regarding the origin of trace elements in feathers are not univocal. Only in the vast literature concerning mercury (as methyl mercury) has endogenous origin been determined. In the present study, we investigate cadmium, lead, and mercury levels in feathers of prey of Falco eleonorae in relation to the ecological characteristics (molt, habitat, and contamination by soil) of the different species. Cluster analysis identified two main groups of species. Differences and correlations within and between groups identified by cluster analysis were then checked by nonparametric statistical analysis. The results showed that mercury levels had a pattern significantly different from those of cadmium and lead, which in turn showed a significant positive correlation, suggesting different origins. Nests of F. eleonorae proved to be a good source for feathers of small trans-Saharan passerines collected by a noninvasive method. They provided abundant feathers of the various species in a relatively small area--in this case, the falcon colony on the Isle of San Pietro, Sardinia, Italy.

  9. Hydroxychloroquine for the prevention of fetal growth restriction and prematurity in lupus pregnancy: A systematic review and meta-analysis.

    PubMed

    Vivien, Guillotin; Alice, Bouhet; Thomas, Barnetche; Christophe, Richez; Marie-Elise, Truchetet; Julien, Seneschal; Pierre, Duffau; Estibaliz, Lazaro

    2018-04-06

    Systemic lupus erythematosus (SLE) is a chronic autoimmune disease that primarily affects women of childbearing age. While the impact of hydroxychloroquine (HCQ) on SLE activity and neonatal lupus occurrence has been evaluated in several studies, its role on prematurity and intrauterine growth restriction (IUGR) remains uncertain. The aim of this study was to assess the impact of HCQ exposure on prematurity and IUGR during pregnancy in women with SLE. We conducted a systematic review and a meta-analysis comparing prematurity and IUGR in SLE pregnancies exposed or not exposed to HCQ. The odds ratio of IUGR and prematurity were calculated and compared between pregnancies in each group according HCQ treatment. Six studies were included (3 descriptive cohort studies and 3 case series) totalling 870 pregnancies. Of the SLE pregnancies, 308 were exposed to HCQ and were compared to 562 not exposed to HCQ. There was no statistical difference for prematurity or IUGR between groups. This meta-analysis failed to prove the efficacy of HCQ in the prevention of prematurity as well as IUGR during SLE pregnancies. Due to the heterogeneity of the studies, these results should be interpreted cautiously. Copyright © 2018 Société française de rhumatologie. Published by Elsevier SAS. All rights reserved.

  10. Quantitative analysis of the renal aging in rats. Stereological study.

    PubMed

    Melchioretto, Eduardo Felippe; Zeni, Marcelo; Veronez, Djanira Aparecida da Luz; Martins, Eduardo Lopes; Fraga, Rogério de

    2016-05-01

    To evaluate the renal function and the renal histological alterations through the stereology and morphometrics in rats submitted to the natural process of aging. Seventy two Wistar rats, divided in six groups. Each group was sacrificed in a different age: 3, 6, 9, 12, 18 and 24 months. It was performed right nephrectomy, stereological and morphometric analysis of the renal tissue (renal volume and weight, density of volume (Vv[glom]) and numerical density (Nv[glom]) of the renal glomeruli and average glomerular volume (Vol[glom])) and also it was evaluated the renal function for the dosage of serum creatinine and urea. There was significant decrease of the renal function in the oldest rats. The renal volume presented gradual increase during the development of the rats with the biggest values registered in the group of animals at 12 months of age and significant progressive decrease in older animals. Vv[glom] presented statistically significant gradual reduction between the groups and the Nv[glom] also decreased significantly. The renal function proved to be inferior in senile rats when compared to the young rats. The morphometric and stereological analysis evidenced renal atrophy, gradual reduction of the volume density and numerical density of the renal glomeruli associated to the aging process.

  11. Cost-benefit analysis of passive fire protections in road LPG transportation.

    PubMed

    Paltrinieri, Nicola; Bonvicini, Sarah; Spadoni, Gigliola; Cozzani, Valerio

    2012-02-01

    The cost-benefit evaluation of passive fire protection adoption in the road transport of liquefied petroleum gas (LPG) was investigated. In a previous study, mathematical simulations of real scale fire scenarios proved the effectiveness of passive fire protections in preventing the "fired" boiling liquid expanding vapor explosion (BLEVE), thus providing a significant risk reduction. In the present study the economical aspects of the adoption of fire protections are analyzed and an approach to cost-benefit analysis (CBA) is proposed. The CBA model is based on the comparison of the risk reduction due to fire protections (expressed in monetary terms by the value of a statistical life) and the cost of the application of fire protections to a fleet of tankers. Different types of fire protections were considered, as well as the possibility to apply protections to the entire fleet or only to a part of it. The application of the proposed model to a real-life case study is presented and discussed. Results demonstrate that the adoption of passive fire protections on road tankers, though not compulsory in Europe, can be economically feasible, thus representing a concrete measure to achieve control of the "major hazard accidents" cited by the European legislation. © 2011 Society for Risk Analysis.

  12. Energy spectra of X-ray clusters of galaxies

    NASA Technical Reports Server (NTRS)

    Avni, Y.

    1976-01-01

    A procedure for estimating the ranges of parameters that describe the spectra of X-rays from clusters of galaxies is presented. The applicability of the method is proved by statistical simulations of cluster spectra; such a proof is necessary because of the nonlinearity of the spectral functions. Implications for the spectra of the Perseus, Coma, and Virgo clusters are discussed. The procedure can be applied in more general problems of parameter estimation.

  13. Neutron Protection Factor Determination and Validation for a Vehicle Surrogate Using a Californium Fission Source

    DTIC Science & Technology

    2017-06-01

    protection factors . The success of this research is a direct result of the immense collaboration across a number of institutions that all shared a...at post detonation neutron transport, an exact solution is not needed. Instead, the RPF research campaign uses a statistical-based method through a... factors of selected light vehicles against residual radiation,” United States Army Ballistic Research Laboratory, Aberdeen Proving Ground, MD, 1988

  14. An investigation of the use of discriminant analysis for the classification of blade edge type from cut marks made by metal and bamboo blades.

    PubMed

    Bonney, Heather

    2014-08-01

    Analysis of cut marks in bone is largely limited to two dimensional qualitative description. Development of morphological classification methods using measurements from cut mark cross sections could have multiple uses across palaeoanthropological and archaeological disciplines, where cutting edge types are used to investigate and reconstruct behavioral patterns. An experimental study was undertaken, using porcine bone, to determine the usefulness of discriminant function analysis in classifying cut marks by blade edge type, from a number of measurements taken from their cross-sectional profile. The discriminant analysis correctly classified 86.7% of the experimental cut marks into serrated, non-serrated and bamboo blade types. The technique was then used to investigate a series of cut marks of unknown origin from a collection of trophy skulls from the Torres Strait Islands, to investigate whether they were made by bamboo or metal blades. Nineteen out of twenty of the cut marks investigated were classified as bamboo which supports the non-contemporaneous ethnographic accounts of the knives used for trophy taking and defleshing remains. With further investigation across a variety of blade types, this technique could prove a valuable tool in the interpretation of cut mark evidence from a wide variety of contexts, particularly in forensic anthropology where the requirement for presentation of evidence in a statistical format is becoming increasingly important. © 2014 Wiley Periodicals, Inc.

  15. The “Dry-Run” Analysis: A Method for Evaluating Risk Scores for Confounding Control

    PubMed Central

    Wyss, Richard; Hansen, Ben B.; Ellis, Alan R.; Gagne, Joshua J.; Desai, Rishi J.; Glynn, Robert J.; Stürmer, Til

    2017-01-01

    Abstract A propensity score (PS) model's ability to control confounding can be assessed by evaluating covariate balance across exposure groups after PS adjustment. The optimal strategy for evaluating a disease risk score (DRS) model's ability to control confounding is less clear. DRS models cannot be evaluated through balance checks within the full population, and they are usually assessed through prediction diagnostics and goodness-of-fit tests. A proposed alternative is the “dry-run” analysis, which divides the unexposed population into “pseudo-exposed” and “pseudo-unexposed” groups so that differences on observed covariates resemble differences between the actual exposed and unexposed populations. With no exposure effect separating the pseudo-exposed and pseudo-unexposed groups, a DRS model is evaluated by its ability to retrieve an unconfounded null estimate after adjustment in this pseudo-population. We used simulations and an empirical example to compare traditional DRS performance metrics with the dry-run validation. In simulations, the dry run often improved assessment of confounding control, compared with the C statistic and goodness-of-fit tests. In the empirical example, PS and DRS matching gave similar results and showed good performance in terms of covariate balance (PS matching) and controlling confounding in the dry-run analysis (DRS matching). The dry-run analysis may prove useful in evaluating confounding control through DRS models. PMID:28338910

  16. [Atomic absorption fingerprint and identification studies of Da Huo Luo pill. I. Exploration of inorganic elements fingerprint for establishment of industrial standard].

    PubMed

    Zhang, Qi-Feng; Zhu, Long-Yin; Ding, Shu-Liang; Wang, Chen; Tu, Long-Fei

    2008-03-01

    The fingerprints for most of Chinese medicines based on their organic compositions have been well established. Nevertheless, there are very few known fingerprints which are based on inorganic elements. In order to identify the Da Huo Luo Dan and its efficiency from other Chinese medicines, the authors attempted to set up a fingerprint which could be determined by the measurement of inorganic elements in Da Huo Luo Dan and other Chinese medicines. In the present study, the authors first employed 28 batches of Da Huo Luo Dan produced by Zhang-Shu Pharmatheutical Company in Jiang Xi Province to screen 12 kinds of inorganic elements measured by atomic absorption spectrophotometer and established the atomic absorption fingerprints. Secondly, the authors tried to identify Da Huo Luo Dan and other Chinese medicines by using the similarly analysis of vectors and the statistical analysis of compositional data. The result showed that the methods the authors used here were predictable to tell the efficiency of Da Huo Luo Dan from others. The authors' study also proves that establishment of standard for quality control by analysis of inorganic elements in Chinese medicines is feasible. The present study provides a new idea and a new technique that serve for the establishment of industrial standards for analysis of inorganic elements fingerprint to explore the effects of Chinese medicines.

  17. Fast and simultaneous determination of 12 polyphenols in apple peel and pulp by using chemometrics-assisted high-performance liquid chromatography with diode array detection.

    PubMed

    Wang, Tong; Wu, Hai-Long; Xie, Li-Xia; Zhu, Li; Liu, Zhi; Sun, Xiao-Dong; Xiao, Rong; Yu, Ru-Qin

    2017-04-01

    In this work, a smart chemometrics-enhanced strategy, high-performance liquid chromatography, and diode array detection coupled with second-order calibration method based on alternating trilinear decomposition algorithm was proposed to simultaneously quantify 12 polyphenols in different kinds of apple peel and pulp samples. The proposed strategy proved to be a powerful tool to solve the problems of coelution, unknown interferences, and chromatographic shifts in the process of high-performance liquid chromatography analysis, making it possible for the determination of 12 polyphenols in complex apple matrices within 10 min under simple conditions of elution. The average recoveries with standard deviations, and figures of merit including sensitivity, selectivity, limit of detection, and limit of quantitation were calculated to validate the accuracy of the proposed method. Compared to the quantitative analysis results from the classic high-performance liquid chromatography method, the statistical and graphical analysis showed that our proposed strategy obtained more reliable results. All results indicated that our proposed method used in the quantitative analysis of apple polyphenols was an accurate, fast, universal, simple, and green one, and it was expected to be developed as an attractive alternative method for simultaneous determination of multitargeted analytes in complex matrices. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. The Nuremberg mind redeemed: a comprehensive analysis of the Rorschachs of Nazi war criminals.

    PubMed

    Resnick, M N; Nunno, V J

    1991-08-01

    We examined a blind, actuarial analysis of the Rorschach data of the Nuremberg war criminals (NWC) using Exner's (1974) Comprehensive System in an attempt to prove the convergence of the NWC construct along dimensions of psychological (personality) functioning and to prove its discriminability from other appropriate psychiatric and nonpsychiatric comparison groups. The weaknesses of previous research methodologies are examined and discussed vis-à-vis the historical and theoretical developments of the concepts of authoritarianism, dogmatism, obedience to authority, and the development of the Rorschach Inkblot Technique.

  19. Effect of the mass center shift for force-free flexible spacecraft

    NASA Technical Reports Server (NTRS)

    Meirovitch, L.; Juang, J.-N.

    1975-01-01

    For a spinning flexible spacecraft the mass center generally shifts relative to the nominal undeformed position. It is thought that this shift of center complicates spacecraft stability analysis. It is proved, on the basis of results achieved by Meirovitch and Calico (1972), that for the general class of force-free single-spin flexible spacecraft it is possible to ignore this shift of center without affecting the stability criteria in any significant way. A new theorem on inequalities for quadratic forms is proved to demonstrate the validity of the stability analysis.

  20. Mathematical analysis on the cosets of subgroup in the group of E-convex sets

    NASA Astrophysics Data System (ADS)

    Abbas, Nada Mohammed; Ajeena, Ruma Kareem K.

    2018-05-01

    In this work, analyzing the cosets of the subgroup in the group of L – convex sets is presented as a new and powerful tool in the topics of the convex analysis and abstract algebra. On L – convex sets, the properties of these cosets are proved mathematically. Most important theorem on a finite group of L – convex sets theory which is the Lagrange’s Theorem has been proved. As well as, the mathematical proof of the quotient group of L – convex sets is presented.

  1. Dimensional Analysis in Mathematical Modeling Systems: A Simple Numerical Method

    DTIC Science & Technology

    1991-02-01

    US Army Ballistic Research Laboratories, Aberden Proving Ground , NID, August 1975. [18] Hi1irlimann, T., and .J. lKohlas "LPL: A Structured Language...such systems can prove that (a’ + ab + b2 + ba) = (a + b) 2 . With some effort, since the laws of physical algebra are a minor variant on those of

  2. Conjecturing, Generalizing and Justifying: Building Theory around Teacher Knowledge of Proving

    ERIC Educational Resources Information Center

    Lesseig, Kristin

    2016-01-01

    The purpose of this study was to detail teachers' proving activity and contribute to a framework of Mathematical Knowledge for Teaching Proof (MKT for Proof). While working to justify claims about sums of consecutive numbers, teachers searched for key ideas and productively used examples to make, test and refine conjectures. Analysis of teachers'…

  3. Linear Discriminant Analysis Achieves High Classification Accuracy for the BOLD fMRI Response to Naturalistic Movie Stimuli

    PubMed Central

    Mandelkow, Hendrik; de Zwart, Jacco A.; Duyn, Jeff H.

    2016-01-01

    Naturalistic stimuli like movies evoke complex perceptual processes, which are of great interest in the study of human cognition by functional MRI (fMRI). However, conventional fMRI analysis based on statistical parametric mapping (SPM) and the general linear model (GLM) is hampered by a lack of accurate parametric models of the BOLD response to complex stimuli. In this situation, statistical machine-learning methods, a.k.a. multivariate pattern analysis (MVPA), have received growing attention for their ability to generate stimulus response models in a data-driven fashion. However, machine-learning methods typically require large amounts of training data as well as computational resources. In the past, this has largely limited their application to fMRI experiments involving small sets of stimulus categories and small regions of interest in the brain. By contrast, the present study compares several classification algorithms known as Nearest Neighbor (NN), Gaussian Naïve Bayes (GNB), and (regularized) Linear Discriminant Analysis (LDA) in terms of their classification accuracy in discriminating the global fMRI response patterns evoked by a large number of naturalistic visual stimuli presented as a movie. Results show that LDA regularized by principal component analysis (PCA) achieved high classification accuracies, above 90% on average for single fMRI volumes acquired 2 s apart during a 300 s movie (chance level 0.7% = 2 s/300 s). The largest source of classification errors were autocorrelations in the BOLD signal compounded by the similarity of consecutive stimuli. All classifiers performed best when given input features from a large region of interest comprising around 25% of the voxels that responded significantly to the visual stimulus. Consistent with this, the most informative principal components represented widespread distributions of co-activated brain regions that were similar between subjects and may represent functional networks. In light of these results, the combination of naturalistic movie stimuli and classification analysis in fMRI experiments may prove to be a sensitive tool for the assessment of changes in natural cognitive processes under experimental manipulation. PMID:27065832

  4. Which level of evidence does the US National Toxicology Program provide? Statistical considerations using the Technical Report 578 on Ginkgo biloba as an example.

    PubMed

    Gaus, Wilhelm

    2014-09-02

    The US National Toxicology Program (NTP) is assessed by a statistician. In the NTP-program groups of rodents are fed for a certain period of time with different doses of the substance that is being investigated. Then the animals are sacrificed and all organs are examined pathologically. Such an investigation facilitates many statistical tests. Technical Report TR 578 on Ginkgo biloba is used as an example. More than 4800 statistical tests are possible with the investigations performed. Due to a thought experiment we expect >240 false significant tests. In actuality, 209 significant pathological findings were reported. The readers of Toxicology Letters should carefully distinguish between confirmative and explorative statistics. A confirmative interpretation of a significant test rejects the null-hypothesis and delivers "statistical proof". It is only allowed if (i) a precise hypothesis was established independently from the data used for the test and (ii) the computed p-values are adjusted for multiple testing if more than one test was performed. Otherwise an explorative interpretation generates a hypothesis. We conclude that NTP-reports - including TR 578 on Ginkgo biloba - deliver explorative statistics, i.e. they generate hypotheses, but do not prove them. Copyright © 2014 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  5. Capturing Fine Details Involving Low-Cost Sensors -a Comparative Study

    NASA Astrophysics Data System (ADS)

    Rehany, N.; Barsi, A.; Lovas, T.

    2017-11-01

    Capturing the fine details on the surface of small objects is a real challenge to many conventional surveying methods. Our paper discusses the investigation of several data acquisition technologies, such as arm scanner, structured light scanner, terrestrial laser scanner, object line-scanner, DSLR camera, and mobile phone camera. A palm-sized embossed sculpture reproduction was used as a test object; it has been surveyed by all the instruments. The result point clouds and meshes were then analyzed, using the arm scanner's dataset as reference. In addition to general statistics, the results have been evaluated based both on 3D deviation maps and 2D deviation graphs; the latter allows even more accurate analysis of the characteristics of the different data acquisition approaches. Additionally, own-developed local minimum maps were created that nicely visualize the potential level of detail provided by the applied technologies. Besides the usual geometric assessment, the paper discusses the different resource needs (cost, time, expertise) of the discussed techniques. Our results proved that even amateur sensors operated by amateur users can provide high quality datasets that enable engineering analysis. Based on the results, the paper contains an outlook to potential future investigations in this field.

  6. TU-B-304-01: The Aftermath of TG-142

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Klein, E.

    2015-06-15

    Although published in 2009, the AAPM TG-142 report on accelerator quality assurance still proves a challenge for full clinical implementation. The choice of methodologies to satisfy TG-142 requirements is critical to a successful application. Understanding the philosophy of TG-142 can help in creating an institution-specific QA practice that is both efficient and effective. The concept of maintaining commissioned beam profiles is still found confusing. The physicist must also consider technologies not covered by TG-142 (i.e. arc therapy techniques). On the horizon is TG-198 report on implementing TG-142. Although the community still lacks a final TG-100 report, performing a failure-mode -and-effectsmore » analysis and statistical process control analysis to determine the institution-specific clinical impact of each TG-142 test may be useful for identifying trends for pro-active surveillance. Learning Objectives: To better understand the confusing and controversial aspects of TG-142. To understand what is still missing from TG-142 and how to account for these tests in clinical practice To describe which QA tests in TG-142 yield the largest potential clinical result if not discovered.« less

  7. TU-B-304-02: Quantitative FMEA of TG-142

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O’Daniel, J.

    2015-06-15

    Although published in 2009, the AAPM TG-142 report on accelerator quality assurance still proves a challenge for full clinical implementation. The choice of methodologies to satisfy TG-142 requirements is critical to a successful application. Understanding the philosophy of TG-142 can help in creating an institution-specific QA practice that is both efficient and effective. The concept of maintaining commissioned beam profiles is still found confusing. The physicist must also consider technologies not covered by TG-142 (i.e. arc therapy techniques). On the horizon is TG-198 report on implementing TG-142. Although the community still lacks a final TG-100 report, performing a failure-mode -and-effectsmore » analysis and statistical process control analysis to determine the institution-specific clinical impact of each TG-142 test may be useful for identifying trends for pro-active surveillance. Learning Objectives: To better understand the confusing and controversial aspects of TG-142. To understand what is still missing from TG-142 and how to account for these tests in clinical practice To describe which QA tests in TG-142 yield the largest potential clinical result if not discovered.« less

  8. TU-B-304-00: The Aftermath of TG-142

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    2015-06-15

    Although published in 2009, the AAPM TG-142 report on accelerator quality assurance still proves a challenge for full clinical implementation. The choice of methodologies to satisfy TG-142 requirements is critical to a successful application. Understanding the philosophy of TG-142 can help in creating an institution-specific QA practice that is both efficient and effective. The concept of maintaining commissioned beam profiles is still found confusing. The physicist must also consider technologies not covered by TG-142 (i.e. arc therapy techniques). On the horizon is TG-198 report on implementing TG-142. Although the community still lacks a final TG-100 report, performing a failure-mode -and-effectsmore » analysis and statistical process control analysis to determine the institution-specific clinical impact of each TG-142 test may be useful for identifying trends for pro-active surveillance. Learning Objectives: To better understand the confusing and controversial aspects of TG-142. To understand what is still missing from TG-142 and how to account for these tests in clinical practice To describe which QA tests in TG-142 yield the largest potential clinical result if not discovered.« less

  9. Assessment the impact of samplers change on the uncertainty related to geothermalwater sampling

    NASA Astrophysics Data System (ADS)

    Wątor, Katarzyna; Mika, Anna; Sekuła, Klaudia; Kmiecik, Ewa

    2018-02-01

    The aim of this study is to assess the impact of samplers change on the uncertainty associated with the process of the geothermal water sampling. The study was carried out on geothermal water exploited in Podhale region, southern Poland (Małopolska province). To estimate the uncertainty associated with sampling the results of determinations of metasilicic acid (H2SiO3) in normal and duplicate samples collected in two series were used (in each series the samples were collected by qualified sampler). Chemical analyses were performed using ICP-OES method in the certified Hydrogeochemical Laboratory of the Hydrogeology and Engineering Geology Department at the AGH University of Science and Technology in Krakow (Certificate of Polish Centre for Accreditation No. AB 1050). To evaluate the uncertainty arising from sampling the empirical approach was implemented, based on double analysis of normal and duplicate samples taken from the same well in the series of testing. The analyses of the results were done using ROBAN software based on technique of robust statistics analysis of variance (rANOVA). Conducted research proved that in the case of qualified and experienced samplers uncertainty connected with the sampling can be reduced what results in small measurement uncertainty.

  10. Validation of different spectrophotometric methods for determination of vildagliptin and metformin in binary mixture

    NASA Astrophysics Data System (ADS)

    Abdel-Ghany, Maha F.; Abdel-Aziz, Omar; Ayad, Miriam F.; Tadros, Mariam M.

    New, simple, specific, accurate, precise and reproducible spectrophotometric methods have been developed and subsequently validated for determination of vildagliptin (VLG) and metformin (MET) in binary mixture. Zero order spectrophotometric method was the first method used for determination of MET in the range of 2-12 μg mL-1 by measuring the absorbance at 237.6 nm. The second method was derivative spectrophotometric technique; utilized for determination of MET at 247.4 nm, in the range of 1-12 μg mL-1. Derivative ratio spectrophotometric method was the third technique; used for determination of VLG in the range of 4-24 μg mL-1 at 265.8 nm. Fourth and fifth methods adopted for determination of VLG in the range of 4-24 μg mL-1; were ratio subtraction and mean centering spectrophotometric methods, respectively. All the results were statistically compared with the reported methods, using one-way analysis of variance (ANOVA). The developed methods were satisfactorily applied to analysis of the investigated drugs and proved to be specific and accurate for quality control of them in pharmaceutical dosage forms.

  11. Application of Visual Attention in Seismic Attribute Analysis

    NASA Astrophysics Data System (ADS)

    He, M.; Gu, H.; Wang, F.

    2016-12-01

    It has been proved that seismic attributes can be used to predict reservoir. The joint of multi-attribute and geological statistics, data mining, artificial intelligence, further promote the development of the seismic attribute analysis. However, the existing methods tend to have multiple solutions and insufficient generalization ability, which is mainly due to the complex relationship between seismic data and geological information, and undoubtedly own partly to the methods applied. Visual attention is a mechanism model of the human visual system which can concentrate on a few significant visual objects rapidly, even in a mixed scene. Actually, the model qualify good ability of target detection and recognition. In our study, the targets to be predicted are treated as visual objects, and an object representation based on well data is made in the attribute dimensions. Then in the same attribute space, the representation is served as a criterion to search the potential targets outside the wells. This method need not predict properties by building up a complicated relation between attributes and reservoir properties, but with reference to the standard determined before. So it has pretty good generalization ability, and the problem of multiple solutions can be weakened by defining the threshold of similarity.

  12. Evaluation of Oil-Palm Fungal Disease Infestation with Canopy Hyperspectral Reflectance Data

    PubMed Central

    Lelong, Camille C. D.; Roger, Jean-Michel; Brégand, Simon; Dubertret, Fabrice; Lanore, Mathieu; Sitorus, Nurul A.; Raharjo, Doni A.; Caliman, Jean-Pierre

    2010-01-01

    Fungal disease detection in perennial crops is a major issue in estate management and production. However, nowadays such diagnostics are long and difficult when only made from visual symptom observation, and very expensive and damaging when based on root or stem tissue chemical analysis. As an alternative, we propose in this study to evaluate the potential of hyperspectral reflectance data to help detecting the disease efficiently without destruction of tissues. This study focuses on the calibration of a statistical model of discrimination between several stages of Ganoderma attack on oil palm trees, based on field hyperspectral measurements at tree scale. Field protocol and measurements are first described. Then, combinations of pre-processing, partial least square regression and linear discriminant analysis are tested on about hundred samples to prove the efficiency of canopy reflectance in providing information about the plant sanitary status. A robust algorithm is thus derived, allowing classifying oil-palm in a 4-level typology, based on disease severity from healthy to critically sick stages, with a global performance close to 94%. Moreover, this model discriminates sick from healthy trees with a confidence level of almost 98%. Applications and further improvements of this experiment are finally discussed. PMID:22315565

  13. phenoVein—A Tool for Leaf Vein Segmentation and Analysis1[OPEN

    PubMed Central

    Pflugfelder, Daniel; Huber, Gregor; Scharr, Hanno; Hülskamp, Martin; Koornneef, Maarten; Jahnke, Siegfried

    2015-01-01

    Precise measurements of leaf vein traits are an important aspect of plant phenotyping for ecological and genetic research. Here, we present a powerful and user-friendly image analysis tool named phenoVein. It is dedicated to automated segmenting and analyzing of leaf veins in images acquired with different imaging modalities (microscope, macrophotography, etc.), including options for comfortable manual correction. Advanced image filtering emphasizes veins from the background and compensates for local brightness inhomogeneities. The most important traits being calculated are total vein length, vein density, piecewise vein lengths and widths, areole area, and skeleton graph statistics, like the number of branching or ending points. For the determination of vein widths, a model-based vein edge estimation approach has been implemented. Validation was performed for the measurement of vein length, vein width, and vein density of Arabidopsis (Arabidopsis thaliana), proving the reliability of phenoVein. We demonstrate the power of phenoVein on a set of previously described vein structure mutants of Arabidopsis (hemivenata, ondulata3, and asymmetric leaves2-101) compared with wild-type accessions Columbia-0 and Landsberg erecta-0. phenoVein is freely available as open-source software. PMID:26468519

  14. The application of LANDSAT-1 imagery for monitoring strip mines in the new river watershed in northeast Tennessee, part 2

    NASA Technical Reports Server (NTRS)

    Shahrokhi, F. (Principal Investigator); Sharber, L. A.

    1977-01-01

    The author has identified the following significant results. LANDSAT imagery and supplementary aircraft photography of the New River drainage basin were subjected to a multilevel analysis using conventional photointerpretation methods, densitometric techniques, multispectral analysis, and statistical tests to determine the accuracy of LANDSAT-1 imagery for measuring strip mines of common size. The LANDSAT areas were compared with low altitude measurements. The average accuracy over all the mined land sample areas mapped from LANDSAT-1 was 90%. The discrimination of strip mine subcategories is somewhat limited on LANDSAT imagery. A mine site, whether active or inactive, can be inferred by lack of vegetation, by shape, or image texture. Mine ponds are difficult or impossible to detect because of their small size and turbidity. Unless bordered and contrasted with vegetation, haulage roads are impossible to delineate. Preparation plants and refuge areas are not detectable. Density slicing of LANDSAT band 7 proved most useful in the detection of reclamation progress within the mined areas. For most state requirements for year-round monitoring of surface mined land, LANDSAT is of limited value. However, for periodic updating of regional surface maps, LANDSAT may provide sufficient accuracies for some users.

  15. Statistical Approach To Estimate Vaccinia-Specific Neutralizing Antibody Titers Using a High-Throughput Assay▿

    PubMed Central

    Kennedy, Richard; Pankratz, V. Shane; Swanson, Eric; Watson, David; Golding, Hana; Poland, Gregory A.

    2009-01-01

    Because of the bioterrorism threat posed by agents such as variola virus, considerable time, resources, and effort have been devoted to biodefense preparation. One avenue of this research has been the development of rapid, sensitive, high-throughput assays to validate immune responses to poxviruses. Here we describe the adaptation of a β-galactosidase reporter-based vaccinia virus neutralization assay to large-scale use in a study that included over 1,000 subjects. We also describe the statistical methods involved in analyzing the large quantity of data generated. The assay and its associated methods should prove useful tools in monitoring immune responses to next-generation smallpox vaccines, studying poxvirus immunity, and evaluating therapeutic agents such as vaccinia virus immune globulin. PMID:19535540

  16. q-triplet for Brazos River discharge: The edge of chaos?

    NASA Astrophysics Data System (ADS)

    Stosic, Tatijana; Stosic, Borko; Singh, Vijay P.

    2018-04-01

    We study the daily discharge data of Brazos River in Texas, USA, from 1900 to 2017, in terms of concepts drawn from the non-extensive statistics recently introduced by Tsallis. We find that the Brazos River discharge indeed follows non-extensive statistics regarding equilibrium, relaxation and sensitivity. Besides being the first such finding of a full-fledged q-triplet in hydrological data with possible future impact on water resources management, the fact that all three Tsallis q-triplet values are remarkably close to those of the logistic map at the onset of chaos opens up new questions towards a deeper understanding of the Brazos River dynamics, that may prove relevant for hydrological research in a more general sense.

  17. A Geometrical Approach to Bell's Theorem

    NASA Technical Reports Server (NTRS)

    Rubincam, David Parry

    2000-01-01

    Bell's theorem can be proved through simple geometrical reasoning, without the need for the Psi function, probability distributions, or calculus. The proof is based on N. David Mermin's explication of the Einstein-Podolsky-Rosen-Bohm experiment, which involves Stern-Gerlach detectors which flash red or green lights when detecting spin-up or spin-down. The statistics of local hidden variable theories for this experiment can be arranged in colored strips from which simple inequalities can be deduced. These inequalities lead to a demonstration of Bell's theorem. Moreover, all local hidden variable theories can be graphed in such a way as to enclose their statistics in a pyramid, with the quantum-mechanical result lying a finite distance beneath the base of the pyramid.

  18. Societal Statistics by virtue of the Statistical Drake Equation

    NASA Astrophysics Data System (ADS)

    Maccone, Claudio

    2012-09-01

    The Drake equation, first proposed by Frank D. Drake in 1961, is the foundational equation of SETI. It yields an estimate of the number N of extraterrestrial communicating civilizations in the Galaxy given by the product N=Ns×fp×ne×fl×fi×fc×fL, where: Ns is the number of stars in the Milky Way Galaxy; fp is the fraction of stars that have planetary systems; ne is the number of planets in a given system that are ecologically suitable for life; fl is the fraction of otherwise suitable planets on which life actually arises; fi is the fraction of inhabited planets on which an intelligent form of life evolves; fc is the fraction of planets inhabited by intelligent beings on which a communicative technical civilization develops; and fL is the fraction of planetary lifetime graced by a technical civilization. The first three terms may be called "the astrophysical terms" in the Drake equation since their numerical value is provided by astrophysical considerations. The fourth term, fl, may be called "the origin-of-life term" and entails biology. The last three terms may be called "the societal terms" inasmuch as their respective numerical values are provided by anthropology, telecommunication science and "futuristic science", respectively. In this paper, we seek to provide a statistical estimate of the three societal terms in the Drake equation basing our calculations on the Statistical Drake Equation first proposed by this author at the 2008 IAC. In that paper the author extended the simple 7-factor product so as to embody Statistics. He proved that, no matter which probability distribution may be assigned to each factor, if the number of factors tends to infinity, then the random variable N follows the lognormal distribution (central limit theorem of Statistics). This author also proved at the 2009 IAC that the Dole (1964) [7] equation, yielding the number of Habitable Planets for Man in the Galaxy, has the same mathematical structure as the Drake equation. So the number of Habitable Planets follows the lognormal distribution as well. But the Dole equation is described by the first FOUR factors of the Drake equation. Thus, we may "divide" the 7-factor Drake equation by the 4-factor Dole equation getting the probability distribution of the last-3-factor Drake equation, i.e. the probability distribution of the SOCIETAL TERMS ONLY. These we study in detail in this paper, achieving new statistical results about the SOCIETAL ASPECTS OF SETI.

  19. Development of computer-assisted instruction application for statistical data analysis android platform as learning resource

    NASA Astrophysics Data System (ADS)

    Hendikawati, P.; Arifudin, R.; Zahid, M. Z.

    2018-03-01

    This study aims to design an android Statistics Data Analysis application that can be accessed through mobile devices to making it easier for users to access. The Statistics Data Analysis application includes various topics of basic statistical along with a parametric statistics data analysis application. The output of this application system is parametric statistics data analysis that can be used for students, lecturers, and users who need the results of statistical calculations quickly and easily understood. Android application development is created using Java programming language. The server programming language uses PHP with the Code Igniter framework, and the database used MySQL. The system development methodology used is the Waterfall methodology with the stages of analysis, design, coding, testing, and implementation and system maintenance. This statistical data analysis application is expected to support statistical lecturing activities and make students easier to understand the statistical analysis of mobile devices.

  20. Orphan therapies: making best use of postmarket data.

    PubMed

    Maro, Judith C; Brown, Jeffrey S; Dal Pan, Gerald J; Li, Lingling

    2014-08-01

    Postmarket surveillance of the comparative safety and efficacy of orphan therapeutics is challenging, particularly when multiple therapeutics are licensed for the same orphan indication. To make best use of product-specific registry data collected to fulfill regulatory requirements, we propose the creation of a distributed electronic health data network among registries. Such a network could support sequential statistical analyses designed to detect early warnings of excess risks. We use a simulated example to explore the circumstances under which a distributed network may prove advantageous. We perform sample size calculations for sequential and non-sequential statistical studies aimed at comparing the incidence of hepatotoxicity following initiation of two newly licensed therapies for homozygous familial hypercholesterolemia. We calculate the sample size savings ratio, or the proportion of sample size saved if one conducted a sequential study as compared to a non-sequential study. Then, using models to describe the adoption and utilization of these therapies, we simulate when these sample sizes are attainable in calendar years. We then calculate the analytic calendar time savings ratio, analogous to the sample size savings ratio. We repeat these analyses for numerous scenarios. Sequential analyses detect effect sizes earlier or at the same time as non-sequential analyses. The most substantial potential savings occur when the market share is more imbalanced (i.e., 90% for therapy A) and the effect size is closest to the null hypothesis. However, due to low exposure prevalence, these savings are difficult to realize within the 30-year time frame of this simulation for scenarios in which the outcome of interest occurs at or more frequently than one event/100 person-years. We illustrate a process to assess whether sequential statistical analyses of registry data performed via distributed networks may prove a worthwhile infrastructure investment for pharmacovigilance.

  1. [Comparative analysis of mental health treatment of various groups of competitive athletes].

    PubMed

    Kaufmann, C; Reiter, N; Barolin, G S

    1995-01-01

    1) There is sense and necessity of psychohygiene in sports. This could be stated with 3 compared groups, namely: a national team, a team of handicapped, a team of juveniles. 2) We have used a complex program with integrated autogenic training (this however, not in an isolated way). Earlier experiences (of last author) over many years and hundreds of sportives were integrated in our evaluation. 3) Its to our knowledge the first time in world literature that effectiveness of psychohygiene within sports could be proved with statistical significance using world cup points within comparable groups. 4) Only possible can be possible, and psychohygiene will help to optimate this goal but not overrun it. This is main differentiation against doping. The human and the humanity is the main goal and not the so-called "necessities" of sport dictatorship. Its a good sign that several of our candidates told us that our psychohygienic program has reached over their period of sports activities into their "normal" lives.

  2. Assessment of vertical excursions and open-sea psychological performance at depths to 250 fsw.

    PubMed

    Miller, J W; Bachrach, A J; Walsh, J M

    1976-12-01

    A series of 10 two-man descending vertical excursion dives was carried out in the open sea from an ocean-floor habitat off the coast of Puerto Rico by four aquanauts saturated on a normoxic-nitrogen breathing mixture at a depth of 106 fsw. The purpose of these dives was two-fold: to validate laboratory findings with respect to decompression schedules and to determine whether such excursions would produce evidence of adaptation to nitrogen narcosis. For the latter, tests designed to measure time estimation, short-term memory, and auditory vigilance were used. The validation of experimental excursion tables was carried out without incidence of decompression sickness. Although no signs of nitrogen narcosis were noted during testing, all subjects made significantly longer time estimates in the habitat and during the excursions than on the surface. Variability and incomplete data prevented a statistical analysis of the short-term memory results, and the auditory vigilance proved unusable in the water.

  3. The quality of our drinking water: aluminium determination with an acoustic wave sensor.

    PubMed

    Veríssimo, Marta I S; Gomes, M Teresa S R

    2008-06-09

    A new methodology based on an inexpensive aluminium acoustic wave sensor is presented. Although the aluminium sensor has already been reported, and the composition of the selective membrane is known, the low detection limits required for the analysis of drinking water, demanded the inclusion of a preconcentration stage, as well as an optimization of the sensor. The necessary coating amount was established, as well as the best preconcentration protocol, in terms of oxidation of organic matter and aluminium elution from the Chelex-100. The methodology developed with the acoustic wave sensor allowed aluminium quantitation above 0.07 mg L(-1). Several water samples from Portugal were analysed using the acoustic wave sensor, as well as by UV-vis spectrophotometry. Results obtained with both methodologies were not statistically different (alpha=0.05), both in terms of accuracy and precision. This new methodology proved to be adequate for aluminium quantitation in drinking water and showed to be faster and less reagent consuming than the UV spectrophotometric methodology.

  4. Extended maximum likelihood halo-independent analysis of dark matter direct detection data

    DOE PAGES

    Gelmini, Graciela B.; Georgescu, Andreea; Gondolo, Paolo; ...

    2015-11-24

    We extend and correct a recently proposed maximum-likelihood halo-independent method to analyze unbinned direct dark matter detection data. Instead of the recoil energy as independent variable we use the minimum speed a dark matter particle must have to impart a given recoil energy to a nucleus. This has the advantage of allowing us to apply the method to any type of target composition and interaction, e.g. with general momentum and velocity dependence, and with elastic or inelastic scattering. We prove the method and provide a rigorous statistical interpretation of the results. As first applications, we find that for dark mattermore » particles with elastic spin-independent interactions and neutron to proton coupling ratio f n/f p=-0.7, the WIMP interpretation of the signal observed by CDMS-II-Si is compatible with the constraints imposed by all other experiments with null results. We also find a similar compatibility for exothermic inelastic spin-independent interactions with f n/f p=-0.8.« less

  5. GENOME-WIDE GENETIC INTERACTION ANALYSIS OF GLAUCOMA USING EXPERT KNOWLEDGE DERIVED FROM HUMAN PHENOTYPE NETWORKS

    PubMed Central

    HU, TING; DARABOS, CHRISTIAN; CRICCO, MARIA E.; KONG, EMILY; MOORE, JASON H.

    2014-01-01

    The large volume of GWAS data poses great computational challenges for analyzing genetic interactions associated with common human diseases. We propose a computational framework for characterizing epistatic interactions among large sets of genetic attributes in GWAS data. We build the human phenotype network (HPN) and focus around a disease of interest. In this study, we use the GLAUGEN glaucoma GWAS dataset and apply the HPN as a biological knowledge-based filter to prioritize genetic variants. Then, we use the statistical epistasis network (SEN) to identify a significant connected network of pairwise epistatic interactions among the prioritized SNPs. These clearly highlight the complex genetic basis of glaucoma. Furthermore, we identify key SNPs by quantifying structural network characteristics. Through functional annotation of these key SNPs using Biofilter, a software accessing multiple publicly available human genetic data sources, we find supporting biomedical evidences linking glaucoma to an array of genetic diseases, proving our concept. We conclude by suggesting hypotheses for a better understanding of the disease. PMID:25592582

  6. Molecular diversity of arbuscular mycorrhizal fungi in relation to soil chemical properties and heavy metal contamination.

    PubMed

    Zarei, Mehdi; Hempel, Stefan; Wubet, Tesfaye; Schäfer, Tina; Savaghebi, Gholamreza; Jouzani, Gholamreza Salehi; Nekouei, Mojtaba Khayam; Buscot, François

    2010-08-01

    Abundance and diversity of arbuscular mycorrhizal fungi (AMF) associated with dominant plant species were studied along a transect from highly lead (Pb) and zinc (Zn) polluted to non-polluted soil at the Anguran open pit mine in Iran. Using an established primer set for AMF in the internal transcribed spacer (ITS) region of rDNA, nine different AMF sequence types were distinguished after phylogenetic analyses, showing remarkable differences in their distribution patterns along the transect. With decreasing Pb and Zn concentration, the number of AMF sequence types increased, however one sequence type was only found in the highly contaminated area. Multivariate statistical analysis revealed that further factors than HM soil concentration affect the AMF community at contaminated sites. Specifically, the soils' calcium carbonate equivalent and available P proved to be of importance, which illustrates that field studies on AMF distribution should also consider important environmental factors and their possible interactions. Copyright 2010 Elsevier Ltd. All rights reserved.

  7. Predicting The Type Of Pregnancy Using Flexible Discriminate Analysis And Artificial Neural Networks: A Comparison Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hooman, A.; Mohammadzadeh, M

    Some medical and epidemiological surveys have been designed to predict a nominal response variable with several levels. With regard to the type of pregnancy there are four possible states: wanted, unwanted by wife, unwanted by husband and unwanted by couple. In this paper, we have predicted the type of pregnancy, as well as the factors influencing it using three different models and comparing them. Regarding the type of pregnancy with several levels, we developed a multinomial logistic regression, a neural network and a flexible discrimination based on the data and compared their results using tow statistical indices: Surface under curvemore » (ROC) and kappa coefficient. Based on these tow indices, flexible discrimination proved to be a better fit for prediction on data in comparison to other methods. When the relations among variables are complex, one can use flexible discrimination instead of multinomial logistic regression and neural network to predict the nominal response variables with several levels in order to gain more accurate predictions.« less

  8. Effect of Friction Stir Process Parameters on the Mechanical and Thermal Behavior of 5754-H111 Aluminum Plates.

    PubMed

    Serio, Livia Maria; Palumbo, Davide; De Filippis, Luigi Alberto Ciro; Galietti, Umberto; Ludovico, Antonio Domenico

    2016-02-23

    A study of the Friction Stir Welding (FSW) process was carried out in order to evaluate the influence of process parameters on the mechanical properties of aluminum plates (AA5754-H111). The process was monitored during each test by means of infrared cameras in order to correlate temperature information with eventual changes of the mechanical properties of joints. In particular, two process parameters were considered for tests: the welding tool rotation speed and the welding tool traverse speed. The quality of joints was evaluated by means of destructive and non-destructive tests. In this regard, the presence of defects and the ultimate tensile strength (UTS) were investigated for each combination of the process parameters. A statistical analysis was carried out to assess the correlation between the thermal behavior of joints and the process parameters, also proving the capability of Infrared Thermography for on-line monitoring of the quality of joints.

  9. Statistical properties of color-signal spaces.

    PubMed

    Lenz, Reiner; Bui, Thanh Hai

    2005-05-01

    In applications of principal component analysis (PCA) it has often been observed that the eigenvector with the largest eigenvalue has only nonnegative entries when the vectors of the underlying stochastic process have only nonnegative values. This has been used to show that the coordinate vectors in PCA are all located in a cone. We prove that the nonnegativity of the first eigenvector follows from the Perron-Frobenius (and Krein-Rutman theory). Experiments show also that for stochastic processes with nonnegative signals the mean vector is often very similar to the first eigenvector. This is not true in general, but we first give a heuristical explanation why we can expect such a similarity. We then derive a connection between the dominance of the first eigenvalue and the similarity between the mean and the first eigenvector and show how to check the relative size of the first eigenvalue without actually computing it. In the last part of the paper we discuss the implication of theoretical results for multispectral color processing.

  10. Statistical properties of color-signal spaces

    NASA Astrophysics Data System (ADS)

    Lenz, Reiner; Hai Bui, Thanh

    2005-05-01

    In applications of principal component analysis (PCA) it has often been observed that the eigenvector with the largest eigenvalue has only nonnegative entries when the vectors of the underlying stochastic process have only nonnegative values. This has been used to show that the coordinate vectors in PCA are all located in a cone. We prove that the nonnegativity of the first eigenvector follows from the Perron-Frobenius (and Krein-Rutman theory). Experiments show also that for stochastic processes with nonnegative signals the mean vector is often very similar to the first eigenvector. This is not true in general, but we first give a heuristical explanation why we can expect such a similarity. We then derive a connection between the dominance of the first eigenvalue and the similarity between the mean and the first eigenvector and show how to check the relative size of the first eigenvalue without actually computing it. In the last part of the paper we discuss the implication of theoretical results for multispectral color processing.

  11. Optimization of succinic acid fermentation with Actinobacillus succinogenes by response surface methodology (RSM)*

    PubMed Central

    Zhang, Yun-jian; Li, Qiang; Zhang, Yu-xiu; Wang, Dan; Xing, Jian-min

    2012-01-01

    Succinic acid is considered as an important platform chemical. Succinic acid fermentation with Actinobacillus succinogenes strain BE-1 was optimized by central composite design (CCD) using a response surface methodology (RSM). The optimized production of succinic acid was predicted and the interactive effects between glucose, yeast extract, and magnesium carbonate were investigated. As a result, a model for predicting the concentration of succinic acid production was developed. The accuracy of the model was confirmed by the analysis of variance (ANOVA), and the validity was further proved by verification experiments showing that percentage errors between actual and predicted values varied from 3.02% to 6.38%. In addition, it was observed that the interactive effect between yeast extract and magnesium carbonate was statistically significant. In conclusion, RSM is an effective and useful method for optimizing the medium components and investigating the interactive effects, and can provide valuable information for succinic acid scale-up fermentation using A. succinogenes strain BE-1. PMID:22302423

  12. [What motivates smoking and alcohol drinking of young people? A behavioural epidemiologic study].

    PubMed

    Pikó, Bettina; Varga, Szabolcs

    2014-01-19

    Adolescence is a life period of trying harmful habits. It is helpful for prevention to map youth's motivations. The main goal of the present study was to investigate high school students' motivations related to alcohol and cigarette use. A questionnaire survey was performed in Debrecen including students from four high schools (n = 501; age range, between 14 and 22 years; mean age, 16.4 years; 34% boys and 66% girls). Beyond descriptive statistics, logistic regression analysis was used to detect odds ratios explaining relationships between substance use and motivations. Besides a slight difference in gender, there were significant differences by substance user status in the structure of motivations. In case of alcohol use, social motivation proved to be a predictor. In case of cigarette smoking, besides social motivation, boredom relief and affect regulation (coping) were also significant. These data suggest that young people start to smoke cigarette and drink alcohol in social situations due to peer pressure. Therefore, prevention strategies should be built on social skills training.

  13. Multivariate pattern recognition for diagnosis and prognosis in clinical neuroimaging: state of the art, current challenges and future trends.

    PubMed

    Haller, Sven; Lovblad, Karl-Olof; Giannakopoulos, Panteleimon; Van De Ville, Dimitri

    2014-05-01

    Many diseases are associated with systematic modifications in brain morphometry and function. These alterations may be subtle, in particular at early stages of the disease progress, and thus not evident by visual inspection alone. Group-level statistical comparisons have dominated neuroimaging studies for many years, proving fascinating insight into brain regions involved in various diseases. However, such group-level results do not warrant diagnostic value for individual patients. Recently, pattern recognition approaches have led to a fundamental shift in paradigm, bringing multivariate analysis and predictive results, notably for the early diagnosis of individual patients. We review the state-of-the-art fundamentals of pattern recognition including feature selection, cross-validation and classification techniques, as well as limitations including inter-individual variation in normal brain anatomy and neurocognitive reserve. We conclude with the discussion of future trends including multi-modal pattern recognition, multi-center approaches with data-sharing and cloud-computing.

  14. A characterization of horizontal visibility graphs and combinatorics on words

    NASA Astrophysics Data System (ADS)

    Gutin, Gregory; Mansour, Toufik; Severini, Simone

    2011-06-01

    A Horizontal Visibility Graph (HVG) is defined in association with an ordered set of non-negative reals. HVGs realize a methodology in the analysis of time series, their degree distribution being a good discriminator between randomness and chaos Luque et al. [B. Luque, L. Lacasa, F. Ballesteros, J. Luque, Horizontal visibility graphs: exact results for random time series, Phys. Rev. E 80 (2009), 046103]. We prove that a graph is an HVG if and only if it is outerplanar and has a Hamilton path. Therefore, an HVG is a noncrossing graph, as defined in algebraic combinatorics Flajolet and Noy [P. Flajolet, M. Noy, Analytic combinatorics of noncrossing configurations, Discrete Math., 204 (1999) 203-229]. Our characterization of HVGs implies a linear time recognition algorithm. Treating ordered sets as words, we characterize subfamilies of HVGs highlighting various connections with combinatorial statistics and introducing the notion of a visible pair. With this technique, we determine asymptotically the average number of edges of HVGs.

  15. Influence of Elevation Data Source on 2D Hydraulic Modelling

    NASA Astrophysics Data System (ADS)

    Bakuła, Krzysztof; StĘpnik, Mateusz; Kurczyński, Zdzisław

    2016-08-01

    The aim of this paper is to analyse the influence of the source of various elevation data on hydraulic modelling in open channels. In the research, digital terrain models from different datasets were evaluated and used in two-dimensional hydraulic models. The following aerial and satellite elevation data were used to create the representation of terrain-digital terrain model: airborne laser scanning, image matching, elevation data collected in the LPIS, EuroDEM, and ASTER GDEM. From the results of five 2D hydrodynamic models with different input elevation data, the maximum depth and flow velocity of water were derived and compared with the results of the most accurate ALS data. For such an analysis a statistical evaluation and differences between hydraulic modelling results were prepared. The presented research proved the importance of the quality of elevation data in hydraulic modelling and showed that only ALS and photogrammetric data can be the most reliable elevation data source in accurate 2D hydraulic modelling.

  16. Vibrational monitor of early demineralization in tooth enamel after in vitro exposure to phosphoridic liquid

    NASA Astrophysics Data System (ADS)

    Pezzotti, Giuseppe; Adachi, Tetsuya; Gasparutti, Isabella; Vincini, Giulio; Zhu, Wenliang; Boffelli, Marco; Rondinella, Alfredo; Marin, Elia; Ichioka, Hiroaki; Yamamoto, Toshiro; Marunaka, Yoshinori; Kanamura, Narisato

    2017-02-01

    The Raman spectroscopic method has been applied to quantitatively assess the in vitro degree of demineralization in healthy human teeth. Based on previous evaluations of Raman selection rules (empowered by an orientation distribution function (ODF) statistical algorithm) and on a newly proposed analysis of phonon density of states (PDOS) for selected vibrational modes of the hexagonal structure of hydroxyapatite, a molecular-scale evaluation of the demineralization process upon in vitro exposure to a highly acidic beverage (i.e., CocaCola™ Classic, pH = 2.5) could be obtained. The Raman method proved quite sensitive and spectroscopic features could be directly related to an increase in off-stoichiometry of the enamel surface structure since the very early stage of the demineralization process (i.e., when yet invisible to other conventional analytical techniques). The proposed Raman spectroscopic algorithm might possess some generality for caries risk assessment, allowing a prompt non-contact diagnostic practice in dentistry.

  17. Study of Dimple Effect on the Friction Characteristics of a Journal Bearing using Taguchi Method

    NASA Astrophysics Data System (ADS)

    Murthy, A. Amar; Raghunandana, Dr.

    2018-02-01

    The effect of producing dimples using chemically etched techniques or by machining process on the surface of a journal bearing bushing to reduce the friction using Taguchi method is investigated. The data used in the present analysis is based on the results obtained by the series of experiments conducted to study the dimples effect on the Stribeck curve. It is statistically proved that producing dimples on the bushing surface of a journal bearing has significant effect on the friction coefficient when used with light oils. Also it is seen that there is an interaction effect between speeds-load and load-dimples. Hence the interaction effect, which are usually neglected should be considered during actual experiments that significantly contributes in reducing the friction in mixed lubrication regime. The experiments, if were conducted after Taguchi method, then the number of experiments would have been reduced to half of the actual set of experiments that were essentially conducted.

  18. Effect of Friction Stir Process Parameters on the Mechanical and Thermal Behavior of 5754-H111 Aluminum Plates

    PubMed Central

    Serio, Livia Maria; Palumbo, Davide; De Filippis, Luigi Alberto Ciro; Galietti, Umberto; Ludovico, Antonio Domenico

    2016-01-01

    A study of the Friction Stir Welding (FSW) process was carried out in order to evaluate the influence of process parameters on the mechanical properties of aluminum plates (AA5754-H111). The process was monitored during each test by means of infrared cameras in order to correlate temperature information with eventual changes of the mechanical properties of joints. In particular, two process parameters were considered for tests: the welding tool rotation speed and the welding tool traverse speed. The quality of joints was evaluated by means of destructive and non-destructive tests. In this regard, the presence of defects and the ultimate tensile strength (UTS) were investigated for each combination of the process parameters. A statistical analysis was carried out to assess the correlation between the thermal behavior of joints and the process parameters, also proving the capability of Infrared Thermography for on-line monitoring of the quality of joints. PMID:28773246

  19. An advection-diffusion-reaction size-structured fish population dynamics model combined with a statistical parameter estimation procedure: application to the Indian ocean skipjack tuna fishery.

    PubMed

    Faugeras, Blaise; Maury, Olivier

    2005-10-01

    We develop an advection-diffusion size-structured fish population dynamics model and apply it to simulate the skipjack tuna population in the Indian Ocean. The model is fully spatialized, and movements are parameterized with oceanographical and biological data; thus it naturally reacts to environment changes. We first formulate an initial-boundary value problem and prove existence of a unique positive solution. We then discuss the numerical scheme chosen for the integration of the simulation model. In a second step we address the parameter estimation problem for such a model. With the help of automatic differentiation, we derive the adjoint code which is used to compute the exact gradient of a Bayesian cost function measuring the distance between the outputs of the model and catch and length frequency data. A sensitivity analysis shows that not all parameters can be estimated from the data. Finally twin experiments in which pertubated parameters are recovered from simulated data are successfully conducted.

  20. Development of new method for simultaneous analysis of piracetam and levetiracetam in pharmaceuticals and biological fluids: application in stability studies.

    PubMed

    Siddiqui, Farhan Ahmed; Sher, Nawab; Shafi, Nighat; Wafa Sial, Alisha; Ahmad, Mansoor; Mehjebeen; Naseem, Huma

    2014-01-01

    RP-HPLC ultraviolet detection simultaneous quantification of piracetam and levetiracetam has been developed and validated. The chromatography was obtained on a Nucleosil C18 column of 25 cm×0.46 cm, 10 μm, dimension. The mobile phase was a (70:30 v/v) mixture of 0.1 g/L of triethylamine and acetonitrile. Smooth flow of mobile phase at 1 mL/min was set and 205 nm wavelength was selected. Results were evaluated through statistical parameters which qualify the method reproducibility and selectivity for the quantification of piracetam, levetiracetam, and their impurities hence proving stability-indicating properties. The proposed method is significantly important, permitting the separation of the main constituent piracetam from levetiracetam. Linear behavior was observed between 20 ng/mL and 10,000 ng/mL for both drugs. The proposed method was checked in bulk drugs, dosage formulations, physiological condition, and clinical investigations and excellent outcome was witnessed.

Top