A Study on Predictive Analytics Application to Ship Machinery Maintenance
2013-09-01
Looking at the nature of the time series forecasting method , it would be better applied to offline analysis . The application for real- time online...other system attributes in future. Two techniques of statistical analysis , mainly time series models and cumulative sum control charts, are discussed in...statistical tool employed for the two techniques of statistical analysis . Both time series forecasting as well as CUSUM control charts are shown to be
Antweiler, Ronald C.; Taylor, Howard E.
2008-01-01
The main classes of statistical treatment of below-detection limit (left-censored) environmental data for the determination of basic statistics that have been used in the literature are substitution methods, maximum likelihood, regression on order statistics (ROS), and nonparametric techniques. These treatments, along with using all instrument-generated data (even those below detection), were evaluated by examining data sets in which the true values of the censored data were known. It was found that for data sets with less than 70% censored data, the best technique overall for determination of summary statistics was the nonparametric Kaplan-Meier technique. ROS and the two substitution methods of assigning one-half the detection limit value to censored data or assigning a random number between zero and the detection limit to censored data were adequate alternatives. The use of these two substitution methods, however, requires a thorough understanding of how the laboratory censored the data. The technique of employing all instrument-generated data - including numbers below the detection limit - was found to be less adequate than the above techniques. At high degrees of censoring (greater than 70% censored data), no technique provided good estimates of summary statistics. Maximum likelihood techniques were found to be far inferior to all other treatments except substituting zero or the detection limit value to censored data.
NASA Astrophysics Data System (ADS)
O'Shea, Bethany; Jankowski, Jerzy
2006-12-01
The major ion composition of Great Artesian Basin groundwater in the lower Namoi River valley is relatively homogeneous in chemical composition. Traditional graphical techniques have been combined with multivariate statistical methods to determine whether subtle differences in the chemical composition of these waters can be delineated. Hierarchical cluster analysis and principal components analysis were successful in delineating minor variations within the groundwaters of the study area that were not visually identified in the graphical techniques applied. Hydrochemical interpretation allowed geochemical processes to be identified in each statistically defined water type and illustrated how these groundwaters differ from one another. Three main geochemical processes were identified in the groundwaters: ion exchange, precipitation, and mixing between waters from different sources. Both statistical methods delineated an anomalous sample suspected of being influenced by magmatic CO2 input. The use of statistical methods to complement traditional graphical techniques for waters appearing homogeneous is emphasized for all investigations of this type. Copyright
Application of multivariable statistical techniques in plant-wide WWTP control strategies analysis.
Flores, X; Comas, J; Roda, I R; Jiménez, L; Gernaey, K V
2007-01-01
The main objective of this paper is to present the application of selected multivariable statistical techniques in plant-wide wastewater treatment plant (WWTP) control strategies analysis. In this study, cluster analysis (CA), principal component analysis/factor analysis (PCA/FA) and discriminant analysis (DA) are applied to the evaluation matrix data set obtained by simulation of several control strategies applied to the plant-wide IWA Benchmark Simulation Model No 2 (BSM2). These techniques allow i) to determine natural groups or clusters of control strategies with a similar behaviour, ii) to find and interpret hidden, complex and casual relation features in the data set and iii) to identify important discriminant variables within the groups found by the cluster analysis. This study illustrates the usefulness of multivariable statistical techniques for both analysis and interpretation of the complex multicriteria data sets and allows an improved use of information for effective evaluation of control strategies.
NASA Astrophysics Data System (ADS)
Walz, Michael; Leckebusch, Gregor C.
2016-04-01
Extratropical wind storms pose one of the most dangerous and loss intensive natural hazards for Europe. However, due to only 50 years of high quality observational data, it is difficult to assess the statistical uncertainty of these sparse events just based on observations. Over the last decade seasonal ensemble forecasts have become indispensable in quantifying the uncertainty of weather prediction on seasonal timescales. In this study seasonal forecasts are used in a climatological context: By making use of the up to 51 ensemble members, a broad and physically consistent statistical base can be created. This base can then be used to assess the statistical uncertainty of extreme wind storm occurrence more accurately. In order to determine the statistical uncertainty of storms with different paths of progression, a probabilistic clustering approach using regression mixture models is used to objectively assign storm tracks (either based on core pressure or on extreme wind speeds) to different clusters. The advantage of this technique is that the entire lifetime of a storm is considered for the clustering algorithm. Quadratic curves are found to describe the storm tracks most accurately. Three main clusters (diagonal, horizontal or vertical progression of the storm track) can be identified, each of which have their own particulate features. Basic storm features like average velocity and duration are calculated and compared for each cluster. The main benefit of this clustering technique, however, is to evaluate if the clusters show different degrees of uncertainty, e.g. more (less) spread for tracks approaching Europe horizontally (diagonally). This statistical uncertainty is compared for different seasonal forecast products.
BaTMAn: Bayesian Technique for Multi-image Analysis
NASA Astrophysics Data System (ADS)
Casado, J.; Ascasibar, Y.; García-Benito, R.; Guidi, G.; Choudhury, O. S.; Bellocchi, E.; Sánchez, S. F.; Díaz, A. I.
2016-12-01
Bayesian Technique for Multi-image Analysis (BaTMAn) characterizes any astronomical dataset containing spatial information and performs a tessellation based on the measurements and errors provided as input. The algorithm iteratively merges spatial elements as long as they are statistically consistent with carrying the same information (i.e. identical signal within the errors). The output segmentations successfully adapt to the underlying spatial structure, regardless of its morphology and/or the statistical properties of the noise. BaTMAn identifies (and keeps) all the statistically-significant information contained in the input multi-image (e.g. an IFS datacube). The main aim of the algorithm is to characterize spatially-resolved data prior to their analysis.
A Monte Carlo study of Weibull reliability analysis for space shuttle main engine components
NASA Technical Reports Server (NTRS)
Abernethy, K.
1986-01-01
The incorporation of a number of additional capabilities into an existing Weibull analysis computer program and the results of Monte Carlo computer simulation study to evaluate the usefulness of the Weibull methods using samples with a very small number of failures and extensive censoring are discussed. Since the censoring mechanism inherent in the Space Shuttle Main Engine (SSME) data is hard to analyze, it was decided to use a random censoring model, generating censoring times from a uniform probability distribution. Some of the statistical techniques and computer programs that are used in the SSME Weibull analysis are described. The methods documented in were supplemented by adding computer calculations of approximate (using iteractive methods) confidence intervals for several parameters of interest. These calculations are based on a likelihood ratio statistic which is asymptotically a chisquared statistic with one degree of freedom. The assumptions built into the computer simulations are described. The simulation program and the techniques used in it are described there also. Simulation results are tabulated for various combinations of Weibull shape parameters and the numbers of failures in the samples.
[The evaluation of costs: standards of medical care and clinical statistic groups].
Semenov, V Iu; Samorodskaia, I V
2014-01-01
The article presents the comparative analysis of techniques of evaluation of costs of hospital treatment using medical economic standards of medical care and clinical statistical groups. The technique of evaluation of costs on the basis of clinical statistical groups was developed almost fifty years ago and is largely applied in a number of countries. Nowadays, in Russia the payment for completed case of treatment on the basis of medical economic standards is the main mode of payment for medical care in hospital. It is very conditionally a Russian analogue of world-wide prevalent system of diagnostic related groups. The tariffs for these cases of treatment as opposed to clinical statistical groups are counted on basis of standards of provision of medical care approved by Minzdrav of Russia. The information derived from generalization of cases of treatment of real patients is not applied.
Index of Selected Publications Through December 1983,
1984-03-01
substantiating methodology , and is designed mainly for * readers with a professional interest in the subject but do * not have a primary responsibility in that...Navy in postwar American security policy -- computer subroutines - CRC 20 H 1052 experimental design techniques, computer North Atlantic-Norwegian...statistical tion and Congestion, With an Example from Southern experimental design technique aids the analysis California, 27 pp., Jan 1971, AD 719 906 of
Signal analysis techniques for incipient failure detection in turbomachinery
NASA Technical Reports Server (NTRS)
Coffin, T.
1985-01-01
Signal analysis techniques for the detection and classification of incipient mechanical failures in turbomachinery were developed, implemented and evaluated. Signal analysis techniques available to describe dynamic measurement characteristics are reviewed. Time domain and spectral methods are described, and statistical classification in terms of moments is discussed. Several of these waveform analysis techniques were implemented on a computer and applied to dynamic signals. A laboratory evaluation of the methods with respect to signal detection capability is described. Plans for further technique evaluation and data base development to characterize turbopump incipient failure modes from Space Shuttle main engine (SSME) hot firing measurements are outlined.
ERIC Educational Resources Information Center
Kostadinov, Boyan
2013-01-01
This article attempts to introduce the reader to computational thinking and solving problems involving randomness. The main technique being employed is the Monte Carlo method, using the freely available software "R for Statistical Computing." The author illustrates the computer simulation approach by focusing on several problems of…
NASA Astrophysics Data System (ADS)
Gaitán Fernández, E.; García Moreno, R.; Pino Otín, M. R.; Ribalaygua Batalla, J.
2012-04-01
Climate and soil are two of the most important limiting factors for agricultural production. Nowadays climate change has been documented in many geographical locations affecting different cropping systems. The General Circulation Models (GCM) has become important tools to simulate the more relevant aspects of the climate expected for the XXI century in the frame of climatic change. These models are able to reproduce the general features of the atmospheric dynamic but their low resolution (about 200 Km) avoids a proper simulation of lower scale meteorological effects. Downscaling techniques allow overcoming this problem by adapting the model outcomes to local scale. In this context, FIC (Fundación para la Investigación del Clima) has developed a statistical downscaling technique based on a two step analogue methods. This methodology has been broadly tested on national and international environments leading to excellent results on future climate models. In a collaboration project, this statistical downscaling technique was applied to predict future scenarios for the grape growing systems in Spain. The application of such model is very important to predict expected climate for the different growing crops, mainly for grape, where the success of different varieties are highly related to climate and soil. The model allowed the implementation of agricultural conservation practices in the crop production, detecting highly sensible areas to negative impacts produced by any modification of climate in the different regions, mainly those protected with protected designation of origin, and the definition of new production areas with optimal edaphoclimatic conditions for the different varieties.
Signal Detection Techniques for Diagnostic Monitoring of Space Shuttle Main Engine Turbomachinery
NASA Technical Reports Server (NTRS)
Coffin, Thomas; Jong, Jen-Yi
1986-01-01
An investigation to develop, implement, and evaluate signal analysis techniques for the detection and classification of incipient mechanical failures in turbomachinery is reviewed. A brief description of the Space Shuttle Main Engine (SSME) test/measurement program is presented. Signal analysis techniques available to describe dynamic measurement characteristics are reviewed. Time domain and spectral methods are described, and statistical classification in terms of moments is discussed. Several of these waveform analysis techniques have been implemented on a computer and applied to dynamc signals. A laboratory evaluation of the methods with respect to signal detection capability is described. A unique coherence function (the hyper-coherence) was developed through the course of this investigation, which appears promising as a diagnostic tool. This technique and several other non-linear methods of signal analysis are presented and illustrated by application. Software for application of these techniques has been installed on the signal processing system at the NASA/MSFC Systems Dynamics Laboratory.
Fráter, Mark; Forster, András; Jantyik, Ádám; Braunitzer, Gábor; Nagy, Katalin
2015-12-01
The purpose of this in vitro investigation was to evaluate the reinforcing effect of different fibre-reinforced composite (FRC) posts and insertion techniques in premolar teeth when using minimal invasive post space preparation. Thirty two extracted and endodontically treated premolar teeth were used and divided into four groups (n = 8) depending on the post used (Group 1-4). 1: one single conventional post, 2: one main conventional and one collateral post, 3: one flexible post, 4: one main flexible and one collateral post. After cementation and core build-up the specimens were submitted to static fracture toughness test. Fracture thresholds and fracture patterns were recorded and evaluated. The multi-post techniques (group 2 and 4) showed statistically higher fracture resistance compared to group one. Regarding fracture patterns there was no statistically significant difference between the tested groups. The application of multiple posts seems to be beneficial regarding fracture resistance independent from the used FRC post. Fracture pattern was not influenced by the elasticity of the post.
A survey of statistics in three UK general practice journal
Rigby, Alan S; Armstrong, Gillian K; Campbell, Michael J; Summerton, Nick
2004-01-01
Background Many medical specialities have reviewed the statistical content of their journals. To our knowledge this has not been done in general practice. Given the main role of a general practitioner as a diagnostician we thought it would be of interest to see whether the statistical methods reported reflect the diagnostic process. Methods Hand search of three UK journals of general practice namely the British Medical Journal (general practice section), British Journal of General Practice and Family Practice over a one-year period (1 January to 31 December 2000). Results A wide variety of statistical techniques were used. The most common methods included t-tests and Chi-squared tests. There were few articles reporting likelihood ratios and other useful diagnostic methods. There was evidence that the journals with the more thorough statistical review process reported a more complex and wider variety of statistical techniques. Conclusions The BMJ had a wider range and greater diversity of statistical methods than the other two journals. However, in all three journals there was a dearth of papers reflecting the diagnostic process. Across all three journals there were relatively few papers describing randomised controlled trials thus recognising the difficulty of implementing this design in general practice. PMID:15596014
Rodríguez-Entrena, Macario; Schuberth, Florian; Gelhard, Carsten
2018-01-01
Structural equation modeling using partial least squares (PLS-SEM) has become a main-stream modeling approach in various disciplines. Nevertheless, prior literature still lacks a practical guidance on how to properly test for differences between parameter estimates. Whereas existing techniques such as parametric and non-parametric approaches in PLS multi-group analysis solely allow to assess differences between parameters that are estimated for different subpopulations, the study at hand introduces a technique that allows to also assess whether two parameter estimates that are derived from the same sample are statistically different. To illustrate this advancement to PLS-SEM, we particularly refer to a reduced version of the well-established technology acceptance model.
Attitude of teaching faculty towards statistics at a medical university in Karachi, Pakistan.
Khan, Nazeer; Mumtaz, Yasmin
2009-01-01
Statistics is mainly used in biological research to verify the clinicians and researchers findings and feelings, and gives scientific validity for their inferences. In Pakistan, the educational curriculum is developed in such a way that the students who are interested in entering in the field of biological sciences do not study mathematics after grade 10. Therefore, due to their fragile background of mathematical skills, the Pakistani medical professionals feel that they do not have adequate base to understand the basic concepts of statistical techniques when they try to use it in their research or read a scientific article. The aim of the study was to assess the attitude of medical faculty towards statistics. A questionnaire containing 42 close-ended and 4 open-ended questions, related to the attitude and knowledge of statistics, was distributed among the teaching faculty of Dow University of Health Sciences (DUHS). One hundred and sixty-seven filled questionnaires were returned from 374 faculty members (response rate 44.7%). Forty-three percent of the respondents claimed that they had 'introductive' level of statistics courses, 63% of the respondents strongly agreed that a good researcher must have some training in statistics, 82% of the faculty was in favour (strongly agreed or agreed) that statistics was really useful for research. Only 17% correctly stated that statistics is the science of uncertainty. Half of the respondents accepted that they have problem of writing the statistical section of the article. 64% of the subjects indicated that statistical teaching methods were the main reasons for the impression of its difficulties. 53% of the faculty indicated that the co-authorship of the statistician should depend upon his/her contribution in the study. Gender did not show any significant difference among the responses. However, senior faculty showed higher level of the importance for the use of statistics and difficulties of writing result section of articles as compared to junior faculty. The study showed a low level of knowledge, but high level of the awareness for the use of statistical techniques in research and exhibited a good level of motivation for further training.
NASA Astrophysics Data System (ADS)
Ishida, Shigeki; Mori, Atsuo; Shinji, Masato
The main method to reduce the blasting charge noise which occurs in a tunnel under construction is to install the sound insulation door in the tunnel. However, the numerical analysis technique to predict the accurate effect of the transmission loss in the sound insulation door is not established. In this study, we measured the blasting charge noise and the vibration of the sound insulation door in the tunnel with the blasting charge, and performed analysis and modified acoustic feature. In addition, we reproduced the noise reduction effect of the sound insulation door by statistical energy analysis method and confirmed that numerical simulation is possible by this procedure.
NASA Astrophysics Data System (ADS)
Peresan, Antonella; Gentili, Stefania
2017-04-01
Identification and statistical characterization of seismic clusters may provide useful insights about the features of seismic energy release and their relation to physical properties of the crust within a given region. Moreover, a number of studies based on spatio-temporal analysis of main-shocks occurrence require preliminary declustering of the earthquake catalogs. Since various methods, relying on different physical/statistical assumptions, may lead to diverse classifications of earthquakes into main events and related events, we aim to investigate the classification differences among different declustering techniques. Accordingly, a formal selection and comparative analysis of earthquake clusters is carried out for the most relevant earthquakes in North-Eastern Italy, as reported in the local OGS-CRS bulletins, compiled at the National Institute of Oceanography and Experimental Geophysics since 1977. The comparison is then extended to selected earthquake sequences associated with a different seismotectonic setting, namely to events that occurred in the region struck by the recent Central Italy destructive earthquakes, making use of INGV data. Various techniques, ranging from classical space-time windows methods to ad hoc manual identification of aftershocks, are applied for detection of earthquake clusters. In particular, a statistical method based on nearest-neighbor distances of events in space-time-energy domain, is considered. Results from clusters identification by the nearest-neighbor method turn out quite robust with respect to the time span of the input catalogue, as well as to minimum magnitude cutoff. The identified clusters for the largest events reported in North-Eastern Italy since 1977 are well consistent with those reported in earlier studies, which were aimed at detailed manual aftershocks identification. The study shows that the data-driven approach, based on the nearest-neighbor distances, can be satisfactorily applied to decompose the seismic catalog into background seismicity and individual sequences of earthquake clusters, also in areas characterized by moderate seismic activity, where the standard declustering techniques may turn out rather gross approximations. With these results acquired, the main statistical features of seismic clusters are explored, including complex interdependence of related events, with the aim to characterize the space-time patterns of earthquakes occurrence in North-Eastern Italy and capture their basic differences with Central Italy sequences.
Stellar populations in local star-forming galaxies
NASA Astrophysics Data System (ADS)
Perez-Gonzalez, P. G.
2003-11-01
The main goal of this thesis work is studying the main properties of the stellar populations embedded in a statistically complete sample of local active star-forming galaxies: the Universidad Complutense de Madrid (UCM) Survey of emission-line galaxies. This sample contains 191 local star-forming galaxies at an average redshift of 0.026. The survey was carried out using an objective-prism technique centered at the wavelength of the Halpha nebular emission-line (a common tracer of recent star formation). (continues)
Finding Statistically Significant Communities in Networks
Lancichinetti, Andrea; Radicchi, Filippo; Ramasco, José J.; Fortunato, Santo
2011-01-01
Community structure is one of the main structural features of networks, revealing both their internal organization and the similarity of their elementary units. Despite the large variety of methods proposed to detect communities in graphs, there is a big need for multi-purpose techniques, able to handle different types of datasets and the subtleties of community structure. In this paper we present OSLOM (Order Statistics Local Optimization Method), the first method capable to detect clusters in networks accounting for edge directions, edge weights, overlapping communities, hierarchies and community dynamics. It is based on the local optimization of a fitness function expressing the statistical significance of clusters with respect to random fluctuations, which is estimated with tools of Extreme and Order Statistics. OSLOM can be used alone or as a refinement procedure of partitions/covers delivered by other techniques. We have also implemented sequential algorithms combining OSLOM with other fast techniques, so that the community structure of very large networks can be uncovered. Our method has a comparable performance as the best existing algorithms on artificial benchmark graphs. Several applications on real networks are shown as well. OSLOM is implemented in a freely available software (http://www.oslom.org), and we believe it will be a valuable tool in the analysis of networks. PMID:21559480
[Introduction to Exploratory Factor Analysis (EFA)].
Martínez, Carolina Méndez; Sepúlveda, Martín Alonso Rondón
2012-03-01
Exploratory Factor Analysis (EFA) has become one of the most frequently used statistical techniques, especially in the medical and social sciences. Given its popularity, it is essential to understand the basic concepts necessary for its proper application and to take into consideration the main strengths and weaknesses of this technique. To present in a clear and concise manner the main applications of this technique, to determine the basic requirements for its use providing a description step by step of its methodology, and to establish the elements that must be taken into account during its preparation in order to not incur in erroneous results and interpretations. Narrative review. This review identifies the basic concepts and briefly describes the objectives, design, assumptions, and methodology to achieve factor derivation, global adjustment evaluation, and adequate interpretation of results. Copyright © 2012 Asociación Colombiana de Psiquiatría. Publicado por Elsevier España. All rights reserved.
A rheumatoid arthritis study by Fourier transform infrared spectroscopy
NASA Astrophysics Data System (ADS)
Carvalho, Carolina S.; Silva, Ana Carla A.; Santos, Tatiano J. P. S.; Martin, Airton A.; dos Santos Fernandes, Ana Célia; Andrade, Luís E.; Raniero, Leandro
2012-01-01
Rheumatoid arthritis is a systemic inflammatory disease of unknown causes and a new methods to identify it in early stages are needed. The main purpose of this work is the biochemical differentiation of sera between normal and RA patients, through the establishment of a statistical method that can be appropriately used for serological analysis. The human sera from 39 healthy donors and 39 rheumatics donors were collected and analyzed by Fourier Transform Infrared Spectroscopy. The results show significant spectral variations with p<0.05 in regions corresponding to protein, lipids and immunoglobulins. The technique of latex particles, coated with human IgG and monoclonal anti-CRP by indirect agglutination known as FR and CRP, was performed to confirm possible false-negative results within the groups, facilitating the statistical interpretation and validation of the technique.
NASA Astrophysics Data System (ADS)
Yi, Yong; Chen, Zhengying; Wang, Liming
2018-05-01
Corona-originated discharge of DC transmission lines is the main reason for the radiated electromagnetic interference (EMI) field in the vicinity of transmission lines. A joint time-frequency analysis technique was proposed to extract the radiated EMI current (excitation current) of DC corona based on corona current statistical measurements. A reduced-scale experimental platform was setup to measure the statistical distributions of current waveform parameters of aluminum conductor steel reinforced. Based on the measured results, the peak value, root-mean-square value and average value with 9 kHz and 200 Hz band-with of 0.5 MHz radiated EMI current were calculated by the technique proposed and validated with conventional excitation function method. Radio interference (RI) was calculated based on the radiated EMI current and a wire-to-plate platform was built for the validity of the RI computation results. The reason for the certain deviation between the computations and measurements was detailed analyzed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Somerville, Richard
2013-08-22
The long-range goal of several past and current projects in our DOE-supported research has been the development of new and improved parameterizations of cloud-radiation effects and related processes, using ARM data, and the implementation and testing of these parameterizations in global models. The main objective of the present project being reported on here has been to develop and apply advanced statistical techniques, including Bayesian posterior estimates, to diagnose and evaluate features of both observed and simulated clouds. The research carried out under this project has been novel in two important ways. The first is that it is a key stepmore » in the development of practical stochastic cloud-radiation parameterizations, a new category of parameterizations that offers great promise for overcoming many shortcomings of conventional schemes. The second is that this work has brought powerful new tools to bear on the problem, because it has been a collaboration between a meteorologist with long experience in ARM research (Somerville) and a mathematician who is an expert on a class of advanced statistical techniques that are well-suited for diagnosing model cloud simulations using ARM observations (Shen).« less
NASA Astrophysics Data System (ADS)
Roy, P. K.; Pal, S.; Banerjee, G.; Biswas Roy, M.; Ray, D.; Majumder, A.
2014-12-01
River is considered as one of the main sources of freshwater all over the world. Hence analysis and maintenance of this water resource is globally considered a matter of major concern. This paper deals with the assessment of surface water quality of the Ichamati river using multivariate statistical techniques. Eight distinct surface water quality observation stations were located and samples were collected. For the samples collected statistical techniques were applied to the physico-chemical parameters and depth of siltation. In this paper cluster analysis is done to determine the relations between surface water quality and siltation depth of river Ichamati. Multiple regressions and mathematical equation modeling have been done to characterize surface water quality of Ichamati river on the basis of physico-chemical parameters. It was found that surface water quality of the downstream river was different from the water quality of the upstream. The analysis of the water quality parameters of the Ichamati river clearly indicate high pollution load on the river water which can be accounted to agricultural discharge, tidal effect and soil erosion. The results further reveal that with the increase in depth of siltation, water quality degraded.
Krami, Loghman Khoda; Amiri, Fazel; Sefiyanian, Alireza; Shariff, Abdul Rashid B Mohamed; Tabatabaie, Tayebeh; Pradhan, Biswajeet
2013-12-01
One hundred and thirty composite soil samples were collected from Hamedan county, Iran to characterize the spatial distribution and trace the sources of heavy metals including As, Cd, Co, Cr, Cu, Ni, Pb, V, Zn, and Fe. The multivariate gap statistical analysis was used; for interrelation of spatial patterns of pollution, the disjunctive kriging and geoenrichment factor (EF(G)) techniques were applied. Heavy metals and soil properties were grouped using agglomerative hierarchical clustering and gap statistic. Principal component analysis was used for identification of the source of metals in a set of data. Geostatistics was used for the geospatial data processing. Based on the comparison between the original data and background values of the ten metals, the disjunctive kriging and EF(G) techniques were used to quantify their geospatial patterns and assess the contamination levels of the heavy metals. The spatial distribution map combined with the statistical analysis showed that the main source of Cr, Co, Ni, Zn, Pb, and V in group A land use (agriculture, rocky, and urban) was geogenic; the origin of As, Cd, and Cu was industrial and agricultural activities (anthropogenic sources). In group B land use (rangeland and orchards), the origin of metals (Cr, Co, Ni, Zn, and V) was mainly controlled by natural factors and As, Cd, Cu, and Pb had been added by organic factors. In group C land use (water), the origin of most heavy metals is natural without anthropogenic sources. The Cd and As pollution was relatively more serious in different land use. The EF(G) technique used confirmed the anthropogenic influence of heavy metal pollution. All metals showed concentrations substantially higher than their background values, suggesting anthropogenic pollution.
Discrimination of dynamical system models for biological and chemical processes.
Lorenz, Sönke; Diederichs, Elmar; Telgmann, Regina; Schütte, Christof
2007-06-01
In technical chemistry, systems biology and biotechnology, the construction of predictive models has become an essential step in process design and product optimization. Accurate modelling of the reactions requires detailed knowledge about the processes involved. However, when concerned with the development of new products and production techniques for example, this knowledge often is not available due to the lack of experimental data. Thus, when one has to work with a selection of proposed models, the main tasks of early development is to discriminate these models. In this article, a new statistical approach to model discrimination is described that ranks models wrt. the probability with which they reproduce the given data. The article introduces the new approach, discusses its statistical background, presents numerical techniques for its implementation and illustrates the application to examples from biokinetics.
Oelze, Michael L.; Mamou, Jonathan
2017-01-01
Conventional medical imaging technologies, including ultrasound, have continued to improve over the years. For example, in oncology, medical imaging is characterized by high sensitivity, i.e., the ability to detect anomalous tissue features, but the ability to classify these tissue features from images often lacks specificity. As a result, a large number of biopsies of tissues with suspicious image findings are performed each year with a vast majority of these biopsies resulting in a negative finding. To improve specificity of cancer imaging, quantitative imaging techniques can play an important role. Conventional ultrasound B-mode imaging is mainly qualitative in nature. However, quantitative ultrasound (QUS) imaging can provide specific numbers related to tissue features that can increase the specificity of image findings leading to improvements in diagnostic ultrasound. QUS imaging techniques can encompass a wide variety of techniques including spectral-based parameterization, elastography, shear wave imaging, flow estimation and envelope statistics. Currently, spectral-based parameterization and envelope statistics are not available on most conventional clinical ultrasound machines. However, in recent years QUS techniques involving spectral-based parameterization and envelope statistics have demonstrated success in many applications, providing additional diagnostic capabilities. Spectral-based techniques include the estimation of the backscatter coefficient, estimation of attenuation, and estimation of scatterer properties such as the correlation length associated with an effective scatterer diameter and the effective acoustic concentration of scatterers. Envelope statistics include the estimation of the number density of scatterers and quantification of coherent to incoherent signals produced from the tissue. Challenges for clinical application include correctly accounting for attenuation effects and transmission losses and implementation of QUS on clinical devices. Successful clinical and pre-clinical applications demonstrating the ability of QUS to improve medical diagnostics include characterization of the myocardium during the cardiac cycle, cancer detection, classification of solid tumors and lymph nodes, detection and quantification of fatty liver disease, and monitoring and assessment of therapy. PMID:26761606
Statistical Analysis for the Solomon Four-Group Design. Research Report 99-06.
ERIC Educational Resources Information Center
van Engelenburg, Gijsbert
The Solomon four-group design (R. Solomon, 1949) is a very useful experimental design to investigate the main effect of a pretest and the interaction of pretest and treatment. Although the design was proposed half a century ago, no proper data analysis techniques have been available. This paper describes how data from the Solomon four-group design…
Statistical physics of vehicular traffic and some related systems
NASA Astrophysics Data System (ADS)
Chowdhury, Debashish; Santen, Ludger; Schadschneider, Andreas
2000-05-01
In the so-called “microscopic” models of vehicular traffic, attention is paid explicitly to each individual vehicle each of which is represented by a “particle”; the nature of the “interactions” among these particles is determined by the way the vehicles influence each others’ movement. Therefore, vehicular traffic, modeled as a system of interacting “particles” driven far from equilibrium, offers the possibility to study various fundamental aspects of truly nonequilibrium systems which are of current interest in statistical physics. Analytical as well as numerical techniques of statistical physics are being used to study these models to understand rich variety of physical phenomena exhibited by vehicular traffic. Some of these phenomena, observed in vehicular traffic under different circumstances, include transitions from one dynamical phase to another, criticality and self-organized criticality, metastability and hysteresis, phase-segregation, etc. In this critical review, written from the perspective of statistical physics, we explain the guiding principles behind all the main theoretical approaches. But we present detailed discussions on the results obtained mainly from the so-called “particle-hopping” models, particularly emphasizing those which have been formulated in recent years using the language of cellular automata.
21 CFR 820.250 - Statistical techniques.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Statistical techniques. 820.250 Section 820.250...) MEDICAL DEVICES QUALITY SYSTEM REGULATION Statistical Techniques § 820.250 Statistical techniques. (a... statistical techniques required for establishing, controlling, and verifying the acceptability of process...
21 CFR 820.250 - Statistical techniques.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Statistical techniques. 820.250 Section 820.250...) MEDICAL DEVICES QUALITY SYSTEM REGULATION Statistical Techniques § 820.250 Statistical techniques. (a... statistical techniques required for establishing, controlling, and verifying the acceptability of process...
Statistical strategy for anisotropic adventitia modelling in IVUS.
Gil, Debora; Hernández, Aura; Rodriguez, Oriol; Mauri, Josepa; Radeva, Petia
2006-06-01
Vessel plaque assessment by analysis of intravascular ultrasound sequences is a useful tool for cardiac disease diagnosis and intervention. Manual detection of luminal (inner) and media-adventitia (external) vessel borders is the main activity of physicians in the process of lumen narrowing (plaque) quantification. Difficult definition of vessel border descriptors, as well as, shades, artifacts, and blurred signal response due to ultrasound physical properties trouble automated adventitia segmentation. In order to efficiently approach such a complex problem, we propose blending advanced anisotropic filtering operators and statistical classification techniques into a vessel border modelling strategy. Our systematic statistical analysis shows that the reported adventitia detection achieves an accuracy in the range of interobserver variability regardless of plaque nature, vessel geometry, and incomplete vessel borders.
Linear regression models and k-means clustering for statistical analysis of fNIRS data.
Bonomini, Viola; Zucchelli, Lucia; Re, Rebecca; Ieva, Francesca; Spinelli, Lorenzo; Contini, Davide; Paganoni, Anna; Torricelli, Alessandro
2015-02-01
We propose a new algorithm, based on a linear regression model, to statistically estimate the hemodynamic activations in fNIRS data sets. The main concern guiding the algorithm development was the minimization of assumptions and approximations made on the data set for the application of statistical tests. Further, we propose a K-means method to cluster fNIRS data (i.e. channels) as activated or not activated. The methods were validated both on simulated and in vivo fNIRS data. A time domain (TD) fNIRS technique was preferred because of its high performances in discriminating cortical activation and superficial physiological changes. However, the proposed method is also applicable to continuous wave or frequency domain fNIRS data sets.
Linear regression models and k-means clustering for statistical analysis of fNIRS data
Bonomini, Viola; Zucchelli, Lucia; Re, Rebecca; Ieva, Francesca; Spinelli, Lorenzo; Contini, Davide; Paganoni, Anna; Torricelli, Alessandro
2015-01-01
We propose a new algorithm, based on a linear regression model, to statistically estimate the hemodynamic activations in fNIRS data sets. The main concern guiding the algorithm development was the minimization of assumptions and approximations made on the data set for the application of statistical tests. Further, we propose a K-means method to cluster fNIRS data (i.e. channels) as activated or not activated. The methods were validated both on simulated and in vivo fNIRS data. A time domain (TD) fNIRS technique was preferred because of its high performances in discriminating cortical activation and superficial physiological changes. However, the proposed method is also applicable to continuous wave or frequency domain fNIRS data sets. PMID:25780751
Antweiler, Ronald C.
2015-01-01
The main classes of statistical treatments that have been used to determine if two groups of censored environmental data arise from the same distribution are substitution methods, maximum likelihood (MLE) techniques, and nonparametric methods. These treatments along with using all instrument-generated data (IN), even those less than the detection limit, were evaluated by examining 550 data sets in which the true values of the censored data were known, and therefore “true” probabilities could be calculated and used as a yardstick for comparison. It was found that technique “quality” was strongly dependent on the degree of censoring present in the groups. For low degrees of censoring (<25% in each group), the Generalized Wilcoxon (GW) technique and substitution of √2/2 times the detection limit gave overall the best results. For moderate degrees of censoring, MLE worked best, but only if the distribution could be estimated to be normal or log-normal prior to its application; otherwise, GW was a suitable alternative. For higher degrees of censoring (each group >40% censoring), no technique provided reliable estimates of the true probability. Group size did not appear to influence the quality of the result, and no technique appeared to become better or worse than other techniques relative to group size. Finally, IN appeared to do very well relative to the other techniques regardless of censoring or group size.
Sensor data validation and reconstruction. Phase 1: System architecture study
NASA Technical Reports Server (NTRS)
1991-01-01
The sensor validation and data reconstruction task reviewed relevant literature and selected applicable validation and reconstruction techniques for further study; analyzed the selected techniques and emphasized those which could be used for both validation and reconstruction; analyzed Space Shuttle Main Engine (SSME) hot fire test data to determine statistical and physical relationships between various parameters; developed statistical and empirical correlations between parameters to perform validation and reconstruction tasks, using a computer aided engineering (CAE) package; and conceptually designed an expert system based knowledge fusion tool, which allows the user to relate diverse types of information when validating sensor data. The host hardware for the system is intended to be a Sun SPARCstation, but could be any RISC workstation with a UNIX operating system and a windowing/graphics system such as Motif or Dataviews. The information fusion tool is intended to be developed using the NEXPERT Object expert system shell, and the C programming language.
2014-01-01
Background The chemical composition of aerosols and particle size distributions are the most significant factors affecting air quality. In particular, the exposure to finer particles can cause short and long-term effects on human health. In the present paper PM10 (particulate matter with aerodynamic diameter lower than 10 μm), CO, NOx (NO and NO2), Benzene and Toluene trends monitored in six monitoring stations of Bari province are shown. The data set used was composed by bi-hourly means for all parameters (12 bi-hourly means per day for each parameter) and it’s referred to the period of time from January 2005 and May 2007. The main aim of the paper is to provide a clear illustration of how large data sets from monitoring stations can give information about the number and nature of the pollutant sources, and mainly to assess the contribution of the traffic source to PM10 concentration level by using multivariate statistical techniques such as Principal Component Analysis (PCA) and Absolute Principal Component Scores (APCS). Results Comparing the night and day mean concentrations (per day) for each parameter it has been pointed out that there is a different night and day behavior for some parameters such as CO, Benzene and Toluene than PM10. This suggests that CO, Benzene and Toluene concentrations are mainly connected with transport systems, whereas PM10 is mostly influenced by different factors. The statistical techniques identified three recurrent sources, associated with vehicular traffic and particulate transport, covering over 90% of variance. The contemporaneous analysis of gas and PM10 has allowed underlining the differences between the sources of these pollutants. Conclusions The analysis of the pollutant trends from large data set and the application of multivariate statistical techniques such as PCA and APCS can give useful information about air quality and pollutant’s sources. These knowledge can provide useful advices to environmental policies in order to reach the WHO recommended levels. PMID:24555534
Measurement of the relationship between perceived and computed color differences
NASA Astrophysics Data System (ADS)
García, Pedro A.; Huertas, Rafael; Melgosa, Manuel; Cui, Guihua
2007-07-01
Using simulated data sets, we have analyzed some mathematical properties of different statistical measurements that have been employed in previous literature to test the performance of different color-difference formulas. Specifically, the properties of the combined index PF/3 (performance factor obtained as average of three terms), widely employed in current literature, have been considered. A new index named standardized residual sum of squares (STRESS), employed in multidimensional scaling techniques, is recommended. The main difference between PF/3 and STRESS is that the latter is simpler and allows inferences on the statistical significance of two color-difference formulas with respect to a given set of visual data.
Techniques for recognizing identity of several response functions from the data of visual inspection
NASA Astrophysics Data System (ADS)
Nechval, Nicholas A.
1996-08-01
The purpose of this paper is to present some efficient techniques for recognizing from the observed data whether several response functions are identical to each other. For example, in an industrial setting the problem may be to determine whether the production coefficients established in a small-scale pilot study apply to each of several large- scale production facilities. The techniques proposed here combine sensor information from automated visual inspection of manufactured products which is carried out by means of pixel-by-pixel comparison of the sensed image of the product to be inspected with some reference pattern (or image). Let (a1, . . . , am) be p-dimensional parameters associated with m response models of the same type. This study is concerned with the simultaneous comparison of a1, . . . , am. A generalized maximum likelihood ratio (GMLR) test is derived for testing equality of these parameters, where each of the parameters represents a corresponding vector of regression coefficients. The GMLR test reduces to an equivalent test based on a statistic that has an F distribution. The main advantage of the test lies in its relative simplicity and the ease with which it can be applied. Another interesting test for the same problem is an application of Fisher's method of combining independent test statistics which can be considered as a parallel procedure to the GMLR test. The combination of independent test statistics does not appear to have been used very much in applied statistics. There does, however, seem to be potential data analytic value in techniques for combining distributional assessments in relation to statistically independent samples which are of joint experimental relevance. In addition, a new iterated test for the problem defined above is presented. A rejection of the null hypothesis by this test provides some reason why all the parameters are not equal. A numerical example is discussed in the context of the proposed procedures for hypothesis testing.
Oelze, Michael L; Mamou, Jonathan
2016-02-01
Conventional medical imaging technologies, including ultrasound, have continued to improve over the years. For example, in oncology, medical imaging is characterized by high sensitivity, i.e., the ability to detect anomalous tissue features, but the ability to classify these tissue features from images often lacks specificity. As a result, a large number of biopsies of tissues with suspicious image findings are performed each year with a vast majority of these biopsies resulting in a negative finding. To improve specificity of cancer imaging, quantitative imaging techniques can play an important role. Conventional ultrasound B-mode imaging is mainly qualitative in nature. However, quantitative ultrasound (QUS) imaging can provide specific numbers related to tissue features that can increase the specificity of image findings leading to improvements in diagnostic ultrasound. QUS imaging can encompass a wide variety of techniques including spectral-based parameterization, elastography, shear wave imaging, flow estimation, and envelope statistics. Currently, spectral-based parameterization and envelope statistics are not available on most conventional clinical ultrasound machines. However, in recent years, QUS techniques involving spectral-based parameterization and envelope statistics have demonstrated success in many applications, providing additional diagnostic capabilities. Spectral-based techniques include the estimation of the backscatter coefficient (BSC), estimation of attenuation, and estimation of scatterer properties such as the correlation length associated with an effective scatterer diameter (ESD) and the effective acoustic concentration (EAC) of scatterers. Envelope statistics include the estimation of the number density of scatterers and quantification of coherent to incoherent signals produced from the tissue. Challenges for clinical application include correctly accounting for attenuation effects and transmission losses and implementation of QUS on clinical devices. Successful clinical and preclinical applications demonstrating the ability of QUS to improve medical diagnostics include characterization of the myocardium during the cardiac cycle, cancer detection, classification of solid tumors and lymph nodes, detection and quantification of fatty liver disease, and monitoring and assessment of therapy.
Ben Chaabane, Salim; Fnaiech, Farhat
2014-01-23
Color image segmentation has been so far applied in many areas; hence, recently many different techniques have been developed and proposed. In the medical imaging area, the image segmentation may be helpful to provide assistance to doctor in order to follow-up the disease of a certain patient from the breast cancer processed images. The main objective of this work is to rebuild and also to enhance each cell from the three component images provided by an input image. Indeed, from an initial segmentation obtained using the statistical features and histogram threshold techniques, the resulting segmentation may represent accurately the non complete and pasted cells and enhance them. This allows real help to doctors, and consequently, these cells become clear and easy to be counted. A novel method for color edges extraction based on statistical features and automatic threshold is presented. The traditional edge detector, based on the first and the second order neighborhood, describing the relationship between the current pixel and its neighbors, is extended to the statistical domain. Hence, color edges in an image are obtained by combining the statistical features and the automatic threshold techniques. Finally, on the obtained color edges with specific primitive color, a combination rule is used to integrate the edge results over the three color components. Breast cancer cell images were used to evaluate the performance of the proposed method both quantitatively and qualitatively. Hence, a visual and a numerical assessment based on the probability of correct classification (PC), the false classification (Pf), and the classification accuracy (Sens(%)) are presented and compared with existing techniques. The proposed method shows its superiority in the detection of points which really belong to the cells, and also the facility of counting the number of the processed cells. Computer simulations highlight that the proposed method substantially enhances the segmented image with smaller error rates better than other existing algorithms under the same settings (patterns and parameters). Moreover, it provides high classification accuracy, reaching the rate of 97.94%. Additionally, the segmentation method may be extended to other medical imaging types having similar properties.
Zhou, Bing; Li, Ming-Hua; Wang, Wu; Xu, Hao-Wen; Cheng, Yong-De; Wang, Jue
2010-03-01
The authors conducted a study to evaluate the advantages of a 3D volume-rendering technique (VRT) in follow-up digital subtraction (DS) angiography of coil-embolized intracranial aneurysms. One hundred nine patients with 121 intracranial aneurysms underwent endovascular coil embolization and at least 1 follow-up DS angiography session at the authors' institution. Two neuroradiologists independently evaluated the conventional 2D DS angiograms, rotational angiograms, and 3D VRT images obtained at the interventional procedures and DS angiography follow-up. If multiple follow-up sessions were performed, the final follow-up was mainly considered. The authors compared the 3 techniques for their ability to detect aneurysm remnants (including aneurysm neck and sac remnants) and parent artery stenosis based on the angiographic follow-up. The Kruskal-Wallis test was used for group comparisons, and the kappa test was used to measure interobserver agreement. Statistical analyses were performed using commercially available software. There was a high statistical significance among 2D DS angiography, rotational angiography, and 3D VRT results (X(2) = 9.9613, p = 0.0069) when detecting an aneurysm remnant. Further comparisons disclosed a statistical significance between 3D VRT and rotational angiography (X(2) = 4.9754, p = 0.0257); a high statistical significance between 3D VRT and 2D DS angiography (X(2) = 8.9169, p = 0.0028); and no significant difference between rotational angiography and 2D DS angiography (X(2) = 0.5648, p = 0.4523). There was no statistical significance among the 3 techniques when detecting parent artery stenosis (X(2) = 2.5164, p = 0.2842). One case, in which parent artery stenosis was diagnosed by 2D DS angiography and rotational angiography, was excluded by 3D VRT following observations of multiple views. The kappa test showed good agreement between the 2 observers. The 3D VRT is more sensitive in detecting aneurysm remnants than 2D DS angiography and rotational angiography and is helpful for identifying parent artery stenosis. The authors recommend this technique for the angiographic follow-up of patients with coil-embolized aneurysms.
Spectroscopic signatures of localization with interacting photons in superconducting qubits
NASA Astrophysics Data System (ADS)
Roushan, P.; Neill, C.; Tangpanitanon, J.; Bastidas, V. M.; Megrant, A.; Barends, R.; Chen, Y.; Chen, Z.; Chiaro, B.; Dunsworth, A.; Fowler, A.; Foxen, B.; Giustina, M.; Jeffrey, E.; Kelly, J.; Lucero, E.; Mutus, J.; Neeley, M.; Quintana, C.; Sank, D.; Vainsencher, A.; Wenner, J.; White, T.; Neven, H.; Angelakis, D. G.; Martinis, J.
2017-12-01
Quantized eigenenergies and their associated wave functions provide extensive information for predicting the physics of quantum many-body systems. Using a chain of nine superconducting qubits, we implement a technique for resolving the energy levels of interacting photons. We benchmark this method by capturing the main features of the intricate energy spectrum predicted for two-dimensional electrons in a magnetic field—the Hofstadter butterfly. We introduce disorder to study the statistics of the energy levels of the system as it undergoes the transition from a thermalized to a localized phase. Our work introduces a many-body spectroscopy technique to study quantum phases of matter.
Berger, Cezar Augusto Sarraf; Freitas, Renato da Silva; Malafaia, Osvaldo; Pinto, José Simão de Paula; Macedo Filho, Evaldo Dacheux; Mocellin, Marcos; Fagundes, Marina Serrato Coelho
2014-01-01
Introduction The knowledge and study of surgical techniques and anthropometric measurements of the nose make possible a qualitative and quantitative analysis of surgical results. Objective Study the main technique used in rhinoplasty on Caucasian noses and compare preoperative and postoperative anthropometric measurements of the nose. Methods A prospective study with 170 patients was performed at a private hospital. Data were collected using the Electronic System Integrated of Protocols software (Sistema Integrado de Protocolos Eletrônicos, SINPE©). The surgical techniques used in the nasal dorsum and tip were evaluated. Preoperative and 12-month follow-up photos as well as the measurements compared with the ideal aesthetic standard of a Caucasian nose were analyzed objectively. Student t test and standard deviation test were applied. Results There was a predominance of endonasal access (94.4%). The most common dorsum technique was hump removal (33.33%), and the predominance of sutures (24.76%) was observed on the nasal tip, with the lateral intercrural the most frequent (32.39%). Comparison between preoperative and postoperative photos found statistically significant alterations on the anthropometric measurements of the noses. Conclusion The main surgical techniques on Caucasian noses were evaluated, and a great variety was found. The evaluation of anthropometric measurements of the nose proved the efficiency of the performed procedures. PMID:25992149
Dimitriadis, Konstantinos; Spyropoulos, Konstantinos; Papadopoulos, Triantafillos
2018-02-01
The aim of the present study was to record the metal-ceramic bond strength of a feldspathic dental porcelain and a Co-Cr alloy, using the Direct Metal Laser Sintering technique (DMLS) for the fabrication of metal substrates. Ten metal substrates were fabricated with powder of a dental Co-Cr alloy using DMLS technique (test group) in dimensions according to ISO 9693. Another ten substrates were fabricated with a casing dental Co-Cr alloy using classic casting technique (control group) for comparison. Another three substrates were fabricated using each technique to record the Modulus of Elasticity ( E ) of the used alloys. All substrates were examined to record external and internal porosity. Feldspathic porcelain was applied on the substrates. Specimens were tested using the three-point bending test. The failure mode was determined using optical and scanning electron microscopy. The statistical analysis was performed using t-test. Substrates prepared using DMLS technique did not show internal porosity as compared to those produced using the casting technique. The E of control and test group was 222 ± 5.13 GPa and 227 ± 3 GPa, respectively. The bond strength was 51.87 ± 7.50 MPa for test group and 54.60 ± 6.20 MPa for control group. No statistically significant differences between the two groups were recorded. The mode of failure was mainly cohesive for all specimens. Specimens produced by the DMLS technique cover the lowest acceptable metal-ceramic bond strength of 25 MPa specified in ISO 9693 and present satisfactory bond strength for clinical use.
Schramm, Jesper; Andersen, Morten; Vach, Kirstin; Kragstrup, Jakob; Peter Kampmann, Jens; Søndergaard, Jens
2007-01-01
Objective To examine the extent and composition of pharmaceutical industry representatives’ marketing techniques with a particular focus on drug sampling in relation to drug age. Design A group of 47 GPs prospectively collected data on drug promotional activities during a six-month period, and a sub-sample of 10 GPs furthermore recorded the representatives’ marketing techniques in detail. Setting Primary healthcare. Subjects General practitioners in the County of Funen, Denmark. Main outcome measures. Promotional visits and corresponding marketing techniques. Results The 47 GPs recorded 1050 visits corresponding to a median of 19 (range 3 to 63) per GP in the six months. The majority of drugs promoted (52%) were marketed more than five years ago. There was a statistically significant decline in the proportion of visits where drug samples were offered with drug age, but the decline was small OR 0.97 (95% CI 0.95;0.98) per year. Leaflets (68%), suggestions on how to improve therapy for a specific patient registered with the practice (53%), drug samples (48%), and gifts (36%) were the most frequently used marketing techniques. Conclusion Drug-industry representatives use a variety of promotional methods. The tendency to hand out drug samples was statistically significantly associated with drug age, but the decline was small. PMID:17497486
The Global Signature of Ocean Wave Spectra
NASA Astrophysics Data System (ADS)
Portilla-Yandún, Jesús
2018-01-01
A global atlas of ocean wave spectra is developed and presented. The development is based on a new technique for deriving wave spectral statistics, which is applied to the extensive ERA-Interim database from European Centre of Medium-Range Weather Forecasts. Spectral statistics is based on the idea of long-term wave systems, which are unique and distinct at every geographical point. The identification of those wave systems allows their separation from the overall spectrum using the partition technique. Their further characterization is made using standard integrated parameters, which turn out much more meaningful when applied to the individual components than to the total spectrum. The parameters developed include the density distribution of spectral partitions, which is the main descriptor; the identified wave systems; the individual distribution of the characteristic frequencies, directions, wave height, wave age, seasonal variability of wind and waves; return periods derived from extreme value analysis; and crossing-sea probabilities. This information is made available in web format for public use at http://www.modemat.epn.edu.ec/#/nereo. It is found that wave spectral statistics offers the possibility to synthesize data while providing a direct and comprehensive view of the local and regional wave conditions.
Spectral region optimization for Raman-based optical biopsy of inflammatory lesions.
de Carvalho, Luis Felipe das Chagas E Silva; Bitar, Renata Andrade; Arisawa, Emília Angela Loschiavo; Brandão, Adriana Aigotti Haberbeck; Honório, Kathia Maria; Cabral, Luiz Antônio Guimarães; Martin, Airton Abrahão; Martinho, Herculano da Silva; Almeida, Janete Dias
2010-08-01
The biochemical alterations between inflammatory fibrous hyperplasia (IFH) and normal tissues of buccal mucosa were probed by using the FT-Raman spectroscopy technique. The aim was to find the minimal set of Raman bands that would furnish the best discrimination. Raman-based optical biopsy is a widely recognized potential technique for noninvasive real-time diagnosis. However, few studies had been devoted to the discrimination of very common subtle or early pathologic states as inflammatory processes that are always present on, for example, cancer lesion borders. Seventy spectra of IFH from 14 patients were compared with 30 spectra of normal tissues from six patients. The statistical analysis was performed with principal components analysis and soft independent modeling class analogy cross-validated, leave-one-out methods. Bands close to 574, 1,100, 1,250 to 1,350, and 1,500 cm(-1) (mainly amino acids and collagen bands) showed the main intragroup variations that are due to the acanthosis process in the IFH epithelium. The 1,200 (C-C aromatic/DNA), 1,350 (CH(2) bending/collagen 1), and 1,730 cm(-1) (collagen III) regions presented the main intergroup variations. This finding was interpreted as originating in an extracellular matrix-degeneration process occurring in the inflammatory tissues. The statistical analysis results indicated that the best discrimination capability (sensitivity of 95% and specificity of 100%) was found by using the 530-580 cm(-1) spectral region. The existence of this narrow spectral window enabling normal and inflammatory diagnosis also had useful implications for an in vivo dispersive Raman setup for clinical applications.
Barry, Robert L.; Klassen, L. Martyn; Williams, Joy M.; Menon, Ravi S.
2008-01-01
A troublesome source of physiological noise in functional magnetic resonance imaging (fMRI) is due to the spatio-temporal modulation of the magnetic field in the brain caused by normal subject respiration. fMRI data acquired using echo-planar imaging is very sensitive to these respiratory-induced frequency offsets, which cause significant geometric distortions in images. Because these effects increase with main magnetic field, they can nullify the gains in statistical power expected by the use of higher magnetic fields. As a study of existing navigator correction techniques for echo-planar fMRI has shown that further improvements can be made in the suppression of respiratory-induced physiological noise, a new hybrid two-dimensional (2D) navigator is proposed. Using a priori knowledge of the slow spatial variations of these induced frequency offsets, 2D field maps are constructed for each shot using spatial frequencies between ±0.5 cm−1 in k-space. For multi-shot fMRI experiments, we estimate that the improvement of hybrid 2D navigator correction over the best performance of one-dimensional navigator echo correction translates into a 15% increase in the volume of activation, 6% and 10% increases in the maximum and average t-statistics, respectively, for regions with high t-statistics, and 71% and 56% increases in the maximum and average t-statistics, respectively, in regions with low t-statistics due to contamination by residual physiological noise. PMID:18024159
Protocol Design Challenges in the Detection of Awareness in Aware Subjects Using EEG Signals.
Henriques, J; Gabriel, D; Grigoryeva, L; Haffen, E; Moulin, T; Aubry, R; Pazart, L; Ortega, J-P
2016-10-01
Recent studies have evidenced serious difficulties in detecting covert awareness with electroencephalography-based techniques both in unresponsive patients and in healthy control subjects. This work reproduces the protocol design in two recent mental imagery studies with a larger group comprising 20 healthy volunteers. The main goal is assessing if modifications in the signal extraction techniques, training-testing/cross-validation routines, and hypotheses evoked in the statistical analysis, can provide solutions to the serious difficulties documented in the literature. The lack of robustness in the results advises for further search of alternative protocols more suitable for machine learning classification and of better performing signal treatment techniques. Specific recommendations are made using the findings in this work. © EEG and Clinical Neuroscience Society (ECNS) 2014.
NASA Astrophysics Data System (ADS)
Arif, Sajjad; Tanwir Alam, Md; Ansari, Akhter H.; Bilal Naim Shaikh, Mohd; Arif Siddiqui, M.
2018-05-01
The tribological performance of aluminium hybrid composites reinforced with micro SiC (5 wt%) and nano zirconia (0, 3, 6 and 9 wt%) fabricated through powder metallurgy technique were investigated using statistical and artificial neural network (ANN) approach. The influence of zirconia reinforcement, sliding distance and applied load were analyzed with test based on full factorial design of experiments. Analysis of variance (ANOVA) was used to evaluate the percentage contribution of each process parameters on wear loss. ANOVA approach suggested that wear loss be mainly influenced by sliding distance followed by zirconia reinforcement and applied load. Further, a feed forward back propagation neural network was applied on input/output date for predicting and analyzing the wear behaviour of fabricated composite. A very close correlation between experimental and ANN output were achieved by implementing the model. Finally, ANN model was effectively used to find the influence of various control factors on wear behaviour of hybrid composites.
Recognition of speaker-dependent continuous speech with KEAL
NASA Astrophysics Data System (ADS)
Mercier, G.; Bigorgne, D.; Miclet, L.; Le Guennec, L.; Querre, M.
1989-04-01
A description of the speaker-dependent continuous speech recognition system KEAL is given. An unknown utterance, is recognized by means of the followng procedures: acoustic analysis, phonetic segmentation and identification, word and sentence analysis. The combination of feature-based, speaker-independent coarse phonetic segmentation with speaker-dependent statistical classification techniques is one of the main design features of the acoustic-phonetic decoder. The lexical access component is essentially based on a statistical dynamic programming technique which aims at matching a phonemic lexical entry containing various phonological forms, against a phonetic lattice. Sentence recognition is achieved by use of a context-free grammar and a parsing algorithm derived from Earley's parser. A speaker adaptation module allows some of the system parameters to be adjusted by matching known utterances with their acoustical representation. The task to be performed, described by its vocabulary and its grammar, is given as a parameter of the system. Continuously spoken sentences extracted from a 'pseudo-Logo' language are analyzed and results are presented.
Lachapelle, J M; Gouverneur, J C; Boulet, M; Tennstedt, D
1977-07-01
A technical modification of skin surface biopsy has been introduced by using plastic tape instead of glass as holder, mainly to investigate mycological infections of skin folds. Among various brands of plastic sheets, a polyester film (Melinex O UCB-SIDAC) has been demonstrated as the most suitable. A direct microscopic comparison has been made between our modified technique and conventional scraping as procedures for collecting material from interdigital spaces in 30 patients with bilateral athlete's foot. It has been shown that the skin surface biopsy gives a slightly greater number of positive results (presence of dermatophytes or Candida species) than the conventional scraping technique, although the difference between both techniques is not statistically significant at the 0-05 level (0-05 less than P less than 0-10). Some advantages of the modified skin surface biopsy are emphasized.
NASA Astrophysics Data System (ADS)
Fernandez, Carlos; Platero, Carlos; Campoy, Pascual; Aracil, Rafael
1994-11-01
This paper describes some texture-based techniques that can be applied to quality assessment of flat products continuously produced (metal strips, wooden surfaces, cork, textile products, ...). Since the most difficult task is that of inspecting for product appearance, human-like inspection ability is required. A common feature to all these products is the presence of non- deterministic texture on their surfaces. Two main subjects are discussed: statistical techniques for both surface finishing determination and surface defect analysis as well as real-time implementation for on-line inspection in high-speed applications. For surface finishing determination a Gray Level Difference technique is presented to perform over low resolution images, that is, no-zoomed images. Defect analysis is performed by means of statistical texture analysis over defective portions of the surface. On-line implementation is accomplished by means of neural networks. When a defect arises, textural analysis is applied which result in a data-vector, acting as input of a neural net, previously trained in a supervised way. This approach tries to reach on-line performance in automated visual inspection applications when texture is presented in flat product surfaces.
Kwan, Paul; Welch, Mitchell
2017-01-01
In order to understand the distribution and prevalence of Ommatissus lybicus (Hemiptera: Tropiduchidae) as well as analyse their current biographical patterns and predict their future spread, comprehensive and detailed information on the environmental, climatic, and agricultural practices are essential. The spatial analytical techniques such as Remote Sensing and Spatial Statistics Tools, can help detect and model spatial links and correlations between the presence, absence and density of O. lybicus in response to climatic, environmental, and human factors. The main objective of this paper is to review remote sensing and relevant analytical techniques that can be applied in mapping and modelling the habitat and population density of O. lybicus. An exhaustive search of related literature revealed that there are very limited studies linking location-based infestation levels of pests like the O. lybicus with climatic, environmental, and human practice related variables. This review also highlights the accumulated knowledge and addresses the gaps in this area of research. Furthermore, it makes recommendations for future studies, and gives suggestions on monitoring and surveillance methods in designing both local and regional level integrated pest management strategies of palm tree and other affected cultivated crops. PMID:28875085
Al-Kindi, Khalifa M; Kwan, Paul; R Andrew, Nigel; Welch, Mitchell
2017-01-01
In order to understand the distribution and prevalence of Ommatissus lybicus (Hemiptera: Tropiduchidae) as well as analyse their current biographical patterns and predict their future spread, comprehensive and detailed information on the environmental, climatic, and agricultural practices are essential. The spatial analytical techniques such as Remote Sensing and Spatial Statistics Tools, can help detect and model spatial links and correlations between the presence, absence and density of O. lybicus in response to climatic, environmental, and human factors. The main objective of this paper is to review remote sensing and relevant analytical techniques that can be applied in mapping and modelling the habitat and population density of O. lybicus . An exhaustive search of related literature revealed that there are very limited studies linking location-based infestation levels of pests like the O. lybicus with climatic, environmental, and human practice related variables. This review also highlights the accumulated knowledge and addresses the gaps in this area of research. Furthermore, it makes recommendations for future studies, and gives suggestions on monitoring and surveillance methods in designing both local and regional level integrated pest management strategies of palm tree and other affected cultivated crops.
Ashok, Anup; Kumar, Devarai Santhosh
2017-10-01
Optimization techniques are considered as a part of nature's way of adjusting to the changes happening around it. There are different factors that establish the optimum working condition or the production of any value-added product. A model is accepted for a particular process after its sustainability has been verified on a statistical and analytical level. Optimization techniques can be divided into categories as statistical, nature inspired and artificial neural network each with its own benefits and usage in particular cases. A brief introduction about subcategories of different techniques that are available and their computational effectivity will be discussed. The main focus of the study revolves around the applicability of these techniques to any particular operation such as submerged fermentation (SmF) and solid state fermentation (SSF), their ability to produce secondary metabolites and the usefulness in the laboratory and industrial level. Primary studies to determine the enzyme activity of different microorganisms such as bacteria, fungi and yeast will also be discussed. l-Asparaginase, the most commonly used drugs in the treatment of acute lymphoblastic leukemia (ALL) shall be considered as an example, a short discussion on models used in the production by the processes of SmF and SSF will be discussed to understand the optimization techniques that are being dealt. It is expected that this discussion would help in determining the proper technique that can be used in running any optimization process for different purposes, and would help in making these processes less time-consuming with better output.
Statistical modeling of optical attenuation measurements in continental fog conditions
NASA Astrophysics Data System (ADS)
Khan, Muhammad Saeed; Amin, Muhammad; Awan, Muhammad Saleem; Minhas, Abid Ali; Saleem, Jawad; Khan, Rahimdad
2017-03-01
Free-space optics is an innovative technology that uses atmosphere as a propagation medium to provide higher data rates. These links are heavily affected by atmospheric channel mainly because of fog and clouds that act to scatter and even block the modulated beam of light from reaching the receiver end, hence imposing severe attenuation. A comprehensive statistical study of the fog effects and deep physical understanding of the fog phenomena are very important for suggesting improvements (reliability and efficiency) in such communication systems. In this regard, 6-months real-time measured fog attenuation data are considered and statistically investigated. A detailed statistical analysis related to each fog event for that period is presented; the best probability density functions are selected on the basis of Akaike information criterion, while the estimates of unknown parameters are computed by maximum likelihood estimation technique. The results show that most fog attenuation events follow normal mixture distribution and some follow the Weibull distribution.
NASA Astrophysics Data System (ADS)
Karali, Anna; Giannakopoulos, Christos; Frias, Maria Dolores; Hatzaki, Maria; Roussos, Anargyros; Casanueva, Ana
2013-04-01
Forest fires have always been present in the Mediterranean ecosystems, thus they constitute a major ecological and socio-economic issue. The last few decades though, the number of forest fires has significantly increased, as well as their severity and impact on the environment. Local fire danger projections are often required when dealing with wild fire research. In the present study the application of statistical downscaling and spatial interpolation methods was performed to the Canadian Fire Weather Index (FWI), in order to assess forest fire risk in Greece. The FWI is used worldwide (including the Mediterranean basin) to estimate the fire danger in a generalized fuel type, based solely on weather observations. The meteorological inputs to the FWI System are noon values of dry-bulb temperature, air relative humidity, 10m wind speed and precipitation during the previous 24 hours. The statistical downscaling methods are based on a statistical model that takes into account empirical relationships between large scale variables (used as predictors) and local scale variables. In the framework of the current study the statistical downscaling portal developed by the Santander Meteorology Group (https://www.meteo.unican.es/downscaling) in the framework of the EU project CLIMRUN (www.climrun.eu) was used to downscale non standard parameters related to forest fire risk. In this study, two different approaches were adopted. Firstly, the analogue downscaling technique was directly performed to the FWI index values and secondly the same downscaling technique was performed indirectly through the meteorological inputs of the index. In both cases, the statistical downscaling portal was used considering the ERA-Interim reanalysis as predictands due to the lack of observations at noon. Additionally, a three-dimensional (3D) interpolation method of position and elevation, based on Thin Plate Splines (TPS) was used, to interpolate the ERA-Interim data used to calculate the index. Results from this method were compared with the statistical downscaling results obtained from the portal. Finally, FWI was computed using weather observations obtained from the Hellenic National Meteorological Service, mainly in the south continental part of Greece and a comparison with the previous results was performed.
Statistical Validation for Clinical Measures: Repeatability and Agreement of Kinect™-Based Software.
Lopez, Natalia; Perez, Elisa; Tello, Emanuel; Rodrigo, Alejandro; Valentinuzzi, Max E
2018-01-01
The rehabilitation process is a fundamental stage for recovery of people's capabilities. However, the evaluation of the process is performed by physiatrists and medical doctors, mostly based on their observations, that is, a subjective appreciation of the patient's evolution. This paper proposes a tracking platform of the movement made by an individual's upper limb using Kinect sensor(s) to be applied for the patient during the rehabilitation process. The main contribution is the development of quantifying software and the statistical validation of its performance, repeatability, and clinical use in the rehabilitation process. The software determines joint angles and upper limb trajectories for the construction of a specific rehabilitation protocol and quantifies the treatment evolution. In turn, the information is presented via a graphical interface that allows the recording, storage, and report of the patient's data. For clinical purposes, the software information is statistically validated with three different methodologies, comparing the measures with a goniometer in terms of agreement and repeatability. The agreement of joint angles measured with the proposed software and goniometer is evaluated with Bland-Altman plots; all measurements fell well within the limits of agreement, meaning interchangeability of both techniques. Additionally, the results of Bland-Altman analysis of repeatability show 95% confidence. Finally, the physiotherapists' qualitative assessment shows encouraging results for the clinical use. The main conclusion is that the software is capable of offering a clinical history of the patient and is useful for quantification of the rehabilitation success. The simplicity, low cost, and visualization possibilities enhance the use of the software Kinect for rehabilitation and other applications, and the expert's opinion endorses the choice of our approach for clinical practice. Comparison of the new measurement technique with established goniometric methods determines that the proposed software agrees sufficiently to be used interchangeably.
Statistical methods for thermonuclear reaction rates and nucleosynthesis simulations
NASA Astrophysics Data System (ADS)
Iliadis, Christian; Longland, Richard; Coc, Alain; Timmes, F. X.; Champagne, Art E.
2015-03-01
Rigorous statistical methods for estimating thermonuclear reaction rates and nucleosynthesis are becoming increasingly established in nuclear astrophysics. The main challenge being faced is that experimental reaction rates are highly complex quantities derived from a multitude of different measured nuclear parameters (e.g., astrophysical S-factors, resonance energies and strengths, particle and γ-ray partial widths). We discuss the application of the Monte Carlo method to two distinct, but related, questions. First, given a set of measured nuclear parameters, how can one best estimate the resulting thermonuclear reaction rates and associated uncertainties? Second, given a set of appropriate reaction rates, how can one best estimate the abundances from nucleosynthesis (i.e., reaction network) calculations? The techniques described here provide probability density functions that can be used to derive statistically meaningful reaction rates and final abundances for any desired coverage probability. Examples are given for applications to s-process neutron sources, core-collapse supernovae, classical novae, and Big Bang nucleosynthesis.
Georgouli, Konstantia; Martinez Del Rincon, Jesus; Koidis, Anastasios
2017-02-15
The main objective of this work was to develop a novel dimensionality reduction technique as a part of an integrated pattern recognition solution capable of identifying adulterants such as hazelnut oil in extra virgin olive oil at low percentages based on spectroscopic chemical fingerprints. A novel Continuous Locality Preserving Projections (CLPP) technique is proposed which allows the modelling of the continuous nature of the produced in-house admixtures as data series instead of discrete points. The maintenance of the continuous structure of the data manifold enables the better visualisation of this examined classification problem and facilitates the more accurate utilisation of the manifold for detecting the adulterants. The performance of the proposed technique is validated with two different spectroscopic techniques (Raman and Fourier transform infrared, FT-IR). In all cases studied, CLPP accompanied by k-Nearest Neighbors (kNN) algorithm was found to outperform any other state-of-the-art pattern recognition techniques. Copyright © 2016 Elsevier Ltd. All rights reserved.
Spyropoulos, Konstantinos
2018-01-01
PURPOSE The aim of the present study was to record the metal-ceramic bond strength of a feldspathic dental porcelain and a Co-Cr alloy, using the Direct Metal Laser Sintering technique (DMLS) for the fabrication of metal substrates. MATERIALS AND METHODS Ten metal substrates were fabricated with powder of a dental Co-Cr alloy using DMLS technique (test group) in dimensions according to ISO 9693. Another ten substrates were fabricated with a casing dental Co-Cr alloy using classic casting technique (control group) for comparison. Another three substrates were fabricated using each technique to record the Modulus of Elasticity (E) of the used alloys. All substrates were examined to record external and internal porosity. Feldspathic porcelain was applied on the substrates. Specimens were tested using the three-point bending test. The failure mode was determined using optical and scanning electron microscopy. The statistical analysis was performed using t-test. RESULTS Substrates prepared using DMLS technique did not show internal porosity as compared to those produced using the casting technique. The E of control and test group was 222 ± 5.13 GPa and 227 ± 3 GPa, respectively. The bond strength was 51.87 ± 7.50 MPa for test group and 54.60 ± 6.20 MPa for control group. No statistically significant differences between the two groups were recorded. The mode of failure was mainly cohesive for all specimens. CONCLUSION Specimens produced by the DMLS technique cover the lowest acceptable metal-ceramic bond strength of 25 MPa specified in ISO 9693 and present satisfactory bond strength for clinical use. PMID:29503711
Karimi, Mohammad H; Asemani, Davud
2014-05-01
Ceramic and tile industries should indispensably include a grading stage to quantify the quality of products. Actually, human control systems are often used for grading purposes. An automatic grading system is essential to enhance the quality control and marketing of the products. Since there generally exist six different types of defects originating from various stages of tile manufacturing lines with distinct textures and morphologies, many image processing techniques have been proposed for defect detection. In this paper, a survey has been made on the pattern recognition and image processing algorithms which have been used to detect surface defects. Each method appears to be limited for detecting some subgroup of defects. The detection techniques may be divided into three main groups: statistical pattern recognition, feature vector extraction and texture/image classification. The methods such as wavelet transform, filtering, morphology and contourlet transform are more effective for pre-processing tasks. Others including statistical methods, neural networks and model-based algorithms can be applied to extract the surface defects. Although, statistical methods are often appropriate for identification of large defects such as Spots, but techniques such as wavelet processing provide an acceptable response for detection of small defects such as Pinhole. A thorough survey is made in this paper on the existing algorithms in each subgroup. Also, the evaluation parameters are discussed including supervised and unsupervised parameters. Using various performance parameters, different defect detection algorithms are compared and evaluated. Copyright © 2013 ISA. Published by Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
El Sharawy, Mohamed S.; Gaafar, Gamal R.
2016-12-01
Both reservoir engineers and petrophysicists have been concerned about dividing a reservoir into zones for engineering and petrophysics purposes. Through decades, several techniques and approaches were introduced. Out of them, statistical reservoir zonation, stratigraphic modified Lorenz (SML) plot and the principal component and clustering analyses techniques were chosen to apply on the Nubian sandstone reservoir of Palaeozoic - Lower Cretaceous age, Gulf of Suez, Egypt, by using five adjacent wells. The studied reservoir consists mainly of sandstone with some intercalation of shale layers with varying thickness from one well to another. The permeability ranged from less than 1 md to more than 1000 md. The statistical reservoir zonation technique, depending on core permeability, indicated that the cored interval of the studied reservoir can be divided into two zones. Using reservoir properties such as porosity, bulk density, acoustic impedance and interval transit time indicated also two zones with an obvious variation in separation depth and zones continuity. The stratigraphic modified Lorenz (SML) plot indicated the presence of more than 9 flow units in the cored interval as well as a high degree of microscopic heterogeneity. On the other hand, principal component and cluster analyses, depending on well logging data (gamma ray, sonic, density and neutron), indicated that the whole reservoir can be divided at least into four electrofacies having a noticeable variation in reservoir quality, as correlated with the measured permeability. Furthermore, continuity or discontinuity of the reservoir zones can be determined using this analysis.
Specialized data analysis of SSME and advanced propulsion system vibration measurements
NASA Technical Reports Server (NTRS)
Coffin, Thomas; Swanson, Wayne L.; Jong, Yen-Yi
1993-01-01
The basic objectives of this contract were to perform detailed analysis and evaluation of dynamic data obtained during Space Shuttle Main Engine (SSME) test and flight operations, including analytical/statistical assessment of component dynamic performance, and to continue the development and implementation of analytical/statistical models to effectively define nominal component dynamic characteristics, detect anomalous behavior, and assess machinery operational conditions. This study was to provide timely assessment of engine component operational status, identify probable causes of malfunction, and define feasible engineering solutions. The work was performed under three broad tasks: (1) Analysis, Evaluation, and Documentation of SSME Dynamic Test Results; (2) Data Base and Analytical Model Development and Application; and (3) Development and Application of Vibration Signature Analysis Techniques.
Deformable Medical Image Registration: A Survey
Sotiras, Aristeidis; Davatzikos, Christos; Paragios, Nikos
2013-01-01
Deformable image registration is a fundamental task in medical image processing. Among its most important applications, one may cite: i) multi-modality fusion, where information acquired by different imaging devices or protocols is fused to facilitate diagnosis and treatment planning; ii) longitudinal studies, where temporal structural or anatomical changes are investigated; and iii) population modeling and statistical atlases used to study normal anatomical variability. In this paper, we attempt to give an overview of deformable registration methods, putting emphasis on the most recent advances in the domain. Additional emphasis has been given to techniques applied to medical images. In order to study image registration methods in depth, their main components are identified and studied independently. The most recent techniques are presented in a systematic fashion. The contribution of this paper is to provide an extensive account of registration techniques in a systematic manner. PMID:23739795
Sequential neural text compression.
Schmidhuber, J; Heil, S
1996-01-01
The purpose of this paper is to show that neural networks may be promising tools for data compression without loss of information. We combine predictive neural nets and statistical coding techniques to compress text files. We apply our methods to certain short newspaper articles and obtain compression ratios exceeding those of the widely used Lempel-Ziv algorithms (which build the basis of the UNIX functions "compress" and "gzip"). The main disadvantage of our methods is that they are about three orders of magnitude slower than standard methods.
2009-01-01
theory ...................24 Table 3.1 Five techniques of pretest - posttest analysis (from Bonate, 2000) ....................34 Table 3.2 Main Effects...Davenport, 1998). In a pretest / posttest design study by Schlomer, Anderson, and Shaw (1997) no statistically significant differences in KR outcomes were... Pretest / posttest design overview ____________ 54 Question numbers ranged from 1 to 55. The prefix a represents a pretest question and the prefix b
NASA Astrophysics Data System (ADS)
El Kanawati, W.; Létang, J. M.; Dauvergne, D.; Pinto, M.; Sarrut, D.; Testa, É.; Freud, N.
2015-10-01
A Monte Carlo (MC) variance reduction technique is developed for prompt-γ emitters calculations in proton therapy. Prompt-γ emitted through nuclear fragmentation reactions and exiting the patient during proton therapy could play an important role to help monitoring the treatment. However, the estimation of the number and the energy of emitted prompt-γ per primary proton with MC simulations is a slow process. In order to estimate the local distribution of prompt-γ emission in a volume of interest for a given proton beam of the treatment plan, a MC variance reduction technique based on a specific track length estimator (TLE) has been developed. First an elemental database of prompt-γ emission spectra is established in the clinical energy range of incident protons for all elements in the composition of human tissues. This database of the prompt-γ spectra is built offline with high statistics. Regarding the implementation of the prompt-γ TLE MC tally, each proton deposits along its track the expectation of the prompt-γ spectra from the database according to the proton kinetic energy and the local material composition. A detailed statistical study shows that the relative efficiency mainly depends on the geometrical distribution of the track length. Benchmarking of the proposed prompt-γ TLE MC technique with respect to an analogous MC technique is carried out. A large relative efficiency gain is reported, ca. 105.
Left atrial appendage closure: a new technique for clinical practice.
John Camm, A; Colombo, Antonio; Corbucci, Giorgio; Padeletti, Luigi
2014-03-01
Atrial fibrillation (AF) is the most common sustained cardiac arrhythmia. It is associated with increased risk for stroke mainly due to cardiac embolism from the left atrial appendage (LAA). Occlusion of the LAA by means of a device represents a valid alternative to oral anticoagulation, mainly in patients who cannot tolerate this therapy because of a high bleeding risk. Recent data on the endocardial device WATCHMAN show encouraging results for this patient population in terms of stroke risk reduction compared to the expected rate as well as in terms of implant success. This article reviews all relevant publications related to the main surgical and transcatheter devices used for LAA closure (LAAC). PROTECT-AF, the first prospective randomized trial conducted on this technique, showed that LAA occlusion using the WATCHMAN was noninferior to warfarin for a combined end-point in patients with nonvalvular AF. There is a lack of large-scale randomized trials on long-term stroke risk in patients submitted to LAAC. Most studies are relatively small and focus on the comparison of different surgical techniques with regard to complete/incomplete closure success. More recently, PROTECT-AF long-term results (4-year follow-up) demonstrated that LAAC was statistically superior to warfarin in terms of efficacy. This review concludes that it is now appropriate to consider these techniques for patients with AF who are at high risk for stroke for whom effective conventional or novel anticoagulant therapy is not available or who present problems in managing drug treatment. Copyright © 2014 Heart Rhythm Society. Published by Elsevier Inc. All rights reserved.
Rodríguez-Arias, Miquel Angel; Rodó, Xavier
2004-03-01
Here we describe a practical, step-by-step primer to scale-dependent correlation (SDC) analysis. The analysis of transitory processes is an important but often neglected topic in ecological studies because only a few statistical techniques appear to detect temporary features accurately enough. We introduce here the SDC analysis, a statistical and graphical method to study transitory processes at any temporal or spatial scale. SDC analysis, thanks to the combination of conventional procedures and simple well-known statistical techniques, becomes an improved time-domain analogue of wavelet analysis. We use several simple synthetic series to describe the method, a more complex example, full of transitory features, to compare SDC and wavelet analysis, and finally we analyze some selected ecological series to illustrate the methodology. The SDC analysis of time series of copepod abundances in the North Sea indicates that ENSO primarily is the main climatic driver of short-term changes in population dynamics. SDC also uncovers some long-term, unexpected features in the population. Similarly, the SDC analysis of Nicholson's blowflies data locates where the proposed models fail and provides new insights about the mechanism that drives the apparent vanishing of the population cycle during the second half of the series.
Understanding photon sideband statistics and correlation for determining phonon coherence
NASA Astrophysics Data System (ADS)
Ding, Ding; Yin, Xiaobo; Li, Baowen
2018-01-01
Generating and detecting coherent high-frequency heat-carrying phonons have been topics of great interest in recent years. Although there have been successful attempts in generating and observing coherent phonons, rigorous techniques to characterize and detect phonon coherence in a crystalline material have been lagging compared to what has been achieved for photons. One main challenge is a lack of detailed understanding of how detection signals for phonons can be related to coherence. The quantum theory of photoelectric detection has greatly advanced the ability to characterize photon coherence in the past century, and a similar theory for phonon detection is necessary. Here, we reexamine the optical sideband fluorescence technique that has been used to detect high-frequency phonons in materials with optically active defects. We propose a quantum theory of phonon detection using the sideband technique and found that there are distinct differences in sideband counting statistics between thermal and coherent phonons. We further propose a second-order correlation function unique to sideband signals that allows for a rigorous distinction between thermal and coherent phonons. Our theory is relevant to a correlation measurement with nontrivial response functions at the quantum level and can potentially bridge the gap of experimentally determining phonon coherence to be on par with that of photons.
NASA Astrophysics Data System (ADS)
Svirina, Anna; Shindor, Olga; Tatmyshevsky, Konstantin
2014-12-01
The paper deals with the main problems of Russian energy system development that proves necessary to provide educational programs in the field of renewable and alternative energy. In the paper the process of curricula development and defining teaching techniques on the basis of expert opinion evaluation is defined, and the competence model for renewable and alternative energy processing master students is suggested. On the basis of a distributed questionnaire and in-depth interviews, the data for statistical analysis was obtained. On the basis of this data, an optimization of curricula structure was performed, and three models of a structure for optimizing teaching techniques were developed. The suggested educational program structure which was adopted by employers is presented in the paper. The findings include quantitatively estimated importance of systemic thinking and professional skills and knowledge as basic competences of a masters' program graduate; statistically estimated necessity of practice-based learning approach; and optimization models for structuring curricula in renewable and alternative energy processing. These findings allow the establishment of a platform for the development of educational programs.
Analysis of Acoustic Emission Parameters from Corrosion of AST Bottom Plate in Field Testing
NASA Astrophysics Data System (ADS)
Jomdecha, C.; Jirarungsatian, C.; Suwansin, W.
Field testing of aboveground storage tank (AST) to monitor corrosion of the bottom plate is presented in this chapter. AE testing data of the ten AST with different sizes, materials, and products were employed to monitor the bottom plate condition. AE sensors of 30 and 150 kHz were used to monitor the corrosion activity of up to 24 channels including guard sensors. Acoustic emission (AE) parameters were analyzed to explore the AE parameter patterns of occurring corrosion compared to the laboratory results. Amplitude, count, duration, and energy were main parameters of analysis. Pattern recognition technique with statistical was implemented to eliminate the electrical and environmental noises. The results showed the specific AE patterns of corrosion activities related to the empirical results. In addition, plane algorithm was utilized to locate the significant AE events from corrosion. Both results of parameter patterns and AE event locations can be used to interpret and locate the corrosion activities. Finally, basic statistical grading technique was used to evaluate the bottom plate condition of the AST.
Martian Chronology: Goals for Investigations from a Recent Multidisciplinary Workshop
NASA Technical Reports Server (NTRS)
Nyquist, L.; Doran, P. T.; Cerling, T. E.; Clifford, S. M.; Forman, S. L.; Papanastassiou, D. A.; Stewart, B. W.; Sturchio, N. C.; Swindle, T. D.
2000-01-01
The absolute chronology of Martian rocks and events is based mainly on crater statistics and remains highly uncertain. Martian chronology will be critical to building a time scale comparable to Earth's to address questions about the early evolution of the planets and their ecosystems. In order to address issues and strategies specific to Martian chronology, a workshop was held, 4-7 June 2000, with invited participants from the planetary, geochronology, geochemistry, and astrobiology communities. The workshop focused on identifying: a) key scientific questions of Martian chronology; b) chronological techniques applicable to Mars; c) unique processes on Mars that could be exploited to obtain rates, fluxes, ages; and d) sampling issues for these techniques. This is an overview of the workshop findings and recommendations.
A BAYESIAN APPROACH TO DERIVING AGES OF INDIVIDUAL FIELD WHITE DWARFS
DOE Office of Scientific and Technical Information (OSTI.GOV)
O'Malley, Erin M.; Von Hippel, Ted; Van Dyk, David A., E-mail: ted.vonhippel@erau.edu, E-mail: dvandyke@imperial.ac.uk
2013-09-20
We apply a self-consistent and robust Bayesian statistical approach to determine the ages, distances, and zero-age main sequence (ZAMS) masses of 28 field DA white dwarfs (WDs) with ages of approximately 4-8 Gyr. Our technique requires only quality optical and near-infrared photometry to derive ages with <15% uncertainties, generally with little sensitivity to our choice of modern initial-final mass relation. We find that age, distance, and ZAMS mass are correlated in a manner that is too complex to be captured by traditional error propagation techniques. We further find that the posterior distributions of age are often asymmetric, indicating that themore » standard approach to deriving WD ages can yield misleading results.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bennett, Janine Camille; Thompson, David; Pebay, Philippe Pierre
Statistical analysis is typically used to reduce the dimensionality of and infer meaning from data. A key challenge of any statistical analysis package aimed at large-scale, distributed data is to address the orthogonal issues of parallel scalability and numerical stability. Many statistical techniques, e.g., descriptive statistics or principal component analysis, are based on moments and co-moments and, using robust online update formulas, can be computed in an embarrassingly parallel manner, amenable to a map-reduce style implementation. In this paper we focus on contingency tables, through which numerous derived statistics such as joint and marginal probability, point-wise mutual information, information entropy,more » and {chi}{sup 2} independence statistics can be directly obtained. However, contingency tables can become large as data size increases, requiring a correspondingly large amount of communication between processors. This potential increase in communication prevents optimal parallel speedup and is the main difference with moment-based statistics (which we discussed in [1]) where the amount of inter-processor communication is independent of data size. Here we present the design trade-offs which we made to implement the computation of contingency tables in parallel. We also study the parallel speedup and scalability properties of our open source implementation. In particular, we observe optimal speed-up and scalability when the contingency statistics are used in their appropriate context, namely, when the data input is not quasi-diffuse.« less
Review of chart recognition in document images
NASA Astrophysics Data System (ADS)
Liu, Yan; Lu, Xiaoqing; Qin, Yeyang; Tang, Zhi; Xu, Jianbo
2013-01-01
As an effective information transmitting way, chart is widely used to represent scientific statistics datum in books, research papers, newspapers etc. Though textual information is still the major source of data, there has been an increasing trend of introducing graphs, pictures, and figures into the information pool. Text recognition techniques for documents have been accomplished using optical character recognition (OCR) software. Chart recognition techniques as a necessary supplement of OCR for document images are still an unsolved problem due to the great subjectiveness and variety of charts styles. This paper reviews the development process of chart recognition techniques in the past decades and presents the focuses of current researches. The whole process of chart recognition is presented systematically, which mainly includes three parts: chart segmentation, chart classification, and chart Interpretation. In each part, the latest research work is introduced. In the last, the paper concludes with a summary and promising future research direction.
Bayesian component separation: The Planck experience
NASA Astrophysics Data System (ADS)
Wehus, Ingunn Kathrine; Eriksen, Hans Kristian
2018-05-01
Bayesian component separation techniques have played a central role in the data reduction process of Planck. The most important strength of this approach is its global nature, in which a parametric and physical model is fitted to the data. Such physical modeling allows the user to constrain very general data models, and jointly probe cosmological, astrophysical and instrumental parameters. This approach also supports statistically robust goodness-of-fit tests in terms of data-minus-model residual maps, which are essential for identifying residual systematic effects in the data. The main challenges are high code complexity and computational cost. Whether or not these costs are justified for a given experiment depends on its final uncertainty budget. We therefore predict that the importance of Bayesian component separation techniques is likely to increase with time for intensity mapping experiments, similar to what has happened in the CMB field, as observational techniques mature, and their overall sensitivity improves.
Noninvasive fetal QRS detection using an echo state network and dynamic programming.
Lukoševičius, Mantas; Marozas, Vaidotas
2014-08-01
We address a classical fetal QRS detection problem from abdominal ECG recordings with a data-driven statistical machine learning approach. Our goal is to have a powerful, yet conceptually clean, solution. There are two novel key components at the heart of our approach: an echo state recurrent neural network that is trained to indicate fetal QRS complexes, and several increasingly sophisticated versions of statistics-based dynamic programming algorithms, which are derived from and rooted in probability theory. We also employ a standard technique for preprocessing and removing maternal ECG complexes from the signals, but do not take this as the main focus of this work. The proposed approach is quite generic and can be extended to other types of signals and annotations. Open-source code is provided.
Astolfi, Laura; Vecchiato, Giovanni; De Vico Fallani, Fabrizio; Salinari, Serenella; Cincotti, Febo; Aloise, Fabio; Mattia, Donatella; Marciani, Maria Grazia; Bianchi, Luigi; Soranzo, Ramon; Babiloni, Fabio
2009-01-01
We estimate cortical activity in normal subjects during the observation of TV commercials inserted within a movie by using high-resolution EEG techniques. The brain activity was evaluated in both time and frequency domains by solving the associate inverse problem of EEG with the use of realistic head models. In particular, we recover statistically significant information about cortical areas engaged by particular scenes inserted within the TV commercial proposed with respect to the brain activity estimated while watching a documentary. Results obtained in the population investigated suggest that the statistically significant brain activity during the observation of the TV commercial was mainly concentrated in frontoparietal cortical areas, roughly coincident with the Brodmann areas 8, 9, and 7, in the analyzed population. PMID:19584910
Processes and subdivisions in diogenites, a multivariate statistical analysis
NASA Technical Reports Server (NTRS)
Harriott, T. A.; Hewins, R. H.
1984-01-01
Multivariate statistical techniques used on diogenite orthopyroxene analyses show the relationships that occur within diogenites and the two orthopyroxenite components (class I and II) in the polymict diogenite Garland. Cluster analysis shows that only Peckelsheim is similar to Garland class I (Fe-rich) and the other diogenites resemble Garland class II. The unique diogenite Y 75032 may be related to type I by fractionation. Factor analysis confirms the subdivision and shows that Fe does not correlate with the weakly incompatible elements across the entire pyroxene composition range, indicating that igneous fractionation is not the process controlling total diogenite composition variation. The occurrence of two groups of diogenites is interpreted as the result of sampling or mixing of two main sequences of orthopyroxene cumulates with slightly different compositions.
Bibliometric indexes, databases and impact factors in cardiology
Bienert, Igor R C; de Oliveira, Rogério Carvalho; de Andrade, Pedro Beraldo; Caramori, Carlos Antonio
2015-01-01
Bibliometry is a quantitative statistical technique to measure levels of production and dissemination of knowledge, as well as a useful tool to track the development of an scientific area. The valuation of production required for recognition of researchers and magazines is accomplished through tools called bibliometricindexes, divided into quality indicators and scientific impact. Initially developed for monographs of statistical measures especially in libraries, today bibliometrics is mainly used to evaluate productivity of authors and citation repercussion. However, these tools have limitations and sometimes provoke controversies about indiscriminate application, leading to the development of newer indexes. It is important to know the most common search indexes and use it properly even acknowledging its limitations as it has a direct impact in their daily practice, reputation and funds achievement. PMID:26107458
Tholkappian, M; Ravisankar, R; Chandrasekaran, A; Jebakumar, J Prince Prakash; Kanagasabapathy, K V; Prasad, M V R; Satapathy, K K
2018-01-01
The concentration of some heavy metals: Al, Ca, K, Fe, Ti, Mg, Mn, V, Cr, Zn, Ni and Co in sediments from Pulicat Lake to Vadanemmeli along Chennai Coast, Tamil Nadu has been determined using EDXRF technique. The mean concentrations of Mg, Al, K, Ca, Ti, Fe, V, Cr, Mn, Co, Ni, and Zn were found to be 1918, 25436, 9832, 9859, 2109, 8209, 41.58, 34.14, 160.80, 2.85. 18.79 and 29.12 mg kg -1 respectively. These mean concentrations do not exceed the world crustal average. The level of pollution attributed to heavy metals was evaluated using several pollution indicators in order to determine anthropogenically derived contaminations. Enrichment Factor (EF), Geoaccumulation Index (I geo ), Contamination Factor (CF) and Pollution Load Index (PLI) were used in evaluating the contamination status of sediments. Enrichment Factors (EF) reveal the anthropogenic sources of V, Cr, Ni and Zn Geoaccumulation Index (I geo ) results reveal that the study area is not contaminated by the heavy metals. Similar results were also obtained by using pollution load index (PLI). The results of pollution indices indicates that most of the locations were not polluted by heavy metals. Multivariate statistical analysis performed using principal components and clustering techniques were used to identify the source of the heavy metals. The result of statistical procedures indicate that heavy metals in sediments are mainly of natural origin. This study provides a relatively novel technique for identifying and mapping the distribution of metal pollutants and their sources in sediment.
NASA Technical Reports Server (NTRS)
Lam, N.; Qiu, H.-I.; Quattrochi, Dale A.; Zhao, Wei
1997-01-01
With the rapid increase in spatial data, especially in the NASA-EOS (Earth Observing System) era, it is necessary to develop efficient and innovative tools to handle and analyze these data so that environmental conditions can be assessed and monitored. A main difficulty facing geographers and environmental scientists in environmental assessment and measurement is that spatial analytical tools are not easily accessible. We have recently developed a remote sensing/GIS software module called Image Characterization and Modeling System (ICAMS) to provide specialized spatial analytical tools for the measurement and characterization of satellite and other forms of spatial data. ICAMS runs on both the Intergraph-MGE and Arc/info UNIX and Windows-NT platforms. The main techniques in ICAMS include fractal measurement methods, variogram analysis, spatial autocorrelation statistics, textural measures, aggregation techniques, normalized difference vegetation index (NDVI), and delineation of land/water and vegetated/non-vegetated boundaries. In this paper, we demonstrate the main applications of ICAMS on the Intergraph-MGE platform using Landsat Thematic Mapper images from the city of Lake Charles, Louisiana. While the utilities of ICAMS' spatial measurement methods (e.g., fractal indices) in assessing environmental conditions remain to be researched, making the software available to a wider scientific community can permit the techniques in ICAMS to be evaluated and used for a diversity of applications. The findings from these various studies should lead to improved algorithms and more reliable models for environmental assessment and monitoring.
Confocal Imaging of porous media
NASA Astrophysics Data System (ADS)
Shah, S.; Crawshaw, D.; Boek, D.
2012-12-01
Carbonate rocks, which hold approximately 50% of the world's oil and gas reserves, have a very complicated and heterogeneous structure in comparison with sandstone reservoir rock. We present advances with different techniques to image, reconstruct, and characterize statistically the micro-geometry of carbonate pores. The main goal here is to develop a technique to obtain two dimensional and three dimensional images using Confocal Laser Scanning Microscopy. CLSM is used in epi-fluorescent imaging mode, allowing for the very high optical resolution of features well below 1μm size. Images of pore structures were captured using CLSM imaging where spaces in the carbonate samples were impregnated with a fluorescent, dyed epoxy-resin, and scanned in the x-y plane by a laser probe. We discuss the sample preparation in detail for Confocal Imaging to obtain sub-micron resolution images of heterogeneous carbonate rocks. We also discuss the technical and practical aspects of this imaging technique, including its advantages and limitation. We present several examples of this application, including studying pore geometry in carbonates, characterizing sub-resolution porosity in two dimensional images. We then describe approaches to extract statistical information about porosity using image processing and spatial correlation function. We have managed to obtain very low depth information in z -axis (~ 50μm) to develop three dimensional images of carbonate rocks with the current capabilities and limitation of CLSM technique. Hence, we have planned a novel technique to obtain higher depth information to obtain high three dimensional images with sub-micron resolution possible in the lateral and axial planes.
Survey of statistical techniques used in validation studies of air pollution prediction models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bornstein, R D; Anderson, S F
1979-03-01
Statistical techniques used by meteorologists to validate predictions made by air pollution models are surveyed. Techniques are divided into the following three groups: graphical, tabular, and summary statistics. Some of the practical problems associated with verification are also discussed. Characteristics desired in any validation program are listed and a suggested combination of techniques that possesses many of these characteristics is presented.
Chadeau-Hyam, Marc; Campanella, Gianluca; Jombart, Thibaut; Bottolo, Leonardo; Portengen, Lutzen; Vineis, Paolo; Liquet, Benoit; Vermeulen, Roel C H
2013-08-01
Recent technological advances in molecular biology have given rise to numerous large-scale datasets whose analysis imposes serious methodological challenges mainly relating to the size and complex structure of the data. Considerable experience in analyzing such data has been gained over the past decade, mainly in genetics, from the Genome-Wide Association Study era, and more recently in transcriptomics and metabolomics. Building upon the corresponding literature, we provide here a nontechnical overview of well-established methods used to analyze OMICS data within three main types of regression-based approaches: univariate models including multiple testing correction strategies, dimension reduction techniques, and variable selection models. Our methodological description focuses on methods for which ready-to-use implementations are available. We describe the main underlying assumptions, the main features, and advantages and limitations of each of the models. This descriptive summary constitutes a useful tool for driving methodological choices while analyzing OMICS data, especially in environmental epidemiology, where the emergence of the exposome concept clearly calls for unified methods to analyze marginally and jointly complex exposure and OMICS datasets. Copyright © 2013 Wiley Periodicals, Inc.
Sousa, F S; Hummel, A D; Maciel, R F; Cohrs, F M; Falcão, A E J; Teixeira, F; Baptista, R; Mancini, F; da Costa, T M; Alves, D; Pisa, I T
2011-05-01
The replacement of defective organs with healthy ones is an old problem, but only a few years ago was this issue put into practice. Improvements in the whole transplantation process have been increasingly important in clinical practice. In this context are clinical decision support systems (CDSSs), which have reflected a significant amount of work to use mathematical and intelligent techniques. The aim of this article was to present consideration of intelligent techniques used in recent years (2009 and 2010) to analyze organ transplant databases. To this end, we performed a search of the PubMed and Institute for Scientific Information (ISI) Web of Knowledge databases to find articles published in 2009 and 2010 about intelligent techniques applied to transplantation databases. Among 69 retrieved articles, we chose according to inclusion and exclusion criteria. The main techniques were: Artificial Neural Networks (ANN), Logistic Regression (LR), Decision Trees (DT), Markov Models (MM), and Bayesian Networks (BN). Most articles used ANN. Some publications described comparisons between techniques or the use of various techniques together. The use of intelligent techniques to extract knowledge from databases of healthcare is increasingly common. Although authors preferred to use ANN, statistical techniques were equally effective for this enterprise. Copyright © 2011 Elsevier Inc. All rights reserved.
Bruni, Aline Thaís; Velho, Jesus Antonio; Ferreira, Arthur Serra Lopes; Tasso, Maria Júlia; Ferrari, Raíssa Santos; Yoshida, Ricardo Luís; Dias, Marcos Salvador; Leite, Vitor Barbanti Pereira
2014-08-01
This study uses statistical techniques to evaluate reports on suicide scenes; it utilizes 80 reports from different locations in Brazil, randomly collected from both federal and state jurisdictions. We aimed to assess a heterogeneous group of cases in order to obtain an overall perspective of the problem. We evaluated variables regarding the characteristics of the crime scene, such as the detected traces (blood, instruments and clothes) that were found and we addressed the methodology employed by the experts. A qualitative approach using basic statistics revealed a wide distribution as to how the issue was addressed in the documents. We examined a quantitative approach involving an empirical equation and we used multivariate procedures to validate the quantitative methodology proposed for this empirical equation. The methodology successfully identified the main differences in the information presented in the reports, showing that there is no standardized method of analyzing evidences. Copyright © 2014 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.
Utilizing the N beam position monitor method for turn-by-turn optics measurements
NASA Astrophysics Data System (ADS)
Langner, A.; Benedetti, G.; Carlà, M.; Iriso, U.; Martí, Z.; de Portugal, J. Coello; Tomás, R.
2016-09-01
The N beam position monitor method (N -BPM) which was recently developed for the LHC has significantly improved the precision of optics measurements that are based on BPM turn-by-turn data. The main improvement is due to the consideration of correlations for statistical and systematic error sources, as well as increasing the amount of BPM combinations which are used to derive the β -function at one location. We present how this technique can be applied at light sources like ALBA, and compare the results with other methods.
Mor, Suman; Singh, Surender; Yadav, Poonam; Rani, Versha; Rani, Pushpa; Sheoran, Monika; Singh, Gurmeet; Ravindra, Khaiwal
2009-12-01
Various physico-chemical parameters, including fluoride (F(-)), were analyzed to understand the hydro-geochemistry of an aquifer in a semi-arid region of India. Furthermore, the quality of the shallow and deep aquifer (using tube well and hand pumps) was also investigated for their best ecological use including drinking, domestic, agricultural and other activities. Different multivariate techniques were applied to understand the groundwater chemistry of the aquifer. Findings of the correlation matrix were strengthened by the factor analysis, and this shows that salinity is mainly caused by magnesium salts as compared to calcium salts in the aquifer. The problem of salinization seems mainly compounded by the contamination of the shallow aquifers by the recharging water. High factor loading of total alkalinity and bicarbonates indicates that total alkalinity was mainly due to carbonates and bicarbonates of sodium. The concentration of F(-) was found more in the deep aquifer than the shallow aquifer. Further, only a few groundwater samples lie below the permissible limit of F(-), and this indicates a risk of dental caries in the populace of the study area. The present study indicates that regular monitoring of groundwater is an important step to avoid human health risks and to assess its quality for various ecological purposes.
Szabo, J.K.; Fedriani, E.M.; Segovia-Gonzalez, M. M.; Astheimer, L.B.; Hooper, M.J.
2010-01-01
This paper introduces a new technique in ecology to analyze spatial and temporal variability in environmental variables. By using simple statistics, we explore the relations between abiotic and biotic variables that influence animal distributions. However, spatial and temporal variability in rainfall, a key variable in ecological studies, can cause difficulties to any basic model including time evolution. The study was of a landscape scale (three million square kilometers in eastern Australia), mainly over the period of 19982004. We simultaneously considered qualitative spatial (soil and habitat types) and quantitative temporal (rainfall) variables in a Geographical Information System environment. In addition to some techniques commonly used in ecology, we applied a new method, Functional Principal Component Analysis, which proved to be very suitable for this case, as it explained more than 97% of the total variance of the rainfall data, providing us with substitute variables that are easier to manage and are even able to explain rainfall patterns. The main variable came from a habitat classification that showed strong correlations with rainfall values and soil types. ?? 2010 World Scientific Publishing Company.
NASA Astrophysics Data System (ADS)
Ye, M.; Pacheco Castro, R. B.; Pacheco Avila, J.; Cabrera Sansores, A.
2014-12-01
The karstic aquifer of Yucatan is a vulnerable and complex system. The first fifteen meters of this aquifer have been polluted, due to this the protection of this resource is important because is the only source of potable water of the entire State. Through the assessment of groundwater quality we can gain some knowledge about the main processes governing water chemistry as well as spatial patterns which are important to establish protection zones. In this work multivariate statistical techniques are used to assess the groundwater quality of the supply wells (30 to 40 meters deep) in the hidrogeologic region of the Ring of Cenotes, located in Yucatan, Mexico. Cluster analysis and principal component analysis are applied in groundwater chemistry data of the study area. Results of principal component analysis show that the main sources of variation in the data are due sea water intrusion and the interaction of the water with the carbonate rocks of the system and some pollution processes. The cluster analysis shows that the data can be divided in four clusters. The spatial distribution of the clusters seems to be random, but is consistent with sea water intrusion and pollution with nitrates. The overall results show that multivariate statistical analysis can be successfully applied in the groundwater quality assessment of this karstic aquifer.
System of Indicators in the Innovation Management: Business Intelligence Applied to Tourism
NASA Astrophysics Data System (ADS)
Lozada, Dayana; Araque, Francisco; Castillo, Jose Manuel; Salguero, Alberto; Delgado, Cecilia; Noda, Marcia; Hernández, Gilberto
The work presents an approach to study mechanisms that allows managers the Innovation Management (IM) measurements. It is assumed, as main motivation, the analysis of patterns for the design of an integral system of indicators. A methodology that integrates the thought process, focusing on the Business Intelligence and the Balance Scorecard will be presented. A group of indexes based on the multidimensionality of IM in organizations of the sector of tourism is proposed. To approach this quality it is necessary to contextualize, in the conditions of sectoral operation, the theories, models and systems used in our approach. It has been used intervention methods like experts' criteria, consensus search techniques by means of surveys, consultation of documents, and statistical methods such as analysis of the main components.
Some aspects of robotics calibration, design and control
NASA Technical Reports Server (NTRS)
Tawfik, Hazem
1990-01-01
The main objective is to introduce techniques in the areas of testing and calibration, design, and control of robotic systems. A statistical technique is described that analyzes a robot's performance and provides quantitative three-dimensional evaluation of its repeatability, accuracy, and linearity. Based on this analysis, a corrective action should be taken to compensate for any existing errors and enhance the robot's overall accuracy and performance. A comparison between robotics simulation software packages that were commercially available (SILMA, IGRIP) and that of Kennedy Space Center (ROBSIM) is also included. These computer codes simulate the kinematics and dynamics patterns of various robot arm geometries to help the design engineer in sizing and building the robot manipulator and control system. A brief discussion on an adaptive control algorithm is provided.
Performance Analysis of Garbage Collection and Dynamic Reordering in a Lisp System. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Llames, Rene Lim
1991-01-01
Generation based garbage collection and dynamic reordering of objects are two techniques for improving the efficiency of memory management in Lisp and similar dynamic language systems. An analysis of the effect of generation configuration is presented, focusing on the effect of a number of generations and generation capabilities. Analytic timing and survival models are used to represent garbage collection runtime and to derive structural results on its behavior. The survival model provides bounds on the age of objects surviving a garbage collection at a particular level. Empirical results show that execution time is most sensitive to the capacity of the youngest generation. A technique called scanning for transport statistics, for evaluating the effectiveness of reordering independent of main memory size, is presented.
Ahmad, Sheikh Saeed; Aziz, Neelam; Butt, Amna; Shabbir, Rabia; Erum, Summra
2015-09-01
One of the features of medical geography that has made it so useful in health research is statistical spatial analysis, which enables the quantification and qualification of health events. The main objective of this research was to study the spatial distribution patterns of malaria in Rawalpindi district using spatial statistical techniques to identify the hot spots and the possible risk factor. Spatial statistical analyses were done in ArcGIS, and satellite images for land use classification were processed in ERDAS Imagine. Four hundred and fifty water samples were also collected from the study area to identify the presence or absence of any microbial contamination. The results of this study indicated that malaria incidence varied according to geographical location, with eco-climatic condition and showing significant positive spatial autocorrelation. Hotspots or location of clusters were identified using Getis-Ord Gi* statistic. Significant clustering of malaria incidence occurred in rural central part of the study area including Gujar Khan, Kaller Syedan, and some part of Kahuta and Rawalpindi Tehsil. Ordinary least square (OLS) regression analysis was conducted to analyze the relationship of risk factors with the disease cases. Relationship of different land cover with the disease cases indicated that malaria was more related with agriculture, low vegetation, and water class. Temporal variation of malaria cases showed significant positive association with the meteorological variables including average monthly rainfall and temperature. The results of the study further suggested that water supply and sewage system and solid waste collection system needs a serious attention to prevent any outbreak in the study area.
Change detection from remotely sensed images: From pixel-based to object-based approaches
NASA Astrophysics Data System (ADS)
Hussain, Masroor; Chen, Dongmei; Cheng, Angela; Wei, Hui; Stanley, David
2013-06-01
The appetite for up-to-date information about earth's surface is ever increasing, as such information provides a base for a large number of applications, including local, regional and global resources monitoring, land-cover and land-use change monitoring, and environmental studies. The data from remote sensing satellites provide opportunities to acquire information about land at varying resolutions and has been widely used for change detection studies. A large number of change detection methodologies and techniques, utilizing remotely sensed data, have been developed, and newer techniques are still emerging. This paper begins with a discussion of the traditionally pixel-based and (mostly) statistics-oriented change detection techniques which focus mainly on the spectral values and mostly ignore the spatial context. This is succeeded by a review of object-based change detection techniques. Finally there is a brief discussion of spatial data mining techniques in image processing and change detection from remote sensing data. The merits and issues of different techniques are compared. The importance of the exponential increase in the image data volume and multiple sensors and associated challenges on the development of change detection techniques are highlighted. With the wide use of very-high-resolution (VHR) remotely sensed images, object-based methods and data mining techniques may have more potential in change detection.
NASA Astrophysics Data System (ADS)
Calì, M.; Santarelli, M. G. L.; Leone, P.
Gas Turbine Technologies (GTT) and Politecnico di Torino, both located in Torino (Italy), have been involved in the design and installation of a SOFC laboratory in order to analyse the operation, in cogenerative configuration, of the CHP 100 kW e SOFC Field Unit, built by Siemens-Westinghouse Power Corporation (SWPC), which is at present (May 2005) starting its operation and which will supply electric and thermal power to the GTT factory. In order to take the better advantage from the analysis of the on-site operation, and especially to correctly design the scheduled experimental tests on the system, we developed a mathematical model and run a simulated experimental campaign, applying a rigorous statistical approach to the analysis of the results. The aim of this work is the computer experimental analysis, through a statistical methodology (2 k factorial experiments), of the CHP 100 performance. First, the mathematical model has been calibrated with the results acquired during the first CHP100 demonstration at EDB/ELSAM in Westerwoort. After, the simulated tests have been performed in the form of computer experimental session, and the measurement uncertainties have been simulated with perturbation imposed to the model independent variables. The statistical methodology used for the computer experimental analysis is the factorial design (Yates' Technique): using the ANOVA technique the effect of the main independent variables (air utilization factor U ox, fuel utilization factor U F, internal fuel and air preheating and anodic recycling flow rate) has been investigated in a rigorous manner. Analysis accounts for the effects of parameters on stack electric power, thermal recovered power, single cell voltage, cell operative temperature, consumed fuel flow and steam to carbon ratio. Each main effect and interaction effect of parameters is shown with particular attention on generated electric power and stack heat recovered.
Statistical Validation for Clinical Measures: Repeatability and Agreement of Kinect™-Based Software
Tello, Emanuel; Rodrigo, Alejandro; Valentinuzzi, Max E.
2018-01-01
Background The rehabilitation process is a fundamental stage for recovery of people's capabilities. However, the evaluation of the process is performed by physiatrists and medical doctors, mostly based on their observations, that is, a subjective appreciation of the patient's evolution. This paper proposes a tracking platform of the movement made by an individual's upper limb using Kinect sensor(s) to be applied for the patient during the rehabilitation process. The main contribution is the development of quantifying software and the statistical validation of its performance, repeatability, and clinical use in the rehabilitation process. Methods The software determines joint angles and upper limb trajectories for the construction of a specific rehabilitation protocol and quantifies the treatment evolution. In turn, the information is presented via a graphical interface that allows the recording, storage, and report of the patient's data. For clinical purposes, the software information is statistically validated with three different methodologies, comparing the measures with a goniometer in terms of agreement and repeatability. Results The agreement of joint angles measured with the proposed software and goniometer is evaluated with Bland-Altman plots; all measurements fell well within the limits of agreement, meaning interchangeability of both techniques. Additionally, the results of Bland-Altman analysis of repeatability show 95% confidence. Finally, the physiotherapists' qualitative assessment shows encouraging results for the clinical use. Conclusion The main conclusion is that the software is capable of offering a clinical history of the patient and is useful for quantification of the rehabilitation success. The simplicity, low cost, and visualization possibilities enhance the use of the software Kinect for rehabilitation and other applications, and the expert's opinion endorses the choice of our approach for clinical practice. Comparison of the new measurement technique with established goniometric methods determines that the proposed software agrees sufficiently to be used interchangeably. PMID:29750166
CAPSAS: Computer Assisted Program for the Selection of Appropriate Statistics.
ERIC Educational Resources Information Center
Shermis, Mark D.; Albert, Susan L.
A computer-assisted program has been developed for the selection of statistics or statistical techniques by both students and researchers. Based on Andrews, Klem, Davidson, O'Malley and Rodgers "A Guide for Selecting Statistical Techniques for Analyzing Social Science Data," this FORTRAN-compiled interactive computer program was…
Basic biostatistics for post-graduate students
Dakhale, Ganesh N.; Hiware, Sachin K.; Shinde, Abhijit T.; Mahatme, Mohini S.
2012-01-01
Statistical methods are important to draw valid conclusions from the obtained data. This article provides background information related to fundamental methods and techniques in biostatistics for the use of postgraduate students. Main focus is given to types of data, measurement of central variations and basic tests, which are useful for analysis of different types of observations. Few parameters like normal distribution, calculation of sample size, level of significance, null hypothesis, indices of variability, and different test are explained in detail by giving suitable examples. Using these guidelines, we are confident enough that postgraduate students will be able to classify distribution of data along with application of proper test. Information is also given regarding various free software programs and websites useful for calculations of statistics. Thus, postgraduate students will be benefitted in both ways whether they opt for academics or for industry. PMID:23087501
A critical review of the neuroimaging literature on synesthesia
Hupé, Jean-Michel; Dojat, Michel
2015-01-01
Synesthesia refers to additional sensations experienced by some people for specific stimulations, such as the systematic arbitrary association of colors to letters for the most studied type. Here, we review all the studies (based mostly on functional and structural magnetic resonance imaging) that have searched for the neural correlates of this subjective experience, as well as structural differences related to synesthesia. Most differences claimed for synesthetes are unsupported, due mainly to low statistical power, statistical errors, and methodological limitations. Our critical review therefore casts some doubts on whether any neural correlate of the synesthetic experience has been established yet. Rather than being a neurological condition (i.e., a structural or functional brain anomaly), synesthesia could be reconsidered as a special kind of childhood memory, whose signature in the brain may be out of reach with present brain imaging techniques. PMID:25873873
NASA Technical Reports Server (NTRS)
Park, Steve
1990-01-01
A large and diverse number of computational techniques are routinely used to process and analyze remotely sensed data. These techniques include: univariate statistics; multivariate statistics; principal component analysis; pattern recognition and classification; other multivariate techniques; geometric correction; registration and resampling; radiometric correction; enhancement; restoration; Fourier analysis; and filtering. Each of these techniques will be considered, in order.
Metamodels for Computer-Based Engineering Design: Survey and Recommendations
NASA Technical Reports Server (NTRS)
Simpson, Timothy W.; Peplinski, Jesse; Koch, Patrick N.; Allen, Janet K.
1997-01-01
The use of statistical techniques to build approximations of expensive computer analysis codes pervades much of todays engineering design. These statistical approximations, or metamodels, are used to replace the actual expensive computer analyses, facilitating multidisciplinary, multiobjective optimization and concept exploration. In this paper we review several of these techniques including design of experiments, response surface methodology, Taguchi methods, neural networks, inductive learning, and kriging. We survey their existing application in engineering design and then address the dangers of applying traditional statistical techniques to approximate deterministic computer analysis codes. We conclude with recommendations for the appropriate use of statistical approximation techniques in given situations and how common pitfalls can be avoided.
Simultenious binary hash and features learning for image retrieval
NASA Astrophysics Data System (ADS)
Frantc, V. A.; Makov, S. V.; Voronin, V. V.; Marchuk, V. I.; Semenishchev, E. A.; Egiazarian, K. O.; Agaian, S.
2016-05-01
Content-based image retrieval systems have plenty of applications in modern world. The most important one is the image search by query image or by semantic description. Approaches to this problem are employed in personal photo-collection management systems, web-scale image search engines, medical systems, etc. Automatic analysis of large unlabeled image datasets is virtually impossible without satisfactory image-retrieval technique. It's the main reason why this kind of automatic image processing has attracted so much attention during recent years. Despite rather huge progress in the field, semantically meaningful image retrieval still remains a challenging task. The main issue here is the demand to provide reliable results in short amount of time. This paper addresses the problem by novel technique for simultaneous learning of global image features and binary hash codes. Our approach provide mapping of pixel-based image representation to hash-value space simultaneously trying to save as much of semantic image content as possible. We use deep learning methodology to generate image description with properties of similarity preservation and statistical independence. The main advantage of our approach in contrast to existing is ability to fine-tune retrieval procedure for very specific application which allow us to provide better results in comparison to general techniques. Presented in the paper framework for data- dependent image hashing is based on use two different kinds of neural networks: convolutional neural networks for image description and autoencoder for feature to hash space mapping. Experimental results confirmed that our approach has shown promising results in compare to other state-of-the-art methods.
Functional data analysis on ground reaction force of military load carriage increment
NASA Astrophysics Data System (ADS)
Din, Wan Rozita Wan; Rambely, Azmin Sham
2014-06-01
Analysis of ground reaction force on military load carriage is done through functional data analysis (FDA) statistical technique. The main objective of the research is to investigate the effect of 10% load increment and to find the maximum suitable load for the Malaysian military. Ten military soldiers age 31 ± 6.2 years, weigh 71.6 ± 10.4 kg and height of 166.3 ± 5.9 cm carrying different military load range from 0% body weight (BW) up to 40% BW participated in an experiment to gather the GRF and kinematic data using Vicon Motion Analysis System, Kirstler force plates and thirty nine body markers. The analysis is conducted in sagittal, medial lateral and anterior posterior planes. The results show that 10% BW load increment has an effect when heel strike and toe-off for all the three planes analyzed with P-value less than 0.001 at 0.05 significant levels. FDA proves to be one of the best statistical techniques in analyzing the functional data. It has the ability to handle filtering, smoothing and curve aligning according to curve features and points of interest.
Aquino-Pérez, Dulce María; Peña-Cadena, Daniel; Trujillo-García, José Ubaldo; Jiménez-Sandoval, Jaime Omar; Machorro-Muñoz, Olga Stephanie
2013-01-01
The use of metered dose inhaler (MDI) is key in the treatment of asthma; its effectiveness is related to proper technique. The purpose of this study is to evaluate the use of the technique of metered dose inhalers for the parents or guardians of school children with asthma. In this cross-sectional study, we used a sample of 221 individual caregivers (parent or guardian) of asthmatic children from 5 to 12 years old, who use MDI. We designed a validated questionnaire consisting of 27 items which addressed the handling of inhaler technique. Descriptive statistics was used. Caregivers were rated as "good technique" in 41 fathers (18.6%), 77 mothers (34.8%) and 9 tutors (4.1%), and with a "regular technique" 32 fathers (14.5%), 48 mothers (21.2%) and 14 guardians (6.3%). Asthmatic children aged 9 were rated as with "good technique" in 24 (10.9%). According to gender, we found a "good technique" in 80 boys (36.2%) and 47 girls (21.3%) and with a "regular technique" in 59 boys (26.7%) and 35 girls (15.8%), P 0.0973, RP 0.9. We found with a "regular technique" mainly those asthmatic children diagnosed at ages between 1 to 3 years. Most of the participants had a good technical qualification; however major mistakes were made at key points in the performance of it.
Advanced signal processing based on support vector regression for lidar applications
NASA Astrophysics Data System (ADS)
Gelfusa, M.; Murari, A.; Malizia, A.; Lungaroni, M.; Peluso, E.; Parracino, S.; Talebzadeh, S.; Vega, J.; Gaudio, P.
2015-10-01
The LIDAR technique has recently found many applications in atmospheric physics and remote sensing. One of the main issues, in the deployment of systems based on LIDAR, is the filtering of the backscattered signal to alleviate the problems generated by noise. Improvement in the signal to noise ratio is typically achieved by averaging a quite large number (of the order of hundreds) of successive laser pulses. This approach can be effective but presents significant limitations. First of all, it implies a great stress on the laser source, particularly in the case of systems for automatic monitoring of large areas for long periods. Secondly, this solution can become difficult to implement in applications characterised by rapid variations of the atmosphere, for example in the case of pollutant emissions, or by abrupt changes in the noise. In this contribution, a new method for the software filtering and denoising of LIDAR signals is presented. The technique is based on support vector regression. The proposed new method is insensitive to the statistics of the noise and is therefore fully general and quite robust. The developed numerical tool has been systematically compared with the most powerful techniques available, using both synthetic and experimental data. Its performances have been tested for various statistical distributions of the noise and also for other disturbances of the acquired signal such as outliers. The competitive advantages of the proposed method are fully documented. The potential of the proposed approach to widen the capability of the LIDAR technique, particularly in the detection of widespread smoke, is discussed in detail.
Cai, Limei; Xu, Zhencheng; Ren, Mingzhong; Guo, Qingwei; Hu, Xibang; Hu, Guocheng; Wan, Hongfu; Peng, Pingan
2012-04-01
One hundred and four surface samples and 40 profiles samples in agricultural soils collected from Huizhou in south-east China were monitored for total contents of 8 heavy metals, and analyzed by multivariate statistical techniques and enrichment factor (EF), in order to investigate their origins. The results indicate that the concentrations of Cu, Zn, Ni, Cr, Pb, Cd, As and Hg in soils are 16.74, 57.21, 14.89, 27.61, 44.66, 0.10, 10.19 and 0.22 mg/kg, respectively. Compared to the soil background contents in Guangdong Province, the mean concentrations of Hg, Cd, Zn, Pb and As in soil of Huizhou are higher, especially Hg and Cd, which are 2.82 and 1.79 times the background values, respectively. Cr, Ni, Cu, partially, Zn and Pb mainly originate from a natural source. Cd, As, partially, Zn mainly come from agricultural practices. However, Hg, partially, Pb originate mainly from industry and traffic sources. Copyright © 2011 Elsevier Inc. All rights reserved.
Comparative Analysis Between Computed and Conventional Inferior Alveolar Nerve Block Techniques.
Araújo, Gabriela Madeira; Barbalho, Jimmy Charles Melo; Dias, Tasiana Guedes de Souza; Santos, Thiago de Santana; Vasconcellos, Ricardo José de Holanda; de Morais, Hécio Henrique Araújo
2015-11-01
The aim of this randomized, double-blind, controlled trial was to compare the computed and conventional inferior alveolar nerve block techniques in symmetrically positioned inferior third molars. Both computed and conventional anesthetic techniques were performed in 29 healthy patients (58 surgeries) aged between 18 and 40 years. The anesthetic of choice was 2% lidocaine with 1: 200,000 epinephrine. The Visual Analogue Scale assessed the pain variable after anesthetic infiltration. Patient satisfaction was evaluated using the Likert Scale. Heart and respiratory rates, mean time to perform technique, and the need for additional anesthesia were also evaluated. Pain variable means were higher for the conventional technique as compared with computed, 3.45 ± 2.73 and 2.86 ± 1.96, respectively, but no statistically significant differences were found (P > 0.05). Patient satisfaction showed no statistically significant differences. The average computed technique runtime and the conventional were 3.85 and 1.61 minutes, respectively, showing statistically significant differences (P <0.001). The computed anesthetic technique showed lower mean pain perception, but did not show statistically significant differences when contrasted to the conventional technique.
NASA Astrophysics Data System (ADS)
Smid, Marek; Costa, Ana; Pebesma, Edzer; Granell, Carlos; Bhattacharya, Devanjan
2016-04-01
Human kind is currently predominantly urban based, and the majority of ever continuing population growth will take place in urban agglomerations. Urban systems are not only major drivers of climate change, but also the impact hot spots. Furthermore, climate change impacts are commonly managed at city scale. Therefore, assessing climate change impacts on urban systems is a very relevant subject of research. Climate and its impacts on all levels (local, meso and global scale) and also the inter-scale dependencies of those processes should be a subject to detail analysis. While global and regional projections of future climate are currently available, local-scale information is lacking. Hence, statistical downscaling methodologies represent a potentially efficient way to help to close this gap. In general, the methodological reviews of downscaling procedures cover the various methods according to their application (e.g. downscaling for the hydrological modelling). Some of the most recent and comprehensive studies, such as the ESSEM COST Action ES1102 (VALUE), use the concept of Perfect Prog and MOS. Other examples of classification schemes of downscaling techniques consider three main categories: linear methods, weather classifications and weather generators. Downscaling and climate modelling represent a multidisciplinary field, where researchers from various backgrounds intersect their efforts, resulting in specific terminology, which may be somewhat confusing. For instance, the Polynomial Regression (also called the Surface Trend Analysis) is a statistical technique. In the context of the spatial interpolation procedures, it is commonly classified as a deterministic technique, and kriging approaches are classified as stochastic. Furthermore, the terms "statistical" and "stochastic" (frequently used as names of sub-classes in downscaling methodological reviews) are not always considered as synonymous, even though both terms could be seen as identical since they are referring to methods handling input modelling factors as variables with certain probability distributions. In addition, the recent development is going towards multi-step methodologies containing deterministic and stochastic components. This evolution leads to the introduction of new terms like hybrid or semi-stochastic approaches, which makes the efforts to systematically classifying downscaling methods to the previously defined categories even more challenging. This work presents a review of statistical downscaling procedures, which classifies the methods in two steps. In the first step, we describe several techniques that produce a single climatic surface based on observations. The methods are classified into two categories using an approximation to the broadest consensual statistical terms: linear and non-linear methods. The second step covers techniques that use simulations to generate alternative surfaces, which correspond to different realizations of the same processes. Those simulations are essential because there is a limited number of real observational data, and such procedures are crucial for modelling extremes. This work emphasises the link between statistical downscaling methods and the research of climate change impacts at city scale.
Speckle statistics in adaptive optics images at visible wavelengths
NASA Astrophysics Data System (ADS)
Stangalini, Marco; Pedichini, Fernando; Ambrosino, Filippo; Centrone, Mauro; Del Moro, Dario
2016-07-01
Residual speckles in adaptive optics (AO) images represent a well known limitation to the achievement of the contrast needed for faint stellar companions detection. Speckles in AO imagery can be the result of either residual atmospheric aberrations, not corrected by the AO, or slowly evolving aberrations induced by the optical system. In this work we take advantage of new high temporal cadence (1 ms) data acquired by the SHARK forerunner experiment at the Large Binocular Telescope (LBT), to characterize the AO residual speckles at visible waveleghts. By means of an automatic identification of speckles, we study the main statistical properties of AO residuals. In addition, we also study the memory of the process, and thus the clearance time of the atmospheric aberrations, by using information Theory. These information are useful for increasing the realism of numerical simulations aimed at assessing the instrumental performances, and for the application of post-processing techniques on AO imagery.
The statistical analysis of energy release in small-scale coronal structures
NASA Astrophysics Data System (ADS)
Ulyanov, Artyom; Kuzin, Sergey; Bogachev, Sergey
We present the results of statistical analysis of impulsive flare-like brightenings, which numerously occur in the quiet regions of solar corona. For our study, we utilized high-cadence observations performed with two EUV-telescopes - TESIS/Coronas-Photon and AIA/SDO. In total, we processed 6 sequences of images, registered throughout the period between 2009 and 2013, covering the rising phase of the 24th solar cycle. Based on high-speed DEM estimation method, we developed a new technique to evaluate the main parameters of detected events (geometrical sizes, duration, temperature and thermal energy). We then obtained the statistical distributions of these parameters and examined their variations depending on the level of solar activity. The results imply that near the minimum of the solar cycle the energy release in quiet corona is mainly provided by small-scale events (nanoflares), whereas larger events (microflares) prevail on the peak of activity. Furthermore, we investigated the coronal conditions that had specified the formation and triggering of registered flares. By means of photospheric magnetograms obtained with MDI/SoHO and HMI/SDO instruments, we examined the topology of local magnetic fields at different stages: the pre-flare phase, the peak of intensity and the ending phase. To do so, we introduced a number of topological parameters including the total magnetic flux, the distance between magnetic sources and their mutual arrangement. The found correlation between the change of these parameters and the formation of flares may offer an important tool for application of flare forecasting.
Alexander, Terry W.; Wilson, Gary L.
1995-01-01
A generalized least-squares regression technique was used to relate the 2- to 500-year flood discharges from 278 selected streamflow-gaging stations to statistically significant basin characteristics. The regression relations (estimating equations) were defined for three hydrologic regions (I, II, and III) in rural Missouri. Ordinary least-squares regression analyses indicate that drainage area (Regions I, II, and III) and main-channel slope (Regions I and II) are the only basin characteristics needed for computing the 2- to 500-year design-flood discharges at gaged or ungaged stream locations. The resulting generalized least-squares regression equations provide a technique for estimating the 2-, 5-, 10-, 25-, 50-, 100-, and 500-year flood discharges on unregulated streams in rural Missouri. The regression equations for Regions I and II were developed from stream-flow-gaging stations with drainage areas ranging from 0.13 to 11,500 square miles and 0.13 to 14,000 square miles, and main-channel slopes ranging from 1.35 to 150 feet per mile and 1.20 to 279 feet per mile. The regression equations for Region III were developed from streamflow-gaging stations with drainage areas ranging from 0.48 to 1,040 square miles. Standard errors of estimate for the generalized least-squares regression equations in Regions I, II, and m ranged from 30 to 49 percent.
NASA Astrophysics Data System (ADS)
Müller-Hansen, Finn; Schlüter, Maja; Mäs, Michael; Donges, Jonathan F.; Kolb, Jakob J.; Thonicke, Kirsten; Heitzig, Jobst
2017-11-01
Today, humans have a critical impact on the Earth system and vice versa, which can generate complex feedback processes between social and ecological dynamics. Integrating human behavior into formal Earth system models (ESMs), however, requires crucial modeling assumptions about actors and their goals, behavioral options, and decision rules, as well as modeling decisions regarding human social interactions and the aggregation of individuals' behavior. Here, we review existing modeling approaches and techniques from various disciplines and schools of thought dealing with human behavior at different levels of decision making. We demonstrate modelers' often vast degrees of freedom but also seek to make modelers aware of the often crucial consequences of seemingly innocent modeling assumptions. After discussing which socioeconomic units are potentially important for ESMs, we compare models of individual decision making that correspond to alternative behavioral theories and that make diverse modeling assumptions about individuals' preferences, beliefs, decision rules, and foresight. We review approaches to model social interaction, covering game theoretic frameworks, models of social influence, and network models. Finally, we discuss approaches to studying how the behavior of individuals, groups, and organizations can aggregate to complex collective phenomena, discussing agent-based, statistical, and representative-agent modeling and economic macro-dynamics. We illustrate the main ingredients of modeling techniques with examples from land-use dynamics as one of the main drivers of environmental change bridging local to global scales.
MetaGenyo: a web tool for meta-analysis of genetic association studies.
Martorell-Marugan, Jordi; Toro-Dominguez, Daniel; Alarcon-Riquelme, Marta E; Carmona-Saez, Pedro
2017-12-16
Genetic association studies (GAS) aims to evaluate the association between genetic variants and phenotypes. In the last few years, the number of this type of study has increased exponentially, but the results are not always reproducible due to experimental designs, low sample sizes and other methodological errors. In this field, meta-analysis techniques are becoming very popular tools to combine results across studies to increase statistical power and to resolve discrepancies in genetic association studies. A meta-analysis summarizes research findings, increases statistical power and enables the identification of genuine associations between genotypes and phenotypes. Meta-analysis techniques are increasingly used in GAS, but it is also increasing the amount of published meta-analysis containing different errors. Although there are several software packages that implement meta-analysis, none of them are specifically designed for genetic association studies and in most cases their use requires advanced programming or scripting expertise. We have developed MetaGenyo, a web tool for meta-analysis in GAS. MetaGenyo implements a complete and comprehensive workflow that can be executed in an easy-to-use environment without programming knowledge. MetaGenyo has been developed to guide users through the main steps of a GAS meta-analysis, covering Hardy-Weinberg test, statistical association for different genetic models, analysis of heterogeneity, testing for publication bias, subgroup analysis and robustness testing of the results. MetaGenyo is a useful tool to conduct comprehensive genetic association meta-analysis. The application is freely available at http://bioinfo.genyo.es/metagenyo/ .
A Streamflow Statistics (StreamStats) Web Application for Ohio
Koltun, G.F.; Kula, Stephanie P.; Puskas, Barry M.
2006-01-01
A StreamStats Web application was developed for Ohio that implements equations for estimating a variety of streamflow statistics including the 2-, 5-, 10-, 25-, 50-, 100-, and 500-year peak streamflows, mean annual streamflow, mean monthly streamflows, harmonic mean streamflow, and 25th-, 50th-, and 75th-percentile streamflows. StreamStats is a Web-based geographic information system application designed to facilitate the estimation of streamflow statistics at ungaged locations on streams. StreamStats can also serve precomputed streamflow statistics determined from streamflow-gaging station data. The basic structure, use, and limitations of StreamStats are described in this report. To facilitate the level of automation required for Ohio's StreamStats application, the technique used by Koltun (2003)1 for computing main-channel slope was replaced with a new computationally robust technique. The new channel-slope characteristic, referred to as SL10-85, differed from the National Hydrography Data based channel slope values (SL) reported by Koltun (2003)1 by an average of -28.3 percent, with the median change being -13.2 percent. In spite of the differences, the two slope measures are strongly correlated. The change in channel slope values resulting from the change in computational method necessitated revision of the full-model equations for flood-peak discharges originally presented by Koltun (2003)1. Average standard errors of prediction for the revised full-model equations presented in this report increased by a small amount over those reported by Koltun (2003)1, with increases ranging from 0.7 to 0.9 percent. Mean percentage changes in the revised regression and weighted flood-frequency estimates relative to regression and weighted estimates reported by Koltun (2003)1 were small, ranging from -0.72 to -0.25 percent and -0.22 to 0.07 percent, respectively.
Detecting most influencing courses on students grades using block PCA
NASA Astrophysics Data System (ADS)
Othman, Osama H.; Gebril, Rami Salah
2014-12-01
One of the modern solutions adopted in dealing with the problem of large number of variables in statistical analyses is the Block Principal Component Analysis (Block PCA). This modified technique can be used to reduce the vertical dimension (variables) of the data matrix Xn×p by selecting a smaller number of variables, (say m) containing most of the statistical information. These selected variables can then be employed in further investigations and analyses. Block PCA is an adapted multistage technique of the original PCA. It involves the application of Cluster Analysis (CA) and variable selection throughout sub principal components scores (PC's). The application of Block PCA in this paper is a modified version of the original work of Liu et al (2002). The main objective was to apply PCA on each group of variables, (established using cluster analysis), instead of involving the whole large pack of variables which was proved to be unreliable. In this work, the Block PCA is used to reduce the size of a huge data matrix ((n = 41) × (p = 251)) consisting of Grade Point Average (GPA) of the students in 251 courses (variables) in the faculty of science in Benghazi University. In other words, we are constructing a smaller analytical data matrix of the GPA's of the students with less variables containing most variation (statistical information) in the original database. By applying the Block PCA, (12) courses were found to `absorb' most of the variation or influence from the original data matrix, and hence worth to be keep for future statistical exploring and analytical studies. In addition, the course Independent Study (Math.) was found to be the most influencing course on students GPA among the 12 selected courses.
Regression modeling of ground-water flow
Cooley, R.L.; Naff, R.L.
1985-01-01
Nonlinear multiple regression methods are developed to model and analyze groundwater flow systems. Complete descriptions of regression methodology as applied to groundwater flow models allow scientists and engineers engaged in flow modeling to apply the methods to a wide range of problems. Organization of the text proceeds from an introduction that discusses the general topic of groundwater flow modeling, to a review of basic statistics necessary to properly apply regression techniques, and then to the main topic: exposition and use of linear and nonlinear regression to model groundwater flow. Statistical procedures are given to analyze and use the regression models. A number of exercises and answers are included to exercise the student on nearly all the methods that are presented for modeling and statistical analysis. Three computer programs implement the more complex methods. These three are a general two-dimensional, steady-state regression model for flow in an anisotropic, heterogeneous porous medium, a program to calculate a measure of model nonlinearity with respect to the regression parameters, and a program to analyze model errors in computed dependent variables such as hydraulic head. (USGS)
Descriptive Statistical Techniques for Librarians. 2nd Edition.
ERIC Educational Resources Information Center
Hafner, Arthur W.
A thorough understanding of the uses and applications of statistical techniques is integral in gaining support for library funding or new initiatives. This resource is designed to help practitioners develop and manipulate descriptive statistical information in evaluating library services, tracking and controlling limited resources, and analyzing…
ERIC Educational Resources Information Center
Williams, Immanuel James; Williams, Kelley Kim
2016-01-01
Understanding summary statistics and graphical techniques are building blocks to comprehending concepts beyond basic statistics. It's known that motivated students perform better in school. Using examples that students find engaging allows them to understand the concepts at a deeper level.
Evaluation of Anomaly Detection Method Based on Pattern Recognition
NASA Astrophysics Data System (ADS)
Fontugne, Romain; Himura, Yosuke; Fukuda, Kensuke
The number of threats on the Internet is rapidly increasing, and anomaly detection has become of increasing importance. High-speed backbone traffic is particularly degraded, but their analysis is a complicated task due to the amount of data, the lack of payload data, the asymmetric routing and the use of sampling techniques. Most anomaly detection schemes focus on the statistical properties of network traffic and highlight anomalous traffic through their singularities. In this paper, we concentrate on unusual traffic distributions, which are easily identifiable in temporal-spatial space (e.g., time/address or port). We present an anomaly detection method that uses a pattern recognition technique to identify anomalies in pictures representing traffic. The main advantage of this method is its ability to detect attacks involving mice flows. We evaluate the parameter set and the effectiveness of this approach by analyzing six years of Internet traffic collected from a trans-Pacific link. We show several examples of detected anomalies and compare our results with those of two other methods. The comparison indicates that the only anomalies detected by the pattern-recognition-based method are mainly malicious traffic with a few packets.
Nitti, Mariangela; Ciavolino, Enrico; Salvatore, Sergio; Gennaro, Alessandro
2010-09-01
The authors propose a method for analyzing the psychotherapy process: discourse flow analysis (DFA). DFA is a technique representing the verbal interaction between therapist and patient as a discourse network, aimed at measuring the therapist-patient discourse ability to generate new meanings through time. DFA assumes that the main function of psychotherapy is to produce semiotic novelty. DFA is applied to the verbatim transcript of the psychotherapy. It defines the main meanings active within the therapeutic discourse by means of the combined use of text analysis and statistical techniques. Subsequently, it represents the dynamic interconnections among these meanings in terms of a "discursive network." The dynamic and structural indexes of the discursive network have been shown to provide a valid representation of the patient-therapist communicative flow as well as an estimation of its clinical quality. Finally, a neural network is designed specifically to identify patterns of functioning of the discursive network and to verify the clinical validity of these patterns in terms of their association with specific phases of the psychotherapy process. An application of the DFA to a case of psychotherapy is provided to illustrate the method and the kinds of results it produces.
Development of a technique for determination of pulmonary artery pulse wave velocity in horses.
Silva, Gonçalo Teixeira de Almeida; Guest, Bruce B; Gomez, Diego E; McGregor, Martine; Viel, Laurent; O'Sullivan, M Lynne; Runciman, John; Arroyo, Luis G
2017-05-01
Calcification of the tunica media of the axial pulmonary arteries (PA) has been reported in a large proportion of racehorses. In humans, medial calcification is a significant cause of arterial stiffening and is implicated in the pathogenesis of cardiac, cerebral, and renal microvascular diseases. Pulse wave velocity (PWV) provides a measure of arterial stiffness. This study aimed to develop a technique to determine PA-PWV in horses and, secondarily, to investigate a potential association between PA-PWV and arterial fibro-calcification. A dual-pressure sensor catheter (PSC) was placed in the main PA of 10 sedated horses. The pressure waves were used to determine PWV along the PA, using the statistical phase offset method. Histological analysis of the PA was performed to investigate the presence of fibro-calcified lesions. The mean (±SD) PWV was 2.3 ± 0.7 m/s in the proximal PA trunk and 1.1 ± 0.1 m/s further distal (15 cm) in a main PA branch. The mean (±SD) of mean arterial pressures in the proximal PA trunk was 30.1 ± 5.2 mmHg, and 22.0 ± 6.0 mmHg further distal (15 cm) within the main PA branch. The mean (±SD) pulse pressure in the proximal PA trunk was 15.0 ± 4.7 mmHg, and 13.5 ± 3.3 mmHg further distal (15 cm) within the main PA branch. Moderate to severe lesions of the tunica media of the PAs were observed in seven horses, but a correlation with PWV could not be established yet. Pulmonary artery PWV may be determined in standing horses. The technique described may allow further investigation of the effect of calcification of large PAs in the pathogenesis of equine pulmonary circulatory disorders. NEW & NOTEWORTHY Pulmonary artery pulse wave velocity was determined safely in standing sedated horses. The technique described may allow further investigation of the effect of calcification of large pulmonary arteries in the pathogenesis of pulmonary circulatory disorders in horses. Copyright © 2017 the American Physiological Society.
Minimally Invasive and Open Distal Chevron Osteotomy for Mild to Moderate Hallux Valgus.
Brogan, Kit; Lindisfarne, Edward; Akehurst, Harold; Farook, Usama; Shrier, Will; Palmer, Simon
2016-11-01
Minimally invasive surgical (MIS) techniques are increasingly being used in foot and ankle surgery but it is important that they are adopted only once they have been shown to be equivalent or superior to open techniques. We believe that the main advantages of MIS are found in the early postoperative period, but in order to adopt it as a technique longer-term studies are required. The aim of this study was to compare the 2-year outcomes of a third-generation MIS distal chevron osteotomy with a comparable traditional open distal chevron osteotomy for mild-moderate hallux valgus. Our null hypothesis was that the 2 techniques would yield equivalent clinical and radiographic results at 2 years. This was a retrospective cohort study. Eighty-one consecutive feet (49 MIS and 32 open distal chevron osteotomies) were followed up for a minimum 24 months (range 24-58). All patients were clinically assessed using the Manchester-Oxford Foot Questionnaire. Radiographic measures included hallux valgus angle, the intermetatarsal angle, hallux interphalangeal angle, metatarsal phalangeal joint angle, distal metatarsal articular angle, tibial sesamoid position, shape of the first metatarsal head, and plantar offset. Statistical analysis was done using Student t test or Wilcoxon rank-sum test for continuous data and Pearson chi-square test for categorical data. Clinical and radiologic postoperative scores in all domains were substantially improved in both groups (P < .001), but there was no statistically significant difference in improvement of any domain between open and MIS groups (P > .05). There were no significant differences in complications between the 2 groups ( > .5). The midterm results of this third-generation technique show that it was a safe procedure with good clinical outcomes and comparable to traditional open techniques for symptomatic mild-moderate hallux valgus. Level III, retrospective comparative study. © The Author(s) 2016.
NASA Astrophysics Data System (ADS)
Ars, Sébastien; Broquet, Grégoire; Yver Kwok, Camille; Roustan, Yelva; Wu, Lin; Arzoumanian, Emmanuel; Bousquet, Philippe
2017-12-01
This study presents a new concept for estimating the pollutant emission rates of a site and its main facilities using a series of atmospheric measurements across the pollutant plumes. This concept combines the tracer release method, local-scale atmospheric transport modelling and a statistical atmospheric inversion approach. The conversion between the controlled emission and the measured atmospheric concentrations of the released tracer across the plume places valuable constraints on the atmospheric transport. This is used to optimise the configuration of the transport model parameters and the model uncertainty statistics in the inversion system. The emission rates of all sources are then inverted to optimise the match between the concentrations simulated with the transport model and the pollutants' measured atmospheric concentrations, accounting for the transport model uncertainty. In principle, by using atmospheric transport modelling, this concept does not strongly rely on the good colocation between the tracer and pollutant sources and can be used to monitor multiple sources within a single site, unlike the classical tracer release technique. The statistical inversion framework and the use of the tracer data for the configuration of the transport and inversion modelling systems should ensure that the transport modelling errors are correctly handled in the source estimation. The potential of this new concept is evaluated with a relatively simple practical implementation based on a Gaussian plume model and a series of inversions of controlled methane point sources using acetylene as a tracer gas. The experimental conditions are chosen so that they are suitable for the use of a Gaussian plume model to simulate the atmospheric transport. In these experiments, different configurations of methane and acetylene point source locations are tested to assess the efficiency of the method in comparison to the classic tracer release technique in coping with the distances between the different methane and acetylene sources. The results from these controlled experiments demonstrate that, when the targeted and tracer gases are not well collocated, this new approach provides a better estimate of the emission rates than the tracer release technique. As an example, the relative error between the estimated and actual emission rates is reduced from 32 % with the tracer release technique to 16 % with the combined approach in the case of a tracer located 60 m upwind of a single methane source. Further studies and more complex implementations with more advanced transport models and more advanced optimisations of their configuration will be required to generalise the applicability of the approach and strengthen its robustness.
Identification of reliable gridded reference data for statistical downscaling methods in Alberta
NASA Astrophysics Data System (ADS)
Eum, H. I.; Gupta, A.
2017-12-01
Climate models provide essential information to assess impacts of climate change at regional and global scales. However, statistical downscaling methods have been applied to prepare climate model data for various applications such as hydrologic and ecologic modelling at a watershed scale. As the reliability and (spatial and temporal) resolution of statistically downscaled climate data mainly depend on a reference data, identifying the most reliable reference data is crucial for statistical downscaling. A growing number of gridded climate products are available for key climate variables which are main input data to regional modelling systems. However, inconsistencies in these climate products, for example, different combinations of climate variables, varying data domains and data lengths and data accuracy varying with physiographic characteristics of the landscape, have caused significant challenges in selecting the most suitable reference climate data for various environmental studies and modelling. Employing various observation-based daily gridded climate products available in public domain, i.e. thin plate spline regression products (ANUSPLIN and TPS), inverse distance method (Alberta Townships), and numerical climate model (North American Regional Reanalysis) and an optimum interpolation technique (Canadian Precipitation Analysis), this study evaluates the accuracy of the climate products at each grid point by comparing with the Adjusted and Homogenized Canadian Climate Data (AHCCD) observations for precipitation, minimum and maximum temperature over the province of Alberta. Based on the performance of climate products at AHCCD stations, we ranked the reliability of these publically available climate products corresponding to the elevations of stations discretized into several classes. According to the rank of climate products for each elevation class, we identified the most reliable climate products based on the elevation of target points. A web-based system was developed to allow users to easily select the most reliable reference climate data at each target point based on the elevation of grid cell. By constructing the best combination of reference data for the study domain, the accurate and reliable statistically downscaled climate projections could be significantly improved.
NASA Astrophysics Data System (ADS)
Bouhaj, M.; von Estorff, O.; Peiffer, A.
2017-09-01
In the application of Statistical Energy Analysis "SEA" to complex assembled structures, a purely predictive model often exhibits errors. These errors are mainly due to a lack of accurate modelling of the power transmission mechanism described through the Coupling Loss Factors (CLF). Experimental SEA (ESEA) is practically used by the automotive and aerospace industry to verify and update the model or to derive the CLFs for use in an SEA predictive model when analytical estimates cannot be made. This work is particularly motivated by the lack of procedures that allow an estimate to be made of the variance and confidence intervals of the statistical quantities when using the ESEA technique. The aim of this paper is to introduce procedures enabling a statistical description of measured power input, vibration energies and the derived SEA parameters. Particular emphasis is placed on the identification of structural CLFs of complex built-up structures comparing different methods. By adopting a Stochastic Energy Model (SEM), the ensemble average in ESEA is also addressed. For this purpose, expressions are obtained to randomly perturb the energy matrix elements and generate individual samples for the Monte Carlo (MC) technique applied to derive the ensemble averaged CLF. From results of ESEA tests conducted on an aircraft fuselage section, the SEM approach provides a better performance of estimated CLFs compared to classical matrix inversion methods. The expected range of CLF values and the synthesized energy are used as quality criteria of the matrix inversion, allowing to assess critical SEA subsystems, which might require a more refined statistical description of the excitation and the response fields. Moreover, the impact of the variance of the normalized vibration energy on uncertainty of the derived CLFs is outlined.
Biochemical Imaging of Gliomas Using MR Spectroscopic Imaging for Radiotherapy Treatment Planning
NASA Astrophysics Data System (ADS)
Heikal, Amr Ahmed
This thesis discusses the main obstacles facing wide clinical implementation of magnetic resonance spectroscopic imaging (MRSI) as a tumor delineation tool for radiotherapy treatment planning, particularly for gliomas. These main obstacles are identified as 1. observer bias and poor interpretational reproducibility of the results of MRSI scans, and 2. the long scan times required to conduct MRSI scans. An examination of an existing user-independent MRSI tumor delineation technique known as the choline-to-NAA index (CNI) is conducted to assess its utility in providing a tool for reproducible interpretation of MRSI results. While working with spatial resolutions typically twice those on which the CNI model was originally designed, a region of statistical uncertainty was discovered between the tumor and normal tissue populations and as such a modification to the CNI model was introduced to clearly identify that region. To address the issue of long scan times, a series of studies were conducted to adapt a scan acceleration technique, compressed sensing (CS), to work with MRSI and to quantify the effects of such a novel technique on the modulation transfer function (MTF), an important quantitative imaging metric. The studies included the development of the first phantom based method of measuring the MTF for MRSI data, a study of the correlation between the k-space sampling patterns used for compressed sensing and the resulting MTFs, and the introduction of a technique circumventing some of side-effects of compressed sensing by exploiting the conjugate symmetry property of k-space. The work in this thesis provides two essential steps towards wide clinical implementation of MRSI-based tumor delineation. The proposed modifications to the CNI method coupled with the application of CS to MRSI address the two main obstacles outlined. However, there continues to be room for improvement and questions that need to be answered by future research.
Chi-squared and C statistic minimization for low count per bin data
NASA Astrophysics Data System (ADS)
Nousek, John A.; Shue, David R.
1989-07-01
Results are presented from a computer simulation comparing two statistical fitting techniques on data samples with large and small counts per bin; the results are then related specifically to X-ray astronomy. The Marquardt and Powell minimization techniques are compared by using both to minimize the chi-squared statistic. In addition, Cash's C statistic is applied, with Powell's method, and it is shown that the C statistic produces better fits in the low-count regime than chi-squared.
Chi-squared and C statistic minimization for low count per bin data. [sampling in X ray astronomy
NASA Technical Reports Server (NTRS)
Nousek, John A.; Shue, David R.
1989-01-01
Results are presented from a computer simulation comparing two statistical fitting techniques on data samples with large and small counts per bin; the results are then related specifically to X-ray astronomy. The Marquardt and Powell minimization techniques are compared by using both to minimize the chi-squared statistic. In addition, Cash's C statistic is applied, with Powell's method, and it is shown that the C statistic produces better fits in the low-count regime than chi-squared.
NASA Astrophysics Data System (ADS)
Gutiérrez, Jose Manuel; Maraun, Douglas; Widmann, Martin; Huth, Radan; Hertig, Elke; Benestad, Rasmus; Roessler, Ole; Wibig, Joanna; Wilcke, Renate; Kotlarski, Sven
2016-04-01
VALUE is an open European network to validate and compare downscaling methods for climate change research (http://www.value-cost.eu). A key deliverable of VALUE is the development of a systematic validation framework to enable the assessment and comparison of both dynamical and statistical downscaling methods. This framework is based on a user-focused validation tree, guiding the selection of relevant validation indices and performance measures for different aspects of the validation (marginal, temporal, spatial, multi-variable). Moreover, several experiments have been designed to isolate specific points in the downscaling procedure where problems may occur (assessment of intrinsic performance, effect of errors inherited from the global models, effect of non-stationarity, etc.). The list of downscaling experiments includes 1) cross-validation with perfect predictors, 2) GCM predictors -aligned with EURO-CORDEX experiment- and 3) pseudo reality predictors (see Maraun et al. 2015, Earth's Future, 3, doi:10.1002/2014EF000259, for more details). The results of these experiments are gathered, validated and publicly distributed through the VALUE validation portal, allowing for a comprehensive community-open downscaling intercomparison study. In this contribution we describe the overall results from Experiment 1), consisting of a European wide 5-fold cross-validation (with consecutive 6-year periods from 1979 to 2008) using predictors from ERA-Interim to downscale precipitation and temperatures (minimum and maximum) over a set of 86 ECA&D stations representative of the main geographical and climatic regions in Europe. As a result of the open call for contribution to this experiment (closed in Dec. 2015), over 40 methods representative of the main approaches (MOS and Perfect Prognosis, PP) and techniques (linear scaling, quantile mapping, analogs, weather typing, linear and generalized regression, weather generators, etc.) were submitted, including information both data (downscaled values) and metadata (characterizing different aspects of the downscaling methods). This constitutes the largest and most comprehensive to date intercomparison of statistical downscaling methods. Here, we present an overall validation, analyzing marginal and temporal aspects to assess the intrinsic performance and added value of statistical downscaling methods at both annual and seasonal levels. This validation takes into account the different properties/limitations of different approaches and techniques (as reported in the provided metadata) in order to perform a fair comparison. It is pointed out that this experiment alone is not sufficient to evaluate the limitations of (MOS) bias correction techniques. Moreover, it also does not fully validate PP since we don't learn whether we have the right predictors and whether the PP assumption is valid. These problems will be analyzed in the subsequent community-open VALUE experiments 2) and 3), which will be open for participation along the present year.
The Importance of Introductory Statistics Students Understanding Appropriate Sampling Techniques
ERIC Educational Resources Information Center
Menil, Violeta C.
2005-01-01
In this paper the author discusses the meaning of sampling, the reasons for sampling, the Central Limit Theorem, and the different techniques of sampling. Practical and relevant examples are given to make the appropriate sampling techniques understandable to students of Introductory Statistics courses. With a thorough knowledge of sampling…
Should I Pack My Umbrella? Clinical versus Statistical Prediction of Mental Health Decisions
ERIC Educational Resources Information Center
Aegisdottir, Stefania; Spengler, Paul M.; White, Michael J.
2006-01-01
In this rejoinder, the authors respond to the insightful commentary of Strohmer and Arm, Chwalisz, and Hilton, Harris, and Rice about the meta-analysis on statistical versus clinical prediction techniques for mental health judgments. The authors address issues including the availability of statistical prediction techniques for real-life psychology…
Change Detection in Rough Time Series
2014-09-01
Business Statistics : An Inferential Approach, Dellen: San Francisco. [18] Winston, W. (1997) Operations Research Applications and Algorithms, Duxbury...distribution that can present significant challenges to conventional statistical tracking techniques. To address this problem the proposed method...applies hybrid fuzzy statistical techniques to series granules instead of to individual measures. Three examples demonstrated the robust nature of the
Enhancing Students' Ability to Use Statistical Reasoning with Everyday Problems
ERIC Educational Resources Information Center
Lawson, Timothy J.; Schwiers, Michael; Doellman, Maureen; Grady, Greg; Kelnhofer, Robert
2003-01-01
We discuss a technique for teaching students everyday applications of statistical concepts. We used this technique with students (n = 50) enrolled in several sections of an introductory statistics course; students (n = 45) in other sections served as a comparison group. A class of introductory psychology students (n = 24) served as a second…
Xavier, Pedro; Ayres-De-Campos, Diogo; Reynolds, Ana; Guimarães, Mariana; Costa-Santos, Cristina; Patrício, Belmiro
2005-09-01
Modifications to the classic cesarean section technique described by Pfannenstiel and Kerr have been proposed in the last few years. The objective of this trial was to compare intraoperative and short-term postoperative outcomes between the Pfannenstiel-Kerr and the modified Misgav-Ladach (MML) techniques for cesarean section. This prospective randomized trial involved 162 patients undergoing transverse lower uterine segment cesarean section. Patients were allocated to one of the two arms: 88 to the MML technique and 74 to the Pfannenstiel-Kerr technique. Main outcome measures were defined as the duration of surgery, analgesic requirements, and bowel restitution by the second postoperative day. Additional outcomes evaluated were febrile morbidity, postoperative antibiotic use, postpartum endometritis, and wound complications. Student's t, Mann-Whitney, and Chi-square tests were used for statistical analysis of the results, and a p < 0.05 was considered as the probability level reflecting significant differences. No differences between groups were noted in the incidence of analgesic requirements, bowel restitution by the second postoperative day, febrile morbidity, antibiotic requirements, endometritis, or wound complications. The MML technique took on average 12 min less to complete (p = 0.001). The MML technique is faster to perform and similar in terms of febrile morbidity, time to bowel restitution, or need for postoperative medications. It is likely to be more cost-effective.
Virtual lab demonstrations improve students' mastery of basic biology laboratory techniques.
Maldarelli, Grace A; Hartmann, Erica M; Cummings, Patrick J; Horner, Robert D; Obom, Kristina M; Shingles, Richard; Pearlman, Rebecca S
2009-01-01
Biology laboratory classes are designed to teach concepts and techniques through experiential learning. Students who have never performed a technique must be guided through the process, which is often difficult to standardize across multiple lab sections. Visual demonstration of laboratory procedures is a key element in teaching pedagogy. The main goals of the study were to create videos explaining and demonstrating a variety of lab techniques that would serve as teaching tools for undergraduate and graduate lab courses and to assess the impact of these videos on student learning. Demonstrations of individual laboratory procedures were videotaped and then edited with iMovie. Narration for the videos was edited with Audacity. Undergraduate students were surveyed anonymously prior to and following screening to assess the impact of the videos on student lab performance by completion of two Participant Perception Indicator surveys. A total of 203 and 171 students completed the pre- and posttesting surveys, respectively. Statistical analyses were performed to compare student perceptions of knowledge of, confidence in, and experience with the lab techniques before and after viewing the videos. Eleven demonstrations were recorded. Chi-square analysis revealed a significant increase in the number of students reporting increased knowledge of, confidence in, and experience with the lab techniques after viewing the videos. Incorporation of instructional videos as prelaboratory exercises has the potential to standardize techniques and to promote successful experimental outcomes.
Rhinoplasty Complications and Reoperations: Systematic Review
Crosara, Paulo Fernando Tormin Borges; Nunes, Flávio Barbosa; Rodrigues, Danilo Santana; Figueiredo, Ana Rosa Pimentel; Becker, Helena Maria Gonçalves; Becker, Celso Goncalves; Guimarães, Roberto Eustáquio Santos
2016-01-01
Introduction This article is related to complications of rhinoplasty and its main causes of reoperations. Objectives The objective of this study is to perform a systematic review of literature on complications in rhinoplasty. Data Synthesis The authors conducted a survey of articles related to key terms in the literature by using three important databases within 11 years, between January 2002 and January 2013. We found 1,271 abstracts and selected 49 articles to this review. Conclusion The main results showed that the number of primary open rhinoplasty was 7902 (89%) and 765 closed (11%) and the percentage of reoperations in primary open complete rhinoplasties was 2.73% and closed complete was 1.56%. The statistical analysis revealed a value of p = 0.071. The standardization of terms can improve the quality of scientific publications about rhinoplasty. There is no difference between primary open or closed rhinoplasty techniques in relation to reoperations. PMID:28050215
Validation of X1 motorcycle model in industrial plant layout by using WITNESSTM simulation software
NASA Astrophysics Data System (ADS)
Hamzas, M. F. M. A.; Bareduan, S. A.; Zakaria, M. Z.; Tan, W. J.; Zairi, S.
2017-09-01
This paper demonstrates a case study on simulation, modelling and analysis for X1 Motorcycles Model. In this research, a motorcycle assembly plant has been selected as a main place of research study. Simulation techniques by using Witness software were applied to evaluate the performance of the existing manufacturing system. The main objective is to validate the data and find out the significant impact on the overall performance of the system for future improvement. The process of validation starts when the layout of the assembly line was identified. All components are evaluated to validate whether the data is significance for future improvement. Machine and labor statistics are among the parameters that were evaluated for process improvement. Average total cycle time for given workstations is used as criterion for comparison of possible variants. From the simulation process, the data used are appropriate and meet the criteria for two-sided assembly line problems.
Technical Note: The Initial Stages of Statistical Data Analysis
Tandy, Richard D.
1998-01-01
Objective: To provide an overview of several important data-related considerations in the design stage of a research project and to review the levels of measurement and their relationship to the statistical technique chosen for the data analysis. Background: When planning a study, the researcher must clearly define the research problem and narrow it down to specific, testable questions. The next steps are to identify the variables in the study, decide how to group and treat subjects, and determine how to measure, and the underlying level of measurement of, the dependent variables. Then the appropriate statistical technique can be selected for data analysis. Description: The four levels of measurement in increasing complexity are nominal, ordinal, interval, and ratio. Nominal data are categorical or “count” data, and the numbers are treated as labels. Ordinal data can be ranked in a meaningful order by magnitude. Interval data possess the characteristics of ordinal data and also have equal distances between levels. Ratio data have a natural zero point. Nominal and ordinal data are analyzed with nonparametric statistical techniques and interval and ratio data with parametric statistical techniques. Advantages: Understanding the four levels of measurement and when it is appropriate to use each is important in determining which statistical technique to use when analyzing data. PMID:16558489
Territories typification technique with use of statistical models
NASA Astrophysics Data System (ADS)
Galkin, V. I.; Rastegaev, A. V.; Seredin, V. V.; Andrianov, A. V.
2018-05-01
Territories typification is required for solution of many problems. The results of geological zoning received by means of various methods do not always agree. That is why the main goal of the research given is to develop a technique of obtaining a multidimensional standard classified indicator for geological zoning. In the course of the research, the probabilistic approach was used. In order to increase the reliability of geological information classification, the authors suggest using complex multidimensional probabilistic indicator P K as a criterion of the classification. The second criterion chosen is multidimensional standard classified indicator Z. These can serve as characteristics of classification in geological-engineering zoning. Above mentioned indicators P K and Z are in good correlation. Correlation coefficient values for the entire territory regardless of structural solidity equal r = 0.95 so each indicator can be used in geological-engineering zoning. The method suggested has been tested and the schematic map of zoning has been drawn.
SEGMENTATION OF MITOCHONDRIA IN ELECTRON MICROSCOPY IMAGES USING ALGEBRAIC CURVES.
Seyedhosseini, Mojtaba; Ellisman, Mark H; Tasdizen, Tolga
2013-01-01
High-resolution microscopy techniques have been used to generate large volumes of data with enough details for understanding the complex structure of the nervous system. However, automatic techniques are required to segment cells and intracellular structures in these multi-terabyte datasets and make anatomical analysis possible on a large scale. We propose a fully automated method that exploits both shape information and regional statistics to segment irregularly shaped intracellular structures such as mitochondria in electron microscopy (EM) images. The main idea is to use algebraic curves to extract shape features together with texture features from image patches. Then, these powerful features are used to learn a random forest classifier, which can predict mitochondria locations precisely. Finally, the algebraic curves together with regional information are used to segment the mitochondria at the predicted locations. We demonstrate that our method outperforms the state-of-the-art algorithms in segmentation of mitochondria in EM images.
Is it possible to identify a trend in problem/failure data
NASA Technical Reports Server (NTRS)
Church, Curtis K.
1990-01-01
One of the major obstacles in identifying and interpreting a trend is the small number of data points. Future trending reports will begin with 1983 data. As the problem/failure data are aggregated by year, there are just seven observations (1983 to 1989) for the 1990 reports. Any statistical inferences with a small amount of data will have a large degree of uncertainty. Consequently, a regression technique approach to identify a trend is limited. Though trend determination by failure mode may be unrealistic, the data may be explored for consistency or stability and the failure rate investigated. Various alternative data analysis procedures are briefly discussed. Techniques that could be used to explore problem/failure data by failure mode are addressed. The data used are taken from Section One, Space Shuttle Main Engine, of the Calspan Quarterly Report dated April 2, 1990.
Valuing improved wetland quality using choice modeling
NASA Astrophysics Data System (ADS)
Morrison, Mark; Bennett, Jeff; Blamey, Russell
1999-09-01
The main stated preference technique used for estimating environmental values is the contingent valuation method. In this paper the results of an application of an alternative technique, choice modeling, are reported. Choice modeling has been developed in the marketing and transport applications but has only been used in a handful of environmental applications, most of which have focused on use values. The case study presented here involves the estimation of the nonuse environmental values provided by the Macquarie Marshes, a major wetland in New South Wales, Australia. Estimates of the nonuse value the community places on preventing job losses are also presented. The reported models are robust, having high explanatory power and variables that are statistically significant and consistent with expectations. These results provide support for the hypothesis that choice modeling can be used to estimate nonuse values for both environmental and social consequences of resource use changes.
Bagur, M G; Morales, S; López-Chicano, M
2009-11-15
Unsupervised and supervised pattern recognition techniques such as hierarchical cluster analysis, principal component analysis, factor analysis and linear discriminant analysis have been applied to water samples recollected in Rodalquilar mining district (Southern Spain) in order to identify different sources of environmental pollution caused by the abandoned mining industry. The effect of the mining activity on waters was monitored determining the concentration of eleven elements (Mn, Ba, Co, Cu, Zn, As, Cd, Sb, Hg, Au and Pb) by inductively coupled plasma mass spectrometry (ICP-MS). The Box-Cox transformation has been used to transform the data set in normal form in order to minimize the non-normal distribution of the geochemical data. The environmental impact is affected mainly by the mining activity developed in the zone, the acid drainage and finally by the chemical treatment used for the benefit of gold.
Costa, Sofia R; Kerry, Brian R; Bardgett, Richard D; Davies, Keith G
2006-12-01
The Pasteuria group of endospore-forming bacteria has been studied as a biocontrol agent of plant-parasitic nematodes. Techniques have been developed for its detection and quantification in soil samples, and these mainly focus on observations of endospore attachment to nematodes. Characterization of Pasteuria populations has recently been performed with DNA-based techniques, which usually require the extraction of large numbers of spores. We describe a simple immunological method for the quantification and characterization of Pasteuria populations. Bayesian statistics were used to determine an extraction efficiency of 43% and a threshold of detection of 210 endospores g(-1) sand. This provided a robust means of estimating numbers of endospores in small-volume samples from a natural system. Based on visual assessment of endospore fluorescence, a quantitative method was developed to characterize endospore populations, which were shown to vary according to their host.
Spatial uncertainty of a geoid undulation model in Guayaquil, Ecuador
NASA Astrophysics Data System (ADS)
Chicaiza, E. G.; Leiva, C. A.; Arranz, J. J.; Buenańo, X. E.
2017-06-01
Geostatistics is a discipline that deals with the statistical analysis of regionalized variables. In this case study, geostatistics is used to estimate geoid undulation in the rural area of Guayaquil town in Ecuador. The geostatistical approach was chosen because the estimation error of prediction map is getting. Open source statistical software R and mainly geoR, gstat and RGeostats libraries were used. Exploratory data analysis (EDA), trend and structural analysis were carried out. An automatic model fitting by Iterative Least Squares and other fitting procedures were employed to fit the variogram. Finally, Kriging using gravity anomaly of Bouguer as external drift and Universal Kriging were used to get a detailed map of geoid undulation. The estimation uncertainty was reached in the interval [-0.5; +0.5] m for errors and a maximum estimation standard deviation of 2 mm in relation with the method of interpolation applied. The error distribution of the geoid undulation map obtained in this study provides a better result than Earth gravitational models publicly available for the study area according the comparison with independent validation points. The main goal of this paper is to confirm the feasibility to use geoid undulations from Global Navigation Satellite Systems and leveling field measurements and geostatistical techniques methods in order to use them in high-accuracy engineering projects.
Climate change adaptation: a panacea for food security in Ondo State, Nigeria
NASA Astrophysics Data System (ADS)
Fatuase, A. I.
2017-08-01
This paper examines the likely perceived causes of climate change, adaptation strategies employed and technical inefficiency of arable crop farmers in Ondo State, Nigeria. Data were obtained from primary sources using a set of structured questionnaire assisted with interview schedule. Multistage sampling technique was used. Data were analyzed using the following: descriptive statistics and the stochastic frontier production function. The findings showed that majority of the respondents (59.1 %) still believed that climate change is a natural phenomenon that is beyond man's power to abate while industrial release, improper sewage disposal, fossil fuel use, deforestation and bush burning were perceived as the most human factors that influence climate change by the category that chose human activities (40.9 %) as the main causes of climate change. The main employed adaptation strategies by the farmers were mixed cropping, planting early matured crop, planting of resistant crops and use of agrochemicals. The arable crop farmers were relatively technically efficient with about 53 % of them having technical efficiency above the average of 0.784 for the study area. The study observed that education, adaptation, perception, climate information and farming experience were statistically significant in decreasing inefficiency of arable crop production. Therefore, advocacy on climate change and its adaptation strategies should be intensified in the study area.
Rughiniș, Cosima; Humă, Bogdana
2015-12-01
In this paper we argue that quantitative survey-based social research essentializes age, through specific rhetorical tools. We outline the device of 'socio-demographic variables' and we discuss its argumentative functions, looking at scientific survey-based analyses of adult scientific literacy, in the Public Understanding of Science research field. 'Socio-demographics' are virtually omnipresent in survey literature: they are, as a rule, used and discussed as bundles of independent variables, requiring little, if any, theoretical and measurement attention. 'Socio-demographics' are rhetorically effective through their common-sense richness of meaning and inferential power. We identify their main argumentation functions as 'structure building', 'pacification', and 'purification'. Socio-demographics are used to uphold causal vocabularies, supporting the transmutation of the descriptive statistical jargon of 'effects' and 'explained variance' into 'explanatory factors'. Age can also be studied statistically as a main variable of interest, through the age-period-cohort (APC) disambiguation technique. While this approach has generated interesting findings, it did not mitigate the reductionism that appears when treating age as a socio-demographic variable. By working with age as a 'socio-demographic variable', quantitative researchers convert it (inadvertently) into a quasi-biological feature, symmetrical, as regards analytical treatment, with pathogens in epidemiological research. Copyright © 2015 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Nieto, Paulino José García; Antón, Juan Carlos Álvarez; Vilán, José Antonio Vilán; García-Gonzalo, Esperanza
2014-10-01
The aim of this research work is to build a regression model of the particulate matter up to 10 micrometers in size (PM10) by using the multivariate adaptive regression splines (MARS) technique in the Oviedo urban area (Northern Spain) at local scale. This research work explores the use of a nonparametric regression algorithm known as multivariate adaptive regression splines (MARS) which has the ability to approximate the relationship between the inputs and outputs, and express the relationship mathematically. In this sense, hazardous air pollutants or toxic air contaminants refer to any substance that may cause or contribute to an increase in mortality or serious illness, or that may pose a present or potential hazard to human health. To accomplish the objective of this study, the experimental dataset of nitrogen oxides (NOx), carbon monoxide (CO), sulfur dioxide (SO2), ozone (O3) and dust (PM10) were collected over 3 years (2006-2008) and they are used to create a highly nonlinear model of the PM10 in the Oviedo urban nucleus (Northern Spain) based on the MARS technique. One main objective of this model is to obtain a preliminary estimate of the dependence between PM10 pollutant in the Oviedo urban area at local scale. A second aim is to determine the factors with the greatest bearing on air quality with a view to proposing health and lifestyle improvements. The United States National Ambient Air Quality Standards (NAAQS) establishes the limit values of the main pollutants in the atmosphere in order to ensure the health of healthy people. Firstly, this MARS regression model captures the main perception of statistical learning theory in order to obtain a good prediction of the dependence among the main pollutants in the Oviedo urban area. Secondly, the main advantages of MARS are its capacity to produce simple, easy-to-interpret models, its ability to estimate the contributions of the input variables, and its computational efficiency. Finally, on the basis of these numerical calculations, using the multivariate adaptive regression splines (MARS) technique, conclusions of this research work are exposed.
[Treatment of idiopathic varicocele: comparative study of three techniques about 128 cases].
Khouni, Hassen; Bouchiba, Nizar; Khelifa, Melik Melek; Ben Ali, Moez; Sebai, Akrem; Dali, Meriem; Charfi, Mehdi; Chouchene, Adnene; El Kateb, Faycel; Bouhaouala, Habib; Balti, Med Hedi
2011-12-01
Several modalities of varicocele treatment are available, however, no therapeutic technique showed its superiority with regard to the other one. To compare the results of three techniques of varicocelecomy. Retrospective Analytical and comparative study of 128 patients treated by of three techniques of varicocelectomy: the open surgery by retro peritoneal way for 42 patients (GI), the varicocelectomie coelioscopique for 41 patients (GII) and the antegrade scrotal sclerotherapy done for 45 patients (GIII), between march 2001 and January 2009.The mean age was 28 years. The main motive for consultation was represented by the painful varicocele in 67 % of the cases, followed by the hypofertility in 20.3 % of the cases and the association both in 12,5 % of the cases. The varicocele was in leftsider in 71.1 % of the cases, to the right side in 5.4 % of the cases and was bilateral in 23.43 % of the cases. Varicocele was infra-clinical at 6 patients, grade 1 in 16 sides, grade 2 in 105 sides and grade 3 in 31 sides. The numeration, the mobility as well as the morphology of sperm cells were comparable between the three groups Results: The global rate success was 81.2 %, with the highest rate found in the group III which was treated by antégrade scrotal sclerotherapy (84.4 %). The improvement of the parameters of the spermogramme was noted in the three groups, however a statistically significant difference was found only in patients treated by antégrade scrotal sclerotherapy; it mainly concerned numeration and the mobility of spermatozoides. The highest rate of pregnancy was recorded in patients treated by antégrade scrotal sclerotherapy (13.3%). The main postoperative complications were hydrocele (16%) followed by testicular hypotrophy (3 patients). Three techniques of varicocele treatment, offer either a similar success rate, and improvement of parameters of the sperm cells. However, antegrade scrotal sclerotherpy seem to be the best treatment of first intention in proposed, regarding its efficiency, of the ease of its realization, its moderate cost and its feasibility in case of recurrence if varicocele was treated with open way'GIII).
The impact of Lean bundles on hospital performance: does size matter?
Al-Hyari, Khalil; Abu Hammour, Sewar; Abu Zaid, Mohammad Khair Saleem; Haffar, Mohamed
2016-10-10
Purpose The purpose of this paper is to study the effect of the implementation of Lean bundles on hospital performance in private hospitals in Jordan and evaluate how much the size of organization can affect the relationship between Lean bundles implementation and hospital performance. Design/methodology/approach The research is considered as quantitative method (descriptive and hypothesis testing). Three statistical techniques were adopted to analyse the data. Structural equation modeling techniques and multi-group analysis were used to examine the research's hypothesis, and to perform the required statistical analysis of the data from the survey. Reliability analysis and confirmatory factor analysis were used to test the construct validity, reliability and measurement loadings that were performed. Findings Lean bundles have been identified as an effective approach that can dramatically improve the organizational performance of private hospitals in Jordan. Main Lean bundles - just in time, human resource management, and total quality management are applicable to large, small and medium hospitals without significant differences in advantages that depend on size. Originality/value According to the researchers' best knowledge, this is the first research that studies the impact of Lean bundles implementation in healthcare sector in Jordan. This research also makes a significant contribution for decision makers in healthcare to increase their awareness of Lean bundles.
How to assess the efficiency of synchronization experiments in tokamaks
NASA Astrophysics Data System (ADS)
Murari, A.; Craciunescu, T.; Peluso, E.; Gelfusa, M.; Lungaroni, M.; Garzotti, L.; Frigione, D.; Gaudio, P.; Contributors, JET
2016-07-01
Control of instabilities such as ELMs and sawteeth is considered an important ingredient in the development of reactor-relevant scenarios. Various forms of ELM pacing have been tried in the past to influence their behavior using external perturbations. One of the main problems with these synchronization experiments resides in the fact that ELMs are periodic or quasi-periodic in nature. Therefore, after any pulsed perturbation, if one waits long enough, an ELM is always bound to occur. To evaluate the effectiveness of ELM pacing techniques, it is crucial to determine an appropriate interval over which they can have a real influence and an effective triggering capability. In this paper, three independent statistical methods are described to address this issue: Granger causality, transfer entropy and recurrence plots. The obtained results for JET with the ITER-like wall (ILW) indicate that the proposed techniques agree very well and provide much better estimates than the traditional heuristic criteria reported in the literature. Moreover, their combined use allows for the improvement of the time resolution of the assessment and determination of the efficiency of the pellet triggering in different phases of the same discharge. Therefore, the developed methods can be used to provide a quantitative and statistically robust estimate of the triggering efficiency of ELM pacing under realistic experimental conditions.
THE MEASUREMENT OF BONE QUALITY USING GRAY LEVEL CO-OCCURRENCE MATRIX TEXTURAL FEATURES.
Shirvaikar, Mukul; Huang, Ning; Dong, Xuanliang Neil
2016-10-01
In this paper, statistical methods for the estimation of bone quality to predict the risk of fracture are reported. Bone mineral density and bone architecture properties are the main contributors of bone quality. Dual-energy X-ray Absorptiometry (DXA) is the traditional clinical measurement technique for bone mineral density, but does not include architectural information to enhance the prediction of bone fragility. Other modalities are not practical due to cost and access considerations. This study investigates statistical parameters based on the Gray Level Co-occurrence Matrix (GLCM) extracted from two-dimensional projection images and explores links with architectural properties and bone mechanics. Data analysis was conducted on Micro-CT images of 13 trabecular bones (with an in-plane spatial resolution of about 50μm). Ground truth data for bone volume fraction (BV/TV), bone strength and modulus were available based on complex 3D analysis and mechanical tests. Correlation between the statistical parameters and biomechanical test results was studied using regression analysis. The results showed Cluster-Shade was strongly correlated with the microarchitecture of the trabecular bone and related to mechanical properties. Once the principle thesis of utilizing second-order statistics is established, it can be extended to other modalities, providing cost and convenience advantages for patients and doctors.
THE MEASUREMENT OF BONE QUALITY USING GRAY LEVEL CO-OCCURRENCE MATRIX TEXTURAL FEATURES
Shirvaikar, Mukul; Huang, Ning; Dong, Xuanliang Neil
2016-01-01
In this paper, statistical methods for the estimation of bone quality to predict the risk of fracture are reported. Bone mineral density and bone architecture properties are the main contributors of bone quality. Dual-energy X-ray Absorptiometry (DXA) is the traditional clinical measurement technique for bone mineral density, but does not include architectural information to enhance the prediction of bone fragility. Other modalities are not practical due to cost and access considerations. This study investigates statistical parameters based on the Gray Level Co-occurrence Matrix (GLCM) extracted from two-dimensional projection images and explores links with architectural properties and bone mechanics. Data analysis was conducted on Micro-CT images of 13 trabecular bones (with an in-plane spatial resolution of about 50μm). Ground truth data for bone volume fraction (BV/TV), bone strength and modulus were available based on complex 3D analysis and mechanical tests. Correlation between the statistical parameters and biomechanical test results was studied using regression analysis. The results showed Cluster-Shade was strongly correlated with the microarchitecture of the trabecular bone and related to mechanical properties. Once the principle thesis of utilizing second-order statistics is established, it can be extended to other modalities, providing cost and convenience advantages for patients and doctors. PMID:28042512
Fusco, Diana; Barnum, Timothy J.; Bruno, Andrew E.; Luft, Joseph R.; Snell, Edward H.; Mukherjee, Sayan; Charbonneau, Patrick
2014-01-01
X-ray crystallography is the predominant method for obtaining atomic-scale information about biological macromolecules. Despite the success of the technique, obtaining well diffracting crystals still critically limits going from protein to structure. In practice, the crystallization process proceeds through knowledge-informed empiricism. Better physico-chemical understanding remains elusive because of the large number of variables involved, hence little guidance is available to systematically identify solution conditions that promote crystallization. To help determine relationships between macromolecular properties and their crystallization propensity, we have trained statistical models on samples for 182 proteins supplied by the Northeast Structural Genomics consortium. Gaussian processes, which capture trends beyond the reach of linear statistical models, distinguish between two main physico-chemical mechanisms driving crystallization. One is characterized by low levels of side chain entropy and has been extensively reported in the literature. The other identifies specific electrostatic interactions not previously described in the crystallization context. Because evidence for two distinct mechanisms can be gleaned both from crystal contacts and from solution conditions leading to successful crystallization, the model offers future avenues for optimizing crystallization screens based on partial structural information. The availability of crystallization data coupled with structural outcomes analyzed through state-of-the-art statistical models may thus guide macromolecular crystallization toward a more rational basis. PMID:24988076
Fusco, Diana; Barnum, Timothy J; Bruno, Andrew E; Luft, Joseph R; Snell, Edward H; Mukherjee, Sayan; Charbonneau, Patrick
2014-01-01
X-ray crystallography is the predominant method for obtaining atomic-scale information about biological macromolecules. Despite the success of the technique, obtaining well diffracting crystals still critically limits going from protein to structure. In practice, the crystallization process proceeds through knowledge-informed empiricism. Better physico-chemical understanding remains elusive because of the large number of variables involved, hence little guidance is available to systematically identify solution conditions that promote crystallization. To help determine relationships between macromolecular properties and their crystallization propensity, we have trained statistical models on samples for 182 proteins supplied by the Northeast Structural Genomics consortium. Gaussian processes, which capture trends beyond the reach of linear statistical models, distinguish between two main physico-chemical mechanisms driving crystallization. One is characterized by low levels of side chain entropy and has been extensively reported in the literature. The other identifies specific electrostatic interactions not previously described in the crystallization context. Because evidence for two distinct mechanisms can be gleaned both from crystal contacts and from solution conditions leading to successful crystallization, the model offers future avenues for optimizing crystallization screens based on partial structural information. The availability of crystallization data coupled with structural outcomes analyzed through state-of-the-art statistical models may thus guide macromolecular crystallization toward a more rational basis.
Statistical baseline assessment in cardiotocography.
Agostinelli, Angela; Braccili, Eleonora; Marchegiani, Enrico; Rosati, Riccardo; Sbrollini, Agnese; Burattini, Luca; Morettini, Micaela; Di Nardo, Francesco; Fioretti, Sandro; Burattini, Laura
2017-07-01
Cardiotocography (CTG) is the most common non-invasive diagnostic technique to evaluate fetal well-being. It consists in the recording of fetal heart rate (FHR; bpm) and maternal uterine contractions. Among the main parameters characterizing FHR, baseline (BL) is fundamental to determine fetal hypoxia and distress. In computerized applications, BL is typically computed as mean FHR±ΔFHR, with ΔFHR=8 bpm or ΔFHR=10 bpm, both values being experimentally fixed. In this context, the present work aims: to propose a statistical procedure for ΔFHR assessment; to quantitatively determine ΔFHR value by applying such procedure to clinical data; and to compare the statistically-determined ΔFHR value against the experimentally-determined ΔFHR values. To these aims, the 552 recordings of the "CTU-UHB intrapartum CTG database" from Physionet were submitted to an automatic procedure, which consisted in a FHR preprocessing phase and a statistical BL assessment. During preprocessing, FHR time series were divided into 20-min sliding windows, in which missing data were removed by linear interpolation. Only windows with a correction rate lower than 10% were further processed for BL assessment, according to which ΔFHR was computed as FHR standard deviation. Total number of accepted windows was 1192 (38.5%) over 383 recordings (69.4%) with at least an accepted window. Statistically-determined ΔFHR value was 9.7 bpm. Such value was statistically different from 8 bpm (P<;10 -19 ) but not from 10 bpm (P=0.16). Thus, ΔFHR=10 bpm is preferable over 8 bpm because both experimentally and statistically validated.
Are Assumptions of Well-Known Statistical Techniques Checked, and Why (Not)?
Hoekstra, Rink; Kiers, Henk A. L.; Johnson, Addie
2012-01-01
A valid interpretation of most statistical techniques requires that one or more assumptions be met. In published articles, however, little information tends to be reported on whether the data satisfy the assumptions underlying the statistical techniques used. This could be due to self-selection: Only manuscripts with data fulfilling the assumptions are submitted. Another explanation could be that violations of assumptions are rarely checked for in the first place. We studied whether and how 30 researchers checked fictitious data for violations of assumptions in their own working environment. Participants were asked to analyze the data as they would their own data, for which often used and well-known techniques such as the t-procedure, ANOVA and regression (or non-parametric alternatives) were required. It was found that the assumptions of the techniques were rarely checked, and that if they were, it was regularly by means of a statistical test. Interviews afterward revealed a general lack of knowledge about assumptions, the robustness of the techniques with regards to the assumptions, and how (or whether) assumptions should be checked. These data suggest that checking for violations of assumptions is not a well-considered choice, and that the use of statistics can be described as opportunistic. PMID:22593746
Evaluation of Three Different Processing Techniques in the Fabrication of Complete Dentures
Chintalacheruvu, Vamsi Krishna; Balraj, Rajasekaran Uttukuli; Putchala, Lavanya Sireesha; Pachalla, Sreelekha
2017-01-01
Aims and Objectives: The objective of the present study is to compare the effectiveness of three different processing techniques and to find out the accuracy of processing techniques through number of occlusal interferences and increase in vertical dimension after denture processing. Materials and Methods: A cross-sectional study was conducted on a sample of 18 patients indicated for complete denture fabrication was selected for the study and they were divided into three subgroups. Three processing techniques, compression molding and injection molding using prepolymerized resin and unpolymerized resin, were used to fabricate dentures for each of the groups. After processing, laboratory-remounted dentures were evaluated for number of occlusal interferences in centric and eccentric relations and change in vertical dimension through vertical pin rise in articulator. Data were analyzed using statistical test ANOVA and SPSS software version 19.0 by IBM was used. Results: Data obtained from three groups were subjected to one-way ANOVA test. After ANOVA test, results with significant variations were subjected to post hoc test. Number of occlusal interferences with compression molding technique was reported to be more in both centric and eccentric positions as compared to the two injection molding techniques with statistical significance in centric, protrusive, right lateral nonworking, and left lateral working positions (P < 0.05). Mean vertical pin rise (0.52 mm) was reported to more in compression molding technique as compared to injection molding techniques, which is statistically significant (P < 0.001). Conclusions: Within the limitations of this study, injection molding techniques exhibited less processing errors as compared to compression molding technique with statistical significance. There was no statistically significant difference in processing errors reported within two injection molding systems. PMID:28713763
Evaluation of Three Different Processing Techniques in the Fabrication of Complete Dentures.
Chintalacheruvu, Vamsi Krishna; Balraj, Rajasekaran Uttukuli; Putchala, Lavanya Sireesha; Pachalla, Sreelekha
2017-06-01
The objective of the present study is to compare the effectiveness of three different processing techniques and to find out the accuracy of processing techniques through number of occlusal interferences and increase in vertical dimension after denture processing. A cross-sectional study was conducted on a sample of 18 patients indicated for complete denture fabrication was selected for the study and they were divided into three subgroups. Three processing techniques, compression molding and injection molding using prepolymerized resin and unpolymerized resin, were used to fabricate dentures for each of the groups. After processing, laboratory-remounted dentures were evaluated for number of occlusal interferences in centric and eccentric relations and change in vertical dimension through vertical pin rise in articulator. Data were analyzed using statistical test ANOVA and SPSS software version 19.0 by IBM was used. Data obtained from three groups were subjected to one-way ANOVA test. After ANOVA test, results with significant variations were subjected to post hoc test. Number of occlusal interferences with compression molding technique was reported to be more in both centric and eccentric positions as compared to the two injection molding techniques with statistical significance in centric, protrusive, right lateral nonworking, and left lateral working positions ( P < 0.05). Mean vertical pin rise (0.52 mm) was reported to more in compression molding technique as compared to injection molding techniques, which is statistically significant ( P < 0.001). Within the limitations of this study, injection molding techniques exhibited less processing errors as compared to compression molding technique with statistical significance. There was no statistically significant difference in processing errors reported within two injection molding systems.
In analyses supporting the development of numeric nutrient criteria, multiple statistical techniques can be used to extract critical values from stressor response relationships. However there is little guidance for choosing among techniques, and the extent to which log-transfor...
Incorporating Nonparametric Statistics into Delphi Studies in Library and Information Science
ERIC Educational Resources Information Center
Ju, Boryung; Jin, Tao
2013-01-01
Introduction: The Delphi technique is widely used in library and information science research. However, many researchers in the field fail to employ standard statistical tests when using this technique. This makes the technique vulnerable to criticisms of its reliability and validity. The general goal of this article is to explore how…
ERIC Educational Resources Information Center
Karadag, Engin
2010-01-01
To assess research methods and analysis of statistical techniques employed by educational researchers, this study surveyed unpublished doctoral dissertation from 2003 to 2007. Frequently used research methods consisted of experimental research; a survey; a correlational study; and a case study. Descriptive statistics, t-test, ANOVA, factor…
Statistics in the Workplace: A Survey of Use by Recent Graduates with Higher Degrees
ERIC Educational Resources Information Center
Harraway, John A.; Barker, Richard J.
2005-01-01
A postal survey was conducted regarding statistical techniques, research methods and software used in the workplace by 913 graduates with PhD and Masters degrees in the biological sciences, psychology, business, economics, and statistics. The study identified gaps between topics and techniques learned at university and those used in the workplace,…
Cytocompatibility, cytotoxicity and genotoxicity analysis of dental implants
NASA Astrophysics Data System (ADS)
Reigosa, M.; Labarta, V.; Molinari, G.; Bernales, D.
2007-11-01
Several types of materials are frequently used for dental prostheses in dental medicine. Different treatments with titanium are the most used. The aim of the present study was to analyze by means of cytotoxicity and cytocompatibility techniques the capacity of dental implants to integrate to the bone tissue. Cultures of UMR 106 cell line derived from an osteosarcoma were used for bioassays mainly because they show many of the properties of osteoblasts. Dental implant samples provided by B&W company were compared with others of recognized trademarks. The first ones contain ASTM titanium (8348 GR2) with acid printing. Cytotoxicity was analyzed by means of lysosome activity, using the neutral red technique and alkaline phosphatase enzyme activity. Cell variability was determined by means of the acridine ethidium-orange bromide technique. One-way ANOVA and Bonferroni and Duncan post-ANOVA tests were used for the statistical analysis. The assays did not show significant differences among the dental implants analyzed. Our findings show that the dental prostheses studied present high biocompatibility, quantified by the bioassays performed. The techniques employed revealed that they can be a useful tool for the analysis of other materials for dental medicine use.
Partial discharge detection and analysis in low pressure environments
NASA Astrophysics Data System (ADS)
Liu, Xin
Typical aerospace vehicles (aircraft and spacecraft) experience a wide range of operating pressures during ascending and returning to earth. Compared to the sea-level atmospheric pressure (760 Torr), the pressure at about 60 km altitude is 2 Torr. The performance of the electric power system components of the aerospace vehicles must remain reliable even under such sub-atmospheric operating conditions. It is well known that the dielectric strength of gaseous insulators, while the electrode arrangement remains unchanged, is pressure dependent. Therefore, characterization of the performance and behavior of the electrical insulation in flight vehicles in low-pressure environments is extremely important. Partial discharge testing is one of the practical methods for evaluating the integrity of electrical insulation in aerospace vehicles. This dissertation describes partial discharge (PD) measurements performed mainly with 60 Hz ac energization in air, argon and helium, for pressures between 2 and 760 Torr. Two main electrode arrangements were used. One was a needle-plane electrode arrangement with a Teflon insulating barrier. The other one was a twisted pair of insulated conductors taken from a standard aircraft wiring harness. The measurement results are presented in terms of typical PD current pulse waveforms and waveform analysis for both main electrode arrangements. The evaluation criteria are the waveform polarity, magnitude, shape, rise time, and phase angle (temporal location) relative to the source voltage. Two-variable histograms and statistical averages of the PD parameters are presented. The PD physical mechanisms are analyzed. For PD pattern recognition, both statistical methods (such as discharge parameter dot pattern representation, discharge parameter phase distribution, statistical operator calculations, and PD fingerprint development) and wavelet transform applications are investigated. The main conclusions of the dissertation include: (1) The PD current pulse waveforms are dependent on the pressure. (2) The rise time of the waveform is another effective PD current pulse characteristic indicator. (3) PD fingerprint patterns that are already available for atmospheric pressure (760 Torr) conditions are inadequate for the evaluation of PD pulses at low pressures. (4) Various wavelet transform techniques can be used effectively for PD pulse signal denoising purposes, and for PD pulse waveform transient feature recognition.
Statistical approach for selection of biologically informative genes.
Das, Samarendra; Rai, Anil; Mishra, D C; Rai, Shesh N
2018-05-20
Selection of informative genes from high dimensional gene expression data has emerged as an important research area in genomics. Many gene selection techniques have been proposed so far are either based on relevancy or redundancy measure. Further, the performance of these techniques has been adjudged through post selection classification accuracy computed through a classifier using the selected genes. This performance metric may be statistically sound but may not be biologically relevant. A statistical approach, i.e. Boot-MRMR, was proposed based on a composite measure of maximum relevance and minimum redundancy, which is both statistically sound and biologically relevant for informative gene selection. For comparative evaluation of the proposed approach, we developed two biological sufficient criteria, i.e. Gene Set Enrichment with QTL (GSEQ) and biological similarity score based on Gene Ontology (GO). Further, a systematic and rigorous evaluation of the proposed technique with 12 existing gene selection techniques was carried out using five gene expression datasets. This evaluation was based on a broad spectrum of statistically sound (e.g. subject classification) and biological relevant (based on QTL and GO) criteria under a multiple criteria decision-making framework. The performance analysis showed that the proposed technique selects informative genes which are more biologically relevant. The proposed technique is also found to be quite competitive with the existing techniques with respect to subject classification and computational time. Our results also showed that under the multiple criteria decision-making setup, the proposed technique is best for informative gene selection over the available alternatives. Based on the proposed approach, an R Package, i.e. BootMRMR has been developed and available at https://cran.r-project.org/web/packages/BootMRMR. This study will provide a practical guide to select statistical techniques for selecting informative genes from high dimensional expression data for breeding and system biology studies. Published by Elsevier B.V.
Sarrami-Foroushani, Ali; Nasr Esfahany, Mohsen; Nasiraei Moghaddam, Abbas; Saligheh Rad, Hamidreza; Firouznia, Kavous; Shakiba, Madjid; Ghanaati, Hossein; Wilkinson, Iain David; Frangi, Alejandro Federico
2015-01-01
Background: Understanding hemodynamic environment in vessels is important for realizing the mechanisms leading to vascular pathologies. Objectives: Three-dimensional velocity vector field in carotid bifurcation is visualized using TR 3D phase-contrast magnetic resonance imaging (TR 3D PC MRI) and computational fluid dynamics (CFD). This study aimed to present a qualitative and quantitative comparison of the velocity vector field obtained by each technique. Subjects and Methods: MR imaging was performed on a 30-year old male normal subject. TR 3D PC MRI was performed on a 3 T scanner to measure velocity in carotid bifurcation. 3D anatomical model for CFD was created using images obtained from time-of-flight MR angiography. Velocity vector field in carotid bifurcation was predicted using CFD and PC MRI techniques. A statistical analysis was performed to assess the agreement between the two methods. Results: Although the main flow patterns were the same for the both techniques, CFD showed a greater resolution in mapping the secondary and circulating flows. Overall root mean square (RMS) errors for all the corresponding data points in PC MRI and CFD were 14.27% in peak systole and 12.91% in end diastole relative to maximum velocity measured at each cardiac phase. Bland-Altman plots showed a very good agreement between the two techniques. However, this study was not aimed to validate any of methods, instead, the consistency was assessed to accentuate the similarities and differences between Time-resolved PC MRI and CFD. Conclusion: Both techniques provided quantitatively consistent results of in vivo velocity vector fields in right internal carotid artery (RCA). PC MRI represented a good estimation of main flow patterns inside the vasculature, which seems to be acceptable for clinical use. However, limitations of each technique should be considered while interpreting results. PMID:26793288
NASA Astrophysics Data System (ADS)
Yousefian Jazi, Nima
Spatial filtering and directional discrimination has been shown to be an effective pre-processing approach for noise reduction in microphone array systems. In dual-microphone hearing aids, fixed and adaptive beamforming techniques are the most common solutions for enhancing the desired speech and rejecting unwanted signals captured by the microphones. In fact, beamformers are widely utilized in systems where spatial properties of target source (usually in front of the listener) is assumed to be known. In this dissertation, some dual-microphone coherence-based speech enhancement techniques applicable to hearing aids are proposed. All proposed algorithms operate in the frequency domain and (like traditional beamforming techniques) are purely based on the spatial properties of the desired speech source and does not require any knowledge of noise statistics for calculating the noise reduction filter. This benefit gives our algorithms the ability to address adverse noise conditions, such as situations where interfering talker(s) speaks simultaneously with the target speaker. In such cases, the (adaptive) beamformers lose their effectiveness in suppressing interference, since the noise channel (reference) cannot be built and updated accordingly. This difference is the main advantage of the proposed techniques in the dissertation over traditional adaptive beamformers. Furthermore, since the suggested algorithms are independent of noise estimation, they offer significant improvement in scenarios that the power level of interfering sources are much more than that of target speech. The dissertation also shows the premise behind the proposed algorithms can be extended and employed to binaural hearing aids. The main purpose of the investigated techniques is to enhance the intelligibility level of speech, measured through subjective listening tests with normal hearing and cochlear implant listeners. However, the improvement in quality of the output speech achieved by the algorithms are also presented to show that the proposed methods can be potential candidates for future use in commercial hearing aids and cochlear implant devices.
Techniques in teaching statistics : linking research production and research use.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Martinez-Moyano, I .; Smith, A.; Univ. of Massachusetts at Boston)
In the spirit of closing the 'research-practice gap,' the authors extend evidence-based principles to statistics instruction in social science graduate education. The authors employ a Delphi method to survey experienced statistics instructors to identify teaching techniques to overcome the challenges inherent in teaching statistics to students enrolled in practitioner-oriented master's degree programs. Among the teaching techniques identi?ed as essential are using real-life examples, requiring data collection exercises, and emphasizing interpretation rather than results. Building on existing research, preliminary interviews, and the ?ndings from the study, the authors develop a model describing antecedents to the strength of the link between researchmore » and practice.« less
Assessment of water quality parameters using multivariate analysis for Klang River basin, Malaysia.
Mohamed, Ibrahim; Othman, Faridah; Ibrahim, Adriana I N; Alaa-Eldin, M E; Yunus, Rossita M
2015-01-01
This case study uses several univariate and multivariate statistical techniques to evaluate and interpret a water quality data set obtained from the Klang River basin located within the state of Selangor and the Federal Territory of Kuala Lumpur, Malaysia. The river drains an area of 1,288 km(2), from the steep mountain rainforests of the main Central Range along Peninsular Malaysia to the river mouth in Port Klang, into the Straits of Malacca. Water quality was monitored at 20 stations, nine of which are situated along the main river and 11 along six tributaries. Data was collected from 1997 to 2007 for seven parameters used to evaluate the status of the water quality, namely dissolved oxygen, biochemical oxygen demand, chemical oxygen demand, suspended solids, ammoniacal nitrogen, pH, and temperature. The data were first investigated using descriptive statistical tools, followed by two practical multivariate analyses that reduced the data dimensions for better interpretation. The analyses employed were factor analysis and principal component analysis, which explain 60 and 81.6% of the total variation in the data, respectively. We found that the resulting latent variables from the factor analysis are interpretable and beneficial for describing the water quality in the Klang River. This study presents the usefulness of several statistical methods in evaluating and interpreting water quality data for the purpose of monitoring the effectiveness of water resource management. The results should provide more straightforward data interpretation as well as valuable insight for managers to conceive optimum action plans for controlling pollution in river water.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nedic, Vladimir, E-mail: vnedic@kg.ac.rs; Despotovic, Danijela, E-mail: ddespotovic@kg.ac.rs; Cvetanovic, Slobodan, E-mail: slobodan.cvetanovic@eknfak.ni.ac.rs
2014-11-15
Traffic is the main source of noise in urban environments and significantly affects human mental and physical health and labor productivity. Therefore it is very important to model the noise produced by various vehicles. Techniques for traffic noise prediction are mainly based on regression analysis, which generally is not good enough to describe the trends of noise. In this paper the application of artificial neural networks (ANNs) for the prediction of traffic noise is presented. As input variables of the neural network, the proposed structure of the traffic flow and the average speed of the traffic flow are chosen. Themore » output variable of the network is the equivalent noise level in the given time period L{sub eq}. Based on these parameters, the network is modeled, trained and tested through a comparative analysis of the calculated values and measured levels of traffic noise using the originally developed user friendly software package. It is shown that the artificial neural networks can be a useful tool for the prediction of noise with sufficient accuracy. In addition, the measured values were also used to calculate equivalent noise level by means of classical methods, and comparative analysis is given. The results clearly show that ANN approach is superior in traffic noise level prediction to any other statistical method. - Highlights: • We proposed an ANN model for prediction of traffic noise. • We developed originally designed user friendly software package. • The results are compared with classical statistical methods. • The results are much better predictive capabilities of ANN model.« less
Reddy, Pramod P; Reddy, Trisha P; Roig-Francoli, Jennifer; Cone, Lois; Sivan, Bezalel; DeFoor, W Robert; Gaitonde, Krishnanath; Noh, Paul H
2011-10-01
One of the main ergonomic challenges during surgical procedures is surgeon posture. There have been reports of a high number of work related injuries in laparoscopic surgeons. The Alexander technique is a process of psychophysical reeducation of the body to improve postural balance and coordination, permitting movement with minimal strain and maximum ease. We evaluated the efficacy of the Alexander technique in improving posture and surgical ergonomics during minimally invasive surgery. We performed a prospective cohort study in which subjects served as their own controls. Informed consent was obtained. Before Alexander technique instruction/intervention subjects underwent assessment of postural coordination and basic laparoscopic skills. All subjects were educated about the Alexander technique and underwent post-instruction/intervention assessment of posture and laparoscopic skills. Subjective and objective data obtained before and after instruction/intervention were tabulated and analyzed for statistical significance. All 7 subjects completed the study. Subjects showed improved ergonomics and improved ability to complete FLS™ as well as subjective improvement in overall posture. The Alexander technique training program resulted in a significant improvement in posture. Improved surgical ergonomics, endurance and posture decrease surgical fatigue and the incidence of repetitive stress injuries to laparoscopic surgeons. Further studies of the influence of the Alexander technique on surgical posture, minimally invasive surgery ergonomics and open surgical techniques are warranted to explore and validate the benefits for surgeons. Copyright © 2011 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.
Statistical Symbolic Execution with Informed Sampling
NASA Technical Reports Server (NTRS)
Filieri, Antonio; Pasareanu, Corina S.; Visser, Willem; Geldenhuys, Jaco
2014-01-01
Symbolic execution techniques have been proposed recently for the probabilistic analysis of programs. These techniques seek to quantify the likelihood of reaching program events of interest, e.g., assert violations. They have many promising applications but have scalability issues due to high computational demand. To address this challenge, we propose a statistical symbolic execution technique that performs Monte Carlo sampling of the symbolic program paths and uses the obtained information for Bayesian estimation and hypothesis testing with respect to the probability of reaching the target events. To speed up the convergence of the statistical analysis, we propose Informed Sampling, an iterative symbolic execution that first explores the paths that have high statistical significance, prunes them from the state space and guides the execution towards less likely paths. The technique combines Bayesian estimation with a partial exact analysis for the pruned paths leading to provably improved convergence of the statistical analysis. We have implemented statistical symbolic execution with in- formed sampling in the Symbolic PathFinder tool. We show experimentally that the informed sampling obtains more precise results and converges faster than a purely statistical analysis and may also be more efficient than an exact symbolic analysis. When the latter does not terminate symbolic execution with informed sampling can give meaningful results under the same time and memory limits.
NASA Astrophysics Data System (ADS)
Khazaeli, S.; Ravandi, A. G.; Banerji, S.; Bagchi, A.
2016-04-01
Recently, data-driven models for Structural Health Monitoring (SHM) have been of great interest among many researchers. In data-driven models, the sensed data are processed to determine the structural performance and evaluate the damages of an instrumented structure without necessitating the mathematical modeling of the structure. A framework of data-driven models for online assessment of the condition of a structure has been developed here. The developed framework is intended for automated evaluation of the monitoring data and structural performance by the Internet technology and resources. The main challenges in developing such framework include: (a) utilizing the sensor measurements to estimate and localize the induced damage in a structure by means of signal processing and data mining techniques, and (b) optimizing the computing and storage resources with the aid of cloud services. The main focus in this paper is to demonstrate the efficiency of the proposed framework for real-time damage detection of a multi-story shear-building structure in two damage scenarios (change in mass and stiffness) in various locations. Several features are extracted from the sensed data by signal processing techniques and statistical methods. Machine learning algorithms are deployed to select damage-sensitive features as well as classifying the data to trace the anomaly in the response of the structure. Here, the cloud computing resources from Amazon Web Services (AWS) have been used to implement the proposed framework.
Claveria, Oscar; Poluzzi, Alessio
2016-06-01
The first decade of the present century has been characterized by several economic shocks such as the 2008 financial crisis. In this data article we present the annual percentage growth rates of the main tourism indicators in the world׳s top tourist destinations: the United States, China, France, Spain, Italy, United Kingdom, Germany, Turkey, Mexico and Austria. We use data from the Compendium of Tourism Statistics provided by the World Tourism Organization (http://www2.unwto.org/content/data-0). It has been demonstrated that the dynamics of growth in the tourism industry pose different challenges to each destination in the previous study "Positioning and clustering of the world׳s top tourist destinations by means of dimensionality reduction techniques for categorical data" (Claveria and Poluzzi, 2016, [1]). We provide a descriptive analysis of the variables over the period comprised between 2000 and 2010. We complement the analysis by graphing the evolution of the main variables so as to visually represent the co-movements between tourism variables and economic growth.
NASA Astrophysics Data System (ADS)
Trigila, Alessandro; Iadanza, Carla; Esposito, Carlo; Scarascia-Mugnozza, Gabriele
2015-11-01
The aim of this work is to define reliable susceptibility models for shallow landslides using Logistic Regression and Random Forests multivariate statistical techniques. The study area, located in North-East Sicily, was hit on October 1st 2009 by a severe rainstorm (225 mm of cumulative rainfall in 7 h) which caused flash floods and more than 1000 landslides. Several small villages, such as Giampilieri, were hit with 31 fatalities, 6 missing persons and damage to buildings and transportation infrastructures. Landslides, mainly types such as earth and debris translational slides evolving into debris flows, were triggered on steep slopes and involved colluvium and regolith materials which cover the underlying metamorphic bedrock. The work has been carried out with the following steps: i) realization of a detailed event landslide inventory map through field surveys coupled with observation of high resolution aerial colour orthophoto; ii) identification of landslide source areas; iii) data preparation of landslide controlling factors and descriptive statistics based on a bivariate method (Frequency Ratio) to get an initial overview on existing relationships between causative factors and shallow landslide source areas; iv) choice of criteria for the selection and sizing of the mapping unit; v) implementation of 5 multivariate statistical susceptibility models based on Logistic Regression and Random Forests techniques and focused on landslide source areas; vi) evaluation of the influence of sample size and type of sampling on results and performance of the models; vii) evaluation of the predictive capabilities of the models using ROC curve, AUC and contingency tables; viii) comparison of model results and obtained susceptibility maps; and ix) analysis of temporal variation of landslide susceptibility related to input parameter changes. Models based on Logistic Regression and Random Forests have demonstrated excellent predictive capabilities. Land use and wildfire variables were found to have a strong control on the occurrence of very rapid shallow landslides.
Uncertainty analysis technique for OMEGA Dante measurements
DOE Office of Scientific and Technical Information (OSTI.GOV)
May, M. J.; Widmann, K.; Sorce, C.
2010-10-15
The Dante is an 18 channel x-ray filtered diode array which records the spectrally and temporally resolved radiation flux from various targets (e.g., hohlraums, etc.) at x-ray energies between 50 eV and 10 keV. It is a main diagnostic installed on the OMEGA laser facility at the Laboratory for Laser Energetics, University of Rochester. The absolute flux is determined from the photometric calibration of the x-ray diodes, filters and mirrors, and an unfold algorithm. Understanding the errors on this absolute measurement is critical for understanding hohlraum energetic physics. We present a new method for quantifying the uncertainties on the determinedmore » flux using a Monte Carlo parameter variation technique. This technique combines the uncertainties in both the unfold algorithm and the error from the absolute calibration of each channel into a one sigma Gaussian error function. One thousand test voltage sets are created using these error functions and processed by the unfold algorithm to produce individual spectra and fluxes. Statistical methods are applied to the resultant set of fluxes to estimate error bars on the measurements.« less
Uncertainty Analysis Technique for OMEGA Dante Measurements
DOE Office of Scientific and Technical Information (OSTI.GOV)
May, M J; Widmann, K; Sorce, C
2010-05-07
The Dante is an 18 channel X-ray filtered diode array which records the spectrally and temporally resolved radiation flux from various targets (e.g. hohlraums, etc.) at X-ray energies between 50 eV to 10 keV. It is a main diagnostics installed on the OMEGA laser facility at the Laboratory for Laser Energetics, University of Rochester. The absolute flux is determined from the photometric calibration of the X-ray diodes, filters and mirrors and an unfold algorithm. Understanding the errors on this absolute measurement is critical for understanding hohlraum energetic physics. We present a new method for quantifying the uncertainties on the determinedmore » flux using a Monte-Carlo parameter variation technique. This technique combines the uncertainties in both the unfold algorithm and the error from the absolute calibration of each channel into a one sigma Gaussian error function. One thousand test voltage sets are created using these error functions and processed by the unfold algorithm to produce individual spectra and fluxes. Statistical methods are applied to the resultant set of fluxes to estimate error bars on the measurements.« less
Symbolic dynamics techniques for complex systems: Application to share price dynamics
NASA Astrophysics Data System (ADS)
Xu, Dan; Beck, Christian
2017-05-01
The symbolic dynamics technique is well known for low-dimensional dynamical systems and chaotic maps, and lies at the roots of the thermodynamic formalism of dynamical systems. Here we show that this technique can also be successfully applied to time series generated by complex systems of much higher dimensionality. Our main example is the investigation of share price returns in a coarse-grained way. A nontrivial spectrum of Rényi entropies is found. We study how the spectrum depends on the time scale of returns, the sector of stocks considered, as well as the number of symbols used for the symbolic description. Overall our analysis confirms that in the symbol space transition probabilities of observed share price returns depend on the entire history of previous symbols, thus emphasizing the need for a modelling based on non-Markovian stochastic processes. Our method allows for quantitative comparisons of entirely different complex systems, for example the statistics of symbol sequences generated by share price returns using 4 symbols can be compared with that of genomic sequences.
Bond, John W; Weart, Jocelyn R
2017-05-01
Recovery, profiling, and speculative searching of trace DNA (not attributable to a body fluid/cell type) over a twelve-month period in a U.S. Crime Laboratory and U.K. police force are compared. Results show greater numbers of U.S. firearm-related items submitted for analysis compared with the U.K., where greatest numbers were submitted from burglary or vehicle offenses. U.S. multiple recovery techniques (double swabbing) occurred mainly during laboratory examination, whereas the majority of U.K. multiple recovery techniques occurred at the scene. No statistical difference was observed for useful profiles from single or multiple recovery. Database loading of interpretable profiles was most successful for U.K. items related to burglary or vehicle offenses. Database associations (matches) represented 7.0% of all U.S. items and 13.1% of all U.K. items. The U.K. strategy for burglary and vehicle examination demonstrated that careful selection of both items and sampling techniques is crucial to obtaining the observed results. © 2016 American Academy of Forensic Sciences.
Novelli, M D; Barreto, E; Matos, D; Saad, S S; Borra, R C
1997-01-01
The authors present the experimental results of the computerized quantifying of tissular structures involved in the reparative process of colonic anastomosis performed by manual suture and biofragmentable ring. The quantified variables in this study were: oedema fluid, myofiber tissue, blood vessel and cellular nuclei. An image processing software developed at Laboratório de Informática Dedicado à Odontologia (LIDO) was utilized to quantifying the pathognomonic alterations in the inflammatory process in colonic anastomosis performed in 14 dogs. The results were compared to those obtained through traditional way diagnosis by two pathologists in view of counterproof measures. The criteria for these diagnoses were defined in levels represented by absent, light, moderate and intensive which were compared to analysis performed by the computer. There was significant statistical difference between two techniques: the biofragmentable ring technique exhibited low oedema fluid, organized myofiber tissue and higher number of alongated cellular nuclei in relation to manual suture technique. The analysis of histometric variables through computational image processing was considered efficient and powerful to quantify the main tissular inflammatory and reparative changing.
Earth Observation System Flight Dynamics System Covariance Realism
NASA Technical Reports Server (NTRS)
Zaidi, Waqar H.; Tracewell, David
2016-01-01
This presentation applies a covariance realism technique to the National Aeronautics and Space Administration (NASA) Earth Observation System (EOS) Aqua and Aura spacecraft based on inferential statistics. The technique consists of three parts: collection calculation of definitive state estimates through orbit determination, calculation of covariance realism test statistics at each covariance propagation point, and proper assessment of those test statistics.
The Shock and Vibration Digest. Volume 16, Number 1
1984-01-01
investigation of the measure- ment of frequency band average loss factors of structural components for use in the statistical energy analysis method of...stiffness. Matrix methods Key Words: Finite element technique. Statistical energy analysis . Experimental techniques. Framed structures, Com- puter...programs In order to further understand the practical application of the statistical energy analysis , a two section plate-like frame structure is
Rigatelli, Gianluca; Dell'Avvocata, Fabio; Zuin, Marco; Giatti, Sara; Duong, Khanh; Pham, Trung; Tuan, Nguyen Si; Vassiliev, Dobrin; Daggubati, Ramesh; Nguyen, Thach
2017-12-01
Provisional and culotte are the most commonly used techniques in left main (LM) stenting. The impact of different post-dilation techniques on fluid dynamic of LM bifurcation has not been yet investigated. The aim of this study is to evaluate, by means of computational fluid dynamic analysis (CFD), the impact of different post-dilation techniques including proximal optimization technique (POT), kissing balloon (KB), POT-Side-POT and POT-KB-POT, 2-steps Kissing (2SK) and Snuggle Kissing balloon (SKB) on flow dynamic profile after LM provisional or culotte stenting. We considered an LM-LCA-LCX bifurcation reconstructed after reviewing 100 consecutive patients (mean age 71.4 ± 9.3 years, 49 males) with LM distal disease. The diameters of LAD and LCX were modelled according to the Finnet's law as following: LM 4.5 mm, LAD 3.5 mm, LCX 2.75 mm, with bifurcation angle set up at 55°. Xience third-generation stent (Abbot Inc., USA) was reconstructed and virtually implanted in provisional/cross-over and culotte fashion. POT, KB, POT-side-POT, POT-KB-POT, 2SK and SKB were virtually applied and analyzed in terms of the wall shear stress (WSS). Analyzing the provisional stenting, the 2SK and KB techniques had a statistically significant lower impact on the WSS at the carina, while POT seemed to obtain a neutral effect. In the wall opposite to the carina, the more physiological profile has been obtained by KB and POT with higher WSS value and smaller surface area of the lower WSS. In culotte stenting, at the carina, POT-KB-POT and 2SK had a very physiological profile; while at the wall opposite to the carina, 2SK and POT-KB-POT decreased significantly the surface area of the lower WSS compared to the other techniques. From the fluid dynamic point of view in LM provisional stenting, POT, 2SK and KB showed a similar beneficial impact on the bifurcation rheology, while in LM culotte stenting, POT-KB-POT and 2SK performed slightly better than the other techniques, probably reflecting a better strut apposition.
NASA Technical Reports Server (NTRS)
Aires, Filipe; Rossow, William B.; Chedin, Alain; Hansen, James E. (Technical Monitor)
2000-01-01
The use of the Principal Component Analysis technique for the analysis of geophysical time series has been questioned in particular for its tendency to extract components that mix several physical phenomena even when the signal is just their linear sum. We demonstrate with a data simulation experiment that the Independent Component Analysis, a recently developed technique, is able to solve this problem. This new technique requires the statistical independence of components, a stronger constraint, that uses higher-order statistics, instead of the classical decorrelation a weaker constraint, that uses only second-order statistics. Furthermore, ICA does not require additional a priori information such as the localization constraint used in Rotational Techniques.
Forest wildlife habitat statistics for Maine - 1982
Robert T. Brooks; Thomas S. Frieswyk; Arthur Ritter
1986-01-01
A statistical report on the first forest wildlife habitat survey of Maine (1982). Eighty-five tables show estimates of forest area and several attributes of forest land wildlife habitat. Data are presented at two levels: state and geographic sampling unit.
NASA Astrophysics Data System (ADS)
Caporali, E.; Chiarello, V.; Galeati, G.
2014-12-01
Peak discharges estimates for a given return period are of primary importance in engineering practice for risk assessment and hydraulic structure design. Different statistical methods are chosen here for the assessment of flood frequency curve: one indirect technique based on the extreme rainfall event analysis, the Peak Over Threshold (POT) model and the Annual Maxima approach as direct techniques using river discharge data. In the framework of the indirect method, a Monte Carlo simulation approach is adopted to determine a derived frequency distribution of peak runoff using a probabilistic formulation of the SCS-CN method as stochastic rainfall-runoff model. A Monte Carlo simulation is used to generate a sample of different runoff events from different stochastic combination of rainfall depth, storm duration, and initial loss inputs. The distribution of the rainfall storm events is assumed to follow the GP law whose parameters are estimated through GEV's parameters of annual maximum data. The evaluation of the initial abstraction ratio is investigated since it is one of the most questionable assumption in the SCS-CN model and plays a key role in river basin characterized by high-permeability soils, mainly governed by infiltration excess mechanism. In order to take into account the uncertainty of the model parameters, this modified approach, that is able to revise and re-evaluate the original value of the initial abstraction ratio, is implemented. In the POT model the choice of the threshold has been an essential issue, mainly based on a compromise between bias and variance. The Generalized Extreme Value (GEV) distribution fitted to the annual maxima discharges is therefore compared with the Pareto distributed peaks to check the suitability of the frequency of occurrence representation. The methodology is applied to a large dam in the Serchio river basin, located in the Tuscany Region. The application has shown as Monte Carlo simulation technique can be a useful tool to provide more robust estimation of the results obtained by direct statistical methods.
Texture analysis with statistical methods for wheat ear extraction
NASA Astrophysics Data System (ADS)
Bakhouche, M.; Cointault, F.; Gouton, P.
2007-01-01
In agronomic domain, the simplification of crop counting, necessary for yield prediction and agronomic studies, is an important project for technical institutes such as Arvalis. Although the main objective of our global project is to conceive a mobile robot for natural image acquisition directly in a field, Arvalis has proposed us first to detect by image processing the number of wheat ears in images before to count them, which will allow to obtain the first component of the yield. In this paper we compare different texture image segmentation techniques based on feature extraction by first and higher order statistical methods which have been applied on our images. The extracted features are used for unsupervised pixel classification to obtain the different classes in the image. So, the K-means algorithm is implemented before the choice of a threshold to highlight the ears. Three methods have been tested in this feasibility study with very average error of 6%. Although the evaluation of the quality of the detection is visually done, automatic evaluation algorithms are currently implementing. Moreover, other statistical methods of higher order will be implemented in the future jointly with methods based on spatio-frequential transforms and specific filtering.
NASA Astrophysics Data System (ADS)
Wan, Xiaoqing; Zhao, Chunhui; Wang, Yanchun; Liu, Wu
2017-11-01
This paper proposes a novel classification paradigm for hyperspectral image (HSI) using feature-level fusion and deep learning-based methodologies. Operation is carried out in three main steps. First, during a pre-processing stage, wave atoms are introduced into bilateral filter to smooth HSI, and this strategy can effectively attenuate noise and restore texture information. Meanwhile, high quality spectral-spatial features can be extracted from HSI by taking geometric closeness and photometric similarity among pixels into consideration simultaneously. Second, higher order statistics techniques are firstly introduced into hyperspectral data classification to characterize the phase correlations of spectral curves. Third, multifractal spectrum features are extracted to characterize the singularities and self-similarities of spectra shapes. To this end, a feature-level fusion is applied to the extracted spectral-spatial features along with higher order statistics and multifractal spectrum features. Finally, stacked sparse autoencoder is utilized to learn more abstract and invariant high-level features from the multiple feature sets, and then random forest classifier is employed to perform supervised fine-tuning and classification. Experimental results on two real hyperspectral data sets demonstrate that the proposed method outperforms some traditional alternatives.
The exploit of cereal embryo structure for productive reasons by in vitro techniques
NASA Astrophysics Data System (ADS)
Savaskan, C.
2017-07-01
There are two main sides of our works exploiting embryo structure in durum wheat and some other cereals. First is haploid (or doubled haploid) embryo production using anther or microspore culture or intergeneric crosses, to ameliorate desirable characters genetically homozygote. Secondly, to develope convenient embryo culture technique in order to be stored and cultivated longer time of genotypes without being alien pollination etc. in field conditions. For that reason, two different auxin and also their combination with kinetin were used for mature embryos of wheat genotypes (hexaploid and tetraploid), to understand efficient dose for calli production and plant regeneration in plant tissue culture. Modified MS media were used adding a single dose of arabinogalactan protein (AGP) and without adding for regeneration. In further step of this study, most efficient auxin+kinetin combination which is determined previous research, it was used in the same modified MS medium to produce calli production and plant regeneration in three different genotypes (hexaploid and tetraploid wheat and diploid barley). Data were calculated in five different developmental stages of treatments. All statistical analysis of data were performed and means were compared with Duncan's test. Genetics and morphological effects of AGP on genotypes were discussed with the results of variance analysis. Simple correlation coefficient (r) was calculated base on the main values of replications.
Teh, V; Sim, K S; Wong, E K
2016-11-01
According to the statistic from World Health Organization (WHO), stroke is one of the major causes of death globally. Computed tomography (CT) scan is one of the main medical diagnosis system used for diagnosis of ischemic stroke. CT scan provides brain images in Digital Imaging and Communication in Medicine (DICOM) format. The presentation of CT brain images is mainly relied on the window setting (window center and window width), which converts an image from DICOM format into normal grayscale format. Nevertheless, the ordinary window parameter could not deliver a proper contrast on CT brain images for ischemic stroke detection. In this paper, a new proposed method namely gamma correction extreme-level eliminating with weighting distribution (GCELEWD) is implemented to improve the contrast on CT brain images. GCELEWD is capable of highlighting the hypodense region for diagnosis of ischemic stroke. The performance of this new proposed technique, GCELEWD, is compared with four of the existing contrast enhancement technique such as brightness preserving bi-histogram equalization (BBHE), dualistic sub-image histogram equalization (DSIHE), extreme-level eliminating histogram equalization (ELEHE), and adaptive gamma correction with weighting distribution (AGCWD). GCELEWD shows better visualization for ischemic stroke detection and higher values with image quality assessment (IQA) module. SCANNING 38:842-856, 2016. © 2016 Wiley Periodicals, Inc. © Wiley Periodicals, Inc.
Analysis of real-time vibration data
Safak, E.
2005-01-01
In recent years, a few structures have been instrumented to provide continuous vibration data in real time, recording not only large-amplitude motions generated by extreme loads, but also small-amplitude motions generated by ambient loads. The main objective in continuous recording is to track any changes in structural characteristics, and to detect damage after an extreme event, such as an earthquake or explosion. The Fourier-based spectral analysis methods have been the primary tool to analyze vibration data from structures. In general, such methods do not work well for real-time data, because real-time data are mainly composed of ambient vibrations with very low amplitudes and signal-to-noise ratios. The long duration, linearity, and the stationarity of ambient data, however, allow us to utilize statistical signal processing tools, which can compensate for the adverse effects of low amplitudes and high noise. The analysis of real-time data requires tools and techniques that can be applied in real-time; i.e., data are processed and analyzed while being acquired. This paper presents some of the basic tools and techniques for processing and analyzing real-time vibration data. The topics discussed include utilization of running time windows, tracking mean and mean-square values, filtering, system identification, and damage detection.
NASA Technical Reports Server (NTRS)
Wolf, S. F.; Lipschutz, M. E.
1993-01-01
Multivariate statistical analysis techniques (linear discriminant analysis and logistic regression) can provide powerful discrimination tools which are generally unfamiliar to the planetary science community. Fall parameters were used to identify a group of 17 H chondrites (Cluster 1) that were part of a coorbital stream which intersected Earth's orbit in May, from 1855 - 1895, and can be distinguished from all other H chondrite falls. Using multivariate statistical techniques, it was demonstrated that a totally different criterion, labile trace element contents - hence thermal histories - or 13 Cluster 1 meteorites are distinguishable from those of 45 non-Cluster 1 H chondrites. Here, we focus upon the principles of multivariate statistical techniques and illustrate their application using non-meteoritic and meteoritic examples.
Code of Federal Regulations, 2010 CFR
2010-07-01
..., other techniques, such as the use of statistical models or the use of historical data could be..., mathematical techniques should be applied to account for the trends to ensure that the expected annual values... emission patterns, either the most recent representative year(s) could be used or statistical techniques or...
Li, Da; Liang, Li; Zhang, Jing; Kang, Tingguo
2015-01-01
Background: Quality control is one of the bottleneck problems limiting the application and development of traditional Chinese medicine (TCM). In recent years, microscopy and high-performance liquid chromatography (HPLC) techniques have been frequently applied in the quality control of TCM. However, studies combining conventional microscopy and HPLC techniques for the quality control of the flower bud of Tussilago farfara L. (Kuandonghua) have not been reported. Objective: This study was undertaken to evaluate the quality of the flower bud of T. farfara L. and to establish the relationships between the quantity of pollen grains and four main bioactive constituents: tussilagone, chlorogenic acid, rutin and isoquercitrin. Materials and Methods: In this study, microscopic examination was used to quantify microscopic characteristics of the flower bud of T. farfara L., and the chemical components were determined by HPLC. The data were analyzed by Statistical Package for the Social Sciences statistics software. Results: The results of the analysis showed that tussilagone, chlorogenic acid, rutin and isoquercitrin were significantly correlated with the quantity of pollen grains in the flower bud of T. farfara L. There is a positive correlation between them. From these results, it can be deduced that the flower bud of T. farfara L. with a greater quantity of pollen grains should be of better quality. Conclusion: The study showed that the established method can be helpful for evaluating the quality of the flower bud of T. farfara L. based on microscopic characteristic constants and chemical quantitation. PMID:26246737
Forest Statistics for Maine, 1995
Douglas M. Griffith; Carol L. Alerich; Carol L. Alerich
1996-01-01
A statistical report on the fourth forest inventory of Maine conducted in 1994-96. Findings are displayed in 117 tables containing estimates of forest area numbers of trees, timber volume, and growth. Data are presented at three levels: state, geographic unit, and county.
The Use of a Context-Based Information Retrieval Technique
2009-07-01
provided in context. Latent Semantic Analysis (LSA) is a statistical technique for inferring contextual and structural information, and previous studies...WAIS). 10 DSTO-TR-2322 1.4.4 Latent Semantic Analysis LSA, which is also known as latent semantic indexing (LSI), uses a statistical and...1.4.6 Language Models In contrast, natural language models apply algorithms that combine statistical information with semantic information. Semantic
[Statistical analysis of German radiologic periodicals: developmental trends in the last 10 years].
Golder, W
1999-09-01
To identify which statistical tests are applied in German radiological publications, to what extent their use has changed during the last decade, and which factors might be responsible for this development. The major articles published in "ROFO" and "DER RADIOLOGE" during 1988, 1993 and 1998 were reviewed for statistical content. The contributions were classified by principal focus and radiological subspecialty. The methods used were assigned to descriptive, basal and advanced statistics. Sample size, significance level and power were established. The use of experts' assistance was monitored. Finally, we calculated the so-called cumulative accessibility of the publications. 525 contributions were found to be eligible. In 1988, 87% used descriptive statistics only, 12.5% basal, and 0.5% advanced statistics. The corresponding figures in 1993 and 1998 are 62 and 49%, 32 and 41%, and 6 and 10%, respectively. Statistical techniques were most likely to be used in research on musculoskeletal imaging and articles dedicated to MRI. Six basic categories of statistical methods account for the complete statistical analysis appearing in 90% of the articles. ROC analysis is the single most common advanced technique. Authors make increasingly use of statistical experts' opinion and programs. During the last decade, the use of statistical methods in German radiological journals has fundamentally improved, both quantitatively and qualitatively. Presently, advanced techniques account for 20% of the pertinent statistical tests. This development seems to be promoted by the increasing availability of statistical analysis software.
NASA Technical Reports Server (NTRS)
Fomenkova, M. N.
1997-01-01
The computer-intensive project consisted of the analysis and synthesis of existing data on composition of comet Halley dust particles. The main objective was to obtain a complete inventory of sulfur containing compounds in the comet Halley dust by building upon the existing classification of organic and inorganic compounds and applying a variety of statistical techniques for cluster and cross-correlational analyses. A student hired for this project wrote and tested the software to perform cluster analysis. The following tasks were carried out: (1) selecting the data from existing database for the proposed project; (2) finding access to a standard library of statistical routines for cluster analysis; (3) reformatting the data as necessary for input into the library routines; (4) performing cluster analysis and constructing hierarchical cluster trees using three methods to define the proximity of clusters; (5) presenting the output results in different formats to facilitate the interpretation of the obtained cluster trees; (6) selecting groups of data points common for all three trees as stable clusters. We have also considered the chemistry of sulfur in inorganic compounds.
NASA Astrophysics Data System (ADS)
Yuksel, Kivanc; Chang, Xin; Skarbek, Władysław
2017-08-01
The novel smile recognition algorithm is presented based on extraction of 68 facial salient points (fp68) using the ensemble of regression trees. The smile detector exploits the Support Vector Machine linear model. It is trained with few hundreds exemplar images by SVM algorithm working in 136 dimensional space. It is shown by the strict statistical data analysis that such geometric detector strongly depends on the geometry of mouth opening area, measured by triangulation of outer lip contour. To this goal two Bayesian detectors were developed and compared with SVM detector. The first uses the mouth area in 2D image, while the second refers to the mouth area in 3D animated face model. The 3D modeling is based on Candide-3 model and it is performed in real time along with three smile detectors and statistics estimators. The mouth area/Bayesian detectors exhibit high correlation with fp68/SVM detector in a range [0:8; 1:0], depending mainly on light conditions and individual features with advantage of 3D technique, especially in hard light conditions.
Aoyagi, Miki; Nagata, Kenji
2012-06-01
The term algebraic statistics arises from the study of probabilistic models and techniques for statistical inference using methods from algebra and geometry (Sturmfels, 2009 ). The purpose of our study is to consider the generalization error and stochastic complexity in learning theory by using the log-canonical threshold in algebraic geometry. Such thresholds correspond to the main term of the generalization error in Bayesian estimation, which is called a learning coefficient (Watanabe, 2001a , 2001b ). The learning coefficient serves to measure the learning efficiencies in hierarchical learning models. In this letter, we consider learning coefficients for Vandermonde matrix-type singularities, by using a new approach: focusing on the generators of the ideal, which defines singularities. We give tight new bound values of learning coefficients for the Vandermonde matrix-type singularities and the explicit values with certain conditions. By applying our results, we can show the learning coefficients of three-layered neural networks and normal mixture models.
A Monte Carlo–Based Bayesian Approach for Measuring Agreement in a Qualitative Scale
Pérez Sánchez, Carlos Javier
2014-01-01
Agreement analysis has been an active research area whose techniques have been widely applied in psychology and other fields. However, statistical agreement among raters has been mainly considered from a classical statistics point of view. Bayesian methodology is a viable alternative that allows the inclusion of subjective initial information coming from expert opinions, personal judgments, or historical data. A Bayesian approach is proposed by providing a unified Monte Carlo–based framework to estimate all types of measures of agreement in a qualitative scale of response. The approach is conceptually simple and it has a low computational cost. Both informative and non-informative scenarios are considered. In case no initial information is available, the results are in line with the classical methodology, but providing more information on the measures of agreement. For the informative case, some guidelines are presented to elicitate the prior distribution. The approach has been applied to two applications related to schizophrenia diagnosis and sensory analysis. PMID:29881002
NASA Astrophysics Data System (ADS)
Pan, X.; Uno, I.; Wang, Z.; Nishizawa, T.; Sugimoto, N.; Yamamoto, S.; Kobayashi, H.; Sun, Y.; Fu, P.; Tang, X.; Wang, Z.
2017-12-01
Natural mineral dust and heavy anthropogenic pollution and its complex interactions cause significant environmental problems in East Asia. Due to restrictions of observing technique, real-time morphological change in Asian dust particles owing to coating process of anthropogenic pollutants is still statistically unclear. Here, we first used a newly developed, single-particle polarization detector and quantitatively investigate the evolution of the polarization property of backscattering light reflected from dust particle as they were mixing with anthropogenic pollutants in North China. The decrease in observed depolarization ratio is mainly attributed to the decrease of aspect ratio of the dust particles as a result of continuous coating processes. Hygroscopic growth of Calcium nitrate (Ca(NO3)2) on the surface of the dust particles played a vital role, particularly when they are stagnant in the polluted region with high RH conditions. Reliable statistics highlight the significant importance of internally mixed, `quasi-spherical' Asian dust particles, which markedly act as cloud condensation nuclei and exert regional climate change.
Pan, Xiaole; Uno, Itsushi; Wang, Zhe; Nishizawa, Tomoaki; Sugimoto, Nobuo; Yamamoto, Shigekazu; Kobayashi, Hiroshi; Sun, Yele; Fu, Pingqing; Tang, Xiao; Wang, Zifa
2017-03-23
Natural mineral dust and heavy anthropogenic pollution and its complex interactions cause significant environmental problems in East Asia. Due to restrictions of observing technique, real-time morphological change in Asian dust particles owing to coating process of anthropogenic pollutants is still statistically unclear. Here, we first used a newly developed, single-particle polarization detector and quantitatively investigate the evolution of the polarization property of backscattering light reflected from dust particle as they were mixing with anthropogenic pollutants in North China. The decrease in observed depolarization ratio is mainly attributed to the decrease of aspect ratio of the dust particles as a result of continuous coating processes. Hygroscopic growth of Calcium nitrate (Ca(NO 3 ) 2 ) on the surface of the dust particles played a vital role, particularly when they are stagnant in the polluted region with high RH conditions. Reliable statistics highlight the significant importance of internally mixed, 'quasi-spherical' Asian dust particles, which markedly act as cloud condensation nuclei and exert regional climate change.
Response statistics of rotating shaft with non-linear elastic restoring forces by path integration
NASA Astrophysics Data System (ADS)
Gaidai, Oleg; Naess, Arvid; Dimentberg, Michael
2017-07-01
Extreme statistics of random vibrations is studied for a Jeffcott rotor under uniaxial white noise excitation. Restoring force is modelled as elastic non-linear; comparison is done with linearized restoring force to see the force non-linearity effect on the response statistics. While for the linear model analytical solutions and stability conditions are available, it is not generally the case for non-linear system except for some special cases. The statistics of non-linear case is studied by applying path integration (PI) method, which is based on the Markov property of the coupled dynamic system. The Jeffcott rotor response statistics can be obtained by solving the Fokker-Planck (FP) equation of the 4D dynamic system. An efficient implementation of PI algorithm is applied, namely fast Fourier transform (FFT) is used to simulate dynamic system additive noise. The latter allows significantly reduce computational time, compared to the classical PI. Excitation is modelled as Gaussian white noise, however any kind distributed white noise can be implemented with the same PI technique. Also multidirectional Markov noise can be modelled with PI in the same way as unidirectional. PI is accelerated by using Monte Carlo (MC) estimated joint probability density function (PDF) as initial input. Symmetry of dynamic system was utilized to afford higher mesh resolution. Both internal (rotating) and external damping are included in mechanical model of the rotor. The main advantage of using PI rather than MC is that PI offers high accuracy in the probability distribution tail. The latter is of critical importance for e.g. extreme value statistics, system reliability, and first passage probability.
Factors contributing to academic achievement: a Bayesian structure equation modelling study
NASA Astrophysics Data System (ADS)
Payandeh Najafabadi, Amir T.; Omidi Najafabadi, Maryam; Farid-Rohani, Mohammad Reza
2013-06-01
In Iran, high school graduates enter university after taking a very difficult entrance exam called the Konkoor. Therefore, only the top-performing students are admitted by universities to continue their bachelor's education in statistics. Surprisingly, statistically, most of such students fall into the following categories: (1) do not succeed in their education despite their excellent performance on the Konkoor and in high school; (2) graduate with a grade point average (GPA) that is considerably lower than their high school GPA; (3) continue their master's education in majors other than statistics and (4) try to find jobs unrelated to statistics. This article employs the well-known and powerful statistical technique, the Bayesian structural equation modelling (SEM), to study the academic success of recent graduates who have studied statistics at Shahid Beheshti University in Iran. This research: (i) considered academic success as a latent variable, which was measured by GPA and other academic success (see below) of students in the target population; (ii) employed the Bayesian SEM, which works properly for small sample sizes and ordinal variables; (iii), which is taken from the literature, developed five main factors that affected academic success and (iv) considered several standard psychological tests and measured characteristics such as 'self-esteem' and 'anxiety'. We then study the impact of such factors on the academic success of the target population. Six factors that positively impact student academic success were identified in the following order of relative impact (from greatest to least): 'Teaching-Evaluation', 'Learner', 'Environment', 'Family', 'Curriculum' and 'Teaching Knowledge'. Particularly, influential variables within each factor have also been noted.
Onay, Ulaş; Akpınar, Sercan; Akgün, Rahmi Can; Balçık, Cenk; Tuncay, Ismail Cengiz
2013-01-01
The aim of this study was to compare new knotless single-row and double-row suture anchor techniques with traditional transosseous suture techniques for different sized rotator cuff tears in an animal model. The study included 56 cadaveric sheep shoulders. Supraspinatus cuff tears of 1 cm repaired with new knotless single-row suture anchor technique and supraspinatus and infraspinatus rotator cuff tears of 3 cm repaired with double-row suture anchor technique were compared to traditional transosseous suture techniques and control groups. The repaired tendons were loaded with 5 mm/min static velocity with 2.5 kgN load cell in Instron 8874 machine until the repair failure. The 1 cm transosseous group was statistically superior to 1 cm control group (p=0.021, p<0.05) and the 3 cm SpeedBridge group was statistically superior to the 1 cm SpeedFix group (p=0.012, p<0.05). The differences between the other groups were not statistically significant. No significant difference was found between the new knotless suture anchor techniques and traditional transosseous suture techniques.
ERIC Educational Resources Information Center
Soule, Margaret
This survey of the current status of public school libraries in Maine was intended to provide statistical data as a basis for improving the school library media center program in these schools. Information was gathered that detailed how resources and delivery of services differed across grade level; across variation in size of school; between…
Thompson, Geoffrey A; Luo, Qing; Hefti, Arthur
2013-12-01
Previous studies have shown casting methodology to influence the as-cast properties of dental casting alloys. It is important to consider clinically important mechanical properties so that the influence of casting can be clarified. The purpose of this study was to evaluate how torch/centrifugal and inductively cast and vacuum-pressure casting machines may affect the castability, microhardness, chemical composition, and microstructure of 2 high noble, 1 noble, and 1 base metal dental casting alloys. Two commonly used methods for casting were selected for comparison: torch/centrifugal casting and inductively heated/ vacuum-pressure casting. One hundred and twenty castability patterns were fabricated and divided into 8 groups. Four groups were torch/centrifugally cast in Olympia (O), Jelenko O (JO), Genesis II (G), and Liberty (L) alloys. Similarly, 4 groups were cast in O, JO, G, and L by an inductively induction/vacuum-pressure casting machine. Each specimen was evaluated for casting completeness to determine a castability value, while porosity was determined by standard x-ray techniques. Each group was metallographically prepared for further evaluation that included chemical composition, Vickers microhardness, and grain analysis of microstructure. Two-way ANOVA was used to determine significant differences among the main effects. Statistically significant effects were examined further with the Tukey HSD procedure for multiple comparisons. Data obtained from the castability experiments were non-normal and the variances were unequal. They were analyzed statistically with the Kruskal-Wallis rank sum test. Significant results were further investigated statistically with the Steel-Dwass method for multiple comparisons (α=.05). The alloy type had a significant effect on surface microhardness (P<.001). In contrast, the technique used for casting did not affect the microhardness of the test specimen (P=.465). Similarly, the interaction between the alloy and casting technique was not significant (P=.119). A high level of castability (98.5% on average) was achieved overall. The frequency of casting failures as a function of alloy type and casting method was determined. Failure was defined as a castability index score of <100%. Three of 28 possible comparisons between alloy and casting combinations were statistically significant. The results suggested that casting technique affects the castability index of alloys. Radiographic analysis detected large porosities in regions near the edge of the castability pattern and infrequently adjacent to noncast segments. All castings acquired traces of elements found in the casting crucibles. The grain size for each dental casting alloy was generally finer for specimens produced by the induction/vacuum-pressure method. The difference was substantial for JO and L. This study demonstrated a relation between casting techniques and some physical properties of metal ceramic casting alloys. Copyright © 2013 Editorial Council for the Journal of Prosthetic Dentistry. Published by Mosby, Inc. All rights reserved.
Use of Statistical Analyses in the Ophthalmic Literature
Lisboa, Renato; Meira-Freitas, Daniel; Tatham, Andrew J.; Marvasti, Amir H.; Sharpsten, Lucie; Medeiros, Felipe A.
2014-01-01
Purpose To identify the most commonly used statistical analyses in the ophthalmic literature and to determine the likely gain in comprehension of the literature that readers could expect if they were to sequentially add knowledge of more advanced techniques to their statistical repertoire. Design Cross-sectional study Methods All articles published from January 2012 to December 2012 in Ophthalmology, American Journal of Ophthalmology and Archives of Ophthalmology were reviewed. A total of 780 peer-reviewed articles were included. Two reviewers examined each article and assigned categories to each one depending on the type of statistical analyses used. Discrepancies between reviewers were resolved by consensus. Main Outcome Measures Total number and percentage of articles containing each category of statistical analysis were obtained. Additionally we estimated the accumulated number and percentage of articles that a reader would be expected to be able to interpret depending on their statistical repertoire. Results Readers with little or no statistical knowledge would be expected to be able to interpret the statistical methods presented in only 20.8% of articles. In order to understand more than half (51.4%) of the articles published, readers were expected to be familiar with at least 15 different statistical methods. Knowledge of 21 categories of statistical methods was necessary to comprehend 70.9% of articles, while knowledge of more than 29 categories was necessary to comprehend more than 90% of articles. Articles in retina and glaucoma subspecialties showed a tendency for using more complex analysis when compared to cornea. Conclusions Readers of clinical journals in ophthalmology need to have substantial knowledge of statistical methodology to understand the results of published studies in the literature. The frequency of use of complex statistical analyses also indicates that those involved in the editorial peer-review process must have sound statistical knowledge in order to critically appraise articles submitted for publication. The results of this study could provide guidance to direct the statistical learning of clinical ophthalmologists, researchers and educators involved in the design of courses for residents and medical students. PMID:24612977
Design of order statistics filters using feedforward neural networks
NASA Astrophysics Data System (ADS)
Maslennikova, Yu. S.; Bochkarev, V. V.
2016-08-01
In recent years significant progress have been made in the development of nonlinear data processing techniques. Such techniques are widely used in digital data filtering and image enhancement. Many of the most effective nonlinear filters based on order statistics. The widely used median filter is the best known order statistic filter. Generalized form of these filters could be presented based on Lloyd's statistics. Filters based on order statistics have excellent robustness properties in the presence of impulsive noise. In this paper, we present special approach for synthesis of order statistics filters using artificial neural networks. Optimal Lloyd's statistics are used for selecting of initial weights for the neural network. Adaptive properties of neural networks provide opportunities to optimize order statistics filters for data with asymmetric distribution function. Different examples demonstrate the properties and performance of presented approach.
Statistical evaluation of vibration analysis techniques
NASA Technical Reports Server (NTRS)
Milner, G. Martin; Miller, Patrice S.
1987-01-01
An evaluation methodology is presented for a selection of candidate vibration analysis techniques applicable to machinery representative of the environmental control and life support system of advanced spacecraft; illustrative results are given. Attention is given to the statistical analysis of small sample experiments, the quantification of detection performance for diverse techniques through the computation of probability of detection versus probability of false alarm, and the quantification of diagnostic performance.
1985-09-01
TECHNIQUES THESIS Robert A. Heinlein Captain, USAF AFIT/GLM/LSM/855-32.- _ DTIC MU’noN ’ST.,TEMENT A A-ZELECTE Approved lt public teleo*I Al \\ Z #&N0V21...343" A FEASIBILITY STUDY OF THE COLLECTION OF UNSCHEDULED MAINTENANCE DATA USING STrATISTICAL SAMPLING TECHNIQUES THESIS L .9 Robe-t A. Heinlein...a AFIT/GLM/LSM/85S-32 A FEASIBILITY STUDY OF THE COLLECTION OF UNSCHEDULED MAINTENANCE DATA USING STATISTICAL SAMPLING TECHNIQUES THESIS
SVS: data and knowledge integration in computational biology.
Zycinski, Grzegorz; Barla, Annalisa; Verri, Alessandro
2011-01-01
In this paper we present a framework for structured variable selection (SVS). The main concept of the proposed schema is to take a step towards the integration of two different aspects of data mining: database and machine learning perspective. The framework is flexible enough to use not only microarray data, but other high-throughput data of choice (e.g. from mass spectrometry, microarray, next generation sequencing). Moreover, the feature selection phase incorporates prior biological knowledge in a modular way from various repositories and is ready to host different statistical learning techniques. We present a proof of concept of SVS, illustrating some implementation details and describing current results on high-throughput microarray data.
Electron Impact Excitation of the Electronic States of Water
NASA Astrophysics Data System (ADS)
Thorn, Penny; Diakomichalis, N.; Brunger, M. J.; Campbell, L.; Teubner, P. J. O.; Kato, H.; Makochekanwa, C.; Hoshino, M.; Tanaka, H.
2006-10-01
We report differential and integral cross sections for excitation of the lowest lying ^3B1, ^1B1, ^3A1 and ^1A1 electronic states of water. The energy range of these measurements is 15-50eV and the angular range of the DCS measurements is 10-90^o. From these DCS the corresponding ICS is calculated using a molecular phase shift analysis technique. Where possible, comparison is made to the results of available theory. One of the main objectives of this study is to perform statistical equilibrium calculations to determine if the origin of the OH Meinel bands in our atmosphere are due to electron driven processes.
Rathbun, R.E.; Grant, R. Stephen
1978-01-01
There are advantages and disadvantages to both the radioactive and modified tracer techniques. The main advantage of the radioactive technique is that the tracer gas is chemically inert; the main disadvantage is that a radioactive isotope of the gas must be used to obtain the necessary analytical sensitivity. The main advantage of the modified technique is that radioactive tracers are not necessary; the main disadvantage is that the hydrocarbon tracer gases may be subject to biological degradation and sorption losses. Results of this comparison study suggest that the modified technique is a promising alternative to the use of radioactive tracers.
Electron Dropout Echoes Induced by Interplanetary Shock: A Statistical Study
NASA Astrophysics Data System (ADS)
Liu, Z.; Zong, Q.; Hao, Y.; Zhou, X.; Ma, X.; Liu, Y.
2017-12-01
"Electron dropout echo" as indicated by repeated moderate dropout and recovery signatures of the flux of energetic electron in the out radiation belt region has been investigated systematically. The electron dropout and its echoes are usually found for higher energy (> 300 keV) channels fluxes, whereas the flux enhancements are obvious for lower energy electrons simultaneously after the interplanetary shock arrives at the Earth's geosynchronous orbit. 104 dropout echo events have been found from 215 interplanetary shock events from 1998 to 2007 based on LANL satellite data. In analogy to substorm injections, these 104 events could be naturally divided into two categories: dispersionless (49 events) or dispersive (55 events) according to the energy dispersion of the initial dropout. It is found that locations of dispersionless events are distributed mainly in the duskside magnetosphere. Further, the obtained locations derived from dispersive events with the time-of-flight technique of the initial dropout regions are mainly located at the duskside as well. Statistical studies have shown that the effect of shock normal, interplanetary magnetic field Bz and solar wind dynamic pressure may be insignificant to these electron dropout events. We suggest that the electric field impulse induced by the IP shock produces a more pronounced inward migration of electrons at the dusk side, resulting in the observed dusk-side moderate dropout of electron flux and its consequent echoes.
NASA Astrophysics Data System (ADS)
Zan, Tao; Wang, Min; Hu, Jianzhong
2010-12-01
Machining status monitoring technique by multi-sensors can acquire and analyze the machining process information to implement abnormity diagnosis and fault warning. Statistical quality control technique is normally used to distinguish abnormal fluctuations from normal fluctuations through statistical method. In this paper by comparing the advantages and disadvantages of the two methods, the necessity and feasibility of integration and fusion is introduced. Then an approach that integrates multi-sensors status monitoring and statistical process control based on artificial intelligent technique, internet technique and database technique is brought forward. Based on virtual instrument technique the author developed the machining quality assurance system - MoniSysOnline, which has been used to monitoring the grinding machining process. By analyzing the quality data and AE signal information of wheel dressing process the reason of machining quality fluctuation has been obtained. The experiment result indicates that the approach is suitable for the status monitoring and analyzing of machining process.
39 CFR 3050.1 - Definitions applicable to this part.
Code of Federal Regulations, 2010 CFR
2010-07-01
..., mathematical, or statistical theory, precept, or assumption applied by the Postal Service in producing a... manipulation technique whose validity does not require the acceptance of a particular economic, mathematical, or statistical theory, precept, or assumption. A change in quantification technique should not change...
NASA Astrophysics Data System (ADS)
Han, Xiuzhen; Ma, Jianwen; Bao, Yuhai
2006-12-01
Currently the function of operational locust monitor system mainly focused on after-hazards monitoring and assessment, and to found the way effectively to perform early warning and prediction has more practical meaning. Through 2001, 2002 two years continuously field sample and statistics for locusts eggs hatching, nymph growth, adults 3 phases observation, sample statistics and calculation, spectral measurements as well as synchronically remote sensing data processing we raise the view point of Remote Sensing three stage monitor the locust hazards. Based on the point of view we designed remote sensing monitor in three stages: (1) during the egg hitching phase remote sensing can retrieve parameters of land surface temperature (LST) and soil moisture; (2) during nymph growth phase locust increases appetite greatly and remote sensing can calculate vegetation index, leaf area index, vegetation cover and analysis changes; (3) during adult phase the locust move and assembly towards ponds and water ditches as well as less than 75% vegetation cover areas and remote sensing combination with field data can monitor and predicts potential areas for adult locusts to assembly. In this way the priority of remote sensing technology is elaborated effectively and it also provides technique support for the locust monitor system. The idea and techniques used in the study can also be used as reference for other plant diseases and insect pests.
Li, Siyue; Zhang, Quanfa
2010-04-15
A data matrix (4032 observations), obtained during a 2-year monitoring period (2005-2006) from 42 sites in the upper Han River is subjected to various multivariate statistical techniques including cluster analysis, principal component analysis (PCA), factor analysis (FA), correlation analysis and analysis of variance to determine the spatial characterization of dissolved trace elements and heavy metals. Our results indicate that waters in the upper Han River are primarily polluted by Al, As, Cd, Pb, Sb and Se, and the potential pollutants include Ba, Cr, Hg, Mn and Ni. Spatial distribution of trace metals indicates the polluted sections mainly concentrate in the Danjiang, Danjiangkou Reservoir catchment and Hanzhong Plain, and the most contaminated river is in the Hanzhong Plain. Q-model clustering depends on geographical location of sampling sites and groups the 42 sampling sites into four clusters, i.e., Danjiang, Danjiangkou Reservoir region (lower catchment), upper catchment and one river in headwaters pertaining to water quality. The headwaters, Danjiang and lower catchment, and upper catchment correspond to very high polluted, moderate polluted and relatively low polluted regions, respectively. Additionally, PCA/FA and correlation analysis demonstrates that Al, Cd, Mn, Ni, Fe, Si and Sr are controlled by natural sources, whereas the other metals appear to be primarily controlled by anthropogenic origins though geogenic source contributing to them. 2009 Elsevier B.V. All rights reserved.
Securizing data linkage in french public statistics.
Guesdon, Maxence; Benzenine, Eric; Gadouche, Kamel; Quantin, Catherine
2016-10-06
Administrative records in France, especially medical and social records, have huge potential for statistical studies. The NIR (a national identifier) is widely used in medico-social administrations, and this would theoretically provide considerable scope for data matching, on condition that the legislation on such matters was respected.The law, however, forbids the processing of non-anonymized medical data, thus making it difficult to carry out studies that require several sources of social and medical data.We would like to benefit from computer techniques introduced since the 70 s to provide safe linkage of anonymized files, to release the current constraints of such procedures.We propose an organization and a data workflow, based on hashing and cyrptographic techniques, to strongly compartmentalize identifying and not-identifying data.The proposed method offers a strong control over who is in possession of which information, using different hashing keys for each linkage. This allows to prevent unauthorized linkage of data, to protect anonymity, by preventing cumulation of not-identifying data which can become identifying data when linked.Our proposal would make it possible to conduct such studies more easily, more regularly and more precisely while preserving a high enough level of anonymity.The main obstacle to setting up such a system, in our opinion, is not technical, but rather organizational in that it is based on the existence of a Key-Management Authority.
A new statistical framework to assess structural alignment quality using information compression
Collier, James H.; Allison, Lloyd; Lesk, Arthur M.; Garcia de la Banda, Maria; Konagurthu, Arun S.
2014-01-01
Motivation: Progress in protein biology depends on the reliability of results from a handful of computational techniques, structural alignments being one. Recent reviews have highlighted substantial inconsistencies and differences between alignment results generated by the ever-growing stock of structural alignment programs. The lack of consensus on how the quality of structural alignments must be assessed has been identified as the main cause for the observed differences. Current methods assess structural alignment quality by constructing a scoring function that attempts to balance conflicting criteria, mainly alignment coverage and fidelity of structures under superposition. This traditional approach to measuring alignment quality, the subject of considerable literature, has failed to solve the problem. Further development along the same lines is unlikely to rectify the current deficiencies in the field. Results: This paper proposes a new statistical framework to assess structural alignment quality and significance based on lossless information compression. This is a radical departure from the traditional approach of formulating scoring functions. It links the structural alignment problem to the general class of statistical inductive inference problems, solved using the information-theoretic criterion of minimum message length. Based on this, we developed an efficient and reliable measure of structural alignment quality, I-value. The performance of I-value is demonstrated in comparison with a number of popular scoring functions, on a large collection of competing alignments. Our analysis shows that I-value provides a rigorous and reliable quantification of structural alignment quality, addressing a major gap in the field. Availability: http://lcb.infotech.monash.edu.au/I-value Contact: arun.konagurthu@monash.edu Supplementary information: Online supplementary data are available at http://lcb.infotech.monash.edu.au/I-value/suppl.html PMID:25161241
Methods for trend analysis: Examples with problem/failure data
NASA Technical Reports Server (NTRS)
Church, Curtis K.
1989-01-01
Statistics are emphasized as an important role in quality control and reliability. Consequently, Trend Analysis Techniques recommended a variety of statistical methodologies that could be applied to time series data. The major goal of the working handbook, using data from the MSFC Problem Assessment System, is to illustrate some of the techniques in the NASA standard, some different techniques, and to notice patterns of data. Techniques for trend estimation used are: regression (exponential, power, reciprocal, straight line) and Kendall's rank correlation coefficient. The important details of a statistical strategy for estimating a trend component are covered in the examples. However, careful analysis and interpretation is necessary because of small samples and frequent zero problem reports in a given time period. Further investigations to deal with these issues are being conducted.
Palazón, L; Navas, A
2017-06-01
Information on sediment contribution and transport dynamics from the contributing catchments is needed to develop management plans to tackle environmental problems related with effects of fine sediment as reservoir siltation. In this respect, the fingerprinting technique is an indirect technique known to be valuable and effective for sediment source identification in river catchments. Large variability in sediment delivery was found in previous studies in the Barasona catchment (1509 km 2 , Central Spanish Pyrenees). Simulation results with SWAT and fingerprinting approaches identified badlands and agricultural uses as the main contributors to sediment supply in the reservoir. In this study the <63 μm sediment fraction from the surface reservoir sediments (2 cm) are investigated following the fingerprinting procedure to assess how the use of different statistical procedures affects the amounts of source contributions. Three optimum composite fingerprints were selected to discriminate between source contributions based in land uses/land covers from the same dataset by the application of (1) discriminant function analysis; and its combination (as second step) with (2) Kruskal-Wallis H-test and (3) principal components analysis. Source contribution results were different between assessed options with the greatest differences observed for option using #3, including the two step process: principal components analysis and discriminant function analysis. The characteristics of the solutions by the applied mixing model and the conceptual understanding of the catchment showed that the most reliable solution was achieved using #2, the two step process of Kruskal-Wallis H-test and discriminant function analysis. The assessment showed the importance of the statistical procedure used to define the optimum composite fingerprint for sediment fingerprinting applications. Copyright © 2016 Elsevier Ltd. All rights reserved.
Li, Lian Yong; Yang, Yong Tao; Qu, Chang Min; Liang, Shu Wen; Zhong, Chang Qing; Wang, Xiao Ying; Chen, Yan; Spandorfer, Robert M; Christofaro, Sarah; Cai, Qiang
2018-04-01
The aim of this study was to assess the efficacy and safety following endoscopic management of Zenker's diverticulum (ZD) using a needle-knife technique. A systematic search of PubMed, Embase and Cochrane library databases was performed. All original studies reporting efficacy and safety of needle-knife technique for treatment of ZD were included. Pooled event rates across studies were expressed with summative statistics. Main outcomes, such as rates of immediate symptomatic response (ISR), adverse events and recurrence, were extracted, pooled and analyzed. Heterogeneity among studies was assessed using the R statistic. The random effects model was used and results were expressed with forest plots and summative statistics. Thirteen studies included 589 patients were enrolled. Pooled event rates for ISR, overall complication, bleeding and perforation were 88% (95% confidence interval [CI] 79-94%), 13% (95% CI 8-22%), 5% (95% CI 3-10%) and 7% (95% CI 4-12%), respectively. The pooled data demonstrated an overall recurrence rate of 14% (95% CI 9-21%). Diverticulum size of at least 4 cm and less than 4 cm demonstrated pooled adverse event rates of 17% (95% CI 10-27%) and 7% (95% CI 2-18%), respectively. When using diverticuloscope as an accessory, pooled ISR and adverse events rates were 84% (95% CI 58-95%) and 10% (95% CI 3-26%), respectively. Flexible endoscopic procedures using needle-knife offers a relatively safe and effective treatment of symptomatic ZD, especially for ZD of <4 cm in diameter. © 2018 Chinese Medical Association Shanghai Branch, Chinese Society of Gastroenterology, Renji Hospital Affiliated to Shanghai Jiaotong University School of Medicine and John Wiley & Sons Australia, Ltd.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shen, Samuel S. P.
2013-09-01
The long-range goal of several past and current projects in our DOE-supported research has been the development of new and improved parameterizations of cloud-radiation effects and related processes, using ARM data, and the implementation and testing of these parameterizations in global models. The main objective of the present project being reported on here has been to develop and apply advanced statistical techniques, including Bayesian posterior estimates, to diagnose and evaluate features of both observed and simulated clouds. The research carried out under this project has been novel in two important ways. The first is that it is a key stepmore » in the development of practical stochastic cloud-radiation parameterizations, a new category of parameterizations that offers great promise for overcoming many shortcomings of conventional schemes. The second is that this work has brought powerful new tools to bear on the problem, because it has been an interdisciplinary collaboration between a meteorologist with long experience in ARM research (Somerville) and a mathematician who is an expert on a class of advanced statistical techniques that are well-suited for diagnosing model cloud simulations using ARM observations (Shen). The motivation and long-term goal underlying this work is the utilization of stochastic radiative transfer theory (Lane-Veron and Somerville, 2004; Lane et al., 2002) to develop a new class of parametric representations of cloud-radiation interactions and closely related processes for atmospheric models. The theoretical advantage of the stochastic approach is that it can accurately calculate the radiative heating rates through a broken cloud layer without requiring an exact description of the cloud geometry.« less
NASA Astrophysics Data System (ADS)
Lemaire, Vincent; Colette, Augustin; Menut, Laurent
2016-04-01
Because of its sensitivity to weather patterns, climate change will have an impact on air pollution so that, in the future, a climate penalty could jeopardize the expected efficiency of air pollution mitigation measures. A common method to assess the impact of climate on air quality consists in implementing chemistry-transport models forced by climate projections. However, at present, such impact assessment lack multi-model ensemble approaches to address uncertainties because of the substantial computing cost. Therefore, as a preliminary step towards exploring large climate ensembles with air quality models, we developed an ensemble exploration technique in order to point out the climate models that should be investigated in priority. By using a training dataset from a deterministic projection of climate and air quality over Europe, we identified the main meteorological drivers of air quality for 8 regions in Europe and developed statistical models that could be used to estimate future air pollutant concentrations. Applying this statistical model to the whole EuroCordex ensemble of climate projection, we find a climate penalty for six subregions out of eight (Eastern Europe, France, Iberian Peninsula, Mid Europe and Northern Italy). On the contrary, a climate benefit for PM2.5 was identified for three regions (Eastern Europe, Mid Europe and Northern Italy). The uncertainty of this statistical model challenges limits however the confidence we can attribute to associated quantitative projections. This technique allows however selecting a subset of relevant regional climate model members that should be used in priority for future deterministic projections to propose an adequate coverage of uncertainties. We are thereby proposing a smart ensemble exploration strategy that can also be used for other impacts studies beyond air quality.
Hardiman, S; Miller, K; Murphy, M
1993-01-01
Safety observations during the clinical development of Mentane (velnacrine maleate) have included the occurrence of generally asymptomatic liver enzyme elevations confined to patients with Alzheimer's disease (AD). The clinical presentation of this reversible hepatocellular injury is analogous to that reported for tetrahydroaminoacridine (THA). Direct liver injury, possibly associated with the production of a toxic metabolite, would be consistent with reports of aberrant xenobiotic metabolism in Alzheimer's disease patients. Since a patient related aberration in drug metabolism was suspected, a biostatistical strategy was developed with the objective of predicting hepatotoxicity in individual patients prior to exposure to velnacrine maleate. The method used logistic regression techniques with variable selection restricted to those items which could be routinely and inexpensively accessed at screen evaluation for potential candidates for treatment. The model was to be predictive (a marker for eventual hepatotoxicity) rather than a causative model, and techniques employed "goodness of fit", percentage correct, and positive and negative predictive values. On the basis of demographic and baseline laboratory data from 942 patients, the PROPP statistic was developed (the Physician Reference Of Predicted Probabilities). Main effect variables included age, gender, and nine hematological and serum chemistry variables. The sensitivity of the current model is approximately 49%, specificity approximately 88%. Using prior probability estimates, however, in which the patient's likelihood of liver toxicity is presumed to be at least 30%, the positive predictive value ranged between 64-77%. Although the clinical utility of this statistic will require refinements and additional prospective confirmation, its potential existence speaks to the possibility of markers for idiosyncratic drug metabolism in patients with Alzheimer's disease.
Regression: The Apple Does Not Fall Far From the Tree.
Vetter, Thomas R; Schober, Patrick
2018-05-15
Researchers and clinicians are frequently interested in either: (1) assessing whether there is a relationship or association between 2 or more variables and quantifying this association; or (2) determining whether 1 or more variables can predict another variable. The strength of such an association is mainly described by the correlation. However, regression analysis and regression models can be used not only to identify whether there is a significant relationship or association between variables but also to generate estimations of such a predictive relationship between variables. This basic statistical tutorial discusses the fundamental concepts and techniques related to the most common types of regression analysis and modeling, including simple linear regression, multiple regression, logistic regression, ordinal regression, and Poisson regression, as well as the common yet often underrecognized phenomenon of regression toward the mean. The various types of regression analysis are powerful statistical techniques, which when appropriately applied, can allow for the valid interpretation of complex, multifactorial data. Regression analysis and models can assess whether there is a relationship or association between 2 or more observed variables and estimate the strength of this association, as well as determine whether 1 or more variables can predict another variable. Regression is thus being applied more commonly in anesthesia, perioperative, critical care, and pain research. However, it is crucial to note that regression can identify plausible risk factors; it does not prove causation (a definitive cause and effect relationship). The results of a regression analysis instead identify independent (predictor) variable(s) associated with the dependent (outcome) variable. As with other statistical methods, applying regression requires that certain assumptions be met, which can be tested with specific diagnostics.
Ghoneim, Mai; Saber, Shehab ElDin; El-Badry, Tarek; Obeid, Maram; Hassib, Nehal
2016-12-15
Diabetes mellitus is a multisystem disease which weakens the human's immunity. Subsequently, it worsens the sequelae of apical periodontitis by raising a fierce bacterial trait due to the impaired host response. This study aimed to estimate bacterial reduction after using different irrigation techniques in systemically healthy and diabetic patients with asymptomatic apical periodontitis. Enterococcus faecalis , Peptostreptococcus micros , and Fusobacterium necleatum bacteria were chosen, as they are the most common and prevailing strains found in periodontitis. Bacterial samples were retrieved from necrotic root canals of systemically healthy and diabetic patients, before and after endodontic cleaning and shaping by using two different irrigation techniques; the conventional one and the EndoVac system. Quantitive polymerase chain reaction (qPCR) was utilised to detect the reduction in the bacterial count. The EndoVac irrigation system was effective in reducing bacteria, especially Peptostreptococcus micros in the diabetic group when compared to conventional irrigation technique with a statistically significant difference. The EndoVac can be considered as a promising tool in combination with irrigant solution to defeat the bacterial colonies living in the root canal system. Additional studies ought to be done to improve the means of bacterial clearance mainly in immune-compromised individuals.
Ghoneim, Mai; Saber, Shehab ElDin; El-Badry, Tarek; Obeid, Maram; Hassib, Nehal
2016-01-01
BACKGROUND: Diabetes mellitus is a multisystem disease which weakens the human’s immunity. Subsequently, it worsens the sequelae of apical periodontitis by raising a fierce bacterial trait due to the impaired host response. AIM: This study aimed to estimate bacterial reduction after using different irrigation techniques in systemically healthy and diabetic patients with asymptomatic apical periodontitis. MATERIAL AND METHODS: Enterococcus faecalis, Peptostreptococcus micros, and Fusobacterium necleatum bacteria were chosen, as they are the most common and prevailing strains found in periodontitis. Bacterial samples were retrieved from necrotic root canals of systemically healthy and diabetic patients, before and after endodontic cleaning and shaping by using two different irrigation techniques; the conventional one and the EndoVac system. Quantitive polymerase chain reaction (qPCR) was utilised to detect the reduction in the bacterial count. RESULTS: The EndoVac irrigation system was effective in reducing bacteria, especially Peptostreptococcus micros in the diabetic group when compared to conventional irrigation technique with a statistically significant difference. CONCLUSION: The EndoVac can be considered as a promising tool in combination with irrigant solution to defeat the bacterial colonies living in the root canal system. Additional studies ought to be done to improve the means of bacterial clearance mainly in immune-compromised individuals. PMID:28028421
NASA Astrophysics Data System (ADS)
Karabelchtchikova, Olga; Rivero, Iris V.
2005-02-01
The distribution of residual stresses (RS) and surface integrity generated in heat treatment and subsequent multipass grinding was investigated in this experimental study to examine the source of variability and the nature of the interactions of the experimental factors. A nested experimental design was implemented to (a) compare the sources of the RS variability, (b) to examine RS distribution and tensile peak location due to experimental factors, and (c) to analyze the superposition relationship in the RS distribution due to multipass grinding technique. To characterize the material responses, several techniques were used, including microstructural analysis, hardness-toughness and roughness examinations, and retained austenite and RS measurements using x-ray diffraction. The causality of the RS was explained through the strong correlation of the surface integrity characteristics and RS patterns. The main sources of variation were the depth of the RS distribution and the multipass grinding technique. The grinding effect on the RS was statistically significant; however, it was mostly predetermined by the preexisting RS induced in heat treatment. Regardless of the preceding treatments, the effect of the multipass grinding technique exhibited similar RS patterns, which suggests the existence of the superposition relationship and orthogonal memory between the passes of the grinding operation.
Comparison of different techniques for obturating experimental internal resorptive cavities.
Goldberg, F; Massone, E J; Esmoris, M; Alfie, D
2000-06-01
Forty extracted maxillary central incisors were instrumented at the working length to a #50 file. The roots were sectioned transversely with a diamond disk at 7 mm from the anatomical apex. At the opening of the root canal of each section, hemicircular cavities were drilled with a specially designed bur. The corresponding root sections were cemented with glue, thus obtaining root canals with similar cavities that simulated internal resorptions. Teeth were embedded in plaster casts to facilitate their handling. The specimens were randomly separated into four groups of 10. The following obturation techniques were evaluated: lateral compaction (group A), hybrid technique (group B), Obtura II (group C), and Thermafil (group D). AH26 was used as the sealer. After obturation, the plaster was removed and the teeth were radiographed in buccolingual and mesiodistal directions to evaluate the quality of the obturation at the IRC. The incisors were then cut with a scalpel at the same level as the previous section, to examine, under a stereomicroscope, the type of material that filled the IRC. Obtura II gave the best results and in most of the specimens obturated with this technique, the IRC were filled mainly with gutta-percha. Statistical analysis of the data indicated that the differences between group C and the other groups were significant (P < 0.05).
Okur, O M; Şener, A; Kavakli, H Ş; Çelik, G K; Doğan, N Ö; Içme, F; Günaydin, G P
2017-12-01
We aimed to compare two digital nerve block techniques in patients due to traumatic digital lacerations. This was a randomized-controlled study designed prospectively in the emergency department of a university-based training and research hospital. Randomization was achieved by sealed envelopes. Half of the patients were randomised to traditional (two-injection) digital nerve block technique while single-injection digital nerve block technique was applied to the other half. Score of pain due to anesthetic infiltration and suturing, onset time of total anesthesia, need for an additional rescue injection were the parameters evaluated with both groups. Epinephrin added lidocaine hydrochloride preparation was used for the anesthetic application. Visual analog scale was used for the evaluation of pain scores. Outcomes were compared by using Mann-Whitney U test and Student t-test. Fifty emergency department patients ≥18 years requiring digital nerve block were enrolled in the study. Mean age of the patients was 33 (min-max: 19-86) and 39 (78 %) were male. No statistically significant difference was found between the two groups in terms of our main parameters; anesthesia pain score, suturing pain score, onset time of total anesthesia and rescue injection need. Single injection volar digital nerve block technique is a suitable alternative for digital anesthesias in emergency departments.
Secondary Analysis of Qualitative Data.
ERIC Educational Resources Information Center
Turner, Paul D.
The reanalysis of data to answer the original research question with better statistical techniques or to answer new questions with old data is not uncommon in quantitative studies. Meta analysis and research syntheses have increased with the increase in research using similar statistical analyses, refinements of analytical techniques, and the…
Computer program uses Monte Carlo techniques for statistical system performance analysis
NASA Technical Reports Server (NTRS)
Wohl, D. P.
1967-01-01
Computer program with Monte Carlo sampling techniques determines the effect of a component part of a unit upon the overall system performance. It utilizes the full statistics of the disturbances and misalignments of each component to provide unbiased results through simulated random sampling.
NASA Astrophysics Data System (ADS)
Ksoll, Victor F.; Gouliermis, Dimitrios A.; Klessen, Ralf S.; Grebel, Eva K.; Sabbi, Elena; Anderson, Jay; Lennon, Daniel J.; Cignoni, Michele; de Marchi, Guido; Smith, Linda J.; Tosi, Monica; van der Marel, Roeland P.
2018-05-01
The Hubble Tarantula Treasury Project (HTTP) has provided an unprecedented photometric coverage of the entire star-burst region of 30 Doradus down to the half Solar mass limit. We use the deep stellar catalogue of HTTP to identify all the pre-main-sequence (PMS) stars of the region, i.e., stars that have not started their lives on the main-sequence yet. The photometric distinction of these stars from the more evolved populations is not a trivial task due to several factors that alter their colour-magnitude diagram positions. The identification of PMS stars requires, thus, sophisticated statistical methods. We employ Machine Learning Classification techniques on the HTTP survey of more than 800,000 sources to identify the PMS stellar content of the observed field. Our methodology consists of 1) carefully selecting the most probable low-mass PMS stellar population of the star-forming cluster NGC2070, 2) using this sample to train classification algorithms to build a predictive model for PMS stars, and 3) applying this model in order to identify the most probable PMS content across the entire Tarantula Nebula. We employ Decision Tree, Random Forest and Support Vector Machine classifiers to categorise the stars as PMS and Non-PMS. The Random Forest and Support Vector Machine provided the most accurate models, predicting about 20,000 sources with a candidateship probability higher than 50 percent, and almost 10,000 PMS candidates with a probability higher than 95 percent. This is the richest and most accurate photometric catalogue of extragalactic PMS candidates across the extent of a whole star-forming complex.
Multiparametric evaluation of risk factors associated to seroma formation in abdominal wall surgery.
Licari, L; Salamone, G; Parinisi, Z; Campanella, S; Sabatino, C; Ciolino, G; De Marco, P; Falco, N; Boventre, S; Gulotta, G
2017-01-01
Incisional hernia is one of the main topics in the general surgery since there is not a unanimous consensus concerning to the best surgical methodology to adopt. It seems that prosthetic surgery is the best technique, even if responsible for the development of periprosthetic seroma. The aim of this study is to assess whether the preoperative abnormalities of the bio-humoral parameters may be considered as risk factors for seroma. From July 2016 to July 2017 at the "Policlinico Paolo Giaccone", Palermo, Department of Emergency Surgery, 56 patients included in this study, underwent laparotomic mesh repair. The inclusion criteria were: age > 18 years, incisional hernia W2R0 according to the Chevrel classification and a monoperator technique. The main variables were: sex, age, BMI, smoke, ASA score, and co-morbidities. Among the main serum-blood variables: natraemia, kalaemia, chloraemia, calcaemia, PCR, level of glucose, creatinine, albumin and proteins in the blood. The data were analyzed using SPSS software. Univariate analysis highlighted hypo- and hyper-natraemia, hyper-kalaemia, hypo-chloraemia, high levels of PCR, hyper-glycemia, low level of serum-blood albumin and proteins, as statistically significant variables. Multivariate analysis revealed a p<0.05 for PCR, hypo-albuminemia and total serum-blood-protein level. Alterations of pre-operative bio-humoral parameters could be associated to a greater risk of seroma development. A better understanding of such alterations may lead to more efficient risk stratification methods. This could be essential to better address the medical resources, reducing the post-operative complications and the outpatient controls as well as the risk associated to seroma.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gilbert, Richard O.
The application of statistics to environmental pollution monitoring studies requires a knowledge of statistical analysis methods particularly well suited to pollution data. This book fills that need by providing sampling plans, statistical tests, parameter estimation procedure techniques, and references to pertinent publications. Most of the statistical techniques are relatively simple, and examples, exercises, and case studies are provided to illustrate procedures. The book is logically divided into three parts. Chapters 1, 2, and 3 are introductory chapters. Chapters 4 through 10 discuss field sampling designs and Chapters 11 through 18 deal with a broad range of statistical analysis procedures. Somemore » statistical techniques given here are not commonly seen in statistics book. For example, see methods for handling correlated data (Sections 4.5 and 11.12), for detecting hot spots (Chapter 10), and for estimating a confidence interval for the mean of a lognormal distribution (Section 13.2). Also, Appendix B lists a computer code that estimates and tests for trends over time at one or more monitoring stations using nonparametric methods (Chapters 16 and 17). Unfortunately, some important topics could not be included because of their complexity and the need to limit the length of the book. For example, only brief mention could be made of time series analysis using Box-Jenkins methods and of kriging techniques for estimating spatial and spatial-time patterns of pollution, although multiple references on these topics are provided. Also, no discussion of methods for assessing risks from environmental pollution could be included.« less
Message frames interact with motivational systems to determine depth of message processing.
Shen, Lijiang; Dillard, James Price
2009-09-01
Although several theoretical perspectives predict that negatively framed messages will be processed more deeply than positively framed messages, a recent meta-analysis found no such difference. In this article, the authors explore 2 explanations for this inconsistency. One possibility is methodological: the statistics used in the primary studies underestimated framing effects on depth of message processing because the data were maldistributed. The other is theoretical: the absence of a main effect is veridical, but framing interacts with individual differences that predispose individuals to greater or lesser depth of processing. Data from 2 experiments (Ns = 286 and 252) were analyzed via tobit regression, a technique designed to overcome the limitations of maldistributed data. One study showed the predicted main effect for framing, but the other did not. Both studies showed the anticipated interaction: Depth of processing correlated positively with a measure of the behavioral activation system in the advantage framing condition, whereas depth of processing correlated positively with the behavioral inhibition system in the disadvantage framing condition.
Sanchez-Gonzalez, Noe; Jaime-Fonseca, Monica R; San Martin-Martinez, Eduardo; Zepeda, L Gerardo
2013-12-11
Betalains were extracted and analyzed from Opuntia joconostle (the prickly pear known as xoconostle in Mexico). For the extraction, two solvent systems were used, methanol/water and ethanol/water. A three-variable Box-Behnken statistical design was used for extraction: solvent concentration (0-80%, v/v), temperature (5-30 °C), and treatment time (10-30 min). The extraction and stability of betalains from xoconostle were studied using response surface methodology (RSM). Techniques such as UV-vis, column chromatography, and HPLC were employed for the separation and analysis of the main pigments present in the extracts. Maximum pigment concentration (92 mg/100 g of fruit) was obtained at a temperature of 15 °C and a time of 10 min for methanol/water (20:80), whereas maximum stability of the pigment was observed at pH 5 and a temperature of 25 °C. HPLC chromatograms showed the main betalains of the xoconostle characterized were betalain, betanidin, and isobetalain.
A data base and analysis program for shuttle main engine dynamic pressure measurements
NASA Technical Reports Server (NTRS)
Coffin, T.
1986-01-01
A dynamic pressure data base management system is described for measurements obtained from space shuttle main engine (SSME) hot firing tests. The data were provided in terms of engine power level and rms pressure time histories, and power spectra of the dynamic pressure measurements at selected times during each test. Test measurements and engine locations are defined along with a discussion of data acquisition and reduction procedures. A description of the data base management analysis system is provided and subroutines developed for obtaining selected measurement means, variances, ranges and other statistics of interest are discussed. A summary of pressure spectra obtained at SSME rated power level is provided for reference. Application of the singular value decomposition technique to spectrum interpolation is discussed and isoplots of interpolated spectra are presented to indicate measurement trends with engine power level. Program listings of the data base management and spectrum interpolation software are given. Appendices are included to document all data base measurements.
NASA Astrophysics Data System (ADS)
Fosas de Pando, Miguel; Schmid, Peter J.; Sipp, Denis
2016-11-01
Nonlinear model reduction for large-scale flows is an essential component in many fluid applications such as flow control, optimization, parameter space exploration and statistical analysis. In this article, we generalize the POD-DEIM method, introduced by Chaturantabut & Sorensen [1], to address nonlocal nonlinearities in the equations without loss of performance or efficiency. The nonlinear terms are represented by nested DEIM-approximations using multiple expansion bases based on the Proper Orthogonal Decomposition. These extensions are imperative, for example, for applications of the POD-DEIM method to large-scale compressible flows. The efficient implementation of the presented model-reduction technique follows our earlier work [2] on linearized and adjoint analyses and takes advantage of the modular structure of our compressible flow solver. The efficacy of the nonlinear model-reduction technique is demonstrated to the flow around an airfoil and its acoustic footprint. We could obtain an accurate and robust low-dimensional model that captures the main features of the full flow.
Physiological ICSI (PICSI) vs. Conventional ICSI in Couples with Male Factor: A Systematic Review.
Avalos-Durán, Georgina; Ángel, Ana María Emilia Cañedo-Del; Rivero-Murillo, Juana; Zambrano-Guerrero, Jaime Enoc; Carballo-Mondragón, Esperanza; Checa-Vizcaíno, Miguel Ángel
2018-04-19
To determine the efficacy of the physiological ICSI technique (PICSI) vs. conventional ICSI in the prognosis of couples, with respect to the following outcome measures: live births, clinical pregnancy, implantation, embryo quality, fertilization and miscarriage rates. A systematic review of the literature, extracting raw data and performing data analysis. Patient(s): Couples with the male factor, who were subjected to in-vitro fertilization. Main Outcome Measures: rates of live births, clinical pregnancy, implantation, embryo quality, fertilization and miscarriage. In the systematic search, we found 2,918 studies and an additional study from other sources; only two studies fulfilled the inclusion criteria for this systematic review. The rates of live births, clinical pregnancy, implantation, embryo quality, fertilization and miscarriage were similar for both groups. There is no statistically significant difference between PICSI vs. ICSI, for any of the outcomes analyzed in this study. Enough information is still not available to prove the efficacy of the PICSI technique over ICSI in couples with male factor.
de Alencar, Paulo Gilberto Cimbalista; Vieira, Inácio Facó Ventura
2010-01-01
Bone banks are necessary for providing biological material for a series of orthopedic procedures. The growing need for musculoskeletal tissues for transplantation has been due to the development of new surgical techniques, and this has led to a situation in which a variety of hospital services have been willing to have their own source of tissue for transplantation. To increase the safety of transplanted tissues, standards for bone bank operation have been imposed by the government, which has limited the number of authorized institutions. The good performance in a bone bank depends on strict control over all stages, including: formation of well-trained harvesting teams; donor selection; conducting various tests on the tissues obtained; and strict control over the processing techniques used. Combination of these factors enables greater scope of use and numbers of recipient patients, while the incidence of tissue contamination becomes statistically insignificant, and there is traceability between donors and recipients. This paper describes technical considerations relating to how a bone bank functions, the use of grafts and orthopedic applications, the ethical issues and the main obstacles encountered.
Gauquelin, N; van den Bos, K H W; Béché, A; Krause, F F; Lobato, I; Lazar, S; Rosenauer, A; Van Aert, S; Verbeeck, J
2017-10-01
Nowadays, aberration corrected transmission electron microscopy (TEM) is a popular method to characterise nanomaterials at the atomic scale. Here, atomically resolved images of nanomaterials are acquired, where the contrast depends on the illumination, imaging and detector conditions of the microscope. Visualization of light elements is possible when using low angle annular dark field (LAADF) STEM, annular bright field (ABF) STEM, integrated differential phase contrast (iDPC) STEM, negative spherical aberration imaging (NCSI) and imaging STEM (ISTEM). In this work, images of a NdGaO 3 -La 0.67 Sr 0.33 MnO 3 (NGO-LSMO) interface are quantitatively evaluated by using statistical parameter estimation theory. For imaging light elements, all techniques are providing reliable results, while the techniques based on interference contrast, NCSI and ISTEM, are less robust in terms of accuracy for extracting heavy column locations. In term of precision, sample drift and scan distortions mainly limits the STEM based techniques as compared to NCSI. Post processing techniques can, however, partially compensate for this. In order to provide an outlook to the future, simulated images of NGO, in which the unavoidable presence of Poisson noise is taken into account, are used to determine the ultimate precision. In this future counting noise limited scenario, NCSI and ISTEM imaging will provide more precise values as compared to the other techniques, which can be related to the mechanisms behind the image recording. Copyright © 2017 Elsevier B.V. All rights reserved.
Statistical reconstruction for cosmic ray muon tomography.
Schultz, Larry J; Blanpied, Gary S; Borozdin, Konstantin N; Fraser, Andrew M; Hengartner, Nicolas W; Klimenko, Alexei V; Morris, Christopher L; Orum, Chris; Sossong, Michael J
2007-08-01
Highly penetrating cosmic ray muons constantly shower the earth at a rate of about 1 muon per cm2 per minute. We have developed a technique which exploits the multiple Coulomb scattering of these particles to perform nondestructive inspection without the use of artificial radiation. In prior work [1]-[3], we have described heuristic methods for processing muon data to create reconstructed images. In this paper, we present a maximum likelihood/expectation maximization tomographic reconstruction algorithm designed for the technique. This algorithm borrows much from techniques used in medical imaging, particularly emission tomography, but the statistics of muon scattering dictates differences. We describe the statistical model for multiple scattering, derive the reconstruction algorithm, and present simulated examples. We also propose methods to improve the robustness of the algorithm to experimental errors and events departing from the statistical model.
Technique for estimation of streamflow statistics in mineral areas of interest in Afghanistan
Olson, Scott A.; Mack, Thomas J.
2011-01-01
A technique for estimating streamflow statistics at ungaged stream sites in areas of mineral interest in Afghanistan using drainage-area-ratio relations of historical streamflow data was developed and is documented in this report. The technique can be used to estimate the following streamflow statistics at ungaged sites: (1) 7-day low flow with a 10-year recurrence interval, (2) 7-day low flow with a 2-year recurrence interval, (3) daily mean streamflow exceeded 90 percent of the time, (4) daily mean streamflow exceeded 80 percent of the time, (5) mean monthly streamflow for each month of the year, (6) mean annual streamflow, and (7) minimum monthly streamflow for each month of the year. Because they are based on limited historical data, the estimates of streamflow statistics at ungaged sites are considered preliminary.
ERIC Educational Resources Information Center
Henry, Gary T.; And Others
1992-01-01
A statistical technique is presented for developing performance standards based on benchmark groups. The benchmark groups are selected using a multivariate technique that relies on a squared Euclidean distance method. For each observation unit (a school district in the example), a unique comparison group is selected. (SLD)
Chou, Eva; Liu, Jun; Seaworth, Cathleen; Furst, Meredith; Amato, Malena M; Blaydon, Sean M; Durairaj, Vikram D; Nakra, Tanuj; Shore, John W
To compare revision rates for ptosis surgery between posterior-approach and anterior-approach ptosis repair techniques. This is the retrospective, consecutive cohort study. All patients undergoing ptosis surgery at a high-volume oculofacial plastic surgery practice over a 4-year period. A retrospective chart review was conducted of all patients undergoing posterior-approach and anterior-approach ptosis surgery for all etiologies of ptosis between 2011 and 2014. Etiology of ptosis, concurrent oculofacial surgeries, revision, and complications were analyzed. The main outcome measure is the ptosis revision rate. A total of 1519 patients were included in this study. The mean age was 63 ± 15.4 years. A total of 1056 (70%) of patients were female, 1451 (95%) had involutional ptosis, and 1129 (74.3%) had concurrent upper blepharoplasty. Five hundred thirteen (33.8%) underwent posterior-approach ptosis repair, and 1006 (66.2%) underwent anterior-approach ptosis repair. The degree of ptosis was greater in the anterior-approach ptosis repair group. The overall revision rate for all patients was 8.7%. Of the posterior group, 6.8% required ptosis revision; of the anterior group, 9.5% required revision surgery. The main reason for ptosis revision surgery was undercorrection of one or both eyelids. Concurrent brow lifting was associated with a decreased, but not statistically significant, rate of revision surgery. Patients who underwent unilateral ptosis surgery had a 5.1% rate of Hering's phenomenon requiring ptosis repair in the contralateral eyelid. Multivariable logistic regression for predictive factors show that, when adjusted for gender and concurrent blepharoplasty, the revision rate in anterior-approach ptosis surgery is higher than posterior-approach ptosis surgery (odds ratio = 2.08; p = 0.002). The overall revision rate in patients undergoing ptosis repair via posterior-approach or anterior-approach techniques is 8.7%. There is a statistically higher rate of revision with anterior-approach ptosis repair.
Experimental Mathematics and Computational Statistics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bailey, David H.; Borwein, Jonathan M.
2009-04-30
The field of statistics has long been noted for techniques to detect patterns and regularities in numerical data. In this article we explore connections between statistics and the emerging field of 'experimental mathematics'. These includes both applications of experimental mathematics in statistics, as well as statistical methods applied to computational mathematics.
Engaging with the Art & Science of Statistics
ERIC Educational Resources Information Center
Peters, Susan A.
2010-01-01
How can statistics clearly be mathematical and yet distinct from mathematics? The answer lies in the reality that statistics is both an art and a science, and both aspects are important for teaching and learning statistics. Statistics is a mathematical science in that it applies mathematical theories and techniques. Mathematics provides the…
A Multidisciplinary Approach for Teaching Statistics and Probability
ERIC Educational Resources Information Center
Rao, C. Radhakrishna
1971-01-01
The author presents a syllabus for an introductory (first year after high school) course in statistics and probability and some methods of teaching statistical techniques. The description comes basically from the procedures used at the Indian Statistical Institute, Calcutta. (JG)
Boe, Debra Thingstad; Parsons, Helen
2009-01-01
Local public health agencies are challenged to continually improve service delivery, yet they frequently operate with constrained resources. Quality improvement methods and techniques such as statistical process control are commonly used in other industries, and they have recently been proposed as a means of improving service delivery and performance in public health settings. We analyzed a quality improvement project undertaken at a local Special Supplemental Nutrition Program for Women, Infants, and Children (WIC) clinic to reduce waiting times and improve client satisfaction with a walk-in nutrition education service. We used statistical process control techniques to evaluate initial process performance, implement an intervention, and assess process improvements. We found that implementation of these techniques significantly reduced waiting time and improved clients' satisfaction with the WIC service. PMID:19608964
Almarakbi, Waleed A; Kaki, Abdullah M
2014-07-01
The main function of an endotracheal tube (ETT) cuff is to prevent aspiration. High cuff pressure is usually associated with postoperative complications. We tried to compare cuff inflation guided by pressure volume loop closure (PV-L) with those by just to seal technique (JS) and assess the postoperative incidence of sore throat, cough and hoarseness. In a prospective, randomized clinical trial, 100 patients' tracheas were intubated. In the first group (n = 50), ETT cuff inflation was guided by PV-L, while in the second group (n. = 50) the ETT cuff was inflated using the JS technique. Intracuff pressures and volumes were measured. The incidence of postoperative cuff-related complications was reported. Demographic data and durations of intubation were comparable between the groups. The use of PV-L was associated with a lesser amount of intracuff air [4.05 (3.7-4.5) vs 5 (4.8-5.5), P < 0.001] and lower cuff pressure than those in the JS group [18.25 (18-19) vs 33 (32-35), P ≤ 0.001]. The incidence of postextubation cuff-related complications was significantly less frequent among the PV-L group patients as compared with the JS group patients (P ≤ 0.009), except for hoarseness of voice, which was less frequent among the PV-L group, but not statistically significant (P ≤ 0.065). Multiple regression models for prediction of intra-cuff pressure after intubation and before extubation revealed a statistically significant association with the technique used for cuff inflation (P < 0.0001). The study confirms that PV-L-guided ETT cuff inflation is an effective way to seal the airway and associates with a lower ETT cuff pressure and lower incidence of cuff-related complications.
Forensic 3D Visualization of CT Data Using Cinematic Volume Rendering: A Preliminary Study.
Ebert, Lars C; Schweitzer, Wolf; Gascho, Dominic; Ruder, Thomas D; Flach, Patricia M; Thali, Michael J; Ampanozi, Garyfalia
2017-02-01
The 3D volume-rendering technique (VRT) is commonly used in forensic radiology. Its main function is to explain medical findings to state attorneys, judges, or police representatives. New visualization algorithms permit the generation of almost photorealistic volume renderings of CT datasets. The objective of this study is to present and compare a variety of radiologic findings to illustrate the differences between and the advantages and limitations of the current VRT and the physically based cinematic rendering technique (CRT). Seventy volunteers were shown VRT and CRT reconstructions of 10 different cases. They were asked to mark the findings on the images and rate them in terms of realism and understandability. A total of 48 of the 70 questionnaires were returned and included in the analysis. On the basis of most of the findings presented, CRT appears to be equal or superior to VRT with respect to the realism and understandability of the visualized findings. Overall, in terms of realism, the difference between the techniques was statistically significant (p < 0.05). Most participants perceived the CRT findings to be more understandable than the VRT findings, but that difference was not statistically significant (p > 0.05). CRT, which is similar to conventional VRT, is not primarily intended for diagnostic radiologic image analysis, and therefore it should be used primarily as a tool to deliver visual information in the form of radiologic image reports. Using CRT for forensic visualization might have advantages over using VRT if conveying a high degree of visual realism is of importance. Most of the shortcomings of CRT have to do with the software being an early prototype.
Effect of numbers vs pictures on perceived effectiveness of a public safety awareness advertisement.
Bochniak, S; Lammers, H B
1991-08-01
In a 2 x 2 completely randomized factorial experiment, 24 women and 16 men rated the perceived effectiveness of an earthquake preparedness advertisement which contained either a picture or no picture of prior earthquake damage and contained either statistics or no statistics on likelihood of an earthquake. A main effect for superiority of the picture was found. The presence of statistics had no main or interactive effects on the perceived effectiveness of the advertisement.
Infrared evaluation of the heat-sink bipolar diathermy dissection technique.
Allan, J; Dusseldorp, J; Rabey, N G; Malata, C M; Goltsman, D; Phoon, A F
2015-08-01
The use of the bipolar diathermy dissection technique is widespread amongst surgeons performing flap perforator dissection and microvascular surgery. The 'heat-sink' modification uses a DeBakey forcep as a heat sinking interposition between the bipolar tip and the main (vascular or flap) pedicle aiming to protect it from the thermal effects of the bipolar diathermy. This study examines the thermal effects of bipolar cautery upon the microvasculature and investigates the efficacy of heat sinking as a thermally protective technique in microsurgical dissection. A chicken thigh microsurgical training model was used to examine the effects of bipolar cautery. The effects of bipolar were examined using high definition, real-time infrared thermographic imaging (FLIR Systems) and temperature quantitatively assessed at various distances away from the point of bipolar cautery. Comparison was made using the heat sink technique to determine if it conferred a thermoprotective effect compared to the standard technique without heat sink. Using paired t-test analysis (SPSS) the heat sink modification of the bipolar dissection technique was found to have a highly statistically significant effect (P < 0.000000001) in reducing the conductive temperature along the vascular pedicle. This protective effect kept temperatures comparable to controls. Bipolar cautery is an extremely safe method of electrosurgery, however when its use is required within 3 mm of important vascular architecture, the heat-sink method is a viable and easy technique to prevent thermal spread and limit potential coagulopathic changes. Copyright © 2015 British Association of Plastic, Reconstructive and Aesthetic Surgeons. Published by Elsevier Ltd. All rights reserved.
Statistical Techniques to Analyze Pesticide Data Program Food Residue Observations.
Szarka, Arpad Z; Hayworth, Carol G; Ramanarayanan, Tharacad S; Joseph, Robert S I
2018-06-26
The U.S. EPA conducts dietary-risk assessments to ensure that levels of pesticides on food in the U.S. food supply are safe. Often these assessments utilize conservative residue estimates, maximum residue levels (MRLs), and a high-end estimate derived from registrant-generated field-trial data sets. A more realistic estimate of consumers' pesticide exposure from food may be obtained by utilizing residues from food-monitoring programs, such as the Pesticide Data Program (PDP) of the U.S. Department of Agriculture. A substantial portion of food-residue concentrations in PDP monitoring programs are below the limits of detection (left-censored), which makes the comparison of regulatory-field-trial and PDP residue levels difficult. In this paper, we present a novel adaption of established statistical techniques, the Kaplan-Meier estimator (K-M), the robust regression on ordered statistic (ROS), and the maximum-likelihood estimator (MLE), to quantify the pesticide-residue concentrations in the presence of heavily censored data sets. The examined statistical approaches include the most commonly used parametric and nonparametric methods for handling left-censored data that have been used in the fields of medical and environmental sciences. This work presents a case study in which data of thiamethoxam residue on bell pepper generated from registrant field trials were compared with PDP-monitoring residue values. The results from the statistical techniques were evaluated and compared with commonly used simple substitution methods for the determination of summary statistics. It was found that the maximum-likelihood estimator (MLE) is the most appropriate statistical method to analyze this residue data set. Using the MLE technique, the data analyses showed that the median and mean PDP bell pepper residue levels were approximately 19 and 7 times lower, respectively, than the corresponding statistics of the field-trial residues.
NASA Astrophysics Data System (ADS)
Tierz, Pablo; Sandri, Laura; Ramona Stefanescu, Elena; Patra, Abani; Marzocchi, Warner; Costa, Antonio; Sulpizio, Roberto
2014-05-01
Explosive volcanoes and, especially, Pyroclastic Density Currents (PDCs) pose an enormous threat to populations living in the surroundings of volcanic areas. Difficulties in the modeling of PDCs are related to (i) very complex and stochastic physical processes, intrinsic to their occurrence, and (ii) to a lack of knowledge about how these processes actually form and evolve. This means that there are deep uncertainties (namely, of aleatory nature due to point (i) above, and of epistemic nature due to point (ii) above) associated to the study and forecast of PDCs. Consequently, the assessment of their hazard is better described in terms of probabilistic approaches rather than by deterministic ones. What is actually done to assess probabilistic hazard from PDCs is to couple deterministic simulators with statistical techniques that can, eventually, supply probabilities and inform about the uncertainties involved. In this work, some examples of both PDC numerical simulators (Energy Cone and TITAN2D) and uncertainty quantification techniques (Monte Carlo sampling -MC-, Polynomial Chaos Quadrature -PCQ- and Bayesian Linear Emulation -BLE-) are presented, and their advantages, limitations and future potential are underlined. The key point in choosing a specific method leans on the balance between its related computational cost, the physical reliability of the simulator and the pursued target of the hazard analysis (type of PDCs considered, time-scale selected for the analysis, particular guidelines received from decision-making agencies, etc.). Although current numerical and statistical techniques have brought important advances in probabilistic volcanic hazard assessment from PDCs, some of them may be further applicable to more sophisticated simulators. In addition, forthcoming improvements could be focused on three main multidisciplinary directions: 1) Validate the simulators frequently used (through comparison with PDC deposits and other simulators), 2) Decrease simulator runtimes (whether by increasing the knowledge about the physical processes or by doing more efficient programming, parallelization, ...) and 3) Improve uncertainty quantification techniques.
Student's Conceptions in Statistical Graph's Interpretation
ERIC Educational Resources Information Center
Kukliansky, Ida
2016-01-01
Histograms, box plots and cumulative distribution graphs are popular graphic representations for statistical distributions. The main research question that this study focuses on is how college students deal with interpretation of these statistical graphs when translating graphical representations into analytical concepts in descriptive statistics.…
NASA Astrophysics Data System (ADS)
Guadagnini, A.; Riva, M.; Dell'Oca, A.
2017-12-01
We propose to ground sensitivity of uncertain parameters of environmental models on a set of indices based on the main (statistical) moments, i.e., mean, variance, skewness and kurtosis, of the probability density function (pdf) of a target model output. This enables us to perform Global Sensitivity Analysis (GSA) of a model in terms of multiple statistical moments and yields a quantification of the impact of model parameters on features driving the shape of the pdf of model output. Our GSA approach includes the possibility of being coupled with the construction of a reduced complexity model that allows approximating the full model response at a reduced computational cost. We demonstrate our approach through a variety of test cases. These include a commonly used analytical benchmark, a simplified model representing pumping in a coastal aquifer, a laboratory-scale tracer experiment, and the migration of fracturing fluid through a naturally fractured reservoir (source) to reach an overlying formation (target). Our strategy allows discriminating the relative importance of model parameters to the four statistical moments considered. We also provide an appraisal of the error associated with the evaluation of our sensitivity metrics by replacing the original system model through the selected surrogate model. Our results suggest that one might need to construct a surrogate model with increasing level of accuracy depending on the statistical moment considered in the GSA. The methodological framework we propose can assist the development of analysis techniques targeted to model calibration, design of experiment, uncertainty quantification and risk assessment.
Application of Scan Statistics to Detect Suicide Clusters in Australia
Cheung, Yee Tak Derek; Spittal, Matthew J.; Williamson, Michelle Kate; Tung, Sui Jay; Pirkis, Jane
2013-01-01
Background Suicide clustering occurs when multiple suicide incidents take place in a small area or/and within a short period of time. In spite of the multi-national research attention and particular efforts in preparing guidelines for tackling suicide clusters, the broader picture of epidemiology of suicide clustering remains unclear. This study aimed to develop techniques in using scan statistics to detect clusters, with the detection of suicide clusters in Australia as example. Methods and Findings Scan statistics was applied to detect clusters among suicides occurring between 2004 and 2008. Manipulation of parameter settings and change of area for scan statistics were performed to remedy shortcomings in existing methods. In total, 243 suicides out of 10,176 (2.4%) were identified as belonging to 15 suicide clusters. These clusters were mainly located in the Northern Territory, the northern part of Western Australia, and the northern part of Queensland. Among the 15 clusters, 4 (26.7%) were detected by both national and state cluster detections, 8 (53.3%) were only detected by the state cluster detection, and 3 (20%) were only detected by the national cluster detection. Conclusions These findings illustrate that the majority of spatial-temporal clusters of suicide were located in the inland northern areas, with socio-economic deprivation and higher proportions of indigenous people. Discrepancies between national and state/territory cluster detection by scan statistics were due to the contrast of the underlying suicide rates across states/territories. Performing both small-area and large-area analyses, and applying multiple parameter settings may yield the maximum benefits for exploring clusters. PMID:23342098
NASA Astrophysics Data System (ADS)
Rimov, A. A.; Chukanova, T. I.; Trofimov, Yu. V.
2016-12-01
Data on the comparative analysis variants of the quality of power installations (benchmarking) applied in the power industry is systematized. It is shown that the most efficient variant of implementation of the benchmarking technique is the analysis of statistical distributions of the indicators in the composed homogenous group of the uniform power installations. The benchmarking technique aimed at revealing the available reserves on improvement of the reliability and heat efficiency indicators of the power installations of the thermal power plants is developed in the furtherance of this approach. The technique provides a possibility of reliable comparison of the quality of the power installations in their homogenous group limited by the number and adoption of the adequate decision on improving some or other technical characteristics of this power installation. The technique provides structuring of the list of the comparison indicators and internal factors affecting them represented according to the requirements of the sectoral standards and taking into account the price formation characteristics in the Russian power industry. The mentioned structuring ensures traceability of the reasons of deviation of the internal influencing factors from the specified values. The starting point for further detail analysis of the delay of the certain power installation indicators from the best practice expressed in the specific money equivalent is positioning of this power installation on distribution of the key indicator being a convolution of the comparison indicators. The distribution of the key indicator is simulated by the Monte-Carlo method after receiving the actual distributions of the comparison indicators: specific lost profit due to the short supply of electric energy and short delivery of power, specific cost of losses due to the nonoptimal expenditures for repairs, and specific cost of excess fuel equivalent consumption. The quality loss indicators are developed facilitating the analysis of the benchmarking results permitting to represent the quality loss of this power installation in the form of the difference between the actual value of the key indicator or comparison indicator and the best quartile of the existing distribution. The uncertainty of the obtained values of the quality loss indicators was evaluated by transforming the standard uncertainties of the input values into the expanded uncertainties of the output values with the confidence level of 95%. The efficiency of the technique is demonstrated in terms of benchmarking of the main thermal and mechanical equipment of the extraction power-generating units T-250 and power installations of the thermal power plants with the main steam pressure 130 atm.
Seuba, Jordi; Deville, Sylvain; Guizard, Christian; Stevenson, Adam J
2016-01-01
Macroporous ceramics exhibit an intrinsic strength variability caused by the random distribution of defects in their structure. However, the precise role of microstructural features, other than pore volume, on reliability is still unknown. Here, we analyze the applicability of the Weibull analysis to unidirectional macroporous yttria-stabilized-zirconia (YSZ) prepared by ice-templating. First, we performed crush tests on samples with controlled microstructural features with the loading direction parallel to the porosity. The compressive strength data were fitted using two different fitting techniques, ordinary least squares and Bayesian Markov Chain Monte Carlo, to evaluate whether Weibull statistics are an adequate descriptor of the strength distribution. The statistical descriptors indicated that the strength data are well described by the Weibull statistical approach, for both fitting methods used. Furthermore, we assess the effect of different microstructural features (volume, size, densification of the walls, and morphology) on Weibull modulus and strength. We found that the key microstructural parameter controlling reliability is wall thickness. In contrast, pore volume is the main parameter controlling the strength. The highest Weibull modulus ([Formula: see text]) and mean strength (198.2 MPa) were obtained for the samples with the smallest and narrowest wall thickness distribution (3.1 [Formula: see text]m) and lower pore volume (54.5%).
NASA Astrophysics Data System (ADS)
Seuba, Jordi; Deville, Sylvain; Guizard, Christian; Stevenson, Adam J.
2016-01-01
Macroporous ceramics exhibit an intrinsic strength variability caused by the random distribution of defects in their structure. However, the precise role of microstructural features, other than pore volume, on reliability is still unknown. Here, we analyze the applicability of the Weibull analysis to unidirectional macroporous yttria-stabilized-zirconia (YSZ) prepared by ice-templating. First, we performed crush tests on samples with controlled microstructural features with the loading direction parallel to the porosity. The compressive strength data were fitted using two different fitting techniques, ordinary least squares and Bayesian Markov Chain Monte Carlo, to evaluate whether Weibull statistics are an adequate descriptor of the strength distribution. The statistical descriptors indicated that the strength data are well described by the Weibull statistical approach, for both fitting methods used. Furthermore, we assess the effect of different microstructural features (volume, size, densification of the walls, and morphology) on Weibull modulus and strength. We found that the key microstructural parameter controlling reliability is wall thickness. In contrast, pore volume is the main parameter controlling the strength. The highest Weibull modulus (?) and mean strength (198.2 MPa) were obtained for the samples with the smallest and narrowest wall thickness distribution (3.1 ?m) and lower pore volume (54.5%).
Curve fitting and modeling with splines using statistical variable selection techniques
NASA Technical Reports Server (NTRS)
Smith, P. L.
1982-01-01
The successful application of statistical variable selection techniques to fit splines is demonstrated. Major emphasis is given to knot selection, but order determination is also discussed. Two FORTRAN backward elimination programs, using the B-spline basis, were developed. The program for knot elimination is compared in detail with two other spline-fitting methods and several statistical software packages. An example is also given for the two-variable case using a tensor product basis, with a theoretical discussion of the difficulties of their use.
Fitting multidimensional splines using statistical variable selection techniques
NASA Technical Reports Server (NTRS)
Smith, P. L.
1982-01-01
This report demonstrates the successful application of statistical variable selection techniques to fit splines. Major emphasis is given to knot selection, but order determination is also discussed. Two FORTRAN backward elimination programs using the B-spline basis were developed, and the one for knot elimination is compared in detail with two other spline-fitting methods and several statistical software packages. An example is also given for the two-variable case using a tensor product basis, with a theoretical discussion of the difficulties of their use.
Dell’Avvocata, Fabio; Zuin, Marco; Giatti, Sara; Duong, Khanh; Pham, Trung; Tuan, Nguyen Si; Vassiliev, Dobrin; Daggubati, Ramesh; Nguyen, Thach
2017-01-01
Abstract Background and Objectives Provisional and culotte are the most commonly used techniques in left main (LM) stenting. The impact of different post-dilation techniques on fluid dynamic of LM bifurcation has not been yet investigated. The aim of this study is to evaluate, by means of computational fluid dynamic analysis (CFD), the impact of different post-dilation techniques including proximal optimization technique (POT), kissing balloon (KB), POT-Side-POT and POT–KB-POT, 2-steps Kissing (2SK) and Snuggle Kissing balloon (SKB) on flow dynamic profile after LM provisional or culotte stenting. Methods We considered an LM-LCA-LCX bifurcation reconstructed after reviewing 100 consecutive patients (mean age 71.4 ± 9.3 years, 49 males) with LM distal disease. The diameters of LAD and LCX were modelled according to the Finnet’s law as following: LM 4.5 mm, LAD 3.5 mm, LCX 2.75 mm, with bifurcation angle set up at 55°. Xience third-generation stent (Abbot Inc., USA) was reconstructed and virtually implanted in provisional/cross-over and culotte fashion. POT, KB, POT-side-POT, POT-KB-POT, 2SK and SKB were virtually applied and analyzed in terms of the wall shear stress (WSS). Results Analyzing the provisional stenting, the 2SK and KB techniques had a statistically significant lower impact on the WSS at the carina, while POT seemed to obtain a neutral effect. In the wall opposite to the carina, the more physiological profile has been obtained by KB and POT with higher WSS value and smaller surface area of the lower WSS. In culotte stenting, at the carina, POT-KB-POT and 2SK had a very physiological profile; while at the wall opposite to the carina, 2SK and POT–KB-POT decreased significantly the surface area of the lower WSS compared to the other techniques. Conclusion From the fluid dynamic point of view in LM provisional stenting, POT, 2SK and KB showed a similar beneficial impact on the bifurcation rheology, while in LM culotte stenting, POT-KB-POT and 2SK performed slightly better than the other techniques, probably reflecting a better strut apposition. PMID:29340277
Statistical Tests of Reliability of NDE
NASA Technical Reports Server (NTRS)
Baaklini, George Y.; Klima, Stanley J.; Roth, Don J.; Kiser, James D.
1987-01-01
Capabilities of advanced material-testing techniques analyzed. Collection of four reports illustrates statistical method for characterizing flaw-detecting capabilities of sophisticated nondestructive evaluation (NDE). Method used to determine reliability of several state-of-the-art NDE techniques for detecting failure-causing flaws in advanced ceramic materials considered for use in automobiles, airplanes, and space vehicles.
Statistical Techniques for Efficient Indexing and Retrieval of Document Images
ERIC Educational Resources Information Center
Bhardwaj, Anurag
2010-01-01
We have developed statistical techniques to improve the performance of document image search systems where the intermediate step of OCR based transcription is not used. Previous research in this area has largely focused on challenges pertaining to generation of small lexicons for processing handwritten documents and enhancement of poor quality…
Towards the implementation of a spectral database for the detection of biological warfare agents
NASA Astrophysics Data System (ADS)
Carestia, M.; Pizzoferrato, R.; Gelfusa, M.; Cenciarelli, O.; D'Amico, F.; Malizia, A.; Scarpellini, D.; Murari, A.; Vega, J.; Gaudio, P.
2014-10-01
The deliberate use of biological warfare agents (BWA) and other pathogens can jeopardize the safety of population, fauna and flora, and represents a concrete concern from the military and civil perspective. At present, the only commercially available tools for fast warning of a biological attack can perform point detection and require active or passive sampling collection. The development of a stand-off detection system would be extremely valuable to minimize the risk and the possible consequences of the release of biological aerosols in the atmosphere. Biological samples can be analyzed by means of several optical techniques, covering a broad region of the electromagnetic spectrum. Strong evidence proved that the informative content of fluorescence spectra could provide good preliminary discrimination among those agents and it can also be obtained through stand-off measurements. Such a system necessitates a database and a mathematical method for the discrimination of the spectral signatures. In this work, we collected fluorescence emission spectra of the main BWA simulants, to implement a spectral signature database and apply the Universal Multi Event Locator (UMEL) statistical method. Our preliminary analysis, conducted in laboratory conditions with a standard UV lamp source, considers the main experimental setups influencing the fluorescence signature of some of the most commonly used BWA simulants. Our work represents a first step towards the implementation of a spectral database and a laser-based biological stand-off detection and identification technique.
As above, so below? Towards understanding inverse models in BCI
NASA Astrophysics Data System (ADS)
Lindgren, Jussi T.
2018-02-01
Objective. In brain-computer interfaces (BCI), measurements of the user’s brain activity are classified into commands for the computer. With EEG-based BCIs, the origins of the classified phenomena are often considered to be spatially localized in the cortical volume and mixed in the EEG. We investigate if more accurate BCIs can be obtained by reconstructing the source activities in the volume. Approach. We contrast the physiology-driven source reconstruction with data-driven representations obtained by statistical machine learning. We explain these approaches in a common linear dictionary framework and review the different ways to obtain the dictionary parameters. We consider the effect of source reconstruction on some major difficulties in BCI classification, namely information loss, feature selection and nonstationarity of the EEG. Main results. Our analysis suggests that the approaches differ mainly in their parameter estimation. Physiological source reconstruction may thus be expected to improve BCI accuracy if machine learning is not used or where it produces less optimal parameters. We argue that the considered difficulties of surface EEG classification can remain in the reconstructed volume and that data-driven techniques are still necessary. Finally, we provide some suggestions for comparing approaches. Significance. The present work illustrates the relationships between source reconstruction and machine learning-based approaches for EEG data representation. The provided analysis and discussion should help in understanding, applying, comparing and improving such techniques in the future.
Mali, Matilda; Dell'Anna, Maria Michela; Mastrorilli, Piero; Damiani, Leonardo; Ungaro, Nicola; Belviso, Claudia; Fiore, Saverio
2015-11-01
Sediment contamination by metals poses significant risks to coastal ecosystems and is considered to be problematic for dredging operations. The determination of the background values of metal and metalloid distribution based on site-specific variability is fundamental in assessing pollution levels in harbour sediments. The novelty of the present work consists of addressing the scope and limitation of analysing port sediments through the use of conventional statistical techniques (such as: linear regression analysis, construction of cumulative frequency curves and the iterative 2σ technique), that are commonly employed for assessing Regional Geochemical Background (RGB) values in coastal sediments. This study ascertained that although the tout court use of such techniques in determining the RGB values in harbour sediments seems appropriate (the chemical-physical parameters of port sediments fit well with statistical equations), it should nevertheless be avoided because it may be misleading and can mask key aspects of the study area that can only be revealed by further investigations, such as mineralogical and multivariate statistical analyses. Copyright © 2015 Elsevier Ltd. All rights reserved.
A comparison of linear and nonlinear statistical techniques in performance attribution.
Chan, N H; Genovese, C R
2001-01-01
Performance attribution is usually conducted under the linear framework of multifactor models. Although commonly used by practitioners in finance, linear multifactor models are known to be less than satisfactory in many situations. After a brief survey of nonlinear methods, nonlinear statistical techniques are applied to performance attribution of a portfolio constructed from a fixed universe of stocks using factors derived from some commonly used cross sectional linear multifactor models. By rebalancing this portfolio monthly, the cumulative returns for procedures based on standard linear multifactor model and three nonlinear techniques-model selection, additive models, and neural networks-are calculated and compared. It is found that the first two nonlinear techniques, especially in combination, outperform the standard linear model. The results in the neural-network case are inconclusive because of the great variety of possible models. Although these methods are more complicated and may require some tuning, toolboxes are developed and suggestions on calibration are proposed. This paper demonstrates the usefulness of modern nonlinear statistical techniques in performance attribution.
NASA Technical Reports Server (NTRS)
Zimmerman, G. A.; Olsen, E. T.
1992-01-01
Noise power estimation in the High-Resolution Microwave Survey (HRMS) sky survey element is considered as an example of a constant false alarm rate (CFAR) signal detection problem. Order-statistic-based noise power estimators for CFAR detection are considered in terms of required estimator accuracy and estimator dynamic range. By limiting the dynamic range of the value to be estimated, the performance of an order-statistic estimator can be achieved by simpler techniques requiring only a single pass of the data. Simple threshold-and-count techniques are examined, and it is shown how several parallel threshold-and-count estimation devices can be used to expand the dynamic range to meet HRMS system requirements with minimal hardware complexity. An input/output (I/O) efficient limited-precision order-statistic estimator with wide but limited dynamic range is also examined.
Rasta, Seyed Hossein; Partovi, Mahsa Eisazadeh; Seyedarabi, Hadi; Javadzadeh, Alireza
2015-01-01
To investigate the effect of preprocessing techniques including contrast enhancement and illumination correction on retinal image quality, a comparative study was carried out. We studied and implemented a few illumination correction and contrast enhancement techniques on color retinal images to find out the best technique for optimum image enhancement. To compare and choose the best illumination correction technique we analyzed the corrected red and green components of color retinal images statistically and visually. The two contrast enhancement techniques were analyzed using a vessel segmentation algorithm by calculating the sensitivity and specificity. The statistical evaluation of the illumination correction techniques were carried out by calculating the coefficients of variation. The dividing method using the median filter to estimate background illumination showed the lowest Coefficients of variations in the red component. The quotient and homomorphic filtering methods after the dividing method presented good results based on their low Coefficients of variations. The contrast limited adaptive histogram equalization increased the sensitivity of the vessel segmentation algorithm up to 5% in the same amount of accuracy. The contrast limited adaptive histogram equalization technique has a higher sensitivity than the polynomial transformation operator as a contrast enhancement technique for vessel segmentation. Three techniques including the dividing method using the median filter to estimate background, quotient based and homomorphic filtering were found as the effective illumination correction techniques based on a statistical evaluation. Applying the local contrast enhancement technique, such as CLAHE, for fundus images presented good potentials in enhancing the vasculature segmentation.
A Classification of Statistics Courses (A Framework for Studying Statistical Education)
ERIC Educational Resources Information Center
Turner, J. C.
1976-01-01
A classification of statistics courses in presented, with main categories of "course type,""methods of presentation,""objectives," and "syllabus." Examples and suggestions for uses of the classification are given. (DT)
Oztekin, Asil; Delen, Dursun; Kong, Zhenyu James
2009-12-01
Predicting the survival of heart-lung transplant patients has the potential to play a critical role in understanding and improving the matching procedure between the recipient and graft. Although voluminous data related to the transplantation procedures is being collected and stored, only a small subset of the predictive factors has been used in modeling heart-lung transplantation outcomes. The previous studies have mainly focused on applying statistical techniques to a small set of factors selected by the domain-experts in order to reveal the simple linear relationships between the factors and survival. The collection of methods known as 'data mining' offers significant advantages over conventional statistical techniques in dealing with the latter's limitations such as normality assumption of observations, independence of observations from each other, and linearity of the relationship between the observations and the output measure(s). There are statistical methods that overcome these limitations. Yet, they are computationally more expensive and do not provide fast and flexible solutions as do data mining techniques in large datasets. The main objective of this study is to improve the prediction of outcomes following combined heart-lung transplantation by proposing an integrated data-mining methodology. A large and feature-rich dataset (16,604 cases with 283 variables) is used to (1) develop machine learning based predictive models and (2) extract the most important predictive factors. Then, using three different variable selection methods, namely, (i) machine learning methods driven variables-using decision trees, neural networks, logistic regression, (ii) the literature review-based expert-defined variables, and (iii) common sense-based interaction variables, a consolidated set of factors is generated and used to develop Cox regression models for heart-lung graft survival. The predictive models' performance in terms of 10-fold cross-validation accuracy rates for two multi-imputed datasets ranged from 79% to 86% for neural networks, from 78% to 86% for logistic regression, and from 71% to 79% for decision trees. The results indicate that the proposed integrated data mining methodology using Cox hazard models better predicted the graft survival with different variables than the conventional approaches commonly used in the literature. This result is validated by the comparison of the corresponding Gains charts for our proposed methodology and the literature review based Cox results, and by the comparison of Akaike information criteria (AIC) values received from each. Data mining-based methodology proposed in this study reveals that there are undiscovered relationships (i.e. interactions of the existing variables) among the survival-related variables, which helps better predict the survival of the heart-lung transplants. It also brings a different set of variables into the scene to be evaluated by the domain-experts and be considered prior to the organ transplantation.
Teaching the Meaning of Statistical Techniques with Microcomputer Simulation.
ERIC Educational Resources Information Center
Lee, Motoko Y.; And Others
Students in an introductory statistics course are often preoccupied with learning the computational routines of specific summary statistics and thereby fail to develop an understanding of the meaning of those statistics or their conceptual basis. To help students develop a better understanding of the meaning of three frequently used statistics,…
NASA Astrophysics Data System (ADS)
McCray, Wilmon Wil L., Jr.
The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization model and dashboard that demonstrates the use of statistical methods, statistical process control, sensitivity analysis, quantitative and optimization techniques to establish a baseline and predict future customer satisfaction index scores (outcomes). The American Customer Satisfaction Index (ACSI) model and industry benchmarks were used as a framework for the simulation model.
Confidence Intervals from Realizations of Simulated Nuclear Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Younes, W.; Ratkiewicz, A.; Ressler, J. J.
2017-09-28
Various statistical techniques are discussed that can be used to assign a level of confidence in the prediction of models that depend on input data with known uncertainties and correlations. The particular techniques reviewed in this paper are: 1) random realizations of the input data using Monte-Carlo methods, 2) the construction of confidence intervals to assess the reliability of model predictions, and 3) resampling techniques to impose statistical constraints on the input data based on additional information. These techniques are illustrated with a calculation of the keff value, based on the 235U(n, f) and 239Pu (n, f) cross sections.
Metabolomics and Integrative Omics for the Development of Thai Traditional Medicine
Khoomrung, Sakda; Wanichthanarak, Kwanjeera; Nookaew, Intawat; Thamsermsang, Onusa; Seubnooch, Patcharamon; Laohapand, Tawee; Akarasereenont, Pravit
2017-01-01
In recent years, interest in studies of traditional medicine in Asian and African countries has gradually increased due to its potential to complement modern medicine. In this review, we provide an overview of Thai traditional medicine (TTM) current development, and ongoing research activities of TTM related to metabolomics. This review will also focus on three important elements of systems biology analysis of TTM including analytical techniques, statistical approaches and bioinformatics tools for handling and analyzing untargeted metabolomics data. The main objective of this data analysis is to gain a comprehensive understanding of the system wide effects that TTM has on individuals. Furthermore, potential applications of metabolomics and systems medicine in TTM will also be discussed. PMID:28769804
Choice-Based Conjoint Analysis: Classification vs. Discrete Choice Models
NASA Astrophysics Data System (ADS)
Giesen, Joachim; Mueller, Klaus; Taneva, Bilyana; Zolliker, Peter
Conjoint analysis is a family of techniques that originated in psychology and later became popular in market research. The main objective of conjoint analysis is to measure an individual's or a population's preferences on a class of options that can be described by parameters and their levels. We consider preference data obtained in choice-based conjoint analysis studies, where one observes test persons' choices on small subsets of the options. There are many ways to analyze choice-based conjoint analysis data. Here we discuss the intuition behind a classification based approach, and compare this approach to one based on statistical assumptions (discrete choice models) and to a regression approach. Our comparison on real and synthetic data indicates that the classification approach outperforms the discrete choice models.
Data survey on the effect of product features on competitive advantage of selected firms in Nigeria.
Olokundun, Maxwell; Iyiola, Oladele; Ibidunni, Stephen; Falola, Hezekiah; Salau, Odunayo; Amaihian, Augusta; Peter, Fred; Borishade, Taiye
2018-06-01
The main objective of this study was to present a data article that investigates the effect product features on firm's competitive advantage. Few studies have examined how the features of a product could help in driving the competitive advantage of a firm. Descriptive research method was used. Statistical Package for Social Sciences (SPSS 22) was engaged for analysis of one hundred and fifty (150) valid questionnaire which were completed by small business owners registered under small and medium scale enterprises development of Nigeria (SMEDAN). Stratified and simple random sampling techniques were employed; reliability and validity procedures were also confirmed. The field data set is made publicly available to enable critical or extended analysis.
HOS network-based classification of power quality events via regression algorithms
NASA Astrophysics Data System (ADS)
Palomares Salas, José Carlos; González de la Rosa, Juan José; Sierra Fernández, José María; Pérez, Agustín Agüera
2015-12-01
This work compares seven regression algorithms implemented in artificial neural networks (ANNs) supported by 14 power-quality features, which are based in higher-order statistics. Combining time and frequency domain estimators to deal with non-stationary measurement sequences, the final goal of the system is the implementation in the future smart grid to guarantee compatibility between all equipment connected. The principal results are based in spectral kurtosis measurements, which easily adapt to the impulsive nature of the power quality events. These results verify that the proposed technique is capable of offering interesting results for power quality (PQ) disturbance classification. The best results are obtained using radial basis networks, generalized regression, and multilayer perceptron, mainly due to the non-linear nature of data.
ACCESS: The Arizona-CfA-Catolica Exoplanet Spectroscopy Survey
NASA Astrophysics Data System (ADS)
Lopez-Morales, Mercedes; Apai, Daniel; Jordan, Andres; Espinoza, Nestor; Rackham, Benjamin; Fraine, Jonathan D.; Rodler, Florian; Lewis, Nikole; Fortney, Jonathan J.; Osip, David J.
2014-06-01
The Arizona-CfA-Catolica Exoplanet Spectroscopy Survey (ACCESS) is an international, multi-institutional consortium with members from the Harvard-Smithsonian CfA, the University of Arizona, Pontificia Universidad Catolica in Chile, MIT and UC Santa Cruz and the Carnegie Institution. ACCESS' goal is to observe about two dozen planets covering a wide range of mass, radius, atmospheric temperatures and energy irradiation levels, with two main scientific goals: 1) to obtain, for the first time, a uniform sample of visible transmission spectra of exoplanets, allowing the study of their atmospheric characteristics as a statistically significant sample, and 2) to mature the technique of ground-based observations of exoplanetary atmospheres for future observations of small planets. Here we describe ACCESS and its first science results.
Creighton, Doug; Gruca, Mark; Marsh, Douglas; Murphy, Nancy
2014-11-01
Cervical mobilization and manipulation have been shown to improve cervical range of motion and pain. Rotatory thrust manipulation applied to the lower cervical segments is associated with controversy and the potential for eliciting adverse reactions (AR). The purpose of this clinical trial was to describe two translatory non-thrust mobilization techniques and evaluate their effect on cervical pain, motion restriction, and whether any adverse effects were reported when applied to the C7 segment. This trial included 30 participants with painful and restricted cervical rotation. Participants were randomly assigned to receive one of the two mobilization techniques. Active cervical rotation and pain intensity measurements were recorded pre- and post-intervention. Within group comparisons were determined using the Wilcoxon signed-rank test and between group comparisons were analyzed using the Mann-Whitney U test. Significance was set at P = 0.05. Thirty participants were evaluated immediately after one of the two mobilization techniques was applied. There was a statistically significant difference (improvement) for active cervical rotation after application of the C7 facet distraction technique for both right (P = 0.022) and left (P = 0.022) rotation. Statistically significant improvement was also found for the C7 facet gliding technique for both right (P = 0.022) and left rotation (P = 0.020). Pain reduction was statistically significant for both right and left rotation after application of both techniques. Both mobilization techniques produced similar positive effects and one was not statistically superior to the other. A single application of both C7 mobilization techniques improved active cervical rotation, reduced perceived pain, and did not produce any AR in 30 patients with neck pain and movement limitation. These two non-thrust techniques may offer clinicians an additional safe and effective manual intervention for patients with limited and painful cervical rotation. A more robust experimental design is recommended to further examine these and similar cervical translatory mobilization techniques.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Solaimani, Mohiuddin; Iftekhar, Mohammed; Khan, Latifur
Anomaly detection refers to the identi cation of an irregular or unusual pat- tern which deviates from what is standard, normal, or expected. Such deviated patterns typically correspond to samples of interest and are assigned different labels in different domains, such as outliers, anomalies, exceptions, or malware. Detecting anomalies in fast, voluminous streams of data is a formidable chal- lenge. This paper presents a novel, generic, real-time distributed anomaly detection framework for heterogeneous streaming data where anomalies appear as a group. We have developed a distributed statistical approach to build a model and later use it to detect anomaly. Asmore » a case study, we investigate group anomaly de- tection for a VMware-based cloud data center, which maintains a large number of virtual machines (VMs). We have built our framework using Apache Spark to get higher throughput and lower data processing time on streaming data. We have developed a window-based statistical anomaly detection technique to detect anomalies that appear sporadically. We then relaxed this constraint with higher accuracy by implementing a cluster-based technique to detect sporadic and continuous anomalies. We conclude that our cluster-based technique out- performs other statistical techniques with higher accuracy and lower processing time.« less
Forest statistics for Maine: 1971 and 1982
Douglas S. Powell; David R. Dickson
1984-01-01
A statistical report on the third forest survey of Maine (1982) and reprocessed data from the second survey (1971). Results of the surveys are displayed in a 169 tables containing estimates of forest and timberland area, numbers of trees, timber volume, tree biomass, timber products output, and components of average annual net change in growing-stock volume for the...
The Role of Statistics in Kosovo Enterprises
ERIC Educational Resources Information Center
Gjonbalaj, Muje; Dema, Marjan; Miftari, Iliriana
2009-01-01
Considering science as the main contributor to contemporary developments has encouraged us to raise a scientific discussion regarding the role of statistics in business decision-making and economic development. Statistics, as an applicative science, is growing and being widely applied in different fields and professions. Statistical thinking is…
Data Analysis Techniques for Physical Scientists
NASA Astrophysics Data System (ADS)
Pruneau, Claude A.
2017-10-01
Preface; How to read this book; 1. The scientific method; Part I. Foundation in Probability and Statistics: 2. Probability; 3. Probability models; 4. Classical inference I: estimators; 5. Classical inference II: optimization; 6. Classical inference III: confidence intervals and statistical tests; 7. Bayesian inference; Part II. Measurement Techniques: 8. Basic measurements; 9. Event reconstruction; 10. Correlation functions; 11. The multiple facets of correlation functions; 12. Data correction methods; Part III. Simulation Techniques: 13. Monte Carlo methods; 14. Collision and detector modeling; List of references; Index.
Neyton, Lionel; Barth, Johannes; Nourissat, Geoffroy; Métais, Pierre; Boileau, Pascal; Walch, Gilles; Lafosse, Laurent
2018-05-19
To analyze graft and fixation (screw and EndoButton) positioning after the arthroscopic Latarjet technique with 2-dimensional computed tomography (CT) and to compare it with the open technique. We performed a retrospective multicenter study (March 2013 to June 2014). The inclusion criteria included patients with recurrent anterior instability treated with the Latarjet procedure. The exclusion criterion was the absence of a postoperative CT scan. The positions of the hardware, the positions of the grafts in the axial and sagittal planes, and the dispersion of values (variability) were compared. The study included 208 patients (79 treated with open technique, 87 treated with arthroscopic Latarjet technique with screw fixation [arthro-screw], and 42 treated with arthroscopic Latarjet technique with EndoButton fixation [arthro-EndoButton]). The angulation of the screws was different in the open group versus the arthro-screw group (superior, 10.3° ± 0.7° vs 16.9° ± 1.0° [P < .001]; inferior, 10.3° ± 0.8° vs 15.7° ± 0.9° [P < .0001]). The angulation of the EndoButtons was 5.7° ± 0.5°; this was different from that of open inferior screws (P = .003). In the axial plane (level of equator), the arthroscopic techniques resulted in lateral positions (arthro-screw, 1.5 ± 0.3 mm lateral [P < .001]; arthro-EndoButton, 0 ± 0.3 mm lateral [P < .0001]) versus the open technique (0.9 ± 0.2 mm medial). At the level of 25% of the glenoid height, the arthroscopic techniques resulted in lateral positions (arthro-screw, 0.3 ± 0.3 mm lateral [P < .001]); (arthro-EndoButton, 0.7 ± 0.3 mm lateral [P < .0001]) versus the open technique (1.0 ± 0.2 mm medial). Higher variability was observed in the arthro-screw group. In the sagittal plane, the arthro-screw technique resulted in higher positions (55% ± 3% of graft below equator) and the arthro-EndoButton technique resulted in lower positions (82% ± 3%, P < .0001) versus the open technique (71% ± 2%). Variability was not different. This study shows that the position of the fixation devices and position of the bone graft with the arthroscopic techniques are statistically significantly different from those with the open technique with 2-dimensional CT assessment. In the sagittal plane, the arthro-screw technique provides the highest positions, and the arthro-EndoButton technique, the lowest. Overall, the mean position of the bone block with the open Latarjet technique in the axial plane is slightly medial to the joint line, as recommended. Conversely, with the arthroscopic techniques, the bone grafts are more lateral with a slight overhang. The main differences are observed in the dispersion of the values (more extreme positions) with the arthro-screw technique, given the acknowledged limitations. Despite the statistical significance, the clinical significance of these differences is yet unknown. Level III, retrospective comparative study. Copyright © 2018 Arthroscopy Association of North America. Published by Elsevier Inc. All rights reserved.
Alasbali, Tariq; Smith, Michael; Geffen, Noa; Trope, Graham E; Flanagan, John G; Jin, Yaping; Buys, Yvonne M
2009-01-01
To investigate the relationship between industry- vs nonindustry-funded publications comparing the efficacy of topical prostaglandin analogs by evaluating the correspondence between the statistical significance of the publication's main outcome measure and its abstract conclusions. Retrospective, observational cohort study. English publications comparing the ocular hypotensive efficacy between any or all of latanoprost, travoprost, and bimatoprost were searched from the MEDLINE database. Each article was reviewed by three independent observers and was evaluated for source of funding, study quality, statistically significant main outcome measure, correspondence between results of main outcome measure and abstract conclusion, number of intraocular pressure outcomes compared, and journal impact factor. Funding was determined by published disclosure or, in cases of no documented disclosure, the corresponding author was contacted directly to confirm industry funding. Discrepancies were resolved by consensus. The main outcome measure was correspondence between abstract conclusion and reported statistical significance of the publications' main outcome measure. Thirty-nine publications were included, of which 29 were industry funded and 10 were nonindustry funded. The published abstract conclusion was not consistent with the results of the main outcome measure in 18 (62%) of 29 of the industry-funded studies compared with zero (0%) of 10 of the nonindustry-funded studies (P = .0006). Twenty-six (90%) of the industry-funded studies had proindustry abstract conclusions. Twenty-four percent of the industry-funded publications had a statistically significant main outcome measure; however, 90% of the industry-funded studies had proindustry abstract conclusions. Both readers and reviewers should scrutinize publications carefully to ensure that data support the authors' conclusions.
Comparison of simulation modeling and satellite techniques for monitoring ecological processes
NASA Technical Reports Server (NTRS)
Box, Elgene O.
1988-01-01
In 1985 improvements were made in the world climatic data base for modeling and predictive mapping; in individual process models and the overall carbon-balance models; and in the interface software for mapping the simulation results. Statistical analysis of the data base was begun. In 1986 mapping was shifted to NASA-Goddard. The initial approach involving pattern comparisons was modified to a more statistical approach. A major accomplishment was the expansion and improvement of a global data base of measurements of biomass and primary production, to complement the simulation data. The main accomplishments during 1987 included: production of a master tape with all environmental and satellite data and model results for the 1600 sites; development of a complete mapping system used for the initial color maps comparing annual and monthly patterns of Normalized Difference Vegetation Index (NDVI), actual evapotranspiration, net primary productivity, gross primary productivity, and net ecosystem production; collection of more biosphere measurements for eventual improvement of the biological models; and development of some initial monthly models for primary productivity, based on satellite data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wilson, William; Krakowiak, Konrad J.; Ulm, Franz-Josef, E-mail: ulm@mit.edu
2014-01-15
According to recent developments in cement clinker engineering, the optimization of chemical substitutions in the main clinker phases offers a promising approach to improve both reactivity and grindability of clinkers. Thus, monitoring the chemistry of the phases may become part of the quality control at the cement plants, along with the usual measurements of the abundance of the mineralogical phases (quantitative X-ray diffraction) and the bulk chemistry (X-ray fluorescence). This paper presents a new method to assess these three complementary quantities with a single experiment. The method is based on electron microprobe spot analyses, performed over a grid located onmore » a representative surface of the sample and interpreted with advanced statistical tools. This paper describes the method and the experimental program performed on industrial clinkers to establish the accuracy in comparison to conventional methods. -- Highlights: •A new method of clinker characterization •Combination of electron probe technique with cluster analysis •Simultaneous assessment of phase abundance, composition and bulk chemistry •Experimental validation performed on industrial clinkers.« less
[Basic concepts for network meta-analysis].
Catalá-López, Ferrán; Tobías, Aurelio; Roqué, Marta
2014-12-01
Systematic reviews and meta-analyses have long been fundamental tools for evidence-based clinical practice. Initially, meta-analyses were proposed as a technique that could improve the accuracy and the statistical power of previous research from individual studies with small sample size. However, one of its main limitations has been the fact of being able to compare no more than two treatments in an analysis, even when the clinical research question necessitates that we compare multiple interventions. Network meta-analysis (NMA) uses novel statistical methods that incorporate information from both direct and indirect treatment comparisons in a network of studies examining the effects of various competing treatments, estimating comparisons between many treatments in a single analysis. Despite its potential limitations, NMA applications in clinical epidemiology can be of great value in situations where there are several treatments that have been compared against a common comparator. Also, NMA can be relevant to a research or clinical question when many treatments must be considered or when there is a mix of both direct and indirect information in the body of evidence. Copyright © 2013 Elsevier España, S.L.U. All rights reserved.
Robust misinterpretation of confidence intervals.
Hoekstra, Rink; Morey, Richard D; Rouder, Jeffrey N; Wagenmakers, Eric-Jan
2014-10-01
Null hypothesis significance testing (NHST) is undoubtedly the most common inferential technique used to justify claims in the social sciences. However, even staunch defenders of NHST agree that its outcomes are often misinterpreted. Confidence intervals (CIs) have frequently been proposed as a more useful alternative to NHST, and their use is strongly encouraged in the APA Manual. Nevertheless, little is known about how researchers interpret CIs. In this study, 120 researchers and 442 students-all in the field of psychology-were asked to assess the truth value of six particular statements involving different interpretations of a CI. Although all six statements were false, both researchers and students endorsed, on average, more than three statements, indicating a gross misunderstanding of CIs. Self-declared experience with statistics was not related to researchers' performance, and, even more surprisingly, researchers hardly outperformed the students, even though the students had not received any education on statistical inference whatsoever. Our findings suggest that many researchers do not know the correct interpretation of a CI. The misunderstandings surrounding p-values and CIs are particularly unfortunate because they constitute the main tools by which psychologists draw conclusions from data.
Long-Term Marine Traffic Monitoring for Environmental Safety in the Aegean Sea
NASA Astrophysics Data System (ADS)
Giannakopoulos, T.; Gyftakis, S.; Charou, E.; Perantonis, S.; Nivolianitou, Z.; Koromila, I.; Makrygiorgos, A.
2015-04-01
The Aegean Sea is characterized by an extremely high marine safety risk, mainly due to the significant increase of the traffic of tankers from and to the Black Sea that pass through narrow straits formed by the 1600 Greek islands. Reducing the risk of a ship accident is therefore vital to all socio-economic and environmental sectors. This paper presents an online long-term marine traffic monitoring work-flow that focuses on extracting aggregated vessel risks using spatiotemporal analysis of multilayer information: vessel trajectories, vessel data, meteorological data, bathymetric / hydrographic data as well as information regarding environmentally important areas (e.g. protected high-risk areas, etc.). A web interface that enables user-friendly spatiotemporal queries is implemented at the frontend, while a series of data mining functionalities extracts aggregated statistics regarding: (a) marine risks and accident probabilities for particular areas (b) trajectories clustering information (c) general marine statistics (cargo types, etc.) and (d) correlation between spatial environmental importance and marine traffic risk. Towards this end, a set of data clustering and probabilistic graphical modelling techniques has been adopted.
Failure Analysis by Statistical Techniques (FAST). Volume 1. User’s Manual
1974-10-31
REPORT NUMBER DNA 3336F-1 2. OOVT ACCESSION NO 4. TITLE Cand Sublllle) • FAILURE ANALYSIS BY STATISTICAL TECHNIQUES (FAST) Volume I, User’s...SS2), and t’ a facility ( SS7 ). The other three diagrams break down the three critical subsystems. T le median probability of survival of the
USDA-ARS?s Scientific Manuscript database
The mixed linear model (MLM) is currently among the most advanced and flexible statistical modeling techniques and its use in tackling problems in plant pathology has begun surfacing in the literature. The longitudinal MLM is a multivariate extension that handles repeatedly measured data, such as r...
ERIC Educational Resources Information Center
Martin, James L.
This paper reports on attempts by the author to construct a theoretical framework of adult education participation using a theory development process and the corresponding multivariate statistical techniques. Two problems are identified: the lack of theoretical framework in studying problems, and the limiting of statistical analysis to univariate…
ERIC Educational Resources Information Center
Vivo, Juana-Maria; Franco, Manuel
2008-01-01
This article attempts to present a novel application of a method of measuring accuracy for academic success predictors that could be used as a standard. This procedure is known as the receiver operating characteristic (ROC) curve, which comes from statistical decision techniques. The statistical prediction techniques provide predictor models and…
Statistical Techniques Used in Published Articles: A Historical Review of Reviews
ERIC Educational Resources Information Center
Skidmore, Susan Troncoso; Thompson, Bruce
2010-01-01
The purpose of the present study is to provide a historical account and metasynthesis of which statistical techniques are most frequently used in the fields of education and psychology. Six articles reviewing the "American Educational Research Journal" from 1969 to 1997 and five articles reviewing the psychological literature from 1948 to 2001…
A Technique for Merging Areas in Timber Mart-South Data
Jeffrey P. Prestemon; John M. Pye
2000-01-01
For over 20 yr, TimberMart-South (TMS) has been distributing prices of various wood products from southern forests. In the beginning of 1988, the reporting frequency changed from monthly to quarterly, a change readily addressed through a variety established statistical techniques. A more significant statistical challenge is Timber Mart-South's change in 1992 from...
NASA Technical Reports Server (NTRS)
Djorgovski, George
1993-01-01
The existing and forthcoming data bases from NASA missions contain an abundance of information whose complexity cannot be efficiently tapped with simple statistical techniques. Powerful multivariate statistical methods already exist which can be used to harness much of the richness of these data. Automatic classification techniques have been developed to solve the problem of identifying known types of objects in multiparameter data sets, in addition to leading to the discovery of new physical phenomena and classes of objects. We propose an exploratory study and integration of promising techniques in the development of a general and modular classification/analysis system for very large data bases, which would enhance and optimize data management and the use of human research resource.
NASA Astrophysics Data System (ADS)
Poulain, Pierre-Marie; Luther, Douglas S.; Patzert, William C.
1992-11-01
Two techniques have been developed for estimating statistics of inertial oscillations from satellite-tracked drifters. These techniques overcome the difficulties inherent in estimating such statistics from data dependent upon space coordinates that are a function of time. Application of these techniques to tropical surface drifter data collected during the NORPAX, EPOCS, and TOGA programs reveals a latitude-dependent, statistically significant "blue shift" of inertial wave frequency. The latitudinal dependence of the blue shift is similar to predictions based on "global" internal wave spectral models, with a superposition of frequency shifting due to modification of the effective local inertial frequency by the presence of strongly sheared zonal mean currents within 12° of the equator.
NASA Technical Reports Server (NTRS)
Djorgovski, Stanislav
1992-01-01
The existing and forthcoming data bases from NASA missions contain an abundance of information whose complexity cannot be efficiently tapped with simple statistical techniques. Powerful multivariate statistical methods already exist which can be used to harness much of the richness of these data. Automatic classification techniques have been developed to solve the problem of identifying known types of objects in multi parameter data sets, in addition to leading to the discovery of new physical phenomena and classes of objects. We propose an exploratory study and integration of promising techniques in the development of a general and modular classification/analysis system for very large data bases, which would enhance and optimize data management and the use of human research resources.
Shaikh, Muhammad Mujtaba; Memon, Abdul Jabbar; Hussain, Manzoor
2016-09-01
In this article, we describe details of the data used in the research paper "Confidence bounds for energy conservation in electric motors: An economical solution using statistical techniques" [1]. The data presented in this paper is intended to show benefits of high efficiency electric motors over the standard efficiency motors of similar rating in the industrial sector of Pakistan. We explain how the data was collected and then processed by means of formulas to show cost effectiveness of energy efficient motors in terms of three important parameters: annual energy saving, cost saving and payback periods. This data can be further used to construct confidence bounds for the parameters using statistical techniques as described in [1].
Empirical performance of interpolation techniques in risk-neutral density (RND) estimation
NASA Astrophysics Data System (ADS)
Bahaludin, H.; Abdullah, M. H.
2017-03-01
The objective of this study is to evaluate the empirical performance of interpolation techniques in risk-neutral density (RND) estimation. Firstly, the empirical performance is evaluated by using statistical analysis based on the implied mean and the implied variance of RND. Secondly, the interpolation performance is measured based on pricing error. We propose using the leave-one-out cross-validation (LOOCV) pricing error for interpolation selection purposes. The statistical analyses indicate that there are statistical differences between the interpolation techniques:second-order polynomial, fourth-order polynomial and smoothing spline. The results of LOOCV pricing error shows that interpolation by using fourth-order polynomial provides the best fitting to option prices in which it has the lowest value error.
NASA Astrophysics Data System (ADS)
Batté, Lauriane; Déqué, Michel
2016-06-01
Stochastic methods are increasingly used in global coupled model climate forecasting systems to account for model uncertainties. In this paper, we describe in more detail the stochastic dynamics technique introduced by Batté and Déqué (2012) in the ARPEGE-Climate atmospheric model. We present new results with an updated version of CNRM-CM using ARPEGE-Climate v6.1, and show that the technique can be used both as a means of analyzing model error statistics and accounting for model inadequacies in a seasonal forecasting framework.The perturbations are designed as corrections of model drift errors estimated from a preliminary weakly nudged re-forecast run over an extended reference period of 34 boreal winter seasons. A detailed statistical analysis of these corrections is provided, and shows that they are mainly made of intra-month variance, thereby justifying their use as in-run perturbations of the model in seasonal forecasts. However, the interannual and systematic error correction terms cannot be neglected. Time correlation of the errors is limited, but some consistency is found between the errors of up to 3 consecutive days.These findings encourage us to test several settings of the random draws of perturbations in seasonal forecast mode. Perturbations are drawn randomly but consistently for all three prognostic variables perturbed. We explore the impact of using monthly mean perturbations throughout a given forecast month in a first ensemble re-forecast (SMM, for stochastic monthly means), and test the use of 5-day sequences of perturbations in a second ensemble re-forecast (S5D, for stochastic 5-day sequences). Both experiments are compared in the light of a REF reference ensemble with initial perturbations only. Results in terms of forecast quality are contrasted depending on the region and variable of interest, but very few areas exhibit a clear degradation of forecasting skill with the introduction of stochastic dynamics. We highlight some positive impacts of the method, mainly on Northern Hemisphere extra-tropics. The 500 hPa geopotential height bias is reduced, and improvements project onto the representation of North Atlantic weather regimes. A modest impact on ensemble spread is found over most regions, which suggests that this method could be complemented by other stochastic perturbation techniques in seasonal forecasting mode.
Investigation of advanced phase-shifting projected fringe profilometry techniques
NASA Astrophysics Data System (ADS)
Liu, Hongyu
1999-11-01
The phase-shifting projected fringe profilometry (PSPFP) technique is a powerful tool in the profile measurements of rough engineering surfaces. Compared with other competing techniques, this technique is notable for its full-field measurement capacity, system simplicity, high measurement speed, and low environmental vulnerability. The main purpose of this dissertation is to tackle three important problems, which severely limit the capability and the accuracy of the PSPFP technique, with some new approaches. Chapter 1 provides some background information of the PSPFP technique including the measurement principles, basic features, and related techniques is briefly introduced. The objectives and organization of the thesis are also outlined. Chapter 2 gives a theoretical treatment to the absolute PSPFP measurement. The mathematical formulations and basic requirements of the absolute PSPFP measurement and its supporting techniques are discussed in detail. Chapter 3 introduces the experimental verification of the proposed absolute PSPFP technique. Some design details of a prototype system are discussed as supplements to the previous theoretical analysis. Various fundamental experiments performed for concept verification and accuracy evaluation are introduced together with some brief comments. Chapter 4 presents the theoretical study of speckle- induced phase measurement errors. In this analysis, the expression for speckle-induced phase errors is first derived based on the multiplicative noise model of image- plane speckles. The statistics and the system dependence of speckle-induced phase errors are then thoroughly studied through numerical simulations and analytical derivations. Based on the analysis, some suggestions on the system design are given to improve measurement accuracy. Chapter 5 discusses a new technique combating surface reflectivity variations. The formula used for error compensation is first derived based on a simplified model of the detection process. The techniques coping with two major effects of surface reflectivity variations are then introduced. Some fundamental problems in the proposed technique are studied through simulations. Chapter 6 briefly summarizes the major contributions of the current work and provides some suggestions for the future research.
Radar error statistics for the space shuttle
NASA Technical Reports Server (NTRS)
Lear, W. M.
1979-01-01
Radar error statistics of C-band and S-band that are recommended for use with the groundtracking programs to process space shuttle tracking data are presented. The statistics are divided into two parts: bias error statistics, using the subscript B, and high frequency error statistics, using the subscript q. Bias errors may be slowly varying to constant. High frequency random errors (noise) are rapidly varying and may or may not be correlated from sample to sample. Bias errors were mainly due to hardware defects and to errors in correction for atmospheric refraction effects. High frequency noise was mainly due to hardware and due to atmospheric scintillation. Three types of atmospheric scintillation were identified: horizontal, vertical, and line of sight. This was the first time that horizontal and line of sight scintillations were identified.
Statistics Report on TEQSA Registered Higher Education Providers
ERIC Educational Resources Information Center
Australian Government Tertiary Education Quality and Standards Agency, 2015
2015-01-01
This statistics report provides a comprehensive snapshot of national statistics on all parts of the sector for the year 2013, by bringing together data collected directly by TEQSA with data sourced from the main higher education statistics collections managed by the Australian Government Department of Education and Training. The report provides…
Electron dropout echoes induced by interplanetary shock: A statistical study
NASA Astrophysics Data System (ADS)
Liu, Z. Y.; Zong, Q.-G.; Hao, Y. X.; Zhou, X.-Z.; Ma, X. H.; Liu, Y.
2017-08-01
"Electron dropout echo" as indicated by repeated moderate dropout and recovery signatures of the flux of energetic electron in the outer radiation belt region has been investigated systematically. The electron moderate dropout and its echoes are usually found for higher-energy (>300 keV) channel fluxes, whereas the flux enhancements are obvious for lower energy electrons simultaneously after the interplanetary shock arrives at the Earth's geosynchronous orbit. The electron dropout echo events are found to be usually associated with the interplanetary shocks arrival. The 104 dropout echo events have been found from 215 interplanetary shock events from 1998 to 2007 based on the Los Alamos National Laboratory satellite data. In analogy to substorm injections, these 104 events could be naturally divided into two categories: dispersionless (49 events) or dispersive (55 events) according to the energy dispersion of the initial dropout. It is found that locations of dispersionless events are distributed mainly in the duskside magnetosphere. Further, the obtained locations derived from dispersive events with the time-of-flight technique of the initial dropout regions are mainly located at the duskside as well. Statistical studies have shown that the effect of shock normal, interplanetary magnetic field Bz and solar wind dynamic pressure may be insignificant to these electron dropout events. We suggest that the ˜1 min electric field impulse induced by the interplanetary shock produces a more pronounced inward migration of electrons at the duskside, resulting in the observed duskside moderate dropout of electron flux and its consequent echoes.
Grouping of Bulgarian wines according to grape variety by using statistical methods
NASA Astrophysics Data System (ADS)
Milev, M.; Nikolova, Kr.; Ivanova, Ir.; Minkova, St.; Evtimov, T.; Krustev, St.
2017-12-01
68 different types of Bulgarian wines were studied in accordance with 9 optical parameters as follows: color parameters in XYZ and SIE Lab color systems, lightness, Hue angle, chroma, fluorescence intensity and emission wavelength. The main objective of this research is using hierarchical cluster analysis to evaluate the similarity and the distance between examined different types of Bulgarian wines and their grouping based on physical parameters. We have found that wines are grouped in clusters on the base of the degree of identity between them. There are two main clusters each one with two subclusters. The first one contains white wines and Sira, the second contains red wines and rose. The results from cluster analysis are presented graphically by a dendrogram. The other statistical technique used is factor analysis performed by the Method of Principal Components (PCA). The aim is to reduce the large number of variables to a few factors by grouping the correlated variables into one factor and subdividing the noncorrelated variables into different factors. Moreover the factor analysis provided the possibility to determine the parameters with the greatest influence over the distribution of samples in different clusters. In our study after the rotation of the factors with Varimax method the parameters were combined into two factors, which explain about 80 % of the total variation. The first one explains the 61.49% and correlates with color characteristics, the second one explains 18.34% from the variation and correlates with the parameters connected with fluorescence spectroscopy.
Soares Medeiros, Lia Carolina; De Souza, Wanderley; Jiao, Chengge; Barrabin, Hector; Miranda, Kildare
2012-01-01
Different methods for three-dimensional visualization of biological structures have been developed and extensively applied by different research groups. In the field of electron microscopy, a new technique that has emerged is the use of a focused ion beam and scanning electron microscopy for 3D reconstruction at nanoscale resolution. The higher extent of volume that can be reconstructed with this instrument represent one of the main benefits of this technique, which can provide statistically relevant 3D morphometrical data. As the life cycle of Plasmodium species is a process that involves several structurally complex developmental stages that are responsible for a series of modifications in the erythrocyte surface and cytoplasm, a high number of features within the parasites and the host cells has to be sampled for the correct interpretation of their 3D organization. Here, we used FIB-SEM to visualize the 3D architecture of multiple erythrocytes infected with Plasmodium chabaudi and analyzed their morphometrical parameters in a 3D space. We analyzed and quantified alterations on the host cells, such as the variety of shapes and sizes of their membrane profiles and parasite internal structures such as a polymorphic organization of hemoglobin-filled tubules. The results show the complex 3D organization of Plasmodium and infected erythrocyte, and demonstrate the contribution of FIB-SEM for the obtainment of statistical data for an accurate interpretation of complex biological structures. PMID:22432024
Das, D K; Maiti, A K; Chakraborty, C
2015-03-01
In this paper, we propose a comprehensive image characterization cum classification framework for malaria-infected stage detection using microscopic images of thin blood smears. The methodology mainly includes microscopic imaging of Leishman stained blood slides, noise reduction and illumination correction, erythrocyte segmentation, feature selection followed by machine classification. Amongst three-image segmentation algorithms (namely, rule-based, Chan-Vese-based and marker-controlled watershed methods), marker-controlled watershed technique provides better boundary detection of erythrocytes specially in overlapping situations. Microscopic features at intensity, texture and morphology levels are extracted to discriminate infected and noninfected erythrocytes. In order to achieve subgroup of potential features, feature selection techniques, namely, F-statistic and information gain criteria are considered here for ranking. Finally, five different classifiers, namely, Naive Bayes, multilayer perceptron neural network, logistic regression, classification and regression tree (CART), RBF neural network have been trained and tested by 888 erythrocytes (infected and noninfected) for each features' subset. Performance evaluation of the proposed methodology shows that multilayer perceptron network provides higher accuracy for malaria-infected erythrocytes recognition and infected stage classification. Results show that top 90 features ranked by F-statistic (specificity: 98.64%, sensitivity: 100%, PPV: 99.73% and overall accuracy: 96.84%) and top 60 features ranked by information gain provides better results (specificity: 97.29%, sensitivity: 100%, PPV: 99.46% and overall accuracy: 96.73%) for malaria-infected stage classification. © 2014 The Authors Journal of Microscopy © 2014 Royal Microscopical Society.
Patterns and causes of geographic variation in bat echolocation pulses.
Jiang, Tinglei; Wu, Hui; Feng, Jiang
2015-05-01
Evolutionary biologists have a long-standing interest in how acoustic signals in animals vary geographically, because divergent ecology and sensory perception play an important role in speciation. Geographic comparisons are valuable in determining the factors that influence divergence of acoustic signals. Bats are social mammals and they depend mainly on echolocation pulses to locate prey, to navigate and to communicate. Mounting evidence shows that geographic variation of bat echolocation pulses is common, with a mean 5-10 kHz differences in peak frequency, and a high level of individual variation may be nested in this geographical variation. However, understanding the geographic variation of echolocation pulses in bats is very difficult, because of differences in sample and statistical analysis techniques as well as the variety of factors shaping the vocal geographic evolution. Geographic differences in echolocation pulses of bats generally lack latitudinal, longitudinal and elevational patterns, and little is known about vocal dialects. Evidence is accumulating to support the fact that geographic variation in echolocation pulses of bats may be caused by genetic drift, cultural drift, ecological selection, sexual selection and social selection. Future studies could relate geographic differences in echolocation pulses to social adaptation, vocal learning strategies and patterns of dispersal. In addition, new statistical techniques and acoustic playback experiments may help to illustrate the causes and consequences of the geographic evolution of echolocation pulse in bats. © 2015 International Society of Zoological Sciences, Institute of Zoology/Chinese Academy of Sciences and Wiley Publishing Asia Pty Ltd.
Moment-based metrics for global sensitivity analysis of hydrological systems
NASA Astrophysics Data System (ADS)
Dell'Oca, Aronne; Riva, Monica; Guadagnini, Alberto
2017-12-01
We propose new metrics to assist global sensitivity analysis, GSA, of hydrological and Earth systems. Our approach allows assessing the impact of uncertain parameters on main features of the probability density function, pdf, of a target model output, y. These include the expected value of y, the spread around the mean and the degree of symmetry and tailedness of the pdf of y. Since reliable assessment of higher-order statistical moments can be computationally demanding, we couple our GSA approach with a surrogate model, approximating the full model response at a reduced computational cost. Here, we consider the generalized polynomial chaos expansion (gPCE), other model reduction techniques being fully compatible with our theoretical framework. We demonstrate our approach through three test cases, including an analytical benchmark, a simplified scenario mimicking pumping in a coastal aquifer and a laboratory-scale conservative transport experiment. Our results allow ascertaining which parameters can impact some moments of the model output pdf while being uninfluential to others. We also investigate the error associated with the evaluation of our sensitivity metrics by replacing the original system model through a gPCE. Our results indicate that the construction of a surrogate model with increasing level of accuracy might be required depending on the statistical moment considered in the GSA. The approach is fully compatible with (and can assist the development of) analysis techniques employed in the context of reduction of model complexity, model calibration, design of experiment, uncertainty quantification and risk assessment.
Jabari, Hamidreza; Sami, Ramin; Fakhri, Mohammad; Kiani, Arda
2012-01-01
Forceps biopsy is the standard procedure to obtain specimens in endobronchial lesions. New studies have proposed flexible cryoprobe as an accepted alternative method for this technique. Although diagnostic use of the cryobiopsy is confirmed in few studies, there is paucity of data with regard to an optimum protocol for this method since one of the main considerations in cryobiopsy is the freezing time. To evaluate diagnostic yield and safety of endobronchial biopsies using the flexible cryoprobe. Moreover, different freezing times were assessed to propose an optimized protocol for this diagnostic modality. For each patient with a confirmed intrabronchial lesion, diagnostic o value of forceps biopsy, cryobiopsy in three seconds, cryobiopsy in five seconds and combined results of cryobiopsy in both timings were recorded. A total of 60 patients (39 males and 21 females; Mean age 56.7 +/- 13.3) were included. Specimens that were obtained by cryobiopsy in five seconds were significantly larger than those of forceps biopsy and cryobiopsy in three seconds (p < 0.001). We showed that the achieved diagnostic yields for all three methods were not statistically different (p > 0.05). Simultaneous usage of samples produced in both cryobiopsies can significantly improve the diagnostic yield (p = 0.02). Statistical analysis showed that there were no significant differences in case of bleeding frequency among the three sampling methods. This study confirmed safety and feasibility of cryobiopsy. Additionally, combination of sampling with two different cold induction timings would significantly increase sensitivity of this emerging technique..
NASA Astrophysics Data System (ADS)
Platonov, Vladimir; Kislov, Alexander; Rivin, Gdaly; Varentsov, Mikhail; Rozinkina, Inna; Nikitin, Mikhail; Chumakov, Mikhail
2017-04-01
The detailed hydrodynamic modelling of meteorological parameters during the last 30 years (1985 - 2014) was performed for the Okhotsk Sea and the Sakhalin island regions. The regional non-hydrostatic atmospheric model COSMO-CLM used for this long-term simulation with 13.2, 6.6 and 2.2 km horizontal resolutions. The main objective of creation this dataset was the outlook of the investigation of statistical characteristics and the physical mechanisms of extreme weather events (primarily, wind speed extremes) on the small spatio-temporal scales. COSMO-CLM is the climate version of the well-known mesoscale COSMO model, including some modifications and extensions adapting to the long-term numerical experiments. The downscaling technique was realized and developed for the long-term simulations with three consequent nesting domains. ERA-Interim reanalysis ( 0.75 degrees resolution) used as global forcing data for the starting domain ( 13.2 km horizontal resolution), then these simulation data used as initial and boundary conditions for the next model runs over the domain with 6.6 km resolution, and similarly, for the next step to 2.2 km domain. Besides, the COSMO-CLM model configuration for 13.2 km run included the spectral nudging technique, i.e. an additional assimilation of reanalysis data not only at boundaries, but also inside the whole domain. Practically, this computational scheme realized on the SGI Altix 4700 supercomputer system in the Main Computer Center of Roshydromet and used 2,400 hours of CPU time total. According to modelling results, the verification of the obtained dataset was performed on the observation data. Estimations showed the mean error -0.5 0C, up to 2 - 3 0C RMSE in temperature, and overestimation in wind speed (RMSE is up to 2 m/s). Overall, analysis showed that the used downscaling technique with applying the COSMO-CLM model reproduced the meteorological conditions, spatial distribution, seasonal and synoptic variability of temperature and wind speed for the study area adequately. The dependences between reproduction quality of mesoscale atmospheric circulation features and the horizontal resolution of the model were revealed. In particular, it is shown that the use of 6 km resolution does not give any significant improvement comparing to 13 km resolution, whereas 2.2 km resolution provides an appreciable quality enhancement. Detailed synoptic analysis of extreme wind speed situations identified the main types of favorable to their genesis, associated with developing of cyclones over the Japan Islands or the Primorsky Kray of Russia, and penetration of intensified cyclones from Pacific Ocean through the Kamchatka peninsula, Kuril or Japan Islands. The obtained dataset will continue to be used for a full and comprehensive analysis of the reproduction quality of hydrometeorological fields, their statistical estimates, climatological trends and many other objectives.
Autoregressive statistical pattern recognition algorithms for damage detection in civil structures
NASA Astrophysics Data System (ADS)
Yao, Ruigen; Pakzad, Shamim N.
2012-08-01
Statistical pattern recognition has recently emerged as a promising set of complementary methods to system identification for automatic structural damage assessment. Its essence is to use well-known concepts in statistics for boundary definition of different pattern classes, such as those for damaged and undamaged structures. In this paper, several statistical pattern recognition algorithms using autoregressive models, including statistical control charts and hypothesis testing, are reviewed as potentially competitive damage detection techniques. To enhance the performance of statistical methods, new feature extraction techniques using model spectra and residual autocorrelation, together with resampling-based threshold construction methods, are proposed. Subsequently, simulated acceleration data from a multi degree-of-freedom system is generated to test and compare the efficiency of the existing and proposed algorithms. Data from laboratory experiments conducted on a truss and a large-scale bridge slab model are then used to further validate the damage detection methods and demonstrate the superior performance of proposed algorithms.
Towards an integrated quality control procedure for eddy-covariance data
NASA Astrophysics Data System (ADS)
Vitale, Domenico; Papale, Dario
2017-04-01
The eddy-covariance technique is nowadays the most reliable and direct way, allowing to calculate the main fluxes of Sensible and Latent Heat and of Net Ecosystem Exchange, this last being the result of the difference between the CO2 assimilated by photosynthetic activities and those released to the atmosphere through the ecosystem respiration processes. Despite the improvements in accuracy of measurement instruments and software development, the eddy-covariance technique is not suitable under non-ideal conditions respect to the instruments characteristics and the physical assumption behind the technique mainly related to the well-developed and stationary turbulence conditions. Under these conditions the calculated fluxes are not reliable and need to be flagged and discarded. In order to discover these unavoidable "bad" fluxes and build dataset with the highest quality, several tests applied both on high-frequency (10-20 Hz) raw data and on half-hourly times series have been developed in the past years. Nevertheless, there is an increasing need to develop a standardized quality control procedure suitable not only for the analysis of long-term data, but also for the near-real time data processing. In this paper, we review established quality assessment procedures and present an innovative quality control strategy with the purpose of integrating the existing consolidated procedures with robust and advanced statistical tests more suitable for the analysis of time series data. The performance of the proposed quality control strategy is evaluated both on simulated and EC data distributed by the ICOS research infrastructure. It is concluded that the proposed strategy is able to flag and exclude unrealistic fluxes while being reproducible and retaining the largest possible amount of high quality data.
Nd:YOV4 laser polishing on WC-Co HVOF coating
NASA Astrophysics Data System (ADS)
Giorleo, L.; Ceretti, E.; Montesano, L.; La Vecchia, G. M.
2017-10-01
WC/Co coatings are widely applied to different types of components due to their extraordinary performance properties including high hardness and wear properties. In industrial applications High Velocity Oxy-Fuel (HVOF) technique is extensively used to deposit hard metal coatings. The main advantage of HVOF compared to other thermal spray techniques is the ability to accelerate the melted powder particles of the feedstock material at a relatively high velocity, leading to obtain good adhesion and low porosity level. However, despite the mentioned benefits, the surface finish quality of WC-Co HVOF coatings results to be poor (Ra higher than 5 µm) thus a mechanical polishing process is often needed. The main problem is that the high hardness of coating leads the polishing process expensive in terms of time and tool wear; moreover polishing becomes difficult and not always possible in case of limited accessibility of a part, micro dimensions or undercuts. Nowadays a different technique available to improve surface roughness is the laser polishing process. The polishing principle is based on focused radiation of a laser beam that melts a microscopic layer of surface material. Compared to conventional polishing process (as grinding) it ensures the possibility of avoiding tool wear, less pollution (no abrasive or liquids), no debris, less machining time and coupled with a galvo system it results to be more suitable in case of 3D complex workpieces. In this paper laser polishing process executed with a Nd:YOV4 Laser was investigated: the effect of different process parameters as initial coating morphology, laser scan speed and loop cycles were tested. Results were compared by a statistical approach in terms of average roughness along with a morphological analysis carried out by Scanning Electron Microscope (SEM) investigation coupled with EDS spectra.
Shamim, Faisal; Asghar, Ali; Tauheed, Saman; Yahya, Muhammad
2017-01-01
Background: Radiofrequency ablation (RFA) is a minimally invasive technique of tumor destruction for patients with hepatic cancer who are not candidates for conventional therapy. The therapy required general anesthesia (GA) or sedation to ensure patient safety and comfort. The study is aimed to report and evaluate factors that influenced the periprocedural anesthetic management, drugs used, and complications during and immediately after RFA procedure for hepatocellular carcinoma. Methods: For this retrospective study, we included 46 patients who underwent percutaneous RFA under GA or conscious sedation from January 2010 to June 2013 in Aga Khan University Hospital, Pakistan. The patients' characteristics, hepatic illness severity (Child-Pugh classification), anesthetic techniques, drugs, and complications of procedure were collected on a predesigned approved form. The data were assessed and summarized using descriptive statistics. Results: The majority of patients were female (57%) and mostly classified as American Society of Anesthesiologist III (65.2%). The preoperative hepatic illness severity in most patients was Child-Pugh Class A (76.10%). Thirty-eight patients (69.09%) had only single lesion and majority number of lesions were <3 cm (65.45). GA was the main anesthetic technique (87%) with laryngeal mask airway as an airway adjunct predominantly (70%). The mainly used anesthetic agents for hypnosis and analgesia were propofol and fentanyl, respectively. Pain was the only significant complaint in postoperative period but only in nine (19%) patients and mild in nature. Conclusions: Percutaneous RFA is a safe treatment of hepatocellular cancer. The procedure required good anesthetic support in the form of sedation-analgesia or complete GA that ensures maximum patient comfort and technical success of the procedure. PMID:28217048
Rasta, Seyed Hossein; Partovi, Mahsa Eisazadeh; Seyedarabi, Hadi; Javadzadeh, Alireza
2015-01-01
To investigate the effect of preprocessing techniques including contrast enhancement and illumination correction on retinal image quality, a comparative study was carried out. We studied and implemented a few illumination correction and contrast enhancement techniques on color retinal images to find out the best technique for optimum image enhancement. To compare and choose the best illumination correction technique we analyzed the corrected red and green components of color retinal images statistically and visually. The two contrast enhancement techniques were analyzed using a vessel segmentation algorithm by calculating the sensitivity and specificity. The statistical evaluation of the illumination correction techniques were carried out by calculating the coefficients of variation. The dividing method using the median filter to estimate background illumination showed the lowest Coefficients of variations in the red component. The quotient and homomorphic filtering methods after the dividing method presented good results based on their low Coefficients of variations. The contrast limited adaptive histogram equalization increased the sensitivity of the vessel segmentation algorithm up to 5% in the same amount of accuracy. The contrast limited adaptive histogram equalization technique has a higher sensitivity than the polynomial transformation operator as a contrast enhancement technique for vessel segmentation. Three techniques including the dividing method using the median filter to estimate background, quotient based and homomorphic filtering were found as the effective illumination correction techniques based on a statistical evaluation. Applying the local contrast enhancement technique, such as CLAHE, for fundus images presented good potentials in enhancing the vasculature segmentation. PMID:25709940
Sci-Thur PM - Colourful Interactions: Highlights 08: ARC TBI using Single-Step Optimized VMAT Fields
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hudson, Alana; Gordon, Deborah; Moore, Roseanne
Purpose: This work outlines a new TBI delivery technique to replace a lateral POP full bolus technique. The new technique is done with VMAT arc delivery, without bolus, treating the patient prone and supine. The benefits of the arc technique include: increased patient experience and safety, better dose conformity, better organ at risk sparing, decreased therapist time and reduction of therapist injuries. Methods: In this work we build on a technique developed by Jahnke et al. We use standard arc fields with gantry speeds corrected for varying distance to the patient followed by a single step VMAT optimization on amore » patient CT to increase dose inhomogeneity and to reduce dose to the lungs (vs. blocks). To compare the arc TBI technique to our full bolus technique, we produced plans on patient CTs for both techniques and evaluated several dosimetric parameters using an ANOVA test. Results and Conclusions: The arc technique is able reduce both the hot areas to the body (D2% reduced from 122.2% to 111.8% p<0.01) and the lungs (mean lung dose reduced from 107.5% to 99.1%, p<0.01), both statistically significant, while maintaining coverage (D98% = 97.8% vs. 94.6%, p=0.313, not statistically significant). We developed a more patient and therapist-friendly TBI treatment technique that utilizes single-step optimized VMAT plans. It was found that this technique was dosimetrically equivalent to our previous lateral technique in terms of coverage and statistically superior in terms of reduced lung dose.« less
Chys, Michael; Demeestere, Kristof; Ingabire, Ange Sabine; Dries, Jan; Van Langenhove, Herman; Van Hulle, Stijn W H
2017-07-01
Ozonation and three (biological) filtration techniques (trickling filtration (TF), slow sand filtration (SSF) and biological activated carbon (BAC) filtration) have been evaluated in different combinations as tertiary treatment for municipal wastewater effluent. The removal of 18 multi-class pharmaceuticals, as model trace organic contaminants (TrOCs), has been studied. (Biological) activated carbon filtration could reduce the amount of TrOCs significantly (>99%) but is cost-intensive for full-scale applications. Filtration techniques mainly depending on biodegradation mechanisms (TF and SSF) are found to be inefficient for TrOCs removal as a stand alone technique. Ozonation resulted in 90% removal of the total amount of quantified TrOCs, but a post-ozonation step is needed to cope with an increased unselective toxicity. SSF following ozonation showed to be the only technique able to reduce the unselective toxicity to the same level as before ozonation. In view of process control, innovative correlation models developed for the monitoring and control of TrOC removal during ozonation, are verified for their applicability during ozonation in combination with TF, SSF or BAC. Particularly for the poorly ozone reactive TrOCs, statistically significant models were obtained that correlate TrOC removal and reduction in UVA 254 as an online measured surrogate parameter.
NASA Astrophysics Data System (ADS)
Fisichella, M.; Shotter, A. C.; Di Pietro, A.; Figuera, P.; Lattuada, M.; Marchetta, C.; Privitera, V.; Romano, L.; Ruiz, C.; Zadro, M.
2015-12-01
For low energy reaction studies involving radioactive ion beams, the experimental reaction yields are generally small due to the low intensity of the beams. For this reason, the stacked target technique has been often used to measure excitation functions. This technique offers considerable advantages since the reaction cross-section at several energies can be simultaneously measured. In a further effort to increase yields, thick targets are also employed. The main disadvantage of the method is the degradation of the beam quality as it passes through the stack due to the statistical nature of energy loss processes and any nonuniformity of the stacked targets. This degradation can lead to ambiguities of associating effective beam energies to reaction product yields for the targets within the stack and, as a consequence, to an error in the determination of the excitation function for the reaction under study. A thorough investigation of these ambiguities is reported, and a best practice procedure of analyzing data obtained using the stacked target technique with radioactive ion beams is recommended. Using this procedure a re-evaluation is reported of some previously published sub-barrier fusion data in order to demonstrate the possibility of misinterpretations of derived excitation functions. In addition, this best practice procedure has been used to evaluate, from a new data set, the sub-barrier fusion excitation function for the reaction 6Li+120Sn .
A multitechnique evaluation of topical corticosteroid treatment.
Josse, G; Rouvrais, C; Mas, A; Haftek, M; Delalleau, A; Ferraq, Y; Ossant, F; George, J; Lagarde, J M; Schmitt, A M
2009-02-01
Corticosteroids are widely prescribed for systemic or local treatment of inflammatory autoimmune disorders. Long-term therapy is associated with side effects and causes cutaneous atrophy of the epidermis and the dermis. The present study aims to evaluate with several noninvasive techniques, the skin modifications observed during corticosteroids treatment. The potential of skin mechanical measurement and ultrasound radio frequency (RF) signal analysis are proposed as new measures more closely related to the functional impairments. Thirteen young healthy women volunteers had two applications per day on one arm of topical Clobetasol propionate 0.05% for 28 days, and they were followed for 28 days more. Skin modifications were studied by high-frequency ultrasound imaging, ultrasound RF signal analysis, optical coherence tomography and by the suction test. For all the techniques, a statistically significant change is observed with treatment. Large variations, around 30%, are observed for all techniques, but less for ultrasound imaging (10%). Dermis and epidermis thickness presented stable measurements on the nontreated zone. At the end of the study, measures returned to normal. The dynamic is mainly observed within the first 14 days of treatment and within the first 14 days after its cessation. Similar dynamics of skin modification during corticosteroid treatment was observed with very different techniques. Moreover, the potential of RF ultrasound analysis and mechanical skin measurement for characterizing skin structural and functional impairments has been evaluated.
Effects of band selection on endmember extraction for forestry applications
NASA Astrophysics Data System (ADS)
Karathanassi, Vassilia; Andreou, Charoula; Andronis, Vassilis; Kolokoussis, Polychronis
2014-10-01
In spectral unmixing theory, data reduction techniques play an important role as hyperspectral imagery contains an immense amount of data, posing many challenging problems such as data storage, computational efficiency, and the so called "curse of dimensionality". Feature extraction and feature selection are the two main approaches for dimensionality reduction. Feature extraction techniques are used for reducing the dimensionality of the hyperspectral data by applying transforms on hyperspectral data. Feature selection techniques retain the physical meaning of the data by selecting a set of bands from the input hyperspectral dataset, which mainly contain the information needed for spectral unmixing. Although feature selection techniques are well-known for their dimensionality reduction potentials they are rarely used in the unmixing process. The majority of the existing state-of-the-art dimensionality reduction methods set criteria to the spectral information, which is derived by the whole wavelength, in order to define the optimum spectral subspace. These criteria are not associated with any particular application but with the data statistics, such as correlation and entropy values. However, each application is associated with specific land c over materials, whose spectral characteristics present variations in specific wavelengths. In forestry for example, many applications focus on tree leaves, in which specific pigments such as chlorophyll, xanthophyll, etc. determine the wavelengths where tree species, diseases, etc., can be detected. For such applications, when the unmixing process is applied, the tree species, diseases, etc., are considered as the endmembers of interest. This paper focuses on investigating the effects of band selection on the endmember extraction by exploiting the information of the vegetation absorbance spectral zones. More precisely, it is explored whether endmember extraction can be optimized when specific sets of initial bands related to leaf spectral characteristics are selected. Experiments comprise application of well-known signal subspace estimation and endmember extraction methods on a hyperspectral imagery that presents a forest area. Evaluation of the extracted endmembers showed that more forest species can be extracted as endmembers using selected bands.
Reliability of diagnosis and clinical efficacy of visceral osteopathy: a systematic review.
Guillaud, Albin; Darbois, Nelly; Monvoisin, Richard; Pinsault, Nicolas
2018-02-17
In 2010, the World Health Organization published benchmarks for training in osteopathy in which osteopathic visceral techniques are included. The purpose of this study was to identify and critically appraise the scientific literature concerning the reliability of diagnosis and the clinical efficacy of techniques used in visceral osteopathy. Databases MEDLINE, OSTMED.DR, the Cochrane Library, Osteopathic Research Web, Google Scholar, Journal of American Osteopathic Association (JAOA) website, International Journal of Osteopathic Medicine (IJOM) website, and the catalog of Académie d'ostéopathie de France website were searched through December 2017. Only inter-rater reliability studies including at least two raters or the intra-rater reliability studies including at least two assessments by the same rater were included. For efficacy studies, only randomized-controlled-trials (RCT) or crossover studies on unhealthy subjects (any condition, duration and outcome) were included. Risk of bias was determined using a modified version of the quality appraisal tool for studies of diagnostic reliability (QAREL) in reliability studies. For the efficacy studies, the Cochrane risk of bias tool was used to assess their methodological design. Two authors performed data extraction and analysis. Eight reliability studies and six efficacy studies were included. The analysis of reliability studies shows that the diagnostic techniques used in visceral osteopathy are unreliable. Regarding efficacy studies, the least biased study shows no significant difference for the main outcome. The main risks of bias found in the included studies were due to the absence of blinding of the examiners, an unsuitable statistical method or an absence of primary study outcome. The results of the systematic review lead us to conclude that well-conducted and sound evidence on the reliability and the efficacy of techniques in visceral osteopathy is absent. The review is registered PROSPERO 12th of December 2016. Registration number is CRD4201605286 .
Comparison between two surgical techniques for root coverage with an acellular dermal matrix graft.
Andrade, Patrícia F; Felipe, Maria Emília M C; Novaes, Arthur B; Souza, Sérgio L S; Taba, Mário; Palioto, Daniela B; Grisi, Márcio F M
2008-03-01
The aim of this randomized, controlled, clinical study was to compare two surgical techniques with the acellular dermal matrix graft (ADMG) to evaluate which technique could provide better root coverage. Fifteen patients with bilateral Miller Class I gingival recession areas were selected. In each patient, one recession area was randomly assigned to the control group, while the contra-lateral recession area was assigned to the test group. The ADMG was used in both groups. The control group was treated with a broader flap and vertical-releasing incisions, and the test group was treated with the proposed surgical technique, without releasing incisions. The clinical parameters evaluated before the surgeries and after 12 months were: gingival recession height, probing depth, relative clinical attachment level and the width and thickness of keratinized tissue. There were no statistically significant differences between the groups for all parameters at baseline. After 12 months, there was a statistically significant reduction in recession height in both groups, and there was no statistically significant difference between the techniques with regard to root coverage. Both surgical techniques provided significant reduction in gingival recession height after 12 months, and similar results in relation to root coverage.
Mann, Michael P.; Rizzardo, Jule; Satkowski, Richard
2004-01-01
Accurate streamflow statistics are essential to water resource agencies involved in both science and decision-making. When long-term streamflow data are lacking at a site, estimation techniques are often employed to generate streamflow statistics. However, procedures for accurately estimating streamflow statistics often are lacking. When estimation procedures are developed, they often are not evaluated properly before being applied. Use of unevaluated or underevaluated flow-statistic estimation techniques can result in improper water-resources decision-making. The California State Water Resources Control Board (SWRCB) uses two key techniques, a modified rational equation and drainage basin area-ratio transfer, to estimate streamflow statistics at ungaged locations. These techniques have been implemented to varying degrees, but have not been formally evaluated. For estimating peak flows at the 2-, 5-, 10-, 25-, 50-, and 100-year recurrence intervals, the SWRCB uses the U.S. Geological Surveys (USGS) regional peak-flow equations. In this study, done cooperatively by the USGS and SWRCB, the SWRCB estimated several flow statistics at 40 USGS streamflow gaging stations in the north coast region of California. The SWRCB estimates were made without reference to USGS flow data. The USGS used the streamflow data provided by the 40 stations to generate flow statistics that could be compared with SWRCB estimates for accuracy. While some SWRCB estimates compared favorably with USGS statistics, results were subject to varying degrees of error over the region. Flow-based estimation techniques generally performed better than rain-based methods, especially for estimation of December 15 to March 31 mean daily flows. The USGS peak-flow equations also performed well, but tended to underestimate peak flows. The USGS equations performed within reported error bounds, but will require updating in the future as peak-flow data sets grow larger. Little correlation was discovered between estimation errors and geographic locations or various basin characteristics. However, for 25-percentile year mean-daily-flow estimates for December 15 to March 31, the greatest estimation errors were at east San Francisco Bay area stations with mean annual precipitation less than or equal to 30 inches, and estimated 2-year/24-hour rainfall intensity less than 3 inches.
ERIC Educational Resources Information Center
Brossart, Daniel F.; Parker, Richard I.; Olson, Elizabeth A.; Mahadevan, Lakshmi
2006-01-01
This study explored some practical issues for single-case researchers who rely on visual analysis of graphed data, but who also may consider supplemental use of promising statistical analysis techniques. The study sought to answer three major questions: (a) What is a typical range of effect sizes from these analytic techniques for data from…
Arroyo-Hernández, M; Mellado-Romero, M A; Páramo-Díaz, P; Martín-López, C M; Cano-Egea, J M; Vilá Y Rico, J
2015-01-01
The purpose of this study is to analyze if there is any difference between the arthroscopic reparation of full-thickness supraspinatus tears with simple row technique versus suture bridge technique. We accomplished a retrospective study of 123 patients with full-thickness supraspinatus tears between January 2009 and January 2013 in our hospital. There were 60 simple row reparations, and 63 suture bridge ones. The mean age in the simple row group was 62.9, and in the suture bridge group was 63.3 years old. There were more women than men in both groups (67%). All patients were studied using the Constant test. The mean Constant test in the suture bridge group was 76.7, and in the simple row group was 72.4. We have also accomplished a statistical analysis of each Constant item. Strength was higher in the suture bridge group, with a significant statistical difference (p 0.04). The range of movement was also greater in the suture bridge group, but was not statistically significant. Suture bridge technique has better clinical results than single row reparations, but the difference is not statistically significant (p = 0.298).
Analysis of Variance in Statistical Image Processing
NASA Astrophysics Data System (ADS)
Kurz, Ludwik; Hafed Benteftifa, M.
1997-04-01
A key problem in practical image processing is the detection of specific features in a noisy image. Analysis of variance (ANOVA) techniques can be very effective in such situations, and this book gives a detailed account of the use of ANOVA in statistical image processing. The book begins by describing the statistical representation of images in the various ANOVA models. The authors present a number of computationally efficient algorithms and techniques to deal with such problems as line, edge, and object detection, as well as image restoration and enhancement. By describing the basic principles of these techniques, and showing their use in specific situations, the book will facilitate the design of new algorithms for particular applications. It will be of great interest to graduate students and engineers in the field of image processing and pattern recognition.
NASA Technical Reports Server (NTRS)
Aires, Filipe; Rossow, William B.; Chedin, Alain; Hansen, James E. (Technical Monitor)
2001-01-01
The Independent Component Analysis is a recently developed technique for component extraction. This new method requires the statistical independence of the extracted components, a stronger constraint that uses higher-order statistics, instead of the classical decorrelation, a weaker constraint that uses only second-order statistics. This technique has been used recently for the analysis of geophysical time series with the goal of investigating the causes of variability in observed data (i.e. exploratory approach). We demonstrate with a data simulation experiment that, if initialized with a Principal Component Analysis, the Independent Component Analysis performs a rotation of the classical PCA (or EOF) solution. This rotation uses no localization criterion like other Rotation Techniques (RT), only the global generalization of decorrelation by statistical independence is used. This rotation of the PCA solution seems to be able to solve the tendency of PCA to mix several physical phenomena, even when the signal is just their linear sum.
Statistical assessment of the learning curves of health technologies.
Ramsay, C R; Grant, A M; Wallace, S A; Garthwaite, P H; Monk, A F; Russell, I T
2001-01-01
(1) To describe systematically studies that directly assessed the learning curve effect of health technologies. (2) Systematically to identify 'novel' statistical techniques applied to learning curve data in other fields, such as psychology and manufacturing. (3) To test these statistical techniques in data sets from studies of varying designs to assess health technologies in which learning curve effects are known to exist. METHODS - STUDY SELECTION (HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW): For a study to be included, it had to include a formal analysis of the learning curve of a health technology using a graphical, tabular or statistical technique. METHODS - STUDY SELECTION (NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH): For a study to be included, it had to include a formal assessment of a learning curve using a statistical technique that had not been identified in the previous search. METHODS - DATA SOURCES: Six clinical and 16 non-clinical biomedical databases were searched. A limited amount of handsearching and scanning of reference lists was also undertaken. METHODS - DATA EXTRACTION (HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW): A number of study characteristics were abstracted from the papers such as study design, study size, number of operators and the statistical method used. METHODS - DATA EXTRACTION (NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH): The new statistical techniques identified were categorised into four subgroups of increasing complexity: exploratory data analysis; simple series data analysis; complex data structure analysis, generic techniques. METHODS - TESTING OF STATISTICAL METHODS: Some of the statistical methods identified in the systematic searches for single (simple) operator series data and for multiple (complex) operator series data were illustrated and explored using three data sets. The first was a case series of 190 consecutive laparoscopic fundoplication procedures performed by a single surgeon; the second was a case series of consecutive laparoscopic cholecystectomy procedures performed by ten surgeons; the third was randomised trial data derived from the laparoscopic procedure arm of a multicentre trial of groin hernia repair, supplemented by data from non-randomised operations performed during the trial. RESULTS - HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW: Of 4571 abstracts identified, 272 (6%) were later included in the study after review of the full paper. Some 51% of studies assessed a surgical minimal access technique and 95% were case series. The statistical method used most often (60%) was splitting the data into consecutive parts (such as halves or thirds), with only 14% attempting a more formal statistical analysis. The reporting of the studies was poor, with 31% giving no details of data collection methods. RESULTS - NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH: Of 9431 abstracts assessed, 115 (1%) were deemed appropriate for further investigation and, of these, 18 were included in the study. All of the methods for complex data sets were identified in the non-clinical literature. These were discriminant analysis, two-stage estimation of learning rates, generalised estimating equations, multilevel models, latent curve models, time series models and stochastic parameter models. In addition, eight new shapes of learning curves were identified. RESULTS - TESTING OF STATISTICAL METHODS: No one particular shape of learning curve performed significantly better than another. The performance of 'operation time' as a proxy for learning differed between the three procedures. Multilevel modelling using the laparoscopic cholecystectomy data demonstrated and measured surgeon-specific and confounding effects. The inclusion of non-randomised cases, despite the possible limitations of the method, enhanced the interpretation of learning effects. CONCLUSIONS - HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW: The statistical methods used for assessing learning effects in health technology assessment have been crude and the reporting of studies poor. CONCLUSIONS - NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH: A number of statistical methods for assessing learning effects were identified that had not hitherto been used in health technology assessment. There was a hierarchy of methods for the identification and measurement of learning, and the more sophisticated methods for both have had little if any use in health technology assessment. This demonstrated the value of considering fields outside clinical research when addressing methodological issues in health technology assessment. CONCLUSIONS - TESTING OF STATISTICAL METHODS: It has been demonstrated that the portfolio of techniques identified can enhance investigations of learning curve effects. (ABSTRACT TRUNCATED)
Statistical analysis of early failures in electromigration
NASA Astrophysics Data System (ADS)
Gall, M.; Capasso, C.; Jawarani, D.; Hernandez, R.; Kawasaki, H.; Ho, P. S.
2001-07-01
The detection of early failures in electromigration (EM) and the complicated statistical nature of this important reliability phenomenon have been difficult issues to treat in the past. A satisfactory experimental approach for the detection and the statistical analysis of early failures has not yet been established. This is mainly due to the rare occurrence of early failures and difficulties in testing of large sample populations. Furthermore, experimental data on the EM behavior as a function of varying number of failure links are scarce. In this study, a technique utilizing large interconnect arrays in conjunction with the well-known Wheatstone Bridge is presented. Three types of structures with a varying number of Ti/TiN/Al(Cu)/TiN-based interconnects were used, starting from a small unit of five lines in parallel. A serial arrangement of this unit enabled testing of interconnect arrays encompassing 480 possible failure links. In addition, a Wheatstone Bridge-type wiring using four large arrays in each device enabled simultaneous testing of 1920 interconnects. In conjunction with a statistical deconvolution to the single interconnect level, the results indicate that the electromigration failure mechanism studied here follows perfect lognormal behavior down to the four sigma level. The statistical deconvolution procedure is described in detail. Over a temperature range from 155 to 200 °C, a total of more than 75 000 interconnects were tested. None of the samples have shown an indication of early, or alternate, failure mechanisms. The activation energy of the EM mechanism studied here, namely the Cu incubation time, was determined to be Q=1.08±0.05 eV. We surmise that interface diffusion of Cu along the Al(Cu) sidewalls and along the top and bottom refractory layers, coupled with grain boundary diffusion within the interconnects, constitutes the Cu incubation mechanism.
NASA Astrophysics Data System (ADS)
von Storch, Hans; Zorita, Eduardo; Cubasch, Ulrich
1993-06-01
A statistical strategy to deduct regional-scale features from climate general circulation model (GCM) simulations has been designed and tested. The main idea is to interrelate the characteristic patterns of observed simultaneous variations of regional climate parameters and of large-scale atmospheric flow using the canonical correlation technique.The large-scale North Atlantic sea level pressure (SLP) is related to the regional, variable, winter (DJF) mean Iberian Peninsula rainfall. The skill of the resulting statistical model is shown by reproducing, to a good approximation, the winter mean Iberian rainfall from 1900 to present from the observed North Atlantic mean SLP distributions. It is shown that this observed relationship between these two variables is not well reproduced in the output of a general circulation model (GCM).The implications for Iberian rainfall changes as the response to increasing atmospheric greenhouse-gas concentrations simulated by two GCM experiments are examined with the proposed statistical model. In an instantaneous `2 C02' doubling experiment, using the simulated change of the mean North Atlantic SLP field to predict Iberian rainfall yields, there is an insignificant increase of area-averaged rainfall of 1 mm/month, with maximum values of 4 mm/month in the northwest of the peninsula. In contrast, for the four GCM grid points representing the Iberian Peninsula, the change is 10 mm/month, with a minimum of 19 mm/month in the southwest. In the second experiment, with the IPCC scenario A ("business as usual") increase Of C02, the statistical-model results partially differ from the directly simulated rainfall changes: in the experimental range of 100 years, the area-averaged rainfall decreases by 7 mm/month (statistical model), and by 9 mm/month (GCM); at the same time the amplitude of the interdecadal variability is quite different.
The Effect of Student-Driven Projects on the Development of Statistical Reasoning
ERIC Educational Resources Information Center
Sovak, Melissa M.
2010-01-01
Research has shown that even if students pass a standard introductory statistics course, they often still lack the ability to reason statistically. Many instructional techniques for enhancing the development of statistical reasoning have been discussed, although there is often little to no experimental evidence that they produce effective results…
ERIC Educational Resources Information Center
DeMark, Sarah F.; Behrens, John T.
2004-01-01
Whereas great advances have been made in the statistical sophistication of assessments in terms of evidence accumulation and task selection, relatively little statistical work has explored the possibility of applying statistical techniques to data for the purposes of determining appropriate domain understanding and to generate task-level scoring…
An Applied Statistics Course for Systematics and Ecology PhD Students
ERIC Educational Resources Information Center
Ojeda, Mario Miguel; Sosa, Victoria
2002-01-01
Statistics education is under review at all educational levels. Statistical concepts, as well as the use of statistical methods and techniques, can be taught in at least two contrasting ways. Specifically, (1) teaching can be theoretically and mathematically oriented, or (2) it can be less mathematically oriented being focused, instead, on…
NASA Astrophysics Data System (ADS)
Bencomo, Jose Antonio Fagundez
The main goal of this study was to relate physical changes in image quality measured by Modulation Transfer Function (MTF) to diagnostic accuracy. One Hundred and Fifty Kodak Min-R screen/film combination conventional craniocaudal mammograms obtained with the Pfizer Microfocus Mammographic system were selected from the files of the Department of Radiology, at M.D. Anderson Hospital and Tumor Institute. The mammograms included 88 cases with a variety of benign diagnosis and 62 cases with a variety of malignant biopsy diagnosis. The average age of the patient population was 55 years old. 70 cases presented calcifications with 30 cases having calcifications smaller than 0.5mm. 46 cases presented irregular bordered masses larger than 1 cm. 30 cases presented smooth bordered masses with 20 larger than 1 cm. Four separated copies of the original images were made each having a different change in the MTF using a defocusing technique whereby copies of the original were obtained by light exposure through different thicknesses (spacing) of transparent film base. The mammograms were randomized, and evaluated by three experienced mammographers for the degree of visibility of various anatomical breast structures and pathological lesions (masses and calicifications), subjective image quality, and mammographic interpretation. 3,000 separate evaluations were anayzed by several statistical techniques including Receiver Operating Characteristic curve analysis, McNemar test for differences between proportions and the Landis et al. method of agreement weighted kappa for ordinal categorical data. Results from the statistical analysis show: (1) There were no statistical significant differences in the diagnostic accuracy of the observers when diagnosing from mammograms with the same MTF. (2) There were no statistically significant differences in diagnostic accuracy for each observer when diagnosing from mammograms with the different MTF's used in the study. (3) There statistical significant differences in detail visibility between the copies and the originals. Detail visibility was better in the originals. (4) Feature interpretations were not significantly different between the originals and the copies. (5) Perception of image quality did not affect image interpretation. Continuation and improvement of this research ca be accomplished by: using a case population more sensitive to MTF changes, i.e., asymptomatic women with minimum breast cancer, more observers (including less experienced radiologists and experienced technologists) must collaborate in the study, and using a minimum of 200 benign and 200 malignant cases.
Aarabi, Ardalan; Osharina, Victoria; Wallois, Fabrice
2017-07-15
Slow and rapid event-related designs are used in fMRI and functional near-infrared spectroscopy (fNIRS) experiments to temporally characterize the brain hemodynamic response to discrete events. Conventional averaging (CA) and the deconvolution method (DM) are the two techniques commonly used to estimate the Hemodynamic Response Function (HRF) profile in event-related designs. In this study, we conducted a series of simulations using synthetic and real NIRS data to examine the effect of the main confounding factors, including event sequence timing parameters, different types of noise, signal-to-noise ratio (SNR), temporal autocorrelation and temporal filtering on the performance of these techniques in slow and rapid event-related designs. We also compared systematic errors in the estimates of the fitted HRF amplitude, latency and duration for both techniques. We further compared the performance of deconvolution methods based on Finite Impulse Response (FIR) basis functions and gamma basis sets. Our results demonstrate that DM was much less sensitive to confounding factors than CA. Event timing was the main parameter largely affecting the accuracy of CA. In slow event-related designs, deconvolution methods provided similar results to those obtained by CA. In rapid event-related designs, our results showed that DM outperformed CA for all SNR, especially above -5 dB regardless of the event sequence timing and the dynamics of background NIRS activity. Our results also show that periodic low-frequency systemic hemodynamic fluctuations as well as phase-locked noise can markedly obscure hemodynamic evoked responses. Temporal autocorrelation also affected the performance of both techniques by inducing distortions in the time profile of the estimated hemodynamic response with inflated t-statistics, especially at low SNRs. We also found that high-pass temporal filtering could substantially affect the performance of both techniques by removing the low-frequency components of HRF profiles. Our results emphasize the importance of characterization of event timing, background noise and SNR when estimating HRF profiles using CA and DM in event-related designs. Copyright © 2017 Elsevier Inc. All rights reserved.
An Automated Energy Detection Algorithm Based on Morphological and Statistical Processing Techniques
2018-01-09
ARL-TR-8272 ● JAN 2018 US Army Research Laboratory An Automated Energy Detection Algorithm Based on Morphological and...is no longer needed. Do not return it to the originator. ARL-TR-8272 ● JAN 2018 US Army Research Laboratory An Automated Energy ...4. TITLE AND SUBTITLE An Automated Energy Detection Algorithm Based on Morphological and Statistical Processing Techniques 5a. CONTRACT NUMBER
The purpose of this memorandum is to inform recipients of concerns regarding Army Corps of Engineers statistical techniques, provide a list of installations and FWS where SiteStat/GridStats (SS/GS) have been used, and to provide direction on communicating with the public on the use of these 'tools' by USACE.
Application of multivariate statistical techniques in microbial ecology
Paliy, O.; Shankar, V.
2016-01-01
Recent advances in high-throughput methods of molecular analyses have led to an explosion of studies generating large scale ecological datasets. Especially noticeable effect has been attained in the field of microbial ecology, where new experimental approaches provided in-depth assessments of the composition, functions, and dynamic changes of complex microbial communities. Because even a single high-throughput experiment produces large amounts of data, powerful statistical techniques of multivariate analysis are well suited to analyze and interpret these datasets. Many different multivariate techniques are available, and often it is not clear which method should be applied to a particular dataset. In this review we describe and compare the most widely used multivariate statistical techniques including exploratory, interpretive, and discriminatory procedures. We consider several important limitations and assumptions of these methods, and we present examples of how these approaches have been utilized in recent studies to provide insight into the ecology of the microbial world. Finally, we offer suggestions for the selection of appropriate methods based on the research question and dataset structure. PMID:26786791
The Statistical Package for the Social Sciences (SPSS) as an adjunct to pharmacokinetic analysis.
Mather, L E; Austin, K L
1983-01-01
Computer techniques for numerical analysis are well known to pharmacokineticists. Powerful techniques for data file management have been developed by social scientists but have, in general, been ignored by pharmacokineticists because of their apparent lack of ability to interface with pharmacokinetic programs. Extensive use has been made of the Statistical Package for the Social Sciences (SPSS) for its data handling capabilities, but at the same time, techniques have been developed within SPSS to interface with pharmacokinetic programs of the users' choice and to carry out a variety of user-defined pharmacokinetic tasks within SPSS commands, apart from the expected variety of statistical tasks. Because it is based on a ubiquitous package, this methodology has all of the benefits of excellent documentation, interchangeability between different types and sizes of machines and true portability of techniques and data files. An example is given of the total management of a pharmacokinetic study previously reported in the literature by the authors.
Water-rock interaction and geochemistry of groundwater from the Ain Azel aquifer, Algeria.
Belkhiri, Lazhar; Mouni, Lotfi; Tiri, Ammar
2012-02-01
Hydrochemical, multivariate statistical, and inverse geochemical modeling techniques were used to investigate the hydrochemical evolution within the Ain Azel aquifer, Algeria. Cluster analysis based on major ion contents defined 3 main chemical water types, reflecting different hydrochemical processes. The first group water, group 1, has low salinity (mean EC = 735 μS/cm). The second group waters are classified as Cl-HCO(3)-alkaline earth type. The third group is made up of water samples, the cation composition of which is dominated by Ca and Mg with anion composition varying from dominantly Cl to dominantly HCO(3) plus SO(4). The varifactors obtained from R-mode FA indicate that the parameters responsible for groundwater quality variations are mainly related to the presence and dissolution of some carbonate, silicate, and evaporite minerals in the aquifer. Inverse geochemical modeling along groundwater flow paths indicates the dominant processes are the consumption of CO(2), the dissolution of dolomite, gypsum, and halite, along with the precipitation of calcite, Ca-montmorillonite, illite, kaolinite, and quartz. © Springer Science+Business Media B.V. 2011
On the Use of Statistics in Design and the Implications for Deterministic Computer Experiments
NASA Technical Reports Server (NTRS)
Simpson, Timothy W.; Peplinski, Jesse; Koch, Patrick N.; Allen, Janet K.
1997-01-01
Perhaps the most prevalent use of statistics in engineering design is through Taguchi's parameter and robust design -- using orthogonal arrays to compute signal-to-noise ratios in a process of design improvement. In our view, however, there is an equally exciting use of statistics in design that could become just as prevalent: it is the concept of metamodeling whereby statistical models are built to approximate detailed computer analysis codes. Although computers continue to get faster, analysis codes always seem to keep pace so that their computational time remains non-trivial. Through metamodeling, approximations of these codes are built that are orders of magnitude cheaper to run. These metamodels can then be linked to optimization routines for fast analysis, or they can serve as a bridge for integrating analysis codes across different domains. In this paper we first review metamodeling techniques that encompass design of experiments, response surface methodology, Taguchi methods, neural networks, inductive learning, and kriging. We discuss their existing applications in engineering design and then address the dangers of applying traditional statistical techniques to approximate deterministic computer analysis codes. We conclude with recommendations for the appropriate use of metamodeling techniques in given situations and how common pitfalls can be avoided.
Multispectral and geomorphic studies of processed Voyager 2 images of Europa
NASA Technical Reports Server (NTRS)
Meier, T. A.
1984-01-01
High resolution images of Europa taken by the Voyager 2 spacecraft were used to study a portion of Europa's dark lineations and the major white line feature Agenor Linea. Initial image processing of images 1195J2-001 (violet filter), 1198J2-001 (blue filter), 1201J2-001 (orange filter), and 1204J2-001 (ultraviolet filter) was performed at the U.S.G.S. Branch of Astrogeology in Flagstaff, Arizona. Processing was completed through the stages of image registration and color ratio image construction. Pixel printouts were used in a new technique of linear feature profiling to compensate for image misregistration through the mapping of features on the printouts. In all, 193 dark lineation segments were mapped and profiled. The more accurate multispectral data derived by this method was plotted using a new application of the ternary diagram, with orange, blue, and violet relative spectral reflectances serving as end members. Statistical techniques were then applied to the ternary diagram plots. The image products generated at LPI were used mainly to cross-check and verify the results of the ternary diagram analysis.
Parkinson's disease detection based on dysphonia measurements
NASA Astrophysics Data System (ADS)
Lahmiri, Salim
2017-04-01
Assessing dysphonic symptoms is a noninvasive and effective approach to detect Parkinson's disease (PD) in patients. The main purpose of this study is to investigate the effect of different dysphonia measurements on PD detection by support vector machine (SVM). Seven categories of dysphonia measurements are considered. Experimental results from ten-fold cross-validation technique demonstrate that vocal fundamental frequency statistics yield the highest accuracy of 88 % ± 0.04. When all dysphonia measurements are employed, the SVM classifier achieves 94 % ± 0.03 accuracy. A refinement of the original patterns space by removing dysphonia measurements with similar variation across healthy and PD subjects allows achieving 97.03 % ± 0.03 accuracy. The latter performance is larger than what is reported in the literature on the same dataset with ten-fold cross-validation technique. Finally, it was found that measures of ratio of noise to tonal components in the voice are the most suitable dysphonic symptoms to detect PD subjects as they achieve 99.64 % ± 0.01 specificity. This finding is highly promising for understanding PD symptoms.
NASA Astrophysics Data System (ADS)
Kitaura, Francisco-Shu
2016-10-01
One of the main goals in cosmology is to understand how the Universe evolves, how it forms structures, why it expands, and what is the nature of dark matter and dark energy. Next decade large and expensive observational projects will bring information on the structure and the distribution of many millions of galaxies at different redshifts enabling us to make great progress in answering these questions. However, these data require a very special and complex set of analysis tools to extract the maximum valuable information. Statistical inference techniques are being developed, bridging the gaps between theory, simulations, and observations. In particular, we discuss the efforts to address the question: What is the underlying nonlinear matter distribution and dynamics at any cosmic time corresponding to a set of observed galaxies in redshift space? An accurate reconstruction of the initial conditions encodes the full phase-space information at any later cosmic time (given a particular structure formation model and a set of cosmological parameters). We present advances to solve this problem in a self-consistent way with Big Data techniques of the Cosmic Web.
A stochastic approach to noise modeling for barometric altimeters.
Sabatini, Angelo Maria; Genovese, Vincenzo
2013-11-18
The question whether barometric altimeters can be applied to accurately track human motions is still debated, since their measurement performance are rather poor due to either coarse resolution or drifting behavior problems. As a step toward accurate short-time tracking of changes in height (up to few minutes), we develop a stochastic model that attempts to capture some statistical properties of the barometric altimeter noise. The barometric altimeter noise is decomposed in three components with different physical origin and properties: a deterministic time-varying mean, mainly correlated with global environment changes, and a first-order Gauss-Markov (GM) random process, mainly accounting for short-term, local environment changes, the effects of which are prominent, respectively, for long-time and short-time motion tracking; an uncorrelated random process, mainly due to wideband electronic noise, including quantization noise. Autoregressive-moving average (ARMA) system identification techniques are used to capture the correlation structure of the piecewise stationary GM component, and to estimate its standard deviation, together with the standard deviation of the uncorrelated component. M-point moving average filters used alone or in combination with whitening filters learnt from ARMA model parameters are further tested in few dynamic motion experiments and discussed for their capability of short-time tracking small-amplitude, low-frequency motions.
Meng, Xiaoteng; Peng, Zhigang; Hardebeck, Jeanne L.
2013-01-01
Earthquakes trigger other earthquakes, but the physical mechanism of the triggering is currently debated. Most studies of earthquake triggering rely on earthquakes listed in catalogs, which are known to be incomplete around the origin times of large earthquakes and therefore missing potentially triggered events. Here we apply a waveform matched-filter technique to systematically detect earthquakes along the Parkfield section of the San Andreas Fault from 46 days before to 31 days after the nearby 2003 Mw6.5 San Simeon earthquake. After removing all possible false detections, we identify ~8 times more earthquakes than in the Northern California Seismic Network catalog. The newly identified events along the creeping section of the San Andreas Fault show a statistically significant decrease following the San Simeon main shock, which correlates well with the negative static stress changes (i.e., stress shadow) cast by the main shock. In comparison, the seismicity rate around Parkfield increased moderately where the static stress changes are positive. The seismicity rate changes correlate well with the static shear stress changes induced by the San Simeon main shock, suggesting a low friction in the seismogenic zone along the Parkfield section of the San Andreas Fault.
Best Merge Region Growing with Integrated Probabilistic Classification for Hyperspectral Imagery
NASA Technical Reports Server (NTRS)
Tarabalka, Yuliya; Tilton, James C.
2011-01-01
A new method for spectral-spatial classification of hyperspectral images is proposed. The method is based on the integration of probabilistic classification within the hierarchical best merge region growing algorithm. For this purpose, preliminary probabilistic support vector machines classification is performed. Then, hierarchical step-wise optimization algorithm is applied, by iteratively merging regions with the smallest Dissimilarity Criterion (DC). The main novelty of this method consists in defining a DC between regions as a function of region statistical and geometrical features along with classification probabilities. Experimental results are presented on a 200-band AVIRIS image of the Northwestern Indiana s vegetation area and compared with those obtained by recently proposed spectral-spatial classification techniques. The proposed method improves classification accuracies when compared to other classification approaches.
NASA Astrophysics Data System (ADS)
Müller, M. F.; Thompson, S. E.
2015-09-01
The prediction of flow duration curves (FDCs) in ungauged basins remains an important task for hydrologists given the practical relevance of FDCs for water management and infrastructure design. Predicting FDCs in ungauged basins typically requires spatial interpolation of statistical or model parameters. This task is complicated if climate becomes non-stationary, as the prediction challenge now also requires extrapolation through time. In this context, process-based models for FDCs that mechanistically link the streamflow distribution to climate and landscape factors may have an advantage over purely statistical methods to predict FDCs. This study compares a stochastic (process-based) and statistical method for FDC prediction in both stationary and non-stationary contexts, using Nepal as a case study. Under contemporary conditions, both models perform well in predicting FDCs, with Nash-Sutcliffe coefficients above 0.80 in 75 % of the tested catchments. The main drives of uncertainty differ between the models: parameter interpolation was the main source of error for the statistical model, while violations of the assumptions of the process-based model represented the main source of its error. The process-based approach performed better than the statistical approach in numerical simulations with non-stationary climate drivers. The predictions of the statistical method under non-stationary rainfall conditions were poor if (i) local runoff coefficients were not accurately determined from the gauge network, or (ii) streamflow variability was strongly affected by changes in rainfall. A Monte Carlo analysis shows that the streamflow regimes in catchments characterized by a strong wet-season runoff and a rapid, strongly non-linear hydrologic response are particularly sensitive to changes in rainfall statistics. In these cases, process-based prediction approaches are strongly favored over statistical models.
NASA Astrophysics Data System (ADS)
Müller, M. F.; Thompson, S. E.
2016-02-01
The prediction of flow duration curves (FDCs) in ungauged basins remains an important task for hydrologists given the practical relevance of FDCs for water management and infrastructure design. Predicting FDCs in ungauged basins typically requires spatial interpolation of statistical or model parameters. This task is complicated if climate becomes non-stationary, as the prediction challenge now also requires extrapolation through time. In this context, process-based models for FDCs that mechanistically link the streamflow distribution to climate and landscape factors may have an advantage over purely statistical methods to predict FDCs. This study compares a stochastic (process-based) and statistical method for FDC prediction in both stationary and non-stationary contexts, using Nepal as a case study. Under contemporary conditions, both models perform well in predicting FDCs, with Nash-Sutcliffe coefficients above 0.80 in 75 % of the tested catchments. The main drivers of uncertainty differ between the models: parameter interpolation was the main source of error for the statistical model, while violations of the assumptions of the process-based model represented the main source of its error. The process-based approach performed better than the statistical approach in numerical simulations with non-stationary climate drivers. The predictions of the statistical method under non-stationary rainfall conditions were poor if (i) local runoff coefficients were not accurately determined from the gauge network, or (ii) streamflow variability was strongly affected by changes in rainfall. A Monte Carlo analysis shows that the streamflow regimes in catchments characterized by frequent wet-season runoff and a rapid, strongly non-linear hydrologic response are particularly sensitive to changes in rainfall statistics. In these cases, process-based prediction approaches are favored over statistical models.
Statistical and Economic Techniques for Site-specific Nematode Management.
Liu, Zheng; Griffin, Terry; Kirkpatrick, Terrence L
2014-03-01
Recent advances in precision agriculture technologies and spatial statistics allow realistic, site-specific estimation of nematode damage to field crops and provide a platform for the site-specific delivery of nematicides within individual fields. This paper reviews the spatial statistical techniques that model correlations among neighboring observations and develop a spatial economic analysis to determine the potential of site-specific nematicide application. The spatial econometric methodology applied in the context of site-specific crop yield response contributes to closing the gap between data analysis and realistic site-specific nematicide recommendations and helps to provide a practical method of site-specifically controlling nematodes.
Incorporating principal component analysis into air quality ...
The efficacy of standard air quality model evaluation techniques is becoming compromised as the simulation periods continue to lengthen in response to ever increasing computing capacity. Accordingly, the purpose of this paper is to demonstrate a statistical approach called Principal Component Analysis (PCA) with the intent of motivating its use by the evaluation community. One of the main objectives of PCA is to identify, through data reduction, the recurring and independent modes of variations (or signals) within a very large dataset, thereby summarizing the essential information of that dataset so that meaningful and descriptive conclusions can be made. In this demonstration, PCA is applied to a simple evaluation metric – the model bias associated with EPA's Community Multi-scale Air Quality (CMAQ) model when compared to weekly observations of sulfate (SO42−) and ammonium (NH4+) ambient air concentrations measured by the Clean Air Status and Trends Network (CASTNet). The advantages of using this technique are demonstrated as it identifies strong and systematic patterns of CMAQ model bias across a myriad of spatial and temporal scales that are neither constrained to geopolitical boundaries nor monthly/seasonal time periods (a limitation of many current studies). The technique also identifies locations (station–grid cell pairs) that are used as indicators for a more thorough diagnostic evaluation thereby hastening and facilitating understanding of the prob
Hatam, Nahid; Kafashi, Shahnaz; Kavosi, Zahra
2015-07-01
The importance of health indicators in the recent years has created challenges in resource allocation. Balanced and fair distribution of health resources is one of the main principles in achieving equity. The goal of this cross-sectional descriptive study, conducted in 2010, was to classify health structural indicators in the Fars province using the scalogram technique. Health structural indicators were selected and classified in three categories; namely institutional, human resources, and rural health. The data were obtained from the statistical yearbook of Iran and was analyzed according to the scalogram technique. The distribution map of the Fars province was drawn using ArcGIS (geographic information system). The results showed an interesting health structural indicator map across the province. Our findings revealed that the city of Mohr with 85 and Zarindasht with 36 had the highest and the lowest scores, respectively. This information is valuable to provincial health policymakers to plan appropriately based on factual data and minimize chaos in allocating health resources. Based on such data and reflecting on the local needs, one could develop equity based resource allocation policies and prevent inequality. It is concluded that, as top priority, the provincial policymakers should place dedicated deprivation programs for Farashband, Eghlid and Zaindasht regions.
Effect of different mixing methods on the bacterial microleakage of calcium-enriched mixture cement.
Shahi, Shahriar; Jeddi Khajeh, Soniya; Rahimi, Saeed; Yavari, Hamid R; Jafari, Farnaz; Samiei, Mohammad; Ghasemi, Negin; Milani, Amin S
2016-10-01
Calcium-enriched mixture (CEM) cement is used in the field of endodontics. It is similar to mineral trioxide aggregate in its main ingredients. The present study investigated the effect of different mixing methods on the bacterial microleakage of CEM cement. A total of 55 human single-rooted human permanent teeth were decoronated so that 14-mm-long samples were obtained and obturated with AH26 sealer and gutta-percha using lateral condensation technique. Three millimeters of the root end were cut off and randomly divided into 3 groups of 15 each (3 mixing methods of amalgamator, ultrasonic and conventional) and 2 negative and positive control groups (each containing 5 samples). BHI (brain-heart infusion agar) suspension containing Enterococcus faecalis was used for bacterial leakage assessment. Statistical analysis was carried out using descriptive statistics, Kaplan-Meier survival analysis with censored data and log rank test. Statistical significance was set at P<0.05. The survival means for conventional, amalgamator and ultrasonic methods were 62.13±12.44, 68.87±12.79 and 77.53±12.52 days, respectively. The log rank test showed no significant differences between the groups. Based on the results of the present study it can be concluded that different mixing methods had no significant effect on the bacterial microleakage of CEM cement.
NASA Technical Reports Server (NTRS)
Oravec, Heather Ann; Daniels, Christopher C.
2014-01-01
The National Aeronautics and Space Administration has been developing a novel docking system to meet the requirements of future exploration missions to low-Earth orbit and beyond. A dynamic gas pressure seal is located at the main interface between the active and passive mating components of the new docking system. This seal is designed to operate in the harsh space environment, but is also to perform within strict loading requirements while maintaining an acceptable level of leak rate. In this study, a candidate silicone elastomer seal was designed, and multiple subscale test articles were manufactured for evaluation purposes. The force required to fully compress each test article at room temperature was quantified and found to be below the maximum allowable load for the docking system. However, a significant amount of scatter was observed in the test results. Due to the stochastic nature of the mechanical performance of this candidate docking seal, a statistical process control technique was implemented to isolate unusual compression behavior from typical mechanical performance. The results of this statistical analysis indicated a lack of process control, suggesting a variation in the manufacturing phase of the process. Further investigation revealed that changes in the manufacturing molding process had occurred which may have influenced the mechanical performance of the seal. This knowledge improves the chance of this and future space seals to satisfy or exceed design specifications.
iTTVis: Interactive Visualization of Table Tennis Data.
Wu, Yingcai; Lan, Ji; Shu, Xinhuan; Ji, Chenyang; Zhao, Kejian; Wang, Jiachen; Zhang, Hui
2018-01-01
The rapid development of information technology paved the way for the recording of fine-grained data, such as stroke techniques and stroke placements, during a table tennis match. This data recording creates opportunities to analyze and evaluate matches from new perspectives. Nevertheless, the increasingly complex data poses a significant challenge to make sense of and gain insights into. Analysts usually employ tedious and cumbersome methods which are limited to watching videos and reading statistical tables. However, existing sports visualization methods cannot be applied to visualizing table tennis competitions due to different competition rules and particular data attributes. In this work, we collaborate with data analysts to understand and characterize the sophisticated domain problem of analysis of table tennis data. We propose iTTVis, a novel interactive table tennis visualization system, which to our knowledge, is the first visual analysis system for analyzing and exploring table tennis data. iTTVis provides a holistic visualization of an entire match from three main perspectives, namely, time-oriented, statistical, and tactical analyses. The proposed system with several well-coordinated views not only supports correlation identification through statistics and pattern detection of tactics with a score timeline but also allows cross analysis to gain insights. Data analysts have obtained several new insights by using iTTVis. The effectiveness and usability of the proposed system are demonstrated with four case studies.
Arul, P
2017-01-01
Asphalts are bitumens that consist of complex of hydrocarbon mixtures and it is used mainly in road construction and maintenance. This study was undertaken to evaluate the micronucleus (MN) assay of exfoliated buccal epithelial cells in road construction workers using liquid-based cytology (LBC) preparation. Three different stains (May-Grunwald Giemsa, hematoxylin and eosin, and Papanicolaou) were used to evaluate the frequency of MN in exfoliated buccal epithelial of 100 participants (fifty road construction workers and fifty administrative staff) using LBC preparation. Statistical analysis was performed with Student's t-test, and P< 0.05 was considered statistically significant. The mean frequency of MN for cases was significantly higher than that of controls (P = 0.001) regardless of staining method used and also cases with exposure period of more than 5 years had statistically significant difference (P < 0.05) than cases with Conclusion: The present study concluded that workers exposed to asphalts during road construction exhibit a higher frequency of MN in exfoliated buccal epithelial cells and they are under the significant risk of cytogenetic damage. LBC preparation has potential application for the evaluation of frequency of MN. This technique may be advocated in those who are occupationally exposed to potentially carcinogenic agents in view of improvement in the smear quality and visualization of cell morphology.
Trends in bromide wet deposition concentrations in the contiguous United States, 2001-2016.
Wetherbee, Gregory A; Lehmann, Christopher M B; Kerschner, Brian M; Ludtke, Amy S; Green, Lee A; Rhodes, Mark F
2018-02-01
Bromide (Br - ) and other solute concentration data from wet deposition samples collected and analyzed by the National Atmospheric Deposition Program (NADP) from 2001 to 2016, were statistically analyzed for trends both geographically and temporally by precipitation type. Analysis was limited to NADP sites in the contiguous 48 United States. The Br - concentrations for this time period had a high number of values censored at the detection limits with greater than 86 percent of sample concentrations below analytical detection. Bromide was more frequently detected at NADP sites in coastal regions. Analysis using specialized statistical techniques for censored data revealed that Br - concentrations varied by precipitation type with higher concentrations usually observed in liquid versus precipitation containing snow. Negative temporal trends in Br - wet deposition concentrations were observed at a majority of NADP sites; approximately 25 percent of these trend values were statistically significant at less than 0.05 to 0.10 significance levels. Potential causes for the negative trends were explored, including annual and seasonal changes in precipitation depth, reduced emissions of methyl bromide (CH 3 Br) from coastal wetlands, and declining industrial use of bromine compounds. The results indicate that Br - in non-coastal wet-deposition comes mainly from long-range transport, not local sources. Correlations between Br - , chloride, and nitrate concentrations also were evaluated. Published by Elsevier Ltd.
Salleh, Fatmah M; Al-Mekhlafi, Abdulsalam M; Nordin, Anisah; Yasin, 'Azlin M; Al-Mekhlafi, Hesham M; Moktar, Norhayati
2011-01-01
This study was conducted to evaluate the modification of the usual Gram-chromotrope staining technique developed in-house known as Gram-chromotrope Kinyoun (GCK) in comparison with the Weber Modified Trichrome (WMT) staining technique; as the reference technique. Two hundred and ninety fecal specimens received by the Microbiology Diagnostic Laboratory of Hospital Universiti Kebangsaan Malaysia were examined for the presence of microsporidial spores. The sensitivity and specificity of GCK compared to the reference technique were 98% and 98.3%, respectively. The positive and negative predictive values were 92.5% and 99.6%, respectively. The agreement between the reference technique and the GCK staining technique was statistically significant by Kappa statistics (K = 0.941, P < 0.001). It is concluded that the GCK staining technique has high sensitivity and specificity in the detection of microsporidial spores in fecal specimens. Hence, it is recommended to be used in the diagnosis of intestinal microsporidiosis. Copyright © 2011 Elsevier Inc. All rights reserved.
In-hospital cost comparison between percutaneous pulmonary valve implantation and surgery
Mishra, Vinod; Lewandowska, Milena; Andersen, Jack Gunnar; Andersen, Marit Helen; Lindberg, Harald; Døhlen, Gaute; Fosse, Erik
2017-01-01
Abstract OBJECTIVES: Today, both surgical and percutaneous techniques are available for pulmonary valve implantation in patients with right ventricle outflow tract obstruction or insufficiency. In this controlled, non-randomized study the hospital costs per patient of the two treatment options were identified and compared. METHODS: During the period of June 2011 until October 2014 cost data in 20 patients treated with the percutaneous technique and 14 patients treated with open surgery were consecutively included. Two methods for cost analysis were used, a retrospective average cost estimate (overhead costs) and a direct prospective detailed cost acquisition related to each individual patient (patient-specific costs). RESULTS: The equipment cost, particularly the stents and valve itself was by far the main cost-driving factor in the percutaneous pulmonary valve group, representing 96% of the direct costs, whereas in the open surgery group the main costs derived from the postoperative care and particularly the stay in the intensive care department. The device-related cost in this group represented 13.5% of the direct costs. Length-of-stay-related costs in the percutaneous group were mean $3885 (1618) and mean $17 848 (5060) in the open surgery group. The difference in postoperative stay between the groups was statistically significant (P≤ 0.001). CONCLUSIONS: Given the high postoperative cost in open surgery, the percutaneous procedure could be cost saving even with a device cost of more than five times the cost of the surgical device. PMID:28007875
The soft computing-based approach to investigate allergic diseases: a systematic review.
Tartarisco, Gennaro; Tonacci, Alessandro; Minciullo, Paola Lucia; Billeci, Lucia; Pioggia, Giovanni; Incorvaia, Cristoforo; Gangemi, Sebastiano
2017-01-01
Early recognition of inflammatory markers and their relation to asthma, adverse drug reactions, allergic rhinitis, atopic dermatitis and other allergic diseases is an important goal in allergy. The vast majority of studies in the literature are based on classic statistical methods; however, developments in computational techniques such as soft computing-based approaches hold new promise in this field. The aim of this manuscript is to systematically review the main soft computing-based techniques such as artificial neural networks, support vector machines, bayesian networks and fuzzy logic to investigate their performances in the field of allergic diseases. The review was conducted following PRISMA guidelines and the protocol was registered within PROSPERO database (CRD42016038894). The research was performed on PubMed and ScienceDirect, covering the period starting from September 1, 1990 through April 19, 2016. The review included 27 studies related to allergic diseases and soft computing performances. We observed promising results with an overall accuracy of 86.5%, mainly focused on asthmatic disease. The review reveals that soft computing-based approaches are suitable for big data analysis and can be very powerful, especially when dealing with uncertainty and poorly characterized parameters. Furthermore, they can provide valuable support in case of lack of data and entangled cause-effect relationships, which make it difficult to assess the evolution of disease. Although most works deal with asthma, we believe the soft computing approach could be a real breakthrough and foster new insights into other allergic diseases as well.
Statistical research into low-power solar flares. Main phase duration
NASA Astrophysics Data System (ADS)
Borovik, Aleksandr; Zhdanov, Anton
2017-12-01
This paper is a sequel to earlier papers on time parameters of solar flares in the Hα line. Using data from the International Flare Patrol, an electronic database of solar flares for the period 1972-2010 has been created. The statistical analysis of the duration of the main phase has shown that it increases with increasing flare class and brightness. It has been found that the duration of the main phase depends on the type and features of development of solar flares. Flares with one brilliant point have the shortest main phase; flares with several intensity maxima and two-ribbon flares, the longest one. We have identified more than 3000 cases with an ultra-long duration of the main phase (more than 60 minutes). For 90% of such flares the duration of the main phase is 2-3 hrs, but sometimes it reaches 12 hrs.
McCormick, Frank; Gupta, Anil; Bruce, Ben; Harris, Josh; Abrams, Geoff; Wilson, Hillary; Hussey, Kristen; Cole, Brian J.
2014-01-01
Purpose: The purpose of this study was to measure and compare the subjective, objective, and radiographic healing outcomes of single-row (SR), double-row (DR), and transosseous equivalent (TOE) suture techniques for arthroscopic rotator cuff repair. Materials and Methods: A retrospective comparative analysis of arthroscopic rotator cuff repairs by one surgeon from 2004 to 2010 at minimum 2-year followup was performed. Cohorts were matched for age, sex, and tear size. Subjective outcome variables included ASES, Constant, SST, UCLA, and SF-12 scores. Objective outcome variables included strength, active range of motion (ROM). Radiographic healing was assessed by magnetic resonance imaging (MRI). Statistical analysis was performed using analysis of variance (ANOVA), Mann — Whitney and Kruskal — Wallis tests with significance, and the Fisher exact probability test <0.05. Results: Sixty-three patients completed the study requirements (20 SR, 21 DR, 22 TOE). There was a clinically and statistically significant improvement in outcomes with all repair techniques (ASES mean improvement P = <0.0001). The mean final ASES scores were: SR 83; (SD 21.4); DR 87 (SD 18.2); TOE 87 (SD 13.2); (P = 0.73). There was a statistically significant improvement in strength for each repair technique (P < 0.001). There was no significant difference between techniques across all secondary outcome assessments: ASES improvement, Constant, SST, UCLA, SF-12, ROM, Strength, and MRI re-tear rates. There was a decrease in re-tear rates from single row (22%) to double-row (18%) to transosseous equivalent (11%); however, this difference was not statistically significant (P = 0.6). Conclusions: Compared to preoperatively, arthroscopic rotator cuff repair, using SR, DR, or TOE techniques, yielded a clinically and statistically significant improvement in subjective and objective outcomes at a minimum 2-year follow-up. Level of Evidence: Therapeutic level 3. PMID:24926159
MANCOVA for one way classification with homogeneity of regression coefficient vectors
NASA Astrophysics Data System (ADS)
Mokesh Rayalu, G.; Ravisankar, J.; Mythili, G. Y.
2017-11-01
The MANOVA and MANCOVA are the extensions of the univariate ANOVA and ANCOVA techniques to multidimensional or vector valued observations. The assumption of a Gaussian distribution has been replaced with the Multivariate Gaussian distribution for the vectors data and residual term variables in the statistical models of these techniques. The objective of MANCOVA is to determine if there are statistically reliable mean differences that can be demonstrated between groups later modifying the newly created variable. When randomization assignment of samples or subjects to groups is not possible, multivariate analysis of covariance (MANCOVA) provides statistical matching of groups by adjusting dependent variables as if all subjects scored the same on the covariates. In this research article, an extension has been made to the MANCOVA technique with more number of covariates and homogeneity of regression coefficient vectors is also tested.
Air Quality Forecasting through Different Statistical and Artificial Intelligence Techniques
NASA Astrophysics Data System (ADS)
Mishra, D.; Goyal, P.
2014-12-01
Urban air pollution forecasting has emerged as an acute problem in recent years because there are sever environmental degradation due to increase in harmful air pollutants in the ambient atmosphere. In this study, there are different types of statistical as well as artificial intelligence techniques are used for forecasting and analysis of air pollution over Delhi urban area. These techniques are principle component analysis (PCA), multiple linear regression (MLR) and artificial neural network (ANN) and the forecasting are observed in good agreement with the observed concentrations through Central Pollution Control Board (CPCB) at different locations in Delhi. But such methods suffers from disadvantages like they provide limited accuracy as they are unable to predict the extreme points i.e. the pollution maximum and minimum cut-offs cannot be determined using such approach. Also, such methods are inefficient approach for better output forecasting. But with the advancement in technology and research, an alternative to the above traditional methods has been proposed i.e. the coupling of statistical techniques with artificial Intelligence (AI) can be used for forecasting purposes. The coupling of PCA, ANN and fuzzy logic is used for forecasting of air pollutant over Delhi urban area. The statistical measures e.g., correlation coefficient (R), normalized mean square error (NMSE), fractional bias (FB) and index of agreement (IOA) of the proposed model are observed in better agreement with the all other models. Hence, the coupling of statistical and artificial intelligence can be use for the forecasting of air pollutant over urban area.
Mavrodi, Alexandra; Ohanyan, Ani; Kechagias, Nikos; Tsekos, Antonis; Vahtsevanos, Konstantinos
2015-09-01
Post-operative complications of various degrees of severity are commonly observed in third molar impaction surgery. For this reason, a surgical procedure that decreases the trauma of bone and soft tissues should be a priority for surgeons. In the present study, we compare the efficacy and the post-operative complications of patients to whom two different surgical techniques were applied for impacted lower third molar extraction. Patients of the first group underwent the classical bur technique, while patients of the second group underwent another technique, in which an elevator was placed on the buccal surface of the impacted molar in order to luxate the alveolar socket more easily. Comparing the two techniques, we observed a statistically significant decrease in the duration of the procedure and in the need for tooth sectioning when applying the second surgical technique, while the post-operative complications were similar in the two groups. We also found a statistically significant lower incidence of lingual nerve lesions and only a slightly higher frequency of sharp mandibular bone irregularities in the second group, which however was not statistically significant. The results of our study indicate that the surgical technique using an elevator on the buccal surface of the tooth seems to be a reliable method to extract impacted third molars safely, easily, quickly and with the minimum trauma to the surrounding tissues.
NASA Astrophysics Data System (ADS)
Singh, Sarvesh Kumar; Kumar, Pramod; Rani, Raj; Turbelin, Grégory
2017-04-01
The study highlights a theoretical comparison and various interpretations of a recent inversion technique, called renormalization, developed for the reconstruction of unknown tracer emissions from their measured concentrations. The comparative interpretations are presented in relation to the other inversion techniques based on principle of regularization, Bayesian, minimum norm, maximum entropy on mean, and model resolution optimization. It is shown that the renormalization technique can be interpreted in a similar manner to other techniques, with a practical choice of a priori information and error statistics, while eliminating the need of additional constraints. The study shows that the proposed weight matrix and weighted Gram matrix offer a suitable deterministic choice to the background error and measurement covariance matrices, respectively, in the absence of statistical knowledge about background and measurement errors. The technique is advantageous since it (i) utilizes weights representing a priori information apparent to the monitoring network, (ii) avoids dependence on background source estimates, (iii) improves on alternative choices for the error statistics, (iv) overcomes the colocalization problem in a natural manner, and (v) provides an optimally resolved source reconstruction. A comparative illustration of source retrieval is made by using the real measurements from a continuous point release conducted in Fusion Field Trials, Dugway Proving Ground, Utah.
Dentascan – Is the Investment Worth the Hype ???
Shah, Monali A; Shah, Sneha S; Dave, Deepak
2013-01-01
Background: Open Bone Measurement (OBM) and Bone Sounding (BS) are most reliable but invasive clinical methods for Alveolar Bone Level (ABL) assessment, causing discomfort to the patient. Routinely, IOPAs & OPGs are the commonest radiographic techniques used, which tend to underestimate bone loss and obscure buccal/lingual defects. Novel technique like dentascan (CBCT) eliminates this limitation by giving images in 3 planes – sagittal, coronal and axial. Aim: To compare & correlate non-invasive 3D radiographic technique of Dentascan with BS & OBM, and IOPA and OPG, in assessing the ABL. Settings and Design: Cross-sectional diagnostic study. Material and Methods: Two hundred and five sites were subjected to clinical and radiographic diagnostic techniques. Relative distance between the alveolar bone crest and reference wire was measured. All the measurements were compared and tested against the OBM. Statistical Analysis: Student’s t-test, ANOVA, Pearson correlation coefficient. Results: There is statistically significant difference between dentascan and OBM, only BS showed agreement with OBM (p < 0.05). Dentascan weakly correlated with OBM & BS lingually.Rest all techniques showed statistically significant difference between them (p= 0.00). Conclusion: Within the limitations of this study, only BS seems to be comparable with OBM with no superior result of Dentascan over the conventional techniques, except for lingual measurements. PMID:24551722
Sheikh, Adnan
2016-01-01
Objective: The aim of this study was to evaluate the impact of adaptive statistical iterative reconstruction (ASiR) technique on the image quality and radiation dose reduction. The comparison was made with the traditional filtered back projection (FBP) technique. Methods: We retrospectively reviewed 78 patients, who underwent cervical spine CT for blunt cervical trauma between 1 June 2010 and 30 November 2010. 48 patients were imaged using traditional FBP technique and the remaining 30 patients were imaged using the ASiR technique. The patient demographics, radiation dose, objective image signal and noise were recorded; while subjective noise, sharpness, diagnostic acceptability and artefacts were graded by two radiologists blinded to the techniques. Results: We found that the ASiR technique was able to reduce the volume CT dose index, dose–length product and effective dose by 36%, 36.5% and 36.5%, respectively, compared with the FBP technique. There was no significant difference in the image noise (p = 0.39), signal (p = 0.82) and signal-to-noise ratio (p = 0.56) between the groups. The subjective image quality was minimally better in the ASiR group but not statistically significant. There was excellent interobserver agreement on the subjective image quality and diagnostic acceptability for both groups. Conclusion: The use of ASiR technique allowed approximately 36% radiation dose reduction in the evaluation of cervical spine without degrading the image quality. Advances in knowledge: The present study highlights that the ASiR technique is extremely helpful in reducing the patient radiation exposure while maintaining the image quality. It is highly recommended to utilize this novel technique in CT imaging of different body regions. PMID:26882825
Patro, Satya N; Chakraborty, Santanu; Sheikh, Adnan
2016-01-01
The aim of this study was to evaluate the impact of adaptive statistical iterative reconstruction (ASiR) technique on the image quality and radiation dose reduction. The comparison was made with the traditional filtered back projection (FBP) technique. We retrospectively reviewed 78 patients, who underwent cervical spine CT for blunt cervical trauma between 1 June 2010 and 30 November 2010. 48 patients were imaged using traditional FBP technique and the remaining 30 patients were imaged using the ASiR technique. The patient demographics, radiation dose, objective image signal and noise were recorded; while subjective noise, sharpness, diagnostic acceptability and artefacts were graded by two radiologists blinded to the techniques. We found that the ASiR technique was able to reduce the volume CT dose index, dose-length product and effective dose by 36%, 36.5% and 36.5%, respectively, compared with the FBP technique. There was no significant difference in the image noise (p = 0.39), signal (p = 0.82) and signal-to-noise ratio (p = 0.56) between the groups. The subjective image quality was minimally better in the ASiR group but not statistically significant. There was excellent interobserver agreement on the subjective image quality and diagnostic acceptability for both groups. The use of ASiR technique allowed approximately 36% radiation dose reduction in the evaluation of cervical spine without degrading the image quality. The present study highlights that the ASiR technique is extremely helpful in reducing the patient radiation exposure while maintaining the image quality. It is highly recommended to utilize this novel technique in CT imaging of different body regions.
Radio Occultation Investigation of the Rings of Saturn and Uranus
NASA Technical Reports Server (NTRS)
Marouf, Essam A.
1997-01-01
The proposed work addresses two main objectives: (1) to pursue the development of the random diffraction screen model for analytical/computational characterization of the extinction and near-forward scattering by ring models that include particle crowding, uniform clustering, and clustering along preferred orientations (anisotropy). The characterization is crucial for proper interpretation of past (Voyager) and future (Cassini) ring, occultation observations in terms of physical ring properties, and is needed to address outstanding puzzles in the interpretation of the Voyager radio occultation data sets; (2) to continue the development of spectral analysis techniques to identify and characterize the power scattered by all features of Saturn's rings that can be resolved in the Voyager radio occultation observations, and to use the results to constrain the maximum particle size and its abundance. Characterization of the variability of surface mass density among the main ring, features and within individual features is important for constraining the ring mass and is relevant to investigations of ring dynamics and origin. We completed the developed of the stochastic geometry (random screen) model for the interaction of electromagnetic waves with of planetary ring models; used the model to relate the oblique optical depth and the angular spectrum of the near forward scattered signal to statistical averages of the stochastic geometry of the randomly blocked area. WE developed analytical results based on the assumption of Poisson statistics for particle positions, and investigated the dependence of the oblique optical depth and angular spectrum on the fractional area blocked, vertical ring profile, and incidence angle when the volume fraction is small. Demonstrated agreement with the classical radiative transfer predictions for oblique incidence. Also developed simulation procedures to generate statistical realizations of random screens corresponding to uniformly packed ring models, and used the results to characterize dependence of the extinction and near-forward scattering on ring thickness, packing fraction, and the ring opening angle.
Testing prediction methods: Earthquake clustering versus the Poisson model
Michael, A.J.
1997-01-01
Testing earthquake prediction methods requires statistical techniques that compare observed success to random chance. One technique is to produce simulated earthquake catalogs and measure the relative success of predicting real and simulated earthquakes. The accuracy of these tests depends on the validity of the statistical model used to simulate the earthquakes. This study tests the effect of clustering in the statistical earthquake model on the results. Three simulation models were used to produce significance levels for a VLF earthquake prediction method. As the degree of simulated clustering increases, the statistical significance drops. Hence, the use of a seismicity model with insufficient clustering can lead to overly optimistic results. A successful method must pass the statistical tests with a model that fully replicates the observed clustering. However, a method can be rejected based on tests with a model that contains insufficient clustering. U.S. copyright. Published in 1997 by the American Geophysical Union.
Optimizing exoplanet transit searches
NASA Astrophysics Data System (ADS)
Herrero, E.; Ribas, I.; Jordi, C.
2013-05-01
Exoplanet searches using the transit technique are nowadays providing a great number of findings. Most exoplanet transit detection programs that are currently underway are focused on large catalogs of stars with no pre-selection. This necessarily makes such surveys quite inefficient, because huge amounts of data are processed for a relatively low transiting planet yield. In this work we investigate a method to increase the efficiency of a targeted exoplanet search with the transit technique by preselecting a subset of candidates from large catalogs of stars. Assuming spin-orbit alignment, this can be done by considering stars that have higher probability to be oriented nearly equator-on (inclination close to 90°). We use activity-rotation velocity relations for low-mass stars to study the dependence of the position in the activity - v sin(i) diagram on the stellar axis inclination. We compose a catalog of G-, K-, M-type main sequence simulated stars using isochrones, an isotropic inclination distribution and empirical relations to obtain their rotation periods and activity indexes. Then the activity-vsini diagram is filled and statistics are applied to trace the areas containing the higher ratio of stars with inclinations above 80°. A similar statistics is applied to stars from real catalogs with log(R'_{HK}) and v sin(i) data to find their probability of being equator-on. We present the method used to generate the simulated star catalog and the subsequent statistics to find the highly inclined stars from real catalogs using the activity-v sin(i) diagram. Several catalogs from the literature are analysed and a subsample of stars with the highest probability of being equator-on is presented. Assuming spin-orbit alignment, the efficiency of an exoplanet transit search in the resulting subsample of probably highly inclined stars is estimated to be two to three times higher than with a global search with no pre-selection.
Hamchevici, Carmen; Udrea, Ion
2013-11-01
The concept of basin-wide Joint Danube Survey (JDS) was launched by the International Commission for the Protection of the Danube River (ICPDR) as a tool for investigative monitoring under the Water Framework Directive (WFD), with a frequency of 6 years. The first JDS was carried out in 2001 and its success in providing key information for characterisation of the Danube River Basin District as required by WFD lead to the organisation of the second JDS in 2007, which was the world's biggest river research expedition in that year. The present paper presents an approach for improving the survey strategy for the next planned survey JDS3 (2013) by means of several multivariate statistical techniques. In order to design the optimum structure in terms of parameters and sampling sites, principal component analysis (PCA), factor analysis (FA) and cluster analysis were applied on JDS2 data for 13 selected physico-chemical and one biological element measured in 78 sampling sites located on the main course of the Danube. Results from PCA/FA showed that most of the dataset variance (above 75%) was explained by five varifactors loaded with 8 out of 14 variables: physical (transparency and total suspended solids), relevant nutrients (N-nitrates and P-orthophosphates), feedback effects of primary production (pH, alkalinity and dissolved oxygen) and algal biomass. Taking into account the representation of the factor scores given by FA versus sampling sites and the major groups generated by the clustering procedure, the spatial network of the next survey could be carefully tailored, leading to a decreasing of sampling sites by more than 30%. The approach of target oriented sampling strategy based on the selected multivariate statistics can provide a strong reduction in dimensionality of the original data and corresponding costs as well, without any loss of information.
UNCERTAINTY ON RADIATION DOSES ESTIMATED BY BIOLOGICAL AND RETROSPECTIVE PHYSICAL METHODS.
Ainsbury, Elizabeth A; Samaga, Daniel; Della Monaca, Sara; Marrale, Maurizio; Bassinet, Celine; Burbidge, Christopher I; Correcher, Virgilio; Discher, Michael; Eakins, Jon; Fattibene, Paola; Güçlü, Inci; Higueras, Manuel; Lund, Eva; Maltar-Strmecki, Nadica; McKeever, Stephen; Rääf, Christopher L; Sholom, Sergey; Veronese, Ivan; Wieser, Albrecht; Woda, Clemens; Trompier, Francois
2018-03-01
Biological and physical retrospective dosimetry are recognised as key techniques to provide individual estimates of dose following unplanned exposures to ionising radiation. Whilst there has been a relatively large amount of recent development in the biological and physical procedures, development of statistical analysis techniques has failed to keep pace. The aim of this paper is to review the current state of the art in uncertainty analysis techniques across the 'EURADOS Working Group 10-Retrospective dosimetry' members, to give concrete examples of implementation of the techniques recommended in the international standards, and to further promote the use of Monte Carlo techniques to support characterisation of uncertainties. It is concluded that sufficient techniques are available and in use by most laboratories for acute, whole body exposures to highly penetrating radiation, but further work will be required to ensure that statistical analysis is always wholly sufficient for the more complex exposure scenarios.
Simulation and statistics: Like rhythm and song
NASA Astrophysics Data System (ADS)
Othman, Abdul Rahman
2013-04-01
Simulation has been introduced to solve problems in the form of systems. By using this technique the following two problems can be overcome. First, a problem that has an analytical solution but the cost of running an experiment to solve is high in terms of money and lives. Second, a problem exists but has no analytical solution. In the field of statistical inference the second problem is often encountered. With the advent of high-speed computing devices, a statistician can now use resampling techniques such as the bootstrap and permutations to form pseudo sampling distribution that will lead to the solution of the problem that cannot be solved analytically. This paper discusses how a Monte Carlo simulation was and still being used to verify the analytical solution in inference. This paper also discusses the resampling techniques as simulation techniques. The misunderstandings about these two techniques are examined. The successful usages of both techniques are also explained.
da Silva, R C V; de Sá, C C; Pascual-Vaca, Á O; de Souza Fontes, L H; Herbella Fernandes, F A M; Dib, R A; Blanco, C R; Queiroz, R A; Navarro-Rodriguez, T
2013-07-01
The treatment of gastroesophageal reflux disease may be clinical or surgical. The clinical consists basically of the use of drugs; however, there are new techniques to complement this treatment, osteopathic intervention in the diaphragmatic muscle is one these. The objective of the study is to compare pressure values in the examination of esophageal manometry of the lower esophageal sphincter (LES) before and immediately after osteopathic intervention in the diaphragm muscle. Thirty-eight patients with gastroesophageal reflux disease - 16 submitted to sham technique and 22 submitted osteopathic technique - were randomly selected. The average respiratory pressure (ARP) and the maximum expiratory pressure (MEP) of the LES were measured by manometry before and after osteopathic technique at the point of highest pressure. Statistical analysis was performed using the Student's t-test and Mann-Whitney, and magnitude of the technique proposed was measured using the Cohen's index. Statistically significant difference in the osteopathic technique was found in three out of four in relation to the group of patients who performed the sham technique for the following measures of LES pressure: ARP with P= 0.027. The MEP had no statistical difference (P= 0.146). The values of Cohen d for the same measures were: ARP with d= 0.80 and MEP d= 0.52. Osteopathic manipulative technique produces a positive increment in the LES region soon after its performance. © 2012 Copyright the Authors. Journal compilation © 2012, Wiley Periodicals, Inc. and the International Society for Diseases of the Esophagus.
Vaidya, Sharad; Parkash, Hari; Bhargava, Akshay; Gupta, Sharad
2014-01-01
Abundant resources and techniques have been used for complete coverage crown fabrication. Conventional investing and casting procedures for phosphate-bonded investments require a 2- to 4-h procedure before completion. Accelerated casting techniques have been used, but may not result in castings with matching marginal accuracy. The study measured the marginal gap and determined the clinical acceptability of single cast copings invested in a phosphate-bonded investment with the use of conventional and accelerated methods. One hundred and twenty cast coping samples were fabricated using conventional and accelerated methods, with three finish lines: Chamfer, shoulder and shoulder with bevel. Sixty copings were prepared with each technique. Each coping was examined with a stereomicroscope at four predetermined sites and measurements of marginal gaps were documented for each. A master chart was prepared for all the data and was analyzed using Statistical Package for the Social Sciences version. Evidence of marginal gap was then evaluated by t-test. Analysis of variance and Post-hoc analysis were used to compare two groups as well as to make comparisons between three subgroups . Measurements recorded showed no statistically significant difference between conventional and accelerated groups. Among the three marginal designs studied, shoulder with bevel showed the best marginal fit with conventional as well as accelerated casting techniques. Accelerated casting technique could be a vital alternative to the time-consuming conventional casting technique. The marginal fit between the two casting techniques showed no statistical difference.
Data mining and statistical inference in selective laser melting
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kamath, Chandrika
Selective laser melting (SLM) is an additive manufacturing process that builds a complex three-dimensional part, layer-by-layer, using a laser beam to fuse fine metal powder together. The design freedom afforded by SLM comes associated with complexity. As the physical phenomena occur over a broad range of length and time scales, the computational cost of modeling the process is high. At the same time, the large number of parameters that control the quality of a part make experiments expensive. In this paper, we describe ways in which we can use data mining and statistical inference techniques to intelligently combine simulations andmore » experiments to build parts with desired properties. We start with a brief summary of prior work in finding process parameters for high-density parts. We then expand on this work to show how we can improve the approach by using feature selection techniques to identify important variables, data-driven surrogate models to reduce computational costs, improved sampling techniques to cover the design space adequately, and uncertainty analysis for statistical inference. Here, our results indicate that techniques from data mining and statistics can complement those from physical modeling to provide greater insight into complex processes such as selective laser melting.« less
Data mining and statistical inference in selective laser melting
Kamath, Chandrika
2016-01-11
Selective laser melting (SLM) is an additive manufacturing process that builds a complex three-dimensional part, layer-by-layer, using a laser beam to fuse fine metal powder together. The design freedom afforded by SLM comes associated with complexity. As the physical phenomena occur over a broad range of length and time scales, the computational cost of modeling the process is high. At the same time, the large number of parameters that control the quality of a part make experiments expensive. In this paper, we describe ways in which we can use data mining and statistical inference techniques to intelligently combine simulations andmore » experiments to build parts with desired properties. We start with a brief summary of prior work in finding process parameters for high-density parts. We then expand on this work to show how we can improve the approach by using feature selection techniques to identify important variables, data-driven surrogate models to reduce computational costs, improved sampling techniques to cover the design space adequately, and uncertainty analysis for statistical inference. Here, our results indicate that techniques from data mining and statistics can complement those from physical modeling to provide greater insight into complex processes such as selective laser melting.« less
Vexler, Albert; Tanajian, Hovig; Hutson, Alan D
In practice, parametric likelihood-ratio techniques are powerful statistical tools. In this article, we propose and examine novel and simple distribution-free test statistics that efficiently approximate parametric likelihood ratios to analyze and compare distributions of K groups of observations. Using the density-based empirical likelihood methodology, we develop a Stata package that applies to a test for symmetry of data distributions and compares K -sample distributions. Recognizing that recent statistical software packages do not sufficiently address K -sample nonparametric comparisons of data distributions, we propose a new Stata command, vxdbel, to execute exact density-based empirical likelihood-ratio tests using K samples. To calculate p -values of the proposed tests, we use the following methods: 1) a classical technique based on Monte Carlo p -value evaluations; 2) an interpolation technique based on tabulated critical values; and 3) a new hybrid technique that combines methods 1 and 2. The third, cutting-edge method is shown to be very efficient in the context of exact-test p -value computations. This Bayesian-type method considers tabulated critical values as prior information and Monte Carlo generations of test statistic values as data used to depict the likelihood function. In this case, a nonparametric Bayesian method is proposed to compute critical values of exact tests.
NASA Astrophysics Data System (ADS)
Lisa, U. F.; Jalina, M.; Marniati
2017-09-01
Based on interviews of so me mother who entered the first stage of labor lack of care from health workers to the effort to reducing the acuteof labor. Health care workers appertain hospital in effective in implement maternity nursing interventions in reducing acute the first stage of labor. The reducing acute have two method are pharmacological and non-pharmacological. In this case, has several techniques there are: relaxation and counterpressure massage techniques that capable to reducing acute first stage of labor. The of non-pharmacological is one of authority which must be implemented by midwives especially breathing relaxation techniquesand massage. The research is Quasi Exsperimen with pretes-posttest design. The statistic test has T test paired and unpairedt test. To indicatea reducing the level of acute before and after given relaxation technique result p-value <0.001 with value mean after being given the treatment as much as 44.00 and the ranges of value 10-90, a reducing the level of acute before and after the counter pressure massage techniques p-value <0.001 with value mean after being given the treatment as much as 42.67 and the ranges of value 10-90. It is no significant difference between the relaxation and counter pressure massage techniques in reducing acute in the first stage of labor, because both techniques are highly effective use in reducing acute of labor the result p-value is 0.891. The relaxation and counter pressure massage techniques useful in provide an affection of mother care because both techniques are very effective work in reducing acute to focus on the point of pain. Therefore, the health of workers, especially for a study to apply relaxation and massage to provide of mother care, mainly to the primigravida who in experienced in process of labor.
Playing at Statistical Mechanics
ERIC Educational Resources Information Center
Clark, Paul M.; And Others
1974-01-01
Discussed are the applications of counting techniques of a sorting game to distributions and concepts in statistical mechanics. Included are the following distributions: Fermi-Dirac, Bose-Einstein, and most probable. (RH)
ERIC Educational Resources Information Center
Wallman, Katherine K.
The main responsibility of the U.S. Bureau of the Census, Bureau of Labor Statistics, and the National Centers for Health and Education Statistics is to collect, process, analyze, and disseminate statistical data on the economic, physical, and social characteristics of the United States. Under the Paperwork Reduction Act of 1980, the federal…
Potentiation Following Ballistic and Nonballistic Complexes: The Effect of Strength Level.
Suchomel, Timothy J; Sato, Kimitake; DeWeese, Brad H; Ebben, William P; Stone, Michael H
2016-07-01
Suchomel, TJ, Sato, K, DeWeese, BH, Ebben, WP, and Stone, MH. Potentiation following ballistic and nonballistic complexes: the effect of strength level. J Strength Cond Res 30(7): 1825-1833, 2016-The purpose of this study was to compare the temporal profile of strong and weak subjects during ballistic and nonballistic potentiation complexes. Eight strong (relative back squat = 2.1 ± 0.1 times body mass) and 8 weak (relative back squat = 1.6 ± 0.2 times body mass) males performed squat jumps immediately and every minute up to 10 minutes following potentiation complexes that included ballistic or nonballistic concentric-only half-squat (COHS) performed at 90% of their 1 repetition maximum COHS. Jump height (JH) and allometrically scaled peak power (PPa) were compared using a series of 2 × 12 repeated measures analyses of variance. No statistically significant strength level main effects for JH (p = 0.442) or PPa (p = 0.078) existed during the ballistic condition. In contrast, statistically significant main effects for time existed for both JH (p = 0.014) and PPa (p < 0.001); however, no statistically significant pairwise comparisons were present (p > 0.05). Statistically significant strength level main effects existed for PPa (p = 0.039) but not for JH (p = 0.137) during the nonballistic condition. Post hoc analysis revealed that the strong subjects produced statistically greater PPa than the weaker subjects (p = 0.039). Statistically significant time main effects existed for time existed for PPa (p = 0.015), but not for JH (p = 0.178). No statistically significant strength level × time interaction effects for JH (p = 0.319) or PPa (p = 0.203) were present for the ballistic or nonballistic conditions. Practical significance indicated by effect sizes and the relationships between maximum potentiation and relative strength suggest that stronger subjects potentiate earlier and to a greater extent than weaker subjects during ballistic and nonballistic potentiation complexes.
Immigration Statistics for the 21st Century
Massey, Douglas S.
2013-01-01
Of the three main contributors to population growth—fertility, mortality, and net migration—the latter is by far the most difficult to capture statistically. This article discusses the main sources of federal statistical data on immigration, each with its own characteristic set of strengths, weaknesses, possibilities, and limitations in the context of the interested social scientist. Among the key limitations, the article argues, are the elimination of parental birthplace from the Census and the lack of complete data concerning the legal statuses of the U.S. population. This article will conclude with suggestions on remedying such deficiencies, at relatively low marginal cost, such as the inclusion of questions on parental birthplace, instituting a regular survey of randomly selected legal immigrants, and the use of the “two-card method” in statistical data. PMID:23990685
Analyzing Faculty Salaries When Statistics Fail.
ERIC Educational Resources Information Center
Simpson, William A.
The role played by nonstatistical procedures, in contrast to multivariant statistical approaches, in analyzing faculty salaries is discussed. Multivariant statistical methods are usually used to establish or defend against prima facia cases of gender and ethnic discrimination with respect to faculty salaries. These techniques are not applicable,…
Explorations in Statistics: Correlation
ERIC Educational Resources Information Center
Curran-Everett, Douglas
2010-01-01
Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This sixth installment of "Explorations in Statistics" explores correlation, a familiar technique that estimates the magnitude of a straight-line relationship between two variables. Correlation is meaningful only when the…
Is it Possible to Sanitize Athletes' Shoes?
Messina, Gabriele; Burgassi, Sandra; Russo, Carmela; Ceriale, Emma; Quercioli, Cecilia; Meniconi, Cosetta
2015-01-01
Context: Footwear should be designed to avoid trauma and injury to the skin of the feet that can favor bacterial and fungal infections. Procedures and substances for sanitizing the interior of shoes are uncommon but are important aspects of primary prevention against foot infections and unpleasant odor. Objective: To evaluate the efficacy of a sanitizing technique for reducing bacterial and fungal contamination of footwear. Design: Crossover study. Setting: Mens Sana basketball team. Patients or Other Participants: Twenty-seven male athletes and 4 coaches (62 shoes). Intervention(s): The experimental protocol required a first sample (swab), 1/shoe, at time 0 from inside the shoes of all athletes before the sanitizing technique began and a second sample at time 1, after about 4 weeks, April 2012 to May 2012, of daily use of the sanitizing technique. Main Outcome Measure(s): The differences before and after use of the sanitizing technique for total bacterial count at 36°C and 22°C for Staphylococcus spp, yeasts, molds, Enterococcus spp, Pseudomonas spp, Escherichia coli, and total coliform bacteria were evaluated. Results: Before use of the sanitizing technique, the total bacterial counts at 36°C and 22°C and for Staphylococcus spp were greater by a factor of 5.8 (95% confidence interval [CI] = 3.42, 9.84), 5.84 (95% CI = 3.45, 9.78), and 4.78 (95% CI = 2.84, 8.03), respectively. All the other comparisons showed a reduction in microbial loads, whereas E coli and coliforms were no longer detected. No statistically significant decrease in yeasts (P = .0841) or molds (P = .6913) was recorded probably because of low contamination. Conclusions: The sanitizing technique significantly reduced the bacterial presence in athletes' shoes. PMID:25415415
Data Mining: Going beyond Traditional Statistics
ERIC Educational Resources Information Center
Zhao, Chun-Mei; Luan, Jing
2006-01-01
The authors provide an overview of data mining, giving special attention to the relationship between data mining and statistics to unravel some misunderstandings about the two techniques. (Contains 1 figure.)
Noudel, R; Gomis, P; Duntze, J; Marnet, D; Bazin, A; Roche, P H
2009-08-01
Therapeutic options for vestibular schwannomas (VS) include microsurgery, stereotactic radiosurgery and conservative management. Early treatment of intracanalicular vestibular schwannomas (IVS) may be advisable because their spontaneous course will show hearing loss in most cases. Advanced microsurgical techniques and continuous intraoperative monitoring of cranial nerves may allow hearing preservation (HP) without facial nerve damage. However, there are still controversies about the definition of hearing preservation, and the best surgical approach that should be used. In this study, we reviewed the main data from the recent literature on IVS surgery and compared hearing, facial function and complication rates after the retrosigmoid (RS) and middle fossa (MF) approaches, respectively. The results showed that the average HP rate after IVS surgery ranged from 58% (RS) to 62% (MF). HP varied widely depending on the audiometric criteria that were used for definition of serviceable hearing. There was a trend to show that the MF approach offered a better quality of postoperative hearing (not statistically significant), whereas the RS approach offered a better facial nerve preservation and fewer complications (not statistically significant). We believe that the timing of treatment in the course of the disease and selection between radiosurgical versus microsurgical procedure are key issues in the management of IVS. Preservation of hearing and good facial nerve function in surgery for VS is a reasonable goal for many patients with intracanalicular tumors and serviceable hearing. Once open surgery has been decided, selection of the approach mainly depends on individual anatomical considerations and experience of the surgeon.
Voitenkov, Voitenkov Vladislav; Andrey, Klimkin; Natalia, Skripchenko; Anastasia, Aksenova
2017-01-01
Context: The diagnosis of polyneuropathy may be challenging at the early stages of the disease. Despite electromyography (EMG) efficacy in the establishment of polyneuropathy diagnosis, in some cases, results are dubious and neurophysiologists may implement additional techniques to ensure that conduction is affected. Aims: The aim of the study was to evaluate motor-evoked potential (MEP) characteristics in children with acute inflammatory demyelinating polyneuropathy (AIDP). Settings and Design: The study was conducted at a pediatric research and clinical center for infectious diseases. Subjects and Methods: Twenty healthy children (7–14 years old) without signs of neurological disorders were enrolled as controls. Thirty-seven patients (8–13 years old) with AIDP were enrolled as the main group. EMG and transcranial magnetic stimulation (TMS) were performed on the 3rd–7th days from the onset of the first symptoms. Statistical Analysis Used: Descriptive statistics and Student's t-test were used. Bonferroni method was applied to implement appropriate corrections for multiple comparisons. Results: Significant differences between children with AIDP and controls on latencies of both cortical and lumbar MEPs were registered. Cortical MEP shapes were disperse in 100% of the cases and lumbar MEPs were disperse in 57% of the cases. Conclusions: Diagnostic TMS on the early stage of the AIDP in children may be implemented as the additional tool. The main finding in this population is lengthening of the latency of cortical and lumbar MEPs. Disperse shape of the lumbar MEPs may be used as the early sign of the acute demyelization. PMID:28904571
Line identification studies using traditional techniques and wavelength coincidence statistics
NASA Technical Reports Server (NTRS)
Cowley, Charles R.; Adelman, Saul J.
1990-01-01
Traditional line identification techniques result in the assignment of individual lines to an atomic or ionic species. These methods may be supplemented by wavelength coincidence statistics (WCS). The strength and weakness of these methods are discussed using spectra of a number of normal and peculiar B and A stars that have been studied independently by both methods. The present results support the overall findings of some earlier studies. WCS would be most useful in a first survey, before traditional methods have been applied. WCS can quickly make a global search for all species and in this way may enable identifications of an unexpected spectrum that could easily be omitted entirely from a traditional study. This is illustrated by O I. WCS is a subject to well known weakness of any statistical technique, for example, a predictable number of spurious results are to be expected. The danger of small number statistics are illustrated. WCS is at its best relative to traditional methods in finding a line-rich atomic species that is only weakly present in a complicated stellar spectrum.
High order statistical signatures from source-driven measurements of subcritical fissile systems
NASA Astrophysics Data System (ADS)
Mattingly, John Kelly
1998-11-01
This research focuses on the development and application of high order statistical analyses applied to measurements performed with subcritical fissile systems driven by an introduced neutron source. The signatures presented are derived from counting statistics of the introduced source and radiation detectors that observe the response of the fissile system. It is demonstrated that successively higher order counting statistics possess progressively higher sensitivity to reactivity. Consequently, these signatures are more sensitive to changes in the composition, fissile mass, and configuration of the fissile assembly. Furthermore, it is shown that these techniques are capable of distinguishing the response of the fissile system to the introduced source from its response to any internal or inherent sources. This ability combined with the enhanced sensitivity of higher order signatures indicates that these techniques will be of significant utility in a variety of applications. Potential applications include enhanced radiation signature identification of weapons components for nuclear disarmament and safeguards applications and augmented nondestructive analysis of spent nuclear fuel. In general, these techniques expand present capabilities in the analysis of subcritical measurements.
Piezosurgery versus Rotatory Osteotomy in Mandibular Impacted Third Molar Extraction.
Bhati, Bharat; Kukreja, Pankaj; Kumar, Sanjeev; Rathi, Vidhi C; Singh, Kanika; Bansal, Shipra
2017-01-01
The aim of this study is to compare piezoelectric surgery versus rotatory osteotomy technique in removal of mandibular impacted third molar. Sample size of 30 patients 18 males, 12 females with a mean age of 27.43 ± 5.27. Bilateral extractions were required in all patients. All the patients were randomly allocated to two groups in one group, namely control group, surgical extraction of mandibular third molar was done using conventional rotatory osteotomy and in the other group, namely test group, extraction of lower third molar was done using Piezotome. Parameters assessed in this study were - mouth opening (interincisal opening), pain (visual analog scale VAS score), swelling, incidence of dry socket, paresthesia and duration of surgery in both groups at baseline, 1 st , 3 rd , and 7 th postoperative day. Comparing both groups pain scores with ( P < 0.05) a statistically significant difference was found between two groups. Mean surgical time was longer for piezosurgery group (51.40 ± 17.9) minutes compared to the conventional rotatory group with a mean of (37.33 ± 15.5) minutes showing a statistically significant difference ( P = 0.002). The main advantages of piezosurgery include soft tissue protection, optimal visibility in the surgical field, decreased blood loss, less vibration and noise, increased comfort for the patient, and protection of tooth structures. Therefore, the piezoelectric device was efficient in decreasing the short-term outcomes of pain and swelling although taking longer duration than conventional rotatory technique it significantly reduces the associated postoperative sequelae of third molar surgery.
Piezosurgery versus Rotatory Osteotomy in Mandibular Impacted Third Molar Extraction
Bhati, Bharat; Kukreja, Pankaj; Kumar, Sanjeev; Rathi, Vidhi C.; Singh, Kanika; Bansal, Shipra
2017-01-01
Aim: The aim of this study is to compare piezoelectric surgery versus rotatory osteotomy technique in removal of mandibular impacted third molar. Materials and Methods: Sample size of 30 patients 18 males, 12 females with a mean age of 27.43 ± 5.27. Bilateral extractions were required in all patients. All the patients were randomly allocated to two groups in one group, namely control group, surgical extraction of mandibular third molar was done using conventional rotatory osteotomy and in the other group, namely test group, extraction of lower third molar was done using Piezotome. Results: Parameters assessed in this study were – mouth opening (interincisal opening), pain (visual analog scale VAS score), swelling, incidence of dry socket, paresthesia and duration of surgery in both groups at baseline, 1st, 3rd, and 7th postoperative day. Comparing both groups pain scores with (P < 0.05) a statistically significant difference was found between two groups. Mean surgical time was longer for piezosurgery group (51.40 ± 17.9) minutes compared to the conventional rotatory group with a mean of (37.33 ± 15.5) minutes showing a statistically significant difference (P = 0.002). Conclusion: The main advantages of piezosurgery include soft tissue protection, optimal visibility in the surgical field, decreased blood loss, less vibration and noise, increased comfort for the patient, and protection of tooth structures. Therefore, the piezoelectric device was efficient in decreasing the short-term outcomes of pain and swelling although taking longer duration than conventional rotatory technique it significantly reduces the associated postoperative sequelae of third molar surgery. PMID:28713729
Gene Identification Algorithms Using Exploratory Statistical Analysis of Periodicity
NASA Astrophysics Data System (ADS)
Mukherjee, Shashi Bajaj; Sen, Pradip Kumar
2010-10-01
Studying periodic pattern is expected as a standard line of attack for recognizing DNA sequence in identification of gene and similar problems. But peculiarly very little significant work is done in this direction. This paper studies statistical properties of DNA sequences of complete genome using a new technique. A DNA sequence is converted to a numeric sequence using various types of mappings and standard Fourier technique is applied to study the periodicity. Distinct statistical behaviour of periodicity parameters is found in coding and non-coding sequences, which can be used to distinguish between these parts. Here DNA sequences of Drosophila melanogaster were analyzed with significant accuracy.
Statistical innovations in the medical device world sparked by the FDA.
Campbell, Gregory; Yue, Lilly Q
2016-01-01
The world of medical devices while highly diverse is extremely innovative, and this facilitates the adoption of innovative statistical techniques. Statisticians in the Center for Devices and Radiological Health (CDRH) at the Food and Drug Administration (FDA) have provided leadership in implementing statistical innovations. The innovations discussed include: the incorporation of Bayesian methods in clinical trials, adaptive designs, the use and development of propensity score methodology in the design and analysis of non-randomized observational studies, the use of tipping-point analysis for missing data, techniques for diagnostic test evaluation, bridging studies for companion diagnostic tests, quantitative benefit-risk decisions, and patient preference studies.
Rule-based statistical data mining agents for an e-commerce application
NASA Astrophysics Data System (ADS)
Qin, Yi; Zhang, Yan-Qing; King, K. N.; Sunderraman, Rajshekhar
2003-03-01
Intelligent data mining techniques have useful e-Business applications. Because an e-Commerce application is related to multiple domains such as statistical analysis, market competition, price comparison, profit improvement and personal preferences, this paper presents a hybrid knowledge-based e-Commerce system fusing intelligent techniques, statistical data mining, and personal information to enhance QoS (Quality of Service) of e-Commerce. A Web-based e-Commerce application software system, eDVD Web Shopping Center, is successfully implemented uisng Java servlets and an Oracle81 database server. Simulation results have shown that the hybrid intelligent e-Commerce system is able to make smart decisions for different customers.
Effects of preprocessing Landsat MSS data on derived features
NASA Technical Reports Server (NTRS)
Parris, T. M.; Cicone, R. C.
1983-01-01
Important to the use of multitemporal Landsat MSS data for earth resources monitoring, such as agricultural inventories, is the ability to minimize the effects of varying atmospheric and satellite viewing conditions, while extracting physically meaningful features from the data. In general, the approaches to the preprocessing problem have been derived from either physical or statistical models. This paper compares three proposed algorithms; XSTAR haze correction, Color Normalization, and Multiple Acquisition Mean Level Adjustment. These techniques represent physical, statistical, and hybrid physical-statistical models, respectively. The comparisons are made in the context of three feature extraction techniques; the Tasseled Cap, the Cate Color Cube. and Normalized Difference.
Huang, Guanxing; Chen, Zongyu; Liu, Fan; Sun, Jichao; Wang, Jincui
2014-11-01
Anthropogenic factors resulted from the urbanization may affect the groundwater As in urbanized areas. Groundwater samples from the Guangzhou city (South China) were collected for As and other parameter analysis, in order to assess the impact of urbanization and natural processes on As distribution in aquifers. Nearly 25.5 % of groundwater samples were above the WHO drinking water standard for As, and the As concentrations in the granular aquifer (GA) were generally far higher than that in the fractured bedrock aquifer (FBA). Samples were classified into four clusters by using hierarchical cluster analysis. Cluster 1 is mainly located in the FBA and controlled by natural processes. Anthropogenic pollution resulted from the urbanization is responsible for high As concentrations identified in cluster 2. Clusters 3 and 4 are mainly located in the GA and controlled by both natural processes and anthropogenic factors. Three main mechanisms control the source and mobilization of groundwater As in the study area. Firstly, the interaction of water and calcareous rocks appears to be responsible for As release in the FBA. Secondly, reduction of Fe/Mn oxyhydroxides and decomposition of organic matter are probably responsible for high As concentrations in the GA. Thirdly, during the process of urbanization, the infiltration of wastewater/leachate with a high As content is likely to be the main source for groundwater As, while NO3 (-) contamination diminishes groundwater As.
A comparison of sequential and spiral scanning techniques in brain CT.
Pace, Ivana; Zarb, Francis
2015-01-01
To evaluate and compare image quality and radiation dose of sequential computed tomography (CT) examinations of the brain and spiral CT examinations of the brain imaged on a GE HiSpeed NX/I Dual Slice 2CT scanner. A random sample of 40 patients referred for CT examination of the brain was selected and divided into 2 groups. Half of the patients were scanned using the sequential technique; the other half were scanned using the spiral technique. Radiation dose data—both the computed tomography dose index (CTDI) and the dose length product (DLP)—were recorded on a checklist at the end of each examination. Using the European Guidelines on Quality Criteria for Computed Tomography, 4 radiologists conducted a visual grading analysis and rated the level of visibility of 6 anatomical structures considered necessary to produce images of high quality. The mean CTDI(vol) and DLP values were statistically significantly higher (P <.05) with the sequential scans (CTDI(vol): 22.06 mGy; DLP: 304.60 mGy • cm) than with the spiral scans (CTDI(vol): 14.94 mGy; DLP: 229.10 mGy • cm). The mean image quality rating scores for all criteria of the sequential scanning technique were statistically significantly higher (P <.05) in the visual grading analysis than those of the spiral scanning technique. In this local study, the sequential technique was preferred over the spiral technique for both overall image quality and differentiation between gray and white matter in brain CT scans. Other similar studies counter this finding. The radiation dose seen with the sequential CT scanning technique was significantly higher than that seen with the spiral CT scanning technique. However, image quality with the sequential technique was statistically significantly superior (P <.05).
Statistical characterization of short wind waves from stereo images of the sea surface
NASA Astrophysics Data System (ADS)
Mironov, Alexey; Yurovskaya, Maria; Dulov, Vladimir; Hauser, Danièle; Guérin, Charles-Antoine
2013-04-01
We propose a methodology to extract short-scale statistical characteristics of the sea surface topography by means of stereo image reconstruction. The possibilities and limitations of the technique are discussed and tested on a data set acquired from an oceanographic platform at the Black Sea. The analysis shows that reconstruction of the topography based on stereo method is an efficient way to derive non-trivial statistical properties of surface short- and intermediate-waves (say from 1 centimer to 1 meter). Most technical issues pertaining to this type of datasets (limited range of scales, lacunarity of data or irregular sampling) can be partially overcome by appropriate processing of the available points. The proposed technique also allows one to avoid linear interpolation which dramatically corrupts properties of retrieved surfaces. The processing technique imposes that the field of elevation be polynomially detrended, which has the effect of filtering out the large scales. Hence the statistical analysis can only address the small-scale components of the sea surface. The precise cut-off wavelength, which is approximatively half the patch size, can be obtained by applying a high-pass frequency filter on the reference gauge time records. The results obtained for the one- and two-points statistics of small-scale elevations are shown consistent, at least in order of magnitude, with the corresponding gauge measurements as well as other experimental measurements available in the literature. The calculation of the structure functions provides a powerful tool to investigate spectral and statistical properties of the field of elevations. Experimental parametrization of the third-order structure function, the so-called skewness function, is one of the most important and original outcomes of this study. This function is of primary importance in analytical scattering models from the sea surface and was up to now unavailable in field conditions. Due to the lack of precise reference measurements for the small-scale wave field, we could not quantify exactly the accuracy of the retrieval technique. However, it appeared clearly that the obtained accuracy is good enough for the estimation of second-order statistical quantities (such as the correlation function), acceptable for third-order quantities (such as the skwewness function) and insufficient for fourth-order quantities (such as kurtosis). Therefore, the stereo technique in the present stage should not be thought as a self-contained universal tool to characterize the surface statistics. Instead, it should be used in conjunction with other well calibrated but sparse reference measurement (such as wave gauges) for cross-validation and calibration. It then completes the statistical analysis in as much as it provides a snapshot of the three-dimensional field and allows for the evaluation of higher-order spatial statistics.
Resampling: A Marriage of Computers and Statistics. ERIC/TM Digest.
ERIC Educational Resources Information Center
Rudner, Lawrence M.; Shafer, Mary Morello
Advances in computer technology are making it possible for educational researchers to use simpler statistical methods to address a wide range of questions with smaller data sets and fewer, and less restrictive, assumptions. This digest introduces computationally intensive statistics, collectively called resampling techniques. Resampling is a…
ERIC Educational Resources Information Center
Foley, Gregory D.; Khoshaim, Heba Bakr; Alsaeed, Maha; Er, S. Nihan
2012-01-01
Attending professional development programmes can support teachers in applying new strategies for teaching mathematics and statistics. This study investigated (a) the extent to which the participants in a professional development programme subsequently used the techniques they had learned when teaching mathematics and statistics and (b) the…
Using Statistical Process Control to Make Data-Based Clinical Decisions.
ERIC Educational Resources Information Center
Pfadt, Al; Wheeler, Donald J.
1995-01-01
Statistical process control (SPC), which employs simple statistical tools and problem-solving techniques such as histograms, control charts, flow charts, and Pareto charts to implement continual product improvement procedures, can be incorporated into human service organizations. Examples illustrate use of SPC procedures to analyze behavioral data…
The Role of the Sampling Distribution in Understanding Statistical Inference
ERIC Educational Resources Information Center
Lipson, Kay
2003-01-01
Many statistics educators believe that few students develop the level of conceptual understanding essential for them to apply correctly the statistical techniques at their disposal and to interpret their outcomes appropriately. It is also commonly believed that the sampling distribution plays an important role in developing this understanding.…
Statistical Analysis For Nucleus/Nucleus Collisions
NASA Technical Reports Server (NTRS)
Mcguire, Stephen C.
1989-01-01
Report describes use of several statistical techniques to charactertize angular distributions of secondary particles emitted in collisions of atomic nuclei in energy range of 24 to 61 GeV per nucleon. Purpose of statistical analysis to determine correlations between intensities of emitted particles and angles comfirming existence of quark/gluon plasma.
Thermographic techniques and adapted algorithms for automatic detection of foreign bodies in food
NASA Astrophysics Data System (ADS)
Meinlschmidt, Peter; Maergner, Volker
2003-04-01
At the moment foreign substances in food are detected mainly by using mechanical and optical methods as well as ultrasonic technique and than they are removed from the further process. These techniques detect a large portion of the foreign substances due to their different mass (mechanical sieving), their different colour (optical method) and their different surface density (ultrasonic detection). Despite the numerous different methods a considerable portion of the foreign substances remain undetected. In order to recognise materials still undetected, a complementary detection method would be desirable removing the foreign substances not registered by the a.m. methods from the production process. In a project with 13 partner from the food industry, the Fraunhofer - Institut für Holzforschung (WKI) and the Technische Unsiversität are trying to adapt thermography for the detection of foreign bodies in the food industry. After the initial tests turned out to be very promising for the differentiation of food stuffs and foreign substances, more and detailed investigation were carried out to develop suitable algorithms for automatic detection of foreign bodies. In order to achieve -besides the mere visual detection of foreign substances- also an automatic detection under production conditions, numerous experiences in image processing and pattern recognition are exploited. Results for the detection of foreign bodies will be presented at the conference showing the different advantages and disadvantages of using grey - level, statistical and morphological image processing techniques.
Statistical iterative reconstruction to improve image quality for digital breast tomosynthesis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xu, Shiyu, E-mail: shiyu.xu@gmail.com; Chen, Ying, E-mail: adachen@siu.edu; Lu, Jianping
2015-09-15
Purpose: Digital breast tomosynthesis (DBT) is a novel modality with the potential to improve early detection of breast cancer by providing three-dimensional (3D) imaging with a low radiation dose. 3D image reconstruction presents some challenges: cone-beam and flat-panel geometry, and highly incomplete sampling. A promising means to overcome these challenges is statistical iterative reconstruction (IR), since it provides the flexibility of accurate physics modeling and a general description of system geometry. The authors’ goal was to develop techniques for applying statistical IR to tomosynthesis imaging data. Methods: These techniques include the following: a physics model with a local voxel-pair basedmore » prior with flexible parameters to fine-tune image quality; a precomputed parameter λ in the prior, to remove data dependence and to achieve a uniform resolution property; an effective ray-driven technique to compute the forward and backprojection; and an oversampled, ray-driven method to perform high resolution reconstruction with a practical region-of-interest technique. To assess the performance of these techniques, the authors acquired phantom data on the stationary DBT prototype system. To solve the estimation problem, the authors proposed an optimization-transfer based algorithm framework that potentially allows fewer iterations to achieve an acceptably converged reconstruction. Results: IR improved the detectability of low-contrast and small microcalcifications, reduced cross-plane artifacts, improved spatial resolution, and lowered noise in reconstructed images. Conclusions: Although the computational load remains a significant challenge for practical development, the superior image quality provided by statistical IR, combined with advancing computational techniques, may bring benefits to screening, diagnostics, and intraoperative imaging in clinical applications.« less
Three-dimensional accuracy of different correction methods for cast implant bars
Kwon, Ji-Yung; Kim, Chang-Whe; Lim, Young-Jun; Kwon, Ho-Beom
2014-01-01
PURPOSE The aim of the present study was to evaluate the accuracy of three techniques for correction of cast implant bars. MATERIALS AND METHODS Thirty cast implant bars were fabricated on a metal master model. All cast implant bars were sectioned at 5 mm from the left gold cylinder using a disk of 0.3 mm thickness, and then each group of ten specimens was corrected by gas-air torch soldering, laser welding, and additional casting technique. Three dimensional evaluation including horizontal, vertical, and twisting measurements was based on measurement and comparison of (1) gap distances of the right abutment replica-gold cylinder interface at buccal, distal, lingual side, (2) changes of bar length, and (3) axis angle changes of the right gold cylinders at the step of the post-correction measurements on the three groups with a contact and non-contact coordinate measuring machine. One-way analysis of variance (ANOVA) and paired t-test were performed at the significance level of 5%. RESULTS Gap distances of the cast implant bars after correction procedure showed no statistically significant difference among groups. Changes in bar length between pre-casting and post-correction measurement were statistically significance among groups. Axis angle changes of the right gold cylinders were not statistically significance among groups. CONCLUSION There was no statistical significance among three techniques in horizontal, vertical and axial errors. But, gas-air torch soldering technique showed the most consistent and accurate trend in the correction of implant bar error. However, Laser welding technique, showed a large mean and standard deviation in vertical and twisting measurement and might be technique-sensitive method. PMID:24605205
Alkarkhi, Abbas F M; Ramli, Saifullah Bin; Easa, Azhar Mat
2009-01-01
Major (sodium, potassium, calcium, magnesium) and minor elements (iron, copper, zinc, manganese) and one heavy metal (lead) of Cavendish banana flour and Dream banana flour were determined, and data were analyzed using multivariate statistical techniques of factor analysis and discriminant analysis. Factor analysis yielded four factors explaining more than 81% of the total variance: the first factor explained 28.73%, comprising magnesium, sodium, and iron; the second factor explained 21.47%, comprising only manganese and copper; the third factor explained 15.66%, comprising zinc and lead; while the fourth factor explained 15.50%, comprising potassium. Discriminant analysis showed that magnesium and sodium exhibited a strong contribution in discriminating the two types of banana flour, affording 100% correct assignation. This study presents the usefulness of multivariate statistical techniques for analysis and interpretation of complex mineral content data from banana flour of different varieties.
Ariew, André
2007-03-01
Charles Darwin, James Clerk Maxwell, and Francis Galton were all aware, by various means, of Aldolphe Quetelet's pioneering work in statistics. Darwin, Maxwell, and Galton all had reason to be interested in Quetelet's work: they were all working on some instance of how large-scale regularities emerge from individual events that vary from one another; all were rejecting the divine interventionistic theories of their contemporaries; and Quetelet's techniques provided them with a way forward. Maxwell and Galton all explicitly endorse Quetelet's techniques in their work; Darwin does not incorporate any of the statistical ideas of Quetelet, although natural selection post-twentieth century synthesis has. Why not Darwin? My answer is that by the time Darwin encountered Malthus's law of excess reproduction he had all he needed to answer about large scale regularities in extinctions, speciation, and adaptation. He didn't need Quetelet.
Visualizing statistical significance of disease clusters using cartograms.
Kronenfeld, Barry J; Wong, David W S
2017-05-15
Health officials and epidemiological researchers often use maps of disease rates to identify potential disease clusters. Because these maps exaggerate the prominence of low-density districts and hide potential clusters in urban (high-density) areas, many researchers have used density-equalizing maps (cartograms) as a basis for epidemiological mapping. However, we do not have existing guidelines for visual assessment of statistical uncertainty. To address this shortcoming, we develop techniques for visual determination of statistical significance of clusters spanning one or more districts on a cartogram. We developed the techniques within a geovisual analytics framework that does not rely on automated significance testing, and can therefore facilitate visual analysis to detect clusters that automated techniques might miss. On a cartogram of the at-risk population, the statistical significance of a disease cluster is determinate from the rate, area and shape of the cluster under standard hypothesis testing scenarios. We develop formulae to determine, for a given rate, the area required for statistical significance of a priori and a posteriori designated regions under certain test assumptions. Uniquely, our approach enables dynamic inference of aggregate regions formed by combining individual districts. The method is implemented in interactive tools that provide choropleth mapping, automated legend construction and dynamic search tools to facilitate cluster detection and assessment of the validity of tested assumptions. A case study of leukemia incidence analysis in California demonstrates the ability to visually distinguish between statistically significant and insignificant regions. The proposed geovisual analytics approach enables intuitive visual assessment of statistical significance of arbitrarily defined regions on a cartogram. Our research prompts a broader discussion of the role of geovisual exploratory analyses in disease mapping and the appropriate framework for visually assessing the statistical significance of spatial clusters.
NASA Astrophysics Data System (ADS)
Su, Wen-Ray; Tsai, Yuan-Fan; Huang, Kuei-Chin; Hsieh, Ching-En
2017-04-01
To facilitate disaster response and enhance the effectiveness of disaster prevention and relief, people and emergency response personnel should be able to rapidly acquire and understand information when disasters occur. However, in existing disaster platforms information is typically presented in text tables, static charts, and maps with points. These formats do not make it easy for users to understand the overall situation. Therefore, this study converts data into human-readable charts by using data visualization techniques, and builds a disaster information dashboard that is concise, attractive and flexible. This information dashboard integrates temporally and spatially correlated data, disaster statistics according to category and county, lists of disasters, and any other relevant information. The graphs are animated and interactive. The dashboard allows users to filter the data according to their needs and thus to assimilate the information more rapidly. In this study, we applied the information dashboard to the analysis of landslides during three typhoon events in 2016: Typhoon Nepartak, Typhoon Meranti and Typhoon Megi. According to the statistical results in the dashboard, the order of frequency of the disaster categories in all three events combined was rock fall, roadbed loss, slope slump, road blockage and debris flow. Disasters occurred mainly in the areas that received the most rainfall. Typhoons Nepartak and Meranti mainly affected Taitung, and Typhoon Megi mainly affected Kaohsiung. The towns Xiulin, Fengbin, Fenglin and Guangfu in Hualian County were all issued with debris flow warnings in all three typhoon events. The disaster information dashboard developed in this study allows the user to rapidly assess the overall disaster situation. It clearly and concisely reveals interactions between time, space and disaster type, and also provides comprehensive details about the disaster. The dashboard provides a foundation for future disaster visualization, since it can combine and present real-time information of various types; as such it will strengthen decision making in disaster prevention management.
Housing decision making methods for initiation development phase process
NASA Astrophysics Data System (ADS)
Zainal, Rozlin; Kasim, Narimah; Sarpin, Norliana; Wee, Seow Ta; Shamsudin, Zarina
2017-10-01
Late delivery and sick housing project problems were attributed to poor decision making. These problems are the string of housing developer that prefers to create their own approach based on their experiences and expertise with the simplest approach by just applying the obtainable standards and rules in decision making. This paper seeks to identify the decision making methods for housing development at the initiation phase in Malaysia. The research involved Delphi method by using questionnaire survey which involved 50 numbers of developers as samples for the primary stage of collect data. However, only 34 developers contributed to the second stage of the information gathering process. At the last stage, only 12 developers were left for the final data collection process. Finding affirms that Malaysian developers prefer to make their investment decisions based on simple interpolation of historical data and using simple statistical or mathematical techniques in producing the required reports. It was suggested that they seemed to skip several important decision-making functions at the primary development stage. These shortcomings were mainly due to time and financial constraints and the lack of statistical or mathematical expertise among the professional and management groups in the developer organisations.
PMMA/PS coaxial electrospinning: a statistical analysis on processing parameters
NASA Astrophysics Data System (ADS)
Rahmani, Shahrzad; Arefazar, Ahmad; Latifi, Masoud
2017-08-01
Coaxial electrospinning, as a versatile method for producing core-shell fibers, is known to be very sensitive to two classes of influential factors including material and processing parameters. Although coaxial electrospinning has been the focus of many studies, the effects of processing parameters on the outcomes of this method have not yet been well investigated. A good knowledge of the impacts of processing parameters and their interactions on coaxial electrospinning can make it possible to better control and optimize this process. Hence, in this study, the statistical technique of response surface method (RSM) using the design of experiments on four processing factors of voltage, distance, core and shell flow rates was applied. Transmission electron microscopy (TEM), scanning electron microscopy (SEM), oil immersion and Fluorescent microscopy were used to characterize fiber morphology. The core and shell diameters of fibers were measured and the effects of all factors and their interactions were discussed. Two polynomial models with acceptable R-squares were proposed to describe the core and shell diameters as functions of the processing parameters. Voltage and distance were recognized as the most significant and influential factors on shell diameter, while core diameter was mainly under the influence of core and shell flow rates besides the voltage.
The Promises and Pitfalls of Genoeconomics*
Benjamin, Daniel J.; Cesarini, David; Chabris, Christopher F.; Glaeser, Edward L.; Laibson, David I.; Guðnason, Vilmundur; Harris, Tamara B.; Launer, Lenore J.; Purcell, Shaun; Smith, Albert Vernon; Johannesson, Magnus; Magnusson, Patrik K.E.; Beauchamp, Jonathan P.; Christakis, Nicholas A.; Atwood, Craig S.; Hebert, Benjamin; Freese, Jeremy; Hauser, Robert M.; Hauser, Taissa S.; Grankvist, Alexander; Hultman, Christina M.; Lichtenstein, Paul
2012-01-01
This article reviews existing research at the intersection of genetics and economics, presents some new findings that illustrate the state of genoeconomics research, and surveys the prospects of this emerging field. Twin studies suggest that economic outcomes and preferences, once corrected for measurement error, appear to be about as heritable as many medical conditions and personality traits. Consistent with this pattern, we present new evidence on the heritability of permanent income and wealth. Turning to genetic association studies, we survey the main ways that the direct measurement of genetic variation across individuals is likely to contribute to economics, and we outline the challenges that have slowed progress in making these contributions. The most urgent problem facing researchers in this field is that most existing efforts to find associations between genetic variation and economic behavior are based on samples that are too small to ensure adequate statistical power. This has led to many false positives in the literature. We suggest a number of possible strategies to improve and remedy this problem: (a) pooling data sets, (b) using statistical techniques that exploit the greater information content of many genes considered jointly, and (c) focusing on economically relevant traits that are most proximate to known biological mechanisms. PMID:23482589
NASA Astrophysics Data System (ADS)
Munawar, Iqra
2016-07-01
Crime mapping is a dynamic process. It can be used to assist all stages of the problem solving process. Mapping crime can help police protect citizens more effectively. The decision to utilize a certain type of map or design element may change based on the purpose of a map, the audience or the available data. If the purpose of the crime analysis map is to assist in the identification of a particular problem, selected data may be mapped to identify patterns of activity that have been previously undetected. The main objective of this research was to study the spatial distribution patterns of the four common crimes i.e Narcotics, Arms, Burglary and Robbery in Gujranwala City using spatial statistical techniques to identify the hotspots. Hotspots or location of clusters were identified using Getis-Ord Gi* Statistic. Crime analysis mapping can be used to conduct a comprehensive spatial analysis of the problem. Graphic presentations of such findings provide a powerful medium to communicate conditions, patterns and trends thus creating an avenue for analysts to bring about significant policy changes. Moreover Crime mapping also helps in the reduction of crime rate.
Approaching Career Criminals With An Intelligence Cycle
2015-12-01
including arrest statistics and “arrest statistics have been used as the main barometer of juvenile delinquent activity, (but) many juvenile... Statistical Briefing Book,” 187. 26 guided by theories about the causes of delinquent behavior, but there was no determination if those efforts achieved the...children.”110 However, the most evidence-based comparison of juvenile delinquency reduction programs is the statistical meta-analysis (a systematic
Nasirudin, Radin A.; Mei, Kai; Panchev, Petar; Fehringer, Andreas; Pfeiffer, Franz; Rummeny, Ernst J.; Fiebich, Martin; Noël, Peter B.
2015-01-01
Purpose The exciting prospect of Spectral CT (SCT) using photon-counting detectors (PCD) will lead to new techniques in computed tomography (CT) that take advantage of the additional spectral information provided. We introduce a method to reduce metal artifact in X-ray tomography by incorporating knowledge obtained from SCT into a statistical iterative reconstruction scheme. We call our method Spectral-driven Iterative Reconstruction (SPIR). Method The proposed algorithm consists of two main components: material decomposition and penalized maximum likelihood iterative reconstruction. In this study, the spectral data acquisitions with an energy-resolving PCD were simulated using a Monte-Carlo simulator based on EGSnrc C++ class library. A jaw phantom with a dental implant made of gold was used as an object in this study. A total of three dental implant shapes were simulated separately to test the influence of prior knowledge on the overall performance of the algorithm. The generated projection data was first decomposed into three basis functions: photoelectric absorption, Compton scattering and attenuation of gold. A pseudo-monochromatic sinogram was calculated and used as input in the reconstruction, while the spatial information of the gold implant was used as a prior. The results from the algorithm were assessed and benchmarked with state-of-the-art reconstruction methods. Results Decomposition results illustrate that gold implant of any shape can be distinguished from other components of the phantom. Additionally, the result from the penalized maximum likelihood iterative reconstruction shows that artifacts are significantly reduced in SPIR reconstructed slices in comparison to other known techniques, while at the same time details around the implant are preserved. Quantitatively, the SPIR algorithm best reflects the true attenuation value in comparison to other algorithms. Conclusion It is demonstrated that the combination of the additional information from Spectral CT and statistical reconstruction can significantly improve image quality, especially streaking artifacts caused by the presence of materials with high atomic numbers. PMID:25955019
NASA Astrophysics Data System (ADS)
Krezhova, Dora; Krezhov, Kiril; Maneva, Svetla; Moskova, Irina; Petrov, Nikolay
2016-07-01
Hyperspectral remote sensing technique, based on reflectance measurements acquired in a high number of contiguous spectral bands in the visible and near infrared spectral ranges, was used to detect the influence of some environmental changes to vegetation ecosystems. Adverse physical and biological conditions give rise to morphological, physiological, and biochemical changes in the plants that affect the manner in which they interact with the light. All green vegetation species have unique spectral features, mainly because of the chlorophyll and carotenoid, and other pigments, and water content. Because spectral reflectance is a function of the illumination conditions, tissue optical properties and biochemical content of the plants it may be used to collect information on several important biophysical parameters such as color and the spectral signature of features, vegetation chlorophyll absorption characteristics, vegetation moisture content, etc. Remotely sensed data collected by means of a portable fiber-optics spectrometer in the spectral range 350-1100 nm were used to extract information on the influence of some environmental changes. Stress factors such as enhanced UV-radiation, salinity, viral infections, were applied to some young plants species (potato, tomato, plums). The test data were subjected to different digital image processing techniques. This included statistical (Student's t-criterion), first derivative and cluster analyses and some vegetation indices. Statistical analyses were carried out in four most informative for the investigated species regions: green (520-580 nm), red (640-680 nm), red edge (680-720 nm) and near infrared (720-780 nm). The strong relationship, which was found between the results from the remote sensing technique and some biochemical and serological analyses (stress markers, DAS-ELISA), indicates the importance of hyperspectral reflectance data for conducting, easily and without damage, rapid assessments of plant biophysical variables. Emphasis is put on current capability and future potential of remote sensing for assessment of the plant health and on the optimum spectral regions and vegetation indices for sensing these biophysical variables.
39 CFR 3050.1 - Definitions applicable to this part.
Code of Federal Regulations, 2011 CFR
2011-07-01
..., mathematical, or statistical theory, precept, or assumption applied by the Postal Service in producing a..., or statistical theory, precept, or assumption. A change in quantification technique should not change...
39 CFR 3050.1 - Definitions applicable to this part.
Code of Federal Regulations, 2013 CFR
2013-07-01
..., mathematical, or statistical theory, precept, or assumption applied by the Postal Service in producing a..., or statistical theory, precept, or assumption. A change in quantification technique should not change...
Barry, R J
1993-01-01
Two apparently new effects in human cardiac responding, "primary bradycardia" and "vagal inhibition", were first described by the Laceys. These effects have been considered by some researchers to reflect differential cardiac innervation, analogous to similar effects observed in animal preparations with direct vagal stimulation. However, it has been argued that such effects arise merely from the data-analytic techniques introduced by the Laceys, and hence are not genuine cardiac cycle effects. Jennings, van der Molen, Somsen and Ridderinkhoff (Psychophysiology, 28 (1991) 596-606) recently proposed a plotting technique and statistical procedure in an attempt to resolve this issue. The present paper demonstrates that the plotting technique fails to achieve their stated aim, since it identifies data from identical cardiac responses as showing cardiac-cycle effects. In addition, the statistical procedure is shown to be reducible to a trivial test of response occurrence. The implication of these demonstrations, in the context of other work, is that this area of investigation has reached a dead end.
Statistical Techniques for Assessing water‐quality effects of BMPs
Walker, John F.
1994-01-01
Little has been published on the effectiveness of various management practices in small rural lakes and streams at the watershed scale. In this study, statistical techniques were used to test for changes in water‐quality data from watersheds where best management practices (BMPs) were implemented. Reductions in data variability due to climate and seasonality were accomplished through the use of regression methods. This study discusses the merits of using storm‐mass‐transport data as a means of improving the ability to detect BMP effects on stream‐water quality. Statistical techniques were applied to suspended‐sediment records from three rural watersheds in Illinois for the period 1981–84. None of the techniques identified changes in suspended sediment, primarily because of the small degree of BMP implementation and because of potential errors introduced through the estimation of storm‐mass transport. A Monte Carlo sensitivity analysis was used to determine the level of discrete change that could be detected for each watershed. In all cases, the use of regressions improved the ability to detect trends.Read More: http://ascelibrary.org/doi/abs/10.1061/(ASCE)0733-9437(1994)120:2(334)
Application of multivariate statistical techniques in microbial ecology.
Paliy, O; Shankar, V
2016-03-01
Recent advances in high-throughput methods of molecular analyses have led to an explosion of studies generating large-scale ecological data sets. In particular, noticeable effect has been attained in the field of microbial ecology, where new experimental approaches provided in-depth assessments of the composition, functions and dynamic changes of complex microbial communities. Because even a single high-throughput experiment produces large amount of data, powerful statistical techniques of multivariate analysis are well suited to analyse and interpret these data sets. Many different multivariate techniques are available, and often it is not clear which method should be applied to a particular data set. In this review, we describe and compare the most widely used multivariate statistical techniques including exploratory, interpretive and discriminatory procedures. We consider several important limitations and assumptions of these methods, and we present examples of how these approaches have been utilized in recent studies to provide insight into the ecology of the microbial world. Finally, we offer suggestions for the selection of appropriate methods based on the research question and data set structure. © 2016 John Wiley & Sons Ltd.
The applications of statistical quantification techniques in nanomechanics and nanoelectronics.
Mai, Wenjie; Deng, Xinwei
2010-10-08
Although nanoscience and nanotechnology have been developing for approximately two decades and have achieved numerous breakthroughs, the experimental results from nanomaterials with a higher noise level and poorer repeatability than those from bulk materials still remain as a practical issue, and challenge many techniques of quantification of nanomaterials. This work proposes a physical-statistical modeling approach and a global fitting statistical method to use all the available discrete data or quasi-continuous curves to quantify a few targeted physical parameters, which can provide more accurate, efficient and reliable parameter estimates, and give reasonable physical explanations. In the resonance method for measuring the elastic modulus of ZnO nanowires (Zhou et al 2006 Solid State Commun. 139 222-6), our statistical technique gives E = 128.33 GPa instead of the original E = 108 GPa, and unveils a negative bias adjustment f(0). The causes are suggested by the systematic bias in measuring the length of the nanowires. In the electronic measurement of the resistivity of a Mo nanowire (Zach et al 2000 Science 290 2120-3), the proposed new method automatically identified the importance of accounting for the Ohmic contact resistance in the model of the Ohmic behavior in nanoelectronics experiments. The 95% confidence interval of resistivity in the proposed one-step procedure is determined to be 3.57 +/- 0.0274 x 10( - 5) ohm cm, which should be a more reliable and precise estimate. The statistical quantification technique should find wide applications in obtaining better estimations from various systematic errors and biased effects that become more significant at the nanoscale.
A constant current charge technique for low Earth orbit life testing
NASA Technical Reports Server (NTRS)
Glueck, Peter
1991-01-01
A constant current charge technique for low earth orbit testing of nickel cadmium cells is presented. The method mimics the familiar taper charge of the constant potential technique while maintaining cell independence for statistical analysis. A detailed example application is provided and the advantages and disadvantages of this technique are discussed.
Statistical analysis of 59 inspected SSME HPFTP turbine blades (uncracked and cracked)
NASA Technical Reports Server (NTRS)
Wheeler, John T.
1987-01-01
The numerical results of statistical analysis of the test data of Space Shuttle Main Engine high pressure fuel turbopump second-stage turbine blades, including some with cracks are presented. Several statistical methods use the test data to determine the application of differences in frequency variations between the uncracked and cracked blades.
Ocean Drilling Program: Web Site Access Statistics
and products Drilling services and tools Online Janus database Search the ODP/TAMU web site ODP's main See statistics for JOIDES members. See statistics for Janus database. 1997 October November December accessible only on www-odp.tamu.edu. ** End of ODP, start of IODP. Privacy Policy ODP | Search | Database
Weak value amplification considered harmful
NASA Astrophysics Data System (ADS)
Ferrie, Christopher; Combes, Joshua
2014-03-01
We show using statistically rigorous arguments that the technique of weak value amplification does not perform better than standard statistical techniques for the tasks of parameter estimation and signal detection. We show that using all data and considering the joint distribution of all measurement outcomes yields the optimal estimator. Moreover, we show estimation using the maximum likelihood technique with weak values as small as possible produces better performance for quantum metrology. In doing so, we identify the optimal experimental arrangement to be the one which reveals the maximal eigenvalue of the square of system observables. We also show these conclusions do not change in the presence of technical noise.
Improving Focal Depth Estimates: Studies of Depth Phase Detection at Regional Distances
NASA Astrophysics Data System (ADS)
Stroujkova, A.; Reiter, D. T.; Shumway, R. H.
2006-12-01
The accurate estimation of the depth of small, regionally recorded events continues to be an important and difficult explosion monitoring research problem. Depth phases (free surface reflections) are the primary tool that seismologists use to constrain the depth of a seismic event. When depth phases from an event are detected, an accurate source depth is easily found by using the delay times of the depth phases relative to the P wave and a velocity profile near the source. Cepstral techniques, including cepstral F-statistics, represent a class of methods designed for the depth-phase detection and identification; however, they offer only a moderate level of success at epicentral distances less than 15°. This is due to complexities in the Pn coda, which can lead to numerous false detections in addition to the true phase detection. Therefore, cepstral methods cannot be used independently to reliably identify depth phases. Other evidence, such as apparent velocities, amplitudes and frequency content, must be used to confirm whether the phase is truly a depth phase. In this study we used a variety of array methods to estimate apparent phase velocities and arrival azimuths, including beam-forming, semblance analysis, MUltiple SIgnal Classification (MUSIC) (e.g., Schmidt, 1979), and cross-correlation (e.g., Cansi, 1995; Tibuleac and Herrin, 1997). To facilitate the processing and comparison of results, we developed a MATLAB-based processing tool, which allows application of all of these techniques (i.e., augmented cepstral processing) in a single environment. The main objective of this research was to combine the results of three focal-depth estimation techniques and their associated standard errors into a statistically valid unified depth estimate. The three techniques include: 1. Direct focal depth estimate from the depth-phase arrival times picked via augmented cepstral processing. 2. Hypocenter location from direct and surface-reflected arrivals observed on sparse networks of regional stations using a Grid-search, Multiple-Event Location method (GMEL; Rodi and Toksöz, 2000; 2001). 3. Surface-wave dispersion inversion for event depth and focal mechanism (Herrmann and Ammon, 2002). To validate our approach and provide quality control for our solutions, we applied the techniques to moderated- sized events (mb between 4.5 and 6.0) with known focal mechanisms. We illustrate the techniques using events observed at regional distances from the KSAR (Wonju, South Korea) teleseismic array and other nearby broadband three-component stations. Our results indicate that the techniques can produce excellent agreement between the various depth estimates. In addition, combining the techniques into a "unified" estimate greatly reduced location errors and improved robustness of the solution, even if results from the individual methods yielded large standard errors.
Multispectral scanner system parameter study and analysis software system description, volume 2
NASA Technical Reports Server (NTRS)
Landgrebe, D. A. (Principal Investigator); Mobasseri, B. G.; Wiersma, D. J.; Wiswell, E. R.; Mcgillem, C. D.; Anuta, P. E.
1978-01-01
The author has identified the following significant results. The integration of the available methods provided the analyst with the unified scanner analysis package (USAP), the flexibility and versatility of which was superior to many previous integrated techniques. The USAP consisted of three main subsystems; (1) a spatial path, (2) a spectral path, and (3) a set of analytic classification accuracy estimators which evaluated the system performance. The spatial path consisted of satellite and/or aircraft data, data correlation analyzer, scanner IFOV, and random noise model. The output of the spatial path was fed into the analytic classification and accuracy predictor. The spectral path consisted of laboratory and/or field spectral data, EXOSYS data retrieval, optimum spectral function calculation, data transformation, and statistics calculation. The output of the spectral path was fended into the stratified posterior performance estimator.
The Malpractice of Statistical Interpretation
ERIC Educational Resources Information Center
Fraas, John W.; Newman, Isadore
1978-01-01
Problems associated with the use of gain scores, analysis of covariance, multicollinearity, part and partial correlation, and the lack of rectilinearity in regression are discussed. Particular attention is paid to the misuse of statistical techniques. (JKS)
Applying Regression Analysis to Problems in Institutional Research.
ERIC Educational Resources Information Center
Bohannon, Tom R.
1988-01-01
Regression analysis is one of the most frequently used statistical techniques in institutional research. Principles of least squares, model building, residual analysis, influence statistics, and multi-collinearity are described and illustrated. (Author/MSE)
42 CFR 1003.109 - Notice of proposed determination.
Code of Federal Regulations, 2010 CFR
2010-10-01
... briefly describe the statistical sampling technique utilized by the Inspector General); (3) The reason why... statistical sampling in accordance with § 1003.133 in which case the notice shall describe those claims and...
11 CFR 9036.4 - Commission review of submissions.
Code of Federal Regulations, 2010 CFR
2010-01-01
..., in conducting its review, may utilize statistical sampling techniques. Based on the results of its... nonmatchable and the reason that it is not matchable; or if statistical sampling is used, the estimated amount...
STATISTICAL SAMPLING AND DATA ANALYSIS
Research is being conducted to develop approaches to improve soil and sediment sampling techniques, measurement design and geostatistics, and data analysis via chemometric, environmetric, and robust statistical methods. Improvements in sampling contaminated soil and other hetero...
NASA Astrophysics Data System (ADS)
Richfield, Jon; bookfeller
2016-07-01
In reply to Ralph Kenna and Pádraig Mac Carron's feature article “Maths meets myths” in which they describe how they are using techniques from statistical physics to characterize the societies depicted in ancient Icelandic sagas.
Wavelet analysis in ecology and epidemiology: impact of statistical tests
Cazelles, Bernard; Cazelles, Kévin; Chavez, Mario
2014-01-01
Wavelet analysis is now frequently used to extract information from ecological and epidemiological time series. Statistical hypothesis tests are conducted on associated wavelet quantities to assess the likelihood that they are due to a random process. Such random processes represent null models and are generally based on synthetic data that share some statistical characteristics with the original time series. This allows the comparison of null statistics with those obtained from original time series. When creating synthetic datasets, different techniques of resampling result in different characteristics shared by the synthetic time series. Therefore, it becomes crucial to consider the impact of the resampling method on the results. We have addressed this point by comparing seven different statistical testing methods applied with different real and simulated data. Our results show that statistical assessment of periodic patterns is strongly affected by the choice of the resampling method, so two different resampling techniques could lead to two different conclusions about the same time series. Moreover, our results clearly show the inadequacy of resampling series generated by white noise and red noise that are nevertheless the methods currently used in the wide majority of wavelets applications. Our results highlight that the characteristics of a time series, namely its Fourier spectrum and autocorrelation, are important to consider when choosing the resampling technique. Results suggest that data-driven resampling methods should be used such as the hidden Markov model algorithm and the ‘beta-surrogate’ method. PMID:24284892
Wavelet analysis in ecology and epidemiology: impact of statistical tests.
Cazelles, Bernard; Cazelles, Kévin; Chavez, Mario
2014-02-06
Wavelet analysis is now frequently used to extract information from ecological and epidemiological time series. Statistical hypothesis tests are conducted on associated wavelet quantities to assess the likelihood that they are due to a random process. Such random processes represent null models and are generally based on synthetic data that share some statistical characteristics with the original time series. This allows the comparison of null statistics with those obtained from original time series. When creating synthetic datasets, different techniques of resampling result in different characteristics shared by the synthetic time series. Therefore, it becomes crucial to consider the impact of the resampling method on the results. We have addressed this point by comparing seven different statistical testing methods applied with different real and simulated data. Our results show that statistical assessment of periodic patterns is strongly affected by the choice of the resampling method, so two different resampling techniques could lead to two different conclusions about the same time series. Moreover, our results clearly show the inadequacy of resampling series generated by white noise and red noise that are nevertheless the methods currently used in the wide majority of wavelets applications. Our results highlight that the characteristics of a time series, namely its Fourier spectrum and autocorrelation, are important to consider when choosing the resampling technique. Results suggest that data-driven resampling methods should be used such as the hidden Markov model algorithm and the 'beta-surrogate' method.
Statistical process management: An essential element of quality improvement
NASA Astrophysics Data System (ADS)
Buckner, M. R.
Successful quality improvement requires a balanced program involving the three elements that control quality: organization, people and technology. The focus of the SPC/SPM User's Group is to advance the technology component of Total Quality by networking within the Group and by providing an outreach within Westinghouse to foster the appropriate use of statistic techniques to achieve Total Quality. SPM encompasses the disciplines by which a process is measured against its intrinsic design capability, in the face of measurement noise and other obscuring variability. SPM tools facilitate decisions about the process that generated the data. SPM deals typically with manufacturing processes, but with some flexibility of definition and technique it accommodates many administrative processes as well. The techniques of SPM are those of Statistical Process Control, Statistical Quality Control, Measurement Control, and Experimental Design. In addition, techniques such as job and task analysis, and concurrent engineering are important elements of systematic planning and analysis that are needed early in the design process to ensure success. The SPC/SPM User's Group is endeavoring to achieve its objectives by sharing successes that have occurred within the member's own Westinghouse department as well as within other US and foreign industry. In addition, failures are reviewed to establish lessons learned in order to improve future applications. In broader terms, the Group is interested in making SPM the accepted way of doing business within Westinghouse.
Interpolative modeling of GaAs FET S-parameter data bases for use in Monte Carlo simulations
NASA Technical Reports Server (NTRS)
Campbell, L.; Purviance, J.
1992-01-01
A statistical interpolation technique is presented for modeling GaAs FET S-parameter measurements for use in the statistical analysis and design of circuits. This is accomplished by interpolating among the measurements in a GaAs FET S-parameter data base in a statistically valid manner.
ERIC Educational Resources Information Center
Hardy, Melissa
2005-01-01
This article presents a response to Timothy Patrick Moran's article "The Sociology of Teaching Graduate Statistics." In his essay, Moran argues that exciting developments in techniques of quantitative analysis are currently coupled with a much less exciting formulaic approach to teaching sociology graduate students about quantitative analysis. The…
NCES Handbook of Survey Methods. NCES 2011-609
ERIC Educational Resources Information Center
Burns, Shelley, Ed.; Wang, Xiaolei, Ed.; Henning, Alexandra, Ed.
2011-01-01
Since its inception, the National Center for Education Statistics (NCES) has been committed to the practice of documenting its statistical methods for its customers and of seeking to avoid misinterpretation of its published data. The reason for this policy is to assure customers that proper statistical standards and techniques have been observed,…
Applying Statistical Process Quality Control Methodology to Educational Settings.
ERIC Educational Resources Information Center
Blumberg, Carol Joyce
A subset of Statistical Process Control (SPC) methodology known as Control Charting is introduced. SPC methodology is a collection of graphical and inferential statistics techniques used to study the progress of phenomena over time. The types of control charts covered are the null X (mean), R (Range), X (individual observations), MR (moving…
Statistics for People Who (Think They) Hate Statistics. Third Edition
ERIC Educational Resources Information Center
Salkind, Neil J.
2007-01-01
This text teaches an often intimidating and difficult subject in a way that is informative, personable, and clear. The author takes students through various statistical procedures, beginning with correlation and graphical representation of data and ending with inferential techniques and analysis of variance. In addition, the text covers SPSS, and…
ERIC Educational Resources Information Center
Theoret, Julie M.; Luna, Andrea
2009-01-01
This action research combined qualitative and quantitative techniques to investigate two different types of writing assignments in an introductory undergraduate statistics course. The assignments were written in response to the same set of prompts but in two different ways: homework journal assignments or initial posts to a computer discussion…
Imura, N; Kato, A S; Novo, N F; Hata, G; Uemura, M; Toda, T
2001-10-01
The purpose of this study was to compare the effects of two engine-driven, nickel-titanium instrument systems with hand files in the final shape of slight and moderately curved canals. A total of 72 mesial roots of extracted human mandibular molars were divided into three groups: ProFile .04 taper, Pow-R rotary systems, and Flex-R hand-filing technique. The roots were mounted and cross-sectioned at two different horizontal levels using a modified Bramante technique. Pre- and postinstrumented cross-sectional roots were imaged, recorded, and computer analyzed. Results showed that, at the middle third, in almost all groups, there was a tendency of cutting more toward the mesial side with only one exception: Pow-R cut more to the distal side (danger zone) (p < 0.02). At the apical third, Flex-R (p < 0.03) and ProFile (0.001) transported to the mesial side (danger zone) when the curvature increased. When the three techniques were compared analyzing each side and considering the two groups of curvature, at the middle third in the moderately curved-canal group, Flex-R cut statistically more than Pow-R toward the lingual side. The other comparisons showed no statistically significant difference. When the techniques were compared in relation with the degree of curvature, in the apical third, ProFile .04 cut statistically more toward the mesial side in the moderately curved canal group than in the slightly curved canal group. The other comparisons showed no statistically significant difference. Canal preparation time was shorter with hand instrumentation (p < .05) in a few instances.
[Digital radiography in young children. Considerations based on experiences in practice].
Berkhout, W E R; Mileman, P A; Weerheijm, K L
2004-10-01
In dentistry, digital radiology techniques, such as a charge-coupled device and a storage phosphor plate, are gaining popularity. It was the objective of this study to assess the importance of the advantages and disadvantages of digital radiology techniques for bitewing radiography in young children, when compared to conventional film. A group of dentists received a questionnaire regarding their experiences with digital radiology techniques or conventional films among young children. Using the Simple Multi-Attributive Rating Technique (SMART) a final weighted score was calculated for the charge-coupled device, the phosphor plate, and conventional film. The scores were 7.40, 7.38, and 6.98 respectively. The differences were not statistically significant (p > 0.47). It could be concluded that, on the basis of experiences in practice, there are no statistically significant preferences for the use of digital radioogy techniques for bitewing radiography in young children.
Lightweight and Statistical Techniques for Petascale PetaScale Debugging
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, Barton
2014-06-30
This project investigated novel techniques for debugging scientific applications on petascale architectures. In particular, we developed lightweight tools that narrow the problem space when bugs are encountered. We also developed techniques that either limit the number of tasks and the code regions to which a developer must apply a traditional debugger or that apply statistical techniques to provide direct suggestions of the location and type of error. We extend previous work on the Stack Trace Analysis Tool (STAT), that has already demonstrated scalability to over one hundred thousand MPI tasks. We also extended statistical techniques developed to isolate programming errorsmore » in widely used sequential or threaded applications in the Cooperative Bug Isolation (CBI) project to large scale parallel applications. Overall, our research substantially improved productivity on petascale platforms through a tool set for debugging that complements existing commercial tools. Previously, Office Of Science application developers relied either on primitive manual debugging techniques based on printf or they use tools, such as TotalView, that do not scale beyond a few thousand processors. However, bugs often arise at scale and substantial effort and computation cycles are wasted in either reproducing the problem in a smaller run that can be analyzed with the traditional tools or in repeated runs at scale that use the primitive techniques. New techniques that work at scale and automate the process of identifying the root cause of errors were needed. These techniques significantly reduced the time spent debugging petascale applications, thus leading to a greater overall amount of time for application scientists to pursue the scientific objectives for which the systems are purchased. We developed a new paradigm for debugging at scale: techniques that reduced the debugging scenario to a scale suitable for traditional debuggers, e.g., by narrowing the search for the root-cause analysis to a small set of nodes or by identifying equivalence classes of nodes and sampling our debug targets from them. We implemented these techniques as lightweight tools that efficiently work on the full scale of the target machine. We explored four lightweight debugging refinements: generic classification parameters, such as stack traces, application-specific classification parameters, such as global variables, statistical data acquisition techniques and machine learning based approaches to perform root cause analysis. Work done under this project can be divided into two categories, new algorithms and techniques for scalable debugging, and foundation infrastructure work on our MRNet multicast-reduction framework for scalability, and Dyninst binary analysis and instrumentation toolkits.« less
Use of statistical procedures in Brazilian and international dental journals.
Ambrosano, Gláucia Maria Bovi; Reis, André Figueiredo; Giannini, Marcelo; Pereira, Antônio Carlos
2004-01-01
A descriptive survey was performed in order to assess the statistical content and quality of Brazilian and international dental journals, and compare their evolution throughout the last decades. The authors identified the reporting and accuracy of statistical techniques in 1000 papers published from 1970 to 2000 in seven dental journals: three Brazilian (Brazilian Dental Journal, Revista de Odontologia da Universidade de Sao Paulo and Revista de Odontologia da UNESP) and four international journals (Journal of the American Dental Association, Journal of Dental Research, Caries Research and Journal of Periodontology). Papers were divided into two time periods: from 1970 to 1989, and from 1990 to 2000. A slight increase in the number of articles that presented some form of statistical technique was noticed for Brazilian journals (from 61.0 to 66.7%), whereas for international journals, a significant increase was observed (65.8 to 92.6%). In addition, a decrease in the number of statistical errors was verified. The most commonly used statistical tests as well as the most frequent errors found in dental journals were assessed. Hopefully, this investigation will encourage dental educators to better plan the teaching of biostatistics, and to improve the statistical quality of submitted manuscripts.
Gräfe, James L; McNeill, Fiona E
2018-06-28
This article briefly reviews the main measurement techniques for the non-invasive detection of residual gadolinium (Gd) in those exposed to gadolinium-based contrast agents (GBCAs). Approach and Main results: The current status of in vivo Gd measurement is discussed and is put into the context of concerns within the radiology community. The main techniques are based on applied atomic/nuclear medicine utilizing the characteristic atomic and nuclear spectroscopic signature of Gd. The main emission energies are in the 40-200 keV region and require spectroscopic detectors with good energy resolution. The two main techniques, prompt gamma neutron activation analysis and x-ray fluorescence, provide adequate detection limits for in vivo measurement, whilst delivering a low effective radiation dose on the order of a few µSv. Gadolinium is being detected in measureable quantities in people with healthy renal function who have received FDA approved GBCAs. The applied atomic/nuclear medicine techniques discussed in this review will be useful in determining the significance of this retention, and will help on advising future administration protocols.
Kim, Seokyeon; Jeong, Seongmin; Woo, Insoo; Jang, Yun; Maciejewski, Ross; Ebert, David S
2018-03-01
Geographic visualization research has focused on a variety of techniques to represent and explore spatiotemporal data. The goal of those techniques is to enable users to explore events and interactions over space and time in order to facilitate the discovery of patterns, anomalies and relationships within the data. However, it is difficult to extract and visualize data flow patterns over time for non-directional statistical data without trajectory information. In this work, we develop a novel flow analysis technique to extract, represent, and analyze flow maps of non-directional spatiotemporal data unaccompanied by trajectory information. We estimate a continuous distribution of these events over space and time, and extract flow fields for spatial and temporal changes utilizing a gravity model. Then, we visualize the spatiotemporal patterns in the data by employing flow visualization techniques. The user is presented with temporal trends of geo-referenced discrete events on a map. As such, overall spatiotemporal data flow patterns help users analyze geo-referenced temporal events, such as disease outbreaks, crime patterns, etc. To validate our model, we discard the trajectory information in an origin-destination dataset and apply our technique to the data and compare the derived trajectories and the original. Finally, we present spatiotemporal trend analysis for statistical datasets including twitter data, maritime search and rescue events, and syndromic surveillance.
NASA Technical Reports Server (NTRS)
Poulain, Pierre-Marie; Luther, Douglas S.; Patzert, William C.
1992-01-01
Two techniques were developed for estimating statistics of inertial oscillations from satellite-tracked drifters that overcome the difficulties inherent in estimating such statistics from data dependent upon space coordinates that are a function of time. Application of these techniques to tropical surface drifter data collected during the NORPAX, EPOCS, and TOGA programs reveals a latitude-dependent, statistically significant 'blue shift' of inertial wave frequency. The latitudinal dependence of the blue shift is similar to predictions based on 'global' internal-wave spectral models, with a superposition of frequency shifting due to modification of the effective local inertial frequency by the presence of strongly sheared zonal mean currents within 12 deg of the equator.
Mathysen, Danny G P; Aclimandos, Wagih; Roelant, Ella; Wouters, Kristien; Creuzot-Garcher, Catherine; Ringens, Peter J; Hawlina, Marko; Tassignon, Marie-José
2013-11-01
To investigate whether introduction of item-response theory (IRT) analysis, in parallel to the 'traditional' statistical analysis methods available for performance evaluation of multiple T/F items as used in the European Board of Ophthalmology Diploma (EBOD) examination, has proved beneficial, and secondly, to study whether the overall assessment performance of the current written part of EBOD is sufficiently high (KR-20≥ 0.90) to be kept as examination format in future EBOD editions. 'Traditional' analysis methods for individual MCQ item performance comprise P-statistics, Rit-statistics and item discrimination, while overall reliability is evaluated through KR-20 for multiple T/F items. The additional set of statistical analysis methods for the evaluation of EBOD comprises mainly IRT analysis. These analysis techniques are used to monitor whether the introduction of negative marking for incorrect answers (since EBOD 2010) has a positive influence on the statistical performance of EBOD as a whole and its individual test items in particular. Item-response theory analysis demonstrated that item performance parameters should not be evaluated individually, but should be related to one another. Before the introduction of negative marking, the overall EBOD reliability (KR-20) was good though with room for improvement (EBOD 2008: 0.81; EBOD 2009: 0.78). After the introduction of negative marking, the overall reliability of EBOD improved significantly (EBOD 2010: 0.92; EBOD 2011:0.91; EBOD 2012: 0.91). Although many statistical performance parameters are available to evaluate individual items, our study demonstrates that the overall reliability assessment remains the only crucial parameter to be evaluated allowing comparison. While individual item performance analysis is worthwhile to undertake as secondary analysis, drawing final conclusions seems to be more difficult. Performance parameters need to be related, as shown by IRT analysis. Therefore, IRT analysis has proved beneficial for the statistical analysis of EBOD. Introduction of negative marking has led to a significant increase in the reliability (KR-20 > 0.90), indicating that the current examination format can be kept for future EBOD examinations. © 2013 Acta Ophthalmologica Scandinavica Foundation. Published by John Wiley & Sons Ltd.
Exploring the Connection Between Sampling Problems in Bayesian Inference and Statistical Mechanics
NASA Technical Reports Server (NTRS)
Pohorille, Andrew
2006-01-01
The Bayesian and statistical mechanical communities often share the same objective in their work - estimating and integrating probability distribution functions (pdfs) describing stochastic systems, models or processes. Frequently, these pdfs are complex functions of random variables exhibiting multiple, well separated local minima. Conventional strategies for sampling such pdfs are inefficient, sometimes leading to an apparent non-ergodic behavior. Several recently developed techniques for handling this problem have been successfully applied in statistical mechanics. In the multicanonical and Wang-Landau Monte Carlo (MC) methods, the correct pdfs are recovered from uniform sampling of the parameter space by iteratively establishing proper weighting factors connecting these distributions. Trivial generalizations allow for sampling from any chosen pdf. The closely related transition matrix method relies on estimating transition probabilities between different states. All these methods proved to generate estimates of pdfs with high statistical accuracy. In another MC technique, parallel tempering, several random walks, each corresponding to a different value of a parameter (e.g. "temperature"), are generated and occasionally exchanged using the Metropolis criterion. This method can be considered as a statistically correct version of simulated annealing. An alternative approach is to represent the set of independent variables as a Hamiltonian system. Considerab!e progress has been made in understanding how to ensure that the system obeys the equipartition theorem or, equivalently, that coupling between the variables is correctly described. Then a host of techniques developed for dynamical systems can be used. Among them, probably the most powerful is the Adaptive Biasing Force method, in which thermodynamic integration and biased sampling are combined to yield very efficient estimates of pdfs. The third class of methods deals with transitions between states described by rate constants. These problems are isomorphic with chemical kinetics problems. Recently, several efficient techniques for this purpose have been developed based on the approach originally proposed by Gillespie. Although the utility of the techniques mentioned above for Bayesian problems has not been determined, further research along these lines is warranted
42 CFR 402.7 - Notice of proposed determination.
Code of Federal Regulations, 2010 CFR
2010-10-01
... and a brief description of the statistical sampling technique CMS or OIG used. (3) The reason why the... is relying upon statistical sampling to project the number and types of claims or requests for...
2012-01-01
Background Opiates are the main drugs of abuse, and Methadone Maintenance Treatment (MMT) is the most widely administered drug addiction treatment program in Iran. Our study aimed to investigate patterns of pre-treatment drug abuse, addiction treatment history and characteristics of patients in MMT in Tehran. Methods We applied a stratified cluster random sampling technique and conducted a cross-sectional survey utilizing a standard patient characteristic and addiction history form with patients (n = 810) in MMT. The Chi-square test and t-test served for statistical analyses. Results A clear majority of the participants were men (96%), more than 60% of whom were between 25 and 44 years of age, educated (89% had more than elementary education), and employed (>70%). The most commonly reported main drugs of abuse prior to MMT entry were opium (69%) and crystalline heroin (24%). The patients’ lifetime drug experience included opium (92%), crystalline heroin (28%), cannabis (16%), amphetamines (15%), and other drugs (33%). Crystalline heroin abusers were younger than opium users, had begun abusing drugs earlier, and reported a shorter history of opiate addiction. Conclusion Opium and crystalline heroin were the main drugs of abuse. A high rate of addiction using more dangerous opiate drugs such as crystalline heroin calls for more preventive efforts, especially among young men. PMID:22676557
NASA Astrophysics Data System (ADS)
Caballero, Rafael; Gil, Ángel; Fernández-Santos, Xavier
2008-08-01
European Large Scale Grazing Systems (LSGS) are at a crossroad with environmental, agronomic, and social factors interacting on their future viability. This research assesses the current environmental and socio-economic status of a wide range of European LSGS according to an agreed subset of sustainability criteria and indicators, which have been recognized by corresponding experts and privileged observers on their respective case-study system. A survey questionnaire was drafted containing five main criteria (pastoral use, environmental, economic, social, and market and development), with four conceptual-scored variables (indicators) within each criterion. Descriptive, analytical and clustering statistical techniques helped to draw a synthesis of the main result and to standardize sustainability variables across different biogeographical regions and management situations. The results show large multicollinearity among the 20 variables proposed. This dependence was revealed by the reduction to six main factor-components, which accounted for about 73% of the total variance in responses. Aggregation of point-score indicators across criteria to obtain a sustainability index can be of less policy relevance than responses to specific criteria or indicators. Affinity between case-study systems, as judged by collaborative-expert responses, was not related to biogeographical location, operating livestock sector, or population density in their areas. The results show larger weaknesses and constraints in the economic and social criteria than in the pastoral and environmental criteria, and the large heterogeneity of responses appears in the social criterion.
NASA Astrophysics Data System (ADS)
Kouhpeima, A.; Feiznia, S.; Ahmadi, H.; Hashemi, S. A.; Zareiee, A. R.
2010-09-01
The targeting of sediment management strategies is a key requirement in developing countries including Iran because of the limited resources available. These targeting is, however hampered by the lack of reliable information on catchment sediment sources. This paper reports the results of using a quantitative composite fingerprinting technique to estimate the relative importance of the primary potential sources within the Amrovan and Royan catchments in Semnan Province, Iran. Fifteen tracers were first selected for tracing and samples were analyzed in the laboratory for these parameters. Statistical methods were applied to the data including nonparametric Kruskal-Wallis test and Differentiation Function Analysis (DFA). For Amrovan catchment three parameters (N, Cr and Co) were found to be not significant in making the discrimination. The optimum fingerprint, comprising Oc, PH, Kaolinite and K was able to distinguish correctly 100% of the source material samples. For the Royan catchment, all of the 15 properties were able to distinguish between the six source types and the optimum fingerprint provided by stepwise DFA (Cholorite, XFD, N and C) correctly classifies 92.9% of the source material samples. The mean contributions from each sediment source obtained by multivariate mixing model varied at two catchments. For Amrovan catchment Upper Red formation is the main sediment sources as this sediment source approximately supplies 36% of the reservoir sediment whereas the dominant sediment source for the Royan catchment is from Karaj formation that supplies 33% of the reservoir sediments. Results indicate that the source fingerprinting approach appears to work well in the study catchments and to generate reliable results.
Hugo, Sanet; Van Rensburg, Berndt J.; Van Wyk, Abraham E.; Steenkamp, Yolande
2012-01-01
The distributions of naturalised alien plant species that have invaded natural or semi-natural habitat are often geographically restricted by the environmental conditions in their new range, implying that alien species with similar environmental requirements and tolerances may form assemblages and characterise particular areas. The aim of this study was to use objective numerical techniques to reveal any possible alien phytogeographic regions (i.e. geographic areas with characteristic alien plant assemblages) in southern Africa. Quarter degree resolution presence records of naturalised alien plant species of South Africa, Lesotho, Swaziland, Namibia and Botswana were analysed through a divisive hierarchical classification technique, and the output was plotted on maps for further interpretation. The analyses revealed two main alien phytogeographic regions that could be subdivided into eight lower level phytogeographic regions. Along with knowledge of the environmental requirements of the characteristic species and supported by further statistical analyses, we hypothesised on the main drivers of alien phytogeographic regions, and suggest that environmental features such as climate and associated biomes were most important, followed by human activities that modify climatic and vegetation features, such as irrigation and agriculture. Most of the characteristic species are not currently well-known as invasive plant species, but many may have potential to become troublesome in the future. Considering the possibility of biotic homogenization, these findings have implications for predicting the characteristics of the plant assemblages of the future. However, the relatively low quality of the dataset necessitates further more in-depth studies with improved data before the findings could be directly beneficial for management. PMID:22574145
NASA Astrophysics Data System (ADS)
Belfort, Benjamin; Weill, Sylvain; Lehmann, François
2017-07-01
A novel, non-invasive imaging technique is proposed that determines 2D maps of water content in unsaturated porous media. This method directly relates digitally measured intensities to the water content of the porous medium. This method requires the classical image analysis steps, i.e., normalization, filtering, background subtraction, scaling and calibration. The main advantages of this approach are that no calibration experiment is needed, because calibration curve relating water content and reflected light intensities is established during the main monitoring phase of each experiment and that no tracer or dye is injected into the flow tank. The procedure enables effective processing of a large number of photographs and thus produces 2D water content maps at high temporal resolution. A drainage/imbibition experiment in a 2D flow tank with inner dimensions of 40 cm × 14 cm × 6 cm (L × W × D) is carried out to validate the methodology. The accuracy of the proposed approach is assessed using a statistical framework to perform an error analysis and numerical simulations with a state-of-the-art computational code that solves the Richards' equation. Comparison of the cumulative mass leaving and entering the flow tank and water content maps produced by the photographic measurement technique and the numerical simulations demonstrate the efficiency and high accuracy of the proposed method for investigating vadose zone flow processes. Finally, the photometric procedure has been developed expressly for its extension to heterogeneous media. Other processes may be investigated through different laboratory experiments which will serve as benchmark for numerical codes validation.
Moreno-Merino, Sergio; Congregado, Miguel; Gallardo, Gregorio; Jimenez-Merchan, Rafael; Trivino, Ana; Cozar, Fernando; Lopez-Porras, Marta; Loscertales, Jesus
2012-01-01
Primary spontaneous pneumothorax is a pathology mainly affecting healthy young patients. Clinical guidelines do not specify the type of pleurodesis that should be conducted, due to the lack of comparative studies on the different techniques. The aim of this study was to compare talc poudrage and pleural abrasion in the treatment of spontaneous pneumothorax. A retrospective comparative study was performed, including 787 patients with primary spontaneous pneumothorax. The 787 patients were classified into two groups: Group A (pleural abrasion) n = 399 and Group B (talc pleurodesis) n = 388. The variables studied were recurrence, surgical time, morbidity and in-hospital length of stay. Statistical analysis was done by an unpaired t-test and Fisher's exact test (SSPS 18.0). Statistically significant differences were observed in the variables: surgical time (A: 46 ± 12.3; B: 37 ± 11.8 min; P < 0.001); length of stay (A: 4.7 ± 2.5; B: 4.3 ± 1.8 days; P = 0.01); apical air camera (A: 25; B: 4; P < 0.001); pleural effusion (A: 6; B: 0; P = 0.05). Talc poudrage shows shorter surgical times and length of stay, and lower re-intervention rates. Morbidity is lower in patients with talc poudrage. Statistically significant differences were not observed in recurrence, persistent air leaks, atelectasis and haemothorax. PMID:22514256
[Seroprevalence of Q fever among the adult population of Lanzarote (Canary Islands)].
Pascual Velasco, F; Rodríguez Pérez, J C; Otero Ferrio, I; Borobio Enciso, M V
1992-09-01
Q fever is an endemic zoonosis in the Canary Islands. In 1986, we detected, in a pilot study, residual antibodies of the infection in 3% of the population from Lanzarote. In 1989, we performed a new study in order to assess seroprevalence of Q fever among the adult native population from the island. We studied 390 human serums obtained from an statistically representative sample. Age ranged from 30 to 64 years. Out of 390 serums, 196 (50.25%) were obtained from men and 194 (49.74%) from women. The serological technique used was the fixation of complement using Coxiella burnetii antigens in phase II. Titres equal or higher than 1/8 were considered positive. No statistically significant differences were observed with regard to seroprevalence rates considering sex, age, nor living in or outside the island's capital city. However, when dividing the island's territory in three areas (north, centre and south), and assessing independently their respective seroprevalences, we observed relatively higher seroprevalences in the furthest areas (13.3% in the north and 13.5% in the south) than in the central area (4.7%), although only the higher seroprevalence in the south reached statistical significance when compared with the mean prevalence. Probably, these observations indicate that, although Q fever is extended all over the island, it is a more frequent infection in rural areas of Lanzarote, at the north and the south, than in the central area, where the main urban areas are located.
Automated localization and segmentation techniques for B-mode ultrasound images: A review.
Meiburger, Kristen M; Acharya, U Rajendra; Molinari, Filippo
2018-01-01
B-mode ultrasound imaging is used extensively in medicine. Hence, there is a need to have efficient segmentation tools to aid in computer-aided diagnosis, image-guided interventions, and therapy. This paper presents a comprehensive review on automated localization and segmentation techniques for B-mode ultrasound images. The paper first describes the general characteristics of B-mode ultrasound images. Then insight on the localization and segmentation of tissues is provided, both in the case in which the organ/tissue localization provides the final segmentation and in the case in which a two-step segmentation process is needed, due to the desired boundaries being too fine to locate from within the entire ultrasound frame. Subsequenly, examples of some main techniques found in literature are shown, including but not limited to shape priors, superpixel and classification, local pixel statistics, active contours, edge-tracking, dynamic programming, and data mining. Ten selected applications (abdomen/kidney, breast, cardiology, thyroid, liver, vascular, musculoskeletal, obstetrics, gynecology, prostate) are then investigated in depth, and the performances of a few specific applications are compared. In conclusion, future perspectives for B-mode based segmentation, such as the integration of RF information, the employment of higher frequency probes when possible, the focus on completely automatic algorithms, and the increase in available data are discussed. Copyright © 2017 Elsevier Ltd. All rights reserved.
Beck-Fruchter, Ronit; Shalev, Eliezer; Weiss, Amir
2016-03-01
The human oocyte is surrounded by hyaluronic acid, which acts as a natural selector of spermatozoa. Human sperm that express hyaluronic acid receptors and bind to hyaluronic acid have normal shape, minimal DNA fragmentation and low frequency of chromosomal aneuploidies. Use of hyaluronic acid binding assays in intracytoplasmic sperm injection (ICSI) cycles to improve clinical outcomes has been studied, although none of these studies had sufficient statistical power. In this systematic review and meta-analysis, electronic databases were searched up to June 2015 to identify studies of ICSI cycles in which spermatozoa able to bind hyaluronic acid was selected. The main outcomes were fertilization rate and clinical pregnancy rate. Secondary outcomes included cleavage rate, embryo quality, implantation rate, spontaneous abortion and live birth rate. Seven studies and 1437 cycles were included. Use of hyaluronic acid binding sperm selection technique yielded no improvement in fertilization and pregnancy rates. A meta-analysis of all available studies showed an improvement in embryo quality and implantation rate; an analysis of prospective studies only showed an improvement in embryo quality. Evidence does not support routine use of hyaluronic acid binding assays in all ICSI cycles. Identification of patients that might benefit from this technique needs further study. Copyright © 2015 Reproductive Healthcare Ltd. Published by Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Singh, Jitendra; Sekharan, Sheeba; Karmakar, Subhankar; Ghosh, Subimal; Zope, P. E.; Eldho, T. I.
2017-04-01
Mumbai, the commercial and financial capital of India, experiences incessant annual rain episodes, mainly attributable to erratic rainfall pattern during monsoons and urban heat-island effect due to escalating urbanization, leading to increasing vulnerability to frequent flooding. After the infamous episode of 2005 Mumbai torrential rains when only two rain gauging stations existed, the governing civic body, the Municipal Corporation of Greater Mumbai (MCGM) came forward with an initiative to install 26 automatic weather stations (AWS) in June 2006 (MCGM 2007), which later increased to 60 AWS. A comprehensive statistical analysis to understand the spatio-temporal pattern of rainfall over Mumbai or any other coastal city in India has never been attempted earlier. In the current study, a thorough analysis of available rainfall data for 2006-2014 from these stations was performed; the 2013-2014 sub-hourly data from 26 AWS was found useful for further analyses due to their consistency and continuity. Correlogram cloud indicated no pattern of significant correlation when we considered the closest to the farthest gauging station from the base station; this impression was also supported by the semivariogram plots. Gini index values, a statistical measure of temporal non-uniformity, were found above 0.8 in visible majority showing an increasing trend in most gauging stations; this sufficiently led us to conclude that inconsistency in daily rainfall was gradually increasing with progress in monsoon. Interestingly, night rainfall was lesser compared to daytime rainfall. The pattern-less high spatio-temporal variation observed in Mumbai rainfall data signifies the futility of independently applying advanced statistical techniques, and thus calls for simultaneous inclusion of physics-centred models such as different meso-scale numerical weather prediction systems, particularly the Weather Research and Forecasting (WRF) model.
Impact of Different Surgeons on Dental Implant Failure.
Chrcanovic, Bruno Ramos; Kisch, Jenö; Albrektsson, Tomas; Wennerberg, Ann
To assess the influence of several factors on the prevalence of dental implant failure, with special consideration of the placement of implants by different dental surgeons. This retrospective study is based on 2,670 patients who received 10,096 implants at one specialist clinic. Only the data of patients and implants treated by surgeons who had inserted a minimum of 200 implants at the clinic were included. Kaplan-Meier curves were stratified with respect to the individual surgeon. A generalized estimating equation (GEE) method was used to account for the fact that repeated observations (several implants) were placed in a single patient. The factors bone quantity, bone quality, implant location, implant surface, and implant system were analyzed with descriptive statistics separately for each individual surgeon. A total of 10 surgeons were eligible. The differences between the survival curves of each individual were statistically significant. The multivariate GEE model showed the following variables to be statistically significant: surgeon, bruxism, intake of antidepressants, location, implant length, and implant system. The surgeon with the highest absolute number of failures was also the one who inserted the most implants in sites of poor bone and used turned implants in most cases, whereas the surgeon with the lowest absolute number of failures used mainly modern implants. Separate survival analyses of turned and modern implants stratified for the individual surgeon showed statistically significant differences in cumulative survival. Different levels of failure incidence could be observed between the surgeons, occasionally reaching significant levels. Although a direct causal relationship could not be ascertained, the results of the present study suggest that the surgeons' technique, skills, and/or judgment may negatively influence implant survival rates.
ERIC Educational Resources Information Center
Brezavšcek, Alenka; Šparl, Petra; Žnidaršic, Anja
2017-01-01
The aim of the paper is to investigate the main factors influencing the adoption and continuous utilization of statistical software among university social sciences students in Slovenia. Based on the Technology Acceptance Model (TAM), a conceptual model was derived where five external variables were taken into account: statistical software…
Eng, Kevin H; Schiller, Emily; Morrell, Kayla
2015-11-03
Researchers developing biomarkers for cancer prognosis from quantitative gene expression data are often faced with an odd methodological discrepancy: while Cox's proportional hazards model, the appropriate and popular technique, produces a continuous and relative risk score, it is hard to cast the estimate in clear clinical terms like median months of survival and percent of patients affected. To produce a familiar Kaplan-Meier plot, researchers commonly make the decision to dichotomize a continuous (often unimodal and symmetric) score. It is well known in the statistical literature that this procedure induces significant bias. We illustrate the liabilities of common techniques for categorizing a risk score and discuss alternative approaches. We promote the use of the restricted mean survival (RMS) and the corresponding RMS curve that may be thought of as an analog to the best fit line from simple linear regression. Continuous biomarker workflows should be modified to include the more rigorous statistical techniques and descriptive plots described in this article. All statistics discussed can be computed via standard functions in the Survival package of the R statistical programming language. Example R language code for the RMS curve is presented in the appendix.
Gooding, Owen W
2004-06-01
The use of parallel synthesis techniques with statistical design of experiment (DoE) methods is a powerful combination for the optimization of chemical processes. Advances in parallel synthesis equipment and easy to use software for statistical DoE have fueled a growing acceptance of these techniques in the pharmaceutical industry. As drug candidate structures become more complex at the same time that development timelines are compressed, these enabling technologies promise to become more important in the future.
A Regression Design Approach to Optimal and Robust Spacing Selection.
1981-07-01
Hassanein (1968, 1969a, 1969b, 1971, 1972, 1977), Kulldorf (1963), Kulldorf and Vannman (1973), Rhodin (1976), Sarhan and Greenberg (1958, 1962) and...of d0 and Q0 1 d 0 "Q0 ’ are in the reproducing kernel Hilbert space (RKHS) generated by R, the techniques developed by Parzen (1961a, 1961b) may be... Greenberg , B.G. (1958). Estimation problems in the exponential distribution using order statistics. Proceedings of the Statistical Techniques in Missile
Analysis of defect structure in silicon. Characterization of samples from UCP ingot 5848-13C
NASA Technical Reports Server (NTRS)
Natesh, R.; Guyer, T.; Stringfellow, G. B.
1982-01-01
Statistically significant quantitative structural imperfection measurements were made on samples from ubiquitous crystalline process (UCP) Ingot 5848 - 13 C. Important trends were noticed between the measured data, cell efficiency, and diffusion length. Grain boundary substructure appears to have an important effect on the conversion efficiency of solar cells from Semix material. Quantitative microscopy measurements give statistically significant information compared to other microanalytical techniques. A surface preparation technique to obtain proper contrast of structural defects suitable for QTM analysis was perfected.
Effectiveness of Various Methods of Teaching Proper Inhaler Technique.
Axtell, Samantha; Haines, Seena; Fairclough, Jamie
2017-04-01
To compare the effectiveness of 4 different instructional interventions in training proper inhaler technique. Randomized, noncrossover trial. Health fair and indigent clinic. Inhaler-naive adult volunteers who spoke and read English. Subjects were assigned to complete the following: (1) read a metered dose inhaler (MDI) package insert pamphlet, (2) watch a Centers for Disease Control and Prevention (CDC) video demonstrating MDI technique, (3) watch a YouTube video demonstrating MDI technique, or (4) receive direct instruction of MDI technique from a pharmacist. Inhaler use competency (completion of all 7 prespecified critical steps). Of the 72 subjects, 21 (29.2%) demonstrated competent inhaler technique. A statistically significant difference between pharmacist direct instruction and the remaining interventions, both combined ( P < .0001) and individually ( P ≤ .03), was evident. No statistically significant difference was detected among the remaining 3 intervention groups. Critical steps most frequently omitted or improperly performed were exhaling before inhalation and holding of breath after inhalation. A 2-minute pharmacist counseling session is more effective than other interventions in successfully educating patients on proper inhaler technique. Pharmacists can play a pivotal role in reducing the implications of improper inhaler use.
Li, Pengxiang; Doshi, Jalpa A.
2016-01-01
Objective Since 2007, the Centers for Medicare and Medicaid Services have published 5-star quality rating measures to aid consumers in choosing Medicare Advantage Prescription Drug Plans (MAPDs). We examined the impact of these star ratings on Medicare Advantage Prescription Drug (MAPD) enrollment before and after 2012, when star ratings became tied to bonus payments for MAPDs that could be used to improve plan benefits and/or reduce premiums in the subsequent year. Methods A longitudinal design and multivariable hybrid models were used to assess whether star ratings had a direct impact on concurrent year MAPD contract enrollment (by influencing beneficiary choice) and/or an indirect impact on subsequent year MAPD contract enrollment (because ratings were linked to bonus payments). The main analysis was based on contract-year level data from 2009–2015. We compared effects of star ratings in the pre-bonus payment period (2009–2011) and post-bonus payment period (2012–2015). Extensive sensitivity analyses varied the analytic techniques, unit of analysis, and sample inclusion criteria. Similar analyses were conducted separately using stand-alone PDP contract-year data; since PDPs were not eligible for bonus payments, they served as an external comparison group. Result The main analysis included 3,866 MAPD contract-years. A change of star rating had no statistically significant effect on concurrent year enrollment in any of the pre-, post-, or pre-post combined periods. On the other hand, star rating increase was associated with a statistically significant increase in the subsequent year enrollment (a 1-star increase associated with +11,337 enrollees, p<0.001) in the post-bonus payment period but had a very small and statistically non-significant effect on subsequent year enrollment in the pre-bonus payment period. Further, the difference in effects on subsequent year enrollment was statistically significant between the pre- and post-periods (p = 0.011). Sensitivity analyses indicated that the findings were robust. No statistically significant effect of star ratings was found on concurrent or subsequent year enrollment in the pre- or post-period in the external comparison group of stand-alone PDP contracts. Conclusion Star ratings had no direct impact on concurrent year MAPD enrollment before or after the introduction of bonus payments tied to star ratings. However, after the introduction of these bonus payments, MAPD star ratings had a significant indirect impact of increasing subsequent year enrollment, likely via the reinvestment of bonuses to provide lower premiums and/or additional member benefits in the following year. PMID:27149092
Li, Pengxiang; Doshi, Jalpa A
2016-01-01
Since 2007, the Centers for Medicare and Medicaid Services have published 5-star quality rating measures to aid consumers in choosing Medicare Advantage Prescription Drug Plans (MAPDs). We examined the impact of these star ratings on Medicare Advantage Prescription Drug (MAPD) enrollment before and after 2012, when star ratings became tied to bonus payments for MAPDs that could be used to improve plan benefits and/or reduce premiums in the subsequent year. A longitudinal design and multivariable hybrid models were used to assess whether star ratings had a direct impact on concurrent year MAPD contract enrollment (by influencing beneficiary choice) and/or an indirect impact on subsequent year MAPD contract enrollment (because ratings were linked to bonus payments). The main analysis was based on contract-year level data from 2009-2015. We compared effects of star ratings in the pre-bonus payment period (2009-2011) and post-bonus payment period (2012-2015). Extensive sensitivity analyses varied the analytic techniques, unit of analysis, and sample inclusion criteria. Similar analyses were conducted separately using stand-alone PDP contract-year data; since PDPs were not eligible for bonus payments, they served as an external comparison group. The main analysis included 3,866 MAPD contract-years. A change of star rating had no statistically significant effect on concurrent year enrollment in any of the pre-, post-, or pre-post combined periods. On the other hand, star rating increase was associated with a statistically significant increase in the subsequent year enrollment (a 1-star increase associated with +11,337 enrollees, p<0.001) in the post-bonus payment period but had a very small and statistically non-significant effect on subsequent year enrollment in the pre-bonus payment period. Further, the difference in effects on subsequent year enrollment was statistically significant between the pre- and post-periods (p = 0.011). Sensitivity analyses indicated that the findings were robust. No statistically significant effect of star ratings was found on concurrent or subsequent year enrollment in the pre- or post-period in the external comparison group of stand-alone PDP contracts. Star ratings had no direct impact on concurrent year MAPD enrollment before or after the introduction of bonus payments tied to star ratings. However, after the introduction of these bonus payments, MAPD star ratings had a significant indirect impact of increasing subsequent year enrollment, likely via the reinvestment of bonuses to provide lower premiums and/or additional member benefits in the following year.
Compendium of abstracts on statistical applications in geotechnical engineering
NASA Astrophysics Data System (ADS)
Hynes-Griffin, M. E.; Deer, G. W.
1983-09-01
The results of a literature search of geotechnical and statistical abstracts are presented in tables listing specific topics, title of the abstract, main author and the file number under which the abstract can be found.
A Primer on Bayesian Analysis for Experimental Psychopathologists
Krypotos, Angelos-Miltiadis; Blanken, Tessa F.; Arnaudova, Inna; Matzke, Dora; Beckers, Tom
2016-01-01
The principal goals of experimental psychopathology (EPP) research are to offer insights into the pathogenic mechanisms of mental disorders and to provide a stable ground for the development of clinical interventions. The main message of the present article is that those goals are better served by the adoption of Bayesian statistics than by the continued use of null-hypothesis significance testing (NHST). In the first part of the article we list the main disadvantages of NHST and explain why those disadvantages limit the conclusions that can be drawn from EPP research. Next, we highlight the advantages of Bayesian statistics. To illustrate, we then pit NHST and Bayesian analysis against each other using an experimental data set from our lab. Finally, we discuss some challenges when adopting Bayesian statistics. We hope that the present article will encourage experimental psychopathologists to embrace Bayesian statistics, which could strengthen the conclusions drawn from EPP research. PMID:28748068